P
P
PQ Reviews
Search…
0.7
Blank Review for .8 DRAFT
Score:

Overview

This is a Process Quality Review completed on 2021. It was performed using the Process Review process (version 0.8) and is documented here. The review was performed by of DeFiSafety. Check out our Telegram.
The final score of the review is ___%, a ____. The breakdown of the scoring is in Scoring Appendix. For our purposes, a pass is 70%.

Summary of the Process

Very simply, the review looks for the following declarations from the developer's site. With these declarations, it is reasonable to trust the smart contracts.
  • Here are my smart contracts on the blockchain
  • Here is the documentation that explains what my smart contracts do
  • Here are the tests I ran to verify my smart contract
  • Here are the audit(s) performed on my code by third party experts
  • Here are the admin controls and strategies

Disclaimer

This report is for informational purposes only and does not constitute investment advice of any kind, nor does it constitute an offer to provide investment advisory or other services. Nothing in this report shall be considered a solicitation or offer to buy or sell any security, token, future, option or other financial instrument or to offer or provide any investment advice or service to any person in any jurisdiction. Nothing contained in this report constitutes investment advice or offers any opinion with respect to the suitability of any security, and the views expressed in this report should not be taken as advice to buy, sell or hold any security. The information in this report should not be relied upon for the purpose of investing. In preparing the information contained in this report, we have not taken into account the investment needs, objectives and financial circumstances of any particular investor. This information has no regard to the specific investment objectives, financial situation and particular needs of any specific recipient of this information and investments discussed may not be suitable for all investors.
Any views expressed in this report by us were prepared based upon the information available to us at the time such views were written. The views expressed within this report are limited to DeFiSafety and the author and do not reflect those of any additional or third party and are strictly based upon DeFiSafety, its authors, interpretations and evaluation of relevant data. Changed or additional information could cause such views to change. All information is subject to possible correction. Information may quickly become unreliable for various reasons, including changes in market conditions or economic circumstances.
This completed report is copyright (c) DeFiSafety 2021. Permission is given to copy in whole, retaining this copyright label.

Chain

This section indicates the blockchain used by this protocol.
Chain:
Guidance: Ethereum Binance Smart Chain Polygon Avalanche Terra

Code and Team

This section looks at the code deployed on the Mainnet that gets reviewed and its corresponding software repository. The document explaining these questions is here. This review will answer the following questions:
1) Are the executing code addresses readily available? (%) 2) Is the code actively being used? (%) 3) Is there a public software repository? (Y/N) 4) Is there a development history visible? (%) 5) Is the team public (not anonymous)? (Y/N)

1) Are the executing code addresses readily available? (%)

Answer:
They are available at website ____ , as indicated in the Appendix.
Guidance: 100% Clearly labelled and on website, docs or repo, quick to find 70% Clearly labelled and on website, docs or repo but takes a bit of looking 40% Addresses in mainnet.json, in discord or sub graph, etc 20% Address found but labeling not clear or easy to find 0% Executing addresses could not be found

How to improve this score:

Make the Ethereum addresses of the smart contract utilized by your application available on either your website or your GitHub (in the README for instance). Ensure the addresses is up to date. This is a very important question towards the final score.

2) Is the code actively being used? (%)

Answer:
Activity is ___ transactions a day on contract ____, as indicated in the Appendix.

Guidance:

100% More than 10 transactions a day 70% More than 10 transactions a week 40% More than 10 transactions a month 10% Less than 10 transactions a month 0% No activity

3) Is there a public software repository? (Y/N)

Answer:
GitHub:
Is there a public software repository with the code at a minimum, but also normally test and scripts. Even if the repository was created just to hold the files and has just 1 transaction, it gets a "Yes". For teams with private repositories, this answer is "No".

4) Is there a development history visible? (%)

Answer:
This metric checks if the software repository demonstrates a strong steady history. This is normally demonstrated by commits, branches and releases in a software repository. A healthy history demonstrates a history of more than a month (at a minimum).
Guidance: 100% Any one of 100+ commits, 10+branches 70% Any one of 70+ commits, 7+branches 50% Any one of 50+ commits, 5+branches 30% Any one of 30+ commits, 3+branches 0% Less than 2 branches or less than 30 commits

How to improve this score:

Continue to test and perform other verification activities after deployment, including routine maintenance updating to new releases of testing and deployment tools. A public development history indicates clearly to the public the level of continued investment and activity by the developers on the application. This gives a level of security and faith in the application.

5) Is the team public (not anonymous)? (Y/N)

Answer:
Location:
For a "Yes" in this question, the real names of some team members must be public on the website or other documentation (LinkedIn, etc). If the team is anonymous, then this question is a "No".

Documentation

This section looks at the software documentation. The document explaining these questions is here.
Required questions are;
6) Is there a whitepaper? (Y/N) 7) Are the basic software functions documented? (Y/N) 8) Does the software function documentation fully (100%) cover the deployed contracts? (%) 9) Are there sufficiently detailed comments for all functions within the deployed contract code (%) 10) Is it possible to trace from software documentation to the implementation in code (%)

6) Is there a whitepaper? (Y/N)

Answer:
Location:

How to improve this score:

Ensure that the white paper is available for download from your website or at least the software repository. Ideally update the whitepaper to meet the capabilities of your present application.

7) Does the documentation cover protocol architecture? (%)

Answer:

Guidance

100% Documentation includes a diagram of smart contract architecture as well as a written explanation on how the contracts interact with eachother. 70% Documentation includes a written explanation on how the smart contracts interact with eachother, but no diagram. 40% Diagram of smart contract architecture included in documentation 0% No information about protocol architecture

How to improve this score:

Write the document based on the deployed code. For guidance, refer to the SecurEth System Description Document. https://consensys.github.io/smart-contract-best-practices/security_tools/

8) Does the software function documentation fully (100%) cover the deployed contracts? (%)

Answer:
Guidance:
100% All contracts and functions documented 80% Only the major functions documented 79-1% Estimate of the level of software documentation 0% No software documentation

How to improve this score:

This score can be improved by adding content to the software functions document such that it comprehensively covers the requirements. For guidance, refer to the SecurEth System Description Document. Using tools that aid traceability detection will help.

9) Are there sufficiently detailed comments for all functions within the deployed contract code (%)

Answer:
Code examples are in the Appendix. As per the SLOC, there is ___% commenting to code (CtC).
The Comments to Code (CtC) ratio is the primary metric for this score.
Guidance: 100% CtC > 100 Useful comments consistently on all code 90-70% CtC > 70 Useful comment on most code 60-20% CtC > 20 Some useful commenting 0% CtC < 20 No useful commenting

How to improve this score

This score can improve by adding comments to the deployed code such that it comprehensively covers the code. For guidance, refer to the SecurEth Software Requirements.

10) Is it possible to trace from software documentation to the implementation in code (%)

Answer:
Guidance: 100% Clear explicit traceability between code and documentation at a requirement level for all code 60% Clear association between code and documents via non explicit traceability 40% Documentation lists all the functions and describes their functions 0% No connection between documentation and code

How to improve this score:

This score can improve by adding traceability from documentation to code such that it is clear where each outlined function is coded in the source code. For reference, check the SecurEth guidelines on traceability.

Testing

This section looks at the software testing available. It is explained in this document. This section answers the following questions;
11) Full test suite (Covers all the deployed code) (%) 12) Code coverage (Covers all the deployed lines of code, or explains misses) (%) 13) Scripts and instructions to run the tests (Y/N) 14) Report of the results (%) 15) Formal Verification test done (%) 16) Stress Testing environment (%)

11) Is there a Full test suite? (%)

Answer:
Code examples are in the Appendix. As per the SLOC, there is ___% testing to code (TtC).
This score is guided by the Test to Code ratio (TtC). Generally a good test to code ratio is over 100%. However the reviewers best judgement is the final deciding factor.
Guidance: 100% TtC > 120% Both unit and system test visible 80% TtC > 80% Both unit and system test visible 40% TtC < 80% Some tests visible 0% No tests obvious

How to improve this score:

This score can improved by adding tests to fully cover the code. Document what is covered by traceability or test results in the software repository.

12) Code coverage (Covers all the deployed lines of code, or explains misses) (%)

Answer:
Guidance: 100% Documented full coverage 99-51% Value of test coverage from documented results 50% No indication of code coverage but clearly there is a reasonably complete set of tests 30% Some tests evident but not complete 0% No test for coverage seen

How to improve this score:

This score can improved by adding tests that achieve full code coverage. A clear report and scripts in the software repository will guarantee a high score.

13) Scripts and instructions to run the tests (Y/N)

Answer:
Scrips/Instructions location:

How to improve this score:

Add the scripts to the repository and ensure they work. Ask an outsider to create the environment and run the tests. Improve the scripts and docs based on their feedback.

14) Report of the results (%)

Answer:
Guidance: 100% Detailed test report as described below 70% GitHub code coverage report visible 0% No test report evident

How to improve this score

Add a report with the results. The test scripts should generate the report or elements of it.

15) Formal Verification test done (Y/N)

Answer:
Guidance:
For a "Yes", there must be evidence of formal verification testing with a link to a report somewhere in the protocol's documentation.
For a "No", there is no evidence of formal verification testing having been completed by the protocol in their documentation.
Note: Evidence of Formal Verification done by a firm that is NOT linked on a protocol's internal documentation does not count.

16) Stress Testing environment (Y/N)

Answer:
Guidance:
For a "Yes", the protocol must have testnet addresses labelled in a section of their documentation that have clearly been transacted upon (there must be transactions on Etherscan).
For a "No", the protocol has no evident testnet addresses available.

Security

This section looks at the 3rd party software audits done. It is explained in this document. This section answers the following questions;
17) Did 3rd Party audits take place? (%) 18) Is the bounty value acceptably high?

17) Did 3rd Party audits take place? (%)

Answer:
Guidance: 100% Multiple Audits performed before deployment and results public and implemented or not required 90% Single audit performed before deployment and results public and implemented or not required 70% Audit(s) performed after deployment and no changes required. Audit report is public
60% Audit(s) performed before deployment and changes needed but not implemented
50% Audit(s) performed after deployment and changes needed but not implemented
30% Audit(s) preformed are low-quality and do not indicate proper due-diligence. 20% No audit performed 0% Audit Performed after deployment, existence is public, report is not public and no improvements deployed OR smart contract address' not found, (where question 1 is 0%)
Deduct 25% if code is in a private repo and no note from auditors that audit is applicable to deployed code

18) Is the bounty value acceptably high (%)

Answer:
Guidance:
100% Bounty is 10% TVL or at least $1M AND active program (see below) 90% Bounty is 5% TVL or at least 500k AND active program 80% Bounty is 5% TVL or at least 500k 70% Bounty is 100k or over AND active program 60% Bounty is 100k or over 50% Bounty is 50k or over AND active program 40% Bounty is 50k or over 20% Bug bounty program bounty is less than 50k 0% No bug bounty program offered
An active program means that a third party (such as Immunefi) is actively driving hackers to the site. An inactive program would be static mentions on the docs.

Access Controls

This section covers the documentation of special access controls for a DeFi protocol. The admin access controls are the contracts that allow updating contracts or coefficients in the protocol. Since these contracts can allow the protocol admins to "change the rules", complete disclosure of capabilities is vital for user's transparency. It is explained in this document. The questions this section asks are as follow;
19) Can a user clearly and quickly find the status of the admin controls? 20) Is the information clear and complete? 21) Is the information in non-technical terms that pertain to the investments? 22) Is there Pause Control documentation including records of tests?

19) Can a user clearly and quickly find the status of the access controls (%)

Answer:
Guidance: 100% Clearly labelled and on website, docs or repo, quick to find 70% Clearly labelled and on website, docs or repo but takes a bit of looking 40% Access control docs in multiple places and not well labelled 20% Access control docs in multiple places and not labelled 0% Admin Control information could not be found

20) Are the contracts clearly labelled as upgradeable? (%)

Answer:
Guidance: 100% Both the contract documentation and the smart contract code state that the code is not upgradeable or immutable. 80% All Contracts are clearly labelled as upgradeable (or not) 50% Code is immutable but not mentioned anywhere in the documentation 0% Admin control information could not be found

How to improve this score:

Create a document that covers the items described above. An example is enclosed.

21) The type of ownership is clearly indicated (OnlyOwner / MultiSig / Roles) (%)

Answer:
Guidance: 100% The type of ownership is clearly indicated in their documentation. (OnlyOwner / MultiSig / etc) 50% The type of ownership is indicated, but only in the code. (OnlyOwner / MultiSig / etc) 0% Admin Control information could not be found

How to improve this score:

Create a document that covers the items described above. An example is enclosed.

22) Are the capabilities for change in the contract described? (%)

Answer:
Guidance:
100% The capabilities for change in a contract are clearly described 70% The captabilities for change in a contract are described, but require some looking or are in multiple places. 0% Admin Control information could not be found

How to improve this score:

Create a document that covers the items described above. An example is enclosed.

23) Is the information in non-technical terms that pertain to the investments (%)

Answer:
Guidance: 100% Description relates to investments safety and updates in clear, complete non-software l language 30% Description all in software specific language 0% No admin control information could not be found

How to improve this score:

Create a document that covers the items described above in plain language that investors can understand. An example is enclosed.

24) Is there Pause Control documentation including records of tests (%)

Answer:
Guidance: 100% All the contracts are immutable or no pause control needed and this is explained OR 100% Pause control(s) are clearly documented and there is records of at least one test within 3 months 80% Pause control(s) explained clearly but no evidence of regular tests 40% Pause controls mentioned with no detail on capability or tests, OR pause software function exists with no explanation 0% Pause control not documented or explained

How to improve this score:

Create a document that covers the items described above in plain language that investors can understand. An example is enclosed.

25) Is there Timelock function Documentation?

Answer:
Guidance: 100% Timelock Function is clearly explained, with functions, conditions for usage and TimeLock Length described, OR no TimeLock function needed and this is explained 80% TimeLock explained without mention of functions 40% Timelock mentioned with no detail on capability 0% Pause control not documented or explained

How to improve this score:

Create a document that covers the items described above in plain language that investors can understand. An example is enclosed.

26) Is the TimeLock an adequate length?

Answer:
Guidance: 100% Timelock is between 48 hours and 6 days 80% TimeLock is 7 days or more 40% TimeLock is 24 hours or less 0% Pause control not documented or explained

How to improve this score:

Create a document that covers the items described above in plain language that investors can understand. An example is enclosed.

Oracles

27) Is the Oracle Data Source Mentioned? (Y/N)

Answer:
Guidance: Yes means that the oracle source is stated explicitly somewhere in their documentation (Chainlink, Uniswap). No means that there is no mention of the oracle source.
If the protocol does not use oracles and explicitly states it, then this question does not count towards overall score.

How to improve this score:

Create a document that covers the items described above in plain language that investors can understand. An example is enclosed.

28) Could Frontrunning be applied to the protocol, and if so, are those frontrunning risks mitigated? (Y/N)

Answer:
Guidance
Yes means that the protocol explicitly states their strategy for dealing with frontrunning. No means no information is provided about frontrunning. N/A means the protocol has no risk of frontrunning, and this is explicitly stated in the documentation. This means the question is null, and doesn't count towards the overall score.
Note: Some types of DeFi Protocols do not have frontrunning risks. The most prominent that could be used for frontrunning is a DEX.

29) Can flashloan attacks be applied to the protocol, and if so, are those flashloan attack risks mitigated? (Y/N)

Answer:
Guidance
Yes means that the protocol explicitly states their strategy for dealing with flashloan attacks. No means that no information is provided about frontrunning. N/A means the protocol has no risk of flashloan attacks, and this is explained in the documentation. This means the question is null, and doesn't count towards the overall score

Appendices

Author Details

The author of this review is Rex of DeFi Safety.
Email : [email protected] Twitter : @defisafety
I started with Ethereum just before the DAO and that was a wonderful education. It showed the importance of code quality. The second Parity hack also showed the importance of good process. Here my aviation background offers some value. Aerospace knows how to make reliable code using quality processes.
I was coaxed to go to EthDenver 2018 and there I started SecuEth.org with Bryant and Roman. We created guidelines on good processes for blockchain code development. We got EthFoundation funding to assist in their development.
Process Quality Reviews are an extension of the SecurEth guidelines that will further increase the quality processes in Solidity and Vyper development.
DeFiSafety is my full time gig and we are working on funding vehicles for a permanent staff.

Scoring Appendix

Executing Code Appendix

Code Used Appendix

Example Code Appendix

1
Copied!

SLOC Appendix

Solidity Contracts

Language
Files
Lines
Blanks
Comments
Code
Complexity
Solidity
Comments to Code / = %

Javascript Tests

Language
Files
Lines
Blanks
Comments
Code
Complexity
JavaScript
Tests to Code / = %

Changelog

Code and team

Documentation

Changed question from "basic Software Functions" to "protocol Architecture"

Testing

Changed question 15 to add Yes/No guidance
Changed question 16 from a % guidance to a Yes/No guidance

Security

Changed Question 17 to include guidance for a bad audit

Access Controls

Turned question 20 "Is the information clear and complete (%)" into 3 distinct questions
Added Q20, Q21, Q22
Added Timelock questions Q25, Q26
New guidance added to Q24

Oracles

Added Oracles Section
Added Q27, Q28, Q29, Q30
Last modified 5d ago
Copy link
Contents
Overview
Summary of the Process
Disclaimer
Chain
Code and Team
1) Are the executing code addresses readily available? (%)
2) Is the code actively being used? (%)
3) Is there a public software repository? (Y/N)
4) Is there a development history visible? (%)
5) Is the team public (not anonymous)? (Y/N)
Documentation
6) Is there a whitepaper? (Y/N)
7) Does the documentation cover protocol architecture? (%)
8) Does the software function documentation fully (100%) cover the deployed contracts? (%)
9) Are there sufficiently detailed comments for all functions within the deployed contract code (%)
10) Is it possible to trace from software documentation to the implementation in code (%)
Testing
11) Is there a Full test suite? (%)
12) Code coverage (Covers all the deployed lines of code, or explains misses) (%)
13) Scripts and instructions to run the tests (Y/N)
14) Report of the results (%)
15) Formal Verification test done (Y/N)
16) Stress Testing environment (Y/N)
Security
17) Did 3rd Party audits take place? (%)
18) Is the bounty value acceptably high (%)
Access Controls
19) Can a user clearly and quickly find the status of the access controls (%)
20) Are the contracts clearly labelled as upgradeable? (%)
21) The type of ownership is clearly indicated (OnlyOwner / MultiSig / Roles) (%)
22) Are the capabilities for change in the contract described? (%)
23) Is the information in non-technical terms that pertain to the investments (%)
24) Is there Pause Control documentation including records of tests (%)
25) Is there Timelock function Documentation?
26) Is the TimeLock an adequate length?
Oracles
27) Is the Oracle Data Source Mentioned? (Y/N)
28) Could Frontrunning be applied to the protocol, and if so, are those frontrunning risks mitigated? (Y/N)
29) Can flashloan attacks be applied to the protocol, and if so, are those flashloan attack risks mitigated? (Y/N)
Appendices
Author Details
Scoring Appendix
Executing Code Appendix
Code Used Appendix
Example Code Appendix
SLOC Appendix
Changelog
Code and team
Documentation
Testing
Security
Access Controls
Oracles