Added : Each contract address must have the software visible in Etherscan. Bytecode only for the contract is equivalent to no address. in the Q1 section. This is because of Wing Finance that published their addresses but the contracts were hidden and no GitHub.
Changed Guidance on Q18 Bug Bounty to
60% Bounty is 100k or over 50% Bounty is 50k or over AND active program 40% Bounty is 50k or over
50% Bounty is 100k or over 40% Bounty is 50k or over
Added to Guidance on Audits: 50% Audit(s) performed after deployment and changes are needed but not implemented
1) Added Access Controls section 2) Added multi chain review capability 3) Added bug bounty question
1) Added Access Controls section 2) Added multi chain review capability 3) Removed the question "Packaged with the deployed code (Y/N)" from testing as it added little value (always YES) 4) Added numbers to the questions, to make reference easier 5) Added Overview title at the start of the review 6) Changed the section name from "Audits" to "Security" as it now has Audits and Bug Bounty questions 7) Weight Changes (see below)
Added "For our purposes, a pass is 70%." after Scoring section at the start.
Added % guidance for Report of the Results in the Testing section
Improved “Summary of the Process”
Improved Disclaimer after legal review
Improved “Sections of the Review Process” to match the revised report
Changed the Scoring rubric to match all 0.6 changes
Added “Private Software Repos” section
Improve the wording in the “Are the executing code addresses readily available? (Y/N)” question to emphasize its importance and the impact of not having the addresses public.
Removed the question; "Are the Contracts Verified?". This question asks if the contracts are "verified" on Etherscan. Frankly, it was always true on every review we did. For this reason, I am removing the question as it adds no value.
Changed the question " Does the code match a tagged version on a code hosting platform" to "Is There a Public Software Repository". Finding and matching all the contracts was time-consuming and added little value.
Change the question from "Is the Software Repository Healthy" to "Is There Development History Visible?". This is effectively the same question asked in a different way. When asked this way the software repository is not mandatory, merely convenient.
Added the question “Is the team public (not anonymous)? (Y/N)” as this is an element of trust we felt needed to be added.
Improved the wording in the "Is there a white paper?" question allowing medium articles also
Change the wording of the question from "Do the requirements fully (100%) cover the deployed contracts?” to “Does the software function documentation fully (100%) cover the deployed contracts?” because many people do not really understand the word "requirements". Also added guidance to this question
Added guidance based on Comments to Code (CtC) ratio on the question “Are there sufficiently detailed comments for all functions within the deployed contract code?” Change the wording of the question from “Is it possible to trace software requirements to the implementation in code” to “Is it possible to trace from software documentation to the implementation in code” because many people do not really understand the word "requirements". Test
Added guidance based on Test to Code ratio (TtC) to the “Full test suite” is a good
Added words where audits do not cover economic issues and the specific impact of audits on private repos
Added % Guidance to "Does the code match a tagged version on a code hosting platform (GitHub, GitLab, etc.)?"
Added % Guidelines to "Code coverage (Covers all the deployed lines of code, or explains misses) (%)" in Testing
Added % Guidelines to "Is it possible to trace requirements to the implementation in code (%)"
General rebalancing of the scoring weights
Added Summary of the Process section
Changed Deployed code to "Executing Code Verification"
Changed Requirements to Documentation
Scoring weight for the "deployed code address(s) readily available?" from 10 to 30 because it is fundamentally important
Scoring weight for the "Does the code match a tagged version on a code hosting platform (GitHub, GitLab, etc.)?" from 10 to 20
Scoring weight for the "Is development software repository healthy)?" from 20 to 10
Changed the heading of Requirements to Documentation for better clarity for the reader.
Deleted "Are the requirements available publicly? Question as it added little value.
Scoring weight for the "Are there sufficiently detailed comments for all functions within the deployed contract code?" from 5 to 10 because is important
Scoring weight for the "Code coverage", "Scripts and instructions to run the tests" and "Packaged with the deployed code" from 10 to 5 for balancing
Changed the weight of Audit from 50 to 70 for balancing
"Is development software repository healthy?" of "Deployed Code" changed from Y/N to %. The AAVE code was developed in a private repository that the auditor cannot view. The public repository was created just to display the final code. This makes the public repository appear unhealthy. But as they have a valid reason and the auditor is comfortable a valid repository exists but cannot be seen we needed something better than a binary Y/N. So we changed to % and changed the explanation.
This is the initial version used for the first three Audits