The vulnerability is due to precision loss in Solidity’s integer division, which truncates decimal remainders. This leads to inaccurate averages and variances in functions like avg and variance. As a result, the protocol’s assessments based on these values may accept or reject validation scores incorrectly.
Solidity does not keep track of the decimals or remainder when dividing.
99/10 will give 9,
9/10 will give 0
Standard practice when doing divisions in solidity is to multiply the dividend by a large multiple of 10(usually 1e18) in order to minimize the precision loss
Take for example, the Statistics#avg function:
Let's say the scores for a generation are:
[1,2,1,2,3]
avg will return 1, which greatly deviates from the correct value of 1.8
Given a set of validation scores, protocol tries to pick scores that don't greatly deviate from the mean.
Having an unprecise mean would cause acceptance of scores that shouldn't be accepted, and rejection of scores that should be accepted
Instances:
Statistics#avg
Statistics#variance
LLMOracleCoordinator#finalizeValidation
Precision loss will lead to inaccurate mean values, allowing incorrect scores to be accepted
Manual Review
Standard practice is to multiply by 1e18 before dividing to make precision loss negligible.
There are some standard libraries by solady and openzeppelin for these kind of operations
The contest is live. Earn rewards by submitting a finding.
This is your time to appeal against judgements on your submissions.
Appeals are being carefully reviewed by our judges.