The getPeriodReward()
function doesn't work as expected.
The getPeriodReward()
calculates the reward for the period with with linear distribution and interval decrease. Also, the initialAmount_
should decrease by constant decreaseAmount_
after each interval.
Let's suppose:
initialAmount_ | decreaseAmount_ | payoutStart_ | interval_ | startTime_ | endTime_ |
---|---|---|---|---|---|
100 | 1 | 86400 | 86400 | 86400 | dynamic |
We get (With Remix):
endTime_ | getPeriodReward | startTime_ - endTime_ |
---|---|---|
86400 (1 day) | 0 | 0 |
172800 (2 days) | 100 | 86400 ( 1 day) |
259200 (3 days) | 199 | 172800 (2 days) |
345600 (4 days) | 297 | 259200 ( 3 days) |
432000 (5 days) | 394 | 345600 ( 4 days) |
864000 (10 days) | 864 | 777600 ( 9 days) |
From the above table, we can clearly see that reward is decreased with incorrect decreaseAmount_
as the length of startTime_ - endTime_
increases.
Example: After 3 days ( startTime_ - endTime_ ) , instead of returning 298
it returns 297
and so on.
Incorrect calculation of period rewards.
Manual Analysis, Remix
I couldn't find the actual piece of code that is causing this issue. My recommendation would be to completely modify the LinearDistributionIntervalDecrease
library.
The contest is live. Earn rewards by submitting a finding.
This is your time to appeal against judgements on your submissions.
Appeals are being carefully reviewed by our judges.