The QuantAMMMathGuard._clampWeights function should prevent _weights being less than absoluteWeightGuardRail value and more than absoluteMax value. The problem is the sumOtherWeights does not include weights which were not adjusted. This way the proportionalRemainder might exceed ONE which causes some _weights exceeds absoluteMax and the sum of weights exceeds ONE. At the same time the _normalizeWeightUpdates function can catch only small rounding errors.
Let we have three tokens with _weights[0] < absoluteMin, _weights[1] > absoluteMax and absoluteMin < _weights[2] < absoluteMax.
When using the functional implementation we will receive:
absoluteMax = ONE - 2 * absoluteMin
proportionalRemainder = (ONE - absoluteMin) / (ONE - 2 * absoluteMin)
This way the proportionalRemainder > ONE. So the _weights[1] = absoluteMax * proportionalRemainder > absoluteMax.
Unexpected behavior, incorrect rebalancing, invariants breaking. Potential asset losses.
Manual Review
Consider including weights which were not adjusted in the sumOtherWeights variable:
Likelihood: Medium/High, when a weight is above absoluteMax. Impact: Low/Medium, weights deviate much faster, and sum of weights also.
The contest is live. Earn rewards by submitting a finding.
This is your time to appeal against judgements on your submissions.
Appeals are being carefully reviewed by our judges.