QuantAMM

QuantAMM
49,600 OP
View results
Submission Details
Severity: medium
Invalid

Precision Loss in QuantAMM's Pack/Unpack Operations

Description

The QuantAMM protocol's packing and unpacking mechanism in QuantAMMStorage.sol exhibits systematic precision loss when handling decimal values. The issue occurs during the pack/unpack cycle and affects both positive and negative numbers, leading to value truncation and potential loss of funds in extreme cases.

The protocol's fixed-point arithmetic implementation in these operations:

  1. Truncates values below certain precision thresholds

  2. Loses precision in least significant digits

  3. Completely drops sub-unit values in certain scenarios

This is particularly concerning because:

  • The loss is systematic and affects all operations

  • It compounds over multiple operations

  • It disproportionately affects smaller values

  • It can lead to asymmetric rounding in negative values

Impact

The precision loss has several implications for the protocol:

  1. Value Misrepresentation

    • Original values are not accurately preserved after pack/unpack cycles

    • Example: 13.421772700000000 becomes 13.421772000000000 (loss of 0.000000700000000)

  2. Asymmetric Treatment of Negative Values

    • Negative values lose precision differently than positive ones

    • Example: 4.200000000 becomes 4.000000000 (loss of 0.200000000)

  3. Complete Loss of Sub-unit Values

    • Values below certain thresholds are completely lost

    • Example: 0.999999999 becomes 0.000000000 (complete loss)

  4. Cumulative Effects

    • In token operations: 1000.123456789000000 tokens become 1000.123456000000000

    • Loss compounds over multiple operations

    • Affects pool share calculations and user balances

Proof of Concept

// SPDX-License-Identifier: MIT
pragma solidity ^0.8.26;
import "forge-std/Test.sol";
import "forge-std/console2.sol";
import "../../contracts/mock/MockQuantAMMStorage.sol";
contract QuantAMMPrecisionLossTest is Test {
MockQuantAMMStorage internal mockQuantAMMStorage;
function setUp() public {
mockQuantAMMStorage = new MockQuantAMMStorage();
}
function testPrecisionLoss() public {
// Create test array with carefully chosen values
int256[] memory sourceArray = new int256[]();
// Test Case 1: Value with significant decimal precision
sourceArray[0] = 13421772700000000; // 13.421772700000000
// Test Case 2: Negative value with precision
sourceArray[1] = -4200000000; // -4.200000000
// Test Case 3: Value near maximum precision
sourceArray[2] = 999999999; // 0.999999999
// Pack and unpack the values
int256[] memory result = mockQuantAMMStorage.ExternalEncodeDecode32Array(
sourceArray,
3
);
// Verify and display precision loss
console2.log("Case 1 - Original:", uint(sourceArray[0]));
console2.log("Case 1 - Result:", uint(result[0]));
console2.log("Case 1 - Lost:", uint(sourceArray[0] - result[0]));
assertEq(
sourceArray[0] - result[0],
700000000,
"Precision loss in case 1 incorrect"
);
assertEq(
sourceArray[1] - result[1],
-200000000,
"Precision loss in case 2 incorrect"
);
assertEq(
sourceArray[2] - result[2],
999999999,
"Precision loss in case 3 incorrect"
);
}
}

Running this test demonstrates the precision loss:

Test Case 1 - Original Value: 13421772700000000
Test Case 1 - After Pack/Unpack: 13421772000000000
Test Case 1 - Precision Lost: 700000000
Test Case 2 - Original Value: -4200000000
Test Case 2 - After Pack/Unpack: -4000000000
Test Case 2 - Precision Lost: -200000000
Test Case 3 - Original Value: 999999999
Test Case 3 - After Pack/Unpack: 0
Test Case 3 - Precision Lost: 999999999

Mitigation

Consider implementing one of the following solutions:

  1. Increase the precision of the internal representation

  2. Add rounding mechanisms instead of truncation

  3. Implement a compensation mechanism for lost precision

  4. Store additional precision bits in unused bit spaces

Updates

Lead Judging Commences

n0kto Lead Judge 7 months ago
Submission Judgement Published
Invalidated
Reason: Design choice

Support

FAQs

Can't find an answer? Chat with us on Discord, Twitter or Linkedin.