QuantAMM

QuantAMM
49,600 OP
View results
Submission Details
Severity: high
Valid

Locked Protocol Fees Due to Incorrect Fee Collection Method

Summary

In the UpliftOnlyExample contract, protocol fees that are collected during withdrawals may become permanently locked in the contract due to using maxAmountsIn instead of exact amounts in when minting BPT's for the admin. This leads to a loss of yield for the protocol as portions of collected fees become inaccessible.

Vulnerability Details

The issue occurs in the fee collection logic where maxAmountsIn is used with localData.accruedQuantAMMFees:

maxAmountsIn: localData.accruedQuantAMMFees

When BPT's are minted for the admin during withdrawals, the full accrued fee amount is specified as a maximum input amount rather than specifying an exact input amount. Since the operation uses AddLiquidityKind.PROPORTIONAL which uses exact output amounts, this means an amount UP to the accruedQuantAMMFees will be used, but not necessarily the full amount. The net effect is:

  1. The full fee amount is deducted from users during withdrawals

  2. Only a portion of those fees may actually be transferred to the protocol

  3. The unused portion remains locked in the contract with no mechanism to recover it

This creates a discrepancy where the users amount is reduced by accruedQuantAMMFees but the protocol mints BTPS's where the amounts in are <= accruedQuantAMMFees. In every case that the amount in is less than accruedQuantAMMFees the protocol will be left with a locked amount of fees that cannot be accessed.

Additionally:

There is another issue with the current logic. The fix for the two is the same which is why I am mentioning both in this report.

Due to the way maxAmountsIn is calculated there will be times it will be less then amountInRaw. When this happens the call will revert and throw the AmountInAboveMax error. This is detrimental to the protocol because this means that the users wont be able to remove liquidity if the amount they are removing is impacted by the rounding nature of the calculations that take place in this flow. The liquidity they wont be able to remove will be locked not because the size of there position is malicious but because the admin fee calculations are wrong and create impossible to satisfy slippage parameters.

PoC

In this PoC you will see that removing liquidity at times will revert when an admin fee exist. It seems this was not caught since admin fees were set to 0 in the test suite. But once set to even 5 basis points the issue presents itself.

By running this test in UpliftExample.t.sol you will see a revert with the message:

[FAIL: AmountInAboveMax(0x2e234DAe75C793f67A35089C9d99245E1C58470b, 1, 0)]

It is important to note that I ran this test with some of the other fixes from my other issues implemented and the same revert exist just with more sensible numbers.

function testFeeCalculationCausesRevert() public {
vm.startPrank(address(vaultAdmin));
updateWeightRunner.setQuantAMMSwapFeeTake(5); //set admin fee to 5 basis points (same as min withdrawal fee)
vm.stopPrank();
// Add liquidity so bob has BPT to remove liquidity.
uint256[] memory maxAmountsIn = [dai.balanceOf(bob), usdc.balanceOf(bob)].toMemoryArray();
vm.prank(bob);
upliftOnlyRouter.addLiquidityProportional(pool, maxAmountsIn, bptAmount, false, bytes(""));
vm.stopPrank();
assertEq(upliftOnlyRouter.getUserPoolFeeData(pool, bob).length, 1, "bptAmount mapping should be 1");
assertEq(upliftOnlyRouter.getUserPoolFeeData(pool, bob)[0].amount, bptAmount, "bptAmount mapping should be 0");
assertEq(
upliftOnlyRouter.getUserPoolFeeData(pool, bob)[0].blockTimestampDeposit,
block.timestamp,
"bptAmount mapping should be 0"
);
assertEq(
upliftOnlyRouter.getUserPoolFeeData(pool, bob)[0].lpTokenDepositValue,
500000000000000000,
"should match sum(amount * price)"
);
assertEq(upliftOnlyRouter.getUserPoolFeeData(pool, bob)[0].upliftFeeBps, 200, "fee");
int256[] memory prices = new int256[]();
for (uint256 i = 0; i < tokens.length; ++i) {
prices[i] = int256(i) * 1.5e18; // Make the price 1.5 times higher
}
updateWeightRunner.setMockPrices(pool, prices);
uint256 nftTokenId = 0;
uint256[] memory minAmountsOut = [uint256(0), uint256(0)].toMemoryArray();
BaseVaultTest.Balances memory balancesBefore = getBalances(bob);
vm.startPrank(bob);
upliftOnlyRouter.removeLiquidityProportional(bptAmount/3, minAmountsOut, false, pool);
vm.stopPrank();
}

Impact

All in all these are two separate issues both of which with high severity however because it can be resolved with one fix as mentioned below I have this as a single high severity issue.
Loss of yield and Locked funds - Protocol fees that are collected but not properly transferred become permanently locked in the contract, depriving the protocol of revenue it should have received.

Tools Used

Manual Review

Recommendations

Modify the logic to use exact input amounts instead of exact output amounts to ensure all collected fees are properly transferred to the protocol:

This ensures that:

  1. The exact fee amount collected from users is transferred to the protocol

  2. No fees remain trapped in the contract

  3. The protocol receives all revenue it is entitled to

  4. Positions wont become stuck due to bad slippage calculations.

Updates

Lead Judging Commences

n0kto Lead Judge 7 months ago
Submission Judgement Published
Validated
Assigned finding tags:

finding_onAfterRemoveLiquidity_vault.addLiquidity_rounding_precision_DoS

Likelihood: High, multiple rounding down and little value can trigger the bug Impact: High, DoS the removal.

finding_onAfterRemoveLiquidity_maxAmountsIn_accruedQuantAMMFees_can_lock_funds

Likelihood: Medium/High, everytime maxAmountsIn is bigger than need for the exact amount of BPT. Impact: High, fees stuck in the contract.

Appeal created

0xhuntoor Auditor
7 months ago
0xhuntoor Auditor
7 months ago
0xlandlady Submitter
7 months ago
0xhuntoor Auditor
7 months ago
n0kto Lead Judge
7 months ago
n0kto Lead Judge 7 months ago
Submission Judgement Published
Validated
Assigned finding tags:

finding_onAfterRemoveLiquidity_vault.addLiquidity_rounding_precision_DoS

Likelihood: High, multiple rounding down and little value can trigger the bug Impact: High, DoS the removal.

Support

FAQs

Can't find an answer? Chat with us on Discord, Twitter or Linkedin.