HardhatDeFi
15,000 USDC
View results
Submission Details
Severity: low
Invalid

Unbounded Loop Operations Leading to DOS for batchRegisterCollateralToken function

Summary

A critical vulnerability has been identified in the batchRegisterCollateralToken function where there is no limit on the array size of tokens that can be registered in a single transaction. This could lead to potential Denial of Service (DoS) conditions due to block gas limit constraints.

Vulnerability Details

Description

The function batchRegisterCollateralToken accepts an unbounded array of token addresses:

function batchRegisterCollateralToken(
address[] calldata _collateralTokens
) external override onlyOwner nonReentrant returns (address[] memory) {
uint256 _length = _collateralTokens.length;
// No maximum length check
for (uint256 i = 0; i < _length; i++) {
_wTokens[i] = _registerCollateralToken(_collateralTokens[i]);
}
return _wTokens;
}

Root Cause

The function lacks a maximum limit check on the _collateralTokens array length. Each token registration operation (_registerCollateralToken) consumes gas, and the cumulative gas consumption increases linearly with the array size.

Proof of Concept

// Attack scenario
function testGasLimitExploit() public {
address[] memory tokens = new address[](); // Large array
// Fill array with valid addresses
for(uint i = 0; i < 1000; i++) {
tokens[i] = address(uint160(i + 1));
}
// This call could exceed block gas limit
batchRegisterCollateralToken(tokens);
}

Impact

  1. Transactions with large arrays will revert due to exceeding block gas limit

  2. Function becomes unusable for legitimate batch operations

  3. Owner cannot register tokens in large batches

  4. Wastes gas on failed transactions

Tools Used

  • Manual code review

Recommendations

Short-term Fix

Implement a maximum array length check:

function batchRegisterCollateralToken(
address[] calldata _collateralTokens
) external override onlyOwner nonReentrant returns (address[] memory) {
uint256 _length = _collateralTokens.length;
// Add maximum length check
uint256 private constant MAX_BATCH_SIZE = 100;
require(_length <= MAX_BATCH_SIZE, "Batch size exceeds limit");
address[] memory _wTokens = new address[]();
for (uint256 i = 0; i < _length; i++) {
_wTokens[i] = _registerCollateralToken(_collateralTokens[i]);
}
return _wTokens;
}

Long-term Considerations

  1. Make MAX_BATCH_SIZE configurable by governance

  2. Implement batch processing with pagination

  3. Add gas consumption estimation before execution

Test Case for batchRegisterCollateralToken

To ensure the function works as expected and to verify the gas limit constraint, we can add a test case. Here is an example test case using Hardhat and Chai:

const { expect } = require("chai");
const { ethers } = require("hardhat");
describe("AaveDIVAWrapper", function () {
let AaveDIVAWrapper;
let aaveDIVAWrapper;
let owner;
let addr1;
let addr2;
let addrs;
beforeEach(async function () {
[owner, addr1, addr2, ...addrs] = await ethers.getSigners();
AaveDIVAWrapper = await ethers.getContractFactory("AaveDIVAWrapper");
aaveDIVAWrapper = await AaveDIVAWrapper.deploy();
await aaveDIVAWrapper.deployed();
});
it("Should revert if batch size exceeds the maximum limit", async function () {
const MAX_BATCH_SIZE = 100;
const largeBatch = new Array(MAX_BATCH_SIZE + 1).fill(addr1.address);
await expect(
aaveDIVAWrapper.batchRegisterCollateralToken(largeBatch)
).to.be.revertedWith("Batch size exceeds limit");
});
it("Should register tokens within the maximum batch size", async function () {
const MAX_BATCH_SIZE = 100;
const validBatch = new Array(MAX_BATCH_SIZE).fill(addr1.address);
const tx = await aaveDIVAWrapper.batchRegisterCollateralToken(validBatch);
const receipt = await tx.wait();
const wTokenAddresses = receipt.logs
.filter((log) => {
try {
const parsedLog = aaveDIVAWrapper.interface.parseLog(log);
return parsedLog?.name === "CollateralTokenRegistered";
} catch {
return false;
}
})
.map((log) => {
const parsedLog = aaveDIVAWrapper.interface.parseLog(log);
return parsedLog?.args[1]; // wToken address is the second argument
});
expect(wTokenAddresses.length).to.equal(MAX_BATCH_SIZE);
});
});

Explanation

  1. Maximum Batch Size Check: The function now includes a check to ensure that the number of tokens to be registered does not exceed the MAX_BATCH_SIZE limit. If it does, the transaction will revert with the message "Batch size exceeds limit".

  2. Test Case: The test case verifies two scenarios:

    • Exceeding Batch Size: Ensures that the function reverts when the batch size exceeds the maximum limit.

    • Valid Batch Size: Ensures that the function successfully registers tokens when the batch size is within the limit.

Updates

Lead Judging Commences

bube Lead Judge 5 months ago
Submission Judgement Published
Invalidated
Reason: Known issue

Support

FAQs

Can't find an answer? Chat with us on Discord, Twitter or Linkedin.