Summary
A critical vulnerability has been identified in the batchRegisterCollateralToken
function where there is no limit on the array size of tokens that can be registered in a single transaction. This could lead to potential Denial of Service (DoS) conditions due to block gas limit constraints.
Vulnerability Details
Description
The function batchRegisterCollateralToken
accepts an unbounded array of token addresses:
function batchRegisterCollateralToken(
address[] calldata _collateralTokens
) external override onlyOwner nonReentrant returns (address[] memory) {
uint256 _length = _collateralTokens.length;
for (uint256 i = 0; i < _length; i++) {
_wTokens[i] = _registerCollateralToken(_collateralTokens[i]);
}
return _wTokens;
}
Root Cause
The function lacks a maximum limit check on the _collateralTokens
array length. Each token registration operation (_registerCollateralToken
) consumes gas, and the cumulative gas consumption increases linearly with the array size.
Proof of Concept
function testGasLimitExploit() public {
address[] memory tokens = new address[]();
for(uint i = 0; i < 1000; i++) {
tokens[i] = address(uint160(i + 1));
}
batchRegisterCollateralToken(tokens);
}
Impact
Transactions with large arrays will revert due to exceeding block gas limit
Function becomes unusable for legitimate batch operations
Owner cannot register tokens in large batches
Wastes gas on failed transactions
Tools Used
Recommendations
Short-term Fix
Implement a maximum array length check:
function batchRegisterCollateralToken(
address[] calldata _collateralTokens
) external override onlyOwner nonReentrant returns (address[] memory) {
uint256 _length = _collateralTokens.length;
uint256 private constant MAX_BATCH_SIZE = 100;
require(_length <= MAX_BATCH_SIZE, "Batch size exceeds limit");
address[] memory _wTokens = new address[]();
for (uint256 i = 0; i < _length; i++) {
_wTokens[i] = _registerCollateralToken(_collateralTokens[i]);
}
return _wTokens;
}
Long-term Considerations
Make MAX_BATCH_SIZE configurable by governance
Implement batch processing with pagination
Add gas consumption estimation before execution
Test Case for batchRegisterCollateralToken
To ensure the function works as expected and to verify the gas limit constraint, we can add a test case. Here is an example test case using Hardhat and Chai:
const { expect } = require("chai");
const { ethers } = require("hardhat");
describe("AaveDIVAWrapper", function () {
let AaveDIVAWrapper;
let aaveDIVAWrapper;
let owner;
let addr1;
let addr2;
let addrs;
beforeEach(async function () {
[owner, addr1, addr2, ...addrs] = await ethers.getSigners();
AaveDIVAWrapper = await ethers.getContractFactory("AaveDIVAWrapper");
aaveDIVAWrapper = await AaveDIVAWrapper.deploy();
await aaveDIVAWrapper.deployed();
});
it("Should revert if batch size exceeds the maximum limit", async function () {
const MAX_BATCH_SIZE = 100;
const largeBatch = new Array(MAX_BATCH_SIZE + 1).fill(addr1.address);
await expect(
aaveDIVAWrapper.batchRegisterCollateralToken(largeBatch)
).to.be.revertedWith("Batch size exceeds limit");
});
it("Should register tokens within the maximum batch size", async function () {
const MAX_BATCH_SIZE = 100;
const validBatch = new Array(MAX_BATCH_SIZE).fill(addr1.address);
const tx = await aaveDIVAWrapper.batchRegisterCollateralToken(validBatch);
const receipt = await tx.wait();
const wTokenAddresses = receipt.logs
.filter((log) => {
try {
const parsedLog = aaveDIVAWrapper.interface.parseLog(log);
return parsedLog?.name === "CollateralTokenRegistered";
} catch {
return false;
}
})
.map((log) => {
const parsedLog = aaveDIVAWrapper.interface.parseLog(log);
return parsedLog?.args[1];
});
expect(wTokenAddresses.length).to.equal(MAX_BATCH_SIZE);
});
});
Explanation
-
Maximum Batch Size Check: The function now includes a check to ensure that the number of tokens to be registered does not exceed the MAX_BATCH_SIZE
limit. If it does, the transaction will revert with the message "Batch size exceeds limit".
-
Test Case: The test case verifies two scenarios:
Exceeding Batch Size: Ensures that the function reverts when the batch size exceeds the maximum limit.
Valid Batch Size: Ensures that the function successfully registers tokens when the batch size is within the limit.