HardhatDeFi
15,000 USDC
View results
Submission Details
Severity: low
Invalid

Unbounded Loop Operations Leading to DOS for batchCreateContingentPool function

Summary

A critical vulnerability has been identified in the batchCreateContingentPool function, where there is no limit on the array size of _poolParams that can be processed in a single transaction. This could lead to potential Denial of Service (DoS) conditions due to block gas limit constraints.

Vulnerability Details

Description

The function batchCreateContingentPool accepts an unbounded array of PoolParams:

function batchCreateContingentPool(
PoolParams[] calldata _poolParams
) external override nonReentrant returns (bytes32[] memory) {
uint256 _length = _poolParams.length;
bytes32[] memory _poolIds = new bytes32[]();
for (uint256 i = 0; i < _length; i++) {
_poolIds[i] = _createContingentPool(_poolParams[i]);
}
return _poolIds;
}

Root Cause

The function lacks a maximum limit check on the _poolParams array length. Each pool creation operation (_createContingentPool) consumes gas, and the cumulative gas consumption increases linearly with the array size. If the array size is too large, the transaction could exceed the block gas limit, causing it to revert.

Proof of Concept

// Attack scenario
function testGasLimitExploit() public {
PoolParams[] memory poolParams = new PoolParams[](); // Large array
// Fill array with valid pool parameters
for(uint i = 0; i < 1000; i++) {
poolParams[i] = PoolParams({
// Initialize pool parameters
});
}
// This call could exceed block gas limit
batchCreateContingentPool(poolParams);
}

Impact

  • Severity: MEDIUM

  • Likelihood: MEDIUM

Technical Impact

  1. Transactions with large arrays will revert due to exceeding the block gas limit.

  2. The function becomes unusable for legitimate batch operations.

  3. Users cannot create pools in large batches.

  4. Wastes gas on failed transactions.

Tools Used

  • Manual code review

  • Hardhat for testing

Recommendations

Short-term Fix

Implement a maximum array length check:

function batchCreateContingentPool(
PoolParams[] calldata _poolParams
) external override nonReentrant returns (bytes32[] memory) {
uint256 _length = _poolParams.length;
// Add maximum length check
uint256 private constant MAX_BATCH_SIZE = 100;
require(_length <= MAX_BATCH_SIZE, "Batch size exceeds limit");
bytes32[] memory _poolIds = new bytes32[]();
for (uint256 i = 0; i < _length; i++) {
_poolIds[i] = _createContingentPool(_poolParams[i]);
}
return _poolIds;
}

Long-term Considerations

  1. Make MAX_BATCH_SIZE configurable by governance.

  2. Implement batch processing with pagination.

  3. Add gas consumption estimation before execution.


Test Case for batchCreateContingentPool

To ensure the function works as expected and to verify the gas limit constraint, we can add a test case. Here is an example test case using Hardhat and Chai:

const { expect } = require("chai");
const { ethers } = require("hardhat");
describe("AaveDIVAWrapper", function () {
let AaveDIVAWrapper;
let aaveDIVAWrapper;
let owner;
let addr1;
let addr2;
let addrs;
beforeEach(async function () {
[owner, addr1, addr2, ...addrs] = await ethers.getSigners();
AaveDIVAWrapper = await ethers.getContractFactory("AaveDIVAWrapper");
aaveDIVAWrapper = await AaveDIVAWrapper.deploy();
await aaveDIVAWrapper.deployed();
});
it("Should revert if batch size exceeds the maximum limit", async function () {
const MAX_BATCH_SIZE = 100;
const largeBatch = new Array(MAX_BATCH_SIZE + 1).fill({
// Initialize pool parameters
});
await expect(
aaveDIVAWrapper.batchCreateContingentPool(largeBatch)
).to.be.revertedWith("Batch size exceeds limit");
});
it("Should create pools within the maximum batch size", async function () {
const MAX_BATCH_SIZE = 100;
const validBatch = new Array(MAX_BATCH_SIZE).fill({
// Initialize pool parameters
});
const tx = await aaveDIVAWrapper.batchCreateContingentPool(validBatch);
const receipt = await tx.wait();
const poolIds = receipt.logs
.filter((log) => {
try {
const parsedLog = aaveDIVAWrapper.interface.parseLog(log);
return parsedLog?.name === "PoolCreated";
} catch {
return false;
}
})
.map((log) => {
const parsedLog = aaveDIVAWrapper.interface.parseLog(log);
return parsedLog?.args[0]; // poolId is the first argument
});
expect(poolIds.length).to.equal(MAX_BATCH_SIZE);
});
});

Explanation

  1. Maximum Batch Size Check: The function now includes a check to ensure that the number of pools to be created does not exceed the MAX_BATCH_SIZE limit. If it does, the transaction will revert with the message "Batch size exceeds limit".

  2. Test Case: The test case verifies two scenarios:

    • Exceeding Batch Size: Ensures that the function reverts when the batch size exceeds the maximum limit.

    • Valid Batch Size: Ensures that the function successfully creates pools when the batch size is within the limit.

Updates

Lead Judging Commences

bube Lead Judge 5 months ago
Submission Judgement Published
Invalidated
Reason: Known issue

Support

FAQs

Can't find an answer? Chat with us on Discord, Twitter or Linkedin.