DeFiHardhatFoundry
250,000 USDC
View results
Submission Details
Severity: low
Invalid

Block.number should not be used as a measure of time in l2.

Line of code

https://github.com/Cyfrin/2024-05-beanstalk-the-finale/blob/df2dd129a878d16d4adc75049179ac0029d9a96b/protocol/contracts/beanstalk/sun/SeasonFacet/SeasonFacet.sol#L82

https://github.com/Cyfrin/2024-05-beanstalk-the-finale/blob/df2dd129a878d16d4adc75049179ac0029d9a96b/protocol/contracts/libraries/Convert/LibConvert.sol#L211

Summary

Block number in L2 is not a good measure of time

Vulnerability Details

In the protocol the code sometime uses block.timestamp to calculate time elapse, the code also uses block.number to show time elapse as well.

The problem is that Block.number is not a reliable way to track time elapse on L2. The block production in each L2 is different.

To show a more specific scenario the block.number in arbitrum refers to the eth mainnet block number instead of the arbitrum block number. If the dev wrongly assumes that he is using arbitrums 0.2 second length block when in fact he is using eth mainnet 12 second lenght blocks.

https://docs.arbitrum.io/build-decentralized-apps/arbitrum-vs-ethereum/block-numbers-and-time#ethereum-block-numbers-within-arbitrum

Accessing block numbers within an Arbitrum smart contract (i.e., block.number in Solidity) will return a value close to (but not necessarily exactly) the L1 block number at which the sequencer received the transaction.

This will result in a bad time keeping mechanism. The block.number will be moving to quickly on L2 compared to eth mainnet and will disrupt certain functions relying on blocknumber.

function stepSeason() private returns (uint32 season) {
s.sys.season.current += 1;
season = s.sys.season.current;
s.sys.season.sunriseBlock = uint32(block.number); // Note: Will overflow in the year 3650.
emit Sunrise(season);

another place that heavilty replies on the accurate block.number is when the code track the convert capacity when computing the penalty of the stalk

https://github.com/Cyfrin/2024-05-beanstalk-the-finale/blob/df2dd129a878d16d4adc75049179ac0029d9a96b/protocol/contracts/libraries/Convert/LibConvert.sol#L211

// Update penalties in storage.
ConvertCapacity storage convertCap = s.sys.convertCapacity[block.number];
convertCap.overallConvertCapacityUsed = convertCap.overallConvertCapacityUsed.add(
overallConvertCapacityUsed
);
convertCap.wellConvertCapacityUsed[inputToken] = convertCap
.wellConvertCapacityUsed[inputToken]
.add(inputTokenAmountUsed);
convertCap.wellConvertCapacityUsed[outputToken] = convertCap
.wellConvertCapacityUsed[outputToken]
.add(outputTokenAmountUsed);

if the contract deployed in arbitrum l2 but the block.number used is in l1, then the capacity tracking per block number will be broken.

suppose the conver capacity per block is 100, but arbitrum produce block 0.2 seconds.

l1 block number is 12 second per block.

so one l1 block = 12 / 0.2 = 60 block in l1

then while the true l2 total capacity is 60 block * 100 = 6000

the real capacity is only 100 (again because l1 block number is used in l2),

the convert capacity will be much lagging and the penalty will be computed incorrectly.

Impact

TIme sensitive functions that rely on block number will be inconsistent across L2's. Will lead to different states on L2 as explained in the example above.

Tools Used

Manual review

Recommendations

Rely only on block.timestamp to measure the elapse of time.

Updates

Lead Judging Commences

inallhonesty Lead Judge about 1 year ago
Submission Judgement Published
Invalidated
Reason: Known issue
Assigned finding tags:

Known - LightChaser

Support

FAQs

Can't find an answer? Chat with us on Discord, Twitter or Linkedin.