Flow

Sablier
FoundryDeFi
20,000 USDC
View results
Submission Details
Severity: medium
Invalid

Precision Loss in Sablier's Debt Calculations Due to Non-Optimized Order of Operations

Summary

The _ongoingDebtScaledOf function is a core calculation method in the Sablier protocol that determines accumulated debt for token streams. It takes a stream ID and returns the amount of tokens that have accrued since the last snapshot, calculated as elapsed time multiplied by the per-second rate. The result is used by higher-level functions for withdrawals, status checks, and solvency determinations. The function works with scaled values (18 decimals) and is designed to handle the precision requirements of continuous token streaming, where exact accounting of streamed amounts is crucial for the protocol's financial accuracy.

While the current implementation is functional, the order of operations and precision handling could be optimized to minimize rounding errors, especially for edge cases with very small rates or long time periods.

The issue stems from the order of operations in _ongoingDebtScaledOf:

function _ongoingDebtScaledOf(uint256 streamId) internal view returns (uint256) {
uint256 blockTimestamp = block.timestamp;
uint256 snapshotTime = _streams[streamId].snapshotTime;
uint256 ratePerSecond = _streams[streamId].ratePerSecond.unwrap();
if (ratePerSecond == 0 || blockTimestamp <= snapshotTime) {
return 0;
}
uint256 elapsedTime;
unchecked {
elapsedTime = blockTimestamp - snapshotTime;
}
// Critical line: elapsedTime * ratePerSecond
return elapsedTime * ratePerSecond;
}

Examples of Precision Loss

Scenario 1: Small Rate, Long Time

// Rate: 0.1 tokens per second (scaled to 18 decimals)
ratePerSecond = 100000000000000000 (0.1 * 1e18)
elapsedTime = 1000000 (long duration)
Calculation Method 1 (current):
1. elapsedTime * ratePerSecond
1000000 * 100000000000000000 = 100000000000000000000000
Calculation Method 2 (alternative):
1. (elapsedTime * ratePerSecond) / 1e18
= 100000 tokens
The intermediate result requires significant precision handling

Scenario 2: Very Small Rate, Long Time

// Rate: 0.000001 tokens per second (scaled)
ratePerSecond = 1000000000000 (0.000001 * 1e18)
elapsedTime = 2000000 (long duration)
Calculation Method 1 (current):
1. elapsedTime * ratePerSecond
2000000 * 1000000000000 = 2000000000000000000
Potential rounding errors when this gets descaled later

How This Affects Total Debt Calculation

The precision loss cascades into _totalDebtOf:

function _totalDebtOf(uint256 streamId) internal view returns (uint256) {
uint256 totalDebtScaled = _ongoingDebtScaledOf(streamId) + _streams[streamId].snapshotDebtScaled;
return Helpers.descaleAmount({ amount: totalDebtScaled, decimals: _streams[streamId].tokenDecimals });
}

Impact Examples

// Example demonstrating precision differences
contract PrecisionTest {
// Scenario setup
uint256 constant RATE = 100000000000000000; // 0.1 tokens/sec
uint256 constant TIME = 1000000; // ~11.5 days
function calculateDebt1() public pure returns (uint256) {
// Current method
return (TIME * RATE);
}
function calculateDebt2() public pure returns (uint256) {
// Alternative method using intermediate scaling
uint256 timeScaled = TIME * 1e18;
return (timeScaled * RATE) / 1e18;
}
}

Real-World Implications

  1. Withdrawal Calculations

// In _withdraw function:
uint256 totalDebtScaled = _ongoingDebtScaledOf(streamId) + _streams[streamId].snapshotDebtScaled;
uint256 totalDebt = Helpers.descaleAmount(totalDebtScaled, tokenDecimals);
// Precision loss here could affect actual withdrawal amounts
  1. Stream Status Determinations

// In statusOf:
bool hasDebt = _uncoveredDebtOf(streamId) > 0;
// Precision loss could affect solvency determination

Impact

The precision loss in _ongoingDebtScaledOf can materially impact the protocol's accounting accuracy. When calculating elapsedTime * ratePerSecond, the multiplication of a large time value with a small rate (in scaled 18 decimal format) leads to rounding artifacts. These errors propagate through to _totalDebtOf and affect critical operations like withdrawals and stream solvency checks. In scenarios with very small rates (e.g., 0.000001 tokens/sec) running over extended periods, users could receive slightly less than entitled during withdrawals due to truncation of the least significant digits during intermediate calculations. While individual rounding errors may be tiny, they compound over time and particularly impact high-precision or high-volume use cases where small discrepancies multiply into material losses. This isn't a direct vulnerability but rather a technical limitation that could undermine the protocol's reliability for certain streaming configurations.

Potential Fixes

  1. Scaled Time Approach:

function _ongoingDebtScaledOf(uint256 streamId) internal view returns (uint256) {
// ... existing checks ...
uint256 elapsedTime = blockTimestamp - snapshotTime;
uint256 elapsedTimeScaled = elapsedTime * 1e18;
return (elapsedTimeScaled * ratePerSecond) / 1e18;
}
  1. Intermediate Precision Method:

function _ongoingDebtScaledOf(uint256 streamId) internal view returns (uint256) {
// ... existing checks ...
uint256 elapsedTime = blockTimestamp - snapshotTime;
uint256 ratePerSecond = _streams[streamId].ratePerSecond.unwrap();
// Use higher precision for intermediate calculations
uint256 PRECISION_FACTOR = 1e9;
uint256 scaledTime = elapsedTime * PRECISION_FACTOR;
uint256 result = (scaledTime * ratePerSecond) / PRECISION_FACTOR;
return result;
}

Test

function testPrecisionLoss() public {
// Setup stream with very small rate
uint256 streamId = createStream(
address(this),
recipient,
0.000001 ether, // tiny rate
token
);
// Advance time significantly
vm.warp(block.timestamp + 365 days);
// Compare different calculation methods
uint256 debt1 = sablierFlow.ongoingDebtScaledOf(streamId);
uint256 debt2 = calculateWithHigherPrecision(streamId);
assertGt(debt2, debt1); // Shows precision loss
}

Recommendations

  1. Consider using PRBMath's fixed-point operations more extensively

  2. Add sanity checks for minimum rates to prevent extreme precision loss

Updates

Lead Judging Commences

inallhonesty Lead Judge 10 months ago
Submission Judgement Published
Invalidated
Reason: Non-acceptable severity

Support

FAQs

Can't find an answer? Chat with us on Discord, Twitter or Linkedin.