Token-gated arbitrage intelligence. Encrypted on IPFS. Pay-per-decrypt.
Launch App β Β· Demo Video Β· Contracts Β· Get Tokens
Full walkthrough: claiming tokens, connecting wallet, burning DADC, decrypting live arbitrage data.
You're a trader looking for arbitrage opportunities across DEX pools. The data is valuable β you don't want bots scraping it for free.
Alpha Foundry solves this by building an encrypted data pipeline where every access costs real tokens:
- Monitors DEX swaps on Ethereum mainnet (Uniswap V3 USDC/WETH pool via Blockscout API)
- Detects price deltas that signal arbitrage opportunities using a 24-hour rolling window
- Encrypts everything using Lighthouse's native Kavach encryption (threshold BLS cryptography)
- Gates access with ERC20 tokens β Lighthouse checks
balanceOf(wallet) >= 1 DADCbefore serving any decryption key - Burns tokens on every unlock β 1 DADC transferred to
0xdeadbefore each decrypt, destroyed permanently
Think of it as a Bloomberg Terminal for on-chain arbitrage data, but encrypted on IPFS and unlocked with crypto tokens. 100 tokens = exactly 100 uses, then you need more.
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β User Browser β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Next.js Frontend (Vercel) β β
β β β’ Connect MetaMask wallet β β
β β β’ Check DADC balance β show "X decrypts available" β β
β β β’ Burn 1 DADC β transfer to 0xdead β β
β β β’ Sign auth message β request decryption key β β
β β β’ Download + decrypt JSONL locally β β
β β β’ Display arbitrage data with analytics β β
β βββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββΌββββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββΊ Sepolia Testnet
β βββ DataCoin.balanceOf() β access check
β βββ DataCoin.transfer(0xdead, 1e18) β burn
β βββ Faucet.claimTokens() β get 100 DADC
β
ββββββββββββΊ Lighthouse Storage
β βββ getAuthMessage() β sign with wallet
β βββ fetchEncryptionKey() β verify ERC20 balance
β βββ decryptFile() β serve encrypted CID
β βββ Key shards distributed across 5 nodes
β
ββββββββββββββββΌββββββββββββββββββββββββββββββββββββββββββββββββ
β βΌ β
β Python Worker (Railway) β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β 1. Poll Blockscout REST API (Uniswap V3 swaps) β β
β β 2. Normalize amounts (token decimals, USD est.) β β
β β 3. Enrich with analytics (delta vs MA, MEV flags) β β
β β 4. Compute rolling 24h price window (min/max/mean) β β
β β 5. Package as JSONL with timestamps β β
β β 6. Encrypt via Lighthouse SDK (Kavach) β β
β β 7. Apply ERC20 access condition (DADC >= 1) β β
β β 8. Upload to IPFS (distributed key shards) β β
β β 9. Auto-cleanup old files (keep only latest) β β
β β 10. Serve metadata via HTTP (/metadata, /health) β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββΊ Ethereum Mainnet RPC
β βββ Uniswap V3 swap events
β
ββββββββββββΊ Lighthouse API
βββ uploadEncrypted()
βββ applyAccessCondition()
The Python worker (apps/worker/run.py) runs a continuous loop every 15 seconds:
- Fetch swaps β
BlockscoutRESTClientqueries the Blockscout API for recent Uniswap V3 swap events on the target pool, paginating up to 5 pages per cycle - Deduplicate β
DedupeTrackerkeeps a set of seen transaction hashes across restarts (persisted tostate/dedupe.json) - Normalize β
transform.pydecodes raw swap amounts using token decimals, resolves pool tokens from hardcoded pool mappings - Estimate USD values β Stablecoin amounts used directly; WETH amounts multiplied by a reference ETH price (Chainlink feed or fallback)
- Enrich with analytics β Each row gets: delta vs 5-swap moving average, price impact vs rolling median, MEV warning flags (>2 stddev deviation), emoji markers for notable swaps
- Rolling price buffer β
RollingPriceBuffermaintains a 24-hour sliding window per token pair, calculating min/max/mean prices. Persisted tostate/price_buffer.json - Write JSONL β Enriched rows written to a timestamped
.jsonlfile with schema version headers - Encrypt + upload β
LighthouseNativeEncryptionruns Lighthouse'suploadEncrypted()via Node.js subprocess, then applies an ERC20 access condition:balanceOf(userAddress) >= 1 DADCon Sepolia chain ID 11155111 - Auto-cleanup β
LighthouseCleanuplists all uploaded files, deletes everything except the latest CID and permanently protected CIDs. Uses Lighthouse CLI for reliable deletion - Serve metadata β
ReadOnlyHTTPServerexposes/metadata(latest CID, row count, freshness) and/healthon port 8787
The Next.js app (frontend/app/page.tsx) implements the full unlock flow:
- Connect wallet β
ethers.BrowserProviderconnects to MetaMask, reads chain ID, checks if on Sepolia (11155111) - Check balance β Calls
DataCoin.balanceOf(address)andFaucet.hasClaimed(address)to show current state - Claim tokens (if needed) β Calls
Faucet.claimTokens()to mint 100 DADC (one-time per address) - Unlock & Decrypt β A 4-step process:
- Step 1: Burn 1 DADC β
DataCoin.transfer(0xdead, 1e18), wait for confirmation - Step 2: Get auth message β
lighthouse.getAuthMessage(address) - Step 3: Sign message β
signer.signMessage(message)via MetaMask popup - Step 4: Fetch key + decrypt β
lighthouse.fetchEncryptionKey(cid, address, signature)thenlighthouse.decryptFile(cid, key). Lighthouse internally verifies that the wallet's DADC balance (post-burn) is still β₯ 1
- Step 1: Burn 1 DADC β
- Display data β Decrypted JSONL parsed and rendered with arbitrage analytics, swap details, price impact indicators
- Lighthouse checks
balanceOf(userAddress) >= 1 DADCon every decrypt attempt β this is enforced at the encryption layer, not by the frontend - The frontend burns 1 DADC before requesting the decryption key, so the user's balance drops
- When balance reaches 0, Lighthouse denies the key automatically β no backend enforcement needed
- Tokens are sent to
0x000000000000000000000000000000000000dEaDβ they're gone forever - This creates genuine deflationary economics: fixed supply, decreasing on every use
| Property | Value |
|---|---|
| Address | 0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC |
| Standard | ERC20 |
| Symbol | DADC |
| Decimals | 18 |
| Total Supply | 100,000,000,000 DADC |
| Created via | 1MB.io factory with 10,000 LSDC lock |
| Creation Tx | 0x0bf7c4da... |
| Property | Value |
|---|---|
| Address | 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB |
| Function | claimTokens() β mints 100 DADC to caller |
| Limit | One claim per address (reverts with AlreadyClaimed on repeat) |
| Cooldown | 1 hour (currently only single claim enforced) |
The faucet contract calls dataCoin.mint(msg.sender, 100e18) and tracks claims via totalClaimed mapping. Source: contracts/DataCoinFaucet.sol.
| Layer | Technology | Role |
|---|---|---|
| Frontend | Next.js 14 Β· ethers.js v6 Β· Tailwind Β· SWR | Wallet connection, token burning, Lighthouse decrypt, live metadata polling |
| Encryption | Lighthouse SDK (@lighthouse-web3/sdk) |
Kavach encryption with BLS threshold cryptography, ERC20 access conditions |
| Backend | Python 3.12 Β· aiohttp Β· pydantic-settings | Async worker loop with typed configuration |
| Blockchain Data | Blockscout REST API Β· httpx Β· tenacity | Swap event ingestion with retry/backoff, MCP-compatible client |
| Price Oracle | Chainlink price feeds (fallback: swap inference) | ETH/USD reference price for USD value estimation |
| Smart Contracts | Solidity 0.8.20 | ERC20 token (DataCoin) + Faucet on Sepolia |
| State | JSON files (checkpointed) | Block cursors, deduplication sets, rolling price buffers β all survive restarts |
| Infra | Vercel (frontend) Β· Railway (backend via nixpacks) | Auto-deploy from main branch |
βββ apps/
β βββ worker/ # Python backend (Railway)
β βββ run.py # Main loop: poll β transform β enrich β encrypt β upload β cleanup
β βββ lighthouse_native_encryption.py # Kavach encryption + ERC20 access conditions via Node.js subprocess
β βββ lighthouse_cleanup.py # Auto-delete old Lighthouse files, protect latest + permanent CIDs
β βββ blockscout_client.py # MCP-first async client with session management
β βββ blockscout_rest.py # REST fallback client with pagination + retry
β βββ transform.py # Swap normalization, token resolution, amount decoding
β βββ state.py # DedupeTracker, RollingPriceBuffer, PreviewStateTracker
β βββ chainlink_price.py # ETH/USD price feed (Chainlink + swap inference fallback)
β βββ http_server.py # /metadata and /health endpoints (port 8787)
β βββ settings.py # Pydantic settings from env vars (poll interval, window size, etc.)
β βββ requirements.txt
β
βββ frontend/ # Next.js app (Vercel)
β βββ app/
β β βββ page.tsx # Full unlock flow: connect β claim β burn β sign β decrypt β display
β β βββ layout.tsx # Root layout with metadata
β β βββ globals.css # Tailwind base styles
β βββ types/
β β βββ window.d.ts # MetaMask ethereum provider types
β βββ next.config.js
β βββ tailwind.config.js
β βββ package.json
β
βββ contracts/
β βββ DataCoinFaucet.sol # One-time claim faucet (100 DADC per wallet)
β
βββ scripts/
β βββ createDEXArbDataCoin.js # 1MB.io factory token creation
β βββ deployFaucet.js # Faucet deployment script
β βββ mintTokens.js # Manual token minting
β βββ testFaucet.js # Faucet integration test
β βββ lighthouse_upload.py # Standalone upload test
β βββ verify_lighthouse_protection.py # Access control verification
β βββ verify_demo.py # End-to-end demo data generator
β
βββ infra/
β βββ autoscout/ # Block explorer configuration
β
βββ Dockerfile # Container build
βββ requirements.txt # Python dependencies
βββ package.json # Root JS dependencies (token scripts)
βββ metadata.json # Token metadata (IPFS-pinned)
- Python 3.12+
- Node.js 18+ (required for Lighthouse CLI)
- MetaMask wallet on Sepolia testnet
- Lighthouse API key (files.lighthouse.storage)
cd apps/worker
python -m venv venv && source venv/bin/activate
pip install -r requirements.txt
npm install -g @lighthouse-web3/sdk
# Required env vars
export BLOCKSCOUT_MCP_BASE="https://eth-sepolia.blockscout.com/api/v2"
export CHAIN_ID=11155111
export LIGHTHOUSE_API_KEY="your_key"
export LIGHTHOUSE_WALLET_PRIVATE_KEY="0x..."
export DATACOIN_CONTRACT_ADDRESS="0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC"
python run.pyExpected output:
β Blockscout REST client initialized
β Lighthouse encryption configured (wallet: 0x...)
β HTTP server started on http://0.0.0.0:8787
β Fetched 47 swaps from Uniswap V3 USDC/WETH
β Enriched 47 rows (3 MEV warnings, 12 new)
β Lighthouse upload successful: QmXXX...
β Access condition applied: DADC >= 1 on chain 11155111
β Auto-cleanup: deleted 3 old files, 1 remaining
cd frontend
npm install
export NEXT_PUBLIC_DATACOIN_ADDRESS="0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC"
export NEXT_PUBLIC_FAUCET_ADDRESS="0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB"
export NEXT_PUBLIC_CHAIN_ID=11155111
export NEXT_PUBLIC_METADATA_API="http://localhost:8787"
npm run dev
# β http://localhost:3000- Connect repo to Railway
- Set environment variables:
BLOCKSCOUT_MCP_BASE=https://eth-sepolia.blockscout.com/api/v2 CHAIN_ID=11155111 LIGHTHOUSE_API_KEY=... LIGHTHOUSE_WALLET_PRIVATE_KEY=0x... DATACOIN_CONTRACT_ADDRESS=0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC - Deploy β Railway auto-detects via nixpacks
- Verify β check logs for
β Lighthouse upload successful
- Connect repo to Vercel
- Set
NEXT_PUBLIC_*environment variables - Deploy β auto-deploys from
main - Live at alpha-foundry.vercel.app
Key settings in apps/worker/settings.py (all via env vars):
| Variable | Default | Description |
|---|---|---|
WORKER_POLL_SECONDS |
15 | Seconds between swap fetch cycles |
WINDOW_MINUTES |
2 | Lookback window for recent swaps |
BLOCK_LOOKBACK |
100 | Max blocks to scan per cycle |
MAX_PAGES_PER_CYCLE |
5 | Pagination limit per Blockscout request |
PREVIEW_ROWS |
8 | Rows included in preview metadata |
ROLLING_WINDOW_SIZE |
1000 | Max rows in the rolling JSONL file |
REFERENCE_ETH_PRICE_USD |
2500.0 | Fallback ETH price if Chainlink unavailable |
Lighthouse Β· 1MB.io Β· Blockscout Β· Chainlink Β· Railway Β· Vercel Β· ethers.js
This project is licensed under the MIT License β see the LICENSE file for details.