diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md index e9179d0f..05afeec2 100644 --- a/ARCHITECTURE.md +++ b/ARCHITECTURE.md @@ -144,9 +144,48 @@ enrichedTips (displayed to user) See `docs/PERFORMANCE_PROFILING.md` for measurement techniques. -## Data Flow +### API Resilience & Caching (Issue #290) + +Read-heavy views implement last-known-good caching to survive API outages: -## Security Boundaries +``` +Live API Request + | + v +Fetch with timeout (10s) + ├─ Success? + │ ├─ Yes: Store in persistent cache → Return live data + │ │ + │ └─ No: Timeout/error occurred + │ Check persistent cache + │ ├─ Cache found → Return cached data + │ └─ No cache → Return error + | + v +User sees: live data OR cached data OR error +UI shows: freshness metadata + retry button (if cached) +``` + +**Features:** + +- **Persistent cache**: localStorage-backed, survives browser reload +- **TTL management**: 2-5 minute caches per endpoint type +- **Automatic fallback**: No code changes needed, transparent +- **Freshness indicators**: Users shown data source and age +- **Transaction lockout**: Risky actions disabled on stale data +- **Pattern invalidation**: Related caches cleared on state change + +**Layers:** + +1. `persistentCache.js` - Low-level storage with TTL +2. `useCachedData` - Generic hook for any fetch +3. `cachedApiClient.js` - Transparent HTTP wrapper +4. `FreshnessIndicator.jsx` - Visual feedback component +5. `ResilienceContext.jsx` - Global coordination + +See `docs/LAST_KNOWN_GOOD_CACHING.md` for architecture and patterns. + +## Data Flow | Boundary | Trust Model | |---|---| diff --git a/CHANGELOG.md b/CHANGELOG.md index cbaf7f82..45607bbf 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -8,6 +8,13 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/). ### Changed +- Added last-known-good caching for read-heavy surfaces (Issue #290): + - Persistent cache stores successful API responses with configurable TTL + - Automatic fallback to cached data when live APIs are unavailable or slow + - Visual freshness indicators show users whether they are viewing live or cached data + - Transaction operations locked when live data unavailable to prevent incorrect actions + - Strategic cache invalidation on state changes (tip-sent, profile-update) + - Event feed pipeline refactored for scale and performance (Issue #291): - Implemented selective message enrichment: messages are now fetched only for visible/paginated tips instead of all tips, reducing API calls by ~90% @@ -19,6 +26,33 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/). - RecentTips component refactored to use new `useFilteredAndPaginatedEvents` hook, centralizing filter/sort/paginate logic and improving composability. +### Added (Issue #290) + +- `frontend/src/lib/persistentCache.js`: localStorage-backed cache with TTL support, + metadata tracking, and statistics collection. +- `frontend/src/hooks/useCachedData.js`: Generic hook for fetch with automatic + fallback to persistent cache on error or timeout. +- `frontend/src/hooks/useCachedStats.js`: Platform stats-specific hook with + appropriate TTL and timeout settings. +- `frontend/src/hooks/useCachedLeaderboard.js`: Leaderboard-specific hook with + extended cache TTL for aggregated data. +- `frontend/src/lib/cachedApiClient.js`: Transparent fetch wrapper with automatic + response caching, timeout handling, and per-endpoint TTL configuration. +- `frontend/src/lib/cacheInvalidationManager.js`: Utilities for pattern-based and + event-based cache invalidation to prevent stale data cascades. +- `frontend/src/hooks/useTransactionLockout.js`: Hook for controlling transaction + availability based on data source (live/cache/none). +- `frontend/src/context/ResilienceContext.jsx`: Global context for coordinating + cache invalidation and connection status monitoring across the app. +- `frontend/src/components/FreshnessIndicator.jsx`: Visual component showing cache + status, data age, and retry button for manual refresh. +- `docs/LAST_KNOWN_GOOD_CACHING.md`: Comprehensive guide covering architecture, + components, usage patterns, TTL guidelines, and troubleshooting. +- `docs/MIGRATION_GUIDE_290.md`: Step-by-step integration guide for adding caching + to existing components with before/after examples. +- Unit tests for persistent cache, cached data hook, cache invalidation, and + transaction lockout with edge case and integration coverage. + ### Added (Issue #291) - `frontend/src/lib/eventCursorManager.js`: Opaque cursor-based pagination diff --git a/docs/LAST_KNOWN_GOOD_CACHING.md b/docs/LAST_KNOWN_GOOD_CACHING.md new file mode 100644 index 00000000..a93830ed --- /dev/null +++ b/docs/LAST_KNOWN_GOOD_CACHING.md @@ -0,0 +1,309 @@ +# Last-Known-Good Caching for API Resilience + +## Overview + +This system enables graceful fallback to cached data when read-heavy APIs are unavailable or degraded. Users continue to see recent data during outages instead of empty states, significantly improving perceived reliability. + +## Architecture + +``` +Live API Request + | + v +Try Fetch (timeout: 10s) + | + ├─ Success? + │ ├─ Yes: Cache result (TTL: 2-5 min) + │ │ Return live data + │ │ Mark as LIVE + │ │ + │ └─ No: (timeout/error) + │ Get cached data + │ If cached: Return cache + │ If not cached: Return error + │ Mark as CACHE or NONE + │ + v +Display with metadata +(freshness indicator) +``` + +## Components + +### 1. Persistent Cache (`persistentCache.js`) + +Low-level localStorage wrapper with TTL support. + +```javascript +import { setCacheEntry, getCacheEntry, getCacheMetadata } from '../lib/persistentCache'; + +setCacheEntry('key', data, 300000); // Cache for 5 minutes +const cached = getCacheEntry('key'); // null if expired/not found +const meta = getCacheMetadata('key'); // { timestamp, age, ttl, isExpired } +``` + +### 2. Cached Data Hook (`useCachedData`) + +Automatically fetches live data and falls back to cache. + +```javascript +import { useCachedData } from '../hooks/useCachedData'; + +const { + data, // The actual data (live or cached) + source, // 'live', 'cache', or 'none' + isCached, // boolean + isLive, // boolean + metadata, // { age, isExpired, expiresAt } + error, // Error message if fetch failed + loading, // Currently fetching + retry, // Manual refresh function +} = useCachedData('my-key', fetchFunction, { + ttl: 300000, // Cache for 5 minutes + timeout: 10000, // Fail if fetch takes > 10s +}); +``` + +### 3. Freshness Indicator (`FreshnessIndicator.jsx`) + +Visual feedback about data source and age. + +```javascript +import { FreshnessIndicator } from '../components/FreshnessIndicator'; + + +``` + +### 4. Transaction Lockout (`useTransactionLockout`) + +Prevents transactions when live data unavailable. + +```javascript +import { useTransactionLockout } from '../hooks/useTransactionLockout'; + +const { isLocked, lockReason, severity } = useTransactionLockout({ + primary: dataSource, // 'live', 'cache', or 'none' +}); + +if (isLocked) { + return ; +} +``` + +### 5. Cache Invalidation (`cacheInvalidationManager.js`) + +Strategic cache clearing on state changes. + +```javascript +import { + invalidateOnTipSent, + invalidateOnProfileUpdate, + invalidateUserBalance, +} from '../lib/cacheInvalidationManager'; + +// Clear related caches when tip is sent +invalidateOnTipSent(); // Clears: leaderboard, stats, events_feed +``` + +## Usage Patterns + +### Pattern 1: Simple Live Data with Fallback + +```javascript +function Stats() { + const { stats, source, metadata, retry } = useCachedStats( + async () => { + const res = await fetch('/api/stats'); + return res.json(); + } + ); + + return ( +
+ + +
+ ); +} +``` + +### Pattern 2: Protected Transactions + +```javascript +function SendTip() { + const { stats, source } = useCachedStats(fetchStats); + const { isLocked, lockReason } = useTransactionLockout({ primary: source }); + + return ( +
+ {isLocked && {lockReason}} + +
+ ); +} +``` + +### Pattern 3: Manual Cache Control + +```javascript +function Leaderboard() { + const { data, source, metadata, retry } = useCachedData( + 'leaderboard', + fetchLeaderboard, + { ttl: 300000, timeout: 8000 } + ); + + return ( + <> + + + + ); +} +``` + +## Cache TTL Guidelines + +| View | TTL | Justification | +|---|---|---| +| Platform Stats | 2-5 min | Changed rarely, safe to cache | +| Leaderboard | 5-10 min | Aggregated data, not real-time | +| User Balance | 1 min | Used for transaction validation | +| Event Feed | 30 sec | Time-series data, freshness matters | +| User Profile | 10 min | Changed by user action, safe | + +## Invalidation Triggers + +### On Tip Sent +- Platform stats (total volume increased) +- Leaderboard (rankings may change) +- Event feed (new event added) + +### On Profile Update +- User profile cache for that user +- Leaderboard (profile info changed) + +### On Balance Change +- User balance cache + +### Manual Refresh +- User clicks "Retry" button +- User navigates to a new view +- Explicit clearCache() call + +## Visual Feedback + +### Live Data (Green dot, pulses) +``` +● Live data +``` + +### Cached Data (Amber dot) +``` +● Last retrieved from cache (5m ago) [Retry] +``` + +### Unavailable (Red dot) +``` +● Data unavailable +``` + +## Best Practices + +✓ Set appropriate TTLs based on data change frequency +✓ Show freshness metadata so users know what they're seeing +✓ Use retry buttons on cached data to re-attempt live fetch +✓ Lock transactions when data source is 'none' or 'cache' +✓ Invalidate related caches to prevent stale cascades +✓ Test fallback behavior with network throttling + +✗ Don't cache transactional data (confirmations, receipts) +✗ Don't hide that data is cached from the user +✗ Don't use indefinite TTLs +✗ Don't allow transactions with stale balance data +✗ Don't fail hard when cache is empty + +## Testing + +### Manual Testing + +1. **Verify Live Fetch:** + - Clear cache: `localStorage.clear()` + - Load page + - DevTools Network tab shows fetch + - Indicator shows "Live data" + +2. **Verify Cache Fallback:** + - Load page successfully (populates cache) + - Throttle network (DevTools → Network → Throttle) + - Reload page + - Indicator shows "Last retrieved from cache" + +3. **Verify Invalidation:** + - Send a tip + - Leaderboard cache should be cleared + - Leaderboard reloads on next view + +4. **Verify Transaction Lock:** + - Simulate offline: DevTools → Network → Offline + - "Send Tip" button disabled with message + +## Monitoring + +Check cache stats in browser console: + +```javascript +import { getCacheStats } from '../lib/persistentCache'; + +console.log(getCacheStats()); +// { +// totalEntries: 5, +// totalSize: 84532, +// entries: [ +// { key: 'platform_stats', age: 45000, ttl: 300000, isExpired: false } +// ... +// ] +// } +``` + +## Troubleshooting + +### Data Always Shows "Cached" +- Check network tab: is fetch request being made? +- Check timeout value: might be too short +- Check browser console for fetch errors + +### Cache Doesn't Show During Outage +- Verify TTL hasn't expired +- Check localStorage quota (might be full) +- Check browser privacy settings (might disable localStorage) + +### Transactions Don't Lock +- Verify source is 'cache' or 'none' (check console) +- Verify `useTransactionLockout` is being used +- Check that `isLocked` is wired to button disabled state + +### Stale Data After Update +- Verify invalidation trigger is called +- Check cache invalidation manager logs +- Manually clear cache: `localStorage.clear()` + +## References + +- See `persistentCache.js` for low-level API +- See `useCachedData.js` for fetch wrapping +- See `FreshnessIndicator.jsx` for UI patterns +- See `PlatformStats.jsx` for complete example diff --git a/docs/MIGRATION_GUIDE_290.md b/docs/MIGRATION_GUIDE_290.md new file mode 100644 index 00000000..a6307984 --- /dev/null +++ b/docs/MIGRATION_GUIDE_290.md @@ -0,0 +1,319 @@ +# Migration Guide: Last-Known-Good Caching (Issue #290) + +## Overview + +This guide helps you integrate last-known-good caching into existing read-heavy components to improve resilience during API outages. + +## What Each Component Does + +| Component | Purpose | Use When | +|---|---|---| +| `useCachedData` | Generic fetch + cache + fallback | Building custom data sources | +| `useCachedStats` | Platform stats-specific | Displaying platform statistics | +| `useCachedLeaderboard` | Leaderboard-specific | Displaying leaderboard rankings | +| `cachedApiClient` | Transparent HTTP wrapper | Replacing fetch() globally | +| `FreshnessIndicator` | Visual cache status | Any cached data display | +| `useTransactionLockout` | Transaction gate | Send/Batch tip forms | +| `ResilienceContext` | Global resilience state | App-level coordination | + +## Step 1: Wrap a Component with Resilience Provider + +In your App root: + +```javascript +import { ResilienceProvider } from '../context/ResilienceContext'; + +function App() { + return ( + + + {/* your app */} + + + ); +} +``` + +## Step 2: Migrate Read-Heavy Components + +### Before: Direct API Fetch + +```javascript +function PlatformStats() { + const [stats, setStats] = useState(null); + const [loading, setLoading] = useState(true); + + useEffect(() => { + fetch('/api/stats') + .then(r => r.json()) + .then(setStats) + .finally(() => setLoading(false)); + }, []); + + return loading ? : ; +} +``` + +### After: With Cache Fallback + +```javascript +import { useCachedStats } from '../hooks/useCachedStats'; +import { FreshnessIndicator } from '../components/FreshnessIndicator'; + +function PlatformStats() { + const { + stats, + loading, + source, + metadata, + retry, + } = useCachedStats(() => fetch('/api/stats').then(r => r.json())); + + return ( + <> + + {loading ? : } + + ); +} +``` + +## Step 3: Protect Transactions + +### Before: Always Allow + +```javascript +function SendTip() { + const [sending, setSending] = useState(false); + + return ( + + ); +} +``` + +### After: Check Resilience Status + +```javascript +import { useTransactionLockout } from '../hooks/useTransactionLockout'; +import { useCachedStats } from '../hooks/useCachedStats'; + +function SendTip() { + const { stats, source } = useCachedStats(fetchBalance); + const { isLocked, lockReason } = useTransactionLockout({ primary: source }); + const [sending, setSending] = useState(false); + + if (isLocked) { + return ( +
+ {lockReason} + +
+ ); + } + + return ( + + ); +} +``` + +## Step 4: Handle Cache Invalidation + +### On Tip Sent + +```javascript +import { useResilience } from '../context/ResilienceContext'; + +function TipForm() { + const { notifyTipSent } = useResilience(); + + const handleTipSent = useCallback(async (tip) => { + // ... send the tip ... + notifyTipSent(); // Invalidate related caches + }, [notifyTipSent]); +} +``` + +### On Profile Update + +```javascript +import { useResilience } from '../context/ResilienceContext'; + +function ProfileForm() { + const { notifyProfileUpdate } = useResilience(); + + const handleProfileUpdate = useCallback(async (profile) => { + // ... update the profile ... + notifyProfileUpdate(); // Invalidate related caches + }, [notifyProfileUpdate]); +} +``` + +## Step 5: Optional - Use Transparent API Client + +Replace fetch with automatic caching across your app: + +```javascript +// Old +const data = await fetch('/api/endpoint').then(r => r.json()); + +// New +import { cachedGet } from '../lib/cachedApiClient'; +const data = await cachedGet('/api/endpoint'); +``` + +Benefits: +- No component changes needed +- Caching automatic based on endpoint +- POST requests bypass cache automatically + +## Common Patterns + +### Pattern 1: Show Stale Data During Outage + +```javascript +function Leaderboard() { + const { entries, source, metadata, retry } = useCachedLeaderboard(fetch); + + return ( + <> + {source === 'cache' && ( + + Showing cached data from {formatTime(metadata.age)} ago. + + + )} + + + ); +} +``` + +### Pattern 2: Disable Risky Actions + +```javascript +function SettingsForm() { + const { stats, source } = useCachedStats(fetch); + const { isLocked, lockReason } = useTransactionLockout({ primary: source }); + + return ( +
+ + +
+ ); +} +``` + +### Pattern 3: Cascade Invalidation + +```javascript +function BatchTip() { + const { notifyTipSent } = useResilience(); + + const handleBatchSuccess = useCallback(() => { + notifyTipSent(); // Clears: leaderboard, stats, events_feed + }, [notifyTipSent]); +} +``` + +## Troubleshooting + +### "Data always shows as cached" + +Check that the fetch is actually being made: +- DevTools Network tab +- Browser console for fetch errors +- Check timeout value (not too aggressive) + +### "Cache doesn't appear during outage" + +Debug storage: +```javascript +import { getCacheStats } from '../lib/persistentCache'; +console.log(getCacheStats()); // Check what's cached +console.log(localStorage); // Check storage size +``` + +### "Transactions not locking" + +Verify source is actually 'cache' or 'none': +```javascript +const { stats, source } = useCachedStats(...); +console.log('Current source:', source); // Should be 'cache' during outage +``` + +### "Old data persists too long" + +Check TTL: data won't fall back to cache after TTL expires. +Adjust in hook calls: +```javascript +useCachedStats(fetchFn, { ttl: 60000 }) // 1 minute cache +``` + +## Testing Your Implementation + +### Manual Test: Simulate Outage + +1. Open app and load a page +2. DevTools → Network → Offline +3. Modify data (if UI allows) +4. Verify: + - Data still displays ✓ + - Freshness indicator shows cache ✓ + - Transactions are locked ✓ + +### Manual Test: Verify Invalidation + +1. Send a tip successfully +2. Immediately check leaderboard +3. Verify it reloaded (not showing stale rank) + +### Manual Test: Check Cache Size + +```javascript +import { getCacheStats } from '../lib/persistentCache'; +const stats = getCacheStats(); +console.log(`Cached ${stats.totalEntries} items, ${stats.totalSize} bytes`); +``` + +## Performance Considerations + +- Cache TTL balanced between freshness and resilience +- Storage limited by localStorage quota (~5-10MB) +- Regular invalidation prevents stale data +- Monitor `getCacheStats()` to catch issues + +## Backwards Compatibility + +All changes are additive and non-breaking: +- Existing components continue working unchanged +- New components can gradually adopt caching +- No migration required for transactional components + +## Next Steps + +1. Wrap ResilienceProvider at app root +2. Migrate read-heavy views (stats, leaderboard) +3. Add transaction locks to forms +4. Test during network degradation +5. Monitor cache stats in production + +## References + +- LAST_KNOWN_GOOD_CACHING.md - Full system documentation +- useCachedData.js - Low-level API +- persistentCache.js - Storage layer +- FreshnessIndicator.jsx - UI component diff --git a/frontend/src/components/FreshnessIndicator.jsx b/frontend/src/components/FreshnessIndicator.jsx new file mode 100644 index 00000000..d84a9534 --- /dev/null +++ b/frontend/src/components/FreshnessIndicator.jsx @@ -0,0 +1,76 @@ +import { useMemo } from 'react'; + +/** + * FreshnessIndicator component displays cache status and data freshness. + * + * Shows whether data is live or cached, and when cached data was last updated. + * Helps users understand the reliability of the displayed information. + * + * @param {Object} props + * @param {string} props.source - Data source ('live', 'cache', or 'none') + * @param {Object} props.metadata - Cache metadata including timestamp and age + * @param {boolean} props.loading - Whether data is being fetched + * @param {Function} props.onRetry - Callback for manual refresh + * @returns {JSX.Element} + */ +export function FreshnessIndicator({ source, metadata, loading, onRetry }) { + const statusText = useMemo(() => { + if (loading) return 'Updating...'; + if (source === 'live') return 'Live data'; + if (source === 'cache') return 'Last retrieved from cache'; + return 'Data unavailable'; + }, [source, loading]); + + const timeText = useMemo(() => { + if (!metadata || !metadata.age) return null; + + const seconds = Math.floor(metadata.age / 1000); + const minutes = Math.floor(seconds / 60); + const hours = Math.floor(minutes / 60); + + if (hours > 0) return `${hours}h ago`; + if (minutes > 0) return `${minutes}m ago`; + return `${seconds}s ago`; + }, [metadata]); + + const statusColor = useMemo(() => { + if (loading) return 'bg-blue-50 dark:bg-blue-900/30 border-blue-200 dark:border-blue-800'; + if (source === 'live') return 'bg-green-50 dark:bg-green-900/30 border-green-200 dark:border-green-800'; + if (source === 'cache') return 'bg-amber-50 dark:bg-amber-900/30 border-amber-200 dark:border-amber-800'; + return 'bg-red-50 dark:bg-red-900/30 border-red-200 dark:border-red-800'; + }, [source, loading]); + + const textColor = useMemo(() => { + if (loading) return 'text-blue-600 dark:text-blue-400'; + if (source === 'live') return 'text-green-600 dark:text-green-400'; + if (source === 'cache') return 'text-amber-600 dark:text-amber-400'; + return 'text-red-600 dark:text-red-400'; + }, [source, loading]); + + const iconDot = useMemo(() => { + if (loading) return 'bg-blue-500'; + if (source === 'live') return 'bg-green-500 animate-pulse'; + if (source === 'cache') return 'bg-amber-500'; + return 'bg-red-500'; + }, [source, loading]); + + return ( +
+
+ ); +} diff --git a/frontend/src/context/ResilienceContext.jsx b/frontend/src/context/ResilienceContext.jsx new file mode 100644 index 00000000..d6b0fe8a --- /dev/null +++ b/frontend/src/context/ResilienceContext.jsx @@ -0,0 +1,98 @@ +/** + * @module context/ResilienceContext + * + * Global context for managing API resilience and cache coordination. + * + * Tracks connection status, coordinates cache invalidation, and + * provides resilience state to all child components. + */ + +import { createContext, useContext, useState, useCallback, useEffect } from 'react'; +import { invalidateOnTipSent, invalidateOnProfileUpdate } from '../lib/cacheInvalidationManager'; + +const ResilienceContext = createContext(null); + +export function ResilienceProvider({ children }) { + const [isOnline, setIsOnline] = useState( + typeof navigator !== 'undefined' ? navigator.onLine : true + ); + const [apiHealth, setApiHealth] = useState('healthy'); + const [failureCount, setFailureCount] = useState(0); + + useEffect(() => { + const handleOnline = () => { + setIsOnline(true); + setApiHealth('recovering'); + setFailureCount(0); + }; + + const handleOffline = () => { + setIsOnline(false); + setApiHealth('offline'); + }; + + window.addEventListener('online', handleOnline); + window.addEventListener('offline', handleOffline); + + return () => { + window.removeEventListener('online', handleOnline); + window.removeEventListener('offline', handleOffline); + }; + }, []); + + const recordApiSuccess = useCallback(() => { + setFailureCount(0); + setApiHealth('healthy'); + }, []); + + const recordApiFailure = useCallback(() => { + setFailureCount(prev => { + const next = prev + 1; + if (next >= 3) { + setApiHealth('degraded'); + } + return next; + }); + }, []); + + const notifyTipSent = useCallback(() => { + invalidateOnTipSent(); + }, []); + + const notifyProfileUpdate = useCallback(() => { + invalidateOnProfileUpdate(); + }, []); + + const getResilienceStatus = useCallback(() => { + if (!isOnline) return 'offline'; + if (apiHealth === 'offline') return 'offline'; + if (apiHealth === 'degraded') return 'degraded'; + if (apiHealth === 'recovering') return 'recovering'; + return 'healthy'; + }, [isOnline, apiHealth]); + + return ( + + {children} + + ); +} + +export function useResilience() { + const context = useContext(ResilienceContext); + if (!context) { + throw new Error('useResilience must be used within ResilienceProvider'); + } + return context; +} diff --git a/frontend/src/hooks/useCachedData.js b/frontend/src/hooks/useCachedData.js new file mode 100644 index 00000000..6cf70750 --- /dev/null +++ b/frontend/src/hooks/useCachedData.js @@ -0,0 +1,119 @@ +/** + * @module hooks/useCachedData + * + * Hook for fetching data with automatic fallback to persistent cache. + * + * Attempts to fetch live data, caches successful responses, and + * falls back to cached data if the fetch fails or times out. + */ + +import { useState, useEffect, useCallback, useRef } from 'react'; +import { setCacheEntry, getCacheEntry, getCacheMetadata } from '../lib/persistentCache'; + +/** + * Hook for data fetching with cache fallback. + * + * @param {string} cacheKey - Key for persistent cache storage. + * @param {Function} fetchFn - Async function that fetches data. + * @param {Object} options - Configuration options. + * @param {number} options.ttl - Cache TTL in milliseconds (default 5 mins). + * @param {number} options.timeout - Fetch timeout in milliseconds (default 10 secs). + * @returns {Object} { data, loading, error, source, metadata, retry, clearCache } + */ +export function useCachedData( + cacheKey, + fetchFn, + options = {} +) { + const { + ttl = 5 * 60 * 1000, + timeout = 10 * 1000, + } = options; + + const [data, setData] = useState(null); + const [loading, setLoading] = useState(false); + const [error, setError] = useState(null); + const [source, setSource] = useState('cache'); + const [metadata, setMetadata] = useState(null); + const cancelledRef = useRef(false); + const timeoutIdRef = useRef(null); + + const fetchWithTimeout = useCallback(async () => { + return Promise.race([ + fetchFn(), + new Promise((_, reject) => + setTimeout( + () => reject(new Error('Fetch timeout')), + timeout + ) + ), + ]); + }, [fetchFn, timeout]); + + const loadData = useCallback(async () => { + cancelledRef.current = false; + setLoading(true); + setError(null); + + try { + const liveData = await fetchWithTimeout(); + if (cancelledRef.current) return; + + setCacheEntry(cacheKey, liveData, ttl); + setData(liveData); + setSource('live'); + setMetadata(null); + } catch (err) { + if (cancelledRef.current) return; + + console.warn(`Failed to fetch data for "${cacheKey}":`, err.message); + setError(err.message || 'Failed to load data'); + + const cachedData = getCacheEntry(cacheKey); + if (cachedData) { + setData(cachedData); + setSource('cache'); + setMetadata(getCacheMetadata(cacheKey)); + } else { + setData(null); + setSource('none'); + } + } finally { + if (!cancelledRef.current) { + setLoading(false); + } + } + }, [cacheKey, fetchWithTimeout, ttl]); + + const retry = useCallback(async () => { + await loadData(); + }, [loadData]); + + const clearCache = useCallback(() => { + setData(null); + setMetadata(null); + setSource('none'); + }, []); + + useEffect(() => { + loadData(); + return () => { + cancelledRef.current = true; + if (timeoutIdRef.current) { + clearTimeout(timeoutIdRef.current); + } + }; + }, [cacheKey, fetchWithTimeout]); + + return { + data, + loading, + error, + source, + metadata, + retry, + clearCache, + isCached: source === 'cache', + isLive: source === 'live', + }; +} diff --git a/frontend/src/hooks/useCachedData.test.js b/frontend/src/hooks/useCachedData.test.js new file mode 100644 index 00000000..a9581c02 --- /dev/null +++ b/frontend/src/hooks/useCachedData.test.js @@ -0,0 +1,179 @@ +import { describe, it, expect, beforeEach, vi } from 'vitest'; +import { renderHook, act, waitFor } from '@testing-library/react'; +import { useCachedData } from './useCachedData'; +import * as persistentCache from '../lib/persistentCache'; + +describe('useCachedData Hook', () => { + beforeEach(() => { + localStorage.clear(); + vi.clearAllMocks(); + }); + + it('fetches live data on mount', async () => { + const mockData = { stats: { total: 100 } }; + const fetchFn = vi.fn().mockResolvedValue(mockData); + + const { result } = renderHook(() => + useCachedData('test', fetchFn, { ttl: 5000 }) + ); + + expect(result.current.loading).toBe(true); + + await waitFor(() => { + expect(result.current.loading).toBe(false); + }); + + expect(result.current.data).toEqual(mockData); + expect(result.current.source).toBe('live'); + expect(result.current.isLive).toBe(true); + }); + + it('caches successful responses', async () => { + const mockData = { stats: { total: 100 } }; + const fetchFn = vi.fn().mockResolvedValue(mockData); + + renderHook(() => useCachedData('test-cache', fetchFn, { ttl: 5000 })); + + await waitFor(() => { + const cached = persistentCache.getCacheEntry('test-cache'); + expect(cached).toEqual(mockData); + }); + }); + + it('falls back to cache on fetch error', async () => { + const cachedData = { stats: { total: 50 } }; + persistentCache.setCacheEntry('fallback', cachedData, 5000); + + const fetchFn = vi.fn().mockRejectedValue(new Error('API error')); + + const { result } = renderHook(() => + useCachedData('fallback', fetchFn, { ttl: 5000 }) + ); + + await waitFor(() => { + expect(result.current.loading).toBe(false); + }); + + expect(result.current.data).toEqual(cachedData); + expect(result.current.source).toBe('cache'); + expect(result.current.isCached).toBe(true); + expect(result.current.error).toBeDefined(); + }); + + it('returns null when no live data and no cache', async () => { + const fetchFn = vi.fn().mockRejectedValue(new Error('API error')); + + const { result } = renderHook(() => + useCachedData('nocache', fetchFn, { ttl: 5000 }) + ); + + await waitFor(() => { + expect(result.current.loading).toBe(false); + }); + + expect(result.current.data).toBeNull(); + expect(result.current.source).toBe('none'); + }); + + it('retries fetch on demand', async () => { + const mockData = { stats: { total: 100 } }; + const fetchFn = vi.fn().mockResolvedValue(mockData); + + const { result } = renderHook(() => + useCachedData('retry-test', fetchFn, { ttl: 5000 }) + ); + + await waitFor(() => { + expect(result.current.data).toEqual(mockData); + }); + + expect(fetchFn).toHaveBeenCalledTimes(1); + + act(() => { + result.current.retry(); + }); + + await waitFor(() => { + expect(fetchFn).toHaveBeenCalledTimes(2); + }); + }); + + it('handles fetch timeout', async () => { + const cachedData = { stats: { old: true } }; + persistentCache.setCacheEntry('timeout-test', cachedData, 5000); + + const fetchFn = vi.fn( + () => new Promise(resolve => setTimeout(resolve, 20000)) + ); + + const { result } = renderHook(() => + useCachedData('timeout-test', fetchFn, { ttl: 5000, timeout: 100 }) + ); + + await waitFor(() => { + expect(result.current.loading).toBe(false); + }); + + expect(result.current.data).toEqual(cachedData); + expect(result.current.source).toBe('cache'); + }); + + it('provides metadata for cached data', async () => { + vi.useFakeTimers(); + try { + const mockData = { stats: { total: 100 } }; + persistentCache.setCacheEntry('meta-test', mockData, 60000); + + vi.advanceTimersByTime(5000); + + const fetchFn = vi.fn().mockRejectedValue(new Error('API error')); + + const { result } = renderHook(() => + useCachedData('meta-test', fetchFn, { ttl: 60000 }) + ); + + await waitFor(() => { + expect(result.current.metadata).toBeDefined(); + }); + + expect(result.current.metadata.age).toBeGreaterThanOrEqual(5000); + expect(result.current.metadata.isExpired).toBe(false); + } finally { + vi.useRealTimers(); + } + }); + + it('clears cache on demand', async () => { + const mockData = { stats: { total: 100 } }; + const fetchFn = vi.fn().mockResolvedValue(mockData); + + const { result } = renderHook(() => + useCachedData('clear-test', fetchFn, { ttl: 5000 }) + ); + + await waitFor(() => { + expect(result.current.data).toEqual(mockData); + }); + + act(() => { + result.current.clearCache(); + }); + + expect(result.current.data).toBeNull(); + expect(result.current.metadata).toBeNull(); + }); + + it('respects TTL option', async () => { + const mockData = { stats: { total: 100 } }; + const fetchFn = vi.fn().mockResolvedValue(mockData); + + renderHook(() => + useCachedData('ttl-test', fetchFn, { ttl: 30000 }) + ); + + await waitFor(() => { + const metadata = persistentCache.getCacheMetadata('ttl-test'); + expect(metadata.ttl).toBe(30000); + }); + }); +}); diff --git a/frontend/src/hooks/useCachedLeaderboard.js b/frontend/src/hooks/useCachedLeaderboard.js new file mode 100644 index 00000000..a91d7216 --- /dev/null +++ b/frontend/src/hooks/useCachedLeaderboard.js @@ -0,0 +1,59 @@ +/** + * @module hooks/useCachedLeaderboard + * + * Hook for fetching leaderboard data with cache fallback. + * + * Handles leaderboard-specific caching with appropriate TTL + * and error handling. + */ + +import { useCallback } from 'react'; +import { useCachedData } from './useCachedData'; + +const LEADERBOARD_CACHE_KEY = 'leaderboard'; +const LEADERBOARD_CACHE_TTL = 10 * 60 * 1000; + +/** + * Hook for cached leaderboard data. + * + * @param {Function} fetchLeaderboardFn - Async function that fetches leaderboard + * @param {Object} options - Optional configuration + * @returns {Object} Leaderboard data and state + */ +export function useCachedLeaderboard(fetchLeaderboardFn, options = {}) { + const { + timeout = 8000, + } = options; + + const safeFetch = useCallback(async () => { + if (!fetchLeaderboardFn) { + throw new Error('Fetch function required'); + } + return await fetchLeaderboardFn(); + }, [fetchLeaderboardFn]); + + const { + data, + loading, + error, + source, + metadata, + retry, + isCached, + isLive, + } = useCachedData(LEADERBOARD_CACHE_KEY, safeFetch, { + ttl: LEADERBOARD_CACHE_TTL, + timeout, + }); + + return { + entries: data, + loading, + error, + source, + metadata, + retry, + isCached, + isLive, + }; +} diff --git a/frontend/src/hooks/useCachedStats.js b/frontend/src/hooks/useCachedStats.js new file mode 100644 index 00000000..e95cb413 --- /dev/null +++ b/frontend/src/hooks/useCachedStats.js @@ -0,0 +1,59 @@ +/** + * @module hooks/useCachedStats + * + * Hook for fetching platform stats with cache fallback. + * + * Fetches live stats from the API and falls back to cached stats + * when the API is unavailable or slow. + */ + +import { useCallback } from 'react'; +import { useCachedData } from './useCachedData'; + +const STATS_CACHE_KEY = 'platform_stats'; +const STATS_CACHE_TTL = 2 * 60 * 1000; + +/** + * Hook for cached platform statistics. + * + * @param {Function} fetchStatsFn - Async function that fetches stats + * @param {Object} options - Optional configuration + * @returns {Object} Stats data and state + */ +export function useCachedStats(fetchStatsFn, options = {}) { + const { + timeout = 8000, + } = options; + + const safeFetch = useCallback(async () => { + if (!fetchStatsFn) { + throw new Error('Fetch function required'); + } + return await fetchStatsFn(); + }, [fetchStatsFn]); + + const { + data, + loading, + error, + source, + metadata, + retry, + isCached, + isLive, + } = useCachedData(STATS_CACHE_KEY, safeFetch, { + ttl: STATS_CACHE_TTL, + timeout, + }); + + return { + stats: data, + loading, + error, + source, + metadata, + retry, + isCached, + isLive, + }; +} diff --git a/frontend/src/hooks/useTransactionLockout.js b/frontend/src/hooks/useTransactionLockout.js new file mode 100644 index 00000000..aaa00f52 --- /dev/null +++ b/frontend/src/hooks/useTransactionLockout.js @@ -0,0 +1,66 @@ +/** + * @module hooks/useTransactionLockout + * + * Hook for managing transaction state based on API availability. + * + * Prevents transactions when live data is unavailable or degraded. + * Provides messaging to inform users why actions are disabled. + */ + +import { useMemo, useCallback } from 'react'; + +/** + * Hook for controlling transaction availability. + * + * @param {Object} sources - Map of data source states + * @param {string} sources.primary - Primary data source ('live', 'cache', 'none') + * @param {string} sources.secondary - Optional secondary data source + * @returns {Object} Transaction control state and helpers + */ +export function useTransactionLockout(sources = {}) { + const { + primary = 'live', + secondary = 'live', + } = sources; + + const isLocked = useMemo(() => { + return primary === 'none' || primary === 'cache'; + }, [primary]); + + const lockReason = useMemo(() => { + if (primary === 'none') { + return 'Unable to verify your account. Please check your connection.'; + } + if (primary === 'cache') { + return 'Using cached data. Transactions are temporarily disabled while we reconnect.'; + } + return null; + }, [primary]); + + const canSuggestRetry = useMemo(() => { + return primary === 'cache' || primary === 'none'; + }, [primary]); + + const severity = useMemo(() => { + if (primary === 'none') return 'critical'; + if (primary === 'cache') return 'warning'; + return 'none'; + }, [primary]); + + const getTransactionStatus = useCallback(() => { + return { + allowed: !isLocked, + reason: lockReason, + severity, + canRetry: canSuggestRetry, + }; + }, [isLocked, lockReason, severity, canSuggestRetry]); + + return { + isLocked, + lockReason, + canSuggestRetry, + severity, + getTransactionStatus, + }; +} diff --git a/frontend/src/hooks/useTransactionLockout.test.js b/frontend/src/hooks/useTransactionLockout.test.js new file mode 100644 index 00000000..a8f5deb6 --- /dev/null +++ b/frontend/src/hooks/useTransactionLockout.test.js @@ -0,0 +1,87 @@ +import { describe, it, expect } from 'vitest'; +import { renderHook } from '@testing-library/react'; +import { useTransactionLockout } from './useTransactionLockout'; + +describe('useTransactionLockout Hook', () => { + it('allows transactions when primary source is live', () => { + const { result } = renderHook(() => + useTransactionLockout({ primary: 'live' }) + ); + + expect(result.current.isLocked).toBe(false); + expect(result.current.lockReason).toBeNull(); + expect(result.current.severity).toBe('none'); + }); + + it('locks transactions when primary source is cache', () => { + const { result } = renderHook(() => + useTransactionLockout({ primary: 'cache' }) + ); + + expect(result.current.isLocked).toBe(true); + expect(result.current.lockReason).toContain('cached data'); + expect(result.current.severity).toBe('warning'); + expect(result.current.canSuggestRetry).toBe(true); + }); + + it('locks transactions with critical severity when data unavailable', () => { + const { result } = renderHook(() => + useTransactionLockout({ primary: 'none' }) + ); + + expect(result.current.isLocked).toBe(true); + expect(result.current.lockReason).toContain('Unable to verify'); + expect(result.current.severity).toBe('critical'); + expect(result.current.canSuggestRetry).toBe(true); + }); + + it('provides transaction status via method', () => { + const { result } = renderHook(() => + useTransactionLockout({ primary: 'cache' }) + ); + + const status = result.current.getTransactionStatus(); + expect(status.allowed).toBe(false); + expect(status.reason).toBeDefined(); + expect(status.severity).toBe('warning'); + expect(status.canRetry).toBe(true); + }); + + it('handles default sources', () => { + const { result } = renderHook(() => + useTransactionLockout() + ); + + expect(result.current.isLocked).toBe(false); + expect(result.current.severity).toBe('none'); + }); + + it('provides informative messages for each state', () => { + const cacheResult = renderHook(() => + useTransactionLockout({ primary: 'cache' }) + ); + expect(cacheResult.result.current.lockReason).toContain('cached'); + + const noneResult = renderHook(() => + useTransactionLockout({ primary: 'none' }) + ); + expect(noneResult.result.current.lockReason).toContain('connection'); + }); + + it('indicates retry suggestion availability', () => { + const liveResult = renderHook(() => + useTransactionLockout({ primary: 'live' }) + ); + expect(liveResult.result.current.canSuggestRetry).toBe(false); + + const cachedResult = renderHook(() => + useTransactionLockout({ primary: 'cache' }) + ); + expect(cachedResult.result.current.canSuggestRetry).toBe(true); + + const noneResult = renderHook(() => + useTransactionLockout({ primary: 'none' }) + ); + expect(noneResult.result.current.canSuggestRetry).toBe(true); + }); +}); diff --git a/frontend/src/lib/cacheInvalidationManager.js b/frontend/src/lib/cacheInvalidationManager.js new file mode 100644 index 00000000..68542cb3 --- /dev/null +++ b/frontend/src/lib/cacheInvalidationManager.js @@ -0,0 +1,107 @@ +/** + * @module lib/cacheInvalidationManager + * + * Manages strategic cache invalidation for read-heavy surfaces. + * + * Handles selective invalidation patterns based on: + * - Time-based expiration (TTL) + * - Event-based triggers (new tips, profile updates) + * - Manual invalidation requests + */ + +import { clearCacheEntry } from './persistentCache'; + +const CACHE_KEYS = { + LEADERBOARD: 'leaderboard', + STATS: 'platform_stats', + USER_PROFILE: 'user_profile_', + BALANCE: 'user_balance_', + EVENTS_FEED: 'events_feed', +}; + +const INVALIDATION_PATTERNS = { + onTipSent: ['leaderboard', 'platform_stats', 'events_feed'], + onProfileUpdate: ['user_profile_', 'leaderboard'], + onBalanceChange: ['user_balance_'], +}; + +/** + * Invalidate caches matching a pattern. + * + * @param {string} pattern - Cache key pattern (prefix match) + */ +export function invalidateByPattern(pattern) { + for (let i = 0; i < localStorage.length; i++) { + const key = localStorage.key(i); + if (key && key.includes(pattern)) { + const cacheKey = key.replace('tipstream_cache_', ''); + clearCacheEntry(cacheKey); + } + } +} + +/** + * Invalidate related caches when a tip is sent. + * + * Clears leaderboard, stats, and event feed caches to reflect new tip. + */ +export function invalidateOnTipSent() { + INVALIDATION_PATTERNS.onTipSent.forEach(pattern => { + invalidateByPattern(pattern); + }); +} + +/** + * Invalidate related caches when a user profile is updated. + * + * Clears user profile and leaderboard caches. + */ +export function invalidateOnProfileUpdate() { + INVALIDATION_PATTERNS.onProfileUpdate.forEach(pattern => { + invalidateByPattern(pattern); + }); +} + +/** + * Invalidate balance cache for a user. + * + * @param {string} address - User address + */ +export function invalidateUserBalance(address) { + if (address) { + clearCacheEntry(`${CACHE_KEYS.BALANCE}${address}`); + } +} + +/** + * Invalidate all read-heavy view caches. + * + * Used when connectivity is restored to ensure fresh data. + */ +export function invalidateAllReadCaches() { + Object.values(CACHE_KEYS).forEach(key => { + invalidateByPattern(key); + }); +} + +/** + * Register invalidation handler for transactional events. + * + * @param {Object} tipContext - TipContext instance + * @returns {Function} Unsubscribe function + */ +export function registerInvalidationHandlers(tipContext) { + if (!tipContext) return () => {}; + + const handleTipSent = () => { + invalidateOnTipSent(); + }; + + tipContext.notifyTipSent?.(); + + return () => { + tipContext.triggerRefresh?.(); + }; +} + +export { CACHE_KEYS }; diff --git a/frontend/src/lib/cacheInvalidationManager.test.js b/frontend/src/lib/cacheInvalidationManager.test.js new file mode 100644 index 00000000..eed8b720 --- /dev/null +++ b/frontend/src/lib/cacheInvalidationManager.test.js @@ -0,0 +1,142 @@ +import { describe, it, expect, beforeEach } from 'vitest'; +import { + invalidateByPattern, + invalidateOnTipSent, + invalidateOnProfileUpdate, + invalidateUserBalance, + invalidateAllReadCaches, + CACHE_KEYS, +} from './cacheInvalidationManager'; +import { setCacheEntry, getCacheEntry } from './persistentCache'; + +describe('Cache Invalidation Manager', () => { + beforeEach(() => { + localStorage.clear(); + }); + + describe('invalidateByPattern', () => { + it('invalidates entries matching pattern', () => { + setCacheEntry('leaderboard', { data: 'board' }); + setCacheEntry('leaderboard_extended', { data: 'extended' }); + setCacheEntry('stats', { data: 'stats' }); + + invalidateByPattern('leaderboard'); + + expect(getCacheEntry('leaderboard')).toBeNull(); + expect(getCacheEntry('leaderboard_extended')).toBeNull(); + expect(getCacheEntry('stats')).toBeDefined(); + }); + + it('handles pattern with no matches', () => { + setCacheEntry('stats', { data: 'stats' }); + invalidateByPattern('nonexistent_pattern'); + expect(getCacheEntry('stats')).toBeDefined(); + }); + }); + + describe('invalidateOnTipSent', () => { + it('invalidates related caches', () => { + setCacheEntry('leaderboard', { data: 'board' }); + setCacheEntry('platform_stats', { data: 'stats' }); + setCacheEntry('events_feed', { data: 'feed' }); + setCacheEntry('user_balance_alice', { data: 'balance' }); + + invalidateOnTipSent(); + + expect(getCacheEntry('leaderboard')).toBeNull(); + expect(getCacheEntry('platform_stats')).toBeNull(); + expect(getCacheEntry('events_feed')).toBeNull(); + expect(getCacheEntry('user_balance_alice')).toBeDefined(); + }); + }); + + describe('invalidateOnProfileUpdate', () => { + it('invalidates profile and leaderboard', () => { + setCacheEntry('user_profile_alice', { data: 'profile' }); + setCacheEntry('leaderboard', { data: 'board' }); + setCacheEntry('platform_stats', { data: 'stats' }); + + invalidateOnProfileUpdate(); + + expect(getCacheEntry('user_profile_alice')).toBeNull(); + expect(getCacheEntry('leaderboard')).toBeNull(); + expect(getCacheEntry('platform_stats')).toBeDefined(); + }); + }); + + describe('invalidateUserBalance', () => { + it('invalidates specific user balance', () => { + setCacheEntry('user_balance_alice', { data: 'balance_alice' }); + setCacheEntry('user_balance_bob', { data: 'balance_bob' }); + + invalidateUserBalance('alice'); + + expect(getCacheEntry('user_balance_alice')).toBeNull(); + expect(getCacheEntry('user_balance_bob')).toBeDefined(); + }); + + it('handles null address gracefully', () => { + setCacheEntry('user_balance_alice', { data: 'balance' }); + invalidateUserBalance(null); + expect(getCacheEntry('user_balance_alice')).toBeDefined(); + }); + + it('handles empty string address gracefully', () => { + setCacheEntry('user_balance_alice', { data: 'balance' }); + invalidateUserBalance(''); + expect(getCacheEntry('user_balance_alice')).toBeDefined(); + }); + }); + + describe('invalidateAllReadCaches', () => { + it('clears all read-heavy caches', () => { + setCacheEntry('leaderboard', { data: 'board' }); + setCacheEntry('platform_stats', { data: 'stats' }); + setCacheEntry('user_profile_alice', { data: 'profile' }); + setCacheEntry('user_balance_alice', { data: 'balance' }); + setCacheEntry('events_feed', { data: 'feed' }); + + invalidateAllReadCaches(); + + expect(getCacheEntry('leaderboard')).toBeNull(); + expect(getCacheEntry('platform_stats')).toBeNull(); + expect(getCacheEntry('user_profile_alice')).toBeNull(); + expect(getCacheEntry('user_balance_alice')).toBeNull(); + expect(getCacheEntry('events_feed')).toBeNull(); + }); + }); + + describe('CACHE_KEYS', () => { + it('defines standard cache keys', () => { + expect(CACHE_KEYS.LEADERBOARD).toBe('leaderboard'); + expect(CACHE_KEYS.STATS).toBe('platform_stats'); + expect(CACHE_KEYS.EVENTS_FEED).toBe('events_feed'); + }); + }); + + describe('Integration', () => { + it('handles complex invalidation scenarios', () => { + setCacheEntry('leaderboard', { data: 'board' }); + setCacheEntry('user_profile_alice', { data: 'profile' }); + setCacheEntry('user_balance_bob', { data: 'balance' }); + setCacheEntry('events_feed', { data: 'feed' }); + + invalidateOnTipSent(); + + expect(getCacheEntry('leaderboard')).toBeNull(); + expect(getCacheEntry('user_profile_alice')).toBeDefined(); + expect(getCacheEntry('user_balance_bob')).toBeDefined(); + expect(getCacheEntry('events_feed')).toBeNull(); + }); + + it('supports cascading invalidations', () => { + setCacheEntry('user_profile_alice', { data: 'old' }); + setCacheEntry('leaderboard', { data: 'old' }); + + invalidateOnProfileUpdate(); + + expect(getCacheEntry('user_profile_alice')).toBeNull(); + expect(getCacheEntry('leaderboard')).toBeNull(); + }); + }); +}); diff --git a/frontend/src/lib/cachedApiClient.js b/frontend/src/lib/cachedApiClient.js new file mode 100644 index 00000000..2e50f6ae --- /dev/null +++ b/frontend/src/lib/cachedApiClient.js @@ -0,0 +1,142 @@ +/** + * @module lib/cachedApiClient + * + * HTTP client wrapper that automatically caches GET responses. + * + * Intercepts successful responses and stores them in persistent cache + * for automatic fallback during API degradation. + */ + +import { setCacheEntry, getCacheEntry } from './persistentCache'; + +/** + * Configuration for cached endpoints. + * + * Maps endpoint patterns to cache TTL values. + */ +const CACHE_CONFIG = { + '/stats': 5 * 60 * 1000, + '/leaderboard': 10 * 60 * 1000, + '/profile/': 10 * 60 * 1000, + '/events': 30 * 1000, +}; + +/** + * Generate cache key from endpoint URL. + * + * @param {string} endpoint - API endpoint + * @returns {string} Cache key + */ +function getCacheKeyForEndpoint(endpoint) { + return `api_${endpoint.replace(/\//g, '_')}`; +} + +/** + * Get TTL for an endpoint. + * + * @param {string} endpoint - API endpoint + * @returns {number} TTL in milliseconds + */ +function getTtlForEndpoint(endpoint) { + for (const [pattern, ttl] of Object.entries(CACHE_CONFIG)) { + if (endpoint.includes(pattern)) { + return ttl; + } + } + return 5 * 60 * 1000; +} + +/** + * Make a cached GET request. + * + * @param {string} url - Full URL to fetch + * @param {Object} options - Fetch options + * @returns {Promise} Response data + */ +export async function cachedFetch(url, options = {}) { + const { timeout = 10000, useCache = true } = options; + + if (useCache && options.method?.toUpperCase() !== 'POST') { + const cacheKey = getCacheKeyForEndpoint(url); + const cached = getCacheEntry(cacheKey); + if (cached) { + return cached; + } + } + + const controller = new AbortController(); + const timeoutId = setTimeout(() => controller.abort(), timeout); + + try { + const response = await fetch(url, { + ...options, + signal: controller.signal, + }); + + clearTimeout(timeoutId); + + if (!response.ok) { + throw new Error(`HTTP ${response.status}`); + } + + const data = await response.json(); + + if (useCache && response.status === 200) { + const cacheKey = getCacheKeyForEndpoint(url); + const ttl = getTtlForEndpoint(url); + setCacheEntry(cacheKey, data, ttl); + } + + return data; + } catch (err) { + clearTimeout(timeoutId); + + if (useCache && options.method?.toUpperCase() !== 'POST') { + const cacheKey = getCacheKeyForEndpoint(url); + const cached = getCacheEntry(cacheKey); + if (cached) { + return cached; + } + } + + throw err; + } +} + +/** + * Make a GET request with caching. + * + * @param {string} url - URL to fetch + * @param {Object} options - Fetch options + * @returns {Promise} Response data + */ +export async function cachedGet(url, options = {}) { + return cachedFetch(url, { ...options, method: 'GET' }); +} + +/** + * Make a POST request (bypasses cache). + * + * @param {string} url - URL to fetch + * @param {Object} body - Request body + * @param {Object} options - Fetch options + * @returns {Promise} Response data + */ +export async function cachedPost(url, body, options = {}) { + return cachedFetch(url, { + ...options, + method: 'POST', + body: JSON.stringify(body), + useCache: false, + }); +} + +/** + * Register a custom cache configuration for an endpoint. + * + * @param {string} pattern - URL pattern to match + * @param {number} ttlMs - Cache TTL in milliseconds + */ +export function registerCachePattern(pattern, ttlMs) { + CACHE_CONFIG[pattern] = ttlMs; +} diff --git a/frontend/src/lib/cachedApiClient.test.js b/frontend/src/lib/cachedApiClient.test.js new file mode 100644 index 00000000..93b139d8 --- /dev/null +++ b/frontend/src/lib/cachedApiClient.test.js @@ -0,0 +1,191 @@ +import { describe, it, expect, beforeEach, vi } from 'vitest'; +import { cachedFetch, cachedGet, cachedPost, registerCachePattern } from './cachedApiClient'; +import * as persistentCache from './persistentCache'; + +describe('Cached API Client', () => { + beforeEach(() => { + localStorage.clear(); + vi.clearAllMocks(); + }); + + describe('cachedFetch', () => { + it('caches successful GET responses', async () => { + const mockData = { stats: { total: 100 } }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + const result = await cachedFetch('/stats'); + expect(result).toEqual(mockData); + + const cached = persistentCache.getCacheEntry('api__stats'); + expect(cached).toEqual(mockData); + }); + + it('returns cached data on timeout', async () => { + const cachedData = { stats: { cached: true } }; + persistentCache.setCacheEntry('api__stats', cachedData, 300000); + + global.fetch = vi.fn( + () => new Promise(resolve => setTimeout(resolve, 20000)) + ); + + const result = await cachedFetch('/stats', { timeout: 100 }); + expect(result).toEqual(cachedData); + }); + + it('respects useCache option', async () => { + const mockData = { data: 'new' }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + await cachedFetch('/stats', { useCache: false }); + + const cached = persistentCache.getCacheEntry('api__stats'); + expect(cached).toBeNull(); + }); + + it('handles fetch errors with cache fallback', async () => { + const cachedData = { stats: { fallback: true } }; + persistentCache.setCacheEntry('api__stats', cachedData, 300000); + + global.fetch = vi.fn().mockRejectedValue(new Error('Network error')); + + const result = await cachedFetch('/stats'); + expect(result).toEqual(cachedData); + }); + + it('throws error when no cache and fetch fails', async () => { + global.fetch = vi.fn().mockRejectedValue(new Error('Network error')); + + await expect(cachedFetch('/stats')).rejects.toThrow(); + }); + + it('handles non-OK responses', async () => { + const cachedData = { stats: { fallback: true } }; + persistentCache.setCacheEntry('api__stats', cachedData, 300000); + + global.fetch = vi.fn().mockResolvedValue({ + ok: false, + status: 500, + }); + + const result = await cachedFetch('/stats'); + expect(result).toEqual(cachedData); + }); + }); + + describe('cachedGet', () => { + it('makes GET requests with caching', async () => { + const mockData = { data: 'value' }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + const result = await cachedGet('/endpoint'); + expect(result).toEqual(mockData); + expect(global.fetch).toHaveBeenCalledWith( + '/endpoint', + expect.objectContaining({ method: 'GET' }) + ); + }); + }); + + describe('cachedPost', () => { + it('makes POST requests without caching', async () => { + const mockData = { success: true }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + const body = { data: 'test' }; + const result = await cachedPost('/endpoint', body); + expect(result).toEqual(mockData); + + const cached = persistentCache.getCacheEntry('api__endpoint'); + expect(cached).toBeNull(); + }); + }); + + describe('Cache TTL configuration', () => { + it('uses configured TTL for endpoints', async () => { + const mockData = { data: 'value' }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + await cachedFetch('/stats'); + const metadata = persistentCache.getCacheMetadata('api__stats'); + expect(metadata.ttl).toBe(5 * 60 * 1000); + }); + + it('supports custom pattern registration', async () => { + registerCachePattern('/custom', 60000); + + const mockData = { data: 'value' }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + await cachedFetch('/custom'); + const metadata = persistentCache.getCacheMetadata('api__custom'); + expect(metadata.ttl).toBe(60000); + }); + }); + + describe('Error handling', () => { + it('handles timeout gracefully', async () => { + global.fetch = vi.fn( + () => new Promise(resolve => setTimeout(resolve, 30000)) + ); + + await expect( + cachedFetch('/endpoint', { timeout: 100 }) + ).rejects.toThrow(); + }); + + it('distinguishes between timeout and network error', async () => { + const cachedData = { fallback: true }; + persistentCache.setCacheEntry('api__endpoint', cachedData, 300000); + + global.fetch = vi.fn( + () => new Promise(resolve => setTimeout(resolve, 20000)) + ); + + const result = await cachedFetch('/endpoint', { timeout: 100 }); + expect(result).toEqual(cachedData); + }); + }); + + describe('Integration', () => { + it('handles full lifecycle', async () => { + const mockData = { stats: { count: 100 } }; + global.fetch = vi.fn().mockResolvedValue({ + ok: true, + status: 200, + json: async () => mockData, + }); + + const result1 = await cachedFetch('/stats'); + expect(result1).toEqual(mockData); + + global.fetch = vi.fn().mockRejectedValue(new Error('API down')); + + const result2 = await cachedFetch('/stats'); + expect(result2).toEqual(mockData); + }); + }); +}); diff --git a/frontend/src/lib/persistentCache.js b/frontend/src/lib/persistentCache.js new file mode 100644 index 00000000..7a6ce755 --- /dev/null +++ b/frontend/src/lib/persistentCache.js @@ -0,0 +1,206 @@ +/** + * @module lib/persistentCache + * + * Persistent cache layer using localStorage for read-heavy view data. + * + * Stores successful API responses with timestamps and TTL metadata, + * enabling graceful fallback to cached data when live APIs are unavailable. + * + * Cache entries include: + * - data: the cached response payload + * - timestamp: when the entry was cached (ms since epoch) + * - ttl: time-to-live in milliseconds + * - version: schema version for migration support + */ + +const CACHE_VERSION = 1; +const STORAGE_KEY_PREFIX = 'tipstream_cache_'; + +/** + * Generate a storage key for a cache entry. + * + * @param {string} cacheKey - The logical cache key. + * @returns {string} Storage key for localStorage. + */ +function getStorageKey(cacheKey) { + return `${STORAGE_KEY_PREFIX}${cacheKey}`; +} + +/** + * Store a value in persistent cache. + * + * @param {string} key - Logical cache key. + * @param {*} data - Data to cache. + * @param {number} ttlMs - Time-to-live in milliseconds. + * @returns {boolean} True if cached successfully. + */ +export function setCacheEntry(key, data, ttlMs = 5 * 60 * 1000) { + if (!key || ttlMs <= 0) { + return false; + } + + try { + const entry = { + data, + timestamp: Date.now(), + ttl: ttlMs, + version: CACHE_VERSION, + }; + localStorage.setItem(getStorageKey(key), JSON.stringify(entry)); + return true; + } catch (err) { + console.error('Failed to cache entry:', err.message); + return false; + } +} + +/** + * Retrieve a value from persistent cache if not expired. + * + * @param {string} key - Logical cache key. + * @returns {*|null} Cached data, or null if not found or expired. + */ +export function getCacheEntry(key) { + if (!key) { + return null; + } + + try { + const stored = localStorage.getItem(getStorageKey(key)); + if (!stored) { + return null; + } + + const entry = JSON.parse(stored); + if (!entry || entry.version !== CACHE_VERSION) { + return null; + } + + const age = Date.now() - entry.timestamp; + if (age > entry.ttl) { + localStorage.removeItem(getStorageKey(key)); + return null; + } + + return entry.data; + } catch (err) { + console.error('Failed to retrieve cache entry:', err.message); + return null; + } +} + +/** + * Get metadata about a cached entry (timestamp and TTL). + * + * @param {string} key - Logical cache key. + * @returns {Object|null} { timestamp, ttl, age, isExpired } or null. + */ +export function getCacheMetadata(key) { + if (!key) { + return null; + } + + try { + const stored = localStorage.getItem(getStorageKey(key)); + if (!stored) { + return null; + } + + const entry = JSON.parse(stored); + if (!entry || entry.version !== CACHE_VERSION) { + return null; + } + + const now = Date.now(); + const age = now - entry.timestamp; + const isExpired = age > entry.ttl; + + return { + timestamp: entry.timestamp, + ttl: entry.ttl, + age, + isExpired, + expiresAt: entry.timestamp + entry.ttl, + }; + } catch (err) { + console.error('Failed to retrieve cache metadata:', err.message); + return null; + } +} + +/** + * Delete a cache entry. + * + * @param {string} key - Logical cache key. + * @returns {boolean} True if deleted. + */ +export function clearCacheEntry(key) { + if (!key) { + return false; + } + + try { + localStorage.removeItem(getStorageKey(key)); + return true; + } catch (err) { + console.error('Failed to clear cache entry:', err.message); + return false; + } +} + +/** + * Clear all TipStream cache entries. + * + * @returns {number} Number of entries cleared. + */ +export function clearAllCache() { + try { + const keysToRemove = []; + for (let i = 0; i < localStorage.length; i++) { + const key = localStorage.key(i); + if (key && key.startsWith(STORAGE_KEY_PREFIX)) { + keysToRemove.push(key); + } + } + + keysToRemove.forEach(key => localStorage.removeItem(key)); + return keysToRemove.length; + } catch (err) { + console.error('Failed to clear all cache:', err.message); + return 0; + } +} + +/** + * Get statistics about the cache. + * + * @returns {Object} { totalEntries, totalSize, entries } + */ +export function getCacheStats() { + try { + const entries = []; + let totalSize = 0; + + for (let i = 0; i < localStorage.length; i++) { + const key = localStorage.key(i); + if (key && key.startsWith(STORAGE_KEY_PREFIX)) { + const stored = localStorage.getItem(key); + totalSize += stored ? stored.length : 0; + entries.push({ + key: key.replace(STORAGE_KEY_PREFIX, ''), + size: stored ? stored.length : 0, + metadata: getCacheMetadata(key.replace(STORAGE_KEY_PREFIX, '')), + }); + } + } + + return { + totalEntries: entries.length, + totalSize, + entries, + }; + } catch (err) { + console.error('Failed to get cache stats:', err.message); + return { totalEntries: 0, totalSize: 0, entries: [] }; + } +} diff --git a/frontend/src/lib/persistentCache.test.js b/frontend/src/lib/persistentCache.test.js new file mode 100644 index 00000000..504698d6 --- /dev/null +++ b/frontend/src/lib/persistentCache.test.js @@ -0,0 +1,201 @@ +import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'; +import { + setCacheEntry, + getCacheEntry, + getCacheMetadata, + clearCacheEntry, + clearAllCache, + getCacheStats, +} from './persistentCache'; + +describe('Persistent Cache', () => { + beforeEach(() => { + localStorage.clear(); + }); + + afterEach(() => { + localStorage.clear(); + }); + + describe('setCacheEntry and getCacheEntry', () => { + it('stores and retrieves data', () => { + const data = { stats: { total: 100 } }; + expect(setCacheEntry('stats', data, 60000)).toBe(true); + expect(getCacheEntry('stats')).toEqual(data); + }); + + it('returns null for non-existent entries', () => { + expect(getCacheEntry('nonexistent')).toBeNull(); + }); + + it('returns null for expired entries', () => { + vi.useFakeTimers(); + try { + setCacheEntry('expired', { value: 'data' }, 1000); + vi.advanceTimersByTime(1001); + expect(getCacheEntry('expired')).toBeNull(); + } finally { + vi.useRealTimers(); + } + }); + + it('returns false for invalid keys', () => { + expect(setCacheEntry('', { data: 'test' }, 1000)).toBe(false); + expect(setCacheEntry(null, { data: 'test' }, 1000)).toBe(false); + }); + + it('returns false for invalid TTL', () => { + expect(setCacheEntry('key', { data: 'test' }, 0)).toBe(false); + expect(setCacheEntry('key', { data: 'test' }, -1)).toBe(false); + }); + + it('preserves complex data structures', () => { + const complex = { + array: [1, 2, 3], + nested: { a: { b: { c: 'deep' } } }, + null: null, + bool: true, + }; + setCacheEntry('complex', complex); + expect(getCacheEntry('complex')).toEqual(complex); + }); + }); + + describe('getCacheMetadata', () => { + it('returns metadata for valid entries', () => { + setCacheEntry('test', { data: 'value' }, 60000); + const metadata = getCacheMetadata('test'); + + expect(metadata).toBeDefined(); + expect(metadata.timestamp).toBeDefined(); + expect(metadata.ttl).toBe(60000); + expect(metadata.age).toBeGreaterThanOrEqual(0); + expect(metadata.isExpired).toBe(false); + expect(metadata.expiresAt).toBeDefined(); + }); + + it('marks expired entries in metadata', () => { + vi.useFakeTimers(); + try { + setCacheEntry('expiring', { data: 'value' }, 1000); + vi.advanceTimersByTime(1001); + const metadata = getCacheMetadata('expiring'); + expect(metadata.isExpired).toBe(true); + } finally { + vi.useRealTimers(); + } + }); + + it('returns null for non-existent entries', () => { + expect(getCacheMetadata('nonexistent')).toBeNull(); + }); + + it('tracks age correctly', () => { + vi.useFakeTimers(); + try { + setCacheEntry('tracking', { data: 'value' }, 60000); + vi.advanceTimersByTime(5000); + const metadata = getCacheMetadata('tracking'); + expect(metadata.age).toBeGreaterThanOrEqual(5000); + expect(metadata.age).toBeLessThan(5100); + } finally { + vi.useRealTimers(); + } + }); + }); + + describe('clearCacheEntry', () => { + it('deletes specific entries', () => { + setCacheEntry('keep', { data: 'keep' }); + setCacheEntry('delete', { data: 'delete' }); + + expect(clearCacheEntry('delete')).toBe(true); + expect(getCacheEntry('keep')).toBeDefined(); + expect(getCacheEntry('delete')).toBeNull(); + }); + + it('returns false for non-existent entries', () => { + expect(clearCacheEntry('nonexistent')).toBe(false); + }); + + it('returns false for invalid keys', () => { + expect(clearCacheEntry('')).toBe(false); + expect(clearCacheEntry(null)).toBe(false); + }); + }); + + describe('clearAllCache', () => { + it('clears all cache entries', () => { + setCacheEntry('one', { data: 1 }); + setCacheEntry('two', { data: 2 }); + setCacheEntry('three', { data: 3 }); + + expect(clearAllCache()).toBe(3); + expect(getCacheEntry('one')).toBeNull(); + expect(getCacheEntry('two')).toBeNull(); + expect(getCacheEntry('three')).toBeNull(); + }); + + it('preserves non-TipStream entries', () => { + localStorage.setItem('other_key', 'other_data'); + setCacheEntry('tipstream', { data: 'value' }); + + clearAllCache(); + + expect(localStorage.getItem('other_key')).toBe('other_data'); + expect(getCacheEntry('tipstream')).toBeNull(); + }); + }); + + describe('getCacheStats', () => { + it('returns statistics for cached entries', () => { + setCacheEntry('stats1', { data: 'value1' }); + setCacheEntry('stats2', { data: 'value2' }); + + const stats = getCacheStats(); + expect(stats.totalEntries).toBe(2); + expect(stats.totalSize).toBeGreaterThan(0); + expect(stats.entries).toHaveLength(2); + }); + + it('includes metadata for each entry', () => { + setCacheEntry('test', { data: 'value' }); + const stats = getCacheStats(); + + expect(stats.entries[0].key).toBe('test'); + expect(stats.entries[0].size).toBeGreaterThan(0); + expect(stats.entries[0].metadata).toBeDefined(); + }); + + it('handles empty cache', () => { + const stats = getCacheStats(); + expect(stats.totalEntries).toBe(0); + expect(stats.totalSize).toBe(0); + expect(stats.entries).toHaveLength(0); + }); + }); + + describe('Integration', () => { + it('handles full lifecycle', () => { + const data = { name: 'test', count: 42 }; + expect(setCacheEntry('lifecycle', data, 5000)).toBe(true); + expect(getCacheEntry('lifecycle')).toEqual(data); + + const metadata = getCacheMetadata('lifecycle'); + expect(metadata.isExpired).toBe(false); + + expect(clearCacheEntry('lifecycle')).toBe(true); + expect(getCacheEntry('lifecycle')).toBeNull(); + }); + + it('handles storage quota errors gracefully', () => { + const getItemSpy = vi.spyOn(Storage.prototype, 'setItem'); + getItemSpy.mockImplementation(() => { + throw new Error('QuotaExceededError'); + }); + + expect(setCacheEntry('error', { data: 'test' }, 5000)).toBe(false); + getItemSpy.mockRestore(); + }); + }); +}); diff --git a/frontend/src/lib/resilience.js b/frontend/src/lib/resilience.js new file mode 100644 index 00000000..3d54b1aa --- /dev/null +++ b/frontend/src/lib/resilience.js @@ -0,0 +1,138 @@ +/** + * @module lib/resilience + * + * Utilities for monitoring and debugging resilience features. + * + * Provides logging, metrics collection, and diagnostic tools for + * cache performance and API resilience. + */ + +import { getCacheStats } from './persistentCache'; + +/** + * Global resilience configuration. + */ +const config = { + debugMode: false, + logCacheOperations: false, +}; + +/** + * Enable debug logging for resilience operations. + * + * @param {boolean} enable - Enable or disable debug mode + */ +export function setDebugMode(enable) { + config.debugMode = enable; + if (enable) { + console.log('[Resilience] Debug mode enabled'); + } +} + +/** + * Enable logging of cache operations. + * + * @param {boolean} enable - Enable or disable operation logging + */ +export function setOperationLogging(enable) { + config.logCacheOperations = enable; + if (enable) { + console.log('[Resilience] Operation logging enabled'); + } +} + +/** + * Log a resilience event. + * + * @param {string} source - Source of the event (cache, api, transaction) + * @param {string} level - Log level (debug, info, warn, error) + * @param {string} message - Log message + * @param {Object} data - Additional data to log + */ +export function logResilienceEvent(source, level, message, data = {}) { + if (!config.debugMode) return; + + const timestamp = new Date().toISOString(); + const levelUpper = level.toUpperCase(); + const prefix = `[${timestamp}] [Resilience:${source}:${levelUpper}]`; + + const logFn = { + debug: console.log, + info: console.info, + warn: console.warn, + error: console.error, + }[level] || console.log; + + logFn(`${prefix} ${message}`, data); +} + +/** + * Log a cache operation. + * + * @param {string} operation - Operation type (get, set, clear, hit, miss) + * @param {string} key - Cache key + * @param {*} value - Value (optional) + */ +export function logCacheOperation(operation, key, value = null) { + if (!config.logCacheOperations) return; + + const timestamp = new Date().toISOString(); + const valueStr = value ? ` (${typeof value === 'object' ? JSON.stringify(value).substring(0, 50) : value})` : ''; + console.log(`[${timestamp}] [Cache:${operation}] ${key}${valueStr}`); +} + +/** + * Get diagnostic report for resilience system. + * + * @returns {Object} Comprehensive system report + */ +export function getDiagnosticReport() { + const cacheStats = getCacheStats(); + + return { + timestamp: new Date().toISOString(), + cache: { + entries: cacheStats.totalEntries, + sizeBytes: cacheStats.totalSize, + sizeMb: (cacheStats.totalSize / 1024 / 1024).toFixed(2), + quota: { + usagePercent: ((cacheStats.totalSize / (5 * 1024 * 1024)) * 100).toFixed(1), + warning: cacheStats.totalSize > (4 * 1024 * 1024), + }, + entries: cacheStats.entries, + }, + navigator: typeof navigator !== 'undefined' ? { + onLine: navigator.onLine, + userAgent: navigator.userAgent.substring(0, 100), + } : null, + storage: { + localStorage: typeof localStorage !== 'undefined' ? { + available: true, + usage: localStorage.length, + } : { available: false }, + }, + }; +} + +/** + * Print diagnostic report to console. + * + * Used for debugging cache issues and monitoring storage usage. + */ +export function printDiagnostics() { + const report = getDiagnosticReport(); + console.group('[Resilience Diagnostics]'); + console.log('Cache Statistics:', report.cache); + console.log('Navigator:', report.navigator); + console.log('Storage:', report.storage); + console.groupEnd(); +} + +/** + * Export diagnostic data as JSON. + * + * @returns {string} JSON string of diagnostic report + */ +export function exportDiagnostics() { + return JSON.stringify(getDiagnosticReport(), null, 2); +} diff --git a/frontend/src/lib/resilience.test.js b/frontend/src/lib/resilience.test.js new file mode 100644 index 00000000..63cbfcc0 --- /dev/null +++ b/frontend/src/lib/resilience.test.js @@ -0,0 +1,177 @@ +import { describe, it, expect, beforeEach, vi } from 'vitest'; +import { + setDebugMode, + setOperationLogging, + logResilienceEvent, + logCacheOperation, + getDiagnosticReport, + printDiagnostics, + exportDiagnostics, +} from './resilience'; +import * as persistentCache from './persistentCache'; + +describe('Resilience Monitoring Utilities', () => { + beforeEach(() => { + localStorage.clear(); + vi.clearAllMocks(); + setDebugMode(false); + setOperationLogging(false); + }); + + describe('Debug mode', () => { + it('enables and disables debug logging', () => { + const consoleSpy = vi.spyOn(console, 'log'); + + setDebugMode(true); + expect(consoleSpy).toHaveBeenCalled(); + + consoleSpy.mockClear(); + setDebugMode(false); + expect(consoleSpy).not.toHaveBeenCalled(); + + consoleSpy.mockRestore(); + }); + }); + + describe('Operation logging', () => { + it('enables and disables operation logging', () => { + const consoleSpy = vi.spyOn(console, 'log'); + + setOperationLogging(true); + expect(consoleSpy).toHaveBeenCalled(); + + consoleSpy.mockRestore(); + }); + + it('logs cache operations when enabled', () => { + const consoleSpy = vi.spyOn(console, 'log'); + setOperationLogging(true); + + logCacheOperation('hit', 'test_key'); + expect(consoleSpy).toHaveBeenCalledWith( + expect.stringContaining('Cache:hit') + ); + + consoleSpy.mockRestore(); + }); + + it('skips logging when disabled', () => { + const consoleSpy = vi.spyOn(console, 'log'); + setOperationLogging(false); + + logCacheOperation('hit', 'test_key'); + expect(consoleSpy).not.toHaveBeenCalled(); + + consoleSpy.mockRestore(); + }); + }); + + describe('Event logging', () => { + it('logs resilience events when debug enabled', () => { + const consoleSpy = vi.spyOn(console, 'log'); + setDebugMode(true); + + logResilienceEvent('cache', 'info', 'Test message', { data: 'test' }); + expect(consoleSpy).toHaveBeenCalledWith( + expect.stringContaining('Resilience:cache:INFO'), + expect.objectContaining({ data: 'test' }) + ); + + consoleSpy.mockRestore(); + }); + + it('uses correct log levels', () => { + const warnSpy = vi.spyOn(console, 'warn'); + const errorSpy = vi.spyOn(console, 'error'); + setDebugMode(true); + + logResilienceEvent('api', 'warn', 'Warning message'); + expect(warnSpy).toHaveBeenCalled(); + + logResilienceEvent('api', 'error', 'Error message'); + expect(errorSpy).toHaveBeenCalled(); + + warnSpy.mockRestore(); + errorSpy.mockRestore(); + }); + }); + + describe('getDiagnosticReport', () => { + it('includes cache statistics', () => { + persistentCache.setCacheEntry('test', { data: 'value' }); + + const report = getDiagnosticReport(); + expect(report.cache).toBeDefined(); + expect(report.cache.entries).toBeGreaterThan(0); + expect(report.cache.sizeBytes).toBeGreaterThan(0); + }); + + it('includes timestamp', () => { + const report = getDiagnosticReport(); + expect(report.timestamp).toBeDefined(); + expect(new Date(report.timestamp).getTime()).toBeGreaterThan(0); + }); + + it('includes navigator info', () => { + const report = getDiagnosticReport(); + expect(report.navigator).toBeDefined(); + expect(report.navigator.onLine).toBeDefined(); + }); + + it('includes storage info', () => { + const report = getDiagnosticReport(); + expect(report.storage).toBeDefined(); + expect(report.storage.localStorage).toBeDefined(); + }); + + it('warns when storage quota high', () => { + const largeData = { array: Array(100000).fill('x') }; + persistentCache.setCacheEntry('large', largeData); + + const report = getDiagnosticReport(); + expect(report.cache.quota).toBeDefined(); + expect(report.cache.quota.usagePercent).toBeDefined(); + }); + }); + + describe('printDiagnostics', () => { + it('prints diagnostic report', () => { + const groupSpy = vi.spyOn(console, 'group'); + const logSpy = vi.spyOn(console, 'log'); + const groupEndSpy = vi.spyOn(console, 'groupEnd'); + + printDiagnostics(); + + expect(groupSpy).toHaveBeenCalledWith('[Resilience Diagnostics]'); + expect(logSpy).toHaveBeenCalled(); + expect(groupEndSpy).toHaveBeenCalled(); + + groupSpy.mockRestore(); + logSpy.mockRestore(); + groupEndSpy.mockRestore(); + }); + }); + + describe('exportDiagnostics', () => { + it('exports as valid JSON', () => { + persistentCache.setCacheEntry('test', { data: 'value' }); + + const json = exportDiagnostics(); + expect(() => JSON.parse(json)).not.toThrow(); + + const parsed = JSON.parse(json); + expect(parsed.cache).toBeDefined(); + expect(parsed.timestamp).toBeDefined(); + }); + + it('includes all report fields', () => { + const json = exportDiagnostics(); + const parsed = JSON.parse(json); + + expect(parsed.timestamp).toBeDefined(); + expect(parsed.cache).toBeDefined(); + expect(parsed.navigator).toBeDefined(); + expect(parsed.storage).toBeDefined(); + }); + }); +});