A fullstack application for managing and analyzing customer reviews with AI-powered responses.
RevEase is a full-stack application that helps businesses ingest, analyze, search, and respond to customer reviews.
- Review Management: Ingest and manage customer reviews with session-based isolation
- AI-Powered Analytics: Automatic sentiment analysis and topic categorization
- Smart Search: Find similar reviews using TF-IDF and cosine similarity
- AI Reply Generation: Generate empathetic, context-aware responses to reviews
- Data Visualization: Interactive charts for sentiment and topic analysis
- Bulk Import: Upload reviews via JSON files
- Privacy Focused: Session-based data isolation with automatic cleanup after 30 minutes
- React + TypeScript
- Vite (build tool)
- Tailwind CSS (styling)
- Recharts (data visualization)
- Lucide React (icons)
- FastAPI (Python)
- Supabase PostgreSQL (database)
- Google Gemini API (AI features)
- Node.js 16+
- Python 3.8+
- Supabase account (for database)
- Google Gemini API key (for AI features)
-
Navigate to the backend directory:
cd backend -
Create a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up environment variables:
cp .env.example .env
Update the values in .env as needed:
SUPABASE_URL: Your Supabase project URLSUPABASE_KEY: Your Supabase service role keyGEMINI_API_KEY: Your Google Gemini API key (optional)
-
Run the backend:
python start.py
-
Navigate to the frontend directory:
cd frontend -
Install dependencies:
npm install
-
Set up environment variables (optional):
cp .env.example .env
Update
VITE_API_URLif your backend is not running on http://localhost:8000 -
Run the development server:
npm run dev
Create the reviews table in your Supabase database:
create table reviews (
id bigint generated by default as identity primary key,
session_id uuid not null,
location text not null,
rating int check (rating >= 1 and rating <= 5),
text text not null,
date timestamp with time zone not null,
sentiment text,
topic text,
reply text,
created_at timestamp with time zone default now()
);Set up automatic cleanup of old reviews using Supabase cron jobs:
-- Run this in Supabase SQL editor to set up automatic cleanup
-- Could not implement here due to time constraints
select
cron.schedule(
'clean-old-reviews',
'* * * * *', -- Every minute
$$delete from reviews where created_at < now() - interval '30 minutes'$$
);GET /health- Health checkPOST /ingest- Store reviews with session_idGET /reviews- List reviews with filters and paginationGET /reviews/{id}- Fetch review detailsPOST /reviews/{id}/suggest-reply- Generate AI replyGET /analytics- Get sentiment and topic analyticsGET /search- Search reviews with TF-IDF
- Start both the frontend and backend servers
- Open your browser to http://localhost:5173
- Use the "Use Sample Data" button to populate with sample reviews
- Or use the "Upload JSON" button to import your own review data
- Navigate between the Inbox and Analytics pages using the top navigation
The JSON file should contain an array of review objects with the following structure:
[
{
"location": "Store Location Name",
"rating": 5,
"text": "Review text content",
"date": "2024-01-01T10:00:00Z",
"topic": "service"
}
]Note: The session_id will be automatically added based on your browser session.
The frontend is built with React and TypeScript. Key components include:
- InboxPage: Review listing with filters and upload functionality
- DetailPage: Review details and AI reply generation
- AnalyticsPage: Data visualization dashboard
The backend uses FastAPI with Pydantic models. Key modules include:
- main.py: API endpoints
- models.py: Pydantic schemas
- services.py: Business logic
- database.py: Supabase client initialization
For cloud deployment on platforms like Render:
-
Set the environment variables in your deployment platform:
SUPABASE_URL: Your Supabase project URLSUPABASE_KEY: Your Supabase service role keyGEMINI_API_KEY: Your Google Gemini API key (optional)FRONTEND_URL: Your frontend deployment URL (for CORS)
-
Deploy the backend as a Python application
-
Ensure the Supabase database is accessible
For cloud deployment on Vercel:
- Create a new Vercel project
- Connect your GitHub repository
- Set the build command to
npm run build - Set the output directory to
dist - Add environment variables if needed:
VITE_API_URL: Your backend deployment URL
# Supabase configuration
SUPABASE_URL=https://your-project-id.supabase.co
SUPABASE_KEY=your-supabase-anon-or-service-role-key
# Gemini API Key (for AI features) - optional
GEMINI_API_KEY=your-gemini-api-key
# Server configuration
HOST=localhost
PORT=8000
# Frontend URL for CORS (optional, defaults to localhost:5173)
FRONTEND_URL=http://localhost:5173# API Base URL (optional, defaults to http://localhost:8000)
# VITE_API_URL=https://your-backend-domain.vercel.app~6β7 hours total within the 24-hour limit. Focused on backend correctness and end-to-end integration. Frontend was kept simple due to timeboxing.
- Limited to 15 rows per batch for analysis/replies
- Chose batching over streaming to stay within quota
- Database on Supabase free tier β 500 MB cap (sufficient for demo, but not scalable)
- Frontend on Render free tier and Backend on vercel free tier
- Basic UI (no advanced animations, minimal theming) due to time constraints
- Used simple bar/donut charts for analytics instead of interactive dashboards
- No Streaming AI Responses - Gemini API streaming not implemented β replies are generated in one shot
- Search functionality implemented using TF-IDF
- Very basic search functionality, no relevance scoring
- Used UUID session IDs stored in cookies
- Data isolation works, but users lose data after 30 minutes (TTL)
- Chosen for simplicity over building a full auth system
- Could not implement cron job for cleanup due to time constraints
- Chose Supabase SDK for easier integration in a short timeframe
- Less control over raw queries and type handling, requiring extra conversions (UUIDs, timestamps)
The project delivers a functional review management system with:
- Review ingestion
- Search and analytics
- AI-assisted replies (Gemini)
- Session-based isolation with TTL cleanup
Despite trade-offs, the app is production-like in architecture while staying within the 6β8 hour scope.