A sophisticated, full-stack AI chat application built with modern web technologies, featuring support for multiple AI providers, file attachments, conversation management, and advanced AI capabilities.
Supported Models:
- OpenAI: GPT-4.1, GPT-4.1 Mini, GPT-4.1 Nano, GPT-4o, GPT-4o Mini, o4-mini (with reasoning)
- Anthropic: Claude 4 Sonnet (with reasoning mode)
- Google: Gemini 2.0 Flash, Gemini 2.5 Flash (with thinking), Gemini 2.5 Pro (with reasoning)
- DeepSeek: R1 (0528), R1 Llama Distilled
- OpenRouter: Access to all the models through OpenRouter API
- Image Generation: GPT ImageGen for AI image creation
- Real-time Streaming: Live response streaming with smooth animations and resumable streams
- Markdown Rendering: Full markdown support with syntax highlighting
- Thread Management: Organize conversations with folders, tags, and pinning
- Thread Branching: Create conversation branches from any message
- Conversation Sharing: Share public links to conversations
- Message Actions: Edit, retry, branch, and regenerate messages
- Model Switching: Change AI models mid-conversation
- Search Integration: Web search capabilities for supported models
- Multiple Formats: Text files, images, PDFs, code files
- Drag & Drop: Intuitive file upload interface
- Visual Previews: In-app preview for images, PDFs, and text files
- Smart Processing: Context-aware file handling based on model capabilities
- Attachment Management: Organize and manage uploaded files
- Model Parameters: Fine-tune temperature, top-p, max tokens, and more
- Reasoning Effort: Control thinking depth for reasoning models
- Voice Input: Speech-to-text integration
- API Key Management: Secure storage and management of API keys
- Gallery View: Browse generated images with multiple view modes
- Attachment Analytics: File storage insights and management
- Framework: Next.js 15.3.3 with App Router
- UI: React 19.1.0 with TypeScript 5
- Styling: Tailwind CSS 4 with custom design system
- Components: shadcn/ui components with custom styling
- State: React Context + Convex real-time subscriptions
- Routing: React Router DOM
- AI SDK: Vercel AI SDK for streaming responses
- BaaS: Convex for real-time database and API functions
- Authentication: Better Auth with email/password
- File Storage: Convex file storage with signed URLs
- Caching: Redis for session management and caching
- Streaming: Real-time response streaming with resumable connections
- Multi-provider: Unified interface for different AI providers
- Error Handling: Robust error recovery and retry mechanisms
- Token Management: Usage tracking and quota monitoring
- Node.js 22+ (required for react-pdf)
- npm/pnpm/yarn
- Convex account
- AI provider API keys (OpenAI, Anthropic, Google, etc.)
- Clone the repository
git clone https://github.com/nizarlj/nizars-chat.git
cd nizars-chat- Install dependencies
npm install
# or
pnpm install- Set up Convex
npx convex dev- Configure environment variables for Next.js
Create
.env.localand add your API keys:
# Convex
CONVEX_DEPLOYMENT=your-convex-deployment
NEXT_PUBLIC_CONVEX_URL=your-convex-url # ends with .cloud
NEXT_PUBLIC_CONVEX_SITE_URL=your-site-url # ends with .site
# AI Provider Keys
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GOOGLE_API_KEY=your-google-key
OPENROUTER_API_KEY=your-openrouter-key
# Redis (optional)
REDIS_URL=your-redis-url
# Auth
BETTER_AUTH_SECRET=your-auth-secret- Configure environment variables for Convex
# Auth & Security
BETTER_AUTH_SECRET=your-auth-secret
SITE_URL=http://localhost:3000
API_KEY_ENCRYPTION_KEY=your-encryption-key- Run the development server
pnpm run devThis starts both the Next.js frontend and Convex backend concurrently.
npm run dev:frontend # Start Next.js only
npm run dev:backend # Start Convex only
npm run dev # Start both concurrently
npm run build # Build for production
npm run start # Start production server
npm run lint # Run ESLintModels are configured in src/lib/models.ts with capabilities and parameters:
{
id: "gpt-4.1",
name: "GPT-4.1",
provider: "openai",
capabilities: {
vision: true,
reasoning: false,
maxTokens: 32768,
contextWindow: 1047576
}
}The Convex schema defines the data structure:
- Users: Authentication and profile data
- Threads: Conversation containers with metadata
- Messages: Individual chat messages with AI responses
- Attachments: File uploads with metadata
- API Keys: Encrypted user API keys
- User Preferences: Settings and model preferences
Uses Better Auth for secure authentication:
- Email/password authentication
- Session management with Convex
- Protected routes and API endpoints
For detailed instructions on deploying to Vercel with Convex, see the official Convex Vercel hosting guide.
- Connect your GitHub repository to Vercel
- Set up environment variables in Vercel dashboard
- Deploy with automatic CI/CD
- Build the application:
npm run build - Set up a Node.js server with environment variables
- Configure Convex deployment for production
- Set up Redis for session storage
Ensure all required environment variables are configured:
- Convex deployment URL and credentials
- AI provider API keys
- Redis connection string
- Authentication secrets
Feel free to:
- Fork the repository
- Make your changes
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Vercel: Next.js framework and AI SDK
- Convex: Real-time backend infrastructure
- shadcn/ui: Beautiful component library
- Tailwind CSS: Utility-first CSS framework
- AI Providers: OpenAI, Anthropic, Google, DeepSeek, and others
- Open Source Community: Various libraries and tools used
Built by Nizar π₯Έ