An AI-powered answer engine with a generative UI.
- 🧱 Stack
- 🚀 Quickstart
- 🌐 Deploy
- ✅ Verified models
- App framework: Next.js
- Text streaming / Generative UI: Vercel AI SDK
- Generative Model: OpenAI
- Search API: Tavily AI
- Component library: shadcn/ui
- Headless component primitives: Radix UI
- Styling: Tailwind CSS
Fork the repo to your Github account, then run the following command to clone the repo:
git clone [email protected]:[YOUR_GITHUB_ACCOUNT]/morphic.git
cd morphic
bun i
cp .env.local.example .env.local
Your .env.local file should look like this:
# Used to set the base URL path for OpenAI API requests.
# If you need to set a BASE URL, uncomment and set the following:
# OPENAI_API_BASE=
# Used to set the model for OpenAI API requests.
# If not set, the default is gpt-4-turbo.
# OPENAI_API_MODEL='gpt-4-turbo'
# OpenAI API key retrieved here: https://platform.openai.com/api-keys
OPENAI_API_KEY=[YOUR_OPENAI_API_KEY]
# Tavily API Key retrieved here: https://app.tavily.com/home
TAVILY_API_KEY=[YOUR_TAVILY_API_KEY]
# Only writers can set a specific model. It must be compatible with the OpenAI API.
# USE_SPECIFIC_API_FOR_WRITER=true
# SPECIFIC_API_BASE=
# SPECIFIC_API_KEY=
# SPECIFIC_API_MODEL=
**Note: This project focuses on Generative UI and requires complex output from LLMs. Currently, it's assumed that the official OpenAI models will be used. Although it's possible to set up other models, if you use an OpenAI-compatible model, but we don't guarantee that it'll work. **
bun dev
You can now visit http://localhost:3000.
Host your own live version of Morphic with Vercel.
List of verified models that can be specified to writers.
- Groq
- LLaMA3 8b
- LLaMA3 70b