Engine Core demonstrates a pattern for enabling LLMs to undertake tasks of a given scope with a dynamic system prompt and a collection of tool functions. We call these chat strategies.
Chat strategies offer a means to dynamically alter the chat history, system prompts, and available tools on every run.
This project includes 3 example strategies:
demoStrategy
- a simple illustrative example called which serves as a starting point for creating new strategies.backendStrategy
- a slightly more comprehensive example where the LLM works on a local Fastify app (running on http://localhost:8080) to create database migrations and API endpoints.shellStrategy
- a LLM powered shell that can write files and run processes
Additionally, we have extracted the LLM integrations (e.g., Anthropic or OpenAI) into adapters, which allow you to run the same app code and strategies while switching foundation models.
- Ensure Docker is installed and running
- Copy
.env.example
to.env
and add at least one ofOPENAI_API_KEY
orANTHROPIC_API_KEY
- Run
bin/cli
- Select a LLM model for which you have provided an API key
- Type
help
to see what you can do
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.