A collaborative AI book writing system using multiple specialized agents. Each agent has a specific role in the writing pipeline to create a complete, polished book draft.
## Project OverviewThis project simulates a writing team of AI agents that co-author a book. Each agent has a distinct role:
- Planner: Defines the book structure and chapters
- Researcher: Gathers background information and supporting content
- Writer: Drafts content for each chapter
- Editor: Reviews and polishes the output for grammar and coherence
The agents communicate through shared memory, enabling seamless collaboration.
| Component | Tool/Library |
|---|---|
| LLMs | Ollama (mistral, llama3, deepseek) |
| Agent Orchestration | Python classes with sequential pipeline |
| Context Sharing | In-memory Python objects |
| HTTP Client | requests library |
| Configuration | YAML |
multiagent-book-writer/
├── main.py # Main pipeline orchestrator
├── agents/ # Agent implementations
│ ├── planner.py # Chapter planning agent
│ ├── researcher.py # Research gathering agent
│ ├── writer.py # Content writing agent
│ └── editor.py # Content editing agent
├── shared/
│ └── context.py # Shared state management
├── output/
│ └── draft.txt # Final output (generated)
├── config.yaml # Configuration file
├── requirements.txt # Python dependencies
└── README.md # This file
- Python 3.8+
- Ollama installed and running
-
Clone or download the project
cd c:\Users\Dell\Downloads\Multi-Agent Book Writer
-
Pull an LLM model with Ollama
ollama pull mistral # Alternative options: ollama pull llama3, ollama pull deepseek -
Start Ollama (keep it running in a separate terminal)
ollama serve
-
Install Python dependencies
pip install -r requirements.txt
Run the complete pipeline:
python main.pyGenerate a book with a specific number of chapters:
python main.py 7The final book draft is saved to output/draft.txt with:
- All chapters with proper formatting
- Edited and polished content
- Coherent flow and consistency
- Planning Phase: Planner agent creates a structured outline with chapter titles and descriptions
- Research Phase: Researcher agent gathers background information, facts, and context for each chapter
- Writing Phase: Writer agent creates full chapter drafts based on research data
- Editing Phase: Editor agent polishes chapters for grammar, clarity, and coherence
- Output: Final draft is saved to
output/draft.txt
All agents share a central context dictionary that includes:
title: Book titlechapters: List of chapter titles and descriptionsresearch: Dictionary of research data per chapterdrafts: List of chapter drafts
Edit config.yaml to customize:
- Book settings: Title, number of chapters, output format
- Ollama settings: API URL, model selection, timeout
- Agent parameters: Temperature and behavior for each agent
- Output settings: Directory and filename preferences
python main.py 3type output\draft.txt- Make sure Ollama is running:
ollama serve - Verify it's accessible at
http://localhost:11434 - Check firewall settings
- Try a smaller model:
ollama pull orca-mini - Reduce chapter count:
python main.py 2
- Check if Ollama is using GPU:
ollama list --all - Consider using a smaller, faster model
Potential enhancements to explore:
- Add LangGraph for advanced orchestration with retries
- Implement feedback loops for iterative refinement
- Build Streamlit UI for real-time monitoring
- Add PDF export functionality
- Integrate with web search APIs for richer research
- Add support for multiple LLMs
- Implement memory persistence (JSON/database)
- Add custom style guides for writing consistency
See requirements.txt:
requests: HTTP library for Ollama API callsollama: Ollama Python client librarypython-dotenv: Environment variable management (optional)
This project is open source and available for educational and commercial use.
- Ollama must be running locally for this project to work
- Generated content quality depends on the selected LLM model
- Generation time depends on model size and hardware capabilities
- All content is generated in-memory before being written to disk
Getting Started: Install dependencies, start Ollama, and run python main.py to generate your first AI-written book!
