Pate is an intelligent reader application that allows you to read FB2 and EPUB books while providing real-time AI-powered translation capabilities. Perfect for language learners and readers who want to enjoy books in their original language while having instant access to translations.
- π Support for FB2 and EPUB formats.
- ποΈ Handles zipped FB2 files.
- π€ AI-powered translation using Ollama.
- π± Responsive side-by-side view of original and translated text.
- π Dark/Light theme support.
- π Chapter navigation.
- Clone the repository:
git clone https://github.com/yourusername/pate.git cd pate
- Install the required dependencies:
pip install -r requirements.txt
- Install Ollama from ollama.ai and pull the required model:
ollama pull aya-expanse:32b-q4_K_M
- Make sure Ollama is served:
ollama serve
- Start the application:
make run # or you can use this from project root folder: PYTHONPATH=$(PWD) python -m streamlit run src/app.py
- Open your browser and navigate to
http://localhost:8501
- Click the "Upload an FB2, EPUB, or ZIP" button in the sidebar.
- Select your book file (supported formats: .fb2, .epub, .zip).
- Wait for the file to process.
You can also use 'drag&drop' feature to load your book.
- Use the chapter dropdown in the sidebar to select your desired chapter.
- The content will automatically update with the original text and translation.
- Original text appears on the left.
- AI-translated text appears on the right.
- Translations are generated in real-time as you navigate through the content.
You can modify the translation settings in src/app_config.py
:
OLLAMA_MODEL = "aya-expanse:32b-q4_K_M" # Change the AI model
TRANSLATION_LANGUAGE = "Russian" # Change target language
TRANSLATION_PROMPT = "Translate this text to {language}, don't say anything else: {text} \n{language}:" #Change prompt
The main translation logic is stored in src/utils/llm_translator
. You can modify this file if necessary (e.g. use OpenAI API for translation).
- Python 3.8+
- Streamlit 1.39.0
- ebooklib 0.18
- beautifulsoup4 4.12.3
- Ollama 0.4.2+
Also, you need a machine that can handle Ollama running your selected LLM. In my case, it was a PC with an RTX 3090.
Contributions are welcome! Please feel free to submit a Pull Request.
- Support for OpenAI API integration
- Add Claude/Anthropic API support
- Configurable provider selection
- API key management interface
- Fallback options between providers
- Save translations back to FB2/EPUB files
- Export bilingual versions of books
- Cache translations for faster loading
- Batch translation of multiple chapters
- Translation memory system
- Interactive translation editing
- Translation quality verification
- Glossary management for consistent translations
- Comments and annotations support
- Translation style guidelines enforcement
- Version control for edited translations
- Customizable side-by-side view
- Reading progress tracking
- Bookmarks and notes
- Translation quality rating system
- User preferences persistence