Skip to content
/ charla Public

A terminal based chat application that works with local language models.

License

Notifications You must be signed in to change notification settings

yaph/charla

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Charla: Chat with Language Models in a Terminal

PyPI - Version PyPI - Python Version

Charla is a terminal based chat application that integrates with Ollama, a backend designed to serve language models. To use Charla, ensure that the ollama server is running and at least one language model is installed.

preview

Installation

Install Charla using pipx:

pipx install charla

Usage

Launch the chat console by typing charla in your terminal, or view all available command line options with charla -h.

Features

  • Terminal-based chat system that supports context aware conversations using local language models.
  • Chat sessions are saved as markdown files in the user's documents directory when ending a chat.
  • Prompt history is saved and previously entered prompts are auto-suggested.
  • Mode switching between single-line and multi-line input without interruption to your chat session.

Development

Run the command-line interface directly from the project source without installing the package:

python -m charla.cli

ollama API

Installed models:

curl http://localhost:11434/api/tags

Model info:

curl http://localhost:11434/api/show -d '{"name": "phi3"}'

License

Charla is distributed under the terms of the MIT license.