Skip to content

Releases: twinnydotdev/twinny

3.10.0

28 Mar 13:08
Compare
Choose a tag to compare
  • Enable fully configurable API for both chat and fim endpoints.
  • Remove defaults causing headache, user can add their own
  • Adds support for custom fim template
  • Add LiteLLM support

v3.8.0

19 Mar 13:28
Compare
Choose a tag to compare
  • Support automatic multiline completions
  • Enable multiline completions by default
  • Keep option to disable it
  • More sophisticated methods to determine multiline completions

v3.7.0

27 Feb 20:01
Compare
Choose a tag to compare

Updates Ollama chat completions to work from the OpenAI chat specification /v1/chat/completions.

This is a minor update which affects the chat completions API.

If you were previously using /api/generate or /api/chat for Ollama chat completions please change it to /v1/chat/completions or it will no longer work.

v3.6.6

23 Feb 21:28
Compare
Choose a tag to compare

Better context
Style updates
Bug fixes
Other features.

v3.5.0

08 Feb 20:34
Compare
Choose a tag to compare
  • Add new document button to code blocks
  • Fix some style issues with code blocks
  • New button styling in code blocks
  • Added LMStudio support
  • Automatically set port and path when selecting provider

v3.4.0

06 Feb 20:50
Compare
Choose a tag to compare
  • Add and edit custom templates
  • Choose default templates for chat window

v3.1.0

01 Feb 20:29
Compare
Choose a tag to compare

Major refactor types, event handlers and more.

v3.0.0

25 Jan 14:46
Compare
Choose a tag to compare
  • Added support hosted llama.cpp servers
  • Added configuration options for separate FIM and Chat completion server endpoints as llama.cpp server can only host one model at a time and fim/chat don't work interchangeably with the same model
  • Some settings have been re-named but the defaults stay the same
  • Remove support for deepseek models as was causing code smell inside the prompt templates (need to improve model support)

v2.6.14

22 Jan 15:00
Compare
Choose a tag to compare

Enabled cancellation of model download when starting twinny and an option to re-enable it.

v2.6.13

21 Jan 19:40
Compare
Choose a tag to compare
  1. Add option to click status bar icon to stop generation and destroy stream
  2. Add max tokens for fim and chat to options.