|
3 | 3 | ;; Copyright (C) 2023 Karthik Chikmagalur
|
4 | 4 |
|
5 | 5 | ;; Author: Karthik Chikmagalur <[email protected]>
|
6 |
| -;; Version: 0.9.7 |
| 6 | +;; Version: 0.9.8 |
7 | 7 | ;; Package-Requires: ((emacs "27.1") (transient "0.7.4") (compat "29.1.4.1"))
|
8 | 8 | ;; Keywords: convenience, tools
|
9 | 9 | ;; URL: https://github.com/karthink/gptel
|
|
36 | 36 | ;;
|
37 | 37 | ;; - The services ChatGPT, Azure, Gemini, Anthropic AI, Anyscale, Together.ai,
|
38 | 38 | ;; Perplexity, Anyscale, OpenRouter, Groq, PrivateGPT, DeepSeek, Cerebras,
|
39 |
| -;; Github Models, xAI and Kagi (FastGPT & Summarizer). |
| 39 | +;; Github Models, Novita AI, xAI and Kagi (FastGPT & Summarizer). |
40 | 40 | ;; - Local models via Ollama, Llama.cpp, Llamafiles or GPT4All
|
41 | 41 | ;;
|
42 | 42 | ;; Additionally, any LLM service (local or remote) that provides an
|
|
51 | 51 | ;; - Supports conversations and multiple independent sessions.
|
52 | 52 | ;; - Supports tool-use to equip LLMs with agentic capabilities.
|
53 | 53 | ;; - Supports multi-modal models (send images, documents).
|
| 54 | +;; - Supports "reasoning" content in LLM responses. |
54 | 55 | ;; - Save chats as regular Markdown/Org/Text files and resume them later.
|
55 | 56 | ;; - You can go back and edit your previous prompts or LLM responses when
|
56 | 57 | ;; continuing a conversation. These will be fed back to the model.
|
|
125 | 126 | ;; Include more context with requests:
|
126 | 127 | ;;
|
127 | 128 | ;; If you want to provide the LLM with more context, you can add arbitrary
|
128 |
| -;; regions, buffers or files to the query with `gptel-add'. To add text or |
129 |
| -;; media files, call `gptel-add' in Dired or use the dedicated `gptel-add-file'. |
| 129 | +;; regions, buffers, files or directories to the query with `gptel-add'. To add |
| 130 | +;; text or media files, call `gptel-add' in Dired or use the dedicated |
| 131 | +;; `gptel-add-file'. |
130 | 132 | ;;
|
131 |
| -;; You can also add context from gptel's menu instead (gptel-send with a prefix |
132 |
| -;; arg), as well as examine or modify context. |
| 133 | +;; You can also add context from gptel's menu instead (`gptel-send' with a |
| 134 | +;; prefix arg), as well as examine or modify context. |
133 | 135 | ;;
|
134 | 136 | ;; When context is available, gptel will include it with each LLM query.
|
135 | 137 | ;;
|
|
156 | 158 | ;; will always use these settings, allowing you to create mostly reproducible
|
157 | 159 | ;; LLM chat notebooks.
|
158 | 160 | ;;
|
159 |
| -;; Finally, gptel offers a general purpose API for writing LLM ineractions |
160 |
| -;; that suit your workflow, see `gptel-request'. |
| 161 | +;; Finally, gptel offers a general purpose API for writing LLM ineractions that |
| 162 | +;; suit your workflow. See `gptel-request', and `gptel-fsm' for more advanced |
| 163 | +;; usage. |
161 | 164 |
|
162 | 165 | ;;; Code:
|
163 | 166 | (declare-function markdown-mode "markdown-mode")
|
|
0 commit comments