-
Notifications
You must be signed in to change notification settings - Fork 44.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DRAFT : PromptStrategy can use Individual ChatModelProviders & set own configuration (llm model, temperature, top_k, top_p...) #6898
Conversation
This PR exceeds the recommended size of 500 lines. Please make sure you are NOT addressing multiple issues with one PR. |
✅ Deploy Preview for auto-gpt-docs ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
Signed-off-by: ph-ausseil <[email protected]>
This PR exceeds the recommended size of 500 lines. Please make sure you are NOT addressing multiple issues with one PR. |
* Remove has_oa_function_call_api * Create AbstractChatModelProvider.llm_api_client for dependency injection of the client * Improve defaulting mechanism when looking for a model name
This PR exceeds the recommended size of 500 lines. Please make sure you are NOT addressing multiple issues with one PR. |
Related:
|
I am loving it so far, it sort of works like version 0.4.1 before function calling was a thing. |
Hey @Wladastic ! You are welcome to help me on the last straight line ! If import are fixed, the PR would be fully functional. It's code that is running for over a month on a fork. However, the fork has 30 000+ line changes. I have not tried to integrate it. I have just made a bunch of ctrl+F to fix the import but did not bother to run the agent (as there are +30 000 line difference). I hope most import are right and I have not forgot any important file. If any, hit me up and I would add the missing file. I would say to close the PR , we need :
It's spring and and I will be gardening on my spare time, not coding 🙁 |
This pull request has conflicts with the base branch, please resolve those so we can evaluate the pull request. |
Has had conflicts since March and no updates. Let me know if you are still working on this, and I can reopen |
Hi,
The lib is fully functional & works on a fork ( I do not have time to
maintain), it's really a very minor effort to have it integrated. Most
likely fix imports.
Pwut said it wasn't a matter of "if" but "when" .
Pierre
Le ven. 28 juin 2024 à 23:53, Swifty ***@***.***> a écrit :
… Has had conflicts since March and no updates.
Let me know if you are still working on this, and I can reopen
—
Reply to this email directly, view it on GitHub
<#6898 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACKCQ5BN56YTQSWNPUVJP3ZJXLN3AVCNFSM6AAAAABDXDNDYGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOJXG4YDKNZTGA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
PromptStrategy
can use individualChatModelProviders
& set own configuration (llm model, temperature, top_k, top_p,...).Overview
ChatModelWrapper
that can Wrap differentChatModelProviders
(including OpenAI), new providers can be created for Gemini, Mixtra...PromptManager
to create theChatModelWrapper
object ; PromptManager is now an intermediary between aPromptStrategy
aChatModelWrapper
.PromptStrategy.build_promp()
still return aChatPrompt
; which data is channeled via an objectChatCompletionKwargs
to structure the dataPromptStrategy
ChatModelWrapper
ChatModelProviders
AbstractChatMessage
so each provider can have it's own roles & messagesLanguageModelFunction
(via dependency injection),ChatModelResponse
(via extension of an interface) & Argument passed to the LLM will be formatted as theChatModelProviders
specify it thus enabling different providers & API.Remaining work
PrompStrategy
&PromptManager
had an AgentMixin that adds methods such as set_agent() and _agent , they will need to be addedMore info :
OpenAIChatMessage
(evolution of AGPT ChatMessage for OpenAI (should not work for Gemini)) or viaLangChain
. Choice is left to the developper for now however the Langchain dependency has not been added to the poetry.AbstractChatModelAdapter
& implement thechat()
method. In the chat method, any client can be used starting with OpenAI client (commented out in the file) or Langchain (if a langchain dependency is added to the poetry.lock )New functionalities :
tool_choice
to select a specific tool (for better guidance of the LLM and new use cases)tool_call
can trigger a new attempts (was useful as GPT3.5 tend not to call functions). This mechanism offer the possibility to force a specific tool (viatool_choice
) after X (default to 3) failed attemptsJinja2
integration to build prompt (requires poetry update)has_oa_tool_calls_api
/has_function_call_api
and deem providers must provide an function_call API in 2024 (Ollama, Gemini, Mixtra do...)Considerations :
ChatPrompt
might integrateChatCompletionKwargs
Disclaimer :
Can't share unit tests at the moment, but I have shared the run I have on my branch in discord
e
Not ported to autogpt, but very close to the finish line.