-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
5663 ollama client host #5674
base: main
Are you sure you want to change the base?
5663 ollama client host #5674
Conversation
…querytime when passing host (and other) kwargs to the OllamaChatCompletionClient constructor
…y kwargs are not passed through to the ollama AsyncClient's .chat() method
…ich will install ollama client and tiktoken required by autogen-ext/src/autogen_ext/models/ollama/_ollama_client.py
@rylativity you need to agree to the contributor license for this to be merged. |
@microsoft-github-policy-service agree
… On Feb 23, 2025, at 1:02 PM, microsoft-github-policy-service[bot] ***@***.***> wrote:
microsoft-github-policy-service[bot]
left a comment
(microsoft/autogen#5674)
@rylativity please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
@microsoft-github-policy-service agree [company="{your company}"]
Options:
(default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
(when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"
Contributor License Agreement
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.
<https://github.com/rylativity> <http://www.opensource.org/> <#5674 (comment)> <https://github.com/notifications/unsubscribe-auth/AJY6DECOPC4SO2E4H4I4GOD2RIEMRAVCNFSM6AAAAABXWQ62VCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNZXGAYTGOBWHE>
microsoft-github-policy-service[bot]
left a comment
(microsoft/autogen#5674)
<#5674 (comment)>
@rylativity <https://github.com/rylativity> please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
@microsoft-github-policy-service agree [company="{your company}"]
Options:
(default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
(when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"
Contributor License Agreement
<http://www.opensource.org/>
—
Reply to this email directly, view it on GitHub <#5674 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AJY6DECOPC4SO2E4H4I4GOD2RIEMRAVCNFSM6AAAAABXWQ62VCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNZXGAYTGOBWHE>.
You are receiving this because you were mentioned.
|
@ekzhu I have agreed to the contributor license. I called this out in the PR comment, but since the tests will require |
We currently don't have an environment to run ollama integration tests. You can see this example for OpenAI client to see how to run tests locally: autogen/python/packages/autogen-ext/tests/models/test_openai_model_client.py Lines 1151 to 1176 in a226966
For unit tests, we rely on mocks. @peterychang can you help with reviewing this PR? I think it falls into the TODOs of Ollama client. |
…_kwarg to use monkeypatched ollama AsyncClient to mock httpx request to ollama server
@ekzhu ok thanks, I've updated the test to use a monkeypatched ollama AsyncClient with a mocked httpx call to ollama server |
@ekzhu should likely be assigned as reviewer
Why are these changes needed?
These changes address the bug reported in #5663. Prevents TypeError from being thrown at inference time by ollama AsyncClient when
host
(and other) kwargs are passed to autogen OllamaChatCompletionClient constructor.It also adds ollama as a named optional extra so that the ollama requirements can be installed alongside autogen-ext (e.g.
pip install autogen-ext[ollama]
@ekzhu, I will need some help or guidance to ensure that the associated test (which requires ollama and tiktoken as dependencies of the OllamaChatCompletionClient) can run successfully in autogen's test execution environment.
I have also left the "I've made sure all auto checks have passed" check below unchecked as this PR is coming from my fork. (UPDATE: auto checks appear to have passed after opening PR, so I have checked box below)
Related issue number
Intended to close #5663
Checks