Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Any plans to extend this to other LLMs especially local ones? #29

Open
PRESIDENT810 opened this issue Jul 24, 2024 · 4 comments

Comments

@PRESIDENT810
Copy link

Looks like magentic already supports running local LLMs using MAGENTIC_LITELLM_MODEL which can be a local server run by Ollama. Is there any plan to support user to use LLMs running locally on their device instead of only OpenAI's model?

@LegalPrimes
Copy link

I'm also curious to know of any plans to support local LLMs in light of the llama 3.1 405B release.

@jmaslek
Copy link

jmaslek commented Jul 26, 2024

This is definitely something we are interested in exploring, but will lilkely not be for a little bit of time. It will be interesting to see how the function calling compares.

This is definitely open to be explored by the community as well.

@kairoswealth
Copy link

It would be great if you could at least support Azure OpenAI.

@bala-ceg
Copy link

I can explore this for running in local llms

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants