LM Studio #2
-
hi there very nice job, can we use LM STUDIO instead locally ? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
I was just going to ask the same thing. Super excited to check this out in conjunction with a local running LLM. Here is an article I spotted recently mentioned in a documentation update PR on the AutoGen repo. It covers an example of using AutoGen with a locally running LLM via LM Studio. Hooefully just exposing the config out to the Env Vars, for us to set up an alternative base url, should be enough to let us try this out with LM Studio and other OpenAI compatible LLM api solutions. https://medium.com/analytics-vidhya/microsoft-autogen-using-open-source-models-97cba96b0f75 |
Beta Was this translation helpful? Give feedback.
-
Hey @heresandyboy thank you for your suggestion. That was actually very helpful. We've exposed the config file, so you can now add your own custom base URL and utilise local LLM models. Please find the file here @kodiii I hope this helps |
Beta Was this translation helpful? Give feedback.
I was just going to ask the same thing. Super excited to check this out in conjunction with a local running LLM. Here is an article I spotted recently mentioned in a documentation update PR on the AutoGen repo.
It covers an example of using AutoGen with a locally running LLM via LM Studio.
Hooefully just exposing the config out to the Env Vars, for us to set up an alternative base url, should be enough to let us try this out with LM Studio and other OpenAI compatible LLM api solutions.
https://medium.com/analytics-vidhya/microsoft-autogen-using-open-source-models-97cba96b0f75