Connect any VLM Using OpenAi Api Local or Online #361
LoFiApostasy
started this conversation in
Show and tell
Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Folks I made an (almost) drop in model that lets you use the well established OpenAi Apj. There are several choices out there, but i use LLM Studio.
Edit: Fix my copy pasta. Added clarity to path.
models_list.py
openai_compatible.py
Install_win.ps1
Launch.ps1
If you really want the one click run experience, you can make a .bat file
Beta Was this translation helpful? Give feedback.
All reactions