You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Im working on a personal project built mainly with Langchain and ChatOpenai prompting.
I want to integrate guidance in the project.
Now the best way will be to load the model with Guidance-Llamacpp but i cannot use Langchain with the model loaded this way.
There are way to use the model loaded with Guidance in langchain for Chat and Embeddings?
Would be an option to give us an option to load the Lamacpp Server when loading the model in guidance? So we can use the server to work with generic Openai calls?
I need help. Loading and unloading the model seems a waste and not a really smart solution
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Im working on a personal project built mainly with Langchain and ChatOpenai prompting.
I want to integrate guidance in the project.
Now the best way will be to load the model with Guidance-Llamacpp but i cannot use Langchain with the model loaded this way.
There are way to use the model loaded with Guidance in langchain for Chat and Embeddings?
Would be an option to give us an option to load the Lamacpp Server when loading the model in guidance? So we can use the server to work with generic Openai calls?
I need help. Loading and unloading the model seems a waste and not a really smart solution
Beta Was this translation helpful? Give feedback.
All reactions