You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 13, 2024. It is now read-only.
Back again for 3 questions which I'm happy to PR into the main README or other documentation pages.
1/ "Source" suffix
It seems that when Canopy creates it's response message, it adds the source or sources at the end of the message, but does so in a non-deterministic way... is this intentional? Is it a known "we'll clean that up later"?
2/ Chat history
In the main README.md chat history is mentioned but I'm not seeing any kind of documentation with regards to adding chat history when using the REST API endpoint of canopy. Perhaps the chat CLI has this within its own session - my apologies I haven't looked at that piece of the code yet - but it would be great if there was some documentation around the chat history topic and when people are using it with the REST API, which will be the normal thing to do in a production environment. Do we need to keep track of this ourselves and pass it to the REST API for example?
3/ Allowing LLM only responses
Is there a way of, in the absence of RAG material, let the LLM make the reply or is this something the client will need to handle?
Thanks so much in advance and I'm happy to PR any answers to the above questions!
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi there,
Back again for 3 questions which I'm happy to PR into the main README or other documentation pages.
1/ "Source" suffix
It seems that when Canopy creates it's response message, it adds the source or sources at the end of the message, but does so in a non-deterministic way... is this intentional? Is it a known "we'll clean that up later"?
2/ Chat history
In the main README.md chat history is mentioned but I'm not seeing any kind of documentation with regards to adding chat history when using the REST API endpoint of canopy. Perhaps the chat CLI has this within its own session - my apologies I haven't looked at that piece of the code yet - but it would be great if there was some documentation around the chat history topic and when people are using it with the REST API, which will be the normal thing to do in a production environment. Do we need to keep track of this ourselves and pass it to the REST API for example?
3/ Allowing LLM only responses
Is there a way of, in the absence of RAG material, let the LLM make the reply or is this something the client will need to handle?
Thanks so much in advance and I'm happy to PR any answers to the above questions!
Beta Was this translation helpful? Give feedback.
All reactions