Do local models (LLM.CPP) have provider() classes in truera? #1000
YasaminAbbaszadegan
started this conversation in
General
Replies: 2 comments
-
Local models can be run through the LiteLLM or Langchain feedback provider classes, or with a custom feedback function. |
Beta Was this translation helpful? Give feedback.
0 replies
-
If you're willing to share an example with the community, please open up a PR. This would be a fantastic contribution! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Do local models (LLM.CPP) have provider() classes in truera?
from llama_index.llms.llama_cpp import LlamaCPP
llm = LlamaCPP(model_url='https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF/resolve/main/mistral-7b-instruct-v0.2.Q4_K_M.gguf',
temperature=1,
model_kwargs={"n_gpu_layers": -1},
verbose=True,)
--> from trulens_eval.feedback.provider import ?????
Beta Was this translation helpful? Give feedback.
All reactions