Replies: 1 comment
-
That's not supported yet. Supported LLMs are here: https://ts.llamaindex.ai/modules/llms/ and supported embeddings here: https://ts.llamaindex.ai/modules/embeddings/ |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, i am trying to integrate my local llamafile server with the llamaindex.ts but i couldn't find the doc for that i was the documentation of python version and it has the option in its documentation.
Can anyone please help here?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions