-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llama.cpp ./embedding #17
Comments
have you tried this Line 287 in baa1bcf
this will get you the embeddings for the prompt |
@mdrokz thank you for your response. I tried it before trying the direct llama.cpp ./embedding executable. The function would always return an empty vector: [] I tried multiple configurations but could not fix the issue. |
Alright i will test on my end see whats happening. Thanks |
I was using zephyr-7B-alpha-GGUF with:
without any GPU assistance. Note:
Right now my workaround is to use a rust wrapper ( |
Is there no rust binding to get the embeddings?
Using llama.cpp one would use:
The text was updated successfully, but these errors were encountered: