You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to run the v3 as I installed a few LLMs with ollama (which works fine). But I keep hitting this error:
ValueError: The number of documents in the SQL database (229) doesn't match the number of embeddings in FAISS (0). Make sure your FAISS configuration file points to the same database that you used when you saved the original index.
This happens when I ask any question. It does not change if I upload a document or not. Both give the same error.
I checked and ollama is running on port 11434 (the default)
For info, I'm on fedora with Python 3.10.13 in a venv.
The text was updated successfully, but these errors were encountered:
@odevroed Thank you for your kind words I am glad you are enjoying it! Also that is my fault, I need to push a fix that deletes the existing indexes each time you run the program again. If you delete both .db files as well as the FAISS config files/json it will work again in the meantime.
I wonder, will you implement a more straightforward way to change the model than to change it in the code?
Furthermore, I tried with gemma, and the results are not good. Which types of model will work with your v3?
@odevroed That is also on my list haha, the plan is to include a drop down to allow selection of whichever model, dynamically swapping the prompt as well to fit each model for the same task.
Hi,
First, this is a great project. I love it!
I tried to run the v3 as I installed a few LLMs with ollama (which works fine). But I keep hitting this error:
ValueError: The number of documents in the SQL database (229) doesn't match the number of embeddings in FAISS (0). Make sure your FAISS configuration file points to the same database that you used when you saved the original index.
This happens when I ask any question. It does not change if I upload a document or not. Both give the same error.
I checked and ollama is running on port 11434 (the default)
For info, I'm on fedora with Python 3.10.13 in a venv.
The text was updated successfully, but these errors were encountered: