The choice of QMD locks out all the OpenClaw users that run on compute and memory constrained VPS or devices like a raspberry pi. Since the project is using hosted LLMs anyway, why not also offer hosted embeddings and re-ranker?
QMD gained quite some popularity because of its ease of use, no complicated configuration to set up, but it is a very immature product. It's okay for a set of small notes but it cannot deal with larger documents. And with large, I don't mean hundreds of pages. QMD returns exactly one hit per document, even if the document contains more passages that are relevant. And that hit might be the abstract or introduction of the note.
People run quickly into OOM, especially if they don't run their claw on a Mac Mini. With all respect, I think QMD is great, but it needs to mature. And actually, there are very mature solutions that are battle proven. For instance LanceDB has all the bells and whistles, vector, BM25, reranking. You can run with local models and hosted ones so users have the choice.
The choice of QMD locks out all the OpenClaw users that run on compute and memory constrained VPS or devices like a raspberry pi. Since the project is using hosted LLMs anyway, why not also offer hosted embeddings and re-ranker?
QMD gained quite some popularity because of its ease of use, no complicated configuration to set up, but it is a very immature product. It's okay for a set of small notes but it cannot deal with larger documents. And with large, I don't mean hundreds of pages. QMD returns exactly one hit per document, even if the document contains more passages that are relevant. And that hit might be the abstract or introduction of the note.
People run quickly into OOM, especially if they don't run their claw on a Mac Mini. With all respect, I think QMD is great, but it needs to mature. And actually, there are very mature solutions that are battle proven. For instance LanceDB has all the bells and whistles, vector, BM25, reranking. You can run with local models and hosted ones so users have the choice.