Project
vgrep
Description
The embed_batch() function in src/core/embeddings.rs creates a new LlamaContext with default parameters, completely ignoring the user's configured context_size value stored in self.n_ctx. This means users cannot effectively control the context window size for batch embedding operations.
Error Message
Debug Logs
System Information
Bounty Version: 0.1.0
OS: Ubuntu 24.04 LTS
CPU: AMD EPYC-Genoa Processor (8 cores)
RAM: 15 GB
Screenshots
No response
Steps to Reproduce
- Set a custom context size:
vgrep config set context-size 256
- Index a project with large files:
vgrep index .
- Observe that the configured context size is not used during batch embedding
Expected Behavior
embed_batch() should create context with self.n_ctx as the context size
- User's
context_size configuration should be respected
- Token truncation and context window should match configuration
Actual Behavior
embed_batch() uses LlamaContextParams::default() without setting context size
- Default context size from llama.cpp is used instead
- User configuration has no effect on batch operations
Additional Context
Location: src/core/embeddings.rs:80-83
let ctx_params = LlamaContextParams::default() // n_ctx NOT set!
.with_n_threads_batch(n_threads)
.with_n_threads(n_threads)
.with_embeddings(true);
Note: self.n_ctx exists and is used for token truncation at line 98, but not for context creation.
Project
vgrep
Description
The
embed_batch()function insrc/core/embeddings.rscreates a newLlamaContextwith default parameters, completely ignoring the user's configuredcontext_sizevalue stored inself.n_ctx. This means users cannot effectively control the context window size for batch embedding operations.Error Message
Debug Logs
System Information
Screenshots
No response
Steps to Reproduce
vgrep config set context-size 256vgrep index .Expected Behavior
embed_batch()should create context withself.n_ctxas the context sizecontext_sizeconfiguration should be respectedActual Behavior
embed_batch()usesLlamaContextParams::default()without setting context sizeAdditional Context
Location:
src/core/embeddings.rs:80-83Note:
self.n_ctxexists and is used for token truncation at line 98, but not for context creation.