-
LLMLingua typically operates using the llama-2-7b model directly within the virtual machine (VM) environment. However, it is possible to configure LLMLingua to make API calls to an external instance of llama-2-7b ? Thanks |
Beta Was this translation helpful? Give feedback.
Answered by
iofu728
Apr 7, 2024
Replies: 1 comment
-
Hi @flthibau, thanks for your interest in LLMLingua. First, I'd like to confirm if the requirement you mentioned involves invoking Llama-2 via an API to implement LLMLingua. If so, we are currently working on this feature and expect to release it soon. For more details, please see issues #70 and #118. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
flthibau
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @flthibau, thanks for your interest in LLMLingua.
First, I'd like to confirm if the requirement you mentioned involves invoking Llama-2 via an API to implement LLMLingua. If so, we are currently working on this feature and expect to release it soon. For more details, please see issues #70 and #118.