Skip to content

Commit

Permalink
Bugfix: only eval new tokens
Browse files Browse the repository at this point in the history
  • Loading branch information
abetlen committed Apr 15, 2023
1 parent 887f3b7 commit 89856ef
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions llama_cpp/llama.py
Original file line number Diff line number Diff line change
Expand Up @@ -280,6 +280,7 @@ def generate(
if self.verbose:
print("generate cache hit", file=sys.stderr)
reset = False
tokens = tokens[len(self.tokens) :]
###
if reset:
self.reset()
Expand Down

0 comments on commit 89856ef

Please sign in to comment.