Counting Input and Output Tokens from an OpenAI Chat Engine #1221
-
I am building a Next.js application largely based on the default project generated by Ideally i would like to just use the llamaindex context chat engine directly and save the tokens once the stream has completed with some callback. I believe there is a token counting handler in the python version of this package but I haven't been able to find anything similar in the ts version (maybe its yet to be implemented). I can probably hack together as solution to count the tokens but it would be nice to use the usage data directly provided by openai for more accuracy. Any advice would be greatly appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Nevermind... Found the answer here. |
Beta Was this translation helpful? Give feedback.
Nevermind... Found the answer here.