You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using the relatively recent "Functions" feature of the ChatGPT API, it seems like tiktoken-rs underestimates the total number of tokens in the request. Here's a minimal example request:
{
"messages": [
{
"role": "system",
"content": "You are a friendly chatbot.\n"
},
{
"role": "assistant",
"content": "Hello, I am a friendly chatbot!\n"
},
{
"role": "user",
"content": "What is the weather in New York?"
},
{
"content": "",
"function_call": {
"arguments": "{\n\"city\": \"New York\"\n}",
"name": "get_weather"
},
"role": "assistant"
},
{
"role": "function",
"name": "get_weather",
"content": "{\"temperature\": 72, \"conditions\": \"partly_cloudy\"}"
}
],
"model": "gpt-4-0613",
"temperature": 0,
"stream": false
}
When using the relatively recent "Functions" feature of the ChatGPT API, it seems like
tiktoken-rs
underestimates the total number of tokens in the request. Here's a minimal example request:I get this response from OpenAI:
...indicating the request consumed 78 tokens for the prompt. However,
tiktoken_rs::num_tokens_from_messages
returns a value of 66 tokens.The text was updated successfully, but these errors were encountered: