You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the chat-completion-accumulating example under the examples folder, the StreamOptions.IncludeUsage option is missing. It should ideally be set to true as per the OpenAI docs to retrieve accurate token usage information from the API.
Reproducible Example
The following code snippet is the same code from the chat-completion-accumulating example.
package main
import (
"context""github.com/openai/openai-go"
)
// Mock function to simulate weather data retrievalfuncgetWeather(locationstring) string {
// In a real implementation, this function would call a weather APIreturn"Sunny, 25°C"
}
funcmain() {
client:=openai.NewClient()
ctx:=context.Background()
question:="Begin a very brief introduction of Greece, then incorporate the local weather of a few towns"print("> ")
println(question)
println()
params:= openai.ChatCompletionNewParams{
Messages: openai.F([]openai.ChatCompletionMessageParamUnion{
openai.UserMessage(question),
}),
Seed: openai.Int(0),
Model: openai.F(openai.ChatModelGPT4o),
Tools: openai.F([]openai.ChatCompletionToolParam{
{
Type: openai.F(openai.ChatCompletionToolTypeFunction),
Function: openai.F(openai.FunctionDefinitionParam{
Name: openai.String("get_live_weather"),
Description: openai.String("Get weather at the given location"),
Parameters: openai.F(openai.FunctionParameters{
"type": "object",
"properties": map[string]interface{}{
"location": map[string]string{
"type": "string",
},
},
"required": []string{"location"},
}),
}),
},
}),
}
stream:=client.Chat.Completions.NewStreaming(ctx, params)
acc:= openai.ChatCompletionAccumulator{}
forstream.Next() {
chunk:=stream.Current()
acc.AddChunk(chunk)
// When this fires, the current chunk value will not contain content dataifcontent, ok:=acc.JustFinishedContent(); ok {
println("Content stream finished:", content)
println()
}
iftool, ok:=acc.JustFinishedToolCall(); ok {
println("Tool call stream finished:", tool.Index, tool.Name, tool.Arguments)
println()
}
ifrefusal, ok:=acc.JustFinishedRefusal(); ok {
println("Refusal stream finished:", refusal)
println()
}
// It's best to use chunks after handling JustFinished eventsiflen(chunk.Choices) >0 {
println(chunk.Choices[0].Delta.JSON.RawJSON())
}
}
iferr:=stream.Err(); err!=nil {
panic(err)
}
// After the stream is finished, acc can be used like a ChatCompletion_=acc.Choices[0].Message.Contentprintln("Total Tokens:", acc.Usage.TotalTokens)
println("Finish Reason:", acc.Choices[0].FinishReason)
}
Current Behavior
The example code does not set the StreamOptions.IncludeUsage request option, resulting in the token usage always being reported as 0.
Expected Behavior
The example should set the StreamOptions.IncludeUsage request option to true to enable accurate reporting of token usage.
Suggested Solution
Add the missing parameter to the example with a value of true to the request params.
The text was updated successfully, but these errors were encountered:
bytesizedwizard
changed the title
Missing request option in example code results in incorrect token usage reporting”
Missing request option in example code results in incorrect token usage reporting
Nov 13, 2024
Description
In the
chat-completion-accumulating
example under theexamples
folder, theStreamOptions.IncludeUsage
option is missing. It should ideally be set totrue
as per the OpenAI docs to retrieve accurate token usage information from the API.Reproducible Example
The following code snippet is the same code from the
chat-completion-accumulating
example.Current Behavior
The example code does not set the
StreamOptions.IncludeUsage
request option, resulting in the token usage always being reported as 0.Expected Behavior
The example should set the
StreamOptions.IncludeUsage
request option totrue
to enable accurate reporting of token usage.Suggested Solution
Add the missing parameter to the example with a value of
true
to the request params.The text was updated successfully, but these errors were encountered: