Skip to content

Support for max_completion_tokens in the o1 series #10523

Closed Answered by RogerBarreto
aeras3637 asked this question in Q&A
Discussion options

You must be logged in to vote

@aeras3637 Thanks for bringing the subject, I have identified and was able to replicate, we merged in a fix for that that will be available on the next release.

After the fix, to use the next max_completion_tokens with Azure you need to update your code to something like:

var result = await service.GetChatMessageContentAsync("my prompt", new AzureOpenAIPromptExecutionSettings
{
    SetNewMaxCompletionTokensEnabled = true,
    MaxTokens = 1000,
});

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by aeras3637
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
.NET Issue or Pull requests regarding .NET code ai connector Anything related to AI connectors
3 participants