Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error "Thread ... already has an active run" when using assistants with function calling and streaming #214

Open
3 tasks done
rstropek opened this issue Sep 17, 2024 · 4 comments
Assignees
Labels
bug Something isn't working

Comments

@rstropek
Copy link

rstropek commented Sep 17, 2024

Confirm this is not an issue with the OpenAI Python Library

  • This is not an issue with the OpenAI Python Library

Confirm this is not an issue with the underlying OpenAI API

  • This is not an issue with the OpenAI API

Confirm this is not an issue with Azure OpenAI

  • This is not an issue with Azure OpenAI

Describe the bug

When using the C# OpenAI SDK with assistants, function calling, and streaming, an error is thrown when calling CreateRunStreamingAsync. I described the issue already in https://community.openai.com/t/error-400-already-has-an-active-run/930753/6?u=rainer1, but got no feedback. Other users are reporting this problem, too.

To Reproduce

I copied the corresponding example from this repository and ran it with GPT-4o, GPT-4o-2024-08-06, and GPT-4-turbo. All models show the same behavior.

Code snippets

Here is the exact code that I used (it is an exact copy of the example from the OpenAI repo, except it is no longer a unit test):

using System.ClientModel;
using System.ClientModel.Primitives;
using System.Text.Json;
using dotenv.net;
using OpenAI.Assistants;
#pragma warning disable OPENAI001

var env = DotEnv.Read(options: new DotEnvOptions(probeForEnv: true, probeLevelsToSearch: 7));

// This example parallels the content at the following location:
// https://platform.openai.com/docs/assistants/tools/function-calling/function-calling-beta
#region Step 1 - Define Functions

// First, define the functions that the assistant will use in its defined tools.

FunctionToolDefinition getTemperatureTool = new()
{
    FunctionName = "get_current_temperature",
    Description = "Gets the current temperature at a specific location.",
    Parameters = BinaryData.FromString("""
    {
        "type": "object",
        "properties": {
        "location": {
            "type": "string",
            "description": "The city and state, e.g., San Francisco, CA"
        },
        "unit": {
            "type": "string",
            "enum": ["Celsius", "Fahrenheit"],
            "description": "The temperature unit to use. Infer this from the user's location."
        }
        }
    }
    """),
};

FunctionToolDefinition getRainProbabilityTool = new()
{
    FunctionName = "get_current_rain_probability",
    Description = "Gets the current forecasted probability of rain at a specific location,"
        + " represented as a percent chance in the range of 0 to 100.",
    Parameters = BinaryData.FromString("""
    {
        "type": "object",
        "properties": {
        "location": {
            "type": "string",
            "description": "The city and state, e.g., San Francisco, CA"
        }
        },
        "required": ["location"]
    }
    """),
};

#endregion

// Assistants is a beta API and subject to change; acknowledge its experimental status by suppressing the matching warning.
AssistantClient client = new(env["OPENAI_KEY"]!);

#region Create a new assistant with function tools
// Create an assistant that can call the function tools.
AssistantCreationOptions assistantOptions = new()
{
    Name = "Example: Function Calling",
    Instructions =
        "Don't make assumptions about what values to plug into functions."
        + " Ask for clarification if a user request is ambiguous.",
    Tools = { getTemperatureTool, getRainProbabilityTool },
};

Assistant assistant = await client.CreateAssistantAsync(env["OPENAI_MODEL"]!, assistantOptions);
#endregion

#region Step 2 - Create a thread and add messages
AssistantThread thread = await client.CreateThreadAsync();
ThreadMessage message = await client.CreateMessageAsync(
    thread,
    MessageRole.User,
    [
        "What's the weather in San Francisco today and the likelihood it'll rain?"
    ]);
#endregion

#region Step 3 - Initiate a streaming run
AsyncCollectionResult<StreamingUpdate> asyncUpdates
    = client.CreateRunStreamingAsync(thread, assistant);

ThreadRun? currentRun = null;
do
{
    currentRun = null;
    List<ToolOutput> outputsToSubmit = [];
    await foreach (StreamingUpdate update in asyncUpdates)
    {
        if (update is RunUpdate runUpdate)
        {
            currentRun = runUpdate;
        }
        else if (update is RequiredActionUpdate requiredActionUpdate)
        {
            if (requiredActionUpdate.FunctionName == getTemperatureTool.FunctionName)
            {
                outputsToSubmit.Add(new ToolOutput(requiredActionUpdate.ToolCallId, "57"));
            }
            else if (requiredActionUpdate.FunctionName == getRainProbabilityTool.FunctionName)
            {
                outputsToSubmit.Add(new ToolOutput(requiredActionUpdate.ToolCallId, "25%"));
            }
        }
        else if (update is MessageContentUpdate contentUpdate)
        {
            Console.Write(contentUpdate.Text);
        }
    }
    if (outputsToSubmit.Count > 0)
    {
        asyncUpdates = client.SubmitToolOutputsToRunStreamingAsync(currentRun, outputsToSubmit);
    }
}
while (currentRun?.Status.IsTerminal == false);

#endregion

// Optionally, delete the resources for tidiness if no longer needed.
RequestOptions noThrowOptions = new() { ErrorOptions = ClientErrorBehaviors.NoThrow };
_ = await client.DeleteThreadAsync(thread.Id, noThrowOptions);
_ = await client.DeleteAssistantAsync(assistant.Id, noThrowOptions);

OS

Ubuntu

.NET version

8

Library version

2.0.0-beta.11

@rstropek rstropek added the bug Something isn't working label Sep 17, 2024
rstropek added a commit to rstropek/microsoft-ai-day that referenced this issue Sep 17, 2024
@winston0410
Copy link

Encountering the same issue, even using the example here
https://github.com/openai/openai-dotnet/blob/7a8bc8bebc831cf95707220e38a56b14297adf33/examples/Assistants/Example02b_FunctionCallingStreaming.cs

@rstropek Have you found a solution to this yet?

@rstropek
Copy link
Author

Encountering the same issue, even using the example here https://github.com/openai/openai-dotnet/blob/7a8bc8bebc831cf95707220e38a56b14297adf33/examples/Assistants/Example02b_FunctionCallingStreaming.cs

@rstropek Have you found a solution to this yet?

No, not yet. However, at https://community.openai.com/t/error-400-already-has-an-active-run/930753/10?u=rainer1, some people posted workarounds and/or fixes that worked for them. Maybe you are lucky.

@winston0410
Copy link

At the moment my workaround is not to use StreamingAsync but just to use Async API, which worked but obviously not ideal for a chat application.

@joseharriaga
Copy link
Collaborator

Thank you for reaching out, @rstropek ! We identified a bug in our example and merged a fix. Could you try the latest version and see if it works for you?
🔗 https://github.com/openai/openai-dotnet/blob/main/examples/Assistants/Example02b_FunctionCallingStreaming.cs

@joseharriaga joseharriaga self-assigned this Oct 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants