Streaming endpoint with C# client #2634
-
I am trying to consume an endpoint created in Azure ML Studio with a C# client, but the output is sent back as a stream only once the answer is fully generated. With a python client consuming the same endpoint, the answer is read as it is being generated. Is C# Streaming not currently supported ? The docker hosting the endpoint is deployed with Prompt Flow 1.7.0 |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
Hi @CyrilMercier00 , thanks for reaching us. Does the C# client is the client you are using to consume the endpoint? We assume the streaming capacity should not related to the client. |
Beta Was this translation helpful? Give feedback.
-
I think I am setting the header correctly to accept an 'event-stream'in my HttpClient, here is how I generate it : return new HttpClient(handler)
{
Timeout = TimeSpan.FromSeconds(180),
BaseAddress = new Uri(EndpointUrl),
DefaultRequestHeaders =
{
Authorization = new AuthenticationHeaderValue("Bearer", EndpointApiKey),
Accept = { new MediaTypeWithQualityHeaderValue("text/event-stream") }
}
}; If you have the time to inspect the test code, I've uploaded it to a repository over here. |
Beta Was this translation helpful? Give feedback.
Hi @CyrilMercier00 there should be a different interface to read the streaming response in C#. Could you please have a try?
ref: link