The ChatCompletionService.GetChatMessageContentAsync method should be extended #7403
Closed
takashiuesaka
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
Also noticed while using the ChatCompletionService and invoking the GetStreamingChatMessageContentsAsync function that any filters (IFunctionInvocationFilter, IPromptRenderFilter, IAutoFunctionInvocationFilter) that you have registered are not invoked. I have tested and when invoking the InvokePromptStreamingAsync function from the Kernel class, that filters are indeed triggered if you have any registered. But I would prefer to rather use the ChatCompletionService. Do we know if this is supported currently or if I am just configuring something wrong? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In my opinion, when communicating with an LLM using SemanticKernel, there are two main methods to use:
The first is used when utilizing ChatHistory, while the second is used when not utilizing it.
However, GetChatMessageContentAsync has an overload that accepts a string instead of ChatHistory, making both methods almost identical in usage. I believe this situation can be quite confusing.
In my view, the current partial implementation of both methods is problematic.
Issues with ChatCompletionService.GetChatMessageContentAsync:
Issues with Kernel.InvokePromptAsync:
My suggestion is to either add an overload to ChatCompletionService.GetChatMessageContentAsync that accepts KernelArguments and ensure it calls IPromptRenderFilter, or plan to deprecate the overload that accepts a string as a prompt. This is because it would be unnatural to create an overload for Kernel.InvokePromptAsync that accepts ChatHistory, an object meant for chat completion.
Beta Was this translation helpful? Give feedback.
All reactions