Skip to content

Commit

Permalink
Merge branch 'main' into feature-connectors-anthropic
Browse files Browse the repository at this point in the history
  • Loading branch information
RogerBarreto authored Sep 16, 2024
2 parents 30b503a + 77aa4e3 commit 2f47d9b
Show file tree
Hide file tree
Showing 10 changed files with 73 additions and 63 deletions.
4 changes: 2 additions & 2 deletions dotnet/Directory.Packages.props
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@
</PropertyGroup>
<ItemGroup>
<PackageVersion Include="Azure.AI.Inference" Version="1.0.0-beta.1" />
<PackageVersion Include="OpenAI" Version="2.0.0-beta.10" />
<PackageVersion Include="OpenAI" Version="2.0.0-beta.11" />
<PackageVersion Include="System.ClientModel" Version="1.1.0-beta.7" />
<PackageVersion Include="Azure.AI.ContentSafety" Version="1.0.0" />
<PackageVersion Include="Azure.AI.OpenAI" Version="2.0.0-beta.4" />
<PackageVersion Include="Azure.AI.OpenAI" Version="2.0.0-beta.5" />
<PackageVersion Include="Azure.Identity" Version="1.12.0" />
<PackageVersion Include="Azure.Monitor.OpenTelemetry.Exporter" Version="1.3.0" />
<PackageVersion Include="Azure.Search.Documents" Version="11.6.0" />
Expand Down
54 changes: 37 additions & 17 deletions dotnet/docs/OPENAI-CONNECTOR-MIGRATION.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,35 +53,55 @@ tags: `ResultsPerPrompt`,`results_per_prompt`

The `OpenAIFileService` was deprecated in the latest version of the OpenAI Connector. We strongly recommend to update your code to use the new `OpenAIClient.GetFileClient()` for file management operations.

## 5. SemanticKernel MetaPackage
## 5. OpenAI ChatCompletion custom endpoint

The `OpenAIChatCompletionService` **experimental** constructor for custom endpoints will not attempt to auto-correct the endpoint and use it as is.

We have the two only specific cases where we attempted to auto-correct the endpoint.

1. If you provided `chat/completions` path before. Now those need to be removed as they are added automatically to the end of your original endpoint by `OpenAI SDK`.

```diff
- http://any-host-and-port/v1/chat/completions
+ http://any-host-and-port/v1
```

2. If you provided a custom endpoint without any path. We won't be adding the `v1/` as the first path. Now the `v1` path needs to provided as part of your endpoint.

```diff
- http://any-host-and-port/
+ http://any-host-and-port/v1
```

## 6. SemanticKernel MetaPackage

To be retro compatible with the new OpenAI and AzureOpenAI Connectors, our `Microsoft.SemanticKernel` meta package changed its dependency to use the new `Microsoft.SemanticKernel.Connectors.AzureOpenAI` package that depends on the `Microsoft.SemanticKernel.Connectors.OpenAI` package. This way if you are using the metapackage, no change is needed to get access to `Azure` related types.

## 6. Contents
## 7. Contents

### 6.1 OpenAIChatMessageContent
### 7.1 OpenAIChatMessageContent

- The `Tools` property type has changed from `IReadOnlyList<ChatCompletionsToolCall>` to `IReadOnlyList<ChatToolCall>`.

- Inner content type has changed from `ChatCompletionsFunctionToolCall` to `ChatToolCall`.

- Metadata type `FunctionToolCalls` has changed from `IEnumerable<ChatCompletionsFunctionToolCall>` to `IEnumerable<ChatToolCall>`.

### 6.2 OpenAIStreamingChatMessageContent
### 7.2 OpenAIStreamingChatMessageContent

- The `FinishReason` property type has changed from `CompletionsFinishReason` to `FinishReason`.
- The `ToolCallUpdate` property has been renamed to `ToolCallUpdates` and its type has changed from `StreamingToolCallUpdate?` to `IReadOnlyList<StreamingToolCallUpdate>?`.
- The `AuthorName` property is not initialized because it's not provided by the underlying library anymore.

## 6.3 Metrics for AzureOpenAI Connector
## 7.3 Metrics for AzureOpenAI Connector

The meter `s_meter = new("Microsoft.SemanticKernel.Connectors.OpenAI");` and the relevant counters still have old names that contain "openai" in them, such as:

- `semantic_kernel.connectors.openai.tokens.prompt`
- `semantic_kernel.connectors.openai.tokens.completion`
- `semantic_kernel.connectors.openai.tokens.total`

## 7. Using Azure with your data (Data Sources)
## 8. Using Azure with your data (Data Sources)

With the new `AzureOpenAIClient`, you can now specify your datasource thru the options and that requires a small change in your code to the new type.

Expand Down Expand Up @@ -116,41 +136,41 @@ var promptExecutionSettings = new AzureOpenAIPromptExecutionSettings
};
```

## 8. Breaking glass scenarios
## 9. Breaking glass scenarios

Breaking glass scenarios are scenarios where you may need to update your code to use the new OpenAI Connector. Below are some of the breaking changes that you may need to be aware of.

#### 8.1 KernelContent Metadata
#### 9.1 KernelContent Metadata

Some of the keys in the content metadata dictionary have changed, you will need to update your code to when using the previous key names.

- `Created` -> `CreatedAt`

#### 8.2 Prompt Filter Results
#### 9.2 Prompt Filter Results

The `PromptFilterResults` metadata type has changed from `IReadOnlyList<ContentFilterResultsForPrompt>` to `ContentFilterResultForPrompt`.

#### 8.3 Content Filter Results
#### 9.3 Content Filter Results

The `ContentFilterResultsForPrompt` type has changed from `ContentFilterResultsForChoice` to `ContentFilterResultForResponse`.

#### 8.4 Finish Reason
#### 9.4 Finish Reason

The FinishReason metadata string value has changed from `stop` to `Stop`

#### 8.5 Tool Calls
#### 9.5 Tool Calls

The ToolCalls metadata string value has changed from `tool_calls` to `ToolCalls`

#### 8.6 LogProbs / Log Probability Info
#### 9.6 LogProbs / Log Probability Info

The `LogProbabilityInfo` type has changed from `ChatChoiceLogProbabilityInfo` to `IReadOnlyList<ChatTokenLogProbabilityInfo>`.

#### 8.7 Finish Details, Index, and Enhancements
#### 9.7 Finish Details, Index, and Enhancements

All of above have been removed.

#### 8.8 Token Usage
#### 9.8 Token Usage

The Token usage naming convention from `OpenAI` changed from `Completion`, `Prompt` tokens to `Output` and `Input` respectively. You will need to update your code to use the new naming.

Expand All @@ -172,13 +192,13 @@ The type also changed from `CompletionsUsage` to `ChatTokenUsage`.
totalTokens: usage?.TotalTokens ?? 0;
```

#### 8.9 OpenAIClient
#### 9.9 OpenAIClient

The `OpenAIClient` type previously was a Azure specific namespace type but now it is an `OpenAI` SDK namespace type, you will need to update your code to use the new `OpenAIClient` type.

When using Azure, you will need to update your code to use the new `AzureOpenAIClient` type.

#### 8.10 Pipeline Configuration
#### 9.10 Pipeline Configuration

The new `OpenAI` SDK uses a different pipeline configuration, and has a dependency on `System.ClientModel` package. You will need to update your code to use the new `HttpClientPipelineTransport` transport configuration where before you were using `HttpClientTransport` from `Azure.Core.Pipeline`.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ public class ChatHistorySummarizationReducer : IChatHistoryReducer
/// </summary>
public const string DefaultSummarizationPrompt =
"""
Provide a concise and complete summarizion of the entire dialog that does not exceed 5 sentences
Provide a concise and complete summarization of the entire dialog that does not exceed 5 sentences

This summary must always:
- Consider both user and assistant interactions
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,9 @@ protected override ChatCompletionOptions CreateChatCompletionOptions(
TopP = (float?)executionSettings.TopP,
FrequencyPenalty = (float?)executionSettings.FrequencyPenalty,
PresencePenalty = (float?)executionSettings.PresencePenalty,
#pragma warning disable OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
Seed = executionSettings.Seed,
#pragma warning restore OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
EndUserId = executionSettings.User,
TopLogProbabilityCount = executionSettings.TopLogprobs,
IncludeLogProbabilities = executionSettings.Logprobs,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,9 +67,9 @@ public void ItUsesEndpointAsExpected(string? clientBaseAddress, string? provided
var clientCore = new ClientCore("model", "apiKey", endpoint: endpoint, httpClient: client);

// Assert
Assert.Equal(endpoint ?? client?.BaseAddress ?? new Uri("https://api.openai.com/"), clientCore.Endpoint);
Assert.Equal(endpoint ?? client?.BaseAddress ?? new Uri("https://api.openai.com/v1"), clientCore.Endpoint);
Assert.True(clientCore.Attributes.ContainsKey(AIServiceExtensions.EndpointKey));
Assert.Equal(endpoint?.ToString() ?? client?.BaseAddress?.ToString() ?? "https://api.openai.com/", clientCore.Attributes[AIServiceExtensions.EndpointKey]);
Assert.Equal(endpoint?.ToString() ?? client?.BaseAddress?.ToString() ?? "https://api.openai.com/v1", clientCore.Attributes[AIServiceExtensions.EndpointKey]);

client?.Dispose();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -76,10 +76,11 @@ public void ConstructorWithApiKeyWorksCorrectly(bool includeLoggerFactory)
}

[Theory]
[InlineData("http://localhost:1234/v1/chat/completions", "http://localhost:1234/v1/chat/completions")] // Uses full path when provided
[InlineData("http://localhost:1234/", "http://localhost:1234/v1/chat/completions")]
[InlineData("http://localhost:8080", "http://localhost:8080/v1/chat/completions")]
[InlineData("https://something:8080", "https://something:8080/v1/chat/completions")] // Accepts TLS Secured endpoints
[InlineData("http://localhost:1234", "http://localhost:1234/chat/completions")]
[InlineData("http://localhost:8080", "http://localhost:8080/chat/completions")]
[InlineData("https://something:8080", "https://something:8080/chat/completions")] // Accepts TLS Secured endpoints
[InlineData("http://localhost:1234/v2", "http://localhost:1234/v2/chat/completions")]
[InlineData("http://localhost:8080/v2", "http://localhost:8080/v2/chat/completions")]
public async Task ItUsesCustomEndpointsWhenProvidedDirectlyAsync(string endpointProvided, string expectedEndpoint)
{
// Arrange
Expand All @@ -95,10 +96,11 @@ public async Task ItUsesCustomEndpointsWhenProvidedDirectlyAsync(string endpoint
}

[Theory]
[InlineData("http://localhost:1234/v1/chat/completions", "http://localhost:1234/v1/chat/completions")] // Uses full path when provided
[InlineData("http://localhost:1234/", "http://localhost:1234/v1/chat/completions")]
[InlineData("http://localhost:8080", "http://localhost:8080/v1/chat/completions")]
[InlineData("https://something:8080", "https://something:8080/v1/chat/completions")] // Accepts TLS Secured endpoints
[InlineData("http://localhost:1234", "http://localhost:1234/chat/completions")]
[InlineData("http://localhost:8080", "http://localhost:8080/chat/completions")]
[InlineData("https://something:8080", "https://something:8080/chat/completions")] // Accepts TLS Secured endpoints
[InlineData("http://localhost:1234/v2", "http://localhost:1234/v2/chat/completions")]
[InlineData("http://localhost:8080/v2", "http://localhost:8080/v2/chat/completions")]
public async Task ItUsesCustomEndpointsWhenProvidedAsBaseAddressAsync(string endpointProvided, string expectedEndpoint)
{
// Arrange
Expand Down Expand Up @@ -127,7 +129,7 @@ public async Task ItUsesHttpClientEndpointIfProvidedEndpointIsMissingAsync()
await chatCompletion.GetChatMessageContentsAsync(this._chatHistoryForTest, this._executionSettings);

// Assert
Assert.Equal("http://localhost:12312/v1/chat/completions", this._messageHandlerStub.RequestUri!.ToString());
Assert.Equal("http://localhost:12312/chat/completions", this._messageHandlerStub.RequestUri!.ToString());
}

[Fact]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -650,7 +650,9 @@ protected virtual ChatCompletionOptions CreateChatCompletionOptions(
TopP = (float?)executionSettings.TopP,
FrequencyPenalty = (float?)executionSettings.FrequencyPenalty,
PresencePenalty = (float?)executionSettings.PresencePenalty,
#pragma warning disable OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
Seed = executionSettings.Seed,
#pragma warning restore OPENAI001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
EndUserId = executionSettings.User,
TopLogProbabilityCount = executionSettings.TopLogprobs,
IncludeLogProbabilities = executionSettings.Logprobs,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ internal async Task<IReadOnlyList<AudioContent>> GetAudioContentsAsync(
SpeechGenerationOptions options = new()
{
ResponseFormat = responseFormat,
Speed = audioExecutionSettings.Speed,
SpeedRatio = audioExecutionSettings.Speed,
};

ClientResult<BinaryData> response = await RunRequestAsync(() => this.Client!.GetAudioClient(targetModel).GenerateSpeechAsync(prompt, GetGeneratedSpeechVoice(audioExecutionSettings?.Voice), options, cancellationToken)).ConfigureAwait(false);
Expand All @@ -58,15 +58,17 @@ private static GeneratedSpeechVoice GetGeneratedSpeechVoice(string? voice)
};

private static (GeneratedSpeechFormat? Format, string? MimeType) GetGeneratedSpeechFormatAndMimeType(string? format)
=> format?.ToUpperInvariant() switch
{
switch (format?.ToUpperInvariant())
{
"WAV" => (GeneratedSpeechFormat.Wav, "audio/wav"),
"MP3" => (GeneratedSpeechFormat.Mp3, "audio/mpeg"),
"OPUS" => (GeneratedSpeechFormat.Opus, "audio/opus"),
"FLAC" => (GeneratedSpeechFormat.Flac, "audio/flac"),
"AAC" => (GeneratedSpeechFormat.Aac, "audio/aac"),
"PCM" => (GeneratedSpeechFormat.Pcm, "audio/l16"),
null => (null, null),
_ => throw new NotSupportedException($"The format '{format}' is not supported.")
};
case "WAV": return (GeneratedSpeechFormat.Wav, "audio/wav");
case "MP3": return (GeneratedSpeechFormat.Mp3, "audio/mpeg");
case "OPUS": return (GeneratedSpeechFormat.Opus, "audio/opus");
case "FLAC": return (GeneratedSpeechFormat.Flac, "audio/flac");
case "AAC": return (GeneratedSpeechFormat.Aac, "audio/aac");
case "PCM": return (GeneratedSpeechFormat.Pcm, "audio/l16");
case null: return (null, null);
default: throw new NotSupportedException($"The format '{format}' is not supported.");
}
}
}
4 changes: 2 additions & 2 deletions dotnet/src/Connectors/Connectors.OpenAI/Core/ClientCore.cs
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ internal partial class ClientCore
/// <summary>
/// Default OpenAI API endpoint.
/// </summary>
private const string OpenAIEndpoint = "https://api.openai.com/";
private const string OpenAIV1Endpoint = "https://api.openai.com/v1";

/// <summary>
/// Identifier of the default model to use
Expand Down Expand Up @@ -104,7 +104,7 @@ internal ClientCore(
if (this.Endpoint is null)
{
Verify.NotNullOrWhiteSpace(apiKey); // For Public OpenAI Endpoint a key must be provided.
this.Endpoint = new Uri(OpenAIEndpoint);
this.Endpoint = new Uri(OpenAIV1Endpoint);
}
else if (string.IsNullOrEmpty(apiKey))
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,29 +67,11 @@ public OpenAIChatCompletionService(
HttpClient? httpClient = null,
ILoggerFactory? loggerFactory = null)
{
Uri? internalClientEndpoint = null;
var providedEndpoint = endpoint ?? httpClient?.BaseAddress;
if (providedEndpoint is not null)
{
// As OpenAI Client automatically adds the chat completions endpoint, we remove it to avoid duplication.
const string PathAndQueryPattern = "v1/chat/completions";
var providedEndpointText = providedEndpoint.ToString();
int index = providedEndpointText.IndexOf(PathAndQueryPattern, StringComparison.OrdinalIgnoreCase);
if (index >= 0)
{
internalClientEndpoint = new Uri($"{providedEndpointText.Substring(0, index)}{providedEndpointText.Substring(index + PathAndQueryPattern.Length)}");
}
else
{
internalClientEndpoint = providedEndpoint;
}
}

this._client = new(
modelId,
apiKey,
organization,
internalClientEndpoint,
endpoint ?? httpClient?.BaseAddress,
httpClient,
loggerFactory?.CreateLogger(typeof(OpenAIChatCompletionService)));
}
Expand Down

0 comments on commit 2f47d9b

Please sign in to comment.