You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+111-1
Original file line number
Diff line number
Diff line change
@@ -13,6 +13,7 @@ Anthropic.SDK is an unofficial C# client designed for interacting with the Claud
13
13
-[Examples](#examples)
14
14
-[Non-Streaming Call](#non-streaming-call)
15
15
-[Streaming Call](#streaming-call)
16
+
-[IChatClient](#ichatclient)
16
17
-[Prompt Caching](#prompt-caching)
17
18
-[Batching](#batching)
18
19
-[Tools](#tools)
@@ -37,7 +38,24 @@ The `AnthropicClient` can optionally take a custom `HttpClient` in the `Anthropi
37
38
38
39
## Usage
39
40
40
-
To start using the Claude AI API, simply create an instance of the `AnthropicClient` class.
41
+
There are two ways to start using the `AnthropicClient`. The first is to simply new up an instance of the `AnthropicClient` and start using it, the second is to use the messaging client with the new `Microsoft.Extensions.AI.Abstractions` builder.
42
+
Brief examples of each are below.
43
+
44
+
Option 1:
45
+
46
+
```csharp
47
+
varclient=newAnthropicClient();
48
+
```
49
+
50
+
Option 2:
51
+
52
+
```csharp
53
+
IChatClientclient=newChatClientBuilder()
54
+
.UseFunctionInvocation() //optional
55
+
.Use(newAnthropicClient().Messages);
56
+
```
57
+
58
+
Both support all the core features of the `AnthropicClient's` Messaging and Tooling capabilities, but the latter will be fully featured in .NET 9 and provide built in telemetry and DI and make it easier to choose which SDK you are using.
The `AnthropicClient` has support for the new `IChatClient` from Microsoft and offers a slightly different mechanism for using the `AnthropicClient`. Below are a few examples.
The `AnthropicClient` supports prompt caching of system messages, user messages (including images), assistant messages, tool_results, and tools in accordance with model limitations. Because the `AnthropicClient` does not have it's own tokenizer, you must ensure yourself that when enabling prompt caching, you are providing enough context to the qualifying model for it to cache or nothing will be cached. Check out the documentation on Anthropic's website for specific model limitations and requirements.
0 commit comments