Sourced from OllamaSharp's releases.
Release 3.0.8
- Allow
SendToOllamaAsync()
to be overridden #87 to support streaming responses in Blazor WASM #84- Update RequestOptions #88 with new properties
MainGpu
,UseMmap
,LowVram
and more.Release 3.0.7
- Exposed
RequestOptions
(to set the temperature, etc.) when using theChat
class #77Release 3.0.6
- Added support for default request headers to the
OllamaApiClient
and custom request headers to requests against the Ollama API.Release 3.0.5
- Added
OllamaApiClient.IncomingJsonSerializerOptions
to control how responses from Ollama are deserialized #78Release 3.0.4
- Removed the OllamaApiConsole demo which is now located in an own repository to reduce package updates
- Added
OllamaApiClient.OutgoingJsonSerializerOptions
to control how requests to Ollama are serialized #75Release 3.0.3
No release notes provided.
Release 3.0.2
Improved tool chat demos
79b7fff
Merge pull request #89
from awaescher/milimerge9f7d340
Update RequestOptions.cs04e2ac0
Merge pull request #87
from milbk/overridable331d4d4
Merge pull request #88
from milbk/mainebe5bb3
Update RequestOptions564efc7
Update RequestOptionsb404041
Update RequestOptionse68fa56
Make SendToOllamaAsync be overridabled0bc647
Merge pull request #77
from Plootie/main7653519
Add Summary