[Breaking change]: Ollama integration updates #2009
Labels
⛓️💥 breaking-change
Issues or PRs tracking breaking changes.
doc-idea
Indicates issues that are suggestions for new topics [org][type][category]
documentation
Improvements or additions to documentation
in-pr
okr-freshness
OKR: Freshness of content
Pri1
High priority, do before Pri2 and Pri3
📌 seQUESTered
Identifies that an issue has been imported into Quest.
📦 release-9.0
Used to track doc updates for release 9.0 of .NET Aspire.
Milestone
Description
For the Aspire 9 release of the Ollama Community Toolkit integrations we have quite a large overhaul coming (see CommunityToolkit/Aspire#194).
The docs here need to be updated to reflect those changes.
Version
.NET Aspire 9.0 GA
Previous behavior
Ollama hosting
The Ollama hosting resource had to be provided as the reference to other resources and it would provide model information as a set of environment variables.
The connection string was only the HTTP endpoint and not in a real "connection string" format.
OllamaSharp client
Supports v3 of the library and doesn't support MEAI interfaces.
New behavior
Models as resources
Originally models were just appended to the Ollama resource and you would pass the Ollama resource as a reference, but this resulted in some workarounds on how you'd set (and discover) the default model to use. But in the v9 release we have an
OllamaModelResource
that is returned and can be passed as a reference, which will then provide connection information to clients on which model to use.New connection string format
Originally the "connection string" from an Ollama resource was just the HTTP endpoint, but to support the Model as resource feature better the resources will create a "real" connection string of
Endpoint=<...>;Model=<...>
. TheModel
part will only be included if you're passing theOllamaModelResource
though. This is a breaking change that you may need update if you're not using the OllamaSharp integration.OllamaSharp 4 + Microsoft.Extensions.AI
OllamaSharp has been updated to a new major version and now supports the interfaces from Microsoft.Extensions.AI. We've bumped our integration to support that and you can register the
IOllamaApiClient
(native OllamaSharp client), or from M.E.AIIChatClient
andIEmbeddingGenerator
(depending on your model type).Using these new interfaces makes code more portable across LLM/SLM options.
API deprecations and removals
With all the refactoring that's gone on some APIs are being deprecated or straight up removed.
Type of breaking change
Reason for change
Wanting to make the library more functional and better integrated with the Aspire API design.
Recommended action
Upgrade to v9
Affected APIs
CommunityToolkit.Aspire.Hosting.Ollama and CommunityToolkit.Aspire.OllamaSharp
Associated WorkItem - 340203
The text was updated successfully, but these errors were encountered: