Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[API Proposal]: IDistributedCache overloads with ReadOnlyMemory<byte> support #70137

Open
kerams opened this issue Jun 2, 2022 · 10 comments
Open
Labels
api-suggestion Early API idea and discussion, it is NOT ready for implementation area-Extensions-Caching feature-request
Milestone

Comments

@kerams
Copy link

kerams commented Jun 2, 2022

Background and motivation

IDistributedCache.Set and SetAsync only take a byte array as the input. This is not ideal when working with oversized pooled arrays for example, because you need to reallocate the data out into a fixed-size array just to pass it into the cache. Having overloads with ReadOnlyMemory<byte> parameters would alleviate this issue (supposing the actual caching implementation itself supports Memory, which StackExchange.Redis does).

API Proposal

Additions to Microsoft.Extensions.Caching.Distributed.IDistributedCache:

void Set(string key, ReadOnlyMemory<byte> value, DistributedCacheEntryOptions options);
Task SetAsync(string key, ReadOnlyMemory<byte> value, DistributedCacheEntryOptions options, CancellationToken token = default(CancellationToken));

API Usage

public static async Task Cache(string key, string looongString, IDistributedCache cache, DistributedCacheEntryOptions options)
{
    var maxLength = Encoding.UTF8.GetMaxByteCount(looongString.Length);
    var pooled = ArrayPool<byte>.Shared.Rent(maxLength);

    try
    {
        var bytesWritten = Encoding.UTF8.GetBytes(looongString, pooled);
        await cache.SetAsync(key, pooled.AsMemory(0, bytesWritten), options);
    }
    finally
    {
        ArrayPool<byte>.Shared.Return(pooled);
    }
}

Alternative Designs

No response

Risks

No response

@kerams kerams added the api-suggestion Early API idea and discussion, it is NOT ready for implementation label Jun 2, 2022
@ghost ghost added the untriaged New issue has not been triaged by the area owner label Jun 2, 2022
@ghost
Copy link

ghost commented Jun 2, 2022

Tagging subscribers to this area: @dotnet/area-extensions-caching
See info in area-owners.md if you want to be subscribed.

Issue Details

Background and motivation

IDistributedCache.Set and SetAsync only take a byte array as the input. This is not ideal when working with oversized pooled arrays for example, because you need to reallocate the data out into a fixed-size array just to pass it into the cache. Having overloads with ReadOnlyMemory<byte> parameters would alleviate this issue (supposing the actual caching implementation itself supports Memory, which StackExchange.Redis does).

API Proposal

Additions to Microsoft.Extensions.Caching.Distributed.IDistributedCache:

void Set(string key, ReadOnlyMemory<byte> value, DistributedCacheEntryOptions options);
Task SetAsync(string key, ReadOnlyMemory<byte> value, DistributedCacheEntryOptions options, CancellationToken token = default(CancellationToken));

API Usage

public static async Task Cache(string key, string looongString, IDistributedCache cache, DistributedCacheEntryOptions options)
{
    var maxLength = Encoding.UTF8.GetMaxByteCount(looongString.Length);
    var pooled = ArrayPool<byte>.Shared.Rent(maxLength);

    try
    {
        var bytesWritten = Encoding.UTF8.GetBytes(looongString, pooled);
        await cache.SetAsync(key, pooled.AsMemory(0, bytesWritten), options);
    }
    finally
    {
        ArrayPool<byte>.Shared.Return(pooled);
    }
}

Alternative Designs

No response

Risks

No response

Author: kerams
Assignees: -
Labels:

api-suggestion, untriaged, area-Extensions-Caching

Milestone: -

@maryamariyan maryamariyan added feature-request and removed untriaged New issue has not been triaged by the area owner labels Jun 7, 2022
@maryamariyan maryamariyan added this to the Future milestone Jun 7, 2022
@eerhardt
Copy link
Member

eerhardt commented Jun 7, 2022

Having overloads with ReadOnlyMemory parameters

Is there a reason this has to be ReadOnlyMemory<byte> and not ReadOnlySpan<byte>? My assumption is that since this is a "distributed cache", the value would need to be sent it to wherever it is being cached. I wouldn't expect the IDistributedCache to "hold on" to the ReadOnlyMemory<byte>, but instead send it.

@Clockwork-Muse
Copy link
Contributor

Is there a reason this has to be ReadOnlyMemory<byte> and not ReadOnlySpan<byte>?

Because you can't use Span over an async suspension point. See for instance Stream.WriteAsync, which would almost certainly be called under the covers for any cross-machine distributed cache.

In theory the synchronous version could still use the Span versions (as Stream.Write does), but using Memory in both places for consistency might be preferable.

@kerams
Copy link
Author

kerams commented Jun 8, 2022

I would not mind contributing this change if approved and within the .NET 7 timeframe. I assume it'll just involve adding the methods and updating any surface area tests. Then I don't know how such changes flow downstream, among others, to the ASP.NET repo, where the Redis implementation will have to be updated because the build will break as soon as the new interface is imported.

In theory the synchronous version could still use the Span versions

StackeExchange.Redis concerns me personally the most, and since Span is not convertible to RedisValue, I don't see much use in Span parameters even in the sync version.

@eerhardt
Copy link
Member

eerhardt commented Jun 8, 2022

Span is not convertible to RedisValue

I think that is a good reason to leave the sync version using ReadOnlyMemory<byte>.

where the Redis implementation will have to be updated because the build will break as soon as the new interface is imported.

We won't be accepting a breaking change. Instead, the new methods would have to be "DIM"s - https://docs.microsoft.com/en-us/dotnet/csharp/whats-new/tutorials/default-interface-methods-versions.

@aKzenT
Copy link

aKzenT commented Mar 25, 2023

Any update on this?

Who would need to approve this @eerhardt ? What are the next steps?

ASP.NET Core spend a lot of effort over the years avoiding even small allocations on each request. IDistributedCache is often used as a session store which is used on each request on many websites, often in combination with relatively large chunks of data and things like serialization. So it seems very worthwhile to be able to avoid unecessary allocations/copies in this common scenario.

@adamsitnik
Copy link
Member

@davidfowl I can see that most of the implementations of this interface are provided by the ASP.NET Team. Whom should we ask for feedback about extending that interface with the proposed methods?

@jodydonetti
Copy link
Contributor

Update: see IBufferDistributedCache here.

@julealgon
Copy link

Is there a reason for this proposal to still exist with the creation of the new HybridCache? Would there be any situation where one shouldn't just migrate from IDistributedCache to HybridCache?

@mgravell
Copy link
Member

mgravell commented Jun 4, 2024

The HybridCache work adds a new optional backend API which sits alongside IDistributedCache and fills this niche, see: #100290 - however this new API should mostly be for backend implementations - think "I'm Cosmos, and I want my existing distributed cache backend to allocate less".

For application code, @julealgon is correct that HybridCache would be the obvious switch here, removing the need for user code to even worry about this API (other than to register a backend cache)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api-suggestion Early API idea and discussion, it is NOT ready for implementation area-Extensions-Caching feature-request
Projects
None yet
Development

No branches or pull requests

9 participants