High performance, thread-safe in-memory caching primitives for .NET
High performance, thread-safe in-memory caching primitives for .NET.
Please refer to the wiki for full API documentation, and a complete analysis of hit rate, latency and throughput.
BitFaster.Caching is installed from NuGet:
dotnet add package BitFaster.Caching
ConcurrentLru
is a light weight drop in replacement for ConcurrentDictionary
, but with bounded size enforced by the TU-Q eviction policy (derived from 2Q). There are no background threads, no global locks, concurrent throughput is high, lookups are fast and hit rate outperforms a pure LRU in all tested scenarios.
Choose a capacity and use just like ConcurrentDictionary
, but with bounded size:
int capacity = 128;
var lru = new ConcurrentLru<string, SomeItem>(capacity);
var value = lru.GetOrAdd("key", (key) => new SomeItem(key));
ConcurrentLfu
is a drop in replacement for ConcurrentDictionary
, but with bounded size enforced by the W-TinyLFU eviction policy. ConcurrentLfu
has near optimal hit rate and high scalability. Reads and writes are buffered then replayed asynchronously to mitigate lock contention.
Choose a capacity and use just like ConcurrentDictionary
, but with bounded size:
int capacity = 128;
var lfu = new ConcurrentLfu<string, SomeItem>(capacity);
var value = lfu.GetOrAdd("key", (key) => new SomeItem(key));