ThreadSafeCacheFactoryCreateTKey, TValue(FuncTKey, TValue, IEqualityComparerTKey, ThreadSafeCacheOptionsBase) Method

Creates a thread safe cache instance that can be accessed as an IThreadSafeCacheAccessorTKey, TValue instance.

Definition

Namespace: KGySoft.Collections
Assembly: KGySoft.CoreLibraries (in KGySoft.CoreLibraries.dll) Version: 8.1.0
C#
public static IThreadSafeCacheAccessor<TKey, TValue> Create<TKey, TValue>(
	Func<TKey, TValue> itemLoader,
	IEqualityComparer<TKey>? comparer,
	ThreadSafeCacheOptionsBase? options = null
)

Parameters

itemLoader  FuncTKey, TValue
A delegate for loading a value, which is invoked when a key is not present in the cache.
comparer  IEqualityComparerTKey
An equality comparer to be used for hashing and comparing keys. If , then a default comparison is used.
options  ThreadSafeCacheOptionsBase  (Optional)
The options for creating the cache. If , then a default LockFreeCacheOptions instance will be used. This parameter is optional.
Default value: .

Type Parameters

TKey
The type of the key in the cache.
TValue
The type of the value in the cache.

Return Value

IThreadSafeCacheAccessorTKey, TValue
An IThreadSafeCacheAccessorTKey, TValue instance that can be used to read the underlying cache in a thread-safe manner.

Remarks

A cache is similar to a dictionary (in terms of using a fast, associative storage) but additionally provides capacity management and transparent access (meaning, all that is needed is to read the indexer of the returned IThreadSafeCacheAccessorTKey, TValue instance, and it is transparent for the consumer whether the returned item was returned from the cache or it was loaded by invoking the specified itemLoader).

If options is , then a lock-free cache instance will be created as if a LockFreeCacheOptions was used with its default settings.

In KGy SOFT Core Libraries there are two predefined classes that can be used to create a thread-safe cache instance: LockFreeCacheOptions and LockingCacheOptions.

  Tip

  • LockFreeCacheOptions: Use this one if you want the fastest, well scalable solution and it is not a problem that the itemLoader delegate might be called concurrently, or capacity management is not too strict (when cache is full, about the half of the elements are dropped at once). Though rarely, it may also happen that itemLoader is invoked multiple times when accessing the same key consecutively and the first call occurred during an internal merge session.
  • LockingCacheOptions: Use this one if you need strict capacity management, you want to dispose the dropped-out values, you want to ensure that the oldest or least recent used element is dropped in the first place, you want to protect the itemLoader delegate from calling it concurrently, or if you want to specify an expiration time period for the values. If elements are often dropped, then it also uses less memory than the lock-free implementation. Depending on the configuration the actual type of the returned instance may vary but in all cases an instance of the public CacheTKey, TValue type will be wrapped internally.

See Also