I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism. Coming from java, I have experiences with LinkedHashMap
, which works fine for what I need: I can't find anywhere a similar solution for .NET.
我想实现一个简单的内存LRU缓存系统,我正在考虑一个基于IDictionary实现的解决方案,它可以处理散列LRU机制。来自java,我有使用LinkedHashMap的经验,它可以满足我的需要:我无法找到类似的.NET解决方案。
Has anyone developed it or has anyone had experiences like this?
有人开发过它或有没有人有这样的经历?
9 个解决方案
#1
There is nothing in the base class libraries that does this.
基类库中没有任何内容可以执行此操作。
On the free side, maybe something like C5's HashedLinkedList would work.
在免费方面,也许像C5的HashedLinkedList这样的东西可行。
If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.
如果您愿意付费,也许可以查看这个C#工具包。它包含一个实现。
#2
This a very simple an fast implementation we developed for a web site we own.
这是我们为我们拥有的网站开发的一个非常简单的快速实现。
We try to improve the code as much as possible but keeping it thread safe. I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.
我们尽可能地改进代码但保持线程安全。我认为代码非常简单明了,但如果您需要一些解释或指导如何使用它,请不要犹豫。
namespace LRUCache
{
public class LRUCache<K,V>
{
private int capacity;
private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();
public LRUCache(int capacity)
{
this.capacity = capacity;
}
[MethodImpl(MethodImplOptions.Synchronized)]
public V get(K key)
{
LinkedListNode<LRUCacheItem<K, V>> node;
if (cacheMap.TryGetValue(key, out node))
{
V value = node.Value.value;
lruList.Remove(node);
lruList.AddLast(node);
return value;
}
return default(V);
}
[MethodImpl(MethodImplOptions.Synchronized)]
public void add(K key, V val)
{
if (cacheMap.Count >= capacity)
{
RemoveFirst();
}
LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
lruList.AddLast(node);
cacheMap.Add(key, node);
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
lruList.RemoveFirst();
// Remove from cache
cacheMap.Remove(node.Value.key);
}
}
class LRUCacheItem<K,V>
{
public LRUCacheItem(K k, V v)
{
key = k;
value = v;
}
public K key;
public V value;
}
}
#3
Found you answer while googling, also found this:
发现你在google搜索时回答,也发现了这个:
http://code.google.com/p/csharp-lru-cache/
csharp-lru-cache: LRU cache collection class library
csharp-lru-cache:LRU缓存集合类库
This is a collection class that functions as a least-recently-used cache. It implements
ICollection<T>
, but also exposes three other members:这是一个集合类,它起到最近最少使用的缓存的作用。它实现了ICollection
,但也暴露了其他三个成员:
Capacity
, the maximum number of items the cache can contain. Once the collection is at capacity, adding a new item to the cache will cause the least recently used item to be discarded. If the Capacity is set to 0 at construction, the cache will not automatically discard items.容量,缓存可以包含的最大项目数。一旦集合处于容量状态,向缓存中添加新项将导致最近最少使用的项被丢弃。如果在构造时将Capacity设置为0,则缓存不会自动丢弃项目。
Oldest
, the oldest (i.e. least recently used) item in the collection.最早,最早(即最近最少使用)的集合中的项目。
DiscardingOldestItem
, an event raised when the cache is about to discard its oldest item. This is an extremely simple implementation. While its Add and Remove methods are thread-safe, it shouldn't be used in heavy multithreading environments because the entire collection is locked during those methods.DiscardingOldestItem,当缓存即将丢弃其最旧的项目时引发的事件。这是一个非常简单的实现。虽然它的Add和Remove方法是线程安全的,但它不应该用在繁重的多线程环境中,因为整个集合在这些方法中被锁定。
#4
I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.
我最近发布了一个名为LurchTable的类,以满足对LinkedHashMap的C#变体的需求。可以在这里找到关于LurchTable的简短讨论。
Basic features:
- Linked Concurrent Dictionary by Insertion, Modification, or Access
- Dictionary/ConcurrentDictionary interface support
- Peek/TryDequeue/Dequeue access to 'oldest' entry
- Allows hard-limit on items enforced at insertion
- Exposes events for add, update, and remove
通过插入,修改或访问链接的并发字典
Dictionary / ConcurrentDictionary接口支持
Peek / TryDequeue / Dequeue访问“最旧”条目
允许对插入时强制执行的项目进行硬限制
公开事件以进行添加,更新和删除
Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
源代码:http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
GitHub: https://github.com/csharptest/CSharpTest.Net.Collections
HTML Help: http://help.csharptest.net/
HTML帮助:http://help.csharptest.net/
PM> Install-Package CSharpTest.Net.Collections
PM>安装包CSharpTest.Net.Collections
#5
This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.
这需要马丁的代码与T先生的建议,并使Stylecop友好。哦,它还允许在循环退出缓存时处理值。
namespace LruCache
{
using System;
using System.Collections.Generic;
/// <summary>
/// A least-recently-used cache stored like a dictionary.
/// </summary>
/// <typeparam name="TKey">
/// The type of the key to the cached item
/// </typeparam>
/// <typeparam name="TValue">
/// The type of the cached item.
/// </typeparam>
/// <remarks>
/// Derived from https://*.com/a/3719378/240845
/// </remarks>
public class LruCache<TKey, TValue>
{
private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
new Dictionary<TKey, LinkedListNode<LruCacheItem>>();
private readonly LinkedList<LruCacheItem> lruList =
new LinkedList<LruCacheItem>();
private readonly Action<TValue> dispose;
/// <summary>
/// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
/// class.
/// </summary>
/// <param name="capacity">
/// Maximum number of elements to cache.
/// </param>
/// <param name="dispose">
/// When elements cycle out of the cache, disposes them. May be null.
/// </param>
public LruCache(int capacity, Action<TValue> dispose = null)
{
this.Capacity = capacity;
this.dispose = dispose;
}
/// <summary>
/// Gets the capacity of the cache.
/// </summary>
public int Capacity { get; }
/// <summary>Gets the value associated with the specified key.</summary>
/// <param name="key">
/// The key of the value to get.
/// </param>
/// <param name="value">
/// When this method returns, contains the value associated with the specified
/// key, if the key is found; otherwise, the default value for the type of the
/// <paramref name="value" /> parameter. This parameter is passed
/// uninitialized.
/// </param>
/// <returns>
/// true if the <see cref="T:System.Collections.Generic.Dictionary`2" />
/// contains an element with the specified key; otherwise, false.
/// </returns>
public bool TryGetValue(TKey key, out TValue value)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
return true;
}
value = default(TValue);
return false;
}
}
/// <summary>
/// Looks for a value for the matching <paramref name="key"/>. If not found,
/// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
/// the cache.
/// </summary>
/// <param name="key">
/// The key of the value to look up.
/// </param>
/// <param name="valueGenerator">
/// Generates a value if one isn't found.
/// </param>
/// <returns>
/// The requested value.
/// </returns>
public TValue Get(TKey key, Func<TValue> valueGenerator)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
TValue value;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
}
else
{
value = valueGenerator();
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
node = new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
return value;
}
}
/// <summary>
/// Adds the specified key and value to the dictionary.
/// </summary>
/// <param name="key">
/// The key of the element to add.
/// </param>
/// <param name="value">
/// The value of the element to add. The value can be null for reference types.
/// </param>
public void Add(TKey key, TValue value)
{
lock (this.cacheMap)
{
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
LinkedListNode<LruCacheItem> node =
new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LruCacheItem> node = this.lruList.First;
this.lruList.RemoveFirst();
// Remove from cache
this.cacheMap.Remove(node.Value.Key);
// dispose
this.dispose?.Invoke(node.Value.Value);
}
private class LruCacheItem
{
public LruCacheItem(TKey k, TValue v)
{
this.Key = k;
this.Value = v;
}
public TKey Key { get; }
public TValue Value { get; }
}
}
}
#6
I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).
我不相信。我当然看到在各种不相关的项目中实施了多次手工轧制(这或多或少证实了这一点。如果有的话,肯定至少有一个项目会使用它)。
It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary
and a List
.
它实现起来非常简单,通常通过创建一个包含Dictionary和List的类来完成。
The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really
键进入列表(按顺序),项目进入字典。当您向集合中添加新项时,该函数会检查列表的长度,拉出最后一个键(如果它太长),然后从字典中删除键和值以匹配。真的没那么多
#7
The Caching Application Block of EntLib has an LRU scavenging option out of the box and can be in memory. It might be a bit heavyweight for what you want tho.
EntLib的缓存应用程序块具有开箱即用的LRU清理选项,可以在内存中。对于你想要的东西,它可能有点重量级。
#8
I like Lawrence's implementation. Hashtable + LinkedList is a good solution. Regarding threading I would not lock this [MethodImpl(MethodImplOptions.Synchronized)], but rather using ReaderWriterLockSlim or spin lock (since contention usually fast) instead. In get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you possibility to keep within a reader lock that not blocking other readers.
我喜欢劳伦斯的实施。 Hashtable + LinkedList是一个很好的解决方案。关于线程,我不会锁定[MethodImpl(MethodImplOptions.Synchronized)],而是使用ReaderWriterLockSlim或自旋锁(因为争用通常很快)。在get函数中,我会检查它是否已经是第一个项目,而不是总是删除和添加。这使您可以保持读取器锁定,而不会阻止其他读者。
#9
If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.
如果它是一个asp.net应用程序,你可以使用缓存类[1],但你将与其他缓存的东西竞争空间,这可能是你想要的或不是。
[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx
#1
There is nothing in the base class libraries that does this.
基类库中没有任何内容可以执行此操作。
On the free side, maybe something like C5's HashedLinkedList would work.
在免费方面,也许像C5的HashedLinkedList这样的东西可行。
If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.
如果您愿意付费,也许可以查看这个C#工具包。它包含一个实现。
#2
This a very simple an fast implementation we developed for a web site we own.
这是我们为我们拥有的网站开发的一个非常简单的快速实现。
We try to improve the code as much as possible but keeping it thread safe. I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.
我们尽可能地改进代码但保持线程安全。我认为代码非常简单明了,但如果您需要一些解释或指导如何使用它,请不要犹豫。
namespace LRUCache
{
public class LRUCache<K,V>
{
private int capacity;
private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();
public LRUCache(int capacity)
{
this.capacity = capacity;
}
[MethodImpl(MethodImplOptions.Synchronized)]
public V get(K key)
{
LinkedListNode<LRUCacheItem<K, V>> node;
if (cacheMap.TryGetValue(key, out node))
{
V value = node.Value.value;
lruList.Remove(node);
lruList.AddLast(node);
return value;
}
return default(V);
}
[MethodImpl(MethodImplOptions.Synchronized)]
public void add(K key, V val)
{
if (cacheMap.Count >= capacity)
{
RemoveFirst();
}
LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
lruList.AddLast(node);
cacheMap.Add(key, node);
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
lruList.RemoveFirst();
// Remove from cache
cacheMap.Remove(node.Value.key);
}
}
class LRUCacheItem<K,V>
{
public LRUCacheItem(K k, V v)
{
key = k;
value = v;
}
public K key;
public V value;
}
}
#3
Found you answer while googling, also found this:
发现你在google搜索时回答,也发现了这个:
http://code.google.com/p/csharp-lru-cache/
csharp-lru-cache: LRU cache collection class library
csharp-lru-cache:LRU缓存集合类库
This is a collection class that functions as a least-recently-used cache. It implements
ICollection<T>
, but also exposes three other members:这是一个集合类,它起到最近最少使用的缓存的作用。它实现了ICollection
,但也暴露了其他三个成员:
Capacity
, the maximum number of items the cache can contain. Once the collection is at capacity, adding a new item to the cache will cause the least recently used item to be discarded. If the Capacity is set to 0 at construction, the cache will not automatically discard items.容量,缓存可以包含的最大项目数。一旦集合处于容量状态,向缓存中添加新项将导致最近最少使用的项被丢弃。如果在构造时将Capacity设置为0,则缓存不会自动丢弃项目。
Oldest
, the oldest (i.e. least recently used) item in the collection.最早,最早(即最近最少使用)的集合中的项目。
DiscardingOldestItem
, an event raised when the cache is about to discard its oldest item. This is an extremely simple implementation. While its Add and Remove methods are thread-safe, it shouldn't be used in heavy multithreading environments because the entire collection is locked during those methods.DiscardingOldestItem,当缓存即将丢弃其最旧的项目时引发的事件。这是一个非常简单的实现。虽然它的Add和Remove方法是线程安全的,但它不应该用在繁重的多线程环境中,因为整个集合在这些方法中被锁定。
#4
I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.
我最近发布了一个名为LurchTable的类,以满足对LinkedHashMap的C#变体的需求。可以在这里找到关于LurchTable的简短讨论。
Basic features:
- Linked Concurrent Dictionary by Insertion, Modification, or Access
- Dictionary/ConcurrentDictionary interface support
- Peek/TryDequeue/Dequeue access to 'oldest' entry
- Allows hard-limit on items enforced at insertion
- Exposes events for add, update, and remove
通过插入,修改或访问链接的并发字典
Dictionary / ConcurrentDictionary接口支持
Peek / TryDequeue / Dequeue访问“最旧”条目
允许对插入时强制执行的项目进行硬限制
公开事件以进行添加,更新和删除
Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
源代码:http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
GitHub: https://github.com/csharptest/CSharpTest.Net.Collections
HTML Help: http://help.csharptest.net/
HTML帮助:http://help.csharptest.net/
PM> Install-Package CSharpTest.Net.Collections
PM>安装包CSharpTest.Net.Collections
#5
This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.
这需要马丁的代码与T先生的建议,并使Stylecop友好。哦,它还允许在循环退出缓存时处理值。
namespace LruCache
{
using System;
using System.Collections.Generic;
/// <summary>
/// A least-recently-used cache stored like a dictionary.
/// </summary>
/// <typeparam name="TKey">
/// The type of the key to the cached item
/// </typeparam>
/// <typeparam name="TValue">
/// The type of the cached item.
/// </typeparam>
/// <remarks>
/// Derived from https://*.com/a/3719378/240845
/// </remarks>
public class LruCache<TKey, TValue>
{
private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
new Dictionary<TKey, LinkedListNode<LruCacheItem>>();
private readonly LinkedList<LruCacheItem> lruList =
new LinkedList<LruCacheItem>();
private readonly Action<TValue> dispose;
/// <summary>
/// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
/// class.
/// </summary>
/// <param name="capacity">
/// Maximum number of elements to cache.
/// </param>
/// <param name="dispose">
/// When elements cycle out of the cache, disposes them. May be null.
/// </param>
public LruCache(int capacity, Action<TValue> dispose = null)
{
this.Capacity = capacity;
this.dispose = dispose;
}
/// <summary>
/// Gets the capacity of the cache.
/// </summary>
public int Capacity { get; }
/// <summary>Gets the value associated with the specified key.</summary>
/// <param name="key">
/// The key of the value to get.
/// </param>
/// <param name="value">
/// When this method returns, contains the value associated with the specified
/// key, if the key is found; otherwise, the default value for the type of the
/// <paramref name="value" /> parameter. This parameter is passed
/// uninitialized.
/// </param>
/// <returns>
/// true if the <see cref="T:System.Collections.Generic.Dictionary`2" />
/// contains an element with the specified key; otherwise, false.
/// </returns>
public bool TryGetValue(TKey key, out TValue value)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
return true;
}
value = default(TValue);
return false;
}
}
/// <summary>
/// Looks for a value for the matching <paramref name="key"/>. If not found,
/// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
/// the cache.
/// </summary>
/// <param name="key">
/// The key of the value to look up.
/// </param>
/// <param name="valueGenerator">
/// Generates a value if one isn't found.
/// </param>
/// <returns>
/// The requested value.
/// </returns>
public TValue Get(TKey key, Func<TValue> valueGenerator)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
TValue value;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
}
else
{
value = valueGenerator();
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
node = new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
return value;
}
}
/// <summary>
/// Adds the specified key and value to the dictionary.
/// </summary>
/// <param name="key">
/// The key of the element to add.
/// </param>
/// <param name="value">
/// The value of the element to add. The value can be null for reference types.
/// </param>
public void Add(TKey key, TValue value)
{
lock (this.cacheMap)
{
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
LinkedListNode<LruCacheItem> node =
new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LruCacheItem> node = this.lruList.First;
this.lruList.RemoveFirst();
// Remove from cache
this.cacheMap.Remove(node.Value.Key);
// dispose
this.dispose?.Invoke(node.Value.Value);
}
private class LruCacheItem
{
public LruCacheItem(TKey k, TValue v)
{
this.Key = k;
this.Value = v;
}
public TKey Key { get; }
public TValue Value { get; }
}
}
}
#6
I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).
我不相信。我当然看到在各种不相关的项目中实施了多次手工轧制(这或多或少证实了这一点。如果有的话,肯定至少有一个项目会使用它)。
It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary
and a List
.
它实现起来非常简单,通常通过创建一个包含Dictionary和List的类来完成。
The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really
键进入列表(按顺序),项目进入字典。当您向集合中添加新项时,该函数会检查列表的长度,拉出最后一个键(如果它太长),然后从字典中删除键和值以匹配。真的没那么多
#7
The Caching Application Block of EntLib has an LRU scavenging option out of the box and can be in memory. It might be a bit heavyweight for what you want tho.
EntLib的缓存应用程序块具有开箱即用的LRU清理选项,可以在内存中。对于你想要的东西,它可能有点重量级。
#8
I like Lawrence's implementation. Hashtable + LinkedList is a good solution. Regarding threading I would not lock this [MethodImpl(MethodImplOptions.Synchronized)], but rather using ReaderWriterLockSlim or spin lock (since contention usually fast) instead. In get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you possibility to keep within a reader lock that not blocking other readers.
我喜欢劳伦斯的实施。 Hashtable + LinkedList是一个很好的解决方案。关于线程,我不会锁定[MethodImpl(MethodImplOptions.Synchronized)],而是使用ReaderWriterLockSlim或自旋锁(因为争用通常很快)。在get函数中,我会检查它是否已经是第一个项目,而不是总是删除和添加。这使您可以保持读取器锁定,而不会阻止其他读者。
#9
If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.
如果它是一个asp.net应用程序,你可以使用缓存类[1],但你将与其他缓存的东西竞争空间,这可能是你想要的或不是。
[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx