在ehcache配置中,maxElementsInMemory的性能是否太大了?

时间:2022-05-29 16:53:38

Just wanted to know if there was a performance impact to setting a maxElementsInMemory much higher than what is actually used? For example, a max of 10,000 and using only 100.

只是想知道设置maxElementsInMemory对性能的影响是否比实际使用的高得多?例如,最大10,000,仅使用100。

ehcache.xml

ehcache.xml中

<defaultCache
    eternal="false"
    overflowToDisk="false"
    maxElementsInMemory="10000"
    timeToIdleSeconds="7200"
    timeToLiveSeconds="0"/>

Context: I am using ehcache with hibernate and I want all records of a table (all entities) to be cached. From one customer to another, the number of records in that table varies so it is hard to set a precise max.

上下文:我正在使用带有hibernate的ehcache,我希望缓存一个表(所有实体)的所有记录。从一个客户到另一个客户,该表中的记录数量不同,因此很难设置精确的最大值。

Thanks!

谢谢!

Marc

渣子

1 个解决方案

#1


4  

No, there is none. This is just a max value. If your cache only holds 100 items, you will pay the cost of a map holding 100 elements. The upper limit has nothing to do here.

不,没有。这只是一个最大值。如果您的缓存只包含100个项目,您将支付包含100个元素的地图的费用。上限与此无关。

You can safely use much higher limit (underneath it is a simple ConcurrentHashMap) although it is hard to justify such a choice.

您可以安全地使用更高的限制(在它下面是一个简单的ConcurrentHashMap),虽然很难证明这样的选择。

#1


4  

No, there is none. This is just a max value. If your cache only holds 100 items, you will pay the cost of a map holding 100 elements. The upper limit has nothing to do here.

不,没有。这只是一个最大值。如果您的缓存只包含100个项目,您将支付包含100个元素的地图的费用。上限与此无关。

You can safely use much higher limit (underneath it is a simple ConcurrentHashMap) although it is hard to justify such a choice.

您可以安全地使用更高的限制(在它下面是一个简单的ConcurrentHashMap),虽然很难证明这样的选择。