Java – Solr filter cache (fastlrucache) occupies too much memory and leads to insufficient memory?

I have a Solr setting One master server and two slave servers are used for replication We have about 70 million documents in the index The slave station has 16 GB of ram 10GB for OS and HD, 6GB for Solr

However, slaves sometimes lose their memory When we download a dump file before we run out of memory, we can see this class:

org.apache.solr.util.ConcurrentLRUCache$Stats @ 0x6eac8fb88

Up to 5GB of memory is being used We widely use filtered cache, which has a hit rate of 93% This is solrconfig XML cached by filter in XML

<property name="filterCache.size" value="2000" />
<property name="filterCache.initialSize" value="1000" />
<property name="filterCache.autowarmCount" value="20" />

<filterCache class="solr.FastLRUCache"
             size="${filterCache.size}"
             initialSize="${filterCache.initialSize}"
             autowarmCount="${filterCache.autowarmCount}"/>

The query results have the same settings, but use lrucache and it uses only about 35 MB of memory Is there a problem with the configuration that needs to be repaired, or do I only need more memory to filter the cache?

Solution

After a friend told me how big the filter cache works, it's obvious why we have memory errors from time to time

So what does the filter cache do? Basically, it creates something like an array that tells which documents match the filter Some like:

cache = [1,1,.. 0]

1 means hit, 0 means miss Therefore, for this example, this means that the filter cache matches documents 0 and 3 Therefore, the cache is a bit like a group of bits with total document length Therefore, suppose I have 500000 documents, so the array length will be 50 million, which means that a filter cache will occupy 50.000 in memory 000 bits

So we specify that we want 2000 filter cache, which means that the ram it will take is roughly:

50.000.000 * 2000 = 100.000.000.000 bit

If you convert it to GB This will be:

100.000.000.000 bit / 8 (to byte) / 1000 (to kb) / 1000 (to mb) / 1000 (to gb) = 12,5 Gb

Therefore, the total ram required for the filter cache is approximately 12gb This means that if Solr has only 6GB heap space, it will not be able to create a 2000 filter cache

Yes, I know Solr doesn't always create this array. If the result of filtering query is very low, it can only create something that takes less memory This calculation is only a rough indication of the upper limit of the filter cache if it has 2000 caches in RAM In other better cases, it can be lower

Therefore, one solution is to reduce the maximum number of filter caches in the Solr configuration We checked Solr statistics. Most of the time, we only have about 600 filter caches, so we can reduce the number of filter caches to the maximum

Another option, of course, is to add more RAM

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>