Java LRU algorithm introduction and usage examples

This paper describes the introduction and usage of Java LRU algorithm. Share with you for your reference, as follows:

1. Preface

When users use networked software, they always get data from the network. When they want to use the same data many times over a period of time, users can't go to the network to make requests every time, which is a waste of time and network

At this time, you can save the data requested by the user, but not any data, which will cause a waste of memory. The idea of LRU algorithm can be used.

2. Introduction to LRU

LRU is the least recently used algorithm, which can delete data that has not been used for a long time.

LRU is also well reflected in some people's emotions. When you are in contact with a group of friends, the people you often contact have a good relationship. If you haven't been in contact for a long time, it is estimated that there will be no contact in the end, and you will lose this friend.

3. The LRU algorithm is shown in the following code:

The simplest method is to use linkhashmap, because it has a method of removing additional old data within the set cache range

Effect after operation:

The code clearly put six entries, but only three are displayed in the end. The three in between are old, so they are directly clicked off

The second method is to use two-way linked list + hashtable

The bidirectional linked list is used to record locations, and the hashtable is used as a container to store data

Why use hashtable instead of HashMap?

1. The key and value of hashtable cannot be null; 2. The LRU implemented with linkhashmap above can be used to synchronize threads for processing, so as to avoid problems caused by multi-threaded processing of unified data

Hashtable has its own synchronization mechanism, so multithreading can correctly process hashtable.

All caches are connected by a location double linked list. When a location is hit, the location will be adjusted to the location of the chain header by adjusting the direction of the linked list, and the newly added cache will be directly added to the linked list header. In this way, after multiple cache operations,

The most recently hit will be moved to the chain header, while those that fail to hit will want to move behind the list, and the end of the list represents the least recently used cache. When we need to replace the content, the last position of the linked list is the least hit position. We just need to Amoy

Just delete the last part of the list

For more information about Java algorithms, readers who are interested can see the topics on this site: Java data structure and algorithm tutorial, summary of Java DOM node operation skills, summary of java file and directory operation skills, and summary of Java cache operation skills

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>