Implementation of BlockingQueue for Java Concurrent learning
1. Introduction
Blocking queue is a Java util BlockingQueue, an important data structure under concurrent package, provides thread safe queue access: when blocking the queue to insert data, if the queue is full, the thread will block and wait until the queue is not full; When fetching data from the blocking queue, if the queue is empty, the thread will block and wait until the queue is not empty. And the implementation of many advanced synchronization classes under the contract is based on BlockingQueue.
JDK7 provides the following 7 blocking queues:
Arrayblockingqueue: a bounded blocking queue composed of an array structure. Linkedblockingqueue: a bounded blocking queue composed of a linked list structure. Priorityblockingqueue: an unbounded blocking queue that supports prioritization. Delayqueue: an unbounded blocking queue implemented using priority queues. Synchronousqueue: a blocking queue that does not store elements. Linkedtransferqueue: an unbounded blocking queue composed of a linked list structure. Linked blocking deque: a bidirectional blocking queue composed of linked list structure.
Blocking queue provides the following four processing methods:
Among the four methods, when the queue is full (or empty), some will throw exceptions, some will return true / false, some will be blocked all the time, and others can set a timeout. When the time is up, they will automatically exit the blocking state. In the actual project, appropriate methods can be selected as needed.
2. Queue
Queue and stack are two commonly used data structures. Queue is characterized by first in first out, while stack is first in last out. To put it more popularly, queue is the entry team lined up when people enter the cinema, The first comers (i.e. the people in front of the front row) enter first, while statck is a team of people who enter a dead end in turn and think of it. The people who go first (the innermost) must wait for the people behind (the people who enter later) to come out before they can come out.
The difference between a blocking queue and an ordinary queue is that when the queue is empty, the operation of obtaining elements from the queue will be blocked, or when the queue is full, the operation of adding elements to the queue will be blocked. Threads trying to get elements from an empty blocking queue will be blocked until other threads insert new elements into the empty queue. Similarly, threads trying to add new elements to a full blocking queue will also be blocked until other threads make the queue idle again, such as removing one or more elements from the queue or completely emptying the queue
In multithreaded applications, queues are often used in production consumption scenarios. For example, many people like to buy fried dough sticks in the morning. The person who buys fried dough sticks is equivalent to a consumer, and the master who makes fried dough sticks is a producer. The iron rack on the side of the oil pan for putting fried dough sticks can be regarded as a shared queue. After the master completes the fried dough sticks, he takes them out one by one and puts them on the rack, while the customers pay money one by one and take them from the rack according to the order of queuing. That is, at one end of the queue, people are constantly putting things (production elements) and at the other end, people are constantly consuming (taking elements). There is a very interesting phenomenon here. If there are many buyers, the master has no time to do it, Then the first customer will be waiting (everyone in the back has to wait, or block the people in the back). Until the master explodes one, and then the first customer buys it, the people in the back can top it up. Similarly, if the shelves are full and no one buys it, the master will stop and wait for someone to buy it before continuing to do it. This is the so-called queue blocking, which can produce blocking behavior Queues are called blocking queues.
As can be seen from the description just now, blocking must meet at least one of the following conditions: (premise: the queue is bounded)
1. When fetching elements from the queue, if the queue is empty, the code will wait here (i.e. blocking) until there is something in the queue and the element is obtained, and the subsequent code can continue;
2. When putting elements into the queue, if the queue is full (i.e. no more elements can be put), the code will also get stuck until the things in the queue are removed (i.e. there is an empty space for new elements), and the subsequent code can continue;
3. Simulate producers and consumers
The scene of buying fried dough sticks is simulated. One boss is making fried dough sticks and three customers are queuing to buy them:
result:
In the above setting, the consumption speed is faster than the production speed. In the console, when the boss explodes the fried dough sticks, the fried dough sticks on the shelf sell fast, so four or five fried dough sticks can be placed. Because the customers buy fast, there are often 0 fried dough sticks left on the shelf. When we exchange the production speed producespeed and consumption speed consumespeed, we will find the opposite result.
4. Principle analysis
Let's simply simulate BlockingQueue and get familiar with the principle behind it:
Let's take a look at the implementation process of JDK:
Part of the source code of arrayblockingqueue:
These three variables are very important. Reentrantlock re-entry lock, notempty check the condition that is not empty, and notfull check the condition that the queue is not full
Condition is an interface with two important methods:
await():Causesthecurrentthreadtowaituntilitissignalledorinterrupted. That is, the current thread is blocked until it is notified (awakened) or interrupted
singal():Wakesuponewaitingthread. Wake up blocked threads
Let's look at the put method: (JDK1.8)
1. Obtain the lock first 2 Then use the while loop to check whether the number of elements is equal to the length of items. If they are equal, it means that the queue is full. Call notfull's await() method to block thread 3 Otherwise, the enqueue () method is called to add element 4 Last unlock
This is the code to add elements (JDK 1.8). Note the last line notEmpty.signal () method. After adding the element, call singal () to notify the thread waiting for (take the element from the queue), the queue is not empty (value), you can pick up something.
Similar take () and dequeue () methods are equivalent to inverse processes (Note: they are also JDK 1.8)
allied:
1. Lock first 2 If the number of elements is empty, it means that the queue is empty. Call await() of notempty to block the thread and directly add new elements to the queue. 3 Then call dequeue to delete the element 4. from the queue. Unlock
Dequeue method:
The last second lines, after the element is removed, call notFull.. Singnal wakes up the waiting (adding elements to the queue) thread. The queue is empty and elements can be added to it.
summary
The above is all the details of the BlockingQueue implementation of Java Concurrent learning for producers and consumers in this paper. I hope it will be helpful to you. Interested friends can continue to refer to this site: detailed explanation of Java Concurrent nested pipe locking, detailed explanation of CAS of Java Concurrent Programming, detailed explanation of semaphore counting semaphore of Java Concurrent Programming, etc. if you have any questions, you can leave a message at any time, and the editor will reply to you in time. Thank you for your support!