On the strategy of thread safety realized by synchronized and volatile

What is thread unsafe

My understanding of thread safety is that when multiple threads operate on a shared variable at the same time, unexpected situations will occur, which is thread unsafe. Note: thread insecurity can only occur in write operations. Threads that only read shared variables are absolutely safe.

A classic example of thread insecurity is that both threads perform 100 auto increment operations on a shared variable x = 0, but the result of X is not 200

Therefore, the thread unsafe condition is: multi thread + shared variable + write operation

Java Memory Model

You may be curious about how threads get shared variables, which is the answer for you

The communication between Java threads is controlled by the Java Memory Model (JMM for short). From an abstract point of view, JMM defines the abstract relationship between threads and main memory. The abstract diagram of JMM is shown in the figure:

We can see from the figure:

The above is just an abstract diagram of the JAVA memory model. In fact, the working model of the thread is like this. The stack memory is two buffers

Next, let's look at an example of thread insecurity:

Suppose threads a and B operate on the same shared variable x, and the initial two-level caches are empty

Because the JAVA memory mechanism is designed in this way, multiple threads operating on the same variable will cause unsafe problems. Volatile keyword is designed to solve this problem. It can only be used for a single variable.

Volatile strategy for solving thread insecurity of shared variables

Let's continue with the above example and define x as follows

volatle int X=0

The memory semantics of volatile are:

When a thread writes to a variable modified by volatile, JMM will immediately refresh the value of the copy in the stack memory corresponding to the thread to the heap memory; When a thread reads a variable modified by volatile, JMM will empty the first and second level cache of the variable and directly read the value of the shared variable from the heap memory.

Volatile can be used as a lightweight lock, but volatile can only ensure the visibility of shared variable memory and can not guarantee the atomicity of operating shared variables. Locks (such as synchronized) can ensure the atomicity of the code within the whole lock range.

Synchronized and locked

First of all, it should be clear that synchronized is not a lock. Locks are object-based (subclasses of object). Every object in Java can be used as a lock.

Synchronized is a keyword of Java, which ensures that only one thread can execute the code in the critical area at the same time.

The thread's execution code will automatically obtain the internal lock before entering the synchronized code block. At this time, other threads will be blocked and suspended when accessing the synchronized code block. The thread that gets the internal lock will release the built-in lock after exiting the synchronization code block normally or throwing an exception, or when the wait series methods of the built-in lock resource are called in the synchronization block. A built-in lock is an exclusive lock, that is, when a thread acquires the lock, other threads must wait for the thread to release the lock before acquiring the lock.

Memory semantics of synchronized

Different from volatile

The memory semantics of entering the synchronized block is to clear all shared variables used in the synchronized block from the stack memory, so that they can only be read from the heap memory to ensure memory visibility. The memory semantics of exiting a synchronized block is to refresh the changes to shared variables in the synchronized block to heap memory.

Think about it carefully. This is actually a locking and unlocking process to ensure the visibility of shared variable modification.

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>