Method for implementing distributed application current limiting based on redis
The purpose of current limiting is to protect the system by limiting the speed of concurrent access / requests or requests within a time window. Once the limit rate is reached, the service can be refused.
A few days ago, in the official account of DD, I saw a program about using the redis to realize single application flow restriction. The original version of the in action is a jedis version. All of them belong to the business level restriction. Current limiting strategies commonly used in actual scenarios:
Nginx access layer current limiting
According to certain rules, such as account number, IP, system call logic, etc., the current is limited at the nginx level
Current limiting of business application system
Control traffic through business code. This traffic can be called semaphore, which can be understood as a lock, which can limit the maximum number of processes that can access a resource at the same time.
code implementation
call
Test interface call
optimization
Optimize code using interceptors + annotations
Interceptor
Define annotation
use
Concurrent testing
Tool: apache-jmeter-3.2
Note: the interface that does not get the semaphore returns 500, the status is red, the interface that gets the semaphore returns 200, and the status is green.
When the limit request semaphore is 2, 5 threads are concurrent:
When the limit request semaphore is 5, 10 threads are concurrent:
data
Implementation based on reids + Lua