Example of linear time selection operation based on divide and conquer algorithm in Java
This paper describes the linear time selection operation implemented by Java based on divide and conquer algorithm. Share with you for your reference, as follows:
Linear time selection problem: given n elements and an integer k in the linear order set, 1 ≤ K ≤ n, it is required to find the K smallest element in the n elements (the given linear set here is disordered).
Random partition linear selection
The linear time selective random partition method can imitate the design of randomized quick sorting algorithm. The basic idea is to divide the input array recursively. Different from quick sort, it only recursively processes one of the divided sub arrays.
Program explanation: use the random function to generate the division benchmark, and divide the array a [P: R] into two sub arrays a [P: I] and a [i + 1: R], so that each element in a [P: I] is not greater than each element in a [I + 1: R]. Then "J = I-P + 1" calculates the number of elements J in a [P: I]. If K < = J, the k-th small element in a [P: R] is in the sub array a [P: I]. If k > J, the k-th small element is in the sub array a [i + 1: R]. Note: it is known that the elements in subarray a [P: I] are smaller than the k-th smallest element to be found, so the k-th smallest element in a [P: R] to be found is the K-J smallest element in a [i + 1: R].
In the worst case, for example, when the smallest element is always found, it is always divided at the largest element, which has a time complexity of O (n ^ 2). However, the average time complexity is linear with N, which is O (n)
Operation results:
For more information about Java algorithms, readers who are interested can see the topics on this site: Java data structure and algorithm tutorial, summary of Java DOM node operation skills, summary of java file and directory operation skills, and summary of Java cache operation skills