Answer the question
In order to leave comments, you need to log in
Sorting Confusion - Why is random a worse case than decremental array?
Actually I train in java.
Wrote bubble sort and quicksort, but this situation surprises me:
Random data is sorted for the longest time.
I assumed that the worst case should be a decremental array (from largest to smallest), but it sorts faster than random data.
I generate an array with random numbers separately, that is, I measure the time correctly.
Or decremental and shouldn't be the worst case?
Answer the question
In order to leave comments, you need to log in
For a bubble, a descending array is really the worst possible case, and the only thing I can think of is JIT and processor-level branch prediction. For the sake of interest, switch to a legacy virtual machine - I think this case will immediately become the worst.
For fast decreasing array - one of the best cases, exactly [n/2] exchanges will be made. It is easy to show that the elements at the beginning and end of the array, standing in their places, will never be displaced by quick sort. So the first thing that sorting does is split into smaller (sorted) and larger (partially sorted as it should, partly in reverse order). That part, which is as it should, sorting will not touch. And the one that is vice versa - we use transfinite induction, and we get that exactly [n / 2].
Both are not particularly sensitive to data.
For a bubble, all cases are equally bad except for a sorted array. For fast, how average you choose the reference element of the comparison is much more important than the distribution of the array. The essence of quicksort optimizations is the correct choice of the pivot element (a random choice is considered good enough)
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question