Time complexity for Shell sort?

前端 未结 2 996
耶瑟儿~
耶瑟儿~ 2020-12-03 08:46

First, here\'s my Shell sort code (using Java):

public char[] shellSort(char[] chars) {
    int n = chars.length;
    int increment = n / 2;
    while(increm         


        
相关标签:
2条回答
  • 2020-12-03 09:22

    Insertion Sort

    If we analyse

    static void sort(int[] ary) {
        int i, j, insertVal;
        int aryLen = ary.length;
        for (i = 1; i < aryLen; i++) {
            insertVal = ary[i];
            j = i;
            /*
             * while loop exits as soon as it finds left hand side element less than insertVal
             */
            while (j >= 1 && ary[j - 1] > insertVal) { 
                ary[j] = ary[j - 1];
                j--;
            }
            ary[j] = insertVal;
        }
    }
    

    Hence in case of average case the while loop will exit in middle

    i.e 1/2 + 2/2 + 3/2 + 4/2 + .... + (n-1)/2 = Theta((n^2)/2) = Theta(n^2)

    You saw here we achieved (n^2)/2 even though divide by two doesn't make more difference.

    Shell Sort is nothing but insertion sort by using gap like n/2, n/4, n/8, ...., 2, 1 mean it takes advantage of Best case complexity of insertion sort (i.e while loop exit) starts happening very quickly as soon as we find small element to the left of insert element, hence it adds up to the total execution time.

    n/2 + n/4 + n/8 + n/16 + .... + n/n = n(1/2 + 1/4 + 1/8 + 1/16 + ... + 1/n) = nlogn (Harmonic Series)

    Hence its time complexity is some thing close to n(logn)^2

    0 讨论(0)
  • 2020-12-03 09:42

    The worst-case of your implementation is Θ(n^2) and the best-case is O(nlogn) which is reasonable for shell-sort.

    The best case ∊ O(nlogn):

    The best-case is when the array is already sorted. The would mean that the inner if statement will never be true, making the inner while loop a constant time operation. Using the bounds you've used for the other loops gives O(nlogn). The best case of O(n) is reached by using a constant number of increments.

    The worst case ∊ O(n^2):

    Given your upper bound for each loop you get O((log n)n^2) for the worst-case. But add another variable for the gap size g. The number of compare/exchanges needed in the inner while is now <= n/g. The number of compare/exchanges of the middle while is <= n^2/g. Add the upper-bound of the number of compare/exchanges for each gap together: n^2 + n^2/2 + n^2/4 + ... <= 2n^2 ∊ O(n^2). This matches the known worst-case complexity for the gaps you've used.

    The worst case ∊ Ω(n^2):

    Consider the array where all the even positioned elements are greater than the median. The odd and even elements are not compared until we reach the last increment of 1. The number of compare/exchanges needed for the last iteration is Ω(n^2).

    0 讨论(0)
提交回复
热议问题