Tuesday, 25 December 2018

Analysis of insertion sort

Add statement costs and number of times executed to INSERTION-SORT
• Assume that the i th line takes time ci , which is a constant. (Since the third line
is a comment, it takes no time.)
• For j = 2, 3, . . . , n, let tj be the number of times that the while loop test is
executed for that value of j .
• Note that when a for or while loop exits in the usual way—due to the test in the
loop header—the test is executed one time more than the loop body.
The running time of the algorithm is

all statements
(cost of statement) · (number of times statement is executed) .
Let T (n) = running time of INSERTION-SORT.

The running time depends on the values of tj . These vary according to the input.
Best case: The array is already sorted.
• Always Þnd that A[i ] ≤ key upon the Þrst time the while loop test is run (when
i = j − 1).
• All t j are 1.
• Running time is
T (n) = c1n + c2(n − 1) + c4(n − 1) + c5(n − 1) + c8(n − 1)
= (c1 + c2 + c4 + c5 + c8)n − (c2 + c4 + c5 + c8) .
• Can express T (n) as an +b for constants a and b (that depend on the statement
costs ci )⇒ T (n) is a linear function of n.


Worst case: The array is in reverse sorted order.
• Always Þnd that A[i ] > key in while loop test.
• Have to compare key with all elements to the left of the j th position⇒compare
with j − 1 elements.
• Since the while loop exits because i reaches 0, there’s one additional test after
the j − 1 tests ⇒tj = j .


Worst-case and average-case analysis
We usually concentrate on Þnding the worst-case running time: the longest running
time for any input of size n.
Reasons:
• The worst-case running time gives a guaranteed upper bound on the running
time for any input.
• For some algorithms, the worst case occurs often. For example, when searching,
the worst case often occurs when the item being searched for is not present,
and searches for absent items may be frequent.
• Why not analyze the average case? Because it’s often about as bad as the worst
case.
Example: Suppose that we randomly choose n numbers as the input to insertion
sort.
On average, the key in A[ j ] is less than half the elements in A[1 . . j − 1] and
it’s greater than the other half.
⇒On average, the while loop has to look halfway through the sorted subarray
A[1 . . j − 1] to decide where to drop key.
⇒t j = j/2.
Although the average-case running time is approximately half of the worst-case
running time, it’s still a quadratic function of n.
Order of growth
Another abstraction to ease analysis and focus on the important features.
Look only at the leading term of the formula for running time.
• Drop lower-order terms.
• Ignore the constant coefÞcient in the leading term.
Example: For insertion sort, we already abstracted away the actual statement costs
to conclude that the worst-case running time is an2 + bn + c.
Drop lower-order terms⇒an2.
Ignore constant coefÞcient ⇒n2.
But we cannot say that the worst-case running time T (n) equals n2.
It grows like n2. But it doesn’t equal n2.
We say that the running time is (n2) to capture the notion that the order of growth
is n2.
We usually consider one algorithm to be more efÞcient than another if its worstcase
running time has a smaller order of growth.

Designing algorithms
There are many ways to design algorithms.
For example, insertion sort is incremental: having sorted A[1 . . j −1], place A[ j ]
correctly, so that A[1 . . j ] is sorted.
Divide and conquer
Another common approach.
Divide the problem into a number of subproblems.
Conquer the subproblems by solving them recursively.
Base case: If the subproblems are small enough, just solve them by brute force.
[It would be a good idea to make sure that your students are comfortable with
recursion. If they are not, then they will have a hard time understanding divide
and conquer.]
Combine the subproblem solutions to give a solution to the original problem.

No comments:

Post a Comment