COMP 320: Introduction to Computer Organization

Report
Asymptotes: Why?
How to describe an algorithm’s running time?
(or space, …)
How does the running time depend on the input?
T(x) = running time for instance x
Problem: Impractical to use, e.g.,
“15 steps to sort [3 9 1 7], 13 steps to sort [1 2 0 3 9], …”
Need to abstract away from the individual instances.
1
Asymptotes: Why?
Standard solution: Abstract based on size of input.
How does the running time depend on the input?
T(n) = running time for instances of size n
Problem: Time also depends on other factors.
E.g., on sortedness of array.
2
Asymptotes: Why?
Solution: Provide a bound over these instances.
Most common. Default.
Worst case
Best case
Average case
T(n) = max{T(x) | x is an instance of size n}
T(n) = min{T(x) | x is an instance of size n}
T(n) = |x|=n Pr{x}  T(x)
Determining the input probability distribution can be difficult.
3
Asymptotes: Why?
What’s confusing about this notation?
Worst case
Best case
Average case
T(n) = max{T(x) | x is an instance of size n}
T(n) = min{T(x) | x is an instance of size n}
T(n) = |x|=n Pr{x}  T(x)
Two different kinds of functions:
T(instance)
T(size of instance)
Won’t use T(instance) notation again, so can ignore.
4
Asymptotes: Why?
Problem: T(n) = 3n2 + 14n + 27
Too much detail: constants may reflect implementation
details & lower terms are insignificant.
3n2
n
Solution: Ignore the constants
& low-order terms.
(Omitted details still important
pragmatically.)
14n+17
1
3
31
10
300
157
100
30,000
1,417
1000
3,000,000
14,017
10000 300,000,000 140,017
3n2 > 14n+17
 “large enough” n
5
Upper Bounds
Creating an algorithm proves we can solve the
problem within a given bound.
But another algorithm might be faster.
E.g., sorting an array.
Insertion sort  O(n2)
What are example algorithms for
O(1), O(log n), O(n), O(n log n), O(n2), O(n3), O(2n)?
6
Lower Bounds
Sometimes can prove that we cannot compute something
without a sufficient amount of time.
That doesn't necessarily mean we know how to compute it in
this lower bound.
E.g., sorting an array.
# comparisons needed in worst case  (n log n)
Shown in COMP 482.
7
Definitions: O, 
T(n)  O(g(n))

 constants C,k > 0
T(n)  (g’(n))

 constants C’,k’ > 0
such that
such that
 nk, T(n)  Cg(n)
 nk’, T(n)  C’g’(n)
Cg(n)
T(n)
C’g’(n)
k
k’
8
Examples: O, 
2n+13  O( ? )
O(n)
2n+13  ( ? )
(n), also (log n), (1), …
2n  O(n) ? (n) ?
Given a C, 2n  Cn, for all but small n.
(n), not O(n).
nlog n  O(n5) ?
No. Given a C, log n  C5, for all
large enough n. Thus, (n5).
Also, O(n2), O(5n), … Can
always weaken the bound.
9
Definitions: 
T(n)  (g(n))

T(n)  O(g(n)) and T(n)  (g(n))
Ideally, find algorithms that are asymptotically as good
as possible.
10
Notation
O(), (), () are sets of functions.
But common to abuse notation, writing
T(n) = O(…)
instead of T(n)  O(…)
as well as T(n) = f(n) + O(…)
11

similar documents