Lecture 2

Report
Asymptotic Analysis
•
•
•
•
Motivation
Definitions
Common complexity functions
Example problems
Motivation
• Lets agree that we are interested in
performing a worst case analysis of
algorithms
• Do we need to do an exact analysis?
Exact Analysis is Hard!
Even Harder Exact Analysis
Simplifications
• Ignore constants
• Asymptotic Efficiency
Why ignore constants?
• Implementation issues (hardware, code
optimizations) can speed up an algorithm by
constant factors
– We want to understand how effective an
algorithm is independent of these factors
• Simplification of analysis
– Much easier to analyze if we focus only on n2
rather than worrying about 3.7 n2 or 3.9 n2
Asymptotic Analysis
• We focus on the infinite set of large n
ignoring small values of n
• Usually, an algorithm that is asymptotically
more efficient will be the best choice for all
but very small inputs.
0
infinity
“Big Oh” Notation
• O(f(n)) =
{g(n) : there exists positive constants c and n0
such that 0 <= g(n) <= c f(n) }
– What are the roles of the two constants?
• n0:
• c:
Set Notation Comment
• O(f(n)) is a set of functions.
• However, we will use one-way equalities like
n = O(n2)
• This really means that function n belongs to the
set of functions O(n2)
• Incorrect notation: O(n2) = n
• Analogy
– “A dog is an animal” but not “an animal is a dog”
Three Common Sets
g(n) = O(f(n)) means c  f(n) is an Upper Bound on g(n)
g(n) = (f(n)) means c  f(n) is a Lower Bound on g(n)
g(n) = (f(n)) means c1  f(n) is an Upper Bound on g(n)
and c2  f(n) is a Lower Bound on g(n)
These bounds hold for all inputs beyond some threshold n0.
O(f(n))
(f(n))
(f(n))
(f(n))
O(f(n)) and (f(n))
O( f (n)) 
1
100
( f (n)) 
1
25
n2
n
Example Function
f(n) =
2
3n
- 100n + 6
Quick Questions
c
3n2 - 100n + 6 = O(n2)
3n2 - 100n + 6 = O(n3)
3n2 - 100n + 6  O(n)
3n2 - 100n + 6 = (n2)
3n2 - 100n + 6  (n3)
3n2 - 100n + 6 = (n)
3n2 - 100n + 6 = (n2)?
3n2 - 100n + 6 = (n3)?
3n2 - 100n + 6 = (n)?
n0
“Little Oh” Notation
• o(g(n)) =
{f(n) : for any positive constant c >0,
there exists a constant n0 > 0 such that
0 <= f(n) < cg(n) for all n >= n0}
– Intuitively, limn f(n)/g(n) = 0
– f(n) < c g(n)
Two Other Sets
g(n) = o(f(n)) means c  f(n) is a strict upper bound on g(n)
g(n) = w(f(n)) means c  f(n) is a strict lower bound on g(n)
These bounds hold for all inputs beyond some threshold n0
where n0 is now dependent on c.
Common Complexity Functions
Complexity
10
20
30
40
50
60
n
110-5 sec 210-5 sec 310-5 sec 410-5 sec 510-5 sec 610-5 sec
n2
0.0001 sec 0.0004 sec 0.0009 sec 0.016 sec
0.025 sec
0.036 sec
n3
0.001 sec
0.008 sec
0.027 sec
0.064 sec
0.125 sec
0.216 sec
n5
0.1 sec
3.2 sec
24.3 sec
1.7 min
5.2 min
13.0 min
2n
0.001sec
1.0 sec
17.9 min
12.7 days
35.7 years 366 cent
3n
0.59sec
58 min
6.5 years
3855 cent
2108cent 1.31013cent
log2 n
310-6 sec 410-6 sec 510-6 sec 510-6 sec 610-6 sec 610-6 sec
n log2 n 310-5 sec 910-5 sec 0.0001 sec 0.0002 sec 0.0003 sec 0.0004 sec
Complexity Graphs
n
log(n)
Complexity Graphs
n log(n)
n
n
log(n)
Complexity Graphs
n10
n3
n2
n log(n)
Complexity Graphs (log scale)
nn
3n
n20
2n
n10
1.1n
Logarithms
Properties:
bx = y  x = logby
blogbx = x
logab = b loga
logax = c logbx
(where c = 1/logba)
Questions:
* How do logan and logbn compare?
* How can we compare n logn with n2?
Example Problems
1. What does it mean if:
f(n)  O(g(n)) and g(n)  O(f(n))
2. Is 2n+1 = O(2n) ?
Is 22n = O(2n) ?
3. Does f(n) = O(f(n)) ?
4. If f(n) = O(g(n)) and g(n) = O(h(n)),
can we say f(n) = O(h(n)) ?
???

similar documents