### (n).

```EECS 3101
Prof. Andy Mirzaian
STUDY MATERIAL:
• [CLRS]
Appendix A, chapter 4
• Lecture Note 3
2
Summations:

=  1 +  2 + ⋯ +   = Θ( ? )
=1

2

= Θ
2

=1
Recurrence Relations:
=
=
− 1 + ()
0
2  − 1 + 1
0
∀ ≥ 1
∀ < 1
∀ ≥ 1
∀ < 1

⟹
=
()
=1
⟹
= Θ 2
3
TOPICS
Summations & Recurrence Relations
arise in the running time analysis of algorithms.
 SUMMATIONS:
 Classic Methods: Arithmetic, Geometric, Harmonic, Bounded Tail
 Approximating Summation by Integration
 Summation by Parts
 ASSUME: ASymptotic SUmmation Made Easy
 RECURRENCE RELATIONS:
 Iteration method
 Recursion tree method
 Divide-&-Conquer Recurrence: The Master Method
 Guess-&-Verify method
 Full History Recurrences
 Variable Substitution method
4
SUMMATION
f : Ν  R
S (n) 
n

f ( i )  f (1)  f ( 2 )  f ( 3)      f ( n )
i 1
5
Classic Methods
 Arithmetic (or polynomial):
f(n) = n: 1 + 2 + ··· + n
f(n) = n2: 12 + 22 + ··· + n2
f(n) = nd: 1d + 2d + ··· + nd
S(n) = Q(n f(n))
= n(n+1)/2 = Q( n2 )
= n(n+1)(2n+1)/6 = Q( n3 )
= Q( nd+1 ), for any constant real d > -1
 Geometric (or exponential): S(n) = Q(f(n))
f(n) = xn: 1 + x + x2 + ··· + xn = (xn+1 –1)/(x –1) for any constant real 0|x|1
= Q(xn) if x > 1 (Geometric increasing)
f(n) = 2n: 1+ 2 + 22 + ··· + 2n = Q(2n)
(with x = 2)
 Harmonic: S(n) = Q(log n)
f(n) = 1/n: H(n) = 1 + 1/2 + 1/3 + ··· + 1/n ≈ ln n = Q(log n)
 Bounded Tail: S(n) = Q(1)
1 + x + x2 + ··· + xn = (1 – xn+1)/(1-x) = Q(1) if 0<x<1. (Geometric decreasing)
f(n) = 2-n: 1 + 1/2 + 1/4 + ··· + 1/2n = Q(1)
(with x = ½)
f(n) = n-2: 1 + 1/22 + 1/ 32 + ··· + 1/n2 = Q(1)
(Arithmetic decreasing)
6
Approximating
f :
() by

f monotonically non-decreasing
f(x)
x
m-1 m

n
m 1
m+1 m+2
f ( x )dx 
n
...
 f (i )
im
n

n+1

n 1
m
f ( x )dx
7
Approximating
f :
() by

f monotonically non-increasing
f(x)
x
m-1 m

n
m 1
m+1 m+2
f ( x )dx 
n
...
 f (i )
im
n

n+1

n 1
m
f ( x )dx
NOTE: direction of inequalities changed
8
() by
Approximating

↑∶

≤
−1
↓∶
+1

≤

≥
−1
S(n) = f(1) + f(2) + … + f(n)

=

+1

≥
=

(m = 1)
Example 1: S(n) = 13 + 23 + … + n3 .
f(x) = x3 : f  and
 f(x) dx =  x3 dx = x4 / 4.
n4 / 4  S(n)  ( (n+1)4 – 1)/ 4
 S(n) = Q(n4)
Example 2: H(n) = 1 + 1/2 + 1/3 + … + 1/n .
f(x) = 1/x : f  and
 f(x) dx =  dx /x = ln x.
1 + ln n  H(n)  ln (n+1)
 H(n) = Q(log n)
Caution against dividing by zero!
Separate f(1) from the summation.
9
Summation by Parts: Arithmetic
f(n) = n : S(n) = f(1) + f(2) + f(3) + ··· + f(n).
 n·n  O(n2)
S(n) = 1 + 2 + 3 +
·······································································
+n
+ (n/2 –1) + n/2 + (n/2 +1) + (n/2 + 2) +
n/2
n/2
 n/2 · n/2
 W(n2)
0
 W(n2)
 S(n) = Q(n2)
10
Summation by Parts: Harmonic
 1+ log n
1
<1 < 2(1/2)=1
< 4(1/4) =1
 O(log n)
< 8(1/8) = 1
H (n)  1  12  13  14  15  16  17  18  19  101  111  121  131  141  151  161     
1
½ >2(1/4) = ½
> 4(1/8) = ½
 1 + ½ log n
1
n
> 8(1/16) = ½
 W(log n)
 H(n) = Q(log n)
11
ASSUME : ASymptotic SUmmation Made Easy
f: N  + : S(n) = f(1) + f(2) + f(3) + ··· + f(n)
maxn(f) = max { f(1) , f(2) , … , f(n) }
minn(f) = min { f(1) , f(2) , … , f(n) }
aven(f) = average {f(1) , f(2), … , f(n) }
S(n) = n · aven(f)
0  minn(f)  aven(f)  maxn(f)
f  minn(f) = f(1) , maxn(f) = f(n)
f  minn(f) = f(n) , maxn(f) = f(1)
max{ n · minn(f), maxn(f) }  S(n)
 S(n) = Q( g(n) · maxn(f))
 n · maxn(f)
for some 1  g(n)  n
ASSUME: A method for finding g(n) (and hence, S(n)).
12
ASSUME on monotone f(n)
f: N  +
S(n) = f(1) + f(2) + f(3) + ··· + f(n)
FACT: f is monotone  exactly one of the following holds:
(1)
(2)
(3)
(4)
f(n) = 0
f(n) = Q(1)
f(n) = w(1)
f(n) = o(1)
n
and f
and f and f(1) > 0
Cases (1) & (2): g(n) = Q(n), S(n) = Q(g(n) f(n)) = Q(n f(n))
Case (3):
Considered next.
Case (4):
Similar methods apply. Exercise.
13
ASSUME for f & f(n) = w(1)
f
S(n) = f(1) + ··· + f(n)
f(n)
f(m)  cf(n)  f(m+1)
f(n)g(n)
0<c<1
m
n
g(n)
Step 1)
Find m such that log f(m) = log f(n) – Q(1)
Step 2)
g(n)  n – m
THEOREM:
f & g
[ Note: 1  g(n)  n ]

S(n) = Q(f(n)g(n)).
14
ASSUME: examples
Polynomial: f(n) = nd (const. d  0) :
log f(m) = log f(n) – Q(1)
d log m = d log n – d = d log n/2
m = n/2  g(n) = n–m = n/2.
f & g  S(n) = Q(f(n)g(n)) = Q(nd+1).
Exponential: f(n) = 2n :
log f(m) = log f(n) – Q(1)
m = n
– 1
 g(n) = n–m = 1
f & g  S(n) = Q(f(n)g(n)) = Q(2n).
15
log(1 – x) = Q( ln(1 – x) ) = – Q(x)
for x = o(1)
Taylor Series Expansion :
f (1) ( 0 )
f ( 2 ) ( 0) 2
f ( 3 ) ( 0) 3
f ( x )  f ( 0) 
x
x 
x 
1!
2!
3!
2
3
4
ln(1  x )   x  x2  x3  x4    
f (x) = ln(1– x) :
Example:
log(n  Q (log n ))
log n
 log n 1  Q  n 
log n
 log n  log1  Q  n 
log n
 log n  Q  n 
 (log n ) 1  Q  1n 


[ take x  Q 
log n
n
]
16
ASSUME: examples
Super-polynomial sub-exponential 1: f(n) = 2

:
log f(m) = log f(n) – Q(1)
=
– 1
[Now square both sides]
2
m = (  – 1)
= n–2 +1
 g(n) = n–m = 2  –1 = Q(  ).
f & g  S(n) = Q(f(n)g(n)) = Q( 2  ).
Super-polynomial sub-exponential 2: f(n) = 2n/log n :
= log f(n) – Q(1)
m/log m = n/log n – Q(1)
[Multiply across by log m  log n ]
m
= n – Q(log n)
[Verify this satisfies (*) with “=” ]
m/log m = (n – Q(log n)) / log (n – Q(log n))
= ( n – Q(log n) ) / (log n)(1 – Q(1/n))
[See previous page]
= ( n/log n – Q(1) ) / (1 – Q(1/n))
= n/log n – Q(1)
[Why? Multiply sides by 1 – Q(1/n)]
log f(m)
(*)
 g(n) = n – m = Q(log n)
f & g  S(n) = Q(f(n)g(n)) = Q(2n/log n log n).
17
ASSUME: Lower Bound Proof
THEOREM: f & g  S(n) = f(1) + ··· + f(n) = Q(f(n)g(n)).
f
f(n)
f(m)  cf(n)  f(m+1)
f(m+1)g(n)
m
m+1
n
g(n)
Lower bound:
S(n)  S(n) – S(m)
= f(m+1) + f(m+2) + ··· + f(n)
= f(m+1) g(n)
 c f(n) g(n)
 W( f(n) g(n) ).
since f
18
ASSUME: Upper Bound Proof
THEOREM: f & g  S(n) = f(1) + ··· + f(n) = Q(f(n)g(n)).
f
f(n0)
cf(n0)  f(n1)
c2f(n0)  cf(n1)  f(n2)
g(n0)f(n0)
g(n2)f(n2) g(n1)f(n1)
n3
g(n2)
n2
g(n1)
n1
g(n0)
n0 = n
Upper bound:
S(n)  sum of shaded rectangular areas
= g(n0)f(n0) + g(n1)f(n1) + g(n2)f(n2) + ···
since f
 g(n) [ f(n0) + f(n1) + f(n2) + ··· ]
since g
 g(n)f(n) [ 1 + c + c2 + ··· ]
f(ni+1)  c f(ni)
 g(n)f(n) / (1 – c)
classic geometric decreasing sum
 O( f(n) g(n) ).
19
Caution!
To guarantee conclusion of the theorem, its premise must hold.
The premise insists that not only f, but also g holds.
f without g is possible. The following is an example:
f(n)  2
2  lo g lo g n 
f(n) is a step function
sandwiched between n and .
Why is “g” not true?
However, with Simple Analytical Functions (SAF), i.e., functions composed of
only constants, arithmetic, exponential, logarithm, and functional composition,
such a “bizarre” situation will not arise, and hence, the statement requiring “g”
can be omitted from the premise of the theorem.
20
Simple Analytical Function (SAF)
SAF : a function composed of a finite number of applications of
constants, arithmetic, exponential, logarithm, & functional composition.
Example:
4 n 2 n log log n
1  6(log n ) / n 2  7 n
FACTS: A SAF f(n) has the following properties:
a) log f(n) and f(x)/x are also SAFs.
b) has only a finite # of roots.
(does not oscillate)
(past its last real root)
d) is asymptotically monotonic.
(part (c) on its derivative)
e) In ASSUME: f(n)  g(n)
f) when comparing SAFs asymptotically, “n0nn0” can be replaced by the
phrase “for infinitely many n” (“for i.m. n” for short).
E.g., “for all n that are positive integer powers of 2”.
(“past last real root”)
Our short article says more on this topic.
The next table shows a summary of ASSUME on SAF.
21
f(n) : a SAF
S(n) = f(1) + f(2) + ··· + f(n)
Sum Type
()
Geometric
()

(or exponential)
Arithmetic
(or polynomial)
quasi
Harmonic
Bounded
Tail

−
Example
f(n)
()
32 log
3
Θ(())
3.5 log
Θ(())

−  −
3
log

4
23
,
1
2+log
Θ log +1    > −1
Θ(log log )   = −1
Θ 1
< −1
Θ(1)
22
A Simple form
The SAFs we encounter, usually have the following simple form:
f ( n )  Qt n log n 
n
d
e
for some real constants t>0, d, e. We classify S(n) for such f(n) as:
t>1
t=1
Geometric (increasing)
d > -1
Arithmetic
d = -1
quasi Harmonic
d < -1
Bounded tail (polynomially decreasing)
0<t<1
Bounded tail (geometrically decreasing)
23
RECURRENCE
RELATIONS
2T n2   Q(n) for n  O(1)
T ( n)  
for n  O(1)
O(1)

T (n)  Q(n log n)
24
The Iteration Method
for n  0
 0
T( n )  
 T ( n  1 )  f ( n ) for n  1
The iteration method (or unwinding the recurrence):
T(n) = f(n) + T(n –1)
= f(n) + f(n-1) + T(n –2)
= f(n) + f(n-1) + f(n-2) + T(n –3)
f(n) is
the driving function
of the recurrence
= …
= f(n) + f(n-1) + f(n-2) + ··· + f(3) + f(2) + T(1)
= f(n) + f(n-1) + f(n-2) + ··· + f(3) + f(2) + f(1) + T(0)
= f(n) + f(n-1) + f(n-2) + ··· + f(3) + f(2) + f(1).
Example: f(n) = Q(n)
f(n) = Q(2n)

T(n) = Q(n2).
 T(n) = Q(2n).
25
The Iteration Method
for n  1
0
T( n )  
n
 2T  2   n for n  1
T(n) = n + 2T(n/2)
= n + 2 [n/2 + 2T(n/22)]
= 2n + 22 T(n/22)
= 2n + 22 [n/22 + 2 T(n/23)]
= 3n + 23 T(n/23)
= 3n + 23 [n/23 + 2T(n/24)]
= 4n + 24 T(n/24)
= …
= kn + 2k T(n/2k)
take k = 1+ log n, T(n/2k) = 0
= n(1 + log n)
= Q(n log n )
26
The Iteration Method
0
T( n )  
2
n


2
T

n

2
for n  1
for n  1
T(n) = n2 + 2T(n/2)
= n2 + 2 [(n/2)2 + 2T(n/22)]
= n2(1+ ½) + 22 T(n/22)
= n2(1+ ½) + 22 [(n/22)2 + 2 T(n/23)]
= n2(1+ ½ + ½2) + 23 T(n/23)
= n2(1+ ½ + ½2) + 23 [(n/23)2 + 2T(n/23)]
= n2(1+ ½ + ½2 + ½3) + 24 T(n/24)
= …
= n2 (1+ ½ + ½2+ ½3 + ··· + ½k ) + 2k+1 T(n/2k+1)
take k = log n, T(n/2k+1) = 0
= n2 (1+ ½ + ½3 + ½5 + ··· + ½k )
= n2 · Q(1)
geometric decreasing
= Q(n2 )
27
The Recursion Tree Method
for n  1
0
T( n )  
n
2T  2   f ( n ) for n  1
T(n)
f(n)
f(n)
T(n/2)
T(n/2)
f(n/2)
2f(n/2)
f(n/2)
T(n/4)
T(n/4)
T(n/4)
T(n/4)
f(n/4)
f(n/4)
f(n/4)
f(n/4)
T(n/8)
f(n/8)
T(n/8) T(n/8)
f(n/8)
T(n/8) T(n/8)
f(n/8)
f(n/8)
f(n/8)
T(n/16)
T(n/8) T(n/8)
f(n/8)
f(n/8)
4f(n/4)
T(n/8)
f(n/8)
8f(n/8)
T(n/16)
T ( n)  f ( n)  2 f ( n 2 )  4 f ( n 4 )  8 f ( n 8 )    

log n 

i 0
2i f
 
n
2i
28
The Recursion Tree Method
for n  4
0
T( n )   n
n



  n for n  4
T

T
 4
2
CLAIM: T(n) = Q(n).
Lower bound: T(n)  W(n) obvious
T(n)
n
n
T(n/2)
n/4
n/2
T(n/16)
T(n/8)
n/16
T(n/64)
n/64
n/8
T(n/8)
T(n/4)
n/8
n/4
log2 n
log4 n
T(n/4)
T(n/32) T(n/32)
T(n/16) T(n/32)
T(n/16) T(n/16)
T(n/8)
n/32
n/16
n/16
n/8
n/32
n/32
n/16
T(n/256)
 3n/4
 (3/4)2n
 (3/4)3n
T(n/16)
T (n)  n  3 4 n  3 4  n  3 4  n     
2

3

Upper bound
 n 1  3 4   3 4   3 4       4n  O(n).
2
3
29
The Recursion Tree Method
for n  4
0
T( n )   n
n



  n for n  4
T

T
 4
2
This is a special case of the following recurrence:
T(n) = T(an) + T(bn) + Q(n)
where 0  a < 1 and 0  b < 1 are real constant parameters.
FACT:
(1) a  b  1
(2) a  b  1
(3) a  b  1
 T(n) = Q(n)
[linear]
 T(n) = Q(n log n)
 T(n) = Q(nd)
[super-linear poly]
where d > 1 is the unique constant
that satisfies the equation ad + bd = 1.
[See the guess-&-verify method later & Exercise 9.]
30
Divide-&-Conquer
MergeSort is the prototypical divide-&-conquer algorithm.
Here is a high level description:
Algorithm MergeSort( S )
Pre-Condition:
§ John von Neumann 
input S is a sequence of numbers
Post-Condition: output is a sorted permutation of the input sequence
Base:
if |S|  1 then return S
Divide:
Divide S into its left and right halves S=L, R, |L||R| ½|S|
Conquer:
L’  MergeSort( L );
R’  MergeSort( R )
Combine: S’  Merge( L’, R’ )
Output:
return S’
end
31
MergeSort Example
INPUT
23
15
15 23
41
82
41 82
15 23 41 82
37
92
37 92
66
31
31 66
31 37 66 92
15 23 31 37 41 66 82 92
34
73
34 73
25
19
19 25
19 25 34 73
33
15
15 33
16
58
16 58
15 16 33 58
15 16 19 25 33 34 58 73
15 15 16 19 23 25 31 33 34 37 41 58 66 73 82 92
OUTPUT
32
Algorithm MergeSort( A[1..n] )
B[0..n+1]  auxiliary array for merging
MS(1,n)
end
MERGE Loop Invariant:
procedure MS( s, f ) § sort A[s..f]
Base:
Divide:
Conquer:
Combine:
n
merged so far
A
s
s-1
f
k
i
m-1 m
end
j
left half
f+1


B
if s  f then return
m  (s+f)/2
MS( s, m –1 ) ; MS( m, f )
Merge( s, m, f )
right half
Merge time = Q(n), where n = f – s +1.
procedure Merge( s, m, f )
B[s-1 .. m-2]  A[s,m-1]
B[m .. f]  A[m,f]
B[m-1] B[f+1] § barrier sentinel
i  s-1 ; j  m
for k  s .. f do
if B[i]  B[j]
then A[k]  B[i]; i++
else A[k]  B[j]; j++
end
33
Algorithm MergeSort( A[1..n] )
B[0..n+1]  auxiliary array for merging
MS(1,n)
end
MergeSort Recurrence:
T(n) = Q(1) n  1
T(n) = T( n/2 ) + T( n/2 ) + Q(n) n>1
procedure MS( s, f ) § sort A[s..f]
Base:
Divide:
Conquer:
Combine:
if s  f then return
m  (s+f)/2
MS( s, m –1 ) ; MS( m, f )
Merge( s, m, f )
end
procedure Merge( s, m, f )
Simplified Recurrence:
T(n) = 2 T(n/2) + Q(n) for n > 1
T(n) = Q(1)
for n  1
Solution:
T(n) = Q( n log n).
B[s-1 .. m-2]  A[s,m-1]
B[m .. f]  A[m,f]
B[m-1] B[f+1] § barrier sentinel
i  s-1 ; j  m
for k  s .. f do
if B[i]  B[j]
then A[k]  B[i]; i++
else A[k]  B[j]; j++
end
34
Divide-&-Conquer Recurrence
=
+ ()
Θ 1
∀ > 1
∀ ≤ 1
a = # of recursive calls (a > 0)
n/b = size of each sub-instance called recursively (b > 1)
f(n) = cost of divide + combine steps
(every thing except the recursive calls).
In MergeSort we have:
a = 2 recursive calls
of size n/b = n/2 each.
Cost of divide = Q(1)
Cost of Combine, i.e., merge, is Q(n).
So, f(n) = Q(n).
35
aT ( bn )  f ( n )
T( n )  
Q( 1 )
n  1
n  1
f(n)
f(n)
a
f(n/b)
f(n/b2)
Recursion Tree:
f(n/b)
f(n/b2)
af(n/b)
f(n/b)
f(n/b2)
a2f(n/b2)
f(n/b2)
k = logbn
f(n/bk-1)
Q(1)
f(n/bk-1)
Q(1)
Q(1)
Q(1)
ak-1f(n/bk-1)
h(n) = Q(n logba )
ak
a logb n = n logb a
(compare their logarithms)
h(n) = ak ·Q(1) = Q(ak) = Q(alogbn ) = Qalogbn) = Qnlogba)
k = logbn = Q( log n).
36
aT ( bn )  f (n)
T ( n)  
Q(1)
n  1
n  1
f(n)
T(n) = S(n) + h(n) = Q( max{S(n), h(n)})
h(n) = Q n
k 1
S ( n)   a i f
i 0
af(n/b)
logba )
S(n) =

S
a2f(n/b2)
n
bi
 f (n)  af ( bn )  a 2 f ( bn2 )      a k 1 f
 
n
b
k 1
ak-1f(n/bk-1)
h(n) = Q(n logba )
a logb n = n logb a
(compare their logarithms)
h(n) = ak ·Q(1) = Q(ak) = Q(alogbn ) = Qalogbn) = Qnlogba)
k = logbn = Q( log n).
37
aT ( bn )  f (n) n  1
T ( n)  
n  1
Q(1)
T(n) = S(n) + h(n)
Special Solution: contribution of the internal nodes of the recursion tree.
Homogeneous Solution: contribution of the leaves of the recursion tree.
ah( bn ) n  1
h ( n)  
Q(1) n  1
Define:
Q ( n) 
S ( n)
,
h( n)
h(n) = Q n logba )
r ( n) 
Q( bn )  r (n) n  1
Q ( n)  
n  1
0
f ( n)
h( n)
T(n) = h(n)( Q(n) + 1 )
Q(n)  r(n)  rbn   r
  r     
n
b2
n
b3
f(n) is a SAF  r(n) is a SAF
38
Q ( n) 
S ( n)
,
h( n)
r ( n) 
f ( n)
h( n )
T(n) = h(n)( Q(n) + 1 )
 r(n) is a SAF
↪ Let n = bk for integer k = Q(log n).
 k
i 
Q(n)  Q  r b 
 i 1

r(n)
r(bi)
Q(n)=Q( ? )
T(n)=Q( ? )
W(n+e)
W(b+ie)
r(n)
f(n)
O(n-e)
O(b-ie)
1
h(n)
Q(loge n)
Q(ie)
r(n)  k
f(n)log n
Q(log-1 n)
Q(1/i)
log k
h(n)log log n
O(loge n)
O(i-e)
1
h(n)
f(n) is a SAF
e > -1
e < -1
 
39
aT ( bn )  f ( n )
T( n )  
Q( 1 )
n  1
n  1
h(n) = Q n logba )
THE MASTER THEOREM
Suppose f(n) is a SAF, and e  0 & e are constant reals.
() / ()
()
+
f(n)
Compare with CLRS. See Exercise 4.
h(n)
»

(e > -1)

(e = -1)

(e < -1)
−
f(n)

h(n)
f(n)
«
h(n)
(())
(() log ) = (log log +1 )
(ℎ() log log ) = (log log log )
(ℎ()) = (log)
40
aT ( bn )  f ( n )
T( n )  
Q( 1 )
n  1
n  1
f(n) = Q( tn nd log e n)
T(n) = Q(f(n))
bd > a
bd
=a

d = logb a
bd < a
0<t<1
constants t > 0, d, e.
T(n) = Q(f(n))
t>1
t=1
h(n) = Q n logba )
e > -1
T(n) = Q(f(n) log n)
e = -1
T(n) = Q(h(n) log log n)
e < -1
T(n) = Q(h(n))
T(n) = Q(h(n))
T(n) = Q(h(n))
41
CSE 3101
log  = log  + log
log   =  log
log   = log  / log
log(1 ± ) = ±Θ() for  = (1)
For f ↑:

=1

≤
−1
Sums & Recurrences
Θ +1
= Θ(log )
Θ 1
Fact Sheet
if d > −1
if d = −1
if d < −1
+1
≤
=0

For f ↓:

=
=

≥
−1
ASSUME [ ASymptotic SUmmation Made Easy ]
= =1      ↑ =  1
g(n) ←  −
+1 − 1
if 0 ≠ || ≠ 1
−1
Θ
if x > 1
Θ 1
if 0 < x < 1

+1
≥
=
where log   = log   − Θ 1 :
↑   ↑ ⇒   =

Sum Type
() :
=
Geometric
2Ω()
(   )
Arithmetic
Θ 1
(    )
Θ(
Harmonic
−Ω 1
Bounded Tail
MASTER METHOD:
∶ ()
≫ ()
≈ ()
≪ ()
−1
−1
() =
()
()

log )
−1
,
=
+

e = a real constant
−1 log   ,  < −1
+ () ∀ >
,
()
∀ ≤

.
()
> −
= −
< −
()
ℎ() = Θ log  ,
∶  ,
constants  > 0 & .
=
=
Ω( )
(   )
Θ(log   ) ,  > −1
(     )
Θ log −1  ,  = −1
(      )
−
(   )
(log   ) ,  < −1
42
aT ( bn )  f ( n )
T( n )  
Q( 1 )
n  1
n  1
h(n) = Q n logba )
Example 1: Assume a = 2, b = 4, f(n) = Q(nd log2 n).
Express T(n) based on the constant parameter d.
Solution:
logb a = log4 2 = ½  h(n) = Q( )
d > ½  T(n) = Q(f(n))
= Q( 2 )
d = ½  T(n) = Q(f(n) log n) = Q(  3 )
d < ½  T(n) = Q(h(n))
= Q( )
43
aT ( bn )  f ( n )
T( n )  
Q( 1 )
n  1
n  1
h(n) = Q n logba )
Example 2: Assume b = 3, f(n) = Q(n4 / log n).
Express T(n) based on the constant parameter a.
Solution:
h(n) = Q(nlog3a) , bd = 34 = 81
a < 81  T(n) = Q(f(n))
= Q(n4 / log n)
a = 81  T(n) = Q(h(n) log log n) = Q(n4 log log n)
a > 81  T(n) = Q(h(n))
= Q(nlog3a)
44
Guess-&-Verify
BASIC INGRIDIANTS:
 Guess a solution to the recurrence.
 Verify it by mathematical induction.
The induction variable must be a natural number.
E.g., height of the recursion tree (max recursion depth), or
size of the recursion tree (# times you apply the recurrence).
n may also be a candidate if it “stays” integer.
 If you spot a place where the verification does not go through,
examine it to help revise your guess, and try again.
 One key guiding principle, taken with a grain of salt, is to first make
sure the leading term (with its constant coefficient) matches on the
LHS and RHS of the recurrence. That will show you to guess
higher or lower the next time! After figuring out the leading term,
you can apply the same principle to find the lower order terms.
45
Guess-&-Verify: Example 1
=
• Clearly   = Ω 2 .
• Guess # 1:
4
Is

2
+ 2
1
> 1
≤ 1
=  2 ?
= 2 +
const.  > 0,   =  2 lower order terms (to be determined)
• Plug this guess in the recurrence and verify:
2
+
? =? 4
2
2
+

? =? 4

2
+ 2

2
+ 2
= ( + 1)2 + 4

2
guess is too low.
46
Guess-&-Verify: Example 1
=
• Guess # 2:
4

2
+ 2
1
> 1
≤ 1
= 2+ +
const.  > 0,  > 0,   lower order terms (to be determined)
• Plug this guess in the recurrence and verify:
2+
+
? =? 4
2+
2
+

? =? 4

2
+ 2

2
1
2
2+ + 4
+ 2
=

2
+2
guess is too high.
47
Guess-&-Verify: Example 1
=
• Guess # 3:
4

2
+ 2
> 1
1
= 2 log  +
≤ 1
const.  > 0,   lower order terms (to be determined)
? =? 4
• Plug this guess in the recurrence and verify:
2 log
+
? =? 4
2

log
2
2

2
+
= 2 log  + 4

2

2
+ 2
+ 2
+ (1 − )2

correct guess of high order term.
• Solve the residual recurrence for () (with revised boundary condition):
= 4

2
+ (1 − )2 = 4
• Solution:   = Θ 2
and

2
(must have 1 −  ≯ 0, set  = 1, why?)
= 2 log  + Θ 2
48
Guess-&-Verify: Example 2
Recurrence:
T(n) = 2T( n/2 ) + n2 ,
T(n)  n2  W(n2) is obvious.
Let’s see if O(n2) is an upper-bound:
Guess: T(n)  an2 + bn + c
Basis (n = 0):
T(0) = 0 
T(0) = 0
a02
(for some constants a>0, b, c to be determined)
+ b0 + c
we need
c0
Ind. Step (n 1): T(n) = 2T( n/2 ) +n2
 2 [ a n/22 + b n/2 + c ] + n2  by induction hypothesis
if n is even  = 2 [ a(n/2)2 + b(n/2)+c ] + n2  not “” for odd n, if b<0
= (1+ a/2)n2 + bn + 2c
 an2 + bn + c  our guessed upper-bound
We need:
for all even n 1:
an2 + bn + c  (1+ a/2)n2 + bn + 2c, i.e.,
(a/2 – 1)n2 – c  0.
49
Guess-&-Verify: Example 2
Recurrence:
T(n) = 2T( n/2 ) + n2 ,
T(n)  n2  W(n2) is obvious.
Let’s see if O(n2) is an upper-bound:
Guess: T(n)  an2 + bn + c
Basis (n = 0):
T(0) = 0 
T(0) = 0
a02
(for some constants a>0, b, c to be determined)
+ b0 + c
we need
c0
Ind. Step (n 1): T(n) = 2T( n/2 ) +n2
 2 [ a n/22 + b n/2 + c ] + n2  by induction hypothesis
if n is odd  = 2 [ a((n-1)/2)2 + b((n-1)/2)+c ] + n2
= (1+ a/2)n2 + (b-a)n + (2c-b+a/2)
 an2 + bn + c  our guessed upper-bound
We need:
for all odd n 1:
an2 + bn + c  (1+ a/2)n2 + (b-a)n + (2c-b+a/2), i.e.,
(a/2 – 1)n2 + an + (b – c – a/2)  0.
We need:
for all even n 1:
an2 + bn + c  (1+ a/2)n2 + bn + 2c, i.e.,
(a/2 – 1)n2 – c  0.
a=2
b = -1
c=0
works for all n  0.
n2  T(n)  2n2 - n
50
Guess-&-Verify: Example 2
Recurrence:
T(n) = 2T( n/2 ) + n2 ,
T(0) = 0
Exercise 8: Using this same method, verify the following
tighter LOWER BOUND:
n1: 2n2 – n – 2n log n  T(n)  2n2 – n.

T(n) = 2n2 – n – O(n log n).
For more examples of
guess-&-verify see
Lecture Note 3
&
Sample Solutions ….
a=2
b = -1
c=0
works for all n  0.
n2  T(n)  2n2 - n
51
Full History Recurrence
Such recurrences often arise in average-case analysis.
Example 1 : T (n)  i 0 T (i )  f (n),
n 1
n  0
Values of T(n) for some small n:
T(0) = f(0)
T(1) = T(0) + f(1) = f(0) + f(1)
T(2) = T(0) + T(1) + f(2) = 2f(0) + f(1) + f(2)
T(3) = T(0) + T(1) + T(2) + f(3) = 4f(0) + 2f(1) + f(2)+ f(3)
T(4) = T(0) + T(1) + T(2) + T(3) + f(4) = 8f(0) + 4f(1) + 2f(2) + f(3) + f(4)
2
Example 2 : T (n) 
n

n 1
i 0
T (i )  n,
n  0
This is the QuickSort expected time recurrence (details shown later in LS5)
52
Full History Recurrence
Example 1: T ( n )  i0 T (i )  f ( n ), n  0
n1
A Solution Method:
n  n -1:
T ( n 1)  i0 T ( i )  f ( n 1), n 1 0 (i.e., n 1 )
n 2
Subtract: T ( n )  T ( n 1)  T ( n 1)  f ( n )  f ( n 1), n 1
2T ( n 1)  f ( n )  f ( n 1)  n 1
Rearrange: T ( n )  
for n  0
f ( 0 )
Now there is only one appearance of T(.) on the RHS.
Continue with conventional methods.
53
Variable Substitution: Example 1
Sometimes we can considerably simplify the recurrence by a change of variable.
T(n) = T(n/2) + log n
[assume the usual boundary condition T(O(1))=O(1).]
Change of variable: n = 2m, i.e., m = log n.
Now T(n) = T(2m). It is some function of m.
Rename it S(m). That is, T(n) = T(2m) = S(m).
The recurrence becomes:
S(m) = S(m-1) + m.
By the iteration method: S(m) = Q(m2).
So, T(n) = T(2m) = S(m) = Q(m2) = Q(log2 n).
T(n) = Q(log2 n).
54
Variable Substitution: How?
Recurrence
T( n )
g(m)
Boundary Condition
= T( n/2 ) + f(n)
g(m-1)
T(n) = 0 for n < 1
g(m)
Rename: n = g(m), n/2 = g(m-1), S(m) = T(g(m)), h(m) = f(g(m))
Now solve 2 recurrences:
1) g(m) = 2g(m-1),
g(0) = 1  our choice
2)

g(m) = 2mg(0) = 2m .
S(m) = S(m-1) + h(m)
 S(m) = h(0)+h(1)+ ··· +h(m).
S(0) = T(g(0)) = T(1) = f(1) = h(0)
Now back substitute:
T(n) = T(g(m)) = S(m) = S(g-1(n)),
where g-1 is functional inverse of g.
See another example on the next page
55
Variable Substitution: Example 2
T() = T( ) + 1
g(m)
g(m-1)
Rename: n = g(m),
= g(m-1), S(m) = T(g(m)).
g (m)  g(m 1)  g (m  2)  g(m  3)      g (0)
2
22
g(0) = 2  n = g(m) = 22m
S(m) = S(m-1)+1
S(0) = T(2) = Q(1)
23
2m
 m = log log n.
 S(m) = Q(m) = Q(log log n).
T(n) = Q(log log n).
56
Exercises
57
1.
For each pseudo-code below derive the simplified asymptotic running time in Q(?) notation.
(a)
for i  1 .. n do
for j  1 .. 2*i do print( i+j )
(b)
for i  1 .. n do
for j  2*i .. n do print( i+j )
(c)
for i  1 .. n do
j1
while j  i do j  j+3
(d)
for i  1 .. n do
j1
while i+j  n do j  j+3
(e)
for i  1 .. n do
j1
while j  i do j  j+j
(f)
for i  1 .. n do
j1
while j*j  i do j  j+1
(g)
for i  1 .. n do
j2
while j  n do j  j*j
(h) for i  1 .. n do
j2
while j  i do j  j*j
(i) for i  1 .. n do
j1
while i*j  n do j  j+1
(j)
for i  1 .. n do
j1
while i*i  j do j  j+1
for i  1 .. n do
jn
while i < j*j do j  j – 2
(l)
for i  1 .. n do
jn
while i  j*j do j  j div 2
(k)
58
59
4. The Master Theorem: CLRS versus these Slides:
The case “f(n)/h(n) = W(ne)” of the Master Theorem [page 94 of CLRS]
has the following extra condition:
(*) if af(n/b) < cf(n) for some constant c < 1 and all sufficiently large n.
The same theorem on page 40 of these slides instead uses the condition:
(**) f(n) is a SAF.
Show that (**) implies (*) in the case “f(n)/h(n) = W(ne)” .
[Hint: Consider the SAF r(n) = f(n)/h(n)  W(n+e). So, r(bi) = W(b+ie) is increasing at least
exponentially as a function of i. Then, with some extra derivation, show that r(n/b) / r(n) < c < 1
for some constant c and all sufficiently large n. c = b –e/2 works. ]
5. Exercise 6 of Lecture Note 3 (page 9) solves the recurrence T(n) = 3T(n/3 +5) + n/2
by the guess-&-verify method. Such recurrences can also be solved by the
variable substitution method. Consider the somewhat generalized recurrence
T(n) = a T(n/b + c) + f(n), where a>0, b>1, and c are constants, and f(n) is the
driving function which is a SAF.
a) Show the substitution S(n) = T(n + cb/(b-1)) results in the new recurrence
S(n) = aS(n/b) + f(n + cb/(b-1)). [Now this can be solved by the Master Theorem.]
b) Use this method to solve the recurrence T(n) = 4T(n/2 +7) + n2.
60
(m) T (n) 

n 1
i 0
i  T (i )  n
61
(n)
=

−1
−1 +1
(o)
=
−1

−1 +1
62
8. Show that the recurrence
= 2
0 = 0

2
+ 2 ,
has the solution: () = 22 –  – (  ).
9. Show that the recurrence
a +  b +
=

≥
0 ≤  <
for any real constants 0  a < 1, 0  b < 1, c > 0,  > 0,
has the following solutions:
a) a  b  1  T(n) = Q(n)
b) a  b  1  T(n) = Q(n log n)
c) a  b  1  T(n) = Q(nd), where d > 1 is the unique constant
that satisfies the equation: ad + bd = 1.
63
64
12. Derive the most simplified answers to the following summations.
Mention which methods you used to derive your solution.
a)
log
=1
b)

=1
(2+7)5
2
=1 (3+4) 8 +3 5 (9−1)3
c)
2
=1
2 +

=1
4
d)

=1
4
=1
e)
2
=1
log
=1
2

+ 7 2 5 + 4 6 log
= Θ ? .
= Θ ? .
= Θ ? .
( 2 +)4
2 log +5
73 +5  2
2 3 +9 log
2
= Θ ? .
= Θ(? ).
65
13. Recursion Time Analysis:
A certain recursive algorithm takes an input list of n elements. Divides the list
into n sub-lists, each with n elements. Recursively solves each of these n
smaller sub-instances. Then spends an additional Θ(n) time to combine the
solutions of these sub-instances to obtain the solution of the main instance.
As a base case, if the size of the input list is at most a specified positive
constant, then the algorithm solves such a small instance directly in Θ(1) time.
a) Express the recurrence relation that governs T n , the time complexity of
this algorithm.
b) Derive the solution to this recurrence relation: T n = Θ(? ).
Mention which methods you used to derive your solution.
66
END
67
```