### Slides

```Stochastic processes
Lecture 6
Power spectral density (PSD)
1
Random process
2
1st order Distribution & density
function
First-order distribution
First-order density function
3
2end order Distribution & density
function
2end order distribution
2end order density function
4
EXPECTATIONS
• Expected value
• The autocorrelation
5
Some random processes
•
•
•
•
•
•
•
•
Single pulse
Multiple pulses
Periodic Random Processes
The Gaussian Process
The Poisson Process
Bernoulli and Binomial Processes
The Random Walk Wiener Processes
The Markov Process
6
Single pulse
Single pulse with random amplitude and arrival time:
X (t) = A S(t −Θ)
A and Θ are statistically independent
Amplitude
Amplitude
Nerve spike
Amplitude
Deterministic pulse:
S(t): Deterministic function.
Random variables:
A: gain a random variable
Θ: arrival time.
2
0
-2
0
0.5
1
1.5
t (ms)
2
2.5
3
0
0.5
1
1.5
t (ms)
2
2.5
3
0
0.5
1
1.5
t (ms)
2
2.5
3
2
0
-2
2
0
-2
7
Multiple pulses
Single pulse with random amplitude and arrival time:
x  =

=1  (
−  )
Deterministic pulse:
S(t): Deterministic function.
Random variables:
Ak: gain a random variable
Θk: arrival time.
n: number of pulses
Multiple Nerve spikes
2
0
-2
0
0.5
1
1.5
2
2.5
3
0
0.5
1
1.5
2
2.5
3
0
0.5
1
1.5
2
2.5
3
2
0
-2
2
0
Ak and Θk are statistically independent
-2
8
Periodic Random Processes
• A process which is periodic with T
x  =   +
n is an integrer
Signal
2
1.5
1
T=100
X(t)
0.5
0
-0.5
-1
-1.5
-2
0
100
200
300
400
x  =
500
t
2
50
600
700
+ Θ +
800
2
100
+Θ
900
1000
9
The Gaussian Process
• X(t1),X(t2),X(t3),….X(tn) are jointly Gaussian fro
all t and n values
• Example: randn() in Matlab
Histogram of Gaussian process
Gaussian process
700
5
4
600
3
500
2
1
400
0
300
-1
200
-2
100
-3
-4
0
1000
2000
3000
4000
5000
6000
7000
8000
9000
10000
0
-4
-3
-2
-1
0
1
2
3
4
5
10
The Poisson Process
• Typically used for modeling of cumulative
number of events over time.
– Example: counting the number of phone call from
a phone

= =
!

−()
11
Alternative definition
Poisson points
• The number of events in an interval
N(t1,t2)
1, 2 =  =   2 −  1
0, 2
2 − 1
= =
!

−(2−1)
−
= =   = =

!
12
Bernoulli Processes
• A process of zeros and ones
X=[0 0 1 1 0 1 0 0 1 1 1 0]
Each sample must be independent and identically distributed
Bernoulli variables.
– The likelihood of 1 is defined by p
– The likelihood of 0 is defined by q=1-p
13
Binomial process
Summed Bernoulli Processes
Where X[n] is a Bernoulli Processes
14
Random walk
• For every T seconds take a step (size Δ) to the left or
right after tossing a fair coin
Random walks
6
4
x[n]
2
0
-2
-4
-6
-8
0
5
10
15
20
25
n
30
35
40
45
50
15
The Markov Process
• 1st order Markov process
– The current sample is only depended on the
previous sample
Density function
Expected value
16
The frequency of earth quakes
• Statement the number large earth quakes has
increased dramatically in the last 10 year!
17
The frequency of earth quakes
• Is the frequency of large earth quakes unusual
high?
• Which random processes can we use for
modeling of the earth quakes frequency?
18
The frequency of earth quakes
• Data
• http://earthquake.usgs.gov/earthquakes/eqar
chives/year/graphs.php
19
Agenda (Lec 16)
• Power spectral density
– Definition and background
– Wiener-Khinchin
– Cross spectral densities
– Practical implementations
– Examples
20
Fourier transform recap 1
Transform between time and frequency domain
Fourier transform
Invers Fourier transform
21
Fourier transform recap 2
• Assumption: The signal can be reconstructed
from sines and cosines functions.
e(jw n)
Real
Imaginary
1
Amplitude
0.5
−2
0
-0.5
= cos 2 − sin 2
-1
0
20
40
60
80
100
n
• Requirement: absolute integrable

 x[n]  
n  
22
Fourier transform of a stochastic
process
• A stationary stochastic process is typical not
absolute integrable
• There the signal is truncated
• Before Fourier transform
23
What is power?
• In the power spectrum density power is
related to electrical power
=
2

24
Power of a signal
• The power of a signal is calculated by squaring
the signal.
()2
• The average power in e period is :
25
Parseval's theorem
• The power of the squared absolute Fourier
transform is equal the power of the signal
26
Power of a stochastic process
• Thereby can the expected power can be
calculated from the Fourier spectrum
27
Power spectrum density
• Since the integral of the squared absolute Fourier
transform contains the full power of the signal it is a
density function.
•
So the power spectral density of a random process is:
• Due to absolute factor the PSD is always real
28
PSD Example
Fourier transform
PSD
1
Sxx(f)
0.8
0.6
0.4
0.2
0
-15
-10
-5
0
5
10
15
29
Power spectrum density
• The PSD is a density function.
– In the case of the random process the PSD is the
density function of the random process and not
necessarily the frequency spectrum of a single
realization.
• Example
– A random process is defined as
X  = sin( )
– Where ωr is a unifom distributed random variable
wiht a range from 0-π
– What is the PSD for the process and
– The power sepctrum for a single realization
30
PSD of random process versus
spectrum of deterministic signals
• In the case of the random process the PSD is
usual the expected value E[Sxx(f)]
• In the case of deterministic signals the PSD is
exact (There is still estimation error)
31
Properties of the PSD
1. Sxx(f) is real and nonnegative
2. The average power in X(t) is given by:
2 () =  0 =
∞

−∞
3. If X(t) is real Rxx(τ) and Sxx(f) are also even
− =
4. If X(t) has periodic components Sxx(f)has impulses
5. Independent on phase
32
Wiener-Khinchin 1
• If the X(t) is stationary in the wide-sense the
PSD is the Fourier transform of the
Autocorrelation
Proof: page33175
Wiener-Khinchin
Two method for estimation of the PSD
Fourier Transform
X(f)
|X(f)|2
X(t)
Sxx(f)
X(t)
f
Fourier Transform
t
Rxx()
Sxx(f)
Autocorrelation
f

34
The inverse Fourier Transform of the
PSD
• Since the PSD is the Fourier transformed
autocorrelation
• The inverse Fourier transform of the PSD is the
autocorrelation
35
Cross spectral densities
• If X(t) and Y(t) are two jointly wide-sense
stationary processes, is the Cross spectral
densities
• Or
36
Properties of Cross spectral densities
1. Since
is
2. Syx(f) is not necessary real
3. If X(t) and Y(t) are orthogonal
Sxy(f)=0
4. If X(t) and Y(t) are independent
Sxy(f)=E[X(t)] E[Y(t)] δ(f)
37
Cross spectral densities example
• 1 Hz Sinus curves in white noise
= sin 2  + 3 ()
= sin 2  +  2 + 3 ()
Welch Cross Power Spectral Density Estimate
Where w(t) is Gaussian noise
5
0
X(t)
10
0
-10
0
5
10
t (s)
Signal Y(t)
15
20
Y(t)
10
-5
-10
-15
-20
-25
0
-10
Power/frequency (dB/Hz)
Signal X(t)
-30
0
5
10
t (s)
15
20
0
5
10
15
Frequency (Hz)
20
25
38
Implementations issues
• The challenges includes
– Finite signals
– Discrete time
39
The periodogram
The estimate of the PSD
• The PSD can be estimate from the
autocorrelation
−1
[] −ω
ω =
=−+1
• Or directly from the signal
ω =
1

2
−1
[] −ω
=0
40
The discrete version of the
autocorrelation
Rxx(τ)=E[X1(t) X(t+τ)]≈Rxx[m]
m=τ where m is an integer
−  −1
=
[ + ]
=0
N: number of samples
Normalized version:

1
=

−  −1
[ + ]
=0
41
Bias in the estimates of the
autocorrelation
N=12
−  −1
=
[ + ]
=0
Autocorrelation
Autocorrelation
M=-10
M=0
M=4
222
888
111
666
000
444
-1-1
-1
-2-2
-2
-10
-10
-10
-5
-5
-5
000
555
nnn
10
10
10
15
15
15
20
20
20
222
000
222
111
-2
-2
-2
000
-4
-4
-4
-1-1
-1
-2-2
-2
-10
-10
-10
-5
-5
-5
000
555
n+m
n+m
n+m
10
10
10
15
15
15
20
20
20
-6
-6
-6
-15
-15
-15
-10
-10
-10
-5
-5
0
5
10
15
42
Bias in the estimates of the
autocorrelation
• The edge effect correspond to multiplying the true
autocorrelation with a Bartlett window
[  ] = []_ []
− ||
=

0
<
ℎ
w[m]
1
0.5
0
-15
-10
-5
0
m
5
10
15
43
Alternative estimation of
autocorrelation
• The unbiased estimate

1
=
− ||
−  −1
[ + ]
=0
Unbiased
0.6
0.6
0.4
0.4
0.2
0.2
Rxx[m]
Rxx[m]
Biased
0
0
-0.2
-0.2
-0.4
-0.4
-0.6
-0.6
-15
-10
-5
0
m
5
10
15
-15
-10
-5
0
m
5
10
15
• Disadvantage: high variance when |m|→N
44
Influence at the power spectrum
• Biased version: a Bartlett window is applied
∞
ω =
=−∞
[]_ [] −ω
• Unbiased version: a Rectangular window is
applied
∞
ω =
=−∞
[]_[] −ω
=
1
0
<
ℎ
45
Example
Autocorrelation biased and unbiased
Unbiased
0.6
0.4
0.4
0.2
0.2
Rxx[m]
0.6
0
0
-0.2
-0.2
-0.4
-0.4
-0.6
-0.6
-15
-10
-5
0
m
5
10
Estimated PSD’s
15
-15
-10
-5
0
m
5
10
15
estimated PSD
5
Unbiased
Biased
4
Sxx()
Rxx[m]
Biased
3
2
1
0
0
1
2
3
4

5
6
7
46
Variance in the PSD
• The variance of the periodogram is estimated
to the power of two of PSD

=  ()
2
Realization 1
10
0
5
-5
True PSD
1
0
0.8
Sxx(f)
PSD: Realization 1
5
5
t (s)
Realization 2
10
0
5
10
0
5
0
50
100
150
f (Hz)
PSD: Realization 2
200
50
200
0.6
0.4
-5
0.2
0
0
50
100
f (Hz)
150
200
0
5
t (s)
Realization 3
10
0
5
10
0
5
-5
0
5
t (s)
10
0
0
0
100
150
f (Hz)
PSD: Realization 3
50
100
f (Hz)
150
200
47
Averaging
• Divide the signal into K segments of M length
=   − 1  + 1:
1≤≤
• Calculate the periodogram of each segment

1
ω =

2
−1
[] −ω
=0
• Calculate the average periodogram
1
[ω] =

[ω]
=0
48
Illustrations of Averaging
X(t)
2
0
-2
-4
0
1
2
10
3
4
5
4
6
7
10
8
9
0
100
10
6
4
5
2
5
2
0
0
100
200
0
0
100
200
0
0
100
200
0
200
10
5
0
0
50
100
f (Hz)
150
200
49
Effect of Averaging
• The variance is decreased

=
1
() 2

• But the spectral resolution is also decreased
50
The Welch method
• Introduce overlap between segment
=   − 1  + 1:  − 1  +
1≤≤
– Where Q is the length between the segments
• Multiply the segment's with windows

1
ω =

2
−1
[] [] −ω
=0
51
Example
• Heart rate variability
• http://circ.ahajournals.org/cgi/content/full/93
/5/1043#F3
• High frequency component related to
Parasympathetic nervous system ("rest and
digest")
• Low frequency component related to
sympathetic nervous system (fight-or-flight)
52
Agenda (Lec 16)
• Power spectral density
– Definition and background
– Wiener-Khinchin
– Cross spectral densities
– Practical implementations
– Examples
53
```