### Y - MathAlpha

```5
Joint Probability
Distributions
CHAPTER OUTLINE
5-1 Two or More Random Variables
5-1.1 Joint Probability Distributions
5-1.2 Marginal Probability
Distributions
5-1.3 Conditional Probability
Distributions
5-1.4 Independence
5-1.5 More Than Two Random
Variables
5-2 Covariance and Correlation
5-3 Common Joint Distributions
5-3.1 Multinomial Probability
Distribution
5-3.2 Bivariate Normal Distribution
5-4 Linear Functions of Random
Variables
5-5 General Functions of Random
Variables
Chapter 4 Title and Outline
1
Learning Objective for Chapter 5
After careful study of this chapter, you should be able to do the
following:
1.
2.
3.
4.
5.
6.
7.
Use joint probability mass functions and joint probability density
functions to calculate probabilities.
Calculate marginal and conditional probability distributions from joint
probability distributions.
Interpret and calculate covariances and correlations between random
variables.
Use the multinomial distribution to determine probabilities.
Understand properties of a bivariate normal distribution and be able to
draw contour plots for the probability density function.
Calculate means and variances for linear combinations of random
variables, and calculate probabilities for linear combinations of normally
distributed random variables.
Determine the distribution of a general function of a random variable.
Chapter 5 Learning Objectives
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
2
Concept of Joint Probabilities
• Some random variables are not independent of each
other, i.e., they tend to be related.
– Urban atmospheric ozone and airborne particulate matter
tend to vary together.
– Urban vehicle speeds and fuel consumption rates tend to
vary inversely.
• The length (X) of a injection-molded part might not be
independent of the width (Y). Individual parts will vary
due to random variation in materials and pressure.
• A joint probability distribution will describe the
behavior of several random variables, say, X and Y. The
graph of the distribution is 3-dimensional: x, y, and
f(x,y).
Chapter 5 Introduction
3
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Example 5-1: Signal Bars
You use your cell phone to check your airline reservation. The airline system
requires that you speak the name of your departure city to the voice
recognition system.
• Let Y denote the number of times that you have to state your departure
city.
• Let X denote the number of bars of signal strength on you cell phone.
Figure 5-1 Joint probability
distribution of X and Y. The table cells
are the probabilities. Observe that
more bars relate to less repeating.
Bar Chart of
Number of Repeats vs. Cell
Phone Bars
0.25
Probability
y = number of x = number of bars
times city
of signal strength
name is stated 1
2
3
1
0.01 0.02 0.25
2
0.02 0.03 0.20
3
0.02 0.10 0.05
4
0.15 0.10 0.05
0.20
0.15
0.10
4 Times
3 Times
Twice
Once
0.05
0.00
1
2
3
Cell Phone Bars
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
4
Joint Probability Mass Function Defined
T he joint probabilit y m ass function of the
discrete random variables X and Y ,
denoted as f X Y  x , y  , satifies:
(1) f X Y  x , y   0
(2)

x
f XY  x , y   1
A ll probabilities are non-negative
T he sum of all probabilities is 1
y
(3) f X Y  x , y   P  X  x , Y  y 
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
(5-1)
5
Joint Probability Density Function Defined
The joint probability density function for the continuous random
variables X and Y, denotes as fXY(x,y), satisfies the following
properties:
(1) f X Y  x , y   0 for all x , y

(2)

 
f X Y  x , y  dxdy  1
 
(3) P   X , Y   R  

f X Y  x , y  dxdy
(5-2)
R
Figure 5-2 Joint probability
density function for the random
variables X and Y. Probability that
(X, Y) is in the region R is
determined by the volume of
fXY(x,y) over the region R.
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
6
Joint Probability Mass Function Graph
Figure 5-3 Joint probability density function for the
continuous random variables X and Y of different
dimensions of an injection-molded part. Note the
asymmetric, narrow ridge shape of the PDF –
indicating that small values in the X dimension are
more likely to occur when small values in the Y
dimension occur.
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
7
Example 5-2: Server Access Time-1
Let the random variable X denote the time (msec’s) until a
computer server connects to your machine. Let Y denote the
time until the server authorizes you as a valid user. X and Y
measure the wait from a common starting point (x < y). The
range of x and y are shown here.
Figure 5-4 The joint probability
density function of X and Y is nonzero
over the shaded region where x < y.
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
8
Example 5-2: Server Access Time-2
• The joint probability density function is:
f X Y  x , y   ke
 0.001 x  0.002 y
for 0  x  y   and k  6  10
6
• We verify that it integrates to 1 as follows:


 
 

f XY

   0 .0 0 1 x  0 .0 0 2 y 
   0 .0 0 2 y   0 .0 0 1 x
dy dx  k    e
dy  e
dx
 x , y dxdy     ke
0  x
0  x



 e  0 .0 0 2 x
 k
0.002
0 

  0 .0 0 1 x
 0 .0 0 3 x
e
dx

0.003
e
dx



0
 1 
 0.003 
 1
 0.003 
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
9
Example 5-2: Server Access Time-2
Now calculate a probability:
1000 2000
P  X  1000, Y  2000  
 

1000
k

0
1000
k

0
f X Y  x , y dxdy
x
 2000  0.002 y   0.001 x
dy  e
dx
  e
 x

 e  0.002 x  e  4

0.002

  0.001 x
dx
e

1000
 0.003

e
 0.003 x
4
e e
 0.001 x
dx
0
1
  1  e 3 


1

e
4
 0.003  
e 

0.003
0.001




Figure 5-5 Region of
integration for the probability
that X < 1000 and Y < 2000 is
 0.003  316.738  11.578   0.915
Sec 5-1.1 Joint Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
10
Marginal Probability Distributions (discrete)
For a discrete joint PDF, there are marginal distributions
for each random variable, formed by summing the
joint PMF over the other variable.
fX  x 
 f  xy 
y
fY
 y  
f  xy 
x
y = number of x = number of bars
times city
of signal strength
name is stated 1
2
3
f (y ) =
1
0.01 0.02 0.25 0.28
2
0.02 0.03 0.20 0.25
3
0.02 0.10 0.05 0.17
4
0.15 0.10 0.05 0.30
f (x ) = 0.20 0.25 0.55 1.00
Figure 5-6 From the prior example,
the joint PMF is shown in green
while the two marginal PMFs are
shown in blue.
Sec 5-1.2 Marginal Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
11
Marginal Probability Distributions (continuous)
• Rather than summing a discrete joint PMF, we
integrate a continuous joint PDF.
• The marginal PDFs are used to make probability
• If the joint probability density function of random
variables X and Y is fXY(x,y), the marginal
probability density functions of X and Y are:
fX  x 

f X Y  x , y dy
y
fY
 y  
f X Y  x , y dx
(5-3)
x
Sec 5-1.2 Marginal Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
12
Example 5-4: Server Access Time-1
For the random variables
times in Example 5-2, find
the probability that Y
exceeds 2000.
Integrate the joint PDF directly
using the picture to
determine the limits.
2000
P  Y  2000  

0
D ark region 
 

  f X Y  x , y  dy dx 
 2000

left dark region




   f X Y  x , y  dy dx
2000  x

right dark region
Sec 5-1.2 Marginal Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
13
Example 5-4: Server Access Time-2
Alternatively, find the marginal PDF and then integrate
that to find the desired probability.
y
fY
 y    ke

 0 .0 0 1 x  0 .0 0 2 y
P  Y  2000  
0

fY
 y dy
2000

y
 ke
 0 .0 0 2 y
e
 0 .0 0 1 x
 6  10
dx
3
 e



  0.001 
0 

 0 .0 0 1 x
 0 .0 0 2 y
y
 0 .0 0 1 y
 ke
1 e

 0.001
 0 .0 0 2 y
3
 6  10 e
 0.002 y
1  e
 0.001 y
dy
2000
0
 ke

e
 0 .0 0 2 y
1  e
 6  10



 0 .0 0 1 y
 6  10

3
3
  e  0.002 y

   0.002


2000
  e  0.003 y

   0.003
 

2000




6
 e 4

e


  0.05
 0.002 0.003 
for y  0
Sec 5-1.2 Marginal Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
14
Mean & Variance of a Marginal Distribution
Means E(X) and E(Y) are calculated from the
discrete and continuous marginal distributions.
D iscrete
EX

C ontinuous
x  fX  x

R
E Y  

X  
y  fY
 y


X
 y  f  y  dy  
Y
Y
R
x  fX  x  X 
2
2
R
V Y  
X
R
R
V
 x  f  x  dx  

x  f X  x  dx   X
2
2
R
y  fY
2
R
 y    Y2


y  fY
2
 y  dy   Y2
R
Sec 5-1.2 Marginal Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
15
Mean & Variance for Example 5-1
y = number of x = number of bars
of signal strength
times city
name is stated 1
2
3
f (y ) = y *f (y ) = y 2*f (y ) =
1
0.01 0.02 0.25 0.28
0.28
0.28
2
0.02 0.03 0.20 0.25
0.50
1.00
3
0.02 0.10 0.05 0.17
0.51
1.53
4
0.15 0.10 0.05 0.30
1.20
4.80
f (x ) = 0.20 0.25 0.55 1.00
2.49
7.61
x *f (x ) = 0.20 0.50 1.65
2.35
x 2*f (x ) = 0.20
1.00
4.95
6.15
E(X) = 2.35
V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275
E(Y) = 2.49
V(Y) = 7.61 – 2.492 = 7.61 – 16.20 = 1.4099
Sec 5-1.2 Marginal Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
16
Conditional Probability Distributions
R ecall that P  B A  
From Example 5-1
P(Y=1|X=3) = 0.25/0.55 = 0.455
P(Y=2|X=3) = 0.20/0.55 = 0.364
P(Y=3|X=3) = 0.05/0.55 = 0.091
P(Y=4|X=3) = 0.05/0.55 = 0.091
Sum = 1.001
PA
B
P  A
y = number of x = number of bars
of signal strength
times city
name is stated 1
2
3
f (y ) =
1
0.01 0.02 0.25 0.28
2
0.02 0.03 0.20 0.25
3
0.02 0.10 0.05 0.17
4
0.15 0.10 0.05 0.30
f (x ) = 0.20 0.25 0.55 1.00
Note that there are 12 probabilities conditional on X, and 12 more
probabilities conditional upon Y.
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
17
Conditional Probability Density Function Defined
G iven continuous random variables X and Y w ith
joint probability density function f X Y  x , y  ,
the conditional probability densiy function of Y given X = x is
fY
x
 y 
f XY  x , y 
fX  x
for f X  x   0
(5-4)
w hich satifies the follow ing properties:
(1) f Y
(2)

x
fY
 y  0
x
 y  dy
1
(3) P  Y  B X  x  

fY
x
 y  dy
for any set B in the range of Y
B
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
18
Example 5-6: Conditional Probability-1
From Example 5-2, determine the conditional PDF for Y given X=x.

fX  x 
 k e
 0.001 x  0.002 y
dy
x
 ke
 ke
 0.001 x
 0.001 x
 0.003 e
fY
x
 y 
 e  0.002 y

 0.002

 0.003 x
 x 
 0.002 e
x




 e  0.002 


0.002


f XY  x , y 
fX


for x  0
ke
 0.001 x  0.002 y
0.003 e
 0.002 x  0.002 y
 0.003 x
for 0  x  y  
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
19
Example 5-6: Conditional Probability-2
Now find the probability that Y exceeds 2000 given that X=1500:
P Y  2000 X  1500 



fY 1500  y  d y
2000



0 .0 0 2 e
 0 .0 0 2 1 5 0 0   0 .0 0 2 y
2000
 e  0 .0 0 2 y
3
 0 .0 0 2 e 
  0 .0 0 2


2000




4


e
3
1
 0 .0 0 2 e 

e
 0 .3 6 8

 0 .0 0 2 
Figure 5-8 again The conditional
PDF is nonzero on the solid line in
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
20
Example 5-7: Conditional Discrete PMFs
Conditional discrete PMFs can be shown as tables.
y = number of x = number of bars
of signal strength
f(x|y) for y =
times city
Sum of
name is stated 1
2
3 f (y ) = 1
2
3 f(x|y) =
1
0.01 0.02 0.25 0.28 0.036 0.071 0.893
1.000
2
0.02 0.03 0.20 0.25 0.080 0.120 0.800
1.000
3
0.02 0.10 0.05 0.17 0.118 0.588 0.294
1.000
4
0.15 0.10 0.05 0.30 0.500 0.333 0.167
1.000
f (x ) = 0.20 0.25 0.55
1
0.050 0.080 0.455
2
3
4
Sum of f(y|x) =
0.100
0.100
0.750
1.000
0.120
0.400
0.400
1.000
0.364
0.091
0.091
1.000
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
21
Mean & Variance of Conditional Random Variables
• The conditional mean of Y given X = x,
denoted as E(Y|x) or μY|x is:
E Y x  
 y  f  y dy
(5-6)
Y x
y
• The conditional variance of Y given X = x,
denoted as V(Y|x) or σ2Y|x is:
V Y x  

y  Y
y
x

2
 fY
y  dy
x 


y  fY
2
2
y




x
Y x
y
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
22
Example 5-8: Conditional Mean & Variance
From Example 5-2 & 6, what is the conditional mean for Y given
that x = 1500? Integrate by parts.
E  Y X  1500  


y  0.002 e
0.002 1500   0.002 y

dy  0.002 e
3
1500

y e
 0.002 y
dy
1500
 e  0.002 y
3
 0.002 e  y
  0.002

1500
 e  0.002 y  
  
 dy 
 0.002  
1500 



 0.002 y

1500  3
e
3
 0.002 e 
e 
  0.002   0.002 
 0.002



1500




3
 1500  3

e
 0.002 e 
e 

0.002
0.002
0.002






3
 e 3

 0.002 e 
 2000    2000
 0.002

3
If the connect time is 1500 ms, then the expected time to be authorized is 2000 ms.
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
23
Example 5-9
For the discrete random variables in Exercise 5-1, what
is the conditional mean of Y given X=1?
y = number of x = number of bars
of signal strength
times city
name is stated 1
2
3
1
0.01 0.02 0.25
2
0.02 0.03 0.20
3
0.02 0.10 0.05
4
0.15 0.10 0.05
f (x ) =
1
2
3
4
Sum of f(y|x) =
f (y ) =
0.28
0.25
0.17
0.30
0.20 0.25 0.55 y*f(y|x=1)
0.050 0.080 0.455
0.05
0.100
0.100
0.750
1.000
0.120
0.400
0.400
1.000
0.364
0.091
0.091
1.000
0.20
0.30
3.00
3.55
y2*f(y|x=1)
0.05
0.40
0.90
12.00
13.35
12.6025
0.7475
The mean number of attempts given one bar is 3.55 with variance of 0.7475.
Sec 5-1.3 Conditional Probability Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
24
Joint Random Variable Independence
• Random variable independence means that
knowledge of the values of X does not change
any of the probabilities associated with the
values of Y.
• X and Y vary independently.
• Dependence implies that the values of X are
influenced by the values of Y.
• Do you think that a person’s height and weight
are independent?
Sec 5-1.4 Independence
25
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Exercise 5-10: Independent Random Variables
In a plastic molding operation,
each part is classified as to
whether it conforms to color
and length specifications.

1 if the part conform s to length specs
Y 
0 otherw ise
Figure 5-10(a) shows marginal & joint
probabilities, fXY(x, y) = fX(x) * fY(y)
Figure 5-10(b) show the conditional
probabilities, fY|x(y) = fY(y)
X 
1 if the part conform s to color specs
0 otherw ise
Sec 5-1.4 Independence
26
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Properties of Independence
For random variables X and Y, if any one of the following
properties is true, the others are also true. Then X
and Y are independent.
(1) f X Y  x , y   f X  x   f Y
(2) f Y
(3) f X
x
y
 y
 y
 y 
fY
for all x and y w ith f X  x   0
 y 
f X  x  for all x and y w ith f Y
 y  0
(4) P  X  A , Y  B   P  X  A   P  Y  B  for any
sets A and B in the range of X and Y , respectively.
Sec 5-1.4 Independence
(5-7)
27
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Rectangular Range for (X, Y)
• A rectangular range for X and Y is a necessary,
but not sufficient, condition for the
independence of the variables.
• If the range of X and Y is not rectangular, then
the range of one variable is limited by the
value of the other variable.
• If the range of X and Y is rectangular, then one
of the properties of (5-7) must be
demonstrated to prove independence.
Sec 5-1.4 Independence
28
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Example 5-11: Independent Random Variables
• Suppose the Example 5-2 is modified such that the
joint PDF is:
f X Y  x , y   2  10
6
e
 0 .0 0 1 x  0 .0 0 2 y
for x  0 and y  0.
• Are X and Y independent? Is the product of the
marginal PDFs equal the joint PDF? Yes by inspection.


fX  x 
6
 2  10 e
 0 .0 0 1 x  0 .0 0 2 y
dy
fY
 y    2  10  6 e  0.001 x  0.002 y dx
0
0
 0.001e
 0 .0 0 1 x
for x  0
 0.002 e
 0.002 y
for y  0
• Find this probability:
P  X  1000, Y  1000   P  X  1000   P  Y  1000 
e
1
 1  e
2
  0.318
Sec 5-1.4 Independence
29
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Example 5-12: Machined Dimensions
Let the random variables X and Y
denote the lengths of 2
dimensions of a machined part.
Assume that X and Y are
independent and normally
distributed. Find the desired
probability.
Normal
Random Variables
X
Y
Mean 10.5
3.2
Variance 0.0025
0.0036
P  10.4  X  10.6, 3.15  Y  3.25   P 10.4  X  10.6   P  3.15  Y  3.25 
10.6  10.5 
3.25  3.2 
 10.4  10.5
 3.15  3.2
 P
 Z 
 Z 
P

0.05
0.05
0.06
0.06




 P   2  Z  2   P   0.833  Z  0.833   0.568
Sec 5-1.4 Independence
30
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Example 5-13: More Than Two Random Variables
• Many dimensions of a machined part are
routinely measured during production. Let
the random variables X1, X2, X3 and X4 denote
the lengths of four dimensions of a part.
• What we have learned about joint, marginal
and conditional PDFs in two variables extends
to many (p) random variables.
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
31
Joint Probability Density Function Redefined
The joint probability density function for the continuous random
variables X1, X2, X3, …Xp, denoted as f x x x ... x  x1 , x 2 , x 3 , ..., x p 
satisfies the following properties:
1 2 3
p
(1) f X 1 X 2 ... X p  x1 , x 2 , ..., x p   0

(2)


  ... 
 
f X 1 X 2 ... X p  x1 , x 2 , ..., x p  dx1 dx 2 ...dx p  1

(3) For any region B of p-dim ensional s pace,
P
 X
1
, X 2 ... X
p
  B    ...
f X 1 X 2 ... X p  x1 , x 2 , ..., x p  dx1 dx 2 ...dx p
(5-8)
B
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
32
In an electronic assembly, let X1, X2, X3, X4 denote the
lifetimes of 4 components in hours. The joint PDF is:
f X 1 X 2 X 3 X 4  x1 , x 2 , x 3 , x 4   9  10
12
e
 0.001 x1  0.002 x 2  0.0015 x 3  0.003 x 4
for x i  0
What is the probability that the device operates more
than 1000 hours?
The joint PDF is a product of exponential PDFs.
P(X1 > 1000, X2 > 1000, X3 > 1000, X4 > 1000)
= e-1-2-1.5-3 = e-7.5 = 0.00055
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
33
Example 5-15: Probability as a Ratio of Volumes
• Suppose the joint PDF of the continuous random
variables X and Y is constant over the region x2 +
y2 =4. The graph is a round cake with radius of 2
and height of 1/4π.
• A cake round of radius 1 is cut from the center of
the cake, the region x2 + y2 =1.
• What is the probability that a randomly selected
bit of cake came from the center round?
• Volume of the cake is 1. The volume of the round
is π *1/4π = ¼. The desired probability is ¼.
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
34
Marginal Probability Density Function
If the joint probability density functio n of continuous random variables
X 1 , X 2 , ... X
p
is f X 1 X 2 ... X p  x1 , x 2 , ... x p  ,
the m arginal probability density functio n of X i is
f X i  xi  
  ...
f X 1 X 2 ... X p  x1 , x 2 , ... x p dx1 dx 2 ...dx i 1 dx i  1 ...dx p
(5-9)
w here the integral is over all points in the
range of X 1 , X 2 , ... X
p
for w hich X i  x i .
( don't integrate out x i )
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
35
Mean & Variance of a Joint PDF
The mean and variance of Xi can be determined
from either the marginal PDF, or the joint PDF
as follows:

E Xi 


  ... 
 
x i  f X 1 X 2 ... X p  x1 , x 2 , ... x p  dx1 dx 2 ...dx p

and
(5-10)

V


 X i     ... 
 
x
i
  Xi

2
 f X 1 X 2 ... X p  x1 , x 2 , ... x p  dx1 dx 2 ...dx p

Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
36
Example 5-16
• There are 10 points
in this discrete
joint PDF.
• Note that
x1+x2+x3 = 3
• List the marginal
PDF of X2
P(X2 = 0) = PXXX(0,0,3) + PXXX(1,0,2) + PXXX(2,0,1) + PXXX(3,0,0)
P(X2 = 1) = PXXX(0,1,2) + PXXX(1,1,1) + PXXX(2,1,0)
P(X2 = 2) = PXXX(0,2,1) + PXXX(1,2,0)
P(X2 = 3) = PXXX(0,3,0)
Note the index pattern
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
37
Reduced Dimensionality
If the joint probability density functio n of continuous random variables
X 1 , X 2 , ... X
p
is f X 1 X 2 ... X p  x1 , x 2 , ... x p  ,
then the probability density function of X 1 ,X 2 ,...X k , k  p is
f X 1 X 2 ... X k  x1 , x 2 , ..., x k  
  ...
f X 1 X 2 ... X p  x1 , x 2 , ... x p dx k  1 dx k  2 ...dx p
(5-11)
w here the integral is over all points in the
range of X 1 , X 2 , ... X
p
for w hich X i  x i for i  1 through k .
( integrate out p-k variables)
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
38
Conditional Probability Distributions
• Conditional probability distributions can be
developed for multiple random variables by
extension of the ideas used for two random
variables.
• Suppose p = 5 and we wish to find the distribution
conditional on X4 and X5.
fX
1X 2
X3 X4 X5
 x1 , x 2 , x 3  
f X 1 X 2 X 3 X 4 X 5  x1 , x 2 , x 3 , x 4 , x 5 
f X 4 X 5  x 4 , x5 
for f X 4 X 5  x 4 , x 5   0.
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
39
Independence with Multiple Variables
The concept of independence can be extended to
multiple variables.
R andom variables X 1 , X 2 , ..., X
p
are in depen d e nt if and only if
f X 1 X 2 ... X p  x1 , x 2 , ..., x p   f X 1  x1   f X 2  x 2  ... f X p  x p  for all x 1 ,x 2 ,...,x p
(5-12)
(joint p df equals the product of all the m argina l P D Fs )
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
40
Example 5-17
• In Chapter 3, we showed that a negative
binomial random variable with parameters p
and r can be represented as a sum of r
geometric random variables X1, X2,…, Xr , each
with parameter p.
• Because the binomial trials are independent,
X1, X2,…, Xr are independent random variables.
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
41
Example 5-18: Layer Thickness
Suppose X1,X2,X3 represent the thickness in μm of a
substrate, an active layer and a coating layer of a
chemical product. Assume that these variables are
independent and normally distributed with
parameters and specified limits as tabled.
What proportion of the product
meets all specifications?
Which one of the three
thicknesses has the least
probability of meeting specs?
Answer: Layer 3 has least prob.
Note the index pattern
Mean (μ)
Std dev (σ)
Lower limit
Normal
Random Variables
X1
X2
X3
10,000 1,000
80
250
20
4
9,200
950
75
Upper limit
10,800
1,050
85
P(in limits) 0.99863 0.98758 0.78870
P(all in limits) = 0.77783
Sec 5-1.5 More Than Two Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
42
Covariance
• Covariance is a measure of the relationship
between two random variables.
• First, we need to describe the expected value of a
function of two random variables. Let h(X, Y)
denote the function of interest.

h  x , y   f X Y  x , y  for X , Y discrete



E  h  X , Y    
(5-13)

h  x , y   f X Y  x , y  dxdy for X , Y continuous
 
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
43
Example 5-19: E(Function of 2 Random Variables)
= covariance
Mean
Marginal
Joint
x
y
1
1
3
3
3
1
3
1
2
1
2
3
1
2
3
μX =
μY =
f(x, y) x-μX y-μY Prod
0.1
-1.4
-1.0 0.14
0.2
-1.4
0.0 0.00
0.2
0.6
-1.0 -0.12
0.2
0.6
0.0 0.00
0.3
0.6
1.0 0.18
0.3 covariance = 0.20
0.7
0.3
0.4
0.3
2.4
2.0
Figure 5-12 Discrete joint
distribution of X and Y.
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
44
Covariance Defined
T he covariance betw een the random variab les X and Y ,
denoted as cov  X , Y

or 
XY
is
 X Y  E   X   X
 Y
T he units of 
are units of X tim es units of Y .
XY
  Y    E  X Y    X  Y
(5 -1 4 )
For exam ple, if the units of X are feet
and the units of Y are pounds,
the units of the covariance are foot-pou nds.
U nlike the range of variance, -   
XY
 .
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
45
Covariance and Scatter Patterns
Figure 5-13 Joint probability distributions and the sign of cov(X, Y).
Note that covariance is a measure of linear relationship. Variables
with non-zero covariance are correlated.
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
46
Example 5-20: Intuitive Covariance
y = number of x = number of bars
of signal strength
times city
name is stated 1
2
3
1
0.01 0.02 0.25
2
0.02 0.03 0.20
3
0.02 0.10 0.05
4
0.15 0.10 0.05
The probability distribution of Example 5-1 is shown.
By inspection, note that the larger probabilities occur as X
and Y move in opposite directions. This indicates a negative
covariance.
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
47
Correlation (ρ = rho)
T he correlation betw een random variables X and Y ,
denoted as  X Y , is
 XY 
cov  X , Y
V

 X  V Y 

 XY
 X Y
(5-15)
S ince  X  0 and  Y  0,
 X Y and cov  X , Y

have the sam e sign.
W e say that  X Y is norm alized, so  1   X Y  1
(5-16)
N ote that  X Y is dim ensionless.
V ariables w ith non-zero correlation are correlated.
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
48
Example 5-21: Covariance & Correlation
Joint
x
Figure 5-14 Discrete joint
distribution, f(x, y).
StDev
Mean
Marginal
Determine the
covariance and
correlation.
y
0
1
1
2
2
3
0
1
2
3
0
1
2
1
2
3
0
1
2
3
μX =
μY =
f(x, y) x-μX y-μY Prod
0.2
-1.8
-1.2 0.42
0.1
-0.8
-0.2 0.01
0.1
-0.8
0.8 -0.07
0.1
0.2
-0.2 0.00
0.1
0.2
0.8 0.02
0.4
1.2
1.8 0.88
0.2 covariance = 1.260
0.2 correlation = 0.926
0.2
0.4
Note the strong
0.2 positive correlation.
0.2
0.2
0.4
1.8
1.8
σX = 1.1662
σY = 1.1662
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
49
Example 5-21
Calculate the
correlation.
Figure 5-15 Discrete joint
distribution. Steepness of line
connecting points is immaterial.
StDev
Mean
Marginals
Joint
x
y
1
2
3
1
2
3
7
9
11
7
9
11
μX =
μY =
f(x, y) x*y*f
0.2
1.4
0.6 10.8
0.2
6.6
0.2 18.8 = E(XY)
0.6 0.80 = cov(X ,Y )
0.2 1.00 = ρXY
0.2 Using Equations
0.6
5-14 & 5-15
0.2
2
9.0
σX = 0.6325
σY = 1.2649
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
50
Independence Implies ρ = 0
• If X and Y are independent random variables,
σXY = ρXY = 0
(5-17)
• ρXY = 0 is necessary, but not a sufficient
condition for independence.
– Figure 5-13d (x, y plots as a circle) provides an
example.
– Figure 5-13b (x, y plots as a square) indicates
independence, but a non-rectangular pattern
would indicate dependence.
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
51
Example 5-23: Independence Implies Zero Covariance
Let f X Y  xy   x  y 16 for 0  x  2 and 0  y  4
S how that  X Y  E  X Y   E  X   E  Y   0
EX


 2

x
yd
x
d y
 
16 0  0

1
1
4
4

16
0
2
x
y
 3
3
2
1 y


16  2

E Y  


d y
0 

2
  8  1 16 4

   
3
6
2
3
0 
 
4


2
xy
d
x

d y
 
16 0  0

1
1
4
4

16
0
2
 x2
y 
 2
2
3
2 y


16  3


d y
0 

2
 1 64 8

 
8
3
3
0 

4
2 2 2 
E  XY  
   x y dx dy
16 0  0

1

1
16

1
16
4
4

0
4

0
 x3
y 
 3
2

 dy
0 

2
2 8 
y   dy
3
3
1y
 
6 3

 1 64 32

 
6
3
9

0 
4
 XY  E  X Y   E  X   E Y

32
9

Figure 5-15 A planar
joint distribution.

4 8
 0
3 3
Sec 5-2 Covariance & Correlation
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
52
Common Joint Distributions
• There are two common joint distributions
– Multinomial probability distribution (discrete), an
extension of the binomial distribution
– Bivariate normal probability distribution
(continuous), a two-variable extension of the
normal distribution. Although they exist, we do
not deal with more than two random variables.
• There are many lesser known and custom joint
probability distributions as you have already
seen.
Sec 5-3 Common Joint Distributions
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
53
Multinomial Probability Distribution
• Suppose a random experiment consists of a series of n
trials. Assume that:
1)
2)
3)
•
The outcome of each trial can be classifies into one of k
classes.
The probability of a trial resulting in one of the k outcomes is
constant, denoted as p1, p2, …, pk.
The trials are independent.
The random variables X1, X2,…, Xk denote the number of
outcomes in each class and have a multinomial
distribution and probability mass function:
P  X 1  x1 , X 2  x 2 , ..., X k  x k  
n!
x1 ! x 2 !... x k !
x
x
x
p1 1 p 2 2 ... p k k
(5-18)
for x1  x 2  ...  x k  n and p1  p 2  ...  p k  1.
Sec 5-3.1 Multinomial Probability Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
54
Example 5-24: Digital Channel
Of the 20 bits received over a digital channel, 14
are of excellent quality, 3 are good, 2 are fair, 1 is
EEEEEEEEEEEEEEGGGFFP.
The probability of that sequence is
0.6140.330.0820.021 = 2.708*10-9
However, the number of different ways of receiving
those bits is a lot!
20 !
x
E
G
F
P
P(x)
0.60
0.30
0.08
0.02
 2, 325, 600
14 !3!2 !1!
The combined result is a multinomial distribution.
P  x1  14, x 2  3, x 3  2, x 4  1  
20 !
0.6 0.3 0.08 0.02  0.0063
14
3
2
1
14 !3!2 !1!
Sec 5-3.1 Multinomial Probability Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
55
Example 5-25: Digital Channel
Refer again to the prior Example 5-24.
What is the probability that 12 bits are E, 6 bits
are G, 2 are F, and 0 are P?
P  x1  12, x 2  6, x 3  2, x 4  0  
20 !
0.6 0.3 0.08 0.02  0.0358
12
6
2
0
12 !6 !2 !0 !
Using Excel
0.03582 = (FACT(20)/(FACT(12)*FACT(6)*FACT(2))) * 0.6^12*0.3^6*0.08^2
Sec 5-3.1 Multinomial Probability Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
56
Multinomial Means and Variances
The marginal distributions of the multinomial
are binomial.
If X1, X2,…, Xk have a multinomial distribution,
the marginal probability distributions of Xi is
binomial with:
E(Xi) = npi and V(Xi) = npi(1-pi)
(5-19)
Sec 5-3.1 Multinomial Probability Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
57
Example 5-26: Marginal Probability Distributions
Refer again to the prior Example 5-25.
The classes are now {G}, {F}, and {E, P}.
Now the multinomial changes to:
PX 2 X 3  x 2 , x 3  
n!
x 2 ! x3 ! n  x 2  x3  !
p
x2
2
p
x3
3
1 
p2  p3 
n  x 2  x3
for x 2  0 to n  x 3 and x 3  0 to n  x 2 .
Sec 5-3.1 Multinomial Probability Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
58
Example 5-27: Bivariate Normal Distribution
Earlier, we discussed the two dimensions of an
injection-molded part as two random
variables (X and Y). Let each dimension be
modeled as a normal random variable. Since
the dimensions are from the same part, they
are typically not independent and hence
correlated.
Now we have five parameters to describe the
bivariate normal distribution:
μX, σX, μY, σY, ρXY
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
59
Bivariate Normal Distribution Defined
f XY  x , y ;  X ,  X ,  Y ,  Y ,   
u 
1
2 1  
2

x  
X

2


X

2

1
2  X  Y 1  
2  x  X
e
u
2
  y  Y   y  Y 

 X Y
Y
2
2



for    x   and    y   .
  x  0,
P aram eter lim its: 
  y  0,
   x   ,
   y   ,
1    1
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
60
Role of Correlation
Figure 5-17 These illustrations show the shapes and contour lines of two
bivariate normal distributions. The left distribution has independent X, Y
random variables (ρ = 0). The right distribution has dependent X, Y random
variables with positive correlation (ρ > 0, actually 0.9). The center of the
contour ellipses is the point (μX, μY).
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
61
Example 5-28: Standard Bivariate Normal Distribution
Figure 5-18 This is a standard bivariate normal because its means
are zero, its standard deviations are one, and its correlation is zero
since X and Y are independent. The density function is:
f XY  x , y  
1
2
e

2
 0.5 x  y
2

Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
62
Marginal Distributions of the Bivariate Normal
If X and Y have a bivariate normal distribution with joint
probability density function fXY(x,y;σX,σY,μX,μY,ρ), the
marginal probability distributions of X and Y are
normal with means μX and μY and σX and σY,
respectively.
(5-21)
Figure 5-19 The marginal probability density functions of a bivariate
normal distribution are simply projections of the joint onto each of
the axis planes. Note that the correlation (ρ) has no effect on the
marginal distributions.
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
63
Conditional Distributions of the Joint Normal
If X and Y have a bivariate normal distribution with
joint probability density fXY(x,y;σX,σY,μX,μY,ρ), the
conditional probability distribution of Y given X =
x is normal with mean and variance as follows:
Y x  Y  
Y x  
2
2
Y
Y
X
x  X 
1   
2
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
64
Correlation of Bivariate Normal Random Variables
If X and Y have a bivariate normal distribution
with joint probability density function
fXY(x,y;σX,σY,μX,μY,ρ), the correlation between
X and Y is ρ.
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
(5-22)
65
Bivariate Normal Correlation & Independence
• In general, zero correlation does not imply
independence.
• But in the special case that X and Y have a
bivariate normal distribution, if ρ = 0, then X
and Y are independent.
(5-23)
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
66
Example 5-29: Injection-Molded Part
The injection- molded part dimensions has parameters as tabled
and is graphed as shown.
The probability of X and Y being within limits is the volume
within the PDF between the limit values.
This volume is determined by numerical integration – beyond
the scope of this text.
Figure 5-3
Mean
Std Dev
Correlation
Upper Limit
Lower Limit
Sec 5-3.2 Bivariate Normal Distribution
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
Bivariate
X
Y
3
7.7
0.04 0.08
0.8
3.05 7.80
2.95 7.60
67
Linear Functions of Random Variables
• A function of random variables is itself a
random variable.
• A function of random variables can be formed
by either linear or nonlinear relationships. We
limit our discussion here to linear functions.
• Given random variables X1, X2,…,Xp and
constants c1, c2, …, cp
Y= c1X1 + c2X2 + … + cpXp
(5-24)
is a linear combination of X1, X2,…,Xp.
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
68
Mean & Variance of a Linear Function
Let Y= c1X1 + c2X2 + … + cpXp and use Equation 5-10:
E  Y   c1 E  X 1   c 2 E  X 2   ...  c p E  X
V  Y   c1 V
2
p

(5-25)
 X 1   c 22V  X 2   ...  c 2pV  X p   2   c i c j cov  X i X j 
(5-26)
i j
If X 1 , X 2 , ..., X
V  Y   c1 V
2
p
are independent , then cov  X i X
j
  0,
 X 1   c 22V  X 2   .. .  c 2pV  X p 
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
(5-27)
69
Example 5-30: Negative Binomial Distribution
Let Xi be a geometric random variable with
parameter p with μ = 1/p and σ2 = (1-p)/p2
Let Y = X1 + X2 +…+Xr, a linear combination of r
independent geometric random variables.
Then Y is a negative binomial random variable
with μ = r/p and σ2 = r(1-p)/p2 by Equations 525 and 5-27.
Thus, a negative binomial random variable is a
sum of r identically distributed and
independent geometric random variables.
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
70
Example 5-31: Error Propagation
A semiconductor product consists of three
layers. The variances of the thickness of each
layer is 25, 40 and 30 nm. What is the
variance of the finished product?
X  X1  X 2  X 3
3
V
 X    V  X i   25  40  30  95
nm
2
i 1
SD  X

95  9.747 nm
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
71
Mean & Variance of an Average
If X 
X
1
 X 2  ...  X

p
p
T hen E  X

p
and E  X i   
 
(5 -28a)
p
If the X i are independent w ith V
T hen V  X

p 
p
2
2


Xi  2
2
(5-28b)
p
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
72
Reproductive Property of the Normal Distribution
If X 1 , X 2 , ..., X
p
are independent , norm al random variables
w ith E  X i    , and V
 X i  =  2 , for i  1, 2, ..., p ,
then Y  c1 X 1  c 2 X 2  ...  c p X
p
is a norm al random variable w ith
E  Y   c1  1  c 2  2  ...  c p  p
and
V  Y   c1  1  c 2  2  ...  c p 
2
2
2
2
2
2
p
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
(5-29)
73
Example 5-32: Linear Function of Independent Normals
Let the random variables X1 and X2 denote
the independent length and width of a
rectangular manufactured part. Their
parameters are shown in the table.
What is the probability that the perimeter
exceeds 14.5 cm?
Parameters of
X1
X2
Mean
2
5
Std Dev 0.1
0.2
Let Y  2 X 1  2 X 2  perim eter
E  Y   2 E  X 1   2 E  X 2   2  2   2  5   14 cm
V Y   2 V
2
SD  Y  
 X 1   2 2 V  X 2   4  0.1 
2
 4  0.2   0.04  0.16  0.20
2
0.20  0.4472 cm
 14.5  14 
P  Y  14.5   1   
  1   1.1180   0.1318
 .4472 
Using Excel
0.1318 = 1 - NORMDIST(14.5, 14, SQRT(0.2), TRUE)
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
74
Example 5-33: Beverage Volume
Soft drink cans are filled by an automated filling machine. The
mean fill volume is 12.1 fluid ounces, and the standard
deviation is 0.1 fl oz. Assume that the fill volumes are
independent, normal random variables. What is the probability
that the average volume of 10 cans is less than 12 fl oz?
Let X i denote the fill volum e of the i
th
can
10
Let X 

X i 10
i 1
EX
n
  EX 
i
10 
10  12.1
10
i 1
 X    11 0   V  X  
2
V
 12.1 fl oz
10
10  0.1 
i
i 1
2
 0.001 fl oz
2
100
 12  12.1 
P  X  12    
     3.16   0.00079
 0.001 
Using Excel
0.000783
= NORMDIST(12, 12.1, SQRT(0.001), TRUE)
Sec 5-4 Linear Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
75
General Functions of Random Variables
• A linear combination is not the only way that we
can form a new random variable (Y) from one or
more of random variables (Xi) that we already
know. We will focus on single variables, i.e., Y =
h(X), the transform function.
• The transform function must be monotone:
– Each value of x produces exactly one value of y.
– Each value of y translates to only one value of x.
• This methodology produces the probability mass
or density function of Y from the function of X.
Sec 5-5 General Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
76
General Function of a Discrete Random Variable
Suppose that X is a discrete random variable
with probability distribution fX(x). Let Y = h(X)
define a one-to-one transformation between
the values of X and Y so that the equation y =
h(x) can be solved uniquely for x in terms of y.
Let this solution be x = u(y), the inverse
transform function. Then the probability mass
function of the random variable Y is
fY(y) = fX[u(y)]
(5-30)
Sec 5-5 General Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
77
Example 5-34: Function of a Discrete Random Variable
Let X be a geometric random variable with PMF:
fX(x) = p(1-p)x-1 for x = 1, 2, …
Find the probability distribution of Y = X2.
Solution:
– Since X > 0, the transformation is one-to-one.
– The inverse transform function is X = sqrt(Y).
– fY(y) = p(1-p)sqrt(y)-1 for y = 1, 4, 9, 16,…
Sec 5-5 General Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
78
General Function of a Continuous Random Variable
Suppose that X is a continuous random variable
with probability distribution fX(x). Let Y = h(X)
define a one-to-one transformation between the
values of X and Y so that the equation y = h(x) can
be solved uniquely for x in terms of y. Let this
solution be x = u(y), the inverse transform
function. Then the probability density function of
the random variable Y is
fY(y) = fX[u(y)]∙|J|
(5-31)
where J = u’(y) is called the Jacobian of the
transformation and the absolute value is used.
Sec 5-5 General Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
79
Example 5-35: Function of a Continuous Random Variable
Let X be a continuous random variable with probability
distribution:
x
fX 
for 0  x  4
8
Find the probability distribution of Y = h(X) = 2X + 4
N ote that Y has a one-to-one relationship to X .
x  u  y 
fY
 y 
y4
2
 y  4
8
and the Jacobian is J  u '  y  
1
2
2 1
y4
 
for 4  y  12.
2
32
Sec 5-5 General Functions of Random Variables
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
80
Important Terms & Concepts for Chapter 5
Bivariate distribution
Bivariate normal distribution
Conditional mean
Conditional probability density
function
Conditional probability mass
function
Conditional variance
Contour plots
Correlation
Covariance
Error propagation
General functions of random
variables
Independence
Joint probability density
function
Joint probability mass function
Linear functions of random
variables
Marginal probability distribution
Multinomial distribution
Reproductive property of the
normal distribution
Chapter 5 Summary
81
© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.
```