### Demand Estimation

```MANAGERIAL ECONOMICS
12th Edition
By
Mark Hirschey
Demand Estimation
Chapter 5
Chapter 5
OVERVIEW
•
•
•
•
•
•
•
Interview and Experimental Methods
Simple Demand Curve Estimation
Simple Market Demand Curve Estimation
Identification Problem
Regression Analysis
Measuring Regression Model Significance
Measures of Individual Variable
Significance
Interview and Experimental
Methods
Consumer Interviews


Interviews can solicit useful information when
market data is scarce.
Consumer opinions can differ from behavior.
Market Experiments


Controlled experiments can generate useful
insight.
Experiments can be expensive.
Simple Demand Curve Estimation
•
Simple Linear Demand Curves
•
•
•
The best estimation method balances
marginal costs and marginal benefits.
Simple linear relations are often useful for
demand estimation.
Using Simple Linear Demand Curves
•
Straight-line relations can give useful
approximations.
Simple Demand Curve Estimation
Consider the linear demand:  =  +
when P=\$12, Q=3,200 and P=\$10, Q=4,000
12 =  +  3200
minus 10 =  +  4000
2 = −800
b = −0.0025
=  +
is
12 =  +  3200
12 =  − 0.0025 3200
12 =  − 8
= 20
= 20 − 0.0025
Demand Curve and TR Maximization
=  ∙
= \$20 − \$0.0025
= \$20 − \$0.0025 2

=

= \$20 − \$0.005
\$20 − \$0.005 = 0
\$0.005 = \$20
= 4000 and  = 20 − 0.0025 4000 = \$10
max  =  ∙  = 4000 ∙ \$10 = \$40,000
Market Demand Curve Estimation
Market Demand Curve

Shows total quantity customers are willing to
buy at various prices under current market
conditions.
Graphing the Market Demand Curve


Market demand is the sum of individual
demand quantities, Q1 + Q2 = Q1+2.
Market Demand Curve Estimation
= \$100 − \$0.001
= \$80 − \$0.004
Domestic Demand
Foreign Demand
= \$1,200,000 + \$24 Total Cost
= \$100 − \$0.001
= \$80 − \$0.004
0.001 = 100 −
0.004 = 80 −
= 100,000 − 1000
= 20,000 − 250
Market Demand Curve Estimation
=  +
= 100,000 − 1000 + 20,000 − 250
= 120,000 − 1,250
1,250 = 120,000 −
= \$96 − \$0.0008
Market Demand
Market Demand Curve Estimation
=  ∙
= \$96 − \$0.0008
= \$96 − \$0.0008 2

=
= \$96 − \$0.0016

= \$1,200,000 + \$24

=
= \$24

Market Demand Curve Estimation
=  at profit maximization
\$96 − \$0.0016 = \$24
\$0.0016 = \$72
= 45,000
= \$96 − \$0.0008 45,000 = \$60
=  −
= \$96 − \$0.0008 2 − (\$1,200,000 + \$24)
= −\$0.0008 2 + \$72 − \$1,200,000
= −\$0.0008 45,000
= \$420,000
2
+ \$72 45,000 − \$1,200,000
Market Demand Curve Estimation
\$120
\$100
\$80
\$60
\$40
Domestic Demand
\$20
Market Demand
Foreign Demand
\$0
0
20,000
40,000
60,000
80,000
100,000
120,000
140,000
Identification Problem
•
Changing Nature of Demand Relations
•
•
Interplay of Demand and Supply
•
•
Economic conditions affect demand and
supply.
Shifts in Demand and Supply
•
•
Demand relations are dynamic.
Curve shifts can be estimated.
Simultaneous Relations
•
Quantity and price are jointly determined.
Identification Problem
The demand curve “D” does not
exist the data are for three
shifting demand curves
All else equal w.r.t.
demand determinants
D1
S1
P
S1
P
D2
S2
P1
S2
The problem:
“D” has higher
elasticity than “D1”.
P1
D3
S3
P2
S3
P2
P3
P3
D
Q1
Q2
Q3
D
Q
Q1
Q2
Q3
Q
Regression Analysis
•
What Is a Statistical Relation?
•
•
•
Specifying the Regression Model
•
•
•
A statistical relation exists metrics are related
A deterministic relation is true by definition.
Dependent variable Y is caused by X.
X variables are independently determined
from Y.
Least Squares Method
•
Minimize sum of squared residuals.
Regression Analysis
Direct Relation
Unit
Sales
Inverse Relation
Unit
Sales
.... .
.
....
.
X
Unit
Sales
...
... ...
..
Price
No Relation
X
.... ......
Orbit of
Jupiter
X
Regression Analysis
Y
Regression Analysis fits a “Sum of Least
Squares” line to the data

−
2
Estimate or
. .
−

=
. .
2 −
2
. . . Error or  −
. Observation or
−
.
=
. . .

=  +
= 0.136047 + 0.005962
X
Regression Analysis

24
43
24
34
36
38
22
23
30
33
=
=

78
100
86
82
86
84
75
80
83
91

1872
4300
2064
2788
3096
3192
1650
1840
2490
3003
2
576
1849
576
1156
1296
1444
484
529
900
1089
−

2 −

−

2

79.50
93.67
79.50
86.96
88.45
89.94
78.01
78.76
83.98
86.21

-1.50
6.33
6.50
-4.96
-2.45
-5.94
-3.01
1.24
-0.98
4.79
105
100
95
90
85
80
75
70
65
10
20
30
10 26295 − 307 845
=
= 0.745623
10 9899 − 307 2
856 − 0.745623 ∙ 307
=
= 61.60937
10
= 61.60937 + 0.745623
40
50
Regression Analysis
Regression equations can take
on any functional form
=  +
=  +   +   +
= 0
The above multiplicative form is popular among economist because
the exponents “bP” is the constant elasticity of the variable.
= 4−0.4 0.2 0.003
has the constant price elasticity of – 0.4
Measuring Regression Significance
Standard Error of the Estimate (SEE) reflects
degree of scatter about the regression line.
Y
Upper 95% confidence bound:
+ 1.96 standard error of the estimate
=  +
= Slope of curve

Lower 95% confidence bound:
- 1.96 standard error of the estimate

X
Goodness of Fit
•
Correlation shows degree of concurrence.
•
•
•
Coefficient of determination, R2.
•
•
•
r = 1 means perfect correlation.
r = 0 means no correlation.
R2 = 100% means perfect fit.
R2 = 0% means no relation.
Corrected coefficient of determination
•
Adjusts R2 downward for small samples.
Regression Analysis
Regression Analysis fits a “Sum of Least
Squares” line to the data
2 =
Y

Total variation
−
−
−
2
2

Unexplained variation  −
Explained variation  −

− 2 = 1 − (1 − 2 )
X
−1
−−1
F statistic
Tells if R2 is statistically significant
Goodness of Fit Test
Do not Reject H0
Reject H0
F
 = 0.05
 = 0.1
F = 2.69
F = 2.14
Critical Value
Judging Variable Significance
•
t statistics compare sample characteristics to the
standard deviation of that characteristic.
•
•
•
Two-tail t Tests
•
•
t > 1.645 implies a strong effect of X on Y (90% conf.).
t > 1.96 implies an even stronger effect of X on Y (95% conf.)
Tests of effect.
One-Tail t Tests
•
Tests of magnitude or direction.
Judging Variable Significance
t – statistic
H0: b=0
Ha: b≠0
Reject H0
Reject H0
 / 2 = 0.025
– 1.96
Do not Reject H0
0
 / 2 = 0.025
1.96
z
Confidence Level = 95%
 / 2 = 0.025 Critical Value = ± 1.96
 / 2 = 0.05 Critical Value = ± 1.645 Confidence Level = 90%
Multiple Regression Example
SUMMARY OUTPUT
Regression Statistics
Multiple R 0.939811
R Square
0.883244
Square
0.836542
Standard
Error
2.929296
Observatio
ns
15
ANOVA
df
Regression
Residual
Total
SS
MS
F
Significanc
eF
4 649.1256 162.2814 18.91221 0.000118
10 85.80775 8.580775
14 734.9333
Coefficient
s
Intercept 23.30213
P
-5.96115
Ps
6.50636
Pc
-1.09766
I
-1.3E-05
Standard
Error
17.55563
2.928719
3.888925
3.416276
0.000116
t Stat
1.327331
-2.03541
1.673049
-0.3213
-0.10919
P-value Lower 95%
0.213904 -15.8142
0.069176 -12.4867
0.125263 -2.1587
0.754595 -8.7096
0.915211 -0.00027
Q
20
30
10
15
12
28
17
14
20
22
32
14
25
28
12
P
2
1.5
2.3
2.5
2.8
1.7
2.4
3
2
2
1.4
3
1.8
1.6
3
Upper
Lower
Upper
95%
95.0%
95.0%
62.41851 -15.8142 62.41851
0.564444 -12.4867 0.564444
15.17142 -2.1587 15.17142
6.514275 -8.7096 6.514275
0.000245 -0.00027 0.000245
Ps
1.5
2.5
1.7
1.9
1.4
2.7
1.8
1.7
2.1
2.3
2.8
1.4
2.4
2.6
1.7
Pc
3
2.3
3.2
2.8
3.5
2.2
3.1
3.8
2.9
2.3
2.4
3.1
2.4
2.2
3.1
I
38000
22000
40000
25000
28000
17000
35000
40000
20000
34000
20000
40000
36000
36000
30000
```