Lecture_20_ASEN_5070_2014F_Post - CCAR

```ASEN 5070: Statistical Orbit Determination I
Fall 2014
Professor Brandon A. Jones
Lecture 20: Project Discussion and
the Kalman Filter
Boulder

Homework 6 Due Friday
Boulder
2
Project/Homework Discussion
Boulder
3


Satellite state estimated and propagated in the
inertial frame:
Dynamics solve-for parameters are
(fundamentally) not tied to a coordinate system:

Ground-station locations are in the Earth-fixed
frame:
Boulder
4


Since the ground stations are in the Earth-fixed
frame, we assume:
Hence, we have:
Boulder
5


The portions of the reference state requiring integration
only includes the spacecraft position and velocity
Strictly speaking, we only need to propagate a 6 × 9 matrix!
Boulder
6
All of these need to be in
the same reference frame!

We recommend including this transformation
in the measurement model:
Boulder
7


How can we estimate the filter solve-for
parameters since the observations do not seem
to depend on them?
How/why can we
estimate these values?
(conceptual and
Boulder
The STM is a
function of
these values
8

Compare to solution online

Results available as .txt and .mat
◦ Results generated for the .txt files did not use
ode45()!
◦ Results in .mat file appear to have used Rel/Abs
tolerances of 1e-11

Note: some elements of the project website
need to be updated (suggestions and rubric)
Boulder
9


We ask for relative differences to quickly
identify differences between your result and
the one online:
Example:
Boulder
10
Conventional Kalman Filter (CKF)
Boulder
11

Given from a previous filter:

We have new a observation and mapping matrix:

We can update the solution via:
Boulder
12

Is there a better sequential processing
algorithm?
◦ YES! – The equations above may be manipulated to
yield the Kalman filter
Boulder
13

Today – Outline derivation from minimum
variance estimator
◦ Demonstrates mathematical equivalence of CKF and
Batch

Wednesday – Derivation as a solution to Bayes
theorem
◦ Demonstrates strengths of Kalman filter in context
of probability/statistics
◦ Also helps to understand impacts of assumptions
Boulder
14

Schur Identity (Appendix B, Theorem 4):
(Yes, it will simplify things…)
Boulder
15
Kalman Gain
Boulder
16
Boulder
17


Mathematically equivalent to the batch least squares

Also provides a solution to the least squares minimization problem

Yields a new set of problems in filtering (to be covered later)
Boulder
18
Does not map to epoch time!
Note the use of Htilde
Boulder
19


Reinitialize integrator after each observation:
Alternatively, we can use already generated
output:
Boulder
20



We have to invert a p×p matrix, which is
likely more efficient and stable than a n×n
matrix inversion
Can we further reduce the computation
Yes – under certain conditions…
Boulder
21
Boulder
22
Boulder
23

Whitening Transformation
Use new values in Kalman filter
Boulder
24

Whitening Transformation
Boulder
25
The Kalman Filter – Prediction Residuals
Boulder
26


Previously, we have discussed the pre-fit and
post-fit residuals:
How can this change in the context of the
CKF?
Boulder
27


At each measurement time in the CKF, we can
take a look at the prediction residual:
Covariance of the prediction residual: