Odometry Error Modeling
Three noble methods to model
random odometry error
Two types of odometer errors
• Systematic error
– Can be avoided by calibrating the wheel encoder
– Can accounted for in software
• Random error
– Difficult to calibrate or find a closed form formula
– Mathematically modeled
– Significant source of confidence level degradation
Systematic error
• UMBmark test
– Let the robot follow a defined path several times
– Compare the deviation of the robot in CW/CCW
– Calculate the weighted Cartesian offsets
– Correct error in software/hardware
Example, UMBmark with square of side D
Robot covers a square of side D, n times in both CW and CCW sense. At
the end of each lap, the error in x and y direction are recorded.
Random error modeling 1
• This model expresses the covariance matrix in
terms of four parameters: ET ER Ktheta Krho
• Here are the relations:
– Vtheta = Ktheta * (Translation speed)
– Vrho = Krho * (Translational speed)
– Note: V stands for variance
• The other two parameters describe
systematic errors (thus uninteresting)
How to determine parameters
• Perform simple robot movements, such as
moving back and fro for a certain number of
• Measure several “observables”, quantities to be
measured from the robot’s movement
• Calculate the four parameters from the measured
• Certain observables give a better estimation of
the parameters. The paper lists best observable
to estimate each model parameter.
Model 2
• Any robot path can be decomposed into three
primitive movements: circular motion, turn on
spot, straight motion.
• This model finds closed form formula for error
propagation for circular motion.
• Using this model the robot updates its odometery
covariance matrix only when the robot changes
type of motion. So faster computation and
independent of sequence of action
Model 3
• This model takes into account the variation in
the wheel diameter and wheel separation.
• A model that incorporates the random wheel
variation to covariance matrix was found.
• Result shows final error propagation is just the
sum of covariance propagation with no wheel
variation and a new error factor.
• The new error factor updates the wheelvariation-covariance matrix (Q)
Result of model 3
Odometry Error Correction
• Nonsystematic Error Correction
– OmniMate Design
– Sensor Fusion
• Systematic Odometry Error Correction
– Systematic Error Compensation
Systematic and Nonsystematic Errors
• Systematic errors: errors in design
Unequal wheel diameters
Misalignment of wheels
Limited sampling resolution
Uncertainty about effective wheelbase
• Nonsystematic: errors due to environment
-Travel over uneven floor
-Unexpected object on the floor
-Wheel slippage
Nonsystematic Error Correction:
OmniMate Design
• Freedom mobile
• omnidirectional
• Two differential-drive
TRC LabMate platforms
• Front and rear can
rotate around A and B
OmniMate Design (continue)
OmniMate (continue)
• IPEC: Internal Position Error Correction
(OmniMate control System)
• Detect automatically and correct
nonsystematic odometry errors
Nonsystematic Error Correction:
Sensor Fusion
• Fusion of data from gyroscope and odometry
systems for a long time travel
– If dataG – dataE > prefix threshold, dataG is used
to compute orientation angle. Otherwise use
– Threshold adapted to different environment
– Kalman filters to reduce noise and measure
angular position
Systematic Error Correction
• Substantial odometry error: effect on straight
line motion
– Ed=DR/DL
• Wheelbase error: effect when turning
expressed as a function of the nominal value
Systematic Odometry Errors
• R=(L/2)/sin(beta/2)
• Eb=(R+b/2)/(R-b/2)=
• Da=(DR+DL)/2
• DL=2Da/(Ed+1)
• DR=2Da/((1/Ed)+1)
• Correction Factors:
– Cl=2/(Ed+1)
– Cr=2/((1/Ed)+1)
Simultaneous Location and
Mapping (SLAM)
• Replace the need to manually create and
update maps
– Map becomes invalid if an object begins moving
or the map is rearranged
– Allows for the robot to be truly autonomous
Primary Goal
• Be able to start from an arbitrary point
• autonomously explore the local environment
using its on-board sensors,
• analyze the location and map the area
• determine its actual location on the self
constructed map
Popular Implementations
• Visual-SLAM algorithm on the extended
Kalman filter
• Graph-SLAM algorithm
• Particle-filter slam
Visual-SLAM algorithm on the
extended Kalman filter
• Combines the extended Kalman filter results
with robot pose and feature locations
• Data is updated each iteration which leads to
high computation costs with a lot of features
• Implementations have been progressive to
allow for maps to be broken down to more
manageable datasets
Graph-based SLAM
• Constraints between robot and feature are
seen as “soft” constraints
• Creates a grid map of the robot location and
• Capable of solving the SLAM problem by
calculating the area of least required engergy
on the grid
Particle filter Slam
• Represents the object location certainty in
samples instead of Gaussian probability
• Good for nonlinear transformations and any
type of non-Gaussian distribution
Handling Errors
• Any error in one iteration will continue to
grow as more iterations are done
• Common approach for correcting this error is
to return to a feature who’s location and
characteristics are well know
• This minimizes uncertainty across the rest of
the map as well
Room for Improvement
• Dynamic object pose a large problem for
– Robot is quickly working within an invalid map
– Attempt at a solution is to treat a dynamic object
differently from a static feature on the map
– Any successful attempt at determining moving
objects and their destination would drastically
improve the SLAM process
Room for Improvement
• SLAM is very sensitive to errors
• This error is especially detrimental when the
robot attempts to minimize uncertainty by
returning to a know object
• Commonly used laser rangefinders are prone
to missing some of the feature’s details
– The use of an on board camera is useful for
– SIFT Function

similar documents