Final Presentation Slides - 3D Mapping with Microsoft Kinect

Report
3D Mapping with
Microsoft Kinect
Derek Allen, Ambrose Bordelon, David Cambas, Jared
Desoto, John Paul Nguyen
Background
Computer Vision:
The act of getting computers to ‘see’.
Machine Vision:
A branch of computer vision, focused on
physical robots ‘seeing’ and reacting to what
they ‘see’.
Project Overview
Create a 3D model of an unexplored room utilizing the
Microsoft Kinect.
• Model of the room should be generated as
quickly as possible.
• Model generated should be as accurate as
possible.
• Platform should be cheep and easy to operate.
Project Management
• Break up coding into 4 parts:
– Program the interface between Kinect and
computer
– Program the raspberry pi and zigbee to allow for
wireless communication
– Develop a SLAM algorithm to stitch together
images from the sequence of Kinect images
– Develop a visualization application to view the
final reconstructed point cloud as a 3-D model
The Kinect
•
•
Cameras
•
30fps
•
640 x 480
IR Emitter
•
•
Reconstruction detail
Tilt Motor
•
27° tilt radius
iRobot Create
•
C or C++
•
Cargo Bay
•
Wheel Clips
•
Weight Capacity
•
Travel Speed
Raspberry Pi B+
•
85 x 56 x 17mm
•
C++
•
4 USB
Zigbee XBee
•
Wireless communication
•
Passive Protocol
•
•
Initiator starts
communication
72 Mbps
SLAM
•
RGBD SLAM
•
Scale-Invariant Feature
Transform (SIFT)
•
•
Object Recognition
•
Image Stitching
Random Sample
Consensus (RANSAC)
•
Inliers and Outliers
RANSAC
Programming
Programming
Our Reconstruction Application
Tool to Write
Kinect Depth
Stream as
Image
Format
Tool to
Transmit
Raster
Images
Wirelessly
Tool to
perform
SLAM on
Image
Sequence
Tool to
Visualize
Final Model
Programming
Depth
Stream
Tool
openni2
Kinect Fusion
libfreenect
Programming
Depth
Stream
Tool
openni2
Kinect Fusion
libfreenect
Programming
Depth
Stream
Tool
Transmit
Tool
python programming
language
bash shell
cpp language
Programming
Depth
Stream
Tool
Transmit
Tool
SLAM Tool
Point Cloud Library
OpenCV
RGBDSLAM
Programming
Depth
Stream
Tool
Transmit
Tool
SLAM Tool
Point Cloud Library
OpenCV
RGBDSLAM
Programming
Depth
Stream
Tool
Transmit
Tool
High Level
Low Level
SLAM Tool
Visualization
Tool
Programming
Depth
Stream
Tool
Transmit
Tool
High Level
Low Level
SLAM Tool
Visualization
Tool
Programming
Depth
Stream
Tool
Transmit
Tool
Visualization
Tool
SLAM Tool
Our Reconstruction Application
Tool to Write
Kinect Depth
libfreenect
Stream as
application
Image
Format
C++
RGBDSLAM
openframeworks
application
Testing
Hardware and Software each will be broken into units.
These units will be tested independently in ‘Unit Tests’,
then tested working together in ‘Integration Tests’.
Hardware Units Include:
• Kinect
• iRobot Create
• Microcontroller with Wireless adaptor
Testing
Software Units will be independent functions, where
each function will have set inputs from which certain
outputs are expected.
Software Units Include:
• Input Data Management
• Model Generation
• Blind Spot Detection
• Obstacle Detection
• Path Generation
• Robot Movement Control
Notes on Testing
•
Software Units are subject to change as project
develops.
• Testing will be done as soon as possible, as
often as possible.
• Extensive testing will be time consuming, but
feedback will allow fixing errors in an isolated
environment. Will allow speedier development
in the long run.
Deadlines and Timing
•
•
•
•
•
•
2/12/14 – Finish image stitching algorithm
2/25/14 – Test and Debug code
3/07/14 – Finish constructing robot
3/20/14 – Complete testing of basic robot
4/18/14 – Complete wireless integration
4/24/14 – Complete testing of wireless
integration
Executive Summary
•
•
•
•
•
3-Dimensional Model using a Kinect Sensor
Budget:
iRobot create, Zigbee, Xbee, raspberry pi
Divide the code into 4 subdivisions
Devised tests and deadlines for our robot

similar documents