Chris Mangold
GEOG-596A : Capstone Project Proposal – Peer Review
Fall 2013
Advisor: Dr. Peter Guth
 Background and motivations for project.
 Application concept.
 Methodology and approach.
 Anticipated results.
 Project timeline.
services moving to the Cloud.
 Large
scale hosting of DEM data in the Cloud.
 Mobile
platform trends: growth and increased
 Rapid
adoption of augmented reality.
2 KM RF Propagation
13 KM Cumulative Viewshed SRTM 90 M
3 KM Observer Viewshed
 High performance, scalable geo-processing web services
 Output available via REST APIs
 Near real-time response using GPU implemented algorithms.
 Support for terrain spatial analysis operations
 Slope, aspect, hill shade, observer viewshed outputs derived from raster
elevation data.
 Geographically global and localized DEM datasets.
 Resolutions from 90 meter to 1 meter.
 Formats that can be easy ingested into Cloud GIS systems.
 LIDAR - bare earth and first return results.
 Location via GPS, Cellular and IP networks, 3 meter accuracy standard.
 Improved device orientation sensors (motion, position).
 Dual-core CPUs common with Quad-Cores on high end devices.
 Graphic Processing Units (GPUs) increasing being integrated into devices.
 1 to 3 GB internal memory, 16 to 32 GB secondary storage.
 Larger screens, higher pixel densities (Retina displays).
Location and object recognition both play a role in AR context.
Task in AR is geo-registering the real world with the synthetic information.
Using mobile sensors to determine pose (position and orientation) is key.
Adoption by Advertising, Tourism, Entertainment and Medical industries.
Augmented view from Fai della Paganella
(Trento, Italy) 1
 Mainstay appears to be point vector data (POIs) based mobile GIS AR apps.
 POI database AR frameworks: Layar, Junaio, Wikitude.
 AR visualization which drapes GIS vector and raster data on camera view
will depend on scaling 3-D terrain modeling to mobile platforms.
 A mobile AR visualization layer indicating a navigation path to avoid being
detected from one or more known observer locations.
 Derived based on Cloud generated observer viewsheds using hosted DEM data.
 Rendered relative to users location and mobile devices orientation..
Cloud hosted GIS
 Cloud GIS generates and returns requested observer viewshed results as PNG
images with bounding box info.
 Device geo-registers viewshed images with user’s location, merges viewshed
images with evaluation AOI, calculates observed intersection points.
 Device derives the augmented curtain layer and renders it over live camera feed.
 AOI base map generated using GeoPDF data source and Tile Mill (MapBox).
 Map tiles served up using Open Street Maps (OSM) Android map platform.
 User drops pins to identify observer points for evaluation.
 When device is vertical, camera dashboard view and AR curtain layer is displayed.
 Augmented curtain and compass data is refreshed based on devices orientation.
 Augmented curtain will be recalculated as users location changes.
NED 10 M
Lidar 10 M
Lidar 3M Aggregate Generalization
LIDAR – 1.0 Meter
 Test application with multiple DEM data sets (NED 10M,3M & LIDAR 10M,3M,1M)
 Measure AR curtain accuracy and CPU rendering performance.
 Envision different use case scenarios were lower resolution data will suffice.
 LIDAR data processed to GeoTIFF format. (LAStools and ArcGIS for Desktop)
AOI: 1000 Pixels x 1000 Pixels
Scale: 1 Pixel = 1 Meter
Base BitMap with Neutral Pixel Color
Scale received viewshed PNG images
Geo-register and merge images
 Read PNG header to get pixel height/width. Scale using view shed bounding box.
 Maintain color for observed pixels during scaling.
 Geo-registers using OSM utilities.
 Merging image process clips out parts of view shed images not in AOI work area.
 Scan pixel values within user evaluation AOI.
 If pixel RGB value is green calculate azimuth to center point.
 Maintain counter for each azimuth compass point in an array.
 Create AR curtain base (circle) where edge
pixels represent visibility value.
 Red pixel indicates user will be visible if they
move in that direction.
 Use Android orientation sensor API to read devices current pose.
 Sensor pose values and graphics libraries used to render compass view.
 Pose values, graphics libraries and curtain base array used to render AR curtain.
 Android’s SurfaceView API provides live camera feed for dashboard background.
 Select evaluation
 1 – Urban site,
 1 – Rural site with variable terrain.
 Field Tests:
 Validate AR curtain layer results with 2D reference maps.
 Verify that AR layer updates and renders accurately as user moves.
 Record CPU usage when generating AR layer for different DEM data sets.
 Record application response times to render AR layer with different DEM
data sets.
Generate comparison report using collected metrics.
DEM data sets with better resolutions will result in more accurate LOP AR
curtain results.
LIDAR 1st return data .vs. bare earth DEM data will provide more accurate
LOP results.
CPU utilization will increase when rendering AR curtain when using higher
resolution DEM data.
Anticipate that application response time will lag as users ground speed
Will probably be optimizing algorithms to increase performance for
calculating AR layer and managing onboard sensors in the short term.
Long term using pose prediction filters to manage sensor inputs9 and
consider using onboard GPUs for AR curtain base calculations.
12/20/13 - 1/10/14
1/11/14 - 1/15/14
1/22/14 - 2/17/14
Complete LOP
App Features
Execute LOP App
Field Tests
Develop and Submit
1. Add tiled base map for test sites. 1. Collect field metrics.
2, Import DEM data sets to Cloud 2. Generate field results
3. Implement Cloud GIS REST calls
from mobile application.
4. Modify AR layer to show
weighted gradient.
5. Add thread to re-calculate curtain
base, when devices location
6. Add drop pin for observer point
7. Improve sensor pose filter.
3/23/14 - 3/28/14
“Geospatial Power in Our
M. Dalla Nura, M. Zanin, C. Andreatta and P. Chippendale, “Augmented Reality: Fusing The Real and Synthetic Worlds,” in IEEE
International Geosciences and Remote Sensing Symposium , 2012. IGARSS 2012, July 2012, pp. 170-173.
S.Kim, S. DiVerdi, JS.Chang, T. Kang, R. Iltis and T. Hollerer, “Implicit 3D Modeling and Tracking for Anywhere Augmentation “,
in Proceedings ACM Symposium on Virtual Reality Software and Technology, 2007. VRST 2007, Nov 2007, pp 19-27.
S. Dong, V.R. Kamat, “Robust Mobile Computing Framework for Visualization of Simulated Processes in Augmented Reality”
Proceedings of the 2010 Winter Simulation Conference (WSC’10) (2010), pp. 3111–3122
L. Baboud, M. Cadik, E. Eisemann and HP. Seidel, “Automatic Photo-to-Terrain Alignment for the Annotation of Mountain
Pictures” 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2011), pp. 41-48
M. Dalla Nura and P. Chippendale, “Real-World DEM Harmonization Through Photo Re-Projection” in IEEE International
Symposium on Geoscience and Remote Sensing, 2012. IGARSS 2012, July 2012, pp 428-430
P. Chippendale , M. Zanin and C. Andreatta, “Spatial and Temporal Attractiveness Analysis Through Georeferanced Photo
Alignment”, in IEEE International Geosciences and Remote Sensing Symposium, 2008. IGARSS 2008, Vol 2. pp. 1116 – 1119
A. Beutel, T. Mølhave, PK. Agarwal, “Natural Neighbor Interpolation Based Grid DEM Construction Using a GPU”, Proceedings
of the 18th SIGSPATIAL International Conference on Advances in Geographic Information Systems, 2010. pp. 172-181.
JM. Noguera, R. Segura, C. Ogayar, R. Joan-Arinyo, “A scalable architecture for 3D map navigation on mobile devices”, Personal
and Ubiquitous Computing, Vol 17 Issues 7 pp. 1487-1502 Oct 2013.
L. Porzi, E. Ricci, TA. Ciarfuglia, M. Zanin, “Visual-inertial tracking on Android for Augmented Reality applications”, in IEEE
Workshop on Environmental Energy and Structural Monitoring Systems , 2012 (EESMS) 2012, Sept 2012, pp. 35-41.
"Bringing the Networked Society to Life." Ericsson Annual Report 2012., 11 March 2013. WEB. 07 Dec. 2013.
Vogel, Lars. "Android Camera API - Tutorial", vogel/a., 01 February 2013. WEB. 10 Oct. 2013.
Conder, Shane & Darcey, Lauren. "Augmented Reality: Getting Started on Android" mobile tuts+., 18 January 2011 WEB. 09 Nov.
Loudoun County Acquisition and Classification for FEMA Region 3 FY 12 VA LIDAR , United States Geological Survey & Federal
Emergency Management Agency . Available via William and Mary Center for Geospatial Analysis
Rothera Point, Adelaide Island, Antarctica. Aster (v2) Global
DEM overlay. 5
“Geospatial Power in Our Pockets” - ASPRS aptly named conference theme .
The future for visualizing more complex GIS data sets on the mobile platform is
Power will come from harnessing the Cloud, mobile device’s improving computing
power and Augmented Reality technologies.

similar documents