3D Reconstruction Optimization Using Imagery Captured By Unmanned Aerial Vehicles

Similar documents
Unmanned Aerial Vehicle (UAV)-Based Remote Sensing for Crop Phenotyping

II. Presentation of the Project

Invasive Spear Thistle Weed Detection and Zoysia grass Mapping Using UAV Aerial Imagery

LOW COST AERIAL MAPPING WITH CONSUMER-GRADE DRONES

CAN LOW COST, CONSUMER UAV'S MAP USEFUL AGRONOMIC CHARACTERISTICS?

$ / maximum area Imagery $10-20 No limit $<1-3. $ / minimum area

Development and Applications of Unmanned Aerial Vehicles: Challenges and Opportunities

Presentation of the Paper. Learning Monocular Reactive UAV Control in Cluttered Natural Environments

FSBPA Chip Baumberger, Project Scientist Dustin Myers, UAV Coordinator Brent Gore, GIS Coordinator

Potential for Using Unmanned Aerial Vehicles (UAV) in an On-Site Inspection

ASSESSING THE STRUCTURE OF DEGRADED FOREST USING UAV

Flying High. Typical methods

Michigan Department of Transportation

Use of Unmanned Aerial Vehicles (UAVs) in Impact Assessment. IAIA, Florence, Italy, April 2015

11. Precision Agriculture

Drones for Development: Robotic Companions Aiding Citizens and Journalists of the Developing World

Tool for Nurseries Usingg an Unmanned Aerial Platform

Unmanned Aircraft for GIS

Potential of UAV for field sensing!! Advanced Remote Sensing

DEVELOPMENT OF AN AUTONOMOUS UNMANNED AERIAL VEHICLE FOR AEROBIOLOGICAL SAMPLING. A Thesis

Generating Actionable Insights in the Field. Agriculture Webinar Series

Technical Layout of Harbin Engineering University UAV for the International Aerial Robotics Competition

Mapping with Drones. Key Considerations and Factors to Achieve Desired Accuracy

Advantages of Using Laser Scan and UAV Technology in Virtual Asset Management

Fixed-Wing Survey Drone. Students:

DETERMINATION OF THE FOREST CANOPY COVER USING A LOW- COST COMMERCIAL DRONE IN A TURKEY OAK AND DURMAST OAK STAND IN THE ZARAND MOUNTAINS, ROMANIA

Unmanned Aerial Systems

DRONE / SMALL UNMANNED AIRCRAFT FLYING - ON THE UCL ESTATE

Concept Paper. Unmanned Aerial Surveillance for Perimeter Security Missions

Fixed Wing Survey Drone. Students:

Artificial Intelligence applied for electrical grid inspection using drones

Lessons Learned from NASA s UAV Science Demonstration Program Missions.

Outline of seminar. The business case for operating a UAS

2017 NMARC/ASMCRA. Take to the Skies: Using Drone Technology as a Tool

USING UNMANNED AERIAL VEHICLES USING UNMANNED AERIAL VEHICLES

United States Air Force Academy Design Project

UAV assessment to support EU policies

UAS In Agriculture A USDA NIFA Perspective; Appropriate Science, Possibilities, and Current Limitations

From Applied Research to Application - Remote Sensing Products for Waterway Management

RPAS Swarms in Disaster Management Missions

Photogrammetric Week '15 Dieter Fritsch (Ed.) Wichmann/VDE Verlag, Belin & Offenbach, Mayr 51. UAVs for Production. Werner Mayr, Welzheim

2016 AASHTO Subcommittee on TSM&O

Drone-Assisted Field Mapping

Drones Use in Forestry: Regulations, Applications, Considerations

Unmanned Aerial Vehicles as Data Mules for Reconnaissance and Environmental Monitoring

Unmanned Aerial Vehicle (UAV) Inspection, survey, mapping, modelling & analysis

Accenture Aerial Monitoring Solution. For Electric Utilities and Oil & Gas companies

Taking Off With Drones. December 4, :45 4:45 Workshop Block II Brian D Simmons, P.E., FAA 107 Principal Engineer Bolton & Menk Inc.

A Seismic approach with UAV technologies

Drone technology has skyrocketed over the past decade driving costs down and the

How are arable farmers actually using drones?

SURVEYING CONSTRUCTION FORESTRY AGRICULTURE ENVIRONMENT POWER ENGINEERING

Capture the invisible

PES ESSENTIAL. A bird s eye view

UAVs 4 STEM. Using recreational drones for learning. March 22, 2016 Presenters: LuAnn Dahlman, NOAA Dan Zalles, SRI

Drones in Forestry: Regulations, Applications, and Considerations SESAF Conference Darian Yawn Paul Shepard

Drone Surveying: The Complete Story

Low Cost Aerial Mapping Alternatives for Natural Disasters in the Caribbean

Unmanned Aircraft Transmission Line Maintenance Technology

AERIAL INSPECTIONS USING UNMANNED AERIAL VEHICLES

Application of remote sensing technique for rice precision farming

Prospect ATCOs Branch Position Paper On Unmanned Aerial Vehicles

UAV Operations at the Purdue Agricultural Centers (PACs)

Jacksonville District Surveying and Mapping Branch UAS Program

GEOINT Aerial Vehicles

Multi-view Configuration of Flight Dynamic Playback

2017 SUNY TYESA Mini UAV Competition Friday, May 5, 2017 Monroe Community College, Rochester NY

DRONE-OLOGY - TO INFINTY AND BEYOND! ADSK Solutions. Chad Studer Owner/President

Opportunities for UAS as Smart Inspectors in Environmental Monitoring Applications

Drones: Are they Risky Business?

Use of Aerial Digital Imagery to Measure the Impact of Selective Logging on Carbon Stocks of Tropical Forests in the Republic of Congo

Drone Enterprise Solutions

Military and Commercial Drones: -- Markets Reach $16.1 Billion By 2021

Indian Agricultural Research Institute, New Delhi *Corresponding author:

3D VISION AND MODELING

UNMANNED AERIAL VEHICLE SYSTEMS

NDVI for Variable Rate N Management in Corn

Drone Basics 101 Technology, Regulations and Project Examples. Tom Gehrdes & Alton Whittle

Outline Applicability of Rotary UAV for Vegetable Crop Investigation. Overview. Crop Investigations III. UAV Experimentation.

Unmanned Aerial Vehicles in Municipality Level 3D Topographic Data Production in Urban Areas

Use of UAV(s) for Agriculture and Forestry Applications

A Study on Generation of 3D Model and Mesh Image of Excavation Work using UAV

OCWR DRONE PROGRAM LEA Technical Training Series December 4, 2018

DRONES EN TOEPASSINGEN

CLICK HERE TO KNOW MORE

Using Aerial Imagery for Irrigation Management Jake LaRue Valmont Irrigation, Valley, NE

Autonomous Quadcopter UAS P15230

UAV USER NEEDS REPORT EXECUTIVE SUMMARY. We know where.

Realistic Models for Characterizing the Performance of Unmanned Aerial Vehicles

Enhancing the Value of Precision Ag Data with Unmanned Aerial Systems (UASs) Mike Buschermohle Precision Ag Specialist UT Extension

Eyes in the Sky for African Agriculture, Water Resources, and Urban Planning

Precision Agriculture Methods & Cranberry Crop Monitoring with Drones

Conducting Tree Surveys Using Unmanned Aerial Systems at Green Valley Farms Living Laboratory

Performing UAV Mission Planning, Design, & Optimization

Assessment of accuracy of basic maneuvers performed by an unmanned aerial vehicle during autonomous flight

USING RECREATIONAL UAVS (DRONES) FOR STEM ACTIVITIES AND SCIENCE FAIR PROJECTS

UAV SERVICES. Consult. Trust. Innovate.

Agricultural Aerial Services

The professional mapping tool

(Capstone-High Altitude Balloon) EE 476

Transcription:

3D Reconstruction Optimization Using Imagery Captured By Unmanned Aerial Vehicles Abby Bassie, Sean Meacham, David Young, Gray Turnage, and Robert J. Moorhead Geosystems Research Institute, Mississippi State University, Mississippi State, MS 39762-9627 Geosystems Research Institute Report 5068 March 2016

3D Reconstruction Optimization Using Imagery Captured By Unmanned Aerial Vehicles Abby Bassie, Sean Meacham, David Young, Gray Turnage, and Robert J. Moorhead Geosystems Research Institute, Mississippi State University, Mississippi State, MS 39762-9627 Introduction Traditionally, piloted aircraft and satellites have been the primary platforms for obtaining remote images. While they have obtained satisfactory imagery on a regional and global scale, these platforms have struggled to provide adequate spatial and temporal resolutions for data acquisition at a local scale (Torres-Sánchez et al. 2013). Unmanned aerial vehicles (UAV s) are playing an increasingly important role in remote imagery acquisition. Because these vehicles can fly at low altitudes, UAV s can collect ultra-high spatial resolution imagery and can observe small individual plants and patches (Xiang and Tian 2011). One way that UAV imagery is used to differentiate between small objects of similar size is through three-dimensional (3-D) reconstruction. When a collection of images is taken over a test area, sorting algorithms compare pixels between similar images. As matching pixels are found in multiple images, these pixels are inserted into a three-dimensional point cloud. By correctly scaling these point clouds, accurate portrayals of objects can be digitally recreated in silica. Objects within these 3-D reconstructions can then be dimensionally measured. This technique is rising in popularity as a surveying tool, thus it is vitally important that researchers understand how to optimize the performance of UAV s and their associated camera payloads for 3-D reconstruction. GRI REPORT #5068 Page 2 March 2016

Methods In this study, a Nikon RGB camera with a 10 mm lens was attached to a Precision Hawk Lancaster (version 4). A series of six flights were executed over a test plot at the R.R. Foil Plant Research Center (North Farm) at Mississippi State University (Figure 1) using no ground control points to simulate conditions in an inaccessible area. This test plot encompassed an area of 8.86 ha (21.89 ac) and contained nine predetermined reference objects (Figure 2). A few dimensions of these reference objects were measured using a tape measure prior to UAV flights. The first flight was conducted at an altitude of 45.72 m (150 ft.), and subsequent flight altitudes were increased by increments of 15.24 m (50 ft.) per flight. Flight plans were developed to provide 70% (frontal and lateral) overlap between images. Overlap greater than 70% can cause triggering issues in the shutter mechanism of the Nikon payload. After obtaining data from each flight, the images were processed using AgiSoft PhotoScan, a stand-alone software product that uses photogrammetry to create 3-D point clouds from digital images. Point clouds from each flight scenario were then exported from AgiSoft as.ply files and were imported into MeshLab, an open-sourced image processing software package. MeshLab was utilized for point cloud analysis in this study because the software contains a readily accessible measuring tool. In order to use this tool, a known dimension of an object in the point cloud must be used to set the point cloud s scale. In each of the flight scenarios, the width of Reference Object 1 (x-direction) was used to set the scale of the point cloud for MeshLab measurement. The other known dimensions of the reference objects were used to analyze the accuracy of MeshLab measurements (Table 1). After using Reference Object 1 to set the scale of the point cloud, MeshLab s measuring tool was used to measure dimensions of the reference objects. These measurements were then GRI REPORT #5068 Page 3 March 2016

compared to the actual dimensions of the reference objects. These changes are recorded as Δx, Δy, and Δz in Appendix A. The Δx, Δy, and Δz values at each altitude were then averaged to show average change in each coordinate direction for each of the six flight scenarios (Table 2). Results While over 900 photos were required to obtain 70% overlap in the 45.72 m (150 ft.) scenario, only 156 photos were required to obtain the same amount of image overlap at 121.92 m (400 ft.). Additionally, high altitude flights significantly reduced AgiSoft computation time in comparison to lower altitude flights. Processing for the 45.72 m flight took around 30 hours, while the processing time for the 121.92 m flight was only approximately 5 hours (Table 2). While flying at higher altitudes reduced processing time and energy costs in comparison to flights performed at lower altitudes (Table 2), there was a substantial loss in spatial resolution at high altitudes. Only two measurements could be made in the point cloud generated at 121.92 m. Most of the reference objects in the point cloud generated from images at this altitude were too pixelated to clearly identify and measure. The reference objects in the point cloud generated from an altitude of 76.2 m (250 ft.), however, were much easier to identify and measure. In this flight scenario, seven of the nine reference objects had measurable dimensions (Appendix A). When the point clouds from the 60.96 m (200 ft.) and 45.72 m (150 ft.) flights were imported into MeshLab, the software crashed. This commonly happens when the size of the data set exceeds MeshLab s computational capabilities. In the future, images from the 60.96 m and 45.72 m flights will be processed using a reduced number of images. Discussion and Conclusions While the 121.92 m (400 ft.) flight yielded the smallest average Δx and Δz, the spatial resolution of point clouds generated at this altitude must be taken into account. Spatial resolution GRI REPORT #5068 Page 4 March 2016

at this altitude was so poor that only two measurements could be obtained from the reference objects. This problem was solved at the lower altitude flights, as increased spatial resolution improved the number of measurable reference objects (Appendix A). This study does suggest that the lower the altitude, the higher the spatial resolution of point clouds; however, MeshLab seems to be unable to handle datasets larger than approximately 360 400 images. Additionally, because the operator manually makes measurements on point clouds in MeshLab, a certain amount of operator error will always factor into the accuracy of measurements made by this tool. Refinement of the point cloud scale and precise measurements help ensure that operator error is minimal. In three of the scenarios flown in this study, MeshLab had the capability to measure object dimensions from 50.8 to 76.2 cm (20 30 inches) with greater than 93% accuracy (Appendix A). The largest average deviation in any flight scenario from actual measurements was 14.773 cm (5.816 in.). These results are satisfactory in a wide variety of research applications focused on differentiating objects using remote imagery. It is possible that MeshLab measurements could be further optimized by investigating variables outside the scope of this study. MeshLab s measuring tool may be more accurate when measuring objects at a larger scale or when used with point clouds constructed with different image overlap settings. Because these refinements are important to optimization of UAV s as a platform for 3-D reconstruction, research investigations focused on these topics are currently underway. GRI REPORT #5068 Page 5 March 2016

Literature Cited Torres-Sánchez, J., F. López-Granados, A.I. De Castro, J.M. Peña-Barragán. 2013. Configuration and Specifications of an Unmanned Aerial Vehicle (UAV) for Early Site Specific Weed Management. PLoS ONE 8(3): e58210. doi:10.1371/journal.pone.0058210. Access March 2016. Xiang, H. and L. Tian. 2011. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst Eng 108(2): 174 190. http://www.sciencedirect.com/science/article/pii/s1537511010002436. Access March 2016. GRI REPORT #5068 Page 6 March 2016

Tables and Figures Table 1. Reference objects and their actual dimensions Reference Object # x (cm) y (cm) z (cm) 1 149.86-69.85 2 78.74-54.61 3 64.77-16.7132 4 78.74 - - 5 57.4675 - - 6 66.04 66.04-7 97.155 78.74 80.01 8 78.74 - - 9 - - 38.1 **also used cap of Reference Object 1. x cap (cm) = 66.04 Table 2. Average change from actual measurements for each flight scenario. Altitude (m) Δx (cm) Δy (cm) Δz (cm) Area (Ha) Image Count Processing Time (hours) 400 0.596 N/A 0.625 8.86 156 5 350 2-0.872-5.816 8.86 190 7 300 3.916 0.851-0.733 8.86 225 10 250-1.191-3.384 1.971 8.86 360 13 200 N/A N/A N/A 8.86 533 20 150 N/A N/A N/A 8.86 927 30 GRI REPORT #5068 Page 7 March 2016

Figure 1. Flight plan over the R.R. Foil Plant Research Center at Mississippi State University. GRI REPORT #5068 Page 8 March 2016

Figure 2. Reference objects at the R.R. Foil Plant Research Center at Mississippi State University. GRI REPORT #5068 Page 9 March 2016

Appendices Appendix A. This appendix shows the deviation of MeshLab measurements from actual measurements of reference object dimensions. Values are shown (+) when MeshLab measurements were larger than the actual measurement and (-) when MeshLab measurements were less than the actual measurements. All values are given in cm. Any measurements that could not be performed in MeshLab are marked with -- -- Reference Object # 1 2 3 4 5 6 7 8 9 Dimension 150 200 250 300 350 400 ΔX -- -- -- -- -1.204 +0.762-1.27 +0.596 ΔZ -- -- -- -- +4.422 -- -- -- -- -- -- ΔX -- -- -- -- +0.338 +2.91-3.973 -- -- ΔZ -- -- -- -- -0.016 -- -- -- -- -- -- ΔX -- -- -- -- -0.686 +4.62 +2.16 -- -- ΔY -- -- -- -- -- -- +1.97 -- -- -- -- ΔX -- -- -- -- -1.185 +7.612 +11.48 -- -- ΔX -- -- -- -- -1.667 +0.523 +1.602 -- -- ΔX -- -- -- -- -- -- -- -- -- -- -- -- ΔY -- -- -- -- -3.384-0.166-0.872 -- -- ΔX -- -- -- -- -2.391 -- -- -- -- +0.625 ΔY -- -- -- -- -- -- +0.749 -- -- -- -- ΔZ -- -- -- -- +1.508-0.733-5.816 -- -- ΔX -- -- -- -- -- -- -- -- -- -- -- -- ΔX -- -- -- -- -- -- -- -- -- -- -- -- GRI REPORT #5068 Page 10 March 2016