Assessing Drone Return-to-Home Landing Accuracy in a Woodland Landscape

Article Open Access

Assessing Drone Return-to-Home Landing Accuracy in a Woodland Landscape

Author Information
Arthur Temple College of Forestry and Agriculture, Stephen F. Austin State University, Nacogdoches, TX 75962, USA
*
Authors to whom correspondence should be addressed.
Views:361
Downloads:53
Drones and Autonomous Vehicles 2024, 1 (3), 10005;  https://doi.org/10.35534/dav.2024.10005

Received: 15 March 2024 Accepted: 06 May 2024 Published: 08 May 2024

Creative Commons

© 2024 by the authors; licensee SCIEPublish, SCISCAN co. Ltd. This article is an open access article distributed under the CC BY license (https://creativecommons.org/licenses/by/4.0/).

ABSTRACT: While aerial photography continues to play an integral role in forest management, its data acquisition can now be obtained through an unmanned aerial vehicle (UAV), commonly referred as a drone, instead of conventional manned aircraft. With its feasibility, a drone can be programed to take off, fly over an area following predefined paths and take images, then return to the home spot automatically. When flying over forests, it requires that there is an open space for a vertical takeoff drone to take off vertically and return safely. Hence, the automatic return-to-home feature on the drone is crucial when operating in a woodland landscape. In this project, we assessed the return-to-home landing accuracy based on a permanently marked launch pad nested in a wooded area on the campus of Stephen F. Austin State University in Nacogdoches, Texas. We compared four models of the DJI drone line, with each flown 30 missions over multiple days under different weather conditions. When each drone returned to the home launch spot and landed, the distance and direction from the launch spot to the landing position was measured. Results showed that both the Phantom 4 Advanced and the Spark had superior landing accuracy, whereas the Phantom 3 Advanced was the least accurate trailing behind the Phantom 4 Pro.
Keywords: UAV; Drones; Positional accuracy; Return-to-home

Graphical Abstract

1. Introduction

Photogrammetry and aerial photo interpretation have been used in forest management since aerial photography became available to foresters. A single frame of aerial photo was used for forest type classification and land area estimation, while a pair of aerial photos viewed through a stereoscope added the 3rd dimension for tree height measurement [1]. It made timber volume estimation for the entire forest stand possible through photogrammetry combined with ground sample measurement. Traditionally, aerial photos were captured with a camera mounted on a manned aircraft. For each flight, the mission was planned beforehand that would cover a large area and could end up with hundreds or thousands of photos taken. The types of the photos used for this process have evolved from films and prints, either black-and-white, color, or color infrared, to digital images where the sensor can be specialized to focus on a range of electromagnetic spectrum, such as blue, green, red, infrared, thermal, etc. [2]. When photos captured in a flight mission were stitched together, an orthorectified aerial photo covering the entire forest stand is achieved for forest management purposes including planning for timber harvest, timber volume estimation, site preparation, seedling survival calculation, and land cover classification. With the advancement in unmanned aerial vehicles (UAVs), commonly referred as drones, it revolutionizes aerial photography in forestry. Fardusi et al. [3] reviewed the practices of geospatial information tools including UAV for forestry. They concluded that modern technologies may contribute to higher precision in forest management and decision-making process. Toth and Jóźków [4] reviewed the advancement in remote sensing platforms and sensors including UAVs. They identified that platform georeferencing has become dominant in remote sensing after the completion of the full GPS constellation. In their review on forestry remote sensing from UAVs, Guimarães et al. [5] stated that UAVs are a good alternative to traditional remote sensing platforms due to their high spatial and temporal resolutions, flexibility, and lower operational costs. Torresan et al. [6] identified that most UAV-based applications for forest research in Europe aimed to inventory resources, map diseases, classify species, monitor fire and its effects, quantify spatial gaps, and estimate post-harvest soil displacement. In higher education, drones have been integrated into natural-resource curricula [7]. Krause [8] studied UAV applications for intensive forest monitoring and concluded that UAV remote sensing will likely become a key component of future forest monitoring, but coordinated efforts and standardization are essential for its effective integration. More recently, Fassnacht [9] found that in some regions airborne light detection and ranging (LiDAR) as well as digital aerial photogrammetry (DAP) have emerged as indispensable technologies for mapping forest structure. Compared to manned aircraft missions, flying over a forest strand for aerial photography can be achieved with a drone that is more cost and time effective [10]. In addition, each photo in digital format taken with a drone comes with GPS coordinates along with flight and camera/sensor information that makes image processing possible with some in-house software applications. Through the photo mosaicking process, not only a georeferenced orthophoto mosaic is generated, but also a point cloud in a 3D space, a digital surface model, and a digital terrain model where the above ground features are filtered out [11]. However, for drone operations in forestry, the take-off and landing of a drone is crucial as an open space in a woodland landscape is often limited, while this known as the drone’s “home” position should not be too far away from the planned flight area in order to maintain communication between the drone and its controller. Hence, drone flight operations in forestry heavily rely on the precision and accuracy of a drone’s built-in return-to-home function that will automatically fly the drone back to and land at its take-off position at the end of a planned mission, or whenever this return-to-home function is utilized in an emergency [12]. There are some factors that are involved in a drone’s return-to-home process. One is the GNSS coordinates recorded at the time of a drone take-off [13]. These home coordinates are as accurate as the GNSS receiver onboard the drone itself, unless a stationary base station is set up that communicates with the drone in real-time that would increase its positional accuracy [14]. Given the same coordinates, a drone could position itself at varying locations within its normal accuracy range [15]. In addition, the drone could take photos of the take-off spot with its downward camera. Through its image recognition algorithm, these photos serve as the target images during the drone’s final descend stage for fine tuning its position. When the return-to-home routine is initiated, the drone flies vertically to reach a height above ground that was set by the remote pilot beforehand, usually a height no less than the planned mission’s fly altitude to avoid any obstacle in the return-to-home path. Then the drone flies horizontally to the home position and starts its vertical descend. Some drone models are also equipped with sensors for collision prevention that play a role in the return-to-home process. The object of this research was to assess the landing accuracy of drones when a drone is called to return to its home position through its return-to-home feature. We evaluated the drones frequently used within the College of Forestry and Agriculture at Stephen F. Austin State University (SFASU) in Nacogdoches, Texas. Four different models of DJI drones [13] were tested on the SFASU campus located in the Pineywood Region of East Texas. The SFASU campus is recognized as a Tree Campus Higher Education by the Arbor Day Foundation [16]. Its urban forest setting resembles the environment one would encounter when operating drones in a woodland landscape where drone take-off and landing must be carefully controlled.

2. Materials and Methods

The study area is located on the campus of Stephen F. Austin State University in Nacogdoches, Texas. A permanent launch pad was painted on a concrete surface as the home location for drones to take off and land. The launch pad was located at 31.62413309 latitude, −94.64709953 longitude within a wooded area on the campus (Figure 1). This launch pad was situated within an open space surrounded by pine and hardwood trees as tall as 110 ft. Four DJI drones were used for the project, including a Phantom 3 Advanced, Phantom 4 Advanced, Phantom 4 Pro, and Spark. Each time, a drone took off and flew away through manual control. Then the return-to-home routine was initiated for the drone to return to its take-off (the home) position automatically. Since the maximum flight height was set at 300 ft, the drone would ascend vertically to this ceiling height when return-to-home was called, fly horizontally to the home coordinates, rotate to the same heading as take-off, and descend vertically back to the launch pad. Figure 2 shows a sequence of downward views from a drone during the descending process. Once landed, the distance and direction from the center of the landed drone to the center of the launch pad were measured. At the same time, weather conditions for the time of landing were also recorded for analysis (Figure 1). This take-off and landing process was repeated 30 times for each of the four drones tested, spanning a 16-day period. The geographic coordinates of the launch pad were obtained with a sub-meter GNSS receiver. The coordinates of each drone landing position were determined through coordinate geometry (COGO), where each landing position was referenced to the launch pad position based on distance and direction. All positions were referenced to UTM Zone 15, WGS 1984 and plotted on a map to show the spatial distribution all of the landing positions for each drone tested, where trees in the surrounding area were also presented.
Figure 1. The painted launch pad with a submeter GNSS receiver recording its geographic coordinates (left) and the launch pad with a weather meter in the foreground (right).
Figure 2. A sequence of downward views from a drone during the return-to-home descend process.

3. Data Analysis

To analyze the geographic distribution of the 30 landing positions of each drone, the Mean Center of each landing cluster was first calculated. The mean center is plotted based on the mean x coordinate and mean y coordinate of all 30 landing positions of the same drone. It represents the geographic center of each cluster (Equation 1). Farther a mean center being away from the launch pad indicates less accurate in terms of its return-to-home accuracy. Then, the Standard Deviational Ellipse was calculated that depicts the dispersion of the landing positions of the same drone around its mean center, as well as the direction where the precision is diluted most (Equation 2). Larger the size of an ellipse indicates lower landing precision. Finally, the Linear Directional Mean was calculated for each group of landing positions to depict the magnitude of each drone being off from the launch pad, for both distance and direction (Equation 3). For each drone tested, we calculated the mean direction and mean distance of a group of vectors from the launch pad to a landing position. A low mean distance indicates a high accuracy of the return-to-home process. It also tests if a group of directions is uniform or not. All geographic distribution data were plotted on the map. Mean Center
where xi and yi, are the coordinates for landing position i and n is the total number of landings for each drone tested. Standard Deviational Ellipse
where
where x and y are the coordinates for each landing position i, (, ) represent the Mean Center for the positions of the same drone and n is equal to the total number of landings for each drone. Linear Directional Mean
where θi are the directions of the vectors from the launch pad to landing positions of the same drone. To test if any drone achieved significantly higher accuracy than others, an analysis of variance (ANOVA) was conducted on the distance errors, distances from a landing position to the launch pad. The level of significance was set at 0.05. A lower mean distance from the landing position to the launch pad would indicate a higher return-to-home accuracy.

4. Results and Discussion

4.1. General Landing Positions When all the landing positions were plotted, it revealed that the Phantom 4 Advanced and Spark had their landing positions clustered at the launch pad, while those of the Phantom 3 Advanced and Phantom 4 Pro were more widely spread. However, it appeared that the Phantom 3 Advance and Phantom 4 Pro tried to avoid obstacles such as trees when descending as their land positions were more located northeast of the launch pad where surrounding trees were further away from the launch pad (Figure 3). It implies that collision prevention function was in action, even though their landing precision and accuracy were not as good as other drones tested.
Figure 3. All drone landing positions centered on the launch pad in relation to the surrounding trees.
4.2. Mean Center and Directional Ellipse The mean center represents the geographic center of a group of landing positions, while the directional ellipse measures the spread and orientation of the positions. The landing positions of the Phantom 4 Advanced are the most clustered (ellipse area = 1.60 sq ft) with its mean center located very close to the launch pad (<1.0 ft), while those of the Phantom 3 Advanced are the most spread (ellipse area = 376.50 sq ft) with its mean center 8.1 ft northeast off from the launch pad (Table 1 and Figure 4). The two more spread, Phantom 3 Advanced and Phantom 4 Pro, had their landing positions oriented in a northwest-southeast direction, while Phantom 4 Advanced and Spark, which had more clustered landing positions, were oriented northeast-southwest (Figure 5). The standard distances and ellipse areas quantified the landing precision and accuracy for each drone showing that Phantom 4 Advanced and Spark performed better than Phantom 3 Advanced and Phantom 4 Pro. These statistics echoed what was observed when viewing those landing positions visually.
Table 1. Mean center and directional ellipse statistics of the landing coordinates by the four drones tested.
Figure 4. Landing positions of the four drones with the directional ellipse of each drone tested.
Figure 5. Landing positions of the drones and directional ellipses with the focus of Phantom 4 Advanced and Spark.
4.3. Linear Directional Mean When both the distance and direction of errors were taken into consideration as a vector off from the home position, it was reconfirmed that the Phantom 4 Advanced was the most accurate with a mean distance error of 0.62 ft, while the Phantom 3 Advanced was the least accurate with a mean distance error of 12.92 ft. Both Phantom 4 Advanced and Phantom 4 Pro, are found uniform in their landing directions from the launch pad, while the other two drones were directional, with both Phantom 3 Advanced (p < 0.0001) and Spark (p = 0.0434) concentrated on the northeast direction (Table 2 and Figure 6 and Figure 7). Although Phantom 4 Advanced and Phantom 4 Pro had very different landing accuracy with their mean distance errors 0.62 ft vs. 4.27 ft, their landing positions were evenly distributed around the launch pad in terms of direction. For the Phantom 3 Advanced, with its largest mean distance error and the mean direction of errors concentrated along the northeast direction, it reflected the earlier observation on the landing positions of this drone where there were less trees in that direction from the launch pad (Figure 3).
Table 2. Linear directional mean statistics of the landing positions by the four drones tested.
Figure 6. The linear directional mean and vector of each of the four drone’s tests.
Figure 7. The linear directional mean and vector with the focus of Phantom 4 Advanced and Spark.
4.4. Analysis of Vairance The analysis of variance (ANOVA) tested if any drone achieved significantly higher accuracy than others based on the distances from a landing position to the launch pad. The resulted p-value (<0.001) showed that there is significant difference statistically. The following Tukey test identified that both Phantom 4 Advanced and Spark are ranked as the most accurate, followed by Phantom 4 Pro, while Phantom 3 Advanced is the least accurate (Table 3). It does not seem that the form factor of a drone played an important role as Spark has a smaller dimension than the three other Phantom drones, while its landing accuracy is in par with Phantom 4 Advanced. Since Phantom 3 Advanced is an older model drone of the DJI Phantom series, it is reasonable that Phantom 3 Advance had a lower landing accuracy compared to the Phantom 4 drones which came with more advanced technology. However, it is surprising that Phantom 4 Pro did not achieve the same level of accuracy as Phantom 4 Advanced. It could be contributed by the fact that during the test, on average this drone spent more time in the air before the return-to-home routine was initiated. This prolonged in the air time might dilute its return-to-home precision and accuracy.
Table 3. Summary statistics, ANOVA, and Tukey test on the distances (ft) off from the launch pad categorized by the four drones tested.
4.5. Influence by Weather Conditions It appears that the overall weather condition (Clear/Partly cloudy/Cloudy) did not play a role in the return-to-home accuracy (Figure 8). For wind speed, higher wind speed measured at the time of landing did reduce the accuracy of Phantom 3 Advanced and Phantom 4 Advanced. However, it is inconsistent for the Phantom 4 Pro and Spark (Figure 9). Also observed is that lower air temperature reduced the accuracy, found on all tested drones except for the Phantom 4 Pro (Figure 10). During the test period, no extreme weather conditions were observed. It is common that a drone flight will not be conducted under extreme weather conditions. However, the wind guest at the time of landing is something that might be out of control. Although wind speed was not found significant in this study, cautions should be taken when planning to fly a mission over a forested area considering weather conditions.
Figure 8. Average error distance from the launch pad by each drone tested under different weather conditions.
Figure 9. Average error distance from the launch pad by each drone tested under different wind speed categories.
Figure 10. Average error distance from the launch pad by each drone tested under different temperature categories.

5. Conclusions

The Phantom 4 Advanced and the Spark had significantly higher return-to-home landing accuracy than the Phantom 4 Pro and the Phantom 3 Advanced due to better technology used in the newer models of the DJI product line. While having the least landing accuracy, the Phantom 3 Advanced had its landing positions concentrated in the direction northeast from the launch pad that is also the area having least tree density. Wind speed at the time of landing did not play a role in landing accuracy, whereas lower temperature reducing accuracy was observed. With its user-friendly flight control interface, versatile mission planning applications, and affordable image processing packages (even free such as WebODM [17]), capturing aerial photos with a drone has become commonplace in many different fields. However, when operating a drone in a woodland landscape, investing in a newer model of drone would assure higher return-to-home accuracy that would allow the drone return to the launch position safely. For precision landing in unstructured environments, Pluckter and Scherer [18] proposed adding a fisheye lens to land a drone. For future uses, automated landing systems with 4 cm accuracy in landing using lights were developed to increase precision landing and to automate charging systems [19] demonstrating that automated landing accuracy can be enhanced by vision-based autonomous landings [20]. To further increase the safety of landing, AI-based image recognition is used with the onboard cameras. The onboard camera will identify landing points similar to Figure 2 and use three-dimensional coordinates to avoid obstacles [21]. Using the drone visible light camera sensor, a lightDenseYOLO algorithm was developed for autonomous drone landing [22]. Ground based markers were proposed for accuracy of drone landings [23]. The best practice is to maintain visual line of sight to the drone, particularly during the return-to-home phase. If something unexpected is going to happen, the drone pilot should take over the control manually on the drone to bring it home safely. The relative accuracy of the landings in our study indicates the inherent technology of the drones investigated. Adoption of proposed landing methods will enhance the accuracy of the return-to-home feature of the drones. Future research could include more factors such RTK (real-time kinetic) capability in positioning the drone.

Acknowledgments

The authors thank the GIS Laboratory in the Arthur Temple College of Forestry and Agriculture at Stephen F. Austin State University for their technical support.

Author Contributions

Conceptualization, I.-K.H., D.K. and D.U.; Methodology, I.-K.H. and D.U.; Validation, I.-K.H., D.U. and Y.Z.; Formal Analysis, I.-K.H.; Investigation, I.-K.H.; Resources, I.-K.H.; Data Curation, I.-K.H. D.U, and Y.Z.; Writing—Original Draft Preparation, I.-K.H.; Writing—Review & Editing, I.-K.H, D.K., D.U., and Y.Z.; Visualization, I.-K.H.; Supervision, D.K.; Project Administration, D.K.; Funding Acquisition, I.-K.H., D.K, D.U. and Y.Z.

Ethics Statement

Not applicable.

Informed Consent Statement

Not applicable.

Funding

This research was supported by McIntire Stennis Capacity grant no. NI18MSCFRXXXG012/project accession nos. 1004705, 1004707, 1011115, 1011116 from the USDA National Institute of Food and Agriculture.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

1.
Campbell JB, Wynne RH. Introduction to Remote Sensing, 5th ed.; Guilford Press: New York, NY, USA, 2011.
2.
David P, Paine DP, James D, Kiser JD. Aerial Photography and Image Interpretation, 3rd ed.; Wiley: Hoboken, NJ, USA, 2012.
3.
Fardusi M, Chianucci F, Barbati A. Concept to practice of geospatial-information tools to assist forest management and planning under precision forestry framework: A review.  Ann. Silvic. Res. 2017, 41, 3–14. [Google Scholar]
4.
Toth C, Jóźków G. Remote sensing platforms and sensors: A survey.  ISPRS J. Photogramm. Remote Sens. 2016, 115, 22–36. [Google Scholar]
5.
Guimarães N, Pádua L, Marques P, Silva N, Peres E, Sousa JJ. Forestry remote sensing from unmanned aerial vehicles: A review focusing on the data, processing and potentialities.  Remote Sens. 2020, 12, 1046. [Google Scholar]
6.
Torresan C, Berton A, Carotenuto F, Di Gennaro SF, Gioli B, Matese A, et al. Forestry applications of UAVs in Europe: A review.  Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar]
7.
Unger DR, Kulhavy DL, Hung I, Zhang Y, Stephens Williams P. Integrating drones into a natural-resource curriculum at Stephen F. Austin State University.  J. For. 2019, 117, 398–405. [Google Scholar]
8.
Krause S. UAV Applications for Intensive Forest Monitoring, PhD Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn, Bonn, Germany, 2024.
9.
Fassnacht FE, White JC, Wulder MA, Næsset E. Remote sensing in forestry: current challenges, considerations and directions. For. Int. J. Forest Res. 2024, 97, 11–37. [Google Scholar]
10.
Pritt MD. Fast orthorectified mosaics of thousands of aerial photographs from small UAVs. In Proceedings of the IEEE Applied Pattern Recognition Workshop (AIPR), Washington, DC, USA, 1–8 October 2014.
11.
Wolf P, DeWitt B, Wilkinson B. Elements of Photogrammetry with Application in GIS, 4th ed.; McGraw Hill: New York, NY, USA. 2014.
12.
Warren M, Greeff M, Patel B, Collier J, Schoellig AP, Barfoot TD. There’s no place like home: Visual teach and repeat for emergency return of multirotor UAVs during GPS failure.  IEEE Robotics Autom. Lett. 2018, 4, 161–168. [Google Scholar]
13.
DJI. Available online: https://www.dji.com/ (accessed on 14 March 2024).
14.
Magiera W, Vārna I, Mitrofanovs I, Silabrieds G, Krawczyk A, Skorupa B, et al. Accuracy of code GNSS receivers under various conditions. Remote Sens. 2022, 14, 2615. [Google Scholar]
15.
Hung I, Unger D, Kulhavy D, Zhang Y. Positional precision analysis of orthomosaics derived from drone captured aerial imagery.  Drones 2019, 3, 46. [Google Scholar]
16.
Arbor Day Foundation. Available online: https://www.arborday.org/programs/tree-campus-higher-education/#recognizedSection (accessed on 12 March 2024).
17.
OpenDroneMap. Available online: https://www.opendronemap.org/webodm/ (accessed on 12 March 2024).
18.
Pluckter K, Scherer S. Precision UAV landing in unstructured environments. In Proceedings of the 2018 International Symposium on Experimental Robotics, Buenos Aires, Argentina, 5–8 November 2018, part of SPAR, volume 11, pp. 177–187.
19.
Alshbatat AIN, Moath A. Vision-based autonomous landing and charging systems for a hexacopter drone.  J. Eur. Syst. Autom. 2024, 57, 225–237. [Google Scholar]
20.
Kumar A. Real-time performance comparison of vision-based autonomous landing of quadcopter on a ground moving target.  IETE J. Res. 2023, 8, 5455–5472. [Google Scholar]
21.
Lee S, Jo D, Kwon Y. Camera-based automatic landing of drones using artificial intelligence image recognition.  Int. J. Mech. Eng. Rob. Res. 2022, 11, 357–364. [Google Scholar]
22.
Nguyen PH, Arsalan M, Koo JH, Naqvi RA, Truong NQ, Park KR. LightDenseYOLO: A fast and accurate marker tracker for autonomous UAV landing by visible light camera sensor on drone.  Sensors 2018, 18, 1703. [Google Scholar]
23.
Wubben J, Fabra F, Calafate CT, Krzeszowski T, Marquez-Barja J, Cano J-C, et al. Accurate landing of unmanned aerial vehicles using ground pattern recognition.  Electronics 2019, 8, 1532. [Google Scholar]
TOP