Next Article in Journal
Continent-Wide Tree Species Distribution Models May Mislead Regional Management Decisions: A Case Study in the Transboundary Biosphere Reserve Mura-Drava-Danube
Next Article in Special Issue
Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research Applications
Previous Article in Journal / Special Issue
Tree Height Measurements in Degraded Tropical Forests Based on UAV-LiDAR Data of Different Point Cloud Densities: A Case Study on Dacrydium pierrei in China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Advances in Unmanned Aerial Vehicle Forest Remote Sensing—A Systematic Review. Part I: A General Framework

by
Riccardo Dainelli
,
Piero Toscano
,
Salvatore Filippo Di Gennaro
* and
Alessandro Matese
Institute of BioEconomy, National Research Council, Via Caproni 8, 50145 Florence, Italy
*
Author to whom correspondence should be addressed.
The two authors contributed equally to this work.
Forests 2021, 12(3), 327; https://doi.org/10.3390/f12030327
Submission received: 12 January 2021 / Revised: 6 February 2021 / Accepted: 8 March 2021 / Published: 11 March 2021
(This article belongs to the Special Issue Forestry Applications of Unmanned Aerial Vehicles (UAVs) 2020)

Abstract

:
Natural, semi-natural, and planted forests are a key asset worldwide, providing a broad range of positive externalities. For sustainable forest planning and management, remote sensing (RS) platforms are rapidly going mainstream. In a framework where scientific production is growing exponentially, a systematic analysis of unmanned aerial vehicle (UAV)-based forestry research papers is of paramount importance to understand trends, overlaps and gaps. The present review is organized into two parts (Part I and Part II). Part II inspects specific technical issues regarding the application of UAV-RS in forestry, together with the pros and cons of different UAV solutions and activities where additional effort is needed, such as the technology transfer. Part I systematically analyzes and discusses general aspects of applying UAV in natural, semi-natural and artificial forestry ecosystems in the recent peer-reviewed literature (2018–mid-2020). The specific goals are threefold: (i) create a carefully selected bibliographic dataset that other researchers can draw on for their scientific works; (ii) analyze general and recent trends in RS forest monitoring (iii) reveal gaps in the general research framework where an additional activity is needed. Through double-step filtering of research items found in the Web of Science search engine, the study gathers and analyzes a comprehensive dataset (226 articles). Papers have been categorized into six main topics, and the relevant information has been subsequently extracted. The strong points emerging from this study concern the wide range of topics in the forestry sector and in particular the retrieval of tree inventory parameters often through Digital Aerial Photogrammetry (DAP), RGB sensors, and machine learning techniques. Nevertheless, challenges still exist regarding the promotion of UAV-RS in specific parts of the world, mostly in the tropical and equatorial forests. Much additional research is required for the full exploitation of hyperspectral sensors and for planning long-term monitoring.

1. Introduction

1.1. A First Look at the Use of Drones in the Forestry Sector

Natural, semi-natural and planted forests are a key asset worldwide, providing a broad range of positive externalities. These kinds of benefits can be included in three main categories such as goods (timber, food, fuel, and bioproducts), ecosystem services (carbon storage, nutrient cycling, water, air quality, and wildlife habitat), and social and cultural features (recreation, traditional resource uses, and well-being) [1]. In this context, sustainable forest planning and management require understanding both short and long-term woodland dynamics [2]; furthermore, a modernization of forestry inventory frameworks is needed and driven by the ongoing uncertainty on the future condition of forests related to climate [3]. Ordinary inventory operations require the collection of field data with labor-intensive, time-consuming, and, no less important, increasingly expensive acquisition procedures. Besides, field campaigns are restricted to small areas, so that the number of field inventories that can be reasonably completed is drastically limited [2]. For the adoption of precision forestry practices, promptness is a key requirement and this is especially true when the forest structure is changing in a hardly predictable way due to pressure from biotic or abiotic factors [3].
Remote sensing (RS) platforms, such as unmanned aircraft systems (UAS), satellites, and airplanes fitted with dedicated sensors are rapidly going mainstream. They are still being developed for full optimization, among other sectors, of forest management and their relevance for decision support is growing crucially for forestry managers, entrepreneurs, and researchers [4]. RS provides data at different resolutions in terms of space, spectral band, and time allowing forest modeling under different conditions and for various management purposes (economic, monitoring, conservation, restoration). Unlike traditional field-based inventories, the full-coverage often guaranteed by RS platforms provides data on many primary forestry parameters [1]. Nevertheless, RS applications for forestry often require images with a high temporal resolution [5]. Considering the traditional airborne and spaceborne RS platforms, the spatial and temporal resolutions provided by satellite-based data are usually not suited to achieving regional or local forestry objectives while aircraft, even if their products have a more appropriate spatial scale, are expensive when regular time-series monitoring is desired [6]. Moreover, data from manned aircraft and satellite platforms are vulnerable to cloudy sky conditions, which attenuate electromagnetic waves and cause information loss and data degradation [7]. Drones (hereafter called Unmanned Aerial Vehicle—UAV), equipped with GPS and digital cameras, are suitable for real-time applications, inasmuch as they combine high spatial resolution and quick turnaround times with lower operational costs [8]. Thanks to their flexibility of use, UAVs are becoming one of the emergent technological tools, with a wide perspective, as well as increasing applicability [9] and therefore, for precision forestry application especially at a local scale, they overtake traditional RS platforms. It is also important to note that recent UAV advances, along with computer vision and other related research topics, have created many opportunities for practical forestry by facilitating and improving field data collection in terms of temporal and spatial accuracy, with the possibilities of creating customized datasets according to specific needs [10].
The major drawbacks in using UAVs rather than other RS platforms are represented by generic technical issues that are not related to the inner features of forests. To the best of the authors’ knowledge, only Surovỳ and Kuželka [10] report that the effective extent of detailed UAV data is limited to several forest stands because the high resolution and high-frequency data cannot be efficiently acquired for the whole extent of a very large forest. In general, the main disadvantages of UAV flights are imposed by battery duration and therefore by small area coverage, payload weight [11], and sensitivity to some bad weather conditions (i.e., wind, precipitation, and sudden and sharp light conditions variation) [4,12]. In the post-flight workflow, UAV imagery products involve massive data processing capability [1], often with a combination of robust image processing software and sophisticated machine learning systems; all this results in substantial computation requirements and therefore high expense in terms of money and time [13]. Current limitations for UAV activity are also enforced by policy and regulations (restrictions on airspace use). This is one of the major factors that prevent researchers from testing all of the possibilities for UAV civil applications [7]. Despite the critical issues listed, the advantages of using UAV instead of other RS platforms far outweigh the drawbacks. If used appropriately and combined with ground surveys and local knowledge, UAVs can constitute a valuable tool in monitoring and mapping forests, especially over small areas, responding to the growing need for more accurate data [14]. In the last years, UAVs have been recognized as an effective complement to traditional vehicles due to their economy, safety, maneuverability, positioning accuracy, high spatial resolution, and data acquisition on demand [5,7]. UAV imagery, due to its extremely high possible spatial resolution (fixed-wing up to 2 cm/pixel; rotary: sub-millimeter), is a cost-effective data source for providing detailed reference information [15], especially for a research project or service-based business with a tight budget. UAVs can carry a wide range of task-oriented sensors [16,17] whose operation is not affected by clouds due to the low flight altitude [4]. UAV missions can be planned flexibly, avoiding poor weather conditions, providing data availability on-demand, and enhancing temporal resolution [6]. The availability of UAV imagery in NRT (Near Real-Time) is another feature that can help agroforestry operations, due to the possibility of identifying problems faster and, consequently, reacting quickly, reducing losses, and, in the case of professional foresters, economic outlays [4]. UAVs can thus be used in real-time operations, for example in wildfire detection using thermal sensors [6]. Regarding academia, the use of UAV allows researchers to acquire complex imagery (i.e., hyperspectral) themselves and with higher frequency than in the past when specialized companies provided all the airborne imagery [10]. Furthermore, thanks also to the constant technological development, UAV costs in terms of material and operational charges are diminishing, while processing capabilities and dedicated artificial intelligence (AI) algorithms, i.e., machine learning, improve [13]. Finally, UAVs can save time, manpower, and financial resources for practitioners, public authorities, and researchers [6]. For all the aforementioned pros, interest in UAV has been increasing and this technology has become a focus of research [5].

1.2. UAV in Forest Remote Sensing: Research Topics, Vehicle Type, and Sensors

The use of smart and low-cost tools such as UAVs in precision forestry has increased exponentially in recent years, as demonstrated by the large number of papers published from 2018 until mid-2020; more than 600 references were found when searching for “UAV” + “forest” and considering articles, systematic reviews, conference proceedings and books [18]. By summarizing the results of some recent review papers [4,5,13,19], it follows that the range of UAV academic topics investigated by researchers is very wide and it also involves the use of UAV together with other RS platforms. Two main clusters among research topics can be outlined: (i) dendrometric parameters estimation (ii) monitoring and conservation activities. Regarding the estimation of basic physiological features, UAVs have been used with varying levels of success in 2D and 3D mapping applications such as individual tree detection or estimation of tree height, leaf area index (LAI), chlorophyll content, tree crown dimension, and location. In this cluster a subgroup of topics, regarding the estimation of derived dendrometric parameters, can be connected directly also with practical forestry activities: monitoring growth status, measuring plant density, merchantable biomass estimation, species identification, inspection of forestry operations, diseases detection and management, and post-harvest data (i.e., forest stockpiles measuring, truckload soil structure degradation). The second cluster includes the use of UAV for monitoring, conservation, and restoration activities required by the impact of climate change on forests and the change in the level of woodland biodiversity. In particular, UAVs are used, from an ecological point of view, for the mapping and control of weed vegetation and invasive alien species; deforestation monitoring and estimation of deforestation rates utilizing the identification of gaps; forest wildfire management and detection, especially for prevention and post-fire monitoring, by identifying and constructing risk maps and supporting shut down operations. UAV and their sensors are also applied to quantify aboveground biomass (AGB) and monitor it over time to assess the impacts of climate and land-use changes on the global carbon cycle of forest ecosystems and to understand their effects on woodland resilience and health. Furthermore, UAVs can track wildlife, detect woodland land-use change, monitor legal restrictions, and, in general, monitor habitats that are difficult to reach (wetlands, rock faces, coastal ecosystems) or where trespassing is undesirable [4,5,13,19].
UAVs employed for forestry applications and reviewed in this study are classified according to their size as small, mini, and micro vehicles and can also be categorized on the basis of wing type. Generally, in regard to civilian usage, UAVs can be rotor-based, fixed-wing, or can adopt hybrid solutions (Figure 1). They are user-friendly platforms with take-off mass ranging from a few tens of grams up to 25 kg and over [5] and with a flight time comprised between a couple of minutes and a few hours of autonomy [4]. Fixed-wing platforms are adequate for monitoring larger areas with a pre-defined flight plan but need space for landing [1]. On the other hand, multi-rotor UAVs are more maneuverable, having easier take-off and landing and are preferred by researchers because they are usually cheaper and more flexible for demonstrative/scientific studies [4].
The progress in sensor implementation particularly concerned their miniaturization to lessen the payload to be transported. This big step forward has allowed us to equip the UAV with a broad range of sensors and tools: high-resolution digital cameras, infrared/thermal cameras, multi-spectral cameras (passive sensors), lidar—Laser Imaging Detection and Ranging (active sensors)—as well as chemical instrumentation for sensing Volatile Organic Compounds (VOCs) [20,21,22] and also tools for distributing agro-forestry products [23]. The possibility of equipping UAV with a wide choice of sensors in association with georeferencing systems has supported forestry scientific research by monitoring anomalies and trends which, if observed by traditional means from the ground, could have manifested in a less obvious way [24].
RS applications very often separate RGB channels and work with individual red, green, and blue channels to provide valuable visual information for foresters [25]. Instead, for assessing vegetation properties, information obtained in the NIR region (from 750 nm to approximately 1400 nm), where the high reflectance of vegetation occurs, should be used. Indeed, the NIR band results crucial for most forestry applications precisely because healthy vegetation that is actively growing and producing energy from photosynthesis reflects more in the NIR region [4]. Thermal cameras operate approximately in the spectrum at wavelengths from 8000 to 15,000 nm and detect electromagnetic radiation which can be experienced as heat. Thanks to these sensors, the pixel’s digital number can be transformed into a temperature measurement [26]. Multispectral sensors sense broadbands, usually 4–12, and are extensively used for vegetation analysis, given that they often include NIR together with multiple bands (e.g., R, G, B, red edge) [27]. The hyperspectral sensor is gradually becoming more common in UAV forestry applications. It has very high spectral resolutions sensing hundreds of narrow bands up to 2 nm in wavelength. Hyperspectral sensors produce images in which each pixel contains the whole spectrum of the sensed wavelengths. In this way, greater imagery information than all the other passive devices is provided [28] but a few drawbacks, such as high operation costs and complex equipment are inevitable with today’s technology [13]. Lidar is an active laser-based remote-sensing technology that uses focused laser light pulses by transmitting them to the surface with a fast repeat rate. It measures the time taken for the reflections to be detected by the sensor (transmitter-target-receptor) to determine the distance to targets (objects, surface). By repeating this process with a fast sequence, lidar generates a 3D point cloud of the surface [4,13].
Regarding the use of UAV sensors in forestry applications, it can be emphasized how the RGB sensors are suitable to estimate Fractional Vegetation Cover (FVC) [29], to find features within a certain area (e.g., tree crown size estimation) [30], and to detect invasive species [31]. Instead, through reflectance analysis with wavebands outside of the visible spectrum, the multispectral, infrared, and hyperspectral sensors are most suitable for LAI [32] and for identifying the presence/absence of certain components or materials (e.g., disease [33,34] and water stress detection [35]). Thermal sensors are used effectively to detect water stress [36] and in forest fire monitoring [37] and wildlife detection [38] while multispectral sensors can be applied for determining burned areas in a post-fire scenario [39]. Lidar sensors can provide accurate measurements targeting land objects for an effective forest inventory, also being capable of gathering data also below the canopy [40]. By penetrating the forest canopy, lidar is a powerful tool for the direct 3D measurement of various tree attributes, even at a fine-grained scale [41]. It can be used for individual tree detection and crown delineation [42,43,44], for retrieving inventory parameters such as diameter [45], height [46] and biomass [47] and for predicting carbon dynamics [48]. Lidar metrics are both ecologically meaningful and management-relevant, as demonstrated by their applications in overstory characterization [49], forest restoration [50], wildfire prevention [51] and post-fire monitoring [52]. Nevertheless, a careful comparison must always be made with optical sensors considering the relatively high cost of lidar technology [53,54]. For instance, to determine vegetation height, optical sensors are a plausible choice, because they offer the possibility of image 3D reconstruction using Structure from Motion (SfM) algorithms and, at the same time, they have higher cost-effectiveness compared to lidar [4].

1.3. Systematic Review Goals

In a framework where scientific production is exponentially growing, a systematic analysis of UAV-based forestry research papers is of paramount importance to understand trends, overlaps and gaps, since forestry applications include a wide range of scientific topics.
The present review is organized into two parts (Part I and Part II) and tries to accomplish a systematic analysis of the recent peer-reviewed literature (2018–mid-2020) on using UAV in forestry RS applications. Scientific papers regarding environmental chemical sensing (i.e., VOCs, particulate matter) and canopy sampling have been excluded. It is important to mention that the main object of this study is small, mini, and micro UAVs with the related imaging sensors since these types of vehicles are simple to use, affordable, and easy to carry.
This paper (Part I) reviews the state-of-the-art research studies capable of remote sensing forest ecosystems. In a broad context, the authors wish to increase the understanding of the current research topic framework. In particular, specific goals are threefold: (1) create a carefully selected bibliographic dataset that other researchers can draw on for their scientific works; (2) analyze general and recent trends in RS forest monitoring (publishing source, keywords, research place, forest type, sensors (3) reveal gaps of the general research framework where an additional activity is needed and suggest possible solutions. To provide a general framework of current research, the targeted audience is mainly represented by forest stakeholders (entrepreneurs, technicians, public authorities) and young researchers who want to approach UAV-RS in forestry. Furthermore, even skilled researchers could benefit from this Part I, which can provide a large bibliographic database as underpinning for their activities.
Section 2 presents the workflow for the dataset creation and forestry research topics categorization. In Section 3 results are elaborated for the global analysis concerning cross aspects to the totality of the works selected in the first and second filtering step. Then in Section 4, general discussions are reported; they encompass overall issues including the geographic distribution of researches, the use of sensors, and the presence of multi-temporal analysis. Finally, in Section 5 the main conclusions derived from Part I of the present review are drawn, outlining both strengths and challenges from a global point of view and providing some suggestions for filling the identified research gaps.
To give a clear picture of the entire review, the aims and the audience of Part II [55] are briefly described below. This part identifies the most popular technical issues and shows the pros and cons of different UAV solutions. The six research topics are fully analyzed and discussed with a particular focus on: hyperspectral sensors, comparison with other RS platforms, machine learning techniques, and use of field data. The authors seek to reveal critical points (not only at the technical level) where additional effort is needed, such as the technology transfer of UAV-RS in a real management context. Bearing this in mind, Part II of the present review is primarily aimed at expert researchers, technicians and consultants.

2. Materials and Method

2.1. Dataset Creation: Scientific Paper Search, Filtering, and Selection

The authors performed a comprehensive literature search using the Web of Science (WoS—Clarivate Analytics) search engine to extract peer-reviewed studies related to the use of UAV platforms for forestry remote sensing applications and published between 2018 and mid-2020. The search descriptors were a combination of keywords related to UAV platforms, namely “UAV”, “unmanned aerial vehicle”, “UAS”, “unmanned aerial system”, “drone” and terms related to forest science and management, namely “forest” and “forestry”. At this point in the database creation workflow, double-step filtering was conducted. The first step consisted of exploiting the exclusion criteria directly available in the WoS search engine, that is publication year, document type, and language (Figure 2). In particular, for each keyword combination entered in the research field “Topic”, the investigation was limited to 2018, 2019, and mid-2020 (30 June) to encompass only the most recent applications of UAV in forest remote sensing. In this way, the present systematic review differs from other bibliographic analyses by making readers aware of the latest research findings in this sector. Only original articles, conference papers, and book chapters published in the English language were screened.
At the end of this phase, bibliometric metadata of the screened research papers are exported from the WoS search engine. Metadata text files are elaborated through a scientific mapping software, VOSviewer [56]. The results of bibliometric networks among the title and abstract keywords most used in the UAV-based forestry research (2018–mid-2020) are presented later, at the beginning of Section 3.
For the eligibility of the screened research papers, the second level of filtering was performed based on the following exclusion criteria: (i) adequacy of the paper topic to the aims of the present review, (ii) mixed criterion accounting for the relevance of publishing source and paper citations, and (iii) scientific paper availability in the research institution’s database. For doubtful cases, the authors re-read the full text of the article and excluded those that did not satisfy the aforementioned criteria. Finally, a total of 226 papers were found to be relevant to the study and therefore are included in the systematic analysis. The overall workflow with single steps and with the related number of selected or excluded papers is reported in Figure 2.

2.2. Topic Categorization and Other Classification Criteria

Once all the papers had been read, they were categorized into six main topics, and subsequently, the information needed to address the research questions (reported in Part II) was collected. The selected topics with a brief description are listed below.

2.2.1. Setting and Accuracy of Imagery Products

In this topic, the acquisition, pre-processing, and processing of UAV data for forest structure characterization is discussed. In particular, it encompasses UAV and sensor settings (e.g., UAV altitude, overlap, sensor resolution) and the accuracy features of generated imagery products as, for instance, digital terrain model (DTM), digital elevation model (DEM), canopy height model (CHM), point cloud.

2.2.2. Tree Detection and Inventory Parameters

The topic includes tree detection and the estimation of dendrometric parameters for forest inventory purposes and, in general, the measure of metrics and spatial properties at stand/tree-level. The focus is on assessing the height of the forest population and other inventory parameters like diameter at breast height (DBH), crown area, etc. Individual tree detection with crown delineation is also a hot issue. Tree biomass and volume are excluded from this section.

2.2.3. Aboveground Biomass/Volume Estimation

Considering the scientific relevance and number of articles gathered, the authors decided to devote a separate section to biomass estimation. This topic includes the estimation of aboveground biomass, overall volume, stock growing volume, basal area, and carbon content at stand/tree-level as well as woody debris and fallen logs. As can be well-understood, many of the selected papers start from estimating tree-dendrometric parameters as basic key features to then assess the biomass.

2.2.4. Pest and Disease Detection

Forest health monitoring is covered in this section; in particular, research papers assessing forest status and mortality induced by biotic factors (diseases and insect pests) are collected.

2.2.5. Species Recognition and Invasive Plant Detection

This topic deals with the assessment of dominant species in forest stands and the spatial analysis of weed/alien plant invasions, regarding species classification and invasive plant detection, respectively.

2.2.6. Conservation, Restoration, and Fire Monitoring

A broad range of sub-topics is included in this section. Regarding natural (and semi-natural) forest ecosystems conservation, the main issue is forest biodiversity monitoring at spatial and temporal scale, also through the quantification of canopy spatial structures and gap patterns. In this framework, even research papers dealing with land use are included in the conservation sub-topic. Restoration encompasses the study and implementation of all those interventions that intend to restore a forest ecosystem destroyed by various natural and human factors (i.e., wildfires/arsons, post-mine sites). The forest fire monitoring sub-topic, investigated in this review, can be divided into two main issues: monitoring (before fires), i.e., prevention, e.g., by creating fire risk maps of vegetation; and post-fire monitoring (after fires), i.e., mapping burned areas and assessing fire effects.
In addition to topic categorization, each selected study is assembled in a tabular format according to the following criteria: publication year, author, title, publication name, publication type, DOI, total times cited count, sensor type, study location (continent and country), forest type, plant group, species, machine learning techniques (object detection methods/crown segmentation algorithms), other RS platform use or comparison, ground truth data type. This synthesis is then used to answer the research questions (Part II).

3. Results

Global Analysis Results

This section reports the results elaborated for the global analysis without going into the specifics of each topic cluster (see Part II). In particular, the authors analyzed scientific production of the period 2018–mid-2020 on the use of UAVs in forestry research in terms of papers’ number, journals, researcher countries, and keywords used. Figure 3 shows the top 20 sources title (journals and proceedings) of research papers filtered only through the exclusion criteria directly available in the WoS search engine (first-step filtering), namely publication year, document type, and language. The Remote Sensing journal (MDPI publisher) turns out to be by far the most popular source title accounting for 23.4% of the publications filtered; then, there are Forests (4.8%—MDPI), the International Journal of Remote Sensing (3.2%—Taylor and Francis), and the International Geoscience and Remote Sensing Symposium Proceedings (2.7%—IEEE Xplore) together with Remote Sensing of Environment (2.7%—Elsevier).
Regarding the top 20 authors’ countries, the bar graph (Figure 4) displays how China and, a step back, the United States of America are the countries that a large share of the researchers who have published on UAVs and forests come from, with 26.5% and 18.4%, respectively. Following these are scientists from Canada (8.8%), Germany (6.7%), and England (5.4%).
Besides, using the abovementioned first-step filtered database, a scientific mapping of title and abstract keyword occurrences was realized through VOSviewer software [56]. The setting options were tuned as follows: title and abstract field, occurrence full counting method, minimum occurrence threshold equal to 10, and default relevance score (60%). Moreover, to ensure that specific terms strictly related to the use of UAVs for forestry applications were highlighted, some generic words (i.e., work, range, task, etc.) were manually excluded. The network mapping (Figure 5) shows the keyword clustered in four groups with colored lines (max numbers set equal to 200) indicating co-occurrence links between terms. The four groups can be labeled as “classification” (red), “biomass estimation” (yellow), “inventory” (blue), and “UAV imagery”(green) clusters. In the red cluster, the main keywords are all related to image classification both in terms of parameters (“vegetation index”, “texture”, “spatial resolution”) and techniques (“random forest”, “SVM”—support vector machine, “CNN”—Convolutional Neural Network); there is also a sub-group linked to forest fire monitoring where “UAVs” is the most influential term. The biomass estimation cluster is characterized by the keyword “AGB” together with “Sentinel” and “mangrove forest”; this analysis confirmed the high number of papers dealing with biomass assessment in mangrove ecosystems also with the use of other RS platforms, like satellites. “Height”, “structure”, “individual tree” and “plot” are the most frequent occurrences in the inventory cluster, which is strictly related to the remote sensed dendrometric parameters as is also confirmed by other terms like “DBH”—Diameter at Breast Height—“basal area”, “field measurement”, and “forest inventory”. In the green cluster, the fulcrum research keyword of “point cloud” around which there are other UAV imagery products such as “DTM”—Digital Terrain Model—and “DEM”—Digital Elevation Model—or terms related to image processing like “photogrammetry” and “DAP”—Digital Aerial Photogrammetry.
Regarding the co-occurrence between keywords of different clusters, an in-depth analysis reveals that “point cloud” could be reasonably considered as the key element of UAV-RS in forestry being strongly tied (thick lines) with all the main keywords of other clusters such as “structure”, “AGB”, and “classification”. It is worth noting that interconnections between different clusters occur also in the case of “AGB” and “vegetation index”, “AGB” and “plot”, “height” and “random forest”. No words regarding the hyperspectral sensor, flight regulation, and multi-temporal analysis stand out according to the minimum occurrence threshold. This scientific mapping could reveal research gaps to further explore.
Applying the second level of filtering following adequacy, relevance, and availability exclusion criteria (Section 2), a total of 226 papers were selected to form the final dataset of the present systematic review (Table 1). The eligible research studies are tabulated according to forest type, plant group, and topics to highlight the high amount of data from which this analysis started and to outline some statements based on the abovementioned clustering criteria. Forest type includes planted woodland, which may have one or more characteristics such as species purity, even-age and regularly spaced, and natural/not regular woodland, mixed, uneven-aged, and irregular-spaced forests, where the spatial variability of vegetation is very high and where the use of UAV is more challenging. The nomenclature of plant groups reflects the plant types found in the articles together with a practical classification among conifer (a group of gymnosperms), broadleaved trees (angiosperms, both deciduous and evergreen), mixed plants (conifer + broadleaf), and other (Gingko biloba L. and trees for which no taxonomic hint is specified in the reviewed articles). Overall, the researchers focused mainly on natural forests (63%) and, among plant groups, on broadleaf woodland (84 research papers). Focusing on plant groups, the majority of tree species are classified as conifer (50%) in planted forests whereas they are mixed in natural ones (42%). Tree detection and inventory parameters are the most discussed topic (95 articles in total) also confirming that it is a key issue in the forestry sector for all the stakeholders (researchers, practitioners, and public authorities). On the contrary, pest and disease detection is addressed only in 17 papers, meaning that the topic has not yet fully entered UAV forestry applications. The setting of image accuracy and AGB estimation are particularly discussed in the natural forest while UAVs are utilized in a few tasks of conservation, restoration, and fire monitoring throughout forest types and plant groups. Finally, as could be expected, species recognition and invasive plant detections are not tackled in the planted forest.
Regarding publication years, the selected research papers were 72 in 2018, 99 in 2019, and 55 in the first half of 2020. Hence, it turns out that the number of scientific publications in UAV-based forestry research is continuously increasing during the reference period also considering the projection for 2020 (six more research items in comparison to mid-2019). This trend is in line with the exponential growth of the discipline since 2012 [5] but also with the fast growth of general UAV research [269].
Figure 6 shows a geographic analysis of the dataset. The pie chart size represents the number of research works conducted on the five continents, with the six topics being highlighted in different colors. Each pie chart is centered on the capital city of the country which is at the top rank of the continent. Asia and Europe (including also Russia) are the most prolific continents with 76 and 70 paper, respectively, but, while for Asian research place the applications are more focused on tree detection and inventory parameters, in Europe the scientific works cover a wider range of topics and are split among several countries. In South America, the species recognition topic is well represented (7) thanks to invasive plant detection in the Amazon rainforest. Among continents, North America (which includes also Central America) has the highest incidence (59%) of the tree detection and inventory parameters topic while the 23% of papers from Oceania deal with pest and disease detection. Finally, in Oceania and Africa, UAVs are only used 13 and 5 times, respectively. Regarding nation ranking within each continent, the countries that have hosted the highest number of researches are China (40—Asia), USA (14—North America), Brazil (11—South America), Finland, Spain and Czech Republic (10—Europe), Australia (8—Oceania), and Ghana (2—Africa).
An analysis of the trend in the use of the main UAV-mounted sensors in the reviewed papers is presented in Figure 7. In the process of label creation, for simplicity, cameras sensing separately in the RGB and NIR spectral range are classified as a single multispectral sensor. When thermal and hyperspectral sensors are used in the same study in combination with others, the authors prioritize the abovementioned sensors. Finally, sensor combination expresses the use of both active and passive sensors. RGB, multispectral, and lidar sensors result as the most utilized technologies (Figure 7a): they account for about 80% of the entire dataset, with RGB emerging far from the other types with 114 uses, followed by multispectral and lidar. Deepening the analysis through the forestry topics (Figure 7b), it results that species recognition and invasive plant detection is mainly based on the hyperspectral sensor, while lidar is UAV-mounted in studies exploring tree detection and inventory parameter issues, even if this topic seems to cross all categories of sensors. AGB estimation is tackled especially with RGB, lidar and sensor combination (where lidar is also present). For conservation, restoration, and fire monitoring purposes, multispectral and RGB sensors (and their combination) are the most used, representing 92% of the studies on this topic.
The authors also conducted a brief analysis of the use of multi-temporal images for assessing forest characteristics and woodland ecosystem state over time. Hence it follows that less than ten studies base their results on a different time-series imagery product. Some of them take advantage of satellite images [114,254,264], while others [135,144,261] perform more than fifteen UAV flights for monitoring tree phenology but within one growing season. It is important to note that only two reviewed papers [76,256] collect UAV-acquired images over a timespan longer than one year.

4. Discussion

Forests, but also other natural resources, need monitoring, management, and preservation, which increasingly profit from UAV remote sensing. As a rule, the capabilities of UAS are evaluated in terms of spectral, spatial and temporal resolution, as well as processing time, area coverage and cost-efficiency. When the monitored woodland has a quite small area, the use of drones is surely advantageous for the amplitude and precision in the acquisition of the spectral range, the centimetric resolution, and daily monitoring frequency. Conversely, among the main drawbacks of UAV compared to other RS platforms may be cited processing time (i.e., about ten times greater than that of a satellite) [136], inability to survey large areas due to the battery and regulatory limitations, and the lack of consistent data collection and processing workflows for multitemporal analysis [270].
The results presented in this Part I show how forestry RS is expanding exponentially. The growth is witnessed by the increased number of scientific papers (also in the short timespan analyzed) and by several journals belonging to different scientific publishers. From scientific mapping, it emerges that the research issues are mainly heading towards the estimation of inventory parameters among which AGB stands out for importance; at a methodological level, researchers look expectantly at the use of UAV-DAP and machine learning techniques for tree identification and classification purposes. The screened papers cover both natural and planted forests composed of coniferous, broadleaved, or mixed species. Researchers make use of a wide range of UAV-mounted sensors (passive and active); in this context, the RGB camera with centimetric resolution now seems a consolidated and mature technology.
Despite all this, there are several critical issues that the academic world must address to further improve the general quality of research in UAV forest remote sensing as also revealed by the scientific mapping of the most frequent keywords. This review highlights that the number of papers in some continents is often inversely proportional to their huge forest heritage: the case of Africa is striking (only five studies published), but also South America has a quite small number of researches. In parallel, tropical and equatorial forests (representing about 30% of emerged land areas) are under-represented biomes, with only <14% of studies. Remote sensing methods represent a powerful tool for investigating spatial and temporal patterns of vegetation in these challenging environments; this is especially true for low-flying UAV imagery which is not affected by atmospheric artifacts from highly seasonal cloud cover and aerosol loads [271]. The resulting lack of knowledge about tropical and equatorial woodland could perhaps be ascribed to the geographic distribution of research centers, which are often located in developed countries, but also to UAV’s limited flight range associated with the inaccessibility of tropical forests that make these ecosystems technically hard to monitor, as long as BVLOS (Beyond Visual Line of Sight) flight strategy will be authorized. Besides, in this context, the availability of ready-to-fly UAV platforms at affordable prices seems to be too recent and it might not allow end-users from developing countries to perform image acquisition with a simpler tool in comparison to customized ones. In the case of UAV, it is difficult to suggest a possible solution through open data accessibility to potential end-users, as in the case of some satellite platforms. As stated by Huylenbroeck et al. [272], using the RS tools to monitor natural resources is not neutral: these methods could exclude stakeholders who do not have access to the technology.
Hyperspectral sensors certainly represent a promising tool for future RS progress but, in this review, less than 10% of papers were found to utilize the technology. This could be due to the difficulties in calibration and setting of prototype/customized solutions, in addition to the high market prices of commercially available packages. In this sense, the ongoing cost reduction of off-the-shelf hyperspectral cameras could certainly boost the use of the sensor. By exploiting the ability to choose numerous bands in a precise spectral range, UAV-mounted hyperspectral cameras can provide the tree spectral signature that, in turn, could be used for species recognition or disease detection.
The capability of acquiring UAV images with high frequency provides new opportunities to study forest evolution. Although UAVs can easily acquire a dense time series, only a few research papers deal with the central issue of multi-temporality to describe vegetation dynamics. Multi-temporal analysis can sometimes replace the spectral range, as in some studies where UAV time series are mainly exploited for intra-annual application such as phenology monitoring. Other papers compare single-season UAV imagery products with relatively coarse resolution satellite images. Less than 1% of research works gather multi-seasonal UAV data; in particular, a three-year UAV image time series is used both for species identification in a Brazilian Atlantic forest [256] and chestnut health monitoring [76]. The poor use of UAV-acquired image time series is an issue that must certainly be addressed by future research. Doing this, it is of pivotal importance to find a balance among the size of the remote-sensed area, spatial and temporal resolution without running into the drawbacks of field surveys again (i.e., high cost and labor).
Finally, although the legal situation of UAV flying is analyzed quite recently [273], none of the selected papers tackle this issue. Many countries still lack legislation that regulates the use of UAV both for research and commercial purposes. Some authors [4,274] highlight the importance of integrating as soon as possible UAV in the airspace through a specific regulatory regime. The legislative framework should be concerted by all the stakeholders to realize UAS’s full potential and increase its usage while ensuring individual citizens’ safety and privacy rights. A debate on conditions under which drones can be operated is required where this is lacking (i.e., within the European Union) or in specific situations where human artifacts could raise some concerns (i.e., agroforestry, the riparian ecosystem in developed countries). In the end, UAV regulations should be improved addressing not only forestry monitoring but also innovative uses such as management operations including, for instance, chemical distribution.

5. Conclusions

An accurate review of recent papers is essential for further progress in UAV forest remote sensing, particularly both for researchers getting ready to enter the research topic or for those wishing to deepen a single scientific issue.
To the best of the authors’ knowledge, this study gathers and analyzes one of the most comprehensive datasets (226 articles) among recent systematic reviews dealing with UAV forest remote sensing. To date, this set of technologies can doubtless be considered popular and well-established, especially within academia, as is shown by the increasing number of papers published in the 2018–mid-2020 timespan.
This Part I of the review analyzed and discussed general aspects of applying UAS (aerial vehicle + sensors) in natural, semi-natural and artificial forestry ecosystems. Certainly, the final assessment presents more positives than negatives as evidenced by the strong points emerging from this study. Among these, there is a broad range of topics discussed in the reviewed studies from which the importance emerges of retrieving tree inventory parameters often with DAP and machine learning techniques. RGB is the most used sensor technology for different forestry research purposes. Not least, the scientific dynamism and growing interest in forestry-UAV-RS is proved by the plentiful research groups, journals, and publishers involved.
Nevertheless, challenges still exist and therefore much additional research is required. UAV-RS should be tested and promoted both in natural and semi-natural forests such as tropical, equatorial and riparian ones. Some parts of the world, in particular Africa and South America, may benefit from greater investment in UAV-RS research so as to have an additional tool to better manage their forest ecosystems. Regarding technical issues, the hyperspectral sensor has many potentialities that are not yet fully exploited and long-term monitoring over a multiple-year timespan should be increased throughout, along with forestry topics where relevant. Finally, considering the general framework of aviation, a proper UAV flight regulation for research purposes should be harmonized among different countries, also for the forestry sector. Future research directions should consider the outlined weak points.
Specific technical issues regarding the application of UAV in forest remote sensing are analyzed and further discussed in Part II [55] of this systematic review. The pros and cons of different UAV solutions throughout the six research topics are showed together with some activities where additional effort is needed, such as the technology transfer. Unlike Part I, Part II of the present review is addressed to a more skilled audience.

Author Contributions

Conceptualization, R.D., P.T., S.F.D.G. and A.M.; methodology, R.D., P.T., S.F.D.G. and A.M.; software, R.D.; formal analysis, R.D.; investigation, R.D.; data curation, R.D.; writing—original draft preparation, R.D.; writing—review and editing, R.D., P.T., S.F.D.G. and A.M.; visualization, R.D.; supervision, P.T., S.F.D.G. and A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available in the research articles displayed in Table 1.

Acknowledgments

The authors thank all anonymous reviewers and journal editorial staff for their efforts in improving the quality of this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gómez, C.; Alejandro, P.; Hermosilla, T.; Montes, F.; Pascual, C.; Ruiz, L.Á.; Álvarez-Taboada, F.; Tanase, M.A.; Valbuena, R. Remote sensing for the Spanish forests in the 21st century: A review of advances, needs, and opportunities. For. Syst. 2019, 28, 1–33. [Google Scholar] [CrossRef]
  2. Goodbody, T.R.H.; Coops, N.C.; White, J.C. Digital Aerial Photogrammetry for Updating Area-Based Forest Inventories: A Review of Opportunities, Challenges, and Future Directions. Curr. For. Rep. 2019, 55–75. [Google Scholar] [CrossRef] [Green Version]
  3. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from Motion Photogrammetry in Forestry: A Review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef] [Green Version]
  4. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  5. Raparelli, E.; Bajocco, S. A bibliometric analysis on the use of unmanned aerial vehicles in agricultural and forestry studies. Int. J. Remote Sens. 2019, 40, 9070–9083. [Google Scholar] [CrossRef]
  6. Guimarães, N.; Pádua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry remote sensing from unmanned aerial vehicles: A review focusing on the data, processing and potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef] [Green Version]
  7. Xiang, T.Z.; Xia, G.S.; Zhang, L. Mini-Unmanned Aerial Vehicle-Based Remote Sensing: Techniques, applications, and prospects. IEEE Geosci. Remote Sens. Mag. 2019, 7, 29–63. [Google Scholar] [CrossRef] [Green Version]
  8. Chianucci, F.; Disperati, L.; Guzzi, D.; Bianchini, D.; Nardino, V.; Lastri, C.; Rindinella, A.; Corona, P. Estimation of canopy attributes in beech forests using true colour digital images from a small fixed-wing UAV. Int. J. Appl. Earth Obs. Geoinf. 2016, 47, 60–68. [Google Scholar] [CrossRef] [Green Version]
  9. De Rango, F.; Palmieri, N.; Santamaria, A.F.; Potrino, G. A simulator for UAVs management in agriculture domain. Simul. Ser. 2017, 49, 149–156. [Google Scholar] [CrossRef]
  10. Surovỳ, P.; Kuželka, K. Acquisition of forest attributes for decision support at the forest enterprise level using remote-sensing techniques—A review. Forests 2019, 10, 273. [Google Scholar] [CrossRef] [Green Version]
  11. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Madrigal, V.P.; Mallinis, G.; Dor, E.B.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
  12. Hakala, T.; Honkavaara, E.; Saari, H.; Mäkynen, J.; Kaivosoja, J.; Pesonen, L.; Pölönen, I. Spectral Imaging From Uavs Under Varying Illumination Conditions. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-1/W2, 189–194. [Google Scholar] [CrossRef] [Green Version]
  13. Zimudzi, E.; Sanders, I.; Rollings, N.; Omlin, C.W. Remote sensing of mangroves using unmanned aerial vehicles: Current state and future directions. J. Spat. Sci. 2019, 1–18. [Google Scholar] [CrossRef]
  14. Senthilnath, J.; Kandukuri, M.; Dokania, A.; Ramesh, K.N. Application of UAV imaging platform for vegetation analysis based on spectral-spatial methods. Comput. Electron. Agric. 2017, 140, 8–24. [Google Scholar] [CrossRef]
  15. Chen, Q.; Kutser, T.; Collin, A.; Warner, T.A. Fine resolution remote sensing of species in terrestrial and coastal ecosystems. Int. J. Remote Sens. 2018, 39, 5597–5599. [Google Scholar] [CrossRef] [Green Version]
  16. Shakhatreh, H.; Sawalmeh, A.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned aerial vehicles: A survey on civil applications and key research challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  17. Dunford, R.; Michel, K.; Gagnage, M.; Piégay, H.; Trémelo, M.-L. Potential and constraints of Unmanned Aerial Vehicle technology for the characterization of Mediterranean riparian forest. Int. J. Remote Sens. 2009, 30, 4915–4935. [Google Scholar] [CrossRef]
  18. Web of Science [v.5.35]—Web of Science Core Collection Basic Search. Available online: https://login.webofknowledge.com/error/Error?Error=IPError&PathInfo=%2F&RouterURL=https%3A%2F%2Fwww.webofknowledge.com%2F&Domain=.webofknowledge.com&Src=IP&Alias=WOK5 (accessed on 8 January 2021).
  19. Poley, L.G.; McDermid, G.J. A systematic review of the factors influencing the estimation of vegetation aboveground biomass using unmanned aerial systems. Remote Sens. 2020, 12, 1052. [Google Scholar] [CrossRef] [Green Version]
  20. Batista, C.E.; Ye, J.; Ribeiro, I.O.; Guimarães, P.C.; Medeiros, A.S.S.; Barbosa, R.G.; Oliveira, R.L.; Duvoisin, S.; Jardine, K.J.; Gu, D.; et al. Intermediate-scale horizontal isoprene concentrations in the near-canopy forest atmosphere and implications for emission heterogeneity. Proc. Natl. Acad. Sci. USA 2019, 116, 19318–19323. [Google Scholar] [CrossRef] [Green Version]
  21. Mckinney, K.A.; Wang, D.; Ye, J.; Fouchier, J.B.D.; Guimarães, P.C.; Batista, C.E.; Souza, R.A.F.; Alves, E.G.; Gu, D.; Guenther, A.B.; et al. A sampler for atmospheric volatile organic compounds by copter unmanned aerial vehicles. Atmos. Meas. Tech. 2019, 12, 3123–3135. [Google Scholar] [CrossRef] [Green Version]
  22. Chen, J.; Scircle, A.; Black, O.; Cizdziel, J.V.; Watson, N.; Wevill, D.; Zhou, Y.; Batista, C.E.; Ye, J.; Ribeiro, I.O.; et al. On the use of multicopters for sampling and analysis of volatile organic compounds in the air by adsorption/thermal desorption GC-MS. Air Qual. Atmos. Health 2019, 116, 835–842. [Google Scholar] [CrossRef]
  23. Katsigiannis, P.; Misopolinos, L.; Liakopoulos, V.; Alexandridis, T.K.; Zalidis, G. An Autonomous Multi-Sensor UAV System for Reduced-Input Precision Agriculture Applications. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016; pp. 60–64. [Google Scholar] [CrossRef]
  24. Nebiker, S.; Lack, N. Multispectral and thermal sensors on UAVs. GIM Int. 2016, 30, 19–21. [Google Scholar]
  25. Nebiker, S.; Annen, A.; Scherrer, M.; Oesch, D. A light-weight multispectral sensor for micro uav-opportunities for very high resolution airborne remote sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2008, 37, 1193–1199. [Google Scholar]
  26. Mejias, L.; Lai, J.; Bruggemann, T. Sensors for Missions BT—Handbook of Unmanned Aerial Vehicles; Valavanis, K.P., Vachtsevanos, G.J., Eds.; Springer: Dordrecht, The Netherlands, 2015; pp. 385–399. [Google Scholar] [CrossRef] [Green Version]
  27. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  28. Bendig, J.V. Unmanned Aerial Vehicles (UAVs) for Multi-Temporal Crop Surface Modelling—A New Method for Plant Height and Biomass Estimation Based on RGB-Imaging. Doctoral Thesis, Universität zu Köln, Cologne, Germany, 2015. [Google Scholar]
  29. Riihimäki, H.; Luoto, M.; Heiskanen, J. Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data. Remote Sens. Environ. 2019, 224, 119–132. [Google Scholar] [CrossRef]
  30. Shashkov, M.; Ivanova, N.; Shanin, V.; Grabarnik, P. Ground Surveys Versus UAV Photography: The Comparison of Two Tree Crown Mapping Techniques. In Proceedings of the Information Technologies in the Research of Biodiversity; Bychkov, I., Voronin, V., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 48–56. [Google Scholar]
  31. Casapia, X.T.; Falen, L.; Bartholomeus, H.; Cárdenas, R.; Flores, G.; Herold, M.; Coronado, E.N.H.; Baker, T.R. Identifying and quantifying the abundance of economically important palms in tropical moist forest using UAV imagery. Remote Sens. 2020, 12, 9. [Google Scholar] [CrossRef] [Green Version]
  32. Guo, X.; Wang, L.; Tian, J.; Yin, D.; Shi, C.; Nie, S. Vegetation horizontal occlusion index (VHOI) from TLS and UAV image to better measure mangrove LAI. Remote Sens. 2018, 10, 1739. [Google Scholar] [CrossRef] [Green Version]
  33. Zhang, N.; Zhang, X.; Yang, G.; Zhu, C.; Huo, L.; Feng, H. Assessment of defoliation during the Dendrolimus tabulaeformis Tsai et Liu disaster outbreak using UAV-based hyperspectral images. Remote Sens. Environ. 2018, 217, 323–339. [Google Scholar] [CrossRef]
  34. Cardil, A.; Otsu, K.; Pla, M.; Silva, C.A.; Brotons, L. Quantifying pine processionary moth defoliation in a pine-oak mixed forest using unmanned aerial systems and multispectral imagery. PLoS ONE 2019, 14, e0213027. [Google Scholar] [CrossRef]
  35. Zhao, T.; Stark, B.; Chen, Y.Q.; Ray, A.L.; Doll, D. Challenges in Water Stress Quantification Using Small Unmanned Aerial System (sUAS): Lessons from a Growing Season of Almond. J. Intell. Robot. Syst. Theory Appl. 2017, 88, 721–735. [Google Scholar] [CrossRef]
  36. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  37. Kinaneva, D.; Hristov, G.; Raychev, J.; Zahariev, P. Early forest fire detection using drones and artificial intelligence. In Proceedings of the 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019; pp. 1060–1065. [Google Scholar] [CrossRef]
  38. Witczuk, J.; Pagacz, S.; Zmarz, A.; Cypel, M. Exploring the feasibility of unmanned aerial vehicles and thermal imaging for ungulate surveys in forests—Preliminary results. Int. J. Remote Sens. 2018, 39, 5504–5521. [Google Scholar] [CrossRef]
  39. Shin, J.I.; Seo, W.W.; Kim, T.; Park, J.; Woo, C.S. Using UAV multispectral images for classification of forest burn severity—A case study of the 2019 Gangneung forest fire. Forests 2019, 10, 1025. [Google Scholar] [CrossRef] [Green Version]
  40. Hyyppä, E.; Hyyppä, J.; Hakala, T.; Kukko, A.; Wulder, M.A.; White, J.C.; Pyörälä, J.; Yu, X.; Wang, Y.; Virtanen, J.P.; et al. Under-canopy UAV laser scanning for accurate forest field measurements. ISPRS J. Photogramm. Remote Sens. 2020, 164, 41–60. [Google Scholar] [CrossRef]
  41. Lefsky, M.A.; Cohen, W.B.; Parker, G.G.; Harding, D.J. Lidar Remote Sensing for Ecosystem Studies: Lidar, an Emerging Remote Sensing Technology That Directly Measures the Three-Dimensional Distribution of Plant Canopies, Can Accurately Estimate Vegetation Structural Attributes and Should Be of Particular Interest to Forest, Landscape, and Global Ecologists; Oxford Academic: Oxford, UK, 2002; p. 52. [Google Scholar] [CrossRef]
  42. Hu, B.; Li, J.; Jing, L.; Judah, A. Improving the efficiency and accuracy of individual tree crowndelineation from high-density LiDAR data. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 145–155. [Google Scholar] [CrossRef]
  43. Wu, B.; Yu, B.; Wu, Q.; Huang, Y.; Chen, Z.; Wu, J. Individual tree crown delineation using localized contour tree method and airborne LiDAR data in coniferous forests. Int. J. Appl. Earth Obs. Geoinf. 2016, 52, 82–94. [Google Scholar] [CrossRef]
  44. Jeronimo, S.M.A.; Kane, V.R.; Churchill, D.J.; McGaughey, R.J.; Franklin, J.F. Applying LiDAR Individual Tree Detection to Management of Structurally Diverse Forest Landscapes. J. For. 2018, 116, 336–346. [Google Scholar] [CrossRef] [Green Version]
  45. Kuželka, K.; Slavík, M.; Surový, P. Very high density point clouds from UAV laser scanning for automatic tree stem detection and direct diameter measurement. Remote Sens. 2020, 12, 1236. [Google Scholar] [CrossRef] [Green Version]
  46. Yin, D.; Wang, L. Individual mangrove tree measurement using UAV-based LiDAR data: Possibilities and challenges. Remote Sens. Environ. 2019, 223, 34–49. [Google Scholar] [CrossRef]
  47. D’Oliveira, M.V.N.; Broadbent, E.N.; Oliveira, L.C.; Almeida, D.R.A.; Papa, D.A.; Ferreira, M.E.; Zambrano, A.M.A.; Silva, C.A.; Avino, F.S.; Prata, G.A.; et al. Aboveground biomass estimation in Amazonian tropical forests: A comparison of aircraft-and gatoreye UAV-borne LIDAR data in the Chico mendes extractive reserve in Acre, Brazil. Remote Sens. 2020, 12, 1754. [Google Scholar] [CrossRef]
  48. Stark, S.C.; Leitold, V.; Wu, J.L.; Hunter, M.O.; de Castilho, C.V.; Costa, F.R.C.; McMahon, S.M.; Parker, G.G.; Shimabukuro, M.T.; Lefsky, M.A.; et al. Amazon forest carbon dynamics predicted by profiles of canopy leaf area and light environment. Ecol. Lett. 2012, 15, 1406–1414. [Google Scholar] [CrossRef] [Green Version]
  49. Wiggins, H.L.; Nelson, C.R.; Larson, A.J.; Safford, H.D. Using LiDAR to develop high-resolution reference models of forest structure and spatial pattern. For. Ecol. Manag. 2019, 434, 318–330. [Google Scholar] [CrossRef]
  50. Almeida, D.R.A.; Broadbent, E.N.; Zambrano, A.M.A.; Wilkinson, B.E.; Ferreira, M.E.; Chazdon, R.; Meli, P.; Gorgens, E.B.; Silva, C.A.; Stark, S.C.; et al. Monitoring the structure of forest restoration plantations with a drone-lidar system. Int. J. Appl. Earth Obs. Geoinf. 2019, 79, 192–198. [Google Scholar] [CrossRef]
  51. Fernández-álvarez, M.; Armesto, J.; Picos, J. LiDAR-based wildfire prevention in WUI: The automatic detection, measurement and evaluation of forest fuels. Forests 2019, 10, 148. [Google Scholar] [CrossRef] [Green Version]
  52. Viedma, O.; Almeida, D.R.A.; Moreno, J.M. Postfire Tree Structure from High-Resolution LiDAR and RBR Sentinel 2A Fire Severity Metrics in a Pinus halepensis-Dominated Burned Stand. Remote Sens. 2020, 12, 3554. [Google Scholar] [CrossRef]
  53. Cao, L.; Liu, H.; Fu, X.; Zhang, Z.; Shen, X.; Ruan, H. Comparison of UAV LiDAR and digital aerial photogrammetry point clouds for estimating forest structural attributes in subtropical planted forests. Forests 2019, 10, 145. [Google Scholar] [CrossRef] [Green Version]
  54. Ganz, S.; Käber, Y.; Adler, P. Measuring tree height with remote sensing—A comparison of photogrammetric and LiDAR data with different field measurements. Forests 2019, 10, 694. [Google Scholar] [CrossRef] [Green Version]
  55. Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent advances in Unmanned Aerial Vehicles forest remote sensing—A systematic review. Part II: Research applications. Forests. Submitted.
  56. VOSviewer—Visualizing Scientific Landscapes. Available online: https://www.vosviewer.com/ (accessed on 8 January 2021).
  57. Aguilar, F.J.; Rivas, J.R.; Nemmaoui, A.; Peñalver, A.; Aguilar, M.A. UAV-based digital terrain model generation under leaf-off conditions to support teak plantations inventories in tropical dry forests. A case of the coastal region of Ecuador. Sensors 2019, 19, 1934. [Google Scholar] [CrossRef] [Green Version]
  58. Balsi, M.; Esposito, S.; Fallavollita, P.; Nardinocchi, C. Single-tree detection in high-density LiDAR data from UAV-based survey. Eur. J. Remote Sens. 2018, 51, 679–692. [Google Scholar] [CrossRef] [Green Version]
  59. Guerra-Hernández, J.; Cosenza, D.N.; Rodriguez, L.C.E.; Silva, M.; Tomé, M.; Díaz-Varela, R.A.; González-Ferreiro, E. Comparison of ALS- and UAV(SfM)-derived high-density point clouds for individual tree detection in Eucalyptus plantations. Int. J. Remote Sens. 2018, 39, 5211–5235. [Google Scholar] [CrossRef]
  60. Iizuka, K.; Watanabe, K.; Kato, T.; Putri, N.A.; Silsigia, S.; Kameoka, T.; Kozan, O. Visualizing the spatiotemporal trends of thermal characteristics in a peatland plantation forest in Indonesia: Pilot test using unmanned aerial systems (UASs). Remote Sens. 2018, 10, 1345. [Google Scholar] [CrossRef] [Green Version]
  61. Medauar, C.C.; Silva, S.D.A.; Carvalho, L.C.C.; Tibúrcio, R.A.S.; Lima, J.S.D.S.; Medauar, P.A.S. Monitoring of eucalyptus sprouts control using digital images obtained by unmanned aerial vehicle. J. Sustain. For. 2018, 37, 739–752. [Google Scholar] [CrossRef]
  62. Mokroš, M.; Liang, X.; Surový, P.; Valent, P.; Čerňava, J.; Chudý, F.; Tunák, D.; Saloň, Š.; Merganič, J. Evaluation of Close-Range Photogrammetry Image Collection Methods for Estimating Tree Diameters. ISPRS Int. J. Geo-Inf. 2018, 7, 93. [Google Scholar] [CrossRef] [Green Version]
  63. Qiu, Z.; Feng, Z.K.; Wang, M.; Li, Z.; Lu, C. Application of UAV photogrammetric system for monitoring ancient tree communities in Beijing. Forests 2018, 9, 735. [Google Scholar] [CrossRef] [Green Version]
  64. Blonder, B.; Graae, B.J.; Greer, B.; Haagsma, M.; Helsen, K.; Kapás, R.E.; Pai, H.; Rieksta, J.; Sapena, D.; Still, C.J.; et al. Remote sensing of ploidy level in quaking aspen (Populus tremuloides Michx.). J. Ecol. 2020, 108, 175–188. [Google Scholar] [CrossRef] [Green Version]
  65. Carl, C.; Lehmann, J.R.K.; Landgraf, D.; Pretzsch, H. Robinia pseudoacacia L. in short rotation coppice: Seed and stump shoot reproduction as well as UAS-based spreading analysis. Forests 2019, 10, 235. [Google Scholar] [CrossRef] [Green Version]
  66. Fawcett, D.; Azlan, B.; Hill, T.C.; Kho, L.K.; Bennie, J.; Anderson, K. Unmanned aerial vehicle (UAV) derived structure-from-motion photogrammetry point clouds for oil palm (Elaeis guineensis) canopy segmentation and height estimation. Int. J. Remote Sens. 2019, 40, 7538–7560. [Google Scholar] [CrossRef] [Green Version]
  67. Wang, Y.; Zhu, X.; Wu, B. Automatic detection of individual oil palm trees from UAV images using HOG features and an SVM classifier. Int. J. Remote Sens. 2019, 40, 7356–7370. [Google Scholar] [CrossRef]
  68. Zeng, K.; Zheng, G.; Ma, L.; Ju, W.; Pang, Y. Modelling three-dimensional spatiotemporal distributions of forest photosynthetically active radiation using UAV-Based lidar data. Remote Sens. 2019, 11, 2806. [Google Scholar] [CrossRef] [Green Version]
  69. Dalla Corte, A.P.; Rex, F.E.; de Almeida, D.R.A.; Sanquetta, C.R.; Silva, C.A.; Moura, M.M.; Wilkinson, B.; Zambrano, A.M.A.; da Cunha Neto, E.M.; Veras, H.F.P.; et al. Measuring individual tree diameter and height using gatoreye high-density UAV-lidar in an integrated crop-livestock-forest system. Remote Sens. 2020, 12, 863. [Google Scholar] [CrossRef] [Green Version]
  70. Picos, J.; Bastos, G.; Míguez, D.; Alonso, L.; Armesto, J. Individual tree detection in a eucalyptus plantation using unmanned aerial vehicle (UAV)-LiDAR. Remote Sens. 2020, 12, 885. [Google Scholar] [CrossRef] [Green Version]
  71. Peña, J.M.; de Castro, A.I.; Torres-Sánchez, J.; Andújar, D.; Martín, C.S.; Dorado, J.; Fernández-Quintanilla, C.; López-Granados, F. Estimating tree height and biomass of a poplar plantation with image-based UAV technology. AIMS Agric. Food 2018, 3, 313–323. [Google Scholar] [CrossRef]
  72. Guerra-Hernández, J.; Cosenza, D.N.; Cardil, A.; Silva, C.A.; Botequim, B.; Soares, P.; Silva, M.; González-Ferreiro, E.; Díaz-Varela, R.A. Predicting growing stock volume of eucalyptus plantations using 3-D point clouds derived from UAV imagery and ALS data. Forests 2019, 10, 905. [Google Scholar] [CrossRef] [Green Version]
  73. Navarro, J.A.; Algeet, N.; Fernández-Landa, A.; Esteban, J.; Rodríguez-Noriega, P.; Guillén-Climent, M.L. Integration of UAV, Sentinel-1, and Sentinel-2 data for mangrove plantation aboveground biomass monitoring in Senegal. Remote Sens. 2019, 11, 77. [Google Scholar] [CrossRef] [Green Version]
  74. Lu, J.; Wang, H.; Qin, S.; Cao, L.; Pu, R.; Li, G.; Sun, J. Estimation of aboveground biomass of Robinia pseudoacacia forest in the Yellow River Delta based on UAV and Backpack LiDAR point clouds. Int. J. Appl. Earth Obs. Geoinf. 2020, 86, 102014. [Google Scholar] [CrossRef]
  75. Maes, W.H.; Huete, A.R.; Avino, M.; Boer, M.M.; Dehaan, R.; Pendall, E.; Griebel, A.; Steppe, K. Can UAV-based infrared thermography be used to study plant-parasite interactions between mistletoe and Eucalypt trees? Remote Sens. 2018, 10, 2062. [Google Scholar] [CrossRef] [Green Version]
  76. Pádua, L.; Hruška, J.; Bessa, J.; Adão, T.; Martins, L.M.; Gonçalves, J.A.; Peres, E.; Sousa, A.M.R.; Castro, J.P.; Sousa, J.J. Multi-temporal analysis of forestry and coastal environments using UASs. Remote Sens. 2018, 10, 24. [Google Scholar] [CrossRef] [Green Version]
  77. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial mapping of forests affected by pathogens using UAVs, hyperspectral sensors, and artificial intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  78. Dell, M.; Stone, C.; Osborn, J.; Glen, M.; McCoull, C.; Rimbawanto, A.; Tjahyono, B.; Mohammed, C. Detection of necrotic foliage in a young Eucalyptus pellita plantation using unmanned aerial vehicle RGB photography—A demonstration of concept. Aust. For. 2019, 82, 79–88. [Google Scholar] [CrossRef] [Green Version]
  79. Iizuka, K.; Kato, T.; Silsigia, S.; Soufiningrum, A.Y.; Kozan, O. Estimating and examining the sensitivity of different vegetation indices to fractions of vegetation cover at different scaling Grids for Early Stage Acacia Plantation Forests Using a Fixed-Wing UAS. Remote Sens. 2019, 11, 1816. [Google Scholar] [CrossRef] [Green Version]
  80. Paolinelli Reis, B.; Martins, S.V.; Fernandes Filho, E.I.; Sarcinelli, T.S.; Gleriani, J.M.; Leite, H.G.; Halassy, M. Forest restoration monitoring through digital processing of high resolution images. Ecol. Eng. 2019, 127, 178–186. [Google Scholar] [CrossRef] [Green Version]
  81. Sealey, L.L.; Van Rees, K.C.J. Influence of skidder traffic on soil bulk density, aspen regeneration, and vegetation indices following winter harvesting in the Duck Mountain Provincial Park, SK. For. Ecol. Manag. 2019, 437, 59–69. [Google Scholar] [CrossRef]
  82. Sealey, L.L.; Van Rees, K.C.J. Assessment of residual slash coverage using UAVs and implications for aspen regeneration. J. Unmanned Veh. Syst. 2020, 8, 19–29. [Google Scholar] [CrossRef]
  83. Díaz, G.M.; Mohr-Bell, D.; Garrett, M.; Muñoz, L.; Lencinas, J.D. Customizing unmanned aircraft systems to reduce forest inventory costs: Can oblique images substantially improve the 3D reconstruction of the canopy? Int. J. Remote Sens. 2020, 41, 3480–3510. [Google Scholar] [CrossRef]
  84. Guan, H.; Zhang, J.; Ma, Q.; Liu, M.; Wu, F.; Guo, Q.; Su, Y.; Hu, T.; Wang, R.; Ma, Q.; et al. A Novel Framework to Automatically Fuse Multiplatform LiDAR Data in Forest Environments Based on Tree Locations. IEEE Trans. Geosci. Remote Sens. 2020, 58, 2165–2177. [Google Scholar] [CrossRef]
  85. Abdollahnejad, A.; Panagiotidis, D.; Surovỳ, P. Estimation and extrapolation of tree parameters using spectral correlation between UAV and Pléiades data. Forests 2018, 9, 85. [Google Scholar] [CrossRef] [Green Version]
  86. Demir, N. Using UAVs for detection of trees from digital surface models. J. For. Res. 2018, 29, 813–821. [Google Scholar] [CrossRef]
  87. Feduck, C.; McDermid, G.J.; Castilla, G. Detection of coniferous seedlings in UAV imagery. Forests 2018, 9, 432. [Google Scholar] [CrossRef] [Green Version]
  88. Goodbody, T.R.H.; Coops, N.C.; Hermosilla, T.; Tompalski, P.; Crawford, P. Assessing the status of forest regeneration using digital aerial photogrammetry and unmanned aerial systems. Int. J. Remote Sens. 2018, 39, 5246–5264. [Google Scholar] [CrossRef]
  89. Iizuka, K.; Yonehara, T.; Itoh, M.; Kosugi, Y. Estimating Tree Height and Diameter at Breast Height (DBH) from Digital surface models and orthophotos obtained with an unmanned aerial system for a Japanese Cypress (Chamaecyparis obtusa) Forest. Remote Sens. 2018, 10, 13. [Google Scholar] [CrossRef] [Green Version]
  90. Shin, P.; Sankey, T.; Moore, M.M.; Thode, A.E. Evaluating unmanned aerial vehicle images for estimating forest canopy fuels in a ponderosa pine stand. Remote Sens. 2018, 10, 1266. [Google Scholar] [CrossRef] [Green Version]
  91. Webster, C.; Westoby, M.; Rutter, N.; Jonas, T. Three-dimensional thermal characterization of forest canopies using UAV photogrammetry. Remote Sens. Environ. 2018, 209, 835–847. [Google Scholar] [CrossRef] [Green Version]
  92. Durfee, N.; Ochoa, C.G.; Mata-Gonzalez, R. The use of low-altitude UAV imagery to assess western juniper density and canopy cover in treated and untreated stands. Forests 2019, 10, 296. [Google Scholar] [CrossRef] [Green Version]
  93. Gülci, S. The determination of some stand parameters using SfM-based spatial 3D point cloud in forestry studies: An analysis of data production in pure coniferous young forest stands. Environ. Monit. Assess. 2019, 191. [Google Scholar] [CrossRef]
  94. He, H.; Yan, Y.; Chen, T.; Cheng, P. Tree height estimation of forest plantation in mountainous terrain from bare-earth points using a DoG-coupled radial basis function neural network. Remote Sens. 2019, 11, 1271. [Google Scholar] [CrossRef] [Green Version]
  95. Imangholiloo, M.; Saarinen, N.; Markelin, L.; Rosnell, T.; Näsi, R.; Hakala, T.; Honkavaara, E.; Holopainen, M.; Hyyppä, J.; Vastaranta, M. Characterizing seedling stands using leaf-off and leaf-on photogrammetric point clouds and hyperspectral imagery acquired from unmanned aerial vehicle. Forests 2019, 10, 415. [Google Scholar] [CrossRef] [Green Version]
  96. Krause, S.; Sanders, T.G.M.; Mund, J.P.; Greve, K. UAV-based photogrammetric tree height measurement for intensive forest monitoring. Remote Sens. 2019, 11, 758. [Google Scholar] [CrossRef] [Green Version]
  97. Lendzioch, T.; Langhammer, J.; Jenicek, M. Estimating snow depth and leaf area index based on UAV digital photogrammetry. Sensors 2019, 19, 1027. [Google Scholar] [CrossRef] [Green Version]
  98. Maturbongs, B.Y.L.; Wing, M.G.; Strîmbu, B.M.; Burnett, J. Forest inventory sensivity to UAS-based image processing algorithms. Ann. For. Res. 2019, 62, 87–108. [Google Scholar] [CrossRef]
  99. Puliti, S.; Solberg, S.; Granhus, A. Use of UAV photogrammetric data for estimation of biophysical properties in forest stands under regeneration. Remote Sens. 2019, 11, 233. [Google Scholar] [CrossRef] [Green Version]
  100. Santini, F.; Kefauver, S.C.; Resco de Dios, V.; Araus, J.L.; Voltas, J. Using unmanned aerial vehicle-based multispectral, RGB and thermal imagery for phenotyping of forest genetic trials: A case study in Pinus halepensis. Ann. Appl. Biol. 2019, 174, 262–276. [Google Scholar] [CrossRef] [Green Version]
  101. Santini, F.; Serrano, L.; Kefauver, S.C.; Abdullah-Al, M.; Aguilera, M.; Sin, E.; Voltas, J. Morpho-physiological variability of Pinus nigra populations reveals climate-driven local adaptation but weak water use differentiation. Environ. Exp. Bot. 2019, 166, 103828. [Google Scholar] [CrossRef]
  102. Tian, J.; Dai, T.; Li, H.; Liao, C.; Teng, W.; Hu, Q.; Ma, W.; Xu, Y. A novel tree height extraction approach for individual trees by combining TLS and UAV image-based point cloud integration. Forests 2019, 10, 537. [Google Scholar] [CrossRef] [Green Version]
  103. D’Odorico, P.; Besik, A.; Wong, C.Y.S.; Isabel, N.; Ensminger, I. High-throughput drone-based remote sensing reliably tracks phenology in thousands of conifer seedlings. New Phytol. 2020, 226, 1667–1681. [Google Scholar] [CrossRef] [PubMed]
  104. Du Toit, F.; Coops, N.C.; Tompalski, P.; Goodbody, T.R.H.; El-Kassaby, Y.A.; Stoehr, M.; Turner, D.; Lucieer, A. Characterizing variations in growth characteristics between Douglas-fir with different genetic gain levels using airborne laser scanning. Trees Struct. Funct. 2020, 34, 649–664. [Google Scholar] [CrossRef]
  105. Hu, X.; Li, D. Research on a Single-Tree Point Cloud Segmentation Method Based on UAV Tilt Photography and Deep Learning Algorithm. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 4111–4120. [Google Scholar] [CrossRef]
  106. Li, L.; Chen, J.; Mu, X.; Li, W.; Yan, G.; Xie, D.; Zhang, W. Quantifying understory and overstory vegetation cover using UAV-based RGB imagery in forest plantation. Remote Sens. 2020, 12, 298. [Google Scholar] [CrossRef] [Green Version]
  107. Lin, J.; Wang, M.; Ma, M.; Lin, Y. Aboveground tree biomass estimation of sparse subalpine coniferous forest with UAV oblique photography. Remote Sens. 2018, 10, 849. [Google Scholar] [CrossRef] [Green Version]
  108. Windrim, L.; Bryson, M.; McLean, M.; Randle, J.; Stone, C. Automated mapping of woody debris over harvested forest plantations using UAVs, high-resolution imagery, and machine learning. Remote Sens. 2019, 11, 733. [Google Scholar] [CrossRef] [Green Version]
  109. Zou, X.; Liang, A.; Wu, B.; Su, J.; Zheng, R.; Li, J. UAV-based high-throughput approach for fast growing Cunninghamia lanceolata (Lamb.) cultivar screening by machine learning. Forests 2019, 10, 815. [Google Scholar] [CrossRef] [Green Version]
  110. Iizuka, K.; Hayakawa, Y.S.; Ogura, T.; Nakata, Y.; Kosugi, Y.; Yonehara, T. Integration of multi-sensor data to estimate plot-level stem volume using machine learning algorithms-case study of evergreen conifer planted forests in Japan. Remote Sens. 2020, 12, 1649. [Google Scholar] [CrossRef]
  111. Puliti, S.; Dash, J.P.; Watt, M.S.; Breidenbach, J.; Pearse, G.D. A comparison of UAV laser scanning, photogrammetry and airborne laser scanning for precision inventory of small-forest properties. Forestry 2020, 93, 150–162. [Google Scholar] [CrossRef]
  112. Yrttimaa, T.; Saarinen, N.; Kankare, V.; Viljanen, N.; Hynynen, J.; Huuskonen, S.; Holopainen, M.; Hyyppä, J.; Honkavaara, E.; Vastaranta, M. Multisensorial close-range sensing generates benefits for characterization of managed scots pine (Pinus sylvestris L.) stands. ISPRS Int. J. Geo-Inf. 2020, 9, 309. [Google Scholar] [CrossRef]
  113. Brovkina, O.; Cienciala, E.; Surový, P.; Janata, P. Unmanned aerial vehicles (UAV) for assessment of qualitative classification of Norway spruce in temperate forest stands. Geo-Spatial Inf. Sci. 2018, 21, 12–20. [Google Scholar] [CrossRef] [Green Version]
  114. Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV multispectral imagery can complement satellite data for monitoring forest health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef] [Green Version]
  115. Näsi, R.; Honkavaara, E.; Blomqvist, M.; Lyytikäinen-Saarenmaa, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Holopainen, M. Remote sensing of bark beetle damage in urban forests at individual tree level using a novel hyperspectral camera from UAV and aircraft. Urban For. Urban Green. 2018, 30, 72–83. [Google Scholar] [CrossRef]
  116. Jung, K.Y.; Park, J.K. Analysis of vegetation infection information using unmanned aerial vehicle with optical sensor. Sens. Mater. 2019, 31, 3319–3326. [Google Scholar] [CrossRef]
  117. Smigaj, M.; Gaulton, R.; Suárez, J.C.; Barr, S.L. Canopy temperature from an Unmanned Aerial Vehicle as an indicator of tree stress associated with red band needle blight severity. For. Ecol. Manag. 2019, 433, 699–708. [Google Scholar] [CrossRef]
  118. Fernández-Guisuraga, J.M.; Sanz-Ablanedo, E.; Suárez-Seoane, S.; Calvo, L. Using unmanned aerial vehicles in postfire vegetation survey campaigns through large and heterogeneous areas: Opportunities and challenges. Sensors 2018, 18, 586. [Google Scholar] [CrossRef] [Green Version]
  119. Nagai, S.; Saitoh, T.M.; Kajiwara, K.; Yoshitake, S.; Honda, Y. Investigation of the potential of drone observations for detection of forest disturbance caused by heavy snow damage in a Japanese cedar (Cryptomeria japonica) forest. J. Agric. Meteorol. 2018, 74, 123–127. [Google Scholar] [CrossRef] [Green Version]
  120. Belmonte, A.; Sankey, T.; Biederman, J.A.; Bradford, J.; Goetz, S.J.; Kolb, T.; Woolley, T. UAV-derived estimates of forest structure to inform ponderosa pine forest restoration. Remote Sens. Ecol. Conserv. 2019, 6, 181–197. [Google Scholar] [CrossRef]
  121. Polewski, P.; Yao, W.; Cao, L.; Gao, S. Marker-free coregistration of UAV and backpack LiDAR point clouds in forested areas. ISPRS J. Photogramm. Remote Sens. 2019, 147, 307–318. [Google Scholar] [CrossRef]
  122. Hentz, Â.M.K.; Silva, C.A.; Dalla Corte, A.P.; Netto, S.P.; Strager, M.P.; Klauberg, C. Estimating forest uniformity in Eucalyptus spp. and Pinus taeda L. stands using field measurements and structure from motion point clouds generated from unmanned aerial vehicle (UAV) data collection. For. Syst. 2018, 27, 1. [Google Scholar] [CrossRef]
  123. Huang, H.; Li, X.; Chen, C. Individual tree crown detection and delineation from very-high-resolution UAV images based on bias field and marker-controlled watershed segmentation algorithms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 2253–2262. [Google Scholar] [CrossRef]
  124. Kuželka, K.; Surový, P. Mapping forest structure using uas inside flight capabilities. Sensors 2018, 18, 2245. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  125. Yan, W.; Guan, H.; Cao, L.; Yu, Y.; Gao, S.; Lu, J.Y. An automated hierarchical approach for three-dimensional segmentation of single trees using UAV LiDAR data. Remote Sens. 2018, 10, 1999. [Google Scholar] [CrossRef] [Green Version]
  126. Li, J.; Yang, B.; Cong, Y.; Cao, L.; Fu, X.; Dong, Z. 3D forest mapping using a low-cost UAV laser scanning system: Investigation and comparison. Remote Sens. 2019, 11, 717. [Google Scholar] [CrossRef] [Green Version]
  127. Yan, W.; Guan, H.; Cao, L.; Yu, Y.; Li, C.; Lu, J.Y. A self-adaptive mean shift tree-segmentation method using UAV LiDAR data. Remote Sens. 2020, 12, 515. [Google Scholar] [CrossRef] [Green Version]
  128. Khokthong, W.; Zemp, D.C.; Irawan, B.; Sundawati, L.; Kreft, H.; Hölscher, D. Drone-Based Assessment of Canopy Cover for Analyzing Tree Mortality in an Oil Palm Agroforest. Front. For. Glob. Chang. 2019, 2, 12. [Google Scholar] [CrossRef] [Green Version]
  129. Puliti, S.; Talbot, B.; Astrup, R. Tree-stump detection, segmentation, classification, and measurement using Unmanned aerial vehicle (UAV) imagery. Forests 2018, 9, 102. [Google Scholar] [CrossRef] [Green Version]
  130. Wu, X.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Assessment of individual tree detection and canopy cover estimation using unmanned aerial vehicle based light detection and ranging (UAV-LiDAR) data in planted forests. Remote Sens. 2019, 11, 908. [Google Scholar] [CrossRef] [Green Version]
  131. Chen, S.W.; Nardari, G.V.; Lee, E.S.; Qu, C.; Liu, X.; Romero, R.A.F.; Kumar, V. SLOAM: Semantic lidar odometry and mapping for forest inventory. IEEE Robot. Autom. Lett. 2019, 5, 612–619. [Google Scholar] [CrossRef] [Green Version]
  132. Liu, K.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Estimating forest structural attributes using UAV-LiDAR data in Ginkgo plantations. ISPRS J. Photogramm. Remote Sens. 2018, 146, 465–482. [Google Scholar] [CrossRef]
  133. Shen, X.; Cao, L.; Yang, B.; Xu, Z.; Wang, G. Estimation of forest structural attributes using spectral indices and point clouds from UAS-based multispectral and RGB imageries. Remote Sens. 2019, 11, 800. [Google Scholar] [CrossRef] [Green Version]
  134. Whiteside, T.G.; Bartolo, R.E. A robust object-based woody cover extraction technique for monitoring mine site revegetation at scale in the monsoonal tropics using multispectral RPAS imagery from different sensors. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 300–312. [Google Scholar] [CrossRef]
  135. Goodbody, T.R.H.; Coops, N.C.; Hermosilla, T.; Tompalski, P.; Pelletier, G. Vegetation phenology driving error variation in digital aerial photogrammetrically derived Terrain Models. Remote Sens. 2018, 10, 1554. [Google Scholar] [CrossRef] [Green Version]
  136. Ruwaimana, M.; Satyanarayana, B.; Otero, V.; Muslim, A.M.; Muhammad Syafiq, A.; Ibrahim, S.; Raymaekers, D.; Koedam, N.; Dahdouh-Guebas, F. The advantages of using drones over space-borne imagery in the mapping of mangrove forests. PLoS ONE 2018, 13, e0200288. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  137. Oliveira, R.A.; Tommaselli, A.M.G.; Honkavaara, E. Generating a hyperspectral digital surface model using a hyperspectral 2D frame camera. ISPRS J. Photogramm. Remote Sens. 2019, 147, 345–360. [Google Scholar] [CrossRef]
  138. Fletcher, A.; Mather, R. Hypertemporal imaging capability of uas improves photogrammetric tree canopy models. Remote Sens. 2020, 12, 1238. [Google Scholar] [CrossRef] [Green Version]
  139. Jurjević, L.; Gašparović, M.; Milas, A.S.; Balenović, I. Impact of UAS image orientation on accuracy of forest inventory attributes. Remote Sens. 2020, 12, 404. [Google Scholar] [CrossRef] [Green Version]
  140. Alexander, C.; Korstjens, A.H.; Hankinson, E.; Usher, G.; Harrison, N.; Nowak, M.G.; Abdullah, A.; Wich, S.A.; Hill, R.A. Locating emergent trees in a tropical rainforest using data from an Unmanned Aerial Vehicle (UAV). Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 86–90. [Google Scholar] [CrossRef]
  141. Bagaram, M.B.; Giuliarelli, D.; Chirici, G.; Giannetti, F.; Barbati, A. UAV remote sensing for biodiversity monitoring: Are forest canopy gaps good covariates? Remote Sens. 2018, 10, 1397. [Google Scholar] [CrossRef]
  142. Chen, S.Y.; Lin, C.; Tai, C.H.; Chuang, S.J. Adaptive window-based constrained energy minimization for detection of newly grown tree leaves. Remote Sens. 2018, 10, 96. [Google Scholar] [CrossRef] [Green Version]
  143. Kattenborn, T.; Hernández, J.; Lopatin, J.; Kattenborn, G.; Fassnacht, F.E. Pilot study on the retrieval of DBH and diameter distribution of deciduous forest stands using cast shadows in uav-based orthomosaics. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 4, 93–99. [Google Scholar] [CrossRef] [Green Version]
  144. Klosterman, S.; Melaas, E.; Wang, J.; Martinez, A.; Frederick, S.; O’Keefe, J.; Orwig, D.A.; Wang, Z.; Sun, Q.; Schaaf, C.; et al. Fine-scale perspectives on landscape phenology from unmanned aerial vehicle (UAV) photography. Agric. For. Meteorol. 2018, 248, 397–407. [Google Scholar] [CrossRef]
  145. Lin, C.; Chen, S.Y.; Chen, C.C.; Tai, C.H. Detecting newly grown tree leaves from unmanned-aerial-vehicle images using hyperspectral target detection techniques. ISPRS J. Photogramm. Remote Sens. 2018, 142, 174–189. [Google Scholar] [CrossRef]
  146. Mayr, M.J.; Malß, S.; Ofner, E.; Samimi, C. Disturbance feedbacks on the height of woody vegetation in a savannah: A multi-plot assessment using an unmanned aerial vehicle (UAV). Int. J. Remote Sens. 2018, 39, 4761–4785. [Google Scholar] [CrossRef]
  147. Morales, G.; Kemper, G.; Sevillano, G.; Arteaga, D.; Ortega, I.; Telles, J. Automatic segmentation of Mauritia flexuosa in unmanned aerial vehicle (UAV) imagery using deep learning. Forests 2018, 9, 736. [Google Scholar] [CrossRef] [Green Version]
  148. Roşca, S.; Suomalainen, J.; Bartholomeus, H.; Herold, M. Comparing terrestrial laser scanning and unmanned aerial vehicle structure from motion to assess top of canopy structure in tropical forests. Interface Focus 2018, 8. [Google Scholar] [CrossRef]
  149. Thomson, E.R.; Malhi, Y.; Bartholomeus, H.; Oliveras, I.; Gvozdevaite, A.; Peprah, T.; Suomalainen, J.; Quansah, J.; Seidu, J.; Adonteng, C.; et al. Mapping the leaf economic spectrum across West African tropical forests using UAV-Acquired hyperspectral imagery. Remote Sens. 2018, 10, 1532. [Google Scholar] [CrossRef] [Green Version]
  150. Dos Santos, A.A.; Marcato Junior, J.; Araújo, M.S.; Di Martini, D.R.; Tetila, E.C.; Siqueira, H.L.; Aoki, C.; Eltner, A.; Matsubara, E.T.; Pistori, H.; et al. Assessment of CNN-based methods for individual tree detection on images captured by RGB cameras attached to UAVS. Sensors 2019, 19, 3595. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  151. Park, J.Y.; Muller-Landau, H.C.; Lichstein, J.W.; Rifai, S.W.; Dandois, J.P.; Bohlman, S.A. Quantifying leaf phenology of individual trees and species in a tropical forest using unmanned aerial vehicle (UAV) images. Remote Sens. 2019, 11, 1534. [Google Scholar] [CrossRef] [Green Version]
  152. Schneider, F.D.; Kükenbrink, D.; Schaepman, M.E.; Schimel, D.S.; Morsdorf, F. Quantifying 3D structure and occlusion in dense tropical and temperate forests using close-range LiDAR. Agric. For. Meteorol. 2019, 268, 249–257. [Google Scholar] [CrossRef]
  153. Xu, J.; Gu, H.; Meng, Q.; Cheng, J.; Liu, Y.; Jiang, P.; Sheng, J.; Deng, J.; Bai, X. Spatial pattern analysis of Haloxylon ammodendron using UAV imagery—A case study in the Gurbantunggut Desert. Int. J. Appl. Earth Obs. Geoinf. 2019, 83, 101891. [Google Scholar] [CrossRef]
  154. Dong, T.; Zhang, X.; Ding, Z.; Fan, J. Multi-layered tree crown extraction from LiDAR data using graph-based segmentation. Comput. Electron. Agric. 2020, 170, 105213. [Google Scholar] [CrossRef]
  155. Krisanski, S.; Taskhiri, M.S.; Turner, P. Enhancing methods for under-canopy unmanned aircraft system based photogrammetry in complex forests for tree diameter measurement. Remote Sens. 2020, 12, 1652. [Google Scholar] [CrossRef]
  156. Moe, K.T.; Owari, T.; Furuya, N.; Hiroshima, T. Comparing individual tree height information derived from field surveys, LiDAR and UAV-DAP for high-value timber species in Northern Japan. Forests 2020, 11, 223. [Google Scholar] [CrossRef] [Green Version]
  157. Otero, V.; Van De Kerchove, R.; Satyanarayana, B.; Martínez-Espinosa, C.; Fisol, M.A.B.; Ibrahim, M.R.B.; Sulong, I.; Mohd-Lokman, H.; Lucas, R.; Dahdouh-Guebas, F. Managing mangrove forests from the sky: Forest inventory using field data and Unmanned Aerial Vehicle (UAV) imagery in the Matang Mangrove Forest Reserve, peninsular Malaysia. For. Ecol. Manag. 2018, 411, 35–45. [Google Scholar] [CrossRef]
  158. Domingo, D.; Ørka, H.O.; Næsset, E.; Kachamba, D.; Gobakken, T. Effects of UAV image resolution, camera type, and image overlap on accuracy of biomass predictions in a tropicalwoodland. Remote Sens. 2019, 11, 948. [Google Scholar] [CrossRef] [Green Version]
  159. González-Jaramillo, V.; Fries, A.; Bendix, J. AGB estimation in a tropical mountain forest (TMF) by means of RGB and multispectral images using an unmanned aerial vehicle (UAV). Remote Sens. 2019, 11, 1413. [Google Scholar] [CrossRef] [Green Version]
  160. Li, Z.; Zan, Q.; Yang, Q.; Zhu, D.; Chen, Y.; Yu, S. Remote estimation of mangrove aboveground carbon stock at the species level using a low-cost unmanned aerial vehicle system. Remote Sens. 2019, 11, 1018. [Google Scholar] [CrossRef] [Green Version]
  161. Ota, T.; Ahmed, O.S.; Minn, S.T.; Khai, T.C.; Mizoue, N.; Yoshida, S. Estimating selective logging impacts on aboveground biomass in tropical forests using digital aerial photography obtained before and after a logging event from an unmanned aerial vehicle. For. Ecol. Manag. 2019, 433, 162–169. [Google Scholar] [CrossRef]
  162. Qiu, P.; Wang, D.; Zou, X.; Yang, X.; Xie, G.; Xu, S.; Zhong, Z. Finer resolution estimation and mapping of mangrove biomass using UAV LiDAR and worldview-2 data. Forests 2019, 10, 871. [Google Scholar] [CrossRef] [Green Version]
  163. Tian, J.; Wang, L.; Li, X.; Yin, D.; Gong, H.; Nie, S.; Shi, C.; Zhong, R.; Liu, X.; Xu, R. Canopy height layering biomass estimation model (CHL-BEM) with full-waveform LiDAR. Remote Sens. 2019, 11, 1446. [Google Scholar] [CrossRef] [Green Version]
  164. Swinfield, T.; Lindsell, J.A.; Williams, J.V.; Harrison, R.D.; Gemita, E.; Schönlieb, C.B.; Coomes, D.A. Accurate Measurement of Tropical Forest Canopy Heights and Aboveground Carbon Using Structure From Motion. Remote Sens. 2019, 11, 928. [Google Scholar] [CrossRef] [Green Version]
  165. Vaglio Laurin, G.; Ding, J.; Disney, M.; Bartholomeus, H.; Herold, M.; Papale, D.; Valentini, R. Tree height in tropical forest as measured by different ground, proximal, and remote sensing instruments, and impacts on above ground biomass estimates. Int. J. Appl. Earth Obs. Geoinf. 2019, 82, 101899. [Google Scholar] [CrossRef]
  166. Xu, Z.; Li, W.; Li, Y.; Shen, X.; Ruan, H. Estimation of secondary forest parameters by integrating image and point cloud-based metrics acquired from unmanned aerial vehicle. J. Appl. Remote Sens. 2019, 14, 1. [Google Scholar] [CrossRef] [Green Version]
  167. Wang, D.; Wan, B.; Qiu, P.; Zuo, Z.; Wang, R.; Wu, X. Mapping height and aboveground biomass of mangrove forests on Hainan Island using UAV-LiDAR sampling. Remote Sens. 2019, 11, 2156. [Google Scholar] [CrossRef] [Green Version]
  168. Di Gennaro, S.F.; Nati, C.; Dainelli, R.; Pastonchi, L.; Berton, A.; Toscano, P.; Matese, A. An automatic UAV based segmentation approach for pruning biomass estimation in irregularly spaced chestnut orchards. Forests 2020, 11, 308. [Google Scholar] [CrossRef] [Green Version]
  169. Jones, A.R.; Raja Segaran, R.; Clarke, K.D.; Waycott, M.; Goh, W.S.H.; Gillanders, B.M. Estimating Mangrove Tree Biomass and Carbon Content: A Comparison of Forest Inventory Techniques and Drone Imagery. Front. Mar. Sci. 2020, 6, 784. [Google Scholar] [CrossRef] [Green Version]
  170. Navarro, A.; Young, M.; Allan, B.; Carnell, P.; Macreadie, P.; Ierodiaconou, D. The application of Unmanned Aerial Vehicles (UAVs) to estimate above-ground biomass of mangrove ecosystems. Remote Sens. Environ. 2020, 242, 111747. [Google Scholar] [CrossRef]
  171. Wang, D.; Wan, B.; Liu, J.; Su, Y.; Guo, Q.; Qiu, P.; Wu, X. Estimating aboveground biomass of the mangrove forests on northeast Hainan Island in China using an upscaling method from field plots, UAV-LiDAR data and Sentinel-2 imagery. Int. J. Appl. Earth Obs. Geoinf. 2020, 85, 101986. [Google Scholar] [CrossRef]
  172. Zhu, Y.; Liu, K.; Myint, S.W.; Du, Z.; Li, Y.; Cao, J.; Liu, L.; Wu, Z. Integration of GF2 optical, GF3 SAR, and UAV data for estimating aboveground biomass of China’s largest artificially planted mangroves. Remote Sens. 2020, 12, 2039. [Google Scholar] [CrossRef]
  173. Cao, J.; Leng, W.; Liu, K.; Liu, L.; He, Z.; Zhu, Y. Object-Based mangrove species classification using unmanned aerial vehicle hyperspectral images and digital surface models. Remote Sens. 2018, 10, 89. [Google Scholar] [CrossRef] [Green Version]
  174. De Sá, N.C.; Castro, P.; Carvalho, S.; Marchante, E.; López-Núñez, F.A.; Marchante, H. Mapping the flowering of an invasive plant using unmanned aerial vehicles: Is there potential for biocontrol monitoring? Front. Plant Sci. 2018, 9, 293. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  175. Franklin, S.E.; Ahmed, O.S. Deciduous tree species classification using object-based analysis and machine learning with unmanned aerial vehicle multispectral data. Int. J. Remote Sens. 2018, 39, 5236–5245. [Google Scholar] [CrossRef]
  176. Liu, C.; Ai, M.; Chen, Z.; Zhou, Y.; Wu, H. Detection of Firmiana danxiaensis Canopies by a Customized Imaging System Mounted on an UAV Platform. J. Sens. 2018, 2018. [Google Scholar] [CrossRef] [Green Version]
  177. Sothe, C.; Dalponte, M.; de Almeida, C.M.; Schimalski, M.B.; Lima, C.L.; Liesenberg, V.; Miyoshi, G.T.; Tommaselli, A.M.G. Tree species classification in a highly diverse subtropical forest integrating UAV-based photogrammetric point cloud and hyperspectral data. Remote Sens. 2019, 11, 1338. [Google Scholar] [CrossRef] [Green Version]
  178. Waite, C.E.; van der Heijden, G.M.F.; Field, R.; Boyd, D.S. A view from above: Unmanned aerial vehicles (UAVs) provide a new tool for assessing liana infestation in tropical forest canopies. J. Appl. Ecol. 2019, 56, 902–912. [Google Scholar] [CrossRef]
  179. Wu, Z.; Ni, M.; Hu, Z.; Wang, J.; Li, Q.; Wu, G. Mapping invasive plant with UAV-derived 3D mesh model in mountain area—A case study in Shenzhen Coast, China. Int. J. Appl. Earth Obs. Geoinf. 2019, 77, 129–139. [Google Scholar] [CrossRef]
  180. Yaney-Keller, A.; Tomillo, P.S.; Marshall, J.M.; Paladino, F.V. Using unmanned aerial systems (Uas) to assay mangrove estuaries on the pacific coast of Costa Rica. PLoS ONE 2019, 14, e0217310. [Google Scholar] [CrossRef]
  181. Yuan, X.; Laakso, K.; Marzahn, P.; Sanchez-Azofeifa, G.A. Canopy Temperature Differences between Liana-Infested and Non-Liana Infested Areas in a Neotropical Dry Forest. Forests 2019, 10, 890. [Google Scholar] [CrossRef] [Green Version]
  182. Kentsch, S.; Caceres, M.L.L.; Serrano, D.; Roure, F.; Diez, Y. Computer vision and deep learning techniques for the analysis of drone-acquired forest images, a transfer learning study. Remote Sens. 2020, 12, 1287. [Google Scholar] [CrossRef] [Green Version]
  183. Miyoshi, G.T.; Arruda, M.D.S.; Osco, L.P.; Junior, J.M.; Gonçalves, D.N.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Gonçalves, W.N. A novel deep learning method to identify single tree species in UAV-based hyperspectral images. Remote Sens. 2020, 12, 1294. [Google Scholar] [CrossRef] [Green Version]
  184. Rupasinghe, P.A.; Simic Milas, A.; Arend, K.; Simonson, M.A.; Mayer, C.; Mackey, S. Classification of shoreline vegetation in the Western Basin of Lake Erie using airborne hyperspectral imager HSI2, Pleiades and UAV data. Int. J. Remote Sens. 2019, 40, 3008–3028. [Google Scholar] [CrossRef]
  185. De Luca, G.; Silva, J.M.N.; Cerasoli, S.; Araújo, J.; Campos, J.; Di Fazio, S.; Modica, G. Object-based land cover classification of cork oak woodlands using UAV imagery and Orfeo Toolbox. Remote Sens. 2019, 11, 1238. [Google Scholar] [CrossRef] [Green Version]
  186. Rossi, F.; Becker, G. Creating forest management units with Hot Spot Analysis (Getis-Ord Gi*) over a forest affected by mixed-severity fires. Aust. For. 2019, 82, 166–175. [Google Scholar] [CrossRef]
  187. Frey, J.; Kovach, K.; Stemmler, S.; Koch, B. UAV photogrammetry of forests as a vulnerable process. A sensitivity analysis for a structure from motion RGB-image pipeline. Remote Sens. 2018, 10, 912. [Google Scholar] [CrossRef] [Green Version]
  188. Jayathunga, S.; Owari, T.; Tsuyuki, S. Evaluating the performance of photogrammetric products using fixed-wing UAV imagery over a mixed conifer-broadleaf forest: Comparison with airborne laser scanning. Remote Sens. 2018, 10, 187. [Google Scholar] [CrossRef] [Green Version]
  189. Ni, W.; Sun, G.; Pang, Y.; Zhang, Z.; Liu, J.; Yang, A.; Wang, Y.; Zhang, D. Mapping Three-Dimensional Structures of Forest Canopy Using UAV Stereo Imagery: Evaluating Impacts of Forward Overlaps and Image Resolutions With LiDAR Data as Reference. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 3578–3589. [Google Scholar] [CrossRef]
  190. Graham, A.; Coops, N.C.; Wilcox, M.; Plowright, A. Evaluation of ground surface models derived from unmanned aerial systems with digital aerial photogrammetry in a disturbed conifer forest. Remote Sens. 2019, 11, 84. [Google Scholar] [CrossRef] [Green Version]
  191. Graham, A.N.V.; Coops, N.C.; Tompalski, P.; Plowright, A.; Wilcox, M. Effect of ground surface interpolation methods on the accuracy of forest attribute modelling using unmanned aerial systems-based digital aerial photogrammetry. Int. J. Remote Sens. 2020, 41, 3287–3306. [Google Scholar] [CrossRef]
  192. Fankhauser, K.E.; Strigul, N.S.; Gatziolis, D. Augmentation of traditional forest inventory and Airborne laser scanning with unmanned aerial systems and photogrammetry for forest monitoring. Remote Sens. 2018, 10, 1562. [Google Scholar] [CrossRef] [Green Version]
  193. Brieger, F.; Herzschuh, U.; Pestryakova, L.A.; Bookhagen, B.; Zakharov, E.S.; Kruse, S. Advances in the derivation of Northeast Siberian forest metrics using high-resolution UAV-based photogrammetric point clouds. Remote Sens. 2019, 11, 1447. [Google Scholar] [CrossRef] [Green Version]
  194. Panagiotidis, D.; Abdollahnejad, A.; Surový, P.; Kuželka, K. Detection of fallen logs from high-resolution UAV images. New Zeal. J. For. Sci. 2019, 49. [Google Scholar] [CrossRef]
  195. St-Onge, B.; Grandin, S. Estimating the height and basal area at individual tree and plot levels in Canadian subarctic lichen woodlands using stereo worldview-3 images. Remote Sens. 2019, 11, 248. [Google Scholar] [CrossRef] [Green Version]
  196. Xu, N.; Tian, J.; Tian, Q.; Xu, K.; Tang, S. Analysis of vegetation red edge with different illuminated/shaded canopy proportions and to construct normalized difference canopy shadow index. Remote Sens. 2019, 11, 1192. [Google Scholar] [CrossRef] [Green Version]
  197. Yancho, J.M.M.; Coops, N.C.; Tompalski, P.; Goodbody, T.R.H.; Plowright, A. Fine-Scale Spatial and Spectral Clustering of UAV-Acquired Digital Aerial Photogrammetric (DAP) Point Clouds for Individual Tree Crown Detection and Segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 2811. [Google Scholar] [CrossRef]
  198. Yilmaz, V.; Güngör, O. Estimating crown diameters in urban forests with Unmanned Aerial System-based photogrammetric point clouds. Int. J. Remote Sens. 2019, 40, 468–505. [Google Scholar] [CrossRef]
  199. Jin, C.; Oh, C.Y.; Shin, S.; Njungwi, N.W.; Choi, C. A comparative study to evaluate accuracy on canopy height and density using UAV, ALS, and fieldwork. Forests 2020, 11, 241. [Google Scholar] [CrossRef] [Green Version]
  200. Fujimoto, A.; Haga, C.; Matsui, T.; Machimura, T.; Hayashi, K.; Sugita, S.; Takagi, H. An end to end process development for UAV-SfM based forest monitoring: Individual tree detection, species classification and carbon dynamics simulation. Forests 2019, 10, 680. [Google Scholar] [CrossRef] [Green Version]
  201. Zhou, X.; Zhang, X. Individual tree parameters estimation for plantation forests based on UAV oblique photography. IEEE Access 2020, 8, 96184–96198. [Google Scholar] [CrossRef]
  202. Ganthaler, A.; Losso, A.; Mayr, S. Using image analysis for quantitative assessment of needle bladder rust disease of Norway spruce. Plant Pathol. 2018, 67, 1122–1130. [Google Scholar] [CrossRef] [Green Version]
  203. Otsu, K.; Pla, M.; Vayreda, J.; Brotons, L. Calibrating the severity of forest defoliation by pine processionary moth with landsat and UAV imagery. Sensors 2018, 18, 3278. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  204. Barmpoutis, P.; Stathaki, T.; Kamperidou, V. Monitoring of Trees’ Health Condition Using a UAV Equipped with Low-cost Digital Camera. ICASSP, IEEE Int. Conf. Acoust. Speech Signal Process. Proc. 2019, 2019, 8291–8295. [Google Scholar] [CrossRef]
  205. Lee, K.-W.; Park, J.-K. Economic Evaluation of Unmanned Aerial Vehicle for Forest Pest Monitoring. J. Korea Acad. Coop. Soc. 2019, 20, 440–446. [Google Scholar] [CrossRef]
  206. Röder, M.; Latifi, H.; Hill, S.; Wild, J.; Svoboda, M.; Brůna, J.; Macek, M.; Nováková, M.H.; Gülch, E.; Heurich, M. Application of optical unmanned aerial vehicle-based imagery for the inventory of natural regeneration and standing deadwood in post-disturbed spruce forests. Int. J. Remote Sens. 2018, 39, 5288–5309. [Google Scholar] [CrossRef]
  207. Fromm, M.; Schubert, M.; Castilla, G.; Linke, J.; McDermid, G. Automated detection of conifer seedlings in drone imagery using convolutional neural networks. Remote Sens. 2019, 11, 2585. [Google Scholar] [CrossRef] [Green Version]
  208. Fraser, B.T.; Congalton, R.G. Issues in Unmanned Aerial Systems (UAS) data collection of complex forest environments. Remote Sens. 2018, 10, 908. [Google Scholar] [CrossRef] [Green Version]
  209. Kellner, J.R.; Armston, J.; Birrer, M.; Cushman, K.C.; Duncanson, L.; Eck, C.; Falleger, C.; Imbach, B.; Král, K.; Krůček, M.; et al. New opportunities for forest remote sensing through ultra-high-density drone lidar. Surv. Geophys. 2019, 40, 959–977. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  210. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of drone altitude, image overlap, and optical sensor resolution on multi-view reconstruction of forest images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
  211. Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK method-An optimal solution for mapping inaccessible forested areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef] [Green Version]
  212. Wallace, L.; Bellman, C.; Hally, B.; Hernandez, J.; Jones, S.; Hillman, S. Assessing the ability of image based point clouds captured from a UAV to measure the terrain in the presence of canopy cover. Forests 2019, 10, 284. [Google Scholar] [CrossRef] [Green Version]
  213. Yu, R.; Lyu, M.; Lu, J.; Yang, Y.; Shen, G.; Li, F. Spatial coordinates correction based on multi-sensor low-altitude remote sensing image registration for monitoring forest dynamics. IEEE Access 2020, 8, 18483–18496. [Google Scholar] [CrossRef]
  214. Carr, J.C.; Slyder, J.B. Individual tree segmentation from a leaf-off photogrammetric point cloud. Int. J. Remote Sens. 2018, 39, 5195–5210. [Google Scholar] [CrossRef]
  215. Chung, C.H.; Wang, C.H.; Hsieh, H.C.; Huang, C.Y. Comparison of forest canopy height profiles in a mountainous region of Taiwan derived from airborne lidar and unmanned aerial vehicle imagery. GIScience Remote Sens. 2019, 56, 1289–1304. [Google Scholar] [CrossRef]
  216. Huang, H.; He, S.; Chen, C. Leaf abundance affects tree height estimation derived from UAV images. Forests 2019, 10, 931. [Google Scholar] [CrossRef] [Green Version]
  217. Jayathunga, S.; Owari, T.; Tsuyuki, S.; Hirata, Y. Potential of UAV photogrammetry for characterization of forest canopy structure in uneven-aged mixed conifer–broadleaf forests. Int. J. Remote Sens. 2020, 41, 53–73. [Google Scholar] [CrossRef]
  218. Liang, X.; Wang, Y.; Pyörälä, J.; Lehtomäki, M.; Yu, X.; Kaartinen, H.; Kukko, A.; Honkavaara, E.; Issaoui, A.E.I.; Nevalainen, O.; et al. Forest in situ observations using unmanned aerial vehicle as an alternative of terrestrial measurements. For. Ecosyst. 2019, 6. [Google Scholar] [CrossRef] [Green Version]
  219. Nuijten, R.J.G.; Coops, N.C.; Goodbody, T.R.H.; Pelletier, G. Examining the multi-seasonal consistency of individual tree segmentation on deciduous stands using Digital Aerial Photogrammetry (DAP) and unmanned aerial systems (UAS). Remote Sens. 2019, 11, 739. [Google Scholar] [CrossRef] [Green Version]
  220. Rissanen, K.; Martin-Guay, M.O.; Riopel-Bouvier, A.S.; Paquette, A. Light interception in experimental forests affected by tree diversity and structural complexity of dominant canopy. Agric. For. Meteorol. 2019, 278, 107655. [Google Scholar] [CrossRef]
  221. Yurtseven, H.; Akgul, M.; Coban, S.; Gulci, S. Determination and accuracy analysis of individual tree crown parameters using UAV based imagery and OBIA techniques. Meas. J. Int. Meas. Confed. 2019, 145, 651–664. [Google Scholar] [CrossRef]
  222. Zhang, D.; Liu, J.; Ni, W.; Sun, G.; Zhang, Z.; Liu, Q.; Wang, Q. Estimation of Forest Leaf Area Index Using Height and Canopy Cover Information Extracted from Unmanned Aerial Vehicle Stereo Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 471–481. [Google Scholar] [CrossRef]
  223. Apostol, B.; Petrila, M.; Lorenţ, A.; Ciceu, A.; Gancz, V.; Badea, O. Species discrimination and individual tree detection for predicting main dendrometric characteristics in mixed temperate forests by use of airborne laser scanning and ultra-high-resolution imagery. Sci. Total Environ. 2020, 698. [Google Scholar] [CrossRef]
  224. Balková, M.; Bajer, A.; Patočka, Z.; Mikita, T. Visual exposure of rock outcrops in the context of a forest disease outbreak simulation based on a canopy height model and spectral information acquired by an unmanned aerial vehicle. ISPRS Int. J. Geo-Inf. 2020, 9, 325. [Google Scholar] [CrossRef]
  225. Brüllhardt, M.; Rotach, P.; Schleppi, P.; Bugmann, H. Vertical light transmission profiles in structured mixed deciduous forest canopies assessed by UAV-based hemispherical photography and photogrammetric vegetation height models. Agric. For. Meteorol. 2020, 281, 107843. [Google Scholar] [CrossRef]
  226. Gil-Docampo, M.L.; Ortiz-Sanz, J.; Martínez-Rodríguez, S.; Marcos-Robles, J.L.; Arza-García, M.; Sánchez-Sastre, L.F. Plant survival monitoring with UAVs and multispectral data in difficult access afforested areas. Geocarto Int. 2020, 35, 128–140. [Google Scholar] [CrossRef]
  227. Gu, J.; Grybas, H.; Congalton, R.G. A comparison of forest tree crown delineation from unmanned aerial imagery using canopy height models vs. spectral lightness. Forests 2020, 11, 605. [Google Scholar] [CrossRef]
  228. Hastings, J.H.; Ollinger, S.V.; Ouimette, A.P.; Sanders-DeMott, R.; Palace, M.W.; Ducey, M.J.; Sullivan, F.B.; Basler, D.; Orwig, D.A. Tree species traits determine the success of LiDAR-based crown mapping in a mixed temperate forest. Remote Sens. 2020, 12, 309. [Google Scholar] [CrossRef] [Green Version]
  229. Isibue, E.W.; Pingel, T.J. Unmanned aerial vehicle based measurement of urban forests. Urban For. Urban Green. 2020, 48, 126574. [Google Scholar] [CrossRef]
  230. Jurado, J.M.; Ramos, M.I.; Enríquez, C.; Feito, F.R. The impact of canopy reflectance on the 3D structure of individual trees in a Mediterranean Forest. Remote Sens. 2020, 12, 1430. [Google Scholar] [CrossRef]
  231. Marzahn, P.; Flade, L.; Sanchez-Azofeifa, A. Spatial estimation of the latent heat flux in a tropical dry forest by using unmanned aerial vehicles. Forests 2020, 11, 604. [Google Scholar] [CrossRef]
  232. Vanderwel, M.C.; Lopez, E.L.; Sprott, A.H.; Khayyatkhoshnevis, P.; Shovon, T.A. Using aerial canopy data from UAVs to measure the effects of neighbourhood competition on individual tree growth. For. Ecol. Manag. 2020, 461, 117949. [Google Scholar] [CrossRef]
  233. Alonzo, M.; Andersen, H.E.; Morton, D.C.; Cook, B.D. Quantifying boreal forest structure and composition using UAV structure from motion. Forests 2018, 9, 119. [Google Scholar] [CrossRef] [Green Version]
  234. Giannetti, F.; Chirici, G.; Gobakken, T.; Næsset, E.; Travaglini, D.; Puliti, S. A new approach with DTM-independent metrics for forest growing stock prediction using UAV photogrammetric data. Remote Sens. Environ. 2018, 213, 195–205. [Google Scholar] [CrossRef]
  235. Jayathunga, S.; Owari, T.; Tsuyuki, S. The use of fixed–wing UAV photogrammetry with LiDAR DTM to estimate merchantable volume and carbon stock in living biomass over a mixed conifer–broadleaf forest. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 767–777. [Google Scholar] [CrossRef]
  236. Puliti, S.; Saarela, S.; Gobakken, T.; Ståhl, G.; Næsset, E. Combining UAV and Sentinel-2 auxiliary data for forest growing stock volume estimation through hierarchical model-based inference. Remote Sens. Environ. 2018, 204, 485–497. [Google Scholar] [CrossRef]
  237. Brede, B.; Calders, K.; Lau, A.; Raumonen, P.; Bartholomeus, H.M.; Herold, M.; Kooistra, L. Non-destructive tree volume estimation through quantitative structure modelling: Comparing UAV laser scanning with terrestrial LIDAR. Remote Sens. Environ. 2019, 233, 111355. [Google Scholar] [CrossRef]
  238. Jayathunga, S.; Owari, T.; Tsuyuki, S. Digital Aerial Photogrammetry for Uneven-Aged Forest Management: Assessing the Potential to Reconstruct Canopy Structure and Estimate Living Biomass. Remote Sens. 2019, 11, 338. [Google Scholar] [CrossRef] [Green Version]
  239. McClelland, M.P.; van Aardt, J.; Hale, D. Manned aircraft versus small unmanned aerial system—Forestry remote sensing comparison utilizing lidar and structure-from-motion for forest carbon modeling and disturbance detection. J. Appl. Remote Sens. 2019, 14, 1. [Google Scholar] [CrossRef]
  240. Ni, W.; Dong, J.; Sun, G.; Zhang, Z.; Pang, Y.; Tian, X.; Li, Z.; Chen, E. Synthesis of leaf-on and leaf-offunmanned aerial vehicle (UAV) stereo imagery for the inventory of aboveground biomass of deciduous forests. Remote Sens. 2019, 11, 889. [Google Scholar] [CrossRef] [Green Version]
  241. Wang, Y.; Pyörälä, J.; Liang, X.; Lehtomäki, M.; Kukko, A.; Yu, X.; Kaartinen, H.; Hyyppä, J. In situ biomass estimation at tree and plot levels: What did data record and what did algorithms derive from terrestrial and aerial point clouds in boreal forest. Remote Sens. Environ. 2019, 232, 111309. [Google Scholar] [CrossRef]
  242. Fernandes, M.R.; Aguiar, F.C.; Martins, M.J.; Rico, N.; Ferreira, M.T.; Correia, A.C. Carbon stock estimations in a mediterranean riparian forest: A case study combining field data and UAV imagery. Forests 2020, 11, 376. [Google Scholar] [CrossRef] [Green Version]
  243. Kotivuori, E.; Kukkonen, M.; Mehtätalo, L.; Maltamo, M.; Korhonen, L.; Packalen, P. Forest inventories for small areas using drone imagery without in-situ field measurements. Remote Sens. Environ. 2020, 237, 111404. [Google Scholar] [CrossRef]
  244. Puliti, S.; Breidenbach, J.; Astrup, R. Estimation of forest growing stock volume with UAV laser scanning data: Can it be done without field data? Remote Sens. 2020, 12, 1245. [Google Scholar] [CrossRef] [Green Version]
  245. Klouček, T.; Komárek, J.; Surový, P.; Hrach, K.; Janata, P.; Vašíček, B. The use of UAV mounted sensors for precise detection of bark beetle infestation. Remote Sens. 2019, 11, 1561. [Google Scholar] [CrossRef] [Green Version]
  246. Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. [Google Scholar] [CrossRef] [Green Version]
  247. Gini, R.; Sona, G.; Ronchetti, G.; Passoni, D.; Pinto, L. Improving tree species classification using UAS multispectral images and texture measures. ISPRS Int. J. Geo-Inf. 2018, 7, 315. [Google Scholar] [CrossRef] [Green Version]
  248. Komárek, J.; Klouček, T.; Prošek, J. The potential of Unmanned Aerial Systems: A tool towards precision classification of hard-to-distinguish vegetation types? Int. J. Appl. Earth Obs. Geoinf. 2018, 71, 9–19. [Google Scholar] [CrossRef]
  249. Mishra, N.B.; Mainali, K.P.; Shrestha, B.B.; Radenz, J.; Karki, D. Species-level vegetation mapping in a Himalayan treeline ecotone using unmanned aerial system (UAS) imagery. ISPRS Int. J. Geo-Inf. 2018, 7, 445. [Google Scholar] [CrossRef] [Green Version]
  250. Rivas-Torres, G.F.; Benítez, F.L.; Rueda, D.; Sevilla, C.; Mena, C.F. A methodology for mapping native and invasive vegetation coverage in archipelagos: An example from the Galápagos Islands. Prog. Phys. Geogr. 2018, 42, 83–111. [Google Scholar] [CrossRef] [Green Version]
  251. Saarinen, N.; Vastaranta, M.; Näsi, R.; Rosnell, T.; Hakala, T.; Honkavaara, E.; Wulder, M.A.; Luoma, V.; Tommaselli, A.M.G.; Imai, N.N.; et al. Assessing biodiversity in boreal forests with UAV-based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2018, 10, 338. [Google Scholar] [CrossRef] [Green Version]
  252. Tuominen, S.; Näsi, R.; Honkavaara, E.; Balazs, A.; Hakala, T.; Viljanen, N.; Pölönen, I.; Saari, H.; Ojanen, H. Assessment of Classifiers and Remote Sensing Features of Hyperspectral Imagery and Stereo-Photogrammetric Point Clouds for Recognition of Tree Species in a Forest Area of High Species Diversity. Remote Sens. 2018, 10, 714. [Google Scholar] [CrossRef] [Green Version]
  253. Dash, J.P.; Watt, M.S.; Paul, T.S.H.; Morgenroth, J.; Pearse, G.D. Early detection of invasive exotic trees using UAV and manned aircraft multispectral and LiDAR Data. Remote Sens. 2019, 11, 1812. [Google Scholar] [CrossRef] [Green Version]
  254. Kattenborn, T.; Lopatin, J.; Förster, M.; Braun, A.C.; Fassnacht, F.E. UAV data as alternative to field sampling to map woody invasive species based on combined Sentinel-1 and Sentinel-2 data. Remote Sens. Environ. 2019, 227, 61–73. [Google Scholar] [CrossRef]
  255. Kattenborn, T.; Eichel, J.; Wiser, S.; Burrows, L.; Fassnacht, F.E.; Schmidtlein, S. Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery. Remote Sens. Ecol. Conserv. 2020. [Google Scholar] [CrossRef] [Green Version]
  256. Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; de Moraes, M.V.A.; Honkavaara, E. Evaluation of hyperspectral multitemporal information to improve tree species identification in the highly diverse atlantic forest. Remote Sens. 2020, 12, 244. [Google Scholar] [CrossRef] [Green Version]
  257. Nezami, S.; Khoramshahi, E.; Nevalainen, O.; Pölönen, I.; Honkavaara, E. Tree species classification of drone hyperspectral and RGB imagery with deep learning convolutional neural networks. Remote Sens. 2020, 12, 1070. [Google Scholar] [CrossRef] [Green Version]
  258. Sothe, C.; De Almeida, C.M.; Schimalski, M.B.; La Rosa, L.E.C.; Castro, J.D.B.; Feitosa, R.Q.; Dalponte, M.; Lima, C.L.; Liesenberg, V.; Miyoshi, G.T.; et al. Comparative performance of convolutional neural network, weighted and conventional support vector machine and random forest for classifying tree species using hyperspectral and photogrammetric data. GIScience Remote Sens. 2020, 57, 369–394. [Google Scholar] [CrossRef]
  259. Baena, S.; Boyd, D.S.; Moat, J. UAVs in pursuit of plant conservation—Real world experiences. Ecol. Inform. 2018, 47, 2–9. [Google Scholar] [CrossRef]
  260. Rossi, F.C.; Fritz, A.; Becker, G. Combining satellite and UAV imagery to delineate forest cover and basal area after mixed-severity fires. Sustainability 2018, 10, 2227. [Google Scholar] [CrossRef] [Green Version]
  261. Berra, E.F.; Gaulton, R.; Barr, S. Assessing spring phenology of a temperate woodland: A multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations. Remote Sens. Environ. 2019, 223, 229–242. [Google Scholar] [CrossRef]
  262. Fraser, B.T.; Congalton, R.G. Evaluating the effectiveness of Unmanned Aerial Systems (UAS) for collecting thematic map accuracy assessment reference data in New England forests. Forests 2019, 10, 24. [Google Scholar] [CrossRef] [Green Version]
  263. Frey, J.; Asbeck, T.; Bauhus, J. Predicting tree-related microhabitats by multisensor close-range remote sensing structural parameters for the selection of retention elements. Remote Sens. 2020, 12, 867. [Google Scholar] [CrossRef] [Green Version]
  264. Pádua, L.; Guimarães, N.; Adão, T.; Sousa, A.; Peres, E.; Sousa, J.J. Effectiveness of sentinel-2 in multi-temporal post-fire monitoring when compared with UAV Imagery. ISPRS Int. J. Geo-Inf. 2020, 9, 225. [Google Scholar] [CrossRef] [Green Version]
  265. Hakala, T.; Markelin, L.; Honkavaara, E.; Scott, B.; Theocharous, T.; Nevalainen, O.; Näsi, R.; Suomalainen, J.; Viljanen, N.; Greenwell, C.; et al. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization. Sensors 2018, 18, 1417. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  266. Brach, M.; Chan, J.C.W.; Szymański, P. Accuracy assessment of different photogrammetric software for processing data from low-cost UAV platforms in forest conditions. IForest 2019, 12, 435–441. [Google Scholar] [CrossRef]
  267. Chakraborty, K.; Saikom, V.; Borah, S.B.; Kalita, M.; Gupta, C.; Meitei, L.R.; Sarma, K.K.; Raju, P.L.N. Forest biometric parameter extraction using unmanned aerial vehicle to aid in forest inventory data collection. Curr. Sci. 2019, 117, 1194–1199. [Google Scholar] [CrossRef]
  268. Yeom, J.; Han, Y.; Kim, T.; Kim, Y. Forest fire damage assessment using UAV images: A case study on goseong-sokcho forest fire in 2019. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2019, 37, 351–357. [Google Scholar] [CrossRef]
  269. Chabot, D. Trends in drone research and applications as the Journal of Unmanned Vehicle Systems turns five. J. Unmanned Veh. Syst. 2018, 6, vi–xv. [Google Scholar] [CrossRef] [Green Version]
  270. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  271. Huete, A.R.; Saleska, S.R. Remote sensing of tropical forest phenology: Issues and controversies. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2010, 38, 539–541. [Google Scholar]
  272. Huylenbroeck, L.; Laslier, M.; Dufour, S.; Georges, B.; Lejeune, P.; Michez, A. Using remote sensing to characterize riparian vegetation: A review of available tools and perspectives for managers. J. Environ. Manag. 2020, 267, 110652. [Google Scholar] [CrossRef] [PubMed]
  273. Cracknell, A.P. UAVs: Regulations and law enforcement. Int. J. Remote Sens. 2017, 38, 3054–3067. [Google Scholar] [CrossRef]
  274. Greenwood, F. Drones on the horizon: New frontier in agricutural innovation. ICT Update 2016, 82, 2–4. [Google Scholar]
Figure 1. Unmanned aerial vehicle (UAV) types for remote sensing civilian usage: rotary-wing (a,b), fixed-wing (c,d), unmanned helicopter (e), and hybrid solution (f).
Figure 1. Unmanned aerial vehicle (UAV) types for remote sensing civilian usage: rotary-wing (a,b), fixed-wing (c,d), unmanned helicopter (e), and hybrid solution (f).
Forests 12 00327 g001
Figure 2. Workflow (PRISMA diagram) showing the phases of the database creation (n = number).
Figure 2. Workflow (PRISMA diagram) showing the phases of the database creation (n = number).
Forests 12 00327 g002
Figure 3. Treemap of the top 20 source titles for the first-step filtering records.
Figure 3. Treemap of the top 20 source titles for the first-step filtering records.
Forests 12 00327 g003
Figure 4. Bar graph of the top 20 authors’ countries for the first-step filtering records.
Figure 4. Bar graph of the top 20 authors’ countries for the first-step filtering records.
Forests 12 00327 g004
Figure 5. Scientific mapping of keyword networks for UAV in forestry applications based on the cluster method (2018–mid-2020). Abbreviations: GCP = Ground Control Point; NDVI = Normalized Difference Vegetation Index, OBIA = Object-Based Image Analysis, TLS = Terrestrial Laser Scanning, VIS = Visible Spectrum; other abbreviations are reported in the text.
Figure 5. Scientific mapping of keyword networks for UAV in forestry applications based on the cluster method (2018–mid-2020). Abbreviations: GCP = Ground Control Point; NDVI = Normalized Difference Vegetation Index, OBIA = Object-Based Image Analysis, TLS = Terrestrial Laser Scanning, VIS = Visible Spectrum; other abbreviations are reported in the text.
Forests 12 00327 g005
Figure 6. Geographic distribution of research works according to the research place. The pie chart size represents the number of papers while topics are highlighted in different colors.
Figure 6. Geographic distribution of research works according to the research place. The pie chart size represents the number of papers while topics are highlighted in different colors.
Forests 12 00327 g006
Figure 7. (a) Distribution of different types of sensors in the reviewed dataset; (b): percentage of studies that used a given sensor, by forestry topics.
Figure 7. (a) Distribution of different types of sensors in the reviewed dataset; (b): percentage of studies that used a given sensor, by forestry topics.
Forests 12 00327 g007
Table 1. Overall research paper dataset tabulated according to forest type, plant group, and topic.
Table 1. Overall research paper dataset tabulated according to forest type, plant group, and topic.
Forest TypePlant GroupTopic
Setting and accuracy of imagery productsTree detection and inventory parametersAGB/volume estimationPest and disease detectionSpecies recognition and invasive plant detectionConservation, restoration, and fire monitoring
PlantedBroadleafAguilar et al. [57]Balsi et al. [58] Guerra-Hernandez et al. [59] Iizuka et al. [60] Medauar et al. [61] Mokros et al. [62] Qiu et al. [63] Blonder et al. [64] Carl et al. [65] Fawcett et al. [66] Wang et al. [67] Zeng [68] Dalla Corte et al. [69] Picos et al. [70]Pena et al. [71] Guerra-Hernandez et al. [72] Navarro et al. [73] Lu et al. [74]Maes et al. [75] Padua et al. [76] Sandino et al. [77] Dell et al. [78] Almeida et al. [50] Iizuka et al. [79] Paolinelli Reis et al. [80] Sealey and Van Rees [81] Sealey and Van Rees [82]
ConiferDiaz et al. [83] Guan et al. [84]Abdollahnejad et al. [85] Demir [86] Feduck et al. [87] Goodbody et al. [88] Iizuka et al. [89] Shin et al. [90] Webster et al. [91] Durfee et al. [92] Ganz et al. [54] Gulci [93] He et al. [94] Imangholiloo et al. [95] Krause et al. [96] Lendzioch et al. [97] Maturbongs et al. [98] Puliti et al. [99] Santini et al. [100] Santini et al. [101] Tian et al. [102] D’Odorico et al. [103] du Toit et al. [104] Hu et al. [105] Kuzelka et al. [45] Li et al. [106]Lin et al. [107] Windrim et al. [108] Zou et al. [109] Hyyppa et al. [40] Iizuka et al. [110] Puliti et al. [111] Yrttimaa et al. [112]Brovkina et al. [113] Dash et al. [114] Nasi et al. [115] Jung and Park [116] Smigaj et al. [117] Fernandez-Guisuraga et al. [118] Nagai et al. [119] Belmonte et al. [120] Shin et al. [39]
MixedPolewski et al. [121]Hentz et al. [122] Huang et al. [123] Kuzelka and Surovy [124] Yan et al. [125] Cao et al. [53] Li et al. [126] Yan et al. [127] Khokthong et al. [128]
Other Puliti et al. [129] Wu et al. [130] Chen et al. [131]Liu et al. [132] Shen et al. [133] Whiteside et al. [134]
Natural/not regularBroadleafGoodbody et al. [135] Ruwaimana et al. [136] Oliveira et al. [137] Fletcher and Mather [138] Jurjevic et al. [139]Alexander et al. [140] Bagaram et al. [141] Chen et al. [142] Guo et al. [32] Kattenborn et al. [143] Klosterman et al. [144] Lin et al. [145] Mayr et al. [146] Morales et al. [147] Rosca et al. [148] Thomson et al. [149] dos Santos et al. [150] Park et al. [151] Schneider et al. [152] Xu et al. [153] Yin and Wang et al. [46] Dong et al. [154] Krisanski et al. [155] Moe et al. [156]Otero et al. [157] Domingo et al. [158] Gonzalez-Jaramillo et al. [159] Li et al. [160] Ota et al. [161] Qiu et al. [162] Tian [163] Swinfield et al. [164] Vaglio Laurin et al. [165] Xu [166] Wang et al. [167] Di Gennaro et al. [168] d’Oliveira et al. [47] Jones et al. [169] Navarro et al. [170] Wang et al. [171] Zhu et al. [172] Cao et al. [173] de Sa et al. [174] Franklin and Ahmed [175] Liu et al. [176] Sothe et al. [177] Waite et al. [178] Wu et al. [179] Yaney-Keller et al. [180] Yuan et al. [181] Casapia et al. [31] Kentsch et al. [182] Miyoshi et al. [183]Rupasinghe et al. [184] De Luca et al. [185] Fernandez-Alvarez et al. [51] Rossi and Becke [186]
ConiferFrey et al. [187] Jayathunga et al. [188] Ni et al. [189] Graham et al. [190] Graham et al. [191]Fankhauser et al. [192] Brieger et al. [193] Panagiotidis et al. [194] St-Onge and Grandin [195] Xu et al. [196] Yancho et al. [197] Yilmaz and Gungor [198] Jin et al. [199]Fujimoto et al. [200], Zhou et al. [201]Ganthaler et al. [202] Otsu et al. [203] Zhang et al. [33] Barmpoutis et al. [204] Lee and Park [205] Roder et al. [206] Fromm et al. [207]
MixedFraser and Congalton [208] Kellner et al. [209] Seifert et al. [210] Tomastik et al. [211] Wallace et al. [212] Yu et al. [213]Carr and Slyder [214] Chung et al. [215] Huang et al. [216] Jayathunga et al. [217] Liang et al. [218] Nuijten et al. [219] Rissanen et al. [220] Shashkov et al. [30] Yurtseven et al. [221] Zhang et al. [222] Apostol et al. [223] Balkova et al. [224] Brullhardt et al. [225] Gil-Docampo et al. [226] Gu et al. [227] Hastings et al. [228] Isibue and Pingel [229] Jurado et al. [230] Marzahn et al. [231] Vanderwel et al. [232]Alonzo et al. [233] Giannetti et al. [234] Jayathunga et al. [235] Puliti et al. [236] Brede et al. [237] Jayathunga et al. [238] McClelland et al. [239] Ni et al. [240] Wang [241] Fernandes et al. [242] Kotivuori et al. [243] Puliti et al. [244]Cardil et al. [34] Kloucek et al. [245] Safonova et al. [246]Gini et al. [247] Komarek et al. [248] Mishra et al. [249] Rivas-Torres et al. [250] Saarinen et al. [251] Tuominen et al. [252] Dash et al. [253] Kattenborn et al. [254] Kattenborn et al. [255] Miyoshi et al. [256] Nezami et al. [257] Sothe et al. [258]Baena et al. [259] Rossi et al. [260] Berra et al. [261] Fraser and Congalton [262] Frey et al. [263] Padua et al. [264]
OtherHakala et al. [265] Brach et al. [266]Chakraborty et al. [267] Yeom et al. [268]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent Advances in Unmanned Aerial Vehicle Forest Remote Sensing—A Systematic Review. Part I: A General Framework. Forests 2021, 12, 327. https://doi.org/10.3390/f12030327

AMA Style

Dainelli R, Toscano P, Di Gennaro SF, Matese A. Recent Advances in Unmanned Aerial Vehicle Forest Remote Sensing—A Systematic Review. Part I: A General Framework. Forests. 2021; 12(3):327. https://doi.org/10.3390/f12030327

Chicago/Turabian Style

Dainelli, Riccardo, Piero Toscano, Salvatore Filippo Di Gennaro, and Alessandro Matese. 2021. "Recent Advances in Unmanned Aerial Vehicle Forest Remote Sensing—A Systematic Review. Part I: A General Framework" Forests 12, no. 3: 327. https://doi.org/10.3390/f12030327

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop