Next Article in Journal
Mean Shift Segmentation Assessment for Individual Forest Tree Delineation from Airborne Lidar Data
Previous Article in Journal
Multi-Task cGAN for Simultaneous Spaceborne DSM Refinement and Roof-Type Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery

1
College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, Shaanxi, China
2
Key Laboratory of Agricultural Internet of Things, Ministry of Agriculture, Yangling 712100, Shaanxi, China
3
Water Management and Systems Research Unit, USDA-ARS, 2150 Centre Avenue, Bldg. D., Fort Collins, CO 80526, USA
4
Institute of Soil and Water Conservation, Northwest A&F University, Yangling 712100, Shaanxi, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(11), 1261; https://doi.org/10.3390/rs11111261
Submission received: 30 March 2019 / Revised: 22 May 2019 / Accepted: 24 May 2019 / Published: 28 May 2019
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
The rapid, accurate, and economical estimation of crop above-ground biomass at the farm scale is crucial for precision agricultural management. The unmanned aerial vehicle (UAV) remote-sensing system has a great application potential with the ability to obtain remote-sensing imagery with high temporal-spatial resolution. To verify the application potential of consumer-grade UAV RGB imagery in estimating maize above-ground biomass, vegetation indices and plant height derived from UAV RGB imagery were adopted. To obtain a more accurate observation, plant height was directly derived from UAV RGB point clouds. To search the optimal estimation method, the estimation performances of the models based on vegetation indices alone, based on plant height alone, and based on both vegetation indices and plant height were compared. The results showed that plant height directly derived from UAV RGB point clouds had a high correlation with ground-truth data with an R2 value of 0.90 and an RMSE value of 0.12 m. The above-ground biomass exponential regression models based on plant height alone had higher correlations for both fresh and dry above-ground biomass with R2 values of 0.77 and 0.76, respectively, compared to the linear regression model (both R2 values were 0.59). The vegetation indices derived from UAV RGB imagery had great potential to estimate maize above-ground biomass with R2 values ranging from 0.63 to 0.73. When estimating the above-ground biomass of maize by using multivariable linear regression based on vegetation indices, a higher correlation was obtained with an R2 value of 0.82. There was no significant improvement of the estimation performance when plant height derived from UAV RGB imagery was added into the multivariable linear regression model based on vegetation indices. When estimating crop above-ground biomass based on UAV RGB remote-sensing system alone, looking for optimized vegetation indices and establishing estimation models with high performance based on advanced algorithms (e.g., machine learning technology) may be a better way.

1. Introduction

Agriculture is facing a great challenge in how to meet the increasing demand for agricultural production with limited soil and water resources due to the increasing population and rapidly growing economy [1]. By 2050, the projected population and socio-economic growth will double current food demand. To meet this challenge in developing countries, grain production needs to increase by 40% and net irrigation water requirements by 40–50% [2]. Maize is one of the most important food crops and provides at least 30% of the food calories together with rice and wheat to more than 4.5 billion people in developing countries [3,4]. Therefore, predicting maize production accurately before harvesting is of great significance to national food security and personal living standards [5].
Above-ground biomass (AGB), an important indicator of agro-ecosystems, is usually used as a key factor in predicting crop production and estimating water use efficiency [6,7,8]. The rapid, accurate, and economical estimation of AGB is of great importance. AGB remains one of the basic indicators to assess the performance of agricultural practices [9,10], to research agro-ecosystem processes [11], and to estimate global market risk [12].
Currently, remote-sensing has become the main data source for large-area AGB estimation [13]. Satellite remote-sensing, as an important data source, has been widely used to estimate plant AGB at the landscape scale. There are two main kinds satellite remote-sensing data used to estimate AGB, namely optical imagery [14,15,16] and synthetic aperture radar data [17,18,19], providing different spatial resolutions, such as SPOT (20 m) and MODIS (1 km), and Sentinel 1A (10 m) and ALOS-2 PALSAR2 (6 m). However, crop monitoring applications at the farm scale are highly demanding with respect to temporal frequency (1–3 days revisit), while spatial resolution requirements are driven by the minimum management unit that can be in the range of 1–10 m [20,21]. Satellite remote-sensing is often difficult to meet the monitoring needs of AGB at the farm scale due to the low spatial-temporal resolution.
Recently, rapid advances in unmanned aerial vehicle (UAV) remote-sensing platforms have boosted the use of near-earth aerial imagery in estimating crop AGB [4,22,23,24]. UAV remote-sensing systems are easier to obtain crop information at the farm scale during an adequate weather condition with high temporal-spatial resolution. For example, Jayathunga et al. [25] examined the potential of the digital aerial photogrammetry UAV remote-sensing system for estimating AGB of forest in Furano City, northern Japan, and provided AGB maps with a ground resolution of 14 cm. As one of the most important indicators to describe the growing status of the crop, plant height (PH) has been widely used to estimate AGB [26,27]. Zhang et al. [28] estimated AGB of grassland by using PH at three different study sites selected from the Gansu, Inner Mongolia, and Jiangsu provinces of China, and indicated a high correlation between the PH and AGB with the coefficient of determination (R2) values greater than 0.66. In addition to PH, vegetation indices (VIs) which could provide reliable information about crop growing status [26,29,30], such as green canopy cover and PH, have also been investigated as a reliable source to estimate AGB [31,32,33]. Sankaran et al. [34] estimated biomass of dry bean at flowering and mid-pod fill stages by using the average green normalized difference vegetation index (GNDVI) in Othello, WA, USA, and indicated a high correlation between the average GNDVI and biomass with Pearson correlation coefficient values greater than 0.52.
In general, VIs derived from the UAV imagery reflect the spectral characteristics of the top canopy and PH reflects the vertical structure properties of the entire canopy. To improve the estimation accuracy of AGB, researchers start to estimate crop AGB by using VIs and PH derived from UAV imagery simultaneously [35,36,37]. For example, Han et al. [38] estimated AGB of maize in Changping District of Beijing City, China and Li et al. [39] estimated AGB of sorghum in Central City, Nebraska, US by simultaneously using VIs derived from UAV multispectral imagery and PH derived from UAV RGB imagery. However, many of these studies utilized VIs and PH derived from different UAV platforms. Research simultaneously using VIs and PH both derived from consumer-grade UAV RGB remote-sensing system is still relatively less in biomass estimation.
Therefore, the main goal of this study was to estimate AGB of maize by using VIs and PH both derived from consumer-grade UAV RGB imagery. The estimation performances of models based on VIs alone, based on PH alone, and based on both VIs and PH were also analyzed. More specifically, our study aimed to:
(1)
directly derive maize PH from consumer-grade UAV RGB point clouds and comparatively analyze the estimation performance with ground-truth PH;
(2)
establish maize AGB estimation models based on PH alone by using linear and exponential regression analyses; based on VIs alone by using single and multivariable linear regression analyses; and based on both VIs and PH by using multivariable linear regression analysis;
(3)
comparatively analyze the performances of maize AGB estimating models and map the distribution of maize AGB by using the optimal estimating model.

2. Materials and Methods

2.1. Research Field

The experiment was carried out on a field in Zhaojun town located in the southwest of Inner Mongolia, China (40°26′0.29″ N, 109°36′25.99″ E, elevation 1010 m). The research field, approximately 1.13 ha, planted with maize, was divided into 5 large regions. Three 6 × 6 m2 areas within each region were chosen as sampling plots (yellow square in Figure 1b) for data collection from three sampling sites (green square in Figure 2a, 1 × 1 m2) within the sampling plot. The mixture of soil samples at depths of 30, 60, and 90 cm were used to estimate soil characteristics. The soil type was loamy sand (80.7% sand, 13.7% powder, and 5.6% clay), using the United States Department of Agriculture soil taxonomy. The soil pH, organic matter, and C content were 9.27, 47.17 g/Kg, and 27.35 g/Kg, respectively. Maize (Junkai 918) was planted on 11 May 2018 (DOY, day of year, 131), with 0.58 m row spacing and 0.25 m plant spacing, and row direction was East–West. The maize emerged on 18 May, headed on 21 July, was harvested on 10 September 2018 (silage) with a 115-day lifespan.

2.2. Field Measurements

At the maize vegetative stage (V5-V7, DOY 171-185), plant height (PH) and above-ground biomass (AGB) were measured on DOY 171, 177, and 185, respectively. A total of 15 sets of samples (PH and AGB) were obtained each day. There were 45 total biomass observations scaled up to the unit of kg/m2. The heights of maize were measured manually within each sampling site (green square in Figure 2a) using a measuring tape. As shown in Figure 2b, the distance between the ground and the highest leaf of maize was measured as the PH within each sampling site during the vegetative stage [40]. Five PH measurements were taken within each sampling site and the average of total fifteen measurements from three sampling sites was used as reference plant height (PHref) for the sampling plot (yellow square in Figure 1b).
For measuring AGB, maize samples were harvested from a 0.5 × 0.5 m2 subplot adjacent to the sampling plot and then was immediately put into a sealed bag to avoid the loss of water. Electronic balance (DJ-2000A, Suzhou Jiangdong Precision Instrument CO., Ltd., Suzhou, China) with a measurement accuracy of 0.01 g was used to measure the weight of maize fresh and dry AGB. For dry AGB, maize sample was put into a DHG-9053A oven (HASUC, Shanghai, China) to de-enzyme under 105 °C for two hours and then was dried to a constant weight under 80 °C. The time required varied from 36 to 72 h depending on the mass of maize samples on each sampling day. Each subplot’s measurement was referred to as the reference biomass of corresponding sampling plot.

2.3. Acquisition and Pretreatment of UAV RGB Imagery

Figure 3 shows the main steps in the acquisition and pretreatment of UAV RGB imagery. A consumer-grade quadrotor UAV RGB remote-sensing system, DJI Phantom 4 Pro (Shenzhen Dajiang Baiwang Technology Co., Ltd., Shenzhen, China) was adopted to collect RGB imagery of maize. This UAV has an integrated camera, with a 1-inch complementary metal-oxide semiconductor sensor that captures red-green-blue spectral information. The camera has an 84° field of view lens with an f/2.8 aperture, and a resolution of 4864 × 3648 pixels. This lens is also specially designed to eliminate image distortion. Detailed information about the digital camera and the UAV system are given in Table 1. On DOY 171, 177, and 185 (sunny days) between 11:00–13:00 (Chinese standard time), RGB imagery of maize were captured by DJI Phantom 4 Pro. Parameters of ISO and shutter were set to 400 and 1/1250 according to specific weather condition, and white balance was set to sunny. Flight was controlled by Altizure software (Everest Innovation Technology Ltd., Hong Kong, China), which directed the UAV flying along a serpentine image acquisition plan at a height of 30 m and a speed of 1.3 m/s with the camera looking downwards. Overlap of imagery to the front and side was 90%. The ground sample distance was 0.8 cm. To improve the geo-location accuracy, the geo-referencing of the point clouds was done using a combination of direct geo-referencing and five ground control points (Figure 1b, green round). The coordinates of the ground control points were measured by a KOLIDA RTK differential GNSS device (KOLIDA Instrument Co., Ltd., Guangzhou, China).
After acquisition, the image mosaic processing was performed by using Pix4DMapper software (Pix4DInc., Switzerland, https://www.pix4d.com/) which was specifically designed to process UAV imagery using techniques rooted in both computer vision and photogrammetry [41]. The whole processing workflow was: firstly, initializing all geo-located imagery captured in each flight by automatically searching for feature tie-points in match pairs of imagery and correcting them based on the camera model. After that, five ground control points were incorporated to correct the geographic coordinates of imagery. Then, the densified point clouds were generated with a 7 × 7 pixels match window. Finally, the digital surface model and orthophoto with a spatial resolution of 0.8 cm were generated by using the inverse distance weighting method. More detailed information about the image mosaic processing based on Pix4DMapper software could be seen in Duan et al. [42].

2.4. The Extraction of Maize Plant Height

At the vegetative stage (V5-V7, DOY 171-185), maize canopy did not reach full coverage. Therefore, point clouds include both soil and maize plants (Figure 4b). PH could be calculated by subtracting the ground level from estimated plant level derived from point clouds. As shown in Figure 4a, the 1st percentile of the cumulative probability distribution of elevation derived from the mixed point clouds was chosen to represent the ground level. To search for an optimal PH feature, the 90th, 95th, 97th, 98th, and 99th percentiles, and the maximum value of point clouds altitude within each sampling plot were selected to represent the plant level. Then, six PH features (PH90, PH95, PH97, PH98, PH99, and PHmax) for each sampling plot were obtained. Regression analyses between six PH features and PHref were adopted to search for the optimal PH feature. There were 45 total PHref values averaged from observations, 15 PHref values for each day. The estimation precisions of six PH features were evaluated by their coefficient of determination (R2) and root mean square error (RMSE).

2.5. The Calculation of UAV RGB Vegetation Index

Many previous studies have used different vegetation indices (VIs) derived from RGB imagery to estimate the AGB of crops [40,43,44,45,46,47]. In this study, five VIs and one VI combination were calculated using visible bands, including the normalized green-red difference index (NGRDI) [44], excess green index (ExG) [48], excess green minus excess red (ExGR), color index of vegetation (CIVE) [49], vegetation index (VEG) [50], and the combination of ExG, ExGR, CIVE and VEG (COM) [51]. Their calculation formulas are as follows:
N G R D I = R G R R R G + R R
E x G = 2 × R G R R R B
E x G R = E x G 1.4 × R R R G
C I V E = 0.441 × R R 0.881 × R G + 0.385 × R B + 18.78745
V E G = R G R R a × R B ( 1 a ) ,   a = 0.667
C O M = 0.25 × E x G + 0.3 × E x G R + 0.33 × C I V E + 0.12 × V E G
where RR, RG, and RB represent the digital numbers of red, green, and blue bands which were normalized by dividing 255 (8-bit image in this study) [52,53].

2.6. Estimation Models of Plant Height and Biomass

Figure 5 shows the main procedures to obtain the optimal estimation model of maize AGB based on maize features derived from UAV RGB imagery. Three estimation models of maize AGB were adopted in this study, namely, estimation models based on (1) PH; (2) Vis; and (3) both PH and VIs. The PH feature derived from point clouds with the best performance and VIs with the better correlations were adopted. Linear regression analysis was used for VIs, and both linear and exponential regression analyses were used for PH. In the establishment of the third kind estimation model, multivariable linear regression analysis was adopted.

2.7. Statistical Analysis

For statistical analysis, the R programming language (R-3.4.3, https://www.r-project.org/) was adopted. More specifically, the lm () function was used to perform regression analyses and AIC ()function was used to calculate the Akaike Information Criterion (AIC). The coefficient of determination (R2), the root mean square error (RMSE), and the AIC [54,55] were calculated for the comparison. R2 is not suitable to evaluate the predictive and fitting performance among models with different forms and different numbers of parameters. Therefore, the AIC was further used for the comparison of models with the lower value representing the better estimation performance.

3. Results

3.1. Comparison between Maize Plant Height Derived from UAV Point Clouds and Ground-Truth Values

The correlations between six PH features (PH90, PH95, PH97, PH98, PH99, and PHmax) and PHref were analyzed (Figure 6). The PH90 and PH95 had the lowest correlations with PHref, with R2 values of 0.20 and 0.51, respectively. The other four PH features (PH97, PH98, PH99 and PHmax) all had good correlations with PHref with R2 values no less than 0.70. The PH99 had the highest correlation with PHref with an R2 value of 0.90. The PH99 also had the lowest extraction error with the RMSE value of 0.12 m. At the same time, it could be observed that when the 90th, 95th, 97th, 98th, and 99th percentiles of point clouds altitude were used to represent the maize plant level, a relatively lower estimation was obtained, compared to ground-truth PH with the fitting line below the 1:1 line (Figure 6). With the increase of percentile of UAV RGB point clouds altitude, there was a decreasing trend of the difference between PH derived from point clouds and PHref. When the maximum value of point clouds altitude was used to represent the maize plant level, a relatively higher estimation was obtained with the fitting line above the 1:1 line.

3.2. Estimation Models of Maize Biomass Based on Plant Height Derived from UAV RGB Point Clouds

Both linear and exponential regression analyses were used to estimate maize fresh and dry AGB based on maize PH99, which had the best estimation performance. As shown in Figure 7, there were high positive correlations between PH99 and fresh or dry AGB, with R2 values no less than 0.59 (p < 0.01). For both fresh and dry AGB, the exponential regression model had higher correlations with R2 values of 0.77 and 0.76, respectively, compared to the linear regression model with R2 values both of 0.59. However, when RMSE and AIC were considered, the linear regression model had relatively lower RMSE and AIC values. For fresh AGB, the RMSE and AIC values of the linear and exponential regression models were 0.41 and 50.18, and 0.49 and 57.17, respectively. For dry AGB, the RMSE and AIC values of the linear and exponential regression models were 0.05 and −134.89, and 0.07 and −128.90, respectively.

3.3. Estimation Models of Maize Biomass Based on UAV RGB Vegetation Indices

In this study, five VIs, NGRDI, ExG, ExGR, CIVE, VEG, and one VI combination (COM), derived from UAV RGB imagery were used to establish linear regression models for both maize fresh and dry AGB. As shown in Figure A1 and Figure A2, and Table 2, except for CIVE, the other five VIs all had significant positive correlations (p < 0.01) with both maize fresh and dry AGB. For both maize fresh and dry AGB, the ExGR had the highest correlations with R2 values both of 0.73, followed by the NGRDI and VEG with R2 values of 0.70 and 0.68, and 0.65 and 0.63, respectively. Lowest correlations were observed by using ExG to estimate maize fresh and dry AGB, with R2 values of 0.34 and 0.33, respectively. The COM also had a high correlation with both maize fresh and dry AGB with R2 values of 0.73 and 0.72, respectively, due to the fact that the COM was a combination of ExG, ExGR, CIVE, and VEG. When it comes to the RMSE, a similar observation was obtained. For maize fresh AGB, the ExGR also had the lowest RMSE with a value of 0.32, following by the NGRDI and VEG with RMSE values of 0.34 and 0.37. The CIVE had the largest RMSE with a value of 0.62. For the maize dry AGB, similar observations were found.
After estimating maize fresh and dry AGB by using single vegetation index, three VIs, NGRDI, ExGR, and VEG, which had the highest correlations were chosen to estimate maize fresh and dry AGB by using multivariable linear regression analysis. When multivariable linear regression was adopted, the estimation performance of maize AGB was improved (Figure 8). For both fresh and dry AGB, the R2 values were 0.82, with an increase of 0.09 compared to the highest R2 values of 0.73 (ExGR) for the single vegetation index. The RMSE of fresh and dry AGB decreased with values of 0.06 and 0.01 kg/m2, compared to the lowest RMSE values of 0.32 and 0.04 kg/m2 (ExGR) for single VI, respectively. The maize fresh and dry AGB could be calculated based on NGRDI, ExGR, and VEG by using Equations (7) and (8), respectively.
A G B f = 53.22 × N G R D I + 2.90 × E x G R 24.01 × V E G + 29.10
A G B d = 6.88 × N G R D I + 0.40 × E x G R 3.23 × V E G + 3.93

3.4. Estimation Models of Maize Biomass Based on Both Plant Height and Vegetation Indices

After estimating maize fresh and dry AGB by using PH or VIs alone, multivariable linear regression analyses were adopted based on both PH and three VIs (NGRDI, ExGR, and VEG). As shown in Figure 9, when PH was added, the correlations with both maize fresh and dry AGB, and the estimation accuracy were improved slightly with an increase of 0.03 of R2 for both maize fresh and dry AGB, and a decrease of 0.02 kg/m2 of RMSE only for maize fresh AGB. At the same time, there were decreases of 6.23 and 7.76 of AIC for maize fresh and dry AGB, respectively.

3.5. Mapping Maize Above-Ground Biomass Based on UAV RGB Remote-Sensing

Due to the insignificant improvement of the correlations and estimation accuracy when PH was added into the multivariable linear regression models, the distributions of both maize fresh and dry AGB at a farm scale were plotted by using VIs alone (Equations (7) and (8)). It could be observed that there were obvious increases of both maize fresh and dry AGB with the growth of maize (Figure 10). For maize fresh AGB, the mean values for DOY 171, 177, and 185 were 0.85, 2.00, and 2.25 kg/m2, respectively. For maize dry AGB, the mean values for DOY 171, 177, and 185 were 0.11, 0.25, and 0.29 kg/m2, respectively.

4. Discussion

The unmanned aerial vehicle (UAV) remote-sensing platform as one of the most important farm-scale data acquisition tools has a great application potential in the rapid, accurate, and economical estimation of crop above-ground biomass (AGB). It has been successfully used to estimate AGB of many crops, such as barely [32], sunflower [40], maize [38,56], wheat [36,37,57], etc. As the two main data sources to describe the growing status of crop, plant height (PH), and vegetation indices (VIs) have been widely used to estimate AGB of crops [23,35,58]. The UAV-based light detection and ranging (LiDAR) system provides a way to estimate plant height accurately [59,60,61,62]. For example, Liu et al. [63] assessed the effectiveness of UAV-based LiDAR obtained point clouds to estimate the mean height of trees in the northern plains of Jiangsu Province, China, and showed a high prediction accuracy for Lorey’s mean height with cross-validation coefficient of determination and relative root mean squared error of 0.97 and 2.83% by using partial least squares model.
However, the high cost of UAV-based LiDAR system is the main limitation for its further application. To reduce the cost of estimating crop PH, researchers start to use low-cost UAV RGB remote-sensing system [64,65,66,67]. For example, Ziliani et al. [68] compared the PH derived from high-resolution UAV RGB imagery against PH derived from LiDAR, and showed a strong correlation between structure-from-motion derived heights and LiDAR scan data with the coefficient of determination (R2) ranged from 0.77 to 0.99. Grüner et al. [69] estimated the PH of grass in northern Hesse, Germany from UAV RGB imagery by using photogrammetric structure from motion processing, and showed a high correlation between PH derived from UAV RGB imagery and ground-truth data with a R2 value of 0.56 and a root mean squared error (RMSE) value of 0.13 m. In many of these studies, crop surface model derived from point clouds was adopted to estimate PH. However, relatively low estimations of PH were often obtained due to the fact that information below the crop canopy is also included in the crop surface model [70]. Compared to the crop surface model, choosing the appropriate PH feature directly from the point clouds to represent plant level could avoid estimation error caused by the information below the crop canopy. Therefore, in this study, point clouds derived from a consumer-grade UAV RGB imagery was directly adopted to estimate PH. More specifically, the 1st percentile of point clouds altitude was defined as the ground level, and six PH features, the 90th, 95th, 97th, 98th, and 99th percentiles, and the maximum value of point clouds altitude were used to search for an optimal PH feature. There was a highest correlation between PH directly derived from point clouds with the 99th percentile defined as the plant level and ground-truth PH (0.36–0.92 m) with an R2 value of 0.90 and an RMSE value of 0.12 m. Compared to ground-truth PH, a relatively lower estimation was derived from the point clouds with the fitting line below the 1:1 line (Figure 6). The reason for this relatively lower estimation may be that, during the study period, the highest position of maize appeared at the tip of the leaf (Figure 2b). Due to the limitation of accuracy of point clouds, the tip of the leaf may be difficult to be obtained.
Both linear and exponential regression analyses were adopted to estimate maize fresh and dry AGB based on PH99 which had the best estimation performance of maize plant height. Compared to the linear regression model (both R2 values were 0.59), the exponential regression models had higher correlations for both fresh and dry AGB with R2 values of 0.77 and 0.76, respectively. However, with respect to RMSE, the exponential regression models had higher values of 0.49 and 0.07 kg/m2 for maize fresh and dry AGB compared to the linear models with values of 0.41 and 0.07 kg/m2, respectively. Similar results were found for maize [71], barely [23,72]. For example, when LiDAR-derived metrics, mean height, and the coefficient of variation of normalized digital terrain model points’ elevations, were used to estimate maize AGB in Heihe River Basin, Gansu province of China, the exponential regression models had higher R2 values of 0.73 and 0.80 compared to R2 values of 0.68 and 0.67 for the linear regression models. There were similar RMSE values for the linear and exponential regression models when the mean height was used; there was a higher RMSE value of 0.66 kg/m2 for the exponential regression model compared to RMSE value of 0.60 kg/m2 for the linear regression model when the coefficient of variation of normalized digital terrain model points’ elevations was used. However, Tilly et al. [73] found an opposite result when PH derived from terrestrial laser scanning was used to estimate AGB of rice in Jiansanjiang, Heilongjiang Province, China. More specifically, for rice species of village 69 and village 36, there were lower R2 values of 0.85 and 0.56 for the exponential regression model, compared to R2 values of 0.90 and 0.60 for the linear regression model. The range of PH used to establish the AGB estimation model may be the reason for the opposite observations of estimation performance. The ranges of mean PH of each data collection campaign for this study, Bendig et al. [23], Tilly et al. [72], and Tilly et al. [73] were 0.36 to 0.92 m, 0.14 to 1.00 m, 0.17 to 0.81 m, and 0.25 to 0.57 m, respectively. When there was a relatively small range of PH, the linear regression model had better performance. However, when there was a relatively large range of PH, the exponential regression model had better performance. As shown in Figure 11, there was a rapidly increasing trend in slopes of the linear regression models between manually measured maize PH and biomass (both fresh and dry maize biomass), indicating that the exponential regression model was more suitable to describe the relationship between maize PH and biomass when all data (a relatively large range of PH) was adopted.
When it comes to estimating crop AGB by using VIs derived from UAV remote-sensing system, there were two traditionally data sources, namely, multispectral [38,40] and hyperspectral [32,74]. For example, Meij et al. [75] estimated fresh AGB of oat based on simple difference vegetation index derived from UAV-based hyperspectral remote-sensing systems and showed an R2 value of 0.56. Vega et al. [40] estimated AGB of sunflower by using the normalized difference vegetation index derived from UAV multispectral remote-sensing system and showed R2 values ranged from 0.09 to 0.98. Recently, due to the high cost and low spatial resolution of UAV hyperspectral and multispectral remote-sensing systems, researchers started to estimate crop AGB based on vegetation index derived from UAV RGB remote-sensing system with a low cost and high spatial resolution [48,76], and showed a similar estimation performance. For example, Yue et al. [36] reported that there was a same R2 value of 0.93 when random forest regression was adopted to estimate AGB of wheat in Changping District of Beijing, China based on vegetation indices derived from UAV hyperspectral and RGB remote-sensing systems, respectively. Therefore, in this study, five VIs and one VI combination, namely, normalized green-red difference index (NGRDI), excess green index (ExG), excess green minus excess red (ExGR), color index of vegetation (CIVE), vegetation index (VEG), and the combination of ExG, ExGR, CIVE, and VEG (COM), derived from consumer-grade UAV RGB imagery were adopted to estimate maize fresh and dry AGB by using simple and multivariable linear regression analyses. The results showed that there were good correlations when ExGR, NGRDI, and VEG were used to estimate maize AGB with R2 values ranging from 0.63 to 0.73. It could be also observed that when ExGR, NGRDI, and VEG were used together in the multivariable linear regression model, a higher R2 value of 0.82 was found for both of maize fresh and dry AGB. Similar results have been found for wheat [36,37]. For example, Lu et al. [37] reported that when VIs derived from UAV RGB remote-sensing system with R2 values ranging from 0.04 to 0.62 were put into a random forest regression model, an ensemble learning method combining a large number of decision trees to improve the accuracy of classification and regression trees, a higher correlation with wheat AGB was obtained with an R2 value of 0.70. The different recognition abilities of VIs for different aspects of crops may be the reason for the increase of estimation performance when multiple VIs were adopted. For example, NGRDI is sensitive to AGB before canopy closure [44] and VEG is invariant over the range of natural daylight illumination [50]. The addition of VEG into the AGB estimation model based on NGRDI may enhance the adaptability to different natural daylight illumination of this model, resulting in a better estimation performance. Further studies are still needed.
For the further improvement of estimation performance, researchers start to estimate crop AGB by using the crop VIs and PH simultaneously. However, in this study, the optimal PH observation of maize directly derived from UAV RGB point clouds was added into the AGB multivariable linear regression model based on ExGR, NGRDI, and VEG, resulting in a slight increase of 0.03 of the R2 value for both fresh and dry AGB. For the RMSE, there was only a decrease of 0.01 kg/m2 for fresh AGB. Similar results have been found for wheat [36,37] and barley [35]. For example, Lu et al. [37] reported an increase of 0.06 of R2 value when wheat height was added into the random forest regression model based on UAV RGB VIs. Näsi reported a decrease of 0.01 of R2 value when barley height information derived from UAV RGB imagery was added into the random forest regression model based on UAV RGB VIs. There may be two reasons for the slight improvement of performance when PH derived from UAV RGB imagery was added into the AGB estimation model based on VIs derived from UAV RGB imagery. On one hand, the reason may be the weaker ability of UAV RGB remote-sensing system to obtain an accurate ground level when estimating PH. Ota et al. [31] reported that the accuracy of ground level derived from the UAV RGB imagery was not accurate enough to estimate AGB of forest in the tropics, and that, for the accurate estimation of AGB, at least a single acquisition of airborne LiDAR was required to obtain ground level. On the other hand, the VIs have a significant relationship with PH. For example, Schirrmann et al. [48] reported a highly significant relationship between blue-green ratio index and winter wheat PH with an R2 value of −0.93. As shown in Figure 12, in this study, high significant relationships were also found between VIs and maize PH with R2 values of 0.77, 0.85, and 0.78 for NGRDI, ExGR, and VEG, respectively. Therefore, when estimating crop AGB based on UAV RGB remote-sensing system alone, exploring the optimized VIs and establishing estimation models with high performance based on advanced algorithms (e.g., machine learning technology) may be a better way [35,36].

5. Conclusions

Unmanned aerial vehicle (UAV) RGB remote-sensing system as one of the most important farm-scale data acquisition tools has a great application potential in the rapid, accurate, and economical estimation of above-ground biomass at the farm scale. Our results confirmed that there was a high correlation between plant height directly derived from UAV RGB point clouds and ground-truth plant height with a coefficient of determination (R2) value of 0.90 and an RMSE value of 0.12 m. Compared to the linear regression model (both R2 values were 0.59), the exponential regression models based on plant height alone had higher correlations for both fresh and dry above-ground biomass with R2 values of 0.77 and 0.76, respectively. The range of plant height used to establish the estimation model may affect the estimation performances of the linear and exponential regression models. The vegetation indices derived from UAV RGB imagery had great potential to estimate maize above-ground biomass with R2 values ranging from 0.63 to 0.73. When multiple vegetation indices were adopted, a higher correlation was obtained with an R2 value of 0.82. The different recognition abilities of vegetation indices to different aspects of the crop may be the reason for the increase of estimation performance. It was also confirmed that the addition of plant height derived from UAV RGB imagery into the above-ground biomass estimation model based on vegetation index could not significantly enhance the estimation performance. When estimating crop above-ground biomass based on UAV RGB remote-sensing system alone, exploring the optimized vegetation indices and establishing estimation models with high performance based on advanced algorithms (e.g., machine learning technology) may be a better way.

Author Contributions

Data curation, Y.N., L.Z. and X.P.; Funding acquisition, W.H.; Investigation, Y.N.; Methodology, Y.N. and L.Z.; Project administration, W.H.; Validation, X.P.; Writing–original draft, Y.N. and L.Z.; Writing–review & editing, H.Z. and W.H.

Funding

This study was supported by the 13th Five-Year Plan for Chinese National Key R&D Project (2017YFC0403203), Major Project of Industry-Education-Research Cooperative Innovation in Yangling Demonstration Zone in China (2018CXY-23), and the 111 Project (No. B12007).

Acknowledgments

We are grateful to Shuangfei Cheng, Chenyu Ge, Chengxuan Tan, and Chonghao Xu for data collection.

Conflicts of Interest

There are no conflicts of interest.

Appendix A

Figure A1. Correlations between five vegetation indices and one VI combination derived from UAV RGB imagery and fresh above-ground biomass (AGB), respectively. (a) Normalized green-red difference index (NGRDI); (b) excess green index (ExG); (c) excess green minus excess red (ExGR); (d) color index of vegetation (CIVE); (e) vegetation index (VEG); and (f) the combination of ExG, ExGR, CIVE and VEG (COM). R2 represents the coefficient of determination and RMSE represents the root mean square error.
Figure A1. Correlations between five vegetation indices and one VI combination derived from UAV RGB imagery and fresh above-ground biomass (AGB), respectively. (a) Normalized green-red difference index (NGRDI); (b) excess green index (ExG); (c) excess green minus excess red (ExGR); (d) color index of vegetation (CIVE); (e) vegetation index (VEG); and (f) the combination of ExG, ExGR, CIVE and VEG (COM). R2 represents the coefficient of determination and RMSE represents the root mean square error.
Remotesensing 11 01261 g0a1
Figure A2. Correlations between five vegetation indices and one VI combination derived from UAV RGB imagery and dry above-ground biomass (AGB), respectively. (a) normalized green-red difference index (NGRDI); (b) excess green index (ExG); (c) excess green minus excess red (ExGR); (d) color index of vegetation (CIVE); (e) vegetation index (VEG); and (f) the combination of ExG, ExGR, CIVE and VEG (COM). R2 represents the coefficient of determination and RMSE represents the root mean square error.
Figure A2. Correlations between five vegetation indices and one VI combination derived from UAV RGB imagery and dry above-ground biomass (AGB), respectively. (a) normalized green-red difference index (NGRDI); (b) excess green index (ExG); (c) excess green minus excess red (ExGR); (d) color index of vegetation (CIVE); (e) vegetation index (VEG); and (f) the combination of ExG, ExGR, CIVE and VEG (COM). R2 represents the coefficient of determination and RMSE represents the root mean square error.
Remotesensing 11 01261 g0a2

References

  1. Atkinson, J.A.; Jackson, R.J.; Bentley, A.R.; Ober, E.; Wells, D.M. Field Phenotyping for the Future. Annu. Plant Rev. Online 2018. [Google Scholar] [CrossRef]
  2. Tubiello, F. Climate Change Adaption and Mitigation: Challenges and Opportunities in the Food Sector; Natural Resources Management and Environment Department: Rome, Italy, 2012. [Google Scholar]
  3. Shiferaw, B.; Prasanna, B.M.; Hellin, J.; Bänziger, M. Crops that feed the world 6. Past successes and future challenges to the role played by maize in global food security. Food Secur. 2011, 3, 307. [Google Scholar] [CrossRef]
  4. Michez, A.; Bauwens, S.; Brostaux, Y.; Hiel, M.-P.; Garré, S.; Lejeune, P.; Dumont, B. How Far Can Consumer-Grade UAV RGB Imagery Describe Crop Production? A 3D and Multitemporal Modeling Approach Applied to Zea mays. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  5. Wang, L.; Tian, Y.; Xia, Y.; Yan, Z.; Cao, W. Predicting grain yield and protein content in wheat by fusing multi-sensor and multi-temporal remote-sensing images. Field Crops Res. 2014, 164, 178–188. [Google Scholar] [CrossRef]
  6. Jin, X.; Yang, G.; Li, Z.; Xu, X.; Wang, J.; Lan, Y. Estimation of water productivity in winter wheat using the AquaCrop model with field hyperspectral data. Precis. Agric. 2016, 19, 1–17. [Google Scholar] [CrossRef]
  7. Nair, S.; Johnson, J.; Wang, C. Efficiency of Irrigation Water Use: A Review from the Perspectives of Multiple Disciplines. Agron. J. 2013, 105, 351–363. [Google Scholar] [CrossRef]
  8. Walter, J.; Edwards, J.; McDonald, G.; Kuchel, H. Photogrammetry for the estimation of wheat biomass and harvest index. Field Crops Res. 2018, 216, 165–174. [Google Scholar] [CrossRef]
  9. Basso, B.; Dumont, B.; Cammarano, D.; Pezzuolo, A.; Marinello, F.; Sartori, L. Environmental and economic benefits of variable rate nitrogen fertilization in a nitrate vulnerable zone. Sci. Total Environ. 2016, 545–546, 227–235. [Google Scholar] [CrossRef]
  10. Hiel, M.P.; Barbieux, S.; Pierreux, J.; Olivier, C.; Lobet, G.; Roisin, C.; Garré, S.; Colinet, G.; Bodson, B.; Dumont, B. Impact of crop residue management on crop production and soil chemistry after seven years of crop rotation in temperate climate, loamy soils. PeerJ 2018, 6, e4836. [Google Scholar] [CrossRef] [Green Version]
  11. Li, W.; Niu, Z.; Huang, N.; Wang, C.; Gao, S.; Wu, C. Airborne LiDAR technique for estimating biomass components of maize: A case study in Zhangye City, Northwest China. Ecol. Indic. 2015, 57, 486–496. [Google Scholar] [CrossRef]
  12. Becker-Reshef, I.; Vermote, E.; Lindeman, M.; Justice, C. A generalized regression-based model for forecasting winter wheat yields in Kansas and Ukraine using MODIS data. Remote Sens. Environ. 2010, 114, 1312–1323. [Google Scholar] [CrossRef]
  13. Lumbierres, M.; Méndez, P.; Bustamante, J.; Soriguer, R.; Santamaría, L. Modeling Biomass Production in Seasonal Wetlands Using MODIS NDVI Land Surface Phenology. Remote Sens. 2017, 9. [Google Scholar] [CrossRef]
  14. Li, M.; Wu, J.; Song, C.; He, Y.; Niu, B.; Fu, G.; Tarolli, P.; Tietjen, B.; Zhang, X. Temporal Variability of Precipitation and Biomass of Alpine Grasslands on the Northern Tibetan Plateau. Remote Sens. 2019, 11. [Google Scholar] [CrossRef]
  15. Chen, L.; Wang, Y.; Ren, C.; Zhang, B.; Wang, Z. Optimal Combination of Predictors and Algorithms for Forest Above-Ground Biomass Mapping from Sentinel and SRTM Data. Remote Sens. 2019, 11. [Google Scholar] [CrossRef]
  16. Pandit, S.; Tsuyuki, S.; Dube, T. Landscape-Scale Aboveground Biomass Estimation in Buffer Zone Community Forests of Central Nepal: Coupling In Situ Measurements with Landsat 8 Satellite Data. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  17. Pham, T.D.; Yoshino, K. Aboveground biomass estimation of mangrove species using ALOS-2 PALSAR imagery in Hai Phong City, Vietnam. J. Appl. Remote Sens. 2017, 11. [Google Scholar] [CrossRef]
  18. Castillo, J.A.A.; Apan, A.A.; Maraseni, T.N.; Salmo, S.G. Estimation and mapping of above-ground biomass of mangrove forests and their replacement land uses in the Philippines using Sentinel imagery. ISPRS J. Photogramm. Remote Sens. 2017, 134, 70–85. [Google Scholar] [CrossRef]
  19. Cougo, M.; Souza-Filho, P.; Silva, A.; Fernandes, M.; Santos, J.; Abreu, M.; Nascimento, W.; Simard, M. Radarsat-2 Backscattering for the Modeling of Biophysical Parameters of Regenerating Mangrove Forests. Remote Sens. 2015, 7, 17097–17112. [Google Scholar] [CrossRef] [Green Version]
  20. Zhang, L.; Zhang, H.; Niu, Y.; Han, W. Mapping Maize Water Stress Based on UAV Multispectral Remote Sensing. Remote Sens. 2019, 11. [Google Scholar] [CrossRef]
  21. Herwitz, S.R.; Johnson, L.F.; Dunagan, S.E.; Higgins, R.G.; Sullivan, D.V.; Zheng, J.; Lobitz, B.M.; Leung, J.G.; Gallmeyer, B.A.; Aoyagi, M.; et al. Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support. Comput. Electron. Agric. 2004, 44, 49–61. [Google Scholar] [CrossRef]
  22. Michez, A.; Lejeune, P.; Bauwens, S.; Herinaina, A.; Blaise, Y.; Castro Muñoz, E.; Lebeau, F.; Bindelle, J. Mapping and Monitoring of Biomass and Grazing in Pasture with an Unmanned Aerial System. Remote Sens. 2019, 11. [Google Scholar] [CrossRef]
  23. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  24. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  25. Jayathunga, S.; Owari, T.; Tsuyuki, S. Digital Aerial Photogrammetry for Uneven-Aged Forest Management: Assessing the Potential to Reconstruct Canopy Structure and Estimate Living Biomass. Remote Sens. 2019, 11. [Google Scholar] [CrossRef]
  26. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Moreno, M.A. Onion biomass monitoring using UAV-based RGB imaging. Precis. Agric. 2018, 19, 840–857. [Google Scholar] [CrossRef]
  27. Brocks, S.; Bareth, G. Estimating Barley Biomass with Crop Surface Models from Oblique RGB Imagery. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  28. Zhang, H.F.; Sun, Y.; Chang, L.; Qin, Y.; Chen, J.J.; Qin, Y.; Du, J.X.; Yi, S.H.; Wang, Y.L. Estimation of Grassland Canopy Height and Aboveground Biomass at the Quadrat Scale Using Unmanned Aerial Vehicle. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  29. Silleos, N.G.; Alexandridis, T.K.; Gitas, I.Z.; Perakis, K. Vegetation Indices: Advances Made in Biomass Estimation and Vegetation Monitoring in the Last 30 Years. Geocarto Int. 2006, 21, 21–28. [Google Scholar] [CrossRef]
  30. Ballesteros, R.; Fernando Ortega, J.; Hernandez, D.; Angel Moreno, M. Characterization of Vitis vinifera L. Canopy Using Unmanned Aerial Vehicle-Based Remote Sensing and Photogrammetry Techniques. Am. J. Enol. Vitic. 2015, 66, 120–129. [Google Scholar] [CrossRef]
  31. Ota, T.; Ahmed, O.S.; Minn, S.T.; Khai, T.C.; Mizoue, N.; Yoshida, S. Estimating selective logging impacts on aboveground biomass in tropical forests using digital aerial photography obtained before and after a logging event from an unmanned aerial vehicle. For. Ecol. Manag. 2019, 433, 162–169. [Google Scholar] [CrossRef]
  32. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  33. Kachamba, D.J.; Orka, H.O.; Gobakken, T.; Eid, T.; Mwase, W. Biomass Estimation Using 3D Data from Unmanned Aerial Vehicle Imagery in a Tropical Woodland. Remote Sens. 2016, 8. [Google Scholar] [CrossRef]
  34. Sankaran, S.; Zhou, J.F.; Khot, L.R.; Trapp, J.J.; Mndolwa, E.; Miklas, P.N. High-throughput field phenotyping in dry bean using small unmanned aerial vehicle based multispectral imagery. Comput. Electron. Agric. 2018, 151, 84–92. [Google Scholar] [CrossRef]
  35. Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating Biomass and Nitrogen Amount of Barley and Grass Using UAV and Aircraft Based Spectral and Photogrammetric 3D Features. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  36. Yue, J.B.; Feng, H.K.; Jin, X.L.; Yuan, H.H.; Li, Z.H.; Zhou, C.Q.; Yang, G.J.; Tian, Q.J. A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  37. Lu, N.; Zhou, J.; Han, Z.X.; Li, D.; Cao, Q.; Yao, X.; Tian, Y.C.; Zhu, Y.; Cao, W.X.; Cheng, T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods 2019, 15. [Google Scholar] [CrossRef]
  38. Han, L.; Yang, G.J.; Dai, H.Y.; Xu, B.; Yang, H.; Feng, H.K.; Li, Z.H.; Yang, X.D. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15. [Google Scholar] [CrossRef]
  39. Li, J.T.; Shi, Y.Y.; Veeranampalayam-Sivakumar, A.N.; Schachtman, D.P. Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System. Front. Plant Sci. 2018, 9. [Google Scholar] [CrossRef]
  40. Vega, F.A.; Ramírez, F.C.; Saiz, M.P.; Rosúa, F.O. Multi-temporal imaging using an unmanned aerial vehicle for monitoring a sunflower crop. Biosyst. Eng. 2015, 132, 19–27. [Google Scholar] [CrossRef]
  41. Turner, D.; Lucieer, A.; Watson, C. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef] [Green Version]
  42. Duan, T.; Zheng, B.; Guo, W.; Ninomiya, S.; Guo, Y.; Chapman, S.C. Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV. Funct. Plant Biol. 2017, 44. [Google Scholar] [CrossRef]
  43. Jannoura, R.; Brinkmann, K.; Uteau, D.; Bruns, C.; Joergensen, R.G. Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter. Biosyst. Eng. 2015, 129, 341–351. [Google Scholar] [CrossRef]
  44. Hunt, E.R., Jr.; Cavigelli, M.; Daughtry, C.S.T.; Mcmurtrey, J.E., III; Walthall, C.L. Evaluation of Digital Photography from Model Aircraft for Remote Sensing of Crop Biomass and Nitrogen Status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  45. Zheng, Y.; Zhang, M.; Zhang, X.; Zeng, H.; Wu, B. Mapping Winter Wheat Biomass and Yield Using Time Series Data Blended from PROBA-V 100- and 300-m S1 Products. Remote Sens. 2016, 8. [Google Scholar] [CrossRef]
  46. Kross, A.; McNairn, H.; Lapen, D.; Sunohara, M.; Champagne, C. Assessment of RapidEye vegetation indices for estimation of leaf area index and biomass in corn and soybean crops. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 235–248. [Google Scholar] [CrossRef] [Green Version]
  47. Ambrus, A.; Burai, P.; Lénárt, C.; Enyedi, P.; Kovács, Z. Estimating biomass of winter wheat using narrowband vegetation. J. Cent. Eur. Green Innov. 2015, 3, 13. [Google Scholar]
  48. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.-H. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8. [Google Scholar] [CrossRef]
  49. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Kobe, Japan, 20–24 July 2003. [Google Scholar]
  50. Hague, T.; Tillett, N.D.; Wheeler, H. Automated Crop and Weed Monitoring in Widely Spaced Cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
  51. Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.J.; Burgos-Artizzu, X.P.; Ribeiro, A. Automatic segmentation of relevant textures in agricultural images. Comput. Electron. Agric. 2011, 75, 75–83. [Google Scholar] [CrossRef] [Green Version]
  52. Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
  53. Verhoeven, G.J.J. It’s all about the format—Unleashing the power of RAW aerial photography. Int. J. Remote Sens. 2010, 31, 2009–2042. [Google Scholar] [CrossRef]
  54. Li, S.; Ding, X.; Kuang, Q.; Ata-Ui-Karim, S.T.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Potential of UAV-Based Active Sensing for Monitoring Rice Leaf Nitrogen Status. Front. Plant Sci. 2018, 9, 1834. [Google Scholar] [CrossRef] [Green Version]
  55. Hirotugu, A. A new look at the statistical model identification. Autom. Control Comput. Sci. 1974, 19, 716–723. [Google Scholar] [CrossRef]
  56. Li, W.; Niu, Z.; Chen, H.Y.; Li, D.; Wu, M.Q.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  57. Matese, A.; Di Gennaro, S.F.; Berton, A. Assessment of a canopy height model (CHM) in a vineyard using UAV-based multispectral imaging. Int. J. Remote Sens. 2017, 38, 2150–2160. [Google Scholar] [CrossRef]
  58. Luo, S.; Wang, C.; Xi, X.; Pan, F.; Peng, D.; Zou, J.; Nie, S.; Qin, H. Fusion of airborne LiDAR data and hyperspectral imagery for aboveground and belowground forest biomass estimation. Ecol. Indic. 2017, 73, 378–387. [Google Scholar] [CrossRef]
  59. Fernandez-Alvarez, M.; Armesto, J.; Picos, J. LiDAR-Based Wildfire Prevention in WUI: The Automatic Detection, Measurement and Evaluation of Forest Fuels. Forests 2019, 10. [Google Scholar] [CrossRef]
  60. Guo, Q.; Su, Y.; Hu, T.; Zhao, X.; Wu, F.; Li, Y.; Liu, J.; Chen, L.; Xu, G.; Lin, G.; et al. An integrated UAV-borne lidar system for 3D habitat mapping in three forest ecosystems across China. Int. J. Remote Sens. 2017, 38, 2954–2972. [Google Scholar] [CrossRef]
  61. Madec, S.; Baret, F.; de Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerle, M.; Colombeau, G.; Comar, A. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and Ground LiDAR Estimates. Front. Plant Sci. 2017, 8. [Google Scholar] [CrossRef]
  62. Wang, D.L.; Xin, X.P.; Shao, Q.Q.; Brolly, M.; Zhu, Z.L.; Chen, J. Modeling Aboveground Biomass in Hulunber Grassland Ecosystem by Using Unmanned Aerial Vehicle Discrete Lidar. Sensors 2017, 17. [Google Scholar] [CrossRef]
  63. Liu, K.; Shen, X.; Cao, L.; Wang, G.B.; Cao, F.L. Estimating forest structural attributes using UAV-LiDAR data in Ginkgo plantations. ISPRS J. Photogramm. Remote Sens. 2018, 146, 465–482. [Google Scholar] [CrossRef]
  64. Iizuka, K.; Yonehara, T.; Itoh, M.; Kosugi, Y. Estimating Tree Height and Diameter at Breast Height (DBH) from Digital Surface Models and Orthophotos Obtained with an Unmanned Aerial System for a Japanese Cypress (Chamaecyparis obtusa) Forest. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  65. Bareth, G.; Schellberg, J. Replacing Manual Rising Plate Meter Measurements with Low-cost UAV-Derived Sward Height Data in Grasslands for Spatial Monitoring. PFG-J. Photogramm. Remote Sens. Geoinf. Sci. 2018, 86, 157–168. [Google Scholar] [CrossRef]
  66. Wang, X.Q.; Zhang, R.Y.; Song, W.; Han, L.; Liu, X.L.; Sun, X.; Luo, M.J.; Chen, K.; Zhang, Y.X.; Yang, H.; et al. Dynamic plant height QTL revealed in maize through remote sensing phenotyping using a high-throughput unmanned aerial vehicle (UAV). Sci. Rep. 2019, 9. [Google Scholar] [CrossRef]
  67. Schirrmann, M.; Hamdorf, A.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Dammer, K.H. Regression Kriging for Improving Crop Height Models Fusing Ultra-Sonic Sensing with UAV Imagery. Remote Sens. 2017, 9. [Google Scholar] [CrossRef]
  68. Ziliani, M.G.; Parkes, S.D.; Hoteit, I.; McCabe, M.F. Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  69. Gruner, E.; Astor, T.; Wachendorf, M. Biomass Prediction of Heterogeneous Temperate Grasslands Using an SfM Approach Based on UAV Imaging. Agronomy 2019, 9. [Google Scholar] [CrossRef]
  70. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  71. Wang, C.; Nie, S.; Xi, X.; Luo, S.; Sun, X. Estimating the Biomass of Maize with Hyperspectral and LiDAR Data. Remote Sens. 2016, 9. [Google Scholar] [CrossRef]
  72. Tilly, N.; Aasen, H.; Bareth, G. Fusion of Plant Height and Vegetation Indices for the Estimation of Barley Biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef] [Green Version]
  73. Tilly, N.; Hoffmeister, D.; Cao, Q.; Lenz-Wiedemann, V.; Miao, Y.; Bareth, G. Transferability of Models for Estimating Paddy Rice Biomass from Spatial Plant Height Data. Agriculture 2015, 5, 538–560. [Google Scholar] [CrossRef] [Green Version]
  74. Yue, J.B.; Yang, G.J.; Li, C.C.; Li, Z.H.; Wang, Y.J.; Feng, H.K.; Xu, B. Estimation of Winter Wheat Above-Ground Biomass Using Unmanned Aerial Vehicle-Based Snapshot Hyperspectral Sensor and Crop Height Improved Models. Remote Sens. 2017, 9. [Google Scholar] [CrossRef]
  75. van der Meij, B.; Kooistra, L.; Suomalainen, J.; Barel, J.M.; De Deyn, G.B. Remote sensing of plant trait responses to field-based plant-soil feedback using UAV-based optical sensors. Biogeosciences 2017, 14, 733–749. [Google Scholar] [CrossRef]
  76. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; del Campo, A.; Moreno, M.A. Combined use of agro-climatic and very high-resolution remote sensing information for crop monitoring. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 66–75. [Google Scholar] [CrossRef]
Figure 1. Location and region division of the research field: (a) location of the research field in China; (b) aerial view of the research field indicating regions division, the location of sampling plots, and ground control points (green round). RGN is the abbreviation of region. The aerial image was taken on DOY 185.
Figure 1. Location and region division of the research field: (a) location of the research field in China; (b) aerial view of the research field indicating regions division, the location of sampling plots, and ground control points (green round). RGN is the abbreviation of region. The aerial image was taken on DOY 185.
Remotesensing 11 01261 g001
Figure 2. The locations of sampling site within each sampling plot (a) and the measuring method of maize plant height within each sampling site (b). PHref represents the reference plant height.
Figure 2. The locations of sampling site within each sampling plot (a) and the measuring method of maize plant height within each sampling site (b). PHref represents the reference plant height.
Remotesensing 11 01261 g002
Figure 3. The acquisition and pretreatment of unmanned aerial vehicles (UAV) RGB imagery, including camera parameter setting, flight route design, UAV fights, and image mosaic and correction. GCP is the abbreviation of the ground control point.
Figure 3. The acquisition and pretreatment of unmanned aerial vehicles (UAV) RGB imagery, including camera parameter setting, flight route design, UAV fights, and image mosaic and correction. GCP is the abbreviation of the ground control point.
Remotesensing 11 01261 g003
Figure 4. Schematic indicating the selecting method of the six maize plant height features (PH90, PH95, PH97, PH98, PH99, and PHmax) for each sampling plot. (a) The curve of the cumulative probability distribution of point clouds within each sampling plot; (b) a cross-section view of point clouds within a sampling plot illustrating the calculation of plant height using height percentiles.
Figure 4. Schematic indicating the selecting method of the six maize plant height features (PH90, PH95, PH97, PH98, PH99, and PHmax) for each sampling plot. (a) The curve of the cumulative probability distribution of point clouds within each sampling plot; (b) a cross-section view of point clouds within a sampling plot illustrating the calculation of plant height using height percentiles.
Remotesensing 11 01261 g004
Figure 5. Schematic indicating the main procedures to obtain the optimal estimation model of maize above-ground biomass (AGB) based on features derived from UAV RGB imagery.
Figure 5. Schematic indicating the main procedures to obtain the optimal estimation model of maize above-ground biomass (AGB) based on features derived from UAV RGB imagery.
Remotesensing 11 01261 g005
Figure 6. Correlations between maize plant heights derived from UAV RGB point clouds and ground-truth plant heights (PHref). Maize plant height estimations were obtained by using the 90th (a); 95th (b); 97th (c); 98th (d); and 99th (e) percentiles; and maximum value (f) of point clouds altitude to represent the plant level, respectively. R2 represents the coefficient of determination and RMSE represents the root mean square error.
Figure 6. Correlations between maize plant heights derived from UAV RGB point clouds and ground-truth plant heights (PHref). Maize plant height estimations were obtained by using the 90th (a); 95th (b); 97th (c); 98th (d); and 99th (e) percentiles; and maximum value (f) of point clouds altitude to represent the plant level, respectively. R2 represents the coefficient of determination and RMSE represents the root mean square error.
Remotesensing 11 01261 g006
Figure 7. Correlations between maize plant heights derived from UAV RGB point clouds and ground-truth fresh or dry above-ground biomass (AGB). Maize plant height (PH99) observations were obtained by using the 99th percentile of point clouds altitude to represent the plant level. AIC represents the Akaike Information Criterion; (a,c) were the linear and exponential regression models for fresh above-ground biomass, respectively; (b,d) were the linear and exponential regression models for dry above-ground biomass, respectively.
Figure 7. Correlations between maize plant heights derived from UAV RGB point clouds and ground-truth fresh or dry above-ground biomass (AGB). Maize plant height (PH99) observations were obtained by using the 99th percentile of point clouds altitude to represent the plant level. AIC represents the Akaike Information Criterion; (a,c) were the linear and exponential regression models for fresh above-ground biomass, respectively; (b,d) were the linear and exponential regression models for dry above-ground biomass, respectively.
Remotesensing 11 01261 g007
Figure 8. Correlations between maize above-ground biomass estimated based on NGRDI, ExGR, and VEG by using multivariable linear regression analysis and ground-truth above-ground biomass; (a) and (b) were maize fresh and dry above-ground biomass, respectively.
Figure 8. Correlations between maize above-ground biomass estimated based on NGRDI, ExGR, and VEG by using multivariable linear regression analysis and ground-truth above-ground biomass; (a) and (b) were maize fresh and dry above-ground biomass, respectively.
Remotesensing 11 01261 g008
Figure 9. Correlations between maize above-ground biomass estimated based on NGRDI, ExGR, VEG, and plant height by using multivariable linear regression analysis and ground-truth above-ground biomass; (a,b) were maize fresh and dry above-ground biomass, respectively.
Figure 9. Correlations between maize above-ground biomass estimated based on NGRDI, ExGR, VEG, and plant height by using multivariable linear regression analysis and ground-truth above-ground biomass; (a,b) were maize fresh and dry above-ground biomass, respectively.
Remotesensing 11 01261 g009
Figure 10. Maps of maize fresh and dry above-ground biomass based on vegetation indices derived from UAV RGB imagery for DOY 171, 177, and 185 in 2018. AGB_f and AGB_d are the abbreviations of maize fresh and dry above-ground biomass, respectively.
Figure 10. Maps of maize fresh and dry above-ground biomass based on vegetation indices derived from UAV RGB imagery for DOY 171, 177, and 185 in 2018. AGB_f and AGB_d are the abbreviations of maize fresh and dry above-ground biomass, respectively.
Remotesensing 11 01261 g010
Figure 11. The changes of slopes of the linear regression models between manually measured maize plant height and biomass during DOY 170-185. (a) Fresh maize biomass; (b) dry maize biomass. k1, k2, and k3 represent the slopes of the linear regression models based on data collected on DOY 170, 177, and 185, respectively.
Figure 11. The changes of slopes of the linear regression models between manually measured maize plant height and biomass during DOY 170-185. (a) Fresh maize biomass; (b) dry maize biomass. k1, k2, and k3 represent the slopes of the linear regression models based on data collected on DOY 170, 177, and 185, respectively.
Remotesensing 11 01261 g011
Figure 12. Correlations between three vegetation indices derived from UAV RGB imagery and maize manually measured plant height, respectively. (a) Normalized green-red difference index (NGRDI); (b) excess green minus excess red (ExGR); (c) vegetation index (VEG). p < 0.01 represents the significance level.
Figure 12. Correlations between three vegetation indices derived from UAV RGB imagery and maize manually measured plant height, respectively. (a) Normalized green-red difference index (NGRDI); (b) excess green minus excess red (ExGR); (c) vegetation index (VEG). p < 0.01 represents the significance level.
Remotesensing 11 01261 g012
Table 1. Main parameters of UAV RGB imagery acquisition system.
Table 1. Main parameters of UAV RGB imagery acquisition system.
ParameterValue
Wheelbase350 mm
Weight1388 g
Flight time30 min
Communication radius5 km
Speed<72 km/s
Image sensor1-inch CMOS
RGB color spacesRGB
Camera resolution4864 × 3648 pixels
Lens focal length8.8 mm/24 mm
Lens field of view84°
ISO range100–12,800
Shutter speed8–1/8000 s
Image formatJPEG; DNG
Table 2. Correlations between vegetation indices derived from UAV RGB imagery and ground-truth fresh or dry above-ground biomass. Five vegetation indices and one vegetation index combination, normalized green-red difference index (NGRDI), excess green index (ExG), excess green minus excess red (ExGR), color index of vegetation (CIVE), vegetation index (VEG) and the combination of ExG, ExGR, CIVE, and VEG (COM), were used to establish linear regression estimation models for both maize ground-truth fresh and dry above-ground biomass.
Table 2. Correlations between vegetation indices derived from UAV RGB imagery and ground-truth fresh or dry above-ground biomass. Five vegetation indices and one vegetation index combination, normalized green-red difference index (NGRDI), excess green index (ExG), excess green minus excess red (ExGR), color index of vegetation (CIVE), vegetation index (VEG) and the combination of ExG, ExGR, CIVE, and VEG (COM), were used to establish linear regression estimation models for both maize ground-truth fresh and dry above-ground biomass.
Vegetation Index and CombinationFresh Above-Ground BiomassDry Above-Ground Biomass
R2RMSE (kg/m2) R2RMSE (kg/m2)
NGRDI0.700.340.680.04
ExG0.340.500.330.06
ExGR0.730.320.730.04
CIVE0.020.620.020.09
VEG0.650.370.630.05
COM0.730.320.720.05

Share and Cite

MDPI and ACS Style

Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. https://doi.org/10.3390/rs11111261

AMA Style

Niu Y, Zhang L, Zhang H, Han W, Peng X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sensing. 2019; 11(11):1261. https://doi.org/10.3390/rs11111261

Chicago/Turabian Style

Niu, Yaxiao, Liyuan Zhang, Huihui Zhang, Wenting Han, and Xingshuo Peng. 2019. "Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery" Remote Sensing 11, no. 11: 1261. https://doi.org/10.3390/rs11111261

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop