Next Article in Journal
Mid-Term Effects of Conservative Soil Management and Fruit-Zone Early Leaf Removal Treatments on the Performance of Nerello Mascalese (Vitis vinifera L.) Grapes on Mount Etna (Southern Italy)
Previous Article in Journal
Contrasting Responses of Guar Genotypes Shed Light on Multiple Component Traits of Salinity Tolerance Mechanisms
Previous Article in Special Issue
Early Detection of Broad-Leaved and Grass Weeds in Wide Row Crops Using Artificial Neural Networks and UAV Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A State-of-the-Art Analysis of Obstacle Avoidance Methods from the Perspective of an Agricultural Sprayer UAV’s Operation Scenario

1
School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
2
Key Laboratory of Modern Agricultural Equipment and Technology, Ministry of Education, Jiangsu University, Zhenjiang 212013, China
3
Department of Agricultural Engineering, Bahauddin Zakariya University, Multan, Punjab 60800, Pakistan
4
School of Aerospace Engineering, University of Michigan, 1320 Beal Avenue, Ann Arbor, MI 48109, USA
*
Author to whom correspondence should be addressed.
Agronomy 2021, 11(6), 1069; https://doi.org/10.3390/agronomy11061069
Submission received: 14 April 2021 / Revised: 11 May 2021 / Accepted: 17 May 2021 / Published: 26 May 2021

Abstract

:
Over the last decade, Unmanned Aerial Vehicles (UAVs), also known as drones, have been broadly utilized in various agricultural fields, such as crop management, crop monitoring, seed sowing, and pesticide spraying. Nonetheless, autonomy is still a crucial limitation faced by the Internet of Things (IoT) UAV systems, especially when used as sprayer UAVs, where data needs to be captured and preprocessed for robust real-time obstacle detection and collision avoidance. Moreover, because of the objective and operational difference between general UAVs and sprayer UAVs, not every obstacle detection and collision avoidance method will be sufficient for sprayer UAVs. In this regard, this article seeks to review the most relevant developments on all correlated branches of the obstacle avoidance scenarios for agricultural sprayer UAVs, including a UAV sprayer’s structural details. Furthermore, the most relevant open challenges for current UAV sprayer solutions are enumerated, thus paving the way for future researchers to define a roadmap for devising new-generation, affordable autonomous sprayer UAV solutions. Agricultural UAV sprayers require data-intensive algorithms for the processing of the images acquired, and expertise in the field of autonomous flight is usually needed. The present study concludes that UAV sprayers are still facing obstacle detection challenges due to their dynamic operating and loading conditions.

1. Introduction

The integration of Unmanned Aerial Vehicles (UAVs) with IoT (Internet of Things) devices, such as embedded sensors and communication elements, for agricultural operations is growing at a significantly faster pace than expected [1,2]. These IoT devices greatly enhance the capabilities of UAVs and enable UAVs to be used in a wide range of agricultural crop management operations, including field mapping [3,4], plant-stress detection [5,6], biomass estimation [7,8], weed management [9,10], inventory counting [11], etc.
Moreover, disease and pest control are mostly achieved by applying different chemical elements using different distribution systems [12,13]. Among these distribution systems, manual air-pressure and battery-powered backpack sprayers constitute the leading spraying equipment [14,15,16]. Because of the toxicity of manual spraying to human health, sprayer UAVs integrated with IoT devices have been deployed to replace the crude and noneffective manual spraying methods [17,18,19]. These kinds of IoT sprayer UAVs are expected to revolutionize agronomy, finishing tasks in hours instead of days, reducing human intervention from pest outbreak, balancing pest deposition on crops, being environmentally friendly, promising significant cost reduction, and increasing crop yield [20,21]. However, in order for IoT sprayer UAVs to be able to perform these tasks, they demand certain characteristics that differ significantly from other applications [22,23,24,25,26,27,28,29,30,31,32,33,34,35]. For instance, pesticide spraying with a manned UAV, especially at night or dark, mainly relies on the observation and judgement of a human pilot. Since the judgement of a manned pilot is often prone to a larger error margin, the fundamental objectives of the operation (which is precise spraying) may not be achieved, and the procedure may at the same time be hazardous. To mitigate this limitation, autonomous sprayer UAVs have been widely employed.
Nonetheless, during sprayer UAV maneuvering in such unfavorable environments, the presence of obstacles, such as heavy-duty tools, mobile vehicles, and robots, have to be considered in order to avoid collisions [36], while still maintaining “precise spraying” and “safety” of the UAV. The fundamental task of the sprayer UAV, thus, becomes that of “intelligent technology operation”, whereby the UAV identifies the obstacles autonomously and finishes the specified collision avoidance actions with a smooth maneuver. In other words, the sprayer UAV system should autonomously and efficiently avoid all types of obstacles in the flight path, ensure spray coverage, prevent accidents caused by heavy load oscillation, avoid autonomous flight catastrophes, and efficiently reduce the loss of property and human fatality.
How to effectively perform the abovementioned task is mainly dependent on the obstacle detection and collision avoidance algorithms incorporated into the sprayer UAV system. Hence, this article seeks to analyze the latest developments on all correlated branches of the obstacle avoidance scenarios of agricultural sprayer UAVs and includes the following main contributions.
  • The most relevant obstacle detection and collision avoidance techniques are reviewed and discussed, as well as their application perspective with agricultural sprayer UAVs.
  • The latest path planning algorithms that are used in agricultural UAVs and the structural challenge of sprayer UAVs are described.
  • The operational pattern, detection sensors, obstacles in agricultural farmlands, and control architecture for collision avoidance are thoroughly highlighted to pave the way for future researchers to design their own agricultural sprayer UAV systems.
  • The core open challenges and recent technical limitations associated with agricultural sprayer UAVs are enumerated.
The rest of this article is organized as follows. Section 1.1 introduces the general background and related work on obstacle detection and collision avoidance of UAVs. Section 2 analyses the constraints of agricultural sprayer UAVs. A review of the operational pattern and the general spraying architecture of sprayer UAVs, including farmland obstacles, is also presented in this section. In Section 3, details of the various obstacle detection and collision avoidance algorithms, including the various sensing techniques, are presented. Finally, Section 4 highlights the currently open challenges, and the conclusion remark is presented in Section 5.

1.1. General Background and Related Work

According to Sedighi et al. [37], obstacle avoidance and navigation problems can be separated into two main parts: global path planning and local collision avoidance. Global path planning creates a set of waypoints, from an initial starting point to a goal point, and maneuver around obstacles in a working space, whiles local collision avoidance creates waypoint tasks, and uses local sensing as a local goal for obstacle avoidance [38]. These obstacle avoidance techniques employ sophisticated algorithms, including the graph technique [14], heuristic search technique [39], and potential fields technique [40]. For this review, autonomy-level techniques specific to agricultural UAVs are analyzed, in which the sprayer UAV should be able to perform both path planning and obstacle detection in order to avoid collisions during navigation.
Early works on collision avoidance focused on “static obstacles”, in which the autonomous system senses its environment to avoid stationary obstacles [41,42] and difficult situations using path planning techniques [43] and various other algorithms, such as decision trees [44]. In view of this, static obstacles can further be categorized as either “single static obstacle” or “multiple static obstacles”. For the “single static obstacle”, the collision cone technique by Chakravarthy and Ghose [45], velocity obstacle approach by Fiorini and Shiller [46], radar-assisted collision guidance strategy (RACAGS) by Ajith Kumar and Ghose [47], and cross-entropy by Olivares-Mendez et al. [48] are examples of the collision avoidance methods. On the other hand, the Mixed-Integer Linear Program (MILP) approach by Richards and How [49], visibility graph method by Lozano-Pérez and Wesley [42], modified Grossberg Neural Network (GNN) by Wang, et al. [50], and ellipsoidal bounding box method by Park and Baek [51] are some examples of “multiple static obstacles” avoidance techniques.
In the case of obstacle avoidance methods for UAVs, numerous reviews and surveys have been conducted either for general UAVs or for specified fixed-wing or multirotor UAVs. Mukhtar et al. [52] presented a systematic review of sensors and vision-based methods for vehicle detection and tracking for collision avoidance systems in the context of on-road driving. Rybus [53] also presented an interesting survey, where the focus on space robotics and the major challenges of collision-free trajectory planning for manipulators mounted on large orbital structures, small satellites, and space robots that navigate in proximity to large orbital structures were presented. Lu et al. [54] presented a review of the vision-based methods for positioning and mapping, including obstacle avoidance and path planning. Regarding the use of deep learning (DL) approaches for robotic solutions, Shabbir and Anwer [55] presented a survey on the use of DL techniques for robotic perception, including robotic control and exploration, as well as robotic navigation and autonomous driving.
However, there are only a few articles on the usage of agricultural sprayer UAVs. For instance, Faiçal et al. [56] analyzed the effect of the number of communication messages between UAVs and the WSN (wireless sensor network). They concluded that the wastage of pesticides and fertilizers could be significantly minimized, if feedback information from the sensors is utilized to adjust the routes. Bae and Koo [57] also analyzed the effectiveness a roll-balanced helicopter, and how the balancing behavior of the designed helicopter can result in left–right balanced spray patterns. Giles and Billing [58] have also evaluated the economic and technical feasibility of the deployment of the RMAX helicopter (Yamaha Motor Co. Ltd, Iwata, Shizuoka, Japan) in the spraying of the commercial vineyards.
Nonetheless, because of the complexity of manual flying, autonomous flying has to be taken into consideration for obstacle detection and collision avoidance in the operations of sprayer UAVs, where autonomous navigation is a must. This is the focus of this review.

2. Constraints and Challenges of Agricultural Sprayer UAV

The main objective of a sprayer UAV is to spray the maximum farming area, to have an adequate spraying coverage and good droplet deposition. However, when spraying, there are some constraints within the agricultural field that have to be taken into consideration. These include (a) field shapes; (b) different weather conditions; and (c) obstacles in the field [59]. Other spraying performance constraints are inherent of the sprayer UAV itself, and these include (a) the liquid load, which will eventually decrease; (b) liquid capacity or vehicle weight; (c) battery/fuel capacity; (d) vehicle type; and (e) spray pressure [24,60,61]. For the purpose of brevity, the basic diagram of the agricultural sprayer UAV is shown in Figure 1.
For the sprayer UAV to be able to perform within the abovementioned constraints, some significant farmland spraying issues have to be considered, and these “spraying issues” are highlighted in the subsequent sections.

2.1. Challenges

A manually flown UAV needs an expert pilot for flying, which may lead to low machinery expertise and thus a decrease in spraying efficiency. Because of human error, autonomous sprayer UAVs are becoming more popular. Similar to agricultural autonomous vehicles, the autonomous sprayer UAV also follows a coverage route plan. Numerous studies have been conducted on route planning for agricultural field coverage. Some route planning research focus on different geometrical field shapes [62,63,64,65], while others have been done without the concern of the geometry of the field [66,67,68,69]. A typical path planning structure is shown in Figure 2.
Torres et al. [70] proposed an algorithm of path planning from a 2D image for optimizing battery consumption. Using satellite imagery, other path-planning methods also have been constructed, which focus on coverage plans without larger obstacles visible from satellite [71,72,73]. Zhang et al. [74] also developed a path-planning method excluding obstacle circles for crop protection UAVs using satellite imagery. Figure 3 shows the filtered path plan excluding the visible satellite obstacles.

2.2. Liquid Load and Sloshing

As is well known, the agricultural sprayer UAV carries a liquid tank, and that liquid tank has three possible situations: (a) the liquid level continuously decreases during flight; (b) vehicle flight movement changes the angle of the vehicle as well as the tank; and (c) the tank liquid create slosh on changing direction. The first situation is that the decreasing level of liquid continuously changes the tank’s center of gravity.
Second, the flight’s activity changes the tank’s angle, which also changes the center of gravity of the liquid tank. Khorsandi, Ayers, Freeland, and Wang [75] have shown how a tilted tank changes the center of gravity, which is shown in Figure 4. The third situation, which is the liquid sloshing, is related to the liquid level, tank tilt angle, and velocity [76,77,78].
In Figure 5, the sloshing impact inside a tank has been shown with a 30% filling rate which was experimented by Li Xi [79]. He did the experiment specifically for sprayer UAV’s liquid sloshing and has shown that the inner horizontal and vertical grille can effectively reduce tank liquid sloshing and the oscillation of the sprayer UAV’s tank. In the literature, several works have been conducted to reduce big liquid tankers’ slosh [80,81,82,83,84].

2.3. Obstacles on Farmland

Due to geographical position and seasonal changes, the agricultural sector faces uncontrollable weather changes such as strong wind, drought, freezing wind, fog, etc. Thus, for the same geographical position, the agricultural UAV faces different situations, and may change its flight parameters, such as motor speed, and PID gains, in order to control the stability and position [85,86]. Moreover, sprayer UAVs are relatively heavier than other UAVs due to the liquid load [87,88]. During spraying operation, droplet deposition is the primary concern for an agricultural UAV, and it is directly related to the flying parameters such as flying velocity and altitude [89]. These parameter settings are selected by operators depending on the plant growth and types, terrain, and topography of the farmland, etc. [90,91,92].
With regards to these spraying issues, the safety of the sprayer UAV should always be ensured. Because of the low altitude flying, obstacles are more specified on farmland, such as ladders, pump houses, electrical substations, power lines, telephone towers, lighting towers, groups of trees, scattered trees, flying birds or bats, etc. Example obstacle images (satellite image) are shown in Figure 6. To avoid these kinds of obstacles successfully and intelligently, it is essential to analyze the features of the obstacles on farmland. So, we categorized all the possible obstacles on farmlands, and their details are in Table 1.
After categorizing all possible obstacles on farmland, we can see some of these obstacles and the global detection system can detect the bigger size obstacles, and some need to be detected by the local detection system [70]. The local detection and obstacle avoidance operations need real-time analysis, intelligent identification, potential area detection, the suitable path, and so on [97]. Thus, the sprayer UAV needs a suitable detection sensor or sensor fusion; thus, obstacle detection sensors are discussed in the next section.

3. Obstacle Avoidance Scenario

The primary motivation for obstacle avoidance, specifically for sprayer UAVs, comes from a fast-growing number of commercial sprayer UAVs and their full autonomy. Unlike the other types of UAVs, the avoidance scenarios of sprayer UAVs are different, for three reasons: (a) the UAV’s liquid load transfer, which is comparatively heavy; (b) the UAV must maximize its spray coverage; and (c) the UAV must use its full battery usage so that it can spray more. Various techniques have been proposed for obstacle avoidance. The basic idea behind the obstacle avoidance scenarios for sprayer UAV is to detect the obstacle precisely and create a suitable avoidance trajectory concerning spraying and safety. This section describes the most important obstacle avoidance scenarios.
Several technologies have been developed for obstacle detection, and numerous works have been performed using sonar, radar, laser, visionary, and fusion methods. For mission follower UAVs, three control architectures have been developed, namely, the reactive control architecture, deliberative planning architecture, and hybrid control architecture. Thus, combining different obstacle detection methods and control architectures, several techniques have been developed for obstacle avoidance. We categorize the detection methods and describe some obstacle avoidance approaches remembering the terms of sprayer UAVs. Finally, we summarize the scenario using existing works about the control, detection sensors, detection methods, and avoidance approaches, as shown in Figure 7.

3.1. Avoidance Plan and Control Architecture

The control architecture for avoiding obstacles for a UAV describes how the avoidance actions will be organized from perceiving the environmental data. Three types of control architectures meet the obstacle avoidance requirement. These are reactive control, deliberative planning, and hybrid integration control [36]. The functional diagram is shown in Figure 8. Typically, the flying mission and obstacle avoidance operation are separate works. This control architecture is called reactive control, where the UAV will start flying according to the mission plan, and during the mission, when it senses the obstacle locally, it will avoid the obstacles and follow real-time sensor data.
In the deliberative planning architecture, at first, the UAV starts flying according to the flight plan, then during operation, it senses the working environment and updates the map model. After the primary flying, the final map will be determined, and it will generate the optimal sequence of the collision-free path. However, this method takes a longer time to determine the definitive plan, and without accurate positioning data, this method will not be sufficient.
Using reactive control and deliberative planning, Nakhaeinia et al. [98] made a hybrid architecture where the execution layer will be in three parts, namely, the deliberative layer, control execution layer, and reactive control layer, which are also shown in Figure 8c. Here, the deliberative layer generates an optimal collision-free plan, which is then transferred to the reactive layer to generate the UAV’s control action. The execution layer connects the other two layers.

3.2. Detection Sensors

Due to atmospheric conditions, such as lighting difference, temperature difference, spray drift or spray fog, water vaporization, etc., selecting the type of sensors for obstacle detection is an essential factor for agricultural sprayer UAV [99,100,101,102]. Different kinds of sensors generate the required information about the obstacle, including their surrounding environment, for the UAV to calculate the obstacle distance and processes necessary for safe path generation around the obstacle; this information contains the size, shape, and location of the obstacles [103,104], and according to the task requirements, different systems use various sensor setups [105].
Sensor classification can be classified into two functional axes [106]: proprioceptive/exteroceptive and passive/active. Proprioceptive sensors detect a vehicle’s internal detection data, such as position, orientation, and speed, which are detected by sensors such as velocity sensors [107], tilt sensors [108], position sensors [109], heading sensors [110], accelerometers [111], etc. Exteroceptive sensors collect information and features from the surrounding environment as well as the surrounding obstacle data of the vehicle, which are obtained by sensors such as ToF [112], lidar [113], laser [114], camera [115], sonar [116], microwave radar [117], etc. These sensors provide information collected from the surroundings and help the vehicles in their decision-making and interaction with the environment.
From the viewpoint of technical information, the other type of classification is passive or active. These are some different sensing systems used to detect obstacles and record information of the obstacle’s presence on the path. Passive sensors use the available environmental energy sources to process data of the surroundings [106]. Sun provides energy, which has visible spectrums and creates light reflections off the objects. These visible wavelengths and reflecting lights can be detected by light- or wavelength-detecting sensors and different camera types, such as CCD, CMOS, and thermal cameras [118,119]. That is why these kinds of sensors are often defined as vision-based sensors, which generate information based on pictures [120,121,122,123,124,125,126,127,128,129]. On the other hand, active sensors use their own generated energy, such as light and soundwaves, and detect the energy reflections by sonar, microwave radar, laser, ToF, etc. From the reflected energy, the sensor generates the surrounding environmental reaction. Because active sensors generate information depending on their controlled interactions, sometimes its performance is better. However, the generated energy can be influenced by other sources of energy, which may cause error readings [106,130,131,132,133,134,135,136].
For agricultural UAVs in different environments, a single sensor may not fulfill the task requirements. So, in that case, a multi-sensor combination, as well as sensor fusion, can be used for better performance [137,138]. Hrabar et al. [139], McGuire et al. [140], and Santos et al. [141] proposed an optical flow and stereo vision sensor fusion to improve the accuracy of the obstacle avoidance performance. Gageik et al. [142] presented infrared with ultrasonic sensors for obstacle avoidance at a low cost. on farmland, the obstacles and situation are different and more specified. Kragh et al. [143] made a datasheet of the different sensor performance types using a ground robot on farmland. After studying the Chinese agricultural UAV industry and the different types of detection sensors for obstacle avoidance, Wang, Lan, Zhang, Zhang, Tahir, Ou, Liu, and Chen [97] compared and summarized the features, advantages, and disadvantages of the sensors from the perspective of agricultural field operation. The different types of real-time obstacle detection sensors, and their advantages and disadvantages, are outlined in Table 2.

3.3. Obstacle Detection Technologies

A system can get various information about its surrounding obstacles from light or soundwave reflection via different sensors. From the previous section, we know these non-contact sensors have different information-sensing capacities. These pieces of information contain the obstacle’s size, distance, shape, color, and direction. After selecting the farmland for spraying, farmland obstacles are more likely a no-fly or no-spraying zone for the UAV. Thus, the primary concerns for the UAV are the shape and position of the obstacle. To achieve obstacle-detection capacity, the sprayer UAV can choose a single or multiple fusion detection methods. Due to the emerging development and future demand of image processing and 3D mapping, obstacle detection using image processing has attracted the attention of several researchers [161,162,163,164,165,166,167,168,169,170,171,172]. Moreover, because of the rapid response and low computational requirement, many detection and ranging-based or active sensor-based research also have been used for the detection of the surrounding obstacle’s position, as well as its coordinates and mapping [125,144,173,174,175,176,177,178,179,180]. Other studies on such detection systems use the fusion of active and passive sensing [181,182,183]. For autonomous navigation, the obstacle detection method is an important part. From analyzing previous studies, we found some of the major obstacle detection methods, which are given in the next section.

3.3.1. Sonar Mapping

The sonar (sound navigation and ranging) mapping system uses the acoustic wave reflection time from different angles to make an image or diagram of the surrounding environment. A sonar sensors represents the oldest obstacle detection technology, which was firstly used to measure the range of an underwater floor in the year 1912 [184]. Later on, this technology was broadly used in modern warfare to detect obstacles [185]. Eventually, it started operating in robotics. Elfes [186] used the sonar-based surrounding mapping for obstacle detection, where a sonar sensor detects the obstacle’s range from different points of view, creating a 2D map. Flynn [187] used multiple sonar sensors for more accuracy from a different angle to create the 2D map in a ground robot. Kleeman and Kuc [188] used a sonar array for targeting, localizing, classifying, and creating the 2D mapping. Akbarally and Kleeman [189] used the sonar sensor for localizing and classifying accurate 3D targets. Later, simultaneous localization and mapping or SLAM technology using sonar sensors were used for 3D imaging underwater by Ribas et al. [190]. Steckel and Peremans [191] used a biomimetics SLAM called BatSLAM, a mapping module using sonar to create a map for mobile robots. Steckel and Peremans [192] used a 3D sonar sensor and BatSLAM sonar module together to improve localization and mapping. Two fully embedded real-time 3D imaging sonar architectures, also known as RTIS, were also demonstrated by Kerstens et al. [193]. A series of sonar sensors were also used in a UAV by bin Misnan et al. [194] for low-altitude mapping where the sensors were setup at different angles. Gageik, Muller, and Montenergo [144], Gageik, Benz, and Montenegro [142], Gupta et al. [195], and Becker, et al. [196] used a round array setup of ultrasonic sensors in a small quadrotor UAV for obstacle mapping. Further studies on mapping obstacles using sonar sensors also have been conducted by various other researchers [197,198,199,200,201].
Sonar sensor mapping and obstacle detection sensors are mostly used in underwater vehicles because of the shortage of light in the deep sea. Because of the fact that the human body is mostly made of water, sonar sensors are also used in medical science to detect the internal body structures in humans [202,203,204,205]. Besides that, sonar mapping is also used in ground robots and small quadcopters in some research to prove the concept of algorithms [142,189,191,195,196].

3.3.2. Radar Mapping

Radar (radio detection and ranging) imaging uses the same ranging principle as sonar. Both use time-of-flight or echo ranging to calculate the distance. Radar uses different radio wave signals, which are also a part of the radio spectrum [206]. Radar imaging uses massive mathematical calculations, but has the powerful capacity of being very long-ranging through space, and therefore has extensive use in aerospace technologies [207,208]. This is why many works have been published on radar in rocket science and military technology [209,210,211,212,213]. However, our concern is obstacle detection for smaller vehicles with lower computational systems. A simple, short-range obstacle localization system was proposed by Giubbolini [214], where multiple 13–24 GHz radars were set up around a vehicle with a central digital processor system (DSP) system. Comprehensive sensing (CS) technology, addressed by Baraniuk and Steeghs [215], is appropriate for monostatic bistatic and multistate scenarios. Viquerat, Blackhall, Reid, Sukkarieh, and Brooker [174] used four microwave doppler radars on a fixed-wing UAV to illuminate four forward field quadrants to determine an obstacle’s position. A rotating radar was used for localization and mapping the surroundings by Vivet et al. [216]. Zhu et al. [217] proposed a 60 GHz imaging algorithm that can detect the nearby object and its location, orientation, curvature, and surface boundaries. Iyer et al. [218] experimented with 77–81 GHz radar for obstacle detection. Guo et al. [219] used range-angle mapping with a single radar pulse to create a map, while Feger et al. [220] used radar integrated with a MIMO array to determine an object’s location. The frequency-modulated continuous wave, or FMCW radar sensor, is another popular radar sensor for obstacle imaging [221,222,223,224].
The rapid growth of autonomous car manufacturing increases miniature radar usage [225,226]. Autonomous UAV usage is growing, and several studies also used radar for obstacle detection [47,227,228,229]. The benefit of a radar detection system is that it can work under various hazardous conditions, such as rain, dust, sunlight, etc. [218].

3.3.3. Laser Ranging

Laser imaging uses single- or multiple-laser sensors to generate an image or map of the surroundings. For laser-imaging techniques, the most used technology is laser imaging and ranging or lidar [230]. Lidar works the same as radar, except that it uses a laser for ranging. The single-point laser uses one-point distance measurements [231]. Creating a single line or 2D laser beam, using a single laser or multiple lasers, for 2D area or obstacle imaging creates an array of point clouds [232,233]. Using a four-layer laser scanner, Yu and Zhang [234] proposed an obstacle detection algorithm that can be used on autonomous land vehicles. The lidar technology uses laser point clouds and scans continuously to create a 3D map [235,236]. Demantké et al. [237] presented a multi-scaling method to compute geometric structures on lidar point clouds to retrieve the optimal surrounding size for each point. Li et al. [238] designed bounding box encoding, which uses a fully convolutional network with lidar to detect vehicles. Kim et al. [239] used 2D lidar scanning on an agricultural helicopter to scan obstacles. Peng et al. [240] also used 2D lidar scanning to detect obstacles that can effectively filter noise from raw laser data with the ground robot. Zheng et al. [241] used clustering based on relative distance and density (CBRDD) and the point-cloud correction method to detect surrounding obstacles from airborne UAVs. Basically, the 2D lidar scanning method is used for avoidance operations [242,243,244]. Usage of 3D lidar scanning is frequent in airborne geographical monitoring, such as agricultural plant monitoring, forest canopy monitoring, urban area monitoring, etc. [230,245,246,247,248,249]
Imaging with lidar is very precise, but it is relatively costly. Laser-ranging system uses highly intensive light, which can be interrupted by fog, rain, or dust. However, because of the increasing precision engineering, the usage of laser imaging is very high.

3.3.4. Computer Vision

Computer vision is a trendy way to detect obstacles, and many studies have been done in different regions [250,251,252,253]. Computer vision or image processing method uses a different type of camera for obstacle detection and mapping, such as monocular vision, stereo vision, binocular vision, infrared cameras, etc. [254,255,256,257]. Computer vision does obstacle detection, recognition, and can measure the distance using different methods. Using vision as the only exteroceptive sensors is called simultaneous localization and mapping (SLAM) [258]. The first use of computer vision was for negation based on a binocular stereo configuration [259,260]. However, because of the expensive binocular or multi-camera systems, monocular vision became more popular [261]. Vision-based obstacle detection is often used for complex environments, where getting the surrounding data is complicated for active sensors. Using computer vision for obstacle detection needs higher computational facilities, but nowadays the rapid development of microcomputers is filling this gap. There are various methods used for computer vision obstacle detection, and some essential techniques are discussed in the subsequent sub sections.

Target Based

The target-based obstacle detection method using computer vision uses known obstacle features to find the obstacles in a known environmental situation. Target-based obstacle detection using computer vision on various platforms has a vast and long history of research [262,263]. Besides, target-based methods on UAVs are also rising. The classical morphological filtering method has been used to detect and avoid obstacles from UAVs by Carnie et al. [264], and the same method has been employed in finding landmine-like objects using UAVs, by Rodriguez et al. [265]. Aoude, Luders, Levine, and How [165] used Ecological Recognizer architecture, which uses a pattern matcher, and was trained offline to make a path estimation. A color-based, robust tracking method was developed by Teuliere et al. [266], which was tracked through particle filtering. Mori and Scherer [161] designed a scale expansion detector to detect obstacles with a monocular camera and used SURF (Speeded-Up Robust Features) to match the obstacle detail with the given data. Targeting forward motion, Barry, Florence, and Tedrake [166] proposed an integration method using push-broom stereo perception and control. Wang and Li [124] proposed a local object-based subtraction method to get the object’s outline.
One advantage of target-based obstacle detection is that it usually uses a single camera for detection, and another benefit is its faster method than others because of its low computation. However, even though it can detect the position of the obstacle in the image correctly, it cannot detect the actual distance. Thus, it requires an additional distance sensor fusion to solve this problem [267,268].

Optical Flow Based

The optical flow-based method uses camera images frame-by-frame, and monitor each frame’s pixel-level movement to find the motion and temporal variation in each image’s grayscale version. Braillon et al. [269] used the optical flow method, where they used two-frame pixel matching to detect an obstacle in real time. Using optical flow information, Souhila and Karim [167] developed an algorithm that can locate any obstacle by detecting any change from the data. Naito et al. [270] developed an algorithm to find the obstacle edges and their changes from the optical flow images. Gharani and Karimi [271] proposed an algorithm that used optical flow with point-tracking algorithms to detect stationary and moving obstacles and show a safe path. Agrawal et al. [272] used two sides of the optical flow images to generate the turn rate of a UAV through the urban canyon. The kernelized correlation filter (KCF) framework, proposed by Bharati et al. [273], uses an adaptive obstacle detection and tracking approach. Capito et al. [274] used a sequence of images using sparse optical flow to generate an artificial potential field or visual potential field. Even though this method can be applied to any unknown environment, it cannot identify the obstacle’s specific characterization. In addition, it renders poor performance when applied to stationary and relatively slow-moving obstacles.

Stereo-Vision Based

Stereo-vision, also known as binocular stereo-vision, uses multiple camera feeds from different angles to generate depth and detect stereo-vision-based obstacles [275]. Due to its ability to provide more accurate data than other visionary detection systems, several research studies have been conducted on stereo-vision-based object tracking. Using the simplified stereo-vision algorithm, Bertozzi et al. [276] detected vehicle identification and distance. Nedevschi et al. [277] presented a stereo-vision-based obstacle detection method that reconstructs and works on the 3D points matching of the object edges. Moore, Thurrowgood, Bland, Soccol, and Srinivasan [168] used two rigidly mounted cameras in a coaxial stereo configuration to capture stereo images of the environment and process them for navigation. Gao, Ai, Wang, Rarity, and Dahnoun [169] used a 3D camera to produce depth information and converted it into the UV-disparity domain, which presents obstacles and ground surfaces as lines. Kramm and Bensrhair [170] proposed an algorithm using stereo-vision and clustering data to localize obstacles. Iacono and Sgorbissa [171] used an RGB-D camera to generate the 3D surface of the surroundings, generating a radial function for every obstacle, and then creating a safe UAV path. Ma et al. [278] proposed a novel insulator detection approach based on RGB-D saliency detection and structural feature searching for aerial images captured by a UAV power transmission line inspection system.
Stereo-vision works like a human eye, which captures an object’s view and assumes the distance. The binocular stereo-vision detection system needs a massive amount of data to process the figure, and real-time processing requires a powerful system [54,275,279,280].

3.3.5. Fusion

From the review of different obstacle detection methods, it can be observed that there are limitations for each detection method. Optical sensing cannot generate accurate ranging data, while active sensor-based methods can measure distances more accurately. In view of this, researchers use fusion methods depending on the working motive, and several fusion combinations can be found in the literature [126,176,267,281,282,283,284,285,286,287].

3.4. Obstacle Avoidance Techniques

As presented in the previous sections, the flying complexity of a sprayer UAV is quite different from a general UAV. Besides that, the obstacles are static and separated on farmlands, as described in Section 2.3. This is why the obstacle avoidance techniques need to be different from others. Although there are some research papers on obstacle avoidance and path planning using satellite images (Section 2.1), to the best of our knowledge, no research work specifically on local obstacle avoidance for sprayer UAVs has been conducted.
Using local satellite images for obstacle avoidance and path planning on farmlands may be deleterious. Because the satellite data updates after a particular schedule, sometimes the images get updated before an obstacle appears, which may cause accidents. Moreover, after updating the image of an obstacle, the obstacle may disappear, which may cause unwanted path generation. Sometimes even a small or narrow obstacle cannot be seen from the satellite image. Thus, avoiding obstacles locally is very important for sprayer UAVs. Several works have been done on local obstacle avoidance for mobile robots as well as UAVs. The subsequent subsections will present some of these obstacle avoidance approaches from different studies.

3.4.1. Bug Algorithm

The simplest obstacle avoidance method among all obstacle avoidance methods is the bug method. Lumelsky and Stepanov [288] proposed this method following bug’s movement. They made two versions of the bug algorithm: Bug1 and Bug2, which are shown in Figure 9. Here, the robot starts the operation from “s” to target “t”. For the case of Bug1, the robot will move fully around to check the object and will calculate the shortest position towards the target point. For the case of Bug2, the robot will create a line from start to target, and if it finds an obstacle, it will go alongside the obstacle, and when it finds the line, it will keep moving. The Bug1 algorithm travels a long path to reach the goal point, whiles Bug2 uses a shorter route. However, to use even a shorter way to reach the goal, some improvements have been made on the bug algorithm, such as tangent bug [289], I-Bug [290], improved bug [291], splitting Bug [292], etc. [293]. These bug algorithms are not so reliable in a more complex environment, and in some tricky conditions, one version works better than the other version. However, the generality of the bug algorithms is that it works well with single obstacle avoidance [294].

3.4.2. Artificial Potential Field Algorithm

The artificial potential field algorithm (APF) was proposed by Khatib [295], a unique real-time obstacle avoidance approach for mobile robots. This algorithm sets an artificial potential field to every point of its known area and starts moving towards the lower possible area, where the target point is the lowest possible area. The vehicle is always attracted to the lowest possible area, to eventually reach the target point. In Figure 10, a robot is avoiding an obstacle using the potential field. Cetin et al. [296] used the APF algorithm to avoid other obstacles and forming obstacles. Besides that, the authors used APF in multiple connected vehicles to make a suitable path.
APF algorithm has some limitations, such as the local minimum point problem and dead point problems. Chen et al. [297] reconstructed the APF to constrained optimization to solve the traditional dead point problem. Since the traditional algorithm only works for a single-vehicle trajectory, Sun et al. [298] proposed an optimized APF algorithm for multi-UAV operation in a 3D environment. Fan et al. [299] improved the APF algorithm by solving some inherited problems, such as the local minima and the target’s inaccessibility.
Figure 10. Artificial potential field [298].
Figure 10. Artificial potential field [298].
Agronomy 11 01069 g010

3.4.3. Collision Cone Method

The collision cone concept was first proposed by Chakravarthy and Ghose [45] for a 2D movement scenario. The authors assumed any obstacle as a circular area, and the distance from the UAV’s position to the obstacle area is calculated within a collision cone using the UAV’s velocity vector. This concept works for any irregular-shaped unknown obstacle and prevents collision between two irregular-shaped objects or vehicles. Following the same strategy, Ajith Kumar and Ghose [47] calculated a radar detection cone to find the possible collision-free path. To solve the collision between two aircraft in 3D space, Goss et al. [300] used the collision cone method with mixed geometry. Watanabe et al. [301] used a 2D passive vision sensor and collision cone approach in order to examine the obstacles from a critical distance. In Figure 11, the vehicle’s position is X v , the obstacle position, X o b s , is inside the safe boundary, and X a p   is the vehicle’s aiming point.
Later on, Chakravarthy and Ghose [302] extended the collision cone approach to detect the moving obstacle in 3D space. Sunkara et al. [303] used this method to avoid shape-shifting targets like shape-shifting snake robots, a swarm of vehicles, and oil split. They first developed the collision cone between a point object and a deformable object and subsequently extended that to the case of an engagement between a circular object and a deforming object. A real-time collision avoidance algorithm, called Tangent Plan Coordinate, addressing multiple obstacles, was proposed by Park and Baek [51]. They used a stereo-vision sensor with a limited field of view to assume the unknown obstacle. The collision cone was calculated by straight lines using an affine transformation that are tangent to the ellipsoid and that passes through the position of the quadrotor.

3.4.4. Fuzzy Logic Algorithm

Fuzzy logic was proposed by Zadeh [304], which uses the fuzzy controller. To use fuzzy logic in any system, the operator needs to assign a set of data or knowledge to create fuzzy sets that will be used to avoid obstacles or navigate the mobile robot. This process of assigning fuzzy input sets is called fuzzification. The set value usually can be anywhere between two traditional logics, such as (0, 1) or (Low, High) or (Cold, Hot), etc. This is why, usually, those vehicles that use fuzzy logic for navigation and avoidance use one kind of multiple sensors or sensor fusion. Lian [305] used the fuzzy controller to control an obstacle avoidance mobile robot. In Figure 12, the usage procedure is given.
This classic method is used in many vehicle navigation systems [306]. Reignier [307] used fuzzy logic techniques to build a reactive navigation system and avoid obstacles. Several research works created a fuzzy logic controller using fuzzy sets to avoid obstacles in real time. Dong et al. [308] used a fuzzy-based approach to track paths and avoid obstacles. Jin [309] proposed a navigation algorithm using a fuzzy controller and sensor fusion (camera and sonar) with a mobile robot to avoid obstacles and generate trajectories. Using the fuzzy logic system and three-way ultrasonic sensor, Li and Choi [310] proposed an avoidance algorithm for a mobile robot. Pandey et al. [311] designed a fuzzy logic controller to improve the vehicle’s movement according to the obstacle’s position.

3.4.5. Vector Field Histogram Method

Vector field histogram is a real-time obstacle avoidance method for mobile robots developed by Borenstein and Koren [41]. This method works using three steps for obstacle avoidance. In the first step, the robot generates a two-dimensional sensory histogram around its body or a limited angle and starts updating the histogram data at every stage. In the second step, the two-dimensional histogram data are converted into one-dimensional histogram data. Finally, it selects the lower polar dense area and moves the vehicle, calculating the direction. In Figure 13, the 2D and 1D histograms are presented.
The VHF algorithm was improved to VFH+ [312] and VFH* [313], respectively. VFH+ reduces the original VFH parameter tuning, and the VFH* method verifies that a particular candidate direction guides the robot around an obstacle. Lidar is a suitable sensor for approving VFH methods, since it can take high-resolution multiple ranging data in two dimensions. For example, Sary et al. [314] used VFH+ with lidar to avoid obstacles using a hexa-copter and Bolbhat et al. [315] used original VHF with lidar for obstacle avoidance of automated guided vehicles.

3.4.6. Neural Network

Neural network algorithms are inspired by the human brain. The neural network takes in data, train themselves to recognize the patterns in the data, and then predict the outputs for a new set of similar data. A computational model repeats training or functions with a biological neural network system until the best result comes out. The dynamic neural network is capable of automatically adjusting its structure, following the complication of the vehicle’s environment, understanding the mapping connection amongst the vehicle’s state and its obstacle avoidance decision in real-time, and efficiently decreasing the vehicle’s computational load [316].
Glasius et al. [317] designed a Hopfield-type neural network with nonlinear analog neurons for path planning and obstacle avoidance. Using a reinforcement learning neural network, an obstacle avoidance approach was proposed by Huang et al. [318]. Yadav et al. [319] designed a controller to find the obstacle-free shortest trajectory in 3D space for a UAV, which used a vision-based Grossberg Neural Network. Later, using a modified Grossberg neural network Wang, Yadav, and Balakrishnan [50] proposed an algorithm to avoid dynamic obstacles in 3D space. Chi and Lee [320] proposed a neural network control system to guide mobile robots to avoid arbitrary obstacles in a maze. Kim and Chwa [321] used a fuzzy neural network to avoid obstacles with a wheeled mobile robot. They used fuzzy sets as members of the neural network layer. Back et al. [322] proposed a vision-based trail following a UAV, which will avoid obstacles in the route using Convolutional Neural Network (CNN). Dai et al. [323] also used CNN to learn schemes to avoid obstacles in an unknown environment for a quadrotor UAV. Using neural network algorithms for obstacle avoidance needs lots of training data, but it is suitable for real-time obstacle avoidance performance. Example training data, to train for a complex environment, are shown in Figure 14, but the result is better than a few other obstacle avoidance methods, shown in Figure 15.

4. Challenges, Technical Limitations, and Analytical Comparison for Sprayer UAVs

4.1. Obstacle Detection and Collision Avoidance Challenges

If positive and precise spraying is to be achieved, autonomous sprayer UAVs must be capable of taking a coherent decision as to which scenarios involve hazard, in order to avoid obstacles that lead to actions that are not only wrong but deleterious to other farming tools and farm workers. This section highlights the major open challenges of obstacle detection and collision avoidance algorithms.
  • A major obstacle detection challenge arises from severe weather conditions and environments with changing illumination conditions. Windy environments and fog can obscure the detection sensors or cameras, which render the processed data inadequate for obstacle avoidance. Even though this challenge can be resolved with the use of radar, which has no problems detecting in fog, rain, or heavy snow, a major drawback of the radar is its limited lateral vision. In other words, the radar only covers a relatively small angular section of about 15 degrees. To enhance lateral vision, multiple sensors have to be employed, which complicates the system. Thus, in order to measure an accurate reconstruction of the environment in such unfavorable conditions, robust algorithms have to be developed to effectively detect and classify the obstacle.
  • Obstacle avoidance techniques, such as Neural Network and Fuzzy Logic, require the extraction of hierarchical abstractions from preprocessed data used throughout the training or learning stages, and the ability of generalization relies on the availability of a large dataset. As a drawback, the large computational cost is a major challenge and has to be highlighted.
  • Realistically, most of the obstacle detection algorithms have to be trained offline through simulation analysis before they are integrated into the sprayer UAV system. The huge gap between real and virtually simulated environments limits the applicability of offline simulation policies to the real world. The development of a realistic virtual dataset is still an open challenge.
  • Accuracy of temporal and spatial alignments among different sensors used in the sprayer UAV system also impacts the quality of the collected data.
The materials and methods should be described with sufficient details to allow others to replicate and build on the published results. Please note that the publication of your manuscript implicates that you must make all materials, data, computer code, and protocols associated with the publication available to readers. Please disclose at the submission stage any restrictions on the availability of materials or information. New methods and protocols should be described in detail while well-established methods can be briefly described and appropriately cited.
Research manuscripts reporting large datasets that are deposited in a publicly available database should specify where the data have been deposited and provide the relevant accession numbers. If the accession numbers have not yet been obtained at the time of submission, please state that they will be provided during review. They must be provided prior to publication.

4.2. Other Challenges

In addition to the abovementioned obstacle detection and collision avoidance challenges, the following open challenges exist.
  • For agricultural sprayer UAVs, spray deposition and coverage are a primary concern, and these parameters are directly related to the drone weight and payload. Moreover, there is always a trade-off between payload and cost, and reliability should be maintained when selecting the type of agricultural sprayer UAV. In most instances, the selection is between single-rotor and multi-rotor. However, quadrotors, which have several spraying limitations, are the preferred choice for agricultural sprayer UAVs.
  • Taking into consideration the current stage of sprayer UAV technology, the high cost of the intelligent sensors and that of the UAV system is a major issue. Improvements in this area will enable farmers to reap more from the use of sprayer UAVs for remote sensing in precision spraying.
  • Even though the use of UAVs for agricultural spraying is increasing, several limitations that prevent wider usage exist. Among these limitations is the absence of a standardized workflow, which leads to the use of ad-hoc procedures for deploying agricultural sprayer UAVs, a fact that discourages stakeholders.
  • As agricultural sprayer UAVs require data-intensive algorithms for the processing of the images acquired, expertise in the field of autonomous flight is usually needed. This suggests that the average farmer will require training or may be compelled to hire experts to assist in the image processing, which may be costly. This may prohibit the adoption of agricultural sprayer UAVs from farmers with less technical expertise.
  • Most agricultural sprayer UAVs have a short flight time, usually from 10 min to barely half an hour. The sprayer UAVs that can offer a longer flight time are relatively expensive. Moreover, the effective usage of a sprayer UAV is mostly prone to climatic conditions. For instance, during windy or rainy days, the flight operation has to be postponed.

4.3. Avoidance Technique Comparison

The proper technique of obstacle avoidance on farmland for sprayer UAVs during the planned mission, based on some special parameters, were discussed in the previous sections. Based on those studies, we summarized and compared the obstacle avoidance techniques for sprayer UAVs, shown in Table 3.

5. Conclusions

This article reviewed the current advances in obstacle detection and collision avoidance scenarios concerning sprayer UAVs. In doing so, the most relevant obstacle detection and collision avoidance techniques were reviewed and discussed, including their application with agricultural sprayer UAVs. In addition, the recent obstacle detection algorithms that are used in agricultural UAVs and the structural challenges of sprayer UAVs were described. After analyzing the main issues of autonomous sprayer UAVs on farmlands, a thorough survey of the recent articles on obstacle detection and collision avoidance techniques was presented. Besides, various constraints of agriculture sprayer UAVs were detailed together with the operational pattern. The detection sensors and control architectures for collision avoidance are thoroughly highlighted to pave the way for future researchers to design their own agriculture sprayer UAV systems. Specific to the physical structure of the sprayer UAV, the liquid load and sloshing, and the most widely used detection sensors were described. For autonomous navigation, the obstacle detection method is an important part. A major obstacle detection challenge arises from severe weather conditions and environments with changing illumination conditions. As agricultural sprayer UAVs require data-intensive algorithms for the processing of the images acquired, expertise in the field of autonomous flight is usually needed. The present study provides a comprehensive review analysis of the obstacle detection methods under UAV spraying conditions and concluded that UAV sprayers are still facing obstacle detection challenges due to their dynamic operating and loading conditions. Moreover, and most importantly, the relevant obstacle detection and collision avoidance algorithms were also presented, wherein, comparative analysis of the obstacles on farmland, including obstacle detection technologies, were tabulated. The Bug2 algorithm is suitable for rotorcraft UAVs. It can ensure the coverage of spraying because of following the exact border of the obstacle. Modifying the algorithm by fixing the heading direction may reduce the time duration. The other algorithms also have their merits under various working conditions. Furthermore, the most relevant open challenges concerning agricultural sprayer UAVs were highlighted. Among these challenges is “spray deposition and coverage”, and these parameters are directly related to the drone weight and payload. Another open challenge that has to do with the high cost of the intelligent sensors and that of the UAV system was highlighted. The short UAV flight times and the high cost of operation, which are major concerns to farmers, were also disclosed. Finally, the gap between real and virtually simulated environments, which limits the applicability of offline simulation policies to the real world, was stated. In all, this review work defines a clear roadmap for future research.

Author Contributions

Conceptualization, S.A. and B.Q.; methodology, S.A.; B.Q. and F.A.; validation, S.A. and B.Q.; formal analysis, S.A. F.A.; investigation, S.A., C.-W.K. and H.X. Resources, S.A. and B.Q.; data curation, S.A.; C.-W.K., H.X.; writing—original draft preparation, S.A. F.A. C.-W.K. and B.Q.; writing—review and editing, S.A. and B.Q.; supervision, B.Q.; project administration, B.Q. funding acquisition, B.Q. All authors have read and agreed to the published version of the manuscript.

Funding

The authors are grateful for the support of the National Key Research and Development Plan (No2017YFD0701005).

Data Availability Statement

Data are available in the manuscript.

Acknowledgments

The authors acknowledge the support of the School of Agricultural Engineering, Jiangsu University, China.

Conflicts of Interest

The authors declare no conflict of interest and the funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  2. Mogili, U.M.R.; Deepak, B.B.V.L. Review on Application of Drone Systems in Precision Agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  3. Marino, S.; Alvino, A. Detection of Spatial and Temporal Variability of Wheat Cultivars by High-Resolution Vegetation Indices. Agronomy 2019, 9, 226. [Google Scholar] [CrossRef] [Green Version]
  4. Surový, P.; Almeida Ribeiro, N.; Panagiotidis, D. Estimation of positions and heights from UAV-sensed imagery in tree plantations in agrosilvopastoral systems. Int. J. Remote Sens. 2018, 39, 4786–4800. [Google Scholar] [CrossRef]
  5. Cilia, C.; Panigada, C.; Rossini, M.; Meroni, M.; Busetto, L.; Amaducci, S.; Boschetti, M.; Picchi, V.; Colombo, R. Nitrogen Status Assessment for Variable Rate Fertilization in Maize through Hyperspectral Imagery. Remote Sens. 2014, 6, 6549–6565. [Google Scholar] [CrossRef] [Green Version]
  6. Zaman-Allah, M.; Vergara, O.; Araus, J.L.; Tarekegne, A.; Magorokosho, C.; Zarco-Tejada, P.J.; Hornero, A.; Alba, A.H.; Das, B.; Craufurd, P.; et al. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods 2015, 11, 35. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  8. Honkavaara, E.; Kaivosoja, J.; Mäkynen, J.; Pellikka, I.; Pesonen, L.; Saari, H.; Salo, H.; Hakala, T.; Marklelin, L.; Rosnell, T. Hyperspectral reflectance signatures and point clouds for precision agriculture by light weight UAV imaging system. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 7, 353–358. [Google Scholar] [CrossRef] [Green Version]
  9. Pflanz, M.; Nordmeyer, H.; Schirrmann, M. Weed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier. Remote Sens. 2018, 10, 1530. [Google Scholar] [CrossRef] [Green Version]
  10. Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J.C.; Lotz, B. Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
  11. Rahnemoonfar, M.; Sheppard, C. Deep Count: Fruit Counting Based on Deep Simulated Learning. Sensors 2017, 17, 905. [Google Scholar] [CrossRef] [Green Version]
  12. Sarwar, M. The killer chemicals as controller of agriculture insect pests: The conventional insecticides. Int. J. Chem. Biomol. Sci. 2015, 1, 141–147. [Google Scholar]
  13. Bhattacharyya, A.; Duraisamy, P.; Govindarajan, M.; Buhroo, A.A.; Prasad, R. Nano-biofungicides: Emerging trend in insect pest control. In Advances and Applications through Fungal Nanobiotechnology; Springer: Berlin/Heidelberg, Germany, 2016; pp. 307–319. [Google Scholar] [CrossRef]
  14. Perez-Lozano, T. Spatial planning: A configuration space approach. IEEE Trans. Comput. 1983, 32. [Google Scholar] [CrossRef]
  15. Schrum, P.B.; Verosky, M.A.; Krygowski, D.J. Portable Pressurized Sprayer. U.S. Patent 8,985,482, 24 March 2015. [Google Scholar]
  16. Baker, W.L. Portable Battery Powered Sprayer. U.S. Patent 4,801,088, 31 January 1989. [Google Scholar]
  17. Abubakar, Y.; Tijjani, H.; Egbuna, C.; Adetunji, C.O.; Kala, S.; Kryeziu, T.L.; Ifemeje, J.C.; Patrick-Iwuanyanwu, K.C. Pesticides, History, and Classification. In Natural Remedies for Pest, Disease and Weed Control; Elsevier: Amsterdam, The Netherlands, 2020; pp. 29–42. [Google Scholar] [CrossRef]
  18. Bencko, V.; Yan Li Foong, F. The history of arsenical pesticides and health risks related to the use of Agent Blue. Ann. Agric. Environ. Med. 2017, 24, 312–316. [Google Scholar] [CrossRef]
  19. Alavanja, M.C.; Samanic, C.; Dosemeci, M.; Lubin, J.; Tarone, R.; Lynch, C.F.; Knott, C.; Thomas, K.; Hoppin, J.A.; Barker, J.; et al. Use of agricultural pesticides and prostate cancer risk in the Agricultural Health Study cohort. Am. J. Epidemiol. 2003, 157, 800–814. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Maddikunta, P.K.R.; Hakak, S.; Alazab, M.; Bhattacharya, S.; Gadekallu, T.R.; Khan, W.Z.; Pham, Q.V. Unmanned aerial vehicles in smart agriculture: Applications, requirements, and challenges. IEEE Sens. J. 2021. [Google Scholar] [CrossRef]
  21. Pederi, Y.; Cheporniuk, H. Unmanned Aerial Vehicles and New Technological Methods of Monitoring and Crop Protection in Precision Agriculture. In Proceedings of the 2015 IEEE International Conference Actual Problems of Unmanned Aerial Vehicles Developments (APUAVD), Kyiv, Ukraine, 13–15 October 2015; IEEE: Kyiv, Ukraine, 2015; pp. 298–301. [Google Scholar] [CrossRef]
  22. Lan, Y.; Shengde, C.; Fritz, B.K. Current status and future trends of precision agricultural aviation technologies. Int. J. Agric. Biol. Eng. 2017, 10, 1–17. [Google Scholar] [CrossRef]
  23. Kulbacki, M.; Segen, J.; Knieć, W.; Klempous, R.; Kluwak, K.; Nikodem, J.; Kulbacka, J.; Serester, A. Survey of Drones for Agriculture Automation from Planting to Harvest. In Proceedings of the 2018 IEEE 22nd International Conference on Intelligent Engineering Systems (INES), Las Palmas de Gran Canaria, Spain, 21–23 June 2018; IEEE: Las Palmas de Gran Canaria, Spain, 2018; pp. 000353–000358. [Google Scholar] [CrossRef]
  24. Durham, K.; Giles, R.C.B. Deployment and Performance of a UAV for Crop Spraying. Chem. Eng. Trans. 2015, 44, 307–312. [Google Scholar] [CrossRef]
  25. Faiçal, B.S.; Freitas, H.; Gomes, P.H.; Mano, L.Y.; Pessin, G.; de Carvalho, A.C.P.L.F.; Krishnamachari, B.; Ueyama, J. An adaptive approach for UAV-based pesticide spraying in dynamic environments. Comput. Electron. Agric. 2017, 138, 210–223. [Google Scholar] [CrossRef]
  26. Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned Aerial Vehicles in Agriculture: A Review of Perspective of Platform, Control, and Applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
  27. Shilin, W.; Jianli, S.; Xiongkui, H.; Le, S.; Xiaonan, W.; Changling, W.; Zhichong, W.; Yun, L. Performances evaluation of four typical unmanned aerial vehicles used for pesticide application in China. Int. J. Agric. Biol. Eng. 2017, 10, 22–31. [Google Scholar] [CrossRef] [Green Version]
  28. Yanliang, Z.; Qi, L.; Wei, Z. Design and test of a six-rotor unmanned aerial vehicle (UAV) electrostatic spraying system for crop protection. Int. J. Agric. Biol. Eng. 2017, 10, 68–76. [Google Scholar] [CrossRef]
  29. Lou, Z.; Xin, F.; Han, X.; Lan, Y.; Duan, T.; Fu, W. Effect of Unmanned Aerial Vehicle Flight Height on Droplet Distribution, Drift and Control of Cotton Aphids and Spider Mites. Agronomy 2018, 8, 187. [Google Scholar] [CrossRef] [Green Version]
  30. Qin, W.; Xue, X.; Zhang, S.; Gu, W.; Wang, B. Droplet deposition and efficiency of fungicides sprayed with small UAV against wheat powdery mildew. Int. J. Agric. Biol. Eng. 2018, 11, 27–32. [Google Scholar] [CrossRef] [Green Version]
  31. Wen, S.; Zhang, Q.; Deng, J.; Lan, Y.; Yin, X.; Shan, J. Design and Experiment of a Variable Spray System for Unmanned Aerial Vehicles Based on PID and PWM Control. Appl. Sci. 2018, 8, 2482. [Google Scholar] [CrossRef] [Green Version]
  32. Yallappa, D.; Veerangouda, M.; Maski, D.; Palled, V.; Bheemanna, M. Development and Evaluation of Drone Mounted Sprayer for Pesticide Applications to Crops. In Proceedings of the 2017 IEEE Global Humanitarian Technology Conference (GHTC), San Jose, CA, USA, 19–23 October 2017; IEEE: San Jose, CA, USA, 2017; pp. 1–7. [Google Scholar] [CrossRef]
  33. Hentschke, M.; Pignaton de Freitas, E.; Hennig, C.; Girardi da Veiga, I. Evaluation of Altitude Sensors for a Crop Spraying Drone. Drones 2018, 2, 25. [Google Scholar] [CrossRef] [Green Version]
  34. Chen, P.; Lan, Y.; Huang, X.; Qi, H.; Wang, G.; Wang, J.; Wang, L.; Xiao, H. Droplet deposition and control of planthoppers of different nozzles in two-stage rice with a quadrotor unmanned aerial vehicle. Agronomy 2020, 10, 303. [Google Scholar] [CrossRef] [Green Version]
  35. Basso, M.; Stocchero, D.; Ventura Bayan Henriques, R.; Vian, A.L.; Bredemeier, C.; Konzen, A.A.; Pignaton de Freitas, E. Proposal for an Embedded System Architecture Using a GNDVI Algorithm to Support UAV-Based Agrochemical Spraying. Sensors 2019, 19, 5397. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Huang, S.; Teo, R.S.H.; Tan, K.K. Collision avoidance of multi unmanned aerial vehicles: A review. Annu. Rev. Control 2019, 48, 147–164. [Google Scholar] [CrossRef]
  37. Sedighi, K.H.; Ashenayi, K.; Manikas, T.W.; Wainwright, R.L.; Tai, H.-M. Autonomous Local Path Planning for a Mobile Robot Using a Genetic Algorithm. In Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No. 04TH8753), Portland, OR, USA, 19–23 June 2004; IEEE: Portland, OR, USA, 2004; pp. 1338–1345. [Google Scholar] [CrossRef] [Green Version]
  38. Marin-Plaza, P.; Hussein, A.; Martin, D.; Escalera, A.d.l. Global and Local Path Planning Study in a ROS-Based Research Platform for Autonomous Vehicles. J. Adv. Transp. 2018, 2018, 6392697. [Google Scholar] [CrossRef]
  39. Warren, C.W. Fast Path Planning Using Modified A* Method. In Proceedings of the IEEE International Conference on Robotics and Automation, Atlanta, GA, USA, 2–6 May 1993; IEEE: Atlanta, GA, USA, 1993; pp. 662–667. [Google Scholar] [CrossRef]
  40. Cui, J.; Zhang, Y.; Ma, S.; Yi, Y.; Xin, J.; Liu, D. Path planning algorithms for power transmission line inspection using unmanned aerial vehicles. In Proceedings of the 2017 29th Chinese Control And Decision Conference (CCDC), Chongqing, China, 28–30 May 2017; IEEE: Chongqing, China, 2017; pp. 2304–2309. [Google Scholar] [CrossRef]
  41. Borenstein, J.; Koren, Y. The vector field histogram-fast obstacle avoidance for mobile robots. IEEE Trans. Robot. Autom. 1991, 7, 278–288. [Google Scholar] [CrossRef] [Green Version]
  42. Lozano-Pérez, T.; Wesley, M.A. An algorithm for planning collision-free paths among polyhedral obstacles. Commun. ACM 1979, 22, 560–570. [Google Scholar] [CrossRef]
  43. Bellingham, J.; Tillerson, M.; Richards, A.; How, J.P. Multi-task allocation and path planning for cooperating UAVs. In Cooperative Control: Models, Applications and Algorithms; Springer: Berlin/Heidelberg, Germany, 2003; pp. 23–41. [Google Scholar] [CrossRef]
  44. Minguez, J.; Montano, L. Nearness Diagram (ND) Navigation: Collision Avoidance in Troublesome Scenarios. IEEE Trans. Robot. Autom. 2004, 20, 45–59. [Google Scholar] [CrossRef] [Green Version]
  45. Chakravarthy, A.; Ghose, D. Obstacle avoidance in a dynamic environment: A collision cone approach. IEEE Trans. Syst. ManCybern. Part A Syst. Hum. 1998, 28, 562–574. [Google Scholar] [CrossRef] [Green Version]
  46. Fiorini, P.; Shiller, Z. Motion Planning in Dynamic Environments Using Velocity Obstacles. Int. J. Robot. Res. 2016, 17, 760–772. [Google Scholar] [CrossRef]
  47. Ajith Kumar, B.; Ghose, D. Radar-assisted collision avoidance/guidance strategy for planar flight. IEEE Trans. Aerosp. Electron. Syst. 2001, 37, 77–90. [Google Scholar] [CrossRef] [Green Version]
  48. Olivares-Mendez, M.A.; Mejias, L.; Campoy, P.; Mellado-Bataller, I. Cross-Entropy Optimization for Scaling Factors of a Fuzzy Controller: A See-and-Avoid Approach for Unmanned Aerial Systems. J. Intell. Robot. Syst. 2012, 69, 189–205. [Google Scholar] [CrossRef] [Green Version]
  49. Richards, A.; How, J.P. Aircraft Trajectory Planning with Collision Avoidance Using Mixed Integer Linear Programming. In Proceedings of the 2002 American Control Conference (IEEE Cat. No. CH37301), Anchorage, AK, USA, 8–10 May 2002; IEEE: Anchorage, AK, USA, 2002; pp. 1936–1941. [Google Scholar] [CrossRef]
  50. Wang, X.; Yadav, V.; Balakrishnan, S.N. Cooperative UAV Formation Flying With Obstacle/Collision Avoidance. IEEE Trans. Control Syst. Technol. 2007, 15, 672–679. [Google Scholar] [CrossRef]
  51. Park, J.; Baek, H. Stereo vision based obstacle collision avoidance for a quadrotor using ellipsoidal bounding box and hierarchical clustering. Aerosp. Sci. Technol. 2020, 103, 105882. [Google Scholar] [CrossRef]
  52. Mukhtar, A.; Xia, L.; Tang, T.B. Vehicle detection techniques for collision avoidance systems: A review. IEEE Trans. Intell. Transp. Syst. 2015, 16, 2318–2338. [Google Scholar] [CrossRef]
  53. Rybus, T. Obstacle avoidance in space robotics: Review of major challenges and proposed solutions. Prog. Aerosp. Sci. 2018, 101, 31–48. [Google Scholar] [CrossRef]
  54. Lu, Y.; Xue, Z.; Xia, G.-S.; Zhang, L. A survey on vision-based UAV navigation. GEO Spat. Inf. Sci. 2018, 21, 21–32. [Google Scholar] [CrossRef] [Green Version]
  55. Shabbir, J.; Anwer, T. A survey of deep learning techniques for mobile robot applications. arXiv 2018, arXiv:07608. [Google Scholar]
  56. Faiçal, B.S.; Costa, F.G.; Pessin, G.; Ueyama, J.; Freitas, H.; Colombo, A.; Fini, P.H.; Villas, L.; Osório, F.S.; Vargas, P.A.; et al. The use of unmanned aerial vehicles and wireless sensor networks for spraying pesticides. J. Syst. Archit. 2014, 60, 393–404. [Google Scholar] [CrossRef]
  57. Bae, Y.; Koo, Y.M. Flight attitudes and spray patterns of a roll-balanced agricultural unmanned helicopter. Appl. Eng. Agric. Avoid. Algorithm Based Monocular Vis. Quad Rotor 2013, 29, 675–682. [Google Scholar] [CrossRef]
  58. Giles, D.; Billing, R. Deployment and Performance of an Unmanned Aerial Vehicle for Spraying of Specialty Crops. In Proceedings of the International Conference of Agricultural Engineering, Zurich, Switzerland, 6 July 2014; p. C0589. [Google Scholar]
  59. Oksanen, T.; Visala, A. Path planning algorithms for agricultural machines. Agric. Eng. Int. CIGR J. 2007. Available online: file:///C:/Users/MDPI/AppData/Local/Temp/940-Article%20Text-934-1-10-20080428-1.pdf (accessed on 1 April 2021).
  60. Wang, C.; Song, J.; He, X.; Wang, Z.; Wang, S.; Meng, Y. Effect of flight parameters on distribution characteristics ofpesticide spraying droplets deposition of plant-protection unmanned aerial vehicle. Trans. Chin. Soc. Agric. Eng. 2017, 33, 109–116. [Google Scholar]
  61. Fritz, B.K.; Czaczyk, Z.; Hoffmann, W.C. Model based decision support system of operating settings for MMAT nozzles. J. Plant Prot. Res. 2016, 56, 178–185. [Google Scholar] [CrossRef]
  62. De Bruin, S.; Lerink, P.; Klompe, A.; van der Wal, T.; Heijting, S. Spatial optimisation of cropped swaths and field margins using GIS. Comput. Electron. Agric. 2009, 68, 185–190. [Google Scholar] [CrossRef]
  63. Oksanen, T.; Visala, A. Coverage path planning algorithms for agricultural field machines. J. Field Robot. 2009, 26, 651–668. [Google Scholar] [CrossRef]
  64. Hofstee, J.; Spätjens, L.; Ijken, H. Optimal Path Planning for Field Operations. In Proceedings of the Joint International Agricultural Conference, (JIAC2009), Wageningen, The Netherlands, 6–8 July 2009; pp. 511–519. [Google Scholar]
  65. Hameed, I.A.; Bochtis, D.D.; Sørensen, C.G.; Nøremark, M. Automated generation of guidance lines for operational field planning. Biosyst. Eng. 2010, 107, 294–306. [Google Scholar] [CrossRef]
  66. Bochtis, D.D.; Sørensen, C.G. The vehicle routing problem in field logistics part I. Biosyst. Eng. 2009, 104, 447–457. [Google Scholar] [CrossRef]
  67. Bochtis, D.D.; Sørensen, C.G.; Busato, P.; Berruto, R. Benefits from optimal route planning based on B-patterns. Biosyst. Eng. 2013, 115, 389–395. [Google Scholar] [CrossRef]
  68. Scheuren, S.; Stiene, S.; Hartanto, R.; Hertzberg, J.; Reinecke, M. Spatio-temporally constrained planning for cooperative vehicles in a harvesting scenario. Ki-Künstliche Intell. 2013, 27, 341–346. [Google Scholar] [CrossRef]
  69. Vasquez Gomez, J.I.; Melchor, M.M.; Herrera Lozada, J.C. Optimal Coverage Path Planning Based on the Rotating Calipers Algorithm. In Proceedings of the 2017 International Conference on Mechatronics, Electronics and Automotive Engineering (ICMEAE), Cuernavaca, Mexico, 21–24 November 2017; pp. 140–144. [Google Scholar] [CrossRef]
  70. Torres, M.; Pelta, D.A.; Verdegay, J.L.; Torres, J.C. Coverage path planning with unmanned aerial vehicles for 3D terrain reconstruction. Expert Syst. Appl. 2016, 55, 441–451. [Google Scholar] [CrossRef]
  71. Zhou, K.; Leck Jensen, A.; Sørensen, C.G.; Busato, P.; Bothtis, D.D. Agricultural operations planning in fields with multiple obstacle areas. Comput. Electron. Agric. 2014, 109, 12–22. [Google Scholar] [CrossRef]
  72. Moon, S.-W.; Shim, D.H.-C. Study on Path Planning Algorithms for Unmanned Agricultural Helicopters in Complex Environment. Int. J. Aeronaut. Space Sci. 2009, 10, 1–11. [Google Scholar] [CrossRef] [Green Version]
  73. Wang, K.; Meng, Z.; Wang, L.; Wu, Z.; Wu, Z. Practical Obstacle Avoidance Path Planning for Agriculture UAVs. In Advances and Trends in Artificial Intelligence, Proceedings of theInternational Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, Graz, Austria, 9–11 July 2019; Springer: Cham, Switzerland, 2019; pp. 196–203. [Google Scholar] [CrossRef]
  74. Zhang, X.; Fan, C.; Cao, Z.; Fang, J.; Jia, Y. Novel obstacle-avoiding path planning for crop protection UAV using optimized Dubins curve. Int. J. Agric. Biol. Eng. 2020, 13, 172–177. [Google Scholar] [CrossRef]
  75. Khorsandi, F.; Ayers, P.D.; Freeland, R.S.; Wang, X. Modeling the effect of liquid movement on the center of gravity calculation of agricultural vehicles. J. Terramech. 2018, 75, 37–48. [Google Scholar] [CrossRef]
  76. Monaghan, J.J.; Kos, A.; Issa, N. Fluid Motion Generated by Impact. J. Waterw. Port Coast. Ocean Eng. 2003, 129, 250–259. [Google Scholar] [CrossRef]
  77. Frosina, E.; Senatore, A.; Andreozzi, A.; Fortunato, F.; Giliberti, P. Experimental and Numerical Analyses of the Sloshing in a Fuel Tank. Energies 2018, 11, 682. [Google Scholar] [CrossRef] [Green Version]
  78. Zang, Y.; Zang, Y.; Zhou, Z.; Gu, X.; Jiang, R.; Kong, L.; He, X.; Luo, X.; Lan, Y. Design and anti-sway performance testing of pesticide tanks in spraying UAVs. Int. J. Agric. Biol. Eng. 2019, 12, 10–16. [Google Scholar] [CrossRef] [Green Version]
  79. Li, X.Z.J.; Qu, F.; Zhang, W.; Wang, D.; Li, W. Optimal design of anti sway inner cavity structure of agricultural UAV pesticide tank. Trans. Chin. Soc. Agric. Eng. 2017, 33, 72–79. [Google Scholar]
  80. Yan, G.R.; Rakheja, S.; Siddiqui, K. Baffle Design Analysis for a Road Tanker: Transient Fluid Slosh Approach. SAE Int. J. Commer. Veh. 2008, 1, 397–405. [Google Scholar] [CrossRef]
  81. Zheng, X.-L.; Li, X.-S.; Ren, Y.-Y.; Wang, Y.-N.; Ma, J. Effects of Transverse Baffle Design on Reducing Liquid Sloshing in Partially Filled Tank Vehicles. Math. Probl. Eng. 2013, 2013, 130570. [Google Scholar] [CrossRef]
  82. Kandasamy, T. An Analysis of Baffles Designs for Limiting Fluid Slosh in Partly Filled Tank Trucks~!2009-10-29~!2010-04-21~!2010-07-23~! Open Transp. J. 2010, 4, 23–32. [Google Scholar] [CrossRef]
  83. Spickelmire, J. Liquid Stabilizing Baffle System. U.S. Patent 5,890,618, 6 April 1999. [Google Scholar]
  84. Taylor, G.L. Anti-Slosh Devices for Damping Oscillation of Liquids in Tanks. U.S. Patent 7,648,749, 19 January 2010. [Google Scholar]
  85. Lun, S.M.L.J.; Sakulthong, S.; Srigrarom, S. Wind Disturbance Control for V-Tail Y-Shape Quadcopter. In Proceedings of the 2019 First International Symposium on Instrumentation, Control, Artificial Intelligence, and Robotics (ICA-SYMP), Bangkok, Thailand, 16–18 January 2019; IEEE: Bangkok, Thailand, 2019; pp. 195–202. [Google Scholar] [CrossRef]
  86. Le Nhu Ngoc Thanh, H.; Hong, S.K. Quadcopter Robust Adaptive Second Order Sliding Mode Control Based on PID Sliding Surface. IEEE Access 2018, 6, 66850–66860. [Google Scholar] [CrossRef]
  87. Freeman, P.K.; Freeland, R.S. Agricultural UAVs in the U.S.: Potential, policy, and hype. Remote Sens. Appl. Soc. Environ. 2015, 2, 35–43. [Google Scholar] [CrossRef]
  88. Lan, Y.; Chen, S. Current status and trends of plant protection UAV and its spraying technology in China. Int. J. Precis. Agric. Aviat. 2018, 1, 1–9. [Google Scholar] [CrossRef]
  89. Lan, W.G.B. Overview and development prospects of China’s plant protection drone industry. Agric. Eng. Technol. 2018, 38, 17–27. [Google Scholar] [CrossRef]
  90. Chen, S.; Lan, Y.; Li, J.; Xu, X.; Wang, Z.; Peng, B. Evaluation and test of effective spraying width of aerial spraying on plant protection UAV. Trans. Chin. Soc. Agric. Eng. 2017, 33, 82–90. [Google Scholar]
  91. Wang, C.; He, X.; Wang, X.; Wang, Z.; Pan, H.; He, Z. Testing method of spatial pesticide spraying deposition quality balance for unmanned aerial vehicle. Trans. Chin. Soc. Agric. Eng. 2016, 32, 54–61. [Google Scholar] [CrossRef] [Green Version]
  92. Wang, D.; Zhang, J.; Li, W.; Xiong, B.; Zhang, S.; Zhang, W. Design and test of dynamic variable spraying system of plant protection UAV. Trans. Chin. Soc. Agric. Mach 2017, 5, 86–93. [Google Scholar]
  93. AppleMaps. In Apple: Sattelite Pro. Available online: https://satellites.pro/China_map#31.928614,119.487323,19 (accessed on 3 January 2021).
  94. AppleMaps. In Apple: Sattelite Pro. Available online: https://satellites.pro/China_map#31.878564,119.454724,19 (accessed on 3 January 2021).
  95. AppleMaps. In Apple: Sattelite Pro. Available online: https://satellites.pro/China_map#31.756452,119.511074,19 (accessed on 3 January 2021).
  96. AppleMaps. In Apple: Sattelite Pro. Available online: https://satellites.pro/China_map#32.466044,120.242939,19 (accessed on 3 January 2021).
  97. Wang, L.; Lan, Y.; Zhang, Y.; Zhang, H.; Tahir, M.N.; Ou, S.; Liu, X.; Chen, P. Applications and Prospects of Agricultural Unmanned Aerial Vehicle Obstacle Avoidance Technology in China. Sensors 2019, 19, 642. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  98. Nakhaeinia, D.; Tang, S.H.; Noor, S.M.; Motlagh, O. A review of control architectures for autonomous navigation of mobile robots. Int. J. Phys. Sci. 2011, 6, 169–174. [Google Scholar] [CrossRef]
  99. Wang, G.; Han, Y.; Li, X.; Andaloro, J.; Chen, P.; Hoffmann, W.C.; Han, X.; Chen, S.; Lan, Y. Field evaluation of spray drift and environmental impact using an agricultural unmanned aerial vehicle (UAV) sprayer. Sci. Total Environ. 2020, 737, 139793. [Google Scholar] [CrossRef] [PubMed]
  100. Liu, Z.; He, Y.; Wang, C.; Song, R. Analysis of the Influence of Foggy Weather Environment on the Detection Effect of Machine Vision Obstacles. Sensors 2020, 20, 349. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  101. Richard, P.-L.; Pouliot, N.; Montambault, S. Introduction of a LIDAR-Based Obstacle Detection System on the LineScout Power Line Robot. In Proceedings of the 2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Besacon, France, 8–11 July 2014; IEEE: Besacon, France, 2014; pp. 1734–1740. [Google Scholar] [CrossRef]
  102. Zhu, Y.; Yi, B.; Guo, T. A Simple Outdoor Environment Obstacle Detection Method Based on Information Fusion of Depth and Infrared. J. Robot. 2016, 2016, 2379685. [Google Scholar] [CrossRef] [Green Version]
  103. White, B.A.; Shin, H.-S.; Tsourdos, A. UAV Obstacle Avoidance using Differential Geometry Concepts. IFAC Proc. Vol. 2011, 44, 6325–6330. [Google Scholar] [CrossRef]
  104. Aswini, N.; Krishna Kumar, E.; Uma, S.V. UAV and obstacle sensing techniques—A perspective. Int. J. Intell. Unmanned Syst. 2018, 6, 32–46. [Google Scholar] [CrossRef]
  105. Discant, A.; Rogozan, A.; Rusu, C.; Bensrhair, A. Sensors for Obstacle Detection—A Survey. In Proceedings of the 2007 30th International Spring Seminar on Electronics Technology (ISSE), Cluj-Napoca, Romania, 9–13 May 2007; IEEE: Cluj-Napoca, Romania, 2007; pp. 100–105. [Google Scholar] [CrossRef]
  106. Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots; MIT Press: Boston, MA, USA, 2011. [Google Scholar]
  107. Wen-Hong, Z.; Lamarche, T. Velocity Estimation by Using Position and Acceleration Sensors. IEEE Trans. Ind. Electron. 2007, 54, 2706–2715. [Google Scholar] [CrossRef]
  108. Dai, R.; Stein, R.B.; Andrews, B.J.; James, K.B.; Wieler, M. Application of tilt sensors in functional electrical stimulation. IEEE Trans. Rehabil. Eng 1996, 4, 63–72. [Google Scholar] [CrossRef] [PubMed]
  109. Chao, H.; Gu, Y.; Gross, J.; Guo, G.; Fravolini, M.L.; Napolitano, M.R. A Comparative Study of Optical Flow and Traditional Sensors in Uav Navigation. In Proceedings of the 2013 American Control Conference, Washington, DC, USA, 17–19 June 2013; IEEE: Washington, DC, USA, 2013; pp. 3858–3863. [Google Scholar] [CrossRef]
  110. Racz, R.; Schott, C.; Huber, S. Electronic Compass Sensor; SENSORS; IEEE: Vienna, Austria, 2004; pp. 1446–1449. [Google Scholar] [CrossRef]
  111. Beliveau, A.; Spencer, G.T.; Thomas, K.A.; Roberson, S.L. Evaluation of MEMS capacitive accelerometers. IEEE Des. Test Comput. 1999, 16, 48–56. [Google Scholar] [CrossRef]
  112. Foix, S.; Alenya, G.; Torras, C. Lock-in Time-of-Flight (ToF) Cameras: A Survey. IEEE Sens. J. 2011, 11, 1917–1926. [Google Scholar] [CrossRef] [Green Version]
  113. Yan, W.Y.; Shaker, A.; El-Ashmawy, N. Urban land cover classification using airborne LiDAR data: A review. Remote Sens. Environ. 2015, 158, 295–310. [Google Scholar] [CrossRef]
  114. Suh, Y.S. Laser Sensors for Displacement, Distance and Position. Sensors 2019, 19, 1924. [Google Scholar] [CrossRef] [Green Version]
  115. Bernini, N.; Bertozzi, M.; Castangia, L.; Patander, M.; Sabbatelli, M. Real-Time Obstacle Detection Using Stereo Vision for Autonomous Ground Vehicles: A Survey. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 24–26 September 2014; IEEE: Qingdao, China, 2014; pp. 873–878. [Google Scholar] [CrossRef]
  116. Choi, J.; Ahn, S.; Chung, W.K. Robust Sonar Feature Detection for the SLAM of Mobile Robot. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; IEEE: Edmonton, AB, Canada, 2005; pp. 3415–3420. [Google Scholar] [CrossRef]
  117. Li, C.; Peng, Z.; Huang, T.-Y.; Fan, T.; Wang, F.-K.; Horng, T.-S.; Munoz-Ferreras, J.-M.; Gomez-Garcia, R.; Ran, L.; Lin, J. A Review on Recent Progress of Portable Short-Range Noncontact Microwave Radar Systems. IEEE Trans. Microw. Theory Tech. 2017, 65, 1692–1706. [Google Scholar] [CrossRef]
  118. Akagawa, K. Thermal Camera for Infrared Imaging. U.S. Patent 5,994,699, 30 November 1999. [Google Scholar]
  119. Fossum, E.R.; Hondongwa, D.B. A review of the pinned photodiode for CCD and CMOS image sensors. IEEE J. Electron. Devices Soc. 2014. [Google Scholar] [CrossRef]
  120. Yamaguchi, K.; Kato, T.; Ninomiya, Y. Moving Obstacle Detection Using Monocular Vision. In Proceedings of the 2006 IEEE Intelligent Vehicles Symposium, Meguro-Ku, Japan, 13–15 June 2006; IEEE: Meguro-Ku, Japan, 2006; pp. 288–293. [Google Scholar] [CrossRef]
  121. Han, Y.-X.; Zhang, Z.-S.; Dai, M. Monocular vision system for distance measurement based on feature points. Guangxue Jingmi Gongcheng 2011, 19, 1110–1117. [Google Scholar]
  122. Zhao, H.; Chen, X.C.; Wang, J.L.; Zeng, R.F. Obstacle avoidance algorithm based on monocular vision for quad-rotor helicopter. Opt. Precis. Eng. Freq. (Rf) Time Flight Ranging Wirel. Sens. Netw. 2014, 22, 2232–2241. [Google Scholar] [CrossRef]
  123. Rui, Z.; Jingyi, L.; Hengyu, L.; Qixing, C. Real-Time Obstacle Detection Based on Monocular Vision for Unmanned Surface Vehicles. In Proceedings of the International Conference on Bio-inspired Information and Communication Technologies, Singapore, 1–2 August 2020; Springer: Singapore, 2020; pp. 166–180. [Google Scholar] [CrossRef]
  124. Wang, S.-H.; Li, X.-X. A Real-Time Monocular Vision-Based Obstacle Detection. In Proceedings of the 2020 6th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 20–23 April 2020; IEEE: Singapore, 2020; pp. 695–699. [Google Scholar] [CrossRef]
  125. Cho, M.-g. In A Study on the Obstacle Recognition for Autonomous Driving RC Car Using Lidar and Thermal Infrared Camera. In Proceedings of the 2019 Eleventh International Conference on Ubiquitous and Future Networks (ICUFN), Zagreb, Croatia, 2–5 June 2019; IEEE: Zagreb, Croatia, 2019; pp. 544–546. [Google Scholar] [CrossRef]
  126. Carrio, A.; Lin, Y.; Saripalli, S.; Campoy, P. Obstacle Detection System for Small UAVs using ADS-B and Thermal Imaging. J. Intell. Robot. Syst. 2017, 88, 583–595. [Google Scholar] [CrossRef]
  127. Huang, S.-K.; Xia, T.; Zhang, T.-X. Passive ranging method based on infrared images. Infrared Laser Eng. 2007, 36, 109. [Google Scholar]
  128. Lu, Y.; Feng, Y.-S.; Ling, Y.-S.; Qiao, Y. Infrared three-color passive ranging by colorimetric method. Guangxue Jingmi Gongcheng 2012, 20, 2680–2685. [Google Scholar] [CrossRef]
  129. Wang, H.; Wang, J.; Shen, Z. Helicopter Pods-based Obstacle Avoidance Technology Using Infrared Imaging and Radar. Sci. Technol. Innov. Her 2014, 29, 56–59. [Google Scholar]
  130. Cheng, H.; Li, J.; Jin, B. Research of Small Blind Zone Ultrasonic Ranging Method Based on Natural Vibration Restraining. J. Vib. Meas. Diagn 2015, 2, 369–374. [Google Scholar]
  131. Wang, M. Localization and Obstacle Avoidance Control of Agricultural Robot Based on DSP and Ultrasonic Distance Measurement. Agric. Mech. Res 2017, 8, 207–211. [Google Scholar]
  132. Zhao, H.; Liu, Y.; Zhu, X.; Zhao, Y.; Zha, H. Scene Understanding in a Large Dynamic Environment through a Laser-Based Sensing. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 4–8 May 2010; IEEE: Anchorage, AK, USA, 2010; pp. 127–133. [Google Scholar] [CrossRef]
  133. Wang, Y.; Liu, J.; Zeng, Q. 3D environment restructure method with structured light for indoor vision/inertial navigation. J. Chin. Inert. Technol. 2016, 1, 51–58. [Google Scholar]
  134. Houshiar, H.; Elseberg, J.; Borrmann, D.; Nüchter, A. A study of projections for key point based registration of panoramic terrestrial 3D laser scan. GEO Spat. Inf. Sci. 2015, 18, 11–31. [Google Scholar] [CrossRef]
  135. Thorbjornsen, B.; White, N.; Brown, A.; Reeve, J. Radio frequency (RF) time-of-flight ranging for wireless sensor networks. Meas. Sci. Technol. Meas. Via Using Ultrason. Sens. 2010, 21, 035202. [Google Scholar] [CrossRef] [Green Version]
  136. Rankin, G.; Tirkel, A.; Leukhin, A. Millimeter Wave Array for UAV Imaging MIMO Radar. In Proceedings of the 2015 16th International Radar Symposium (IRS), Dresden, Germany, 24–26 June 2015; IEEE: Dresden, Germany, 2015; pp. 499–504. [Google Scholar] [CrossRef]
  137. Zhang, W.; Ning, Y.; Suo, C. A Method Based on Multi-Sensor Data Fusion for UAV Safety Distance Diagnosis. Electronics 2019, 12, 1467. [Google Scholar] [CrossRef] [Green Version]
  138. Lyu, H. Detect and Avoid System Based on Multi Sensor Fusion for UAV. In Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, Korea, 17–19 October 2018; IEEE: Jeju, Korea, 2018; pp. 1107–1109. [Google Scholar] [CrossRef]
  139. Hrabar, S.; Sukhatme, G.S.; Corke, P.; Usher, K.; Roberts, J. Combined Optic-Flow and Stereo-Based Navigation of Urban Canyons for a UAV. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; IEEE: Edmonton, AB, Canada, 2005; pp. 3309–3316. [Google Scholar] [CrossRef] [Green Version]
  140. McGuire, K.; De Croon, G.; De Wagter, C.; Tuyls, K.; Kappen, H. Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone. IEEE Robot. Autom. Lett. 2017, 2, 1070–1076. [Google Scholar] [CrossRef] [Green Version]
  141. Santos, M.C.; Santana, L.V.; Brandao, A.S.; Sarcinelli-Filho, M. UAV Obstacle Avoidance Using RGB-D System. In Proceedings of the 2015 International Conference On Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; IEEE: Denver, CO, USA, 2015; pp. 312–319. [Google Scholar] [CrossRef]
  142. Gageik, N.; Benz, P.; Montenegro, S. Obstacle Detection and Collision Avoidance for a UAV With Complementary Low-Cost Sensors. IEEE Access 2015, 3, 599–609. [Google Scholar] [CrossRef]
  143. Kragh, M.F.; Christiansen, P.; Laursen, M.S.; Larsen, M.; Steen, K.A.; Green, O.; Karstoft, H.; Jorgensen, R.N. FieldSAFE: Dataset for Obstacle Detection in Agriculture. Sensors 2017, 17, 2579. [Google Scholar] [CrossRef] [Green Version]
  144. Gageik, N.; Muller, T.; Montenergo, S. Obstacle detection and collision avoidance using ultrasonic distance sensors for an autonomous quadrocopter. In Proceedings of the 1st microdrones International ResearchWorkshop UAVWeek 2012, Siegen, Germany, 20–21 November 2012; pp. 3–23. [Google Scholar]
  145. Zhmud, V.; Kondratiev, N.; Kuznetsov, K.; Trubin, V.; Dimitrov, L. Application of Ultrasonic Sensor for Measuring Distances in Robotics; Conference Series; IOP Publishing: Tomsk, Russia, 2018; p. 032189. [Google Scholar] [CrossRef] [Green Version]
  146. Kelemen, M.; Virgala, I.; Kelemenová, T.; Mikova, L.; Frankovský, P.; Lipták, T.; Lörinc, M. Distance measurement via using of ultrasonic sensor. J. Autom. Control 2015, 3, 71–74. [Google Scholar]
  147. Kilian, J.; Haala, N.; Englich, M. Capture and evaluation of airborne laser scanner data. Int. Arch. Photogramm. Remote Sens. 1996, 31, 383–388. [Google Scholar]
  148. Donges, A.; Noll, R. Laser Measurement Technology; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
  149. Di, L.; Chao, H.; Chen, Y. A Two-Stage Calibration Method for Low-Cost UAV Attitude Estimation Using Infrared Sensor. In Proceedings of the 2010 IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, QingDao, China, 15–17 July 2010; IEEE: QingDao, China, 2010; pp. 137–142. [Google Scholar] [CrossRef]
  150. Silberman, N.; Fergus, R. Indoor Scene Segmentation Using a Structured LIGHT Sensor. In Proceedings of the 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, Spain, 6–13 November 2011; IEEE: Barcelona, Spain, 2011; pp. 601–608. [Google Scholar] [CrossRef]
  151. Nejad, S.M.; Olyaee, S. Low-noise high-accuracy TOF laser range finder. Am. J. Appl. Sci. 2008, 5, 755–762. [Google Scholar] [CrossRef] [Green Version]
  152. Fujimoto, D.; Hayashi, Y.-I. Study on Estimation of Sensing Timing Based on Observation of EM Radiation from ToF Range Finder. In Proceedings of the 2019 Joint International Symposium on Electromagnetic Compatibility, Sapporo and Asia-Pacific International Symposium on Electromagnetic Compatibility (EMC Sapporo/APEMC), Sapporo, Japan, 3–7 June 2019; IEEE: Sapporo, Japan, 2019; pp. 1–4. [Google Scholar] [CrossRef]
  153. Xiang, J.; Zhang, M. Millimeter-Wave Radar and Its Applications; National Defense Industry Press: Beijing, China, 2015. [Google Scholar]
  154. Johnston, S.L. Millimeter Wave Radar; Harard: Dedham, MA, USA, 1980. [Google Scholar]
  155. Chen, H.-C. Monocular Vision-Based Obstacle Detection and Avoidance for a Multicopter. IEEE Access 2019, 7, 167869–167883. [Google Scholar] [CrossRef]
  156. Levkovits-Scherer, D.S.; Cruz-Vega, I.; Martinez-Carranza, J. Real-Time Monocular Vision-Based UAV Obstacle Detection and Collision Avoidance in GPS-Denied Outdoor Environments Using CNN MobileNet-SSD. In Proceedings of the Mexican International Conference on Artificial Intelligence, Veracruz, Mexico, 27 October–2 November 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 613–621. [Google Scholar] [CrossRef]
  157. Zhang, L.; Xu, J.; Xia, Q. Pose estimation algorithm and verification based on binocular stereo vision for unmanned aerial vehicle. J. Harbin Inst. Technolobstacle Detect. Using Ultrason. Sens. 2014, 46, 66–72. [Google Scholar]
  158. Zhu, P.; Zhen, Z.-Y.; Qin, H.-Q.; Jiang, J. Stereo vision and optical flow based obstacle avoidance algorithm for UAVs. Electron. Opt. Control 2017, 24, 31–35. [Google Scholar]
  159. Wang, Q.; Meng, Z.; Liu, H. Review on Application of Binocular Vision Technology in Field Obstacle Detection. In IOP Conference Series: Materials Science and Engineering, Proceedings of the International Conference on AI and Big Data Application (AIBDA 2019), Guangzhou, China, 20–22 December 2019; IOP Publishing: Bristol, UK, 2020; p. 012025. [Google Scholar] [CrossRef]
  160. Lei, Z.; Shumao, W.; Bingqi, C.; Zhigang, L. Detection of obstacles in farmland based on binocular vision. J. China Agric. Univ. 2007, 12, 70. [Google Scholar]
  161. Mori, T.; Scherer, S. First Results in Detecting and Avoiding Frontal Obstacles from a Monocular Camera for Micro Unmanned Aerial Vehicles. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Karlsruhe, Germany, 2013; pp. 1750–1757. [Google Scholar] [CrossRef] [Green Version]
  162. Lee, J.O.; Lee, K.H.; Park, S.H.; Im, S.G.; Park, J. Obstacle avoidance for small UAVs using monocular vision. Aircr. Eng. Aerosp. Technol. 2011, 83, 397–406. [Google Scholar] [CrossRef] [Green Version]
  163. Magree, D.; Mooney, J.G.; Johnson, E.N. Monocular visual mapping for obstacle avoidance on UAVs. In Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 18–31 May 2013; IEEE: Atlanta, GA, USA, 2013; pp. 471–479. [Google Scholar] [CrossRef]
  164. Pal, S.K.; King, R.A.; Hashim, A.A. Image description and primitive extraction using fuzzy sets. IEEE Trans. Syst. ManCybern. 1983, SMC-13, 94–100. [Google Scholar] [CrossRef]
  165. Aoude, G.S.; Luders, B.D.; Levine, D.S.; How, J.P. Threat-Aware Path Planning in Uncertain Urban Environments. In Proceedings of the2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; IEEE: Taipei, Taiwan, 2010; pp. 6058–6063. [Google Scholar] [CrossRef]
  166. Barry, A.J.; Florence, P.R.; Tedrake, R. High-speed autonomous obstacle avoidance with pushbroom stereo. J. Field Robot. 2018, 35, 52–68. [Google Scholar] [CrossRef] [Green Version]
  167. Souhila, K.; Karim, A. Optical Flow Based Robot Obstacle Avoidance. Int. J. Adv. Robot. Syst. 2007, 4, 2. [Google Scholar] [CrossRef] [Green Version]
  168. Moore, R.J.; Thurrowgood, S.; Bland, D.; Soccol, D.; Srinivasan, M.V. A Stereo Vision System for Uav Guidance. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 11–15 October 2009; IEEE: St. Louis, MO, USA, 2009; pp. 3386–3391. [Google Scholar] [CrossRef]
  169. Gao, Y.; Ai, X.; Wang, Y.; Rarity, J.; Dahnoun, N. UV-Disparity Based Obstacle Detection with 3D Camera and Steerable Filter. In Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany, 5–9 June 2011; IEEE: Baden-Baden, Germany, 2011; pp. 957–962. [Google Scholar] [CrossRef]
  170. Kramm, S.; Bensrhair, A. Obstacle Detection Using Sparse Stereovision and Clustering Techniques. In Proceedings of the 2012 IEEE Intelligent Vehicles Symposium, Madrid, Spain, 3–7 June 2012; IEEE: Madrid, Spain, 2012; pp. 760–765. [Google Scholar] [CrossRef]
  171. Iacono, M.; Sgorbissa, A. Path following and obstacle avoidance for an autonomous UAV using a depth camera. Robot. Auton. Syst. 2018, 106, 38–46. [Google Scholar] [CrossRef]
  172. Kato, T.; Ninomiya, Y.; Masaki, I. An obstacle detection method by fusion of radar and motion stereo. IEEE Trans. Intell. Transp. Syst. 2002, 3, 182–188. [Google Scholar] [CrossRef]
  173. Vidhya, D.; Rebelo, D.P.; D’Silva, C.; Fernandes, L.W.; Costa, C. Obstacle detection using ultrasonic sensors. Int. J. Innov. Res. Sci. Technol. 2016, 2, 316–320. [Google Scholar]
  174. Viquerat, A.; Blackhall, L.; Reid, A.; Sukkarieh, S.; Brooker, G. Reactive Collision Avoidance for Unmanned Aerial Vehicles Using Doppler Radar. In Field and Service Robotics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 245–254. [Google Scholar] [CrossRef] [Green Version]
  175. Blanc, C.; Aufrère, R.; Malaterre, L.; Gallice, J.; Alizon, J. Obstacle detection and tracking by millimeter wave RADAR. IFAC Proc. Vol. 2004, 37, 322–327. [Google Scholar] [CrossRef]
  176. Sugimoto, S.; Tateda, H.; Takahashi, H.; Okutomi, M. Obstacle Detection Using Millimeter-Wave Radar and Its Visualization on Image Sequence. In Proceedings of the 17th International Conference on Pattern Recognition, ICPR 2004, Cambridge, UK, 26 August 2004; IEEE: Cambridge, UK, 2004; pp. 342–345. [Google Scholar] [CrossRef]
  177. Han, J.; Kim, D.; Lee, M.; Sunwoo, M. Enhanced road boundary and obstacle detection using a downward-looking LIDAR sensor. IEEE Trans. Veh. Technol. 2012, 61, 971–985. [Google Scholar] [CrossRef]
  178. Catapang, A.N.; Ramos, M. Obstacle Detection Using a 2D LIDAR System for an Autonomous Vehicle. In Proceedings of the 2016 6th IEEE International Conference on Control System, Computing and Engineering (ICCSCE), Batu, Ferringhi, 2 August 2016; IEEE: Batu, Ferringhi, 2016; pp. 441–445. [Google Scholar] [CrossRef]
  179. Kuthirummal, S.; Das, A.; Samarasekera, S. A Graph Traversal Based Algorithm for Obstacle Detection Using Lidar or Stereo. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; IEEE: San Francisco, CA, USA, 2011; pp. 3874–3880. [Google Scholar] [CrossRef]
  180. Thi Phuoc Van, N.; Tang, L.; Demir, V.; Hasan, S.F.; Duc Minh, N.; Mukhopadhyay, S. Review-Microwave Radar Sensing Systems for Search and Rescue Purposes. Sensors 2019, 19, 2879. [Google Scholar] [CrossRef] [Green Version]
  181. Zeng, S.; Zhang, W.; Litkouhi, B.B. Fusion of Obstacle Detection Using Radar and Camera. U.S. Patent 9,429,650, 30 August 2016. [Google Scholar]
  182. Jha, H.; Lodhi, V.; Chakravarty, D. Object Detection and Identification Using Vision and Radar Data Fusion System for Ground-Based Navigation. In Proceedings of the 2019 6th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 7–8 March 2019; IEEE: Noida, India, 2019; pp. 590–593. [Google Scholar] [CrossRef]
  183. Bertozzi, M.; Bombini, L.; Cerri, P.; Medici, P.; Antonello, P.C.; Miglietta, M. Obstacle Detection and Classification Fusing Radar and Vision. In Proceedings of the 2008 IEEE Intelligent Vehicles Symposium, Eindhoven, The Netherlands, 4–6 June 2008; IEEE: Eindhoven, The Netherlands, 2008; pp. 608–613. [Google Scholar] [CrossRef]
  184. Hill, M.N. Physical Oceanography; Harvard University Press: Boston, MA, USA, 2005; Volume 1. [Google Scholar]
  185. D’amico, A.; Pittenger, R. A Brief History of Active Sonar; Space and Naval Warfare Systems Center: San Diego, CA, USA, 2009. [Google Scholar]
  186. Elfes, A. Sonar-based real-world mapping and navigation. IEEE J. Robot. Autom. 1987, 3, 249–265. [Google Scholar] [CrossRef]
  187. Flynn, A.M. Combining Sonar and Infrared Sensors for Mobile Robot Navigation. Int. J. Robot. Res. 2016, 7, 5–14. [Google Scholar] [CrossRef]
  188. Kleeman, L.; Kuc, R. An Optimal Sonar Array for Target Localization and Classification. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, San Diego, CA, USA, 8–13 May 1994; IEEE: San Diego, CA, USA, 1994; pp. 3130–3135. [Google Scholar] [CrossRef]
  189. Akbarally, H.; Kleeman, L. A Sonar Sensor for Accurate 3D Target Localisation and Classification. In Proceedings of the 1995 IEEE International Conference on Robotics and Automation, Nagoya, Japan, 21–27 May 1995; IEEE: Nagoya, Japan, 1995; pp. 3003–3008. [Google Scholar] [CrossRef]
  190. Ribas, D.; Ridao, P.; Neira, J.; Tardos, J.D. SLAM Using an Imaging Sonar for Partially Structured Underwater Environments. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; IEEE: Beijing, China, 2006; pp. 5040–5045. [Google Scholar] [CrossRef] [Green Version]
  191. Steckel, J.; Peremans, H. BatSLAM: Simultaneous localization and mapping using biomimetic sonar. PLoS ONE 2013, 8, e54076. [Google Scholar] [CrossRef] [PubMed]
  192. Steckel, J.; Peremans, H. Spatial Sampling Strategy for a 3D Sonar Sensor Supporting BatSLAM. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015; IEEE: Hamburg, Germany, 2015; pp. 723–728. [Google Scholar] [CrossRef]
  193. Kerstens, R.; Laurijssen, D.; Steckel, J. ERTIS: A Fully Embedded Real Time 3D Imaging Sonar Sensor for Robotic Applications. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Montreal, QC, Canada, 2019; pp. 1438–1443. [Google Scholar] [CrossRef]
  194. Bin Misnan, M.F.; Arshad, N.M.; Abd Razak, N. Construction Sonar Sensor Model of Low Altitude Field Mapping Sensors for Application on a UAV. In Proceedings of the 2012 IEEE 8th International Colloquium on Signal Processing and its Applications, Malacca, Malaysia, 23–25 March 2012; IEEE: Malacca, Malaysia, 2012; pp. 446–450. [Google Scholar] [CrossRef]
  195. Gupta, N.; Makkar, J.S.; Pandey, P. Obstacle Detection and Collision Avoidance Using Ultrasonic Sensors for Rc Multirotors. In Proceedings of the 2015 International Conference on Signal Processing and Communication (ICSC), Noida, India, 16–18 March 2015; IEEE: Noida, India, 2015; pp. 419–423. [Google Scholar] [CrossRef]
  196. Becker, M.; Sampaio, R.C.B.; Bouabdallah, S.; Perrot, V.; Siegwart, R. In flight collision avoidance for a Mini-UAV robot based on onboard sensors. J. Braz. Soc. Mech. Sci. Eng. 2012, 2. Available online: https://www.researchgate.net/profile/Rafael-Sampaio-8/publication/261635073_In_flight_collision_avoidance_for_a_Mini-UAV_robot_based_on_onboard_sensors/links/00b7d534e01ac6c752000000/In-flight-collision-avoidance-for-a-Mini-UAV-robot-based-on-onboard-sensors.pdf (accessed on 1 April 2021).
  197. Li, J.; Kaess, M.; Eustice, R.M.; Johnson-Roberson, M. Pose-Graph SLAM Using Forward-Looking Sonar. IEEE Robot. Autom. Lett. 2018, 3, 2330–2337. [Google Scholar] [CrossRef]
  198. Rahman, S.; Li, A.Q.; Rekleitis, I. Sonar Visual Inertial SLAM of Underwater Structures. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Austrilia, 21–25 May 2018; IEEE: Brisbane, QLD, Austrilia, 2018; pp. 5190–5196. [Google Scholar] [CrossRef]
  199. Teixeira, P.V.; Kaess, M.; Hover, F.S.; Leonard, J.J. Underwater Inspection Using Sonar-Based Volumetric Submaps. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; IEEE: Daejeon, Korea, 2016; pp. 4288–4295. [Google Scholar] [CrossRef] [Green Version]
  200. Huang, T.A.; Kaess, M. Towards Acoustic Structure from Motion for Imaging Sonar. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015; IEEE: Hamburg, Germany, 2015; pp. 758–765. [Google Scholar] [CrossRef]
  201. Wang, X.; Zhang, G.; Sun, Y.; Wan, L.; Cao, J. Research on autonomous underwater vehicle wall following based on reinforcement learning and multi-sonar weighted round robin mode. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420925311. [Google Scholar] [CrossRef]
  202. Chutia, S.; Kakoty, N.M.; Deka, D. A review of underwater robotics, navigation, sensing techniques and applications. Proc. Adv. Robot. 2017, 1–6. [Google Scholar] [CrossRef]
  203. Sahoo, A.; Dwivedy, S.K.; Robi, P.S. Advancements in the field of autonomous underwater vehicle. Ocean Eng. 2019, 181, 145–160. [Google Scholar] [CrossRef]
  204. Christ, R.D.; Wernli, R.L., Sr. The ROV Manual: A User Guide for Remotely Operated Vehicles; Butterworth-Heinemann: Oxford, UK, 2013. [Google Scholar]
  205. Nguyen, H.T.; Lee, E.H.; Lee, S. Study on the Classification Performance of Underwater Sonar Image Classification Based on Convolutional Neural Networks for Detecting a Submerged Human Body. Sensors 2019, 20, 94. [Google Scholar] [CrossRef] [Green Version]
  206. Levanon, N. Radar Principles. John Wiley & Sons: New York, NY, USA, 1988. [Google Scholar]
  207. Özer, I.E.; Leijen, F.J.; Jonkman, S.N.; Hanssen, R.F. Applicability of satellite radar imaging to monitor the conditions of levees. J. Flood Risk Manag. 2018, 12 (Suppl. S2), e12509. [Google Scholar] [CrossRef] [Green Version]
  208. Lee, J.-S.; Pottier, E. Polarimetric Radar Imaging: From Basics to Applications; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
  209. Kanevsky, M.B. Radar Imaging of the Ocean Waves; Elsevier: Amsterdam, The Netherlands, 2008. [Google Scholar]
  210. Brisken, S.; Moscadelli, M.; Seidel, V.; Schwark, C. Passive Radar Imaging Using DVB-S2. In Proceedings of the 2017 IEEE Radar Conference (RadarConf), Seattle, WA, USA, 8–12 May 2017; IEEE: Seattle, WA, USA, 2017; pp. 0552–0556. [Google Scholar] [CrossRef]
  211. Ergun, S.; Sonmez, S. Terahertz technology for military applications. J. Manag. Inf. Sci. 2015, 3, 13–16. [Google Scholar] [CrossRef]
  212. Pisciottano, I.; Pastina, D.; Cristallini, D. DVB-S based passive radar imaging of ship targets. In Proceedings of the 2019 20th International Radar Symposium (IRS), Ulm, Germany, 26–28 June 2019; IEEE: Ulm, Germany, 2019; pp. 1–7. [Google Scholar] [CrossRef]
  213. Cristallini, D.; Pisciottano, I.; Kuschel, H. Multi-Band Passive Radar Imaging Using Satellite Illumination. In Proceedings of the 2018 International Conference on Radar (RADAR), Brisbane, QLD, Austrilia, 27–30 August 2018; IEEE: Brisbane, QLD, Austrilia, 2018; pp. 1–6. [Google Scholar] [CrossRef]
  214. Giubbolini, L. A multistatic microwave radar sensor for short range anticollision warning. IEEE Trans. Veh. Technol. 2000, 49, 2270–2275. [Google Scholar] [CrossRef]
  215. Baraniuk, R.; Steeghs, P. In Compressive Radar Imaging. In Proceedings of the 2007 IEEE radar conference, Waltham, MA, USA, 17–20 April 2007; IEEE: Waltham, MA, USA, 2007; pp. 128–133. [Google Scholar] [CrossRef]
  216. Vivet, D.; Checchin, P.; Chapuis, R. Localization and mapping using only a rotating FMCW radar sensor. Sensors 2013, 13, 4527–4552. [Google Scholar] [CrossRef] [PubMed]
  217. Zhu, Y.; Zhu, Y.; Zhao, B.Y.; Zheng, H. Reusing 60ghz Radios for Mobile Radar Imaging. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking, Paris, France, 7–11 September 2015; pp. 103–116. [Google Scholar] [CrossRef] [Green Version]
  218. Iyer, N.C.; Pillai, P.; Bhagyashree, K.; Mane, V.; Shet, R.M.; Nissimagoudar, P.; Krishna, G.; Nakul, V. Millimeter-wave AWR1642 RADAR for Obstacle Detection: Autonomous Vehicles. In Innovations in Electronics and Communication Engineering; Springer: Berlin/Heidelberg, Germany, 2020; pp. 87–94. [Google Scholar] [CrossRef]
  219. Guo, L.; Antoniou, M.; Baker, C.J. Memory-augmented cognitive radar for obstacle avoidance using nearest steering vector search. IET Radar. Sonar. Navig. 2020, 15, 51–61. [Google Scholar] [CrossRef]
  220. Feger, R.; Wagner, C.; Schuster, S.; Scheiblhofer, S.; Jager, H.; Stelzer, A. A 77-GHz FMCW MIMO Radar Based on an SiGe Single-Chip Transceiver. IEEE Trans. Microw. Theory Tech. 2009, 57, 1020–1035. [Google Scholar] [CrossRef]
  221. Zhang, Z.; Tian, Z.; Zhou, M. Latern: Dynamic Continuous Hand Gesture Recognition Using FMCW Radar Sensor. IEEE Sens. J. 2018, 18, 3278–3289. [Google Scholar] [CrossRef]
  222. Peng, Z.; Li, C.; Muñoz-Ferreras, J.-M.; Gómez-García, R. An FMCW Radar Sensor for Human Gesture Recognition in the Presence of Multiple Targets. In Proceedings of the 2017 First IEEE MTT-S International Microwave Bio Conference (IMBIOC), Gothenburg, Sweden, 15–17 May 2017; IEEE: Gothenburg, Sweden, 2017; pp. 1–3. [Google Scholar] [CrossRef]
  223. Folster, F.; Rohling, H.; Lubbert, U. An Automotive Radar Network Based on 77 GHz FMCW Sensors. In Proceedings of the IEEE International Radar Conference, Arlington, VA, USA, 9–12 May 2005; IEEE: Arlington, VA, USA, 2005; pp. 871–876. [Google Scholar] [CrossRef]
  224. Jardak, S.; Alouini, M.-S.; Kiuru, T.; Metso, M.; Ahmed, S. Compact mmWave FMCW radar: Implementation and performance analysis. IEEE Aerosp. Electron. Syst. Mag. 2019, 34, 36–44. [Google Scholar] [CrossRef]
  225. Hussain, R.; Zeadally, S. Autonomous Cars: Research Results, Issues, and Future Challenges. IEEE Commun. Surv. Tutor. 2019, 21, 1275–1313. [Google Scholar] [CrossRef]
  226. Jianmin, D.; Kaihua, Z.; Lixiao, S. Road and Obstacle Detection Based on Multi-Layer Laser Radar in Driverless Car. In Proceedings of the 2015 34th Chinese Control Conference (CCC), Hangzhou, China, 28–30 July 2015; IEEE: Hangzhou, China, 2015; pp. 8003–8008. [Google Scholar] [CrossRef] [Green Version]
  227. Kwag, Y.K.; Chung, C.H. UAV Based Collision Avoidance Radar Sensor. In Proceedings of the 2007 IEEE International Geoscience and Remote Sensing Symposium, Barcelona, Spain, 23–27 July 2007; IEEE: Barcelona, Spain, 2007; pp. 639–642. [Google Scholar] [CrossRef]
  228. Hugler, P.; Roos, F.; Schartel, M.; Geiger, M.; Waldschmidt, C. Radar Taking Off: New Capabilities for UAVs. IEEE Microw. Mag. 2018, 19, 43–53. [Google Scholar] [CrossRef] [Green Version]
  229. Dogru, S.; Marques, L. Pursuing Drones With Drones Using Millimeter Wave Radar. IEEE Robot. Autom. Lett. 2020, 5, 4156–4163. [Google Scholar] [CrossRef]
  230. Reutebuch, S.E.; Andersen, H.-E.; McGaughey, R.J. Light detection and ranging (LIDAR): An emerging tool for multiple resource inventory. J. For. 2005, 103, 286–292. [Google Scholar] [CrossRef]
  231. Kikuta, H.; Iwata, K.; Nagata, R. Distance measurement by the wavelength shift of laser diode light. Appl. Opt. 1986, 25, 2976. [Google Scholar] [CrossRef] [PubMed]
  232. Dalgleish, F.R.; Vuorenkoski, A.K.; Ouyang, B. Extended-Range Undersea Laser Imaging: Current Research Status and a Glimpse at Future Technologies. Mar. Technol. Soc. J. 2013, 47, 128–147. [Google Scholar] [CrossRef]
  233. Ye, C.; Borenstein, J. Characterization of a 2D Laser Scanner for Mobile Robot Obstacle Negotiation. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation (Cat. No. 02CH37292), Washington, DC, USA, 11–15 May 2002; IEEE: Washington, DC, USA, 2002; pp. 2512–2518. [Google Scholar] [CrossRef] [Green Version]
  234. Yu, C.; Zhang, D. Obstacle Detection Based on a Four-Layer Laser Radar. In Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 15–18 December 2007; IEEE: Sanya, China, 2007; pp. 218–221. [Google Scholar] [CrossRef]
  235. Himmelsbach, M.; Mueller, A.; Lüttel, T.; Wünsche, H.-J. LIDAR-Based 3D Object Perception. In Proceedings of the 1st International Workshop on Cognition for Technical Systems, Munich, Germany, 6–8 October 2008. [Google Scholar]
  236. Douillard, B.; Underwood, J.; Kuntz, N.; Vlaskine, V.; Quadros, A.; Morton, P.; Frenkel, A. On the Segmentation of 3D LIDAR Point Clouds. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; IEEE: Shanghai, China, 2011; pp. 2798–2805. [Google Scholar] [CrossRef]
  237. Demantké, J.; Mallet, C.; David, N.; Vallet, B. Dimensionality Based Scale Selection in 3D Lidar Point Clouds. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 97–102. [Google Scholar] [CrossRef] [Green Version]
  238. Li, B.; Zhang, T.; Xia, T. Vehicle detection from 3d lidar using fully convolutional network. arXiv 2016, arXiv:07608. [Google Scholar]
  239. Kim, J.; Song, S.; Kim, S.; Suk, J. Collision Avoidance System for Agricultural Unmanned Helicopter using LIDAR Sensor. Asia-Pacific Int. Symp. Aerosp. Technol. 2014. Available online: https://www.researchgate.net/profile/Seungkeun-Kim/publication/273135419_Collision_Avoidance_System_for_Agricultural_Unmanned_Helicopter_using_LIDAR_Sensor/links/54f91ec90cf210398e976276/Collision-Avoidance-System-for-Agricultural-Unmanned-Helicopter-using-LIDAR-Sensor.pdf (accessed on 1 April 2021).
  240. Peng, Y.; Qu, D.; Zhong, Y.; Xie, S.; Luo, J.; Gu, J. The Obstacle Detection and Obstacle Avoidance Algorithm Based on 2-d Lidar. In Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China, 8–10 August 2015; IEEE: Lijiang, China, 2015; pp. 1648–1653. [Google Scholar] [CrossRef]
  241. Zheng, L.; Zhang, P.; Tan, J.; Li, F. The Obstacle Detection Method of UAV Based on 2D Lidar. IEEE Access 2019, 7, 163437–163448. [Google Scholar] [CrossRef]
  242. Song, K.-T.; Chiu, Y.-H.; Kang, L.-R.; Song, S.-H.; Yang, C.-A.; Lu, P.-C.; Ou, S.-Q. Navigation Control Design of a Mobile Robot by Integrating Obstacle Avoidance and LiDAR SLAM. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; IEEE: Miyazaki, Japan, 2018; pp. 1833–1838. [Google Scholar] [CrossRef]
  243. Baras, N.; Nantzios, G.; Ziouzios, D.; Dasygenis, M. Autonomous Obstacle Avoidance Vehicle Using Lidar and an Embedded System. In Proceedings of the 2019 8th International Conference on Modern Circuits and Systems Technologies (MOCAST), Thessaloniki, Greece, 13–15 May 2019; IEEE: Thessaloniki, Greece, 2019; pp. 1–4. [Google Scholar] [CrossRef]
  244. Miyakawa, A.S. Autonomous Ground Vehicle Low-Profile Obstacle Avoidance Using 2D LIDAR; Naval Postgraduate School: Monterey, CA, USA, 2019. [Google Scholar]
  245. Gallay, M.; Eck, C.; Zgraggen, C.; Kaňuk, J.; Dvorný, E. High resolution airborne laser scanning and hyperspectral imaging with a small UAV platform. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 823. [Google Scholar] [CrossRef] [Green Version]
  246. Omasa, K.; Hosoi, F.; Konishi, A. 3D lidar imaging for detecting and understanding plant responses and canopy structure. J. Exp. Bot. 2007, 58, 881–898. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  247. Lovell, J.L.; Jupp, D.L.B.; Culvenor, D.S.; Coops, N.C. Using airborne and ground-based ranging lidar to measure canopy structure in Australian forests. Can. J. Remote Sens. 2014, 29, 607–622. [Google Scholar] [CrossRef]
  248. Omasa, K.; Hosoi, F.; Uenishi, T.M.; Shimizu, Y.; Akiyama, Y. Three-Dimensional Modeling of an Urban Park and Trees by Combined Airborne and Portable On-Ground Scanning LIDAR Remote Sensing. Environ. Model. Assess 2007, 13, 473–481. [Google Scholar] [CrossRef]
  249. Hopkinson, C.; Lovell, J.; Chasmer, L.; Jupp, D.; Kljun, N.; van Gorsel, E. Integrating terrestrial and airborne lidar to calibrate a 3D canopy model of effective leaf area index. Remote Sens. Environ. 2013, 136, 301–314. [Google Scholar] [CrossRef]
  250. Fraga-Lamas, P.; Ramos, L.; Mondéjar-Guerra, V.; Fernández-Caramés, T.M. A Review on IoT Deep Learning UAV Systems for Autonomous Obstacle Detection and Collision Avoidance. Remote Sens. 2019, 11, 2144. [Google Scholar] [CrossRef] [Green Version]
  251. Sun, Z.; Bebis, G.; Miller, R. On-Road Vehicle Detection Using Optical Sensors: A Review. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No. 04TH8749), Washington, WA, USA, 3–6 October 2004; IEEE: Washington, WA, USA, 2004; pp. 585–590. [Google Scholar] [CrossRef]
  252. Chavan, Y.; Chavan, P.; Nyayanit, A.; Waydande, V. Obstacle detection and avoidance for automated vehicle: A review. J. Opt. 2021, 50, 46–54. [Google Scholar] [CrossRef]
  253. Islam, M.M.; Sheikh Sadi, M.; Zamli, K.Z.; Ahmed, M.M. Developing Walking Assistants for Visually Impaired People: A Review. IEEE Sens. J. 2019, 19, 2814–2828. [Google Scholar] [CrossRef]
  254. Zhao, X.; Pu, F.; Wang, Z.; Chen, H.; Xu, Z. Detection, Tracking, and Geolocation of Moving Vehicle From UAV Using Monocular Camera. IEEE Access 2019, 7, 101160–101170. [Google Scholar] [CrossRef]
  255. Zaarane, A.; Slimani, I.; Al Okaishi, W.; Atouf, I.; Hamdoun, A. Distance measurement system for autonomous vehicles using stereo camera. Array 2020, 5, 100016. [Google Scholar] [CrossRef]
  256. Griffiths, E.; Assana, S.; Whitehouse, K. Privacy-preserving Image Processing with Binocular Thermal Cameras. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 1, 1–25. [Google Scholar] [CrossRef]
  257. Chrysochoos, A.; Louche, H. An infrared image processing to analyse the calorific effects accompanying strain localisation. Int. J. Eng. Sci. 2000, 38, 1759–1788. [Google Scholar] [CrossRef]
  258. Fuentes-Pacheco, J.; Ruiz-Ascencio, J.; Rendón-Mancha, J.M. Visual simultaneous localization and mapping: A survey. Artif. Intell. Rev. 2015, 43, 55–81. [Google Scholar] [CrossRef]
  259. Se, S.; Lowe, D.; Little, J. Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks. Int. J. Robot. Res. 2002, 21, 735–758. [Google Scholar] [CrossRef]
  260. Olson, C.F.; Matthies, L.H.; Schoppers, M.; Maimone, M.W. Rover navigation using stereo ego-motion. Robot. Auton. Syst. 2003, 43, 215–229. [Google Scholar] [CrossRef]
  261. Davison, A.J. Real-Time Simultaneous Localisation and Mapping with a Single Camera. Proceedings Ninth IEEE International Conference on Computer Vision, Nice, France, 13–16 October 2003; p. 1403. [Google Scholar] [CrossRef]
  262. Zou, Z.; Shi, Z.; Guo, Y.; Ye, J.J. Object detection in 20 years: A survey. arXiv 2019, arXiv:05055. [Google Scholar]
  263. Kanellakis, C.; Nikolakopoulos, G. Survey on Computer Vision for UAVs: Current Developments and Trends. J. Intell. Robot. Syst. 2017, 87, 141–168. [Google Scholar] [CrossRef] [Green Version]
  264. Carnie, R.; Walker, R.; Corke, P. Image Processing Algorithms for UAV “Sense and Avoid”. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, Orlando, FL, USA, 1–19 May 2006; IEEE: Orlando, FL, USA, 2006; pp. 2848–2853. [Google Scholar] [CrossRef] [Green Version]
  265. Rodriguez, J.; Castiblanco, C.; Mondragon, I.; Colorado, J. Low-Cost Quadrotor Applied for Visual Detection of Landmine-Like Objects. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; IEEE: Orlando, FL, USA, 2014; pp. 83–88. [Google Scholar] [CrossRef]
  266. Teuliere, C.; Eck, L.; Marchand, E. Chasing a Moving Target from a Flying UAV. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; IEEE: San Francisco, CA, USA, 2011; pp. 4929–4934. [Google Scholar] [CrossRef] [Green Version]
  267. Lin, K.-H.; Chang, C.-H.; Dopfer, A.; Wang, C.-C. Mapping and Localization in 3D Environments Using a 2D Laser Scanner and a Stereo Camera. J. Inf. Sci. Eng. 2012, 28, 131–144. [Google Scholar]
  268. Yankun, Z.; Hong, C.; Weyrich, N. A Single Camera Based Rear Obstacle Detection System. In Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany, 5–9 June 2011; IEEE: Baden-Baden, Germany, 2011; pp. 485–490. [Google Scholar] [CrossRef]
  269. Braillon, C.; Pradalier, C.; Crowley, J.L.; Laugier, C. Real-Time Moving Obstacle Detection Using Optical Flow Models. In Proceedings of the 2006 IEEE Intelligent Vehicles Symposium, Meguro-Ku, Japan, 13–15 June 2006; IEEE: Meguro-Ku, Japan, 2006; pp. 466–471. [Google Scholar] [CrossRef] [Green Version]
  270. Naito, T.; Ito, T.; Kaneda, Y. The Obstacle Detection Method Using Optical Flow Estimation at the Edge Image. In Proceedings of the 2007 IEEE Intelligent Vehicles Symposium, Istanbul, Turkey, 13–15 June 2007; IEEE: Istanbul, Turkey, 2007; pp. 817–822. [Google Scholar] [CrossRef]
  271. Gharani, P.; Karimi, H.A. Context-aware obstacle detection for navigation by visually impaired. Image Vis. Comput. 2017, 64, 103–115. [Google Scholar] [CrossRef]
  272. Agrawal, P.; Ratnoo, A.; Ghose, D. Inverse optical flow based guidance for UAV navigation through urban canyons. Aerosp. Sci. Technol. 2017, 68, 163–178. [Google Scholar] [CrossRef]
  273. Bharati, S.P.; Wu, Y.; Sui, Y.; Padgett, C.; Wang, G. Real-Time Obstacle Detection and Tracking for Sense-and-Avoid Mechanism in UAVs. IEEE Trans. Intell. Veh. 2018, 3, 185–197. [Google Scholar] [CrossRef]
  274. Capito, L.; Ozguner, U.; Redmill, K. Optical Flow Based Visual Potential Field for Autonomous Driving. In Proceedings of the 2020 IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA, 20–23 October 2020; IEEE: Las Vegas, NV, USA, 2020; pp. 885–891. [Google Scholar] [CrossRef]
  275. Seunghun, J.; Junguk, C.; Xuan Dai, P.; Kyoung Mu, L.; Sung-Kee, P.; Munsang, K.; Jae Wook, J. FPGA Design and Implementation of a Real-Time Stereo Vision System. IEEE Trans. Circuits Syst. Video Technol. 2010, 20, 15–26. [Google Scholar] [CrossRef]
  276. Bertozzi, M.; Broggi, A.; Fascioli, A.; Nichele, S. Stereo Vision-Based Vehicle Detection. In Proceedings of the Proceedings of the IEEE Intelligent Vehicles Symposium 2000 (Cat. No. 00TH8511), Dearborn, MI, USA, 5 October 2000; IEEE: Dearborn, MI, USA, 2000; pp. 39–44. [Google Scholar] [CrossRef] [Green Version]
  277. Nedevschi, S.; Danescu, R.; Frentiu, D.; Marita, T.; Oniga, F.; Pocol, C.; Schmidt, R.; Graf, T. In High Accuracy Stereo Vision System for Far Distance Obstacle Detection. In Proceedings of the IEEE Intelligent Vehicles Symposium, Parma, Italy, 14–17 June 2004; IEEE: Parma, Italy, 2004; pp. 292–297. [Google Scholar] [CrossRef]
  278. Ma, Y.; Li, Q.; Chu, L.; Zhou, Y.; Xu, C. Real-Time Detection and Spatial Localization of Insulators for UAV Inspection Based on Binocular Stereo Vision. Remote Sens. 2021, 13, 230. [Google Scholar] [CrossRef]
  279. Huh, K.; Park, J.; Hwang, J.; Hong, D. A stereo vision-based obstacle detection system in vehicles. Opt. Lasers Eng. 2008, 46, 168–178. [Google Scholar] [CrossRef]
  280. García Carrillo, L.R.; Dzul López, A.E.; Lozano, R.; Pégard, C. Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV. J. Intell. Robot. Syst. 2011, 65, 373–387. [Google Scholar] [CrossRef]
  281. Arnay, R.; Hernandez-Aceituno, J.; Toledo, J.; Acosta, L. Laser and Optical Flow Fusion for a Non-Intrusive Obstacle Detection System on an Intelligent Wheelchair. IEEE Sens. J. 2018, 18, 3799–3805. [Google Scholar] [CrossRef]
  282. Chang, S.; Zhang, Y.; Zhang, F.; Zhao, X.; Huang, S.; Feng, Z.; Wei, Z. Spatial Attention Fusion for Obstacle Detection Using MmWave Radar and Vision Sensor. Sensors 2020, 20, 956. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  283. Long, N.; Wang, K.; Cheng, R.; Hu, W.; Yang, K. Unifying obstacle detection, recognition, and fusion based on millimeter wave radar and RGB-depth sensors for the visually impaired. Rev. Sci. Instrum. 2019, 90, 044102. [Google Scholar] [CrossRef] [PubMed]
  284. Meichen, L.; Jun, C.; Xiang, Z.; Lu, W.; Yongpeng, T. Dynamic obstacle detection based on multi-sensor information fusion. Ifac-Papers 2018, 51, 861–865. [Google Scholar] [CrossRef]
  285. Zhang, X.; Zhou, M.; Qiu, P.; Huang, Y.; Li, J. Radar and vision fusion for the real-time obstacle detection and identification. Ind. Robot. Int. J. Robot. Res. Appl. 2019. [Google Scholar] [CrossRef]
  286. Zhang, J.; Han, J.; Wang, S.; Liao, Y.; Li, P. Real Time Obstacle Detection Method Based on Lidar and Wireless Sensor. In Proceedings of the 2017 Chinese Automation Congress (CAC), Jinan, China, 20–22 October 2017; IEEE: Jinan, China, 2017; pp. 5951–5955. [Google Scholar] [CrossRef]
  287. John, V.; Mita, S. RVNet: Deep Sensor Fusion of Monocular Camera and Radar for Image-Based Obstacle Detection in Challenging Environments. In Pacific-Rim Symposium on Image and Video Technology; Springer: Berlin/Heidelberg, Germany, 2019; pp. 351–364. [Google Scholar] [CrossRef]
  288. Lumelsky, V.J.; Stepanov, A.A. Path-planning strategies for a point mobile automaton moving amidst unknown obstacles of arbitrary shape. Algorithmica 1987, 2, 403–430. [Google Scholar] [CrossRef] [Green Version]
  289. Kamon, I.; Rimon, E.; Rivlin, E. TangentBug: A Range-Sensor-Based Navigation Algorithm. Int. J. Robot. Res. 2016, 17, 934–953. [Google Scholar] [CrossRef]
  290. Taylor, K.; LaValle, S.M. I-Bug: An Intensity-Based Bug Algorithm. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; IEEE: Kobe, Japan, 2009; pp. 3981–3986. [Google Scholar] [CrossRef] [Green Version]
  291. Zohaib, M.; Pasha, S.M.; Javaid, N.; Iqbal, J. IBA: Intelligent Bug Algorithm—A Novel Strategy to Navigate Mobile Robots Autonomously. In Communication Technologies, Information Security and Sustainable Development; Springer: Cham, Switzerland, 2014; pp. 291–299. [Google Scholar] [CrossRef]
  292. Sharma, N.; Pinto, J.; Sujit, P. BugFlood: A bug inspired algorithm for efficient path planning in an obstacle rich environment. In Proceedings of the AIAA Infotech@ Aerospace, San Diego, CA, USA, 4–8 January 2016; p. 0254. [Google Scholar] [CrossRef]
  293. Ng, J.; Bräunl, T. Performance Comparison of Bug Navigation Algorithms. J. Intell. Robot. Syst. 2007, 50, 73–84. [Google Scholar] [CrossRef]
  294. Yufka, A.; Parlaktuna, O. Performance Comparison of Bug Algorithms for Mobile Robots. In Proceedings of the 5th International Advanced Technologies Symposium, Karabuk, Turkey, 13–15 May 2009; pp. 13–15. [Google Scholar]
  295. Khatib, O. Real-time obstacle avoidance for manipulators and mobile robots. In Autonomous Robot Vehicles; Springer: Berlin/Heidelberg, Germany, 1986; pp. 396–404. [Google Scholar] [CrossRef]
  296. Cetin, O.; Zagli, I.; Yilmaz, G. Establishing Obstacle and Collision Free Communication Relay for UAVs with Artificial Potential Fields. J. Intell. Robot. Syst. 2012, 69, 361–372. [Google Scholar] [CrossRef]
  297. Chen, Y.-B.; Luo, G.-C.; Mei, Y.-S.; Yu, J.-Q.; Su, X.-L. UAV path planning using artificial potential field method updated by optimal control theory. Int. J. Syst. Sci. 2014, 47, 1407–1420. [Google Scholar] [CrossRef]
  298. Sun, J.; Tang, J.; Lao, S. Collision Avoidance for Cooperative UAVs With Optimized Artificial Potential Field Algorithm. IEEE Access 2017, 5, 18382–18390. [Google Scholar] [CrossRef]
  299. Fan, X.; Guo, Y.; Liu, H.; Wei, B.; Lyu, W. Improved Artificial Potential Field Method Applied for AUV Path Planning. Math. Probl. Eng. 2020, 2020, 6523158. [Google Scholar] [CrossRef]
  300. Goss, J.; Rajvanshi, R.; Subbarao, K. Aircraft Conflict Detection and Resolution Using Mixed Geometric and Collision Cone Approaches. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit , Rhode, Island, 16–19 August 2004; p. 4879. [Google Scholar] [CrossRef]
  301. Watanabe, Y.; Calise, A.; Johnson, E. Vision-based obstacle avoidance for UAVs. In Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Hilton Head, South Carolina, 20–23 August 2007; p. 6829. [Google Scholar]
  302. Chakravarthy, A.; Ghose, D. Generalization of the collision cone approach for motion safety in 3-D environments. Auton. Robot. 2011, 32, 243–266. [Google Scholar] [CrossRef]
  303. Sunkara, V.; Chakravarthy, A.; Ghose, D. Collision Avoidance of Arbitrarily Shaped Deforming Objects Using Collision Cones. IEEE Robot. Autom. Lett. 2019, 4, 2156–2163. [Google Scholar] [CrossRef]
  304. Zadeh, L.A. Information and control. Fuzzy Sets 1965, 8, 338–353. [Google Scholar]
  305. Lian, S.H. Fuzzy Logic Control of an Obstacle Avoidance Robot. Proceedings of IEEE 5th International Fuzzy Systems, New Orleans, LA, USA, 8–11 September 1996; IEEE: New Orleans, LA, USA, 1996; pp. 26–30. [Google Scholar] [CrossRef]
  306. Driankov, D.; Saffiotti, A. Fuzzy Logic Techniques for Autonomous Vehicle Navigation; Physica: Amsterdam, The Netherlands, 2013; Volume 61. [Google Scholar]
  307. Reignier, P. Fuzzy logic techniques for mobile robot obstacle avoidance. Robot. Auton. Syst. 1994, 12, 143–153. [Google Scholar] [CrossRef]
  308. Dong, T.; Liao, X.; Zhang, R.; Sun, Z.; Song, Y. Path Tracking and Obstacles Avoidance of Uavs-Fuzzy Logic Approach. In Proceedings of the 14th IEEE International Conference on Fuzzy Systems, 2005. FUZZ’05, Reno, NV, USA, 11–14 July 2005; IEEE: Reno, NV, USA, 2005; pp. 43–48. [Google Scholar] [CrossRef]
  309. Jin, T.-S. Obstacle Avoidance of Mobile Robot Based on Behavior Hierarchy by Fuzzy Logic. Int. J. Fuzzy Log. Intell. Syst. 2012, 12, 245–249. [Google Scholar] [CrossRef] [Green Version]
  310. Li, X.; Choi, B.-J. Design of obstacle avoidance system for mobile robot using fuzzy logic systems. Int. J. Smart Home 2013, 7, 321–328. [Google Scholar]
  311. Pandey, A.; Sonkar, R.K.; Pandey, K.K.; Parhi, D. Path Planning Navigation of Mobile Robot with Obstacles Avoidance Using Fuzzy Logic Controller. In Proceedings of the 2014 IEEE 8th International Conference on Intelligent Systems and Control (ISCO), Coimbatore, India, 10–11 January 2014; IEEE: Coimbatore, India, 2014; pp. 39–41. [Google Scholar] [CrossRef]
  312. Ulrich, I.; Borenstein, J. In VFH+: Reliable Obstacle Avoidance for Fast Mobile Robots. In Proceedings of the 1998 IEEE international conference on robotics and automation (Cat. No. 98CH36146), Leuven, Belgium, 20 May 1998; IEEE: Leuven, Belgium, 1998; pp. 1572–1577. [Google Scholar] [CrossRef]
  313. Ulrich, I.; Borenstein, J. VFH/sup*: Local Obstacle Avoidance with Look-Ahead Verification. In Proceedings of the 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), San Francisco, CA, USA, 24–28 April 2000; pp. 2505–2511. [Google Scholar] [CrossRef]
  314. Sary, I.P.; Nugraha, Y.P.; Megayanti, M.; Hidayat, E.; Trilaksono, B.R. Design of Obstacle Avoidance System on Hexacopter Using Vector Field Histogram-Plus. In Proceedings of the 2018 IEEE 8th International Conference on System Engineering and Technology (ICSET), Bandung, Indonesia, 15–16 October 2018; IEEE: Bandung, Indonesia, 2018; pp. 18–23. [Google Scholar] [CrossRef]
  315. Bolbhat, S.; Bhosale, A.; Sakthivel, G.; Saravanakumar, D.; Sivakumar, R.; Lakshmipathi, J. Intelligent Obstacle Avoiding AGV Using Vector Field Histogram and Supervisory Control; Journal of Physics: Conference Series; IOP Publishing: Chennai, India, 2020; p. 012030. [Google Scholar] [CrossRef]
  316. Gupta, M.; Jin, L.; Homma, N. Static and Dynamic Neural Networks: From Fundamentals to Advanced Theory; John Wiley & Sons: Hoboken, NJ, USA, 2004. [Google Scholar]
  317. Glasius, R.; Komoda, A.; Gielen, S.C.A.M. Neural Network Dynamics for Path Planning and Obstacle Avoidance. Neural. Netw. 1995, 8, 125–133. [Google Scholar] [CrossRef]
  318. Huang, B.-Q.; Cao, G.-Y.; Guo, M. Reinforcement Learning Neural Network to the Problem of Autonomous Mobile Robot Obstacle Avoidance. In Proceedings of the 2005 International Conference on Machine Learning and Cybernetics, Guangzhou, China, 18–21 August 2005; IEEE: Guangzhou, China, 2005; pp. 85–89. [Google Scholar] [CrossRef]
  319. Yadav, V.; Wang, X.; Balakrishnan, S. Neural Network Approach for Obstacle Avoidance in 3-D Environments for UAVs. In Proceedings of the 2006 American Control Conference, Minneapolis, MN, USA, 14–16 June 2006; IEEE: Minneapolis, MN, USA, 2006; p. 6. [Google Scholar] [CrossRef]
  320. Chi, K.-H.; Lee, M.-F.R. Obstacle Avoidance in Mobile Robot Using Neural Network. In Proceedings of the 2011 International Conference on Consumer Electronics, Communications and Networks (CECNet), Xianning, China, 16–18 April 2011; IEEE: Xianning, China, 2011; pp. 5082–5085. [Google Scholar] [CrossRef]
  321. Kim, C.-J.; Chwa, D. Obstacle Avoidance Method for Wheeled Mobile Robots Using Interval Type-2 Fuzzy Neural Network. IEEE Trans. Fuzzy Syst. 2015, 23, 677–687. [Google Scholar] [CrossRef]
  322. Back, S.; Cho, G.; Oh, J.; Tran, X.-T.; Oh, H. Autonomous UAV Trail Navigation with Obstacle Avoidance Using Deep Neural Networks. J. Intell. Robot. Syst. 2020, 100, 1195–1211. [Google Scholar] [CrossRef]
  323. Dai, X.; Mao, Y.; Huang, T.; Qin, N.; Huang, D.; Li, Y. Automatic obstacle avoidance of quadrotor UAV via CNN-based learning. Neurocomputing 2020, 402, 346–358. [Google Scholar] [CrossRef]
  324. He, R.; Wei, R.; Zhang, Q. UAV autonomous collision avoidance approach. Automatika 2017, 58, 195–204. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Agricultural sprayer UAV (Quadrotor).
Figure 1. Agricultural sprayer UAV (Quadrotor).
Agronomy 11 01069 g001
Figure 2. (a) Path plan without considering geometry [67]; (b) path plan considering geometry [63].
Figure 2. (a) Path plan without considering geometry [67]; (b) path plan considering geometry [63].
Agronomy 11 01069 g002
Figure 3. (a) Selected field from satellite data; (b) the generated plan for the field [70].
Figure 3. (a) Selected field from satellite data; (b) the generated plan for the field [70].
Agronomy 11 01069 g003
Figure 4. Liquid shift and center of gravity (CG1, CG2) position [75].
Figure 4. Liquid shift and center of gravity (CG1, CG2) position [75].
Agronomy 11 01069 g004
Figure 5. Slosh impact comparison: (a) free moving liquid vs. (b) included vertical and horizontal grill with 30% liquid [79].
Figure 5. Slosh impact comparison: (a) free moving liquid vs. (b) included vertical and horizontal grill with 30% liquid [79].
Agronomy 11 01069 g005
Figure 6. Farmland obstacles, satellite view—(a) house [93]; (b) electric tower [94]; (c) big bush [95]; (d) trees [96].
Figure 6. Farmland obstacles, satellite view—(a) house [93]; (b) electric tower [94]; (c) big bush [95]; (d) trees [96].
Agronomy 11 01069 g006
Figure 7. Obstacle avoidance scenario group.
Figure 7. Obstacle avoidance scenario group.
Agronomy 11 01069 g007
Figure 8. (a) Reactive control architecture; (b) deliberative planning architecture; (c) hybrid control architecture [36].
Figure 8. (a) Reactive control architecture; (b) deliberative planning architecture; (c) hybrid control architecture [36].
Agronomy 11 01069 g008
Figure 9. Trajectories of the Bug1 (a) and Bug2 algorithms (b) [288].
Figure 9. Trajectories of the Bug1 (a) and Bug2 algorithms (b) [288].
Agronomy 11 01069 g009
Figure 11. Collision cone and aiming point [301].
Figure 11. Collision cone and aiming point [301].
Agronomy 11 01069 g011
Figure 12. Usage of fuzzy logic control: (a) multiple sensor setup with (b) the structure of the fuzzy controller. (c) Fuzzy control rules for steering away from front obstacles [305].
Figure 12. Usage of fuzzy logic control: (a) multiple sensor setup with (b) the structure of the fuzzy controller. (c) Fuzzy control rules for steering away from front obstacles [305].
Agronomy 11 01069 g012
Figure 13. (a) 2D histogram grid map; (b) converted 1D polar histogram [41].
Figure 13. (a) 2D histogram grid map; (b) converted 1D polar histogram [41].
Agronomy 11 01069 g013
Figure 14. Neural network training data. (ah) Generated paths with single and multiple obstacles with different placements during training phase [324].
Figure 14. Neural network training data. (ah) Generated paths with single and multiple obstacles with different placements during training phase [324].
Agronomy 11 01069 g014
Figure 15. Collision avoidance performance using training data, comparing with Particle Swarm (PSO), the dynamic A * algorithm, and Artificial Potential Field (APF) [324].
Figure 15. Collision avoidance performance using training data, comparing with Particle Swarm (PSO), the dynamic A * algorithm, and Artificial Potential Field (APF) [324].
Agronomy 11 01069 g015
Table 1. Possible obstacles on farmland.
Table 1. Possible obstacles on farmland.
ObstacleIn Flying ZoneRemovable/
Replicable
SolutionComment
Local electric wire for pump or other farm-related useYesYesRewiring-
Gridline electric wireNoNo-Not concern
Plant brunchYesYesTrimming-
Protruding plantYesYesReplace or Trimming
Test pole---These test poles use for research, and mostly high as the plants
Nylon net---These nets are for covering the plants from birds
Metal net---These nets use for the boundary
Group treesYesYesIf not cut down or remove, avoid by UAVGroup of trees can be detected from the satellite image and can be filtered from the mission planner [70]
Single treeYesYesIf not cut down or remove, avoid by UAV-
Telegraph pole + wooden pole + small electrical poleYesNoAvoid by UAV-
WindmillYesNoAvoid by UAV-
PergolaYesNoAvoid by UAV-
Small house/pump houseYesNoAvoid by UAV-
Big houseYesNoAvoid by UAVCan be detected from the satellite image and can be filtered from the mission planner [70]
High-pressure towerYesNoAvoid by UAV-
Meteorological towerYesNoAvoid by UAV-
BirdsYesNo-Birds are the most unpredicted dangerous moving obstacle for all kind of aircraft
Other dronesYesYes--
HumanYesYes--
Other moving non-living thingsYesYes--
Table 2. Obstacle detection sensors for UAVs.
Table 2. Obstacle detection sensors for UAVs.
Sensor TypesUltrasonicLaser/Initiative Infrared SensorStructured Light SensorToFMillimeter-Wave RadarMonocular VisionBinocular Vision
Range<10<50<10<10<250<10<100
CostLowMediumVery HighMediumHighMediumHigh
PrecisionShort RangeVery HighVery HighMediumLowLowMedium
ResolutionLowVery HighVery HighLowLowMediumMedium
ReliabilityLowHighHighHighHighMediumMedium
Liquid Droplet InfluenceYesYesYesYesNoYesYes
Sound InfluenceHigh-Pitched Sound InterferenceNoNoNoNoNoNo
Light InfluenceNoDirect Sunlight May Influence InfraredVery HighVery HighNoNoNo
Temperature InfluenceYesNoNoNoNoNoNo
Light NeedNoNoNoNoNoYesYes
Single Point Measurement ReliabilityNo-Yes-NoSuitable for Static Obstacle-
ProcessFastFastMediumFastFastLarge Amount Data Need Faster ProcessorLarge Amount Data Need Faster Processor
Reference[130,131,144,145,146][132,147,148,149][133,150][135,151,152][136,153,154][121,122,155,156][157,158,159,160]
Table 3. Algorithm comparison for sprayer UAVs.
Table 3. Algorithm comparison for sprayer UAVs.
Avoidance TechniqueFeaturesSprayer UAV’s Insights
Bug2
Algorithm
  • Easy and Convenient
  • It follows the obstacle’s exact outline
  • It changes the heading when bypassing the obstacle
  • It doesn’t detect the edge of the obstacle, which may take a longer path sometimes
This algorithm may suitable for rotorcraft UAVs. It can ensure the coverage of spraying because of following the exact border of the obstacle. But changing direction for every edge of the obstacle may increase avoidance time. Modifying the algorithm by fixing the heading direction may reduce the time duration.
Artificial
Potential Field
  • A simple approach for implementation
  • Easy to find the shorter edge of the obstacle
  • Local minima problem can cause process failure
This method can use for finding the shorter avoidance direction according to the edge of the obstacle. But because the sprayer UAV operated based on waypoints, the avoidance may search the next waypoint as the target and change the coverage path line.
Collision Cone
  • Creates simpler avoidance path
  • Uses vehicle dynamics to create avoidance path
  • The minimum effort of guidance control
  • Doesn’t consider the shape of the obstacle
Because the sprayer UAV carries a liquid load that decreases continuously, this method may reduce the avoidance time by using the vehicle information. But in case of larger obstacles may consider the spray coverage.
Fuzzy Logic
  • Robust and suitable for dynamic environments
  • Needs multisensory system
  • Polyhedral shape may increase computational calculation
The multisensory system mostly uses low-cost sensors, which may reduce the UAV’s retail price and increase availability for poor farmers. But it also has an issue with heading change.
Vector Field Histogram
  • A better method for detect obstacle’s shape identification
  • Require longer time to 2D map the obstacle
  • High computational requirement
  • Doesn’t consider the vehicle’s dynamics
Expensive UAV’s uses Lidar sensor for detecting an obstacle, which is suitable for this method. An up-gradation of including vehicle details may increase the performance of the avoidance procedure.
Neural Network
  • Good for known obstacle environment
  • Better performance for real-time avoidance
  • Needs lots of training data before performance
On farmland, most obstacles are known and categorized. A collection of training data may create a perfect avoidance performance for particular models.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ahmed, S.; Qiu, B.; Ahmad, F.; Kong, C.-W.; Xin, H. A State-of-the-Art Analysis of Obstacle Avoidance Methods from the Perspective of an Agricultural Sprayer UAV’s Operation Scenario. Agronomy 2021, 11, 1069. https://doi.org/10.3390/agronomy11061069

AMA Style

Ahmed S, Qiu B, Ahmad F, Kong C-W, Xin H. A State-of-the-Art Analysis of Obstacle Avoidance Methods from the Perspective of an Agricultural Sprayer UAV’s Operation Scenario. Agronomy. 2021; 11(6):1069. https://doi.org/10.3390/agronomy11061069

Chicago/Turabian Style

Ahmed, Shibbir, Baijing Qiu, Fiaz Ahmad, Chun-Wei Kong, and Huang Xin. 2021. "A State-of-the-Art Analysis of Obstacle Avoidance Methods from the Perspective of an Agricultural Sprayer UAV’s Operation Scenario" Agronomy 11, no. 6: 1069. https://doi.org/10.3390/agronomy11061069

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop