Skip to main content
Log in

Deep Learning-based Monocular Obstacle Avoidance for Unmanned Aerial Vehicle Navigation in Tree Plantations

Faster Region-based Convolutional Neural Network Approach

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

In recent years, Unmanned Aerial Vehicles (UAVs) are widely utilized in precision agriculture, such as tree plantations. Due to limited intelligence, these UAVs can only operate at high altitudes, leading to the use of expensive and heavy sensors for obtaining important health information of the plants. To fly at low altitudes, these UAVs must possess the capability of obstacle avoidance to prevent crashes. However, most current obstacle avoidance systems with active sensors are not applicable to small aerial vehicles due to the cost, weight, and power consumption constraints. To this end, this paper presents a novel approach to the autonomous navigation of a small UAV in tree plantations only using a single camera. As the monocular vision does not provide depth information, a machine learning model, Faster Region-based Convolutional Neural Network (Faster R-CNN), was trained for the tree trunk detection. A control strategy was implemented to avoid the collision with trees. The detection model uses image heights of detected trees to indicate their distances from the UAV and image widths between trees to find the widest obstacle-free space. The control strategy allows the UAV to navigate until any approaching obstacle is detected and to turn to the safest area before continuing its flight. This paper demonstrates the feasibility and performance of the proposed algorithms by carrying out 11 flight tests in real tree plantation environments at two different locations, one of which is a new place. All the successful results indicate that the proposed method is accurate and robust for autonomous navigation in tree plantations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Agrawal, P., Ratnoo, A., Ghose, D.: Inverse optical flow based guidance for UAV navigation through urban canyons. Aerosp. Sci. Technol. 68, 163–178 (2017)

    Article  Google Scholar 

  2. Aguilar, W.G., Casaliglla, V.P., Pólit, J.L.: Obstacle avoidance based-visual navigation for micro aerial vehicles. Electronics 6(1), 10 (2017)

    Article  Google Scholar 

  3. Alvarez, H., Paz, L.M., Sturm, J., Cremers, D.: Collision avoidance for quadrotors with a monocular camera. In: Experimental Robotics, pp 195–209. Springer (2016)

  4. Bachrach, A., He, R., Roy, N.: Autonomous flight in unknown indoor environments. Int. J. Micro Air Veh. 1(4), 217–228 (2009)

    Article  Google Scholar 

  5. Bauer, P., Hiba, A., Bokor, J., Zarandy, A.: Three dimensional intruder closest point of approach estimation based-on monocular image parameters in aircraft sense and avoid. J. Intell. Robot. Syst. 93 (1-2), 261–276 (2019)

    Article  Google Scholar 

  6. Bills, C., Chen, J., Saxena, A.: Autonomous MAV flight in indoor environments using single image perspective cues. In: Robotics and automation (ICRA), 2011 IEEE international conference, pp. 5776–5783. IEEE (2011)

  7. Canziani, A., Paszke, A., Culurciello, E.: An analysis of deep neural network models for practical applications. arXiv:1605.07678 (2016)

  8. Chebrolu, N., Läbe, T., Stachniss, C.: Robust long-term registration of UAV images of crop fields for precision agriculture. IEEE Robot. Autom. Lett. 3(4), 3097–3104 (2018)

    Article  Google Scholar 

  9. Chong, K.L., Kanniah, K.D., Pohl, C., Tan, K.P.: A review of remote sensing applications for oil palm studies. Geo. Spat. Inf. Sci. 20(2), 184–200 (2017)

    Article  Google Scholar 

  10. Cui, J.Q., Lai, S., Dong, X., Chen, B.M.: Autonomous navigation of UAV in foliage environment. J. Intell. Robot. Syst. 84(1-4), 259–276 (2016)

    Article  Google Scholar 

  11. Daftry, S., Zeng, S., Khan, A., Dey, D., Melik-Barkhudarov, N., Bagnell, J. A., Hebert, M.: Robust monocular flight in cluttered outdoor environments. arXiv:1604.04779 (2016)

  12. Eresen, A., Mamolu, N., Efe, M.N.: Autonomous quadrotor flight with vision-based obstacle avoidance in virtual environment. Expert Sys. Appl. 39(1), 894–905 (2012)

    Article  Google Scholar 

  13. Esrafilian, O., Taghirad, H.D.: Autonomous flight and obstacle avoidance of a quadrotor by monocular SLAM. In: Robotics and Mechatronics (ICROM), 2016 4th International Conference, pp. 240–245. IEEE (2016)

  14. Gageik, N., Benz, P., Montenegro, S.: Obstacle detection and collision avoidance for a UAV with complementary low-cost sensors. IEEE Access 3, 599–609 (2015)

    Article  Google Scholar 

  15. Girshick, R.: Fast r-cnn. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1440–1448 (2015)

  16. Giusti, A., Guzzi, J., Ciresan, D.C., He, F. L., Rodríguez, J. P., Fontana, F., Faessler, M., Forster, C., Schmidhuber, J., Di Caro, G.: A machine learning approach to visual perception of forest trails for mobile robots. IEEE Robot. Autom. Lett. 1(2), 661–667 (2016)

    Article  Google Scholar 

  17. Gosiewski, Z., Ciesluk, J., Ambroziak, L.: Vision-based obstacle avoidance for unmanned aerial vehicles. In: 2011 4th International Congress on Image and Signal Processing (CISP), vol. 4, pp 2020–2025. IEEE (2011)

  18. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

  19. Herwitz, S., Johnson, L., Dunagan, S., Higgins, R., Sullivan, D., Zheng, J., Lobitz, B., Leung, J., Gallmeyer, B., Aoyagi, M.: Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support. Comput. Electron. Agric. 44(1), 49–61 (2004)

    Article  Google Scholar 

  20. Ho, H., De Wagter, C., Remes, B., De Croon, G.: Optical-flow based self-supervised learning of obstacle appearance applied to MAV landing. Robot. Auton. Syst. 100, 78–94 (2018)

    Article  Google Scholar 

  21. Hoo-Chang, S., Roth, H.R., Gao, M., Lu, L., Xu, Z., Nogues, I., Yao, J., Mollura, D., Summers, R.M.: Deep convolutional neural networks for computer-aided detection: Cnn architectures, dataset characteristics and transfer learning. IEEE Trans. Med. Imaging 35(5), 1285 (2016)

    Article  Google Scholar 

  22. Hui, X., Bian, J., Yu, Y., Zhao, X., Tan, M.: A novel autonomous navigation approach for UAV power line inspection. In: 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 634–639. IEEE (2017)

  23. Huuskonen, J., Oksanen, T.: Soil sampling with drones and augmented reality in precision agriculture. Comput. Electron. Agric. 154, 25–35 (2018)

    Article  Google Scholar 

  24. Iacono, M., Sgorbissa, A.: Path following and obstacle avoidance for an autonomous UAV using a depth camera. Robot. Auton. Syst. 106, 38–46 (2018)

    Article  Google Scholar 

  25. Ioffe, S., Szegedy, C.: Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv:1502.03167 (2015)

  26. Kanellakis, C., Nikolakopoulos, G.: Survey on computer vision for UAVs: Current developments and trends. J. Intell. Robot. Syst. 87(1), 141–168 (2017)

    Article  Google Scholar 

  27. Kim, D.K., Chen, T.: Deep neural network for real-time autonomous indoor navigation. arXiv:1511.04668 (2015)

  28. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)

  29. Lee, J.O., Lee, K.H., Park, S.H., Im, S.G., Park, J.: Obstacle avoidance for small UAVs using monocular vision. Aircr. Eng. Aerosp. Technol. 83(6), 397–406 (2011)

    Article  Google Scholar 

  30. Lelong, C.C., Burger, P., Jubelin, G., Roux, B., Labbé, S., Baret, F.: Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots. Sensors 8(5), 3557–3585 (2008)

    Article  Google Scholar 

  31. Liu, P., ElGeneidy, K., Pearson, S., Huda, M.N., Neumann, G., et al.: Towards real-time robotic motion planning for grasping in cluttered and uncertain environments. In: Towards Autonomous Robotic Systems: 19th Annual Conference, TAROS 2018, Bristol, UK July 25–27, 2018, Proceedings, vol. 10965, p 481. Springer (2018)

  32. Liu, P., Yu, H., Cang, S.: Adaptive neural network tracking control for underactuated systems with matched and mismatched disturbances. Nonlinear Dyn. 98(2), 1447–1464 (2019)

    Article  Google Scholar 

  33. Liu, Z., Zhang, Y., Yuan, C., Ciarletta, L., Theilliol, D.: Collision avoidance and path following control of unmanned aerial vehicle in hazardous environment. J. Intell. Robot. Syst., 1–18 (2018)

  34. Ma, Z., Wang, C., Niu, Y., Wang, X., Shen, L.: A saliency-based reinforcement learning approach for a UAV to avoid flying obstacles. Robot. Auton. Syst. 100, 108–118 (2018)

    Article  Google Scholar 

  35. Mancini, M., Costante, G., Valigi, P., Ciarfuglia, T.A.: Fast robust monocular depth estimation for obstacle detection with fully convolutional networks. In: Intelligent Robots and Systems (IROS), 2016 IEEE/RSJ International Conference, pp. 4296–4303. IEEE (2016)

  36. Mancini, M., Costante, G., Valigi, P., Ciarfuglia, T.A.: J-mod 2: Joint monocular obstacle detection and depth estimation. IEEE Robot. Autom. Lett. 3(3), 1490–1497 (2018)

    Article  Google Scholar 

  37. Matthew, D., Fergus, R.: Visualizing and understanding convolutional neural networks. In: Proceedings of the 13th European Conference Computer Vision and Pattern Recognition, Zurich, Switzerland, pp. 6–12 (2014)

  38. Nieuwenhuisen, M., Droeschel, D., Beul, M., Behnke, S.: Autonomous navigation for micro aerial vehicles in complex gnss-denied environments. J. Intell. Robot. Sys. 84(1-4), 199–216 (2016)

    Article  Google Scholar 

  39. Qu, T., Zhang, Q., Sun, S.: Vehicle detection from high-resolution aerial images using spatial pyramid pooling-based deep convolutional neural networks. Multimed. Tools Appl. 76(20), 21651–21663 (2017)

    Article  Google Scholar 

  40. Ren, S., He, K., Girshick, R., Sun, J.: Faster r-cnn: Towards real-time object detection with region proposal networks. In: Advances in Neural Information Processing Systems, pp. 91–99 (2015)

  41. Ross, S., Melik-Barkhudarov, N., Shankar, K. S., Wendel, A., Dey, D., Bagnell, J.A., Hebert, M.: Learning monocular reactive UAV control in cluttered natural environments. arXiv:1211.1690 (2012)

  42. Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., McCool, C.: Deepfruits: A fruit detection system using deep neural networks. Sensors 16(8), 1222 (2016)

    Article  Google Scholar 

  43. Schauwecker, K., Zell, A.: On-board dual-stereo-vision for the navigation of an autonomous MAV. J. Intell. Robot. Syst. 74(1-2), 1–16 (2014)

    Article  Google Scholar 

  44. Serres, J.R., Ruffier, F.: Optic flow-based collision-free strategies: From insects to robots. Arthropod Struct. Dev. 46(5), 703–717 (2017)

    Article  Google Scholar 

  45. Shyam, R.A., Lightbody, P., Das, G., Liu, P., Gomez-Gonzalez, S., Neumann, G.: Improving local trajectory optimisation using probabilistic movement primitives. In: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2666–2671 (2019)

  46. Wang, X., Cheng, P., Liu, X., Uzochukwu, B.: Fast and accurate, convolutional neural network based approach for object detection from UAV. In: IECON 2018-44th Annual Conference of the IEEE Industrial Electronics Society, pp. 3171–3175. IEEE (2018)

  47. Xie, L., Wang, S., Markham, A., Trigoni, N.: Towards monocular vision based obstacle avoidance through deep reinforcement learning. arXiv:1706.09829 (2017)

  48. Yao, H., Yu, Q., Xing, X., He, F., Ma, J.: Deep-learning-based moving target detection for unmanned air vehicles. In: 2017 36th Chinese Control Conference (CCC), pp. 11459–11463. IEEE (2017)

  49. Zufferey, J.C., Floreano, D.: Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Trans. Robot. 22(1), 137–146 (2006)

    Article  Google Scholar 

Download references

Acknowledgements

The corresponding author would like to thank Universiti Sains Malaysia (USM) for providing the Short Term Research Grant Scheme (304/PAERO/6315113).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to H. W. Ho.

Ethics declarations

Conflict of interests

The authors declare that they have no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, H.Y., Ho, H.W. & Zhou, Y. Deep Learning-based Monocular Obstacle Avoidance for Unmanned Aerial Vehicle Navigation in Tree Plantations. J Intell Robot Syst 101, 5 (2021). https://doi.org/10.1007/s10846-020-01284-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10846-020-01284-z

Keywords

Navigation