Open access peer-reviewed chapter

Review of Agricultural Unmanned Aerial Vehicles (UAV) Obstacle Avoidance System

Written By

Uche Emmanuel

Submitted: 24 January 2022 Reviewed: 03 February 2022 Published: 21 December 2022

DOI: 10.5772/intechopen.103037

From the Edited Volume

Aeronautics - New Advances

Edited by Zain Anwar Ali and Dragan Cvetković

Chapter metrics overview

100 Chapter Downloads

View Full Metrics

Abstract

Unmanned aerial vehicles (UAVs) are being used for commercial, scientific, agricultural and infrastructural enhancement to alleviate maladies. The objective of this chapter is to review existing capabilities and ongoing studies to overcome difficulties associated with the deployment of the agricultural unmanned aerial vehicle in obstacle-rich farms for pesticides and fertilizer application. By review of various literature, it is apparent that the potential for real-time and near real-time exists but the development of technology for quality imagery and rapid processing leading to real-time response is needed. The Infrared, time of flight and millimeter wavelength radar sensors for detecting farm and flight environment obstacles appear promising. The autonomous mental development algorithm, and the simultaneous localization and mapping technology are, however, ahead of others in achieving autonomous identification of obstacles and real-time obstacle avoidance. They are, therefore, found fit for further studies and development for deployment on agricultural unmanned aerial vehicles for obstacle-rich farms.

Keywords

  • unmanned aerial vehicle
  • obstacle avoidance
  • sensors
  • spraying
  • obstacle-rich farm

1. Introduction

Topical in Nigeria today is, therefore, the intractable problem of sustainable economic growth whose precursor are people and capital formation. The proximate cause of economic growth is an increase in knowledge and its application in increasing the amount of capital or other resources per head. It is axiomatic that the motivation for agricultural mechanization is to reduce drudgery, increase productivity and return on investment and ultimately enhance capital formation for economic growth. But for a labour surplus, country like Nigeria agricultural mechanization is considered toxic to employment since the sector is responsible for the livelihood of over 70% of the population who are mostly into subsistence agriculture. Today agriculture has become a poverty incubator of the country wherein over 40% of the population are within the poverty bracket earning less than one dollar a day [1]. This is not because farming is not a profitable vocation but the native approach to the business of farming fall below global best practice creating a poverty trap for 70% of the population with the concomitant avalanche of societal problems such as high crime rate, insurgency, hyper inequality in income distribution and unstable national structure. As Lewis [2] noted, the traditional agricultural sector is characterized by zero marginal labour productivity.

Farmers in more developed climes use sophisticated technologies such as robots, temperature and moisture sensors, aerial imaging, drones and GPS technology in the area of precision agriculture to ensure high productivity, more profit, efficiency, safe and environmentally friendly farming [3]. As a result, only 12–15% of the gainfully occupied population suffice to feed the people in developed countries [2]. In contrast and at the low level of productivity 60–70% of the gainfully occupied population is needed in agriculture for the same purpose in Nigeria. For instance, Ukrainian GDP comes from the agriculture sector that employs just around 20% of its population [4]. It, therefore, becomes imperative that all hands especially those entrusted with the responsibility for defending and building the nation must be on deck to salvage the nation.

A sure step toward this is to unleash the power of modern agricultural technology on this critical sector to free this trapped population to the various value chains of agriculture expected to be induced by higher production and productivity of mechanized farming. More importantly is that the released population will create a huge market for farm produce as they no longer depend on the below optimal subsistence farming to feed and cater for their daily need. Today this tepid population adds nothing to the off-take of agricultural products and services as they can neither afford to buy nor have felt the need for them, choosing instead to live on their meager production or go into crime.

This debilitating state of affairs demands massive attacks from both grounds—tractorization and air-agricultural UAV. The development of advanced electronics, global positioning systems (GPS) and remote sensing have enabled advancements in the practice of precision agriculture whereby agronomic practices are based on variations in soil, nutrition and crop stress [5]. Fortunately, Nigeria is endowed with enormous land resources relative to its population. It will create not only jobs and market up front but also provide the adequate raw material for the various high capacity processing plants doting the country and currently operating far below-installed capacity due to lack of raw materials and limited market. Typical examples abode such as the tomato, sugar cane, rubber, oil palm, breweries, rice mills and other plants which cannot adequately source the local content of their raw material need.

The proper implementation of artificial intelligence (AI) in agriculture will help the cultivation process and create ambiance for the agricultural value chain market. Such development in Nigeria’s agriculture sector will boost rural development and rural transformation and eventually result in structural transformation of the national economy.

Advertisement

2. Unmanned aerial vehicles (UAVs)

UAVs come in different forms and for different purposes. Purposes include military and civil applications such as firefighting, reconnaissance for natural disaster, border security, traffic surveillance etc. UAVs are deployed in agriculture in many areas such as spraying and fertilizer application, seed planting and weed recognition, diagnosis of insect pests and artificial pollination, irrigation assessment, mapping and crop forecasting [6]. Pedestrian classification of UAVs is by use; photography, aerial mapping, surveillance, cinematography, agricultural etc. UAVs are better categorized by type; multi-rotor, fixed-wing, single rotor and fixed wing multi rotor hybrid with varying capabilities on altitude, control range, flight endurance and air speed. UAVs are further classified by their drive to include electric; where electric battery is used, solar; where solar cells are the source of energy and internal combustion engines; where they are driven by gasoline or methanol-fueled combustion engines [7]. For agricultural purposes, Vroegindeweij et al. [8] categorized UAVs into three as—fixed wing, vertical take-off and landing (VTOL) and bird/insect. According to Banjo and Ajayi [9], hybridization resulted in higher flying altitude, wider control range, increased speed and longer flights time, factors which are at variance with the demands on Ag UAVs of low altitude, low speed but agrees with Ag UAVs need of longer flight time and wider control range. Hence, for low altitude remote sensing (LARS), most Ag UAVs are of low cost, low speed, lightweight, low payload weight capabilities and short duration. UAVs for pesticide spraying and fertilizer application, however, need to be of higher payload weight and able to support longer flight endurance [5].

2.1 Agricultural unmanned aerial vehicles (Ag AUV)

The use of aerial system is necessary not only to ensure precision in agriculture for optimum utilization of inputs and efficacy and mitigation of the environmental impact of excess application inorganic agrichemical, but also times the only mechanized means to overcome obstacles limiting ground operations such as terrain, soil compaction, swamp etc. Moreover, with an increasing knowledge-based population inclined more to the mental white-collar job than to physical blue-collar job sourcing farmhands for even routine farm operations as pest and disease control (PDC) is becoming arduous necessitating deployment of agricultural technology to achieve the desired outcome. The innovative UAV platform for farming may lure the youth to rural areas having afforded a comfortable working environment and reduced farm drudgery thereby generating employment opportunities in the rural sector which may address social balance. Piloted aircrafts are used to carry out spraying and aerial imaging on large fields in a short time but these are not readily available in all areas and so UAVs are used especially on smaller fields. UAVs are considered to have high efficiency, low labour intensity and low comprehensive cost [10].

The deployment of UAV platforms for agriculture started in 1983 when the first remote-controlled aerial spraying system (RCASS) was built, followed in 1990 by an R50 helicopter with a payload limit of 20 kg and a laser system for height determination. Currently, low-volume UAV helicopters with fully autonomous unmanned vertical take-off and landing (VTOL) and spraying capacity of up to 7.7 kg, integrated with the flight control system of the UAVs show high potential for vector control in areas not easily accessible by ground services. Some agricultural spraying unmanned helicopters now have plant protection parameters as shown in Table 1 [7, 11].

ItemsIndex
Length of spraying rod140 m
Spraying height (above crop)1–3 m
Nozzle number6pcs
Spraying flow rate3.0–4.4 liter/min
Agrochemical tank volume18.0 liter
Spraying time per flight4–9 min
Spraying width6–8 m
Covering area of one flight1.0–1.2 hectare

Table 1.

Plant protection parameter (QF170-18L AgriSpraying Helicopter).

Agricultural unmanned vehicles (Ag UAVs) are used to optimize agricultural operations, increase crop production, monitor crop growth, sensor and produce digital imagery and to give a clear picture of the status of farm to a farmer, especially on large farms. Ag UAVs have, therefore, become a prerequisite for precision agriculture toward rapid industrialization of agriculture.

The limited scale adoption of aerial services relative to other methods of farming service tools is not unconnected with the high cost of the service, short flying time, the unreliability of the equipment, the uncertainty of the quality of operations, risk of liquid spray being airborne to unintended targets and the fact that flying of UAVs is under aviation laws [7]. Moreover, most Ag UAVs operations are done through manual remote control rendering the outcome susceptible to the skill level of the operator [5]. It is therefore imperative as a solution, to realize real-time autonomous OA technology to assuage the fears of many farmers on the safety and quality of UAVs services.

The deposition of pesticides on plants with the use of UAVs is the combined effect of the jet of liquid being sprayed and the stream of air generated by the rotors [7]. Hence, the efficacy of aerial spraying depends on the speed, droplet drift, flight altitude, weather, type of pesticide, temperature and terrain, the goal being to achieve blanket spraying or spot spaying of targets. The efficacy of the spaying is therefore subject to the behavior of the UAV airframe. The rush of air from the rotor changes because of the varying operation load of the UAV on the discharge of the spraying mix content of the tank thereby inducing a difference in the concentration of droplets in the air stream between the start, along with and end of spraying operation. Hence, the quality of UAV spraying may not meet that of manned aerial and ground systems [7].

2.2 Agricultural unmanned aerial vehicles (Ag AUV) obstacle avoidance technology

The objective of this chapter is to examine the literature on existing capabilities and ongoing studies to overcome difficulties associated with the deployment of agricultural UAVs on obstacle-rich farms for pesticides and fertilizer application.

Small size, fragmentation, multiple farmlands, undulating, fallow, beast, human and meandering boundary are all characteristics of a typical Nigerian farm holding. Forested areas, tall trees, electric poles and wire, farm structures, birds and some reflecting objects, wireless networks and hot and stormy weather abound in the flying environment. Furthermore, when operating on farmland, Ag UAVs are typically 1–1.5 m above the ground, posing ground operation challenges such as small trees in the middle of the farm, stacking poles, robes, molehills, undulation terrain, out-growths and so on. Flying dust and liquid smudge the spraying farm environment, making it impossible to use visual obstacle avoidance systems. As a result, an autonomous system is required to manage these complex and constantly changing aerial farm environments and spraying variables.

An agricultural unmanned vehicle obstacle avoidance (Ag UAV OA) technology, according to Wang et al. [12], is the core intelligent technology that allows an agricultural drone to autonomously identify farm obstacles and complete the specified avoidance action. It is an inbuilt capacity for sensing and avoidance (S&A) of threat [13]. In sensing depth, especially of frontal obstacles, studies have been on mimicking biological systems such as motion parallax, monocular cues and stereo vision [14]. An Ag OA UAV functions, therefore, include real-time perception, rapid image analysis, intelligent identification, potential areas detection and decision making on obstacle avoidance. To this end, radar ranging, laser ranging, ultrasonic ranging, monocular and binocular vision are deployed as tools for sensing or detecting obstacles.

Sensor fusion is a process by which data from multi-sensor UAV are fused for computations in multispectral remote imaging in precision agriculture to capture both visible and invisible images of crops and vegetation. The sensors feed the data back to the flight controller which runs on obstacle avoidance algorithms for processing the image data of the scanned surrounding [15]. Many Ag UAV systems use on-site suspension, planned travel routes, and autonomous obstacle avoidance as obstacle avoidance methods after detecting an obstacle. However, Leonetang pointed out [16] that autonomous actions that require the UAVs to evade the algorithm and regenerate the route come at the cost of battery life, which may be insufficient to tackle additional obstacles during route regeneration. In agriculture, as noted earlier UAV have been primarily applied in remote sensing, crop production and protection materials, precision seeding, vegetation testing (NDVI) etc. with various types of obstacle avoidance (OA) systems under remotely or programmed flight control [5]. Initially, there were two main sense and avoid (S&A) technologies—radar that sends out radio waves and measures their reflections from obstacles and light detection and ranging LiDAR optical sensor that uses laser beams instead of radio waves to provide detailed images of nearby features [17]. UAVs control modes are categorized as linear, non-linear and learning-based by Kim et al. [6]. They opined that linear and non-linear control systems based on linear-quadratic (LQ) are used to control UAV and handle wind and weather but not storm and snow. Learning-based controls use a type of fuzzy logic that learned using data obtained from the flight and does not require models.

Many variants of controls and obstacle detection and avoidance devices adorn the UAV shelves and aircraft market today, virtually solving the earlier version of sensors’ problems of bulkiness, weight, low energy efficiency and high cost. This includes RTK sensors, ultrasonic sensors, laser sensors, infrared sensing technology, structured light, TOF ranging, millimetre-wavelength radar, monocular visual ranging and binocular stereo vision [7]. The sensors are further supported by a variety of algorithms that enable real-time obstacle perception, rapid analysis and actionable image interpretation. These are broken down into three categories [13] to include a geometric relationship (relative distance, speed, acceleration, angle etc. between the drone and the obstacle), real-time planning (artificial potential field, ANN algorithm—a software-based approach to replicate the biological neurons, artificial heuristic, path-planning etc.) and decision making (Markov and Bayesian decision theory). In the genre of emerging OA technique is the reinforcement learning whereby the UAV selects the actions on the basis of its past experiences (exploitation) and also by new choices (exploration) such as autonomous mental development (AMD) algorithm [13] that simulates the mental development process of a human being using neural network algorithm [18], online free path generation and navigation system [19], multi-UAVs genetic algorithm [15], and the simultaneous localization and mapping (SLAM) technology which maps in real-time, recognizing own position and identifies obstacles while autonomously traveling or performing tasks [6]. The technique of UAV swarm control is also evolving using linear and nonlinear controls with strong resistance to external influence based on the K-means algorithm (K-means clustering) to prevent collisions and another to map allocated areas. However, as Corrigan pointed out in his paper [15], the challenge of these technologies is accuracy, as measurements must be taken continuously as the UAV moves through its environment and assimilated to update the models and account for noise introduced by both the device’s movement and the measurement method’s inaccuracy. This task is accomplished by updating the model state variables with measured values. The state variables are updated sequentially i.e. each time an observation is available. Kalman filter is deployed in estimating the states of the systems from the sensor data, estimating variables that are not directly observable and to minimize the noise [20].

The plethora of demands on OA system according to Wang et al. [12] to ensure the safety of agricultural UAV and continuous operations at low altitude and ultralow volume spraying include the following capabilities: action response time and implementation efficiency, autonomous adjustment of flying speed, height and attitude of the drone, re-planning of flight path after obstacle avoidance, deployment of signal loss prevention and anti-magnetic field interference, decision making on single, multiple static or dynamic obstacles.

To achieve the above objectives of all-weather autonomous operation of Ag UAV, multi-sensor obstacle avoidance technology is evolving while the development of auxiliary classification (indirect identification) of farmland obstacles and standardization of UAVs processes is ongoing. In July 2015, the Republic of South Africa became the first country to implement and enforce a comprehensive set of legally binding rules governing unmanned aerial vehicles. By 2016, 15 countries had published dedicated drone regulations. Nigeria is one of the countries with legally binding rules on the use of unmanned aerial vehicles (UAVs) [21].

2.3 Applicability of the various UAV obstacle sensing and avoidance technologies to obstacle rich farmland

2.3.1 Real-time kinematic (RTK)

It is more suitable for building obstacle maps of farmland than real-time obstacle avoidance. The positioning technology has not been fully applied to Ag UAV because of its high cost, difficulty of deployment and time consuming and labour intensive features [12]. However, Global Navigation Satellite System (GNSS) with RTK allows for centimeter level high position accuracy [6].

2.3.2 Ultrasonic sensors

These are used as an auxiliary safety device to get flight altitude parameters, achieve autonomous take-off and landing or fly in complex terrains at very low altitudes. Image processing is used to recognize the position of an obstacle while the ultrasonic determines the distance [22]. The obstacle avoidance principle is based on the sound wave reflected by the obstacle and measuring the echo time difference to determine the distance [16]. On Ag quadrotor UAV the ultrasonic sensors are used to detect objects and to calculate the distance between the obstacle and the UAV. The effective method is subsequently devised to avoid the obstacle [23]. They are not affected by light intensity, color variation and are of simple structure, low cost and easy to operate. However, its performance is limited by the small range and sound absorption, ambient temperature, humidity, atmospheric pressure and ground effect of grass in addition to losses caused by ultrasonic reflection and crosstalk between sound waves [6]. Further, acoustically, soft materials like cloth may be difficult to detect [9]. Moreover, Gibbs et al. [24] argued that their resolution can be as low as 60 degrees and cannot, therefore, identify the angular location of an obstacle in its view. Their study using two sonar receivers and scalar Kalman filter to reduce the effect of the signal noise was however salutary, especially with the energy and power spectral density metrics (out of the four signal metrics tested). Others being maximum (peak) frequency and cross-correlation of raw data and PSD. Further work by Davies et al. [25] using ultrasonic transit-time flow meters’ reveals that the autonomous performance of ultrasonic sensors in-flight instrumentation of UAVs can be improved upon by optimal design of two variables—the mounting configuration and the optimal angle of incidence for the transducer mounting.

2.3.3 LiDAR

It calculates distances and detects objects by measuring the time, it takes for a short laser pulse to travel from the sensor to an object and back using the known speed of light [9]. Laser sensors mainly used in autonomous navigation require optical systems thereby making them unsuitable for liquid, the dusty and smoky farming environments in addition to their weight and production cost, especially for small (sUAV) and micro unmanned aerial vehicles (MAVs). 3D scanning information takes so much time with laser scanners that make them unsatisfactory for real-time obstacle avoidance [26]. However, Huang et al. [5] proffered that light detection and ranging (LiDAR) optical sensor could be configured in a multi-sensor platform for agricultural field survey and crop height profiling.

2.3.4 Infrared (IR)

This sensing technology is based on the principle of triangulation whereby the infrared emitter emits an infrared beam at a certain angle and the light is reflected back on encountering the object. On detection, the object distance is calculated [18]. Output on analogue voltage corresponds to the distance to the reflecting object [15]. It works under all weather conditions including a night to measure distances and describe contours. It must, however, avoid direct sunlight and reflections to evade interference or failure of the OA system [6]. Corrigan [15], however, noted that IR obstacle avoidance sensors work with a specific frequency of infrared produced by the emitter to prevent them from being confused by visible light. Its good concealment and all-time service confer it with unique ability to observe animals in their natural habitats without causing disturbances [9]. On his part, Wang et al. [12] argued that the detection distance is small and the light emitted by the system is easily disturbed by the external environment.

2.3.5 Line structured light

It is emitted from the laser and converged into the light band of different shapes after passing through different lens structures [6]. Through the image acquisition, processing and calculation, the distance, azimuth and width and other parameter information of front obstacles are extracted [24]. Single camera-based obstacle avoidance systems use structured light to map their environment in 3D without being weighed down by traditional bulky LiDAR [25]. Wang et al. [6] however observed that there are mutual interferences between adjacent structural light sensors with natural light nullifying structured light for the outdoor environment and hence not commonly used in agricultural UAVs.

2.3.6 Millimeter-wavelength radar

It is a detection radar working in a microwave band. It transmits signals with a wavelength that is in the millimeter range—a short wavelength. It, therefore, can detect movements that are as small as a fraction of a millimeter [26]. Due to the complex farmland operating environment ultrasonic and other sensors based on optical principles are easily affected by climatic conditions, whereas millimeter wavelength radars a non-cooperative sensor can work in all weather conditions with strong penetrating ability, large operating distance, reliable detection, and anti-electromagnetic interference [6]. Compared with other ranging sensors the millimeter wavelength radar, however, has low resolution and implemented with discrete components that demand increase power consumption and overall system high cost. It can not only detect parallel distances but also hardly describe the outline of the OA objects as well as their angle in the field of view. It is, therefore, limited to terrain imitation flight systems for agricultural UAVs. Lovescu and Rao [26] noticed that Texas Instrument has solved these challenges with complementary metal-oxide semiconductors (CMOS)-based mmWave radar devices and implemented frequency modulated continuous wave (FMCW) that measures range as well as angle and velocity. This differs from traditional pulse-radar systems which transmit short pulses periodically they noted.

2.3.7 Time of flight (TOF) ranging

It is one of the widely used ranging methods. It works by the round trip flight time whereby its camera illuminates the whole scene including the objects using a pulse or continuous wave light source and then measures the reflected light time using the speed of light to and from the object to obtain the distance and hence the 3D depth range map. It is the quickest technology to capture 3D information, created in a single shot of an area or scene [9]. Compare with other ranging methods, it has the features of low energy consumption and easy deployment and is suitable for applications where high ranging accuracy is required. However, due to the transmission features of the light wave signal non-linear propagation factors such as reflection, refraction, and diffraction can all cause measurement time deviation which will lead to huge distance calculation errors. When applied in agricultural UAVs safety, an auxiliary device derived from the combination of TOF ranging principle and other sensors is more suitable for obstacle avoidance [26]. AMS TOF obstacle detection and avoidance sensors, for example, are based on a proprietary single photon avalanche photodiode (SPAD) pixel design and time to digital converters (TDCs) with extremely narrow pulse width and can measure the time of flight of a laser emitted infrared ray reflected from an object in real-time [7].

2.3.8 Monocular visual ranging

It captures images through a single-lens camera. The constraints of payload due to size, weight and power (SWaP) of both small and micro UAVs favors the use of monocular cameras. It is a 3D depth reconstruction from a single still image. It is simple in structure, mature in technology and fast in computing speed [26], but it cannot directly obtain the depth information of obstacles because both optical flow and perspective cues cannot handle frontal obstacles well [21]. The algorithm deployed to interpret the image data using various cues in the image is what makes monocular vision cameras able to create 3D images, determine distances between objects and detect obstacles [9]. Hence, monocular ranging is difficult to meet the requirements of real-time performance and accuracy of obstacle avoidance for UAVs giving the complexity of the farmland operating environment. However, a relative size cue to detect frontal collisions which work on the knowledge that the size of approaching objects increases with nearness appears promising for real-time frontal obstacle detection using a single camera. According to Mori and Scherer [14], the time to collision can be obtained by measuring the expansion of the obstacle.

2.3.9 Binocular stereo vision sensors

Beginning with identifying an image pixel that corresponds to the same point in a physical scene observed by multiple cameras, stereoscopic vision is the calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints. Triangulation using a ray from each camera can then be used to determine the 3D position of a point [15]. Stereo vision technology can recognize and measure the distance between the fuselage and the obstacles due to its good concealment and ability to obtain comprehensive information including color and texture of obstacles as well as 3D depth information. The most serious issue with binocular vision, however, is stereo matching. The effects of lighting changes, scene rotation, object occlusion, low image resolution, interference and even overwhelming of target features all lead to target feature instability and decreased object detection accuracy [6]. It is, therefore, more used in consumer and professional UAVs rather than in agriculture.

Advertisement

3. Conclusion

The attempt was made to situate the topic and review within the milieu of the national discourse on economic development to isolate it from the pool of academic exercise which has become the bane of research in Nigerian Institutions of Learning (NIL). The imperatives of enhanced mechanization of the nation’s farm systems as a cardinal strategy to fight poverty and for capital formation and economic growth were highlighted. The role of unmanned aircraft in precision agriculture was x-rayed to include soil and water analysis, planting, crop and spot spraying, crop monitoring, irrigation, farm health assessment and livestock production systems. Based on the three functional compartments of unmanned aircraft system—guidance, navigation and control (GNC) the application of UAV to agriculture was examined to include the two main platforms fixed wing and rotor airframes, the navigations sensors such as real-time kinematic (RTK), ultrasonic sensors, laser sensors, infrared sensing technology, structured light, time of flight (TOF) ranging, millimetre-wavelength radar, monocular visual ranging and binocular stereo vision and the controls—geometric relationship, real-time planning and decision-making algorithms, the autonomous mental development (AMD) algorithm, online free path generation and navigation system, multi-UAVs genetic algorithm, the simultaneous localization and mapping (SLAM) technology and UAV swarm control and sensor fusion systems. The obstacle avoidance methods of many Ag UAV systems were found to be either by on-site suspension, planned travel route and/or autonomous obstacle avoidance. The merits and the demerits of the available Ag UAV technologies were highlighted and the gap to bridge in airframe technology—cost, payload and flight endurance; in sensors—direct imaging sensors of less cost, size and weight; reliability—in mechanical, electronics and interference; in off-the-shelf devices for rapid remediation, in operation-autonomous take-off and landing, automated computation of flight paths, integrated spray and remote sensing algorithm to direct the operation of spraying and in power for higher and lightweight solar cells were noted.

The objective of the study is to examine the literature pertaining to the use of UAV for pesticides and fertilizer application in obstacle rich farms. The study shows that UAVs platform for agricultural pesticide and fertilizer application needs higher payloads weight that demands enhanced mechanical (especially the landing gears) and integrated electrical structure. Enhanced flight endurance to ensure continuous spraying operation by a unit of task size is important. It was further observed that rotor aircraft with the ability for vertical take-off and landing (VTOL) is better for agricultural operations.

It was also observed that the farm environment is replete with flight obstacles for Ag UAVs. The various types of obstacles were outlined. Nine sensors that are primary to obstacle sensing and avoidance technology were therefore reviewed for their applicability to such environment and their merits and demerits highlighted. None is found completely independent without an assist to achieve the objectives of real-time all-weather autonomous operation. A fusion of multi-sensor data systems is therefore in vogue to complement the shortfall of each other. The potential for real-time and near real-time exists but the development of technology for quality imagery and rapid processing leading to real-time response is needed. The infrared, time of flight and millimeter wavelength radar sensors for detecting farm and flight environment obstacles appear must promising especially with its modern versions for Ag UAV.

The autonomous mental development (AMD) algorithm and the simultaneous localization and mapping (SLAM) technology appear to be ahead of others in achieving autonomous identification of obstacles and real-time obstacle avoidance for agricultural UAVs. They are, therefore, found fit for further studies and development for deployment on Ag UAVs for pesticides and fertilizer application in obstacles in rich farmlands.

References

  1. 1. National Bureau of Statistics. Statistic Bulletin. 2020;3:70806. Available from: https://www.aljazeera.com-news [Accessed: May 04, 2020]
  2. 2. Lewis A. Theory of Economic Development. London: George Allen and Unwin Ltd; 1977
  3. 3. Agbo J. Precision Agriculture and National Development in The Nation Online. 11th June 2020
  4. 4. Yun G, Mazul M, Pederii Y. Role of unmanned aerial vehicles in precision farming. Proceedings of the National Aviation University. 2017;70(1):106-112
  5. 5. Huang Y et al. Development and prospect of unmanned aerial vehicle technologies for agricultural production management. International Journal of Agricultural and Biological Engineering. 2013;6:3. Available from: http://www.ijabe.org [Accessed: June 14, 2020]
  6. 6. Kim J et al. Unmanned aerial vehicle in agriculture: a review of perspective of platforms, control and applications. IEEE Access. 2019;10:1109. Available from: https://creativecommons.org/licences/by/4.0/ [Accessed: June 14, 2020]
  7. 7. Uche E, Audu S. UAV for agrochemical application: a review. Nigerian Journal of Technology (NIJOTECH). 2021;40(5):795-809
  8. 8. Vroegindeweij B, Henten E. Autonomous unmanned aerial vehicles for agricultural applications. In: Proceeding. International Conference of Agricultural Engineering. Zurich; 2014. p. 8
  9. 9. Banjo C, Ajayi O. Sky-Farmers: Applications of Unmanned Aerial Vehicles (UAV) in Agriculture. IntechOpen; 2019. Available from: http://creativecommons.org/licences/by/3.0 [Accessed: June 12, 2020]
  10. 10. He L et al. Optimization of pesticide spraying tasks via multi-UAVS using genetic algorithm. Hindawi Mathematical Problems in Engineering. 2017;2017:7139157. DOI: 10.1155/2017/7139157
  11. 11. Chufang H. QF170-18L AgriSpraying Helicopter. 2020 [Accessed: June 14, 2020]
  12. 12. Wang L et al. Applications and prospects of agricultural unmanned aerial vehicle obstacle avoidance technology in China. MDPI. 2019;19(3):642. Available from: http://creativecommon.org/licences/by/4.0/ [Accessed: June 14, 2020]
  13. 13. Renke H et al. UAV autonomous collision avoidance approach. Automatika. 2017;58(2):195-204. DOI: 10.1080/00051144.2017.1388646
  14. 14. Mori T, Scherer S. First results in detecting and avoiding frontal obstacles from a monocular camera for micro aerial vehicles. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA). Karlsruhe, Germany; 2019
  15. 15. Corrigan F. 12 Top Collision Avoidance Drones and Obstacle Detection Explained. DroneZon [Online]. 2020. pp. 1-31. Available from: https://www.dronezon.com/learn-about-drones-quadcopters/top-drones [Accessed: June 10, 2020]
  16. 16. Leonetang W. The Obstacle Avoidance System of Agricultural UAV. Welkinuav [Online]. 2018. pp. 921-922. Available from: https://www.welkinauv.com [Accessed: June 10, 2020]
  17. 17. Braasch M. Obstacle Avoidance: The Challenge for Drone Package Delivery. The Conversation [Online]. 2016. pp. 1-5. Available from: https://theconversation.com/amp/obstacle-avoidance-the-challenge-for-drone-package-delivery-70241 [Accessed: June 09, 2020]
  18. 18. De Simone M et al. Obstacle avoidance system for unmanned ground vehicles by using ultrasonic sensors. Machines. 2018;6:18. Available from: www.mdpi.com/journal/machines [Accessed: June 12, 2020]
  19. 19. Adnan A, Majd A, Troubitsyna E. Online path generation and navigation for swam of UAVs. Scientific Programming. 2020;2020:1-15. Available from: http://creativecommon.org/licences/by/4.0/ [Accessed: June 12, 2020]
  20. 20. Gerstberger LJ. State Estimation: KALMAN FILTER. In: Programmierrung & Softwaretechnik. Munchen: Lud wig Maximilians Universtat; 2012. p. 6
  21. 21. Sylvester G. Unmanned Aerial Systems (UAS) in Agriculture: Regulations & Good Practices. E-Agriculture in Action, Drones for Agriculture, Food and Agricultural Organization of the United Nations (FAO). 2016. p. 20
  22. 22. Anis H et al. Automatic quadcopter control avoiding obstacle using camera with integrated ultrasonic sensor. Journal of Physics: Conference Series. 2018;1011:012046
  23. 23. Guanglei M, Haibing P. The application of ultrasonic sensor in obstacle avoidance of quadrotor UAV [Online]. In: 2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC). 2016. Available from: https://ieeexplore.ieee.org.document [Accessed: June 12, 2020]
  24. 24. Gibbs G, Jia H, Madani I. Obstacle detection ultrasonic sensors and signal analysis metrics. In: International Conference on Air Transport—INAIR 2017 [Online]. 2017. pp. 173-182. Available from: www.sciencdirect.com [Accessed: June 12, 2020]
  25. 25. Davies D, Bolam R, Vagapov Y, Excell P. Ultrasonic sensor for UAV flight navigation. In: 25th International Workshop on Electric Drives: Optimization in Control of Electric Drives (IWED). Available from: http://doi.org/10.1109/IWED.2018.8321389 [Accessed: June 14, 2020]
  26. 26. Lovescu C, Rao S. The Fundamentals of Millimetre Wave Sensors. Texas Instrument [Online]. 2019. Available from: http://www.ti.com/sc/docs/sampterms.htm [Accessed: June 12, 2020]

Written By

Uche Emmanuel

Submitted: 24 January 2022 Reviewed: 03 February 2022 Published: 21 December 2022