Open access peer-reviewed chapter

Robots in Agriculture: State of Art and Practical Experiences

Written By

Juan Jesús Roldán, Jaime del Cerro, David Garzón‐Ramos, Pablo Garcia‐Aunon, Mario Garzón, Jorge de León and Antonio Barrientos

Submitted: 21 October 2016 Reviewed: 23 May 2017 Published: 20 December 2017

DOI: 10.5772/intechopen.69874

From the Edited Volume

Service Robots

Edited by Antonio J. R. Neves

Chapter metrics overview

4,027 Chapter Downloads

View Full Metrics

Abstract

The presence of robots in agriculture has grown significantly in recent years, overcoming some of the challenges and complications of this field. This chapter aims to collect a complete and recent state of the art about the application of robots in agriculture. The work addresses this topic from two perspectives. On the one hand, it involves the disciplines that lead the automation of agriculture, such as precision agriculture and greenhouse farming, and collects the proposals for automatizing tasks like planting and harvesting, environmental monitoring and crop inspection and treatment. On the other hand, it compiles and analyses the robots that are proposed to accomplish these tasks: e.g. manipulators, ground vehicles and aerial robots. Additionally, the chapter reports with more detail some practical experiences about the application of robot teams to crop inspection and treatment in outdoor agriculture, as well as to environmental monitoring in greenhouse farming.

Keywords

  • robotics
  • agriculture
  • greenhouse
  • UGV
  • UAV
  • multi‐robot

1. Introduction

Agriculture can be a field as favourable as industry for the application of automation. The challenges for robots in agriculture are diverse. On the one hand, agricultural environments, in contrast to industrial facilities, are not structured and controlled. On the other hand, industrial processes can be designed by modules to apply specific robots to specific works, whereas the complex tasks of agriculture sometimes cannot be split into simple actions. For the above reasons, agricultural applications require more versatile and robust robots.

In the last years, multiple groups around the world have applied different automation solutions (e.g. sensor networks, manipulators, ground vehicles and aerial robots) to diverse agricultural tasks (e.g. planting and harvesting, environmental monitoring, supply of water and nutrients, and detection and treatment of plagues and diseases). This chapter aims to collect the state of the art about robotics applied to agriculture, as well as to describe more exhaustively some of our practical experiences.

Section 2 addresses the agricultural tasks where the robots can be applied, remarking precision agriculture and greenhouse farming, but covering other works such as automatic planting and harvesting. Section 3 describes the robots applied in agricultural tasks in the state of the art, covering both multi‐robot systems, ground and aerial robots. Section 4 summarizes our main experiences in robotics applied to agriculture, which are related to precision agriculture in open fields and environmental monitoring of greenhouses. Finally, Section 5 summarizes the main conclusions acquired from the review of the state of the art and our experience in the research projects.

Advertisement

2. Automation in agriculture

This section reports the state of the art about automation in agriculture. For this purpose, it is organized as follows: Section 2.1 is focused on precision agriculture, Section 2.2 addresses the application of new technologies to greenhouse farming, and Section 2.3 analyzes the proposals for automatic planting and harvesting.

2.1. Precision agriculture

Precision agriculture, also known as precision farming, is a concept of farm management based on the application of different technologies, in order to manage the spatial and temporal variability associated with all aspects of agricultural production. Its main goal is the improvement of both crop performance and environmental quality.

Several authors have confirmed the economic and environmental benefits that are achieved when precision agriculture methodologies are applied. Nonetheless, academic surveys and professional reports show that the rate of adoption of these technologies is still low [1].

Moreover, instead of using precision agriculture as complete concept, most of the deployments reported use these techniques to solve specific needs or to fill important gaps in the knowledge of farmers [2]. Additionally, even though agronomists are playing the leading role in PA development, engineers have worked diligently to provide technologies needed to implement PA practices. Engineering innovations for PA involve development of sensors, controls, and remote‐sensing technologies.

Autonomous mobile robots can be used in a variety of field operations. They can be applied to facilitate capturing and processing high quantities of data, and they can provide the capabilities required to operate not only at individual plant level but also at complete field level. Blackmore and Griepentrog [3] study the autonomous platforms that may be available in the future, which would be used for cultivation and seeding, weeding, scouting, application of fertilizers, irrigation and harvesting.

The most widely used robotic technology in precision agriculture is vehicle guidance and auto‐steer systems. The reason is that the economic benefits are easily achievable without requiring the integration of additional components or decision support systems [2]. However, other technologies, especially those related to remote sensing, development of sensors and controls, are also used by teams combining agronomists and engineers. Table 1 summarizes some of the developments, which use mobile robotics and remote sensing for precision agriculture.

Publication Operation Technique
[4] Weeding Automatic computer vision method for detecting weeds in cereal crops, and differential spraying to control the weeds.
[5] Field mapping Creation of 3D terrain maps by combining the information captured with a stereo camera, a location sensor, and an inertial measurement unit, all installed on a mobile equipment platform.
[6] Field mapping and coverage An unmanned car‐like mobile robot uses a SLAM algorithm to navigate in the agricultural environment while creating a map of such environment.
[7] Multi‐purpose Design and construction of a multi‐purpose mobile ground platform for PA tasks.
[8] Coverage path planning Harmony Search (HS) algorithm for finding complex coverage trajectories with a fleet of aerial robots.
[9] Weed and pest control A fleet of heterogeneous ground and aerial robots is developed and equipped with innovative sensors, enhanced end‐effectors and improved decision control algorithms to cover a large variety of agricultural situations.

Table 1.

Precision agriculture: main operations and techniques.

2.2. Greenhouse farming

Greenhouse farming is often a suitable field for applying the technologies of automation, computing and robotics. Some examples of technologies implemented in productive greenhouses are the control of temperature and humidity, the soil preparation and the supply of water and nutrients [10]. The robots can perform some tasks that humans cannot do due to the harsh conditions of greenhouses, such as environmental monitoring and control, crop monitoring, supply and treatment, and pest and disease detection.

The environmental monitoring of greenhouses is interesting not only to control the growth of crops but also to determine the traceability of products. Nowadays, most of the systems used for environmental monitoring of greenhouses are based on wireless sensor networks (WSNs) [1113]. Nevertheless, the robots are starting to be applied as mobile platforms for sensors [1416].

Greenhouses can be considered complex multiple‐input multiple‐output systems [17]. The literature collects multiple proposals for modelling and controlling the conditions of greenhouses [18]. Some of them obtain the models of greenhouses applying analytical equations (e.g. mass and energy balances) [19], whereas the rest identifying process models (e.g. neural networks or fuzzy sets) [20]. A review of these models determined the input, output, and disturbance variables described below.

The input variables allow to actuate the greenhouses and change the environmental conditions. The most relevant variables considered by literature are the ventilation [21], heating [22], fogging [23], shading and CO2 injection [24] systems. The ventilation systems control the exchange of air between greenhouse and environment, which has an impact on the air temperature, humidity and composition. The heating systems are used to compensate the heat losses and keep the temperatures in the adequate range. The fogging systems spray water into the air to increase the humidity and reduce the temperature. The shading systems control the irradiation of the covers to avoid the overheating of the greenhouse. Finally, the CO2 injection systems are used to promote the photosynthesis of the plants.

The output variables define the state of greenhouse can be measured by the appropriate sensors and are the target of climate control. The most relevant variables collected by literature are the air temperature, air humidity, solar radiation and CO2 concentration. In addition, there are some variables that have influence on the state of greenhouse and should be measured and controlled. These disturbances are the external temperature, external humidity, wind speed, wind direction, external CO2 concentration, cover temperature, crop temperature and ground temperature.

Table 2 collects the relevant variables for the environmental monitoring of greenhouses [25], as well as the appropriate sensors to measure them and their possible application in robots.

Variable Sensor Application
Radiation (absorbed) Net radiometer AR, GR, FS
Radiation (solar) Pyranometer AR, GR, FS
Air temperature Resistance temperature device AR, GR, FS
Thermocouple AR, GR, FS
Thermistor AR, GR, FS
Surface temperature Infrared AR, GR, FS
Thermocouple GR, FS
Substrate temperature Resistance temperature device GR, FS
Thermocouple GR, FS
Thermistor GR, FS
Air humidity Capacitance hygrometer AR, GR, FS
Condensation hygrometer AR, GR, FS
Psychrometer AR, GR, FS
Ground humidity Electrical conductivity meter GR, FS
Carbon dioxide in air Non‐dispersive infrared sensor AR, GR, FS
pH pH probe GR, FS

Table 2.

Environmental monitoring of greenhouses: variables, sensors and robots based on Ref. [16].

AR, air robot; GR, ground robot; FS, fixed sensor.

Another task of greenhouse farming where the robots can perform an important role is the crop inspection and treatment. The detection of weeds [26], pests [27] and diseases [28] is possible through direct and indirect methods. The direct methods are based on acquiring RGB [29] and 3D [30] images and applying computer vision techniques. The indirect ones require to take samples in the greenhouse and to analyze them in the laboratory. The ground robots can be used to apply treatments and fertilizers to the crops [3133], in order to improve the precision and rationalize the products.

Planting and harvesting are seasonal tasks that require a considerable amount of work. The literature contains some proposals to automatize these tasks [3436]. These proposals consider different types of robots (ground mobile and rail robots), sensors (mainly, RGB and 3D cameras and laser scanners) and effectors (manipulators and graspers).

2.3. Seeding and harvesting

Regarding the point of view of service robotics, Onwude et al. [37] attempt to evaluate the application of agricultural mechanization and its present technologies and limitations for the large‐scale purposes. The study shows an increasing level of technological advancement in field and crop mapping, soil sampling, mechanical seeders and harvesters in agricultural robots. Sharing the same approach, Kester et al. [38] show the future trends and the likely adoption of automated farming machinery. The results of this study point out a growing interest in autonomous and semi‐autonomous systems for reducing the highest workload operations: tillage, seeding and harvesting.

In order to support the growing development of seeding and harvesting robots, new strategies are proposed for driving autonomous mobile robot in the associated scenarios. Ko et al. [33] makes a small review about common techniques for robot navigation in greenhouses and proposes a methodology based on machine learning. Due to the importance of the manipulation tasks in seeding, transplanting and harvesting processes, the manipulator motion strategy is also discussed by several authors. However, they usually do not specify path planning algorithms; the most common approach is the direct displacement towards the end‐effector desired position by position‐based control [39] and visual feedback control [40]. The task planning strategies are only studied by a few researchers. Commonly, the harvesting task is limited to pick one fruit, while the planning required for picking the rest is avoided. Nevertheless, the problem can be studied from two perspectives: the coverage path planning [41] for picking all the fruits in a scene or the time minimization [42] for moving from a fruit to another. The obstacle detection and avoidance are studied by an equally low number of authors. A great complexity is added to the solution due to the obstacle recognition in addition to the path planning algorithms. A few approaches are based in the obstacle detection with collision sensors in the end effector [43], the obstacle recognition by Light Imaging, Detection and Ranging (LIDAR) [44] and vision techniques [45].

A review about the use of technologies for automated activities in greenhouses is presented in Ref. [46], showing an increasing implementation of wireless systems for environmental measurements. Additionally, the study shows that a considerable number of research works are aimed to develop robotic systems for fruit picking and extraction. Furthermore, the research community has put much effort on developing techniques for robust fruit recognition; moreover, there is a high necessity for improving the picking capabilities of transplanting and harvesting robots in order to move towards a commercial application. A review on vision control techniques and their potential applications in fruit or vegetable harvesting robots is presented in Ref. [47]. The fruit identification and localization are the most common problem studied by the authors. Like the fruit ripeness identification [48, 49], a great number of approaches are based on RGB cameras [50], as well as colour and shape recognition [51]. The multi‐spectral lightning is less studied despite more information can be acquired with this kind of technology [52]. The next level of complexity involves the implementation of stereo vision systems and LIDAR for calculating the fruit position in a 3D space [53].

This objective can only be accomplished by the diversification and specialization of the robotic systems. With the aim of getting better results in harvesting tasks, newer and more precise sensors are needed. In Ref. [36], the authors review the modern sensor systems used in semi or fully automated robotic harvesting. Their research shows how the integration of several kinds of technologies and sensor fusion can improve the precision in fruit recognition and localization activities.

Some interesting publications to be considered for a further review of seeding, transplanting and harvesting robotics can be found in Refs. [34, 54].

Advertisement

3. Agricultural robots

This section reviews the different types of robots and payloads that are applied to agricultural tasks. For this purpose, it is organized as follows: Section 3.1 analyzes the aerial robots, Section 3.2 the ground robots, Section 3.3 some special robots that are not conventional ground or aerial vehicles and Section 3.4 the multi‐robot systems.

3.1. Aerial robots

The Association for Unmanned Vehicle Systems International (AUVSI) published an economical report in 2013 [55], which emphasized the future impact of the civil use of the unmanned aerial vehicles (UAVs) on the USA economy. This document highlights two markets over multiple areas: public safety and precision agriculture. It concludes that by far and above, this last market will be the largest in the next decade in terms of economy and jobs.

In the last decades, collecting data from agricultural holdings has been carried out mainly by manned aerial vehicles, satellites or directly specialized experts at floor level [56]. The availability of these methods has some limitations, such as the presence of clouds, the long data delivery times, the need for special permissions and the prizes of some products. In contrast, the UAVs can be deployed efficiently, can carry multiple types of sensors, do not require very restrictive permissions and are becoming a cost‐effective alternative. These advantages have been reinforced with the rise of the vertical take‐off and landing (VTOL) vehicles and more specifically the quadcopters (a good example can be seen in Ref. [57]).

One of the first applications of UAVs in precision agriculture was their use to measure the water stress in agricultural holdings. Nowadays, the UAVs are equipped with thermal and hyperspectral cameras, as well as fluorescence sensors [58]. An interesting experiment is reported in Ref. [59], where the authors produced a controlled deficit of irrigation to generate a gradient of water stress in a citrus orchard. They compared the data obtained by the micro‐thermal and hyperspectral cameras boarded on a fixed‐wing unmanned aircraft with the measurements on the leaves, validating the aerial methods to measure the water stress. Similar works can be found in Refs. [6062], which show the feasibility and benefits of the aerial thermal imagery to improve the irrigation.

Another use of aerial vehicles in precision agriculture is the monitoring of crops to predict yields, properly calculate the amount of fungicides and fertilizers and detect pathogen. In Ref. [63], a RGB camera was used to estimate the biomass of a barley exploitation under two different nitrogen treatments. The results were cross‐validated with five different crop surface models based on the height of the plant. Additionally, in Ref. [64], techniques of clustering were applied to estimate the biomass of a wheat exploitation through the RGB images. The method was enhanced by measuring the height of the plant directly on ground. These studies demonstrate the feasibility, applicability and precision of this tool.

Another interesting parameter for monitoring is the leaf area index (LAI), defined as the ratio of leaf surface to unit ground area. In Ref. [65], the authors estimate the LAI of maize, potato and sunflower fields by using a hyperspectral camera and inverting the solar model of the canopies. In [66], they used a visual 3D modelling technique to estimate the LAI in a vineyard, obtaining around 57% less precision. Although this method seems to be nowadays less accurate, it is quick, inexpensive and practical compared to other methods. In Ref. [67], they use four cameras and overlap the corresponding images to measure the LAI over wheat and rapeseed crops. They achieve a very good correlation in comparison to the measurements at floor level, despite that they highlight the dependency of the method on the light conditions.

Additionally, the UAVs are a practical tool for the prevention of diseases in agricultural holdings. The work reported in Ref. [68] uses a UAV and a manned aircraft equipped with multispectral cameras. It applies four classification algorithms to detect the prints of the diseases and compare the results with the ground truth. Similarly, the work described in Ref. [69] proposes a new airborne visual sensor to detect diseases on the leaves on citrus trees, and the experiments show an accuracy of 93%. These methods have been used not only for citrus trees but also for olives [70], avocado trees [71], potatoes [72] and grapevines [73], among others.

Although this use is not extended around the world due to the restrictions in many countries, there have been some trials and investigations with UAVs spraying locally fertilizers and pesticides [74]. For instance, in Ref. [75], a net of pesticide sensors are deployed over a virtual field, whereas a UAV sprays the chemicals where they are needed. The sensors provide a feedback to the UAV, which adjusts the optimal route and minimizes the waste of pesticides. A more practical and exhaustive study was carried out in Ref. [76], where different spraying parameters are tested in a field with rice crops, such as operation height and velocity, and the deposited volume at different heights of the plants.

As we have seen, the UAVs equipped with hyperspectral and thermal cameras are an affordable and practical system to improve the performance of agricultural holdings, reducing costs and increasing productivity. With the development of new sensors and the establishment of new legal frameworks, their expansion will continue in the next decades.

Advertisement

3.2. Ground robots

The systematic design methods assist researchers in design choices, whereas the economic analysis considers allowable cost of a system. Only a few authors report design processes based in requirement engineering. The decisions about the hardware have influence on the robot performance, the complexity of algorithms and, eventually, the system costs. Some approaches for ground robots are reviewed with special interest in the platform, manipulator and end‐effector implemented in several agricultural applications.

Regarding to the platform, custom mobile robots are the most common choice [77]. The robustness of crawler and caterpillar platforms [78, 79] is described by several authors, but the wheeled robots [80] are also shown as good choices due to their simplicity. All these platforms can integrate approaches such as GPS [43], odometry [50], line guidance [81], path plans [82] and manned approaches [83] as navigation strategy. Other robots take advantage of irrigation pipes or rails in the field [84, 85] to achieve their displacement.

The most common configuration of the manipulators is near 3 DoF (Degrees of Freedom) in a custom‐made development [86, 87]. This situation can be associated to the high cost of implementations involving commercial manipulators [39]. Although the manipulators are commonly custom made, neither analysis nor explanation for the number of DOF selected is usually detailed by the authors. The manipulator movements are just simplified, and the high numbers of DOF are avoided.

Finally, most end‐effectors are designed to operate with two fingers [44, 79], since most of the grasps can be performed by them, and they are the smallest suitable mechanical architectures for grasping hand devices. In addition, grasping is commonly achieved by suction grippers [88]; for that reason, the end effectors are mainly actuated by electrical and pneumatic systems. A great number of the end effectors are custom made [89, 90]; this design preference can be associated to the huge diversity of tasks to be performed, as well as the several kinds of fruits and vegetables to be handled. Additionally, it is a common practice trying to solve the problems originated by the mobile platform and the manipulator through the end‐effector behaviour; this situation leads to greater customization of the grippers.

For comprehensive reviews of the agricultural robotics literature and further exemplification of their ground robot applications, we refer the reader to the readings [91, 92].

3.3. Special robots

The traditional ground robots present several limitations related to the constraints of wheeled and caterpillar motion. Additionally, the use of aerial robots is not always possible, especially when the task should be performed on surface or indoor. This section reports some cases in the literature where robots with alternative locomotion systems are applied to agricultural tasks.

Sphere robots are systems whose movements are produced by instability. This type of robots is used in several applications and scenarios, such as exploration [93] and surveillance [94]. ROSPHERE is described by their authors as a new low‐cost spherical robot for measuring soil temperature and moisture in precision agriculture [95]. In comparison to wheeled robots with similar size and capabilities, the ROSPHERE is much lighter and robust in irregular terrains.

Several robots are designed by bioinspiration, which tries to replicate the biological evolution. For instance, the engineers have noticed that hexapod insects are able to walk in all terrains, so they are replicating their physiognomies and walking patterns. This is the case of Prospero, which is a prototype of hexapod robot that can plant, tend and harvest autonomously [96]. Another example are the RoboBees of Harvard University, which are micro‐aerial robots inspired in bees [97]. These robots are being applied for distributed environmental monitoring and assistance to crop pollination.

3.4. Multi‐robot systems

Sometimes, single robots are not able to perform some complex tasks (e.g. those that require coordinated actions in multiple locations) or to perform simple tasks in the required time (e.g, when the tasks must be performed in large areas). In these cases, robot teams might provide some advantages over single robots, such as their effectiveness, efficiency, flexibility and fault tolerance.

Most of the cases of robot teams for agricultural tasks reported by the literature are homogeneous. Using a fleet of UAVs instead of a single UAV for collecting data in large areas is common, and there are multiple techniques for area distribution and path planning [98]. For instance, a team of small UAVs with low‐cost cameras can be applied to control the exploitation and management of water [99], obtaining the same results than a single UAV equipped with a better camera and providing the operation with more robustness.

Nevertheless, some agricultural applications require heterogeneous robot teams. The most common situation is when the task consists of operations that are more effective from the air and others that are more effective from the ground. For instance, aerial robots are able to efficiently cover large fields taking pictures and collecting data (e.g. distribution of water or location of weeds), whereas ground robots can actuate on the crops with more robustness and precision (e.g, watering or applying treatments) [9].

Therefore, the heterogeneous multi‐robot systems often combine the advantages and compensate the drawbacks of different robots. The Section 4 describes with more details two different multi‐robot systems applied to two different agricultural scenarios: outdoor agriculture and greenhouse farming.

Advertisement

4. Practical experience

This section reports some of the experiences of the Robotics and Cybernetics Group (RobCib) in the context of robotics applied to agriculture. It is organized as follows: Section 4.1 summarizes the participation in the Robot Fleets for Highly Effective Agricultural and Forestry Management (RHEA) Project, whereas Section 4.2 describes the use of multiple robots for the environmental monitoring of greenhouses.

4.1. RHEA project

A good example of precision agriculture is the RHEA (Robot Fleets for Highly Effective Agricultural and Forestry Management) Project. It was carried out under the seventh framework program of the European Commission and identified as NMP‐CP‐IP 245986‐2. RHEA activities finished on 31 July 2014. The project was focused on designing, developing and testing of a new generation of robotic systems for both chemical and physical (mechanical and thermal) effective weed management in the context of agriculture and forestry.

The use of pesticides in agriculture helps to improve yields and to prevent crop losses. Nevertheless, pesticides include active ingredients that have adverse impacts on the environment and habitats. According to the “Agriculture, forestry and fishery statistics” [100] online publication of Eurostat, the sale of pesticides in European Union member states amounted to close to 400,000 tonnes in 2014. Due to their potential toxicity, the application of pesticides is strictly controlled by EU legislation since 1991 and previously by national regulations. Although the return of pesticides (the crops saved from pests and diseases) is approximately four times the investment [101], the indirect costs (the impact on human health and environment) are estimated to total approximately $10,000 million per year in the United States [101].

The farmers usually apply these pesticides by using traditional sprayers that distribute them uniformly over the complete field. Therefore, the aim of RHEA Project was to support the farmers by reducing the amount of pesticides applied without reducing the effectiveness of the treatments. This objective was reached by applying the pesticides with high precision only where it is required. This solution not only prevents the pernicious effects of the pesticides but also reduces drastically the economic cost of treatment.

RHEA scope covers a large variety of European products, such as agriculture wide row crops (processing tomato, maize among others), close row crops (winter wheat and winter barley) and forestry woody perennials (walnut trees, almond trees, olive groves and multi‐purpose open woodland).

The project is based on the cooperation among aerial and ground vehicles to perform precision agriculture tasks, namely weeds removal and trees fumigation. A complete description of the system as well as the result of the project can be found at Ref. [9].

Ground units are based on small Case New Holland Industrial tractors with some modifications. In order to provide the system with the required autonomy, several kinds of sensors and actuators were deployed into the systems.

A high precision Global Navigation Satellite System (GNSS) for autonomous outdoor navigation was used to allow the control system to accurately steer the robots to work on wide‐row crops (with 0.75 m‐spaced rows). Additionally, a ground perception system was used to discriminate weeds from crops while travelling along crop rows, as well as a real‐time tree canopy detection system.

The tractors were also endowed with three kinds of actuators, namely a patch sprayer aiming at reducing herbicide use by approximately 75%, a canopy sprayer to reduce the use of pesticide in canopy spraying by approximately 50% and a mechanical/thermal tool to destroy 90% of the detected weeds.

Additionally, the tractors were provided with a communication equipment, a sustainable energy system and a safety systems for humans and animals detection. Figure 1 shows the ground units with the mentioned actuators.

Figure 1.

Ground units of RHEA project.

In order to plan the activities of the ground units, the RHEA concept includes the support of aerial imagery by using drones. Thus, the system used a fleet of last‐generation hexacopters provided by Air Robot that rely on high payload and extraordinary stability (shown in Figure 2 ). These features allow taking steady pictures with high‐quality cameras in large open fields. As result, high‐resolution images of the open field are obtained, in order to provide the ground units with the locations of weeds to remove them.

Figure 2.

Aerial units of RHEA project.

Usually, weeds are not uniformly spread over the fields but are located in patches. For this reason, the first step of the mission is to obtain high‐precision aerial images to locate the patches in the field. The location of the patches will allow defining optimal paths for the tractors. This task requires a thorough planning of the flights, which was our main task in this project. Several area coverage planning techniques were applied taking into account spatial and temporal requirements, in order to generate optimal and safe flight plans for the drones. A complete description of the aerial mission can be found in Refs. [102, 103].

The drones were equipped with two high‐resolution cameras (4704 × 3136) mounted on a gimbal system. A multi‐spectral device that includes visible and near infrared (NIR) channels was selected to maximize the robustness under different light conditions. Moreover, in order to preserve the complete colour information, the solution uses a coupling of two commercial still cameras: one of them modified to provide the NIR channel. The flights were performed with an elevation of 60 m to obtain a resolution of 1 cm per pixel with each image covering approximately 40 × 30 m and overlapping 60% with the consecutive ones.

RHEA provides the farmer with a ground station with a graphical user interface (GUI). This interface allows the farmer to create and launch the mission. Thus, the operator has to define the area by entering the points that limit the field.

Once the limits of the field have been established, the system creates a flight plan for each robot of the fleet. After testing several optimization techniques, back and forth motion movements are applied for planning the fleet trajectories, in order to increase the situational awareness of the operator. Although the drones are able to autonomously perform take‐off and landing operations, the pilot needs to provide altitude commands to ensure a safe operation.

Later, a mosaicking procedure is developed: colour and NIR images from the two cameras are joined in a unique four‐channel picture. An approach based on the Fourier‐Mellin (FM) transform was successfully developed and tested. This approach identifies rotation, translation and scale changes between images by means of Fourier spectrum analysis. To cope with large‐sized camera images, which imply non‐linear transformations, the original images are partitioned into a set of small image portions, where the FM identification process is executed iteratively. Then, a global homographic transformation model is computed including lens radial distortion. A registration accuracy of 0.3 pixels is obtained [17]. After obtaining a global image of the fields, the patch detection should be performed.

Detecting weeds when crop and weed plants are at early phenological stages is a challenge. The proposal of RHEA overcomes it by using high‐spatial‐resolution imagery and object‐based image analysis (OBIA), taking into account the relative position of the weeds to the crop lines, so that every plant that is not located on the crop row is considered a weed [104]. As a result, a weed patch map is created using a grid of 0.5 m. The size of this grid is customizable according to the requirements of the herbicide‐spraying machinery.

Later, the Ground Mission Planner is executed by the operator in the base station. It determines the configuration of the ground vehicles (number and type of vehicles), as well as the plan for each one to efficiently apply the treatment. Simulated annealing and basic genetic algorithms are used to find the optimal solution that minimizes either the task cost or the time required cost, whereas a non‐dominated sorting genetic algorithm (NSGA‐II) is employed as a proper approach for simultaneously minimizing both criteria [105].

Once defined the optimal paths, the ground mission starts. During the mission, the ground perception system detects weeds, both inter and intra row. This system is based on a SVS‐VISTEK camera connected to the high‐level decision‐making system computers for acquiring images and running the relevant vision algorithms [106]. The operation speed of Unmanned Ground Vehicles (UGVs) was fixed at 0.83 m/s, and the region of interest (ROI) for the ground perception system was defined to be 3‐m wide and 2‐m long located in front of the UGV. Weed detection relies on the spatial identification of crop rows. Thus, determination of crop‐row positions with respect to the UGV becomes a key task for weed patch detection and UGV guiding. Weed detection system also generates the orders to the sprayers to activate precisely when nozzle passes over the weeds.

4.2. Environmental monitoring in greenhouses

As mentioned above, one of the agricultural applications where the robots can work is the environmental monitoring of greenhouses. This task cannot be performed by humans, because it requires a continuous work under the harsh conditions of greenhouses. The alternatives of the robots in this task are the fixed sensors, which are not able to capture the spatial variability of the environmental conditions, and the sensor networks, which cannot be moved during the operation to the points of interest.

The proposed solution is a multi‐robot system that measures the environmental variables and collects their spatial and temporal variability. This information is the key to control the conditions of the crops, which determine both the productivity and the quality of greenhouse. As shown in Figure 3 , the multi‐robot system is split into small teams with ground and aerial units that work in specific areas. A base station controls the mission, coordinating the actions of the robots, as well as collecting and storing their measurements.

Figure 3.

Scheme of multi‐robot system for environmental monitoring of greenhouses.

The first work used a mini‐UAV to measure air temperature, humidity, luminosity and carbon dioxide concentration [5]. The sensors shown in Table 3 were selected to measure these variables regarding their size, weight, range, resolution and cost. These sensors were integrated by means of a Raspberry Pi computer, which collects the measurements and sends them to the base station via Wi‐Fi network.

Variable Sensor Robot Controller
Air temperature RHT03 UAV/UGV Raspberry Pi
Ground temperature MLX90614 UGV Arduino
Air humidity RHT03 UAV/UGV Raspberry Pi
Ground humidity SEN92355P UGV Arduino
Luminosity TSL2561 UAV/UGV Raspberry Pi
CO2 concentration MG811 UAV/UGV Raspberry Pi

Table 3.

Experience in environmental monitoring in greenhouses.

The main contributions of this work were the aerodynamics study of quadcopter and the real experiments with the sensory system. The results validated the use of quadcopter as sensory system and determined the optimal location of sensors: the centre and top of the quadrotor frame. A subsequent work developed a chamber to measure the concentration of gases minimizing the influence of propellers and obtaining less errors and fluctuations [107].

The second work introduced a medium‐size UGV to measure ground temperature and humidity [6]. Specifically, we used the distance infrared temperature sensor and the contact conductivity humidity sensor collected by Table 3 . Additionally, the work studied the path planning and following strategies to cover the greenhouse in the minimum time.

The aerial and ground robots have different strengths and weaknesses. The aerial robots can move fast and agile through the corridors and reach any point in the 3D space. On the other hand, the ground robots have more autonomy to cover the greenhouse and more robustness to avoid accidents. Therefore, a heterogeneous team can take advantage of the potential of both types of robots [7].

The team strategy of the multi‐robot system is the following: the UGV carries the UAV on a platform while it develops its tasks, and when it is required, the UAV takes‐off, performs some tasks and lands on the UGV. In this manner, the multi‐robot system can avoid the obstacles in the corridors, as well as find the source of anomalous measurements.

The main future challenges of this research line are related to the navigation of the UAVs and the autonomy of UGV and UAV. The navigation of UAVs in the greenhouse is a challenge because the scenario is closed and has high occupancy. The autonomy of both robots is required for the continuous operation and needs better batteries and charging systems.

Advertisement

5. Conclusions

In recent years, the robots have found their own place in agriculture. This chapter addresses the main fields of application (precision agriculture, greenhouse farming, and seeding and harvesting), analyzes the aerial, ground and special robots used in these applications, and describes two research projects related to precision agriculture and greenhouse farming. The main conclusions of these sections are summarized below.

Precision agriculture seeks to apply multiple technologies to acquire knowledge about the spatial and temporal variability of crops. Among other technologies, the use of aerial robots to build maps of the fields and detect weeds or irrigation deficits, and the application of ground robots to apply accurate treatments to plants must be remarked. In addition, greenhouse agriculture has included robots for multiple tasks, such as the monitoring of environmental variables, which is important for the control of the conditions of crops, and the watering and spraying of plants. Finally, although the use of robots for seeding and harvesting is in an earlier step of development, a series of techniques of perception, positioning and grasping have been developed.

The most widely used robots in agriculture are UAVs and UGVs. Aerial robots are usually applied to acquire information of the fields by taking advantage of the altitude. Although the first agricultural aerial robots were fixed‐wing UAVs, nowadays the multi‐rotors are more popular due to their flexibility. On the other hand, ground robots are usually used to act on the crops. The most common configurations are wheeled and caterpillar robots, and some of the most relevant issues are the selected location and navigation algorithms. Nevertheless, other designs such as spherical or bio‐inspired robots are gaining interest, as well as the use of multi‐robot systems that can go further than single‐robot ones.

Finally, the chapter summarizes two different projects that address the application of robots in agriculture. The RHEA project uses aerial robots to locate weeds within the fields and ground robots to apply localized treatments on them. The other project introduces ground and aerial robots in greenhouses to take measurements of environmental variables. A series of lessons have been learned from these experiences, such as the potential of robots as moving sensory and actuation systems, the difficulties of navigation in unstructured scenarios, the power of cooperation with heterogeneous fleets and the limitations imposed by the autonomy of robots.

Advertisement

Acknowledgments

The research leading to these results has received funding from the RoboCity2030‐III‐CM project (Robótica aplicada a la mejora de la calidad de vida de los ciudadanos. Fase III; S2013/MIT‐2748), funded by Programas de Actividades I+D en la Comunidad de Madrid and the Structural Funds of the EU, from the DPI2014‐56985‐R project (Protección robotizada de infraestructuras críticas) funded by the Ministerio de Economía y Competitividad of Gobierno de España, and from the NMPCP‐IP 245986‐2 RHEA project (Robot Fleets for Highly Effective Agriculture and Forestry Management) sponsored by the European Commission’s Seventh Framework Programme.

References

  1. 1. Pierpaoli E, Carli G, Pignatti E, Canavari M. Drivers of precision agriculture technologies adoption: A literature review. Procedia Technology. 2013;8:61-69
  2. 2. McBratney A, Whelan B, Ancev T, Bouma J. Future directions of precision agriculture. Precision Agriculture. 2005;6(1):7-23
  3. 3. Blackmore S, Griepentrog HW. A future view of precision farming. In: Berger D, et al., editor. Proceedings of the PreAgro Precision Agriculture Conference. Muncheberg, Germany: Center for Agricultural Landscape and Land Use Research (ZALF); 2002. pp. 131-145
  4. 4. Tellaeche A, BurgosArtizzu XP, Pajares G, Ribeiro A, Fernández‐Quintanilla C. A new vision‐based approach to differential spraying in precision agriculture. Computers and Electronics in Agriculture. 2008;60(2):144-155
  5. 5. Rovira‐Más F, Zhang Q, Reid JF. Stereo vision three‐dimensional terrain maps for precision agriculture. Computers and Electronics in Agriculture. 2008;60(2):133-143
  6. 6. Cheein FA, Steiner G, Paina GP, Carelli R. Optimized EIF‐SLAM algorithm for precision agriculture mapping based on stems detection. Computers and Electronics in Agriculture. 2011;78(2):195-207
  7. 7. Nielsen SH, Jensen K, Bøgild A, Jørgensen OJ, Jacobsen NJ, Jæger‐Hansen CL, Jørgensen RN. A low cost, modular robotics tool carrier for precision agriculture research. In 11th International Conference on Precision Agriculture; 15‐18 July 2012; Indianapolis, United States. International Society of Precision Agriculture; 2012
  8. 8. Valente J, Del Cerro J, Barrientos A, Sanz D. Aerial coverage optimization in precision agriculture management: A musical harmony inspired approach. Computers and Electronics in Agriculture. 2013;99:153-159
  9. 9. Gonzalez‐de‐Santos P, Ribeiro A, Fernandez‐Quintanilla C, Lopez‐Granados F, Brandstoetter M, Tomic S, et al. Fleets of robots for environmentally‐safe pest control in agriculture. Precision Agriculture. 2016;1‐41
  10. 10. Martínez M, Blasco X, Herrero JM, Ramos C, Sanchis J. Monitorización y control de procesos. una visión teórico‐práctica aplicada a invernaderos. RIAII. 2005;2(4):5-24
  11. 11. Pawlowski A, Guzman JL, Rodríguez F, Berenguel M, Sánchez J, Dormido S. Simulation of greenhouse climate monitoring and control with wireless sensor network and event‐based control. Sensors. 2009;9(1):232-252
  12. 12. Pahuja R, Verma HK, Uddin M. A wireless sensor network for greenhouse climate control. IEEE Pervasive Computing. 2013;12(2):49-58
  13. 13. Cama‐Pinto A, Gil‐Montoya F, Gómez‐López J, García‐Cruz A, Manzano‐Agugliaro F. Wireless surveillance system for greenhouse crops. Dyna. 2014;81(184):164-170
  14. 14. Roldán JJ, Joossen G, Sanz D, del Cerro J, Barrientos A. Mini‐UAV based sensory system for measuring environmental variables in greenhouses. Sensors. 2015;15(2):3334-3350
  15. 15. Ruiz‐Larrea A, Roldán JJ, Garzón M, del Cerro J, Barrientos A. A UGV approach to measure the ground properties of greenhouses. In: Robot 2015: Second Iberian Robotics Conference. Springer International Publishing; 2016. pp. 3-13
  16. 16. Roldán JJ, Garcia‐Aunon P, Garzón M, de León J, del Cerro J, Barrientos A. Heterogeneous Multi‐Robot system for mapping environmental variables of greenhouses. Sensors. 2016;16(7):1018
  17. 17. Zeng S, Hu H, Xu L, Li G. Nonlinear adaptive PID control for greenhouse environment based on RBF network. Sensors. 2012;12(5):5328-5348
  18. 18. Van Henten EJ. Greenhouse climate management: An optimal control approach (No. 631.34 H4). 1994
  19. 19. Fourati F, Chtourou M. A greenhouse control with feed‐forward and recurrent neural networks. Simulation Modelling Practice and Theory. 2007;15(8):1016-1028
  20. 20. Rodríguez F, Berenguel M, Guzmán JL, Ramírez‐Arias A. Modeling and Control of Greenhouse Crop Growth. London, UK: Springer; 2015
  21. 21. Sethi VP, Sharma SK. Survey of cooling technologies for worldwide agricultural greenhouse applications. Solar Energy. 2007;81(12):1447-1459
  22. 22. Sethi VP, Sharma SK. Survey and evaluation of heating technologies for worldwide agricultural greenhouse applications. Solar Energy. 2008;82(9):832-859
  23. 23. Arbel A, Barak M, Shklyar A. Combination of forced ventilation and fogging systems for cooling greenhouses. Biosystems Engineering. 2003;84(1):45-55
  24. 24. Zhang Z, Liu L, Zhang M, Zhang Y, Wang Q. Effect of carbon dioxide enrichment on health‐promoting compounds and organoleptic properties of tomato fruits grown in greenhouse. Food Chemistry. 2014;153:157-163
  25. 25. Both AJ, Benjamin L, Franklin J, Holroyd G, Incoll LD, Lefsrud MG, Pitkin G. Guidelines for measuring and reporting environmental parameters for experiments in greenhouses. Plant Methods. 2015;11(1):1
  26. 26. Slaughter DC, Giles DK, Downey D. Autonomous robotic weed control systems: A review. Computers and Electronics in Agriculture. 2008;61(1):63-78
  27. 27. Chung BK, Xia C, Song YH, Lee JM, Li Y, Kim H, Chon TS. Sampling of Bemisia tabaci adults using a pre‐programmed autonomous pest control robot. Journal of Asia‐Pacific Entomology. 2014;17(4):737-743
  28. 28. Fang Y, Ramasamy RP. Current and prospective methods for plant disease detection. Biosensors. 2015;5(3):537-561
  29. 29. Li Y, Xia C, Lee J. Detection of small‐sized insect pest in greenhouses based on multifractal analysis. Optik‐International Journal for Light and Electron Optics. 2015;126(19):2138-2143
  30. 30. Xia C, Wang L, Chung BK, Lee JM. In situ 3d segmentation of individual plant leaves using a rgb‐d camera for agricultural automation. Sensors. 2015;15(8):20463-20479
  31. 31. Sammons PJ, Furukawa T, Bulgin A. Autonomous pesticide spraying robot for use in a greenhouse. In: Proceedings of the Australian Conference on Robotics and Automation; 5 December 2005; Sydney, Australia. 2005. pp. 1‐9
  32. 32. Belforte G, Deboli R, Gay P, Piccarolo P, Aimonino DR. Robot design and testing for greenhouse applications. Biosystems Engineering. 2006;95:309-321
  33. 33. Ko MH, Ryuh BS, Kim KC, Suprem A, Mahalik NP. Autonomous greenhouse mobile robot driving strategies from system integration perspective: Review and application. IEEE/ASME Transactions on Mechatronics. 2015;20:1705-1716
  34. 34. Bac CW, Henten EJ, Hemming J, Edan Y. Harvesting robots for High-value crops: State-of-the-art review and challenges ahead. Journal of Field Robotics. 2014;31(6):888-911
  35. 35. Bachche S. Deliberation on design strategies of automatic harvesting systems: A survey. Robotics. 2015;4(2):194-222
  36. 36. Zujevs A, Osadcuks V, Ahrendt P. Trends in robotic sensor technologies for fruit harvesting: 2010‐2015. Procedia Computer Science. 2015;77:227-233
  37. 37. Onwude DI, Abdulstter R, Gomes C, Hashim N. Mechanisation of large-scale agricultural fields in developing countries—A review. Journal of the Science of Food and Agriculture, 2016, vol. 96, no 12, p. 3969-3976
  38. 38. Kester C, Griepentrog HW, Hörner R, Tuncer Z. A survey of future farm automation—A descriptive analysis of survey responses. In: Precision Agriculture’13. Wageningen Academic Publisher, Wageningen, Netherlands, 2013
  39. 39. Hayashi S, Shigematsu K, Yamamoto S, Kobayashi K, Kohno Y, Kamata J, Kurita M. Evaluation of a strawberry‐harvesting robot in a field test. Biosystems Engineering. 2010;105(2):160-171
  40. 40. Foglia MM, Reina G. Agricultural robot for radicchio harvesting. Journal of Field Robotics. 2006;23(6-7):363-377
  41. 41. Baeten J, Donné K, Boedrij S, Beckers W, Claesen E. Autonomous fruit picking machine: A robotic apple harvester. In: Field and Service Robotics. Berlin Heidelberg: Springer, Berlin, Germany, 2008, pp. 531-539
  42. 42. Reed JN, Miles SJ, Butler J, Baldwin M, Noble R. AE—Automation and emerging technologies: Automatic mushroom harvester development. Journal of Agricultural Engineering Research. 2001;78(1):15-23
  43. 43. De‐An Z, Jidong L, Wei J, Ying Z, Yu C. Design and control of an apple harvesting robot. Biosystems Engineering. 2011;110(2):112-122
  44. 44. Tanigaki K, Fujiura T, Akase A, Imagawa J. Cherry‐harvesting robot. Computers and Electronics in Agriculture. 2008;63(1):65-72
  45. 45. Van Henten EJ, Van Tuijl BV, Hemming J, Kornet JG, Bontsema J, Van Os EA. Field test of an autonomous cucumber picking robot. Biosystems Engineering. 2003;86(3):305-313
  46. 46. Zanlorensi LA, Araújo VM, Guimarães AM. Automatic control and robotics for greenhouses: A review on heating technologies. Ibero|American Journal of Applied Computing. 2016;4(3):21-28
  47. 47. Zhao Y, Gong L, Huang Y, Liu C. A review of key techniques of vision‐based control for harvesting robot. Computers and Electronics in Agriculture. 2016;127:311-323
  48. 48. Edan Y, Rogozin D, Flash T, Miles GE. Robotic melon harvesting. IEEE Transactions on Robotics and Automation. 2000;16(6):831-835
  49. 49. Feng G, Qixin C, Masateru N. Fruit detachment and classification method for strawberry harvesting robot. International Journal of Advanced Robotic Systems. 2008;5(1):41-48
  50. 50. Muscato G, Prestifilippo M, Abbate N, Rizzuto I. A prototype of an orange picking robot: Past history, the new robot and experimental results. Industrial Robot: An International Journal. 2005;32(2):128-138
  51. 51. Plebe A, Grasso G. Localization of spherical fruits for robotic harvesting. Machine Vision and Applications. 2001;13(2):70-79
  52. 52. Rath T, Kawollek M. Robotic harvesting of Gerbera Jamesonii based on detection and three‐dimensional modeling of cut flower pedicels. Computers and Electronics in Agriculture. 2009;66(1):85-92
  53. 53. Han KS, Kim SC, Lee YB, Kim SC, Im DH, Choi HK, Hwang H. Strawberry harvesting robot for bench‐type cultivation. Journal of Biosystems Engineering. 2012;37(1):65-74
  54. 54. Bechar A, Vigneault C. Agricultural robots for field operations. Part 2: Operations and systems. Biosystems Engineering. 2017;153:110-128
  55. 55. Vasigh DJB. The Economic Impact of Unmanned Aircraft Systems Integration in the United States. Arlington, VA: Association for Unmanned Vehicle Systems International; 2013
  56. 56. Mulla DJ. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosystems Engineering. 2013;114(4):358-371
  57. 57. Torres‐Sánchez J, López‐Granados F, Serrano N, Arquero O, Peña JM. High‐throughput 3‐D monitoring of agricultural‐tree plantations with unmanned aerial vehicle (UAV) technology. PLoS One. 2015;10(6):e0130479
  58. 58. Gago J, Douthe C, Coopman RE, Gallego PP, Ribas‐Carbo M, Flexas J, Escalona J, Medrano H. UAVs challenge to assess water stress for sustainable agriculture. Agricultural Water Management. 2015;153:9-19
  59. 59. Zarco‐Tejada PJ, González‐Dugo V, Berni JAJ. Fluorescence, temperature and narrow‐band indices acquired from a UAV platform for water stress detection using a micro‐hyperspectral imager and a thermal camera. Remote Sensing of Environment. 2012;117:322-337
  60. 60. Bellvert J, Zarco‐Tejada PJ, Girona J, Fereres E. Mapping crop water stress index in a pinot‐noir vineyard: Comparing ground measurements with thermal remote sensing imagery from an unmanned aerial vehicle. Precision Agriculture. 2014;15(4):361-376
  61. 61. Baluja J, Diago MP, Balda P, Zorer R, Meggio F, Morales F, Tardaguila J. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrigation Science. 2012;30(6):511-522
  62. 62. Suárez L, Zarco‐Tejada PJ, González‐Dugo V, Berni JAJ, Sagardoy R, Morales F, Fereres E. Detecting water stress effects on fruit quality in orchards with time‐series pri airborne imagery. Remote Sensing of Environment. 2010;114(2):286-298
  63. 63. Bendig J, Bolten A, Bennertz S, Broscheit J, Eichfuss S, Bareth G. Estimating biomass of barley using crop surface models (csms) derived from uav‐based rgb imaging. Remote Sensing. 2014;6(11):10395-10412
  64. 64. Schirrmann M, Hamdorf A, Garz A, Ustyuzhanin A, Dammer KH. Estimating wheat biomass by combining image clustering with crop height. Computers and Electronics in Agriculture. 2016;121:374-384
  65. 65. Duan S‐B, Li Z‐L, Wu H, Tang B‐H, Ma L, Zhao E, Li C. Inversion of the prosail model to estimate leaf area index of maize, potato, and sun flower yields from unmanned aerial vehicle hyperspectral data. International Journal of Applied Earth Observation and Geoinformation. 2014;26:12-20
  66. 66. Mathews AJ, Jensen JLR. Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sensing. 2013;5(5):2164-2183
  67. 67. Verger A, Vigneau N, Chéron C, Gilliot JM, Comar A, Baret F. Green area index from an unmanned aerial system over wheat and rapeseed crops. Remote Sensing of Environment. 2014;152:654-664
  68. 68. Garcia‐Ruiz F, Sankaran S, Maja JM, Lee JS, Rasmussen J, Ehsani R. Comparison of two aerial imaging platforms for identification of huanglongbing‐infected citrus trees. Computers and Electronics in Agriculture. 2013;91:106-115
  69. 69. Sarkar SK, Das I, Ehsani R, Kumar V. Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close‐range remote sensing. In: 2016 IEEE International Conference on Robotics and Automation (ICRA); 16-21 May 2016; Stockholm, Sweden. IEEE; 2016. pp. 5143-5148
  70. 70. Calderón R, Navas‐Cortés JA, Lucena C, Zarco‐Tejada PJ. High‐resolution airborne hyperspectral and thermal imagery for early detection of verticillium wilt of olive using fluorescence, temperature and narrow‐band spectral indices. Remote Sensing of Environment. 2013;139:231-245
  71. 71. De Castro AI, Ehsani R, Poetz RC, Crane JH, Buchanon S. Detection of laurel wilt disease in avocado using low altitude aerial imaging. PLoS One. 2015;10(4):e0124642
  72. 72. Aylor DE, Schmale DG, Shields EJ, Newcomb M, Nappo CJ. Tracking the potato late blight pathogen in the atmosphere using unmanned aerial vehicles and lagrangian modeling. Agricultural and Forest Meteorology. 2011;151(2):251-260
  73. 73. Di Gennaro SF, Battiston E, Di Marco S, Facini O, Matese A, Nocentini M, Palliotti A, Mugnai L. Unmanned aerial vehicle (UAV)‐based remote sensing to monitor grapevine leaf stripe disease within a vineyard affected by esca complex. Phytopathologia Mediterranea. 2016;55(2):262
  74. 74. Giles DK. Use of remotely piloted aircraft for pesticide applications: Issues and outlook. Outlooks on Pest Management. 2016;27(5):213-216
  75. 75. Faiçal BS, Costa FG, Pessin G, Ueyama J, Freitas H, Colombo A, Fini PH, Villas L, Osório FS, Vargas PA, et al. The use of unmanned aerial vehicles and wireless sensor networks for spraying pesticides. Journal of Systems Architecture. 2014;60(4):393-404
  76. 76. Qin W‐C, Qiu B‐J, Xue X‐Y, Chen C, Xu Z‐F, Zhou Q‐Q. Droplet deposition and control effect of insecticides sprayed with an unmanned aerial vehicle against plant hoppers. Crop Protection. 2016;85:79-88
  77. 77. Antonelli MG, Auriti L, Beomonte Zobel P, Raparelli T. Development of a new harvesting module for saffron flower detachment. Romanian Review Precision Mechanics, Optics and Mechatronics. 2011;39:163-168
  78. 78. Chatzimichali AP, Georgilas IP, Tourassis VD. Design of an advanced prototype robot for white asparagus harvesting. In: 2009 IEEE/ASME International Conference on Advanced Intelligent Mechatronics; 14-17 July 2009; Singapore. IEEE; 2009. pp. 887-892
  79. 79. Aljanobi AA, Al‐Hamed SA, Al‐Suhaibani SA. A setup of mobile robotic unit for fruit harvesting. In: 19th International Workshop on Robotics in Alpe‐Adria‐Danube Region (RAAD 2010); 24-26 June 2010; Budapest, Hungary. IEEE; 2010
  80. 80. Kitamura S, Oka K. Recognition and cutting system of sweet pepper for picking robot in greenhouse horticulture. In: IEEE International Conference Mechatronics and Automation; 29 July‐1 August; Niagara Falls, Ont., Canada. IEEE; 2005. Vol. 4. pp. 1807-1812
  81. 81. Qingchun F, Wengang Z, Quan Q, Kai J, Rui G. Study on strawberry robotic harvesting system. In: IEEE International Conference on Computer Science and Automation Engineering (CSAE), 2012; 25-27 May 2012; Zhangjiajie, China. IEEE; 2012. Vol. 1. pp. 320-324
  82. 82. Sakai S, Iida M, Osuka K, Umeda M. Design and control of a heavy material handling manipulator for agricultural robots. Autonomous Robots. 2008;25(3):189-204
  83. 83. Ceres R, Pons JL, Jimenez AR, Martin JM, Calderon L. Design and implementation of an aided fruit‐harvesting robot (Agribot). Industrial Robot: An International Journal. 1998;25(5):337-346
  84. 84. Hwang H, Kim SC. Development of multi‐functional tele‐operative modular robotic system for greenhouse watermelon. In: Proceedings of 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2003 (AIM 2003); 20-24 July 2003; Kobe, Japan, Japan. IEEE; 2003. Vol. 2. pp. 1344-1349
  85. 85. Irie N, Taguchi N, Horie T, Ishimatsu T. Asparagus harvesting robot coordinated with 3‐D vision sensor. In: IEEE International Conference on Industrial Technology, 2009. ICIT 2009; 10-13 Feb. 2009; Gippsland, VIC, Australia. IEEE; 2009. pp. 1-6
  86. 86. Kondo N, Yamamoto K, Yata K, Kurita M. A machine vision for tomato cluster harvesting robot. In 2008 Providence, Rhode Island, 29 June‐2 July 2008. American Society of Agricultural and Biological Engineers; 2008, p. 1
  87. 87. Hayashi S, Ganno K, Ishii Y, Tanaka I. Robotic harvesting system for eggplants. Japan Agricultural Research Quarterly. 2002;36(3):163-168
  88. 88. Liu TH, Zeng XR, Ke ZH. Design and prototyping a harvester for litchi picking. In: International Conference on Intelligent Computation Technology and Automation (ICICTA), 2011; 28-29 March 2011; Shenzhen, Guangdong, China. IEEE; 2011. Vol. 2, pp. 39-42
  89. 89. Ismail W, Ishak W, Kit WH, Awal M. Design and development of eggplant harvester for gantry system. Pertanika Journal of Science & Technology. 2010;18(2):231-242
  90. 90. Arima S, Kondo N. Cucumber harvesting robot and plant training system. Journal of Robotics and Mechatronics. 1999;11:208-212
  91. 91. Bonadies S, Lefcourt A, Gadsden SA. A survey of unmanned ground vehicles with applications to agricultural and environmental sensing. In: SPIE Commercial+ Scientific Sensing and Imaging. International Society for Optics and Photonics; 2016. pp. 98660Q‐98660Q
  92. 92. Bechar A, Vigneault C. Agricultural robots for field operations: Concepts and components. Biosystems Engineering. 2016;149:94-111
  93. 93. Zhan Q, Cai Y, Liu Z. Near‐optimal trajectory planning of a spherical mobile robot for environment exploration. IEEE Conference on Robotics, Automation and Mechatronics; 21-24 Sept. 2008; Chengdu, China. IEEE; 2008. pp. 84-89
  94. 94. Hernández JD, Sanz D, Rodríguez‐Canosa GR, Barrientos J, del Cerro J, Barrientos A. Sensorized robotic sphere for large exterior critical infrastructures supervision. Journal of Applied Remote Sensing. 2013;7(1):073522-073522
  95. 95. Hernández JD, Barrientos J, del Cerro J, Barrientos A, Sanz D. Moisture measurement in crops using spherical robots. Industrial Robot: An International Journal. 2013;40(1):59-66
  96. 96. Dorhout D. Prospero, The Robot Farmer. Available from: http://www.dorhoutrd.com/prospero_robot_farmer [Accessed: January 2017]
  97. 97. Wood R. RoboBees Project. Harvard University; Cambridge, Massachusetts, United States; 2015
  98. 98. Avellar GS, Pereira GA, Pimenta LC, Iscold P. Multi‐UAV routing for area coverage and remote sensing with minimum time. Sensors. 2015;15(11):27783-27803
  99. 99. Chao H, Baumann M, Jensen A, Chen Y, Cao Y, Ren W, McKee M. Band‐reconfigurable multi‐UAV‐based cooperative remote sensing for real‐time water management and distributed irrigation control. IFAC Proceedings Volumes. 2008;41(2):11744-11749
  100. 100. Pesticide Sales Statistics, by Major Groups in 2014. Eurostat Statistics Explained. Available from: http://ec.europa.eu/eurostat/statistics‐explained/index.php/Pesticide_sales_ statistics [Accessed: December, 2016]
  101. 101. Pimentel D. Environmental and economic costs of the application of pesticides primarily in the United States. Environment, Development and Sustainability. 2005;7:229-252
  102. 102. Del Cerro J, Barrientos A, Sanz D, Valente J. Aerial fleet in RHEA project: A high vantage point contributions to ROBOT 2013. In: ROBOT2013: First Iberian Robotics Conference. Springer International Publishing; 2014. pp. 457-468
  103. 103. Rabatel G, Labbé S. Registration of visible and near infrared unmanned aerial vehicle images based on Fourier‐Mellin transform. Precision Agriculture, 2016, vol. 17, no 5, p. 564-587
  104. 104. López‐Granados F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Research. 2011;51(1):1-11
  105. 105. Conesa‐Muñoz J, Ribeiro A, Andujar D, Fernandez‐Quintanilla C, Dorado J. Multipath planning based on a NSGA‐II for a fleet of robots to work on agricultural tasks. In: Abbass H, Essam D, Sarker R, editors. IEEE World Congress on Computational Intelligence. Red Hook, NY: Congress of Evolutionary Computation (CEC), Curran Associates Inc; 2012. pp. 2236-2243
  106. 106. Romeo J, Guerrero JM, Montalvo M, Emmi L, Guijarro M, Gonzalez‐de‐Santos P, et al. Camera sensor arrangement for crop/weeds detection accuracy in agronomic images. Sensors. 2013;13:4348-4366
  107. 107. Xiaoyuan Y, Jiwei D, Tianjie Y, Qingfu Q. A method for improving detection of gas concentrations using quadrotor. In: Information Technology, Networking, Electronic and Automation Control Conference; 20-22 May 2016; Chongqing, China. IEEE; 2016. pp. 971-975

Written By

Juan Jesús Roldán, Jaime del Cerro, David Garzón‐Ramos, Pablo Garcia‐Aunon, Mario Garzón, Jorge de León and Antonio Barrientos

Submitted: 21 October 2016 Reviewed: 23 May 2017 Published: 20 December 2017