Open access peer-reviewed chapter

Cloud Robotics and Autonomous Vehicles

Written By

Khuram Shahzad

Submitted: 15 November 2015 Reviewed: 04 May 2016 Published: 07 September 2016

DOI: 10.5772/64064

From the Edited Volume

Autonomous Vehicle

Edited by Andrzej Zak

Chapter metrics overview

2,706 Chapter Downloads

View Full Metrics

Abstract

Recently, a good amount of research has been focused on the development of the autonomous vehicles. Autonomous vehicles possess great potential in numerous challenging applications, for example, autonomous armoured fighting vehicles, automated highway systems, etc. To enable the usage of autonomous vehicles in such challenging applications, it is important to ensure the safety, efficiency, reliability and robustness of the system. Most of the existing implementations of the autonomous vehicles operate as standalone systems limited to onboard capabilities (computations, memory, data, etc.), which limit their potential and performance in real-world applications. The advent of the Internet and emerging advances in the cloud infrastructure suggests new methodologies where vehicles are not limited to onboard capabilities. Processing is also performed remotely on cloud to support different operations and to increase the proficiency of decision-making. This chapter surveys the research to date in the evolution of autonomous vehicles, cloud and cloud-enabled autonomous vehicles, with the limitations of existing systems, research challenges and possible future directions. The chapter can help new researchers in the field to understand and evaluate different approaches for the design of the autonomous vehicular systems.

Keywords

  • cloud robotics
  • cloud computing
  • big data
  • crowdsourcing
  • open-source
  • open-access
  • vehicular cloud

1. Introduction

The advancements in the vehicular industry have immensely benefited different related industries and served the humanity by increasing the efficiency of our routine activities. Take the example of agriculture industry, one can cultivate a piece of land so quickly (using tractors and other equipment) than compared to 100 years ago. The same applies to the transportation industry—nowadays, one can travel from one point to another so quickly compared to travelling few decades ago. However, vehicular industry still requires further advancements to decrease/eliminate the human error/involvement. If we take the example of transportation industry, only in the USA, motor-vehicles-traffic-related injuries result in around 34,000 deaths every year, and it is the leading cause of death every year for people aged between 4 and 34 [1]. If we look worldwide, over 1.20 million people die every year as a result of road traffic accidents and between 20 and 50 million more people suffer from non-fatal injuries including physical disabilities, etc. [2]. 90 plus percent of these accidents are caused by human error [2]. Hence, a need arises for the technology that always pays attention to it, never gets distracted and has no/minimal human involvement. Such goals can be achieved through autonomous vehicles.

Figure 1.

Different types of vehicles used in real life applications for travelling in different mediums such as land, water, air and space.

Vehicles can be generally categorized into on road, off road, water, aerial, space and amphibious vehicles. Figure 1(a)(f) shows examples of different types of vehicles from real life. On-road vehicles require paved or gravel surface for their driving. These vehicles involve sports cars, passenger cars, pickup trucks, school/passenger buses, etc. Off-road vehicles are vehicles which are capable of driving both on as well off the paved surface. These vehicles are generally characterized with deep large tires or caterpillar tracks, flexible suspension and open treads. Tractors, bulldozers, tanks, 4WD army trucks are examples of such type of vehicles. Both on-road and off-road vehicles require land as a medium of travelling. Water vehicles are vehicles which are capable of driving/travelling on water, under water or both on as well under water. Ships, boats and submarines are examples of such vehicles. Aerial vehicles are vehicles which are capable of flying in the air by gaining air support. Aeroplanes, helicopters, drone planes are examples of aerial vehicles. Space vehicles are vehicles which are capable of travelling/flying in the outer space. They are used to carry payload such as humans or satellites between the earth and the outer space. These vehicles are rocket-powered vehicles, which also require an oxidizer to operate in vacuum space. Spacecrafts and rockets are examples of space vehicles. Amphibious vehicles are vehicles, which inherit characteristics of multiple mediums of travelling and can travel in those mediums efficiently. These mediums can be land, water, air and space. AeroMobil 3.0 and LARC-V (Lighter, Amphibious Resupply, Cargo, 5 ton) are examples of amphibious vehicles.

Automation of these different types of vehicles can increase the safety, reliability, robustness and efficiency of the systems through standardization of procedural operations with minimal human intervention. With the advancements in technology, autonomous vehicles have become forefront public interest and active discussion topic recently. Based on a recent survey conducted in the USA, UK and Australia, 56.8% peoples had positive opinion, 29.4% had neutral opinion and only 13.8% had negative opinion about the autonomous or self-driving vehicles [3]. These stats give us a good picture of the general public’s interest in the autonomous vehicles; however, they do have high levels of concerns regarding safety, privacy and performance issues. There are generally five levels of autonomous or self-driving vehicles ranging from Level 0 to Level 4 [4]. The brief description of these levels is as follows (taken from [4]):

  • Level 0 means no automation.

  • Level 1 achieves critical function-specific automation.

  • Level 2 achieves combined functions automation by coordinating two or more Level 1 functions.

  • Level 3 provides limited self-driven automation.

  • Level 4 provides completely self-driven unmanned vehicle. The vehicle controls all its operations by itself.

These levels of automations are drafted for the on-road vehicles; however, they can be extended for the other types of vehicles. Based on a recent survey conducted in the USA, UK and Australia, 29.9% people were very concerned, 30.5% were moderately concerned, 27.5% were slightly concerned and only 12.1% people were not at all concerned about riding in a Level-4 autonomous vehicle [3]. These statistics show that major efforts are required for the usage and acceptability of the autonomous vehicles in real-world practical applications.

Autonomous vehicle is an active area of research and possesses numerous challenging applications. The earlier implementations of the autonomous vehicles and other autonomous systems were standalone implementations; rather, most of the existing implementations are still operating independently [5]. In such standalone implementations, the system is limited to the onboard capabilities such as memory, computations, data and programs, and also the vehicles cannot interact with each other or have access to each other’s information or information about their surroundings [5]. To achieve Level-4 autonomous vehicles and self-driven automation in other robotic systems, it is important to overcome these limitations and go beyond the onboard capabilities of such systems. With the advent of Internet and emerging advances in the cloud robotics paradigm, new approaches have been enabled where systems are not limited to the onboard capabilities and the processing is also performed remotely on the cloud to support different operations. The cloud-based implementation of different types of vehicles has been illustrated in Figure 2. The cloud-based infrastructure has potential to enable a wide range of applications and new paradigms in robotics and automation systems. Autonomous vehicle is one example of such systems, which can highly benefit from cloud infrastructure and can overcome the limitations posed by standalone implementations. Cloud infrastructure enables ubiquitous, convenient and on-demand network access to a shared pool of configurable computing resources [6]. These computing resources can include services, storage, servers, networks and applications [6]. These resources can be rapidly provisioned and released with minimal service provider interaction or management effort [6]. The cloud model is characterized with five important characteristics (on-demand self-services, broad network access, resource pooling, rapid elasticity and measured service), three service models (Software as a Service, Platform as a Service and Infrastructure as a Service, abbreviated as SaaS, PaaS and IaaS, respectively) and four deployment models (private, community, public and hybrid clouds), and see Ref. [6] for more detail.

Figure 2.

Example of vehicular cloud where vehicles act as nodes to access shared pool of computing resources, including services, storage, servers, networks and applications.

An example of cloud-based implementation is the online document-processing facility offered by Microsoft through Office 365 and OneDrive. One can perform different operations online, such as one can create, edit and share MS Office documents (Word, Excel, PowerPoint, etc.) online without the need of installing the MS Office locally. The documents’ data and software installations reside on the cloud remote servers, which can be accessed through the Internet. These servers share their computing capabilities such as processors, storage and memory. Cloud-based implementations provide economics of scale and take care of the software and hardware updates. The infrastructure also facilities backup of data as well sharing of resources across different applications and users.

Cloud-enabled robots and autonomous systems are not limited to onboard capabilities and rely on data from a cloud network to support their different operations. In 2010, James Kuffner explained the potential benefit of cloud-enabled robots and coined the term “Cloud Robotics” [7]. The vehicular industry is rapidly evolving; nowadays, vehicles are equipped with different ranges of sensors and cloud collaborations, which facilitate drivers with the desired information, for example, weather forecasts, GPS location, traffic situation on the road, road condition, directions and time to reach the destination with different alternative paths and speeds, etc. Through cloud robotics, we can develop a network of autonomous vehicles (such as vehicular cloud or Internet of vehicles), by which autonomous vehicles can collaborate with each other or can perform different computing activities that they cannot perform locally, to achieve their well-defined utility functions (e.g. timely delivery of the passengers or payload, safety, environment friendly, etc.). Google’s self-driving car demonstrates the cloud-robotics-based implementation of the autonomous vehicles. It uses cloud services for the accurate localization and manoeuvring. Google tested the autonomous vehicle deployment on different types of cars including Audi TT, Lexus RX450h, Toyota Prius and their own custom vehicle [8, 9]. As of March 2016, Google had tested their autonomous self-driven vehicles a total of 1,498,214 miles (2,411,142 km) [10]. The project is limited to the on-road implementation of the autonomous vehicle; also it has many limitations that need to be addressed before it can be released to the commercial public. These limitations involve driving in heavy rain or snowy weather, driving on unmapped intersections or routes, veer unnecessarily due to difficulty in objects identification and other limitations cause of LIDAR technology to spot different signals (e.g. police officer signalling the car to stop), etc. [11, 12].

The rest of the chapter is organized as follows: In Section 2, we have presented the historical backgrounds of the evolution of autonomous vehicles, cloud computing and cloud-enabled autonomous vehicles. In this section, we have also presented different high level architectures of autonomous vehicle and cloud-enabled autonomous vehicles proposed in literature. In Section 3, we have discussed five potential benefits of cloud-enabled autonomous vehicles, namely cloud computing, big data, open-source/open-access, system learning and crowdsourcing. Section 4 describes active research challenges and possible future directions in the field, and conclusion appears in Section 5.

Advertisement

2. Historical backgrounds

Autonomous vehicle is an active area of research with rich history. The research performed in the early 1980s and 1990s on autonomous driving demonstrated the possibilities of developing vehicles that can control their movements in the complex environments [13, 14]. Initial prototypes of autonomous vehicles were limited to indoor use [1517]. First study on visual road vehicle was performed in Mechanical Engineering Laboratory, Tsukuba, Japan, using vertical stereo cameras [18]. In the early 1980s, completely onboard autonomous vision system for vehicles was developed and deployed with digital microprocessors [13, 19]. First milestone towards the development of road vehicles with machine vision was accomplished in the late 1980s, with the fully autonomous longitudinal and lateral demonstrations of UBM’s (Universität der Bundeswehr München) test vehicle and computer vision VaMoRs on a free stretch of Autobahn over 20 km at 96 km/h [13]. These encouraging results led the inclusion of computer vision for vehicles in the European EUREKA-project “Prometheus” (1987–1994) and also sparked European car industrialists and universities for research in the field [13]. The major focus of these developments was to demonstrate which functions of the autonomous vehicles could be automated through computer vision [13, 14]. The promising results of such demonstrations kicked-off many initiatives [13, 20].

The Intermodal Surface Transportation Efficiency Act (ISTEA) passed the transportation authorization bill in 1991 and instructed United States Department of Transportation (USDOT) for demonstrating an autonomous vehicle and highway system by 1997 [20, 21]. The USDOT built partnerships with academia, state, local and private sectors in conducting the program, and made extraordinary progress with revolutionizing developments in vehicle safety and information systems [21]. The USDOT program also motivated US Federal Highway Administration (FHWA) to start National Automated Highway System Consortium (NAHSC) program with different partners including California PATH, General Motors, Carnegie Mellon University, Hughes, Caltrans, Lockheed Martian, Bechtel, Parsons Brinkerhoff and Delco Electronics [20]. The Intelligent Vehicle Initiative (IVI) program was announced in 1997 and was legalized in 1998 Transportation Equity Act for the 21st Century (TEA-21), with the purpose to speed-up the development of driver-assistive systems by focusing on crash prevention rather than mitigation and vehicle-based rather than highway-based applications [22]. The IVI program resulted in a number of successful developments and deployments in the field operational tests which include different commercial applications such as lane change assist, lane departure/merge warning, adaptive cruise control, adaptive forward crash warning and vehicle stability system [22]. Commercial versions of these systems were manufactured shortly, and ever since their evolution and industrial penetration have been increasing [22].

The increasing advancements in the computational and sensing technologies have further impelled interest in the field of autonomous vehicles and developing cost-effective systems [5]. The present state of the art autonomous vehicles can sense their local environment, identify different objects, have knowledge about the evolution of their environment and can plan complex motion by obeying different rules. Many advancements have been made in the recent 1.5 decade, evidenced through different successful demonstrations and competitions. The most prominent historical series of such competitions/challenges were organized by US Department of Defence under Defence Advanced Research Projects Agency (DARPA). The competition was initially launched as DARPA Grand Challenge (DGC), and there are five such competitions held so far—first in March 2004, second in October 2005, third in November 2007, fourth from October 2012–June 2015 and fifth one from January–April 2013 [5, 2325]. The fourth challenge, named as DARPA Robotics Challenge (DRC), was aimed at the development of semi-autonomous emergency maintenance ground robots [23, 25], and the fifth challenge, named as Fast Adaptable Next-Generation Ground Vehicle (FANG GV), was aimed at adaptive designs of the vehicles [24]. Both FANG and DRC were not aiming for self-driven/autonomous/robotic vehicles; hence, we have not discussed them in this chapter. In the first three challenges, hundreds of autonomous vehicles from the USA and around the world participated in the competition and exhibited their different levels of versatilities. The first and second challenges were aimed to examine vehicles’ ability in off-road environment. Autonomous vehicles had to navigate in a desert up to 240 km at speed up to 80 km/h [5]. Only five vehicles were able to travel more than a mile, in the first competition. Out of those five vehicles the furthest travelling vehicle covered only 7.32 miles (11.78 km) [5]. None of the vehicles completed the route; hence, there was no winner, and the second challenge was scheduled for October 2005. By the second challenge, five vehicles were able to successfully complete the route, and Stanley from Stanford University, Palo Alto, California, secured the first place in the competition. The third challenge, named as DARPA Urban Challenge (DUC), was shifted to the urban area. The route involved 60 miles of travelling to be completed within 6 h, and vehicles had to obey all traffic regulations during their autonomous driving. Out of the 11 final teams, only 6 were able to successfully complete the route, and Boss from Carnegie Mellon University, Pittsburgh, Pennsylvania, secured the first place in the competition [5].

The participating teams in the DGC competitions adopted different types of system architectures, with standalone implementations. However, on a higher level, they decomposed the system architecture in four basic subsystems, namely, sensing, perception, planning and control (see Figure 3 for the pictorial representation) [5]. The sensing unit takes raw measurements from different on-/off-board sensors (e.g. GPS, radar, LIDAR, odometer, vision, inertial measurement unit, etc.) for perceiving the static and dynamic environment. Sensor unit passes the raw data to the perception unit, which then generates the usable information about the vehicle (e.g. pose, map relative estimations, etc.) and its environment (e.g. lanes of other vehicles, obstacles, etc.), based on provided data. The planner unit takes the usable information/estimations from the perception unit, reasons about the provided information and plans about the vehicle’s actuations in the environment, such as path, behavioural, escalation and map planning, etc. to maximize their well-defined utility functions. Finally, the planner unit passes the ultimate information/commands to the control unit, which is responsible for actuating the vehicle.

Figure 3.

High-level system architecture of the autonomous vehicles.

One of the main lessons learned from the DARPA challenges was the need for the autonomous vehicles to be connected, that is, the ability to interact with each other and to have access to each other’s information or information about their surroundings [5]. This also provides us some idea about the importance of cloud infrastructure in accomplishing dreams of autonomous vehicles. In the 2000s, cloud computing was evolving and came into existence. Amazon introduced its Elastic Compute Cloud (EC2) as a web service in 2006 [26]. Amazon EC2 was aimed to provide resizable computing capabilities in the cloud servers [26]. In 2008, NASA’s OpenNebula become the first open-source software to provide private and hybrid clouds [27]. In the same year, Azure was announced by Microsoft, aiming to provide cloud computing services and was released in early 2010 [28]. In the mid-2010, OpenStack project was jointly launched by NASA and Rackspace Hosting, with the intentions to help organizations to set up cloud computing services (mostly IaaS) on their standard hardware [29]. Oracle Cloud was announced by Oracle in year 2012, with the aim to provide access to integrated set of IT solutions, including SaaS, PaaS and IaaS [30].

The importance of connecting machines in manufacturing automation systems through networking was realized 3.5 decades ago when General Motors developed Manufacturing Automation Protocol in 1980s [31]. Before the discovery of World Wide Web (WWW), different types of incompatible protocols were adopted by different vendors. In the early 1990s, the discovery of WWW promoted the Hypertext Transfer Protocol (HTTP) over Internet Protocol (IP) [32]. In 1994, the first industrial robot was integrated with WWW so that it can be teleoperated by different users through graphical user interface [33]. In the mid- and late-1990s different types of robots were integrated with the web to explore robustness and interface issues that initiated study in a new field named as “Network Robotics” [34, 35]. In 1997, Inaba et al. investigated the benefits of remote computing to accomplish control of remote-brained robots [36]. The technical committee on networked robotics established the IEEE robotics and automation society in 2001 [37]. The initial focus of the society was on Internet-based teleported robots which was later on extended to different range of applications [37]. In 2006, MobEyes system was proposed, which exploits vehicular sensors networks to record surrounding environment and events for the purpose of urban monitoring. RoboEarth project was announced in 2009, with the purpose to use WWW for robots, such that they can share their data and learn from each other [38]. In 2010, James Kuffner explained the concept of ‘Remote Brained’ robots (i.e. physical separation of the robotics hardware and the software) and introduced the term “Cloud Robotics” with potential applications and benefits of cloud-enabled robots [7]. In the same year, the term “Internet of Things (IoT)” was introduced to exploit the network of physical things (e.g. vehicles, household appliances, buildings, etc.) that consists of sensors, software and ability for network connectivity for exchanging information with other objects [39]. Different vehicular ad-hoc networks (VANETs) were proposed in 2011, with the purpose to provide several cloud services for the next generation automotive systems [40, 41]. The term “Industry 4.0” was introduced in the same year for the fourth industrial revolution, with the purpose to use networking to follow the first three revolutions [42].

In 2012, M. Gerla discussed different design principles, issues and potential applications of the Vehicular Cloud Computing (VCC) [43]. In the same year, S. Kumar et al. proposed the Octree-based cloud-assisted design for autonomous driving of vehicles to assist them in planning their trajectories [44]. The high level system architecture can be visualized as presented in Figure 4. The purpose of sensing, planner and controller unit is same as explained for Figure 3, with the modification that the perception unit has been merged into the planner unit, and planner unit has been divided into two sub-units namely onboard planner and planner over cloud. Both planner units can communicate with each other to exchange desired information and for vehicle to vehicle (V2V) communication. Cloud planner can generate requests to various autonomous vehicles for providing sensors’ data, which is then aggregated to generate the information about the obstacles, path planning, localization and emergency control, etc. Onboard planner unit communicates with the cloud planner for planning the optimal trajectory and passes the ultimate information to the controller unit, which then actuates the vehicle as required.

Figure 4.

High-level system architecture of the cloud-assisted design of the autonomous vehicles.

In the same year 2012, the term “Industrial Internet” was introduced by General Electric, with the purpose to connect industrial equipment over network for exchanging their data [45]. In 2014, Gerla et al. investigated the vehicular cloud and deduced that it will be a core system for autonomous vehicles that will make the advancements possible [46]. In the same year, Ashutosh Saxena announced the “RoboBrain” project, with the aim to build a massive online brain for all the robots of the world from publically available internet data [47, 48]. In early 2016, HERE announced the launch of their cloud-based mapping service for autonomous vehicles, aiming to enhance automated driving features of the vehicles [49]. In February 2016, Maglaras et al. investigated the concept of Social Internet of Vehicles (SIoV), discussed its different design principles, potential applications and research issues [50].

Advertisement

3. Potential benefits

As discussed in the previous section, cloud-based automation has gained massive interest of researches around the world and sparked many initiatives such as RoboEarth, Remote Brained Robots, IoT, Industry 4.0, VCC Industrial Internet, RoboBrain and SIoV. In this section, we have discussed potential of cloud to enhance automation of vehicle (also applicable to all robotics automation systems in general) by improving performance though five potential benefits, as follows:

3.1. Cloud computing

Autonomous vehicles require intensive parallel computation cycles to process sensors’ data and efficient path planning in the real-world environment [5]. It is certainly not practical to deploy massive onboard computing power with each agent of autonomous vehicle. Such deployments will be cost-intensive and may have certain limitations in parallel processing. Cloud provides massively parallel on demand computation, up to the computing power of super computers [51], which was previously not possible in standalone onboard implementations. Nowadays, a wide range of commercial sources (including Amazon’s EC2 [26], Microsoft’s Azure [28] and Google’s Compute Engine [52]) are available for cloud computing services, with the aim to provide access to tens of thousands of processors for on-demand computing tasks [53]. Initially, web/mobile apps developers used such services; however, they have increasingly been used in technical high-performance applications. Cloud computing can be used for computationally extensive tasks, such as to find out uncertainties in models, sensing and controls, analysis of videos and images, generate rapidly growing graphs (e.g. RRT*) and mapping, etc. [53]. Many applications require real-time processing of computational tasks, in such applications cloud can be prone to varying network latency and quality of service (QoS), and this has been an active research area nowadays [51, 53].

3.2. Big data

Big data refers to extremely large collection of datasets that cannot be handled with conventional database systems and require analysis to find out different patterns, associations, trends, etc. [54]. Autonomous vehicles require access to vast amount of data, for example, sensors’ network data, maps, images, videos, weather forecasts, programs, algorithms, etc., which cannot be maintained on board and surpass the processing capabilities of conventional database systems. Cloud infrastructure offer access to unlimited on-demand elastic storage capacities over cloud servers that can store large collections of big data as well facilitate in their intensive computations [5456]. Shared access of big datasets can also facilitate more accurate machine learning of autonomous vehicles, which can help the planners in optimal decision-making. It is essential to recognize that big datasets may require high-performance IaaS tools for performing intensive computations on the gigantic amount of data. These may include Amazon’s EC2 [26], Microsoft’s Azure [28] and Google’s Compute Engine [52], as described in previous section. Active research challenges in cloud-based big data storage include defining cross platform formats, working with sparse representation for efficient processing and developing new approaches that can be more robust to dirty data [55, 56].

3.3. Open-source/open-access

Open-source refers to free access to original source code of software and models in case of hardware, which can be modified and redistributed without any discrimination [56]. Open-access refers to free access of the algorithms, publications, libraries, designs, models, maps, datasets, standards and competitions, etc. [57]. In open set-up, different organizations and researchers contribute and share such resources to facilitate their development, adoption and distribution. For standalone autonomous vehicles, it is not possible to maintain such open-source software and resources and take maximum advantages of the facilities. Cloud infrastructure facilitates by providing well-organized access to such pools of resources [6]. A prominent example of the success of the open resources in the scientific community is the Robot Operating System (ROS), which provides access to the robotics tools and libraries to facilitate the development of the robotics applications [58]. Furthermore, many simulation tools and libraries (e.g. GraspIt, Bullet, Gazebo, OpenRAVE, etc.) are available open-source and can be customized as per the application’s requirement, which can certainly speed up the research and development activities.

3.4. System learning

System learning refers to collective learning of all the agents (e.g. autonomous vehicles) in the system. Autonomous vehicles need to learn from each other’s experiences, for example, if a vehicle identifies a new situation that was not part of the initial system, then the learning outcome of that instance needs to be reflected in all the vehicles in the systems. Accomplishing such goals is not possible with standalone implementation of the autonomous vehicles [5]. Cloud infrastructure enables shared access on the data [6]. Instances of physical trials and new experiences are also stored in that shared pool for collective leaning of all the vehicles. Instances can hold initial and anticipated conditions, boundary conditions and outcome of the execution. A good example of collective learning is the “Lightning” framework, which indexes paths of different robots in the system over several tasks and then use cloud computing for path planning and variations in different new situations [59].

3.5. Crowdsourcing

Crowdsourcing can be defined as a process to obtain desired information, service, input or ideas on a specific task (which surpass computer capabilities) from human(s), typically over the Internet [60, 61]. In case of autonomous vehicles, crowdsourcing can be performed to solve a number of problems, for example, during operation vehicles identify new obstacles/routes which were not labelled previously and require human(s) input. Standalone implementations limit vehicles’ ability for accomplishing such objectives [5]. Cloud-enabled systems facilitate in conducting crowdsourcing activities with specific or cloud crowd [61]. Cloud-based crowdsourcing has captured much attention of the researchers and industrialists/enterprises to achieve automation in their different processes [61]. A prominent example of cloud-based crowdsourcing is Amazon’s Mechanical Turk (MTurk), which provides marketplace to perform tasks that surpass computer capabilities and require human intelligence [62].

Advertisement

4. Research challenges and future directions

In this section, we have summarized different potential research challenges and future directions for cloud-enabled autonomous vehicles.

  • Effective load balancing: New algorithms and policies are required for balancing computations between vehicle’s onboard and cloud computers.

  • Scalable parallelization: Advancements in the cloud infrastructure are required for cloud computing parallelization scheme to scale based upon the size of autonomous vehicular system.

  • Effective sampling and scaling of data: New algorithms and approaches are required, which scale to the size of big data and are more robust to dirty data [56].

  • Ensure privacy and security: The data collected through different autonomous vehicles (using sensors, cams, route maps, etc.) can include potential secretes (e.g. private home data, corporate business planes, etc.), and over cloud it can be prone to theft or criminal use. Hence, privacy and security of the data over cloud needs to be ensured.

  • Ensure control and safety: The control of autonomous vehicles over cloud can be exposed to potential hacking threats. A hacker could remotely control the vehicle and use it for unethical purpose or to cause certain damage. Hence, the control and safety of the vehicle needs to be ensured.

  • Cope with varying network latency and QoS: For real-time applications, new algorithms and approaches are needed to handle varying network latency and QoS.

  • Fault tolerant control: For autonomous vehicular system failures can lead to undesirable hazardous situations, hence are not acceptable. New approaches are required for onboard and cloud-based fault tolerant control.

  • Verification and validation of the system: A primary problem for the autonomous vehicular system is the ability to substantiate that the system can operate safely, effectively and robustly, with safety being the major concern [5]. New methods are required for verifying and validating the desired functioning of the autonomous vehicular system [5].

  • Standards and protocols: Research in new standards and protocols is required such as to define cross platform formats of data and to work with sparse representation for efficient processing etc.

  • Crowdsourcing quality control: Crowdsourcing has normally been prone to generating nosier/erroneous data [59]. Hence, new mechanisms are required for improving and ensuring the quality of data collected through crowdsourcing.

Advertisement

5. Conclusion

In this chapter, we have provided an overview of the evolution of the cloud robotics and autonomous vehicles through different phases of the history. We discussed that automation of different types of vehicles (e.g. on-road, off-road, water, aerial, space and amphibious vehicle) can increase the safety, reliability, robustness and efficiency of the system. We examined that autonomous vehicle is an active area of research with rich history, and it possesses great potential in numerous challenging applications. We analysed that the cloud robotics paradigm enabled new approaches, where autonomous vehicles are not limited to onboard capabilities and relies on data from a cloud network to support their different operations. Cloud provides economics of scale and facilitates backup and sharing of data across different agents. We also discussed potential of cloud to enhance automation of vehicles by improving performance though different potential benefits including cloud computing, big data, open-source/open-access, system learning and crowdsourcing. In the end we analysed different active research challenges and possible future directions in the field.

The chapter can help new researchers in the field to get an overview of the current state-of-the-art systems and start research activities in possible future directions. We believe that a comprehensive design of autonomous vehicular systems based on cloud infrastructure can significantly increase the reliability, robustness and safety of autonomous vehicles that can be exploited for different potential applications.

References

  1. 1. M.D. Jiaquan Xu, B.S. Sherry, L. Murphy, M.A. Kenneth, D. Kochanek, B.S. Brigham, A. Bastian. Deaths: Final Data for 2013. NVSR. 2016;64(2). Available from: http://www.cdc.gov/nchs/data/nvsr/nvsr64/nvsr64_02.pdf [Accessed: 2016-02-21].
  2. 2. M. Peden, R. Scurfield, D. Sleet, D. Mohan, A.A. Hyder, E. Jarawan, C. Mathers, editors. World Report on Road Traffic Injury Prevention. Geneva: World Health Organization; 2004. Available from: http://whqlibdoc.who.int/publications/2004/9241562609.pdf?ua=1 [Accessed: 2016-02-21].
  3. 3. B. Schoettle, M. Sivak. A Survey of Public Opinion about Autonomous and Self-driving Vehicles in the US, the UK, and Australia. University of Michigan, Ann Arbor, Transportation Research Institute; 2014. Report No. UMTRI-2014-21. Available from: https://deepblue.lib.umich.edu/bitstream/handle/2027.42/108384/103024.pdf [Accessed: 2016-03-21].
  4. 4. U.S. National Highway Traffic Safety Administration. Preliminary Statement of Policy Concerning Automated Vehicles [Internet]. 2013. Available from: http://www.nhtsa.gov/staticfiles/rulemaking/pdf/Automated_Vehicles_Policy.pdf [Accessed: 2016-03-21].
  5. 5. M. Campbell, M. Egerstedt, J.P. How, R.M. Murray. Autonomous driving in urban environments: approaches, lessons and challenges. Philosophical Transactions of the Royal Society of London A: Mathematical, Physical and Engineering Sciences. 2010;368:4649–4672. DOI: 10.1098/rsta.2010.0110
  6. 6. P. Mell, T. Grance. The NIST Definition of Cloud Computing. Computer Security Division, Information Technology Laboratory, National Institute of Standards and Technology Gaithersburg; 2011. Available from: http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-145.pdf [Accessed: 2016-03-21].
  7. 7. J.J. Kuffner. Cloud-enabled Robots. IEEE-RAS International Conference on Humanoid Robotics, Nashville, TN; 2010. Available from: http://www.scribd.com/doc/47486324/Cloud-Enabled-Robots [Accessed: 2016-03-21].
  8. 8. D. Lavrinc. Exclusive: Google Expands Its Autonomous Fleet with Hybrid Lexus RX450h [Internet]. 2012-04-16. Available from: http://www.wired.com/2012/04/google-autonomous-lexus-rx450h/ [Accessed: 2016-03-21].
  9. 9. G. Nelson. Automotive News. Google in Talks with OEMs, Suppliers to Build Self-driving Cars [Internet]. 2015-01-14. Available from: http://www.autonews.com/article/20150114/OEM09/150119815/google-in-talks-with-oems-suppliers-to-build-self-driving-cars [Accessed: 2016-03-21].
  10. 10. Google. Google Self-Driving Car Project Monthly Report March 2016 [Internet]. 2016-03-31. Available from: https://static.googleusercontent.com/selfdrivingcar/files/reports/report-0316.pdf [Accessed: 2016-04-01].
  11. 11. J. Muller. No Hands, No Feet: My Unnerving Ride in Google’s Driverless Car [Internet]. 2013-03-21. Available from: http://www.forbes.com/sites/joannmuller/2013/03/21/no-hands-no-feet-my-unnerving-ride-in-googles-driverless-car/#5528586261b6 [Accessed: 2016-03-21].
  12. 12. L. Gomes. Hidden Obstacles for Google’s Self-Driving Cars [Internet]. 2014-08-28. Available from: https://www.technologyreview.com/s/530276/hidden-obstacles-for-googles-self-driving-cars/ [Accessed: 2016-03-21].
  13. 13. E.D. Dickmanns. The Development of Machine Vision for Road Vehicles in the Last Decade. Intelligent Vehicle Symposium. IEEE. 2002;1:268–281. DOI: 10.1109/IVS.2002.1187962
  14. 14. C. Thorpe, M.H. Hebert, T. Kanade, S.A. Shafer. Vision and Navigation for the Carnegie–Mellon Navlab. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1988;10(3):362–373. DOI: 10.1109/34.3900
  15. 15. N.J. Nilsson. A Mobile Automaton: An Application of Artificial Intelligence Techniques. Proceedings of the 1st International Joint Conference on Artificial Intelligence; 1969. pp. 509–521. Available from: http://dl.acm.org/citation.cfm?id=1624562.1624607 [Accessed: 2016-03-21].
  16. 16. D.B. Gennery. A Stereo Vision System for an Autonomous Vehicle. Proceedings of the 5th International Joint Conference on Artificial Intelligence, Cambridge, USA. 1977;2:576–582. Available from: http://dl.acm.org/citation.cfm?id=1622943.1622945 [Accessed: 2016-03-21].
  17. 17. H.P. Moravec. The Stanford Cart and the CMU Rover. Proceedings of the IEEE. 1983;71(7):872–884. DOI: 10.1109/PROC.1983.12684
  18. 18. S. Tsugawa, T. Yatabe, T. Hirose, S. Matsumoto. An Automobile with Artificial Intelligence. Proceedings of the 6th International Joint Conference on Artificial Intelligence, Tokyo, Japan. 1979;2:893–895. ISBN: 0-934613-47-8. Available from: http://dl.acm.org/citation.cfm?id=1623050.1623117 [Accessed: 2016-03-21].
  19. 19. H.G. Meissner. Steuerung dynamischer Systeme aufgrund bildhafter Informationen [thesis]. Doctoral Dissertation (in German), Aerospace Department, German Armed Forces University Munich, Germany, LRT; 1982. 175 p.
  20. 20. Transportation Research Board. National Automated Highway System Research Program: A Review. Transportation Research Board. Special Report 253; 1998. 87 p. ISBN: 030906452X.
  21. 21. C.M. Johnson. The National ITS Program: Where We’ve Been and Where We’re Going [Internet]. Sept/Oct 1997. Available from: https://www.fhwa.dot.gov/publications/publicroads/97septoct/p97sept6.cfm [Accessed: 2016-03-21].
  22. 22. K. Hartman, J. Strasser. Saving Lives through Advanced Vehicle Safety Technology: Intelligent Vehicle Initiative Final Report. Federal Highway Administration, FHWA-JPO-05-057; Sep 2005. Available from: http://ntl.bts.gov/lib/jpodocs/repts_pr/14153_files/ivi.pdf [Accessed: 2016-03-21].
  23. 23. DARPA. DARPA Robotics Challenge (Broad Agency Announcement). DARPA-BAA-12-39. 2012-04-10. Available from: http://www.androidworld.com/DARPA_Robotics_Challenge.pdf [Accessed: 2016-03-23].
  24. 24. DARPA. DARPA Announces Winner of the First FANG Challenge [Internet]. 2013-04-22. Available from: http://www.darpa.mil/news-events/2013-04-22 [Accessed: 2016-03-23].
  25. 25. E. Ackerman, E. Guizzo. DARPA Robotics Challenge: Amazing Moments, Lessons Learned, and What’s Next [Internet]. 2015-06-11. Available from: http://spectrum.ieee.org/automaton/robotics/humanoids/darpa-robotics-challenge-amazing-moments-lessons-learned-whats-next [Accessed: 2016-03-23].
  26. 26. Amazon. Announcing Amazon Elastic Compute Cloud (Amazon EC2) - Beta [Internet]. 2006-08-24. Available from: https://aws.amazon.com/about-aws/whats-new/2006/08/24/announcing-amazon-elastic-compute-cloud-amazon-ec2---beta/ [Accessed: 2016-04-02].
  27. 27. B. Rochwerger, D. Breitgand, E. Levy, A. Galis, K. Nagin, I.M. Llorente, R. Montero, Y. Wolfsthal, E. Elmroth, J. Caceres, et al. The Reservoir Model and Architecture for Open Federated Cloud Computing. IBM Journal of Research and Development. 2009;53(4):535–545. DOI: 10.1147/JRD.2009.5429058
  28. 28. Microsoft. Windows Azure General Availability [Internet]. 2010-02-01. Available from: http://blogs.microsoft.com/blog/2010/02/01/windows-azure-general-availability/#sm.000zygf6310ttetpyqo2hls18kvmn [Accessed: 2016-04-02].
  29. 29. Business Wire. OpenStack Launches as Independent Foundation, Begins Work Protecting, Empowering and Promoting OpenStack [Internet]. 2012-09-19. Available from: http://www.businesswire.com/news/home/20120919005997/en/OpenStack-Launches-Independent-Foundation-Begins-Work-Protecting [Accessed: 2016-04-02].
  30. 30. T.P. Morgan. Oracle’s Big Cloud Announcement, Again [Internet]. 2012-06-07. Available from: http://www.theregister.co.uk/2012/06/07/oracle_cloud_rehash_platinum_services/ [Accessed: 2016-04-02].
  31. 31. J.D. Irwin. The Industrial Electronics Handbook. 2nd ed. CRC Press LLC, 2000 Corporate Blvd NW, Boca Raton, Florida 33431, USA; 1997. 1728 p. ISBN: 9780849383434
  32. 32. M. Narita, S. Okabe, Y. Kato, Y. Murakwa, K. Okabayashi, S. Kanda. Reliable Cloud-based Robot Services. In: Conference of the IEEE Industrial Electronics Society. IEEE; 2013. pp. 8317–8322. DOI: 10.1109/IECON.2013.6700526
  33. 33. K. Goldberg. Beyond the Web: Excavating the Real World via Mosaic. In: Second International WWW Conference; 1994. DOI: 10.1.1.299.6262. Available from: http://www.ieor.berkeley.edu/∼goldberg/pubs/Beyond-the-Web-Excavating-Oct-1994.pdf [Accessed: 2016-04-01].
  34. 34. K. Goldberg, B. Chen. Collaborative Control of Robot Motion: Robustness to Error. In: IEEE/RSJ International Conference; 2001. pp. 655–660. DOI: 10.1109/IROS.2001.976244
  35. 35. G. McKee. What Is Networked Robotics? Springer, Berlin, Heidelberg; 2008. pp. 35–45. DOI: 10.1007/978-3-540-79142-3_4. Available from: http://dx.doi.org/10.1007/978-3-540-79142-3_4 [Accessed: 2016-04-01].
  36. 36. M. Inaba. Remote-brained Robots. International Joint Conference on Artificial Intelligence; 1997. pp. 1593–1606. Available from: http://www.ijcai.org/Proceedings/97-2/Papers/118.pdf [Accessed: 2016-04-01].
  37. 37. IEEE. IEEE Society of Robotics and Automation’s Technical Committee on Networked Robots [Internet]. 2001. Available from: http://www-users.cs.umn.edu/∼isler/tc/ [Accessed: 2016-04-02].
  38. 38. M. Waibel. RoboEarth: A World Wide Web for Robots [Internet]. 2011-02-05. Available from: http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/roboearth-a-world-wide-web-for-robots [Accessed: 2016-04-02].
  39. 39. L. Atzori, A. Iera, G. Morabito. The Internet of Things: A Survey. Computer Networks. 2010;54(15):2787–2805. DOI: 10.1016/j.comnet.2010.05.010
  40. 40. A. Iwai, M. Aoyama. Automotive Cloud Service Systems Based on Service-Oriented Architecture and Its Evaluation. In: IEEE International Conference on Cloud Computing (CLOUD); 4–9 July 2011; Washington, DC. pp. 638–645. DOI: 10.1109/CLOUD.2011.119
  41. 41. J. Wang, J. Cho, S. Lee, T. Ma. Real Time Services for Future Cloud Computing Enabled Vehicle Networks. In: IEEE International Conference on Wireless Communications and Signal Processing (WCSP); 9–11 Nov. 2011; Nanjing. pp. 1–5. DOI: 10.1109/WCSP.2011.6096957
  42. 42. R. Drath, A. Horch. Industrie 4.0: Hit or Hype? [Industry Forum]. IEEE Industrial Electronics Magazine. 2014;8(2):56–58. DOI: 10.1109/MIE.2014.2312079
  43. 43. M. Gerla. Vehicular Cloud Computing. In: The 11th Annual Mediterranean on Ad Hoc Networking Workshop (Med-Hoc-Net); 19–22 June 2012; Ayia Napa. pp. 152–155. DOI: 10.1109/MedHocNet.2012.6257116
  44. 44. S. Kumar, S. Gollakota, D. Katabi. A Cloud-assisted Design for Autonomous Driving. In: Proceedings of the First Edition of the MCC Workshop on Mobile Cloud Computing; Helsinki, Finland; ACM; 2012. pp. 41–46. DOI: 10.1145/2342509.2342519
  45. 45. P.C. Evans, M. Annunziata. Industrial Internet: Pushing the Boundaries of Minds and Machines. General Electric. 2012. Available from: http://www.ge.com/docs/chapters/Industrial_Internet.pdf [Accessed: 2016-04-03].
  46. 46. M. Gerla, E.K. Lee, G. Pau, U. Lee. Internet of Vehicles: From Intelligent Grid to Autonomous Cars and Vehicular Clouds. In: IEEE World Forum on Internet of Things (WF-IoT); 6–8 March 2014; Seoul. pp. 241–246. DOI: 10.1109/WF-IoT.2014.6803166
  47. 47. A. Saxena, A. Jain, O. Sener, A. Jami, D.K. Misra, H.S. Koppula. Robobrain: Large-scale Knowledge Engine for Robots. arXiv:1412.0691. 2014; Available from: http://arxiv.org/pdf/1412.0691.pdf [Accessed: 2016-04-04].
  48. 48. D. Hernandez. The Plan to Build a Massive Online Brain for All the World’s Robots [Internet]. 2014-08-25. Available from: http://www.wired.com/2014/08/robobrain/ [Accessed: 2016-04-04].
  49. 49. P. Pajarillo. HERE Launches Cloud-Based Mapping Service For Autonomous Vehicles [Internet]. 2016-01-05. Available from: http://www.itechpost.com/articles/17137/20160105/here-launches-cloud-based-mapping-service-for-autonomous-vehicles.htm [Accessed: 2016-04-04].
  50. 50. L.A. Maglaras, A.H. Al-Bayatti, Y. He, I. Wagner, H. Janicke. Social Internet of Vehicles for Smart Cities. Journal of Sensor and Actuator Networks. 2016; 5(1):1–22. DOI: 10.3390/jsan5010003
  51. 51. M. Armbrust, A. Fox, R. Griffith, A.D. Joseph, R. Katz, A. Konwinski, G. Lee, D. Patterson, A. Rabkin, I. Stoica, M. Zaharia. A View of Cloud Computing. Communication of the ACM. 2010;53(4):50–58. DOI: 10.1145/1721654.1721672
  52. 52. O. Malik. Google Launches Amazon Rival, Compute Engine [Internet]. 2012-06-28. Available from: https://gigaom.com/2012/06/28/taking-on-amazon-google-launches-compute-on-demand-rival-to-ec2/ [Accessed: 2016-04-05].
  53. 53. Z. Li, H. Zhang, L. O’Brien, R. Cai, S. Flint. On Evaluating Commercial Cloud Services: A Systematic Review. Journal of Systems and Software. 2013;86(9):2371–2393. DOI: 10.1016/j.jss.2013.04.021
  54. 54. N. Marz, J. Warren. Big Data: Principles and Best Practices of Scalable Realtime Data Systems. 1st ed. Greenwich, CT, USA: Manning Publications Co.; 2015. 425 p. ISBN: 978-1617290343
  55. 55. H. Elazhary. Cloud Computing for Big Data. MAGNT Research Report. 2014;2(4):135–144. Available from: https://www.researchgate.net/profile/Hanan_Elazhary/publication/285692839_Cloud_Computing_for_Big_Data/links/56629d4e08ae192bbf8e241f.pdf [Accessed: 2016-04-06].
  56. 56. Robotics Virtual Organization. A Roadmap for U.S. Robotics: From Internet to Robotics 2013 Edition. Presented to Congressional Robotics Caucus; 2013. Available from: https://robotics-vo.us/sites/default/files/2013%20Robotics%20Roadmap-rs.pdf [Accessed: 2016-04-10].
  57. 57. E.M. Corrado. The Importance of Open Access, Open Source, and Open Standards for Libraries. Issues in Science and Technology Librarianship. 2005;42:1092–1206. DOI: 10.5062/F42F7KD8
  58. 58. M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, A.Y. Ng. ROS: An Open-source Robot Operating System. In: ICRA Workshop on Open Source Software; 2009. Available from: http://pub1.willowgarage.com/∼konolige/cs225B/docs/quigley-icra2009-ros.pdf [Accessed: 2016-04-06].
  59. 59. D. Berenson, P. Abbeel, K. Goldberg. A Robot Path Planning Framework That Learns from Experience. In: IEEE International Conference on Robotics and Automation (ICRA); 14–18 May 2012; Saint Paul, MN. 2012. pp. 3671–3678. DOI: 10.1109/ICRA.2012.6224742
  60. 60. M. Lease. On Quality Control and Machine Learning in Crowdsourcing. In: Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence; 2011. Available from: http://www.aaai.org/ocs/index.php/WS/AAAIW11/paper/viewFile/3906/4255 [Accessed: 2016-04-07].
  61. 61. M. Vukovic. Crowdsourcing for Enterprises. In: Congress on Services – I; 2009. DOI: 10.1109/SERVICES-I.2009.56
  62. 62. Amazon. Introduction to Amazon Mechanical Turk [Internet]. 2014-08-15. Available from: http://docs.aws.amazon.com/AWSMechTurk/latest/AWSMechanicalTurkGettingStartedGuide/SvcIntro.html [Accessed: 2016-04-10].

Written By

Khuram Shahzad

Submitted: 15 November 2015 Reviewed: 04 May 2016 Published: 07 September 2016