Open access peer-reviewed chapter

Driver Assistance Technologies

Written By

Pradip Kumar Sarkar

Submitted: 20 May 2020 Reviewed: 06 October 2020 Published: 30 October 2020

DOI: 10.5772/intechopen.94354

From the Edited Volume

Models and Technologies for Smart, Sustainable and Safe Transportation Systems

Edited by Stefano de Luca, Roberta Di Pace and Chiara Fiori

Chapter metrics overview

1,200 Chapter Downloads

View Full Metrics

Abstract

Topic: Driver Assistance Technology is emerging as new driving technology popularly known as ADAS. It is supported with Adaptive Cruise Control, Automatic Emergency Brake, blind spot monitoring, lane change assistance, and forward collision warnings etc. It is an important platform to integrate these multiple applications by using data from multifunction sensors, cameras, radars, lidars etc. and send command to plural actuators, engine, brake, steering etc. ADAS technology can detect some objects, do basic classification, alert the driver of hazardous road conditions, and in some cases, slow or stop the vehicle. The architecture of the electronic control units (ECUs) is responsible for executing advanced driver assistance systems (ADAS) in vehicle which is changing as per its response during the process of driving. Automotive system architecture integrates multiple applications into ADAS ECUs that serve multiple sensors for their functions. Hardware architecture of ADAS and autonomous driving, includes automotive Ethernet, TSN, Ethernet switch and gateway, and domain controller while Software architecture of ADAS and autonomous driving, including AUTOSAR Classic and Adaptive, ROS 2.0 and QNX. This chapter explains the functioning of Assistance Driving Technology with the help of its architecture and various types of sensors.

Keywords

  • sensors
  • ADAS architecture
  • levels
  • technologies

1. Introduction

In order to enhance road safety as well as to satisfy increasingly stringent government regulations in western countries, automobile makers are confronted with incorporating a range of diverse technologies for driver assistance to their new model. These technologies help drivers to avoid accidents, both at high speeds and for backward movement for parking. This system can be placed into the category of advanced driver-assistance systems (ADAS). Besides increasing safety, ADAS [1] applications are concerned with to enhancing comfort, convenience, and energy efficiency. It is emerging as new driving technology supported with Adaptive Cruise Control, Automatic Emergency Brake, blind spot monitoring, lane change assistance, and forward collision warnings etc. It is an important platform to integrate these multiple applications by using data from radar, lidar, and ultra sound sensors etc. The vehicle engine related to hardware such as actuators, engine, brake, steering get the commands from the above sensors to enable the ADAS to take desired actions with respect to alerting the driver for detection of hazardous object or location or stopping the vehicle if necessary. For example, the recognition of black spot warning, lane change assistance and forward collision warning are extremely becoming useful in the ADAS.

During the gradual emergence of Connected and Automated vehicle (CAV), driver behavior modeling (DBM) coupled with simulation system modeling appears to be an instrumental in predicting driving maneuvers, driver intent, vehicle and driver state, and environmental factors, to improve transportation safety and the driving experience as a whole. These models can play an effective role by incorporating its desired safety-proof output into Advanced Driver Assistance System (ADAS. To cite an example, it could be said with confidence that the information generated from all types of sensors in an ADAS driven vehicle with accurate lane changing prediction models could prevent road accidents by alerting the driver ahead of time of potential danger. It is increasingly felt that DBM developed by incorporating personal driving incentives and preferences, with contextual factors such as weather and lighting, is still required to be refined, calibrated and validated to make it robust so that it turns into more better personalized and generic models. In regard to the modeling of personalized navigation and travel systems, earlier studies in this area have mainly considered ideal knowledge and information of the road network and environment, which does not seem to be very realistic. More researches are required to be conducted to address this real life challenges to make ADAS more acceptable to society.

There are an increasing evidences from the various literatures that a single vehicle making inferences based on sensed measurement of the driver, the vehicle, and its environment is mostly focused for DBM where there is any hardly attempt made to develop DBM in the traffic environment in the presence of vehicle to vehicle (V2V), and vehicle to infrastructure (V2I) scenario- communications system. It would be interesting to develop DBM with respect to connected and automated vehicle (CAV) to leverage information from multiple vehicles so that more global behavioral models can be developed.. This would be useful to apply the output of the CAV modeling in the design of ADAS driven vehicle to create a safety proof driving-scenario for diverse applications.

Advertisement

2. Architecture of ADAS

There are a number of sensors which are increasingly being used. These are namely cameras, medium and long-range radar, ultrasonic, and LIDAR.. Data generated from these sensors go through fusion process to authenticate the data so as to enable the computer software perform the necessary tasks to activate the driver assistance system to take correct decisions. These decisions are related to parking assistance, automatic emergency breaking, pedestrian detection, surrounding view, and even drowsiness of the driver. The functional components such as various types of sensors collecting data from immediate surrounding environment are related to ADAS architecture that helps to perform necessary tasks as shown in the Figure 1. The forward collision-avoidance ECU module is located in the windshield, supported with the blind spot ultrasonic sensors and related ADAS processor may be located in the side mirrors or other location areas.

Figure 1.

Functional components and various types of sensors. Source: http://www.hitachi-automotive.us/Products/oem/DCS/ADAS/index.htm

The architecture [2, 3, 4] of the electronic control units (ECUs) is responsible for executing advanced driver assistance systems (ADAS) in vehicles which is changing for its response during the process of driving. Automotive system architect integrates multiple applications into ADAS ECUs that serve multiple functions of ITS architecture as shown in the Figure 2. Figures 3 and 4 show Architecture for other functions related to Forward Collision and Parking Assistance respectively.

Figure 2.

Architecture of ADAS, source: Ref [3].

Figure 3.

Architecture of forward collision avoidance & blind spot avoidance. Source: Ian Riches, strategy analytics.

Figure 4.

Architecture of ADAS -Parking Avoidance & Blind. Source: http://www.techdesignforums.com/practice/technique/managing-the-evolving-architecture-of-integrated-adas-controllers/.

Hardware architecture of ADAS and autonomous driving, includes automotive Ethernet, TSN, Ethernet switch and gateway, and domain controller while Software architecture of ADAS and autonomous driving, including AUTOSAR Classic and Adaptive, ROS 2.0 and QNX.

Advertisement

3. Functioning of ADAS

Advanced driver assistance systems (ADAS) need a number of integrated sensors to accurately determine situational assessment and action implementation. In ADAS technologies [5, 6, 7] sensors such as video, radar, LIDAR, ultrasonic and infrared (IR) sensors are being increasingly utilized. Sensor fusion with advanced algorithms and computing power, connectivity and data transmission, contextual awareness and processing, and virtual sensors is extremely important for success of ADAS.

There are six levels of vehicle automation as shown in Figure 5 defined by the Society of Automotive Engineers (SAE) [8] with a span from Level 0, which has no automation, to Level 5, which involve fully autonomous vehicles. As automation expands, driver assistance and ADAS plays an increasingly important role.

Figure 5.

Various levels of ADAS, source: https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving- automation%E2%80%9D-standard-for-self-driving-vehicles.

Level 0: Driver only: the driving is controlled by the human driver using with driving aids independently including steering, throttle, brakes, etc.

Level 1: Assisted driving: driver needs assistance during vehicle operation with respect to Cruise Control, ACC.

Level 2: Partial automation: the system is monitored during driving. At least one system, such as cruise control and lane centering, is fully automated.

Level 3: Conditional automation: the system is monitored by the operator and can intervene when it is necessary. Safety-critical functions, under certain circumstances, are shifted to the vehicle.

Level 4: High automation: there is no monitoring required by the driver. Vehicles are designed to operate safety-critical functions and monitor road conditions for an entire trip. However, the functions do not cover all every driving scenario and are limited to the operational design of the vehicle.

Level 5: Full automation: it ensures operator-free driving without any intervention.

As of today, no car manufacturer has achieved level 3 or higher in production, although several have produced demonstration vehicles. The legislature of some countries is working on a possible admission of “Level 3” vehicles, which is expected to be available in 2020/21. Driver assistance systems enabling autonomous driving from level 3 onwards will require at least three types of sensor systems: camera, radar, and LIDAR systems. As can be seen in Figure 5, several of each type of sensor operates at various locations on the vehicle. The development of the LIDAR system is still posing the bigger and most dynamic challenge in technical and commercial terms.

Advertisement

4. Data fusion

4.1 Fusion of DATA at ECU

There are a number of sub systems associated in performing various tasks of ADAS. A vehicle’s movement detected by the ADAS can be seen in the main system inside the vehicle when the driver is present. This system interacts with the environment. There are different functions of the system as can be clearly distinguished in Figure 6. The following distinctive features of fusion are mentioned as under:

  1. Information has to be gathered;

  2. Information needs to be evaluated;

  3. A safety measure need to be taken;

Figure 6.

Fusion of data at ECU received from various types of sensors housed in ADAS, Source: Ref No: [3].

These functions are synonymous to as Sense (1), Think(2), and Act(3). Only the Sense sensors are reviewed and only the systems in which the driver is inside the loop. Figure 6 shows the process of ‘multi sensor processing’, starting with the sensor data acquisition. Next, the sensors processing, divided into several tasks, as ‘Calibration, ‘Feature Extraction’, ‘Object Detection’, etc., begins to analyze the sensors data and, in the end, serves the application with a more or less detailed model of the environment [4].

Fusion of data received from complementary and independent sources place the data into a single description. Data association and data assimilation are two important components to be addressed for data fusion as a part of the process that matches sensor data with the description of the environment that requires synchronization of the sensor data and the associated object state (e.g., position and velocity).

Advertisement

5. Various sensors of ADAS

It is extremely important to know which sensors are required for autonomous driving from Levels 1 to 5. As already mentioned, there are three main groups of sensor systems camera-, radar-, and LIDAR-based systems. Although, for parking, ultrasonic sensors are available today and are widespread, they are of minor importance for autonomous driving. Camera and radar systems are in the Level 1 and 2 vehicles today and are prerequisite for all further levels of automation.

5.1 Sensor camera for ADAS

This advanced Camera (digital HDR CMOS cameras) with large dynamic range is well suited to poor light conditions and primary differences are due to its brightness.

A large number of digital interfaces are available with camera for automobiles along with digital signal processor and internal memory capacity. The camera generates processed video images for evaluation using software algorithm. It also help images transformed in to signals to merge with other sensor signals such as other as radar and lidar etc. Due to the inherent intelligence of the camera, all the signals are processed in the fusion mode to enable the ADAS to take correct decision. The camera used as sensor [9] is required to go through the quality management (ISO/TS/16949 in the automobile industry and are suited for adaptability which is quick and flexible. Current digital camera system is continuously receiving raw data that is then processed and forwarded to the display unit for image display. This procedure is shown in Figure 7.

Figure 7.

Video data transfer to head unit of camera through Ethernet, source: https://www.fierceelectronics.com/components/three-sensor-types-drive-autonomous-vehicles

Besides this, the infrared (IR) camera consists of several components. It is important to distinguish 2 different versions of the IR camera:

The infrared (IR) camera consists of several components. It is important to distinguish 2 different versions of the IR camera:

  1. Near Infra Red(NIR);

  2. Far Infra-Red(FIR);

In both systems a camera plays an important role in identifying radiation of objects. It may be mentioned that NIR technology offer an extra illumination by IR-headlights while the FIR systems is not characterized with special headlights. The primary difference between the two is picking up the extra-radiated objects by the NIR systems while FIR only accepts only the regular radiation of objects.

Table 1 presents transmission of data rate from sensors [10]. Figure 8 shows the functioning of Lidar.

SensorData rate required to transmit raw data
Camera1Gb/sec to 24Gb/sec
Radar5Gb/sec to 120Gb/sec
Lidar2 Mb/sec to 10Gb/sec

Table 1.

Data rate required for transmission of data.

Figure 8.

Principle of the functioning of LIDAR. Source: https://www.fierceelectronics.com/components/three-sensor-types-drive-autonomous-vehicles.

5.2 LIDAR systems

For purpose of measuring distance and creation of three-dimensional images of the environment, LIDAR system [11] is fitted and integrated ever more frequently into vehicles and mobile machines. A pulsed laser beam assesses the signal‘s transit time from the object back to the detector as shown in Figure 8. A highly sensitive technique using Avalanche Photodiodes along with internal amplification measure the light pulses in the nanosecond range across wider bandwidths. Lidar optical system requires the high spatial resolutions. Therefore sensor has the capability to develop APD arrays comprising with multiple sensor elements. The APD arrays from sensor addresses the effect of temperature due to its high voltage. Their highly accurate amplification offers excellent APD signal quality. The modules can be adapted to as per the specific application. Development boards with digital output signal and Low Voltage Differentiating Signal (LVDS) is interfaced. With the help of Lidar and Radar System, the object of the road can easily be identified. But in addition to these, there is a necessity for a camera for classification and detection of an object in a correct way. With the development of point density cloud from the reflections from radar and lidar, the distance and closing speed of the object can easily be measured. It may be mentioned that due to lower resolutions from these sensors as compared to camera, the detection of the objects are not easily made. To optimize the detection at varying ranges with lower resolution, a number of units are installed from a medium-range unit for emergency brake assist to long-range radar for adaptive cruise control although LIDAR & radar, functions in a similar way at longer ranges with lower point-density.

5.3 Radar

RADAR is meant to define its full form “Radio Detection And Ranging.”. By this sensor, the object is detected with the identification of localization of objects using radio waves with a frequency range from 24 to 77 GHz. It is noteworthy to mention that the higher measurement of accuracy with respect to distance and speed along with precise angular resolution depends on high intensity of radio wave frequency. Generally the frequency over 24 GHz is used for the smaller antenna size with the lower interference problem. The examples of various types of frequency band [12] used for different sensors are as under:

Short-range radio applications include:

  • Blind Spot Detection (Blind Spot Monitoring)

  • The lane and the lane-change assistant

  • Rear end radar for collision warning or collision avoidance

  • Park Assist

  • Brake Assist

  • Emergency braking,

  • Automatic distance control

Radar configurations can be broadly categorized into three categories namely short-range radar with a maximum distance of about 30 meters, medium range radar with about 60 meters and long- range radar with about 250 meters. It may be mentioned that the use of Short Range Radar is increasingly seen with the detection of blind spot, rear and forward mitigation, parking assist etc. On the other, there are a number of detection system namely forward collision warnings, cross traffic alert, stop & go etc. operated by Medium Range Radar. So far there is no specific distinction made between SRR and MRR by the industry. It is seen now a days that ultrasonic sensors and highly automated driving are gradually replaced by the SRR. We do not have as such specific definitions and distinctions between the SSR and MDR as formulated by the industries. As far as the placement of sensors in the vehicles, the forward looking sensor for long range detection is generally placed in the front of the vehicle.

For a ‘cocoon’ radar system, extra sensors are placed on each side mid-body. Ideally, these radar sensors work on the 79-GHz frequency band with a 4-GHZ bandwidth. It may be mentioned that, global frequency specifications so far allow only 1 GHZ bandwidth at 77 GHz. Now a days a radar MMIC (monolithic microwave integrated circuit) comprises of three transmission channels (TX) and four-receiver channel (RX) to be monolithically integrated. Whether it creates a sense to integrate base band processing in the monolithic microwave integrated circuit (MMIC) or whether it is better to concentrate on a raw data radar sensor, it is a matter of debate.

The difference is that the output of the baseband processor provides so called pre-targets. In this case, data is pre-processed such as unverified information on speed, distance, signal strength, horizontal angle, and vertical angle for each detected object. The raw data radar sensor presents unfiltered raw data, to the ECU for processing. Figure 9 demonstrates the architecture of such a raw data radar sensor. The radar sensor used as partitioned simplifies the data fusion of the video and radar data, and LIDAR data since the same communication interface can be used A prerequisite for the development of MWICs (Millimeter Wave Integrated Circuit) is dedicated high-frequency (HF) technologies to realize the frequencies (24 GHz or 77 GHz) and the corresponding output power. Table 2 presents summary table of the properties of a radar sensor in certain ADAS.

Figure 9.

Radar architecture for processing of raw data. Source: (https://www.sensorsmag.com/components/three-sensor-types-drive-autonomous-vehicles

PropertyPresent in systemsComment
Frequency: 76–77 GHz
Range: 1 to 200 m
Search Area: 12°
Speed measurement precision:
<
0.2 km/h
  • Adaptive Cruise Control

  • Lane Change Assistant

Long range, Pulse Doppler,
Active sensor,
Angular Precision: < 0.3°
Frequency: 24.125 Ghz Distance range: 10 m Velocity range: 60 m/s Field of view:
  • ± 50 0Horizontal

  • ±

0,5 m/s Accuracy:
  • ± 0,05 m

  • ± 0,5 m/s

Smallest object: Metal bar 10 mm dia (vertically placed at 1,5 m)
Dimensions: 90 x 40 x 15 mm
  • Adaptive Cruise Control

  • ACC/Stop &Go

  • ACC/Stop & Go +Foresight

  • Lane Change Assistant

  • Automatic Parking

Short range
Frequency: 24 GHz
  • Forward Collision Warning

Forward looking, long
Range
Frequency: 24 GHz
Frequency: 5.8 GHz
  • Side Obstacle Detection

Side looking, short range
Side looking, short
Range
F = 76.5 GHz
Resolution = 100 cm Bandwidth = 100–500 MHz
Range = 7–150 m
Long range, Pulse Doppler,
Active sensor,
Radar
  • Near field Collision Warning

  • Lane Keeping Assistant

  • Obstacle & Collision Warning

  • Rural Drive Assistance

  • Obstacle and Collision Avoidance

Active sensor,
Frequency: 24 GHz UWD (Ultra Wide Band) Resolution = 3 cm Bandwidth = 5 GHz
Range: 0.3–30 m
  • Pre-Crash Collision and Mitigation System

Short range, Active sensor,
Transmission Power = −41.3
dBm/ MHz
Infrared Radar
  • Pre-Crash Collision and Mitigation System

Near InfraRed (NIR),
Far Infra Red (FIR),

Table 2.

Summary table of the properties of a radar sensor in certain ADAS, source: Ref. [12].

Multiple transmitters and receivers are generally are in-built to determine range, angle, and velocity of objects in their field of view. As various sensors are concerned, it consists of ultra-short-range- radar (USRR), short-range-radar (SRR), medium-range-radar (MRR), and long-range-radar (LRR) sensors or systems.

5.4 Ultrasonic sensing system

The primary philosophy of working with the ultrasonic technology is to transmit short bursts of sound waves that return back after hitting objects for which the measurement are to be taken in terms of time required to bounce back with speed of approximately 346 m/s which is the speed of the sound. For detection of short distance range obstacle, Ultrasonic sensors are increasing being used in the automobile industries which is generally characterized by with a sound pressure kHz and detection covering range of one to three meters supported by horizontal beam width of maximum100°and60°vertical. The ultrasonic and radar technology complements each other to determine the higher degree of accuracy,

Ultrasonic sensing is generally meant for short-distance applications at low speeds, such as park assist, self-parking, and blind-spot detection. For maximum coverage, an automotive ultrasonic system typically performs with multiple sensors placed in the wing mirror and front and rear bumpers. Ultrasonic sensing is a more cost-effective approach than cameras, which have poor close- distance detection. Though infrared sensing is cheaper than ultrasonic, it’s less accurate and cannot function properly in direct sunlight. Objects closer to the transmitter generate a stronger echo than an object with more distantly located. In order to avoid false positives, the system neglects all inputs that are less than that of the noise. The important parameters related to the specifications of ultrasonic sensor are the frequency, sensitivity, and directivity. The system is further characterized by the tunable transformer that is required to excite the transducer.

A tuning capacitor built into the system is concerned with matching the resonant frequency between the transducer and transformer. The speed of sound in air is affected by air temperature, humidity, and wind. If multiple sensors are applied, they must be placed in sufficient space so that the sensor signals do not interfere. Figure 10 shows the features of ultrasonic system (Table 3).

Figure 10.

This ultrasonic system features a PGA450 analog front end (source: Author/PGA450-Q1 PDF).

PropertyPresent in systemsComment
F = 40 kHz
Distance range: 0 to 3
  • Pre-Crash Collision and Mitigation System

  • Obstacle & Collision Warning

In adverse weather
conditions
meter Distance accuracy:
10 cm Angular range:
120° Angular accuracy:
+/− 5°
Response time: 60 ms

Table 3.

Summary table of the properties of an ultrasonic sensor in certain ADAS, source: https://www.embedded.com/how-smart-sensors-enhance-adas-designs/

Advertisement

6. Understanding the design of ADAS

It is realized that in order to make the ADAS commercially viable, three aspects on designing, testing and validating are of great importance and challenge to researchers/ scientists and manufacturer. The processing and sharing of information requiring a huge computation effort, within its fusion system in real time situation is a complex and difficult task in view of the computational load and the time-constraints placed on the system.

The inertial navigation systems identify, measures position, orientation, and velocity measurements. The sensor of RT-Ranges [13] is responsible for creating a real-time network, which is capable of tracking multiple targets, calculating distance, time to collision, and other relative measurements. Targets include primarily road vehicles, vulnerable road users (VRUs) such as cyclists or pedestrians. Euro NCAP (The European New Car Assessment Programme,) targets traffic assets and more. Euro NCAP is a European car safety performance assessment programme. Data is available in real-time on a software dashboard captured to verify test outcomes. Vehicle-to-vehicle measurements can be made over a 1 km range. Many similar systems in different parts of the world are increasingly seen, all with a slightly different name.

Various system of ADAS associated with various sensors is presented in the Table 4. A number of sensors developed during the process of development of ADAS are briefly discussed below.

6.1 Night vision

During night vision, one is more concerned with the proper visibility where the camera plays an important role. Therefore the camera for this purpose is designed with the use of near or far infrared to improve the perception of the driver in dark conditions. The improved sight vision created by the above near or far- infrared camera is displayed in the monitors of the vehicle. Human Machine interface though poses an issue for correctly showing the road-side picture for timely intervention plays an important role to the driver to enhance the safety to the driver so that the driver is not distracted. Table 5presents available sensors with their properties for night vision.

Table 4.

Various sensors related to their applications. Source: Automotive ADAS Systems, ST Developers Conference, Sep, 12, 2019, Santa Clara Convention Centre, Mission City Ballroom, Santa Clara, CA.

SensorPropertyComment
Infrared cameral = 800 nmNear InfraRed (NIR)
(CMOS)= 7–14 μmFar InfraRed (FIR)
Both systems are mono-camera,mono-camera,

Table 5.

Available sensor and properties in night vision systems, source: [12].

6.2 Lane departure warning

Lane departure warning mechanism works on the principles of certain thresholds with respect to distance, time to lane crossing. It is based on the decision made out from the data fusion analysis supported with computer software algorithm to warn the driver that he or she is about commit mistake in departing traffic lane. For example, sensors such as acoustic, optic means continuously generate and analyze the data along with the video image processing data created by the vehicle cameras results in the detection of warning to the vehicle. In order to make the warning system effective, the carriage way would have to be laid with Good visible lane markings system. These influence the complexity of the system on the roadside. This system aims to prevent involuntary lane departure, which constitutes a relevant cause of road accidents. With real-time measurement and positional accuracy which is generally at less than 2 cm, the system captures the data that the sensor performs the task of lane departure action as shown in Figure 11. This warns the Lane Departure Warning system if the vehicle suddenly decides to change the lane without proper indication. The camera used for the lane detection system is low cost generally mounted on the windscreen near the rear view. The position of this location of the camera helps continuously capture the image of solid lane line marking of the road towards the front side of driving. it also works along with the front (adaptive cruise control and, ii) forward collision warning), side (lane departure warning), and iii) rear side (blind spot detection).

Figure 11.

How it works: Windshield camera tracks lane markings. Source:https://www.extremetech.com/extreme/165320-what-is-lane-departure-warning-and-how-does-it-work.

6.3 Near field collision warning

There are multiple collision warning systems (12) mentioned on the Table 6.

SensorPropertyComment
Infrared cameral = 800 nmNear InfraRed (NIR)
(CMOS)= 7–14 μmFar InfraRed (FIR)
Both systems are mono-camera,mono-camera,

Table 6.

Available sensor and properties in night vision systems.

The finest example of the application of near field collision warning is the detection of blind spot, which takes very close proximity of the presence of vehicle. Lidar, radar or vision based sensors are generally used. It may also be acoustical, haptical or optical also. In many cases, the frequency of this kind sensor is found to be 24 GHz. To test and develop blind-spot detection systems, it is necessary to accurately measure the position and trajectory of targets relative to the vehicle under test (VUT). The system may require the following protocol accuracy:

  • Relative accuracy 2 cm

  • Heading accuracy 0.1°

  • Free post-processing software

  • Ability to track multiple objects in real-time

  • Perfectly suited to open-road testing

To evaluate blind-spot detection systems, an RT inertial navigation system and RT- Range S [7] are installed in the vehicle under test. This powerful system is designed to work in conjunction with GNSS-aided inertial navigation products. Automobiles can be equipped with GNSS receivers, which display moving maps and information about location, speed, direction, and nearby streets and points of interest. The manner in which sensor works is based the measurement of real-time distance between the sensor and the identified object. It may include any type of vehicle, blind corner of a junction, pedestrian and bicycle etc.

For real-time testing, range measurements from the RT-Range S Hunter can be used as output via Ethernet or CAN (Controller Area Network) which is a communication hardware that allows communication between parts of a system without the intermediary of a central computer. Or data can be logged internally and analyzed back at base where it can be post-processed and exported in CSV file format ((“Comma Separated Values”) which is often used to exchange data between differently similar applications).

6.4 Forward collision warning

Warning system developed by EATON-VORAD in the USA for trucks and busses [13] as the first step towards the Advanced Driver Assistance Systems (ADAS) can be considered as Forward Collision Warning System. Forward Collision warning with a frequency of 24 Ghz is first seen in the USA market in 1995. It used to detect the object with signal emitted through either optical or acoustical method to the driver when the object happens to be close to the path of collision.

6.5 Side obstacle detection

This system addresses a side looking short-range radar that operates at 24 GHz. This sensor identifies and detects side obstacles that are signaled with a proper display. As a further option, the system can also be linked to engine control with a view to controlling speed. This function is called “Smart Cruise”. More recently, the side obstacle detection System has been introduced also on Volvo cars based on camera sensor and image processing.

6.6 Curve & speed limit information

This system communicates with the driver about speed limits and informed the recommended speed at curves. There are a number of relevant information generated from digital maps, image processing or communication system between the interactions of vehicles and road infrastructure. That is the reason that updated real time data is important to the driver generated from the above which helps in recognizing the speed limit of the road where the vehicle is traveling. It may be mentioned that the details of the road features such exact location of traffic marking, position of street light etc. are available in the form of digital map in ADAS that helps in identifying and recognizing the speed limit.

6.7 Adaptive cruise control (ACC)

This system was introduced firstly inside Japan, and then in Europe for the car market. ACC systems are based on a front looking sensor designed with laser radar, (LIDAR) or microwave radar with a maximum detection range of around 100 m. The microwave radar sensor operates in the 76–77 GHz bands that have been reserved for application of automotive obstacle detection. Based on front vehicle information, mainly distance and speed, the ACC system regulates own vehicle speed by acting on engine control and braking system. The ACC is an extension of the standard Cruise.

Control system, with the extra capability to adapt the speed of the vehicle to the speed of the preceding one. This function was firstly introduced in Japan on 1995 based on LIDAR technology.

Europe experienced the emergence of lidar and microwave technology in the following years which led the introduction of these technologies in the Mercedes car during the year 1999. It is noteworthy to mention that the automatic cruise control system (ACC) was seen fitted with truck manufactured by Mercedes automobile industry. Presently around twenty automobile manufacturers are producing this type car and truck.

It is based on a high performance GNSS/INS for dynamic applications developed on the convenience of a conventional cruise control system by automatically changing speed to match the vehicular flow in front. It’s important to determine precisely when and how the system intervenes, how well it acquires and then it tracks the targets and how it performs in a number of different real-world scenarios [6]. Measurements such as target bearing, distance, relative velocity and time-to-collision are key to the evaluation of these systems. Sensors with RT and RT range for ACC offers the following characteristics:

  • Relative accuracy 2 cm

  • Heading accuracy 0.1°

  • Real-time birds eye view showing measurements

  • Ability to track multiple objects in real-time

  • Perfectly suited to open-road testing

In order to get accurate vehicle-to-vehicle measurements, an RT inertial navigation system and RT-Range S [7] are installed in the vehicle under test (VUT) and any target vehicles. An RT inertial navigation takes into account a number of parameters for operation. These include position with respect to latitude, longitude, altitude distance and its coordinate position. Besides the position of these, velocity, acceleration, orientation, angular rates and acceleration and slip angle are also taken into account. RT-XLAN Wi-Fi radios then send real-time information from target vehicles back to the VUT where the RT-Range S calculates, logs and outputs real- time measurements about the relative position of the target vehicles. The measurements being the output include the position of both the Hunter and target vehicles, orientation and velocity. The current status of the ACC hardware can also be logged with the data via a CAN bus interface, which is a robust vehicle bus standard designed to allow microcontrollers and devices to communicate with each other in applications without a host computer. It can also be or later synchronized with the measurements via a GPS time stamp. Moreover, from some manufacturers, ACC is given in combination with lane warning system.

It will have frequency allocation for 24 GHz sensors. The properties of various sensors associated with the functioning this ACC are presented in Table 7 as under:

SensorProperty
LIDARWavelength l: 850 nm
RadarFrequency: 76–77 GHz
Range: 1 to 200 m
Resolution: 100 cm
Search Area: 12°
Speed measurement precision: <
0.2 km/h
Angular Precision: < 0.3°
Frequency: 24.125 Ghz Distance range: 10 m Velocity range: 60 m/s Field of view:
  • ± 50 0Horizontal

  • ± 0,5 m/s Accuracy:

  • ± 0,05 m

  • ± 0,5 m/s

    Smallest object: Metal bar 10 mm diam (vertically placed at 1,5 m)

    Dimensions: 90 x 40 x 15 mm

Table 7.

Available sensors with their properties in ACC (source: Ref No. [12]).

6.8 ACC/stop & go

Adaptive cruise control (ACC) permits a driver to travel with the flow in traffic. In this situation, a radar sensor monitors the situation in front of the vehicle. As the road is observed to be clear, ACC operates with the desired speed. If the radar sensor finds a slower vehicle ahead of it, ACC automatically maintains and adjusts the speed a preset distance. In the Stop & Go version, the system results in slowing the car down in a traffic jam, or even comes to a halt it completely. If the car has an automatic transmission, Stop & Go also restarts the engine once traffic gets moving again after a brief pause.

6.9 Stop & go

In this system the driver continues to receive support from this sensor with respect longitudinal control for the formation of queue. During the stop & go of the vehicle facing the front side, longitudinal control is carried out by the system for detecting the near side objects.

6.10 Lane keeping assistant

The function of a lane keeping assistant system includes the lane detection and the feedback to the driver if he is leaving a defined trajectory within the lane. Lane departure warning systems merely alert the driver when the car is leaving its lane, while lane-keeping assist actually works to keep the car from moving out of the lane. An active steering wheel can help the driver with a force feedback to keep on this trajectory. The lane is detected by a video image processing system. Additionally to the lane departure warning aspects especially regarding the infrastructure, the HMI becomes more important.

The driver gets all assistance through his touch with steering and other devices for taking decisions for the vehicular movement linking with the controller that also helps to lane keeping assistance to adhere to lane driving.

The Protocol accuracy requirements [12] for this are as under:

  • Axes to be in ISO 8855:1991 orientation

  • Longitudinal speed to 0.1 km/h

  • Update rate at least 100 Hz

  • Time is required as a synchronization DGPS (Differential GPS)

  • Position to 0.03 m

  • Yaw velocity to 0.1°/s

  • Acceleration to 0.1 m/s2

  • Vehicle edge to lane edge measurements

For the LSS (Lane support System) LKA tests, the key measurements are the distance between the outer-edge bulge of the front tires and the inside edge of the lane markings when any intervention is triggered.

6.11 Local hazard warning

If a hazard occurs far away in front of the vehicle, so that the driver cannot see it, this system will warn him. By the means of communication, it is possible, to transfer this information over long distances. A usable frequency has to be allocated. Local Hazard Warning [14, 15] is a system that uses short-range communication between cars, and between a car and its surroundings, to give drivers early warning of safety hazards. For example, a car equipped with Local Hazard Warning might issue a warning to other vehicles if it had broken down in the middle of a carriageway or had been involved in a collision. Similarly, emergency vehicles equipped with such a system might send a signal to nearby vehicles to warn them of their presence, or temporary roadwork barriers could issue for such warnings. As well as transmitting such warnings, cars equipped with Local Hazard Warning can also receive these signals and use them to alert the driver to the danger [16].

6.12 Automatic parking

The automatic parking is a function that helps the driver entering into a parking slot in a parallel maneuver by automatically acting on the steering wheel and engine control. The sensors measure [12] the object with following accuracy:

  • Relative accuracy 2 cm

  • Heading accuracy 0.1°

  • Real-time birds eye view showing measurements

  • Ability to track multiple objects in real-time

The vehicle is fitted with a GNSS-aided inertial navigation system (GNSS/INS). In most cases (because of the low speeds involved), a dual-antenna model is fitted to maintain the best headway accuracy at all times. The properties of various sensors [12] are presented in Table 8.

SensorPropertyComment
LaserBeam deflection: horizontal Range: 0–80 m
Range: 0–35 m @ Rr = 5%
Resolution: 20 mm Accuracy: ± 50 mm Frequency: 10–40 Hz
Cycle time: 25–100 ms Vertical opening angle: ~ 3,5°
Horizontalangularfield:+ − 120°
Lateral resolution: 0,25° - 1°
RadarFrequency: 24.125 GhzShort range
Distance range: 10 m Velocity range: 60 m/s Field of view:
  • ± 50 0Horizontal

  • ±

0,5 m/s Accuracy:
  • ± 0,05 m

  • ± 0,5 m/s

Smallest object: Metal bar 10 mm diam (vertically placed at 1,5 m)
Dimensions: 90 x 40 x 15 mm

Table 8.

Various sensors available for automatic parking and their properties.

6.13 Pre-crash collision and mitigation system

Pre Crash Safety Systems identify an imminent crash and deploy safety devices such as seat belt pretensions.

Pre Crash Safety Systems identify an imminent crash and deploy safety devices such as seat belt pretensions. In order to reduce the damages of an accident, this system has been designed that is capable of applying brake automatically after identification of imminent occurrence of collision. As discussed earlier, various sensors such as Lidar, Camera etc. play an important role in identifying the hindrance for an imminent collision. This feature is primarily designed to address the problem of safety, which integrates the sensitivity of seatbelt. If one happens to wear the seat belt during the occurrence of road accident, the chances of being injured is quite less. Most of the seat belts now available in the car are very sensitive, as the vehicle will not move if car users or someone does not wear seat belt.

6.14 Obstacle & collision warning

The driver will be warned if a potential collision is detected with e.g. another car or obstacle. This warning can be, for example acoustic, visual. The functional limits of these systems have to be clearly pointed out.

In city environments, collision between vehicles and pedestrians or cyclists often result in serious injuries as there is a little time for either party to react. Protocol accuracy requirements [12] for this kind of collision are the following.

  • Update rate at least 100 Hz

  • Lateral path error

  • Time is required as a synchronization DGPS

  • Position to 0.03 m

  • VUT (Vehicle under test) Speed to 0.1 km/h

  • Yaw velocity to 0.1°/s

  • Acceleration to 0.1 m/s2

  • Polygon perimeter shapes

6.15 Intersection safety

In an intersection situation especially in cities, a driver has to fulfill several tasks in parallel. In order to assist the driver in such situations, it is necessary to support certain tasks like approaching a stop sign/traffic light or right of way of crossing traffic. The complexity of the possible intersection scenarios leads to the high risk probabilities of causing accidents. As any intersections are designed to address a number of turning movements of automobile traffic coupled with the non-motorized and pedestrian traffic, the detection and recognition are not as simple as on a straight section of a road. Due to these complexities, the safety of the road intersection would have to be taken into all possible scenarios to make hazard free zone.

Advertisement

7. Autonomous driving

The driving of vehicle is controlled by a computeralgorithm in each situation. It is presently viewed that this fully AV cannot be reached at the present situation in the actual road network immediately. There is an expectation that true Level − 5 of AV to attain full autonomy is about ten-plus years away. It is also expected that geo-fenced applications of autonomous vehicles (AVs) would reach in the next three to five years. The progress on the hardware as well as software has actually been very significant. The cost of LIDARs [light detection and ranging sensors], for example, has dropped by a factor of ten over the last five years. Similarly, the amount of computational capacity that the GPUs [graphic processing unit] has also increased significantly. ISO 39003 is now working on Guidance on Safety Ethical Considerations for Autonomous Vehicles in order to ensure that this vehicle is absolute safe and smooth from operational point.

Advertisement

8. Conclusions

Although there are many demonstration seen on advanced vehicles up to Level 3 or more, so far automobile manufacturers have not been able to commercialize to the high level automated vehicle which requires detailed and comprehensive legislation in the countries.. International Standard Organization is presently working on the standards for this automated vehicle. A number of fundamentals aspects of ADAS that are a part of the complex process of the system have been discussed. ADAS with level-2 are becoming increasingly available in the market in western countries with implication of increase in its cost. It may be mentioned that the manufacturers of ADAS driven vehicles have not been able to make any significant impact on the sale of this type of vehicle. It may be mentioned that there is not significant negative values experienced so far. The R&D into ADAS is increasingly being accelerated to enhance safety.

Though the ADAS driven vehicle is yet to find its place in the market in spite of apprehension raised by many sections of people on the safety related issues, it would be important to appreciate when it turns into Cooperative Road Vehicle Highways System reducing the probability of accident to almost zero level.

The European Community (16) is leading by investing significantly in R&D into ADAS in Europe. Many countries such as France, the Netherlands and the UK are increasingly taking an active role by participating in research activities and promoting successful implementation. The most important issues of ADAS have two key factors: i) a high level of usability and ii) a low financial risk to the manufacturer. It seems for the time being, ADAS user benefits are not clear yet and financial risks still exists.

As far as legal aspects are concerned, the relation between ADAS and product liability is very important. The product liability for ADA systems will address specific additional requirements, in particular taking into account the interaction of the drivers/users with the product in view of the current legal framework. The Code of Practice is being developed by the European car manufacturers by addressing these requirements. Presently the ADASE II technology roadmap for ADAS confirmed the expectation that ADAS will have potential benefits on safety, throughput and comfort, ranging from positive to very positive. Related technology development, R&D is still required to improve the performance of ADAS to cover wider ranges of traffic scenarios and to bring down costs. Political motivations and intervention may be needed to advise to the different decision makers to accelerate and facilitate (or regulate) the market the introduction of ADAS.

Therefore the government should come forward along with the concerned stake holders like road operators, car manufactures, users etc. by jointly setting up proper conducive environment in order to promote the advances of ADAS. The ADAS driven vehicles should be commercially viable in the market by addressing concerned legal issues in the society.

References

  1. 1. https://www.acea.be/uploads/publications/20090831_Code_of_Practice_ADAS.pdf
  2. 2. http://www.hitachi-automotive.us/Products/oem/DCS/ADAS/index.htm (ARCHi
  3. 3. Mahdi Rezaei Ghahroudi1 and Reza Sabzevari Multisensor Data Fusion Strategies for Advanced Driver Assistance Systems”, https://www.researchgate.net/publication/221787813
  4. 4. DR. MADHU B K1, KARTHIK KOTI2, K SURABHI3, NIKHIL U4,YASHWANTH M.” VEHICLE COLLISION AVOIDANCE SYSTEM”, International Research Journal of Engineering and Technology(IRJET,)e-ISSN: 2395-0056,Volume: 07 Issue: 06 | June 2020
  5. 5. https://www.kpit.com/insights/architectural-concepts-for-autonomous- driving-applications/−( Sensor_Architecture)
  6. 6. https://pdfs.semanticscholar.org/4bce/e9bcdfae49c3654429f03be03fb8c 5a7836e.pdf ( all sensors-twent)
  7. 7. https://www.oxts.com/industry/adas-testing/
  8. 8. https://www.sae.org/news/press-room/2018/12/sae-international- releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving- automation%E2%80%9D-standard-for-self-driving-vehicles
  9. 9. https://www.first-sensor.com/en/applications/mobility/
  10. 10. https://www.embedded.com/how-smart-sensors-enhance-adas-designs/
  11. 11. An Introduction to Automotive LIDAR, Texas Instruments, Motaz Khader Temperature and Humidity Sensing Samir Cherian High-Speed Amplifiers
  12. 12. Capita Selecta: Virtual Reality, Sensors in ADAS, 8th Jan, 2007, University of Twent, Martijn Pijpers, s9709401
  13. 13. https://support.oxts.com/hc/en-us/articles/115002772345-RT-Range-Measurements
  14. 14. www.sae.org/publications/technical-papers/content/2009-01-2911/
  15. 15. https://www.autocarpro.in/news-international/volkswagen-local-hazard- warning-car2x-safety-tech-wins-euro-ncap-award-55905
  16. 16. Information and communication technologies for safe and intelligent vehicles, Communication from the Commission to the Council and the European Parliament, SEC, 2003) 963

Written By

Pradip Kumar Sarkar

Submitted: 20 May 2020 Reviewed: 06 October 2020 Published: 30 October 2020