Open access peer-reviewed chapter

Tactical Communications for Cooperative SAR Robot Missions

Written By

José Manuel Sanchez, José Cordero, Hafeez M. Chaudhary, Bart Sheers and Yudani Riobó

Reviewed: 27 April 2017 Published: 23 August 2017

DOI: 10.5772/intechopen.69494

From the Monograph

Search and Rescue Robotics - From Theory to Practice

Authored by

Chapter metrics overview

22,754 Chapter Downloads

View Full Metrics

Abstract

This chapter describes how the ICARUS communications (COM) team defined, developed and implemented an integrated wireless communication system to ensure an interoperable and dependable networking capability for both human and robotic search and rescue field teams and crisis managers. It starts explaining the analysis of the requirements and the context of the project, the existing solutions and the design of the ICARUS communication system to fulfil all the project needs. Next, it addresses the implementation process of the required networking capabilities, and finally, it explains how the ICARUS communication system and associated tools have been integrated in the overall mission systems and have been validated to provide reliable communications for real‐time information sharing during search and rescue operations in hostile conditions.

Keywords

  • communications
  • mesh
  • contention
  • optimisation
  • middleware
  • propagation

1. Introduction

First responders’ communications (COM) have become a key concern in large crisis events which involve numerous organisations, human responders and an increasing amount of unmanned systems which offer precious but bandwidth‐hungry situational awareness capabilities.

The ICARUS team in charge of developing the COM system — lead by INTEGRASYS with contributions from RMA and QUOBIS — has designed, implemented and tested in real‐life conditions an integrated multi‐radio tactical network able to fulfil the new demands of cooperating high‐tech search and rescue teams acting in incident spots. The ICARUS network offers interoperable and reliable communications with particular consideration of cooperative unmanned air, sea and land vehicles.

In this chapter, we provide a description of the different phases. Starting with requirements collected from high‐level mission managers and specific platform operators, we describe the key design decisions taken by the COM team to follow with implementation details and finalising with the COM system results obtained during the different trials conducted by the project.

Advertisement

2. Communication scenarios and requirements

Proper communication systems are needed to ensure the networking capability that allows SAR team members (robots and humans) and operations managers to share real‐time information under the hostile operating conditions characterising disaster‐relief operations [13]. These conditions mandate the use of wireless communication technologies to support the inherent mobility nature of operations [4, 5].

Figure 1 depicts the general information exchanges occurring in typical disaster‐relief operations where multiple SAR teams are actuating. An entity named on‐site operations coordination centre (OSOCC) acts as the central coordination centre for all operations and is placed close to the disaster zone. First, area reduction and sectorisation tasks are performed by the OSOC to quickly identify and analyse priority actuation areas so as to allocate specific sectors to available SAR teams. These initial planning activities are likely done by the OSOCC with the support of unmanned assets of SAR team temporarily collocated with the OSOCC.

Figure 1.

High‐level communications in ICARUS. (Source: ICARUS).

The SAR teams are groups of first responders equipped with unmanned vehicles that perform SAR operations in an allocated area. They use team‐internal communications (labelled Field Team Communications in the figure) to perform their activities, in particular sharing sensor information captured by human or robotic responders and commanding unmanned vehicles from control stations. The SAR team activities are supervised and coordinated by the OSSOC using field mission communications, which serve, for example, to report about rescued victims, current team‐members’ location, new actuation areas, etc. Both the OSOCC and the SAR teams may make use of external communications with distant entities, such as agencies headquarters for logistic coordination, or data servers providing background or newly acquired information about the disaster area.

Building upon the reference ICARUS communication scenario described above, the ICARUS COM team worked in closed cooperation with other project teams to gather a list of relevant requirements to guide the COM system design and further implementation. Feedback obtained from end‐users (SAR organisations) participating in the project either as partners or as end user board (EUB) experts was used to compile a list of essential high‐level requirements, which is shown in Table 1. From this list, we highlight in particular the need of using non‐reserved spectrum for the operations, due to the likely impossibility of using pre‐existing local communication infrastructures and coordinating with the national spectrum regulation in the early phases of a crisis event.

End users COM requirements
Description Level
Affordable solution Mandatory
Support sectorisation and SAR operations Mandatory
QoS support Mandatory
Over the air security Mandatory
Ad hoc capability Mandatory
Unlicensed spectrum operation Mandatory
Easy and uniform management and control Mandatory
High temporal and spatial availability Mandatory
Interoperability with existing networks Desirable

Table 1.

ICARUS communication requirements stated from SAR end users.

Furthermore, in collaboration with the different project teams in charge of defining overall user requirements, providing unmanned platforms and developing the interoperable Command & Control (C2) tools, an extended view of the communication architecture was elaborated together with a list of quantitative performance target for the ICARUS COM system, based on expected equipment sizing of future SAR teams.

Figure 2 shows the refined view of the communications architecture where the field team communications within a SAR team operation area are populated with different entities and networking segments. This architecture constitutes the reference ICARUS communication model and reflects the typical command and control architecture of future SAR missions making use of ICARUS tools. Each SAR team has a base of operations (BoO) entity which coordinates different squads, namely a group of human and robot responders working in a specific spot within the assigned SAT team area. Making use of a Squad coordination network, each squad operates its unmanned assets through a robot command and control (RC2) station, which additionally serves as a base station for human communications, either voice‐based or message‐based. The BoO receives mission guidance and reports mission status to the OSOCC through the team coordination network segment and at the same time executes the assigned team mission in coordination with the different squads through the team coordination network segment. In Figure 2, it can be seen several COM management entities, residing on the different system entities forming a hierarchical structure that will cooperatively perform all management and control functions on underlying COM resources to allow first responders and their tools to be smoothly interconnected during operations. The network segmentation shown in Figure 2 does not assume a corresponding physical segmentation in terms of frequency channels, link‐level networks or IP‐level networks; it is rather a logical organisation resembling the working structure of teams.

Figure 2.

High‐level communication segments in ICARUS. (Source: ICARUS).

Table 2 gathers the list of key performance targets for the ICARUS COM system elaborated in cooperation with end user organisations, unmanned platform providers and C2 system providers. The reference team networking scenario consists of several interconnected squads operating in cell areas with a maximum radius of 1500 m and five nodes, including a R2C station, which should be able to transition across squads within a limited time. A mix of synchronous/asynchronous application traffic is transferred within squads, between the squads and with the OSOCC. The estimated peak capacities include typical video, voice and Telemetry/Telecommand (TM/TC) feeds.

Quantitative requirements
Description Value Scenario Level
Maximum range 10 Km Sectorisation Mandatory
1.5 Km Outdoor SAR
500 Indoor SAR
100 Rubble SAR
Max. squad nodes 5 All Mandatory
Max. squads 3 All Mandatory
Critical payload Video feed (500 Kbps) SAR Mandatory
Exoskeleton (250 Kbps)
Peak capacity 100 Kbps@backhaul Sectorisation Desirable
670 Kbps@spot
100 Kbps@backhaul SAR
670 Kbps@spot
Maximum platform mobility 100 Km/h Sectorisation Desirable
Squad handover time 30 s SAR Desirable

Table 2.

ICARUS communication performance targets.

Advertisement

3. Pre‐existing solutions and design decisions

Providing reliable wireless connectivity during disaster relief presents a significant challenge. For robust and effective disaster response, mesh wireless networking technology presents a solution to create adaptive network in emergency scenarios in which support infrastructure is either scare or non‐existent [511]. A flexible mesh network architecture that provides a common networking platform for heterogeneous multi‐operator networks, for operation in case of emergencies, is proposed in Ref. [5]. In Ref. [12], the authors have proposed an approach to establish a wireless access network on‐the‐fly in a disaster‐hit area relying on the surviving access points or base stations, and end‐user mobile devices. Similar works also appear in Refs. [13, 14]. An ad hoc networking solution is proposed in Ref. [15] to aid emergency response relying on WiFi‐Direct enabled consumer electronic devices such as smartphones, tablets, and laptops. An integrated communication system is proposed in Ref. [16] comprising heterogeneous wireless networks to facilitate communication and information collection on the disaster site. Based on WiMAX technology, without fixed access point, an ad hoc networking solution is proposed in Ref. [17] using UAV relays to realise a backbone network during emergency situations. Similar concept is proposed in Ref. [18] using IEEE802.11s. Arecent work in Ref. [19] employs dual wireless access technology for robotic assisted SAR operations–one technology to provide a long‐range, single‐hop, low bandwidth network for coordination and control of the robotic devices and second technology for short‐range, multi‐hop, high‐bandwidth network for sensor data collection. Ref. [20] proposes a framework for modelling and simulating the communication networks and examining the ways in which availability, quality of the communication links, and the user engagement affect the overall delays in disaster management and relief. Leveraging the latest advances in wireless networking and unmanned robotic devices, Ref. [21] proposes a framework and network architecture for effective disaster prediction, assessment and relief.

As we have seen in the previous sections, the ICARUS SAR scenario demands QoS‐enforced wireless communications for different types of nodes (robots and stations) spread over a relatively large area in order to provide proper throughput, latency and reliability for the different applications needed to support the missions. Furthermore, future robotic C2 systems enabling higher autonomy – for example, those supported by the JAUS framework selected in ICARUS − will dynamically use centralised and decentralised algorithms [22], demanding from the communications layer the ability to have a flexible balance of the uplink (transmission) and downlink (reception) capacity of network nodes.

Previous research or demonstration activities dealing with a cooperative robotic scenario similar to ICARUS have commonly deployed different technologies, either standards‐based such as PMR (Professional Mobile Radio, e.g. TETRA), WLAN (802.11 family of standards), WPAN (802.15.4 family) and WMAN (802.16 family), or proprietary‐based solutions in licensed or unlicensed spectrum; complemented with public services such as 3G/4G or WiMax, in case these were available at the operations area. As no single communication technology is able to satisfy the varied set of requirements usually demanded by the users, a combination of several datalinks is recurrently used to provide the communication service.

In order to facilitate the selection of the most appropriate datalink technologies for ICARUS, a reduced set of operational and technology challenges to be solved in order to provide a proper, real‐world communication solution for the posed scenario was defined in cooperation with end users. These challenges are shown on the left side of Table 3, followed at the right side by the corresponding approaches taken by the COM team to address them building upon existing datalinks.

Category Challenge Response
Cross‐cutting, operations & management Heterogeneity of robotics platforms and operation environments Variety of COM options (HW, radio bands, datalink options) offered in uniform way
Minimal configuration and integration effort for robot platforms and C2I system providers Custom application MW traffic processing in COM
Single interface for COM management collocated with robot fleet management
Guarantee robustness and real‐time performance with affordable hardware Reliability enforcement via software
Need to have dynamic allocation of robots to C2I stations (teams) Change of robot‐to‐RC2 allocations via expedite software reconfiguration
Datalink technology Maintain reliable connectivity in unlicensed spectrum Cognitive radio, reduced bandwidths, fast channel switching, channel/band aggregation
Achieve long ranges in unlicensed bands Relays, proper bands/channels and transceivers
Maintain shared link/flow status in harsh, highly changing network conditions Network timing, synchronisation and recovery mechanisms
Avoid network congestion Application adaptation, local safeguards, global admission control with pervasive performance monitoring

Table 3.

Key communication challenges in robotic SAR scenarios and the ICARUS responses.

While the various datalink technologies surveyed present rather different features and capabilities, the COM team focused on the specific set of wanted characteristics that served most to solve the challenges identified. As an example, in the following, we list some of the key wanted features at the datalink level.

  • Dynamic channel selection and frequency hopping to improve reliability in unlicensed spectrum where multiple competing networks may exist.

  • Multi‐hop capable datalinks or the lowest possible spectrum bands (e.g. 433 and 868 MHz) looking for favourable propagation conditions to achieve long ranges in unlicensed spectrum. Both approaches come at the expense of reduced bandwidth.

  • Modulations resilient to non‐line‐of‐sight conditions, link diversity solutions (e.g. link meshing or MIMO antennas), and rate and transmission power control to cope with variable link conditions experienced by mobile nodes, subject, for example, to blocking obstacles.

  • Proper QoS techniques to avoid network congestion while guaranteeing performance for the individual flows generated at the different nodes. QoS can be guaranteed on a deterministic basis with a channel access scheme based (at least partially) on time‐slots allocation, which requires time synchronisation between network nodes and may add significant control traffic overhead if frequent reallocation of capacities is needed. QoS performance highly depends on network topology, and some datalink technologies (e.g. those used in sensor networks) are designed for specific application cases (e.g. cluster‐tree topologies), which limits usability in the ICARUS scenarios.

These considerations on datalink technologies must be traded‐off with wanted high‐level system features and overall non‐functional requirements, as stated in Table 3, observing at all times the need to have an affordable solution.

From a system level perspective, ICARUS C2 applications are operating upon the JAUS middleware, assuming transparent IP connectivity between the different end nodes. Therefore, solutions are needed to integrate the different datalink technologies and link‐layer subnetworks in an interoperable IP addressing space and to properly propagate QoS settings for different exchanges from the middleware level down to the datalink layers. Some datalink technologies are not IP‐capable due to resource constrains of the node platforms (e.g. sensors), which adds further difficulty leading to the implementation of IP gateways which must properly translate all needed IP protocols to the link layer. On the other hand, a specific requirement is the ability to transfer the control of robots between different stations operation potentially in different areas, so roaming over different network segments would be required. There are generic solutions at the IP level which provide multi‐homing and mobility support but are rarely applied in ICARUS‐like scenarios due to the effort needed to synchronise mechanisms at IP‐level with those needed at the underlying link‐level for the several datalink technologies used.

Having all of the above considerations in mind a detailed comparative study of available solutions was made, resulting in the final selection of the following technologies:

  • ETSI digital mobile radio (DMR) datalink [23] for long‐range low‐rate communications between control stations and robots. Aiming at an open and affordable hardware implementation using commercial components, a Tier‐2 direct‐mode operation is selected with multiple coding options to avail of capacity versus range flexibility. This is extended with software‐based functions allowing valuable services such as node discovery and capacity management. The latter allows to accommodate different traffic arrival patters latency requirements procuring maximum network utilisation.

  • IEEE 802.11n network [24] with meshed multi‐hop support to interconnect the different squads, teams and the OSOCC. Building upon commercial transceivers, extended management and control functions based on open Linux‐based software are identified to achieve high performance in ICARUS environments, based on the smart handling of channel, power/rate, CSMA and EDCA parameters. Spectrum‐level functions such as channel selection and power control are supported by cognitive radio techniques [25], aiming at operation with minimum interference and maximum spatial reusability conditions. The use of such cognitive radio features in disaster response networks offers opportunities to adapt communication links to the various changes in the operating environment and thereby enhance the performance of the communication network [26].

The proper integration, extension and smart utilisation of the two types of datalink selected are expected to provide the concrete responses to the ICARUS COM challenges found at the right side of Table 3, which form the key design aspects of the ICARUS COM solution.

Advertisement

4. The implemented ICARUS COM system

4.1. Interoperability, performance and manageability functions

The ICARUS COM team approach to implement the required networking capability for SAR missions is to implement key software‐based functions upon well‐established, commercial datalink technologies offering managed performance levels with enough predictability. The combined set of functions will ensure instant interoperability among the variety of unmanned vehicles, personal devices and control stations and will enable performance optimisation by adapting to changing conditions due, for example, to nodes mobility, propagation environment, external interference or evolving mission needs.

The implemented ICARUS COM functions are grouped in three different areas: (a) radio resources management, (b) IP protocol addressing and routing management and (c) overall management and control (M&C).

At radio resources level, ICARUS implements a distributed cognitive radio capability to allow dynamic channel selection (frequency and width) over different unlicensed spectrum bands – 433 MHz, 870 MHz, 2.4 GHz and 5 GHz – for the whole set of datalinks and network segments used in the system. An innovative combination of raw spectrum monitoring with physical and link layer measurements from network devices provides a global view for channel selection as well as a per‐link view to quickly detect problems and take proper correction actions; procuring at the same time implementation of required regulation rules to access given spectrum bands.

At IP protocol layer, a single virtual IP network is offered to applications building upon native operating system tools. Rather than providing a single IP to each system platform (robot or control station), an IP subnet sized for six different addresses is allocated, so that different physical nodes corresponding the same platform (e.g. main computer and standalone cameras on‐board the same vehicle) can access to the ICARUS communication capability available on a dedicate COM computer hosting the COM software and datalinks. Proper routing functions ensure that unicast and multicast application traffic running over the virtual IP network smoothly traverses multiple wired and wireless link‐layer segments.

All of the IP traffic handled in the ICARUS is QoS marked so that proper processing can be done first within the IP stacks of the system nodes and further within the operating datalink layer. In SAR communications, it is imperative to be able to handle different application flows with different QoS giving priority to certain types of data. Based on the defined requirements in Section 2, a number of traffic classes have been defined in the ICARUS COM system, which are shown in Table 4 detailing the differentiating characteristics and typical application flows making use of them.

QoS classes
Access priority Delay enforcement Throughput enforcement Reliability enforcement Pre‐emption Flow examples
Critical First High High Yes Yes Network M&C
Vehicle TM/TC
Exoskeleton TM/TC
Robotic MW signalling
Real time Second Medium Medium Yes Yes Primary and secondary real time imaging
High
Best effort No No No Yes No Sensor data downloading
Secondary real‐time imaging

Table 4.

ICARUS communication QoS classes.

At overall M&C level, a coordinated set of managers and controllers’ modules is designed to handle the traffic generated from the JAUS application middleware to be properly transferred through the underlying datalinks. To that end, the COM layer implements automated JAUS traffic identification and subsequent QoS allocation based on a set of predefined and run‐time reconfigurable rules so that no change is needed on existing applications to benefit from the managed communication capacity of ICARUS so that custom application MW traffic processing in COM easy‐to‐use software interfacing mechanism will be provided within the middleware itself. In addition to passing data units, applications will use the interface to select applicable QoS parameters, while the COM layer will provide relevant information about connectivity (e.g. reachability of other nodes, capacity limits, etc.) using the same naming rules used by the middleware. In this way, control algorithms can conveniently include communication status information to take better decisions.

4.2. The architecture of the ICARUS COM nodes

The set of COM functions briefly introduced in the previous section is implemented in the form of software modules residing in computing nodes associated with the different system entities, namely unmanned vehicles and corresponding control stations, personal devices and mission coordination stations. The various software modules need to efficiently interface with each other — either within the same or over different platforms nodes — to undertake different control, data or management functions. In order to facilitate the implementation of the ICARUS COM system as well to allow for future extensibility, well‐structured and formal mechanisms were defined to model, develop and deploy the different ICARUS software modules. The set of core modules supporting this mechanism and implementing essential system functions is known as the ICARUS COM middleware (COMMW). The COMMW enables the implementation of cooperative and specialised management and control functions and has therefore been a key piece enabling interoperable and resilient tactical communications in the ICARUS scenario of crisis response operations covering air/sea/land portable and mobile nodes.

Figure 3 represents the key COM modules residing in the four different nodes forming a single robot control setup. Two of them (APPNODEs) represent the main computers aboard a robot and at the RC2 station hosting all the software needed for controlling and supervising the platform and its payload sensors. The other two (COMNODEs) are small computers linked through Ethernet connection to their corresponding application nodes acting as data routers providing access to the ICARUS wireless network. In the case of the RC2 station, management and control interfaces are also established between given entities at communication and application levels for overall monitoring and control of mission communications during operations. In the figure, there can be easily identified the different layers constituting the ICARUS COMMW.

Figure 3.

High‐level communication segments in ICARUS. (Source: ICARUS).

The COMMW has been implemented on open, Linux‐based embedded computing platforms with proper kernel and user‐space extensions enabling an overall optimisation of the network stack, including the queuing components present in the system data path, which may largely affect throughput and latency of applications. Figure 4 below shows the final aspect of the assembled COM computer mounted aboard the so called LUGV (Large Unmanned Ground Vehicle) ICARUS robot.

Figure 4.

SUGV COM box and set of antennas used in various missions. (Source: ICARUS).

The COMMW framework seamlessly integrates and jointly manages both WLAN and DMR datalinks according to dynamic mission conditions and evolving requirements. In the following sections, we describe the key datalink‐specific functions implemented.

4.3. DMR datalink implementation

The DMR datalink technology standardised by ETSI provides long range coverage (typically beyond 5 km in open areas) and can handle both voice and low‐rate data. The so‐called soft‐DMR modem implemented in ICARUS [27] enables adaptation of key transmission parameters — coding rate, delivery mode, channel access mode and transmission power — on a per‐destination basis, according to QoS requirements (Table 4) of the currently handled application data. As ICARUS extensions to the DMR Tier‐2 technology, a node discovery service and a capacity management protocol (allowing allocation of throughput levels per node) were implemented to strength the networking aspects of DMR. All these characteristics make the soft‐DMR well suited for networked tactical and mission critical applications.

The following Figure 5 shows the final DMR modem board implemented together with an average ballpoint pen for size comparison purposes.

Figure 5.

ICARUS DMR hardware transceiver. (Source: ICARUS).

4.4. WLAN datalink implementation

ICARUS WLAN datalinks are based on 802.11n commercial transceivers with 2 × 2 MIMO antenna configuration which was assessed as a fair setup to operate in the variety of radio propagation conditions existing in ICARUS missions. All used transceivers are equipped with an Atheros dual‐band chipset supported by the Ath9k Linux driver, which is the common basis to develop low‐level ICARUS extensions. Full‐mesh capability spanning multiple frequency channels is provided through the 802.11s Linux implementation, properly configured to allow a smooth behaviour of mesh peering and routing algorithms given the particular mobility and radio link conditions expected for ICARUS nodes.

Specific functions deployed in Kernel space for performance reasons allow the fine control of key system parameters affecting the overall network performance — particularly range and throughput — which are optimised in real‐time according to predefined and reconfigurable operator policies. These parameters refer to three distinct areas:

  • At radio link level, the controlled parameters are: radio bands and channels frequencies and widths; transmitted power, rate control policy, frame retry policy and waveform mode (e.g. 11b, 11g, or 11n). Legacy waveforms are eventually used for nodes under particularly disadvantaged radio conditions, for example, located at long distances or in indoor.

  • At channel access level, the controlled parameters are per‐class EDCA contention parameters and the CSMA carrier sense level.

  • At mesh protocols levels, the controlled parameters are timers and counters associated with paths and peers’ discovery and association protocols; and to the configuration of root and gateway nodes.

4.5. Operational management

In parallel to the implementation of COM managers and controller modules, the ICARUS COM team worked in the development of a convenient set of tools to ease the tasks of operators responsible for communications during the different mission phases (planning, deployment, operation) aiming at simplified and fast manual interventions while having proper information and tools at all times to fine‐tune key parameters affecting the performance of the overall network and specific links.

There are two different toolsets offered to network operators. The first one is a configuration tool based on a structured data model which allows to setup the overall node configuration based on capacity allocation targets for both locally‐generated and relayed traffic; differentiating among individual application flows and supporting latency, reliability and security requirements in addition to throughput. Operators are provided with a set of utilities for guidance on setting the different configuration parameters. Some of the settings will be subject to dynamic changes during mission execution.

The second one is a rich graphical environment named COM console (COMCON) conceived to support planning, supervision and optimisation of the integrated multi‐radio ICARUS network, combining simulation features with real‐time monitoring and control capabilities. In both simulation and real‐time modes, the COMCON tool acts as a visualisation and control frontend for the COMMW modules. The COMCON tool is able to represent with high‐fidelity the time behaviour of the ICARUS network with fine‐grained view and control of a number of interrelated physical or system factors, which influence the performance of specific links and the overall network.

At planning phase, the COMCON tool accurately characterises COM components, propagation environments, RF interference and vehicles platforms in order to assess global network performance over wide operation areas; as well as the performance of individual terminals along given mission routes. This allows in particular to take proper decisions on radio bands and channels, antennas pattern/polarisation and transceiver features for every node in the network. Furthermore, the eventual need and location of network relays can be assessed. The tool includes propagation models for indoor, rubble and sea environments in UHF/2.4 GHz/5 GHz bands; as well as protocol models of 802.11 mesh networks enabling informed planning of CSMA‐related parameters and reliable estimation of throughput performance. Figure 6 exemplifies a mission modelled in the COMCON tool where the different links and antenna coverages of networking nodes are calculated and verified during mission planning in an interactive 3D Earth Globe visualisation interface.

Figure 6.

ICARUS COM console used in mission planning. (Source: ICARUS).

At operations phase, the COMCON features a centralised monitoring of all key parameters affecting the network performance, allowing to mitigate coverage and throughput problems by timely reconfiguration and eventual reallocation of nodes. Figure 7 shows an example of a real mission monitoring display offering connectivity as well as link performance information to the operator.

Figure 7.

ICARUS COM console in mission operations. (Source: ICARUS).

Some optimisation actions of limited impact are performed automatically by the COMMW stacks, while some others of wider scope require human operator intervention to decide the best solution given the current mission conditions. Of special relevance to network operators is the ability of the combined COMMW software and COMCON tool to determine the likely reason of detected traffic losses, leading to different corrections. The traffic losses are classified in four different groups:

  • Collisions, which can be solved by forcing RTS/CTS, changing paths or moving nodes

  • External interference, which can be solved by selecting new channels or changing the channel bandwidth

  • Propagation conditions, which can be solved relocating nodes, moving to basic transmission modes

  • Queuing, reflecting packet drops in different system queues, which can be solved limiting application demand

Advertisement

5. Field validation and conclusions

During the final project demonstrations conducted at the Almada Camp of the Portuguese Navy and the Roi Albert Camp of the Belgium Army, the ICARUS COM system and associated tools have proven to offer significant value for mission commanders along different mission phases, as illustrated on Figures 810. First, as a powerful deployment planning tool and second, as a network management and optimisation tool able to seamlessly connect all robots’ telemetry and tele‐control capabilities to the ICARUS C2I stations, mitigating eventual coverage and throughput shortcomings arising during operations.

Figure 8.

ICARUS COM tools communicating with aerial robotic systems (acting as communication relays). (Source: ICARUS).

Figure 9.

ICARUS COM tools communicating to rescue workers operating inside a rubble field. (Source: ICARUS).

Figure 10.

ICARUS COM tools installed on a small unmanned ground vehicle. (Source: ICARUS).

The ICARUS communication system makes use of HW/SW mass‐market technologies thoroughly engineered for professional performance exploiting unlicensed spectrum in UHF, 2.4 and 5 GHz bands. The “unlicensed spectrum” approach has provided acceptable performance during the set of trials executed during the project life under limited interference conditions. Nevertheless, in real‐life safety‐critical SAR operations, it is highly desirable having guaranteed access to radio spectrum with proper EIRP limits to ensure required throughput and operation in long ranges or harsh propagation scenarios such as rubble or indoor [2831]. The ICARUS communication system includes by‐design specific provisions to ease integration of new datalink technologies and extend operation to new frequency bands, by adapting the cognitive radio functions to implement any required spectrum access rules. Existing 802.11 COTS professional transceivers that can be tuned to operate in any band up to 6 GHz will allow to readily reuse all of the COMMW/COMCON 802.11 capabilities in low‐frequency spectrum particularly suitable and eventually protected for public protection and disaster relief (PPDR) applications. In the migration phase towards commercialisation, the team is also working on the integration of LTE services; either commercial (if available on crisis location) or PPDR‐specific (e.g. operating in the 700 MHz) to be used as a complementary incident‐spot capacity as an interconnection means between distant incident‐spots. While low‐layer LTE functions would be out of control of ICARUS COM reducing optimisation possibilities, the framework is already able to evaluate in real time the throughput and latency offered by external networks, which would be used to manage the available capacity as a whole.

The research leading to these results has received funding from the European Community’s Seventh Framework Programme (FP7/2007–2013) under grant agreement number 285417.

References

  1. 1. Stopforth R. van de Groenendaal H. “Search and rescue robot-lessons from 9/11”, engineer IT, electronics, computer, information & communication technology in engineering. SAIEE Journal. Jan. 2010
  2. 2. Baldini G, Karanasios S, Allen D, Vergari F. Survey of wireless communication technologies for public safety, IEEE Communications. Surveys & Tutorials. 2nd Quarter, 2014;16(2):619-641
  3. 3. Kumbhar A, Koohifar F, Guvenc I, Mueller B. A survey on legacy and emerging technologies for public safety communications. IEEE Communications. Surveys & Tutorials. 1st Quarter, 2017;19(1):97-124
  4. 4. Malone BL. Wireless search and rescue: Concepts for improved capabilities. Bell Labs Technical Journal. Apr. 2004;9(2):37-49
  5. 5. Fragkiadakis AG, Askoxylakis IG, Tragos EZ, Verikoukis CV. Ubiquitous robust communications for emergency response using multi‐operator heterogeneous networks. EURASIP Journal on Wireless Communications and Networking. 2011;13: 1-16. DOI:10.1186/ 1687‐1499‐2011‐13)
  6. 6. Fujiwara T, Watanabe T. An ad hoc networking scheme in hybrid networks for emer-gency communications. Elsevier Journal on Ad Hoc Networks. Sep. 2005;3(5): 607-620
  7. 7. George SB, Whou W, Chenji H, Won Y, Lee O, Pazarlogou A, Stoleru R, Barooah P. DistressNet: A wireless ad hoc and sensor network architecture for situation manage-ment in disaster response. IEEE Communications. Magazine. Mar. 2010;48(3):128-136
  8. 8. Nelson CB, Steckler BD, Stamberger JA. The evolution of hastily formed networks for disaster response: technologies, case studies, and future trends. In: IEEE Global Humanitarian Technology Conference (GHTC), Seattle, USA. Nov. 2011
  9. 9. Felice MD, Trotta A, Bedogni L, Chowdhury KR, Bononi L. Self‐organizing aerial mesh networks for emergency communication. In: IEEE 25th International Symposium on Personal, Indoor, and Mobile Radio Communication (PIMRC), Washington DC, USA. Sep. 2014
  10. 10. Reina DG, Askalani M, Toral SL, Barrero F, Asimakopoulo E, Bessis N. A survey on multi-hop ad hoc networks for disaster response scenarios. International Journal of Distributed Sensor Networks. Jan. 2015;2015
  11. 11. Salamanca MDP, Camargo J. A survey on IEEE 802.11‐based MANETs and DTNs for survivor communication in disaster scenarios. In: IEEE Global Humanitarian Technology Conference (GHTC), Seattle, USA. Oct. 2016
  12. 12. Minh QT, Nguyen K, Borcea C, Yamada S. On‐the‐fly establishment of multihop wireless access networks for disaster recovery. IEEE Communications Magazine. Oct. 2014;52(10):60-66
  13. 13. Aloi G, Bedogni L, Bononi L, Briante O, Di Felice M, Loscrì V, Pace P, Panzieri F, Ruggeri G, Trotta A. STEM‐NET: How to deploy a self‐organizing network of mobile end‐user devices for emergency communication, Elsevier Journal on Computer Communications. Apr. 2015;60:12-27
  14. 14. Ray SK, Sinha R, Ray SK. A smartphone‐based post‐disaster management mechanism using WiFi tethering. In: IEEE 10th Conference on Industrial Electronics and Applications (ICIEA), Auckland, New Zealand. June 2015
  15. 15. Koumidis K, Kolios P, Panayiotou C, Ellinas G. ProximAid: Proximal adhoc networking to aid emergency response. In: International Conference on Information and Communications. Technologies for Disaster Management (ICT‐DM), Rennes, France. Dec. 2015
  16. 16. Bai Y, Du W, Ma Z, Shen C, Whou Y, Chen B. Emergency communication system by heterogeneous wireless networking. In: IEEE International Conference on Wireless Communications. Networking and Information Security (WCNIS), Beijing, China. June 2010
  17. 17. Dalmasso I, Galletti I, Giuliano R, Mazzenga F. WiMAX networks for emergency management based on UAVs. In: IEEE 1st AESS European Conference on Satellite Telecommunication (ESTEL), Rome, Italy. Oct. 2012
  18. 18. Morgenthaler S, Braun T, Zhao Z, Staub T, Andwander M. UAVNet: A mobile wireless mesh network using unmanned aerial vehicles. IEEE Globecom Workshops (GC Wkshps), Anaheim, California. Dec 2012
  19. 19. Flushing EF, Gambardella LM, Di Caro GA. On using mobile robotic relays for adaptive communication in search and rescue missions. In: IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland. Oct. 2016
  20. 20. Singh A, Adams R, Dookie I, Kissoon S. A simulation tool for examining the effect of communications on disaster response in the oil and gas industry. IEEE Transactions on Systems, Man, and Cybernetics, Aug. 2016;46(8):1036-1046
  21. 21. Erdelj M, Natalizio E, Chowdhury KR, Akyildiz IF. Help from the sky: Leveraging UAVs for disaster management. IEEE Pervasive Computing. Jan-Mar. 2017;16(1):24-32
  22. 22. Gonzales D, Harting S. Designing unmanned systems with greater autonomy, RAND Corporation. 2014
  23. 23. DMR Association. Benefits and features of DMR. White Paper. Available from: http://dmrassociation.org. Retrieved Dec. 2016
  24. 24. IEEE 802.11™‐2012, Standard for information technology-telecommunications and information exchange between systems local and metropolitan area networks-specific requirements Part 11: Wireless LAN medium access control (MAC) and physical layer (PHY) specifications. Available from: https://standards.ieee.org
  25. 25. Chaudhary MH, Scheers B. Dynamic channel and transmit‐power adaptation of WiFi network in search and rescue operations. In: International Conference on Military Communications and Information Systems (ICMCIS), Brussels, Belgium. May 2016
  26. 26. Ghafoor S, Sutton PD, Sreenan CJ, Brown KN. Cognitive radio for disaster response networks: Survey, potential, and challenges. IEEE Wireless Communications. Oct. 2014;21(5):70-80
  27. 27. Chaudhary MH, Scheers B. DMR implementation on SDR platform for control of unmanned devices and cognitive management of WiFi network in search and rescue missions. In: NATO IST/RSY Symposium on Cognitive Radio & Future Networks, The Hague, The Netherlands. May 2015
  28. 28. Holloway CL, Koepke G, Camell D, Young WF, Ramley KA. Radio propagation measurements during a building collapse: Applications for first responders. In: International Symposium on Advanced Radio Technology (ISART), Boulder, USA. Mar. 2005
  29. 29. Oestges C. Radio channel models for search‐and‐rescue missions into collapsed structures. In: URSI International Symposium on Electromagnetic Theory (EMTS), Hiroshima, Japan. May 2013
  30. 30. Anusas‐amornkul T. A victim and rescuer communication model in collapsed buildings/ structures, In: IEEE International Conference on Parallel and Distributed Systems (ICPADS), Hsinchu, Taiwan. Dec. 2014
  31. 31. Propagation data and prediction methods for the planning of indoor radio communication systems and the radio local area networks in the frequency range 900 MHz to 100 GHz. ITU‐R Recommendations, Geneva. 2001

Written By

José Manuel Sanchez, José Cordero, Hafeez M. Chaudhary, Bart Sheers and Yudani Riobó

Reviewed: 27 April 2017 Published: 23 August 2017