Navigation systems provide the optimized route from one location to another. It is mainly assisted by external technologies such as Global Positioning System (GPS) and satellite-based radio navigation systems. GPS has many advantages such as high accuracy, available anywhere, reliable, and self-calibrated. However, GPS is limited to outdoor operations. The practice of combining different sources of data to improve the overall outcome is commonly used in various domains. GIS is already integrated with GPS to provide the visualization and realization aspects of a given location. Internet of things (IoT) is a growing domain, where embedded sensors are connected to the Internet and so IoT improves existing navigation systems and expands its capabilities. This chapter proposes a framework based on the integration of GPS, GIS, IoT, and mobile communications to provide a comprehensive and accurate navigation solution. In the next section, we outline the limitations of GPS, and then we describe the integration of GIS, smartphones, and GPS to enable its use in mobile applications. For the rest of this chapter, we introduce various navigation implementations using alternate technologies integrated with GPS or operated as standalone devices.
- public transport
- smartphone application
- indoor/outdoor navigation
- vision-based navigation
- obstacle detection
- control system
- obstacle avoidance
- pedestrian navigation system
- modified RHKF filter
- indoor assistive navigation
- semantic map
- obstacle avoidance
- tango device
- near-field communication
- indoor navigation
- indoor positioning
- NFC internal
- sensor fusion
- map matching
- hidden Markov models
- Kalman filter
1. The limitations of GPS
Some of the downsides of GPS are listed in . Among these are several limitations which are relevant to this chapter. The weak intensity signal causes GPS to be less applicable for cases where stable navigating is mandatory or cases where navigating at indoor and covered areas. The low granularity of the signal accuracy makes navigation in crowded cities where landmarks are so close such that GPS is not able to differentiate among them and so is not effective. Furthermore, GPS signal may disperse and change its direction due to interruptions caused by skyscrapers, trees, geomagnetic storms, etc. The impact of unreliable GPS is huge especially due to the constant growing use of navigation applications such as Google Maps and Waze, which heavily rely on GPS signal. The impact may be more car accidents in cases where required information is missed exactly at the time it is critical and useful for driving continuation. GPS signal is not enough for covering all navigation instances. Local and timely knowledge is required for updated and accurate information to be able to properly react instantly when obstacles appear in the road ahead, for example, whether a deep pit or a flooding road is likely to be or if the road is closed. To reduce the dependency on GPS, several methods and technologies have been proposed, such as detailed map information, data from sensors, vision-based measurements, stop lines, and GPS-fused SLAM technologies.
2. GPS and GIS integration
A geographic information system (GIS) is a system designed to capture, store, manipulate, analyze, manage, and present all types of geographical data. Many electronic navigation systems deliver its road-guiding instructions using just verbal commands referring to the associated electronic map displayed to the user. This approach assumes the user familiarity with street maps and road networks, which sometimes is not so. In addition, there are places where street maps are not commonly used and instead landmarks are used allowing the intuitive navigation by recognizable and memoizable views along the route. The introduction of buildings as landmarks together with corresponding spoken instructions is a step towards a more natural navigation. The integration of GPS and GIS provides this capability. The main problem lies in identifying suitable landmarks and evaluating their usefulness for navigation instructions. Existing databases can help to tackle this problem and be an integrated part of most navigation applications. For example, Brondeel et al.  used GPS, GIS, and accelerometer data to collect data of trips and proposed a prediction model for transportation modes with high correction rate. ResZexue et al.  developed a logistics distribution manager (LDM) software and a smart machine (SM) system. It is based on fusing GPS, GIS, Big Data, Internet+, and other technologies to effectively apply its attributes and benefits for achieving a robust information management system for the logistics industry. The resulting logistics facility has shorter distribution time, improved operational competitiveness, optimized the workflow of the logistics distribution efficiency, and saved cost. These examples demonstrate the level of improvements we can expect by integrating GPS and GIS as well as the IoT, mobile phones, and other current technologies.
3. GPS and mobile phone integration
GPS positions provided via phone are generated using multiple different methods, resulting in highly variable performance. Performance depends on the smartphone attributes, the cell network, availability of GPS satellites, and line of sight to these satellites. The time from turning on the smartphone to getting GPS coordinates is relatively long. To accelerate it, a variety of techniques got used. Some phones have incomplete GPS hardware, requiring a cell network to function. The quality of the GPS antenna determines the duration until the device will get a lock. For example, the S3 Mini device has relatively good GPS hardware, including GLONASS and A-GPS support.
4. Urban vehicles navigation
Urban canyons, sky blockage, and multipath errors affect the quality and accuracy of GNSS/GPS. Public transportation in modern cities may have hundreds of routes and thousands of bus stops, exchange points, and busses. These two factors make urban bus systems hard to follow and complex to navigate. Mobile applications provide passengers with transport planning tools and find the optimal route, next bus number, arrival time, and ride duration. More advanced applications provide also micro-navigation-based decisions, such as current position and bus number, the number of stops left till arrival, and exchange to a better route. Micro-navigation decisions are highly contextual and depend not just on time and location but also on the user’s current transport mode, waiting for a bus or riding on a bus. Emerging technology is where accuracy and robustness are critical requirements for safe guidance and stable control. GNSS accuracy can be significantly improved using several techniques such as differential GNSS (DGNSS), augmented GNSS, and precise positioning services (PPS). These techniques add complexity and additional cost. Multi-constellation GNSS also enhances the accuracy by increasing the number of visible satellites. In dense urban areas where high buildings are common, the geometry of visible satellites often results into high uncertainty in the vehicle’s GNSS position estimate resulting in performance in dense urban areas still being challenging.
4.1 Bus navigation using embedded Wi-Fi and a smartphone application
Urban Bus Navigator (UBN) is a system infrastructure that connects passengers’ mobile smartphones with Wi-Fi-enabled busses, gaining real-time information about the journey and transport situations of passengers . A key feature of UBN is a semantic bus ride detection that identifies the concrete bus and route the passenger is riding on, providing continuous, just-in-time dynamic rerouting and end-to-end guidance for bus passengers. Technical tests indicate the feasibility of semantic bus ride detection, while user tests revealed recommendations for effective user support with micro-navigation. The system elements include semantic bus ride detection using a Wi-Fi-based recognition system and a dynamic trip tracking. The semantic bus ride detection combined with the phone’s GPS is used to monitor the passenger’s trip progress. Deviations are immediately recognized and trigger replanning the trip, resulting a new set of navigation instructions for the passenger. The architecture is composed of Wi-Fi for proximity detection of busses by the passenger’s mobile phone, a smartphone application for trip planning using macro-navigation, a context-aware trip hints using micro-navigation, context sensing, bus ride recognition, and trip tracking.
4.2 GNSS/IMU sensor fusion scheme
This urban navigation is based on detecting and mitigating GNSS errors caused by condensed high buildings interfering signals going through . It is using a map-aided adaptive fusion scheme. The method estimates the current active map segment using dead-reckoning and robust map-matching algorithms modeling the vehicle state history, road geometry, and map topology in a hidden Markov model (HMM). The Viterbi algorithm decodes the HMM model and selects the most likely map segment. The projection of vehicle states onto the map segment is used as a supplementary position update to the integration filter. The solution framework has been developed and tested on a land-based vehicular platform. The results show a reliably mitigate biased GNSS position and accurate map segment selection in complex intersections, forks, and joins. In contrast to common existing adaptive Kalman filter methods, this solution does not depend on redundant pseudo-ranges and residuals, which makes it suitable for use with arbitrary noise characteristics and varied integration schemes.
4.3 Navigation based on compass-based navigation control law
Urban environments offer a challenging scenario for autonomous driving . The proposed solution allows autonomously navigate urban roadways with minimum a priori map or GPS. Localization is achieved by Kalman filter extended with odometry, compass, and sparse landmark measurement updates. Navigation is accomplished by a compass-based navigation control law. Experiments validate simulated results and demonstrate that, for given conditions, an expected range can be found for a given success rate.
The architecture contains steering and speed controllers, an object tracker, a path generator, a pose estimator, and a navigation algorithm using sensors allowing real-time control. High-level localization is provided by the pose estimator, which utilizes only odometry measurements, compass measurements, and sparse map-based measurements. The sparse map-based measurements generated from computer vision methods compare raw camera images to landmark images contained within a sparse map. The roadway scene includes lane line markings, road signs, traffic lights, and other sensor measurements. The scene information and the inertial pose estimate are fed into a navigation algorithm to determine the best route required to reach the target. This navigation scheme is provided by a compass-based navigation control law.
5. Space navigation systems
Common navigation technologies assume navigation on a surface with two-dimension (2D), flat land area. Navigation in three-dimension (3D) is much more complicated requiring at least new technologies to complement the existing 2D navigation technologies.
5.1 Autonomous navigation of micro aerial vehicles
In this section we present a low-computational method for state estimator enabling autonomous flight of micro aerial vehicles . All the estimation and control tasks are solved on board and in real time on a simple computational unit. The state estimator fuses observations from an inertial measurement unit, an optical flow smart camera, and a time-of-flight range sensor. The smart camera provides optical flow measurements and odometry estimation, avoiding the need for image processing, usable during flight times of several minutes. A nonlinear controller operating in the special Euclidean group SE(3) can drive, based on the estimated vehicle’s state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D position and heading. The approach is validated through simulations and experimental result.
5.2 Vision-based navigation for micro helicopters
Weiss  developed a vision-based navigation system for micro helicopters operating in large and unknown environments. It is based on vision-based methods and a sensor fusion approach for state estimation and sensor self-calibration of sensors and with their different availability during flight. This is enabled by an onboard camera, real-time motion sensor, and vision algorithms. It renders the camera and an onboard multi-sensor fusion framework capable to estimate at the same time the vehicle’s pose and the inter-sensor calibration for continuous operation. It runs at linear time to the number of key frames captured in a previously visited area. To maintain constant computational complexity, improve performance, and increase scalability and reliability, the computationally expensive vision part is replaced by the final calculated camera pose.
5.3 Space navigation using formation flying tiny satellites
Traditional space positioning and navigation are based on large satellites flying in a semi-fixed orbit and so are costly and less flexible . Recent developments of low-mass, low-power navigation sensors and the popularity of smaller satellites, a new approach of having many tiny spacecrafts flying in clusters under controlled configurations utilizing its cumulative power to perform necessary assignments. To keep stable but changeable configurations, positioning, attitude, and intersatellite navigation are used. For the determination of relative position and attitude between the formation flying satellites, Carrier-phase differential GPS (CDGPS) is used, where range coefficients, GPS differential corrections, and other data are exchanged among spacecrafts, enhancing the precision of the ranging and navigation functions. The CDGPS communicates the NAVSTAR GPS constellation to provide precise measures of the relative attitude, the positions between vehicles, and attitude in the formation.
6. Pedestrian navigation systems
Pedestrian navigation services enable people to retrieve precise instructions to reach a specific location. As the spatial behavior of people on foot differs in many ways from the driver’s performance, common concepts for car navigation services are not suitable for pedestrian navigation. Cars use paved roads with clear borderlines and road signs, and so keeping the car on track is its main role, neglecting obstacles and hazards, unless it is integrated with a social network. However, pedestrians, unlike like cars, may not follow the defined road. This makes personal navigation more complicated and forces us adding special features required for safe navigation. Pedestrian navigation requires very accurate, high-resolution, and real-time response . Solely GPS does not support last moment route changes, such as road detours, significant obstacles, and safety requirements. However, integrating the IoT and GPS via an application generates a solution providing accurate and safe navigation. To enable it, a two-stage personal navigation system is used. In the first stage, the trail is photographed by a navigated drown, and the resulting video is saved in a cloud database. In the second stage, a mobile application is loaded to the pedestrian’s mobile phone. Once the pedestrian is about to walk, it activates the mobile application which synchronizes itself with the cloud navigation database, and then instructions from the mobile phone guide the pedestrian along the trail-walk. A more advanced system contains the two stages within the mobile application. The mobile video camera is activated and captures the trail images in front of the pedestrian, processes it, and guides the pedestrian accordingly. In case of an upcoming obstacle, the application proposes the safest and most effective detour and guides the pedestrian accordingly.
Personal navigation systems are very accurate and safe, operate indoor and outdoor, and are available as long as the mobile phone is connected, and its internal storage is big enough. It provides spatial information for climbing, wandering, or tramping users. It is used for locating casualties, as well as for self-orientation of rescue teams in areas with low visibility. In military and security operations, localization and information technologies are used by soldiers to self-locate, collect, and collate. A similar implementation with the same functionality is a walking stick with embedded micro devices and software as described above and a wearable Bluetooth headset with an embedded camera in front of it.
6.1 Landmark-based pedestrian navigation systems
Navigation in cities is commonly done by the target address: zip-code, street, and house number. However, in cases where people do not use street and house number as an address but rather use landmarks to identify the route to the target as well as the target location , by combining CIS and GPS, the desired landmarks coordinates are loaded to the cloud database, and the corresponding navigation application is modified to identify the landmarks on ground.
A landmark-based navigation system is composed of a video camera to obtain and analyze pedestrian paths, selected reliable landmarks along the main routes, a routing table containing all relevant origins and destinations within the site, positions of view and orientations to assert maximal coverage of interesting spots, thousands of partial routes for the entire recording period, and the detected stops over a whole day for different definitions of a stop. Based on the defined sections and the landmarks and decision points, a routing table is created to define navigational instructions from each origin in the station to each possible destination. Table columns correspond to the original landmarks and the decision points; rows correspond to destination landmarks. The identified landmarks and the defined route instructions are used to develop an audio guiding system using speech recognition and text-to-speech software. The audio guiding system employs verbalisms that are as distinct and clearly recognizable as the visual landmarks and that the users can intuitively combine the description with what they see.
6.2 Shoe navigation based on micro electrical mechanical system
A micro electrical mechanical system (MEMS)  is a family of thumbnail technologies enabling a wide variety of advanced and innovative applications. When such device is mounted on a shoe, it collects the number of steps, average step width, and walking directions. This data is constantly collected and processed, and via signals it guides the person wearing the shoe. Due to the magnetic field, some navigation errors may occur; a special filter offsets it by using a special filter. Experiments show that this approach is applicable and efficient.
7. Indoor navigation technologies
Indoor navigation systems became popular due to the lack of GPS signals indoors and the increase in navigation needs especially in small areas, such as parking garages and huge complex of buildings. Several indoors navigation systems have already been implemented. Each of them is based on a different technology that complies with the specific requirements and constraints of the location it is expected to navigate in. We assume that each solution has technical and usability limitations. It helps tracking objects by using wireless concepts, optical tracking, ultrasound techniques, sensors, infrared (IR), ultra-wide band (UWB), Wireless Local Area Networks (WLANs), Wi-Fi, Bluetooth, radio frequency identification (RFID), assisted GPS (A-GPS), and more. Most solutions have limited capabilities, accuracy, unreliability, design complexity, low security, and high configuration costs.
7.1 NFC-based indoor navigation system
NFC technology allows communication over short-range, mobile, and wireless conditions. NFC communication happens when two NFC-capable devices are close to each other. Users use their NFC mobiles to interact with an NFC tag or another NFC mobile. NFC-based indoor navigation system enables users to navigate through a complex of buildings by touching NFC tags spread around and orienting users to the destination.  NFC internal has considerable advantages to indoor navigation systems in terms of security, privacy, cost, performance, robustness, complexity, and commercial availability. The application orients the user by receiving the destination name and touching the mobile device to the NFC tags and so navigates to the desired destination.
7.2 Indoor garage navigation based on car-to-infrastructure communication
Indoor micro-navigation systems for enclosed parking garages  are based on car-to-infrastructure communication providing layout information of the car park and the coordinates of the destination parking lot. It uses unique signal rates. In case a car is detected, the system calculates its position and transmits data to a vehicle to substitute the internal positioning system. With this information the vehicle is guided. Integration to the outdoor navigation system is available to allow smooth transition from/to outdoor/indoor.
7.3 Autonomous vision-based micro air vehicle (MAV) for indoor and outdoor navigation
In this section we introduce a quadrotor that performs autonomous navigation in complex indoor and outdoor environments . An operator selects target positions in the onboard map, and the system autonomously plans flights to these locations. An onboard stereo camera and an inertial measurement unit (IMU) are the only sensors. The system is independent of external navigation aids like GPS. All navigation tasks are implemented onboard the system. The system is based on FPGA-dense stereo matching images using semi-global matching, locally drift-free visual odometry with key frames and sensor data fusion. It utilizes the available depth images from stereo matching. To save processing time and make large movements or rather low frame rates possible, the system works only on features. A wireless connection is used for sending images and a 3D map to the operator and to receive target locations. The results of a complex, autonomous indoor/outdoor flight support this approach. The position is controlled by the estimated motion of the sensor. To enable it, a state machine controller, a tracking position system, and a reference generator are implemented. The reference generator is used to create smooth position, velocity, acceleration, and a tracking controller based on a list of waypoints. The flown path is composed of straight line segments between any two waypoints.
8. Obstacle avoidance navigation systems
A comprehensive automated navigation system must incorporate effective tools for detecting road obstacles and instantly propose the optimal alternate route bypassing the detected obstacle. It combines optimal route finding, real-time route inspection, and route adjustments to ensure safe navigation. The following are three examples utilizing advanced technologies such as computer vision, fuzzy logic, and context-aware. More examples can be found in .
8.1 Image processing obstacle avoidance navigation
Unmanned aerial vehicles (UAVs) use vision as the principal  source of information through the monocular onboard camera. The system compares the obtained image to the obstacles to be avoided. Micro aerial vehicle (MAV), to detect and avoid obstacles in an unknown controlled environment. Only the feature points are compared with the same type of contrast, achieving a lower computational cost without reducing the descriptor performance. After detecting the obstacle, the vehicle should recover the path. The algorithm starts when the vehicle is closer to the obstacle than the distance allowed. The limit area value is experimentally obtained defining the dimensions of obstacles in pixels at a specific distance. The output of the control law moves the vehicle away from the center of the obstacle avoiding it. If the error is less than zero, the vehicle moves to the right side. Detouring of permanent obstacles, a preliminary process is applied to scan the route and correct it such that the corrected route already considers all known obstacles and skips them.
8.2 Fuzzy logic technique for mobile robot obstacle avoidance navigation
Mobile robots perform tasks such as rescue and patrolling. It can navigate intelligently by using sensor control techniques . Several techniques have been applied for robot navigation and obstacle avoidance. Fuzzy logic technique is inspired by human perception-based reasoning. It has been applied to behavior-based robot navigation and obstacle avoidance in unknown environments. It trains the robot to navigate by receiving the obstacle distance from a group of sensors. A reinforcement learning method and a genetic algorithm optimize the fuzzy controller for improving its performance while the robot moves. Comparing the performance of different functions such as triangular, trapezoidal, and Gaussian for mobile robot navigation shows that the Gaussian membership function is more efficient for navigation.
A similar concept is using neural network learning method to construct a path planning and collision-free for robots. Real-time collision-free path planning is more difficult when the robot is moving in a dynamic and unstructured environment.
8.3 Context-aware mobile wearable system with obstacle avoidance
The system is composed of three embedded components; a map manager, a motion tracker, and a hindrance dodging . The map manager generates semantic maps from a given building model. The hindrance dodging detects visible objects lying on the road and suggests a safe bypass route to the target location. A developed prototype performed very well proving that this navigation system is effective and efficient.
This chapter introduces various complementing navigation concepts and implementations, integrating advanced technologies and improving and expanding existing traditional navigation solutions. The outcome is a wide variety of solutions for cases where standard navigation technologies such as GPS are less effective or not applicable. We presented several areas where various technologies have been tailored to specific problems. For each problem we described different cases with unique technologies and implementations.