Open access peer-reviewed chapter

IoT-Based Route Guidance Technology for the Visually Impaired in Indoor Area

Written By

Jong-Gyu Hwang, Tae-Ki An, Kyeong-Hee Kim and Chung-Gi Yu

Submitted: 11 January 2022 Reviewed: 25 May 2022 Published: 13 July 2022

DOI: 10.5772/intechopen.105549

From the Edited Volume

Smart Mobility - Recent Advances, New Perspectives and Applications

Edited by Arif I. Sarwat, Asadullah Khalid and Ahmed Hasnain Jalal

Chapter metrics overview

190 Chapter Downloads

View Full Metrics

Abstract

The mobility handicapped, especially the visually impaired, are complaining of many difficulties and inconveniences in moving in underground spaces such as subway stations due to inconvenience in behavior and lack of guidance information. Route guidance for the visually impaired through various mobile apps using the newest global positioning system (GPS) is supported, but these apps cannot be used in areas where GPS signals are not received. To solve this problem, in this chapter, an Internet of things (IoT) sensor-based route guidance technology was presented to improve the mobility of the visually impaired in indoor areas such as railway stations and a mobile app has been developed, where an IoT-sensor-based user positioning algorithm and user convenience are considered. In addition, to evaluate the applicability of the developed system, the user satisfaction was measured through a test at the virtual Busan City Hall metro station for the visually impaired. The route guidance technology presented in this chapter is expected to contribute greatly to the improvement in the mobility of the visually impaired in indoor areas including railway stations.

Keywords

  • the visually impaired
  • route guidance
  • IoT-based technology
  • satisfaction ratio
  • positioning algorithm in indoor area

1. Introduction

The mobility handicapped refers to people who have temporary or continuous movement restrictions or inconveniences when using public transportation facilities due to behavioral inadequacies, and as of 2017 in Korea, the transportation vulnerable population is expected to reach 28.9% of the total population and an annual growth rate of 2% over the next 5 years. Accordingly, the improvement in convenience in the use of public transportation facilities such as railway stations for the mobility handicapped is becoming an urgent problem. Among the mobility handicapped, especially the visually impaired have many inconveniences in using public transportation such as railways due to their physical characteristics. In addition, it is difficult for the visually impaired to safely escape through various types of emergency evacuation information and escape systems in case of emergency situations such as fires in railway stations, so it has potential safety problems for them. Among the mobility handicapped, especially the visually impaired feel more difficult and inconvenient in indoor activities than other mobility handicapped due to their lack of behavioral information and lack of information for them.

Recently, various walking-supported technologies for the visually impaired using global positioning system (GPS) signal-based location information have been introduced, helping them in their outdoor activities. However, since GPS signals cannot be used in underground or indoor areas such as railway stations, various walking support systems that have been introduced recently cannot be used. According to a survey report by the Ministry of Land, Infrastructure and Transport [1], as shown in Table 1, it is identified that the level of user satisfaction of the mobility handicapped including the visually impaired is 10–20% lower than that of the general public at railway station and bus terminal [1, 2, 3, 4].

DivisionSumVery satisfactionSatisfactionNormalUnsatisfactionVery unsatisfactionNo useUser satisfaction
The physically disabled156379135202252
The visually disabled690742137554
The hearing disabled633342123957
The complex disabled683438173856
The pregnant93624432001763
The older374451051655098367
The ordinary430681881601136372

Table 1.

Usability evaluation results for each the mobility handicapped [1].

In order to increase the satisfaction ratio for the physical disabled, large-scale hardware investment such as the installation of elevators for interfloor movement and improvement in facilities such as application of barrier-free (BF) design is essential. However, on the other hand, the visually impaired need routes and risk information, not large-scale facility investors like the disabled. Various mobility convenience facilities to support the mobility of the visually impaired in indoor areas are continuously installed, but the level of the user satisfaction ratio is not improving as the facilities installation rate increases. Accordingly, although it is important to install hardware-based mobile convenience facilities, measures to improve the user satisfaction in view of software are required [3, 4, 5].

In order to solve these problems, various technologies for supporting independent walking for the visually impaired are being introduced and developed in many countries and institutes, as shown in Table 2 [6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26]. It is confirmed that these various technologies are having difficulty in practical use in terms of usability, such as electronic sticks like NAVIWALK [14, 25], or still in the early stages of technology development. In particular, most of these are technologies for outdoor application, and except for some, it is analyzed that indoor technology is still in the early stages of development.

Nationsproject of the visually impaired peopleInstitute
KoreaBus information terminal (BIT)Rosisy Co., Ltd.
Accessory technology for the visually impaired (NAVIWALK)Naviworks Co., Ltd.
Accessory technology for the visually impairedFreemfor Co. & Isonic Co., Ltd.
Accessory technology for the visually impaired (destination guidance sticks)Daegu Univ.
Accessory technology for the visually impaired (Smart-walk)Daegu Univ.
Voice guidance system for the visually impairedNowon-gu District Office
Accessory technology for the visually impaired (Visually impaired Navigation)HIMS International Co., Ltd.
Sound signal for the blind using GPSKorea Road Traffic Authority
USAWireless pedestrian navigation system (Drishti)Univ. of Florida
Trinetra project (the third eye)CMU
Seeing AIMS
Access programGo-Metro
Drone system to help blind people exerciseUniv. of Nevada
Accessible pedestrian signals (APS)USA Government
EUGuideCaneWorld Int’l Sensory Aid
Brumel navigation systemBrunel Univ.
Walking guidance technology using Bluetooth-based beaconsWayfindr
OnTheBus System ProjectAutonoma Tech. Univ.
GPS-based visually impaired navigation app (Blindsquare)Scandinavia

Table 2.

Technology development research and practical application examples.

Recently, the use of smartphones has become common among visually impaired people in external activities in Korea. Accordingly, in this chapter, a technology for supporting mobility in indoor spaces of the visually impaired people based on smartphones that do not require large-scale facilities was presented, and its applicability was verified [7, 8, 13]. That is, Internet of things (IoT) sensor-based route guidance technology that can improve the satisfaction of the visually impaired in indoor space through smartphone-based technology and user satisfaction evaluation results was represented.

Advertisement

2. Technology trend analysis of support system for the mobility handicapped

In this section, the technology analysis through patent analysis and patent trend analysis for predicting future technology directions are conducted in relation to the support system technology for the mobility handicapped, centered on the visually impaired. In general, since related information is disclosed to the public after 18 months have elapsed since the patent application is filed, the quantitative meaning of patents filed in 2019–2020 is not valid, so quantitative analysis is limited to the end of 2018.

Looking at the overall patent trend by year of the support system technology for the mobility handicapped, from a macroscopic point of view, except for a brief decrease around 2010, the overall number of applications has been steadily increased, and it has been shown to have increased rapidly since 2010 (Figure 1). Korea has 433 cases, occupying 17% of the total valid patents; China has 1078 cases, occupying 41% of the total patents; Japan has 151 cases, occupying 6% of the total patents; the USA has 720 cases, occupying 27% of the total patents; and Europe has 237 cases, occupying 9% of the total patents. Looking at the patent trend by year in Korea, the number of patents increased from 2010, and after a sharp increase and peak in 2014, it decreased in 2016 and then increased again. In the USA, patents started to increase from 2006. In 2015, 102 applications were filed, the most among major market countries, and it appears to be declining in 2018. Patents in China started to increase from 2011, and in 2018, 202 applications were filed, the most among major market countries, and it is increasing steadily. In Japan, patents increased sharply in the early 2000s, since then increased and decreased repetitively, but, overall, it showed an increasing trend, but it has been decreasing since 2018. In Europe, the number of patents started to increase from 2011, and in 2015, 34 applications were filed, the most among major market countries, and it has been declining since 2016 [8, 27, 28].

Figure 1.

Patent application trends by year in major market countries [7].

The technology market generally goes through stages of “Birth ⇒ Growth ⇒ Maturity ⇒ Decline ⇒ Recovery” (Figure 2). In the field of interactive support system technology for the mobility handicapped, accordingly as it was investigated that the number of applicants as well as the number of applications continued to increase from Section 1 (2003–2006) to Section 4 (2015–2018), it was analyzed that currently it was located in the growth stage. The growth stages of the technology market by country seem to have entered a period of growth. That is, the technology position of Korea patent (KIPO), US patent (USPTO), Chinese patents (SIPO), Japanese patent (JPO), and European patents (EPO) is analyzed to have entered a period of growth, as both the number of applications and the number of applicants were increasing from Section 1 (2003–2006) to Section 4 (2015–2018).

Figure 2.

Growth stage of technology market.

The main examples of development related to the mobility support system for the visually impaired are shown in Table 3, and only a few of them are introduced in this section.

ClassNationProjectRemark
The visually impairedKoreaBus Information Terminal (BIT)LOGISYS
Passage assistance technology for the visually impaired (NAVIWALK)NAVIWORKS
Passage assistance technology for the visually impairedPRIMPO, ISONIC
Passage assistance technology for the visually impaired (Destination Guide Cane)Daegu University
Smart-walk technology for the visually impairedDaegu University
Voice guidance system for the visually impairedNowon-gu Office
Passage assistance technology for the visually impaired (only navigation for the visually impaired)HIMS International
A sound signal device for the visually impaired using GPSRoad Traffic Authority
USAWireless Pedestrian Navigation System (Drishti)University of Florida
Trinetra Project (the third eye)Carnegie Mellon University
Seeing AIMicrosoft
Access programGo-Metro
Drone System that helps visually impaired people to exerciseUniversity of Nevada
Accessible Pedestrian Signals (APS)USA Government
EuropeGuideCaneWormald International Sensory Aids
Brumel navigation systemBrunel University
Walking guidance technology using Bluetooth-based beaconWayfindr
OnTheBus System ProjectUniversitat Autonoma
GPS-based navigation app for the visually impaired (Blindsquare)Scandinavia
Handicapped, Elderly, InfantKoreaBus boarding reservation service for the handicappedJeonju City
USASenior Pedestrian Focus Areas for Senior PedestriansNew York City
Travel Assistance DeviceUniversity of North Florida
EuropeSafeWalk and C-Walk sensorsTraficon
SMART-WAYGermany
WAY4ALLAustria
JapanIntelligent Wheelchair Robot (TAO Aicle)AISIN SEIKI
Current status of pedestrian facilities for the elderlyJapan Government
OtherKoreaDevelopment of customized public transportation service technology for the mobility handicappedMinistry of Land, Infrastructure and Transport
USAPedestrian sanctuaryDepartment of Transportation
School zone systemDepartment of Transportation
Car swivel seatDepartment of Transportation
Policies and legal systems related to the mobility handicappedDepartment of Transportation
EuropeMobility project to improve public transport accessibilityCIVITAS
Intelligent bus stop system of the ACCESS2ALL projectEuropean Commission
HaptiMap projectLund University
PocketNavigatorEuropean Commission
JapanAccessible Japan for the mobility handicappedJapan Government
Acoustic signal system technology developmentJapan Government
Pedestrian protection zoneJapan Government
WHILL autonomous driving systemJapan Government
ChinaWheelchair Accessible Tour GuideChina Government
Policies to increase accessibility for the mobility handicappedChina Government

Table 3.

Examples of technology development research and commercialization [9, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25].

2.1 Passage assistance technology for the visually impaired of NAVIWORKS

When the NAVIWALK cane developed in Korea comes into contact with the radio frequency identification (RFID)-tag-inserted braille block, there is a product that reads the data of the current location and guide information stored in the tag and provides it with voice.

For this purpose, the RFID tag embedded in the braille block is detected through the antenna of the tip of the carrying cane (Figure 3). Because NAVIWALK is an offline system, it is easy to build and has low construction and operation costs, and it is easy to add, change, and modify location information and information messages wirelessly without physical changes, so because there are no restrictions on RFID tag installation, generally, it is suitable for commercialization. The principle of operation is that when the NAVIWALK cane comes into contact with the RFID-tag-inserted braille block, the data of the current location and guide information stored in the tag are read and provided as voice information to provide information to the visually impaired.

Figure 3.

Smart cane configuration diagram of NAVIWALK model [14, 25].

2.2 Passage assistance technology for the visually impaired of Primpo Co. (Isonic-Primpo)

In order to overcome the limitation that the existing canes for the visually impaired could not detect obstacles higher than waist height, as shown in Figure 4, Isonic-Primpo is characterized by attaching an ultrasonic sensor to support a wider detection range [15]. It can detect obstacles located within 2 m from the user and even thin and slender obstacles with a thickness of 3 cm, and it can recognize up to an angle of 25° left and right. It can inform the user of the color of obstacles with a voice and can inform the user of brightness level with a voice. User-centered voice support is possible while delivering obstacle location information with a stronger vibration as it gets closer. In particular, as an obstacle detection electronic cane, it has the great advantages in its strong vibration tactile system and voice recognition support that can overcome visual limitations.

Figure 4.

Isonic-Primpo voice recognition support for the visually impaired [15].

2.3 Drone system to help exercise of the visually impaired for the University of Nevada

The visually impaired people have limitations in some exercise, such as running outside without a guide, but it has been confirmed that the visually impaired have higher spatial localization skills than the general public. This study became the basis for the ability of the visually impaired people to follow drones in a running track environment. In fact, the University of Nevada, Reno (UNR), developed a drone system (Figure 5) that helps exercise of the visually impaired people through low-cost flying drones [15]. Equipped with a total of two cameras, a downward-facing camera that follows the track’s line and a separate camera that focuses on the marker on the runner’s shirt, the drone flies about 10 feet ahead of the runner running at eye level and provides sound guidance. As the runner speeds up or slows down, the drone adjusts its own speed to guide the movement of the visually impaired. The study was conducted with two visually impaired persons, and the results of the study confirmed that the visually impaired could accurately identify and follow the drone, and the qualitative results showed that the participants were accustomed to following the drone, and that the drone system had high efficiency when following and locating the drone.

Figure 5.

Conceptual diagram of UNR’s drone system that helps exercise of the visually impaired [16].

2.4 Google’s HearHere project

HearHere is a navigation system for the visually impaired, which proceeds in two steps [17]. Figure 6 shows the overview of this project process. First, the hardware equipped with the sensor is installed on the glasses to measure the direction of the user with the sensor, and the measured information is transmitted to the smartphone through Bluetooth module, and the software installed on the smartphone creates small destinations at regular intervals on the route to the destination based on the transmitted location information.

Figure 6.

Google’s HearHere project process [17].

When the destination is set, the walking route from the current location to the destination is calculated, and a virtual waypoint that will generate a sound in units of 10 m is created. A visually impaired person feels as if a sound is emitted from the nearest waypoint, and when the waypoint is reached, the next waypoint is updated to sound.

2.5 Wormald international sensory Aid’s robot GuideCane for the visually impaired

Initially, NavBelt, a combination of navigation and belt, was developed to search for obstacles in the path for the visually impaired to walk; however, since the walking part must also identify obstacles on the lower foot, a broader concept of GuideCane was developed [9]. The GuideCane and its functional components, as shown in Figure 7, are very similar to a white cane, in which the user holds the GuideCane in front of him while walking, but the details are different, although the GuideCane is considerably heavier than a regular cane, since it rolls on wheels that support the weight of the GuideCane during operation, it has normal weight. A submotor operating under the control of a built-in computer can move the wheels left and right based on the cane, and both wheels are equipped with encoders that determine the relative motion, and for obstacle detection, GuideCane is equipped with 10 ultrasonic sensors, so it can detect dangerous obstacles. To specify the desired direction of motion, the user manipulates the mini joystick on the handle, and based on user input and sensor data from the encoder, the computer determines where to turn next.

Figure 7.

Configuration of GuideCane [9].

Advertisement

3. Smart braille-block-based route guidance technology for the visually impaired

The visually impaired people have a lot of difficulties due to their visual limitations when moving outside. However, recently, various route guidance support systems using GPS signals have been developed and introduced to help them find their destination, but there are still many difficulties in mobility in underground and indoor areas such as railway stations where GPS signals cannot be used. Various technologies that can use location information in this indoor area are being developed, but most of them require the construction of many infrastructure facilities, and at the same time, users must have a dedicated terminal or additional device to use these services, etc. [6, 7, 8, 9]. Therefore, it is difficult to put it into practical use. In this chapter, to increase practicality through the analysis of these existing studies, IoT-sensor-based route guidance technology was designed through positioning in the indoor space so that the user installs only the smartphone app and minimizes the construction of infrastructure facilities [7, 13].

3.1 Overview of IoT-sensor-based route guidance technology

Braille blocks for the visually impaired are installed on the floor of most indoor areas, including railway stations, and rounded type is installed on the path of the braille blocks, and linear type is installed at junctions or end points to help the visually impaired. In this chapter, the IoT sensor is embedded in the braille block installed on the floor, and the mobile app determines the user’s location and calculates the route to their destination based on the signal from the sensor. Following the confirmed user’s current location and desired route, route information is guided through voice and screen of the mobile terminal. Figure 8 shows the outline of the route guidance technology in the indoor area proposed in this chapter and the application screen of the mobile terminal.

Figure 8.

Configuration of IoT-based route guidance technology for the visually impaired.

The app screen is used by the visually impaired, not the general public, must be designed in accordance with the national app accessibility standard, and must also be certified by an authorized agency. The app developed in this chapter is designed and certified according to this standard. When the user’s location information in the indoor area is confirmed, a route guidance service to the desired place is possible, and additionally, information on major facilities around the moving route and risk information can be provided. In other words, until now, it was impossible to provide various pieces of information to improve mobility as GPS signals were not available in indoor areas. However, through the location information through the IoT sensor proposed in this chapter, it is possible to apply various services for the visually impaired to support movement in indoor spaces. Figure 1 shows an overview of route guidance technology for the visually impaired.

3.2 IoT sensor data structure

As described in Section 3.1, the IoT sensor installed on the floor to identify the user’s location in the indoor area is a Beacon (hereinafter referred to as Bluetooth Low Energy (BLE) in this chapter), and this sensor is based on the MAP of the indoor area that provides the route guidance service. As a result, IoT sensor mapping was done through the following appropriate zone design for each sensor:

  • Zoning so that travel routes do not overlap

  • Zoning by equalizing the installation interval of the sensor

  • Mapping of direction information for each zone to provide user movement direction information

  • Mapping with points of interest (POIs) management information by establishing standard identification code of sensor

After zoned on the MAP of the indoor area for location-based service in this way, the IoT sensors were mapped for each zone, and then, each identifier code system for each mapped sensor was designed. In this chapter, the BLE sensor standard data structure was applied in consideration of service scalability and terminal compatibility with the platform (Android and iOS). It was designed to use the identifier for classifying route guidance services for the visually impaired in the universally unique identifier (UUID) field of the data structure, the local information identifier for the area where the indoor area is located in the Major field, and the facility information identifier of the indoor area in the Minor field. The information in these two fields is configured differently depending on the characteristics of indoor areas such as railway stations, underground shopping malls, and buildings. In the figure, the allocation range means information about each zone zoned in the sensor mapping process. Each BLE sensor having information by such a standard identification code emits a radio frequency (RF) signal having physical location information by allocating it to each zone in the indoor area map. Table 4 shows an example of designing an identifier code for the major and minor fields when the indoor area to be serviced is a metro station. If the area to be serviced is not a railway station, but a different area such as an underground shopping mall, the structure of the Major and Minor fields will be adjusted according to the characteristics of the target area.

MajorMinor
Structure[J1] [J2] [J3] [J4] [J5][M1] [M2] [M3] [M4] [M5]
Allocation range[J1]: 0–5, [J2]: 0–9
[J3]: 0–9, [J4]: 0–9
[J5]: 0–9
[M1]: 0–5, [M2]: 0–9
[M3]: 0–9, [M4]: 0–9
[M5]: 0–9
Code allocation[J1][J2][J3]: Station Code(000–599)
[J4][J5]: Region/Line Classification(00–99)
*Region/Line Classification
00–29: Seoul area, 30–39: spare
40–49: Busan, 50–59: Daegu, …
[M1][M2]: Consecutive numbers(00–59)
[M3][M4]: Use classification(00–99)
[M5]: Classification of floors
*classifications of floors
0: top fourth floor, 1: top third floor
2: top second floor, 3: top first floor
5: bottom fourth floor, …
Examples[09801] Seoul area/line 1/Seoul station
[02906] Seoul area/line 6/Bugok station, …
[01014]: Platform Up/bottom first floor
[01353]: Transfer parking/top first floor, …

Table 4.

Example for Beacon identifier code in case metro station.

3.3 IoT-based positioning algorithm in indoor area

In this chapter, the user’s location in the indoor area is confirmed based on the smart braille block with the built-in BLE sensor with the data structure presented in Section 3.2. Although the user’s location is identified based on a receiver signal strength indicator (RSSI) signal from a sensor installed on the floor, sensor signals of adjacent sections can be received at the same time, so a method of determining in which section the user is actually located is required. In addition, in order to increase the accuracy of the route guidance information, even if the area of the sensor where the user is located is determined, it is necessary to monitor how far away from the sensor and whether the user deviates from the set route while moving.

To measure the user’s moving direction and distance from the sensor, a hybrid positioning algorithm is applied through pedestrian dead reckoning (PDR) technology, which corrects the position through various sensors built into the mobile terminal. PDR is a technique for estimating the relative position change from the previous position through the detection of a pedestrian’s steps, estimation of the stride length to determine the distance traveled, and estimation of the direction to determine the direction of walking by using the measurement values of three sensors in the inertial measurement unit (IMU) built into the smartphone. For positioning error correction, Kalman filter (KF) was applied to remove the error included in the RSSI value measured by the inertial sensor of the mobile terminal, and an algorithm for correcting the accumulated error of the inertial sensor of the smartphone was applied. During positioning, error correction and algorithms are applied according to the situation such as the position of terminal, stride length, and speed. The user’s current location, movement direction, and movement distance are determined using map information based on the link information between nodes of the BLE sensors mapped to the indoor area map and a hybrid positioning algorithm. Route guidance is provided to the user through the app based on the user’s current location information determined by this algorithm.

The overview of the user tracking algorithm through the BLE signal is shown in Figure 9. It shows the concept of tracking information and area determination when a user enters area A and then moves to area C via area B. Multiple BLE signals are simultaneously received at the user’s current location; in consideration of the magnitude of these signals and the magnitude of the received signal of each signal in the previous position, the user’s moving position is estimated and which current sensor zone the user is in is determined. As shown in Figure 10, when the user enters area A and exits area C through area B, multiple BLE signals may be received by the user’s terminal, and some signals may be within the error range. The current user location is estimated in consideration of the strength of the received BLE signals, the mapped link information between the sensors, and area information from the previous location.

Figure 9.

Concept of BLE-based tracking information.

Figure 10.

Calibration concept when ambient signals are measured higher.

As shown in Figure 10, based on the received sensor information, it is estimated which area the user is in now or from which area the user is moving to which area. In the figure, “A → B” means that, although it is estimated that the user is moving from area A to area B, the user is currently in area A. The part shown in Figure 10 is the case that the signals of sensor No. 2 of A and No. 3 sensor of B and the signals of sensor No. 1 of A and No. 4 sensor of B are received within the error range, respectively; although it is ambiguous to determine where the signal is in A or B area from only this received signal, since the previous position is in area A, in this algorithm, it is determined that the user is moving to area B, while he is in region A.

As described above, after estimating the user’s location as a zone first, which sensor the user is located in is estimated in detail by the method shown in Figure 11. It is checked whether the sensor signal received from the terminal is a signal from a valid sensor, and if it is a valid signal, it is determined as the first priority signal of the user’s current location based on the received RSSI value through the above-described location correction algorithm. In addition, the sensor after ranking correction is compared with the previous tracking information to check whether there is a change, and finally, the user’s tracking information is updated.

Figure 11.

Node link of sensors when the desired route is straight line.

Figure 11 is an example of a case in which the BLE signal of area C is strongly measured in area A. In this case, since the strongest signal combination is No.1 and No.2 BLE of area A, the surrounding BLE information is searched. Through this, the nearest BLE after BLE No. 1 and No. 2 is determined as No. 3 and signal No. 5 is ignored. Figure 12 shows the flowchart for checking the user’s tracking information based on the algorithm described so far. That is, it is checked whether the BLE signal received from the mobile phone is a valid BLE signal, and if it is a valid signal, it is determined as the first priority signal of the user’s current location based on the received RSSI value through the above-described location correction algorithm. In addition, the BLE after ranking correction is compared with the previous tracking information to check whether there is a change, and finally, the user’s tracking information is updated. That is, the BLE signal processing order for user tracking is processed according to the following order.

Figure 12.

Flowchart of user tracking information checking.

For route guidance in an indoor area, the criteria for continuous route guidance are divided into one unit through a smart braille block with built-in IoT sensors, and separate map node information is stored in the server for each divided unit. And when the user arrives at a location where route guidance is possible, the map node information of the corresponding unit is designed to be downloaded from the server to the mobile terminal. When the user’s mobile terminal detects the sensor of the smart braille block, the current location is provided to the user by voice, and basic information and brief usage of the app are provided by voice. A list of facilities for a destination reachable from the current location is provided, and when the user selects a facility corresponding to the destination, route information is set up to that facility, and route guidance information can be provided in image and voice according to the user’s movement.

Advertisement

4. Simulation test and user satisfaction evaluation

For the evaluation of the system and algorithm presented in this chapter, a mobile app was produced based on the design presented in Section 3, and the user satisfaction was evaluated through a survey for the visually impaired before and after the application of the system of this chapter. For the user satisfaction survey, the waiting room of Busan City Hall Station was selected as an application target, and the nodes of the smart braille block were coded through the field survey, and route guidance information for the visually impaired to the destination could be provided through the connection of the coded nodes. Figure 13 shows the mapping of IoT sensors and their connection status according to the smart braille block in the waiting room of Busan City Hall Station. The city hall, two ticket gates, toilets, and preferential ticketing machines, which are destinations that can be reached from ① of the station exit gate 4, are displayed, and it can be confirmed that they are linked to each other. If a destination is selected from the location where the main facilities including the exit gate ①, which are each destination, are located, nodes are linked to the destination and route guidance is provided along the linked route.

Figure 13.

Mapping of IoT sensors in case Busan City hall station.

Figure 14 shows some part of the screen of the mobile app produced. The left first screen is the initial screen displayed when the user runs the app after arriving at the location of major facilities in the station; through voice recognition, the visually impaired people can easily select the destination they want to go to. Furthermore, it was also produced to provide a user interface (UI) that allows users to select and set destinations through screen touch rather than voice recognition. When a destination is selected through voice recognition or screen touch, the route is set by linking the sensor nodes as shown in Figure 13 to the destination route, and the route to the destination is sequentially guided as shown in the middle two screens in Figure 14. In this case, route information is sequentially provided to the visually impaired through the voice displayed in red as well as the image to be guided, and when they finally arrive at the destination, the voice guidance is terminated.

Figure 14.

Developed mobile app windows (in Korean).

Based on the produced mobile app, a simulation test was conducted to evaluate the development system targeting 23 visually impaired people in Busan through the Busan Blind Union. Although it is necessary to evaluate through the use of the development system in actual station, due to the corona situation, evaluation was conducted through a satisfaction survey through a simulation test. In the simulation test, an environment was established where the visually impaired could experience the voice route guidance system through a mobile app rather than the actual Busan City Hall station site. In other words, node information of the sensors installed in the actual Busan station was built in the actual Busan City Hall station server, and when the user selects a destination, the sensor node according to the route to the destination is linked as in reality. However, for location confirmation according to the user’s movement, the movement was simulated in the app in consideration of the average movement speed of the visually impaired, and the link with the server was constructed so that route information could be provided according to the node link set identical to the actual station.

The design of the questionnaire is important in the user satisfaction survey according to the use of the development system. In this chapter, the basic survey items were applied mutatis mutandis by reviewing the “2017 Transportation Convenience Survey Study” conducted annually by the Ministry of Land, Infrastructure and Transport for the system use satisfaction survey for the test subjects. In order to understand the user satisfaction and the effect of the system on route movement, the NASA-TLX survey items were reflected as items for the satisfaction survey through the review of experts in related fields [4, 6, 23]. Figure 15(a) shows a photograph of the user satisfaction survey conducted by the Busan Blind Union for the 23 visually impaired persons, and Figure 15(b) shows the results of the user satisfaction survey before and after the application of the development system of this chapter. As shown in the figure, the user satisfaction ratio before the application of the development system was 6.81 out of 10, but after using it, it was analyzed to be 8.81, which was an improvement of about 19.4%, confirming that the application effect of the proposed system was very good. In addition, the visually impaired people were reluctant to use railroad stations due to difficulties in finding routes when using them, but if the system of this chapter is applied to the field, a majority opinion that it would be very useful and helpful when moving at an actual station through a simulation walking through this app before going to the station was suggested.

Figure 15.

Results of the satisfaction ratio survey.

Advertisement

5. Conclusion

In order to improve the mobility of the visually impaired in indoor area, an IoT-sensor-based route guidance technology was designed and presented in this chapter. To this end, a system was developed, such as an IoT-sensor-based user positioning algorithm and a mobile app that reflects the UI according to the app accessibility guidelines that reflect the user’s convenience. For the evaluation of the developed technology, the IoT sensor map was mapped for the urban railway station, which is one of the representative indoor areas, and the app for the simulation test was additionally produced, and the user satisfaction level of the application of this developed system for the visually impaired was investigated. As a result of the user satisfaction survey, it was confirmed that the user satisfaction improved significantly compared to before the application of this developed system. In addition, with just the app for the simulation test the visually impaired people who participated in the simulation test could check the station and the route to destination before going out in advance and experience the route to the station they wanted to go, so it was possible to confirm the utility of the technology proposed in this chapter, such as many opinions were suggested that it can be usefully used. Moreover, if the improvement of the voice recognition rate specialized for the relevant indoor area, such as a railway station, is supplemented, it is expected that it will be possible to dramatically improve the mobility support of the visually impaired and user satisfaction through the minimum hardware installation in the indoor area and software technology.

References

  1. 1. Ministry of Land, Infrastructure and Transport, In year 2017 – A Study on the Actual Condition of Movement of the Traffic Handicapped. 2018. (in Korean)
  2. 2. Jung SB, Lee SO. A regression analysis study on customer satisfaction considering the mediation effect of safety of operation – Focusing on Subway line 9. Journal of the Korean Society for Railway. 2017;20(6):853-865. DOI: 10.7782/JKSR.17.6.853
  3. 3. Kim HC. Customer satisfaction analysis for urban railway service quality by IPA analysis. Journal of the Korean Society for Railway. 2015;18(5):502-511. DOI: 10.7782/JKSR.18.5.502
  4. 4. Jeong EB, Yu SY. Assessment of route guidance for mobility-handicapped passengers in railway stations based on user satisfaction survey. Journal of Korean Society of Transportation. 2020;38(4):309-323. DOI: 10.7470/jkst.2020.38.4.309
  5. 5. Kim HH. Study on the Usability Testing of User Interface Design Using NASA-LX [Thesis]. Korea: Korea Tech University; 2003
  6. 6. Ge T. Indoor Positioning System Based on BLE for Blind or Visually Impaired [Thesis]. Stockholm, Sweden: KTH Royal Institute of Technology; 2015
  7. 7. KRRI Research Report. Development in Interactive Route Guidance and Supporting System Technology for Mobility Handicapped in Railway Station. Korea: Korea Railroad Research Institute; 2021
  8. 8. Yu JW et al. Identification of vacant and emerging Technologies in Smart Mobility through the STM-based patent map development. Sustainability in MDPI. 2020;12:1-12. DOI: 10.3390/su14010476
  9. 9. Sgoval S, Ulrich I, Borenstein J. NavBelt and the guide-cane [obstacle-avoidance systems for the blind and visually impaired]. IEEE Robotics & Automation Magazine. 2003;10(1):9-20. DOI: 10.1109/MRA.2003.1191706
  10. 10. Rahman MA et al. Design and development of navigation guide for visually impaired people. In: Proceeding of IEEE International Conference on Biomedical Engineering, Computer and Information Technology for Health. IEEE; 2019. pp. 28-30. DOI: 10.1109/BECITHCON48839.2019.9063201
  11. 11. Okamoto T, Shimono T, Tsuboi Y, et al. Braille Block Recognition Using Convolutional Neural Network and Guide for Visually Impaired People. In: Proceedings of IEEE 29th International Symposium on Industrial Electronics (ISIE). 2020. DOI: 10.1109/ISIE45063.2020.9152576
  12. 12. Kasthuri R, Nivetha B, et al. Smart device for visually impaired people. In: Proceedings of International Conference on Science Technology Engineering & Management (ICONSTEM). IEEE; 2017. DOI: 10.1109/ICONSTEM.2017.8261257
  13. 13. Hwang JG et al. Design of Supporting System to improve the mobility handicapped satisfaction in Railway Station. The Trans. of the KIEE. 2019;68P(1):17-24. DOI: 10.5370/KIEEP.2019.68.1.017
  14. 14. Biconix, Gangnam Stick for The Blined [Internet]. 2017. Available from: http://play.google.com/store/apps/details?id=jh.com.beaconyx.gangnamsticks&hl=es_419 [Accessed: December 20, 2020]
  15. 15. iSONIC [Internet]. 2010. Available from: http://www.primpo.com/kr/index.html [Accessed: May 2, 2022]
  16. 16. Al Zayer M et al. Exploring the use of a drone to guide blind runner. In: Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility. ACM; 2016. pp. 263-264. DOI: 10.1145/2982142.2982204
  17. 17. Google’s Hear Here Project [Internet]. 2016. Available from: https://folio.openknowl.com/project/7950 [Accessed: May 2, 2022]
  18. 18. Aipoly. Vision through artificial intelligence [Internet]. 2016. Available from: http://aipoly.com/ [Accessed: January, 4 2022]
  19. 19. Microsoft. Seeing AI [Internet]. 2017. Available from: https://www.microsoft.com/en-us/ai/seeing-ai [Accessed: January 4, 2022]
  20. 20. Google. Lookout [Internet]. 2018. Available from: https://korea.googleblog.com/2018/05/lookout.html [Accessed: January 4, 2022]
  21. 21. Be My Eyes [Internet]. 2015. Available from: https://www.bemyeyes.com/ [Accessed: January 4, 2022]
  22. 22. Right-Hear [Internet]. 2015. Available from: https://right-hear.com/ [Accessed: January 4, 2022]
  23. 23. Blind Pad [Internet]. 2017. Available from: https://www.blindpad.eu/ [Accessed: January 4, 2022]
  24. 24. Soundplex [Internet]. 2018. Available from: http://soundplex.co.kr/ [Accessed: January 4, 2022]
  25. 25. Beaconyx. Gangnam Stick. 2017. Available from: https://play.google.com/store/apps/details?id=jh.com.beaconyx.gangnamsticks&hl=es_419 [Accessed: January 4, 2022]
  26. 26. Elmannai W, Elleithy K. Sensor-based assistive devices for visually-impaired people - current status, challenges and future directions. Sensors in MDPI. 2017;17(3):565-573. DOI: 10.3390/s1730565
  27. 27. Smart Transportation Market-Global Industry Analysis, Size, Share, Growth, Trends and Forecast 2015~2021. Transparency Market Research. US: Markets and Markets; 2015
  28. 28. Visual Networking Index: Global Mobile Data Traffic Forecast Update 2016-2021. US: Cisco; 2017

Written By

Jong-Gyu Hwang, Tae-Ki An, Kyeong-Hee Kim and Chung-Gi Yu

Submitted: 11 January 2022 Reviewed: 25 May 2022 Published: 13 July 2022