Open access peer-reviewed chapter

Wearable Devices and their Implementation in Various Domains

Written By

Menachem Domb

Submitted: 04 November 2018 Reviewed: 27 March 2019 Published: 25 April 2019

DOI: 10.5772/intechopen.86066

From the Edited Volume

Wearable Devices - the Big Wave of Innovation

Edited by Noushin Nasiri

Chapter metrics overview

1,732 Chapter Downloads

View Full Metrics

Abstract

Wearable technologies are networked devices that collect data, track activities and customize experiences to users? needs and desires. They are equipped, with microchips sensors and wireless communications. All are mounted into consumer electronics, accessories and clothes. They use sensors to measure temperature, humidity, motion, heartbeat and more. Wearables are embedded in various domains, such as healthcare, sports, agriculture and navigation systems. Each wearable device is equipped with sensors, network ports, data processor, camera and more. To allow monitoring and synchronizing multiple parameters, typical wearables have multi-sensor capabilities and are configurable for the application purpose. For the wearer?s convenience, wearables are lightweight, modest shape and multifunctional. Wearables perform the following tasks: sense, analyze, store, transmit and apply. The processing may occur on the wearer or at a remote location. For example, if dangerous gases are detected, the data are processed, and an alert is issued. It may be transmitted to a remote location for testing and the results can be communicated in real-time to the user. Each scenario requires personalized mobile information processing, which transforms the sensory data to information and then to knowledge that will be of value to the individual responding to the situation.

Keywords

  • wearable devices
  • device architecture
  • healthcare
  • visually impair
  • automatic navigation

1. Introduction

Wearable devices have embedded sensors which acquire the data for which they were built. This data are then pushed to its integrated processor. The processor analyzes this data and accordingly launches commands, actuators or activates other sensors to collect more data or execute tasks according to predefined scenarios and processes. To promote device standardization and quick adaptation to a wide variety of goals and purposes, we propose a three-layer architecture: the common layer, the domain layer and the special-purpose layer. The basic layer contains common elements required for any wearable device: motherboard, power supply, processor, operating system, communication ports, a grid of sockets and adapters for sensors plugin and software applications.

With the emergence of growth in various technologies, it is predicted that soon about 50 billion new devices will be added world-wide. This raises two major issues: a huge amount of data and heterogeneous devices with severe integration issues. These concerns remain when referring to wearable technology. Typical wearable body sensor networks consist of tiny, smart, low-power and self-organized sensors to observe physiological signals of a human body. Standardization, compliance, effective coexistence and interoperability among multiple technologies are required to ensure end-to-end network routing and connectivity among wearables and external devices. M. Alam et al. [4] review multi-standard and multiple technologies based wearable wireless for inter-device communication. Coexistence and inter-operability are challenges discussed along with utilization of possible technologies for on-body, body-to-body and off-body communications. It explores several schemes to ensure effective coexist among multiple technologies and issues related to interoperability.

In this chapter, we describe the architecture and its operation in several domains, one implementation per domain. Lauren Kolodzey et al. [1] reviewed 614 articles aiming to provide an objective overview of the literature about the use of wearable technology in clinical and simulated surgery. They found that applications of wearable technology mainly focused on improving the safety and efficiency of intraoperative processes. The associated applications were wide-ranging and designed for use by a variety of care providers, thereby reflecting the interconnected relationship between intraoperative safety and the entire healthcare team. It suggests that wearable devices resolve certain human factors that negatively influence performance and safety in the operating room. For example, a display of patient variables to mitigate conflicts associated with patient care tasks and the distracting operative environment. It recommended the use of a variety of wearable devices, such as special glass for its lightweight construction, user-friendly interface and potentially for hands-free control, special camera for capturing precise anatomical details.

The rest of this chapter is composed as follows. In Section 2, we describe the platform and technology components used for developing and implementing wearable-based systems. In Section 3, we outline wearables in the healthcare domain, which is the most advanced domain and with the highest number of production implementations. In Section 4, we review several wearable implementations in several domains, such as agriculture, cconstruction and others. In Section 5, we describe in detail our original implementation of a wearable-based system assisting visually impaired people in safety walking through and avoiding obstacles. At Section 6, we conclude and outline potential directions for further advancements in this subject.

Advertisement

2. Technology enablers

S. Park et al. [2, 3] explore advanced wearables and accordingly recommend guidelines for successful development and deployment of comprehensive wearable systems. Among these are the use of a variety of sensors, each sensor should be flexible, adaptive, effective and reliable.

2.1 Variety of sensor types and flexible, effective and practical sensors

Sensors are needed for capturing various aspects and parameters to be handled simultaneously [4], such as vital signs sensors working at the same time and having multiple sensor types: heart rate, body temperature, pulse oximetry, blood glucose level of different types, the number of sensors may change to capture the signals required to compute a single parameter. In some cases, it may require placing the sensors is specific locations, for example, electrocardiography for recording the electrical activity of the heart where the sensors are placed in three locations on the body. In addition, sensors should be easy for attachment and removal, or for plugging and playing, as sensors may be used at different times and changing requirements. In most cases, parallel processing is required. For example, a pilot during a flight-simulation wants to analyze his overall body reaction during the simulation action. This requires placement of several sensor types, changing locations and types during the simulation. Practically, sensors should be low cost, lightweight, adaptable to the wearer body, distributed power supply and data communication among sensors and processes in the wearable network.

The concept of packaging and fabrication technologies has been widely used and keeps improving with new various materials [5]. These developments enable embedding sensors, such as gyroscopes, accelerometers, camera, motion sensing, physiological and biochemical sensing, into a rigid and flexible platform, adding capabilities to wearable devices. Mobile devices have been integrated with wireless communication technology. The constant growth of broadband wireless networks opens a new era for wearable devices and sensors to continuously monitor the health of patients remotely.

2.2 The generic paradigm for connecting wearables

M. Alam and Ben Hamida [6] propose a generic paradigm, which can serve as a platform for many existing and future applications, such as healthcare, disaster recovery, people safety and more. The key advantage is its wearable Wireless Body Area Networks (WBANs) capabilities, enabling remote and ad hoc deployment of networks. Envisioned applications in this context, range from the popular medical field, continue with entertainment, lifestyle, gaming and ambient intelligence. Applications, such as disaster recovery, rescue, safety, wearable technology can also play a role to protect critical and valuable assets. The network is designed in such a way that the coordinating device communicates with implanted and on-body sensors, transmits the collected information to a remote monitoring station. Figure 1 depicts the advantages of wearable communications to enhance the readiness and alertness of the wearers, devices and vehicles to act as an integrated unit, regardless of their physical location and distance from the occurrence. Enhancing this composition, we may implant cameras in the wearables and provide real-time information to all who involved in a given situation. We may assume that this architecture is the right architecture and infrastructure for any wearable-based functions.

Figure 1.

The generic paradigm.

Advertisement

3. Wearables for health

A dominant area of wearables is health. It is aiming to predict and treat common cases by acquiring and processing physiological and environmental data. Wearable technologies allow consumers to be better at converting personal, biological and environmental data into valuable consumer insights. Wearables can transmit the data to and from the consumer at the appropriate time, creating new consumption experiences that can improve the landscape of health and fitness. These insights may turn into holistic decisions and goal-directed actions, especially if patients allow the access to their physiological data, collected from wearables. A new generation of wearable sensors enables physicians to capture long-term-patients’ activity levels and exercise compliance, facilitating effective dispensing of medications for chronic patients and provide tools to assess their ability to perform specific motor activities, and propose rehabilitation solutions.

Wearables enable remote health monitoring of patients [7, 8]. The data are sent from the wearable to the physician’s office, avoiding the need for office visits. The ability to continuously track patients’ health helps identifying potential problems through preventive interventions and so enhances the quality of care and save money, since the cost of prevention is most cases is less than the treatment cost. The resulting higher quality of care at lower cost would also contribute to better operating efficiencies and lower overhead costs for insurance companies, as resources can be better spent on providing care and not on measures to ensure high quality of care is being provided. This is where wearables have a critical role to play in creating and serving as the core of an ecosystem essential for facilitating the seamless transformation of data to deliver value.

Healthy lifestyle improves employee’s productivity and lower absence rate [9]. Insurance companies can collect its activity and sleeping data to leverage the data for personal insurance plans and reward employees for good health score. An American insurance company issued a wearables based health program pilot, which continuously collects invasive and noninvasive data, such as vital signs. Artificial intelligence provides an added value to healthcare with a focus on diagnosis, treatment, patient monitoring and prevention.

3.1 Personalization

The doctor, with the help of a software expert can quickly create a program based on the needs of the patient. Early diagnosis: precise medical parameters allow early detection of symptoms. Remote patient monitoring: healthcare professionals can monitor patients remotely and in real-time using wearable devices. Adherence to medication: help patients to take medications on time and inform medical professionals if the patient fails to adhere to medications. Information registry: the data are stored in real time allowing an exhaustive analysis of the information. The result is a complete and precise report about the patient’s medical history, which can be shared with other specialists. Optimum decisions: the doctor can analyze the data to make better clinical decisions, to enhance the patient’s quality of life. Saving healthcare cost: remote healthcare using wearable devices means saving time and mobility.

Recent emergence of new materials accelerates the development of non-invasive skin-based wearable devices [10], which are expected to be compatible with human skin: flexible, stretchable and less irritating, and comply with: sensitivity to changes in body temperature, changes in the body and an adequate detection limit. Following are several examples of skin-based devices in healthcare applications: predicting a sudden attack and providing the means to cope with it; detecting genetic cancer syndromes or rapid changes in heart-beat rate; early evidence of vascular events; detecting abnormal respiration rate; monitoring body temperature and biosensing clothing. Wearable strain sensors are used for detecting and monitoring of movement-based signals, such as heart-beat rate and respiration rate. It is lightweight, reliable, flexible, stretchable and aligned with the diverse healthcare applications.

3.2 Diseases

Several researchers proposed wearable-based solutions for specific diseases [11], to assist in curing or relieving the symptoms of a list of diseases as follows:

  1. Sleep apnea: interruptions or a decrease in breathing for few seconds up to a minute. The treatment types depend on the severity of the case and ranges from weight loss to surgical operations. DT is a wearable oral device for following the prescribed therapy for sleep apnea. It measures the temperature, movement and head position of patients by determining the spatial orientation of the device in the mouth.

  2. Chronic obstructive pulmonary disease: a common lung disease that leads to shortness of breath. An ear wearable monitors the physical activities that allow patients to continuously evaluate their condition at home. It reduces healthcare costs for patients that can be treated at home.

  3. Diabetes mellitus. A chronic disease whereby the body cannot produce enough insulin, and the control of blood glucose levels is essential for diabetic patients. A wearable artificial pancreas for monitoring glucose level. It is composed of a flexible core system as a brain and three wedges for insulin delivery, glucose sensing and glucagon delivery. Another wearable to measure blood sugar levels in diabetics is the smart contact lens that Google/Verily Life Sciences owns.

  4. Cardiovascular diseases: it is related to the heart, veins, venous thrombosis, heart failure and cardiac dysrhythmia. Various wearable sensors exist for providing real-time heart rate measurements, such as the wireless blood pressure wrist monitor, which monitors blood pressure in connection with a smartphone. It was shown that the accuracy of the measurements was in good agreement with the reference clinical measurements.

  5. The Vega GPS bracelet is a wearable sensor for ensuring the safety of people by monitoring their location with the use of GPS and global system for mobile communications positioning. Embrace is a wristband for monitoring physiological signals in epileptic people in real time to alert family members.

  6. Mosquito-borne diseases: it causes a wide range of deadly diseases, such as malaria, chikungunya, yellow fever, the Zika virus and the Ebola virus. The kite patch is a patch-type wearable that disperses volatile compounds and is worn on a shirt to repel mosquitoes.

  7. Renal failure: kidney failure and chronic kidney disease. In the treatment of renal failure, dialysis is commonly used, in which kidney function is replaced by a machine. To replace dialysis, a wearable artificial kidney has been developed.

  8. Skeletal system diseases: joint disorders, osteoporosis and poor posture. Using three-dimensional gyroscopes, accelerometers and magnetometers embedded into wearable sensors, the chronic pain resulting from most skeletal diseases can be treated with transcutaneous electrical nerve stimulation and by performing therapeutic exercises. Another wearable monitor uses postural variation and warns users through vibrations when they deviate from normal posture, reminding them to return to a normal posture.

  9. Sunburn prevention: the ultraviolet (UV) radiation of sunlight causing wrinkles, burns, aging and even skin cancer. Wearable UV sensors, which can be worn on the arm in the form of a bracelet, armband or wristband, are used to monitor UV exposure levels with alerts for potential skin damage and safety precautions, as well as estimating vitamin D production levels.

  10. Vein finding: a wearable smart glass termed Eyes-On technology enables nurses to rapidly see the veins of patients through the skin by incorporating multispectral 3D imaging and wireless connectivity.

  11. Detection of stress/depression levels: wearables are used to determine the state of mind of their users. The product is a wristband that monitors heart rate variability aiming to warn the user about a rise in personal stress levels.

3.3 Nutrition and dietetics

Real-time, effective and affordable nutrition and dietetics wearable technology and sensors are an emerging field with immense opportunities and benefits to the global nutrition challenge [12, 13]. Such revolution real time, home-, work- and hospital-based rapid, accurate and cost-effective self-detection and diagnosis of direct or indirect causes or diet deficiency or excess are much needed for generating evidence-based information and knowledge for individual and vulnerable group nutritional and dietary mitigation and lifestyle adaptation through wearable sensors and technology. These can enhance evidence-based, coherent and coordinated nutrition and dietary programs and strategies to a targeted group or illness, vital in addressing malnutrition and under-nutrition public health burden. Building wearable consumers? health and fitness prognosis, prospective digital nutrition, dietetic data and database and nutrition informatics platforms. These provide a paradigm shift in engaging participatory communication among public consumers, dietetic and nutritionist professionals in improving quality interventions, management and outcomes. Assessment and understanding of nutritional and dietary needs, and potential opportunities in functional health benefits and resource development in personalized accessibility and availability of needed resources to encourage positive behavior, diet and nutrition changes.

Diet-related deficiencies are estimated at 3.5 million deaths annually. This results in rapid urbanization and food consumption patterns that require nutrition safety. The public health nutrition wearable and implantable sensors approach provides a new perspective in human and animal nutrition and dietary. They lead to reliable and effective nutrition and health interdisciplinary approaches and tackle the ever-growing local and global nutrition challenges. Modern convenient and cost-effective wearable sensors can be used to educate, track and predict energy level and advice on interventions or activities required to improve the excess or deficiency and adaptation changes from plant-derived sources in achieving balanced choices and quantities of unique fruit and vegetable phytochemical/micronutrient needs.

The effectiveness of wearable devices and fitness trackers, and mobile application on healthy life and care delivery outcomes, such as weight loss and maintenance have been documented in developed countries. Nutritional and dietary wearable technology has a critical role in contributing to nutritional and food challenges paradigm shift in Africa. It provides real time, home-, work- and hospital-based rapid, accurate and cost-effective detection, and diagnosis of nutrition/energy or diet deficiency or excess is much needed. It supports the generation of quality information and knowledge for individual, vulnerable group to national decision-making nutrition policy and guidelines, programs and interventions towards healthier lifestyle and increasing life expectancy, more productivity and wellness. Real time is required for flexible applications of smart wearable and implantable sensors are needed in providing clues into effective fitness and feeding best practices.

3.4 Body dietary and energy balance

To estimate daily total energy expenditure (TEE) using a physical activity monitor, combined with dietary assessment of energy intake to assess the relationship between daily energy expenditure and patterns of activity with energy intake. [14] An activity monitor has been used to determine the total energy expenditure, sleep duration and physical activity. The armband was placed around the left upper triceps. Energy intake was determined by evaluating all food and drink items. TEE was correlated with BMI and body weight but inversely related to sleep duration and time lying down. Multiple linear regression analysis revealed that after taking BMI, sleep duration and time spent lying down into account, TEE was no longer correlated with energy intake. Results show the extent to which body mass, variable activity and sleep patterns may be contributing to TEE and together with reduced energy intake, energy requirements were not satisfied. Hence, wearable technology has the potential to offer real-time monitoring to provide appropriate nutrition management which is more person-centered to prevent weight loss.

Advertisement

4. Wearables for other domains

4.1 Construction

The known high percentage of accidents occurring in the construction industry, calls for developing safety strategies. In this section, we describe personalized construction safety-monitoring applications, incorporating wearable technology. These devices predict safety performance and management practices are identified and analyzed. Awolusi et al. [15] present a variety of solutions.

Environment sensors are silicon sensors, small and embedded communication technologies, such as Bluetooth and Wi-Fi wearables. They increase the volume and precision of environmental data, such as air quality, barometric pressure, carbon monoxide, capacitance, color, gas leaks, humidity, hydrogen sulfide, temperature, light, volatile organic compounds (VOCs) and ability to realize intelligent RFID tags. There are sensors that support a broad range of emerging high-performance applications, such as navigation, barometric air pressure, humidity and ambient air temperature sensing functions. Some of these sensors are designed for wearable technologies. Workers can be monitored while doing their normal work and at the same time having the ability to see highly localized, real-time data on things like temperature. Other wearable-sensors that can be used in wearable devices are gyroscope, light sensors, noise sensors, humidity sensors, temperature sensors, gas sensors, among others.

Wearable devices have the potential to protect workers in hazardous conditions: the use of Smart headsets for monitoring truck drivers’ performance to reduce accidents; Augmented Reality headset to guide workers though complex production processes or wearable devices to predict injuries and machine downtime. According to Gartner, most companies with 500+ employees already use wearables in the workplace.

4.2 Quality of life

J. Lee et al. [16] focus on the value of sustainability in human-oriented wearables and services that seek to improve the quality of life, which involves social impact and public interest. Wearables refer to the technology and its applications with a value of sustainability having a positive impact on the improvement of quality of life, social impact and the public interest. We aim to discuss how continuously evolving wearables influence positively on human life and environment through the keyword of sustainability.

A variety of wearable devices have been launched in the market to achieve various purposes with the development of sensing technologies. One typical example is an application that constantly measures movement distance and movement conditions of users over time through motion sensors that include in wrist-wearable devices and display the measured results. Moreover, measuring the intake and consumption of calories, tracking sleep, postural correction, blood pressure, and heart rate are the most fundamental applications of the current wearables field. As such, wearable applications started by quantifying various human activities (consciously or unconsciously) numerically in daily life. Over the past few years, more wearable devices have been introduced according to their purpose with increasing performance. As a result, the demand for them to quantify individual daily lives by themselves has increased. Along with this demand, more studies of the methods to improve the quality of life by analyzing individual conditions have been conducted for application in real life, which is called the quantified self. Targets whose movements are tracked include various types of personal information, such as physical activities performed and environmental information.

4.3 Monitoring social interactions

Wrist-worn wearables enable monitoring, detecting and recording interpersonal social interaction features [17]. The wrist has embedded motion sensors, accelerometers, heart rate monitors, optical sensors, skin conductivity, skin temperature and other physiological sensors. Increased synchrony of physiological measures has been shown to lead to increased perceived empathy and positive outcome. Leveraging data from wearables for social sensing based on interpersonal synchrony. Preliminary results show that wearable data are suitable for analyzing and quantifying social dynamics. Results indicate differences in wearable sensing data during a social interaction between two people.

4.4 Agriculture

According to Afzal et al. [18], water is a vital component in plants. They measured leaf moisture using special sensors. Results showed that variations curve of the capacitance was in the form of an exponential function, y = ae bx, where y is capacitance, x is leaf moisture content, a is the linear coefficient and b is the exponential coefficient. A new adhesive sensor, sensitive to water vapor, measures leaf surface humidity and how much water is transpired by crop plants. It exhibits different levels of conductivity depending on the humidity and provides farmers with practical information on the real-time water absorption habits of their crops. The sensor is connected to a Wi-Fi device that transmits the data to the data analyzer, which then recommends the amount of water gallons to put in which parts of the field. The sensor is used for water management to accelerate the process of breeding drought tolerant for any crop.

Advertisement

5. Wearables for navigation and safety systems

Automatic navigation in an unknown environment raises various challenges as many cues about orientation are difficult to perceive without the use of vision. Though assisted aids, such as global positioning system (GPS), a satellite-based radio-navigation system, which help in route finding, still it fails to fulfill safety requirements. This section proposes a framework that provides accurate guiding and information on the route traversal and the topography of the road ahead. The framework is composed of technologies, such as Lumigrids, Drone, GPS, Mobile applications andCloud storage which are used to map the road surface and generate proper navigation guidance to the end user. This is done in three stages: (1) off-line mapping of the road surface and storing this information in the cloud; (2) wearable technology used for obtaining in real-time surface information and comparing it to the data on the cloud facilitating accurate and safer navigation and (3) updating the cloud information with information collected by the pedestrian.

There are many technological navigation aids but none of them focus on pedestrian paths. Banovic et al. [19] claim that travelers require detailed information about the terrain and its challenges—size, curves, hurdles, fences, changes in elevation and proposes a three-phase safe navigation system that provides surface information of the pedestrian paths and uses this information while suggesting in real time routes to the visually impaired.

Most applications use location-sensing technology, such as GPS combined with a map to locate and guide pedestrians. Sendero [20] uses smart phone’s location sensing power. Trekker Breeze [21] supports orientation using a commercial GPS receiver. In another work, [22] has combined crowd sourcing with computer vision techniques to provide additional information about traffic intersections and sidewalks or arbitrary images. Few open source [23] software systems provide similar navigation instructions on points of interest like restaurants and buildings to the user using speech or Braille output. Studies say that pedestrians are positive on using technological assisted aids to guide them for navigation [24].

5.1 The process phases

The proposed navigation system consists of the following three phases: (1) terrain mapping phase, (2) pedestrian guidance phase and (3) re-mapping of the terrain based on comparative walk-thru and terrain database. In the terrain mapping phase, an unmanned aerial vehicle is made to fly over the pedestrian path. This vehicle records the GPS coordinates of the mapped region and accurately identifies the actual terrain of the underlying pedestrian path. This data are versioned and stored in a cloud. This referential database is centrally shared for the visually impaired. The terrain mapping phase is essential to initially map all the pedestrian paths and populate the cloud with data. The pedestrian guidance phase is the phase where the stored terrain-related information on cloud is combined with the regular GPS-based route finding and in real time, it is used to guide a pedestrian in navigation. A shirt mounted device assists the visually impaired in achieving this. During the walk-thru, the mounted device with the visually impaired obtains the real-time terrain information of the path ahead and compares it to the existing information on the cloud to alert of the new challenges/hazards that may have cropped up.

5.2 The navigation system

The terrain mapping phase consists of the following components: Quadcopter—unmanned aerial object and Raspberry—a microcomputer to run required image processing algorithms and save the information to the cloud. Figure 2 depicts the components and their interconnection used in terrain mapping phase.

Figure 2.

Components used in terrain mapping phase.

Lumigrids—a LED projector projecting light in the shape of grids as presented in Figure 3.

Figure 3.

Light grids projected on ground by lumigrids projector.

Lumigrids are mounted on the quadcopter and placed facing the ground. These light grids can accurately extract the terrain information of the pedestrian path as the regular arrangement of the lights grid gets distorted based on the terrain. [24] shows how lumigrids can help cyclists to understand the terrain ahead at night and keep them safe. Camera—placed facing the direction of ground where the lumigrids are projected. It constantly takes the images of the patterns formed by the grids and sends it for image processing. GPS sensor is used to obtain the GPS location of the quadcopter drone. Raspberry Pi serves as the central computing unit for all the attached sensors. It processes the captured images of the formed light grids on the ground and obtains the required terrain information.

An interesting approach can also be used to obtain the terrain-related information by using the accelerometer data of the smart phones of other visually sound pedestrians who use these pedestrian paths. The accelerometer of their mobile devices detects the vibration along the X, Y and Z-axes. The magnitude m of the acceleration is calculated as m = X 2 + Y 2 + Z 2 . This is used to predict the terrain information of the pedestrian paths.

5.3 Capturing of the terrain topography in two phases

The terrain mapping system consists of a lumigrids projector and a GPS sensor mounted on a quadcopter which flies along the pedestrian path at height “h” above ground, as depicted in Figure 4. The captured data are associated with its exact location [GPS], which allows the comparison between images taken from the same location. As mentioned, the process is divided into two phases. In the first phase, the terrain image and data are taken and stored in the cloud storage. To ensure accurate terrain data, while the pedestrian walks, we recapture, in the second phase, the same image from the same location.

Figure 4.

The basic set-up of capturing the terrain image and its data in phase 1.

Figure 5 describes the process of obtaining the terrain topography using lumigrids projection. The first picture on the top-left is the image of the sidewalk, we refer to in this section. The picture on the top-right presents the projection of the lumigrids projector on the sidewalk. A complete flat terrain will produce and show a perfect grid picture. However, due to some bumps in the sidewalk, as presented in the right-bottom image, some of the projected squares are distorted, representing the bump location. The resulting grid is sent to the cloud application for analysis and storing it in the cloud storage.

Figure 5.

The terrain mapping phase and its transmission to the cloud storage.

Figure 6 depicts the data collection and processing from the impaired person guidance perspective. It assumes the use of a smart-phone application, which continuously transmits the current person location and orientation to the cloud and obtains the data about the terrain of the path ahead. The left-top image presents the shirt with a mounted unit, which the pedestrian wears. The unit consists of a lumigrids projector, camera and a communication unit. The projector flashes on the ground. The camera captures the grid image formed on the ground and continuously transmits it to the smartphone application, which then transmits it to the cloud application. The application compares the received image to the already stored image and generates the most accurate image representing the terrain situation at this moment. Accordingly, the application generates the proper instructions set and sends it back to the smartphone, which guides the pedestrian accordingly. In parallel, the discrepancy between the stored data in the cloud and the data accepted from the pedestrian, is analyzed and if there is a need to update the cloud data it is done by the cloud application.

Figure 6.

The process of the pedestrian guidance phase.

The steps of the terrain mapping phase:

  1. The entire pedestrian path is divided into squares of equal area—called the sub-squares: let “k” be the area of each sub-square with side “x” which are named as (1, 1), (1, 2) and so on.

  2. The height “h” is adjusted to generate the lumigrids of area “k” just enough to cover each sub square.

  3. The midpoint M of the sub-square is calculated as: M = lat 1 + x 2 long 1 + x 2 .

  4. The quadcopter flying at height “h” above the ground files to the calculated M from where it flashes the lumigrids of area “k” equal to the area of the sub-square on ground. The lumigrids projector creates the light grids of dimension n × n on the ground below.

  5. The following image formed on the ground shows an undistorted lumigrids of area “k” formed on an ideally flat and perpendicular surface to the quadcopter flying at a height “h” above the ground.

  6. This image is captured by the mounted camera and thresholding of the input image splits the lumigrids image data from rest of the image as explained in 10. Camera coordinates can be mapped to the real world coordinates by the following transformation matrix X c Yc Zc = Tcm Xa Ya Za , where Xc, Yc, Zc are the coordinates of the object in camera and Xa, Ya, Za are the coordinates of the same object in the real world and Tcm is the transformation matrix which can be calibrated for a camera.

  7. The dimensions and inclinations of each line segment of the n × n segmented sub-square are the parameters used to represent an ideally flat terrain Length = Breadth of each side = x n Inclination of each side = 90 °

  8. Shortening of length (less than x n ) of any line segment (even skewed) of the formed lumigrids square mesh indicates that the terrain beneath the formed lumigrids is not flat. It is either concave or convex in nature along the Z axis.

  9. The angle between the line segments (tangents of the line segments at the point of intersection if they are skewed) if not the right angle indicates that there is an inclination in XY plane of the terrain beneath the formed lumigrids based on the quadrant (first quadrant or fourth quadrant) of the inclination. Let a′ be the inclination of the line segments of the lumigrids and “a” the corresponding inclination in the ground is given by: a = ± d 1 a , where “d1” is the ratio of the inclination on the ground and the corresponding inclination caused by the lumigrids. And + indicates that the inclination is towards the first quadrant and—indicates that the inclination is towards the fourth quadrant.

  10. After the image thresholding algorithm on the obtained image, the lumigrids are visible clearly as Figure 7.

    In the above image, the required lengths between the skewed line segments are calculated.

  11. Let a line segment of generated lumigrids of ideal expected size x n gets shorten by y% due to a skewed terrain. Let “d2” be the ratio of the absolute value of the vertical height on the ground indicated by the corresponding lumigrids to length of the corresponding line segment generated by the lumigrids. Then the absolute height “h” with reference to ideal flat surface of the ground is given by: h = ± d 2 x n 100 y 100 . Axiom 5 decides if h is positive or negative. h is positive for concave terrain and negative for convex terrain. If y = 100%, theoretically there could be a narrow pit or hill in the ground, as indicated by the non-visibility of the lumigrids.

  12. To exactly identify if the terrain at a given position is concave or convex in nature, we observe the inter line segment distance i of the terrain. If i =  x n  → flat surface, if i >  x n  → concave surface, if i <  x n  → convex surface.

  13. After calculating the terrain information of the given sub-square, the process is repeated to all the sub-squares so that the entire pedestrian path is scanned for its terrain details and mapped. The data thus obtained is pushed to the cloud.

Figure 7.

Lumigrids formed over a pit.

The cloud now has precise information of the terrain. The pedestrian guidance phase consists of the following steps:

  1. When the pedestrian wishes to navigate, the pedestrian’s smart phone requests a route from source to destination. A GIS map is consulted to obtain various routes from the source to the destination. The data from the cloud has precise information about the terrain of each of the pedestrian paths presented in all these routes. An optimum route is selected based on the variations in the terrain in that route, pedestrian traffic density in the route, the route with easy help in case of danger or need and various other parameters which govern the safety of the pedestrian are considered.

  2. The smart phone guides the pedestrian along this route in the pedestrian path. All major terrain variations in the pedestrian path are alerted to the pedestrian.

  3. The shirt mounted unit on the pedestrian flashes the lumigrids on the path ahead and the camera embedded on the unit captures the image of the lumigrids formed and transmits this image to the smart phone of the pedestrian (Figure 8).

  4. The terrain information obtained from the lumigrids are cross checked at real time with the terrain information available in the cloud to recognize and handle temporary terrain changes, like a dog sitting on the pedestrian’s path or a random stone in the way, or sudden permanent terrain changes like a road block.

  5. If considerable discrepancies are found in the terrain, the person is alerted to find possible alternate route like “Stop and Move 3 feet to your right” and a match for the known pattern in the cloud is checked for. If a match is found, the pedestrian is guided along that path.

  6. If some permanent blocks are identified by the shirt mounted device, the cloud is notified about this so that the cloud can flag the terrain data of that pedestrian path as obsolete and can schedule a re-mapping of the terrain phase. An alternate route is found for the pedestrian and the pedestrian is guided accordingly.

Figure 8.

Lumigrids formed by the shirt mounted unit of a pedestrian in the guidance phase.

Re-mapping of the terrain based on comparative walk-thru and terrain database phase consists of re-mapping of a pedestrian path either if the current data is flagged as obsolete by the pedestrian guidance phase, or a scheduled re-mapping process or on-need basis.

The data on the cloud contains the terrain information of the pedestrian path capable of generating a terrain grid along with its GPS coordinates.

The visualization of the data represented as a terrain grid available on the cloud for a pedestrian path looks like Figure 9.

  1. A sample data from the cloud is as follows

    GPS Ver. h a Dirty bit
    (20, 30) 1 +20 −3 0
    (20, 31) 1 +20 −7 0
    (20, 32) 1 +20 −10 0
    (20, 33) 1 +20 −10 0
    (21, 30) 1 +2 0 0
    (21, 31) 1 +2 +1 0
    (21, 32) 1 +3 0 0
    (21, 33) 1 0 −2 0
    (20, 30) 1 −8 0 0
    (20, 31) 1 −8 0 0
    (20, 32) 1 +2 0 0
    (20, 33) 1 +2 0 0

    GPS, the coordinates of the GPS location; Ver., the version number of the data; h, the height of the terrain; a, the inclination of the terrain; dirty bit, specifies if the data is obsolete

  1. When the pedestrian wants to navigate, he first initiates a session with the cloud server which is a onetime activity for every navigation session.

  2. The smart phone application now starts streaming the terrain data from the cloud shown above which is the reference data of the pedestrian path.

  3. The system guides the person to follow the route and alerts on any terrain-related danger. For instance, when the pedestrian is in (20, 33). The interface alerts the pedestrian that there is a pit right in front of him ((20, 30), (20, 31) as indicated by a negative high value) and identifies that nearby terrain that is tolerable to walk and guides the pedestrian accordingly.

  4. The lumigrids on the shirt scans the terrain ahead of the person and checks if there is an acceptable match with the reference data on cloud. If there is any discrepancy in the data obtained by the shirt and the cloud, the person is requested to take some alternative like a slight lateral movement and again a match is checked for. If the person is not able to get any help or no match is found, the server looks for alternative routes and guides the person. For instance, let the person be in (21, 30). According to the cloud data, there should be a high wall in front of him, but the shirt mounted unit scans and finds that there is no wall now and the terrain is optimum to walk. It flags all these data in the cloud as dirty by setting the Dirty Bit as follows:

Figure 9.

Visualization of the terrain grid of a pedestrian path formed by the data.

GPS Ver. h a Dirty bit
(20, 30) 1 +20 −3 1
(20, 31) 1 +20 −7 1
(20, 32) 1 +20 −10 1
(20, 33) 1 +20 −10 1

Accordingly the cloud decides if it needs to schedule a re-mapping phase for that terrain or to accept the information shared by the pedestrian shirt.

After the re-map, following is the data in the cloud:

GPS Ver. h a Dirty bit
(20, 30) 2 +0 0 0
(20, 31) 2 +2 0 0
(20, 32) 2 +0 0 0
(20, 33) 2 +2 0 0

5.4 Summary

This section proposes a conceptual framework which fills the major gaps exist in the design of technological navigation aids and explains the software architecture, hardware and wearable devices requirements and the theoretical models necessary for building an infrastructure to seamlessly gather the terrain-related information of the pedestrian path and use this information to guide the pedestrians to navigate properly.

Advertisement

6. Conclusions

In this chapter, we outlined various aspects of wearable technology and its implementation in a wide range of applications, starting with healthcare, continued with other domains and concludes with the integration of wearables to navigation and safety systems. Wearables technology is still at its development and growing stage. We expect wearables to continue its fast growth and be implemented in much more domains, transforming our life to be much more convenient, safe and automated.

References

  1. 1. Kolodzey L, Grantcharov PD, Rivas H, Schijven MP, Grantcharov TP, Wearable Technology in Healthcare Society. Wearable technology in the operating room: A systematic review. BMJ Innovations. February 2017;3(1):55-end
  2. 2. Park S, Chung K, Jayaraman S. Wearables: Fundamentals, advancements, and a roadmap for the future. In: Wearable Sensors. Chapter 1.1. Academic Press, an imprint of Elsevier; 2014. pp. 1-23
  3. 3. Liu Y, Wang H, Zhao W, Zhang M, Qin H, Xie Y. Flexible, stretchable sensors for wearable health monitoring: Sensing mechanisms, materials, fabrication strategies and features. Sensors. 2018;18:645. DOI: 10.3390/s18020645
  4. 4. Alam MM, Hamida EB. Surveying wearable human assistive technology for life and safety critical applications: Standards, challenges and opportunities. Sensors. 2014;14(5):9153-9209. DOI: 10.3390/s140509153
  5. 5. Armagan E, Papkovsky DB, Toncelli C. Chapter 2: New polymer-based sensor materials and fabrication technologies for large-scale applications. In: Quenched-Phosphorescence Detection of Molecular Oxygen: Applications in Life Sciences. Royal Society of Chemistry [RSC] publishing; 2018. pp. 19-49. ISBN: 978-1-78801-175-4
  6. 6. Awolusi I, Marks E, Hallowell M. Wearable technology for personalized construction safety monitoring and trending: Review of applicable devices. Automation in Construction. 2018;85:96-106
  7. 7. Rodgers; Vinay MM, Pai; Richard M, Conroy S. Recent advances in wearable sensors for health monitoring. IEEE Sensors Journal. June 2015;15(6):3119-3126
  8. 8. Majumder S, Mondal T, Deen MJ. Wearable sensors for remote health monitoring. Sensors. 2017;17:130. DOI: 10.3390/s17010130
  9. 9. Mitchell R, Ozminkowski R, Serxner S. Improving employee productivity through improved health. Journal of Occupational and Environmental Medicine. 2013;55(10):1142-1148. DOI: 10.1097/JOM.0b013e3182a50037
  10. 10. Jin H, Jin Q, Jian J. Smart materials for wearable healthcare devices. In: Ortiz JH, editor. Wearable Technologies. Rijeka, Croatia: IntechOpen; 2018. DOI: 10.5772/intechopen.76604. Available from: https://www.intechopen.com/books/wearable-technologies/smart-materials-for-wearable-healthcare-devices
  11. 11. Aliverti A. Wearable technology: Role in respiratory health and disease. Breathe. 2017;13(2):e27-e36
  12. 12. Tambo E, Ngogang JY. Wearable nutrition and dietetics technology on health nutrition paradigm shift in low- and middle-income countries. International Journal of Nutrition and Metabolism. 2018;10(5):31-36. DOI: 10.5897/IJNAM2016.0207
  13. 13. Javadi B, Calheiros RN, Matawie KM, Ginige A, Cook A. Smart Nutrition Monitoring System Using Heterogeneous Internet of Things Platform; 2017. Available from: staff.scem.uws.edu.au/∼bjavadi/papers/javadi_IDCS2017.pdf
  14. 14. Murakami H, Kawakami R, NakaeShow S, Miyachi M. Accuracy of wearable devices for estimating Total energy expenditure: Comparison with metabolic chamber and doubly Labeled water method. JAMA Internal Medicine. 2016;176:702-703. DOI: 10.1001/jamainternmed.2016.0152
  15. 15. Lee J, Kim D, Ryoo H-Y, Shin B-S. Sustainable wearables: Wearable technology for enhancing the quality of human life. Sustainability. 2016;8(5):466. DOI: 10.3390/su8050466
  16. 16. Afzal A, Mousavi S-F, Khademi M. Estimation of leaf moisture content by measuring the capacitance. Journal of Agricultural Science and Technology. 2010;12:339-346
  17. 17. Boukhechb M, Daros AR, Philip KF, Bethany C, Barnes TL. Demonic salmon: Monitoring mental health and social interactions of college students using smartphones. Smart Health. 2018;9-10:192-203
  18. 18. Alam MM, Arbia DB, Hamida EB. Research Trends in Multi-Standard Device-to-Device Communication in Wearable Wireless Networks, Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering Book Series, LNICST; 2015. p. 156
  19. 19. Banovic N, Franz RL, Truong KN, Mankoff J, Dey AK. Uncovering information needs for independent spatial learning for users who are visually impaired. In: Proc. ASSETS’13, Article: 24
  20. 20. Sendero GPS for the Blind. Available from: www.senderogroup.com
  21. 21. Trekker Breeze. Available from: www.humanware.com
  22. 22. Lumigrids. Available from: http://www.gizmag.com/lumigrids-led-projector/27691/
  23. 23. Kane SK, Jayant C, Wobbrock JO, Ladner RE. Freedom to roam: A study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Assets’09. 2009. pp. 115-122
  24. 24. Shoval S, Borenstein J, Koren Y. Mobile robot obstacle avoidance in a computerized travel for the blind. In: IEEE International Conference on Robotics and Automation. San Diego, CA; 1994

Written By

Menachem Domb

Submitted: 04 November 2018 Reviewed: 27 March 2019 Published: 25 April 2019