Open access peer-reviewed chapter

Real-Time Capable Sensor Data Analysis-Framework for Intelligent Assistance Systems

Written By

Ulrich H.P. Fischer, Sabrina Hoppstock, Peter Kußmann and Isabell Steuding

Submitted: 13 February 2020 Reviewed: 25 August 2020 Published: 17 March 2021

DOI: 10.5772/intechopen.93735

From the Edited Volume

Data Acquisition - Recent Advances and Applications in Biomedical Engineering

Edited by Bartłomiej Płaczek

Chapter metrics overview

513 Chapter Downloads

View Full Metrics

Abstract

In the industrialized countries, the very old part of the population has been growing rapidly for many years. In the next few years in particular, the age cohort over 65 will increase significantly. This goes hand in hand with illnesses and other physical and cognitive limitations. In order to enable these people to remain in their own homes for as long as possible despite physical and cognitive restrictions, technologies are being used to create ambient assisted living applications. However, most of these systems are neither medically verified nor are latencies short enough, for example, to avoid falls. In order to overcome these problems, a promising approach is to use the new 5G network technology. Combined with a suitable sensor data analysis frame work, the fast care project showed that a real-time situation picture of the patient in the form of an Avatar could be generated. The sensor structure records the heart rate, the breathing rate, analyzes the gait and measures the temperature, the VOC content of the room air, and its humidity. An emergency button has also been integrated. In a laboratory demonstrator, it was shown that the infrastructure realizes a real-time visualization of the sensor data over a heterogeneous network.

Keywords

  • ambient assisted living technologies
  • eHealth
  • eCare
  • tele-care
  • real-time networks
  • vital data acquisition
  • fast project

1. Introduction

Assistance systems in Ambient Assisted Living and in medical care have to recognize relevant situations, that require fast assistive intervention. Former projects in this field like tecla [1, 2, 3] or PAUL [4] have been focused on the application of the new AAL-technologies in AAL test beds to get information about the acceptance level [5, 6] of the technologies and the different new applications for the patients. Additionally, business models [7, 8] have been drafted to realize a successful AAL business area in future.

The clinical established measurement technology for diagnostic, monitoring and risk stratification does not translate directly to the outpatient area (ambulant or domestically environment). The key challenge is, that many relevant situations are only noticeable, when various sensor modalities are merged – such as for discrimination between pathological, emotional [9] or stress induced increase of the heart rate [10]. This is only possible by the use of the combination of multiple different sensors [11]. The same applies to the analysis of joint kinematics of everyday activities, which requires more and inertial sensors with higher accuracy.

The next generation of radio networks (5G) [12] shows the possibility of introducing new possibilities of real-time communication in all areas of life with very low latency and high data rates. One speaks of a so-called tactile Internet. People come into contact with their surroundings through their senses, which involve several different reaction times. Here, muscular, audio-visual and tactile response times are of particular importance. The typical muscular response time is around 1 second, that of the hearing at 100 ms, while the visual response time is in the range of 10 ms [13].

In the case of active control of an object, such as a car or a machine, the information must first be recorded while a reaction must be carried out at the same time. The well-known use of a touch screen requires that you move your finger in a controlled manner across the screen. It is therefore necessary that the touch screen can achieve a response time of less than 1 ms in order not to produce any noticeable delay in the visual impression. In the case of an active prothesis, which was applied in this study, the response time must be below 10 ms to achieve a practical application basis for its use in daily life. Therefore, fast sensor data-frameworks are needed to analyze the conditions of real-time identification and subsequently provide a medical valid corresponding assistance [12, 14].

The aim of the fast care project was to develop a real-time sensor data analysis framework [9] for intelligent assistance systems in the area of Ambient Assisted Living (AAL), eHealth, mHealth, tele-rehabilitation and tele-care. It provides a medically valid, integrated real-time situation picture based on a distributed, ad hoc networking, everyday use and energy-efficient sensor infrastructure with a latency of less than several ms. The integrated situation picture that includes physiological, cognitive, kinematic information of the patient is generated by the intelligent fusion of sensor data [15, 16]. It can serve as a basis both for the rapid detection of risks and dangerous situations as well as for everyday use medical assistance systems that autonomously intervene in real time [17, 18] and allows active telemedical feedback [10].

In this chapter of the book, after an introduction, the technical goals and implementation options of a fast sensor network with real-time data analysis are presented followed without contact by the structure of the overall system. In the Section 2, the details of the technological concept such as data fusion and telemetry are presented. All relevant interfaces for real-time applications are discussed in detail. In the following section, the hardware, sensors/actuators and the specific installation of the demonstrator in laboratory operation are discussed. In the following part, details of the individual sensor systems and the corresponding visualization of the sensor data presented by an Avatar are distinguished. In the Section 3, the acceptance test for the use of the sensor components of the demonstration are analyzed and discussed. Finally, a summary with a view of upcoming developments will be given at the end.

Advertisement

2. Technical goals and solutions

2.1 System setup

The basis of a medical valid - integrated real-time picture of the situation is an ad hoc interconnected sensor infrastructure. Its latency period should be very fast to fulfill the boundaries of a haptive working network. Here, physiological, cognitive and kinematic information of a patient are captured with the help of intelligent sensor data fusion. These data can be combined to provide an integrated picture of the patient’s physical and mental situation. In this way, it should be ensured that the framework can be used for applications, in which feedback has to be embedded synchronically. This can be realized in visual, auditive, tactile or proprioceptive string of perception, such as in the field of support of motor function and kinematics for the rehabilitation and for active prosthetics and orthotics.

Figure 1 shows an overview of the system concept of the project approach for an integrated sensor infrastructure in the home of an elderly person. It consists of GPS data, air pressure and temperature data, vital parameters, cameras, optical sensors and so-called inertial sensors (IMU) together.

Figure 1.

Integrated system concept.

These sensor data are summarized in real-time and buffered in a database system. From this database, an integrated real-time situation analysis is generated that touches on three areas of human life: firstly, the kinematic data such as localization, movement and posture. The second area is the cognitive sub-area with awareness, emotionality and mental clarity. The third subsection deals with the physiological data in which cardiovascular metabolic and neurological data can be recorded and analyzed.

This entirety of the data in the home of the living person can be evaluated integratively and can accordingly provide a precise analysis of his health. In this project, apart from the emotional and neurological aspects, all the addressed areas were recorded and evaluated. After evaluating the situation analysis, actuators are implemented for rehabilitation, in a special case of an active prosthesis of the foot, which can adjust different heel heights, automatic adaptation to different floor conditions or rapid walking. Furthermore, the client should be provided with a real-time display of his vital parameters as a so-called Smart Home Assistant, which can give a helpful health support to the client.

For a real-time application, it is necessary that the latency times between sensor detection and actuator actuation are less than several Milliseconds. This ensures a so-called haptic functionality of the system and can be achieved with the help of new radio technologies and fast network technologies such as FTTH and the fifth generation of mobile radio networks (5G). To ensure private data security, all data is stored and evaluated in a so-called home server which is situated in the client’s apartment. Further intervention options are possible by a secure cloud connection to medical services or the system administrators for possible updates of the sensor and actuator components.

The challenge of a distributed, real-time medical sensor technology and signal processing is to be processed by means of sensor-based data processing and sensor hubs, optical sensors, hardware system optimization, the development of distributed systems as well as by interface network sensors. The focus of the project was on the intelligent fusion of sensor and actuator data as well as the evaluation and delivery in real-time. In order to meet this objective, the following developments took place in the Ambient Assisted Living (AAL)-Lab of the Harz University of Applied Sciences in Wernigerode (Figure 2).

  • Analysis of requirements

  • Data acquisition

  • Data analysis

  • Data fusion

  • Acceptance analysis

  • Situation detection and assistance in real-time

Figure 2.

Application of fast care real-time sensor system.

The objective of a distributed, real-time medical sensor technology and signal processing is to get an evaluation of the patient’s situation from the available data in real-time. The main application focuses in the area of the application of orthopedic devices. For example, the optimization process of the leg prosthesis` damping members and active foot positioning points shall be executed online. Currently, these parameters are performed offline and hand-made by orthopedic technicians with variable quality. This often leads to suboptimal adapted orthopedic devices; whose functionality and efficacy are correspondingly limited and therefore to an unsatisfactory rehabilitation outcome. This system approach of the sensor integration into an active foot prothesis is called a real-time active prosthetics/orthotics -time controller. Another project section describes the online execution of the estimation of cognitive condition, the motion analysis for rehabilitation and cardiopulmonary performance.

2.2 Technological concept

Based on the project goals, the technical and content requirements of the technological topics to be worked on were specified, categorized and summarized by the individual partners. The basic requirements are listed in the following areas:

  1. Hardware/sensors,

  2. Network,

  3. Data analysis,

  4. Actuators/intervention/feedback

The system diagram of the research approach of the fast care framework is shown in the Figure 3. The fast care framework is the technical basis for the realization of the fast care project, which implements the fusion of heterogeneous sensors via heterogeneous networks. The basic idea of the fast care framework is to derive a condition from the past and the current states of the sensory data using different newly developed sensor applications, including the following areas and interfaces (see Figure 3). From the network topological representation, a breakdown of the used network interfaces was made, specified by the project partners. Based on this, a suitable communication protocol was selected regarding the individual implementations. Communication via MQTT forms the basis of the used communication between the sensor-applications and the real-time controller depicted in Figure 3. In the left side of the figure, the sensor-applications are situated, consisting of a Kinect system for motion data, inertial motion units (IMU) for the detection of movements of body and objects in a fixed sequence for the analysis of a workout in a kitchen, motion sensors/actuators in an active intelligent prothesis, a camera based heart rate and breathe sensor, and finally a special sensor of volatile organic components in the room air. Prothesis, body and objects sensors are connected via smartphone and Bluetooth low energy. While the smartphone transfers the data to the real-time controller.

Figure 3.

Network topology.

In total, the seven sensor components are listed there on the left. The active prosthesis, the heart rate measurement, the respiratory rate measurement, the detection of VOC components in the breathing air, the detection of movement in the room and the measurement of room temperature and humidity, as well as the use of the emergency button, uses the corresponding network structure according to the blocks shown in the sketch.

After the individual implementations of the interfaces a suitable software communication server was selected. The MQTT protocol [19] was implemented using a real-time capable Linux variant. Suitable hardware was procured by the project partner of the Harz University of Applied Sciences, a suitable operating system was installed and the MQTT software server “mosquitto” [20] was installed and configured. The definition of topics (message channels) and the specification of the data formats were necessary for smooth communication of the individual partner realizations “in-itself” and “with each other.” A detailed description of the communication formats between the sensors built by the partners and the MQTT server can be found in the final design plan of the fast care project [21].

At the beginning of the project, the communication protocols that should be used between the individual project partners for data exchange have been discussed and clearly defined (see Table 1). The interfaces for the network used in the project are essentially the Bluetooth LE transmission, the Wi-Fi transmission and the wired transmission via Ethernet 802.3. Furthermore, wireless transmission via LTE or 4G plus was used by several partners. This resulted in a very broad transmission application scenario. An overview of the transmission technology of the sensor infrastructure to the real-time controller and the forwarding to the real-time visualization is depicted in Figure 4.

Table 1.

Overview of network interface parts used in fast care.

Figure 4.

Network infrastructure [22].

After the data has been transferred to the real-time controller, the data is available in the form of JSON objects that were stored on the Linux system of the server. At the same time, an integrative situation analysis of the sensor data is carried out and the corresponding information is transferred to the real-time visualization via the public network to a cloud server, which generates a website with the correspondingly evaluated real-time data in the form of an Avatar.

2.3 Hardware, sensors, actors

In this part all of the hardware components which have been developed in the project are described. On the one hand, this includes sensors with the task of capturing a physical measured variable like motion, VOC gas, heart rate, etc. Furthermore, sensor modules have been developed with implemented combined sensors which form a functional unit with actuators e.g. the electronically controllable lower leg prosthesis. For a better overview of the components used by the individual partners, a matrix of the use of all partners and their network interfaces was created. (See Table 2).

KinectIMUs
(Body)
IMUs
(Object)
ProthesisCameraVOC Sen.Smart
phone
Real-time
controller
CloudTerminal
HSH+++
TUD++++
OvGU+++
URO++++++++++++
EX+++++++
BST++
OBO+++
HO+++

Table 2.

Types of hardware components used by the cooperation partners.

In the following subsections all of the used hardware and all sensors/actors are collected and described.

2.3.1 AAL lab installation

Rapid and intelligent sensors and actuators, an improvement of motion pattern recognition and intelligent algorithms for real-time network integration in three demonstrators of the AAL-Lab serve as solution approaches. Within the fast care project, a real-time network integration with demonstrators is to be carried out at the AAL-Lab of the Harz University. The various partial results of the project partners have been collected and integrated in the AAL-Lab. The integration at the AAL-Lab will be performed with the focus on user friendliness and the interaction with him by means of a show flat. Figure 5 illustrates the realized structure of the AAL-Lab with various elements for monitoring and evaluation of the measured vital data. The lab includes the following parts: Sensors on the walls: Pulse, Blood pressure, breathing frequency, Motion/position, VOC breath analysis, e-rehabilitation workout and the real-time controller PC.

Figure 5.

AAL lab of the Harz university; sketch of installations; (a) sensors on the walls: Pulse, blood pressure, breathing frequency, skin resistance, motion/position, VOC breath analysis, (b) E-rehabilitation, (c) real-time controller.

In Figure 6 you can see the laboratory, including a sofa, several armchairs, a bed and all the sensor components that were attached to the room, as shown in the Figure 5. The room has been deliberately designed like an old room to create a pleasant atmosphere for the examinations. After the technology was installed, the acceptance tests were carried out in this environment.

Figure 6.

Photograph of AAL lab.

2.3.2 E-rehabilitation system

The Kinect sensor used by the Otto von Guericke University in fast care is a physical device with depth sensor technology, integrated color camera, infrared transmitter and microphone array that detects the position and movement of people and voices. Table 2 shows the data of the KINECT depth sensor, while Figure 7 shows the workout scene. The application is to make a therapeutically workout with the patient and give him in real-time information and helpful feedback to move him in the right way. Additionally, a gait analysis [23, 24] can be performed by the use of IMUs positioned at the feet, shown in Figure 7. More detailed information can be found by Stoutz et al. in [25] (Table 3).

Figure 7.

Setup of the gait measurements for e-rehabilitation of Otto von Guericke university; above left: IMU application at the feet; above right: Therapeutic movements with avatar; lower middle: Presentation of gait analysis measurement.

FeatureDescription
Depth sensor
512 × 424, 30 Hz
FOV: 70 × 60
One-Modus: 0.5–4.5 m
Optimized 3D visualization, detection of smaller objects in particular and stable body tracking
1080p-Color Camera
30 Hz (15 Hz in poor lighting conditions)
Camera with 1080p resolution
Neue aktive Infrarot-Funktionen
512 × 424, 30 Hz
IR functions for lighting independent observations
Multi-Array-MicrophoneFour microphones etc. to find the sound source and the direction of the audio wave
InterfacesKinectAUX (USB)
Kinect2AUX (USB)

Table 3.

Data of the used KINECT sensor system for e-rehabilitation.

2.3.3 Inertial measurement unit (IMU)

The IMU used by the project partners “Otto Bock HealthCare GmbH”, “Otto von Guericke University” and “University of Rostock” describes an initial measuring unit. It is a self-contained measuring system that continuously records, analyzes, and, if necessary, pre-processes defined physical parameters (e.g. movement, acceleration, pressure, etc.) and forwards them to downstream communication and network protocols (see Figure 8). A distinction is made between two application modes. On the one hand, the IMUs on an object e.g. be installed in a kitchen appliance [26], which describes the use of “IMU on object” and provides measurement data for further analysis. Another area of application is the use of an IMU through suitable holders on the body of a person, which in turn describes the use of the “initial sensor on body” and also provides measurement data for further analysis [27, 28]. The project partner “Bosch Sensortec GmbH” [29, 30] developed and produces the IMU’s used in the fast care project [31].

Figure 8.

Structure of the inertial measurement unit network.

2.3.4 Camera-based vital parameter sensor

The camera-based vital sensors [32, 33] used by the project partner of the “Technical University Dresden” [34, 35, 36] are based on one or more camera systems with an associated, spectrally controllable lighting system and generate a spatial image of the surroundings as a database for further evaluations. Camera-based photoplethysmography (cbPPG) remotely detects the volume pulse of cardiac ejection in the peripheral circulation. The system does measure the heart rate, the breath rate with a camera system contactless in real time. More detailed information’s are described in the work of the Technical University of Dresden, Institute of Biomedical Technologies of Zaunseder et al. [37, 38]. The camera-based system records the change in the movement of the surface of the face in a fast data recording (see Figure 9).

Figure 9.

Camera-based vital sensors, 1 measurement unit, 2: Camera and lighting system 1, 3: Central display of real-time measurement 4: Measurement system 1 while application, 5: Measurement system 2 in while application, 6: Camera and lighting system 2.

The exposure with an LED light source with a special spectral range is necessary to obtain a particularly good contrast. The raw image data are sent directly to a controller and evaluated there. The evaluated data (heart rate, respiratory rate) are transferred directly as a JSON object to the real-time controller via Ethernet cabling at 1 Gb/s and stored there in the MQTT server. The representation of the respiratory rate and the heart rate is then realized in real time in the Avatar (see Sensor Data Visualization 2.4).

2.3.5 VOC air sensor

As part of the BMBF-funded “fast care” project, HarzOptics GmbH [39] has developed components for a distributed sensor network for the spectroscopic analysis of air. The sensor system analyzes the air in a room by measuring the optical spectral content of volatile organic components (VOC) [39, 40, 41, 42]. Special absorptions of VOC gases are analyzed, which indicate the beginning of clinical pictures. In addition to assessing the quality of indoor air for AAL applications, this system is also to be used for the detection of VOC in breathing gas. Since the presence of certain VOCs in exhaled air enables conclusions to be drawn about diseases such as lung cancer or metabolic disorders, the integration of a non-invasive permanent gas analysis in real-time medical care is becoming possible, also in view of increasing bandwidths and decreasing latency times [39].

The air sensor is part of a more complex system, the basic mode of operation of which can be seen in Figure 10. Data recorded by a sensor (e.g. CO2 concentration) are transferred as (voltage) values to an Arduino board, which converts the values into volume concentrations, converts the data generated from it into an MQTT-compliant format and transmits it to a real-time server. The data is displayed using a special real time Avatar sketch which is presented in chapter 4.10 in more detail. If limits are exceeded, a warning or recommendation is issued (e.g. “Please open window and ventilate” or “Please consult a doctor”). In addition to the data from this sensor, the MQTT server also receives data from other sensors that have been developed by other project partners. These are also visualized in the Avatar figure.

Figure 10.

VOC sensor setup.

After the spectrum could not be recorded using an optical spectrometer due to a lack of sensitivity, an alternative setup with laser sources was implemented. The wavelengths used here correspond to the previously determined absorptions of the relevant substances and are recorded by a broadband optical sensor. If the substances sought are present in the air, the light from the laser source is attenuated in accordance with the concentration, which reduces the voltage values at the sensor output and the volume concentration can be determined. The temperature sensitivity of the sensor and amplifier is still causing problems.

2.3.6 Active prothesis

Under the catchphrase “active prosthesis”, “Otto Bock HealthCare GmbH” summarizes its IMUs worn on the body, an associated analysis and evaluation unit and the control of an active prosthetic foot. The aim is to map an automatic adjustment of an active prosthetic foot using a long-term measurement of a gait analysis based on the foot, knee and joint angle. The realization of the complete measurement system is described in more details by Albrecht-Laatsch in [43]. The current status quo for the adaptation of prostheses is that clients rarely come to adapt their prostheses for rehabilitation and check-ups. Therefore, the prosthesis is usually only adapted for one type of gait. In addition, developers rarely speak to users, so that little everyday problems flow into development.

The goal of the development the active prothesis in the fast care project was to get a better picture of the real prosthesis usage, as well as to make it easier and faster to adapt to the real needs of the user. This was achieved with a remote connection of the active prosthetic foot used for remote diagnosis and automatic adaptation to the conditions of use.

Implementation was achieved with the help of motion sensors (IMU), the measured values of which were used both locally and remotely. This eliminates the need for a regular visit to the gait laboratory and the long-term recording takes place in a relaxed environment. In addition, incorrect movement patterns can be recognized and corrected early. The adaptation takes place automatically and can be initiated from a “remote” location. With the active prosthetic foot, the heel height and the active aisle support could be automatically adjusted by the software. This reduces fatigue, as the engine pushes the legs off. The support is regulated depending on the speed. For experts in the laboratory, the gait diagram is displayed remotely in real time, and further parameters of the prosthesis can be remotely adjusted by the experts in fine tuning mode. The test of the automatic adaptation of the was performed in the laboratory which is depicted in the working scene of Figure 11.

Figure 11.

Active prothesis motion sensor with feedback for gait optimization.

2.3.7 Bluetooth beacons

The University of Rostock uses “bulky BLE Beacons” to locate its IMUs in the room [27, 28]. These beacons are distributed in a fixed position in the room and allow the IMU’s to make statements about movements in the space of people and their acceleration via a field strength measurement. The sensors provide information about using a kitchen task assessment dataset. This dataset contains normal behavior as well as erroneous behavior due to dementia, recorded with wearable sensors as well as with sensors attached to objects. The scene of the application of the kitchen task workout is depicted Figure 12.

Figure 12.

Motion analysis of a cooking process with IMUs with inference method at university Rostock.

In this workout, a test client prepares a pudding meal that is clearly defined in a few simple steps. The process goes through the compilation of the ingredients, the cooking itself to completion and decanting the pudding into several cups. All sub-processes are analyzed in detail and provided with appropriate help if the wrong ingredients are used or the wrong wooden spoon, while all objects in the environment which the person is working, are connected with IMU sensors.

The kitchen task is created by a semantic annotation scheme. This scheme gives information about the observed motions and the errors while performing the workout. The data format splits in sensor and video data. The video data are collected by several cameras while the sensor data are collecting parallel to the video several accelerations from the IMU sensors fixed at the body worn sensors and additional from the used objects. The complete data roll consists of several normal and false runs. To get information about the false runs, the clients realized errors in the workout. The data consists of action data as well as the object being manipulated and the client that is working with it. More information about the sensor application to analyze the erroneous behavior from Hein et al. can be found in [44].

2.3.8 Emergency button and temperature/humidity sensors

As an additional sensor system, the Exelonix company implemented an NbIoT sensor as a push button, which transmits its sensor data in JSON format to the real-time server via the public network via the existing 4G + radio network (see Figure 13). The emergency is displayed in real time on the visualization server. In the real case, this could then be transmitted to the 24/7 service of a nursing service. A second sensor that also works via NbIoT transmission is a motion-sensitive sensor. This has been installed to register movements in the room and additionally to transmit the room temperature and air pressure to the real-time server via the public radio network. In this case, too, the data is transmitted in JSON format. Further information on the exact key data of the sensors can be found in the publications by Stege et al. [45, 46, 47, 48].

Figure 13.

Sensor modules of Exelonix, left: IoT emergency button via 4G+; right: IoT temperature, air pressure and motion sensor via 4G+.

2.3.9 Real-time controller

Within the fast care project, the Harz University of Applied Sciences developed a real-time platform for the sensor data fusion of the partial realizations of the partners. For this purpose, a Linux-based application server was configured based on a communication protocol (MQTT) selected for the project. This “real-time controller”, on which all information converges, forms the central “sensor data fusion”. The device includes a rack mounted server PC with Intel I7 topology and a memory of 16 GByte 1600 MHz DDR3 which is depicted in Figure 14. The LINUX version is “Red Hat Enterprise Linux Server release 7.7 (Maipo)”. The network interfaces are two 1 GB IEEE 802.3 and a “Realtek Semiconductor Co., Ltd. RTL8192EE PCIe Wireless Network Adapter”. More detailed information can be found in [21] the so called final design plan of the fast care project.

Figure 14.

Real-time controller with MQTT server.

2.4 Sensor data visualization

The project partners agreed to the technical implementation of the data fusion on the planned real-time server and the development of a user interface. After the data collection of all partners, these data are evaluated centrally on the real-time controller. The user should receive feedback about the obtained information. This feedback is based on the visualization of the situation analysis. The main view of the real-time visualization is shown in Figure 15. With its end customer platform, Exelonix GmbH forms the technological basis for the visualization in the fast care project. All sensor data collected in the MQTT server of the Harz University of Applied Sciences are evaluated using the Axel Onyx and Customer Platform, and all sensor data collected in the MQTT server of the Harz University are collected using the end customer platform from Exelonix. The sensor data were evaluated and visualized in a web page to which only the project partners had access. The transformation and preparation of the “technical information and data packets” received on the “real-time controller” was realized into a form that can be interpreted by those in need of care, relatives and experts. Among other things, time courses and histories are added.

Figure 15.

Real-time visualization of the measured sensor data.

The visualization is shown in Figure 15. An Avatar appears on the left, in which both, the heart rate and the breathing rate are shown optically in a movement of the heart and chest. On the right side of the picture there is a heart with the heart rate and with a lung that the respiratory rate. Furthermore, the data of the Exelonix sensor as well as the emergency button status, the room temperature and the room humidity are shown. An indication of the condition of the indoor air is shown directly below these displays, in this case the icon of a green cloud shows that the indoor air is in good condition.

Additional sensor data is depicted on the Avatar sketch. In the hip, knee and ankle area of the legs, the information about the energetic states of the batteries of the IMUs for recording the posture and knee angle is shown. The measured knee angle from the leg with the prosthesis is shown online in the graphic on the right, where the knee angle is shown in degrees over time while walking.

The measurement of the gait parameters of the patient, which is also recorded by the IMUs on the hips, knees and ankles (see Section 2.3.3), can be seen online to the right of the two icons on the gait width and lifting height of the foot. This allows the gait to be assessed and improved in situ for rehabilitation purposes.

In addition to this main page of the real-time display, a sub-page has been created for each application of the partners, in which the details of the individual sensor elements and their operation are compressed. The details of the real-time visualization of the partners can be seen especially in the final design plan, which can be found in the publication of Kußmann et al. [21].

Advertisement

3. User acceptance studies

In addition to the technical development activities, an analysis of acceptance was executed at the AAL-Lab of the Harz University. As a result of the project, fast care wants to develop feasible products and create the medical fundamentals for an interaction (feedback) in real time.

The project partners agreed to the technical implementation of the data fusion on the planned real-time server and the development of a user interface. This is done in addition to the workload of the integration of all technical components and the planed example application. After the data collection of all partners, these data are evaluated centrally on the real-time controller. The user should receive feedback about the obtained information. This feedback is based on the visualization of the situation analysis.

In the analysis of acceptance of the system, a small sample of a total of 20 subjects from different age groups was interviewed. The following figure shows the distribution by gender and age (Figure 16). Although this study is not representative, it gives a first insight into the valuation of the developed technology.

Figure 16.

Age and gender distribution of the testing persons.

During the survey, the subjects had to assess both the individual systems of the project partners and the overall system. The survey results of the entire system were very positive. 60% of the respondents stated, that they would like to use the technology privately, 70% of the respondents would like to have access to the technology, 35% would be willing to buy the presented technology and 95% see a great benefit for themselves and for others in the tested technology (see Figure 17).

Figure 17.

Use of the presented technologies.

In another part of the test, the sample’s affinity for technology was queried. On average, the confidence “in your own skills” when dealing with new technology was rated with 3.33 out of 5 points, the willingness to use new and unknown technology with 4 out of 5 points and the degree of technical overload with only 2.13 out of 5 Points. As a result, the test subjects showed a great willingness to use new technologies and did not feel overwhelmed with the used technology (see Figure 18).

Figure 18.

Technical affinity of the test persons.

Figure 19 illustrates, that the subsystem of the project partner Otto Bock was rated positively by the test subjects. The success of the measurement was rated on average with 4.35 out of 5 points, the success of the calibration with 3.97 out of 5 points and the intelligibility of the display with 3.27 out of 5 points. The women rated the manageability of the system with 4.08 out of 5 points slightly better than the men with 3.44 out of 5 points.

Figure 19.

Evaluation of the application of the active prothetic foot.

The gait analysis of the project partner of the Otto von Guericke University was rated as very positive by the subjects with 4.27 out of 5 points. The technology used by the OvGU Kinect system with 3.9 out of 5 points. The more the test subjects were overwhelmed with the technology, the more negative the system was rated (see Figure 20).

Figure 20.

Evaluation of the applications of the demonstrators of OvGU and TU Dresden.

Analyzing the system of the TU Dresden, the success of the measurement was rated 4.05 out of 5 points and the comprehensibility of the instructions with 4.05 out of 5 points. The comprehensibility of the instructions was more incomprehensible for the test subjects when they were overwhelmed by the technology. The intelligibility of the display and the results was rated with 3.58 out of 5 points (see Figure 20).

Advertisement

4. Conclusions

In the project fast care, a real-time capable sensor data analysis-framework in the fields of ambient assisted living was developed. The project realized a medical valid integrated real-time picture of the patient’s situation by using several interconnected sensor-actor infrastructures with a latency period of less than 10 ms. The implemented sensor structure records the heart rate, the breathing rate, the VOC content of the room air, analyzes the gait for rehabilitation and measures the temperature and humidity in the room. An emergency button has also been integrated.

An active prosthetic foot was used as a special application of the sensor-actor System. Its running parameters can be measured online, and the prosthesis can automatically adapt to the floor covering and the running demands via the network. This means that users have an intelligent active prosthesis at their disposal to help them cope with everyday life more easily.

It was shown that even with a heterogeneous network consisting of the components WiFi, Bluetooth LE, Gigabit LAN and 4G+, real-time operation was possible for the use of the AAL components. Even the display of the measured data, which was transferred to a website via the cloud, only showed latencies of an additional few milliseconds. This made it possible to create a real-time image in the form of an Avatar for all vital parameters and the automatic setting of the active prosthetic foot, which enables the client to notice his physical condition in situ.

In addition to the technical development activities, an analysis of acceptance was executed at the demonstrator in the AAL-laboratory. The survey results of the entire system were very positive. 60% of the respondents stated, that they would like to use the technology privately, 70% of the respondents would like to have access to the technology, 35% would be willing to buy the presented technology and 95% see a great benefit for themselves and for others in the tested technology.

Unfortunately, some slow network technologies such as Bluetooth LE had to be used to carry out the project. It is to be expected, that with the full expansion of the networks to the fifth generation (5G), there will still be a significant leap in transmission speed and transmission quality. It is therefore to be expected that eHealth applications in the home area can be implemented in real time in the near future. After the data fusion, further processing with the help of the artificial intelligence will bring further benefits to the client for the prevention of his physical and mental health.

Advertisement

Acknowledgments

The fast care project was supported by the German Federal Ministry of Education and Research in the program “Zwanzig20 – Partnerschaft für Innovation”, contract no. 03ZZ0519I. It was carried out in the form of a joint project with eight partners and a project coordinator. We thank all fast care project partners for their contributions to this work personally listed in the following: Thomas Kirste, Christian Haubelt, Albert Hein, Florian Grützmacher from University Rostock, Ernst Albrecht-Laatsch, Bernhard Graimann, Martin Schmidt and Katharina Olze from Ottobock, Alexander Trumpp, Daniel Wedekind, Martin Schmidt, Sebastian Zaunseder, Hagen Malberg from Technische Universität Dresden, Christian Reinboth and Jens-Uwe Just from HarzOptics, Matthias Stege, Frank Schäfer, Tristan Heinig and Sascha Huth from Exelonix, Rainer Dorsch from Bosch Sensortec, Lutz Schega, Sebastian Stoutz and Kim-Charline Broscheid from Otto-von-Guericke-Universität Magdeburg.

References

  1. 1. Fischer-Hirchert UHP. Anwendung von technikgestützten Pflegeassistenzsystemen in der Harzregion. Geriatric and Gerontology Congress. 2014;2014:1
  2. 2. Rost K, Siegmund S, UHP F. Technische Pflegeassistenzsysteme für ein längeres selbstbestimmtes Leben. In: Tagungsband AAL-Konferenz. Berlin: VDE Verlag; 2012. p. 335
  3. 3. Haupt M, Just J-U, Fischer-Hirchert UH. Vitalparametererfassung in technikgestützten Pflegeas-sistenzsystemen. In: BMC-KONGRESS. Berlin: E.V., Bundesverband Managed Care GGmbH, BMC – Managed Care; 2019. p. 5. Available from: http://www.bmckongress.de
  4. 4. Schelisch L. Wer nutzt eigentlich PAUL? Erfahrungen aus dem Praxiseinsatz. In: VDE (Hrsg.): Wohnen - Pflege - Teilhabe “Besser leben durch Technik”. 7. Deutscher AAL- Kongress mit Ausstellung. Elektronische Ressource. Berlin: VDE-Verlag; 21-22 January 2014
  5. 5. Meyer S, Mollenkopf H. In: Coors M, Kumlehn M, editors. Ambient Assisted Living (AAL):Komponenten, Projekte, Services. Eine Bestandsaufnahme. Kohlhammer: Verlag; 2013. p. 220
  6. 6. Kung A, Jean-Bart B. Making AAL platforms a reality. In: Proceedings of the AMI 10 workshop (AmI-10). Ambient Intelligence. Lecture Notes in Computer Science. Vol. 6439. 2010. pp. 187-196
  7. 7. Fischer UHP, Rost K. Businessmodell zur Applikation von AAL-Userportalen zur Verbesserung der sozialen Teilhabe älterer Menschen in der Harzregion. In: AAL-Kongress 2014 Berlin, Wohnen – Pflege – Teilhabe - Besser leben durch Technik. Berlin: VDE; 2014. p. 5
  8. 8. Bauer J, Kettschau, A-K, Franke J. Optimierung der datenvisualisierung von AAL-serviceplattformen durch usability-tests. In: VDE, BMBF, Sozialverband VdK, Fraunhofer-AAL, editors. Wohnen - Pflege - Teilhabe - Besser leben durch Technik. Berlin: VDE Verlag Gmbh; 2014. pp. 1-5
  9. 9. Wagner J, André E, Jung F. Smart sensor integration: A framework for multimodal emotion recognition in real-time. In: Proceedings - 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. ACII; 2009. p. 2009
  10. 10. Albert M, Görs M, Schilling K. Telemedical applications with rulebased descision- and information-systems (TARDIS). IFAC-PapersOnLine. 2015;48(10):7-11. ISSN 2405-8963
  11. 11. Gao W, Emaminejad S, Nyein HYY, Challa S, Chen K, Peck A, et al. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis. Nature. January 2016;28;529(7587):509-514. DOI: 10.1038/nature16521. PMID: 26819044; PMCID: PMC4996079
  12. 12. Simsek M, Aijaz A, Dohler M, Sachs J, Fettweis G. 5G-enabled tactile internet. IEEE Journal on Selected Areas in Communications. 2016;34(3):460-473
  13. 13. Fettweis GP. The tactile internet: Applications and challenges. IEEE Vehicular Technology Magazine. 2014;9(1):64-70
  14. 14. Parvez I, Rahmati A, Guvenc I, Sarwat AI, Dai H. A survey on low latency towards 5G: RAN, Core network and caching solutions. IEEE Communication Surveys and Tutorials. 2018;20(4):3098-3130
  15. 15. Yao S, Hu S, Zhao Y, Zhang A, Abdelzaher T. DeepSense: A unified deep learning framework for time-series mobile sensing data processing. In: 26th International World Wide Web Conference. WWW; 2017. p. 2017
  16. 16. Ruhm KH. Sensor fusion and data fusion - mapping and reconstruction. Measurement. 2007;40(2):145-157. ISSN 0263-2241. DOI: 10.1016/j.measurement.2006.07.012
  17. 17. Kasetty S, Stafford C, Walker GP, Wang X, Keogh E. Real-time classification of streaming sensor data. In: Proceedings - International Conference on Tools with Artificial Intelligence. ICTAI; 2008
  18. 18. Van Den Bogert AJ, Geijtenbeek T, Even-Zohar O, Steenbrink F, Hardin EC. A real-time system for biomechanical analysis of human movement and muscle function. Medical & Biological Engineering & Computing. October 2013;51(10):1069-1077. DOI: 10.1007/s11517-013-1076-z. Epub 2013 Jul 25. PMID: 23884905; PMCID: PMC3751375
  19. 19. mqtt.org. MQTT homepage [Internet]. 2020. Available from: https://mqtt.org
  20. 20. mosquitto.org. MOSQUITTO homepage [Internet]. 2020. Available from: https://www.mosquitto.org
  21. 21. Kußmann P, Hoppstock S, Fischer-Hirchert U. Fast Care Final Design Plan. Wernigerode: Harz University; 2020
  22. 22. Designed by fullvector/Freepik. Server Picture; 2020
  23. 23. Stoutz S, Chen CH, Broscheid KC, Schega L. User acceptance and usability of a home based gait analysis system. In: Smart SysTech 2019 - European Conference on Smart Objects, Systems and Technologies. 2019
  24. 24. Broscheid K-C, SToutz S, Chien-Hsi C, Schega L. The potential of a home-based gait evaluation system with a new low-cost IMU: A pilot study. In: Conference: HEALTH ACROSS LIFESPAN (HAL) - International Conference on Healthiness and Fitness across the Lifespan. Magdeburg: Otto von Guericke University Magdeburg; 12-15-September 2018
  25. 25. Stoutz S, Schega L. Presentation of a concept to support rehabilitation through realtime feedback/monitoring in the home environment. In: Tagung DGBMT. Dresden: VDE; 2017. p. 86. Available from: https://www.vde.com/resource/blob/1645606/36a6dc49966d0b0196c7ddca0c52de8f/bmt2017-dgbmt-jahrestagung-programm-data.pdf
  26. 26. Grützmacher F, Beichler B, Hein A, Kirste T, Haubelt C. Time and memory efficient online piecewise linear approximation of sensor signals. Sensors. 2019;19(23):5206
  27. 27. Grützmacher F, Hein A, Kirste T, Haubelt C. Model-based design of energy-efficient human activity recognition systems with wearable sensors. Technology. 2018;6(4):89
  28. 28. Hein A, Kirste T. Activity recognition for ambient assisted living: Potential and challenges. Berlin, Germany: Ambient Assisted Living (AAL), Deutscher AAL-Kongress mit Ausstellung / Technologien - Anwendungen - Management; 30 Janauary-01 February 2008. pp. 36-41
  29. 29. Grutzmacher F, Wolff JP, Hein A, Lepidis P, Dorsch R, Kirste T, et al. Towards energy efficient sensor nodes for online activity recognition. In: Proceedings IECON 2017 - 43rd Annual Conference of the IEEE Industrial Electronics Society. 2017
  30. 30. Grützmacher F, Beichler B, Haubelt C. Model-based real time analysis of distributed human activity recognition stages in wireless sensor networks. In: UbiComp/ISWC 2019-Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. 2019
  31. 31. Hein A, Grützmacher F, Haubelt C, Kirste T. Fast care – Real-time sensor data analysis framework for intelligent assistance systems. Current Directions in Biomedical Engineering. 2017;3(2):743-747
  32. 32. Lempe G, Zaunseder S, Wirthgen T, Zipser S, Malberg H. Kamerabasierte Erfassung kardiorespiratorischer signale. Technisches Messen. 2013;80(5):179-184. DOI: 10.1524/teme.2013.0029
  33. 33. Zaunseder S, Heinke A, Trumpp A, Malberg H. Heart beat detection and analysis from videos. In: 2014 IEEE 34th International Scientific Conference on Electronics and Nanotechnology, ELNANO 2014 - Conference Proceedings. 2014
  34. 34. Lempe G, Zaunseder S, Wirthgen T, Zipser S, Malberg H. ROI selection for remote photoplethysmography. In: Meinzer HP, Deserno T, Handels H, Tolxdorff T, editors. Bildverarbeitung für die Medizin. Springer, Berlin, Heidelberg: Informatik aktuell; 2013. Available from: http://doiorg-443.webvpn.fjmu.edu.cn/10.1007/978-3-642-36480-8_19
  35. 35. Takano C, Ohta Y. Heart rate measurement based on a time-lapse image. Medical Engineering & Physics. 2007;29(8):853-857
  36. 36. Zaunseder S, Trumpp A, Ernst H, Förster M, Malberg H. Cardiovascular assessment by imaging photoplethysmography – a review. Biomedical Engineering / Biomedizinische Technik. 2018;63(5):617-634. DOI: 10.1515/bmt-2017-0119
  37. 37. Zaunseder S, Trumpp A, Wedekind D, Malberg H. Cardiovascular assessment by imaging photoplethysmography-a review. Biomedizinische Technik. 25 Octobter 2018;63(5):617-634. DOI: 10.1515/bmt-2017-0119. PMID: 29897880
  38. 38. Trumpp A, Bauer PL, Rasche S, Malberg H, Zaunseder S. The value of polarization in camera-based photoplethysmography. Biomedical Optics Express. 2017;8:2822-2834
  39. 39. Fischer-Hirchert UHP, Reinboth C, Just J-U. Entwicklung von Komponenten für ein verteiltes Sensorsystem zur Echtzeit- Analyse von Atemgas. In: BMC Kongress 2019. Berlin: BMC; 2019. Available from: https://www.bmcev.de/bmc-kongress-posterausstellung/
  40. 40. Khan MRR, Kang B-H, Lee S-W, Kim S-H, Yeom S-H, Lee S-H, et al. Fiber-optic multi-sensor array for detection of low concentration volatile organic compounds. Optics Express. 2013;21(17):20119-20130. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24105558
  41. 41. Just J-U, Reinboth C, Kußmann P, Müller A. Realisierung eines Demonstrators zur spektroskopi-schen Analyse von Raumluft und Atemgasen. In: Anhalt H, editor. Nachwuchswissenschaftlerkonferenz. Bernburg: Hochschule Anhalt; 2018. p. 1. Available from: http://nwk2018.de/fileadmin/Dateien/NWK/nwk2018_programmuebersicht.pdf
  42. 42. Khan M, Kang S-W. A high sensitivity and wide dynamic range fiber-optic sensor for low-concentration VOC gas detection. Sensors. 2014;14(12):23321-23336. Available from: http://www.mdpi.com/1424-8220/14/12/23321/
  43. 43. Albrecht-laatsch E, Szufnarowski F. Optimization of dynamic properties of exo-prostheses using a distributed inertial measurement system. In: Jahrestagung der deutschen Gesellschaft für Biomedizinishe Technik DGBMT. Dresden: Saxony; 2017. p. 86. Available from: https://www.vde.com/resource/blob/1645606/36a6dc49966d0b0196c7ddca0c52de8f/bmt2017-dgbmt-jahrestagung-programm-data.pdf
  44. 44. Yordanova K, Hein A, Kirste T. Kitchen Task Assessment Dataset for Measuring Errors Due to Cognitive Impairments. 2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Austin, TX, USA. 2010. pp. 1-6. DOI: 10.1109/PerComWorkshops48775.2020.9156115
  45. 45. Stege M. Requirements of low latency sensor/actuator networks for e-health applications. In: Jahrestagung der BIOMEDIZINISCHEN TECHNIK und Dreiländertagung der MEDIZINISCHEN PHYSIK. Dresden: DGBMT; 2017. p. FS89. Available from: https://www.vde.com/resource/blob/1645606/36a6dc49966d0b0196c7ddca0c52de8f/bmt2017-dgbmt-jahrestagung-programm-data.pdf
  46. 46. Matz AP, Fernandez-Prieto J-A, Cañada-Bago J, Birkel UA. Systematic analysis of narrowband IoT quality of service. Sensors. 2020;20:1636-1642
  47. 47. Sunyaev A. The internet of things. In: Internet Computing. Heidelberg: Springer; 2020. DOI: 10.1007/978-3-030-34957-8_10
  48. 48. Exelonix. IoT – Services & Applications E-Health Appications [Internet]. 2020. Available from: https://www.exelonix.com/services_englisch/

Written By

Ulrich H.P. Fischer, Sabrina Hoppstock, Peter Kußmann and Isabell Steuding

Submitted: 13 February 2020 Reviewed: 25 August 2020 Published: 17 March 2021