Open access

Development of a Virtual Group Walking Support System

Written By

Masashi Okubo

Published: 01 February 2010

DOI: 10.5772/8134

From the Edited Volume

Human-Robot Interaction

Edited by Daisuke Chugo

Chapter metrics overview

1,792 Chapter Downloads

View Full Metrics

1. Introduction

Recently, we often see the people walking around both in the town and countryside with their friends for health keeping in the morning and evening in Japan. Most of them make a group for walking. One of the reasons why they make a group for walking is that group walking helps them to continue the exercise. But, sometimes they can't exercise out of their houses because of weather condition, for example, the rain, snow and so on. Then, they have exercise machine, for example, stepper and walking machine in their home. However, it is hard to keep motivation to exercise alone at home. From this viewpoint, the cycling machine with display, which offers a virtual space and avatar that supports the users exercise, has been developed (IJsselsteijn et al., 2004).

In this research, we have proposed the walking system for health keeping with partners using shared virtual space through the Internet. The system consists of a computer connected to the Internet and a sensor for extracting the user's motion. The proposed system can provide the moving images and footsteps based on the user's step to the users. And in the case of paired use, the voices, motions and footsteps are sent to each other.

Advertisement

2. System configuration

The system configuration of proposed system is shown in Fig. 1. It consists of a stepper for health keeping, a personal computer connected to the Internet and a sensor for measuring the user’s motion. The user can walk around the street in the virtual space by using the stepper with sensor. The positional data from sensor attached on the stepper are sent to the PC through the serial interface. When more than one person uses the system, the positional data are sent to each other to actualize the walking with partners

2.1. Hardware configuration

In this research, a stepper for diet tool is used. A sensor is attached on the heel of the tool, and the positional data x, y and z are sent to the PC. This kind of stepper is used in this research shown in Fig.1. However, any kind of walking machine can be used instead of this tool as far as the user’s motion can be measured. The magnetic sensor (POLHEMUS, FASTRAK) is used for measuring user’s motion. It can measure the positional data x, y, z, axes shown in Fig.2, elevation and roll. The PC (DELL XPS) used in this research has Pentium D 3.46GHz CPU, 2.0GB memory, GeForce 7800 GTX ×2(SLI) graphic cards and Windows XP OS. It makes the virtual space and moves the user’s viewpoint in the virtual space based on the positional data from the magnetic sensor.

Figure 1.

System Configuration.

Figure 2.

Example of positional data obtained from magnetic sensor.

2.2. Software configuration

VC++ 2005 has been used to make the program for extracting the user’s motion from the positional data, making the virtual space and moving viewpoint in the virtual space based on the user’s motion. DirectX loads the x-file such as a walkway, and draws the image. When more than one person use the system, each user’s motion data is sent to the partner’s PC through socket communication. Fig. 3 shows the examples of image given to user; (a) is for alone use and (b) is for paired use.

Figure 3.

Examples of user’s viewpoint in case of alone use (a) and paired use (b).

In addition, the proposed system can offer the footsteps to user based on the user’s walking rhythm (Kobayashi et al., 2006), (Miwa et al., 2001). The footsteps are given to user from speaker when the user switches his step. The footstep sounds are made in the system based on the positional data obtained from magnetic sensors which are put on the steppers. Fig.4 shows the example of user’s motion and how to determine the timing to sound the footsteps. The system pays attention to the threshold values which are slight smaller than maximum position and bigger than minimum position. The system will not sound the footstep momentarily after the system does it. Therefore, the system doesn’t sound it in case of 9, but in case of 1 to 8.

Figure 4.

Timing to sound the footsteps.

More than two people can use this system in the same time through the network. They can share the virtual space and communicate with partners while walking exercise. The proposed system sends the user’s footsteps rhythms to each other, and users can feel the partner’s presence and motion.

Advertisement

3. System evaluation

We performed the sensory evaluation to evaluate the proposed system’s usability both in the case of alone use and paired use. The subjects urged to compare the conditions with moving image and/or footsteps in the virtual space with it without moving image nor footsteps. In the case of paired use, the effectiveness of partner’s footsteps on the feeling of partner’s presence was estimated.

3.1. Experimental method

The 20 subjects in their 20’s performed the exercise under each experimental condition as follows:

Alone walking with

(a-1) TV

(a-2) Only moving image

(a-3) Footsteps and moving image

Paired walking with

(p-1) Only voice chat with remote partner

(p-2) Voice chat and moving image

(p-3) Voice chat, moving image and footsteps

Fig.5 shows the example of experimental scene. After the experiment, the subjects answered some questionnaire.

Figure 5.

Example of experimental scene.

In case of alone use, first, to familiarize the subjects with the use of the system, the subjects were walking with the experimental system in 1 min. Secondly the subject urged to perform the exercise by himself under two conditions out of (a-1) to (a-3) and answer which condition he preferred. The experiments were performed with three combinations of three experimental conditions. After the experiment, the subjects answered some questionnaire.

In the case of partner’s use, the two subjects in different rooms performed the system with his partner, and the subject urged to perform the exercise with his remote partner undo two conditions out of (p-1) to (p-3) and each subject answered which condition he preferred. The experiments were performed with three combinations of three experimental conditions.

3.2. Experimental results

Table 1 and 2 show the results of paired comparison. The number in the table shows that of subjects who preferred the line condition to the row condition. Most of subjects prefer the exercise with TV (a-1) in the case of alone use.

The Bradley-Terry model was assumed to evaluate the preference of the condition quantitatively, defined as follows (Okubo & Watanabe, 1999);

P i j = π i π i + π j i π i = c o n s t . ( = 100 ) E1

Where π i : intensity of i, Pij : probability of judgment that i is better than j

Here, π i shows the intensity of preference of the experimental condition. The model enables to determine the preference based on the paired comparison (see Fig.6 and 7).

The Bradley-Terry model assumed by using the result of paired comparison. And to approve the matching of the model, the goodness-of-fit test and likelihood ratio test were applied to this Bradley-Terry model. As a result, the matching of the model was consistent.

(a-1) (a-2) (a-3) Total
(a-1) 14 15 29
(a-2) 6 5 11
(a-3) 5 15 20

Table 1.

Result of paired comparison in case of alone use.

(p-1) (p-2) (p-3) Total
(p-1) 7 6 13
(p-2) 13 13 26
(p-3) 14 7 21

Table 2.

Result of paired comparison in case of paired use.

Figure 6.

Bradley-Teery model for paired comparison in case of alone use.

Figure 7.

Bradley-Teery model for paired comparison in case of paired use.

3.3. Answers for questionnaires

After the experiments, some questionnaires about the system usability and the experimental conditions were asked to the subjects. In the questionnaires, the subjects were asked which experimental conditions were preferred between (a-1): the alone walking with TV and (p-1): paired walking with only voice chat. The result is shown in Table 3.

(a-1) even (p-1)
4 1 1 4 10

Table 3.

Comparison (p-1) with (a-1).

However the experimental condition (a-1) is most preferred one in case of alone walking and (p-1) is worst preferred in case of paired walking, the subjects tend to prefer the paired walking. These results indicate that the paired walking tend to be preferred to the alone walking even in the virtual space.

Moreover, 14 subjects out of 20 answered that they prefer the paired walking to the alone walking. It shows the importance of partners to keep the motivation for exercise.

Advertisement

4. Future works

A diversity of virtual space must be important, especially in case of alone exercise. This is indicated in the result of experiment. Therefore, we have tried to make the virtual space with diversity. Fig.8 shows the example of the virtual space in which the car across the road and unknown people are walking the street randomly. On the other hand, for encouraging communication with the partner, speech driven avatar named InterActor will be applied (Watanabe et al., 2004).

Figure 8.

Example of virtual space with diversity.

Moreover, there is a limitation in a diversity of computer graphics. And we have to think the utilization of video movies in place of computer graphics ( Fig.9).

Figure 9.

Utilization of video movies.

Advertisement

5. Conclusions

In this paper, we have proposed the group walking system for health keeping with partners using shared virtual space through the Internet to keep the motivation. And the effectiveness of moving images, footsteps and conversation with partners on the dull exercise by using proposed system is demonstrated. In the case of alone use, the subjects tend to prefer the exercise watching TV, secondly the system with moving image and footsteps based on their steps to that with nothing. On the other hand, in the case of paired use, the subjects tend to prefer the condition with voice chat and virtual images. From the result of questionnaires, most subjects tend to prefer the paired walking to the alone walking with watching something. As a result of sensory evaluation and questionnaires, the effectiveness of proposed system is demonstrated.

References

  1. 1. IJsselsteijn W. Kort Y. Westerink J. Jager M. Bonants R. 2004 Fun and Sports: Enhancing the Home Fitness Experience; Entertainment Computing - ICEC 2004, 46 56 .
  2. 2. Kobayashi T. Miyake Y. Wada Y. Matsubara M. 2006 Kinematic Analysis System of Walking by Acceleration Sensor: An estimation of Walk-Mate in post-operation rehabilitation of hip-joint disease, Journal of the Society of Instrument and Control Engineers, 42 5 567 576 (in Japanese).
  3. 3. Miwa Y. Wesugi S. Ishibiki C. Itai S. 2001 Embodied Interface for Emergence and Co-share of’Ba’. Usability Evaluation and Interface Design.
  4. 4. Okubo M. Watanabe T. 1999 “Visual, Tactile and Gazing Line- Action Linkage System for 3D Shape Evaluation in Virtual Space”, Proc. of the 8th IEEE International Workshop on Robot and Human Communication (RO-MAN’99), 72 75.
  5. 5. Watanabe T. Okubo M. Nakashige M. Danbara R. 2004 InterActor: Speech-Driven Embodied Interactive Actor, International Journal of Human-Computer Interaction, 43 60.

Written By

Masashi Okubo

Published: 01 February 2010