Open access

A Novel Haptic Texture Display Based on Image Processing

Written By

Juan Wu, Aiguo, Song and Chuiguo Zou

Published: 01 December 2009

DOI: 10.5772/7051

From the Edited Volume

Image Processing

Edited by Yung-Sheng Chen

Chapter metrics overview

2,803 Chapter Downloads

View Full Metrics

1. Introduction

Surface texture is among the most salient haptic characteristics of objects, which help in object identification and enhanced feel of the device being manipulated. Although textures are widely known in computer graphics only recently have they been explored in haptic. Haptic textures are used to improve the realism of haptic interaction and give cues associated with many tasks in tel-manipulation or designing machines in virtual space. The haptic texture display can help the visually impaired to experience 3D art at virtual museums and perceive the features of arts (G. Jansson et al 2003). It also can be used for home shoppers in the internet shopping.

Researchers have developed many sophisticated haptic texture display methods. The texture display methods so far involve three types of construction: 1) real surface patch presentation (S. Tachi et al 1994)( K. Hirota & M. Hirose 1995), These methods use a real contact surface, arranged in arbitrary position in a 3D space, to simulate a partial model of the virtual object. The typical system was developed by Minsky and was call the Sandpaper system. 2) multiple-pin vibratory presentation. These approach need to design specialized pin-array device which can dynamically or statically exert pressure to the skin. (Ikei et al1997) (Ikei 1998) (Masami 1998) 3) Single point sensing and contact tool driving presentation. In recently years, researchers have tried to combine the kinematics and tactile device together, so as to enrich the haptic presentation and enhance the performance the texture display(Ikei 2002). However, the developments mean more complicated and more expensive.

Under the category 3), When the operator exploring the virtual sculpture’s surface by performing natural movements of his/her hand, the artificial force is generated by the interaction with the virtual surface through a haptic interface. As the interacting with the virtual textured surface will generate complex force information, so the key issue of this class of methods is to computing the contact forces in response to interactions with virtual textured surface, and applying to the operator through the force-reflecting haptic interface. This paper first presents some relevant previous work in texture force modeling, and the principle of the image based feature extraction and contact force modeling is introduced. Then an implementation of the algorithm is presented. Finally the experimental results are discussed.

Advertisement

2. Previous work

Hari present a method to record perturbations while dragging the tip of the PHANToM on a real surface and play back using the same device (Lederman et al 2004)(Vasudevan & Manivannan 2006). Ho et.al present a modification of ‘bump maps’ borrowed from computer graphics to create texture in haptic environments (C.-H. Ho et al 1999). This method involves the perturbation of surface normals to create the illusion of texture. Dominguez-Ramirez proposed that the texture could be modelled as a periodic function. Saira and Pai present a stochastic approach to haptic textures aimed at reducing the computational complexity of texturing methods (Juhani& Dinesh 2006). Fritz and Barner follow this up by presenting two stochastic models to generate haptic textures (Fritz& Barner 1996). S. Choi and H.Z.Tan study the perceived instabilities arising out of current haptic texture rendering algorithms while interacting with textured models (Choi & Tan 2004). Miguel proposed a force model based on the geometry model (Miguel et al 2004). All this contact force models can be classified into four kinds.

  • A mechanical sensor-based approach.

  • A geometry model-based approach

  • Stochastic and deterministic models

  • An image date-based approach.

The sensor-based approach using the device such as scanning electron microscope, potentially reproduces tactile impressions most precisely but a sensor equivalent to human skin is not commonly available and too expensive(Tan 2006). The geometry based approach involves intricate microscopic modelling of an object surface, requiring computation time to resolve the contact state between a finger and the surface. The stochastic and deterministic models such as sinusoid and random stochastic are simple to implement and could generate different sample of contact force and perceptually different from each other, but the produced contact force is not mapping the real textures. As to the deterministic models, the variable parameters are amplify, frequency and texture coordinate. The controllable parameters are limited and these features couldn’t fully describe the characteristic of texture, such as the trend and direction of texture.

To modelling the texture force during contact with texture, the geometrical shape and material properties should be grasped. However, the precise measurement of minute shape or bumpiness is not easy since it requires special apparatus for measurement. The method of using the photograph to get the geometrical data of texture has been adopted by Ikei, in which, histogram transformation was adopted to get the intensity distribution of the image.

Advertisement

3. Realization of haptic texture display based on delta haptic device

Although the height profile of a surface itself is not directly the intensity of tactile sensation perceived, it is among most related data to the real stimulus. The material of object should be in some category that produces the image reflecting its height map. Based on this hypothesis, a novel image date-based texture force model is proposed. We proposed that a photo image would be equivalent to a geometrical data as long as the photo was properly taken. The image data are processed with Gauss filters and the micro-geometric feature is acquired. Based on the height map, the constraint forces in tangent and normal direction are modelled. Then, the texture force is applied to the operator by the DELTA haptic device. The principle diagram of the haptic display system is shown as Figure 1. During exploration on a surface of a virtual object, the user can perceive force stimulus from the DELTA haptic device on the hand.

Figure 1.

A schematic representation of the haptic texture display system

3.1. Graphic to haptic

A virtual texture is presented by 3-DOF texture force. In order to simulate the contact force as if touching the real texture surface, it is proposed that the present surface texture by using a photograph. To assure the precise discrimination and high similarity to actual object textures, the feature of the image was dealt at the following step.

  1. Image sampling and pre-processing. The original images were taken form digital camera. The colour image was firstly transformed into grey-scale image. To acquiring the contour of textured surface from 2D image data, the prominent problem is that an image’s brightness intensity must roughly match the height map of texture protrusions. So homomorphic filtering was adopted to eliminate the effect of the non-uniform illumination.

  2. In image processing, Gauss filter is a common low pass filter in the frequency domain to attenuate high frequencies and retains low frequencies unchanged. The result is to smooth the edge of the image. While to the texture image, the low frequency components usually reflect the large continuous spatial region, and the high frequency components usually reflect the edge of texture image, which are in coherence with the alternation of geometrical height in space. Here we use the uniform Gauss filter to reshape the texture image, and then the original image is minus by the filtered image, the left ‘noise’ denotes the texture models.

The original image is f(x,y).

  1. Transforming the original image f into Fourier domain F(k,l).

  2. The input image is then multiplied with Gauss filter H in a pixel by pixel fashion, that is G(k,l)=H(k,l) *F(k,l). Where H is the filter function, and G(k,l) is the filtered image in the Fourier domain.

  3. To obtain the resulting image in the real space, G(k,l) has to be re-transformed to g(x,y) using the inverse Fourier Transform.

  4. The sharp intensity changes are obtained through f(x,y)-g(x,y), which reflects the height maps of texture surface.

All the image processing is realized with MATLAB function.

As the texture height maps are acquired. The virtual texture is generated through binding the texture height map to the virtual object surface.

3.2. Haptic texture generation

Figure 2.

Interaction force components

Figure 3.

the spatial coordinates of the intersection between the haptic probe (or rendering device) and the virtual object being probed

The constraint force when virtual agent (virtual finger, probe, etc.)contacting with the virtual texture surface is modelled as resultant force of texture force Ft and friction force Ff as Fig. 2 and Fig.3. The texture force is calculated as

F t = k × d ( x , y ) E1

Where k is a constant of proportionality, and d(x,y) is the height map of the texture. (x,y) is a texture coordinate within a surface. The direction of force is normal to the reference surface that forms a contour of a whole object.

The friction Ff is modelled as

F f = k 1 Δ wn E2

Where Δ w is a depth of penetration, n is a unit normal vector and k1 is a friction coefficient which varied with different surfaces.

F c = F t + F f E3

To constraint the avatar on the surface of the virtual surface, the maximum penetration depth H is given. If the depth of penetration depth is over H, then, the constraint force in normal will set to the constant value, which is within the output range of the haptic device.

3.3. Implementation and experiments

Using the techniques discussed in this paper, the texture force display system, shown in Figure 4, was composed of 3-DOF DELTA haptic device, which was able to produce the force stimulations. It was connected to a 2.66GHz Pentium PC, and graphical display on the LCD screen. During exploration on the virtual textured surface, the user can perceive variation of reactive force between the virtual probe and virtual textured surface. The screen shows the virtual space in which the red dot, an avatar representing the fingertip of the physical hand on the right, interact with a sculpture. DELTA haptic device held by the operator can reflect forces to the user whose sensations approximate the effect of exploring a real object having the same shape.

Figure 4.

the texture presentation based on DELTA haptic device

To evaluate presentation quality of the haptic texture display system, other two commonly used texture force models were compared. One is sinusoid model (Tan, 2006), the other is random stochastic model (Siira, 2006). The sinusoid model is commonly used that define the height map as

h ( x ) = A sin ( 2 π x / L ) + A E4

Where L is the length of wave, A is the amplitude of height. The restoring force is calculated as

F = { k × [ h ( x ) p x ]     w h e n   p x h ( x ) 0    other wise E5

Where the restoring force F always pointed up, p z was the z-position of the stylus tip.

The random stochastic model is

Figure 5.

Sample textures used for the discrimination experiment and corresponding height maps

* Note: the unit of x and y axis is pixel, z is the contour of rough surface

F = k × r a n d ( μ , σ 2 ) E6

Where the rand() is the white noise, produced by computer program. The adjustable parameters mean μ and variance σ, which originated from the statistical properties of the texture image sample. Higher variance produces a rougher texture. To assure that there is no texture force applied when the user is not moving, F is set to zero below a small velocity threshold.

In the psychophysical experiments, twenty subjects, 10 males and 10 female, students at the southeast university, participated. The average age was 24 years. The typical texture samples were metal, wood, brick, sand paper. The typical sample images and the corresponding profiles acquired by image processing method were shown as Fig. 5.

ⅠSelection Experiment

This experiment is to compare the effectiveness of the texture force models mentioned above, the texture force were presented to the subjects with the models mentioned above in a random sequence. While controlling the proxy to slide over the virtual texture surface with DELTA device, subjects felt the texture force, and they were required to select which method was the most realistic one. The trials were repeated three times for different texture samples, and Table 1 shows the mean selection results.

The result indicates that the proposed model is superior to others. One explanation for this may be that the height map is originated from the real image texture, which contains more information such as orientation of the texture than other two models.

Table 1.

Comparing which method is more realistic

ⅡMatching Experiment

This experiment was to evaluate the proposed model’s legibility. In the experiment, subjects were asked to wear the eye mask so as the process were implemented prohibiting the visual observation, four different texture samples were displayed to the subject. And then the four corresponding texture images were shown to the subject. Subject was asked to match the texture of what they saw to what they have felt previously. As human’s haptic memory span is limited. For one continues haptic perception trial, the number of the samples which human can remember and identify is 3 to 6. So in this experiment, one rendered group only included four samples. Four texture samples are metal, brick, wood and brick (a, b, c, d in Fig.5). Each subject performed on the same condition for three times. The match correct rate is as table 2.

Table 2.

Matching the haptic rendering ones with the image samples

Ⅲ Recognition Experiment

This experiment is also to estimate the presentation quality of the haptic texture model. In the experiment, the haptic texture was render to the subjects with our method, and subjects were required to see a group of texture images in which the rendered one was among them, and to tell which image is the just rendered one. The number of samples in one group is 6.

The correct rate of match experiment II is higher than III. It is supposed that, in experiment II, the match pairs is limited, and the human’ force perception can be referred to the others samples

Table 3.

Selecting the haptic rendered one from the image samples

Advertisement

4. Results and discussion

From the experiments, it implied that our model is distinctly superior to other two haptic texture models. The haptic texture render system exhibits the remarkable different reflected force to the users. One important reason is that the texture force model is from the height profile of texture image and Gauss filter is utilized. But we also have to confess that what we feel is still not what we see. Due to the complex nature of the haptic rendering pipeline and the human somatosensory system, it remains a difficult problem to expose all factors contributing to such perceptual artefacts. The haptic interface research laboratory at Purdue University has investigated the unrealistic behaviour of haptic texture. To our system, one reason for the limited matching rating is from the haptic device. As the haptic texture is implemented through the haptic device, so the performance is closely related with the device. To generate the stimuli of small texture force, the output force is controlled in the range of 5N. However, the DELTA device has relatively large damping force compared with the magnitude of texture force, which effected the actually output of the system. On the other hand, a real stainless steel surface has an almost infinite stiffness and cannot be penetrated by the fingertip or a probe, whereas an impedance implementation of a virtual surface has a limited stiffness due to the output of the haptic device. Another reason may affect the haptic perception in our system is the mapping between the pixel coordination of the image and world coordination of haptic device. As the resolution of the image is given, the larger working space of movement means interpolation or approximation approach would be used.

Advertisement

5. Conclusion and future work

To rapidly and effectively modelling the textured force and simulated the haptic stimuli while the user touching the textured surface of a 3D object. The height map of the texture is acquired based on the Gauss filter in frequency domain. The texture force and friction force are modelled based on the height map. As the height map is transformed from the image data, so the processing is simple and no specialized 3D geometrical scanning device is utilized. In this system, the texture objects are regular for example cuboids and columns. In the future, more the haptic texture model could be improved to be combined with random signal based on SR (stochastic resonance) and further psychophysical experiments would be carried out to adjust the parameters. The system will be applied to the objects with irregular shape for more widely uses.

References

  1. 1. Ho C. H. Basdogan C. Srinivasan M. A. 1999Efficient point-based rendering techniques for haptic display of virtual objects.” Presence, October, 1999, 8 5 477 491 , 1054-7460
  2. 2. Choi S. Tan H. Z. 2004Toward realistic haptic rendering of surface textures.IEEE Computer Graphics and Applications, March, 2004, 24 2 40 47 , 0272-1716
  3. 3. Fritz J. P. Barner K. E. 1996 “Stochastic Models for Haptic Texture”. Proceedings of SPIE The International Society for Optical Engineering, 34 44 , ISSN:0277786X, Boston, November,1996, SPIE, Bellingham
  4. 4. Tan H. Z. Adelstein B. D. Traylor R. 2006 “Discrimination of Real and Virtual High-Definition Textured Surfaces”, Proceedings of Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 3 9 , 978-1-42440-226-7 Arlington, March, 2006, IEEE, Piscataway
  5. 5. Hirota K. Hirose M. 1995 “Simulation and Presentation of Curved Surface in Virtual Reality Environment Through Surface Display,” Proceeding of IEEE Virtual Reality Annual International Symposium, 211 216 , 0-81867-084-3 Alamitos, March, 1995.
  6. 6. Ikei Y. Shiratori M. 2002 “TextureExplorer: A tactile and force display for virtual textures”, Proceedings of the 10th Symp.On Haptic Interfaces For Virtual Envir.& Teleoperator Systems., 327 334 , 0-76951-489-8 March,2002, IEEE, Los Alamitos
  7. 7. Ikei Y. Wakamatsu K. Fukuda S. 1997 “Texture Presentation by Vibratory Tactile Display”, Proceeding of IEEE Annual Virtual Reality International Symposium, 199 205 , 0-81867-843-7 Mexico,March, 1997, IEEE, Los Alamitos
  8. 8. Ikei Y. Wakamatsu K. Fukuda S. 1998 “Image Data Transformation for Tactile Texture Display,” Proceedings of Virtual Reality Annual International Symposium, 51 58 , 0-81868-362-7 March, 1998, IEEE, Los Alamitos,
  9. 9. Jansson G. Bergamasco M. Frisoli A. 2003 A New Option for the Visually Impaired to Experience 3D Art at Museums: Manual Exploration of Virtual Copies", Visual Impairment Research, April, 2003, 5 1 1 12 0138-8235X
  10. 10. Siira Juhani Pai Dinesh, K. (2006)"Haptic Texturing- A Stochastic Approach.", Proceedings of IEEE International Conference on Robotics and Automation, pp.557-562, ISBN: 0780395050, Orlando,May, 2006, IEEE, Piscataway
  11. 11. Lederman S. J. et al. 2004 “Force variability during surface contact with bare finger or rigid probe”. Proceedings of 12th international symposium on Interfaces for virtual environment and teleoperator system, 154 160 , 0-76952-112-6 March, 2004, IEEE, Los Alamitos
  12. 12. Shinohara Masami Yutaka Shimizu Mochizuki Akira 1998 “Three-Dimensional Tactile Display for the Blind”, IEEE Transactions on Rehabilitation Engineering, 6 3 September,1998, 249 256 , 1063-6528
  13. 13. Miguel A. etal 2004 “Haptic Display of Interaction between Textured Models”. Proceedings of IEEE Visualization Conference, 297 304 , 0-78038-788-0 October, 2004, IEEE, Piscataway
  14. 14. Tachi S. et al. 1994 “A Construction Method of Virtual Haptic Space,” Proceedings of Fourth International Conference on Artificial Reality and Tele-Existence, 131 138 , Tokyo, July,1994.
  15. 15. Vasudevan H. Manivannan M. 2006 “Recordable Haptic textures”, Proceedings of IEEE International Workshop on Haptic Audio Visual Environments and their Applications, 130 133 , 1-42440-761-3 November, 2006, IEEE, Piscataway

Written By

Juan Wu, Aiguo, Song and Chuiguo Zou

Published: 01 December 2009