Open access peer-reviewed chapter

Electrostatic Friction Displays to Enhance Touchscreen Experience

Written By

Reza Haghighi Osgouei

Submitted: 20 November 2018 Reviewed: 09 January 2020 Published: 12 February 2020

DOI: 10.5772/intechopen.91056

From the Edited Volume

Modern Applications of Electrostatics and Dielectrics

Edited by Dengming Xiao and Krishnaswamy Sankaran

Chapter metrics overview

899 Chapter Downloads

View Full Metrics

Abstract

Touchscreens are versatile devices that can display visual content and receive touch input, but they lack the ability to provide programmable tactile feedback. This limitation has been addressed by a few approaches generally called surface haptics technology. This technology modulates the friction between a user’s fingertip and a touchscreen surface to create different tactile sensations when the finger explores the touchscreen. This functionality enables the user to see and feel digital content simultaneously, leading to improved usability and user experiences. One major approach in surface haptics relies on the electrostatic force induced between the finger and an insulating surface on the touchscreen by supplying high AC voltage. The use of AC also induces a vibrational sensation called electrovibration to the user. Electrostatic friction displays require only electrical components and provide uniform friction over the screen. This tactile feedback technology not only allows easy and lightweight integration into touchscreen devices but also provides dynamic, rich, and satisfactory user interfaces. In this chapter, we review the fundamental operation of the electrovibration technology as well as applications have been built upon.

Keywords

  • electrostatic display
  • variable friction display
  • electrovibration technology
  • surface haptics
  • tactile rendering
  • texture rendering

1. Introduction

Among the five senses, touch is the most fundamental one we are equipped from the moment we enter this world. Even newborn babies know how to utilize their sense of touch to interact with their surrounding environment. Many of the typical tasks around us require touch which without it even a very basic task would be challenging to accomplish. Just imagine how difficult it can be to grab any object if you cannot feel its shape and weight or determine the amount of force you need to apply to hold it. Touch is very important to human being, and we rely on our touch sense more than we think we do [1].

Modern technologies in this digital era added new interactive agents around us which require our touch input. Touchscreen consumer electronics such as smartphones and tablet devices are among them. They are a versatile device that displays visual content and takes touch input simultaneously. More specifically, smartphones are an inevitable part of our daily life. Users spend a significant amount of time interacting with the digital contents on their mobile phones. So, equipping such devices with functionality to provide some sort of touch feedback was inevitable and seemed to be a natural course of technological development. However, despite technological advances, these devices lack the ability to provide programmed tactile feedback, which can be essential for more natural and intuitive interaction. At best, they provide some simple monotonic vibration patterns in response to the user’s touch input. This is neither appealing nor satisfactory given the expectations users have from such modern devices [1].

With the introduction of variable friction displays, this limitation has been addressed by technologies collectively called surface haptics. These technologies modulate friction between a user’s fingertip and a touchscreen surface in order to create a variety of tactile sensations when the finger explores on the touchscreen. This functionality allows the user to see and feel the digital content simultaneously with richer haptic information, leading to improved user experience and/or usability. There exist two major approaches in surface haptics: electrovibration and ultrasonic vibration. Whereas the former increases the surface friction by modulating attractive electrostatic force, the latter decreases the friction by vibrating the surface at an ultrasonic frequency and creating an air gap. Such electrovibration displays have the advantages that they require only electrical components and that the friction can be controlled uniformly on the screen, which are particularly attractive for mobile devices with a provision of adequate amplifiers [2].

The rest of the chapter is organized as following. In the next section, a brief overview of the fundamental operation of the electrovibration technology is given. Next, the literature has been reviewed for the studies, and applications have been built upon. In the final section, conclusions and future remarks are provided.

Advertisement

2. Electrovibration technology

The earliest known observation of electrical attraction between the human skin and a charged surface was made by Gray in 1875 [3, 4]. Forgotten for a while, a similar phenomenon was rediscovered later and called electroadhesion by Johnsen and Rahbek in 1923 [4, 5]. In 1953, Mallinckrodt et al. again reported a rubber-like sensation when a coated metallic surface connected to a 110-V power line was touched by a grounded finger [4, 6]. This phenomenon is called electrovibration by Grimnes in 1983, explaining its principle of operation based on Coulomb’s electrostatic force [7]. Electrovibration is due to the electrostatic attraction force between two conductive plates separated by a dielectric. When the finger scans an insulated electrode, a condenser is formed between the electrode and the conductive substance under the skin [7] (Figure 1). Exciting the electrode using a periodic voltage induces electrostatic attraction, and this increases the friction force between the surface and the moving finger.

Figure 1.

Interaction between the finger, the isolating part of the skin (stratum corneum), and the conductive plate.

The induced friction is perceived by the mechanoreceptors in the fingertip skin. In general, mechanoreceptors are responsible to perceive sensations such as pressure, vibration, and texture, and there are four types of them in hairless skin, Merkel discs, Meissner’s corpuscles, Ruffini corpuscles, and Pacinian corpuscles, as shown in Figure 2. They are categorized into fast-adapting (Pacinian and Meissner) and slow-adapting (Merkel and Ruffini) receptors. The former ones detect small and fast changes such as surface roughness, while the latter ones detect static perception such as pressure. It has been shown that the electrovibration is primarily perceived through the Pacinian channel [9].

Figure 2.

Touch mechanoreceptors in the hairless (glabrous) skin of the human fingertip [8].

Nevertheless, when a potential is applied, the electrostatic force, Fe=ϵAV22d2, compresses the stratum corneum, where A is contact area, d is thickness of stratum corneum, V is instantaneous potential difference, and ϵ is dielectric constant. Because there are no nerve endings in the stratum corneum, the compression will not be sensed. By moving the skin along the metal electrode, another force perpendicular to the compressive force will arise. This frictional tangential force is given by Ft=μFe+Fn, where μ is coefficient of friction and Fn is contact pressure (normal force) exerted by human body. μ is therefore an appreciable amount of transfer from the perpendicular compressional force to the tangential frictional force.

This electrostatic stimulation was introduced into a tactile display by Strong et al. [10]. They developed the first electrostatic display using a stimulator array consisting of a large number of small electrodes. They reported that the intensity of the perceived vibration was mainly due to the peak applied voltage. Later, a polyimide-on-silicon electrostatic fingertip tactile display was fabricated with 49 electrodes arranged in a square array [11]. They conducted experiments to assess the intensity and spatial resolution of the tactile percepts. In a following study, its application to present various spatial tactile patterns such as line, triangle, square, and circle to the visually impaired users is investigated [12]. In all these works, the dryness of fingertip is emphasized to be the key factor maintaining the percept, reporting that a small amount of sweat could cause the percept to fade or disappear. The direct method has difficulty in stable stimulation because of finger perspiration. Indirect stimulation was suggested as a solution. Yamamoto et al. built a display with a thin slider film between electrostatic stator electrodes and fingertip for presenting surface roughness [13]. In another work, multiple contact pads are used for multi-finger interaction with a large electrostatic display [14]. This was mainly to address finger perspiration during direct interaction and also to create larger force by applying higher voltage. This also enables multi-finger interaction.

Electrovibration regained attention in 2010 after a collaboration between Disney Research and Carnegie Mellon University yielded to a system for rendering 3D textures onto an electrovibration touchscreen. Called TeslaTouch [15], the developed system could deliver variable friction to user’s sliding finger by modifying amplitude and frequency of the excitation signal. Implemented on top of a tablet computer, a user could perceive real-time tactile feedback correspond to the displayed digital content. Different tactile effects could be generated mimicking surface geometry such as bumps and ridges or surface texture such as frictional patterns to enhance user experience interacting with the objects in the scene.

The core of TeslaTouch is a transparent capacitive touch panel (Microtouch, 3 M, USA) driven by a high-voltage signal to modulate friction on a sliding finger. The panel is made of a thick glass layer on the bottom, a transparent electrode (indium tin oxide; ITO) in the middle, and a thin insulator layer on the top. In the usual setup, the electrode is excited by high AC voltage, and the human body is grounded electrically. The big advantage of TeslaTouch is that the capacitive panel is a commercial off-the-shelf product which requires only an additional high-voltage amplifier for proper operation. The same panel has been used in electrovibration displays by other groups [16, 17, 18, 19, 20, 21, 22]. Radivojevic et al. at Nokia introduced a flexible and bendable version by replacing indium tin oxide (ITO) with graphene [23].

While TeslaTouch was mainly designed for desktop applications, a company in Finland, Senseg, developed Tixel [24], a transparent electrostatic film targeting handheld devices. The touch panel is made of transparent electrodes on a glass plate coated with an insulating layer. By applying a periodic voltage to the electrodes via connections used for sensing a finger’s position on the screen, the researchers were able to effectively induce a charge in a finger dragged along the surface. By changing the amplitude and frequency of the applied voltage, the surface can be made to feel as though it is bumpy, rough, sticky, or vibrating. The major difference is the specially designed control circuit that produces the sensations.

The tactile experience comes from two components: a coating layered atop touchscreen and electronics that modulate the electrostatic field and produce textures. Senseg’s Tixel is the means by which Senseg’s technology transmits electrovibration stimulus. It is an ultrathin durable coating on the touch interface that outputs tactile effects. The hardware inside a device modulates the signal for varied intensities of tactile sensation and types of tactile effects and provides accurate spatial resolution over the entire Tixel surface area.

Senseg later introduced a short-lived commercial product called Feelscreen, a 7″ Android tablet overlaid with Tixel, into the market between 2014 and 2016. Feelscreen has been used in several projects such as 3D shape rendering [25], texture gradients [26], and visual and haptic latency [27]. At the moment, Tanvas [28], a startup company in the USA, is commercializing similar products but on a larger 10″ tablet with some improvements such as generating stronger friction forces and not requiring an external power supply.

Some other researchers developed their own electrovibration display not using the 3 M capacitive touch panel. Pyo et al. built a tactile display that provides both electrovibration and mechanical vibration on a large surface [29]. They fabricated an insulated ITO electrode on top of an electrostatic parallel plate actuator, both operating based on the electrostatic principle. A nontransparent electrostatic friction display was also developed in [30, 31] using an aluminum plate covered with a thin plastic insulator film.

These displays do not support multi-touch or localized friction modulation, and all fingers in contact with the surface experience the same sensation. This issue was addressed by several prototypes presenting local stimulation. For example, a display panel was developed with multiple horizontal and vertical ITO electrodes in a grid enabling localized stimulation at the region where the vertical and horizontal electrodes cross each other [32]. In [14], a multi-finger electrostatic display was developed consisting of a transparent electrode and multiple contact pads on which users place their fingers. Applying different voltages to the pads and electrically grounding the transparent electrode induce different frictional stimuli to the multiple fingers.

The relationship between input signal and output friction in electrostatic friction displays is not clearly understood, and a number of studies have shown great interest in defining such relationship. Researchers have worked on this topic either by measuring friction forces using a tribometer [16, 31] or by estimating perceived intensities in psychophysical experiments [17, 33]. For instance, Meyer et al. [16] developed a tribometer to make precise measurements of finger friction and confirmed the expected square law of frictional force to driving voltage. They also showed a linear mapping between friction and normal force, confirming the Coulombic model of dry friction. Conducting a six-value effect strength subjective index rating, Wijekoon et al. showed a significant correlation (0.8) between signal amplitude and perceived intensity but no correlation between frequency and perceived intensity [33]. In [17], participants assigned a number between 0 and 100 to the subjective friction intensity. A linear fit in log-log scale was observed in the normalized results relating applied voltage amplitude to perceived friction force intensity.

As well as fabrication, various properties of electrovibration have been investigated too. The polarity effect of the actuation signal is studied in [34], reporting that tactile sensation is more sensitive to negative than positive pulses. Meyer et al. showed an expected square law dependence of frictional force, measured by a tribometer, on actuation voltage [16]. A similar approach is taken by Vezzoli et al. to develop a model for electrovibration effect considering frequency dependence [31]. Kim et al. proposed a current control method to provide more uniform perceived intensity of electrovibration [19]. In another work and by comparing two actuation signals, it is reported that square waves are more detectable than sine waves at frequencies lower than 60 Hz while they are same at higher frequencies [35]. Testing three methods, amplitude modulation, adding DC offset, and their combination, Kang et al. investigated low-voltage operation of electrovibration display [22]. They showed all methods increased dynamic friction force, while only DC offset increased static friction force.

Advertisement

3. Applications

To perceive the friction force generated on an electrovibration display, one requires to drag or slide their finger over the surface. While this type of interaction is natural and intuitive for most of handheld touchscreen devices, however, it limits the range of applications can benefit from this functionality. It is worth to recall that the two key attributes of real and simulated objects are shape (surface geometry) and texture (simply surface frictional properties) [36]. Addressing these two attributes separately, in this section we review the relevant work in the literature.

3.1 Rendering surface geometry

Rendering 3D objects on a flat surface, either using a haptic interface or a variable friction display, has not been addressed much in the literature. In an early work regarding haptic perception of curvature, Gordon and Morison showed that the gradient is an effective stimulus for curvature perception and humans rely on local curvature when perceiving surface [37]. Later, Minsky et al. demonstrated that tangential force alone can be sufficient for rendering surface texture assuming it is made of little bumps [38]. They introduced gradient technique to create the illusion of bumps and valleys using a 2D force-feedback joystick. As the user moves the joystick in a direction which is up a bump, his motion is opposed by a spring force proportional to the height of the bump. This gives the sense that it is very difficult to move to the top of the bump (springs resist being stretched) and easy to fall off the bump back into a lower region of the simulated surface (springs like to revert to a short length). For fine-grained surfaces, joystick spring forces can be computed based on a local gradient. As the user moves the joystick on the virtual surface, the change in height in the direction of motion is noted. We create virtual springs opposing the motion “up” the sides of each tiny bump. Thus, the spring forces applied to the hand are computed from local gradients of the height of the surface.

Based on the gradient technique, an early attempt to create the haptic illusion of a non-flat shape on a nominally flat surface was introduced in [39] using a force-shading algorithm. Later continuing their earlier work [40], Robles-De-La-Torre and Hayward demonstrated that in active exploration of a physical shape, lateral force applied to the sliding finger plays the main role in the perception of shape [41]. They investigated the accuracy of physical shape recognition using a one-degree of freedom (DoF) force-feedback device without visual cues. Different combinations of physical and virtual geometries (bump, hole, and flat surface), e.g., a virtual bump laid on a physical flat surface, were presented to participants. The virtual shapes were rendered using lateral force only. Participants could accurately identify the virtual shapes in all conditions.

This study was foundational to the gradient-based algorithm of Kim et al. [17] for rendering 3D features on a touchscreen using electrovibration. In their work, a psychophysical perceptual model, subjectively relating the perceived friction to the applied voltage, was formulated. The model was a straight line in log–log scale, fitted over average users’ ratings of the perceived friction intensity in a scale of 0–100. The model then utilized to modulate friction and render three lateral force profiles: height, slope, and rectangular. They compared users’ preference for three types of force profile for a visual bump displayed on the screen. Results indicated that the slope profile best matched the visual bump. They generalized this finding to a 2D gradient-based rendering algorithm for 3D features and applied the algorithm to many user interface examples.

In Ref. [25], the authors presented an effective rendering method for improving the recognition of 3D features rendered on a touchscreen using an electrostatic friction display. First, a formative user study is carried out using a basic gradient-based algorithm adapted from [41] in order to assess users’ ability of recognizing primitive 3D shapes based on lateral force feedback provided by an electrostatic tablet and a force-feedback interface. Experimental results demonstrated that users are not able to associate electrovibration patterns with geometric shapes in an absolute manner without contextual information. However, when such guidance was given, participants achieved moderate recognition. Then, they extended the basic algorithm to support general 3D mesh objects. The generalized algorithm computes the frictional rendering force by estimating the gradient at the touch point and also emphasizes sharp edges on the surface by rendering perceptually salient friction effects. Lastly, they conducted a summative user study to evaluate the effectiveness of their proposed shape rendering algorithm in reducing the visual uncertainty in 3D shape perception. They found that when frictional feedback was provided, the correct recognition performance was notably increased in comparison with when only visual rendering was presented.

3.2 Rendering surface texture

Compared to the problem of rendering 3D geometries on a flat electrostatic display, rendering surface textures seems more feasible and intuitive on such displays. As mentioned earlier, depending on the actuation signal, an electrovibration display generates different textural patterns. A simple illustration is given in Figure 3. On one hand, a sinusoid actuation signal creates a smooth bumpiness underneath of sliding finger. On the other hand, a square wave signal generates a rough and edgy feeling. A more complicated texture can be re-created using a proper complex signal.

Figure 3.

How the input actuation signal makes the perceived friction different.

Therefore, the type of input signal, its waveform, its amplitude, and its frequency components play a significant role on the generated textural patterns. Hence looking at the problem from a systematic standpoint, knowing the input–output relationship of the display is vital for this problem. As stated earlier, several efforts have been made modeling the display and drawing a relationship between the input actuation voltage and the output friction force. However, aside from the fact that the output force is somehow proportional to the squared input voltage, there exists no reliable general model covering all type of input signals across a wide range of frequencies. This suggests an alternative method, the so called data-driven texture rendering. Data-driven, or measurement-based, haptic rendering is a general approach that uses recordings from real objects to generate realistic haptic feedback in virtual environments [42, 43]. It can be either parametric- and physics-based, to optimize parameters of a predefined model, or nonparametric and generic. It is usually accompanied by a generic interpolation scheme to handle the data sets not being measured. It provides a unified framework to capture and display a diverse range of physical phenomena, while not requiring simulations of complex contact dynamics. This data-driven approach enables researchers to bypass the complex step of hand tuning a dynamic simulation of the target interaction to try to match a haptic sensation. Instead, the goal of the modeling process is to capture the output response of the system (e.g., force and acceleration) given some set of user inputs (e.g., position, velocity, and force). Such methods shift the focus from reproducing the physics of the interaction to reproducing the real sensations felt by the user, and thus they have been largely successful at realistic haptic simulation [44].

While the problem of data-driven haptic texture rendering has been fairly addressed in the literature using conventional or customized haptic interfaces [45, 46, 47, 48, 49, 50, 51], little work has been done on variable friction displays and particularly using electrovibration attraction.

An electrostatic friction display creates clearly perceptible stimuli when the surface is laterally scanned, but not when the finger is stationary. This fundamental limitation has confined the application of electrostatic friction displays mostly to texture rendering. In the only relevant work [18], Ilkhani et al. proposed a data-driven texture rendering method by recording accelerations from three real materials and playing them back on an electrovibration display. Their automated data collection is done under single constraint condition (contact force 0.35 N and scanning velocity 0.74 m/s) using a servomotor controlled by an Arduino Uno. They conducted a user study to compare the perceived surface roughness generated with their data-driven signals and with that of square wave signals. The frequency of each square wave is set based on the main frequency of the corresponding acceleration. Using a visual indicator, they made the user to keep a constant scanning velocity, but not equal to the data collection velocity and presumably very slower than that. In addition, there is no mention of contact force status during experimentation. Nevertheless, they reported higher percentage of similarity between data-driven textures and real ones than square wave patterns. In their extended work [52], they applied the same approach on the data from Penn Haptic Texture Toolkit [53] and performed MDS analysis to create a perceptual space and to extract underlying dimensions of the textures. Their results showed roughness and stickiness as the primary dimensions of texture perception.

In ref. [54], a data-driven neural network for realistic texture rendering on an electrovibration display is proposed. First, a motorized linear tribometer is developed to collect lateral frictional forces from the textured surfaces under various scanning velocities and normal forces. Then an inverse dynamics model of the display is created to describe its output-input relationship using nonlinear autoregressive with external input (NARX) neural networks. Forces resulting from applying a full-band pseudorandom binary signal (PRBS) to the display are used to train each network under the given experimental condition. A comparison between the real and virtual forces in frequency domain shows promising results and reveals the capabilities and limitations of the proposed technique.

Advertisement

4. Conclusions

In this chapter, we have introduced the concept behind electrostatic friction displays (also called electrovibration displays) and their potential applications for shape and texture rendering. The potential uses for the technique are exciting. Electrovibration could make interactive textbooks more engaging on tablets, allowing students to explore the three-dimensional features of an object directly on each page. Software for iOS or Android could be augmented with unique haptic feedback for button presses and swipe gestures. Games could incorporate electrovibration to add a new layer of interactivity to touch controls. With some smart design, it could really improve the functionality of touchscreens used in other fields, as well. For instance, the use of touchscreens in automobiles to navigate the map or control the music playback persuades drivers to avert their eyes from the road. Possibly, with an appropriate design, the same control functionalities could be delivered using a variable touch-based feedback without the need to take our eyes off the road. Given the commonness of capacitive touchscreens, the addition of richer tactile feedback through electrovibration promises to enhance almost all of our interactions with digital contents.

While the technique has a lot of potential, the form factor remains a primary barrier to adoption. Implementing the alternating voltage results in a much bulkier device than with an ordinary capacitive touchscreen. As the technology sees more frequent use, however, there may be technological developments that allow more smartphone and tablet manufacturers to feature electrovibration without sacrificing the compactness of their designs.

Advertisement

Acknowledgments

I would like to thank Prof. Seungmoon Choi for his extensive personal and professional guidance teaching me a great deal about both scientific research and life in general. As my teacher and mentor, he has taught me more than I could ever give him credit for here. He has shown me, by his example, what a good scientist (and person) should be. I also would like to thank Dr. Jin Ryung Kim for his constant support and encouragement.

Advertisement

Conflict of interest

I certify that I have no affiliations with or involvement in any organization or entity with any financial interest (such as honoraria; educational grants; participation in speakers’ bureaus; membership, employment, consultancies, stock ownership, or other equity interest; and expert testimony or patent-licensing arrangements) or nonfinancial interest (such as personal or professional relationships, affiliations, knowledge, or beliefs) in the subject matter or materials discussed in this manuscript.

References

  1. 1. Haghighi Osgouei R. Haptic Rendering of Surface 3D Curvature and Texture on Electrostatic Friction Display. 2017
  2. 2. Haghighi Osgouei R, Kim JR, Choi S. Identification of primitive geometrical shapes rendered using electrostatic friction display. In: IEEE Haptics Symposium (HAPTICS). 2016. pp. 198-204
  3. 3. Gray E, inventor. Improvement in electric telegraphs for transmitting musical, tones. United States patent US US166,096; 1875
  4. 4. Haghighi Osgouei R, Kim JR, Choi S. Data-driven texture modeling and rendering on electrovibration display. IEEE Transactions on Haptics. 2019. DOI: 10.1109/TOH.2019.2932990
  5. 5. Johnsen A, Rahbek K. A physical phenomenon and its applications to telegraphy, telephony, etc. Journal of the Institution of Electrical Engineers. 1923;61(320):713-725
  6. 6. Mallinckrodt E, Hughes AL, Sleator W Jr. Perception by the skin of electrically induced vibrations. Science. 1953;118:277-278. Available from: https://doi.org/10.1126/science.118.3062.277
  7. 7. Grimnes S. Electrovibration, cutaneous sensation of microampere current. Acta Physiologica Scandinavica. 1983;118(1):19-25
  8. 8. Darian-Smith I. The sense of touch: Performance and peripheral neural processes. In: Comprehensive Physiology. Wiley [Online Library]. 2011. pp. 739-788
  9. 9. Vardar Y, Güçlü B, Basdogan C. Effect of waveform on tactile perception by electrovibration displayed on touch screens. IEEE Transactions on Haptics. 2017;10(4):488-499
  10. 10. Strong RM, Troxel DE. An electrotactile display. IEEE Transactions on Man-Machine Systems. 1970;11(1):72-79
  11. 11. Beebe DJ, Hymel CM, Kaczmarek KA, Tyler ME. A polyimide-on-silicon electrostatic fingertip tactile display. In: Engineering in Medicine and Biology Society, 1995, IEEE 17th Annual Conference. Vol. 2. 1995. pp. 1545-1546
  12. 12. Tang H, Beebe DJ. A microfabricated electrostatic haptic display for persons with visual impairments. IEEE Transactions on Rehabilitation Engineering. 1998;6(3):241-248
  13. 13. Yamamoto A, Ishii T, Higuchi T. Electrostatic tactile display for presenting surface roughness sensation. In: IEEE International Conference on Industrial Technology. Vol. 2. 2003. pp. 680-684
  14. 14. Nakamura T, Yamamoto A. Multi-finger electrostatic passive haptic feedback on a visual display. In: World Haptics Conference (WHC). 2013. pp. 37-42
  15. 15. Bau O, Poupyrev I, Israr A, Harrison C. TeslaTouch: Electrovibration for touch surfaces. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology. 2010. pp. 283-292
  16. 16. Meyer DJ, Peshkin MA, Colgate JE. Fingertip friction modulation due to electrostatic attraction. In: World Haptics Conference (WHC). 2013. pp. 43-48
  17. 17. Kim SC, Israr A, Poupyrev I. Tactile rendering of 3D features on touch surfaces. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. 2013. pp. 531-538
  18. 18. Ilkhani G, Aziziaghdam M, Samur E. Data-driven texture rendering with electrostatic attraction. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 2014. pp. 496-504
  19. 19. Kim H, Kang J, Kim KD, Lim KM, Ryu J. Method for providing electrovibration with uniform intensity. IEEE Transactions on Haptics. 2015;8(4):492-496
  20. 20. Zhang Y, Harrison C. Quantifying the targeting performance benefit of electrostatic haptic feedback on touchscreens. In: Proceedings of the International Conference on Interactive Tabletops & Surfaces. 2015. pp. 43-46
  21. 21. Wang Q , Ren X, Sarcar S, Sun X. EV-pen: Leveraging Electrovibration haptic feedback in pen interaction. In: Proceedings of the ACM on Interactive Surfaces and Spaces. 2016. pp. 57-66
  22. 22. Kang J, Kim H, Choi S, Kim KD, Ryu J. Investigation on low voltage operation of electrovibration display. IEEE Transactions on Haptics. 2017;10(3):371-381
  23. 23. Radivojevic Z, Beecher P, Bower C, Haque S, Andrew P, Hasan T, et al. Electrotactile touch surface by using transparent graphene. In: Proceedings of the Virtual Reality International Conference. 2012. p. 16
  24. 24. Linjama J, Mäkinen V. E-sense screen: Novel haptic display with capacitive electrosensory interface. In: HAID, 4th Workshop for Haptic and Audio Interaction Design. 2009
  25. 25. Haghighi Osgouei R, Kim JR, Choi S. Improving 3D shape recognition with electrostatic friction display. IEEE Transactions on Haptics. 2017;10(4):533-544
  26. 26. Klatzky RL, Adkins S, Bodas P, Haghighi Osgouei R, Choi S, Tan HZ. Perceiving texture gradients on an electrostatic friction display. In: World Haptics Conference (WHC). 2017. pp. 154-158
  27. 27. Kim JR, Haghighi Osgouei R, Choi S. Effects of visual and haptic latency on touchscreen interaction: A case study using painting task. In: World Haptics Conference (WHC). 2017. pp. 159-164
  28. 28. TanvasTouch®. Available from: https://tanvas.co [Accessed: 02 September 2020]
  29. 29. Pyo D, Ryu S, Kim SC, Kwon DS. A new surface display for 3D haptic rendering. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 2014. pp. 487-495
  30. 30. Giraud F, Amberg M, Lemaire-Semail B. Merging two tactile stimulation principles: Electrovibration and squeeze film effect. In World Haptics Conference (WHC). 2013. pp. 199-203
  31. 31. Vezzoli E, Amberg M, Giraud F, Lemaire-Semail B. Electrovibration modeling analysis. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 2014. pp. 369-376
  32. 32. Haga H, Yoshinaga K, Yanase J, Sugimoto D, Takatori K, Asada H. Electrostatic tactile display using beat phenomenon of voltage waveforms. In: SID Symposium Digest of Technical Papers. Vol. 45. No. 1. 2014. pp. 623-626
  33. 33. Wijekoon D, Cecchinato ME, Hoggan E, Linjama J. Electrostatic modulated friction as tactile feedback: Intensity perception. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 2012. pp. 613-624
  34. 34. Kaczmarek KA, Nammi K, Agarwal AK, Tyler ME, Haase SJ, Beebe DJ. Polarity effect in electrovibration for tactile display. IEEE Transactions on Biomedical Engineering. 2006;53(10):2047-2054
  35. 35. Vardar Y, Güçlü B, Basdogan C. Effect of waveform in haptic perception of electrovibration on touchscreens. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. 2016. pp. 190-203
  36. 36. Campion G, Hayward V. Fundamental limits in the rendering of virtual haptic textures. In: First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference. Pisa, Italy; 2005. pp. 263-270
  37. 37. Gordon IE, Morison V. The haptic perception of curvature. Perception & Psychophysics. 1982;31(5):446-450
  38. 38. Minsky M, Ming OY, Steele O, Brooks Jr FP, Behensky M. Feeling and seeing: Issues in force display. In: Proceedings of the 1990 Symposium on Interactive 3D Graphics (I3D ’90). Association for Computing Machinery. New York, NY, USA; 1990:235-241. Available from: https://doi.org/10.1145/91385.91451
  39. 39. Morganbesser HB, Srinivasan MA. Force shading for shape perception in haptic virtual environments. In: Proceedings of the 5th Annual Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. Vol. 58. Atlanta, GA, DSC: ASME/IMECE; 1996
  40. 40. Robles-De-La-Torre G, Hayward V. Virtual surfaces and haptic shape perception. In: Proceedings ASME IMECE Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems. Vol. 69. 2000. p. 2
  41. 41. Robles-De-La-Torre G, Hayward V. Force can overcome object geometry in the perception of shape through active touch. Nature. 2001;412(6845):445
  42. 42. Hover R, Kósa G, Szekly G, Harders M. Data-driven haptic rendering—From viscous fluids to visco-elastic solids. IEEE Transactions on Haptics. 2009;2(1):15-27
  43. 43. Yim S, Jeon S, Choi S. Data-driven haptic modeling and rendering of viscoelastic and frictional responses of deformable objects. IEEE Transactions on Haptics. 2016;9(4):548-559
  44. 44. Romano JM, Yoshioka T, Kuchenbecker KJ. Automatic filter design for synthesis of haptic textures from recorded acceleration data. In: IEEE International Conference on Robotics and Automation (ICRA). 2010. pp. 1815-1821
  45. 45. Choi S, Tan HZ. Toward realistic haptic rendering of surface textures. IEEE Computer Graphics and Applications. 2004;24(2):40-47
  46. 46. Andrews S, Lang J. Haptic texturing based on real-world samples. In: IEEE International Workshop on Haptic, Audio and Visual Environments and Games (HAVE). 2007. pp. 142-147
  47. 47. Campion G. On the synthesis of haptic textures. In: The Synthesis of Three Dimensional Haptic Textures: Geometry, Control, and Psychophysics. London: Springer; 2008. pp. 73-97
  48. 48. Lang J, Andrews S. Measurement-based modeling of contact forces and textures for haptic rendering. IEEE Transactions on Visualization and Computer Graphics. 2011;17(3):380-391
  49. 49. Culbertson H, Unwin J, Kuchenbecker KJ. Modeling and rendering realistic textures from unconstrained tool-surface interactions. IEEE Transactions on Haptics. 2014;3:1
  50. 50. Shin S, Haghighi Osgouei R, Kim KD, Choi S. Data-driven modeling of isotropic haptic textures using frequency-decomposed neural networks. In: World Haptics Conference (WHC). 2015. pp. 131-138
  51. 51. Abdulali A, Jeon S. Data-driven rendering of anisotropic haptic textures. In: International Asia Haptics Conference. 2016. pp. 401-407
  52. 52. Ilkhani G, Aziziaghdam M, Samur E. Data-driven texture rendering on an electrostatic tactile display. International Journal of Human–Computer Interaction. 2017;33(9):756-770
  53. 53. Culbertson H, Lopez Delgado JJ, Kuchenbecker KJ. The Penn Haptic Texture Toolkit for modeling, rendering, and evaluating haptic virtual textures, Departmental Papers (MEAM). 299; 2014
  54. 54. Haghighi Osgouei R, Shin S, Kim JR, Choi S. An inverse neural network model for data-driven texture rendering on electrovibration display. In: IEEE Haptics Symposium (HAPTICS). 2018. pp. 270-277

Written By

Reza Haghighi Osgouei

Submitted: 20 November 2018 Reviewed: 09 January 2020 Published: 12 February 2020