Open access peer-reviewed chapter

Object-Handling Tasks Based on Active Tactile and Slippage Sensations

By Masahiro Ohka, Hanafiah Bin Yussof and Sukarnur Che Abdullah

Submitted: October 15th 2010Reviewed: February 14th 2011Published: June 9th 2011

DOI: 10.5772/16781

Downloaded: 2504

1. Introduction

Many tactile sensors have been developed to enhance robotic manufacturing tasks, such as assembly, disassembly, inspection and materials handling as described in several survey papers (Harmon, 1982; Nicholls & Lee 1989; Ohka, 2009a). In the last decade, progress has been made in tactile sensors by focusing on limited uses. Many examples of practical tactile sensors have gradually appeared. Using a Micro Electro Mechanical System, MEMS-based tactile sensors have been developed to incorporate pressure-sensing elements and piezoelectric ceramic actuators into a silicon tip for detecting not only pressure distribution but also the hardness of a target object (Hasegawa et al., 2004). Using PolyVinylidene DiFluoride, a PVDF film-based tactile sensor has been developed to measure the hardness of tumors based on comparison between the obtained sensor output and the input oscillation (Tanaka et al., 2003). A wireless tactile sensor using two-dimensional signal transmission has been developed to be stretched over a large sensing area (Chigusa et al., 2007). An advanced conductive rubber-type tactile sensor has been developed to be mounted on robotic fingers (Shimojo et al., 2004). Furthermore, image based tactile sensors have been developed using a charge-coupled device (CCD) and complementary metal oxide semiconductor (CMOS) cameras and image data processing, which are mature techniques (Ohka, 1995, 2004, 2005a, 2005b, Kamiyama et al., 2005).

In particular, the three-axis tactile sensor that is categorized as an image based tactile sensor has attracted the greatest anticipation for improving manipulation because a robot must detect the distribution not only of normal force but also of slippage force applied to its finger surfaces (Ohka, 1995, 2004, 2005a, 2005b, 2008). In addition to our three-axis tactile sensors, there are several designs of multi-axis force cells based on such physical phenomena as magnetic effects (Hackwood et al., 1986), variations in electrical capacity (Novak, 1989; Hakozaki & Shinoda 2002), PVDF film (Yamada & Cutkosky, 1994), and a photointerrupter (Borovac et al., 1996).

Our three-axis tactile sensor is based on the principle of an optical waveguide-type tactile sensor (Mott et al., 1984; Tanie et al., 1986; Nicholls et al., 1990; Kaneko et al., 1992; Maekawa et al., 1992), which is composed of an acrylic hemispherical dome, a light source, an array of rubber sensing elements, and a CCD camera (Ohka, 1995, 2004a, 2005a, 2005b, 2008). The sensing element of the silicone rubber comprises one columnar feeler and eight conical feelers. The contact areas of the conical feelers, which maintain contact with the acrylic dome, detect the three-axis force applied to the tip of the sensing element. Normal and shearing forces are then calculated from integration and centroid displacement of the grayscale value derived from the conical feeler’s contacts.

The tactile sensor is evaluated with a series of experiments using an x-z stage, a rotational stage, and a force gauge. Although we discovered that the relationship between the integrated grayscale value and normal force depends on the sensor’s latitude on the hemispherical surface, it is easy to modify the sensitivity based on the latitude to make the centroid displacement of the grayscale value proportional to the shearing force.

To demonstrate the effectiveness of the three-axis tactile sensor, we designed a hand system composed of articulated robotic fingers sensorized with the three-axis tactile sensor (Ohka, 2009b, 2009c). Not only tri-axial force distribution directly obtained from the tactile sensor but also the time derivative of the shearing force distribution are used for the hand control algorithm: the time derivative of tangential force is defined as slippage; if slippage arises, grasping force is enhanced to prevent fatal slippage between the finger and an object. In the verification test, the robotic hand twists on a bottle cap completely.

In the following chapters, after the optical three-axis tactile sensor is explained, the robotic hand sensorized with the tactile sensors is described. The above cap-twisting task is discussed to show the effectiveness of tri-axial tactile data for robotic control.

2. Optical three-axis tactile sensor

2.1. Sensing principle

2.1.1. Structure of optical tactile sensors

Figure 1 shows a schematic view of the present tactile processing system to explain the sensing principle. The present tactile sensor is composed of a CCD camera, an acrylic dome, a light source, and a computer. The light emitted from the light source is directed into the optical waveguide dome. Contact phenomena are observed as image data, acquired by the CCD camera, and transmitted to the computer to calculate the three-axis force distribution.

Figure 1.

Principle of the three-axis tactile sensor system

In this chapter, we adopt a sensing element comprised of a columnar feeler and eight conical feelers, as shown in Fig. 2, because the element showed wide measuring range and good linearity in a previous paper (Ohka, 2004b). Since a single sensing element of the present tactile sensor should carry a heavier load compared to a flat-type tactile sensor, the height of the columnar feeler of the flat-type tactile sensor is reduced from 5 to 3 mm. The sensing elements are made of silicone rubber (KE119, Shinetsu) and are designed to maintain contact with the conical feelers and the acrylic board and to make the columnar feelers touch an object. Each columnar feeler features a flange to fit into a counter bore portion in the fixing dome to protect the columnar feeler from horizontal displacement caused by shearing force.

2.1.2. Expressions for sensing element located on vertex

Dome brightness is inhomogeneous because the edge of the dome is illuminated and light converges on its parietal region. Since the optical axis coincides with the center line of the vertex, the apparent image of the contact area changes based on the sensing element’s latitude. Although we must consider the above problems to formulate a series of equations for the three components of force, the most basic sensing element located on the vertex will be considered first.

Figure 2.

Sensing element

Figure 3.

Relationship between spherical and Cartesian coordinates

Coordinate O-xyz is adopted, as shown in Fig. 3. Based on previous studies (Ohka, 2005), since grayscale value g(x,y)obtained from the image data is proportional to pressure p(x,y)caused by contact between the acrylic dome and the conical feeler, normal force is calculated from integrated grayscale valueG. Additionally, shearing force is proportional to the centroid displacement of the grayscale value. Therefore, theFx, Fy, and Fzvalues are calculated using integrated grayscale value Gand the horizontal displacement of the centroid of grayscale distribution u=uxi+uyjas follows:

Fx=fx(ux)E1
,
Fy=fy(uy)E2
,
Fz=g(G)E3
,

where iand jare the orthogonal base vectors of the x- and y-axes of a Cartesian coordinate, respectively, andfx(x), fy(x), and g(x)are approximate curves estimated in calibration experiments.

2.1.3. Expressions for sensing elements other than those located on vertex

For sensing elements other than those located on the vertex, each local coordinate O i -x i y i z i is attached to the root of the element, where suffix i denotes element number. Each z i -axis is aligned with the center line of the element and its direction is along the normal direction of the acrylic dome. The z i -axis in local coordinate O i -x i y i z i is taken along the center line of sensing element i so that its origin is located on the crossing point of the center line and the acrylic dome's surface and its direction coincides with the normal direction of the acrylic dome. If the vertex is likened to the North Pole, the directions of the x i - and y i -axes are north to south and west to east, respectively. Since the optical axis direction of the CCD camera coincides with the direction of the z-axis, information of every tactile element is obtained as an image projected into the O-xy plane. The obtained image data g(x,y)should be transformed into modified imageg(xi,yi), which is assumed to be taken in the negative direction of the z i -axis attached to each sensing element. The transform expression is derived from the coordinate transformation of the spherical coordinate to the Cartesian coordinate as follows:

g(xi,yi)=g(x,y)/sinφiE4

Centroid displacements included in Eqs. (1) and (2), and ux(x,y)and uy(x,y)should be transformed into ux(xi,yi)and uy(xi,yi)as well. In the same way as Eq. (4), the transform expression is derived from the coordinate transformation of the spherical coordinate to the Cartesian coordinate as follows:

ux(xi,yi)=ux(x,y)cosϕi+uy(x,y)sinϕisinφiE5
,
uy(xi,yi)=ux(x,y)sinϕi+uy(x,y)cosϕiE6
.

Figure 4.

Fingertip including three-axis tactile sensor

2.1.4. Design of optical three-axis tactile sensor

Since the tactile sensor essentially needs to be a lens system, it is difficult to make it thinner; thus, it should be designed as a type of integrated fingertip and hemispherical three-axis tactile sensor, as shown in Fig. 4 (Ohka et al., 2008). Forty-one sensing elements are concentrically arranged on the acrylic dome, which is illuminated along its edge by optical fibers connected to a light source (ELI-100S, Mitsubishi Rayon Co.) Image data consisting of bright spots caused by the feelers’ collapse (MSGS-1350-III, Moritex Co.) are retrieved by an optical fiber scope connected to the CCD camera (C5985, Hamamatsu Photonics Co.)

2.2. Procedure of evaluation tests

2.2.1. Experimental apparatus

We developed a loading machine shown in Fig. 5 that includes an x-stage, a z-stage, rotary stages, and a force gauge (FGC-0.2B, NIDEC-SIMPO Co.) to detect the sensing characteristics of normal and shearing forces. The force gauge has a probe to measure force

Figure 5.

Loading machine

Figure 6.

Tactile data processing system

and can detect force ranging from 0 to 2 N with a resolution of 0.001 N. The positioning precisions of the y-, the z-, and rotary stages are 0.001 mm, 0.1 mm, and 0.1, respectively. Output of the present tactile sensor is processed by the data processing system shown in Fig. 6. The system is composed of a tactile sensor, a loading machine, an image processing board (Himawari PCI/S, Library, Co.), and a computer. Image data acquired by the image processing board are processed by in-house software. The image data acquired by the CCD camera are divided into 41 subregions, as shown in Fig. 7. The dividing procedure, digital filtering, integrated grayscale value and centroid displacement are processed on the image processing board. Since the image warps due to projection from a hemispherical surface, as shown in Fig. 7, software installed on the computer modifies the obtained data. The motorized stage and the force gauge are controlled by the software.

2.2.2. Procedure of sensing normal force test

Because the present tactile sensor can detect not only normal force but also shearing force, we must confirm the sensing capability of both forces. In normal-force testing, by applying a normal force to the tip of a sensing element using the z-stage after rotating the attitude of the tactile sensor, it is easy to test the specified sensing element using the rotary stage. Since the rotary stage’s center of rotation coincides with the center of the present tactile sensor’s hemispherical dome, testing any sensing element aligned along the hemisphere’s meridian is easy.

2.2.3. Procedure of sensing shearing force test

When generating the shearing-force component, both the rotary and x-stages are adjusted to specify the force direction and sensing element. First, the rotary stage is operated to give force directionθ, as shown in Fig. 8. The x-stage is then adjusted to the applied tilted force

Figure 7.

Addresses of sensing elements

at the tip of the specified sensing element. Figure 8 shows that the sensing element located on the parietal region can be assigned based on the procedure described above. After that, a force is loaded onto the tip of the sensing element using the z-stage. Regarding the manner

Figure 8.

Generation of shearing force component

of loading, since the force direction does not coincide with the axis of the sensing element, slippage between the probe and the tip of the sensing element occurs. To eliminate this problem, a spherical concave portion is formed on the probe surface to mate the concave portion with the hemispherical tip of the tactile element. Normal force FNand shearing force FSapplied to the sensing elements are calculated using the following formulas, when force Fis applied to the tip of the tactile element:

FN=FcosθE7
FS=FsinθE8

2.3. Sensing ability of optical three-axis tactile sensor

2.3.1. Sensing ability of normal force

To evaluate the sensing characteristics of sensing elements distributed on the hemispherical dome, we need to measure the variation within the integrated grayscale values generated by the sensor elements. Figure 9 shows examples of variation in the integrated grayscale value caused by increases in the normal force for sensors #00, #01, #05, #09, #17, #25, and #33. In these experiments, normal force is applied to a tip of each tactile element. As the figure indicates, the gradient of the relationship between the integrated grayscale value and applied force increases with an increase inφ; that is, sensitivity depends upon the latitude on the hemisphere. Dome brightness is inhomogeneous because the edge of the dome is illuminated and light converges on its parietal region. Brightness is represented as a function of latitudeφ, and since sensitivity is uniquely determined by latitude, it is easy to modify the sensitivity according toφ.

Figure 9.

Relationship between applied force and grayscale value

However, sensing elements located at the same latitude φshow different sensing characteristics. For example, the sensitivities of #09 and #17 should coincide since they have identical latitude; however, as Fig. 10 clearly indicates, they do not. The difference reflects the inhomogeneous brightness of the acrylic dome. Therefore, we need to obtain the sensitivity of every sensing element.

Figure 10.

Approximation using bi-linearity

As shown in Fig. 9, the relationship between the integrated grayscale value and applied normal force is not completely linear. Therefore, we adopt the bi-linear lines shown in Fig. 10 as function g(x)in Eq. (3) to approximate these curves. Linear approximation adequately represents the relationship between force and integrated grayscale value for two tactile elements (#12 and #29) with high accuracy; for the other elements bi-linear approximation can represent the relationship with a rather high correlation factor ranging from 0.911 to 0.997.

To show that under the combined loading condition normal force component was independently obtained with Eq. (3), we applied inclined force to the tip of the tactile element to examine the relationship between the normal component of applied force and integrated grayscale value. Figure 11 displays the relationship for #00. Even if the inclination is varied from -30to 30, the relationship coincides within a deviation of 3.7%. Therefore, the relationship between the normal component of applied force and the integrated grayscale value is independent of inclinationθ.

2.3.2. Sensing ability of shearing force

When force is applied to the tip of the sensing element located in the parietal region under several θs, the relationships between the displacement of the centroid and the shearing-force component calculated by Eq. (5) are obtained, as shown in Fig. 12. Although the inclination of the applied force is varied in a range from 15to 60, the curves converge into a single one. Therefore, the applied shearing force is obtained independently from centroid displacement.

Figure 11.

Relationship between integrated grayscale value and applied normal force at several inclinations

Figure 12.

Relationship between centroid displacement and applied shearing

When the tactile element accepts directional forces of 45, 135, 225, and 315, centroid trajectories are shown in Fig. 13 to examine shearing force detection under various directions except for the x- and y-directions. If the desired trajectories shown in Fig. 13 are compared to the experimental results, they almost trace identical desired trajectories. The present tactile sensor can detect various applied forces.

Figure 13.

Trajectory of centroid

3. Object handling based on tri-axial tactile data

3.1. Hand robot equipped with optical three-axis tactile sensors

We designed a two-fingered robotic hand as shown in Fig. 14 for general-purpose use in robotics (Ohka et al., 2009b, 2009c). The robotic hand includes links, fingertips equipped with the three-axis tactile sensor, and micro actuators (YR-KA01-A000, Yasukawa). Each micro actuator, which consists of an AC servo-motor, a harmonic drive, and an incremental encoder, was particularly developed for application to a multi-fingered hand. Since the tactile sensors must be fitted to a multi-fingered hand, we are developing a fingertip that includes the hemispherical three-axis tactile sensor shown in Fig. 4.

Figure 14.

Robotic hand equipped with three-axis tactile sensors

3.2. Kinematics of hand robot

As shown in Fig. 14, each robotic finger has three movable joints. The frame of the workspace is set on the bottom of the z-stage. The kinematics of the present hand is derived according to Denavit-Hartenberg notation shown in Fig. 14. The frame of the workspace is defined as O-xyz. The frames of O i - (xiyizi) (in the following, O-xyz is used instead of O0-x0y0z0) are attached on each joint, the basement of the z-stage, or the fingertip, as shown in Fig. 14. The velocities of the micro actuators (θ˙=(θ˙1θ˙2θ˙3)) are calculated with

θ˙=J1(θ)r˙E9

to satisfy specified velocity vector r˙(=(x˙y˙z˙)), which is calculated from the planed trajectory. Jacobian J(θ)is obtained by the kinematics of the robotic hand as follows:

J(θ)=[R13(l2+l3c2+l4c23)l3(R11s3+R12c3)+l4R12R12l4R23(l2+l3c2+l4c23)l3(R21s3+R22c3)+l4R22R22l4R33(l2+l3c2+l4c23)l3(R31s3+R32c3)+l4R32R32l4],E10

where

[R11R12R13R21R22R23R31R32R33]=[a11c23+a13s23a11s23+a13c23a12a21c23+a23s23a21s23+a23c23a22a31c23+a33s23a31s23+a33c33a32],[a11a12a13a21a22a23a31a32a33]=[cφ1c1+sφ1sφ2s1cφ1s1+sφ1sφ2s1sφ1cφ2cφ2sφ1cφ2sφ1sφ2sφ1c1+sφ1sφ2s1sφ1s1+cφ1sφ2cφ1cφ1cφ2]cicosθisisinθicφicosφisφisinφicijcos(θi+θj)sijsin(θi+θj)(i;j=1,2,3)E11

In the above equations, the rotations of the first frame around the x0- and y0-axes are denoted as φ1andφ2, respectively. The distance between the origins of the m-th and m+1-th frames is denoted aslm. The joint angles of the micro actuators on O2-x2y2z2, O3-x3y3z3and O4-x4y4z4areθ1, θ2, andθ3, respectively.

Position control of the fingertip is performed based on resolved motion rate control. In this control method, joint angles are assumed at the first step, and displacement vector r0is calculated with kinematics. Adjustment of joint angles is obtained by Eq. (9) and the difference between r0and objective vector rdto modify joint angle θt+1at the next step. The modified joint angle is designated as the current angle in the next step, and the above procedure is repeated until the displacement vector at k-th step rkcoincides with objective vector rdwithin a specified error. That is, the following Eqs. (12) and (13) are calculated until |rdrk|becomes small enough:

r˙k=Jθ˙kE12
θk+1=θkJ1(rdrk)E13

3.3. Control algorithm

Our objective is to show that the robotic hand adapts its finger trajectory to the environment according to tri-axial tactile data. Hence, we make a simple control algorithm for the hand. In the algorithm, there is an assumption that finger trajectory provided beforehand to the hand is as simple as possible. The trajectory is modified to prevent normal force from exceeding a threshold and to stabilize slippage caused on the contact area according to tri-axial tactile data.

Figure 15.

Algorithm of flag analyzer

Figure 16.

Algorithm of finger speed estimator

The hand is controlled according to velocity control. First, hand status becomes “search mode” to make fingers approach an object with finger speedv=v0. After the fingers touch the object, the hand status becomes “move mode” to manipulate the object with finger speedv=vm. During both search and move modes, when the absolute time derivative of the shearing force of a sensing element exceeds a thresholddr, this system regards the sensing element as slippage. To prevent the hand from dropping the object, re-compressive velocity is defined as moving the fingertip along the counter direction of applied force.

However, if normal force of a sensing element exceeds a thresholdF2, the re-compressive velocity is cancelled to prevent the sensing element from breaking. The hand is controlled by a control module with applying total velocity obtained by adding the re-compressive velocity to current velocity.

In our system, the sensor control program and hand control program are executed in different computers because CPU time is efficiently consumed using a multi-task program method. These programs are synchronized with the following five flags.

SEARCH: Fingers search for an object with initial finger velocity v0until normal force of a sensing element exceeds a threshold F1or Slip flag is raised.

MOVE: This flag is raised whenever the robotic hand manipulates an object.

TOUCH: This flag is raised whenever one of the fingers touches an object.

SLIP: This flag is raised whenever the time derivative of shearing force exceeds a thresholddr.

OVER: This flag is raised when normal force of a sensing element exceeds a thresholdF2.

These flags are decided according to tri-axial tactile data and finger motions. Since two modules, the flag analyzer and finger speed estimator, mainly play the role of object handling, these modules are shown in Figs. 15 and 16, respectively.

In the flag analyzer, TOUCH flag, SLIP flag and OVER flag are decided. The flag analyzer regards finger status as touching an object when normal force of a sensing element is exceeded or the absolute time derivative of the shearing force is exceeded (SLIP flag is raised). Whenever it regards finger status as touching an object, the TOUCH flag is raised. The OVER flag is raised when normal force of a sensing element exceeds F1to prohibit re-compressive motion.

In the finger speed estimator, the velocity of the fingertip is determined based on the five flag values and conserved whenever contact status is not changed. Since the cap-twisting problem requires touch-and-release motion, the MOVE and SEARCH flags are controlled according to the TOUCH flag and time spent. Whenever the SLIP flag is raised, a sensing element of the largest normal force is determined and the re-compressive velocity of the finger is determined as an inward normal line of the sensing element. The re-compressive velocity is added to the current velocity, and the resultant velocity is applied to the control module.

3.4. Evaluation experiment of object handling

3.4.1. Experimental apparatus and procedure

To examine the above algorithm, the robotic hand performed the bottle cap-closing task because this task requires a curved trajectory along the cap contour. Evaluation of cap closing is performed using an apparatus including a torque sensor. Figure 17 shows the apparatus composed of the two-fingered hand, the torque sensor (TCF-0.2N, Nippon Tokushu Sokki, Co., Ltd.) and a PET bottle holder. A PET bottle is clamped with two twists of the PET holder, and its cap is turned by the robotic hand. The torque sensor measures torque with four strain gauges. Variation in gauge resistance is measured as voltage through a bridge circuit, and it is sent to a computer with an A/D converter to obtain the relationship between finger configuration and generated torque.

The experimental apparatus is shown in Fig. 17. A PET bottle is held by the holder. At first, two fingers approach the cap, and moving direction is changed to the tangential direction of the cap surface after grasping force exceeds 1 N. After the finger moves, keeping the direction within 10 mm, the fingers are withdrawn from the cap surface and returned to each home position from which they started moving. Consequently, the trajectory of the fingers is designed as shown in Fig. 17. During the task of closing the cap, variation in torque is monitored through the torque sensor to evaluate the task. Even if the trajectory is simple, we will show that it adapts to the cap contour in the following section.

Figure 17.

Experimental apparatus for cap-twisting task

3.4.2. Relationship between grasping force and torque

The relationship between grasping force and torque while twisting the bottle cap is shown in Fig. 18 as an overview of the experiment. Since touch-and-release motion is continued four times, four groupings are found in Fig. 18. As shown in Fig. 18, compared to the first twisting motion, both grasping force and torque decrease considerably in the second twisting, and in the third and fourth twistings they increase compared to the former two twistings. Since the third and fourth twistings show almost the same variations in grasping force and torque, twisting seems to become constant. Therefore, after the third twisting, the cap seems to be closed. In the first twisting, we can observe the transition from light twisting to forceful twisting because torque increases in spite of constant grasping force. It is shown that the cap is turned without resistant torque at first. The reason for reducing grasping force and torque in the second twisting is the variation in contact position and status between the first and second twistings. Twisting on the cap was successfully completed as mentioned above.

Figure 18.

Relationship between grasping force and torque

Figure 19.

Relationship between variations in time derivative of shearing force and torque

3.4.3. Relationship between time derivative of shearing force and torque

When the cap is twisted on completely, slippage between the robotic finger and the cap occurs. To examine this phenomenon, the relationship between the time derivative of the shearing force and torque is shown in Fig. 19. As can be seen, the time derivative of the shearing force shows periodic bumpy variation. This bumpy variation synchronizes with variation in torque. This means large tangential force induces the time derivative of the shearing force, which is caused by the trembling of the slipping sensor element.

To examine the cap-twisting, a comparison between the results of the first screwing and fourth twisting is performed with Figs. 20 and 21. In the first twisting, since the cap is loose, the marked time derivative of the shearing force does not occur in Fig. 20. On the other hand, in the fourth twisting, the marked time derivative of the shearing force does occur because of the securing of the cap (Fig. 21). Therefore, the robotic hand can twist on the bottle cap completely. Additionally, the time derivative of the shearing force can be adopted as a measure for twisting the cap.

Figure 20.

Detailed relationship between variations in time derivative of shearing force and torque at first twisting

Figure 21.

Detailed relationship between variations in time derivative of shearing force and torque at fourth twisting

3.4.4. Trajectory of fingertip modified according to tri-axial tactile data

Trajectories of sensor element tips are shown in Figs. 22 and 23. If the result of Fig. 22 is compared with the result of Fig. 23, trajectories of Fig. 23 are closer to the cap contour. Modification of the trajectory is saturated after closing the cap. Although input finger trajectories were a rectangle roughly decided to touch and turn the cap as described in the previous section, a segment of the rectangle was changed from a straight line to a curved line to fit the cap contour.

Figure 22.

Trajectories of sensor element before closing the cap

Figure 23.

Trajectories of sensor element after closing the cap

4. Conclusion

We developed a new three-axis tactile sensor to be mounted on multi-fingered hands, based on the principle of an optical waveguide-type tactile sensor comprised of an acrylic hemispherical dome, a light source, an array of rubber sensing elements, and a CCD camera. The sensing element of the present tactile sensor includes one columnar feeler and eight conical feelers. A three-axis force applied to the tip of the sensing element is detected by the contact areas of the conical feelers, which maintain contact with the acrylic dome. Normal and shearing forces are calculated from integration and centroid displacement of the grayscale value derived from the conical feeler’s contacts.

To evaluate the present tactile sensor, we conducted a series of experiments using a y-z stage, rotational stages, and a force gauge. Although the relationship between the integrated grayscale value and normal force depended on the sensor’s latitude on the hemispherical surface, it was easy to modify sensitivity based on the latitude. Sensitivity to normal and shearing forces was approximated with bi-linear curves. The results revealed that the relationship between the integrated grayscale value and normal force converges into a single curve despite the inclination of the applied force. This was also true for the relationship between centroid displacement and shearing force. Therefore, applied normal and shearing forces can be obtained independently from integrated grayscale values and centroid displacement, respectively. Also, the results for the present sensor had enough repeatability to confirm that the sensor is sufficiently sensitive to both normal and shearing forces.

Next, a robotic hand was composed of two robotic fingers to indicate that tri-axial tactile data generated the trajectory of the robotic fingers. Since the three-axis tactile sensor can detect higher order information compared to the other tactile sensors, the robotic hand’s behavior is determined on the basis of tri-axial tactile data. Not only tri-axial force distribution directly obtained from the tactile sensor but also the time derivative of shearing force distribution is used for the hand-control program. If grasping force measured from normal force distribution is lower than a threshold, grasping force is increased. The time derivative is defined as slippage; if slippage arises, grasping force is enhanced to prevent fatal slippage between the finger and object. In the verification test, the robotic hand twists on a bottle cap completely. Although input finger trajectories were a rectangle roughly decided to touch and turn the cap, a segment of the rectangle was changed from a straight line to a curved line to fit the cap contour. Therefore, higher order tactile information can reduce the complexity of the control program.

We are continuing to develop the optical three-axis tactile sensor to enhance its capabilities such as sensing area, precision and sensible range of load. Furthermore, we will apply the hand to more practical tasks such as assemble-and-disassemble and peg-in-hole tasks in future work.

© 2011 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike-3.0 License, which permits use, distribution and reproduction for non-commercial purposes, provided the original is properly cited and derivative works building on this content are distributed under the same license.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Masahiro Ohka, Hanafiah Bin Yussof and Sukarnur Che Abdullah (June 9th 2011). Object-Handling Tasks Based on Active Tactile and Slippage Sensations, Robot Arms, Satoru Goto, IntechOpen, DOI: 10.5772/16781. Available from:

chapter statistics

2504total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

3D Terrain Sensing System Using Laser Range Finder with Arm-Type Movable Unit

By Toyomi Fujita and Yuya Kondo

Related Book

Frontiers in Guided Wave Optics and Optoelectronics

Edited by Bishnu Pal

First chapter

Frontiers in Guided Wave Optics and Optoelectronics

By Bishnu Pal

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us