Scholarly article on topic 'Geometrical Data Extraction Using Interaction Between Objects and Robotic Fingers Equipped with Three-axis Tactile Sensors'

Geometrical Data Extraction Using Interaction Between Objects and Robotic Fingers Equipped with Three-axis Tactile Sensors Academic research paper on "Mechanical engineering"

CC BY-NC-ND
0
0
Share paper
Academic journal
Procedia Engineering
OECD Field of science
Keywords
{"Geometrical data" / "three-axis tactile sensor" / "edge extraction" / "cap screwing" / "height data"}

Abstract of research paper on Mechanical engineering, author of scientific article — Sukarnur Che Abdullah, Masahiro Ohka, Yusuke Dosho, Takuya Ikai, Hanafiah Yussof

Abstract If humans want to know geometrical information about certain objects, we can pick them up, push them, pull them or turn them to learn the distance and relationship between the objects and us. Although this tactic is very effective for robots working in unknown environments, it has not always been applied to them because it requires sophisticated tactile sensors to obtain data from the interaction between robotic hands and their environment. On the other hand, we developed a robotic finger equipped with optical three-axis tactile sensors, of which the sensing cell can separately detect normal and shearing forces. A robotic hand equipped with such fingers can explore geometrical data through pushing-pulling, turning and picking up-placing motions as follows. It can acquire the contour of an object with a corrugated shape through scanning the surface with two fingers; it can learn a curved line to fit a cap contour through turning it, even if input finger trajectories are rectangular; it can place an object on a specified location with unknown height (z-coordinate) after measuring the x-y position of the location because it can detect upward slippage occurring between the object and fingers when the object bottom touches the location.

Academic research paper on topic "Geometrical Data Extraction Using Interaction Between Objects and Robotic Fingers Equipped with Three-axis Tactile Sensors"

Available online at www.sciencedirect.com

SciVerse ScienceDirect

Procedia Engineering 41 (2012) 1379 - 1388

Engineering

Procedia

www.elsevier.com/locate/procedia

2012 International Symposium on Robotics and Intelligent Sensors

Geometrical Data Extraction Using Interaction Between Objects and Robotic Fingers Equipped with Three-axis

Tactile Sensors

Sukarnur Che Abdullaha, Masahiro Ohkab, Yusuke Doshob, Takuya Ikaib and

Hanafiah Yussof

aFaculty of Mechanical Engimeering, Universiti Teknologi MARA, Shah Alam, 40450, Selangor, Malaysia bGraduate School of Information Science, Nagoya University, Furo-cho, Chikusaku, Nagoya 464-8601, Japan

Abstract

If humans want to know geometrical information about certain objects, we can pick them up, push them, pull them or turn them to learn the distance and relationship between the objects and us. Although this tactic is very effective for robots working in unknown environments, it has not always been applied to them because it requires sophisticated tactile sensors to obtain data from the interaction between robotic hands and their environment. On the other hand, we developed a robotic finger equipped with optical three-axis tactile sensors, of which the sensing cell can separately detect normal and shearing forces. A robotic hand equipped with such fingers can explore geometrical data through pushing-pulling, turning and picking up-placing motions as follows. It can acquire the contour of an object with a corrugated shape through scanning the surface with two fingers; it can learn a curved line to fit a cap contour through turning it, even if input finger trajectories are rectangular; it can place an object on a specified location with unknown height (z-coordinate) after measuring the x-y position of the location because it can detect upward slippage occurring between the object and fingers when the object bottom touches the location.

© 2011 Published by Elsevier Ltd. Selection and/or peer-review under responsibility of 2012 International Symposium on Robotics and Intelligent Sensors

Keywords: Geometrical data; three-axis tactile sensor; edge extraction; cap screwing; height data.

1. Introduction

For a robot, it is difficult to recognize geometrical data of an object buried in a complicated background using vision. Even for human beings, it is sometimes difficult to find a specified object on a messy desk. In particular, the relationship between two parts is not easily recognized through vision only.

1877-7058 © 2012 Published by Elsevier Ltd. doi:10.1016/j.proeng.2012.07.325

We recognize the relationship with trial handling such as pulling and twisting the object. Although this manner is natural for humans and very effective for robots as well, it has not always been applied to robots because it requires sophisticated tactile sensors to obtain data from the interaction between robotic hands and the environment.

On the other hand, optical three-axis tactile sensors [1]-[4] produced by the authors are superior to other tactile sensors [5]-[9] because they can simultaneously detect normal and shearing forces, and because of their sensing ability for detecting contact physics caused between the sensors and an object. Although one of the present authors has already reported another type of three-axis tactile sensor, [10]-[11] it has a very hard sensing surface and a fragile sensing element because the sensing element is made of single-crystal silicon.

Regarding flexible and impact-resistant surfaces as the most important, we improved conventional optical waveguide-type tactile sensors [12]-[16] to develop the optical three-axis tactile sensor. In a previous paper,[17] we developed a hemispherical tactile sensor for general-purpose use with our three-axis tactile sensor, and the three-axis tactile sensor was mounted on a robotic finger with three degrees of freedom to evaluate the tactile sensor for dexterous hands. Three kinds of experiments were performed. First, the robotic finger touches and scans flat specimens to evaluate their friction coefficient. Second, it detects the contours of parallelepiped and cylindrical objects. Finally, it manipulates a parallelepiped plastic case put on a table by sliding it.

In the preceding paper,[18] the robotic hand was composed of two robotic fingers equipped with three-axis tactile sensors. Using the robotic hand, it was found that tri-axial tactile data generated the trajectory of the robotic fingers, even if a simple initial trajectory was provided for the control program. In the verification test, the robotic hand screwed a bottle cap to close it. The experimental apparatus was composed of the two-fingered hand, a bottle holder and a torque sensor to monitor generated torque during the experiment. Variation in torque and generated trajectory were examined to evaluate the robotic hand and the cap-closing algorithm.

In this paper, the hand is mounted on the end of an arm robot to search a wider area. Using this handarm robot, we perform three kinds of object exploration tests to demonstrate the effectiveness of the trial handling, such as pulling and twisting on acquisition of an object's geometrical data. The first test is object contour extraction, which is basic information to recognize the object. In this experiment, the robot hand grasps the object with two fingers and slides them on the object surface. Second, the advanced cap-screwing task is performed to show the effectiveness of trial turning to acquire the cap contour. Although we have pointed out this effectiveness in the previous paper, we take up this issue again in this paper because we can advance it with the introduction of screwing termination. Third, the height for placing an object is measured from reaction force generated by contact between the object bottom and the floor when the hand-arm robot tries to put it on the floor. Since stereo vision usually does not have the ability to obtain sufficient precision of height detection, planer coordinates and height coordinates are acquired with the stereo vision and tactile information, respectively.

2. Three-Axis Tactile Sensor

Since our three-axis tactile sensor has been explained in previous papers,[1]-[3], [16] the structure and functions of the tactile sensor are described briefly in the present paper. The tactile sensor is composed of a CCD camera, an acrylic dome, a light source, and a computer as shown in Fig. 1. The light emitted from the light source is directed into the acrylic dome. Contact phenomena are observed as image data, which are acquired by the CCD camera and transmitted to the computer to calculate the three-axis force distribution. The sensing element presented in this paper is comprised of a columnar feeler and eight conical feelers. The sensing elements, which are made of silicone rubber, are designed to maintain contact

with the conical feelers and the acrylic dome and to make the columnar feelers touch an object.

When the three components of the force vector, Fx, Fy, and Fz, are applied to the tip of the columnar

feeler, contact between the acrylic dome and the conical feelers is measured as a distribution of gray-scale values, which are transmitted to the computer. Fx, Fy, and Fz values are calculated using integrated

gray-scale value G and the horizontal displacement of the centroid of gray-scale distribution. We are currently designing a multi-fingered robotic hand for general-purpose use in robotics. The robotic hand includes links, fingertips equipped with the three-axis tactile sensor, and micro actuators (YR-KA01-A000, Yasukawa). Each micro actuator, which consists of an AC servo-motor, a harmonic drive, and an incremental encoder, is particularly developed for application to a multi-fingered hand. Since the tactile sensors must be fitted to a multi-fingered hand, we are developing a fingertip that includes a hemispherical three-axis tactile sensor [17]-[18].

Sensing elements are arranged on the acrylic dome in a concentric configuration. The acrylic dome is illuminated along its edge by optical fibers connected to a light source. Image data consisting of bright spots caused by the feelers' collapse are retrieved by an optical fiber scope connected to the CCD camera.

Fig. 1 Three-axis tactile sensor system

Fig. 2 Two-hand-arm robot equipped with optical three-

Image F&£ a?CqUo1ÎetWo-J1yan^ëim:r^t camera are divided intM^ -cäg№P an m°Wn in Fig. 1: the dividing procedure, digital filtering, integrated gray-scale values, and centroid. Image data obtained by CCD camera and the address of sensing-element displacement are processed on an image processing board.

Since the image warps due to projection from a hemispherical surface, software installed on the computer modifies the warped image data and calculates G, ux, and uy to obtain the three-axis force applied to the tip of the sensing element.

3. Hand Arm Robot

Figure 2 shows a two-hand-arm robot, which is produced to perform several assemble tasks and is advancement over the one-hand-arm robot presented in a previous article [19]. Figure 3 shows the structure of the present robot; the arm system has DOF of 5; each finger's DOF is 3. To compensate for the lack of arm DOF, this robot uses its finger's root joint as its wrist's DOF. Although this robot has two hand-arms, we use the right hand-arm in this paper.

On each fingertip, it has the three-axis tactile sensor described in the previous chapter. We use the sensor information as an effective key to induce a specified behavior. A local coordinate is embedded on sensing elements of the three-axis sensor attached to the fingertip. Figure 4 shows the relationship between the location of the sensing element and the local coordinate. Although the three-axis tactile sensor has 41 sensing elements, experimental results of the specified sensing element attached to the local coordinate will be demonstrated in the next chapter. Using coordinate transformation, component slippage vectors are calculated with respect to the global coordinate O- XYZ demonstrated in Fig. 4.

Position control of the fingertip is performed based on resolved motion rate control. In this control method, joint angles are assumed at the first step, and the current displacement vector is calculated with kinematics. Joint angles are adjusted through the current joint angle and the difference between the current displacement vector and the objective displacement vector to modify the joint angle in the next step. The modified joint angle is designated as the current angle in the next k + 1-th step, and the above procedure is repeated until the displacement vector at k-th step r coincides with objective position

vector r within a specified error. That is, the following Eqs. (1) and (2) are calculated until |rd - rk| becomes small enough:

r*=Jqt, (i)

□k+1 = □ - J-1 (rd - rk ). (2)

where J is the Jacobian matrix obtained from kinematics of the hand-arm robot.

4. Control Algorithm

The hand is controlled according to velocity control. First, hand status enters "search mode" to make fingers approach an object with finger speed v = v0 . After the fingers touch the object, the hand status changes to "move mode" to manipulate the object with finger speed v = vm . During both search and move modes, when the absolute time derivative of the shearing force of a sensing element exceeds a threshold dr, this system regards the sensing element as slippage. To prevent the hand from dropping the object, re-push velocity is defined as moving the fingertip along the counter direction of applied force.

Table 1. Main constants and threshold

Constants and threshold Edge extraction Value Screwing and pick and place

Sampling interval for sensor 100 msec

Sampling interval for finger 25 msec3

F1 0.008 N 0.5 N

F2 0.2 N 2 N

dr 0.04 N/sec

Initial velocity, vo 1 mm/sec

Manipulate velocity, vm 1 mm/sec

Re-push velocity. Vp 1 mm/sec 2 m/sec

However, if normal force of a sensing element exceeds a threshold F2 , the re-push velocity is canceled to prevent the sensing element from breaking. The hand is controlled by a control module applying total velocity obtained by adding the re-push velocity to current velocity.

In our system, the sensor control program and hand control program are executed in different computers because CPU time is efficiently consumed using a multi-task program method. These programs are synchronized with the following five flags.

SEARCH: Fingers search for an object with initial finger velocity v 0 until normal force of a sensing element exceeds a threshold F1 or SLIP flag is raised.

MOVE: This flag is raised whenever the robotic hand manipulates an object.

TOUCH: This flag is raised whenever one of the fingers touches an object.

SLIP: This flag is raised whenever the time derivative of shearing force exceeds a threshold dr.

OVER: This flag is raised when normal force of a sensing element exceeds a threshold F2 or a specific direction of the force component exceeds a threshold.

These flags are determined according to tri-axial tactile data and finger motions. Two modules, the flag analyzer and finger speed estimator, mainly play the role of object handling. The algorithm of the finger speed estimator is shown in Fig. 5.

In the flag analyzer, the TOUCH flag, SLIP flag and OVER flag are decided on. The flag analyzer regards finger status as touching an object, when normal force of a sensing element exceeds F1 or the absolute time derivative of the shearing force exceeds dr (SLIP flag is raised). Whenever it regards finger status as touching an object, the TOUCH flag is raised. The OVER flag is raised when normal force of a sensing element exceeds F2 to prohibit the re-push motion. For this purpose, finger velocity is set as the opposite of the last step direction.

In the finger speed estimator, the velocity of the fingertip is determined based on the five flag values and conserved whenever contact status is not changed. Since the screw-cap problem requires touch-and-release motion, the MOVE and SEARCH flags are controlled according to the TOUCH flag and time spent. Whenever the SLIP flag is raised, a sensing element of the largest normal force is determined and the re-push velocity of the finger is determined as an inward normal line of the sensing element. The re-push velocity is added to the current velocity, and the resultant velocity is applied to the control module.

On the basis of the resultant velocity, the revolution velocity of each joint is calculated to control the hand-arm robot motion. Major constants for the control are shown in Table 1. For the edge-extraction experiment discussed in the next chapter, the difference between thresholds F1 and F2 should be small enough to have no influence on other experiments. After several trials, we selected 0.08 and 0.2 N for F1 and F2, respectively.

5. Experiment Result and Discussion

5.1. Edge Extraction

Edge extraction of an object is basic information for recognition of the object. Although visual information is effective for acquiring whole geometrical data of an object, we should employ several kinds of image data processing such as excluding background, adjustment lighting and coordinate transformation.

To explain the above problems with robotic vision, the edge-extraction result is shown in Fig. 6. The original image is shown in Fig. 7, which is a scene of edge-line extraction of a wood model with two fingers of the hand-arm robot. Since background data are included in the figure, they should be excluded.

Cap shape Base

Fig. 8 Edge extracted by tactile scan Fig. 9 Finger trajectory in cap-screwing task

On the other hand, we conducted an edge-exploring experiment using the two robotic fingers to explore real objects to supplement the above problems with vision. In this experiment, after the robot touches both sides of the object with two fingers, the hand moves linearly in the z-direction as shown in Fig. 7. According to the algorithm shown in Fig. 5, the hand-arm robot maintains contact with the sides of the object because it tries to make maximum normal force generated in a tactile element to maintain the range specified by two thresholds, F1 and F2.

Figure 8 shows the result of edge extraction with tactile data processing. To evaluate the precision of edge extraction with tactile sensing, trajectories of the contact position are shown by thick black lines and overlapped on the photograph of the specimen. As shown in Fig. 8, the hand-arm robot successfully obtains the object contour without obtaining edges that are not needed.

5.2. Cap Screwing

In the cap-screwing task, the initial trajectory for the fingers is applied to the robot to start cap screwing; the termination condition of the cap-screwing task must be determined. For this task, we added two modules as follows.

One of them provides an initial cap-screwing task in which the fingers follow the square trajectories shown in Fig. 8 as an inserted figure. Since the finger trajectory is intersected at the cap's contour, the finger tips slip on its surface. If slippage occurs, the increment of the shearing force is observed to raise the Move flag. At this moment, finger velocity v as re-push velocity is applied to enhance the grasping force.

In the previous paper, the hand-arm robot did not judge the termination of cap screwing.[18] The other module decides when the hand-arm robot terminates the screwing task. Empirically, we know that much slippage occurs when the cap is tightened. If the increment of shearing force exceeds 1.4 times, which is the maximum during the first touch, the finger motion is terminated.

The fingertip trajectory is shown in Fig. 9. Modification of the initial trajectory is saturated after closing the cap. Although the initial finger trajectory is a rough rectangle determined to touch and turn the cap, a segment of it is changed from a straight line to a curved line to fit the cap contour. The curved trajectory is not provided, but it is obtained through this task.

5.3. Height Data

The picking up task is accomplished by the object grasping explained in Fig. 10. After the hand-arm robot grasps an object, it transfers the object to a point above the goal, and then moves it down to the goal.

In this experiment, although we tried to use stereo vision to obtain 3D coordinates of the destination point as shown in Fig. 11, especially with respect to the z-coordinate, we could not obtain sufficient precision. However the planar coordinate calculated from the image data was so close to the practical level because estimation error of planar coordinate is around 1.1 mm. Therefore, while the planar coordinate of the goal point can be provided with vision, height data is not provided and is treated as unknown value.

Fig. 10 Picking up-and-placing task

Fig. 11 Trial experiment for stereo vision

If it does not release the object, the object is crushed between the hand and the goal. To prevent this, the OVER flag is raised when upward force exceeds a threshold. In this experiment, once the hand is opened, the SEARCH flag is never raised. Since the hand-arm robot releases the object when it puts the object on the goal, it can obtain the height data of the goal.

To evaluate the above algorithm, after the right hand takes the object from the left hand, it tries to put the object on the goal. As shown in Fig. 12, which is the sequential pictures during this task, the hand-arm robot can complete this task.

Fig. 12 Sequential pictures in object-transfer task

6. Conclusion

Geometric data are basic and necessary to manipulate an object. Although image-data processing is very effective for obtaining geometrical data, there are several problems such as excluding background, adjusting lighting and coordinating transformation. On the other hand, human beings recognize the relationship with trial handling such as pulling and twisting. We wanted to integrate the above vision merits and advantages of tactile sensing to obtain geometric data.

In this paper, we performed three kinds of object exploration tests to show the effectiveness of trial handling, such as pulling and twisting upon acquisition of an object's geometrical data. The first test was object-contour extraction, which is basic information for recognizing an object. In this experiment, the robot hand grasps the object with two fingers and slides them on the object surface. Second, an advanced cap-screwing task is performed to show the effectiveness of the trial turning to acquire the cap contour. Third, the height for placing an object is measured from reaction force generated by contact between the object bottom and the floor when the hand-arm robot tries to put it on the floor.

In the future work, we want to incorporate a pan-tilt system equipped with stereo vision into our handarm to accomplish progressed data fusion of vision and tactile sensing.

References

[1] Ohka, M., Mitsuya, Y., Matsunaga, Y., and Takeuchi, S., "Sensing Characteristics of an Optical Three-axis Tactile Sensor Under Combined Loading," Robotica, vol. 22, 2004, pp. 213-221.

[2] Ohka, M., Kawamura, T., Itahashi, T., Miyaoka, T., and Mitsuya, Y., "A Tactile Recognition System Mimicking Human Mechanism for Recognizing Surface Roughness," JSME International Journal, Series C. Vol. 48, No. 2, 2005, pp. 278-285.

[3] Ohka, M., Mitsuya, Y., Higashioka, I., Kabeshita, H., "An Experimental Optical Three-axis Tactile Sensor for Micro-Robots," Robotica, vol. 23-4, 2005, pp. 457-465.

[4] Ohka, M., Kobayashi, H., Takata, J., Mitsuya, M., "An Experimental Optical Three-axis Tactile Sensor Featured with Hemispherical Surface", Journal of Advanced Mechanical Design, Systems, and Manufacturing, Vol. 2, No. 5(2008), pp. 860873.

[5] Raibert, H. M. and Tanner, J. E., "Design and Implementation of a VSLI Tactile Sensing Computer," Int. J. Robotics Res., Vol. 1-3, 1982, 3-18.

[6] Hackwood, S., Beni, G., Hornak, L. A., Wolfe, R. and Nelson, T. J., "Torque-Sensitive Tactile Array for Robotics," Int. J. Robotics Research, Vol. 2-2, 1983, pp. 46-50.

[7] Dario, P., Rossi, D.D., Domenci, C. and Francesconi, R., "Ferroelectric Polymer Tactile Sensors with Anthropomorphic Features," Proc. 1984 IEEE Int. Conf. On Robotics and Automation, 1984, pp. 332-340.

[8] Novak, J. L., "Initial Design and Analysis of a Capacitive Sensor for Shear and Normal Force Measurement," Proc. of 1989 IEEE Int. Conf. on Robotic and Automation, 1989, pp. 137-145.

[9] Nicholls, H. R. & Lee, M. H., "A Survey of Robot Tactile Sensing Technology," Int. J. Robotics Res., Vol. 8-3, 1989, pp. 3-30.

[10] Ohka, M. et al., "Tactile Expert System Using a Parallel-fingered Hand Fitted with Three-axis Tactile Sensors," JSME Int. J., Series C, Vol. 37, No. 1, 1994, pp. 138-146.

[11] Takeuchi, S., Ohka, M., and Mitsuya, Y., "Tactile Recognition Using Fuzzy Production Rules and Fuzzy Relations for Processing Image Data from Three-dimensional Tactile Sensors Mounted on a Robot Hand," Proc. of the Asian Control Conf., Vol. 3, 1994, pp. 631-634.

[12] Mott, H., Lee, M. H. and Nicholls, H. R., "An Experimental Very-High-Resolution Tactile Sensor Array," in Proc. 4th Int. Conf. On Robot Vision and Sensory Control, 1984, pp. 241-250.

[13] Tanie, K., Komoriya, K., Kaneko M., Tachi, S., and Fujiwara, A., "A High-Resolution Tactile Sensor Array," Robot Sensors Vol. 2: Tactile and Non-Vision, Kempston, UK: IFS (Pubs), 1986, pp. 189-198.

[14] Nicholls, H. R., "Tactile Sensing Using an Optical Transduction Method," Traditional and Non-traditional Robot Sensors (Edited by T. Fig. C. Henderson), Springer-Verlag, 1990, pp. 83-99.

[15] Kaneko, M., H. Maekawa, and K. Tanie, Active Tactile Sensing by Robotic Fingers Based on Minimum-External-Sensor-Realization, Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 1289-1294, 1992.

[16] Maekawa, H., Tanie, K., Komoriya, K., Kaneko M., Horiguchi, C., and Sugawara, T., "Development of a Finger-shaped Tactile Sensor and Its Evaluation by Active Touch," Proc. of the 1992 IEEE Int. Conf. on Robotics and Automation, 1992, pp. 13271334.

[17] Ohka, M., Morisawa, N., Suzuki, H., Takada, J., Kobayashi, H., Yussof, H., "A Robotic Finger Equipped with an Optical Three-axis Tactile Sensor," Proc. of IEEE Inter. Conf. on Robotic and Automation, 2008, pp. 3425-3430.

[18] Ohka, M., Morisawa, N. and Yussof, H. B., "Trajectory Generation of Robotic Fingers Based on Tri-axial Tactile Data for Cap Screwing Task", 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, May 12-17, 2009, pp. 883888.

[19] Yussof, H. B., Wada, J. and Ohka, M., "Object Handling Tasks Based on Active Tactile and Slippage Sensations in Multi-Fingered Humanoid Robot Arm", 2009 IEEE International Conference on Robotics and Automation, 2009, pp. 502-507.