Scholarly article on topic 'Grasping Strategy and Control Algorithm of Two Robotic Fingers Equipped with Optical Three-Axis Tactile Sensors'

Grasping Strategy and Control Algorithm of Two Robotic Fingers Equipped with Optical Three-Axis Tactile Sensors Academic research paper on "Mechanical engineering"

CC BY-NC-ND
0
0
Share paper
Academic journal
Procedia Engineering
OECD Field of science
Keywords
{"Tactile sensor" / "robot finger" / grasping / "control algorithm"}

Abstract of research paper on Mechanical engineering, author of scientific article — Hanafiah Yussof, Masahiro Ohka

Abstract This paper presents grasping strategy of robot fingers based on tactile sensing information acquired by optical three-axis tactile sensor. We developed a novel optical three-axis tactile sensor system based on an optical waveguide transduction method capable of acquiring normal and shearing forces. The sensors are mounted on fingertips of two robotic fingers. To enhance the ability of recognizing and manipulating objects, we designed the robot control system architecture comprised of connection module, thinking routines, and a hand/finger control modules. We proposed tactile sensing-based control algorithm in the robot finger control system to control fingertips movements by defining optimum grasp pressure and perform re-push movement when slippage was detected. Verification experiments were conducted whose results revealed that the finger's system managed to recognize the stiffness of unknown objects and complied with sudden changes of the object's weight during object manipulation tasks.

Academic research paper on topic "Grasping Strategy and Control Algorithm of Two Robotic Fingers Equipped with Optical Three-Axis Tactile Sensors"

Available online at www.sciencedirect.com —-:---

SciVerse ScienceDirect Engineering

Procedia

Procedia Engineering 41 (2012) 1573 - 1579

www.elsevier.com/locate/procedia

International Symposium on Robotics and Intelligent Sensors 2012 (IRIS 2012)

Grasping Strategy and Control Algorithm of Two Robotic Fingers Equipped with Optical Three-Axis Tactile Sensors

Hanafiah Yussof^*, Masahiro Ohkab

aCenter of Humanoid Robot and Bio-Sensors (HuRoBs), Faculty of Mechanical Engineering, Universiti Teknologi MARA, Shah Alam 40450 Selangor

Malaysia

bGraduate School of Information Science, Nagoya University, Furo-cho Chikusa-ku, Nagoya Japan

Abstract

This paper presents grasping strategy of robot fingers based on tactile sensing information acquired by optical three-axis tactile sensor. We developed a novel optical three-axis tactile sensor system based on an optical waveguide transduction method capable of acquiring normal and shearing forces. The sensors are mounted on fingertips of two robotic fingers. To enhance the ability of recognizing and manipulating objects, we designed the robot control system architecture comprised of connection module, thinking routines, and a hand/finger control modules. We proposed tactile sensing-based control algorithm in the robot finger control system to control fingertips movements by defining optimum grasp pressure and perform re-push movement when slippage was detected. Verification experiments were conducted whose results revealed that the finger's system managed to recognize the stiffness of unknown objects and complied with sudden changes of the object's weight during object manipulation tasks.

© 2012 The Authors. Published by Elsevier Ltd. Selection and/or peer-review under responsibility of the Centre of Humanoid Robots and Bio-Sensor (HuRoBs), Faculty of Mechanical Engineering, Universiti Teknologi MARA.

Keywords: Tactile sensor, robot finger, grasping, control algorithm

1. Introduction

The improvement of industrial robots has motivated studies on various grippers, robot hands and grasping function. Furthermore, recent advancement on service robots such as humanoid robot has inspired development of intelligent sensor devices such as tactile sensors [1][2]. Research on tactile sensor is basically motivated by tactile sensing system of human skin. It is well known that human's tactile sense is very accurate and sensitive. Human skin structure provides mechanism to sense static and dynamics pressure simultaneously with extremely high accuracy. Most tactile sensors developed nowadays are aimed to detect both static and dynamic pressure [3]. Tactile sensors have been developed using measurements of strain produced in sensing materials that are detected using physical quantities such as electric resistance and capacity, magnetic intensity, voltage and light intensity [4][5].

In this research, to realize precision object grasping and manipulation tasks in robotic finger, we developed an optical three-axis tactile sensor capable of acquiring normal and shearing forces to mount on the fingertips of robot fingers [6][7]. It uses an optical waveguide transduction method and applies image processing techniques. Such a sensing principle is expected to provide better sensing accuracy to realize contact phenomena by acquiring the three axial directions of the forces.

ELSEVIER

* Corresponding author. Tel.: +603-55436455; fax: +603-55435161. E-mail address: hurobs@gmail.com

1877-7058 © 2012 Published by Elsevier Ltd. doi:10.1016/j.proeng.2012.07.352

Fig. 1 Optical three-axis tactile sensor mounted on two robot fingers

In this research, we analyzed grasp synthesis in robot manipulation by acquisition of tactile sensing information and slippage sensation using two robot fingers and the tactile sensor systems as shown in Fig. 1. We developed a tactile sensing-based control algorithm embedded in the robot finger control system to realize optimum grasp pressure and control the slippage of object during manipulation tasks. Performance evaluation of the proposed algorithm is conducted in experiments with real objects.

2. Optical Three-Axis Tactile Sensor

The optical three-axis tactile sensor developed in this research is designed in a hemispherical dome shape that consists of an array of sensing elements. This shape is to mimics the structure of human fingertips for easy compliance with various shapes of objects. The tactile sensor measurement devices are placed outside the sensor.

The hardware novelty consists of an acrylic hemispherical dome, an array of 41 pieces of sensing elements made from silicon rubber, a light source, an optical fiber-scope, and a CCD camera. The optical fiber-scope is connected to the CCD camera to acquire image of sensing elements touching acrylic dome inside the tactile sensor. At this moment, light emitted from the light source is directed toward the edge of the hemispherical acrylic dome through optical fibers. Total of 24 pieces optical fibers are used; 12 pieces each at left and right side of the sensor, transmitting halogen light from the light source. The light directed into the acrylic dome remains within it due to total internal reflection generated, since the acrylic dome is surrounded by air having a lower reflection index than the acrylic dome. This make the acquired image by the CCD camera become clear even the sensor is hemispherical shape.

Fig. 2 shows structure of the optical three-axis tactile sensor. The silicone rubber sensing element is comprised of one columnar feeler and eight conical feelers which remain in contact with the acrylic surface while the tip of the columnar feeler touches an object. The sensing elements are arranged on the hemispherical acrylic dome in a concentric configuration with 41 sub-regions. Such orientation is expected to provide good indication of contact pressure during performing object manipulation.

-Conical feeler (cone shape)

Fig. 2 Structure of optical three-axis tactile sensor

3. Control System Structure

Fig. 3 shows a layout of tactile sensor controller consist of sensing and measurement devices. Based on image data captured by the CCD camera, in the tactile sensor controller, an image processing board Himawari PCI/S (Library Corp.) functions as a PCI bus and selects the image and sends it to an internal buffer created inside the PC main memory. Sampling time for this process is 1/30 seconds. We used a PC with Windows XP OS installed with Microsoft Visual C++. The image data are then send to image analysis module, where Cosmos32 software controls the dividing procedure, the digital filtering, the calculation of integrated grayscale values and the centroid displacement.

The control system architecture of the robot finger is shown in Fig. 4. It is comprised of three modules: a Connection Module, Thinking Routines, and a Hand Control Module. The architecture is connected to the tactile sensor controller by the connection module using TCP/IP protocols.

Thinking Routines Module function is basically to define what kind of information is to be acquired from the tactile sensor, how to translate and utilize this information, and how to send commands to the robot finger to properly control the velocity of the finger motion. As shown in Fig. 4, inside the Thinking Routines Module, there is a Thinking Routine Chooser that consists of a Pin Status Analyzer and a Velocity Generator. Moreover, there is a Motion Information Structure that connects to both the Pin Status Analyzer and the Velocity Generator. The Pin Status Analyzer module receives information from the tactile sensor about the condition of the sensing elements and uses this information to determine a suitable motion mode. Then it sends a list of sensing elements to the Connection Module to acquire tactile sensing information. Meanwhile, the Velocity Generator Module determines finger velocity based on the Finger Information and Motion Information Structure. The Motion Information Structure consists of initial velocity and motion flag modes, which are used to control the finger movement. Meanwhile, the Finger Information Structure provides connections to all modules so that they can share finger orientation, joint angle, and tactile sensing data from each sensor element. A User Interface was designed so that the operator can provide commands to the finger control system. The finger control module controls finger motions by calculating joint velocities and angles.

Fig. 3 Control system structure of optical three-axis tactile sensor

Fig. 4 Control system structure of robot finger

4. Control Strategy and Algorithm

With the aim to optimize grasping motion on various type of object, we proposed a control algorithm as shown in Fig. 5. This control algorithm of the robot hand system is based on tactile sensing information acquired from the optical three-axis tactile sensor. To optimize grasping strategy of unknown object, grasping pressure was classified into three classifications of object stiffness: soft, medium, and hard. These classifications are used to select velocity ratio for velocity of re-push motion vp, and threshold of normal force F1 and F2 to control finger movement during grasping and manipulating objects.

At first, when the finger system touches the object, the control system will record the time when slippage sensation exceeds the threshold of centroid change dr. This slippage sensation is defined from displacement of centroid position at x-and y-axes; |dxG| and |dyG|.

Fig. 5 Control algorithm flowchart

At this moment, the SLIP flag is at rise-up condition and reference of maximum normal force value Fref is recorded. If the displacement of centroid position continuously exceeds the threshold of centroid change dr with proportional of time increment At, the control system will proceed with stiffness distinction process.

During stiffness distinction process, if the difference between maximum normal force Fmax and reference maximum

normal force Fref within At exceeds threshold of AF, the object is classified as hard object. If the different value is lower than AF, the object is classified as soft object. This process is conducted every time the slippage sensation is detected exceeding the threshold of centroid change dr (SLIP flag is rise-up). For safety, once the object was classified as soft object, the system will not further check for hard or middle object (this restriction will be released when the system is reset back). Furthermore, stiffness distinction process of each finger was conducted separately.

If comparison of the final hardness distinction result for these two fingers was not indicating the same object classification, the control system will choose to classify the object as soft object. This severe selection process of object stiffness help the robot hand control system from damaging the object or the sensor elements due to over pushed or mistake in choosing velocity ratio to re-push the object.

Next, the control system will proceed to re-push on the object. During re-push motion, selection of threshold of normal force F1 and F2 are decided according to object stiffness classification. Based on the thresholds of normal force, velocity ratio to re-push towards the object and decision to stop the re-push motion were defined. Velocity of re-push motion vp is according to object stiffness classification; soft, medium, or hard. The finger will stop the re-push motion when the detected normal force Fn exceeds these threshold values with respect to each object hardness classification.

5. Experiment

We conducted a series of calibration experiments to evaluate performance of the control algorithm and grasping strategy. The first experiment is to grasp and lift hard and soft object. The second experiment is to grasp an empty paper cup and then we pour water inside the cup.

5.1 Experiment I: Grasp and lift hard and soft object

The purpose of this experiment to evaluate the proposed algorithm related with stiffness recognition. The hard object was an aluminum block, and the soft object was a paper box. The experiment condition is shown in Fig. 6. In this experiment, both fingers move along the x-axis to softly grip the object and define the optimum grasp pressure for the grasping mode. Then both fingers lift up the object along the z-axis in the moving mode.

In experiment with hard object, the reaction force applied toward the tactile sensor elements was large because the elasticity coefficient for the hard object is high. Therefore, the detected normal force becomes high. Meanwhile, the object's weight caused great slippage.

For soft object, small reaction force is applied to the sensing elements because the elasticity coefficient for soft objects is low. Accordingly, the detected normal force becomes low. Therefore, to correlate the stiffness distinction of both hard and soft objects, we utilized the increment of normal force AF, which was calculated within a specified progress time, as a stiffness distinction parameters.

From this experiment, to comply with the slippage that particularly occurred for hard object; we considered the amount of centroid change dr for x-directional (dxG) and ^-directional (dyG) of the fingertip coordinate frame, by means of shearing force distribution. If slippage is over the dr value, the finger re-pushes toward the object to prevent it from slipping. However, if the detected AF was lower than a specified value (i.e., a soft object), the finger system uses the dr value to control the finger's re-push velocity so that the grasping motion becomes gentler and finally stops when the centroid change is over a specified dr value.

Fig. 6 Experiment I: Grasp and lift hard and soft objects

Fig. 7 Experiment II: Grasp paper cup and sudden weight change 5.2 Experiment II: Grasp empty cup and pour water

Based on improvement from Experiment I, we conducted experiment to verify the performance of stiffness recognition and changes of object weights. The object was an empty paper cup that weighed about 4 grams. Motion planning was designed so that both fingers could move along the x-axis direction to grip the cup, lifting it up along the y-axis direction within 80 sec of the progress time. As shown in Fig. 7, at 40 sec we poured 60 ml of water into the cup, then after 55 sec we poured another 30 ml and finally 20 ml after about 70 sec. This is to analyze the control system performance against sudden changes of the object's weight.

In this experiment, at first both fingers softly touch the cup to recognize its stiffness and define the optimum grasping pressure. Based on the proposed control algorithm, the control system recognized stiffness of the paper cup. When optimum gripping pressure is satisfied, the fingers manipulate the paper cup by lifting it without crushing it. At this moment, the re-push movement increases the detected normal force, while at the same time the parameters of F1 and F2 are used to control the grip force so that the fingers do not crush the cup. In this experiment, when water is poured into the cup, slippage was detected by tactile sensors. The finger system responds by adjusting the fingertips position to tighten the grip to prevent the cup from slip out and drop. The fingertip movements and the detected normal force are compiled in graphs for analysis, as shown in Fig. 8.

10000 20000 30000 40000 50000 60000 70000 80000 Time, f[iusec]

Fig. 8 Normal force and x-directional fingertip position

6. Conclusion

In this research we present grasping strategy and control algorithm of robotic fingers equipped with optical three-axis tactile sensors. The performance of the proposed strategy and algorithm were evaluated in experiments with real objects. The results from both presented experiments revealed that the robot hand control system managed to recognize stiffness of the object. The system also managed to respond effectively to the sudden change of object's weight by moving both fingers to re-push the cup to tighten its grip and prevent the cup from slipping. The optical three-axis tactile sensor provided valuable tactile information for the robot control system to recognize contact interaction. It is anticipated that using this novel control algorithm with tactile sensing technology will help advance the evolution of real-time object manipulation.

Acknowledgement

This research is partially supported by Grant-in-Research from the Japan Society for the Promotion of Science (JSPS) for fiscal year 2008-2010. The authors gratefully acknowledge the Ministry of Higher Education Malaysia (MOHE) and Universiti Teknologi MARA (UiTM) through the Fundamental Research Grant Scheme (FRGS) (600-RMI/FRGS 5/3/Fst/ (31/2011)).

References

[1] O. Kerpa, K. Weiss and H. Worn, "Development of a flexible tactile sensor system for a humanoid robot", in Proc. IROS2003, vol. 1, pp. 1-6, 2003.

[2] Y. Kuniyoshi, Y. Ohmura, and K. Terada, "Embodied basis of invariant features in execution and perception of whole-body dynamic actions -knacks and focuses of roll-and-rise motion," J. Robotics and Autonomous Systems, vol. 48, pp. 189-201, 2004.

[3] M.H. Lee and H.R. Nicholls, "Tactile sensing for mechatronics: a state of the art survey," JournalMechatronics, 9(1), pp 1-31, 1999.

[4] H. R. Nicholls, "Tactile sensing using an optical transduction method, Traditional and Non-Traditional Robot Sensors" (Edited by T. C. Henderson), Springer-Verlag, pp. 83-99, 1990.

[5] S. Omata, Y. Murayama and C.E. Constantinou, "Real time robotic tactile sensor system for determination of the physical properties of biomaterials," J. Sensors and Actuators A, 112(2-3), pp 278-285, 2004.

[6] M. Ohka, H. Kobayashi and Y. Mitsuya, "Sensing precision of an optical three-axis tactile sensor for a robotic finger", in Proc. 15th IEEE International Symposium on Robot and Human Interaction Communication (R0-MAN2006), pp. 220-225, 2006.

[7] H. Yussof, M. Ohka, et. al, "Sensing performance of an optical three-axis tactile sensor system with application in multi-fingered humanoid robot arm", in Proc. of Int. Conf. on Intelligent Automation and Robotics (ICIAR07), pp. 504-509, 2007.