Scholarly article on topic 'Use of Delta Robot as an Active Touch Device in Immersive Case Scenarios'

Use of Delta Robot as an Active Touch Device in Immersive Case Scenarios Academic research paper on "Computer and information sciences"

Share paper
Academic journal
Procedia Computer Science
OECD Field of science
{"Virtual reality" / "Delta robot" / "Immersive simulations" / "Virtual training"}

Abstract of research paper on Computer and information sciences, author of scientific article — Damian Grajewski, Filip Gorski, Dominik Rybarczyk, Piotr Owczarek, Andrzej Milecki, et al.

Abstract The paper presents a new approach to creating interactive simulations for testing tactile interaction with the user, which involves use of a low-cost device. The role of the haptic (touch) device and its functionality is achieved by using a manipulator with a parallel kinematic structure (Delta robot). This required the development of methods of integrating a robot with the selected Virtual Reality (VR) systems. Therefore, a new element was introduced into the virtual environment - a device to simulate real objects. In the Virtual Environment (VE), user interacted with a simulated object, and the task of the Delta robot was moving into a position which enabled the interaction. In this way, VE has become more responsive because it corresponded to the behavior of the user, who can move and interact with digital objects and directly experience their physical properties (e.g. size, shape). According to the authors, this type of application will be able in the future to support the effectiveness of virtual training, with particular emphasis on educational simulation (e.g. to perform procedural tasks).

Academic research paper on topic "Use of Delta Robot as an Active Touch Device in Immersive Case Scenarios"


Available online at


Procedía Computer Science 104 (2017) 485 - 492

ICTE 2016, December 2016, Riga, Latvia

Use of Delta Robot as an Active Touch Device in Immersive Case


Damian Grajewskia , Filip Gorskia, Dominik Rybarczyka, Piotr Owczareka,

Andrzej Mileckia, Pawel Buna

aPoznan University of Technology, Faculty of Mechanical Engineering and Management, Piotrowo Str. 3, PL-60-965 Poznan, Poland


The paper presents a new approach to creating interactive simulations for testing tactile interaction with the user, which involves use of a low-cost device. The role of the haptic (touch) device and its functionality is achieved by using a manipulator with a parallel kinematic structure (Delta robot). This required the development of methods of integrating a robot with the selected Virtual Reality (VR) systems. Therefore, a new element was introduced into the virtual environment - a device to simulate real objects. In the Virtual Environment (VE), user interacted with a simulated object, and the task of the Delta robot was moving into a position which enabled the interaction. In this way, VE has become more responsive because it corresponded to the behavior of the user, who can move and interact with digital objects and directly experience their physical properties (e.g. size, shape). According to the authors, this type of application will be able in the future to support the effectiveness of virtual training, with particular emphasis on educational simulation (e.g. to perform procedural tasks).

©2017 The Authors.Publishedby ElsevierB.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.Org/licenses/by-nc-nd/4.0/).

Peer-reviewunderresponsibility of organizingcommitteeofthescientificcommitteeoftheinternational conference;ICTE2016 Keywords: Virtual reality; Delta robot; Immersive simulations; Virtual training

1. Introduction

Nowadays, virtual and haptic technologies are solutions that are used in supportive role in many branches of industry, starting from the military industry, through the e-entertainment industry, medicine, education, to

* Corresponding author. Tel.: +48 61 665 2718; fax: +48 61 665 2774. E-mail

1877-0509 © 2017 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (

Peer-review under responsibility of organizing committee of the scientific committee of the international conference; ICTE 2016 doi:10.1016/j.procs.2017.01.163

engineering. One can observe the use of Virtual Reality (VR) as a set of techniques to prototyping new devices (military vehicles, weapons). Virtual Environments (VE) are also used to build interactive simulations (including the conditions of combat)1. When it comes to e-entertainment industry, it is mainly about the use of interactive devices for computer games, so the level of interaction in the virtual world of games, from the perspective of the user reached a value that was previously unseen. In the medical industry, Virtual Reality solutions found their application in medical engineering (prototyping of medical devices, medical infrastructure aided design)2. For several years virtual technologies have been used to create interactive educational applications for medical environments3,4,5,6 to build specialized surgical simulators and virtual simulations of treatment procedures7,8,9,10,11 as well as to create applications for rehabilitation of patients12,13.

In the field of mechanical engineering and production engineering, VR solutions effectively support primary processes of designing products and processes. Virtual environments enable real-sized, realistic visualization of CAD data of the product, among other things to optimize the entire design before manufacture, and enable diagnosis of environments in which the products will be used. When it comes to support of the process design, virtual reality provides tools for simulation, analysis and optimization of manufacturing and assembly processes. These technologies are also increasingly used to conduct research in the field of design of ergonomic workplaces14. This is possible thanks to the large possibilities of intuitive, realistic interaction with VR environments, which can contribute to broadening the scope of research on the one hand, and to reduce the costs associated with the production of physical prototypes of workplaces on the other hand.

One of the most developing areas of the use of Virtual Reality in production engineering, are interactive systems for industrial training. Until recently, a major constraint on development of virtual training applications was the cost of such a device. A significant increase of interest in VR solutions (mainly from the e - entertainment industry), led to development of many new low-cost VR devices (for example low-cost Head Mounted Displays - trend started by the Oculus Rift device). As a result, the VR technology became more widely available, which also has its impact on solutions for virtual training applications15,16.

Virtual simulation as a training tool, is used more and more in cases where traditional training may compromise the health and life of the trainee, or in situations where implementation of the training requires large investments. Modern IT tools can help generate virtual environments, which are complex both visually and logically. Therefore, interactive VR systems allow the user to interact with elements of the created three-dimensional world.

VR solutions can support traditional methods of training procedural tasks17,18,19,20,21,22,23,24. VR systems can be successfully used to improve efficiency, quality and productivity of a trainee. The user can interact with virtual object, feel the collisions, and, what is the most important, consistently practice tasks and "learn by doing". In this way it is possible to obtain a ready-made solution, so that a trainee can practice until a satisfactory level of efficiency at a given task is reached.

Analysis of potential of VR systems as environments dedicated to interactive simulations, with particular emphasis on user interaction with the virtual environment, leads to the identification of different approaches in the creation of virtual applications and training systems. There are two main approaches - an immersive one, where user is free to walk around and is using immersive gear (e.g. a Head Mounted Device) and a haptic one, where user remains stationary as he is manipulating with a force feedback device. These approaches will be presented in detail in the next part of the paper.

This paper presents a new approach to build training VR applications, focused on touch-based interactions in immersive environment. Usually building solutions that integrate haptic and immersive approaches requires very high investments, especially in hardware, and is very rarely practiced. The authors present a new approach, with use of low-cost robot as an active haptic device.

2. Basic approaches to VR training simulations

2.1. Immersive approach

Immersive approach involves use of advanced systems for stereoscopic visualization, which are integrated with tracking systems. In the case of interactive training for industry, it is an immersive simulation of the selected workplace, prepared on the basis of the CAD model. The scenario of such a simulation takes into account the

characteristics of an actual process (assembly, welding, etc.), which occurs at the real workplace, usually it is a manual process. An example of such a solution is the "Visionary Render" environment from Virtalis25, in which the user can refer to the process of assembling machines (including large-scale ones), and then practice the corresponding sequence of assembly operations. Stereoscopic visualization of this process is displayed with the support of large area projection system. User movements are tracked by a magnetic tracking system. The practical exercise concerns grasping (using a peripheral device - 3D mouse) part and place it in a certain place. Another example is the simulation process on the clamping screws to the car body26. This time, the user simulates the execution of the assembly process using special markers.

Immersive approach provides a high level of immersion, which undoubtedly affects the quality of perceived simulation and received training. The interaction, although effective and intuitive, is somehow limited, mainly due to the lack of a tactile signal (force-feedback). No tactile (haptic) interaction causes the manipulation of objects somewhat symbolic for the user, as the so-called "ghost effect" occurs (the ability to move or lift heavy parts, which in real terms would be impossible to pick up, the lack of touch feedback in the event of a collision).

Limited realism of educational simulations, which use only no tactile interaction techniques, became the impetus for the creation of a hybrid approach. For this purpose, low-cost Rapid Prototyping devices are used commonly. In this approach, physical models of real objects are introduced into the virtual environment. These models reproduce studied objects in the virtual environment and are necessary for the proper conduct of the simulation (e.g. models of hand tools, control panels). Physical models using RP techniques must be associated with markers of tracking systems, to enable tracking their position for immersive simulation. This kind of work is already known in literature 3,14 An example is the immersive simulation of the workplace for assembly (screwing bolts into the body of the engine) using a hand tool (wrench). The user, whose movements and gestures are recorded by the tracking system, is equipped with a HMD and a physical model of the wrench made by 3D printing. Use of a physical prototype extends realism of the simulation, but still there is a considerable lack of a tactile signal which decreases immersion level.

2.2. Haptic approach

This approach assumes use of a haptic manipulator with force feedback, in order to enable effective tactile interaction with a virtual environment. In this approach, HMDs are rarely used, mainly due to limitations of the haptic devices (lack of mobility, small working area). Therefore, the level of visual and kinetic immersion in this approach is low, as user remains physically bound to the same place, cannot freely walk or move his hand. However, user is presented with a stream of tactile data, which in certain cases may be more useful than the visual immersion - especially in training of precise surgery or robot tele-operation, where level of applied strength is a very important factor.

Haptic approach is also used with hybrid solutions. Physical prototypes (frequently manufactured with the assistance of 3D printing technologies) are assembled with the end effector of the haptic manipulator. For example, with the support of 3D printing techniques, physical model of the welding gun is applied to the end effector of the manipulator, in order to enhance the realism of the simulated process. This kind of work is also already known in literature14.

On the basis of analysis of the current dominating approaches in the creation of interactive training applications it can be noted that the tactile interaction is limited to user contact with the end effector of the haptic manipulator, whose main task is to provide tactile stimuli, mainly in the form of force-feedback. There is also no freedom of movement for the user (trainee). It is also worth noting that the cost of both purchase and further development of commercial solutions for interactive educational simulation is very significant. Therefore, the authors propose a new approach to creating interactive simulations of this type.

3. The new approach to VR training

The authors have developed a new approach in creation of interactive simulations for studies of haptic interactions with a user. The virtual environment was introduced with a new element - an active device simulating real objects. The role of this device is played by the Delta robot. A user (participant of a virtual scene) obtained

a possibility to interact with the simulated object in a tactile manner, while task of the Delta robot was to move itself to a position allowing such an interaction. This approach required development of method of integration of the Delta robot with appropriate VR systems (tracking systems, Head-Mounted Display) and physical, real objects (props). An environment prepared on the basis of this approach is a ready solution - low-cost system for study of haptic interactions in virtual environment. Concept of the solution is presented in Fig. 1.

Fig. 1. Concept of a system for studying haptic interactions in VR environment.

The initial studies realized using the system were on haptic interaction between the testing person and the end effector of the robotic manipulator, on which a physical object was mounted, with its virtual representative. For example, user can see a virtual model of a table and will be able to touch it - his hand will touch flat surface of the haptic manipulator which will follow the hand. Appropriate programming of the interaction will ensure effect of touching of a virtual object.

4. Integration of an active haptic device (Delta robot) with VR systems - methods and results

Concept of building a system for studies of haptic interaction using a manipulator of parallel kinematic structure required development of a method of integration of active device (the Delta robot) with VR software, VR systems and physical objects (props). The method assumes undertaking activities aimed at hardware integration of a robot with virtual work environment. It required development of:

• Robot control algorithm

• Communication interfaces with VR environment

• Algorithm of positioning the end effector of the haptic device

In the first stage of work, communication interfaces available and possible to use were analyzed. Types of information which should be transferred from and to the robot, as well as to and from the virtual environment were determined. Then, work was undertaken on physical connection between the delta robot and the immersive VR devices. Parameters of transmission were checked, including its speed and reliability. Simultaneously, communication software for the VR system was created. After each stage of the work, launching and verification tests were conducted. It was assumed, that for the preliminary tests, the system will only display a simple rectangular table of variable dimensions. Task of the manipulator was to position its end effector in such a place, to represent upper edge of the table in front of operator's tracked hand. It is noteworthy, that it was particularly difficult to obtain (determine) information about current position of the virtual edge in 3D space. Additional difficulty was caused by determination of current position and movement direction of the operator. This part of the

task required writing a software algorithm cooperating with a tracking system used to measure position of operator's hand in real time.

Verification of integration between the Delta robot and VR systems with real physical objects consisted in tests verifying work of the system (simulation of touching the table). During these tests, accuracy of positioning of the end effector was tested, as well as time of reaction of the system on movements of user of the virtual scene. For design needs of the system for haptic interaction study, a digital kinematic model of the robot was developed, to find angles at robot joints, depending on position and orientation of the end effector (inverse kinematics). As an input to the system, the XYZ coordinates of the end effector were entered.

Moreover, general assumptions regarding the robot structure were made - number of its degrees of freedom, workspace limits, forces, velocities and positioning accuracy. A base structure was proposed, on the basis of kinematics of a universal delta manipulator. Initially, it was assumed that zone of operation of the manipulator will be approx. 1 m x 1 m x 0,7 m. Because of operator's safety, velocities were lower than 100 mm/s, while maximal generated force did not exceed 20 N. On the basis of the assumptions, drives were selected, as well as measurement elements and control elements. As the drives, direct current engines with reducing transmissions were selected. Rotary encoders were used as measurement elements. Subsequent work required development of a concept of the control system, to select specific drive, measurement and control units for the manipulator elements. As a main control unit of the active haptic device, a ready solution was selected - an industrial computer adjusted to controlling drives (3 axes minimum) and communication with other devices using network interfaces.

On the basis of conducted identification and initial assumptions, mechanical parts of the manipulator were designed. Standard elements were selected and design documentation in 3D was made of the whole manipulator. The end effector was also designed at this stage, along with the control system. Then, the mechanical and control part of the manipulator were built. The arms of the robot were made of lightweight, elastic aluminum tubes. Thanks to this, the device is safe in operation and can work near to human operators, not causing any danger in case of improper operation or programming. As the drives, modern synchronic engines with constant magnets PMSM were used, of nominal torque M = 0,64 Nm. The engines were coupled with encoders for angular position measurement, with resolution of 4200 imp/rot. The engine was coupled with a transmission of 1:30 ratio. The whole robot is presented in Fig. 2.

Fig. 2. The Delta robot.

Control system of the Delta robot was based on an industrial computer. The controller was connected to inverters. Each inverter was equipped with a communication and encoding card, while each drive was regulated by an independent PID (proportional-plus-integral-plus-derivative) controller. The controller parameters were selected experimentally. In the further stage, controlling software was created for the manipulator. The program was written using the Automation Studio environment, in Structured Text language. The program was divided into subprograms, due to particular tasks performed by the device. The main thread of the controlling program was responsible for inverse kinematics calculation.

Determined values of x, y and z variables were directed into a so-called TaskClass, which was responsible for servicing the drives. The class was also used to define values of velocities and accelerations of the particular robot arms. Another subprogram was responsible for realization of communication between a remote computer (i.e. VR environment) and the haptic manipulator. The whole software works on a real-time operating system Automation Runtime.

Communication between the virtual environment and the Delta-based haptic device was realized using the Ethernet interface and the User Datagram Protocol (UDP). Desired position of the end effector (XYZ), as calculated in the VR environment, was used as an input data to the robot software. The data was sent in the string format (single string with dividing chars between numbers) and then split and converted to the float format. Transmission was realized with 20Hz frequency. The basic communication algorithm as described above allows remote control of the robot end effector by sending desired position from a remote computer, through UDP, in an appropriate format. On top of this communication scheme, a framework virtual environment was built, as presented in Fig. 3. Using the EON Studio software, a digital model of the Delta robot was created and its inverse kinematics were programmed. Virtual hand model was created and implemented in the application. Its real-time tracking was ensured by connecting the virtual model to position data from the large area tracking system. This allows user who wears an optical marker (see Fig. 4) to control position of the virtual hand in real time. To enable immersive interaction, stereoscopic camera set was implemented in the application. Position of user's head was also tracked, using the same optical tracking system (WorldViz PPT X with 4 cameras). Rotational tracking was realized by the professional accelerometer InertiaCube2+. Basic algorithms of communication between these devices and the virtual environment were also prepared inside the EON Studio environment.

To ensure proper relative position of the virtual robot towards user's hand, calibration coefficients were introduced programmatically. Each time the robot base is physically moved, its position must be measured using the tracking system marker (to obtain position values in the same coordinate system as user's) and introduced into the framework calibration algorithm. Accuracy of the PPT X tracking system stays within a sub-millimeter range, so it is an acceptable tool to perform a calibration, as its accuracy is better than positioning of the robot itself.

Main function of the framework application (which will be used to develop more advanced touch interaction scenarios in the future) is simulation of touching of simple geometrical objects, such as cubicoids. To simulate a cubicoid, positions of its opposite vertices must be defined in a dedicated script, which will limit possible positions of virtual robot's end effector only to positions within that cubicoid. Then, after enabling the simulation option (which is done programmatically, with possibility to switch off at any time due to user's safety), the end effector is bound to user's hand position and will always follow it, staying inside the defined spatial shape. As a result, the user will be able to touch and feel the shape in the space, not having to hold onto the end effector and with full possibility of moving around the shape, which is also displayed in front of his eyes.

Fig. 3. Framework application for simple haptic interaction study in the EON Studio software.

Before each experiment, the robot position must be determined and entered to the calibration algorithm. Wrong calibration could pose a potential danger to the user, even considering low stiffness and torque of the manipulator. Full compatibility must be reached before conducting any haptic interaction tests.

Calibration, as described above, was the first step in the performed tests. Then, first simple touch interaction tests were performed with users experienced both with VR and robots. After initial tests, small corrections were introduced in the control system of the Delta manipulator, to optimize fluency of its operation in interactive, virtual simulations. Then, touch interaction tests were performed with participation of five users in total and with different sizes of virtual objects (see Fig. 4). To compare the feeling of immersion, users were first introduced to the simulation without the active haptic device. Then, after getting familiar with the environment, they were presented with the haptic Delta robot and were asked to evaluate the feeling on a comparative, 3-point scale (worse, same, better).

Fig. 4. Haptic interaction with a Delta robot using the prepared application framework.

The following preliminary results were noted:

• All the users found the immersion to be a lot higher with the active touch device

• The users found it natural to move around and touch the object from different locations in space, possibility of virtual walk around large-sized objects was highly evaluated

• Some inaccuracies were noted because of lack of rotational tracking of user's hand, it can be improved by implementing an additional device in form of a glove or a wristband, to make virtual hand fully compatible with the real hand

• Some problems were observed with fluency of robot movement during rapid movements of user's hand in space, it can be avoided by further improvement of control software; however, the manipulator was built in a low-cost approach, so such problems can be also caused by hardware issues, upgrading to better components could help

5. Conclusion

Proposed approach in development of interactive simulations for study of haptic interaction with a user is complementary to current state of knowledge in area of use of Virtual Reality and haptics in interactive training simulations. Because of significant hardware (small working area) and economical (very high price) limitations, haptic interaction in existing solutions was limited to continuous contact between user and end effector of the haptic manipulator to obtain certain force feedback effects, with no free movement of user in an immersive space.

Thanks to development of appropriate method of integration with VR systems, function of a haptic device can be fulfilled by a low-cost manipulator of parallel kinematic structure (Delta robot). It can be successfully used as a new type of interaction device in a virtual environment - active touch device for simulation of real objects. The user (participant of a virtual scene) is able to interact with a simulated object and the Delta robot is responsible for allowing the user to feel resistance caused by presence of this object. It is possible to build such a system in a completely low-cost approach, using HMD devices such as Oculus Rift or HTC Vive, what is planned in the further stage of development (the created framework is compatible with any type of HMD, as long as user's hand position can be directly accessed by the algorithm).

Preliminary analysis of results of the performed experimental studies shows, that this type of application will be able to increase effectiveness of virtual training with particular consideration of industrial training scenarios. There are still many problems to be solved, especially regarding changing between different shapes and materials in the same simulation. However, the authors plan further development of the proposed solution. Besides hardware improvements in system structure, a software tool will be prepared for learning of more complex procedural tasks.


1. Pajak E et al. Techniki przyrostowe i wirtualna rzeczywistosc w procesach przygotowania produkcji; 2011.

2. Pezzementi Z, Ursu D, Misra S, Okamura A. Modeling realistic tool-tissue interactions with haptic feedback: A learning-based method. Symposium on haptic interfaces for virtual environment and teleoperator systems; 2008. p. 209-215.

3. Bun P, Gorski F et al. Immersive educational simulation of medical ultrasound examination. Procedia Computer Science. Vol. 75; 2015. p. 186-194.

4. Coles TR. Investigating Augmented Reality Visio-Haptic Techniques for Medical Training. Bangor University; 2011.

5. Hamrol A et al. Virtual 3D atlas of a human body - development of an educational medical software application. Procedia Computer Science. Vol. 25; 2013. p. 302-314.

6. Zhang L et al. The added value of virtual reality technology and force feedback for surgical training simulators. J of Prevention, Assessm. and Rehab. 41; 2012. p. 2288-2292.

7. de Bruin ED, Schoene D, Pichierri G, Smith ST. Use of virtual reality technique for the training of motor control in the elderly. Zeitschrift für Gerontologie und Geriatrie. 43; 2010. p. 229-234.

8. Farber M et al. VR simulator for the training of lumbar punctures. Methods Inf. Med. 48(5); 2009. p. 493-501.

9. Mafi R, Sirouspour S, Mahdavikhah B, Moody B, Elizeh K, Kinsman A, Nicolici N. A Parallel computing platform for real-time haptic interaction with deformable bodies. IEEE Transactions on Haptics. 3(3); 2010. p. 211-223.

10. Panait L et al. The role of haptic feedback in laparoscopic simulation training. J of Surgical Research. 156; 2009. p. 312-316.

11. Ullrich S et al. Virtual needle simulation with haptics for regional anaesthesia. IEEE Virtual Reality; 2010. p. 3-5.

12. Cameirao MS, Bermudez BS, Duarte E, Frisoli A, Verschure PFMJ. The combined impact of virtual reality neurorehabilitation and its interfaces on upper. Stroke. 43; 2012. p. 2720-2728.

13. Girone M et al. Orthopedic rehabilitation using the "Rutgers ankle" interface. Stud Health Technol Inform. 70; 2000. p. 89-95.

14. Grajewski D, Gorski F, Zawadzki P, Hamrol A. Application of virtual reality techniques in design of ergonomic manufacturing workplaces. Procedia Computer Science. 25; 2013. p. 289-301.

15. Bun P, Gorski F, Wichniarek R, Kuczko W, Hamrol A, Zawadzki P. Application of professional and low-cost head mounted devices in immersive educational application. Procedia Computer Science. 75; 2015. p. 173-181.

16. Bun P, Gorski F, Wichniarek R, Kuczko W, Hamrol A, Zawadzki P. Application of low-cost tracking systems in educational training applications. Procedia Computer Science. 75; 2015. p. 398-407.

17. Bhatti A, Khoo Y, Bing C, Douglas AJ, Nahavandi S, Zhou, M. Haptically enabled interactive virtual reality prototype for general assembly. Proceedings of the World Automation Congress WAC; 2008. p. 1-6.

18. Gupta R, Whitney DE, Zeltzer D. Prototyping and design for assembly analysis using multimodal virtual environments. Computer-Aided Design. 29(8); 1997. p. 585-597.

19. Jayram S, Connacher HL et al. Virtual assembly using virtual reality techniques. Computer Aided Design. 29(8); 1997. p. 575-584.

20. McDermott SD, Bras B. Development of a haptically enabled dis/re-assembly simulation environment. Proc of DETC'99; 1999.

21. Ritchie JM, Lim T, Medellin H, Sung RS. A haptic based virtual assembly system for the generation of assembly process plans. Memorias del XV Congresso Internacional Anual de la Somim; 2009. p. 585-593.

22. Rodriguez J, Gutierrez T, Sanchez EJ, Casado S, Aguinaga I. Training of procedural tasks through the use of virtual reality and direct aids. Virtual Reality and Environments; 2012.

23. Seth A, Su HJ, Vance JM. SHARP: A system for haptic assembly & realistic prototyping. Proceedings of the DETC'06; 2006. p. 1-8.

24. Wan H, Gao S et al. Mivas: A multi-modal immersive virtual assembly system. Proceedings of DETC. 4; 2004. p. 113-122.

25. VR addresses engineering's demographic time bomb; 2013. Available:

26. Haption products; 2016. Available:

Damian Grajewski, research assistant at the Chair of Management and Production Engineering. Currently he is working on PhD thesis "Study of tactile interaction in Virtual Reality using Delta robot". His research interests include Virtual Reality, haptic technology, product and process design. He is author and co-author of more than 20 publications. Contact him at