Scholarly article on topic 'Taking the LEAP with the Oculus HMD and CAD - Plucking at thin Air?'

Taking the LEAP with the Oculus HMD and CAD - Plucking at thin Air? Academic research paper on "Computer and information sciences"

CC BY-NC-ND
0
0
Share paper
Academic journal
Procedia Technology
OECD Field of science
Keywords
{"Oculus Rift" / "LEAP Motion" / "CAD interaction" / HCI}

Abstract of research paper on Computer and information sciences, author of scientific article — Nathan Beattie, Ben Horan, Sophie McKenzie

Abstract Achieving adequate visualisation of designs within CAD packages remains a challenge for designers with current methods of 3D CAD visualisation requiring either a high level of technical ability, or expensive hardware and software. The recent re-emergence of consumer VR has lowered the barrier for everyday developers wanting to visualise their designs in true 3D. This paper presents the CAD Interaction Lab (CIL) which employs the Oculus Rift Head Mounted Display (HMD) and Leap Motion Controller (LMC) to provide a low cost method enabling users to use their hands to dissect a mechanic model to manipulate and inspect individual components in realistic 3D. Qualitative observations of user interactions with the CIL show that users were able to intuitively manipulate the CAD model using natural hand movements with only minimal instruction.

Academic research paper on topic "Taking the LEAP with the Oculus HMD and CAD - Plucking at thin Air?"

Available online at www.sciencedirect.com -

ScienceDirect PrOC6d ¡0

Technology

Procedia Technology 20 (2015) 149 - 154 ^^^^^^^^^^^^^^

The International Design Technology Conference, DesTech2015, 29th of June - 1st of July 2015,

Geelong, Australia

Taking the LEAP with the Oculus HMD and CAD - plucking at thin

Nathan Beattiea*, Ben Horana, Sophie McKenziea

aSchool of Engineering, Deakin University, 75 Pigdons Road, Waurn Ponds 3216, Australia

Abstract

Achieving adequate visualisation of designs within CAD packages remains a challenge for designers with current methods of 3D CAD visualisation requiring either a high level of technical ability, or expensive hardware and software. The recent re-emergence of consumer VR has lowered the barrier for everyday developers wanting to visualise their designs in true 3D. This paper presents the CAD Interaction Lab (CIL) which employs the Oculus Rift Head Mounted Display (HMD) and Leap Motion Controller (LMC) to provide a low cost method enabling users to use their hands to dissect a mechanic model to manipulate and inspect individual components in realistic 3D. Qualitative observations of user interactions with the CIL show that users were able to intuitively manipulate the CAD model using natural hand movements with only minimal instruction.

© 2015 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.Org/licenses/by-nc-nd/4.0/).

Peer-review under responsibility of School of Engineering, Faculty of Science Engineering & Built Environment, Deakin University Keywords: Oculus Rift; LEAP Motion; CAD interaction; HCI

1. Introduction

Virtual Reality (VR) for Computer Aided Design (CAD) and visualisation has been a popular topic in recent years [1-5]. Differing levels of VR immersion are discussed by [3], ranging from technology such as 2D screens for little to no immersion, projection walls for moderate immersion, and Head-Mounted Displays (HMD's) for full immersion. VR allows users to view and navigate three dimensional (3D) virtual environments in ways not possible with conventional two dimensional (2D) displays. Combining VR and CAD makes it is possible for users to view

* Corresponding author. E-mail address: nbeattie@deakin.edu.au

2212-0173 © 2015 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Peer-review under responsibility of School of Engineering, Faculty of Science Engineering & Built Environment, Deakin University doi: 10. 1016/j. protcy.2015.07.025

their designed model in front of them in realistic 3D in a similar fashion to viewing the physical realised artefact. Through the integration of hand-tracking technology designers can interact with the model using natural hand movements and gestures. Researchers have proposed the use of CAVE Automated Virtual Environments (CAVE's) for 3D CAD [5, 6] where the user was able to view a 3D representation of a virtual environment on each wall and was shown to provide a highly immersive experience. Despite the high level of immersion achievable, CAVEs can be prohibitively expensive for some applications and are in-situ, only allowing visualisations to occur in the one location. In [1] a system was designed employing HMDs providing arguably lower cost and higher portability than a CAVE or other stereoscopic VR technologies. Recently, the Oculus Rift has gained popularity as a consumer-grade VR headset. The Rift contains a suite of sensors to ensure accurate tracking of the orientation of the users head at all times [7]. The Rift provides high resolution (960 x 1080 pixels per eye) stereoscopic images with 100° field of view (FOV). While application of the Rift is as diverse as from teleoperation [8] to historical reconstruction [9], to the best of the authors' knowledge, it has not yet been used for CAD visualisation. There have however been research studies in the medical imaging field which have shown promising results of the technologies [10, 11]. The success of these studies, combined with the authors' own experience with the Oculus Rift, have contributed to the decision to use the Oculus Rift in the development of the CAD Interaction Lab (CIL) introduced in this paper.

Aside from visual interaction another important aspect of Human Computer Interaction (HCI) relevant to both CAD, and VR in general, is the method by which the designer can interact with the CAD model to manipulate and inspect it. Given the 3D nature of CAD models and 3D visual imagery, an appropriate method for this interaction is required. Common methods include the use of 'wands' [1], which provide six degrees of freedom, or data gloves [5], which track hand movements. Currently however both of these methods require complex and expensive tracking systems comprising several cameras and passive markers. Research has also investigated the use of consumer gaming hardware, such as the Microsoft Kinect™ [2]. While this approach provides full body tracking, it is limited in detecting fine movement, and more suited to gesture based controls. The Leap Motion Controller (LMC), obtainable as consumer hardware, however is designed to track hand movement inside a small workspace and capable of fine movement tracking. The LMC employs two IR (infra-red) cameras and three IR emitters, and software to identify and determine the position of a person's hand [3]. At this time despite there only being a few works using the LMC [10, 12-14]. One such work evaluated the use of the LMC as a 2D pointing device, and based on comparison with a standard mouse, found that a mouse was up to three times more accurate for pointing than the LMC [12]. In contrast the accuracy of the tracker itself was evaluated in [14], finding it to be highly accurate. This suggests that while the LMC may be inaccurate for interaction with a 2D display its high accuracy may lead to high effectiveness in a 3D environment such as shown in [13] and [10]. Due to the LMC's effectiveness in 3D environments, it has been chosen as the primary interaction method in the CIL.

This paper presents the design and implementation of the CIL which was developed to provide intuitive 3D CAD visualisation using consumer hardware, novel interaction techniques, and can be easily used by designers of all ages and skill level. Application of the CIL will be discussed via presentation of the system architecture and interaction design.

2. System Architecture

Given its ease of use and familiarity by the research community the Unity game engine [15] was employed in the work presented. The system is comprised of two primary components: visualisation using Oculus Rift that handles the viewing of the environment, and hand tracking using LMC that provides interaction within the 3D space. The system's architecture is shown in Fig. 1.

HAND TRACKING

Fig. 1. System Architecture Diagram

The CIL requires a reasonably powerful computer for smooth operation. For this reason, a Gigabyte P35 gaming laptop was used. The laptop contained a 2.5GHz Intel i7 CPU, GTX980M Graphics card, and 16Gb RAM. This setup allowed the CIL to run at 60 - 75 frames per second, providing smooth camera movement when the looking around the scene.

2.1. Visualisation

To provide a 3D experience suited to exploration of CAD models, the CIL is situated in a laboratory environment. The environment was also chosen to support future experiments.

After importing the laboratory environment into Unity the position of the CAD model in the scene needed to be determined. A horizontal surface in a reasonably central location was deemed appropriate. To designate the area of interaction, the workspace material was altered to show to a checked pattern. In addition, the CAD model and menu system floated above this surface, with the camera positioned in front, facing the model.

In order to place users 'inside' the virtual environment Oculus Rift provided a visualisation of two in-environment cameras that work together with the lenses to provide stereoscopic vision. Figure 2 shows the 3D representation of the user's hands interacting with the CAD model. Further shown in Figure 2, using the Oculus Rift introduced a warping effect that was corrected through corrections in the software.

Fig. 2. Rendering of the CIL for the Oculus Rift

2.2. Hand Tracking

Given the natural ability of humans to use hands to interact with the natural environment hand interaction was seen as an accurate and intuitive method for interaction in the CIL. The LMC provides an effective method for bringing a designer's hands into a virtual environment which provides highly accurate tracking in a conical space above the device extending upwards by 30cm - 40cm and approx. 45cm across. The optimal interaction height is 20cm, and the device works best with palms facing down.

Leap Motion provides an asset package for Unity which allows simple integration of their controller. In the scene, position and orientation of this asset needs to match the setup of the real-world LMC. Due to the height of the optimal interaction the position, the device in the CIL must be placed at a distance below the surface. It is also necessary to scale up the size of the hand, and by extension, the sensitivity of the tracker, in order to provide a suitable interaction area inside the environment. Initial testing found that a scale of 1.5 times that of the regular tracking led to a result that gave a larger tracking area without making the hand too large, and without the virtual hand was moving too fast compared to the real hand.

3. Interaction

Allowing users to interact with CAD visualisations is a core feature of the CIL. By leveraging the power of the LMC, users are able to use their own hands to manipulate a CAD model, allowing them to move it around the interaction area and visualise it from all angles.

The CAD model was broken up into twelve major components, as shown in Fig. 3a and 3b. Each of these components were manipulated by the user with a two-fingered pinch gesture. Upon pinching the fingers together, the software will perform a test to determine if the pinch position was within the bounding box of any of the components. If it is, the user has 'picked up' the component, and can drag it anywhere in the interaction area. Both translation and rotation of objects was supported. Users could then release objects by opening their fingers.

Fig. 3. (a) CAD Model components in scene hierarchy; (b) Diagram of all components (in no particular order)

Due to the limited interaction area a trade-off between realism and usability had to be made, with usability being the primary focus. First of all, physics were disabled on all components of the CAD model, which would leave them as static objects when not being held. With physics enabled, pieces would tend to float outside of the tracking area leaving the user with no way to pick them up again. With gravity, objects would fall to the bottom of the tracking area, also making them difficult to pick them up. Collisions were also removed, as some objects take up a considerable percentage of the interaction area and would bump other objects outside of the area when moved.

The menu system of CIL consisted of three buttons that performed different functions: Explode, Reset and Credits. These buttons could be activated by touching them with the virtual hand. 'Explode' would set each component of the CAD model to a pre-defined position, defined in the 'ExplodePositions' object shown in Fig. 3. (a). This function provides a simple method of opening up the CAD model, and demonstrating the individual interactive components. Pressing the 'Reset' button will return all objects to the starting co-ordinates The 'Credits' button will simply display the names of those responsible for CIL's development.

4. Results

The CIL was demonstrated at the 2015 Avalon Air Show. The setup comprised of an Oculus Rift and a table-mounted LMC. These were connected to a laptop underneath the demonstration area. Figure 4 demonstrates the CIL showing a visitor plucking/picking the virtual CAD components.

Fig. 4. CAD Interaction Laboratory demonstration

The demonstration of the CIL ran over six consecutive days during which we estimate that five to ten thousand visitors interacted with the display. Results from this demonstration consist of qualitative data stemming from the researchers' observations of the CIL visitors.

The CIL received overwhelmingly positive comments from visitors, with the vast majority learning how to interact with the system quickly. An average training time of 15 - 20 seconds was required to fully learn CIL interaction. Due to the high volume of visitors, an individual's total use time of the Lab was usually limited to 1 minute.

While manipulation of the CAD model was designed to be performed with a pinch gesture it was found that most attempted gesture was a full-fisted grab. Upon instruction of the correct technique described as a 'two fingered pinch with the thumb and forefinger', visitors were very quickly able to correctly manipulate the model. Skill in interacting with the virtual environment appeared constant between age groups with children as young as 7 or 8 being just as capable as adults at manipulating the model.

Some issues were encountered with user interaction stemming from the limited tracking area of the LMC. After mounting the Oculus Rift to the head of the user, visitors would frequently place their hands too high or out to the side of the tracking area due to their inhibited vision of the real world. It quickly became common practice for an assistant to either guide a visitor's hand to the center of the tracking area, or to instruct them to place their hand in the correct location before wearing the Oculus Rift.

The vast majority of visitors did not interact with the menu system, which has been attributed to its location in the environment as the menu was situated on the peripheral of the Oculus Rift's viewing angle when looking directly at

the CAD model. After being told of the menu's presence visitors would often attempt to press the buttons. In this case it was commonplace for a one-fingered press gesture to be used similar to a tap gesture on a smartphone.

5. Conclusion

This paper discussed the development and qualitative evaluation of an interactive virtual reality environment developed for the purpose of giving users a simple and natural method for visualizing a CAD model. To demonstrate the CIL the Oculus Rift and LMC provided an interface to the virtual world. The significant volume of positive user feedback received from demonstrating the CIL suggests that virtual reality can be a highly engaging medium for visualisation in CAD. With the incorporation of hand tracking and simple gestures for interaction, visualisation and evaluation of CAD becomes accessible to designers of all ages and skill levels.

Future work could be directly applied to education particularly in fields where maintenance and assembly of large and/or expensive objects is required. This method of object manipulation can also influence fields such as data visualisation and medical imaging.

6. Acknowledgements

The authors would like to acknowledge and thank Deakin University's Centre for Advanced Design in Engineering Training (CADET) and School of Engineering for financially supporting this project.

7. Notes

The source code for the CIL is downloadable here: https://github.com/Zaeran/CAD-Demo References

1. Alvarez, J. and H.-J. Su, VRMDS: an intuitive virtual environment for supporting the conceptual design of mechanisms. Virtual Reality, 2012. 16(1): p. 57-68.

2. Dave, D., A. Chowriappa, and T. Kesavadas, Gesture Interface for 3D CAD Modeling using Kinect. Computer-Aided Design & Applications (Computer-Aided Design & Applications), 2013. 10(4): p. 663-669.

3. Falcao, C. and M. Soares, Application of Virtual Reality Technologies in Consumer Product Usability, in Design, User Experience, and Usability. Web, Mobile, and Product Design, A. Marcus, Editor. 2013, Springer Berlin Heidelberg. p. 342-351.

4. Girbacia, F., et al., Visual Depth Perception of 3D CAD Models in Desktop and Immersive Virtual Environments. International Journal of Computers, Communications & Control, 2012. 7(5): p. 840-848.

5. Mahdjoub, M., et al., A collaborative Design for Usability approach supported by Virtual Reality and a Multi-Agent System embedded in aPLMenvironment. Computer-Aided Design, 2010. 42(5): p. 402-413.

6. Duarte Filho, N., et al., An immersive and collaborative visualization system for digital manufacturing. International Journal of Advanced Manufacturing Technology, 2010. 50(9-12): p. 1253-1261.

7. LaValle, S.M., et al., Head tracking for the Oculus Rift. 2014 IEEE International Conference on Robotics & Automation (ICRA), 2014: p. 187.

8. Kot, T. and P. Novak, Utilization of the Oculus Rift HMD in Mobile Robot Teleoperation. Applied Mechanics & Materials, 2014(555): p. 199-208.

9. Webel, S., et al. Immersive experience of current and ancient reconstructed cultural attractions. in Digital Heritage International Congress (DigitalHeritage), 2013. 2013.

10. Karolczak, K. and A. Klepaczko, A stereoscopic viewer of the results of vessel segmentation in 3D magnetic resonance angiography images.

11. Livatino, S., et al., Stereoscopic Visualization and 3-D Technologies in Medical Endoscopic Teleoperation. IEEE Transactions on Industrial Electronics, 2015. 62(1): p. 525-535.

12. Bachmann, D., F. Weichert, and G. Rinkenauer, Evaluation of the Leap Motion Controller as a New Contact-Free Pointing Device. Sensors (14248220), 2015. 15(1): p. 214-233.

13. Bizzotto, N., et al., Leap motion gesture control with OsiriX in the operating room to control imaging: first experiences during live surgery. Surgical Innovation, 2014. 21(6): p. 655-656.

14. Guna, J., et al., An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking. Sensors (14248220), 2014. 14(2): p. 3702-3720.

15. Unity. Unity - Game Engine. Available from: http://unity3d.com.