3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration

  • PDF / 5,509,886 Bytes
  • 26 Pages / 439.37 x 666.142 pts Page_size
  • 110 Downloads / 203 Views

DOWNLOAD

REPORT


3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration Peng Wang 1 & Xiaoliang Bai 1 & Mark Billinghurst 1,2 & Shusheng Zhang 1 & Sili Wei 1 & Guangyao Xu 1 & Weiping He 1 & Xiangyu Zhang & Jie Zhang 1 Received: 25 March 2020 / Revised: 15 July 2020 / Accepted: 25 August 2020 # Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract

As Virtual Reality(VR), Augmented Reality(AR), Mixed Reality(MR) technology becomes more accessible, it is important to explore VR/AR/MR technologies that can be used for remote collaboration on physical tasks. Previous research has shown that gesture-based interaction is intuitive and expressive for remote collaboration, and using 3D CAD models can provide clear instructions for assembly tasks. In this paper, therefore, we describe a new MR remote collaboration system which combines the use of gesture and CAD models in a complementary manner. The prototype system enables a remote expert in VR to provide instructions based on 3D gesture and CAD models (3DGAM) for a local worker who uses AR to see these instructions. Using this interface, we conducted a formal user study to explore the effect of sharing 3D gesture and CAD models in an assembly training task. We found that the combination of 3D gesture and CAD models can improve remote collaboration on an assembly task with respect to the performance time and user experience. Finally, we provide some conclusions and directions for future research. Keywords Augmented reality . Mixed reality . Remote collaboration . Physical tasks . Sharing gesture . 3D CAD models

Peng Wang, Xiaoliang Bai and Mark Billinghurst contributed equally to this work. Electronic supplementary material The online version of this article (https://doi.org/10.1007/s11042-02009731-7) contains supplementary material, which is available to authorized users.

* Peng Wang [email protected] * Xiaoliang Bai [email protected] * Mark Billinghurst [email protected]

1

Cyber-Physical Interaction Lab, Northwestern Polytechnical University, Xi’an, China

2

Empathic Computing Lab, University of South Australia, Mawson Lakes, Australia

Multimedia Tools and Applications

1 Introduction In this paper we describe a novel system that uses Virtual Reality(VR) and Augmented Reality(AR) to provide remote training on a real world assembly task. A remote expert can provide real-time assistance for a local worker performing a physical task in many industrial scenarios, such as assembly/disassembly, training, and maintenance of mechanical equipment [4, 9, 21, 30]. With the increase in the power of VR and AR Head-Mounted Displays (HMDs) (such as HTC Vive1 and HoloLens2), they can provide improved VR/AR/Mixed Reality (MR) experiences for remote collaboration. Many industries hope that workers can be effectively trained using the advantages and the capabilities of both VR and AR/MR to improve performance and user experience [2, 8, 32]. There has been significant previous research focused on sharing non-verbal communication c