Augmented assembly technologies based on 3D bare-hand interaction

https://doi.org/10.1016/j.cirp.2011.03.001Get rights and content

Abstract

Augmented reality has been applied to develop augmented assembly systems. However, most reported studies used pre-defined assembly information; AR is predominantly used to display information and interactions between users and the augmented environment are limited. This paper presents 3D bare-hand interaction in an augmented assembly environment to manipulate and assemble virtual components. A hybrid method based on constraint analysis is presented, which interprets users’ manual assembly intents robustly without the need for auxiliary CAD information. Algorithms for assembly constraint recognition, assembly tool operations and assembly sequence evaluation have been formulated. An augmented assembly system has been developed in this research.

Introduction

In recent years, virtual reality (VR) and virtual prototyping (VP) techniques [1], [2] have been widely used to simulate and evaluate assembly in the early design stage. However, the assembly planning experience is limited to a pure virtual environment (VE) due to a lack of real spatial feeling and suitable sensory feedback. A large amount of computation resources is also needed to simulate a complicated assembly process in a pure VE.

Augmented assembly (AA) is an application of augmented reality (AR) in assembly where an augmented environment (AE) is created, in which virtual objects are combined with the real environment to enhance the assembly design and planning process. An AE for product assembly design combines physical parts, real feedback and virtual contents to analyze the behaviors and properties of future products. Thus, the benefits of physical prototyping and VP can be combined. AR has been applied in manual assembly stations planning [3], product assembly guidance in product assembly [4], [5], augmenting digital virtual prototypes with physical products [6], dataglove-based virtual assembly [7] and replacing physical manuals with augmented virtual contents [8].

In many reported AA systems, assembly information, e.g., assembly features, predefined assembly constraints, etc., has to be extracted from CAD data, and this limits their impact as high preparation time is required for each application [9]. There is limited interaction between the users and the systems.

This research presents an AA system that interprets users’ manual assembly intents, supports on-line constraint recognition, and provides a robust 3D bare-hand interface to allow realistic visual feedback during assembly (Fig. 1). A method to realize interactive constraint-based AA using natural bare-hand interactions has been developed to increase interactions between users and virtual components to achieve an active participation of the users in the AA process. A bare-hand interaction augmented assembly (BHAA) system has been developed (Fig. 2).

Section snippets

3D bare-hand interaction method

Many tasks in AA systems are manipulation tasks with many degrees-of-freedom (DOFs), leading to greater difficulties in manipulating the virtual objects using traditional interaction methods, e.g., a mouse. To achieve natural and intuitive human-computer interaction (HCI), human hands can be used as interaction devices in AEs. Compared with traditional HCI devices, bare-hand is less intrusive and more convenient for the users to interact with virtual contents and explore the 3D AEs. Natural HCI

Assembly data management

A tri-layer assembly data structure (TADS) is used for assembly data management in BHAA. The first layer consists of geometric information. In the BHAA system, the environment does not need to be prepared a priori. During initialization, CAD models of the assembly parts and components are loaded into the BHAA system. The surfaces of these CAD models are enumerated and discretized into triangle tessellations to reconstruct continuous geometric entities, which are stored for display using OpenGL

Augmented assembly process

BHAA consists of a HCI interface and an AE, where virtual models are superimposed over the real scene. Users can perform assembly operations via the HCI interface with these virtual models in the AE. With the 3DNBHI interface, users can manipulate and assemble two different parts more intuitively and realistically. When these two parts are sufficiently close to each other, the user can adjust the positions and orientations of these parts easily and efficiently to trigger the assembly intent

Assembly sequence evaluation and feedback

Using bare hands in the BHAA system, the user can detect possible design faults easily. For example, when a part interferes with another part, this can be detected easily and feedback can be provided to product designers for design improvement. In traditional assembly planning, it is often hard to evaluate precisely the motions and complex manipulations needed. In the BHAA system, this can be achieved by assembling a set of virtual components using bare hands to identify the difficulties and

Implementation and case study

The BHAA system works well and consistently at about 15 frames per second for a 512 × 384 frame resolution. The accuracy of the system is determined by the accuracy of the fingertip detection method which has a RMS error of 1–2 mm in all axes.

The exploded view of the roller bracket case study is shown in Fig. 4. In Fig. 4(a), the user grasps the roller with his right hand and the left bush with his left hand and assembles these components. When the parts are colliding, the system analyzes the

Conclusion and future work

A robust and efficient system integrating natural bare-hand interaction with an AR-assisted methodology for constraint-based augmented assembly is presented. A 3D dual-handed interaction interface is provided to facilitate AA. A contribution of this research is the robust feature-based automatic constraint-based assembly recognition algorithm which can interpret a user's assembly intent robustly without the need for any a priori auxiliary CAD information. Assembly tools operations algorithms

References (13)

There are more references available in the full text version of this article.

Cited by (69)

  • Head-mounted display augmented reality in manufacturing: A systematic review

    2023, Robotics and Computer-Integrated Manufacturing
  • A comprehensive review of augmented reality-based instruction in manual assembly, training and repair

    2022, Robotics and Computer-Integrated Manufacturing
    Citation Excerpt :

    The last and most important point is that, compared with conventional instructions, AR instructions have more natural, intuitive interactivity and three-dimensional performance [6,40]. Users can directly interact with them using bare-hand input [41]. Considering the above advantages and the problems in manual assembly assistance, AR instruction may be one of the most promising tools in manual assembly and repair.

  • Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: A state of the art review

    2022, Journal of Manufacturing Systems
    Citation Excerpt :

    Immersive technologies such as AR and VR are being used to assist workers in overcoming these obstacles and challenges by providing support, simulation, and assistance in enhancing industrial processes prior to their implementation in a production environment. This will result in cost savings in reworks and revisions to manufacturing activities such as product design, process planning, machining, and so on [5,6]. The immersive technologies such as AR and VR were first used by retailers to promote their products and services to the customers.

View all citing articles on Scopus
View full text