Abstract
Interaction in augmented reality environments may be very complex, depending on the degrees of freedom (DOFs) required for the task. In this work we present a 3D user interface for collaborative manipulation of virtual objects in augmented reality (AR) environments. It maps position - acquired with a camera and fiducial markers - and touchscreen input of a handheld device into gestures to select, move, rotate and scale virtual objects. As these transformations require the control of multiple DOFs, collaboration is proposed as a solution to coordinate the modification of each and all the available DOFs. Users are free to decide their own manipulation roles. All virtual elements are displayed directly on the mobile device as an overlay of the camera capture, providing an individual point of view of the AR environment to each user.
Originalsprog | Engelsk |
---|---|
Titel | 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings |
Antal sider | 2 |
Forlag | IEEE Signal Processing Society |
Publikationsdato | 5 apr. 2017 |
Sider | 264-265 |
Artikelnummer | 7893373 |
ISBN (Elektronisk) | 9781509067169 |
DOI | |
Status | Udgivet - 5 apr. 2017 |
Begivenhed | 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Los Angeles, USA Varighed: 18 mar. 2017 → 19 mar. 2017 |
Konference
Konference | 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 |
---|---|
Land/Område | USA |
By | Los Angeles |
Periode | 18/03/2017 → 19/03/2017 |
Sponsor | IEEE Computer Society, IEEE Visualization and Graphics Technical Committee (VGTC) |
Navn | 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings |
---|
Bibliografisk note
Publisher Copyright:© 2017 IEEE.