Augmented Reality Views for Occluded Interaction

Klemen Lilija, Henning Pohl, Sebastian Boring, Kasper Hornbæk

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

Abstract

We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object's real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability
Original languageUndefined/Unknown
Title of host publicationProceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '19
Place of PublicationNew York, New York, USA
PublisherAssociation for Computing Machinery
Publication date2019
DOIs
Publication statusPublished - 2019

Cite this