Augmented Reality Views for Occluded Interaction

Klemen Lilija, Henning Pohl, Sebastian Boring, Kasper Anders Søren Hornbæk

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

9 Citations (Scopus)


We rely on our sight whenmanipulating objects.When objects
are occluded, manipulation becomes difficult. Such occluded
objects can be shown via augmented reality to re-enable visual
guidance. However, it is unclear how to do so to best support
object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across
a set of everyday manipulation tasks of varying complexity.
The best performing views were a see-through view and a
displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the
latter showed the 3D view of the manipulated object offset
from the object’s real location. The worst performing view
showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with
their real-world location is less important than an appropriate
point-of-view and view stability.
Original languageEnglish
Title of host publicationCHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
Number of pages12
Publication date6 May 2019
Article number446
ISBN (Print)9781450359702
ISBN (Electronic)9781450359702
Publication statusPublished - 6 May 2019
Externally publishedYes
EventConference on Human Factors in Computing Systems - Skotland, Glasgow, United Kingdom
Duration: 4 May 20199 May 2019


ConferenceConference on Human Factors in Computing Systems
Country/TerritoryUnited Kingdom


  • Augmented reality
  • Finger-camera
  • Manipulation task


Dive into the research topics of 'Augmented Reality Views for Occluded Interaction'. Together they form a unique fingerprint.

Cite this