Description
Social interaction takes place predominantly in a world that has spatial volume, yet planocentric 2D video (and stereo audio) can inhibit our efforts to record and document tangible volumetric and haptic phenomena. My presentation explores how immersive technologies can exploit, enhance, enfold, sublimate, splinter and bypass ‘the flat screen’ in video-based qualitative research. The presentation will be illustrated with examples of the use of virtual reality software tools, such as CAVA360VR and SQUIVE, designed to support Immersive Qualitative Analytics within the Big Video paradigm.Period | 2 Jul 2020 |
---|---|
Event title | Interaction Multimodales Par ECran |
Event type | Conference |
Location | Lyon, FranceShow on map |
Degree of Recognition | International |
Keywords
- video analysis
- Screening
- Big Video
- Immersive Qualitative Analytics
- Video ethnography
- EMCA
- Conversation Analysis
- Volumetric video
Related content
-
Projects
-
Big Video
Project: Research
-
Staging Qualitative Immersive Virtualisation Engine
Project: Research
-
Staging & Inhabiting Mixed Video for Immersive Qualitative Analytics
Project: Research
-
Annotate, Visualise, Analysis 360 Video in Virtual Reality
Project: Research
-
Publications
-
Inhabiting Spatial Video and Audio Data: Towards a Scenographic Turn in the Analysis of Social Interaction
Research output: Contribution to journal › Journal article › Research › peer-review
-
The Future of ‘Video’ in Video-Based Qualitative Research Is Not ‘Dumb’ Flat Pixels! Exploring Volumetric Performance Capture and Immersive Performative Replay
Research output: Contribution to journal › Journal article › Research › peer-review
-
A Big Video Manifesto: Re-sensing Video and Audio
Research output: Contribution to journal › Journal article › Research › peer-review