Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example

Nikolas Vidakis, Anastasios Vlasopoulos, Tsampikos Kounalakis, Petros Varchalamas, Michalis Dimitriou, Gregory Kalliatakis, Efthimios Syntychakis, John Christofakis, Georgios Triantafyllidis

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

3 Citationer (Scopus)
676 Downloads (Pure)

Abstrakt

This paper presents a natural user interface system
based on multimodal human computer interaction, which
operates as an intermediate module between the user and the
operating system. The aim of this work is to demonstrate a
multimodal system which gives users the ability to interact with
desktop applications using face, objects, voice and gestures.
These human behaviors constitute the input qualifiers to the
system. Microsoft Kinect multi-sensor was utilized as input
device in order to succeed the natural user interaction, mainly
due to the multimodal capabilities offered by this device. We
demonstrate scenarios which contain all the functions and
capabilities of our system from the perspective of natural user
interaction.
OriginalsprogEngelsk
Titel18th International Conference on Digital Signal Processing (DSP)
RedaktørerAthanasios Skodras
Antal sider8
ForlagWiley-IEEE press
Publikationsdato2013
ISBN (Trykt)978-1-4673-5807-1
DOI
StatusUdgivet - 2013
BegivenhedInternational conference on Digital Signal Processing - Fira, Grækenland
Varighed: 1 jul. 20133 jul. 2013
Konferencens nummer: 18
http://dsp2013.dspconferences.org/

Konference

KonferenceInternational conference on Digital Signal Processing
Nummer18
LandGrækenland
ByFira
Periode01/07/201303/07/2013
Internetadresse
NavnInternational Conference on Digital Signal Processing proceedings
ISSN1546-1874

Fingeraftryk Dyk ned i forskningsemnerne om 'Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example'. Sammen danner de et unikt fingeraftryk.

Citationsformater