A Multimodal Interaction Framework for Blended Learning

Nikolaos Vidakis, Konstantinos Kalafatis, Georgios Triantafyllidis

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

Abstract

Humans interact with each other by utilizing the five basic senses as input modalities, whereas sounds, gestures, facial expressions etc. are utilized as output modalities. Multimodal interaction is also used between humans and their surrounding environment, although enhanced with further senses such as equilibrioception and the sense of balance. Computer interfaces that are considered as a different environment that human can interact with, lack of input and output amalgamation in order to provide a close to natural interaction. Multimodal human-computer interaction has sought to provide alternative means of communication with an application, which will be more natural than the traditional “windows, icons, menus, pointer” (WIMP) style. Despite the great amount of devices in existence, most applications make use of a very limited set of modalities, most notably speech and touch. This paper describes a
multimodal framework enabling deployment of a vast variety of modalities, tailored appropriately for use in blended learning environment.
Original languageEnglish
Title of host publicationInteractivity, Game Creation, Design, Learning, and Innovation : 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2-3, 2016, Proceedings
EditorsAnthony L. Brooks, Eva Brooks
PublisherSpringer
Publication date2017
Pages205-211
ISBN (Print)978-3-319-55833-2
ISBN (Electronic)978-3-319-55834-9
DOIs
Publication statusPublished - 2017
Event1st EAI International Conference on Design, Learning & Innovation - Esbjerg campus of Aalborg University, Esbjerg, Denmark
Duration: 2 May 20164 May 2016
http://designlearninginnovation.org/

Conference

Conference1st EAI International Conference on Design, Learning & Innovation
LocationEsbjerg campus of Aalborg University
CountryDenmark
CityEsbjerg
Period02/05/201604/05/2016
Internet address
SeriesLecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST
Volume196
ISSN1867-8211

Fingerprint

Human computer interaction
Interfaces (computer)
Acoustic waves
Communication

Cite this

Vidakis, N., Kalafatis, K., & Triantafyllidis, G. (2017). A Multimodal Interaction Framework for Blended Learning. In A. L. Brooks, & E. Brooks (Eds.), Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2-3, 2016, Proceedings (pp. 205-211). Springer. Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol.. 196 https://doi.org/10.1007/978-3-319-55834-9_24
Vidakis, Nikolaos ; Kalafatis, Konstantinos ; Triantafyllidis, Georgios. / A Multimodal Interaction Framework for Blended Learning. Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2-3, 2016, Proceedings. editor / Anthony L. Brooks ; Eva Brooks. Springer, 2017. pp. 205-211 (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol. 196).
@inproceedings{cca108d03b044b11bb5526f84596ad67,
title = "A Multimodal Interaction Framework for Blended Learning",
abstract = "Humans interact with each other by utilizing the five basic senses as input modalities, whereas sounds, gestures, facial expressions etc. are utilized as output modalities. Multimodal interaction is also used between humans and their surrounding environment, although enhanced with further senses such as equilibrioception and the sense of balance. Computer interfaces that are considered as a different environment that human can interact with, lack of input and output amalgamation in order to provide a close to natural interaction. Multimodal human-computer interaction has sought to provide alternative means of communication with an application, which will be more natural than the traditional “windows, icons, menus, pointer” (WIMP) style. Despite the great amount of devices in existence, most applications make use of a very limited set of modalities, most notably speech and touch. This paper describes amultimodal framework enabling deployment of a vast variety of modalities, tailored appropriately for use in blended learning environment.",
author = "Nikolaos Vidakis and Konstantinos Kalafatis and Georgios Triantafyllidis",
year = "2017",
doi = "10.1007/978-3-319-55834-9_24",
language = "English",
isbn = "978-3-319-55833-2",
series = "Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST",
publisher = "Springer",
pages = "205--211",
editor = "Brooks, {Anthony L.} and Eva Brooks",
booktitle = "Interactivity, Game Creation, Design, Learning, and Innovation",
address = "Germany",

}

Vidakis, N, Kalafatis, K & Triantafyllidis, G 2017, A Multimodal Interaction Framework for Blended Learning. in AL Brooks & E Brooks (eds), Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2-3, 2016, Proceedings. Springer, Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, vol. 196, pp. 205-211, 1st EAI International Conference on Design, Learning & Innovation, Esbjerg, Denmark, 02/05/2016. https://doi.org/10.1007/978-3-319-55834-9_24

A Multimodal Interaction Framework for Blended Learning. / Vidakis, Nikolaos; Kalafatis, Konstantinos; Triantafyllidis, Georgios.

Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2-3, 2016, Proceedings. ed. / Anthony L. Brooks; Eva Brooks. Springer, 2017. p. 205-211 (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol. 196).

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

TY - GEN

T1 - A Multimodal Interaction Framework for Blended Learning

AU - Vidakis, Nikolaos

AU - Kalafatis, Konstantinos

AU - Triantafyllidis, Georgios

PY - 2017

Y1 - 2017

N2 - Humans interact with each other by utilizing the five basic senses as input modalities, whereas sounds, gestures, facial expressions etc. are utilized as output modalities. Multimodal interaction is also used between humans and their surrounding environment, although enhanced with further senses such as equilibrioception and the sense of balance. Computer interfaces that are considered as a different environment that human can interact with, lack of input and output amalgamation in order to provide a close to natural interaction. Multimodal human-computer interaction has sought to provide alternative means of communication with an application, which will be more natural than the traditional “windows, icons, menus, pointer” (WIMP) style. Despite the great amount of devices in existence, most applications make use of a very limited set of modalities, most notably speech and touch. This paper describes amultimodal framework enabling deployment of a vast variety of modalities, tailored appropriately for use in blended learning environment.

AB - Humans interact with each other by utilizing the five basic senses as input modalities, whereas sounds, gestures, facial expressions etc. are utilized as output modalities. Multimodal interaction is also used between humans and their surrounding environment, although enhanced with further senses such as equilibrioception and the sense of balance. Computer interfaces that are considered as a different environment that human can interact with, lack of input and output amalgamation in order to provide a close to natural interaction. Multimodal human-computer interaction has sought to provide alternative means of communication with an application, which will be more natural than the traditional “windows, icons, menus, pointer” (WIMP) style. Despite the great amount of devices in existence, most applications make use of a very limited set of modalities, most notably speech and touch. This paper describes amultimodal framework enabling deployment of a vast variety of modalities, tailored appropriately for use in blended learning environment.

U2 - 10.1007/978-3-319-55834-9_24

DO - 10.1007/978-3-319-55834-9_24

M3 - Article in proceeding

SN - 978-3-319-55833-2

T3 - Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST

SP - 205

EP - 211

BT - Interactivity, Game Creation, Design, Learning, and Innovation

A2 - Brooks, Anthony L.

A2 - Brooks, Eva

PB - Springer

ER -

Vidakis N, Kalafatis K, Triantafyllidis G. A Multimodal Interaction Framework for Blended Learning. In Brooks AL, Brooks E, editors, Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2-3, 2016, Proceedings. Springer. 2017. p. 205-211. (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol. 196). https://doi.org/10.1007/978-3-319-55834-9_24