Prototyping of Multimodal Interactions for Smart Environments based on Task Models

AutorSebastian Feuerstack, Marco Blumendorf, Sahin Albayrak
QuelleEuropean Conference on Ambient Intelligence: Workshop on Model Driven Software Engineering for Ambient Intelligence Applications, Darmstadt, Germany 
LinksDownload   |   BibTeX 

Smart environments offer interconnected sensors, devices, and appliances that can be considered for interaction to substantially extend the potentially available modality mix. This promises a more natural and situation aware human computer interaction. Technical challenges and differences in interaction principles for distinct modalities restrict multimodal systems to specialized systems that support specific situations only. To overcome these limitations enabling an easier integration of new modalities to enhance interaction in smart environments, we propose a task-based notation that can be interpreted at runtime. The notation supports evolutionary prototyping of new interaction styles for already existing interactive systems. We eliminate the gap between design- and runtime, since support for additional modalities can be prototyped at runtime to an already existing interactive system.