Multimodal Interaction Recognition Mechanism by Using Midas Featured By Data-Level and Decision-Level Fusion

  • Muhammad Habib Lahore Garrison University
  • Noor ul Qamar Lahore Garrison University
Keywords: Degradation-based sensor data, Multimodal interaction patterns, MIML (Multimodal Interaction Markup Language), temporal combinations.

Abstract

Natural User Interfaces (NUI's) dealing with gestures is an alternative of traditional input devices on multi-touch panels. Rate of growth in the Sensor technology has increased the use of multiple sensors to deal with various monitoring and compatibility issues of machines. Research on data-level fusion models requires more focus on the fusion of multiple degradation-based sensor data. Midas, a novel declarative language to express multimodal interaction patterns has come up with the idea of developers required patterns description by employing multi-model interaction mechanism. The language as a base interface deals with minimum complexity issues like controlling inversion and intermediary states by means of data fusion, data processing and data selection provisioning high-level programming abstractions.

Published
2017-06-30
How to Cite
Muhammad Habib, & Noor ul Qamar. (2017). Multimodal Interaction Recognition Mechanism by Using Midas Featured By Data-Level and Decision-Level Fusion. Lahore Garrison University Research Journal of Computer Science and Information Technology, 1(2), 41-51. https://doi.org/10.54692/lgurjcsit.2017.010227
Section
Articles