Recognizing multi-modal sensor signals using evolutionary learning of dynamic Bayesian networks

Young Seol Lee, Sung Bae Cho

Research output: Contribution to journalArticlepeer-review


Multi-modal context-aware systems can provide user-adaptive services, but it requires complicated recognition models with larger resources. The limitations to build optimal models and infer the context efficiently make it difficult to develop practical context-aware systems. We developed a multi-modal context-aware system with various wearable sensors including accelerometers, gyroscopes, physiological sensors, and data gloves. The system used probabilistic models to handle the uncertain and noisy time-series sensor data. In order to construct the efficient probabilistic models, this paper uses an evolutionary algorithm to model structure and EM algorithm to determine parameters. The trained models are selectively inferred based on a semantic network which describes the semantic relations of the contexts and sensors. Experiments with the real data collected show the usefulness of the proposed method.

Original languageEnglish
Pages (from-to)695-707
Number of pages13
JournalPattern Analysis and Applications
Issue number4
Publication statusPublished - 2014 Oct 16

Bibliographical note

Funding Information:
The authors would like to thank Dr. J.-K. Min, Mr. S.-I. Yang and Dr. J.-H. Hong for their help to implement the system used in this paper. This research is supported by Ministry of Culture, Sports and Tourism (MCST) and Korea Creative Content Agency (KOCCA) in the Culture Technology (CT) Research & Development Program.

Publisher Copyright:
© 2012, Springer-Verlag London.

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Recognizing multi-modal sensor signals using evolutionary learning of dynamic Bayesian networks'. Together they form a unique fingerprint.

Cite this