Activity recognition with android phone using mixture-of-experts co-trained with labeled and unlabeled data

Young Seol Lee, Sung Bae Cho

Research output: Contribution to journalArticlepeer-review

64 Citations (Scopus)

Abstract

As the number of smartphone users has grown recently, many context-aware services have been studied and launched. Activity recognition becomes one of the important issues for user adaptive services on the mobile phones. Even though many researchers have attempted to recognize a user's activities on a mobile device, it is still difficult to infer human activities from uncertain, incomplete and insufficient mobile sensor data. We present a method to recognize a person's activities from sensors in a mobile phone using mixture-of-experts (ME) model. In order to train the ME model, we have applied global-local co-training (GLCT) algorithm with both labeled and unlabeled data to improve the performance. The GLCT is a variation of co-training that uses a global model and a local model together. To evaluate the usefulness of the proposed method, we have conducted experiments using real datasets collected from Google Android smartphones. This paper is a revised and extended version of a paper that was presented at HAIS 2011.

Original languageEnglish
Pages (from-to)106-115
Number of pages10
JournalNeurocomputing
Volume126
DOIs
Publication statusPublished - 2014 Feb 27

Bibliographical note

Funding Information:
This research was supported by the Industrial Strategic Technology Development Program (10044828) funded by the Ministry of Trade, Industry and Energy (MI, Korea), and the ICT R&D Program 2013 funded by the MSIP (Ministry of Science, ICT & Future Planning, Korea).

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Activity recognition with android phone using mixture-of-experts co-trained with labeled and unlabeled data'. Together they form a unique fingerprint.

Cite this