Boosted multifeature learning for cross-domain transfer

Xiaoshan Yang, Tianzhu Zhang, Changsheng Xu, Ming Hsuan Yang

Research output: Contribution to journalArticle

15 Citations (Scopus)

Abstract

Conventional learning algorithm assumes that the training data and test data share a common distribution. However, this assumption will greatly hinder the practical application of the learned model for cross-domain data analysis in multimedia. To deal with this issue, transfer learning based technology should be adopted. As a typical version of transfer learning, domain adaption has been extensively studied recently due to its theoretical value and practical interest. In this article, we propose a boosted multifeature learning (BMFL) approach to iteratively learn multiple representations within a boosting procedure for unsupervised domain adaption. The proposed BMFL method has a number of properties. (1) It reuses all instances with different weights assigned by the previous boosting iteration and avoids discarding labeled instances as in conventional methods. (2) It models the instance weight distribution effectively by considering the classification error and the domain similarity, which facilitates learning new feature representation to correct the previously misclassified instances. (3) It learns multiple different feature representations to effectively bridge the source and target domains. We evaluate the BMFL by comparing its performance on three applications: image classification, sentiment classification and spam filtering. Extensive experimental results demonstrate that the proposed BMFL algorithm performs favorably against state-of-the-art domain adaption methods.

Original languageEnglish
Article number35
JournalACM Transactions on Multimedia Computing, Communications and Applications
Volume11
Issue number3
DOIs
Publication statusPublished - 2015 Jan 1

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Hardware and Architecture
  • Computer Networks and Communications

Cite this