Visual Tracking via Sparse and Local Linear Coding

Guofeng Wang, Xueying Qin, Fan Zhong, Yue Liu, Hongbo Li, Qunsheng Peng, Ming Hsuan Yang

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)


The state search is an important component of any object tracking algorithm. Numerous algorithms have been proposed, but stochastic sampling methods (e.g., particle filters) are arguably one of the most effective approaches. However, the discretization of the state space complicates the search for the precise object location. In this paper, we propose a novel tracking algorithm that extends the state space of particle observations from discrete to continuous. The solution is determined accurately via iterative linear coding between two convex hulls. The algorithm is modeled by an optimal function, which can be efficiently solved by either convex sparse coding or locality constrained linear coding. The algorithm is also very flexible and can be combined with many generic object representations. Thus, we first use sparse representation to achieve an efficient searching mechanism of the algorithm and demonstrate its accuracy. Next, two other object representation models, i.e., least soft-threshold squares and adaptive structural local sparse appearance, are implemented with improved accuracy to demonstrate the flexibility of our algorithm. Qualitative and quantitative experimental results demonstrate that the proposed tracking algorithm performs favorably against the state-of-the-art methods in dynamic scenes.

Original languageEnglish
Article number7140796
Pages (from-to)3796-3809
Number of pages14
JournalIEEE Transactions on Image Processing
Issue number11
Publication statusPublished - 2015 Nov 1

Bibliographical note

Publisher Copyright:
© 2015 IEEE.

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Visual Tracking via Sparse and Local Linear Coding'. Together they form a unique fingerprint.

Cite this