Robust Structural Sparse Tracking

Tianzhu Zhang, Changsheng Xu, Ming Hsuan Yang

Research output: Contribution to journalArticle

18 Citations (Scopus)

Abstract

Sparse representations have been applied to visual tracking by finding the best candidate region with minimal reconstruction error based on a set of target templates. However, most existing sparse trackers only consider holistic or local representations and do not make full use of the intrinsic structure among and inside target candidate regions, thereby making them less effective when similar objects appear at close proximity or under occlusion. In this paper, we propose a novel structural sparse representation, which not only exploits the intrinsic relationships among target candidate regions and local patches to learn their representations jointly, but also preserves the spatial structure among the local patches inside each target candidate region. For robust visual tracking, we take outliers resulting from occlusion and noise into account when searching for the best target region. Constructed within a Bayesian filtering framework, we show that the proposed algorithm accommodates most existing sparse trackers with respective merits. The formulated problem can be efficiently solved using an accelerated proximal gradient method that yields a sequence of closed form updates. Qualitative and quantitative evaluations on challenging benchmark datasets demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods.

Original languageEnglish
Article number8267329
Pages (from-to)473-486
Number of pages14
JournalIEEE transactions on pattern analysis and machine intelligence
Volume41
Issue number2
DOIs
Publication statusPublished - 2019 Feb 1

Fingerprint

Target
Gradient methods
Visual Tracking
Sparse Representation
Occlusion
Patch
Proximal Methods
Quantitative Evaluation
Gradient Method
Spatial Structure
Proximity
Outlier
Template
Closed-form
Filtering
Update
Benchmark
Demonstrate

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Cite this

Zhang, Tianzhu ; Xu, Changsheng ; Yang, Ming Hsuan. / Robust Structural Sparse Tracking. In: IEEE transactions on pattern analysis and machine intelligence. 2019 ; Vol. 41, No. 2. pp. 473-486.
@article{4efbb7d6d38348cea22bccdfe2287ff3,
title = "Robust Structural Sparse Tracking",
abstract = "Sparse representations have been applied to visual tracking by finding the best candidate region with minimal reconstruction error based on a set of target templates. However, most existing sparse trackers only consider holistic or local representations and do not make full use of the intrinsic structure among and inside target candidate regions, thereby making them less effective when similar objects appear at close proximity or under occlusion. In this paper, we propose a novel structural sparse representation, which not only exploits the intrinsic relationships among target candidate regions and local patches to learn their representations jointly, but also preserves the spatial structure among the local patches inside each target candidate region. For robust visual tracking, we take outliers resulting from occlusion and noise into account when searching for the best target region. Constructed within a Bayesian filtering framework, we show that the proposed algorithm accommodates most existing sparse trackers with respective merits. The formulated problem can be efficiently solved using an accelerated proximal gradient method that yields a sequence of closed form updates. Qualitative and quantitative evaluations on challenging benchmark datasets demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods.",
author = "Tianzhu Zhang and Changsheng Xu and Yang, {Ming Hsuan}",
year = "2019",
month = "2",
day = "1",
doi = "10.1109/TPAMI.2018.2797082",
language = "English",
volume = "41",
pages = "473--486",
journal = "IEEE Transactions on Pattern Analysis and Machine Intelligence",
issn = "0162-8828",
publisher = "IEEE Computer Society",
number = "2",

}

Robust Structural Sparse Tracking. / Zhang, Tianzhu; Xu, Changsheng; Yang, Ming Hsuan.

In: IEEE transactions on pattern analysis and machine intelligence, Vol. 41, No. 2, 8267329, 01.02.2019, p. 473-486.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Robust Structural Sparse Tracking

AU - Zhang, Tianzhu

AU - Xu, Changsheng

AU - Yang, Ming Hsuan

PY - 2019/2/1

Y1 - 2019/2/1

N2 - Sparse representations have been applied to visual tracking by finding the best candidate region with minimal reconstruction error based on a set of target templates. However, most existing sparse trackers only consider holistic or local representations and do not make full use of the intrinsic structure among and inside target candidate regions, thereby making them less effective when similar objects appear at close proximity or under occlusion. In this paper, we propose a novel structural sparse representation, which not only exploits the intrinsic relationships among target candidate regions and local patches to learn their representations jointly, but also preserves the spatial structure among the local patches inside each target candidate region. For robust visual tracking, we take outliers resulting from occlusion and noise into account when searching for the best target region. Constructed within a Bayesian filtering framework, we show that the proposed algorithm accommodates most existing sparse trackers with respective merits. The formulated problem can be efficiently solved using an accelerated proximal gradient method that yields a sequence of closed form updates. Qualitative and quantitative evaluations on challenging benchmark datasets demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods.

AB - Sparse representations have been applied to visual tracking by finding the best candidate region with minimal reconstruction error based on a set of target templates. However, most existing sparse trackers only consider holistic or local representations and do not make full use of the intrinsic structure among and inside target candidate regions, thereby making them less effective when similar objects appear at close proximity or under occlusion. In this paper, we propose a novel structural sparse representation, which not only exploits the intrinsic relationships among target candidate regions and local patches to learn their representations jointly, but also preserves the spatial structure among the local patches inside each target candidate region. For robust visual tracking, we take outliers resulting from occlusion and noise into account when searching for the best target region. Constructed within a Bayesian filtering framework, we show that the proposed algorithm accommodates most existing sparse trackers with respective merits. The formulated problem can be efficiently solved using an accelerated proximal gradient method that yields a sequence of closed form updates. Qualitative and quantitative evaluations on challenging benchmark datasets demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods.

UR - http://www.scopus.com/inward/record.url?scp=85041012100&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85041012100&partnerID=8YFLogxK

U2 - 10.1109/TPAMI.2018.2797082

DO - 10.1109/TPAMI.2018.2797082

M3 - Article

C2 - 29994599

AN - SCOPUS:85041012100

VL - 41

SP - 473

EP - 486

JO - IEEE Transactions on Pattern Analysis and Machine Intelligence

JF - IEEE Transactions on Pattern Analysis and Machine Intelligence

SN - 0162-8828

IS - 2

M1 - 8267329

ER -