Visual Tracking under Motion Blur

Bo Ma, Lianghua Huang, Jianbing Shen, Ling Shao, Ming Hsuan Yang, Fatih Porikli

Research output: Contribution to journalArticlepeer-review

59 Citations (Scopus)


Most existing tracking algorithms do not explicitly consider the motion blur contained in video sequences, which degrades their performance in real-world applications where motion blur often occurs. In this paper, we propose to solve the motion blur problem in visual tracking in a unified framework. Specifically, a joint blur state estimation and multi-task reverse sparse learning framework are presented, where the closed-form solution of blur kernel and sparse code matrix is obtained simultaneously. The reverse process considers the blurry candidates as dictionary elements, and sparsely represents blurred templates with the candidates. By utilizing the information contained in the sparse code matrix, an efficient likelihood model is further developed, which quickly excludes irrelevant candidates and narrows the particle scale down. Experimental results on the challenging benchmarks show that our method performs well against the state-of-the-art trackers.

Original languageEnglish
Article number7585089
Pages (from-to)5867-5876
Number of pages10
JournalIEEE Transactions on Image Processing
Issue number12
Publication statusPublished - 2016 Dec

Bibliographical note

Funding Information:
This work was supported in part by the National Natural Science Foundation of China under Grant 61472036 and Grant 61272359, in part by the National Basic Research Program of China (973 Program) under Grant 2013CB328805, in part by the Australian Research Council's Discovery Projects Funding Scheme under Grant DP150104645, and in part by the Specialized Fund for Joint Building Program of Beijing Municipal Education Commission.

Publisher Copyright:
© 2016 IEEE.

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Visual Tracking under Motion Blur'. Together they form a unique fingerprint.

Cite this