We propose an algorithm for accurate tracking of (articulated) objects using online update of appearance and shape. The challenge here is to model foreground appearance with histograms in a way that is both efficient and accurate. In this algorithm, the constantly changing foreground shape is modeled as a small number of rectangular blocks, whose positions within the tracking window are adaptively determined. Under the general assumption of stationary foreground appearance, we show that robust object tracking is possible by adaptively adjusting the locations of these blocks. Implemented in MATLAB without substantial optimization, our tracker runs already at 3.7 frames per second on a 3GHz machine. Experimental results have demonstrated that the algorithm is able to efficiently track articulated objects undergoing large variation in appearance and shape.
|Title of host publication||26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR|
|Publication status||Published - 2008|
|Event||26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR - Anchorage, AK, United States|
Duration: 2008 Jun 23 → 2008 Jun 28
|Name||26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR|
|Other||26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR|
|Period||08/6/23 → 08/6/28|
Bibliographical noteFunding Information:
M.-H. Yang is supported in part by a UC Merced faculty start-up fund and a gift from Google.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Control and Systems Engineering