• Jul 03, 2017 News!Good News! Since 2017, IJMLC has been indexed by Scopus!
  • Aug 15, 2017 News![CFP] 2017 the annual meeting of IJMLC Editorial Board, ACMLC 2017, will be held in Singapore, December 8-10, 2017.   [Click]
  • Jul 26, 2017 News!The papers published in Vol.7, No.1-3 have all received dois from Crossref.
Search
General Information
Editor-in-chief
Dr. Lin Huang
Metropolitan State University of Denver, USA
It's my honor to take on the position of editor in chief of IJMLC. We encourage authors to submit papers concerning any branch of machine learning and computing.
IJMLC 2016 Vol.6(3): 196-204 ISSN: 2010-3700
DOI: 10.18178/ijmlc.2016.6.3.598

Robust and Persistent Visual Tracking-by-Detection for Robotic Vision Systems

Yao Yeboah, Zhuliang Yu, and Wei Wu
Abstract—Visual object tracking in robotics applications comes riddled with challenges associated with almost incessant object motion, robot motion or in certain cases, both object and robot motion that may not be necessarily correlated. This problem is further compounded by appearance variations which result from scale and pose variations, rendering the establishment of robust online tracking schemes a challenging task. The paper presents and extension of the CSK tracker via a effectively incorporation of depth features and an improved model update scheme into a single tracking framework. In realizing this framework, feed-forward and feedback strategies are introduced into the CSK tracking scheme allowing for the seamless incorporation of depth features which are extracted on a per frame basis. Additionally, coherency is achieved between the depth and RGB feature spaces via a coupling scheme which is applied towards warping the depth and RGB spaces on a per frame basis. An intermediary stage between object detection and model update is further introduced towards intelligent and adaptive model and classifier parameter update schemes. The contributions of the paper are manifold. Firstly, the paper achieves an efficient incorporation of depth features into the CSK tracker towards classifier robustification; Secondly, intelligent and adaptive classifier and model parameter update strategies are achieved towards robust tracking by means of feed-back and feed-forward strategies; Finally, a coupling scheme allows for warping to be achieved between RBG and depth space thereby facilitating robustness in the tracker without introducing additional computational overhead and significantly trading off classifier speed. Experimental results suggest that the proposed scheme achieves tracking robustness in situations of partial occlusion and this offers a means by which the CSK tracker could be robustified towards feasibility in robotics applications. The scheme also allows the Circulant Tracker to operate at speeds feasible in online robotic applications.

Index Terms—Robust tracking, tracking-by-detection, tracking for robotics, visual tracking.

Yao Yeboah is with the Department of Electrical and Computer Engineering, South China University of Technology, Tianhe, Guangzhou, P.R. China (e-mail: mail@ yaoyeboah.com).
Zhuliang Yu and Wei Wu are with the School of Automation Technology, Tianhe, Guangzhou, P.R. China (e-mail: zlyu@scut.edu.cn, auweiwu@scut.edu.cn).

[PDF]

Cite: Yao Yeboah, Zhuliang Yu, and Wei Wu, "Robust and Persistent Visual Tracking-by-Detection for Robotic Vision Systems," International Journal of Machine Learning and Computing vol. 6, no. 3, pp. 196-204, 2016.

Copyright © 2008-2015. International Journal of Machine Learning and Computing. All rights reserved.
E-mail: ijmlc@ejournal.net