Abstract—Visual object tracking in robotics applications comes riddled with challenges associated with almost incessant object motion, robot motion or in certain cases, both object and robot motion that may not be necessarily correlated. This problem is further compounded by appearance variations which result from scale and pose variations, rendering the establishment of robust online tracking schemes a challenging task. The paper presents and extension of the CSK tracker via a effectively incorporation of depth features and an improved model update scheme into a single tracking framework. In realizing this framework, feed-forward and feedback strategies are introduced into the CSK tracking scheme allowing for the seamless incorporation of depth features which are extracted on a per frame basis. Additionally, coherency is achieved between the depth and RGB feature spaces via a coupling scheme which is applied towards warping the depth and RGB spaces on a per frame basis. An intermediary stage between object detection and model update is further introduced towards intelligent and adaptive model and classifier parameter update schemes. The contributions of the paper are manifold. Firstly, the paper achieves an efficient incorporation of depth features into the CSK tracker towards classifier robustification; Secondly, intelligent and adaptive classifier and model parameter update strategies are achieved towards robust tracking by means of feed-back and feed-forward strategies; Finally, a coupling scheme allows for warping to be achieved between RBG and depth space thereby facilitating robustness in the tracker without introducing additional computational overhead and significantly trading off classifier speed. Experimental results suggest that the proposed scheme achieves tracking robustness in situations of partial occlusion and this offers a means by which the CSK tracker could be robustified towards feasibility in robotics applications. The scheme also allows the Circulant Tracker to operate at speeds feasible in online robotic applications.
Index Terms—Robust tracking, tracking-by-detection, tracking for robotics, visual tracking.
Yao Yeboah is with the Department of Electrical and Computer Engineering, South China University of Technology, Tianhe, Guangzhou, P.R. China (e-mail: mail@ yaoyeboah.com).
Zhuliang Yu and Wei Wu are with the School of Automation Technology, Tianhe, Guangzhou, P.R. China (e-mail: firstname.lastname@example.org, email@example.com).
Cite: Yao Yeboah, Zhuliang Yu, and Wei Wu, "Robust and Persistent Visual Tracking-by-Detection for Robotic Vision Systems," International Journal of Machine Learning and Computing vol. 6, no. 3, pp. 196-204, 2016.