Abstract—Tracking people as they move through a camera network with non-overlapping field of view is a challenging issue due to the significant changes of their appearance among the cameras. The varying appearance of a pedestrian is the consequence of illumination changes, parameters of camera, viewing angle, and deformable geometry of people. In addition the observations of individuals are often widely separated in time and space in such systems and common proximity techniques cannot be used to constrain possible correspondences. In this paper a new feature is proposed to represent the appearance of people, which is capable of dealing with the typical illumination changes occurring in indoor environment. To construct the proposed feature, co-occurrence matrix of image computed in YCbCr color space. Then diagonal vectors of the co-occurrence matrix is extracted and normalized. Experimental results from a real surveillance scene show the efficiency of the proposed method.
Index Terms—Tracking, disjoint camera views, co-occurrence matrix, YCbCr color space, jensen function.
The authors are with the Department of Computer and IT Engineering, Shahrood University of technology, Shahrood, Iran (e-mail: firstname.lastname@example.org, email@example.com).
Cite:Rahman Yousefzadeh and Hamid Hassanpour, "Pedestrian Tracking through Camera Network with Disjoint Views," International Journal of Machine Learning and Computing vol.3, no. 3, pp. 271-273, 2013.