1 1 Multi-Person Tracking for a Mobile Robot using Overlapping Silhouette Templates Junji Satake 1 and Jun Miura 1 This paper describes a stereo-based person tracking method for a person following robot. Many previous works on person tracking use laser range finders which can provide very accurate range measurements. Stereo-based systems have also been popular, but most of them are not used for controlling a real robot. In this paper, we propose an accurate, stable tracking method using overlapping silhouette templates which consider how persons overlap in the image. Experimental results show the effectiveness of the proposed method. In addition, we examine the possibility of extending the method to multi-camera cases. 1 Toyohashi University of Technology 1. 1) 4) 5),6) 7),8) 9) 11) Ess 12),13) 14) 16) probabilistic exclusion principle Khan 15) 2 Tweed Calway 16) 1 c 2010 Information Processing Society of Japan
2. 2.1 17) 1 2 2[m] 2.2 90% H W T (x, y) I D(x, y) d d = 1 [T (p, q) I D(x + p, y + q)] HW 2 (1) p q 1 4 3 2 3 2.2 3 3 t 3 (X t,y t,z t) x t [ ] T x t = X t Y t Z t Ẋ t Ẏ t (2) Ẋt, Ẏ t X-Y x t+1 = F tx t + w t (3) 3 (1) d L (a) Left Front Right 1 Fig. 1 Depth templates (b) 2 Fig. 2 Detection examples using depth templates ( ) L = 1 exp d2 (4) 2πσ 2σ 2 2.3 N 2 A, B 2 c 2010 Information Processing Society of Japan
Person B Person A Fig. 3 3 Definition of coordinate systems Fig. 4 4 Procedure of tracking using an overlapping silhouette template x A t x B t x AB t [ ] x AB t = x A t x B t (5) N N N N = 100 2 25 N AB =25 25 L AB L A,L B L AB = L A L B [ ] [ ] x AB F t 0 t+1 = x AB w t t + (6) 0 F t w t 4 2 2 3 2.1 5 Fig. 5 Comparison of tracking results with overlapping silhouette templates (drawn by white frame) and with individual templates (drawn by red or blue frame) 1 3 1 2.1 L AB A,B x AB t x A t x B t N AB N 3 c 2010 Information Processing Society of Japan
5 5 5 (a) (b) 1 3. 18) 1 2 6 1 PC 7 2 3.1 7(a) N 2 6(c) 3 3.2 7(b) N 個のパーティクルを保持 N 個のパーティクルを保持 (c) 2 人物位置 (d) 6 Fig. 6 Tracking using multiple cameras N 個のパーティクルを保持 統合モジュール 人物位置を統合 (a) 各パーティクルを評価 各パーティクルを評価 パーティクルの 3 次元位置 評価結果 各パーティクルを評価 統合モジュール N 個のパーティクルを保持 (b) 7 Fig. 7 Integration of tracking with multiple cameras N 3 4 c 2010 Information Processing Society of Japan
2.3 6 3.1 N 1 PC 4.2 4. 4.1 1 3 MobileRobots PeopleBot PointGreyResearch Bumblebee2 XGA 20fps f=2.5[mm] 12[cm] 1 PC Core2Duo, 3.06 [GHz] 512 384 10[fps] 8 N = 100 N AB =25 25 #160 #202 B #310 #324 A 1 Table 1 Comparison of tracking results concerning the number of particles (a) using only individual templates number of particles success rate positional error processing time N = 100 73.3 [%] 9.56 [pixel] 248 [ms] N = 200 75.0 [%] 9.95 [pixel] 383 [ms] (b) using overlapping silhouette templates number of particles success rate positional error processing time N AB =15 15 86.7 [%] 7.09 [pixel] 226 [ms] N AB =20 20 90.0 [%] 6.74 [pixel] 314 [ms] N AB =25 25 93.3 [%] 6.64 [pixel] 436 [ms] N AB =30 30 90.0 [%] 6.53 [pixel] 568 [ms] N AB = 100 100 91.7 [%] 6.43 [pixel] 4993 [ms] 60 12 5 1 2 60 2 1 9 N = 100 N AB =20 20 A B A 4.2 6 1 2 3 PC 4 PC TCP/IP 5 c 2010 Information Processing Society of Japan
3.2 10 5. 1 2 2 1) P. Viola, M.J. Jones, and D. Snow: Detecting pedestrians using patterns of motion and appearance, Int. J. Computer Vision, Vol.63, No.2, pp.153 161 (2005). 2) N. Dalal and B. Briggs: Histograms of oriented gradients for human detection, IEEE Conf. Computer Vision and Patttern Recognition, pp.886 893 (2005). 3) B. Han, S. W. Joo, and L. S. Davis: Probabilistic fusion tracking using mixture kernel-based Bayesian filtering, Int. Conf. Computer Vision (2007). 4) S. Munder, C. Schnorr, and D. M. Gavrila: Pedestrian detection and tracking using a mixture of view-based shape-texture models, IEEE Trans. Intelligent Transportation Systems, Vol.9, No.2, pp.333 343 (2008). 5) D. Schulz, W. Burgard, D. Fox, and A. B. Cremers: People tracking with a mobile robot using sample-based joint probabilistic data association filters, Int. J. Robotics Research, Vol.22, No.2, pp.99 116 (2003). 6) N. Bellotto and H. Hu: Multisensor data fusion for joint people tracking and identification with a service robot, IEEE Int. Conf. Robotics and Biomimetics, pp. 1494 1499 (2007). 7) H. Koyasu, J. Miura, and Y. Shirai: Realtime omnidirectional stereo for obstacle detection and tracking in dynamic environments, IEEE/RSJ Int. Conf. Intelligent Robots and Sysetms, pp.31 36 (2001). 8) M. Kobilarov, G. Sukhatme, J. Hyams, and P. Batavia: People tracking and following with mobile robot using omnidirectional camera and a laser, IEEE Int. Conf. Robotics and Automation, pp.557 562 (2006). 9) D. Beymer and K. Konolige: Real-time tracking of multiple people using continuous detection, Int. Conf. Computer Vision (1999). 10) A. Howard, L. H. Matthies, A. Huertas, M. Bajracharya, and A. Rankin: Detecting pedestrians with stereo vision: safe operation of autonomous ground vehicles in dynamic environments, Int. Symp. Robotics Research (2007). 11) D. Calisi, L. Iocchi, and R. Leone: Person following through appearance models and stereo vision using a mobile robot, VISAPP 2007 Workshop on Robot Vision, pp.46 56 (2007). 12) A. Ess, B. Leibe, and L. V. Cool: Depth and appearance for mobile scene analysis, Int. Conf. Computer Vision (2007). 13) A. Ess, B. Leibe, K. Schindler, and L. V. Cool: A mobile vision system for robust multi-person tracking, IEEE Conf. Computer Vision and Pattern Recognition (2008). 14) J. MacCormick and A. Blake: A probabilistic exclusion principle for tracking multiple objects, Int. Conf. Computer Vision, pp.572 578 (1999). 15) Z. Khan, T. Balch, and F. Dellaert: An MCMC-based particle filter for tracking multiple interacting targets, European Conf. Computer Vision, pp.279 290 (2004). 16) D. Tweed and A. Calway: Tracking many objects using subordinated Condensation, British Machine Vision Conf., pp.283 292 (2002). 17) J. Satake and J. Miura: Robust stereo-based person detection and tracking for a person following robot, IEEE ICRA 2009 Workshop on People Detection and Tracking (2009). 18) SIR/MCMC Vol.28, No.1, pp.65 76 (2008) 6 c 2010 Information Processing Society of Japan
情報処理学会研究報告 図 8 オフライン画像を用いた人物追跡の結果 Fig. 8 Experimental result of tracking using off-line images 図 9 特定人物追従の結果 Fig. 9 Result on a person following control 7 c 2010 Information Processing Society of Japan
情報処理学会研究報告 (a) ロボットカメラ (b) 天井カメラ 1 (c) 天井カメラ 2 (d) 統合結果 図 10 複数台カメラを用いた人物追跡結果 Fig. 10 Tracking result using multiple cameras 8 c 2010 Information Processing Society of Japan