Vol. 33 No. 4, pp.63 74, 5 63 CPS-SLAM Automatic planning of laser measurements for a large-scale environment using CPS-SLAM system Souichiro Oshima, Syogo Nagakura, Jeong Yongjin,YumiIwashita and Ryo Kurazume In recent years, several low-price 3D laser scanners are being brought to the market and 3D laser scanning has been widely used in many applications. For example, 3D modeling of architectural structures and digital preservation of cultural heritages are typical applications for 3D laser scanning. Despite of the development of light-weight and high-speed laser scanners, complicated measurement procedure and large measurement time are still heavy burden for the widespread use of laser scanning. We have proposed a robotic 3D scanning system using multiple robots named CPS-SLAM, which consists of parent robots with a 3D laser scanner and child robots with corner cubes. This system enables to perform 3D laser scanning without complicated post-processing procedures such as ICP, and large-scale 3D models can be acquired by an on-board 3D laser scanner from several positions determined precisely by the localization technique named Cooperative Positioning System, CPS. This paper proposes an automatic planning technique for an efficient laser measurement for the CPS-SLAM system. By planning a proper scanning strategy depending on a target structure, it is possible to perform laser scanning efficiently and accurately even for a large-scale environment. Proposed technique plans an optimal scanning strategy automatically by taking several criteria, such as visibility between robots, error accumulation, and efficient traveling, into consideration. We conducted computer simulation and outdoor experiments to verify the performance of the proposed technique. Key Words: Laser measurement, Multiple robots, 3D modeling, Automatic sensing planning. * * Kyushu University FARO Focus 3D Lieca Scanstation, TOPCON GLS- 5 3 Iterative Closest Point (ICP) CPS- SLAM 3 [], [], [3] CPS-SLAM 3 SLAM GPS SLAM CPS- SLAM [3] m 98mm 33m 5m 6mm GPS 33 4 55 5 5
64 ICP ICP ICP SLAM CPS-SLAM RTK VRS GPS CPS-SLAM CPS-SLAM 3 4 4. [4] [5] [6] Art gallery problem [7] [8] Allen [9] 3 Topcuoglu [] Chen [],Scotto [] 3 Prieto [3] CAD [4] [5] [6] [7] [8] [9] [] Okamoto [] Li [8] Next Best View (NBV) 3. CPS-SLAM CPS-SLAM Cooperative Positinoning System, CPS CPS Fig. JRSJ Vol. 33 No. 4 56 May, 5
CPS-SLAM 65 z y x r ψ φ 3m Child robots () Robots and move. () Robot measures the position of robot. r ψ φ (3) Robot measures the position of robot. Fig. r ψ φ (4) Robot moves and measures the position of robots and. r ψ φ Cooperative Positioning System, CPS Fig. 3 3D model of large building () () (3) () (4) CPS-SLAM CPS 3 CPS ICP (Iterative Closest Point) Fig. (CPS- VII) Fig.3 Fig. 3 4. CPS-SLAM Visibility Graph [3] [4] Fig.4 Total station Laser scanner Corner cube Transform 3D model to D grid map Calculate candidate positions of parent robot Determine the optimum position of parent robot Child robot Child robot Calculate candidate positions of child robots Does the candidate positions of child robots exist? No Set subgoals for the parent robot using Visibility Graph Yes Determine the optimum positions of child robots Plan the trajectories to the target positions Fig. CPS-SLAM machine model, CPS-VII Move to the parent target position Move to the child target positions Fig. CPS-VII Fig. 4 Flowchart of automatic planning algorithm 33 4 57 5 5
66 3 4. 3 4 K-means Fig.5 Fig.6 () () (3) () (4) (3) Measured wall Unknown wall Boundary areas of unknown reg. (a) Centroids of clusters Candidates of parent positions Scatter sample points randomly (c) (d) Fig. 6 Determined candidate positions for parent robot by K- means clustering /m L m S m α β S 4. CPS AND (b) Child robot Child robot Potential field Unknown reg. Visible from current position of parent robot Fig. 5 Problem definition Current position of parent robot () () (3) (4) G = R (P + α L + β S) Current position of child robot Fig. 7 Optimum position of parent robot (maximum score) Current position of child robot (a) (b) Visible from target position of parent robot AND region Candidate positions of chid robots (c) (d) AND region and candidate positions for child robots G R = = P AND AND () AND JRSJ Vol. 33 No. 4 58 May, 5
CPS-SLAM 67 () (3) (4) 9 3 [6] 4 9 CPS [6] (5) AND (6) AND (7) AND (5) (6) (8) (4) (7) G c = P + α c θ θ t + β c D D t G c P /m θ D m θ t D t δ α c β c θ t =9 AND 4. 3 Visibility Graph AND Visibility Graph [3] [4] Fig.8 Visibility Graph Visibility Graph CPS Visibility Graph Visibility Graph Visibility Graph () AND () AND Visibility Graph (3) Visibility Graph (4) Fig. 8 Visibility Graph 4. 4 Voronoi Visibility Graph CPS Visibility Graph Voronoi [7] Voronoi Voronoi Visibility Graph Voronoi 5. 5. Fig.9 3 MAP-A, B, C 8 6 cm cm m MAP-A Fig. 5.. 4. Eq.() α, β 33 4 59 5 5
68 (a) Map A (b) Map B (c) Map C Fig. 9 MAPA,B,andC a) L (β =) b) S (α =) MAP-C Fig.9(c) Fig. Fig.3 Fig. Figs.4,5 a) L (β = Figs.,) (Fig.) 3 (Fig.)b) S (α = Figs.3,4,5) 6 Visibility Graph (Figs.4,5) 8 (Fig.3) Fig. β = Fig.3 α =Fig.3 Fig. MAP-C a) L (β =) b) S (α =) Fig.6 a) 5.46[%/ ] Fig. Child robot Child robot Planned trajectories of parent and child robots for MAP A b) 6.4[%/ ] 5.[%/ ] 5 [m] a).43[%/m] )b).3[%/m].33[%/m] 5 a) 3.% 98% a) 8 b) 6 9.6 5 b) 8.4% 5.. 4. Eq.() α c,β c 4 JRSJ Vol. 33 No. 4 6 May, 5
CPS-SLAM 69 st nd 3rd st nd 3rd 4th 5th 6th 4th 5th 6th 7th 8th 9th 7th 8th 9th th st nd th st nd 3rd 4th 5th 3rd 4th 5th 6th 7th 8th 6th 7th 8th 9th th st 9th th st nd 3rd Fig. Measured map and trajectories of parent robot in case that a closest target position is selected (β = ) for MAP C nd 3rd Fig. Trajectories of robots in case that a closest target position is selected (β =)formapc a) [5] b) α c = β c = c) α c = β c = d) α c = β c = a) Eq.() CPS c) θ t =9 d) D t =3[m] CPS [6] 8 6 (=8m 6m) P 33 4 6 5 5
7 st nd 3rd st nd 3rd 4th 5th 6th 4th 5th 6th 7th 8th 9th 7th 8th 9th th st nd th Subgoal th Subgoal th Subgoal 3 3rd 4th 5th th Subgoal 4 st nd Subgoal 6th 7th 8th Fig. 3 Measured map and trajectories of parent robot in case that a target position where a measurable area is maximized is selected (α =)formapc nd Subgoal nd Subgoal 3 nd Subgoal 4 5 L S β =. 5.. (b) α =. MAP-A Fig.7 Fig.9 9 Table nd Subgoal 5 nd Subgoal 6 3rd Fig. 4 Trajectories of robots in case that a target position where a measurable area is maximized is selected (α = ) for MAP C (st 3rd measurements) 3 [5] 4 5. (Fig.) Fig.8 Fig.9 JRSJ Vol. 33 No. 4 6 May, 5
CPS-SLAM 7 4th 5th 6th Subgoal 6th Subgoal 6th Subgoal 3 6th Subgoal 4 Error variance [mm ] 5 4 3 a) Nagakura s method [5] b) Consider angle and distance c) Consider angle mainly d) Consider distance mainly 3 4 5 6 7 8 9 Number of measurement by parent robot Map A Error variance [mm ] 5 4 3 Error variance [mm ] 5 4 3 a) Nagakura s method [5] b) Consider angle and distance c) Consider angle mainly d) Consider distance mainly a) Nagakura s method [5] b) Consider angle and distance c) Consider angle mainly d) Consider distance mainly 3 4 5 6 7 8 9 Number of measurement by parent robot Map B 6th Subgoal 5 6th Subgoal 6 7th Fig. 7 3 4 5 6 7 8 9 Number of measurement by parent robot Map C Comparison of positioning errors 8th Subgoal 8th Subgoal 8th Subgoal 3 8th Subgoal 4 Fig. 5 Trajectories of robots in case that a target position where a measurable area is maximized is selected (α = ) for MAP C (4th 8th measurements) Fig. 4 Fig. 9 Fig. VisibilityGraph Fig. 3 7 4 6 7 Figs. 3 3 x 4 5 Travel distance [mm] 4 3 b) α=, β= Manual a) α=, β= 5 5 5 Movements [times] 3 Completeness [%] 8 b) α=, β= 6 4 Manual a) α=, β= 5 5 5 Measurements [times] (a) Travel distance (b) Measured area (cover ratio) Fig. 6 Comparison of travel distance and measured area of parent robot 3 (a) Fig. 8 (b) Experimental environment Table Comparison of positioning error variances for MAP A, B, and C after parent robot moves nine times [mm ] (a) [5] (b) angle/distance (c) angle (d) distance MAP A 5..3 94. 59.3 MAP B 46.5 43.6 43.7 65. MAP C 54. 45.3 47. 45.8 Figs., Fig. Fig. 9 Outdoor experiment 3 33 4 63 5 5
7 Child robot Child robot Child robot Child robot Trajectories of Fig. Obtained 3D model (case ) Child robot Initial position of parent Initial position robot Child robot Fig. 3 Obtained 3D model (case ) Fig. Trajectories of parent and child robots (case ) 6. Trajectories of Initial position of parent Initial position robot Fig. Trajectories of parent and child robots (case ) Fig.4(a)(b) Fig.4(c) Fig.8(a) Figs., Fig.4 m CPS-SLAM K-means AND Visibility graph Voronoi 3 (A) 6499 [],,, : CPS-SLAM - -,, vol.5, no.8, pp.34-4, 7. [ ] Yukihiro Tobata, Ryo Kurazume, Yusuke Noda, Kai Lingemann, Yumi Iwashita, Tsutomu Hasegawa: Laser-based geometrical modeling of large-scale architectural structures using co-operative multiple robots, Autonomous Robot, Vol.3, No., pp. 49-6,. [3],,, CPS-SLAM - -,, Vol.3, No., pp.8-87,. [ 4 ] K. A. Tarabanis, P. K. Allen, and R. Y. Tsai, A survey of sensor planning in computer vision, IEEE Trans. on Robotics and automation, Vol.RA-, No., pp. 86-4, 995. [ 5 ] T.S. Newman and A.K. Jain, A survey of automated visual in- JRSJ Vol. 33 No. 4 64 May, 5
CPS-SLAM 73 Fig. 4 m (a) 3D model (b) 3D model (Top view) Child robot (c) Trajectories Child robot Slope (Height m) 3D models and trajectories of parent and child robots (case 3) spection, Computer Vision and Image Understanding, Vol.6, No., pp.3-6, 995. [ 6 ] Aaron Mavrinac and Xiang Chen. Modeling coverage in camera networks: A survey, International journal of computer vision, Vol., No., 5-6, 3. [ 7 ] A. Aggarwal, The art gallery theorem: Its variations, applications, and algorithmic aspects, Ph.D. thesis, Johns Hopkins University, 984. [ 8 ] J. O Rourke, Art Gallery Theorems and Algorithms. London, U.K.: Oxford Univ. Press, 987. [ 9 ] I. Stamos, P. K. Allen, Interactive sensor planning, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp.489-494, 998. [] Haluk Rahmi Topcuoglu, Murat Ermis, and Mesut Sifyan, Positioning and Utilizing Sensors on a 3-D Terrain Part I Theory and Modeling, IEEE Trans. on Systems, Man and Cybernetics Part C: Applications and Reviews Vol.4, No.3,. [] S.Y. Chen and Y.F. Li, Automatic sensor placement for modelbased robot vision, Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, vol.34, no., pp.393-48, 4. [] W.R. Scott, G. Roth, and J.-F. Rivest, View planning with a registration constraint, Proc. Third Int. Conf. on 3-D Digital Imaging and Modeling, pp. 7-34,. [3] F. Prieto, T. Redarce, et al, CAD-based range sensor placement for optimum 3D data acquisition, Proc. Second Int. Conf. on 3-D Digital Imaging and Modeling, pp. 8-37, 999. [4] D. Papadopoulos-Orfanos and F. Schmitt, Automatic 3-D digitization using a laser rangefinder with a small field of view, Conf. on Recent Advances in 3-D Digital Imaging and Modeling, pp. 6-67, 997. [5] H. Zha, K. Morooka, T. Hasegawa, and T. Nagata, Active modeling of 3-D objects: Planning on the next best pose (NBP) for acquiring range images, Proc. Int. Conf. on Recent Advances 3-D Digital Imaging Modeling, pp. 68-75, 997. [6] E. Marchand and F. Chaumette, Active sensor placement for complete scene reconstruction and exploration, Proc. IEEE Int. Conf. on Robotics and Automation, pp. 743-75, 997. [7] S.Y. Chen and Y.F. Li, Vision sensor planning for 3-D model acquisition, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol.35, no.5, pp.894-94 5. [8] Y.F. Li and Z.G. Liu, Information entropy-based viewpoint planning for 3-D object reconstruction, IEEE Transactions on Robotics, vol., no.3, pp.34-337, 5. [9] Ruzena Bajcsy, Active perception,.proc. of the IEEE, Vol.76, No.8, pp.966-5, 988. [] Yiannis Aloimonos, ed. Active perception. Psychology Press, 3. [] J. Okamoto, M. Milanova, U. Bueker, Active perception system for recognition of 3D objects in image sequences, 5th Int. Workshop on Advanced Motion Control, pp. 7-75, 998. [],,,,, - -,, Vol.4, No.8, pp.9-36, 996. [3] Chatila.R: Path planning and environment learning in a mobile robot system, Proc. European Conz Artificial Intelligence, Torsey, France, 98. [4] Mark de Berg et al., Computational Geometry: Algorithms and Applications, Springer, 997. [5],,, : 3, 3, B-3,. [6],,,,, -CPS-II -,, Vol.5, No.5, pp.773-78, 997. [7] Franz Aurenhammer, Voronoi Diagrams, A Survey of a Fundamental Geometric Data Structure, ACM Computing Surveys (CSUR), Vol 3, Issue 3, pp.345-45, 99. [8] J.C.Latombe: Robot Motion Planning, Kluwer Academic Publishers, 99. A. Fig.5(a)(b) Fig.5(c) Fig.5(d),(e) 3 6 Fig.5(d),(e) (Fig.5(d) 4 ) 6 Fig.5(d),6 3,7 33 4 65 5 5
74 (Shingo Nagakura 3 (a) Environment (b) D map Jeong Yongjin 4 Yumi Iwashita 7 Imperial College London NASA 5 Ryo Kurazume 9 (c) Measured 3D model 6 5 3 7 3 4 8 99 995 7 (993,4) (8) () (d) Automatic planning 6 5 5 4 6 8 3 7 4 9 3 Fig. 5 (e) Manual planning Example of measurement paths by manual and automatic planning (Souichiro Oshima 3 JRSJ Vol. 33 No. 4 66 May, 5