proc-JRSJ.dvi

Similar documents
1234 Vol. 25 No. 8, pp , 2007 CPS SLAM Study on CPS SLAM 3D Laser Measurement System for Large Scale Architectures Ryo Kurazume,Yukihiro Toba

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS ) GPS Global Positioning System

proc.dvi

28 Horizontal angle correction using straight line detection in an equirectangular image

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q

IPSJ SIG Technical Report Vol.2012-CG-148 No /8/29 3DCG 1,a) On rigid body animation taking into account the 3D computer graphics came

A Navigation Algorithm for Avoidance of Moving and Stationary Obstacles for Mobile Robot Masaaki TOMITA*3 and Motoji YAMAMOTO Department of Production

untitled

1 Fig. 1 Extraction of motion,.,,, 4,,, 3., 1, 2. 2.,. CHLAC,. 2.1,. (256 ).,., CHLAC. CHLAC, HLAC. 2.3 (HLAC ) r,.,. HLAC. N. 2 HLAC Fig. 2

DPA,, ShareLog 3) 4) 2.2 Strino Strino STRain-based user Interface with tacticle of elastic Natural ObjectsStrino 1 Strino ) PC Log-Log (2007 6)

光学

1 Table 1: Identification by color of voxel Voxel Mode of expression Nothing Other 1 Orange 2 Blue 3 Yellow 4 SSL Humanoid SSL-Vision 3 3 [, 21] 8 325

IPSJ SIG Technical Report Vol.2010-MPS-77 No /3/5 VR SIFT Virtual View Generation in Hallway of Cybercity Buildings from Video Sequen

IPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta

1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z +

(a) 1 (b) 3. Gilbert Pernicka[2] Treibitz Schechner[3] Narasimhan [4] Kim [5] Nayar [6] [7][8][9] 2. X X X [10] [11] L L t L s L = L t + L s

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro

第62巻 第1号 平成24年4月/石こうを用いた木材ペレット

Journal of Geography 116 (6) Configuration of Rapid Digital Mapping System Using Tablet PC and its Application to Obtaining Ground Truth

IHI Robust Path Planning against Position Error for UGVs in Rough Terrain Yuki DOI, Yonghoon JI, Yusuke TAMURA(University of Tokyo), Yuki IKEDA, Atsus

xx/xx Vol. Jxx A No. xx 1 Fig. 1 PAL(Panoramic Annular Lens) PAL(Panoramic Annular Lens) PAL (2) PAL PAL 2 PAL 3 2 PAL 1 PAL 3 PAL PAL 2. 1 PAL

Fig. 3 Flow diagram of image processing. Black rectangle in the photo indicates the processing area (128 x 32 pixels).

Fig. 2 Signal plane divided into cell of DWT Fig. 1 Schematic diagram for the monitoring system

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)

IPSJ SIG Technical Report Vol.2014-MBL-70 No.49 Vol.2014-UBI-41 No /3/15 2,a) 2,b) 2,c) 2,d),e) WiFi WiFi WiFi 1. SNS GPS Twitter Facebook Twit

900 GPS GPS DGPS Differential GPS RTK-GPS Real Time Kinematic GPS 2) DGPS RTK-GPS GPS GPS Wi-Fi 3) RFID 4) M-CubITS 5) Wi-Fi PSP PlayStation Portable

a) Extraction of Similarities and Differences in Human Behavior Using Singular Value Decomposition Kenichi MISHIMA, Sayaka KANATA, Hiroaki NAKANISHI a

Optical Lenses CCD Camera Laser Sheet Wind Turbine with med Diffuser Pitot Tube PC Fig.1 Experimental facility. Transparent Diffuser Double Pulsed Nd:

3 2 2 (1) (2) (3) (4) 4 4 AdaBoost 2. [11] Onishi&Yoda [8] Iwashita&Stoica [5] 4 [3] 3. 3 (1) (2) (3)

EQUIVALENT TRANSFORMATION TECHNIQUE FOR ISLANDING DETECTION METHODS OF SYNCHRONOUS GENERATOR -REACTIVE POWER PERTURBATION METHODS USING AVR OR SVC- Ju

特-3.indd

IPSJ SIG Technical Report Vol.2016-CE-137 No /12/ e β /α α β β / α A judgment method of difficulty of task for a learner using simple

Human-Agent Interaction Simposium A Heterogeneous Robot System U

5D1 SY0004/14/ SICE 1, 2 Dynamically Consistent Motion Design of Humanoid Robots even at the Limit of Kinematics Kenya TANAKA 1 and Tomo

Optical Flow t t + δt 1 Motion Field 3 3 1) 2) 3) Lucas-Kanade 4) 1 t (x, y) I(x, y, t)

Fig Measurement data combination. 2 Fig. 2. Ray vector. Fig (12) 1 2 R 1 r t 1 3 p 1,i i 2 3 Fig.2 R 2 t 2 p 2,i [u, v] T (1)(2) r R 1 R 2

2 ( ) i

(3.6 ) (4.6 ) 2. [3], [6], [12] [7] [2], [5], [11] [14] [9] [8] [10] (1) Voodoo 3 : 3 Voodoo[1] 3 ( 3D ) (2) : Voodoo 3D (3) : 3D (Welc

IPSJ SIG Technical Report 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai,

28 TCG SURF Card recognition using SURF in TCG play video

Gaze Head Eye (a) deg (b) 45 deg (c) 9 deg 1: - 1(b) - [5], [6] [7] Stahl [8], [9] Fang [1], [11] Itti [12] Itti [13] [7] Fang [1],

( ) [1] [4] ( ) 2. [5] [6] Piano Tutor[7] [1], [2], [8], [9] Radiobaton[10] Two Finger Piano[11] Coloring-in Piano[12] ism[13] MIDI MIDI 1 Fig. 1 Syst

Vol.54 No (July 2013) [9] [10] [11] [12], [13] 1 Fig. 1 Flowchart of the proposed system. c 2013 Information

IPSJ SIG Technical Report GPS LAN GPS LAN GPS LAN Location Identification by sphere image and hybrid sensing Takayuki Katahira, 1 Yoshio Iwai 1

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2013-CVIM-186 No /3/15 EMD 1,a) SIFT. SIFT Bag-of-keypoints. SIFT SIFT.. Earth Mover s Distance

proc.dvi

MDD PBL ET 9) 2) ET ET 2.2 2), 1 2 5) MDD PBL PBL MDD MDD MDD 10) MDD Executable UML 11) Executable UML MDD Executable UML

Q [4] 2. [3] [5] ϵ- Q Q CO CO [4] Q Q [1] i = X ln n i + C (1) n i i n n i i i n i = n X i i C exploration exploitation [4] Q Q Q ϵ 1 ϵ 3. [3] [5] [4]

MmUm+FopX m Mm+Mop F-Mm(Fop-Mopum)M m+mop MSuS+FX S M S+MOb Fs-Ms(Mobus-Fex)M s+mob Fig. 1 Particle model of single degree of freedom master/ slave sy

IPSJ SIG Technical Report iphone iphone,,., OpenGl ES 2.0 GLSL(OpenGL Shading Language), iphone GPGPU(General-Purpose Computing on Graphics Proc

1(a) (b),(c) - [5], [6] Itti [12] [13] gaze eyeball head 2: [time] [7] Stahl [8], [9] Fang [1], [11] 3 -

2). 3) 4) 1.2 NICTNICT DCRA Dihedral Corner Reflector micro-arraysdcra DCRA DCRA DCRA 3D DCRA PC USB PC PC ON / OFF Velleman K8055 K8055 K8055

soturon.dvi

3 1 Table 1 1 Feature classification of frames included in a comic magazine Type A Type B Type C Others 81.5% 10.3% 5.0% 3.2% Fig. 1 A co

Worm Hole View 2 ( ) ( ) Evaluation of a Presentation Method of Multilevel Urban Structures using Panorama Views Yohei Abe, Ismail Arai and Nobuhiko N

IPSJ SIG Technical Report Vol.2009-CVIM-167 No /6/10 Real AdaBoost HOG 1 1 1, 2 1 Real AdaBoost HOG HOG Real AdaBoost HOG A Method for Reducing

JOURNAL OF THE JAPANESE ASSOCIATION FOR PETROLEUM TECHNOLOGY VOL. 66, NO. 6 (Nov., 2001) (Received August 10, 2001; accepted November 9, 2001) Alterna

2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server

THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE.

17 Proposal of an Algorithm of Image Extraction and Research on Improvement of a Man-machine Interface of Food Intake Measuring System

( ), ( ) Patrol Mobile Robot To Greet Passing People Takemi KIMURA(Univ. of Tsukuba), and Akihisa OHYA(Univ. of Tsukuba) Abstract This research aims a

06_学術_関節単純X線画像における_1c_梅木様.indd

Proceedings of the 61st Annual Conference of the Institute of Systems, Control and Information Engineers (ISCIE), Kyoto, May 23-25, 2017 The Visual Se

塗装深み感の要因解析

23 Fig. 2: hwmodulev2 3. Reconfigurable HPC 3.1 hw/sw hw/sw hw/sw FPGA PC FPGA PC FPGA HPC FPGA FPGA hw/sw hw/sw hw- Module FPGA hwmodule hw/sw FPGA h


Table 1. Reluctance equalization design. Fig. 2. Voltage vector of LSynRM. Fig. 4. Analytical model. Table 2. Specifications of analytical models. Fig

29 Short-time prediction of time series data for binary option trade

自動車ボディ寸法検査

IPSJ SIG Technical Report Vol.2009-DPS-141 No.20 Vol.2009-GN-73 No.20 Vol.2009-EIP-46 No /11/27 1. MIERUKEN 1 2 MIERUKEN MIERUKEN MIERUKEN: Spe

,4) 1 P% P%P=2.5 5%!%! (1) = (2) l l Figure 1 A compilation flow of the proposing sampling based architecture simulation

2) 3) LAN 4) 2 5) 6) 7) K MIC NJR4261JB0916 8) 24.11GHz V 5V 3kHz 4 (1) (8) (1)(5) (2)(3)(4)(6)(7) (1) (2) (3) (4)

14 2 5

橡上野先生訂正2

橡 PDF

IEEE HDD RAID MPI MPU/CPU GPGPU GPU cm I m cm /g I I n/ cm 2 s X n/ cm s cm g/cm

IPSJ SIG Technical Report Vol.2014-HCI-158 No /5/22 1,a) 2 2 3,b) Development of visualization technique expressing rainfall changing conditions

形状変形による古文書画像のシームレス合成

HASC2012corpus HASC Challenge 2010,2011 HASC2011corpus( 116, 4898), HASC2012corpus( 136, 7668) HASC2012corpus HASC2012corpus

Sobel Canny i

9.プレゼン資料(小泉)R1

The 15th Game Programming Workshop 2010 Magic Bitboard Magic Bitboard Bitboard Magic Bitboard Bitboard Magic Bitboard Magic Bitboard Magic Bitbo

2003/3 Vol. J86 D II No Fig. 1 An exterior view of eye scanner. CCD [7] CCD PC USB PC PC USB RS-232C PC

fiš„v8.dvi

johnny-paper2nd.dvi

B HNS 7)8) HNS ( ( ) 7)8) (SOA) HNS HNS 4) HNS ( ) ( ) 1 TV power, channel, volume power true( ON) false( OFF) boolean channel volume int

第 55 回自動制御連合講演会 2012 年 11 月 17 日,18 日京都大学 1K403 ( ) Interpolation for the Gas Source Detection using the Parameter Estimation in a Sensor Network S. T

, (GPS: Global Positioning Systemg),.,, (LBS: Local Based Services).. GPS,.,. RFID LAN,.,.,.,,,.,..,.,.,,, i

1 [1, 2, 3, 4, 5, 8, 9, 10, 12, 15] The Boston Public Schools system, BPS (Deferred Acceptance system, DA) (Top Trading Cycles system, TTC) cf. [13] [

VRSJ-SIG-MR_okada_79dce8c8.pdf

IPSJ SIG Technical Report Secret Tap Secret Tap Secret Flick 1 An Examination of Icon-based User Authentication Method Using Flick Input for

24 Region-Based Image Retrieval using Fuzzy Clustering

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L

Vol. 42 No MUC-6 6) 90% 2) MUC-6 MET-1 7),8) 7 90% 1 MUC IREX-NE 9) 10),11) 1) MUCMET 12) IREX-NE 13) ARPA 1987 MUC 1992 TREC IREX-N

24 Depth scaling of binocular stereopsis by observer s own movements

Microsoft Word - mitomi_v06.doc

IPSJ SIG Technical Report Vol.2014-CG-155 No /6/28 1,a) 1,2,3 1 3,4 CG An Interpolation Method of Different Flow Fields using Polar Inter

kut-paper-template.dvi

(bundle adjustment) 8),9) ),6),7) GPS 8),9) GPS GPS 8) GPS GPS GPS GPS Anai 9) GPS GPS GPS GPS GPS GPS GPS Maier ) GPS GPS Anai 9) GPS GPS M GPS M inf

1 Web [2] Web [3] [4] [5], [6] [7] [8] S.W. [9] 3. MeetingShelf Web MeetingShelf MeetingShelf (1) (2) (3) (4) (5) Web MeetingShelf

Transcription:

Vol. 33 No. 4, pp.63 74, 5 63 CPS-SLAM Automatic planning of laser measurements for a large-scale environment using CPS-SLAM system Souichiro Oshima, Syogo Nagakura, Jeong Yongjin,YumiIwashita and Ryo Kurazume In recent years, several low-price 3D laser scanners are being brought to the market and 3D laser scanning has been widely used in many applications. For example, 3D modeling of architectural structures and digital preservation of cultural heritages are typical applications for 3D laser scanning. Despite of the development of light-weight and high-speed laser scanners, complicated measurement procedure and large measurement time are still heavy burden for the widespread use of laser scanning. We have proposed a robotic 3D scanning system using multiple robots named CPS-SLAM, which consists of parent robots with a 3D laser scanner and child robots with corner cubes. This system enables to perform 3D laser scanning without complicated post-processing procedures such as ICP, and large-scale 3D models can be acquired by an on-board 3D laser scanner from several positions determined precisely by the localization technique named Cooperative Positioning System, CPS. This paper proposes an automatic planning technique for an efficient laser measurement for the CPS-SLAM system. By planning a proper scanning strategy depending on a target structure, it is possible to perform laser scanning efficiently and accurately even for a large-scale environment. Proposed technique plans an optimal scanning strategy automatically by taking several criteria, such as visibility between robots, error accumulation, and efficient traveling, into consideration. We conducted computer simulation and outdoor experiments to verify the performance of the proposed technique. Key Words: Laser measurement, Multiple robots, 3D modeling, Automatic sensing planning. * * Kyushu University FARO Focus 3D Lieca Scanstation, TOPCON GLS- 5 3 Iterative Closest Point (ICP) CPS- SLAM 3 [], [], [3] CPS-SLAM 3 SLAM GPS SLAM CPS- SLAM [3] m 98mm 33m 5m 6mm GPS 33 4 55 5 5

64 ICP ICP ICP SLAM CPS-SLAM RTK VRS GPS CPS-SLAM CPS-SLAM 3 4 4. [4] [5] [6] Art gallery problem [7] [8] Allen [9] 3 Topcuoglu [] Chen [],Scotto [] 3 Prieto [3] CAD [4] [5] [6] [7] [8] [9] [] Okamoto [] Li [8] Next Best View (NBV) 3. CPS-SLAM CPS-SLAM Cooperative Positinoning System, CPS CPS Fig. JRSJ Vol. 33 No. 4 56 May, 5

CPS-SLAM 65 z y x r ψ φ 3m Child robots () Robots and move. () Robot measures the position of robot. r ψ φ (3) Robot measures the position of robot. Fig. r ψ φ (4) Robot moves and measures the position of robots and. r ψ φ Cooperative Positioning System, CPS Fig. 3 3D model of large building () () (3) () (4) CPS-SLAM CPS 3 CPS ICP (Iterative Closest Point) Fig. (CPS- VII) Fig.3 Fig. 3 4. CPS-SLAM Visibility Graph [3] [4] Fig.4 Total station Laser scanner Corner cube Transform 3D model to D grid map Calculate candidate positions of parent robot Determine the optimum position of parent robot Child robot Child robot Calculate candidate positions of child robots Does the candidate positions of child robots exist? No Set subgoals for the parent robot using Visibility Graph Yes Determine the optimum positions of child robots Plan the trajectories to the target positions Fig. CPS-SLAM machine model, CPS-VII Move to the parent target position Move to the child target positions Fig. CPS-VII Fig. 4 Flowchart of automatic planning algorithm 33 4 57 5 5

66 3 4. 3 4 K-means Fig.5 Fig.6 () () (3) () (4) (3) Measured wall Unknown wall Boundary areas of unknown reg. (a) Centroids of clusters Candidates of parent positions Scatter sample points randomly (c) (d) Fig. 6 Determined candidate positions for parent robot by K- means clustering /m L m S m α β S 4. CPS AND (b) Child robot Child robot Potential field Unknown reg. Visible from current position of parent robot Fig. 5 Problem definition Current position of parent robot () () (3) (4) G = R (P + α L + β S) Current position of child robot Fig. 7 Optimum position of parent robot (maximum score) Current position of child robot (a) (b) Visible from target position of parent robot AND region Candidate positions of chid robots (c) (d) AND region and candidate positions for child robots G R = = P AND AND () AND JRSJ Vol. 33 No. 4 58 May, 5

CPS-SLAM 67 () (3) (4) 9 3 [6] 4 9 CPS [6] (5) AND (6) AND (7) AND (5) (6) (8) (4) (7) G c = P + α c θ θ t + β c D D t G c P /m θ D m θ t D t δ α c β c θ t =9 AND 4. 3 Visibility Graph AND Visibility Graph [3] [4] Fig.8 Visibility Graph Visibility Graph CPS Visibility Graph Visibility Graph Visibility Graph () AND () AND Visibility Graph (3) Visibility Graph (4) Fig. 8 Visibility Graph 4. 4 Voronoi Visibility Graph CPS Visibility Graph Voronoi [7] Voronoi Voronoi Visibility Graph Voronoi 5. 5. Fig.9 3 MAP-A, B, C 8 6 cm cm m MAP-A Fig. 5.. 4. Eq.() α, β 33 4 59 5 5

68 (a) Map A (b) Map B (c) Map C Fig. 9 MAPA,B,andC a) L (β =) b) S (α =) MAP-C Fig.9(c) Fig. Fig.3 Fig. Figs.4,5 a) L (β = Figs.,) (Fig.) 3 (Fig.)b) S (α = Figs.3,4,5) 6 Visibility Graph (Figs.4,5) 8 (Fig.3) Fig. β = Fig.3 α =Fig.3 Fig. MAP-C a) L (β =) b) S (α =) Fig.6 a) 5.46[%/ ] Fig. Child robot Child robot Planned trajectories of parent and child robots for MAP A b) 6.4[%/ ] 5.[%/ ] 5 [m] a).43[%/m] )b).3[%/m].33[%/m] 5 a) 3.% 98% a) 8 b) 6 9.6 5 b) 8.4% 5.. 4. Eq.() α c,β c 4 JRSJ Vol. 33 No. 4 6 May, 5

CPS-SLAM 69 st nd 3rd st nd 3rd 4th 5th 6th 4th 5th 6th 7th 8th 9th 7th 8th 9th th st nd th st nd 3rd 4th 5th 3rd 4th 5th 6th 7th 8th 6th 7th 8th 9th th st 9th th st nd 3rd Fig. Measured map and trajectories of parent robot in case that a closest target position is selected (β = ) for MAP C nd 3rd Fig. Trajectories of robots in case that a closest target position is selected (β =)formapc a) [5] b) α c = β c = c) α c = β c = d) α c = β c = a) Eq.() CPS c) θ t =9 d) D t =3[m] CPS [6] 8 6 (=8m 6m) P 33 4 6 5 5

7 st nd 3rd st nd 3rd 4th 5th 6th 4th 5th 6th 7th 8th 9th 7th 8th 9th th st nd th Subgoal th Subgoal th Subgoal 3 3rd 4th 5th th Subgoal 4 st nd Subgoal 6th 7th 8th Fig. 3 Measured map and trajectories of parent robot in case that a target position where a measurable area is maximized is selected (α =)formapc nd Subgoal nd Subgoal 3 nd Subgoal 4 5 L S β =. 5.. (b) α =. MAP-A Fig.7 Fig.9 9 Table nd Subgoal 5 nd Subgoal 6 3rd Fig. 4 Trajectories of robots in case that a target position where a measurable area is maximized is selected (α = ) for MAP C (st 3rd measurements) 3 [5] 4 5. (Fig.) Fig.8 Fig.9 JRSJ Vol. 33 No. 4 6 May, 5

CPS-SLAM 7 4th 5th 6th Subgoal 6th Subgoal 6th Subgoal 3 6th Subgoal 4 Error variance [mm ] 5 4 3 a) Nagakura s method [5] b) Consider angle and distance c) Consider angle mainly d) Consider distance mainly 3 4 5 6 7 8 9 Number of measurement by parent robot Map A Error variance [mm ] 5 4 3 Error variance [mm ] 5 4 3 a) Nagakura s method [5] b) Consider angle and distance c) Consider angle mainly d) Consider distance mainly a) Nagakura s method [5] b) Consider angle and distance c) Consider angle mainly d) Consider distance mainly 3 4 5 6 7 8 9 Number of measurement by parent robot Map B 6th Subgoal 5 6th Subgoal 6 7th Fig. 7 3 4 5 6 7 8 9 Number of measurement by parent robot Map C Comparison of positioning errors 8th Subgoal 8th Subgoal 8th Subgoal 3 8th Subgoal 4 Fig. 5 Trajectories of robots in case that a target position where a measurable area is maximized is selected (α = ) for MAP C (4th 8th measurements) Fig. 4 Fig. 9 Fig. VisibilityGraph Fig. 3 7 4 6 7 Figs. 3 3 x 4 5 Travel distance [mm] 4 3 b) α=, β= Manual a) α=, β= 5 5 5 Movements [times] 3 Completeness [%] 8 b) α=, β= 6 4 Manual a) α=, β= 5 5 5 Measurements [times] (a) Travel distance (b) Measured area (cover ratio) Fig. 6 Comparison of travel distance and measured area of parent robot 3 (a) Fig. 8 (b) Experimental environment Table Comparison of positioning error variances for MAP A, B, and C after parent robot moves nine times [mm ] (a) [5] (b) angle/distance (c) angle (d) distance MAP A 5..3 94. 59.3 MAP B 46.5 43.6 43.7 65. MAP C 54. 45.3 47. 45.8 Figs., Fig. Fig. 9 Outdoor experiment 3 33 4 63 5 5

7 Child robot Child robot Child robot Child robot Trajectories of Fig. Obtained 3D model (case ) Child robot Initial position of parent Initial position robot Child robot Fig. 3 Obtained 3D model (case ) Fig. Trajectories of parent and child robots (case ) 6. Trajectories of Initial position of parent Initial position robot Fig. Trajectories of parent and child robots (case ) Fig.4(a)(b) Fig.4(c) Fig.8(a) Figs., Fig.4 m CPS-SLAM K-means AND Visibility graph Voronoi 3 (A) 6499 [],,, : CPS-SLAM - -,, vol.5, no.8, pp.34-4, 7. [ ] Yukihiro Tobata, Ryo Kurazume, Yusuke Noda, Kai Lingemann, Yumi Iwashita, Tsutomu Hasegawa: Laser-based geometrical modeling of large-scale architectural structures using co-operative multiple robots, Autonomous Robot, Vol.3, No., pp. 49-6,. [3],,, CPS-SLAM - -,, Vol.3, No., pp.8-87,. [ 4 ] K. A. Tarabanis, P. K. Allen, and R. Y. Tsai, A survey of sensor planning in computer vision, IEEE Trans. on Robotics and automation, Vol.RA-, No., pp. 86-4, 995. [ 5 ] T.S. Newman and A.K. Jain, A survey of automated visual in- JRSJ Vol. 33 No. 4 64 May, 5

CPS-SLAM 73 Fig. 4 m (a) 3D model (b) 3D model (Top view) Child robot (c) Trajectories Child robot Slope (Height m) 3D models and trajectories of parent and child robots (case 3) spection, Computer Vision and Image Understanding, Vol.6, No., pp.3-6, 995. [ 6 ] Aaron Mavrinac and Xiang Chen. Modeling coverage in camera networks: A survey, International journal of computer vision, Vol., No., 5-6, 3. [ 7 ] A. Aggarwal, The art gallery theorem: Its variations, applications, and algorithmic aspects, Ph.D. thesis, Johns Hopkins University, 984. [ 8 ] J. O Rourke, Art Gallery Theorems and Algorithms. London, U.K.: Oxford Univ. Press, 987. [ 9 ] I. Stamos, P. K. Allen, Interactive sensor planning, Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp.489-494, 998. [] Haluk Rahmi Topcuoglu, Murat Ermis, and Mesut Sifyan, Positioning and Utilizing Sensors on a 3-D Terrain Part I Theory and Modeling, IEEE Trans. on Systems, Man and Cybernetics Part C: Applications and Reviews Vol.4, No.3,. [] S.Y. Chen and Y.F. Li, Automatic sensor placement for modelbased robot vision, Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, vol.34, no., pp.393-48, 4. [] W.R. Scott, G. Roth, and J.-F. Rivest, View planning with a registration constraint, Proc. Third Int. Conf. on 3-D Digital Imaging and Modeling, pp. 7-34,. [3] F. Prieto, T. Redarce, et al, CAD-based range sensor placement for optimum 3D data acquisition, Proc. Second Int. Conf. on 3-D Digital Imaging and Modeling, pp. 8-37, 999. [4] D. Papadopoulos-Orfanos and F. Schmitt, Automatic 3-D digitization using a laser rangefinder with a small field of view, Conf. on Recent Advances in 3-D Digital Imaging and Modeling, pp. 6-67, 997. [5] H. Zha, K. Morooka, T. Hasegawa, and T. Nagata, Active modeling of 3-D objects: Planning on the next best pose (NBP) for acquiring range images, Proc. Int. Conf. on Recent Advances 3-D Digital Imaging Modeling, pp. 68-75, 997. [6] E. Marchand and F. Chaumette, Active sensor placement for complete scene reconstruction and exploration, Proc. IEEE Int. Conf. on Robotics and Automation, pp. 743-75, 997. [7] S.Y. Chen and Y.F. Li, Vision sensor planning for 3-D model acquisition, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol.35, no.5, pp.894-94 5. [8] Y.F. Li and Z.G. Liu, Information entropy-based viewpoint planning for 3-D object reconstruction, IEEE Transactions on Robotics, vol., no.3, pp.34-337, 5. [9] Ruzena Bajcsy, Active perception,.proc. of the IEEE, Vol.76, No.8, pp.966-5, 988. [] Yiannis Aloimonos, ed. Active perception. Psychology Press, 3. [] J. Okamoto, M. Milanova, U. Bueker, Active perception system for recognition of 3D objects in image sequences, 5th Int. Workshop on Advanced Motion Control, pp. 7-75, 998. [],,,,, - -,, Vol.4, No.8, pp.9-36, 996. [3] Chatila.R: Path planning and environment learning in a mobile robot system, Proc. European Conz Artificial Intelligence, Torsey, France, 98. [4] Mark de Berg et al., Computational Geometry: Algorithms and Applications, Springer, 997. [5],,, : 3, 3, B-3,. [6],,,,, -CPS-II -,, Vol.5, No.5, pp.773-78, 997. [7] Franz Aurenhammer, Voronoi Diagrams, A Survey of a Fundamental Geometric Data Structure, ACM Computing Surveys (CSUR), Vol 3, Issue 3, pp.345-45, 99. [8] J.C.Latombe: Robot Motion Planning, Kluwer Academic Publishers, 99. A. Fig.5(a)(b) Fig.5(c) Fig.5(d),(e) 3 6 Fig.5(d),(e) (Fig.5(d) 4 ) 6 Fig.5(d),6 3,7 33 4 65 5 5

74 (Shingo Nagakura 3 (a) Environment (b) D map Jeong Yongjin 4 Yumi Iwashita 7 Imperial College London NASA 5 Ryo Kurazume 9 (c) Measured 3D model 6 5 3 7 3 4 8 99 995 7 (993,4) (8) () (d) Automatic planning 6 5 5 4 6 8 3 7 4 9 3 Fig. 5 (e) Manual planning Example of measurement paths by manual and automatic planning (Souichiro Oshima 3 JRSJ Vol. 33 No. 4 66 May, 5