219 (2004.11.05) 219-4 Development of a 3D Range Sensor Based on Equiphase Light-Section Method KUMAGAI Masaaki * *Tohoku Gakuin University : (Vision sensor), (3-D range sensor), (Light-section method), (Phase shift method), (Equiphase plane) : 985-8537, Tel.:022-368-7358, Fax: 022-368-7070, E-mail: kumagai tjcc.tohoku gakuin.ac.jp 1. 1)2) 3) 4) ( ) 5, 6) 7) 8, 9) 1
10) ( ) 10) 2. 2.1 Fig. 1 7) ( ) Fig. 2 ( ) 2
(a) (b) (c) L A B C 3 2 sin(2 si/n+ 1 ) detection line Fig. 1 / Traditional light-section method. A slit light is projected to target, and bright segmented and/or curved lines will be observed on the target. sin(2 si/n+ 2 ) sin(2 si/n+ 1 ) Fig. 3 / Detection of target by detecting lighting phase. Observed phase on each pixel in camera image depends on which equiphase-plane crosses the objects. (L denotes light source) Fig. 2 / Equiphase light-section method. Lights whose intensities are modulated by sinusoidal function with different phases are projected to the target instead of slit light in light-section method. Fig. 3 (b) Fig. 4 / An example of sequence of modulated images. Same narrow horizontal parts of 8 images are arranged in order of them. Several sections whose phases are different from each other can be found. Halogen lamp B φ 2 Masking cylinder Motor (a), (c) A, C φ 3,φ 1 Fig. 4 Fig. 5 ( ) / Lighting system. A cylinder with striped pattern of sinusoidal function (two per revolution in this case) is driven by a motor synchronized to video signal. A bright light source such as halogen lamp is inserted to cylinder. Fig. 5 3
α ψ α(ψ) =0.5+0.5 sin(2ψ) (1) 180 [deg] [0, 2π] 2.2 1 I = I 0 (a + b sin(2πft + φ)) (2) I I 0 a b (a + b =1, a b ), t f i (x, y) Y xy [i] Y xy [i] =Y 0,xy (a + b sin(2πi/n + φ)) + Y B,xy (3) 1 Y 0,xy Y B,xy n ( ) n x, y, i φ Y xy [i] sin(2πi/n) cos(2πi/n) n Y C,xy = (1/n) {sin(2πi/n)y xy [i]} = (1/n) {sin(2πi/n)(ay 0,xy + Y B,xy )} +(by 0,xy /n) {sin(2πi/n) sin(2πi/n + φ)} = {(ay 0,xy + Y B,xy )/n} sin(2πi/n) +(by 0,xy /2n) {cos(φ) cos(4πi/n + φ)} = 0+(bY 0,xy /2) cos φ =(by 0,xy /2) cos φ. (4) Y S,xy = (1/n) {cos(2πi/n)y xy [i]} = (by 0,xy /2) sin φ. (5) Y C,xy Y S,xy φ 8) Y xy = (1/n) Y xy [i] = (1/n) (ay 0,xy + Y B,xy ) 4
+(by 0,xy /n) sin(2πi/n + φ) = (ay 0,xy + Y B,xy ) (6) Y R, G, B ( R, G, B Y R + G + B ) ( ) (n) 2.3 n s Y xy [i] =Y 0,xy (a + b sin(2πsi/n + φ)) + Y b,xy (7) s sin(2πsi/n) cos(2πsi/n) Y C,xy = (1/n) {sin(2πsi/n)y xy [i]} = {(ay 0,xy + Y b,xy )/n} sin(2πsi/n) +(by 0,xy /2n) {cos(φ) cos(4πsi/n + φ)} = (by 0,xy /2) cos φ, (8) Y S,xy = (1/n) {cos(2πsi/n)y xy } = (by 0,xy /2) sin φ (9) s n s s 1/s 2.5 s [0, 2π] [0, 2sπ] s ( ) (8)(9) [0, 2π] [0, 2sπ] 2.4 n s 1 s 2 ( s 1 s 2 ) Y xy [i] = Y 0,xy (a + b 1 sin(2πs 1 i/n + φ 1 ) +b 2 sin(2πs 2 i/n + φ 2 )) (10) b 1,b 2,φ 1,φ 2 sin(2πs 1 i/n) 5
Y C1,xy = (1/n) {sin(2πs 1 i/n)y xy [i]} = (b 1 Y 0,xy /2n) {cos(φ 1 ) cos(4πs 1 i/n + φ 1 )} +(b 2 Y 0,xy /2n) {cos(φ 2 +2πs 2 i/n 2πs 1 i/n) cos(φ 2 +2πs 2 i/2+2πs 1 i/n)} = (b 1 Y 0,xy /2) cos φ 1. (11) Y S1,xy, Y C2,xy, Y S2,xy n n s ( ) 2.5 50 [Hz] 2.3 s = n +1 n sin(2πsi/n + φ) = sin(2π(n +1)i/n + φ) = sin(2πi +2πi/n + φ) =sin(2πi/n + φ)(12) ( ) ( ) s 60 [Hz] NTSC n 60/n [Hz] s = n +1 60(1 + 1/n) [Hz] 60 [Hz] n +1 6
(b) /Lighting unit (lamp and cylinder) (c) / (a) /Outline Sinusoidal pattern on cyclinder Fig. 6 / Experimental system. 3. Fig. 6 (c) OHP 1 5 25 1/5(72 [deg]) 1 5 1 400 (200 1-2 ) 80 (4/3)[s] ( ) 40 NTSC CCD Linux(KNOPPIX EduTG 0.9) Pentium 2.6 [GHz] PC C++ Video for Linux CPU 20% 50 [ms] Fig. 7 Fig. 7 ( ) (6) / Some of experimental results. Each image consists four parts, top-left: brief range image (brighter pixel means farther point), top-right: normal picture obtained with equation (6), bottom-left: contour of range image, bottom-right: horizontally rotated top-right image in 3D using range image. ( ) ( ) Fig. 8 7
Inner surface of a cup Outer surface Plane of table (distored a little) Fig. 8 / Cross section of range image. There faces were found; top plane of table, frontal outer surface and back inner surface of cup. DC Fig. 9 120 ( 2[s]) 12 [rps](720 [rpm]) 20 [mm] 5 [mm] 130 [mm] ( ) Fig. 9 / Some of experimental results with new developing system. DC 8
4. 1), :, 4) Takashi Emura, Masaaki Kumagai and Lei Wang: A Next-Generation Intelligent Car for Safe Drive, Journal of Robotics and Mechatronics, 12-5, pp.545 551 (2000) 5) Hans-Gerd Maas, Robust Automatic Surface Reconstruction with Structured Light, International Archives of Photogrammetry and Remote Sensing, 29-B5, 709/713 (1992) 6) Peter Albrecht and Bernd Michaelis, Improvement of the Spatial Resolution of an Optical 3-D Measurement Procedure, IEEE Trans, on Instrumentation and Measurement, 47-1, 158/162 (1998) 7),,, :,, 57-9, 1149/1151 (2003) 8) V. Srinivasan, H.C.Liu, and M. Halioua: Automated phase-measuring profilometry of 3-D diffuse objects, Applied Optics, 23-18, 3105/3108 (1984) 9), :,, no.78, 10/15 (2002) 10) KUMAGAI Masaaki: Development of a 3D Vision Range Sensor using Equiphase Light-Section Method, Proc. of IEEE/RSJ IROS 2004, 2161/2166 (2004), 37-11.1040/1047(2001) 2),, :, ROBOMEC 02, 1A1-H04 (2002) 3),, :, ROBOMEC 02, 2P1-E10 (2002) 9