Subspace 3 3 Projective Grid Space(PGS) 3 PGS % 10 Subspace 3 3 RANSAC

Size: px
Start display at page:

Download "Subspace 3 3 Projective Grid Space(PGS) 3 PGS % 10 Subspace 3 3 RANSAC"

Transcription

1 2006 3

2 Subspace 3 3 Projective Grid Space(PGS) 3 PGS % 10 Subspace 3 3 RANSAC

3 Projective Grid Space Homography i

4 Subspace Subspace Subspace RANSAC A 73 A A A A A A ii

5 A A A A A A A B 83 iii

6

7 Projective Grid Space Projective Grid Space Homography f Base View ( ) ( ) m k (x) ( ) ( ) ( + ) ( ) ( ) v

8 5.8 3 ( + ) ( ) ( ) ( + ) ( ) ( ) ( + ) ( ) ( ) ( + ) ( ) ( ) ( + ) ( ) ( ) vi

9 4.1 [%] ( ) ( ) ( + ) ( ) ( ) ( + ) ( ) ( ) ( + ) vii

10

11 1 Graphical User Interface(GUI) 2 3 GUI Perceptual User Interface(PUI)[16] PUI 3 PUI Virtual Reality(VR) [7] [22, 15]

12 3 3 3 [30, 25, 13] 2 [20] 3 PUI 3 [20] PUI Subspace 3 3 Projective Grid Space[21] 3 3 Projective Grid Space Subspace M N 3M N Subspace Tomasi Shape from motion Subspace [6] Tomasi 2

13 Irani Subspace [18] Irani 3 3 RANSAC Subspace A 3

14

15 f (Perspective projection) (Focal point) (Focus) (Focal length) 3 x = f X Z (2.1) y = f Y Z 5

16 2.1: 3 (x, y, f) (2.1) x f y 0 f X Y Z 1 (2.2) 1 m = [x, y, 1] T, M = [X, Y, Z, 1] T (2.2) m P M (2.3) P = f f

17 (world coordinate system) (2.2) R t R RR T = R T R = I det(r) = 1 M c M w M c = RM w + t (2.4) M c = D M w (2.5) D [ R t D = 0 T 3 1 ], 0 3 = [0, 0, 0] T 3 (2.3) (2.5) m P M c = P D M w (2.6) (i) (ii) (iii) (c, x, y) c x y (o, u, v) x u u v (c, x, y) k u, k v u v 7

18 2.2: θ θ (o, u, v) [u 0, v 0 ] T (o, u, v) (c, x, y) m p = [u, v] T, m s = [x, y] T m p = A m s (2.7) A = k u k u cot θ u 0 0 k v / sin θ v A D A P = AP N D (2.8) f P N = 0 f 0 0 (2.9) M m m P M (2.10) 8

19 P H : 2 C 1 C 1 n C 1 C 1 d M C 1 M 1 n T M 1 = d (2.11) C 1 C 2 R t M C 2 M 2 M 2 = RM 1 + t n T M 1 /d = 1 M 2 = RM 1 + t nt M 1 (2.12) ] d = [R + tnt M 1 (2.13) d 9

20 M m 1 = M 1 /z 1 m 2 = M 2 /z 2 z 2 z 1 m 2 = [R + tnt d ] m 1 (2.14) (u 1, v 1 ) (u 2, v 2 ) α u 2 v 2 1 = H u 1 v 1 1 (2.15) (2.15) n 1 i (u 1i, v 1i ) 2 i (u 2i, v 2i ) (2.15) α α i α i u 2i v 2i 1 = H u 1i v 1i 1 (2.16) H (ij) H ij h = (H 11 H 12 H 13 H 21 H 22 H 23 H 31 H 32 H 33 ) T u 1i v 1i u 2i 0 0 u 1i v 1i v 2i 0 0 u 1i v 1i 1 1 [ h α i ] = (2.17) 3 9 U i 1 -u i n h U 1 u U 2 0 u h α 1. = (0) (2.18) U n 0 0 u 2n α n 10

21 h H (2.18) u 11 H 11 + v 11 H 12 + H 13 u 21 α 1 = 0 u 11 H 21 + v 11 H 22 + H 23 v 21 α 1 = 0 u 11 H 31 + v 11 H 32 + H 33 α 1 = 0 u 12 H 11 + v 12 H 12 + H 13 u 22 α 2 = 0 u 12 H 21 + v 12 H 22 + H 23 v 22 α 2 = 0 (2.19) u 12 H 31 + v 12 H 32 + H 33 α 2 = 0. u 1i H 11 + v 1i H 12 + H 13 u 2i α i = 0 u 1i H 21 + v 1i H 22 + H 23 v 2i α i = 0 u 1i H 31 + v 1i H 32 + H 33 α i = 0 α i = u 1i H 31 + v 1i H 32 + H 33 (2.19) α u 11 H 11 + v 11 H 12 + H 13 u 21 u 11 H 31 u 21 v 11 H 32 u 21 H 33 = 0 u 11 H 21 + v 11 H 22 + H 23 v 21 u 11 H 31 v 21 v 11 H 32 v 21 H 33 = 0 u 12 H 11 + v 12 H 12 + H 13 u 22 u 12 H 31 u 22 v 12 H 32 u 22 H 33 = 0 u 12 H 21 + v 12 H 22 + H 23 v 22 u 12 H 31 v 22 v 12 H 32 v 22 H 33 = 0 (2.20). u 1i H 11 + v 1i H 12 + H 13 u 2i u 1i H 31 u 2i v 1i H 32 u 2i H 33 = 0 u 1i H 21 + v 1i H 22 + H 23 v 2i u 1i H 31 v 2i v 1i H 32 v 2i H 33 = 0 A h = 0 u 11 v u 21 u 11 u 21 v 11 u u 11 v 11 1 v 21 u 11 v 21 v 11 v 21 u 12 v u 22 u 12 u 22 v 12 u u 12 v 12 1 v 22 u 12 v 22 v 12 v u 1i v 1i u 2i u 1i u 2i v 1i u 2i u 1i v 1i 1 v 2i u 1i v 2i v 1i v 2i H 11 H 12 H 13 H 21 H 22 H 23 H 31 H 32 = (0) (2.21) H 33 (2.22) h A T A 11

22 : 2 (Epipolar plane) (Epipolar line) (Epipole) 1 1 m 2 m R t x T (t (R x T + t)) = 0 (2.23) x 1 x 2 x 3 = 0 x 3 x2 x 3 0 x 1 x 2 x 1 0 (2.24) x T [t] (R x + t) = x T E x = 0 (2.25) 12

23 E = [t] R E (Essential Matrix) x T E x = m T F m = 0 (2.26) F = A T EA 1 (2.27) F (Fundamental Matrix) e e M 2 m m (2.28) m T F m = 0 (2.28) 1 m l l = F m (2.29) l (2.28) m T l = 0 2 m 2 l 1 m 2 l l = F T m l 2 m m (2.28) F m = 0 m e F ẽ = 0 (2.30) e 2 m 1 1 F T ẽ = 0 e 2 13

24 3 3 3 F = f 11 f 12 f 13 f 21 f 22 f 23 = f 1 f 2 (2.31) f 31 f 32 f 33 f 3 rankf = 2 3 f 1 f 2 f (2.30) 3 0 f 1 ẽ = 0 f 2 ẽ = 0 f 3 ẽ = 0 (2.32) e f 1 f 2 f 3 F 3 f 1 f 2 f f 1 f 2 f 3 e F 1 e F ( ) ( ) 2 e F ( ) ( ) 2.5: l l m m m = u 1 0 m = 14 u 1 0 (2.33)

25 m m m T F m = 0 (2.34) f 11 uu + f 12 u + f 21 u + f 22 = 0 (2.35) [ ] λ u 1 = [ f 21 f 22 f 11 f 12 ] [ u 1 ] (2.36) H [ H = f 21 f 22 f 11 f 12 ] (2.37) H F 2 e e 2 3 H [ ] RH RHe F = (2.38) e T RH e T RHe R = [ ] 2 2 = [10, 11, 14, 27] 8 1 m i = [u i, v i, 1] T 2 m i = [u i, v i, 1] T (2.39) m T i F m i = 0 (2.39) 15

26 F (2.39) u T i f = 0 (2.40) u i = [u i u i, u i v i, u i, v i u i, v i v i, v i, u i, v i, 1] T f = [F 11, F 12, F 13, F 21, F 22, F 23, F 31, F 32, F 33 ] T F ij i j n (2.40) Bf = 0 B B = u T 1. u T n min F ( m T i F m i) 2 (2.41) i f min f f T B T Bf (2.42) f = 1 (2.42) f B T B 2 2 F F = UΣV T (2.43) 16

27 Σ σ 1 > σ 2 > σ 3 Σ = diag(σ 1, σ 2, σ 3 ) U V F 2 F = U ˆΣV T (2.44) ˆΣ = diag(σ 1, σ 2, 0) 2.4 ( ) Projective Grid Space Homography F 3 [36, 27, 29] P P P = [I 0] (2.45) P = [[ẽ ] F ẽ ] (2.46) e Projective Grid Space 3 [21] 2 (Base View) 2.6 Grid Point Base View 1 (p, q) 1 Base View 2 r s Base View 1 Base 17

28 View 2 F 21 l Base View 2 l Projective Grid Space Projective Grid 2.6: Projective Grid Space Space 2 (p, q, r) (p, q, r) (p, q) (r, s) s s = (l xr + l z ) l y (2.47) l = (l x, l y, l z ) (p, q) Base View 2 l = F 21 p q 1 (2.48) Projective Grid Space i 2 F i1 F i2 (p, q, r) i (p, q, r) Base View 1 (p, q) i F i1 (p, q) l 1 Base View 2 (s, r) i Base View2 (s, r) F i2 l i l 1 l 2 (p, q, r) 18

29 2.7: Projective Grid Space P P = p 11 p 12 p 13 p 14 p 21 p 22 p 23 p 24 p 31 p 32 p 33 p 34 [u, v] u = p 11X w + p 12 Y w p 13 Z w + p 14 p 31 X w + p 32 Y w p 33 Z w + p 34 v = p 21X w + p 22 Y w p 23 Z w + p 24 p 31 X w + p 32 Y w p 33 Z w + p 34 P P p 34 1 X w p 11 + Y w p 12 + Z w p 13 + p 14 ux w p 31 uy w p 32 uz w p 33 = u X w p 21 + Y w p 22 + Z w p 23 + p 14 vx w p 31 vy w p 32 vz w p 33 = v [X w, Y w, Z w ] [u, v] P

30 6 P Tsai [4, 5] Homography [24, 31] 2.8 0(Z = 0) (2.3) 2.8: Homography x y 1 P X Y 0 1 p 11 p 12 p 14 p 21 p 22 p 24 p 31 p 32 p 34 X Y 1 ˆP X Y 1 H X Y 1 (2.49) P ˆP 2 (Homography) Homography Homography 20

31 Homography A = f 0 c x 0 f c y (c x, c y ) : f : (2.50) A 1 H = [r 1 r 2 t] (2.51) R r 1 r 2 0 f 2 = (h 11 c x h 31 )(h 12 c x h 32 ) + (h 21 c y h 31 )(h 22 c y h 32 ) h 31 h 31 (2.52) R, t (2.51) r 1, r 2, t r 3 r 1 r 2 r 3 R R = [r 1 r 2 (r 1 r 2 )] (2.53) Homography 21

32

33 ( ) f(x, y, z; t) = P i I i = I i (u i, v i ; t) x = (x, y, z) u = (u i, v i ) u i = [P i] 1 (x, y, z, 1) T [P i ] 3 (x, y, z, 1) T (3.1) v i = [P i] 2 (x, y, z, 1) T [P i ] 3 (x, y, z, 1) T (3.2) [P i ] j P i j x u i 2 3 u i x u i x u i u i x (3.3) x 23

34 3.1: f 24

35 u x i f = f u i x u i x x u i = ( ) (3.4) x = f x = (0 0) (3.5) u i u i t x m E E = E(m; x; t) (x, y, z) t s = s(x, y, z; t) s(x, y, z; t) = E(m; x, y, z; t)dm (3.6) S(n) S(n) = {m : m = 1 and m n 0} ρ = ρ(x; t) x = (x, y, z) i u i I(u i ; t) = C ρ(x; t)[n(x; t) s(x; t)] (3.7) C x(t) i u i (t) 3 dx dt du i ( dt ) x dρ dt = 0 (3.8) di i dt = I i du i dt + I i t = C ρ(x; t) d [n s] (3.9) dt 25

36 I i du i I i dt t n s n s n s = S(n) E(m; x; t)n dm (3.10) d [n s] = 0 dt x i u i u i du i dt = u i dx x dt (3.11) f dx dt = x dx u i dt + x (3.12) t ui 3 2 x u i u i x x t f ui 3.3 [2] 26

37 3.3.1 ( ) 3.2 A a B a (N N) B b (a ) ( A) 3.2: ( B) I N N T,. (x, y). (3.13) (x, y),. R(x, y) = N 1 n=0 N 1 m=0 I(x + m, y + n) T (m, n) (3.13), ( ) (x, y) t E(x, y, t) t ( x, y) (x, y) (x + x, y + y) 27

38 E(x, y, t) = E(x + x, y + y, t + t) (3.14) E(x, y, t) =.. E(x, y, t) + x E x E + y y + t E t (3.15) t t 0 x E t x + y E t y + E t = 0 (3.16) x E t x + y E t y + E t = 0 (3.17) (x, y) x x y = u y = v t t E x, E y E t ( (3.18)) E x u + E y v + Et = 0 (3.18) u v 2 u v ( ( ) ) ( ( ) ) Lucas-Kanade (u, v) Lucas-Kanade (LK) Lucas-Kanade (u, v) [1] 28

39 3 3. E x1 u + E y1 v = E t1 E x2 u + E y2 v = E t2. (3.19) E x9 u + E y9 v = E t9 (u, v) E E = E x1 E x2. E y1 E y2. (3.20) E x9 E y9 E 9 2 t = (E t1, E t2,, E t9 ) T P = (u, v) T EP = t E E T EP = E T t (3.21) E T E 2 2 E T t 2 1 [ ] [ Ex E x Ex E y Ex E y Ey E y u v ] = [ ] Ex E t Ey E t (3.22) (3.23) (u, v) [ u v ] [ ] 1 [ ] Ex E x Ex E y Ex E t = Ex E y Ey E y Ey E t (3.23) 29

40

41 Projective Grid Space M M Base View 1 m 1 m 1 M Base View 2 m Base View 2 m 2 m 2 m 1 Base View 2 l : Base View Base View M 31

42 i Base View 1 m 1 Base View 1 i F i1 l i1 Base View 2 m 2 Base View 2 i F i2 l i2 4.2: Step1 t t 1 Step2 Projective Grid Space ( Base View1 ) Base View1 Base View2 Step3 C i Step4 2 Base View Step5 Step2 Step4 Seitz Voxel Coloring 32

43 1 Z [19, 28] 0 Z Z 1 Z camera 1 x y 45 camera 2 camera 3 camera 1 x z x 4.3: y (b) 33

44 4.4: 2 ( ) 34

45 : 2 ( ) Point Gray Research Dragonfly(IEEE-1394 ) 35

46 t 4.6: t 1 3 t t p q r Projective Grid Space q + ( ) 4.1: [%] p + p q + q r + r

47 4.7: 4.8: 3 37

48

49 5 Subspace 3 Projective Grid Space Subspace Vedula [20] Vedula 3 2 Vedula 3 (Voxel) du k dt = u k x dx dt (5.1) du k dt k u k dx dt x u k x x = [X x, X y, X y ] T 39

50 5.1: m k (x) k u k = [u k, v k ] T (5.1) dx dt = ( ) uk du k x dt + µr k(u k ) (5.2) ( ) uk x : u k x r k (u k ) : u k µ : m k (x) dx [( ) ] dt uk du k x dt r k(u k ) dx dt = 0 (5.3) (5.3) m k (x) 3 x 5.1 m k (x) m k (x) M(x) = k ˆm k ˆm T k (5.4) ˆm k m k (x) m k (x) (5.4) M(x) 0 M(x) Voxel Seitz Voxel Coloring [19] 40

51 5.1.2 (5.1) n u 1 t v 1 t. u n t v n t = u 1 X x u 1 X y u 1 X z v 1 X x v 1 X y v 1 X z u n X x v n X x. u n X y v n X y u n X z v n X z dx dt (5.5) dx 3 (5.6) 2n dt x Subspace j x i dx ij t dt j ω j ( 5.2 ) x dx ij dt v ij v ij = t j + ω j x i (5.6) v ij = [v xij, v yij, v zij ] T x i = [X xi, X yi, X zi ] T t j = [t xj, t yj, t zj ] T : ω j = [ω xj, ω yj, ω zj ] T : i : j : 41

52 5.2: (5.6) v xij v yij v zij = t xj ω zj ω yj 0 t yj 0 ω zj 0 ω xj 0 0 t zj ω yj ω xj X xi X yi (5.7) X zi (5.7) v xij v yij = s xj s yj q i(6 1) (5.8) v zij s zj (3 6) s xj = [t xj, 0, 0, 0, ω zj, ω yj ] s yj = [0, t yj, 0, ω zj, 0, ω xj ] s zj = [0, 0, t zj, ω yj, ω xj, 0] q i = [1, 1, 1, X xi, X yi, X zi ] T (5.7) (5.8) v xj v yj = s xj s yj Q (6 N) (5.9) v zj (3 N) s zj 42

53 Q (6 N) = [q 1,..., q N ] = X x1... X xn X y1... X yn X z1... X zn (5.9) s xj, s yj, s zj M (5.10) S x = s x1., S y = s y1., S z = s z1. s xm s ym s zm V x V y = S x S y Q (6 N) (5.10) V z (3M N) S z (3M 6) [V x /V y /V z ] Q S x S y S z 1 [S x /S y /S z ] 3 [V x /V y /V z ] Q [S x /S y /S z ] : [V x /V y /V z ] Q 1 Q

54 5.2.2 Subspace 2 3 Subspace Suspace v x v y v z V x V y = U 1 DU T 2 (5.11) V z (3M N) U 1 3M 3M U 2 N N D D d = [d 1, d 2,..., d 3M ] T 3 d = [d 1, d 2, d 3, 0,..., 0] T D d d D 0 0 d = (3M N) D U 1 U 2 V x V y V z (3M N) = U 1 D U T 2 (5.12) [V x/v y/v z] 3 (5.6)

55 Subspace 5 1 (M = 2) Subspace : 3 ( ) (a) Subspace (b) v t v (5.13) (5.14) : cos θ = v v t v v t (5.13) 45

56 5.4: 3 ( ) 5.5: 3 ( + ) 5.6: 3 ( ) 46

57 5.7: 3 ( ) 5.8: 3 ( + ) 5.9: 3 ( ) 47

58 5.10: 3 ( ) 5.11: 3 ( + ) 48

59 : v v t v t (5.14) (a) (b) 5.12: ( ) 5.13: ( ) % 94.2% 10% 46.3% 58.3% 49

60 5.14: ( + ) 5.15: ( ) 50

61 5.16: ( ) 5.17: ( + ) 51

62 5.18: ( ) 5.19: ( ) 52

63 5.20: ( + ) 5.21: ( ) 53

64 5.3.2 実画像を用いた実験 実画像を用いたモーション復元実験を行う 実験環境は 図 5.22 に示すように 3 次元 空間中に配置した 5 台のカメラを用い 各カメラのパラメータは既知であるとする 各カ メラ間のシャッタータイミングは同期している 対象とする物体は 図 5.23 に示すように ルービックキューブを持ち 空間内で運動する人間の腕部とする オプティカルフローの 計算にはブロックマッチング法を用いる 本実験では y 軸方向に 1 秒間に約 15cm 平行 移動した場合と x 軸を中心に約 180 度回転した場合の 2 つのシーケンスを対象とし 入 力画像は基準フレームから前後 1 フレームの計 15 枚 (3 フレーム 5 台) を用いる 図 5.23 に示す平行移動における入力オプティカルフローから復元されたシーンフローを図 5.24 に示す また 回転運動に対する復元されたシーンフローを図 5.25 に示す Vedula ら 図 5.22: 実験環境 の手法では シーンフローの存在位置を 各入力カメラから算出されるべクトル mk の一 致性から判定する これにより 図 (a) では シーン内の唯一の動物体である 腕部のシーンフローのみを復元することが可能であることが分かる 図 5.24(a) では 分 散しているシーンフローの方向が Subspace 拘束を用いることで同一の方向へ修正され ているのが分かる また 図 5.25(b) でも同様に 分散しているシーンフローの方向を修 正できていることが分かる 54

65 図 5.23: 実画像における入力画像とオプティカルフロー (平行移動) 図 5.24: 平行移動における 3 次元シーンフローの例 図 5.25: 平行移動における 3 次元シーンフローの例 55

66

67 x i v ij j (5.6) v ij = 0 X zi X yi X zi 0 X xi X yi X xi ] = [ [x i ] I (3 3) ] [ ω j t j [ ωj [x i ] x i, I (3 3) : 3 3 t j ] (6.1) N v 1j. v Nj = [x 1 ] I 1(3 3). [x N ]. I N(3 3) [ ωj t j ] (6.2) ω j = [ω xj, ω yj, ω zj ] T t j = [t xj, t yj, t zj ] T 6 (6.2) 3 N [x i ] 2 N = 2 (6.2) 3 v ij x i 57

68 6.1.1 RANSAC 5.2 Subspace 3 Subspace ( 6.1 ) 6.1: RANSAC [3] RANSAC Step1 3 Step2 2 Step3 Step4 Step5 Spte6 Step1 Step5 58

69 (a) 6.1: ( ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] 1.75E E E E E E-05 RANSAC E E E-05 RANSAC 7.15E E E E : ( ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] 5.48E E E E-06 RANSAC E RANSAC E E : ( + ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] 4.25E E E E E-05 RANSAC RANSAC E E-05 RANSAC RANSAC 59

70 6.4: ( ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] 4.43E E E E E-05 RANSAC E RANSAC 5.89E E : ( ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] 7.61E E E E E-05 RANSAC E E E-05 RANSAC 8.77E E E : ( + ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] E E E E-05 RANSAC 1.68E E RANSAC E : ( ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] 4.23E E E E E E-06 RANSAC E-05 RANSAC 2.31E E E-05 60

71 6.8: ( ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] E E E-06 RANSAC E E-05 RANSAC E E E E : ( + ) ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] E E E-06 RANSAC E-05 RANSAC E : 61

72 6.2(b)(c) RANSAC Subspace RANSAC : ω x ω y ω z t x t y t z [deg/frame] [deg/frame] [deg/frame] [mm/frame] [mm/frame] [mm/frame] ω x, ω y, ω z 1 y 1frame(1/30sec) 4.5mm cm 15cm x 1frame frame 6 t y t y 62

73 Subspace Projective Grid Space 3 3 Projective Grid Space 5 Subspace 3 Subspace RANSAC 63

74 2 ( ) 64

75 65

76

77 [1] B. D. Lucas, and T. Kanade, An iterative image registration technique with an application to stereo vision, Imaging Understanding Workshop, pp , [2] BKP Horn, BG Schunck, Determining optical flow, Artificial Intelligence, 17, pp , [3] M. Fischer, and R. Bolles. Randoam sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography, Communications of the ACM, Vol. 24, No. 6, pp , [4] R. Y. Tsai, An efficient and accurate camera calibration technique for 3D machine vision, CVPR, pp , [5] R. Y. Tsai, A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-self TV camera and lenses, J-RA, vol. 3, pp , [6] C. Tomasi and T. Kanade, Shape and Motion from Image Streams under Orthography: a Factorization Method, International Journal of Computer Vision, Vol. 9, No. 2, pp , [7] T. Baudel and M. B. Lafon, CHARADE : Remote Control of Objects Using Free- Hand Gestures, Communication of the ACM, Vol. 36, No. 2, pp , [8] Q.T.Luong, T.Vieville, Canonic representations for the geometries of multiple projective views, Proc. 3rd Europian Conference on Computer Vision, Vol.1: , 1994 [9] J. L. Barron, D. J. Fleet, and S. S. Beauchemin, Performance of Optical Flow Techniques, International Journal of Computer Vision, Vol. 12, No. 1, pp , [10],,, Vol.35, No.2, pp ,

78 [11],,, Vol. 36, No. 8, pp , [12] R. T. Collins, A space-sweep approach to true multi-image matching, In IEEE Computer Vision and Pattern Recognition, pp , [13],,, 3,. D-II,, II-, Vol.79, No.8, pp , [14],,, Vol. 37, No. 3, pp , [15],,,,,,, Vol.51, No.12, pp , [16] M. Turk, Moving from GUIs to PUIs, Proc. Fourth Symposium on Intelligent Information Media, Tokyo, Japan, [17] S. M. Seitz and K. N. kutulakos, Plenoptic Image Editing, Proc. 6th Int. Conf. Computer Vision, [18] M. Irani, Multi-Frame Optical Flow Estimation Using Subspace Constraints, International Conference on Computer Vision, Vol. 1, pp [19] S. M. Seitz and C.R. Dyer, Photorealistic scene reconstruction by Vvoxel Coloring, International Journal of Computer Vision, vol. 35, No. 2, pp , [20] S. Vedula, S. Baker, P. Rander, R. Collins, T. Kanade, Three-Dimensional Scene Flow, Proceedings of the 7th International Conference on Computer Vision, Vol. 2, September, pp , [21] H. Saito, T. Kanade, Shape Reconstruction in Projective Grid Space from Large Number of Images, IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 99), June, [22],,,,, Vol.40, No.2, pp , [23] J.Y. Bouguet, Pyramidal Implementation of the Lucas Kanade Feature Tracker, In OpenCV Documentation, Microprocessor Research Labs, Intel Corp.,

79 [24] G. Simon, and A. Ritzgibbon, and A. Zisserman, Markerless Tracking using Planar Structures in the Scene, Proc. of the ISAR, pp , [25] Makoto Hirose, Takeo Miyasaka,Kazuhiro Kuroda and Kazuo Araki, Integration of successive range images for robot vision, Proc. of XIXth Congress of the international society for photogramm etry and remote sensing(isprs), pp , Amsterdam, July [26],, Vol.42(SIG6):33-43, [27],,, J85-D-2(3): , [28] S. M. Seitz and K. N. Kkutulakos, Plenoptic Image Editing, International Conference on Computer Vision, vol. 48, No. 2, pp , [29],,, (MIRU2004) [30],, 3D, 2005 pp , [31],,, (MIRU2005) pp , [32] - - [33] [34] 3 CG [35] G [36] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, Cambridge University Press,

80

81 [1],,. 3, (MIRU2006), p , [2],,. 3, CVIM 149, pp , [1],,. Subspace 3, 18, September, [2],,. Projective Grid Space 3,, 0-324,

82

83 A A.1 A.1.1 V a, b 2 (i) (ii) a + b V k ka ka V 8 V a, b a + b = b + a (k + l)a = ka + la (a + b) + c = a + (b + c) k(a + b) = ka + lb a + 0 = a (kl)a = k(la) a + ( a) = 0 1a = a 73

84 A.1.2 n a 1, a 2,..., a n c 1 a 1 + c 2 a c n a n = 0 (A.1) c 1 = c 2 =... = c n = 0 a 1, a 2,..., a n (A.1) c 1, c 2,..., c n ( 0 ) a 1, a 2,..., a n a 1, a 2,..., a n A = [a 1, a 2,..., a n ] n a 1, a 2,..., a n n A n a 1, a 2,..., a n v = v 1 a 1 + v 2 a v n a n (A.2) v V u = u 1 a 1 + u 2 a u n a n V (A.3) v = v 1 a 1 + v 2 a v n a n V (A.4) u + v = (u 1 + v 1 )a 1 + (u 2 + v 2 )a (u n + v n )a n V (A.5) ku = ku 1 a 1 + ku 2 a ku n a n V (A.6) V a 1, a 2,..., a n V V n V dim V 3 x, y, z 3 [v x, v y, v z ] T v x v y v z 1 = v x v y v z 0 0 1

85 e x = [1, 0, 0] T, e y = [0, 1, 0] T, e z = [0, 0, 1] T A.1.3 n R n 2 a = [a 1, a 2,..., a n ] T, a = [b 1, b 2,..., b n ] T (a, b) (a, b) = n a i b i = a 1 b 1 + a 2 b a n b n i=1 (A.7) (1) (a, b) = (b, a) (2) (a, b + c) = (a, b) + (a, c) (3) (ka, b) = k(a, b) (4) (a, a) 0 n a, b n e 1, e 2,..., e n a = a 1 e 1 + a 2 e a n e n, b = b 1 e 1 + b 2 e b n e n (e i, e j ) = δ ij = { 1 (i = j) 0 (i j) (A.8) (a, b) (a, b) = (a 1 e 1 + a 2 e a n e n, b 1 e 1 + b 2 e b n e n ) = a 1 b 1 (e 1, e 2 ) + a 2 b 2 (e 1, e 2 ) a 2 b 1 (e 2, e 1 ) +a 2 b 2 (e 2, e 2 ) a n b n (e 2, e 1 ) = a 1 b 1 + a 2 b a n b n R n 75

86 A.2 A.2.1 R n n e 1, e 2,..., e n R n x x = n x i e i i=1 R m y m ẽ 1, ẽ 2,..., ẽ n y = m x i ẽ i i=1 2 y x y 1 = a 11 x 1 + a 12 x a 1n x n y 2 = a 21 x 2 + a 22 x a 2n x n. y m = a m1 x 2 + a m2 x a mn x n a ij (i = 1, 2..., m, j = 1, 2,..., n) m n y 1 a 11 a a 1n x 1 y 2. = a 21 a a 2n x 2... y m a m1 a m2... a mn x n A x x 1, x 2,..., x n A(x + x ) = Ax + Ax A(cx) = c(ax) 76

87 c f A.2.2 f : x y(x R n, y R m ) R n e 1, e 2,..., e n e 1, e 2,..., e n p 11 p p 1n p [e 1, e 2,..., e n ] = [e 1, e 2,..., e n ] 21 p p 2n (A.9).. p n1 p n2... p nn P P e i(i = 1, 2,..., n) f R m (A.9) e i = n e k p ki f k=1 ( n ) f(e i) = f e k p ki = k=1 n a jk p ki = (AP ) ji k=1 n f(e k )p ki = k=1 m n a jk p ki ẽ j j=1 k=1 f(e i) = m (AP ) ji ẽ j j=1 (A.10) f AP R m ẽ 1, ẽ 2,... ẽ n ẽ 1, ẽ 2,... ẽ n [ẽ 1, ẽ 2,... ẽ n] = [ẽ 1, ẽ 2,... ẽ n ]Q Q m m [ẽ 1, ẽ 2,... ẽ n ] = m [ẽ 1, ẽ 2,... ẽ n]q 1 (Q 1 ) lj ẽ j = ẽ l(q 1 ) lj (A.10) l=1 f(e i) = m m (AP ) j i ẽ l(q 1 ) lj = j=1 l=1 77 m m (Q 1 ) lj (AP ) ji ẽ l l=1 j=1

88 m j=1 (Q 1 ) lj (AP ) ji = (Q 1 AP ) li f(e i) = m (Q 1 AP ) li ẽ l (A.11) l=1 f A = Q 1 AP A.2.3 R n W W R n (1) a W, b W a + b W (2) a W ca W (c ) 3 A.2.4 f : R n R m R n x f(x) = 0 R m Kerf f(x) R m R n x f f(x) Imf Kerf = {x R n f(x) = 0 R m } Imf = {f(x R m x R n } Kerf R n Imf f(e 1 ), f(e 2 ),... f(e n ) R m R n n f : R n R m dim(kerf) + dim(imf) = n dim(kerf) dim(imf) 78

89 A.2.5 Imf f(x) ( n ) f(x) = f x i e i = i=1 n x i f(e i ) i=1 (A.12) Imf f(e i )(i = 1, 2,..., n) R m f(e i )(i = 1, 2,..., n) Imf f A f(e i ) A = [f(e 1 ), f(e 2 ),..., f(e n )] A f(e 1 ), f(e 2 ),..., f(e n ) R n R m f rankf = ranka A.3 A.3.1 A P P 1 AP 2 2 [ ] [ ] P 1 a 11 a 12 λ 1 0 P = a 21 a 22 0 λ 2 P P = [x 1 x 2 ] P P 1 x 1 x 2 c 1 x 1 +c 2 x 2 = 0 c 1 = c 2 = 0 (A.13) [ ] [ ] a 11 a 12 λ 1 0 [x 1 x 2 ] = [x 1 x 2 ] a 21 a 22 0 λ 2 = [λ 1 x 1 λ 2 x 2 ] 79

90 x 1 x 2 [ ] [ ] a 11 a 12 x 1 = λ 1 x 1, a 11 a 12 x 2 = λ 2 x 2 a 21 a 22 a 21 a 22 λ 1 λ 2 A x 1, x 2 A n n A R n [e 1 e 2... e n] = [e 1 e 2... e n ]P A = P 1 AP A A 3 A T A (A T A) T = A T A A T A A T A T 1 T 1 1 A T AT 1 = γ γ γ 3 γ 1, γ 2, γ 3 A T A u i γ i T 1 T 1 = [u 1, u 2, u 3 ] (Au i, Au i ) = (u i, A T Au i ) = γ i (u i, u i ) = γ i (Au i, Au i ) 0 γ i 0 γ 1 = λ 2 1 0, γ 2 = λ 2 2 0, γ 3 = λ v 1 = Au 1 λ 1, v 2 = Au 2 λ 2 v 1 v 2 80

91 v 3 2 T 2 = [v 1, v 2, v 3 ] (v i, v j ) = δ ij T T T = I(I = ) T 2 T T 2 AT 1 = (v 1, Au 1 ) (v 1, Au 2 ) (v 1, Au 3 ) (v 2, Au 1 ) (v 2, Au 2 ) (v 2, Au 3 ) (v 3, Au 1 ) (v 3, Au 2 ) (v 3, Au 3 ) = λ λ (A.13) i = 1, 2 j = 1, 2, 3 (v i, Au j ) = (v 3, Au j ) = 0 ( ) Aui, Au j = 1 λ (u i, A T Au j ) = λ j δ i j λ i n A 2 T 1, T 2 λ λ T T 2 AT 1 = λ n λ 2 1, λ 2 2,..., λ 2 n AT A λ 1, λ 2,..., λ n A 81

92

93 B [1],,. 3, (MIRU2006), p , [2],,. 3, CVIM 149, pp , [3],,. Subspace 3, 18, September, [4],,. Projective Grid Space 3,, 0-324,

94

Optical Flow t t + δt 1 Motion Field 3 3 1) 2) 3) Lucas-Kanade 4) 1 t (x, y) I(x, y, t)

Optical Flow t t + δt 1 Motion Field 3 3 1) 2) 3) Lucas-Kanade 4) 1 t (x, y) I(x, y, t) http://wwwieice-hbkborg/ 2 2 4 2 -- 2 4 2010 9 3 3 4-1 Lucas-Kanade 4-2 Mean Shift 3 4-3 2 c 2013 1/(18) http://wwwieice-hbkborg/ 2 2 4 2 -- 2 -- 4 4--1 2010 9 4--1--1 Optical Flow t t + δt 1 Motion Field

More information

1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z +

1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z + 3 3D 1,a) 1 1 Kinect (X, Y) 3D 3D 1. 2010 Microsoft Kinect for Windows SDK( (Kinect) SDK ) 3D [1], [2] [3] [4] [5] [10] 30fps [10] 3 Kinect 3 Kinect Kinect for Windows SDK 3 Microsoft 3 Kinect for Windows

More information

IPSJ SIG Technical Report Vol.2010-MPS-77 No /3/5 VR SIFT Virtual View Generation in Hallway of Cybercity Buildings from Video Sequen

IPSJ SIG Technical Report Vol.2010-MPS-77 No /3/5 VR SIFT Virtual View Generation in Hallway of Cybercity Buildings from Video Sequen VR 1 1 1 1 1 SIFT Virtual View Generation in Hallway of Cybercity Buildings from Video Sequences Sachiyo Yoshida, 1 Masami Takata 1 and Joe Kaduki 1 Appearance of Three-dimensional (3D) building model

More information

Part () () Γ Part ,

Part () () Γ Part , Contents a 6 6 6 6 6 6 6 7 7. 8.. 8.. 8.3. 8 Part. 9. 9.. 9.. 3. 3.. 3.. 3 4. 5 4.. 5 4.. 9 4.3. 3 Part. 6 5. () 6 5.. () 7 5.. 9 5.3. Γ 3 6. 3 6.. 3 6.. 3 6.3. 33 Part 3. 34 7. 34 7.. 34 7.. 34 8. 35

More information

Σ A Σ B r Σ A (Σ A ): A r = [ A r A x r A y r z ] T Σ B : B r = [ B r B x r B y r z ] T A r = A x B B r x + A y B B r y + A z B B r z A r = A R B B r

Σ A Σ B r Σ A (Σ A ): A r = [ A r A x r A y r z ] T Σ B : B r = [ B r B x r B y r z ] T A r = A x B B r x + A y B B r y + A z B B r z A r = A R B B r 3 : Σ A = O A {X A, Y A, Z A } : Σ B = O B {X B, Y B, Z B } O B : A p B X B, Y B, Z B Σ A : A x B, A y B, A z B Σ A : A p B Σ A : { A x B, A y B, A z B } A R B = [ A x A B y A B z B ] ( A R B ) T ( A R

More information

IPSJ SIG Technical Report Vol.2012-CG-149 No.13 Vol.2012-CVIM-184 No /12/4 3 1,a) ( ) DB 3D DB 2D,,,, PnP(Perspective n-point), Ransa

IPSJ SIG Technical Report Vol.2012-CG-149 No.13 Vol.2012-CVIM-184 No /12/4 3 1,a) ( ) DB 3D DB 2D,,,, PnP(Perspective n-point), Ransa 3,a) 3 3 ( ) DB 3D DB 2D,,,, PnP(Perspective n-point), Ransac. DB [] [2] 3 DB Web Web DB Web NTT NTT Media Intelligence Laboratories, - Hikarinooka Yokosuka-Shi, Kanagawa 239-0847 Japan a) yabushita.hiroko@lab.ntt.co.jp

More information

3 2 2 (1) (2) (3) (4) 4 4 AdaBoost 2. [11] Onishi&Yoda [8] Iwashita&Stoica [5] 4 [3] 3. 3 (1) (2) (3)

3 2 2 (1) (2) (3) (4) 4 4 AdaBoost 2. [11] Onishi&Yoda [8] Iwashita&Stoica [5] 4 [3] 3. 3 (1) (2) (3) (MIRU2012) 2012 8 820-8502 680-4 E-mail: {d kouno,shimada,endo}@pluto.ai.kyutech.ac.jp (1) (2) (3) (4) 4 AdaBoost 1. Kanade [6] CLAFIC [12] EigenFace [10] 1 1 2 1 [7] 3 2 2 (1) (2) (3) (4) 4 4 AdaBoost

More information

n ξ n,i, i = 1,, n S n ξ n,i n 0 R 1,.. σ 1 σ i .10.14.15 0 1 0 1 1 3.14 3.18 3.19 3.14 3.14,. ii 1 1 1.1..................................... 1 1............................... 3 1.3.........................

More information

スライド 1

スライド 1 知能制御システム学 画像追跡 (1) 特徴点の検出と追跡 東北大学大学院情報科学研究科鏡慎吾 swk(at)ic.is.tohoku.ac.jp 2008.07.07 今日の内容 前回までの基本的な画像処理の例を踏まえて, ビジュアルサーボシステムの構成要素となる画像追跡の代表的手法を概説する 画像上の ある点 の追跡 オプティカルフローの拘束式 追跡しやすい点 (Harris オペレータ ) Lucas-Kanade

More information

VRSJ-SIG-MR_okada_79dce8c8.pdf

VRSJ-SIG-MR_okada_79dce8c8.pdf THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE. 630-0192 8916-5 E-mail: {kaduya-o,takafumi-t,goshiro,uranishi,miyazaki,kato}@is.naist.jp,.,,.,,,.,,., CG.,,,

More information

fi¡ŒØ.dvi

fi¡ŒØ.dvi (2001) 49 1 77 107 ( 2000 10 18 2001 1 18 ) * 2 2 3 Structure from motion, 1. 1.1 3 2 2 3 2 3 3 2 1 * 305 8568 1 1 1 2. 78 49 1 2001 3 3 2 2 3 3 2 3 2 2 3 3 2 3 2 3 2 3 3 2 3 3 Image-Based Rendering, Virtualized

More information

II 2 3.,, A(B + C) = AB + AC, (A + B)C = AC + BC. 4. m m A, m m B,, m m B, AB = BA, A,, I. 5. m m A, m n B, AB = B, A I E, 4 4 I, J, K

II 2 3.,, A(B + C) = AB + AC, (A + B)C = AC + BC. 4. m m A, m m B,, m m B, AB = BA, A,, I. 5. m m A, m n B, AB = B, A I E, 4 4 I, J, K II. () 7 F 7 = { 0,, 2, 3, 4, 5, 6 }., F 7 a, b F 7, a b, F 7,. (a) a, b,,. (b) 7., 4 5 = 20 = 2 7 + 6, 4 5 = 6 F 7., F 7,., 0 a F 7, ab = F 7 b F 7. (2) 7, 6 F 6 = { 0,, 2, 3, 4, 5 },,., F 6., 0 0 a F

More information

2 7 V 7 {fx fx 3 } 8 P 3 {fx fx 3 } 9 V 9 {fx fx f x 2fx } V {fx fx f x 2fx + } V {{a n } {a n } a n+2 a n+ + a n n } 2 V 2 {{a n } {a n } a n+2 a n+

2 7 V 7 {fx fx 3 } 8 P 3 {fx fx 3 } 9 V 9 {fx fx f x 2fx } V {fx fx f x 2fx + } V {{a n } {a n } a n+2 a n+ + a n n } 2 V 2 {{a n } {a n } a n+2 a n+ R 3 R n C n V??,?? k, l K x, y, z K n, i x + y + z x + y + z iv x V, x + x o x V v kx + y kx + ky vi k + lx kx + lx vii klx klx viii x x ii x + y y + x, V iii o K n, x K n, x + o x iv x K n, x + x o x

More information

(3.6 ) (4.6 ) 2. [3], [6], [12] [7] [2], [5], [11] [14] [9] [8] [10] (1) Voodoo 3 : 3 Voodoo[1] 3 ( 3D ) (2) : Voodoo 3D (3) : 3D (Welc

(3.6 ) (4.6 ) 2. [3], [6], [12] [7] [2], [5], [11] [14] [9] [8] [10] (1) Voodoo 3 : 3 Voodoo[1] 3 ( 3D ) (2) : Voodoo 3D (3) : 3D (Welc 1,a) 1,b) Obstacle Detection from Monocular On-Vehicle Camera in units of Delaunay Triangles Abstract: An algorithm to detect obstacles by using a monocular on-vehicle video camera is developed. Since

More information

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q x-means 1 2 2 x-means, x-means k-means Bayesian Information Criterion BIC Watershed x-means Moving Object Extraction Using the Number of Clusters Determined by X-means Clustering Naoki Kubo, 1 Kousuke

More information

28 Horizontal angle correction using straight line detection in an equirectangular image

28 Horizontal angle correction using straight line detection in an equirectangular image 28 Horizontal angle correction using straight line detection in an equirectangular image 1170283 2017 3 1 2 i Abstract Horizontal angle correction using straight line detection in an equirectangular image

More information

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS 2 3 4 5 2. 2.1 3 1) GPS Global Positioning System

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS 2 3 4 5 2. 2.1 3 1) GPS Global Positioning System Vol. 52 No. 1 257 268 (Jan. 2011) 1 2, 1 1 measurement. In this paper, a dynamic road map making system is proposed. The proposition system uses probe-cars which has an in-vehicle camera and a GPS receiver.

More information

光学

光学 Fundamentals of Projector-Camera Systems and Their Calibration Methods Takayuki OKATANI To make the images projected by projector s appear as desired, it is e ective and sometimes an only choice to capture

More information

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)

(MIRU2008) HOG Histograms of Oriented Gradients (HOG) (MIRU2008) 2008 7 HOG - - E-mail: katsu0920@me.cs.scitec.kobe-u.ac.jp, {takigu,ariki}@kobe-u.ac.jp Histograms of Oriented Gradients (HOG) HOG Shape Contexts HOG 5.5 Histograms of Oriented Gradients D Human

More information

9. 05 L x P(x) P(0) P(x) u(x) u(x) (0 < = x < = L) P(x) E(x) A(x) P(L) f ( d EA du ) = 0 (9.) dx dx u(0) = 0 (9.2) E(L)A(L) du (L) = f (9.3) dx (9.) P

9. 05 L x P(x) P(0) P(x) u(x) u(x) (0 < = x < = L) P(x) E(x) A(x) P(L) f ( d EA du ) = 0 (9.) dx dx u(0) = 0 (9.2) E(L)A(L) du (L) = f (9.3) dx (9.) P 9 (Finite Element Method; FEM) 9. 9. P(0) P(x) u(x) (a) P(L) f P(0) P(x) (b) 9. P(L) 9. 05 L x P(x) P(0) P(x) u(x) u(x) (0 < = x < = L) P(x) E(x) A(x) P(L) f ( d EA du ) = 0 (9.) dx dx u(0) = 0 (9.2) E(L)A(L)

More information

6.1 (P (P (P (P (P (P (, P (, P.

6.1 (P (P (P (P (P (P (, P (, P. (011 30 7 0 ( ( 3 ( 010 1 (P.3 1 1.1 (P.4.................. 1 1. (P.4............... 1 (P.15.1 (P.16................. (P.0............3 (P.18 3.4 (P.3............... 4 3 (P.9 4 3.1 (P.30........... 4 3.

More information

27 AR

27 AR 27 AR 28 2 19 12111002 AR AR 1 3 1.1....................... 3 1.1.1...................... 3 1.1.2.................. 4 1.2............................ 4 1.2.1 AR......................... 5 1.2.2......................

More information

6.1 (P (P (P (P (P (P (, P (, P.101

6.1 (P (P (P (P (P (P (, P (, P.101 (008 0 3 7 ( ( ( 00 1 (P.3 1 1.1 (P.3.................. 1 1. (P.4............... 1 (P.15.1 (P.15................. (P.18............3 (P.17......... 3.4 (P................ 4 3 (P.7 4 3.1 ( P.7...........

More information

(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α,

(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α, [II] Optimization Computation for 3-D Understanding of Images [II]: Ellipse Fitting 1. (1) 2. (2) (edge detection) (edge) (zero-crossing) Canny (Canny operator) (3) 1(a) [I] [II] [III] [IV ] E-mail sugaya@iim.ics.tut.ac.jp

More information

2003/3 Vol. J86 D II No.3 2.3. 4. 5. 6. 2. 1 1 Fig. 1 An exterior view of eye scanner. CCD [7] 640 480 1 CCD PC USB PC 2 334 PC USB RS-232C PC 3 2.1 2

2003/3 Vol. J86 D II No.3 2.3. 4. 5. 6. 2. 1 1 Fig. 1 An exterior view of eye scanner. CCD [7] 640 480 1 CCD PC USB PC 2 334 PC USB RS-232C PC 3 2.1 2 Curved Document Imaging with Eye Scanner Toshiyuki AMANO, Tsutomu ABE, Osamu NISHIKAWA, Tetsuo IYODA, and Yukio SATO 1. Shape From Shading SFS [1] [2] 3 2 Department of Electrical and Computer Engineering,

More information

IPSJ SIG Technical Report 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai,

IPSJ SIG Technical Report 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai, 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] 1 599 8531 1 1 Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai, Osaka 599 8531, Japan 2 565 0871 Osaka University 1 1, Yamadaoka, Suita, Osaka

More information

Fig Measurement data combination. 2 Fig. 2. Ray vector. Fig (12) 1 2 R 1 r t 1 3 p 1,i i 2 3 Fig.2 R 2 t 2 p 2,i [u, v] T (1)(2) r R 1 R 2

Fig Measurement data combination. 2 Fig. 2. Ray vector. Fig (12) 1 2 R 1 r t 1 3 p 1,i i 2 3 Fig.2 R 2 t 2 p 2,i [u, v] T (1)(2) r R 1 R 2 IP 06 16 / IIS 06 32 3 3-D Environment Modeling from Images Acquired with an Omni-Directional Camera Mounted on a Mobile Robot Atsushi Yamashita, Tomoaki Harada, Ryosuke Kawanishi, Toru Kaneko (Shizuoka

More information

24 I ( ) 1. R 3 (i) C : x 2 + y 2 1 = 0 (ii) C : y = ± 1 x 2 ( 1 x 1) (iii) C : x = cos t, y = sin t (0 t 2π) 1.1. γ : [a, b] R n ; t γ(t) = (x

24 I ( ) 1. R 3 (i) C : x 2 + y 2 1 = 0 (ii) C : y = ± 1 x 2 ( 1 x 1) (iii) C : x = cos t, y = sin t (0 t 2π) 1.1. γ : [a, b] R n ; t γ(t) = (x 24 I 1.1.. ( ) 1. R 3 (i) C : x 2 + y 2 1 = 0 (ii) C : y = ± 1 x 2 ( 1 x 1) (iii) C : x = cos t, y = sin t (0 t 2π) 1.1. γ : [a, b] R n ; t γ(t) = (x 1 (t), x 2 (t),, x n (t)) ( ) ( ), γ : (i) x 1 (t),

More information

all.dvi

all.dvi 5,, Euclid.,..,... Euclid,.,.,, e i (i =,, ). 6 x a x e e e x.:,,. a,,. a a = a e + a e + a e = {e, e, e } a (.) = a i e i = a i e i (.) i= {a,a,a } T ( T ),.,,,,. (.),.,...,,. a 0 0 a = a 0 + a + a 0

More information

2 (2016 3Q N) c = o (11) Ax = b A x = c A n I n n n 2n (A I n ) (I n X) A A X A n A A A (1) (2) c 0 c (3) c A A i j n 1 ( 1) i+j A (i, j) A (i, j) ã i

2 (2016 3Q N) c = o (11) Ax = b A x = c A n I n n n 2n (A I n ) (I n X) A A X A n A A A (1) (2) c 0 c (3) c A A i j n 1 ( 1) i+j A (i, j) A (i, j) ã i [ ] (2016 3Q N) a 11 a 1n m n A A = a m1 a mn A a 1 A A = a n (1) A (a i a j, i j ) (2) A (a i ca i, c 0, i ) (3) A (a i a i + ca j, j i, i ) A 1 A 11 0 A 12 0 0 A 1k 0 1 A 22 0 0 A 2k 0 1 0 A 3k 1 A rk

More information

,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw

,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw ,.,. NP,.,. 1 1.1.,.,,.,.,,,. 2. 1.1.1 (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., 152-8552 2-12-1, tatsukawa.m.aa@m.titech.ac.jp, 190-8562 10-3, mirai@ism.ac.jp

More information

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,. 9 α ν β Ξ ξ Γ γ o δ Π π ε ρ ζ Σ σ η τ Θ θ Υ υ ι Φ φ κ χ Λ λ Ψ ψ µ Ω ω Def, Prop, Th, Lem, Note, Remark, Ex,, Proof, R, N, Q, C [a, b {x R : a x b} : a, b {x R : a < x < b} : [a, b {x R : a x < b} : a,

More information

ii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,.

ii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,. (1 C205) 4 10 (2 C206) 4 11 (2 B202) 4 12 25(2013) http://www.math.is.tohoku.ac.jp/~obata,.,,,..,,. 1. 2. 3. 4. 5. 6. 7. 8. 1., 2007 ( ).,. 2. P. G., 1995. 3. J. C., 1988. 1... 2.,,. ii 3.,. 4. F. ( ),..

More information

Silhouette on Image Object Silhouette on Images Object 1 Fig. 1 Visual cone Fig. 2 2 Volume intersection method Fig. 3 3 Background subtraction Fig. 4

Silhouette on Image Object Silhouette on Images Object 1 Fig. 1 Visual cone Fig. 2 2 Volume intersection method Fig. 3 3 Background subtraction Fig. 4 Image-based Modeling 1 1 Object Extraction Method for Image-based Modeling using Projection Transformation of Multi-viewpoint Images Masanori Ibaraki 1 and Yuji Sakamoto 1 The volume intersection method

More information

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,. 24(2012) (1 C106) 4 11 (2 C206) 4 12 http://www.math.is.tohoku.ac.jp/~obata,.,,,.. 1. 2. 3. 4. 5. 6. 7.,,. 1., 2007 (). 2. P. G. Hoel, 1995. 3... 1... 2.,,. ii 3.,. 4. F. (),.. 5... 6.. 7.,,. 8.,. 1. (75%)

More information

数値計算:有限要素法

数値計算:有限要素法 ( ) 1 / 61 1 2 3 4 ( ) 2 / 61 ( ) 3 / 61 P(0) P(x) u(x) P(L) f P(0) P(x) P(L) ( ) 4 / 61 L P(x) E(x) A(x) x P(x) P(x) u(x) P(x) u(x) (0 x L) ( ) 5 / 61 u(x) 0 L x ( ) 6 / 61 P(0) P(L) f d dx ( EA du dx

More information

i I II I II II IC IIC I II ii 5 8 5 3 7 8 iii I 3........................... 5......................... 7........................... 4........................ 8.3......................... 33.4...................

More information

II No.01 [n/2] [1]H n (x) H n (x) = ( 1) r n! r!(n 2r)! (2x)n 2r. r=0 [2]H n (x) n,, H n ( x) = ( 1) n H n (x). [3] H n (x) = ( 1) n dn x2 e dx n e x2

II No.01 [n/2] [1]H n (x) H n (x) = ( 1) r n! r!(n 2r)! (2x)n 2r. r=0 [2]H n (x) n,, H n ( x) = ( 1) n H n (x). [3] H n (x) = ( 1) n dn x2 e dx n e x2 II No.1 [n/] [1]H n x) H n x) = 1) r n! r!n r)! x)n r r= []H n x) n,, H n x) = 1) n H n x) [3] H n x) = 1) n dn x e dx n e x [4] H n+1 x) = xh n x) nh n 1 x) ) d dx x H n x) = H n+1 x) d dx H nx) = nh

More information

211 kotaro@math.titech.ac.jp 1 R *1 n n R n *2 R n = {(x 1,..., x n ) x 1,..., x n R}. R R 2 R 3 R n R n R n D D R n *3 ) (x 1,..., x n ) f(x 1,..., x n ) f D *4 n 2 n = 1 ( ) 1 f D R n f : D R 1.1. (x,

More information

xx/xx Vol. Jxx A No. xx 1 Fig. 1 PAL(Panoramic Annular Lens) PAL(Panoramic Annular Lens) PAL (2) PAL PAL 2 PAL 3 2 PAL 1 PAL 3 PAL PAL 2. 1 PAL

xx/xx Vol. Jxx A No. xx 1 Fig. 1 PAL(Panoramic Annular Lens) PAL(Panoramic Annular Lens) PAL (2) PAL PAL 2 PAL 3 2 PAL 1 PAL 3 PAL PAL 2. 1 PAL PAL On the Precision of 3D Measurement by Stereo PAL Images Hiroyuki HASE,HirofumiKAWAI,FrankEKPAR, Masaaki YONEDA,andJien KATO PAL 3 PAL Panoramic Annular Lens 1985 Greguss PAL 1 PAL PAL 2 3 2 PAL DP

More information

2019 1 5 0 3 1 4 1.1.................... 4 1.1.1......................... 4 1.1.2........................ 5 1.1.3................... 5 1.1.4........................ 6 1.1.5......................... 6 1.2..........................

More information

untitled

untitled 2 : n =1, 2,, 10000 0.5125 0.51 0.5075 0.505 0.5025 0.5 0.4975 0.495 0 2000 4000 6000 8000 10000 2 weak law of large numbers 1. X 1,X 2,,X n 2. µ = E(X i ),i=1, 2,,n 3. σi 2 = V (X i ) σ 2,i=1, 2,,n ɛ>0

More information

II A A441 : October 02, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka )

II A A441 : October 02, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) II 214-1 : October 2, 214 Version : 1.1 Kawahira, Tomoki TA (Kondo, Hirotaka ) http://www.math.nagoya-u.ac.jp/~kawahira/courses/14w-biseki.html pdf 1 2 1 9 1 16 1 23 1 3 11 6 11 13 11 2 11 27 12 4 12 11

More information

() n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (5) (6 ) n C + nc + 3 nc n nc n (7 ) n C + nc + 3 nc n nc n (

() n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (5) (6 ) n C + nc + 3 nc n nc n (7 ) n C + nc + 3 nc n nc n ( 3 n nc k+ k + 3 () n C r n C n r nc r C r + C r ( r n ) () n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (4) n C n n C + n C + n C + + n C n (5) k k n C k n C k (6) n C + nc

More information

18 2 20 W/C W/C W/C 4-4-1 0.05 1.0 1000 1. 1 1.1 1 1.2 3 2. 4 2.1 4 (1) 4 (2) 4 2.2 5 (1) 5 (2) 5 2.3 7 3. 8 3.1 8 3.2 ( ) 11 3.3 11 (1) 12 (2) 12 4. 14 4.1 14 4.2 14 (1) 15 (2) 16 (3) 17 4.3 17 5. 19

More information

JIS Z803: (substitution method) 3 LCR LCR GPIB

JIS Z803: (substitution method) 3 LCR LCR GPIB LCR NMIJ 003 Agilent 8A 500 ppm JIS Z803:000 50 (substitution method) 3 LCR LCR GPIB Taylor 5 LCR LCR meter (Agilent 8A: Basic accuracy 500 ppm) V D z o I V DUT Z 3 V 3 I A Z V = I V = 0 3 6 V, A LCR meter

More information

I A A441 : April 15, 2013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida )

I A A441 : April 15, 2013 Version : 1.1 I   Kawahira, Tomoki TA (Shigehiro, Yoshida ) I013 00-1 : April 15, 013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida) http://www.math.nagoya-u.ac.jp/~kawahira/courses/13s-tenbou.html pdf * 4 15 4 5 13 e πi = 1 5 0 5 7 3 4 6 3 6 10 6 17

More information

X G P G (X) G BG [X, BG] S 2 2 2 S 2 2 S 2 = { (x 1, x 2, x 3 ) R 3 x 2 1 + x 2 2 + x 2 3 = 1 } R 3 S 2 S 2 v x S 2 x x v(x) T x S 2 T x S 2 S 2 x T x S 2 = { ξ R 3 x ξ } R 3 T x S 2 S 2 x x T x S 2

More information

Z: Q: R: C:

Z: Q: R: C: 0 Z: Q: R: C: 3 4 4 4................................ 4 4.................................. 7 5 3 5...................... 3 5......................... 40 5.3 snz) z)........................... 4 6 46 x

More information

I A A441 : April 21, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) Google

I A A441 : April 21, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) Google I4 - : April, 4 Version :. Kwhir, Tomoki TA (Kondo, Hirotk) Google http://www.mth.ngoy-u.c.jp/~kwhir/courses/4s-biseki.html pdf 4 4 4 4 8 e 5 5 9 etc. 5 6 6 6 9 n etc. 6 6 6 3 6 3 7 7 etc 7 4 7 7 8 5 59

More information

1 Table 1: Identification by color of voxel Voxel Mode of expression Nothing Other 1 Orange 2 Blue 3 Yellow 4 SSL Humanoid SSL-Vision 3 3 [, 21] 8 325

1 Table 1: Identification by color of voxel Voxel Mode of expression Nothing Other 1 Orange 2 Blue 3 Yellow 4 SSL Humanoid SSL-Vision 3 3 [, 21] 8 325 社団法人人工知能学会 Japanese Society for Artificial Intelligence 人工知能学会研究会資料 JSAI Technical Report SIG-Challenge-B3 (5/5) RoboCup SSL Humanoid A Proposal and its Application of Color Voxel Server for RoboCup SSL

More information

(a) 1 (b) 3. Gilbert Pernicka[2] Treibitz Schechner[3] Narasimhan [4] Kim [5] Nayar [6] [7][8][9] 2. X X X [10] [11] L L t L s L = L t + L s

(a) 1 (b) 3. Gilbert Pernicka[2] Treibitz Schechner[3] Narasimhan [4] Kim [5] Nayar [6] [7][8][9] 2. X X X [10] [11] L L t L s L = L t + L s 1 1 1, Extraction of Transmitted Light using Parallel High-frequency Illumination Kenichiro Tanaka 1 Yasuhiro Mukaigawa 1 Yasushi Yagi 1 Abstract: We propose a new sharpening method of transmitted scene

More information

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro TV 1,2,a) 1 2 2015 1 26, 2015 5 21 Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Rotation Using Mobile Device Hiroyuki Kawakita 1,2,a) Toshio Nakagawa 1 Makoto Sato

More information

I-2 (100 ) (1) y(x) y dy dx y d2 y dx 2 (a) y + 2y 3y = 9e 2x (b) x 2 y 6y = 5x 4 (2) Bernoulli B n (n = 0, 1, 2,...) x e x 1 = n=0 B 0 B 1 B 2 (3) co

I-2 (100 ) (1) y(x) y dy dx y d2 y dx 2 (a) y + 2y 3y = 9e 2x (b) x 2 y 6y = 5x 4 (2) Bernoulli B n (n = 0, 1, 2,...) x e x 1 = n=0 B 0 B 1 B 2 (3) co 16 I ( ) (1) I-1 I-2 I-3 (2) I-1 ( ) (100 ) 2l x x = 0 y t y(x, t) y(±l, t) = 0 m T g y(x, t) l y(x, t) c = 2 y(x, t) c 2 2 y(x, t) = g (A) t 2 x 2 T/m (1) y 0 (x) y 0 (x) = g c 2 (l2 x 2 ) (B) (2) (1)

More information

(2018 2Q C) [ ] R 2 2 P = (a, b), Q = (c, d) Q P QP = ( ) a c b d (a c, b d) P = (a, b) O P ( ) a p = b P = (a, b) p = ( ) a b R 2 {( ) } R 2 x = x, y

(2018 2Q C) [ ] R 2 2 P = (a, b), Q = (c, d) Q P QP = ( ) a c b d (a c, b d) P = (a, b) O P ( ) a p = b P = (a, b) p = ( ) a b R 2 {( ) } R 2 x = x, y (2018 2Q C) [ ] R 2 2 P = (a, b), Q = (c, d) Q P QP = a c b d (a c, b d) P = (a, b) O P a p = b P = (a, b) p = a b R 2 { } R 2 x = x, y R y 2 a p =, c q = b d p + a + c q = b + d q p P q a p = c R c b

More information

(2016 2Q H) [ ] R 2 2 P = (a, b), Q = (c, d) Q P QP = ( ) a c b d (a c, b d) P = (a, b) O P ( ) a p = b P = (a, b) p = ( ) a b R 2 {( ) } R 2 x = x, y

(2016 2Q H) [ ] R 2 2 P = (a, b), Q = (c, d) Q P QP = ( ) a c b d (a c, b d) P = (a, b) O P ( ) a p = b P = (a, b) p = ( ) a b R 2 {( ) } R 2 x = x, y (2016 2Q H) [ ] R 2 2 P = (a, b), Q = (c, d) Q P QP = a c b d (a c, b d) P = (a, b) O P a p = b P = (a, b) p = a b R 2 { } R 2 x = x, y R y 2 a p =, c q = b d p + a + c q = b + d q p P q a p = c R c b

More information

v v = v 1 v 2 v 3 (1) R = (R ij ) (2) R (R 1 ) ij = R ji (3) 3 R ij R ik = δ jk (4) i=1 δ ij Kronecker δ ij = { 1 (i = j) 0 (i

v v = v 1 v 2 v 3 (1) R = (R ij ) (2) R (R 1 ) ij = R ji (3) 3 R ij R ik = δ jk (4) i=1 δ ij Kronecker δ ij = { 1 (i = j) 0 (i 1. 1 1.1 1.1.1 1.1.1.1 v v = v 1 v 2 v 3 (1) R = (R ij ) (2) R (R 1 ) ij = R ji (3) R ij R ik = δ jk (4) δ ij Kronecker δ ij = { 1 (i = j) 0 (i j) (5) 1 1.1. v1.1 2011/04/10 1. 1 2 v i = R ij v j (6) [

More information

yoo_graduation_thesis.dvi

yoo_graduation_thesis.dvi 200 3 A Graduation Thesis of College of Engineering, Chubu University Keypoint Matching of Range Data from Features of Shape and Appearance Yohsuke Murai 1 1 2 2.5D 3 2.1 : : : : : : : : : : : : : : :

More information

JFE.dvi

JFE.dvi ,, Department of Civil Engineering, Chuo University Kasuga 1-13-27, Bunkyo-ku, Tokyo 112 8551, JAPAN E-mail : atsu1005@kc.chuo-u.ac.jp E-mail : kawa@civil.chuo-u.ac.jp SATO KOGYO CO., LTD. 12-20, Nihonbashi-Honcho

More information

1W II K =25 A (1) office(a439) (2) A4 etc. 12:00-13:30 Cafe David 1 2 TA appointment Cafe D

1W II K =25 A (1) office(a439) (2) A4 etc. 12:00-13:30 Cafe David 1 2 TA  appointment Cafe D 1W II K200 : October 6, 2004 Version : 1.2, kawahira@math.nagoa-u.ac.jp, http://www.math.nagoa-u.ac.jp/~kawahira/courses.htm TA M1, m0418c@math.nagoa-u.ac.jp TA Talor Jacobian 4 45 25 30 20 K2-1W04-00

More information

統計学のポイント整理

統計学のポイント整理 .. September 17, 2012 1 / 55 n! = n (n 1) (n 2) 1 0! = 1 10! = 10 9 8 1 = 3628800 n k np k np k = n! (n k)! (1) 5 3 5 P 3 = 5! = 5 4 3 = 60 (5 3)! n k n C k nc k = npk k! = n! k!(n k)! (2) 5 3 5C 3 = 5!

More information

I , : ~/math/functional-analysis/functional-analysis-1.tex

I , : ~/math/functional-analysis/functional-analysis-1.tex I 1 2004 8 16, 2017 4 30 1 : ~/math/functional-analysis/functional-analysis-1.tex 1 3 1.1................................... 3 1.2................................... 3 1.3.....................................

More information

8.dvi

8.dvi 3 12 2 4 86453 i 1 1 1.1..................................... 2 1.2.................................. 3 1.3................................ 6 1.3.1........................ 7 1.3.2.........................

More information

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j )

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j ) 5 Armitage. x,, x n y i = 0x i + 3 y i = log x i x i y i.2 n i i x ij i j y ij, z ij i j 2 y = a x + b 2 2. ( cm) x ij (i j ) (i) x, x 2 σ 2 x,, σ 2 x,2 σ x,, σ x,2 t t x * (ii) (i) m y ij = x ij /00 y

More information

y π π O π x 9 s94.5 y dy dx. y = x + 3 y = x logx + 9 s9.6 z z x, z y. z = xy + y 3 z = sinx y 9 s x dx π x cos xdx 9 s93.8 a, fx = e x ax,. a =

y π π O π x 9 s94.5 y dy dx. y = x + 3 y = x logx + 9 s9.6 z z x, z y. z = xy + y 3 z = sinx y 9 s x dx π x cos xdx 9 s93.8 a, fx = e x ax,. a = [ ] 9 IC. dx = 3x 4y dt dy dt = x y u xt = expλt u yt λ u u t = u u u + u = xt yt 6 3. u = x, y, z = x + y + z u u 9 s9 grad u ux, y, z = c c : grad u = u x i + u y j + u k i, j, k z x, y, z grad u v =

More information

1.2 y + P (x)y + Q(x)y = 0 (1) y 1 (x), y 2 (x) y 1 (x), y 2 (x) (1) y(x) c 1, c 2 y(x) = c 1 y 1 (x) + c 2 y 2 (x) 3 y 1 (x) y 1 (x) e R P (x)dx y 2

1.2 y + P (x)y + Q(x)y = 0 (1) y 1 (x), y 2 (x) y 1 (x), y 2 (x) (1) y(x) c 1, c 2 y(x) = c 1 y 1 (x) + c 2 y 2 (x) 3 y 1 (x) y 1 (x) e R P (x)dx y 2 1 1.1 R(x) = 0 y + P (x)y + Q(x)y = R(x)...(1) y + P (x)y + Q(x)y = 0...(2) 1 2 u(x) v(x) c 1 u(x)+ c 2 v(x) = 0 c 1 = c 2 = 0 c 1 = c 2 = 0 2 0 2 u(x) v(x) u(x) u (x) W (u, v)(x) = v(x) v (x) 0 1 1.2

More information

24 21 21115025 i 1 1 2 5 2.1.................................. 6 2.1.1........................... 6 2.1.2........................... 7 2.2...................................... 8 2.3............................

More information

WISS 2018 [2 4] [5,6] Query-by-Dancing Query-by- Dancing Cao [1] OpenPose 2 Ghias [7] Query by humming Chen [8] Query by rhythm Jang [9] Query-by-tapp

WISS 2018 [2 4] [5,6] Query-by-Dancing Query-by- Dancing Cao [1] OpenPose 2 Ghias [7] Query by humming Chen [8] Query by rhythm Jang [9] Query-by-tapp Query-by-Dancing: WISS 2018. Query-by-Dancing Query-by-Dancing 1 OpenPose [1] Copyright is held by the author(s). DJ DJ DJ WISS 2018 [2 4] [5,6] Query-by-Dancing Query-by- Dancing Cao [1] OpenPose 2 Ghias

More information

untitled

untitled 17 5 13 1 2 1.1... 2 1.2... 2 1.3... 3 2 3 2.1... 3 2.2... 5 3 6 3.1... 6 3.2... 7 3.3 t... 7 3.4 BC a... 9 3.5... 10 4 11 1 1 θ n ˆθ. ˆθ, ˆθ, ˆθ.,, ˆθ.,.,,,. 1.1 ˆθ σ 2 = E(ˆθ E ˆθ) 2 b = E(ˆθ θ). Y 1,,Y

More information

11) 13) 11),12) 13) Y c Z c Image plane Y m iy O m Z m Marker coordinate system T, d X m f O c X c Camera coordinate system 1 Coordinates and problem

11) 13) 11),12) 13) Y c Z c Image plane Y m iy O m Z m Marker coordinate system T, d X m f O c X c Camera coordinate system 1 Coordinates and problem 1 1 1 Posture Esimation by Using 2-D Fourier Transform Yuya Ono, 1 Yoshio Iwai 1 and Hiroshi Ishiguro 1 Recently, research fields of augmented reality and robot navigation are actively investigated. Estimating

More information

January 27, 2015

January 27, 2015 e-mail : kigami@i.kyoto-u.ac.jp January 27, 205 Contents 2........................ 2.2....................... 3.3....................... 6.4......................... 2 6 2........................... 6

More information

x (x, ) x y (, y) iy x y z = x + iy (x, y) (r, θ) r = x + y, θ = tan ( y ), π < θ π x r = z, θ = arg z z = x + iy = r cos θ + ir sin θ = r(cos θ + i s

x (x, ) x y (, y) iy x y z = x + iy (x, y) (r, θ) r = x + y, θ = tan ( y ), π < θ π x r = z, θ = arg z z = x + iy = r cos θ + ir sin θ = r(cos θ + i s ... x, y z = x + iy x z y z x = Rez, y = Imz z = x + iy x iy z z () z + z = (z + z )() z z = (z z )(3) z z = ( z z )(4)z z = z z = x + y z = x + iy ()Rez = (z + z), Imz = (z z) i () z z z + z z + z.. z

More information

+ 1 ( ) I IA i i i 1 n m a 11 a 1j a 1m A = a i1 a ij a im a n1 a nj a nm.....

+   1 ( ) I IA i i i 1 n m a 11 a 1j a 1m A = a i1 a ij a im a n1 a nj a nm..... + http://krishnathphysaitama-uacjp/joe/matrix/matrixpdf 1 ( ) I IA i i i 1 n m a 11 a 1j a 1m A = a i1 a ij a im a n1 a nj a nm (1) n m () (n, m) ( ) n m B = ( ) 3 2 4 1 (2) 2 2 ( ) (2, 2) ( ) C = ( 46

More information

miru2006_cr.dvi

miru2006_cr.dvi y;yy yy y y;yy yy y;yy y 630 0192 8916 5 yy NEC 630 0101 8916 47 E-mail: yftomoka-s,sei-i,kanbara,yokoyag@is.naist.jp, yyiketani@cp, n-nakajima@ay.jp.nec.com (structure from motion), structure from motion,,,

More information

D 24 D D D

D 24 D D D 5 Paper I.R. 2001 5 Paper HP Paper 5 3 5.1................................................... 3 5.2.................................................... 4 5.3.......................................... 6

More information

64 3 g=9.85 m/s 2 g=9.791 m/s 2 36, km ( ) 1 () 2 () m/s : : a) b) kg/m kg/m k

64 3 g=9.85 m/s 2 g=9.791 m/s 2 36, km ( ) 1 () 2 () m/s : : a) b) kg/m kg/m k 63 3 Section 3.1 g 3.1 3.1: : 64 3 g=9.85 m/s 2 g=9.791 m/s 2 36, km ( ) 1 () 2 () 3 9.8 m/s 2 3.2 3.2: : a) b) 5 15 4 1 1. 1 3 14. 1 3 kg/m 3 2 3.3 1 3 5.8 1 3 kg/m 3 3 2.65 1 3 kg/m 3 4 6 m 3.1. 65 5

More information

agora04.dvi

agora04.dvi Workbook E-mail: kawahira@math.nagoya-u.ac.jp 2004 8 9, 10, 11 1 2 1 2 a n+1 = pa n + q x = px + q a n better 2 a n+1 = aan+b ca n+d 1 (a, b, c, d) =(p, q, 0, 1) 1 = 0 3 2 2 2 f(z) =z 2 + c a n+1 = a 2

More information

note1.dvi

note1.dvi (1) 1996 11 7 1 (1) 1. 1 dx dy d x τ xx x x, stress x + dx x τ xx x+dx dyd x x τ xx x dyd y τ xx x τ xx x+dx d dx y x dy 1. dx dy d x τ xy x τ x ρdxdyd x dx dy d ρdxdyd u x t = τ xx x+dx dyd τ xx x dyd

More information

2S III IV K A4 12:00-13:30 Cafe David 1 2 TA 1 appointment Cafe David K2-2S04-00 : C

2S III IV K A4 12:00-13:30 Cafe David 1 2 TA 1  appointment Cafe David K2-2S04-00 : C 2S III IV K200 : April 16, 2004 Version : 1.1 TA M2 TA 1 10 2 n 1 ɛ-δ 5 15 20 20 45 K2-2S04-00 : C 2S III IV K200 60 60 74 75 89 90 1 email 3 4 30 A4 12:00-13:30 Cafe David 1 2 TA 1 email appointment Cafe

More information

21 2 26 i 1 1 1.1............................ 1 1.2............................ 3 2 9 2.1................... 9 2.2.......... 9 2.3................... 11 2.4....................... 12 3 15 3.1..........

More information

(a) (b) 2 2 (Bosch, IR Illuminator 850 nm, UFLED30-8BD) ( 7[m] 6[m]) 3 (PointGrey Research Inc.Grasshopper2 M/C) Hz (a) (b

(a) (b) 2 2 (Bosch, IR Illuminator 850 nm, UFLED30-8BD) ( 7[m] 6[m]) 3 (PointGrey Research Inc.Grasshopper2 M/C) Hz (a) (b (MIRU202) 202 8 AdrianStoica 89 0395 744 89 0395 744 Jet Propulsion Laboratory 4800 Oak Grove Drive, Pasadena, CA 909, USA E-mail: uchino@irvs.ait.kyushu-u.ac.jp, {yumi,kurazume}@ait.kyushu-u.ac.jp 2 nearest

More information

keisoku01.dvi

keisoku01.dvi 2.,, Mon, 2006, 401, SAGA, JAPAN Dept. of Mechanical Engineering, Saga Univ., JAPAN 4 Mon, 2006, 401, SAGA, JAPAN Dept. of Mechanical Engineering, Saga Univ., JAPAN 5 Mon, 2006, 401, SAGA, JAPAN Dept.

More information

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2014-GN-90 No.6 Vol.2014-CDS-9 No.6 Vol.2014-DCC-6 No /1/23 Bullet Time 1,a) 1 Bullet Time Bullet Time

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2014-GN-90 No.6 Vol.2014-CDS-9 No.6 Vol.2014-DCC-6 No /1/23 Bullet Time 1,a) 1 Bullet Time Bullet Time Bullet Time 1,a) 1 Bullet Time Bullet Time Generation Technique and Eveluation on High-Resolution Bullet-Time Camera Work Ryuuki Sakamoto 1,a) Ding Chen 1 Abstract: The multi-camera environment have been

More information

all.dvi

all.dvi 29 4 Green-Lagrange,,.,,,,,,.,,,,,,,,,, E, σ, ε σ = Eε,,.. 4.1? l, l 1 (l 1 l) ε ε = l 1 l l (4.1) F l l 1 F 30 4 Green-Lagrange Δz Δδ γ = Δδ (4.2) Δz π/2 φ γ = π 2 φ (4.3) γ tan γ γ,sin γ γ ( π ) γ tan

More information

A

A A04-164 2008 2 13 1 4 1.1.......................................... 4 1.2..................................... 4 1.3..................................... 4 1.4..................................... 5 2

More information

2). 3) 4) 1.2 NICTNICT DCRA Dihedral Corner Reflector micro-arraysdcra DCRA DCRA DCRA 3D DCRA PC USB PC PC ON / OFF Velleman K8055 K8055 K8055

2). 3) 4) 1.2 NICTNICT DCRA Dihedral Corner Reflector micro-arraysdcra DCRA DCRA DCRA 3D DCRA PC USB PC PC ON / OFF Velleman K8055 K8055 K8055 1 1 1 2 DCRA 1. 1.1 1) 1 Tactile Interface with Air Jets for Floating Images Aya Higuchi, 1 Nomin, 1 Sandor Markon 1 and Satoshi Maekawa 2 The new optical device DCRA can display floating images in free

More information

December 28, 2018

December 28, 2018 e-mail : kigami@i.kyoto-u.ac.jp December 28, 28 Contents 2............................. 3.2......................... 7.3..................... 9.4................ 4.5............. 2.6.... 22 2 36 2..........................

More information

Mthesis_yamazaki.dvi

Mthesis_yamazaki.dvi ( ) 2004 1 (1) (2) (3) 1 1 1.1........................................ 1 1.2........................................ 2 1.3...................................... 2 1.3.1.............................. 2

More information

Real AdaBoost HOG 2009 3 A Graduation Thesis of College of Engineering, Chubu University Efficient Reducing Method of HOG Features for Human Detection based on Real AdaBoost Chika Matsushima ITS Graphics

More information

20 6 4 1 4 1.1 1.................................... 4 1.1.1.................................... 4 1.1.2 1................................ 5 1.2................................... 7 1.2.1....................................

More information

Vol. 44 No. SIG 9(CVIM 7) ) 2) 1) 1 2) 3 7) 1) 2) 3 3) 4) 5) (a) (d) (g) (b) (e) (h) No Convergence? End (f) (c) Yes * ** * ** 1

Vol. 44 No. SIG 9(CVIM 7) ) 2) 1) 1 2) 3 7) 1) 2) 3 3) 4) 5) (a) (d) (g) (b) (e) (h) No Convergence? End (f) (c) Yes * ** * ** 1 Vol. 44 No. SIG 9(CVIM 7) July 2003, Robby T. Tan, 1 Estimating Illumination Position, Color and Surface Reflectance Properties from a Single Image Kenji Hara,, Robby T. Tan, Ko Nishino, Atsushi Nakazawa,

More information

[ ] 0.1 lim x 0 e 3x 1 x IC ( 11) ( s114901) 0.2 (1) y = e 2x (x 2 + 1) (2) y = x/(x 2 + 1) 0.3 dx (1) 1 4x 2 (2) e x sin 2xdx (3) sin 2 xdx ( 11) ( s

[ ] 0.1 lim x 0 e 3x 1 x IC ( 11) ( s114901) 0.2 (1) y = e 2x (x 2 + 1) (2) y = x/(x 2 + 1) 0.3 dx (1) 1 4x 2 (2) e x sin 2xdx (3) sin 2 xdx ( 11) ( s [ ]. lim e 3 IC ) s49). y = e + ) ) y = / + ).3 d 4 ) e sin d 3) sin d ) s49) s493).4 z = y z z y s494).5 + y = 4 =.6 s495) dy = 3e ) d dy d = y s496).7 lim ) lim e s49).8 y = e sin ) y = sin e 3) y =

More information

変 位 変位とは 物体中のある点が変形後に 別の点に異動したときの位置の変化で あり ベクトル量である 変位には 物体の変形の他に剛体運動 剛体変位 が含まれている 剛体変位 P(x, y, z) 平行移動と回転 P! (x + u, y + v, z + w) Q(x + d x, y + dy,

変 位 変位とは 物体中のある点が変形後に 別の点に異動したときの位置の変化で あり ベクトル量である 変位には 物体の変形の他に剛体運動 剛体変位 が含まれている 剛体変位 P(x, y, z) 平行移動と回転 P! (x + u, y + v, z + w) Q(x + d x, y + dy, 変 位 変位とは 物体中のある点が変形後に 別の点に異動したときの位置の変化で あり ベクトル量である 変位には 物体の変形の他に剛体運動 剛体変位 が含まれている 剛体変位 P(x, y, z) 平行移動と回転 P! (x + u, y + v, z + w) Q(x + d x, y + dy, z + dz) Q! (x + d x + u + du, y + dy + v + dv, z +

More information

(, Goo Ishikawa, Go-o Ishikawa) ( ) 1

(, Goo Ishikawa, Go-o Ishikawa) ( ) 1 (, Goo Ishikawa, Go-o Ishikawa) ( ) 1 ( ) ( ) ( ) G7( ) ( ) ( ) () ( ) BD = 1 DC CE EA AF FB 0 0 BD DC CE EA AF FB =1 ( ) 2 (geometry) ( ) ( ) 3 (?) (Topology) ( ) DNA ( ) 4 ( ) ( ) 5 ( ) H. 1 : 1+ 5 2

More information

IPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta

IPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta 1 1 1 1 2 1. Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Takayuki Okatani 1 and Koichiro Deguchi 1 This paper presents a method for recognizing the pose of a wire harness

More information

Vol.-CVIM-7 No.7 /3/8 NLPCA kernel PCA KPCA 4),) NLPCA KPCA NLPCA KPCA principle curve principle surface KPCA ) ),),6),8),),3) ) Jacobian KPCA PCA ) P

Vol.-CVIM-7 No.7 /3/8 NLPCA kernel PCA KPCA 4),) NLPCA KPCA NLPCA KPCA principle curve principle surface KPCA ) ),),6),8),),3) ) Jacobian KPCA PCA ) P Vol.-CVIM-7 No.7 /3/8 RANSAC M Subspace fitting via robust Jacobian kernel PCA Jun Fujiki and Shotaro Akaho The subspace fitting method based on the original kernel principle component analysis (PCA),

More information

2 Fig D human model. 1 Fig. 1 The flow of proposed method )9)10) 2.2 3)4)7) 5)11)12)13)14) TOF 1 3 TOF 3 2 c 2011 Information

2 Fig D human model. 1 Fig. 1 The flow of proposed method )9)10) 2.2 3)4)7) 5)11)12)13)14) TOF 1 3 TOF 3 2 c 2011 Information 1 1 2 TOF 2 (D-HOG HOG) Recall D-HOG 0.07 HOG 0.16 Pose Estimation by Regression Analysis with Depth Information Yoshiki Agata 1 and Hironobu Fujiyoshi 1 A method for estimating the pose of a human from

More information

20 4 20 i 1 1 1.1............................ 1 1.2............................ 4 2 11 2.1................... 11 2.2......................... 11 2.3....................... 19 3 25 3.1.............................

More information

IPSJ SIG Technical Report Vol.2015-CVIM-196 No /3/6 1,a) 1,b) 1,c) U,,,, The Camera Position Alignment on a Gimbal Head for Fixed Viewpoint Swi

IPSJ SIG Technical Report Vol.2015-CVIM-196 No /3/6 1,a) 1,b) 1,c) U,,,, The Camera Position Alignment on a Gimbal Head for Fixed Viewpoint Swi 1,a) 1,b) 1,c) U,,,, The Camera Position Alignment on a Gimbal Head for Fixed Viewpoint Swiveling using a Misalignment Model Abstract: When the camera sets on a gimbal head as a fixed-view-point, it is

More information