fi¡ŒØ.dvi
|
|
- みさえ あると
- 4 years ago
- Views:
Transcription
1 (2001) ( ) * Structure from motion, *
2 Image-Based Rendering, Virtualized Reality, Augmented Reality Mixed Reality structure from motion 3 structure from motion 3
3 Kanade-Lucas-Tomasi (Shi and Tomasi (1994)) (Xu and Zhang (1996)) image plane camera center pixel coordinate 3 1..
4 (camera coordinate) 2 (image coordinate) 2 XY Z optical axis Z X Y xy x y X Y x y X Y (X, Y, Z) T 3 (x, y) T (1.1) x = l X Z, y = l Y Z l (focal length) 3 (1.1)
5 (1990) 3 3 Tomasi and Kanade (1992) F P P 3 F FP FP 2 3 FP 2F P F 2F 3 P 3 3 P = Singular Value Decomposition; SVD
6 Necker Reversal Necker Reversal 2 3 (Christy and Horaud (1996)) f p X fp =(X fp,y fp,z fp ) T f p x fp =(x fp,y fp ) T l ψ! ψ! (2.1) x fp = l X fp x fp = l X fp Z fp Y fp l Z fp 3
7 paraperspective (Poelman and Kanade (1997)) scaled orthographic 2.2 Paraperspective (2.1) X f x f (G) X fp = X fp X f x fp = x fp x f (2.1) (2.2) ψ x fp l! l = Z f + Zfp (X f + X fp) ρ ff = l I 3 1 X f (0, 0, 1) X fp + l X f + O( X fp 2 ) Z f Z f Z f O( X fp 2 ) X fp x fp ψ! ψ! (2.3) x fp = l Z f 0 X f Zf 2 X fp + l X f = A para f X fp + x f 0 Z f Y f Z f Y f
8 paraperspective A para f paraperspective Zfp Z f 3 paraperspective 4 1. X fp Z = Z f X f l/z f 2.3 Scaled orthographic Paraperspective X f /Z f 0, Y f /Z f 0 paraperspective (2.3) ψ! (2.4) x fp = l X fp = A scaled f X fp Z f scaled orthographic weak perspective A scaled f scaled orthographic scaled orthographic 4 1. X fp Z = Z f l/z f 2.4 Scaled orthographic Z 1 Z, Z 2 Z,...,Z F Z (2.4) (2.5) x fp = l Z ψ l/z Y fp! X fp (2.6) Y fp = l Z X fp (2.5) (2.7) x fp = ψ ! Y fp = A ortho f Y fp A ortho f paraperspective scaled orthographic 4
9 scaled orthographic paraperspective (MAP; Metric Affine Projection) (Generalized Affine Projection) Mundy and Zisserman (1992) Mundy and Zisserman (1992) X fp x fp (2.8) x fp = A f X fp + u f (2 3) A f u f A f A f u f A f MAP 1997b scaled orthographic ψ! ψ! (2.9) x fp = l X fp x fp = l (tx fp ) (t>0) Z f tz f t t t Z f tz f t Z f λ f = tz f λ f = tz f A f λ f A f
10 λ f B f (2.10) A f = 1 λ f B f B f A f B f Bf T B f u f xf l = l λ f X f (Metric Affine Projection MAP) MAP (2.11) x fp = A f X fp = 1 λ f B f X fp, X f = λ f l A f A f MAP 2.6 MAP B f (2.12) B f = R f Σ f D f, R T f R f = D f D T f = I 2, Σ f = diag{p f,q f } B f R f Σ f D f (2.11) MAP (2.13) x fp = A f X fp = 1 λ f R f Σ f D f X fp D f =(c f, d f ) T ψ! (2.14) χ fp = D f X fp = fdf X fp χ fp X fp span{c f, d f } c f, d f A T f {c f, d f } e Y 2 3 Y =(Y1, Y 2 ) T 3 3 (Y 1, Y 2, Y 1 Y 2 ) T 1 (2.15) x fp = R f Σ f χ fp λ f 1 λ f χ fp X fp span{c f, d f } scaled orthographic {c f, d f } f Df 3 MAP span{c f, d f } scaled orthographic {c f, d f } R f Σ f (2.16) 1 χ fp = Σ 1 f Rf T x fp λ f ψ x f l!
11 MAP. MAP scaled orthographic scaled orthographic Xu and Sugimoto 1998 MAP λ f MAP λ f 2 1 λ f 1 {λ f } F f=1 {λ f } F f=1 (Xu and Zhang (1996)) 4 {λ f } F f=1 {λ f } F f=1 2 MAP MAP span{c f, d f } scaled orthographic span{c f, d f } (virtual image plane) MAP (Tomasi and Kanade (1992))
12 f t f f {i f, j f } k f C f =(i f, j f, k f ) T p s p f X fp (3.1) s p = t f + C T f X fp s s p = s p s, t f = t f s (3.2) s p = t f + C T f X fp X fp = C f (s p t f ) (2.11) MAP (3.3) x fp = A f X fp = A f C f s p, t f = λ f Cf T l (A f C f ) T 3.1 P F 2 (3.3) F f=1; P p=1 {C f } F f=1, {s p} P p=1 {λ f } F f=1 FP (3.3) F f=1; P p=1 {C f } F f=1, {s p} P p=1 {λ f } F f=1 W M S (3.4) W = M = 0 0 W 1. W F T M 1. M F 1 C 1 C (3.5) W = M A, W f =(x f1,...,x fp), ψ x f l A, M f = A f C f, S =(s 1,...,s P ) S (2F 3) (3 P ) (3.5) (3.3) F f=1; P p=1 M {C f } F f=1 {λ f } F f=1 S {s p} P p=1 W M S W M S C f 3 (3.6) M f Mf T = A f A T f = 1 B λ 2 f Bf T f λ f W (3.6) F f=1 M S!
13 W (3.5) 3 (3.7) W = ˆM Ŝ (2F 3) (3 P ) = 0 ˆM 1.. ˆM F 1 C A Ŝ Yokoya et al M, S, ˆM,Ŝ (3.8) M = ˆMA, S = A 1 Ŝ 3 3 A ˆM Ŝ 3 3 A A A Q = AA T (3.6) (3.9) ˆM f Q ˆM f T = A f A T f = 1 B λ 2 f Bf T f (3.9) F f=1 {λ f } F f=1 Q B f B f R f Σ f D f R f,σ f (3.10) ˆPf =(ˆp f, ˆq f ) T = R T f ˆM f, P f = R T f M f (3.9) (3.11) ˆPf Q ˆP f T = 1 Σ λ 2 f 2 f (3.11) (3.12) ˆp T f Qˆp f = p2 f, λ 2 ˆp T f Qˆq f =0, ˆqT f Qˆq f = q2 f f λ 2 f ˆpT f Qˆp f p 2 f λ f = ˆqT f Qˆq f q 2 f = 1, λ 2 ˆp T f Qˆq f =0 f (3.13) q 2 f ˆp T f Qˆp f p 2 f ˆq T f Qˆq f =0, ˆp T f Qˆq f =0 3 3 Q Q
14 Q = (G) (average depth) {λ f } F f=1 s (3.14) λ f = p 2 f ˆp T f Qˆp f = s q 2 f ˆq T f Qˆq f Q Q Q =(Q ij ) q =(Q 11,Q 12,Q 13,Q 22,Q 23,Q 33 ) T a, b (3.15) ω(a, b) T q = a T Qb 6 ω(a, b) (3.13) (3.16) ψ! qf 2 ω(ˆp f, ˆp f ) T p 2 f ω(ˆq f, ˆq f ) T ω(ˆp f, ˆq f ) T q = ˆ! T f q = 0 2 (3.17) Ω =(ˆ! 1,..., ˆ! F ) T (3F 6) 0 q ω(ˆp 1, ˆp 1 )=1 ψ! ψ! ω(ˆp (3.18) 1, ˆp 1 ) 1 q = Ω 0 2F (3.19) q = ψ ω(ˆp 1, ˆp 1 ) Ω! + ψ! 1 0 2F q 2 X + X s s p 2 f (3.20) λ f = ω(ˆp f, ˆp f ) T q = qf 2 ω(ˆq f, ˆq f ) T q Q Q Q Q Q (Quan (1996)) Q 3 Q Q = LL T A A = L T U( U O(3)) M, S (3.21) M = ˆML T U, S = U T LŜ U M S detu 2
15 Necker reversal. f M f V f = λ f Σ 1 f P f C f (3.22) C f =(f Af ) 1 g Mf =(f Df ) 1 f Vf t f (3.3) detu 2 MAP 2 MAP Necker Reversal Necker Reversal 6 2 Necker Reversal 2 MAP Necker Reversal S S S diag{ 1, 1, 1}S S C f S H f C f,h f 3 1 (3.23) A f C f = A f H f D f C f = D f H f (3.24) H f =(f Af ) 1 diag{ 1, 1, 1}f Af C f =(f Df ) T diag{ 1, 1, 1}f Df C f scaled orthographic paraperspective (3.25) H f = 8 >< >: diag{ 1, 1, 1}C f ψ 1 2x para f xpara T f gf 2 I 2 g 2 f 2lx para T f 2lx para f g 2 f 2u 2 f! C f scaled orthographic (paraperspective)
16 MAP Necker reversal S C f diag{ 1, 1, 1}S K f C f,k f 3 1 (3.26) A f C f = A f K f diag{ 1, 1, 1} D f C f = D f K f diag{ 1, 1, 1} (3.27) K f = H f diag{ 1, 1, 1} =(f Af ) 1 diag{1, 1, 1}f Af C f diag{ 1, 1, 1} =(f Df ) T diag{1, 1, 1}f Df C f diag{ 1, 1, 1} (3.28) K T f = diag{ 1, 1, 1}C T f (f Df ) T diag{1, 1, 1}f Df diag{ 1, 1, 1}X X 1 Xdiag{1, 1, 1} X 3 C T f K T f MAP Necker Reversal 3 ranks =3 4 rankm =2 {M T f } F f=1 F MAP
17 MAP 3 MAP 3 MAP MAP rankm =2 MAP 3 rankm =3 rankm = ranks =3 rankm = ranks =3 W (2F P ) = M(2F 3) S (3 P ) M S 1997a Q Necker Reversal Q 1 (3.11) Q 3 2 span{ˆp f, ˆq f } MAP 3 Q 2 j i span{ˆp j, ˆq j } span{ˆp i, ˆq i } 2 j Q i Q Q 1 (3.11) Q 6 2 Q 2 Q 6 Q 1997a Q Koenderink and van Doorn (1991) (1997a) 1
18 i, j 2 span{ˆp j, ˆq j } span{ˆp i, ˆq i } 1 span{ˆp j, ˆq j } span{ˆp i, ˆq i } λ i λ j ĉ ij span{ˆp i, ˆq i } span{ˆp j, ˆq j } ĉ ij 0 3 ˆP i T ˆP j T 2 ĥi, ĥj (4.1) ĉ ij = ˆP T i ĥi = ˆP T j ĥj (3.11) (4.2) λ i : λ j = Σ j ĥ j : Σ i ĥ i {λ f } F f=1 {λ f } F f=1 Σ f (3.11) (4.3) ˆVf =(ˆv f, ŵ f ) T = λ f Σ 1 f ˆP f, V f = λ f Σ 1 f P f (4.4) ˆVf Q ˆV T f = I 2 ( ˆV f ˆV f )cs Q =csi 2 cs (4.4) 2 MAP MAP (4.5) ˆV = ( ˆV 1 ˆV 1 ) T,..., ( ˆV F ˆV F ) T T, I = (cs I2,..., cs I 2 ) T cs Q = ˆV + I Q ĉ ij 2 2 1,2 ĉ 12 ĉ ĉ Q =1 â, ˆb (4.6) â = ĉ, ŵ 1 Qˆv 1 ĉ, ˆv 1 Q ŵ 1, (4.7) ˆb = ĉ, ŵ2 Qˆv 2 ĉ, ˆv 2 Q ŵ a 1999 â Q = ˆb Q = ĉ Q =1, â, ĉ Q = ˆb, ĉ Q =0, (4.8) â span{ˆv 1, ŵ 1 }, ˆb span{ˆv2, ŵ 2 }, ĉ span{ˆv 1, ŵ 1 } span{ˆv 2, ŵ 2 } span{ˆv i, ŵ i } = span{ˆp i, ˆq i }(i =1, 2) â {ˆv 1, ŵ 1, ˆv 1 ŵ 1 } {a, c, ˆv 1 ŵ 1 } ˆb 8 (4.8) 5 (4.4) f=1,2 5 â, ˆb Q Q â, ˆb 2 Q â 2 Q ˆb 2 Q =1 (4.9) â, ˆb Q = cos 2θ, 0 < 2θ <π
19 θ 0,π P =(â, ˆb, ĉ) cos 2θ 0 B Q=P C cos 2θ 1 0A P 1 = P T K diag{2 cos 2 θ, 2sin 2 θ, 1}K T P 1, (4.10) K= 0 1/ 2 1/ 2 0 1/ 2 1/ C A (4.11) A = P T Kdiag{ 2 cos θ, 2sinθ, 1}U 3, U 3 O(3) X T =(X T ) U 3 U 3 I 3 O(3) {M,S } 1 θ (4.12) (4.13) M(θ) T = diag{ 2 cos θ, 2sinθ, 1}K T P 1 ˆM T, S (θ) = diag{1/ 2 cos θ, 1/ 2sinθ, 1}K 1 P T Ŝ M(θ) T = diag{ 2 cos θ, 2sinθ, 1}M(π/4) T, S (θ) = diag{1/ 2 cos θ, 1/ 2sinθ, 1}S (π/4) M(θ) T M(π/4) T diag{ 2 cos θ, 2sinθ, 1} S (θ) S (π/4) diag{1/ 2 cos θ, 1/ 2sinθ, 1}
20 θ = π/4. ĉ span{ˆv 1, ŵ 1 } span{ˆv 2, ŵ 2 } 2 (4.14) c = diag{ 2 cos θ, 2sinθ, 1}KP 1 ĉ =(0, 0, 1) T a span{ˆv 1, ŵ 1 } b span{ˆv 2, ŵ 2 } (4.15) a =( cos θ, sin θ, 0) T, b = (cos θ, sin θ, 0) T a = b, a c, b c c (4.16) u =(1, 0, 0) T, v =(0, 1, 0) T M(θ) T M(π/4) T u v c 2 cos θ 2sinθ 1 S (θ) S (π/4) u v c 1/ 2 cos θ 1/ 2sinθ θ c θ 4.3 MAP 2 3 Necker Reversal Q Necker Reversal r T Qr =1 λ f 2 λ f 2 10 a 2 MAP 3 10 b 3 MAP
21 MAP 1997b Ullman (1979) 3 ˆπ i (i =1, 2, 3) 11 2 (a) ˆπ 1 ˆπ 2 ˆπ 3 1 ê 1 ˆπ 2 ˆπ 3, ê 2 ˆπ 3 ˆπ 1, ê 3 ˆπ 1 ˆπ 2 ê i ( 0) (i =1, 2, 3) ê 1 Q, ê 2 Q, ê 3 Q, ê 1, ê 2 Q, ê 2, ê 3 Q, ê 3, ê 1 Q Q (b) ˆπ 1 ˆπ 2 ˆπ 3 ê ˆπ 1 ˆπ 2 ˆπ 3 ê i ˆπ i ê i ê ê i ê, ê i ( 0) (i =1, 2, 3) ê 3 = e 1 ê 1 + e 2 ê 2 e 1,e 2 0 ê 3 2 Q = e 2 1 ê 1 2 Q +2e 1 e 2 ê 1, ê 2 Q + e 2 2 ê 2 2 Q ê 1, ê 2 Q ê Q, ê 1 Q, ê 2 Q, ê, ê 1 Q, ê, ê 2 Q, ê 1, ê 2 Q Q 5. W MAP W 3 W (2F P ) = ˆM (2F 3) Ŝ (3 P ) MAP W
22 W 3 4 MAP W 3 3 MAP 3 3 MAP W W = MS W M,S W M, S AB BA AB BA 0 W T W 0 (M T M)(S T S ) 0 Φ (S S T ) 1 (µ) = det(m T M µ(s S T ) 1 )=0 M T M 0 W 0 W T W 0 W M T M S S T s p (5.1) Cov(s p)= 1 P S S T S S T M T M M 1 FX (5.2) M T 1 M = np o 2 f v f v T f + qf 2 T w f w f λ 2 f=1 f M T M {v f, w f } F f=1 {p f /λ f,q f /λ f } F f=1 S S T (S S T ) 1 M T M
23 W T W LMeds 1999b 6. MAP MAP MAP 2 (Christy and Horaud (1996) Strum and Triggs (1996) Ueshiba and Tomita (1998) (1998)) MAP 1 MAP Strum and Triggs (1996) Ueshiba and Tomita(1998) (1998) 2
24 Strum and Triggs (1996) Christy and Horaud (1996) paraperspective paraperspective Christy and Horaud (1996) paraperspective f p x per fp f p paraperspective x para fp 12 (6.1) x para fp x per fp = µ fpx per fp, µ fp = λ fp λ f = x per fp x f x para fp = x para fp x f Ueshiba and Tomita 1998 λ fp (projective depth) f p µ fp f p Christy and Horaud (1996) (6.1) paraperspective paraperspective W (6.2) W = 0 µ 11x µ 1P x 1P µ F 1x F 1... µ FPx FP x fp = x fp x f 1 C A 12. paraperspective.
25 Christy and Horaud (0) µ fp =1 paraperspective M (+),S (+) M ( ),S ( ) M (±),S (±) (6.1) µ (±) fp (1) W (±) paraperspective (2) W (±) 2 M (±),S (±) M (±),S (±) (3) M (±),S (±) (6.1) µ (±) fp (4) µ (±) fp ɛ (1) (5) M (±) ˆx fp M (+),S (+) M ( ),S ( ) X f, p ˆx fp x fp MAP Morita and Kanade Morita and Kanade (1997) (1998) (1998) Christy and Horaud (1996) (Fujiki and Kurata (2000)) 7.1 MAP W W W 3 P 2F 3
26 W 3 S W 3 (3.6) M l T 1, l T 2, l T 3 M = (l 1, l 2, l 3 ) T W W = MS, MM T = 0 l T 1 l 1 l T 2 l 1 l T 3 l 1 l T 1 l 2 l T 2 l 2 l T 3 l 2 l T 1 l 3 l T 2 l 3 l T 3 l 3 M M 3 I (3.5) 3 M M M M T M M M T M (sufficient statistic) M M (7.1) M T M = M T M M M (7.1) M 1 C A
27 M (7.2) M = FΛE E Λ M F ΛE F (7.3) M = ΛE M M T M = M T M M M 3 3 M 2F 3 M 3 3 X X T X = M T M X = U M U O(3) M 1 M ( ) (7.4) W = MS, MM T = Λ 2 F +1 (2F +2) P (W T,W T F +1) T 5 P (W T,W T F +1) T 5 MAP MAP W MAP W = MS W = MS (7.5) W T W = S T M T MS = S T M T MS = W T W W, W W T W, W T W MAP M 3 E Λ
28 MAP 1998 Christy and Horaud (1996) paraperspective (Fujiki and Kurata (2000)) 2 2 Sref,S ob SrefS ob T SrefS ob T = UDV T Sob Sref = ESob E 2 (7.6) E = UV T (0) 3 (1) 5 P (2) (1) MAP (1998) (0) k( 3) W (1) f( k) M [f] f W [f] M [f] M T [f] = Λ 2 [f] f +1 Wf+1 f W[f] f +1 W[f+1] W[f+1] =(W[f] T,Wf+1) T T f 1 f [f] (2) M [f+1] = (M T [f],mf+1) T T S [f+1] Q ˆM [f] Q ˆM T [f] = M [f] M T [f] = Λ 2 [f] = diag{λ [f],1,λ [f],2,λ [f],3 }, ˆV f+1 Q ˆV f+1 T = V f+1 Vf+1 T = I 2 ψ! ˆM[f] ˆV [f+1] = ˆM [f], I [f+1] = ˆV f+1 ˆV f+1 ψ! cs (Λ 2 [f]) cs I 2 cs Q = ˆV + [f+1] I [f+1] ˆM [f] =(ˆl [f],1,ˆl [f],2,ˆl [f],3 ) T Ω [f+1] L [f+1] Ω [f+1] =(ω [f],11,ω [f],22,ω [f],33,ω [f],12,ω [f],13,ω [f],23, ˆω f+1 ) T, ω [f],ij = ω(ˆl [f],i,ˆl [f],j ), L [f+1] =(λ 2 [f],1,λ 2 [f],2,λ 2 [f],3, 0 T 3, 1, 0, 1) T
29 q =Ω + [f+1] L [f+1] dets [f] S T [f+1] > 0 Q (1) 6 (Fujiki and Kurata (2000)) (Costeria and Kanade(1998) Gear(1998)) 1998 (Quan and Kanade (1997) Morris and Kanade (1998)) Christy, S. and Horaud, R. (1996). Euclidean reconstruction: From paraperspective to perspective, Proceedings of 4th European Conference on Computer Vision, 2, Costeria, J. P. and Kanade, T. (1998). A multibody factorization method for independently moving objects, International Journal on Computer Vision, 29(3), (2000). Paraperspective , D 12 50, p (1997a). PRMU 97 22, (1997b). PRMU , Fujiki, J. and Kurata, T. (2000). Recursive factorization method for the paraperspective model based on the perspective projection, Proceedings of 15th International Conference on Pattern Recognition, 1, (1998). PRMU , Gear, C. W. (1998). Multibody grouping from motion images, International Journal on Computer Vision, 29(2), (1993). J76-D-II (8), (1990). 3 (1998). PRMU 98 26, 1 8. Koenderink, J. J. and van Doorn, A. J. (1991). Affine structure from motion, J. Opt. Soc. Amer. A, 8(2),
30 ,, (1999a). 40(8), ,, (1999b). PRMU , Morita, T. and Kanade, T. (1997). A sequential factorization method for recovering shape and motion from image streams, IEEE Trans. Pattern Analysis and Machine Intelligence, 19(8), Morris, D. D. and Kanade, T. (1998). A unified factorization algorithm for points, line segments and plane with uncertainty models, Proceedings of International Conference on Computer Vision 98, Mundy, J. L. and Zisserman, A. (ed.) (1992). Geometric Invariance in Computer Vision, MIT press Cambridge, Massachusetts. (1998). J81-D-II 3, Poelman, C. J. and Kanade, T. (1997). A paraperspective factorization method for shape and motion recovery, IEEE Trans. Pattern Analysis and Machine Intelligence, 19(3), Quan, L. (1996). Self-calibration of an affine camera from multiple views, International Journal on Computer Vision, 19(1), Quan, L. and Kanade, T. (1997). Affine structure from line correspondences with uncalibrated affine cameras, IEEE Trans. Pattern Analysis and Machine Intelligence, 19(8), Shi, J. and Tomasi, C. (1994). Good features to track, Proceedings of Computer Vision and Pattern Recognition 94, Sturm, P. and Triggs, B. (1996). A factorization based algorithm for multi-image projective structure and motion, Proceedings of 4th European Conference on Computer Vision, 2, Tomasi, C. and Kanade, T. (1992). Shape and motion from image streams under orthography: A factorization method, International Journal on Computer Vision, 9(2), Ueshiba, T. and Tomita, F. (1998). A factorization method for projective and Euclidean reconstruction from multiple perspective views via iterative depth estimation, Proceedings of 5th European Conference on Computer Vision, Ullman, S. (1979). The interpretation of visual motion, MIT Press, Cambridge, Massachusetts. Xu, G. and Sugimoto, N. (1998). A linear algorithm for motion from three weak perspective images using Euler angles, Proceedings of Asian Conference on Computer Vision 98, Xu, G. and Zhang, Z. (1996). Epipolar Geometry in Stereo, Motion and Object Recognition A Unified Approach, Kluwer, Dordrecht. Yokoya, N., Takemura, H. and Hwang, K. (1998). A factorization method using 3-d linear combination for shape and motion recovery, Proceedings of International Conference on Pattern Recognition 98,
31 Proceedings of the Institute of Statistical Mathematics Vol. 49, No. 1, (2001) 107 3D Reconstruction from Sequences of 2D Images under Point Correspondences An Mathematical Analysis of the Factorization Method Jun Fujiki (National Institute of Advanced Industrial Science and Technology) Recovering the camera motion and the object shape from multiple images is a fundamental and important problem in the field of computer vision. Especially, the problem under point correspondences is the most fundamental and most important. To solve this problem, many methods are presented and among them, the factorization method is an excellent method because it is stable in numerical computation and it gives good reconstruction although it is based on the affine approximation of the perspective projection. The factorization method is useful not only for solving the problem, but also for understanding the mathematical meaning of the problem under affine approximated projection. In this paper, the mathematical analysis of recovering the camera motion and the object shape from multiple affine approximated projection images under point correspondences by the factorization method is considered. The way to recover the camera motion and the object shape from perspective images by estimating the affine approximated projection images from perspective images and the recursive factorization method are also considered. Key words: Structure from motion, factorization method, recursive, Metric Affine Projection, perspective projection.
Optical Flow t t + δt 1 Motion Field 3 3 1) 2) 3) Lucas-Kanade 4) 1 t (x, y) I(x, y, t)
http://wwwieice-hbkborg/ 2 2 4 2 -- 2 4 2010 9 3 3 4-1 Lucas-Kanade 4-2 Mean Shift 3 4-3 2 c 2013 1/(18) http://wwwieice-hbkborg/ 2 2 4 2 -- 2 -- 4 4--1 2010 9 4--1--1 Optical Flow t t + δt 1 Motion Field
More informationVRSJ-SIG-MR_okada_79dce8c8.pdf
THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE. 630-0192 8916-5 E-mail: {kaduya-o,takafumi-t,goshiro,uranishi,miyazaki,kato}@is.naist.jp,.,,.,,,.,,., CG.,,,
More information(MIRU2008) HOG Histograms of Oriented Gradients (HOG)
(MIRU2008) 2008 7 HOG - - E-mail: katsu0920@me.cs.scitec.kobe-u.ac.jp, {takigu,ariki}@kobe-u.ac.jp Histograms of Oriented Gradients (HOG) HOG Shape Contexts HOG 5.5 Histograms of Oriented Gradients D Human
More information1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z +
3 3D 1,a) 1 1 Kinect (X, Y) 3D 3D 1. 2010 Microsoft Kinect for Windows SDK( (Kinect) SDK ) 3D [1], [2] [3] [4] [5] [10] 30fps [10] 3 Kinect 3 Kinect Kinect for Windows SDK 3 Microsoft 3 Kinect for Windows
More information28 Horizontal angle correction using straight line detection in an equirectangular image
28 Horizontal angle correction using straight line detection in an equirectangular image 1170283 2017 3 1 2 i Abstract Horizontal angle correction using straight line detection in an equirectangular image
More information4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q
x-means 1 2 2 x-means, x-means k-means Bayesian Information Criterion BIC Watershed x-means Moving Object Extraction Using the Number of Clusters Determined by X-means Clustering Naoki Kubo, 1 Kousuke
More information光学
Fundamentals of Projector-Camera Systems and Their Calibration Methods Takayuki OKATANI To make the images projected by projector s appear as desired, it is e ective and sometimes an only choice to capture
More information14 2 5
14 2 5 i ii Surface Reconstruction from Point Cloud of Human Body in Arbitrary Postures Isao MORO Abstract We propose a method for surface reconstruction from point cloud of human body in arbitrary postures.
More informationIPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta
1 1 1 1 2 1. Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Takayuki Okatani 1 and Koichiro Deguchi 1 This paper presents a method for recognizing the pose of a wire harness
More informationIPSJ SIG Technical Report Vol.2012-CG-149 No.13 Vol.2012-CVIM-184 No /12/4 3 1,a) ( ) DB 3D DB 2D,,,, PnP(Perspective n-point), Ransa
3,a) 3 3 ( ) DB 3D DB 2D,,,, PnP(Perspective n-point), Ransac. DB [] [2] 3 DB Web Web DB Web NTT NTT Media Intelligence Laboratories, - Hikarinooka Yokosuka-Shi, Kanagawa 239-0847 Japan a) yabushita.hiroko@lab.ntt.co.jp
More information(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α,
[II] Optimization Computation for 3-D Understanding of Images [II]: Ellipse Fitting 1. (1) 2. (2) (edge detection) (edge) (zero-crossing) Canny (Canny operator) (3) 1(a) [I] [II] [III] [IV ] E-mail sugaya@iim.ics.tut.ac.jp
More informationJFE.dvi
,, Department of Civil Engineering, Chuo University Kasuga 1-13-27, Bunkyo-ku, Tokyo 112 8551, JAPAN E-mail : atsu1005@kc.chuo-u.ac.jp E-mail : kawa@civil.chuo-u.ac.jp SATO KOGYO CO., LTD. 12-20, Nihonbashi-Honcho
More informationxx/xx Vol. Jxx A No. xx 1 Fig. 1 PAL(Panoramic Annular Lens) PAL(Panoramic Annular Lens) PAL (2) PAL PAL 2 PAL 3 2 PAL 1 PAL 3 PAL PAL 2. 1 PAL
PAL On the Precision of 3D Measurement by Stereo PAL Images Hiroyuki HASE,HirofumiKAWAI,FrankEKPAR, Masaaki YONEDA,andJien KATO PAL 3 PAL Panoramic Annular Lens 1985 Greguss PAL 1 PAL PAL 2 3 2 PAL DP
More informationfiš„v3.dvi
(2001) 49 1 23 42 2000 10 16 2001 4 23 NTT * 1. 1.1 1998 * 104 0033 1 21 2 7F 24 49 1 2001 1999 70 91 MIT M. Turk Recognition Using Eigenface (Turk and Pentland (1991)). 1998 IC 1 CPU (Jain and Waller
More informationmain.dvi
SGC - 70 2, 3 23 ɛ-δ 2.12.8 3 2.92.13 4 2 3 1 2.1 2.102.12 [8][14] [1],[2] [4][7] 2 [4] 1 2009 8 1 1 1.1... 1 1.2... 4 1.3 1... 8 1.4 2... 9 1.5... 12 1.6 1... 16 1.7... 18 1.8... 21 1.9... 23 2 27 2.1
More information22_05.dvi
Vol. 1 No. 2 41 49 (July 2008) 3 1 1 3 2 1 1 3 Person-independent Monocular Tracking of Face and Facial Actions Yusuke Sugano 1 and Yoichi Sato 1 This paper presents a monocular method of tracking faces
More information2019 1 5 0 3 1 4 1.1.................... 4 1.1.1......................... 4 1.1.2........................ 5 1.1.3................... 5 1.1.4........................ 6 1.1.5......................... 6 1.2..........................
More informationVol. 44 No. SIG 9(CVIM 7) ) 2) 1) 1 2) 3 7) 1) 2) 3 3) 4) 5) (a) (d) (g) (b) (e) (h) No Convergence? End (f) (c) Yes * ** * ** 1
Vol. 44 No. SIG 9(CVIM 7) July 2003, Robby T. Tan, 1 Estimating Illumination Position, Color and Surface Reflectance Properties from a Single Image Kenji Hara,, Robby T. Tan, Ko Nishino, Atsushi Nakazawa,
More information1 Fig. 1 Extraction of motion,.,,, 4,,, 3., 1, 2. 2.,. CHLAC,. 2.1,. (256 ).,., CHLAC. CHLAC, HLAC. 2.3 (HLAC ) r,.,. HLAC. N. 2 HLAC Fig. 2
CHLAC 1 2 3 3,. (CHLAC), 1).,.,, CHLAC,.,. Suspicious Behavior Detection based on CHLAC Method Hideaki Imanishi, 1 Toyohiro Hayashi, 2 Shuichi Enokida 3 and Toshiaki Ejima 3 We have proposed a method for
More informationIP IIS Construction of Overhead View Images by Estimating Intrinsic and Extrinsic Camera Parameters of Multiple Fish-Eye Cameras Shota Kas
I-08- IIS-08- Construction of Overead View Images by Estimating Intrinsic and Extrinsic Camera arameters of Multiple Fis-Eye Cameras Sota Kase, Ryota Okutsu, Hisanori Mitsumoto (Cuo University) Yoei Aragaki,
More information2003/3 Vol. J86 D II No.3 2.3. 4. 5. 6. 2. 1 1 Fig. 1 An exterior view of eye scanner. CCD [7] 640 480 1 CCD PC USB PC 2 334 PC USB RS-232C PC 3 2.1 2
Curved Document Imaging with Eye Scanner Toshiyuki AMANO, Tsutomu ABE, Osamu NISHIKAWA, Tetsuo IYODA, and Yukio SATO 1. Shape From Shading SFS [1] [2] 3 2 Department of Electrical and Computer Engineering,
More informationyoo_graduation_thesis.dvi
200 3 A Graduation Thesis of College of Engineering, Chubu University Keypoint Matching of Range Data from Features of Shape and Appearance Yohsuke Murai 1 1 2 2.5D 3 2.1 : : : : : : : : : : : : : : :
More information24 I ( ) 1. R 3 (i) C : x 2 + y 2 1 = 0 (ii) C : y = ± 1 x 2 ( 1 x 1) (iii) C : x = cos t, y = sin t (0 t 2π) 1.1. γ : [a, b] R n ; t γ(t) = (x
24 I 1.1.. ( ) 1. R 3 (i) C : x 2 + y 2 1 = 0 (ii) C : y = ± 1 x 2 ( 1 x 1) (iii) C : x = cos t, y = sin t (0 t 2π) 1.1. γ : [a, b] R n ; t γ(t) = (x 1 (t), x 2 (t),, x n (t)) ( ) ( ), γ : (i) x 1 (t),
More information2 CAD : CAD 7
1 CAD 2017.6.25 2 CAD 2 3 1998 1 0 6 : CAD 7 3 CAD 2017 6 4 0 7 0.1 1............................. 7 0.2 2............................. 8 0.3 3............................ 9 0.4 4............................
More information数学Ⅱ演習(足助・09夏)
II I 9/4/4 9/4/2 z C z z z z, z 2 z, w C zw z w 3 z, w C z + w z + w 4 t R t C t t t t t z z z 2 z C re z z + z z z, im z 2 2 3 z C e z + z + 2 z2 + 3! z3 + z!, I 4 x R e x cos x + sin x 2 z, w C e z+w
More informationall.dvi
38 5 Cauchy.,,,,., σ.,, 3,,. 5.1 Cauchy (a) (b) (a) (b) 5.1: 5.1. Cauchy 39 F Q Newton F F F Q F Q 5.2: n n ds df n ( 5.1). df n n df(n) df n, t n. t n = df n (5.1) ds 40 5 Cauchy t l n mds df n 5.3: t
More information( ) ( 40 )+( 60 ) Schrödinger 3. (a) (b) (c) yoshioka/education-09.html pdf 1
2009 1 ( ) ( 40 )+( 60 ) 1 1. 2. Schrödinger 3. (a) (b) (c) http://goofy.phys.nara-wu.ac.jp/ yoshioka/education-09.html pdf 1 1. ( photon) ν λ = c ν (c = 3.0 108 /m : ) ɛ = hν (1) p = hν/c = h/λ (2) h
More informationD 24 D D D
5 Paper I.R. 2001 5 Paper HP Paper 5 3 5.1................................................... 3 5.2.................................................... 4 5.3.......................................... 6
More information,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw
,.,. NP,.,. 1 1.1.,.,,.,.,,,. 2. 1.1.1 (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., 152-8552 2-12-1, tatsukawa.m.aa@m.titech.ac.jp, 190-8562 10-3, mirai@ism.ac.jp
More informationΣ A Σ B r Σ A (Σ A ): A r = [ A r A x r A y r z ] T Σ B : B r = [ B r B x r B y r z ] T A r = A x B B r x + A y B B r y + A z B B r z A r = A R B B r
3 : Σ A = O A {X A, Y A, Z A } : Σ B = O B {X B, Y B, Z B } O B : A p B X B, Y B, Z B Σ A : A x B, A y B, A z B Σ A : A p B Σ A : { A x B, A y B, A z B } A R B = [ A x A B y A B z B ] ( A R B ) T ( A R
More information25 7 18 1 1 1.1 v.s............................. 1 1.1.1.................................. 1 1.1.2................................. 1 1.1.3.................................. 3 1.2................... 3
More information2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server
a) Change Detection Using Joint Intensity Histogram Yasuyo KITA a) 2 (0 255) (I 1 (x),i 2 (x)) I 2 = CI 1 (C>0) (I 1,I 2 ) (I 1,I 2 ) 2 1. [1] 2 [2] [3] [5] [6] [8] Intelligent Systems Research Institute,
More informationFig Measurement data combination. 2 Fig. 2. Ray vector. Fig (12) 1 2 R 1 r t 1 3 p 1,i i 2 3 Fig.2 R 2 t 2 p 2,i [u, v] T (1)(2) r R 1 R 2
IP 06 16 / IIS 06 32 3 3-D Environment Modeling from Images Acquired with an Omni-Directional Camera Mounted on a Mobile Robot Atsushi Yamashita, Tomoaki Harada, Ryosuke Kawanishi, Toru Kaneko (Shizuoka
More information,,.,.,,.,.,.,.,,.,..,,,, i
22 A person recognition using color information 1110372 2011 2 13 ,,.,.,,.,.,.,.,,.,..,,,, i Abstract A person recognition using color information Tatsumo HOJI Recently, for the purpose of collection of
More informationII (No.2) 2 4,.. (1) (cm) (2) (cm) , (
II (No.1) 1 x 1, x 2,..., x µ = 1 V = 1 k=1 x k (x k µ) 2 k=1 σ = V. V = σ 2 = 1 x 2 k µ 2 k=1 1 µ, V σ. (1) 4, 7, 3, 1, 9, 6 (2) 14, 17, 13, 11, 19, 16 (3) 12, 21, 9, 3, 27, 18 (4) 27.2, 29.3, 29.1, 26.0,
More informationSobel Canny i
21 Edge Feature for Monochrome Image Retrieval 1100311 2010 3 1 3 3 2 2 7 200 Sobel Canny i Abstract Edge Feature for Monochrome Image Retrieval Naoto Suzue Content based image retrieval (CBIR) has been
More informationm dv = mg + kv2 dt m dv dt = mg k v v m dv dt = mg + kv2 α = mg k v = α 1 e rt 1 + e rt m dv dt = mg + kv2 dv mg + kv 2 = dt m dv α 2 + v 2 = k m dt d
m v = mg + kv m v = mg k v v m v = mg + kv α = mg k v = α e rt + e rt m v = mg + kv v mg + kv = m v α + v = k m v (v α (v + α = k m ˆ ( v α ˆ αk v = m v + α ln v α v + α = αk m t + C v α v + α = e αk m
More informationx = a 1 f (a r, a + r) f(a) r a f f(a) 2 2. (a, b) 2 f (a, b) r f(a, b) r (a, b) f f(a, b)
2011 I 2 II III 17, 18, 19 7 7 1 2 2 2 1 2 1 1 1.1.............................. 2 1.2 : 1.................... 4 1.2.1 2............................... 5 1.3 : 2.................... 5 1.3.1 2.....................................
More informationa) Extraction of Similarities and Differences in Human Behavior Using Singular Value Decomposition Kenichi MISHIMA, Sayaka KANATA, Hiroaki NAKANISHI a
a) Extraction of Similarities and Differences in Human Behavior Using Singular Value Decomposition Kenichi MISHIMA, Sayaka KANATA, Hiroaki NAKANISHI a), Tetsuo SAWARAGI, and Yukio HORIGUCHI 1. Johansson
More informationGmech08.dvi
145 13 13.1 13.1.1 0 m mg S 13.1 F 13.1 F /m S F F 13.1 F mg S F F mg 13.1: m d2 r 2 = F + F = 0 (13.1) 146 13 F = F (13.2) S S S S S P r S P r r = r 0 + r (13.3) r 0 S S m d2 r 2 = F (13.4) (13.3) d 2
More information1 1.1 H = µc i c i + c i t ijc j + 1 c i c j V ijklc k c l (1) V ijkl = V jikl = V ijlk = V jilk () t ij = t ji, V ijkl = V lkji (3) (1) V 0 H mf = µc
013 6 30 BCS 1 1.1........................ 1................................ 3 1.3............................ 3 1.4............................... 5 1.5.................................... 5 6 3 7 4 8
More information微分積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 初版 1 刷発行時のものです.
微分積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. ttp://www.morikita.co.jp/books/mid/00571 このサンプルページの内容は, 初版 1 刷発行時のものです. i ii 014 10 iii [note] 1 3 iv 4 5 3 6 4 x 0 sin x x 1 5 6 z = f(x, y) 1 y = f(x)
More information形状変形による古文書画像のシームレス合成
Use of Shape Deformation to Seamlessly Stitch Historical Document Images Wei Liu Wei Fan Li Chen Sun Jun あらまし 1 2 Abstract In China, efforts are being made to preserve historical documents in the form
More informationIPSJ SIG Technical Report Vol.2014-DPS-158 No.27 Vol.2014-CSEC-64 No /3/6 1,a) 2,b) 3,c) 1,d) 3 Cappelli Bazen Cappelli Bazen Cappelli 1.,,.,.,
1,a),b) 3,c) 1,d) 3 Cappelli Bazen Cappelli Bazen Cappelli 1.,,,,,.,,,,.,,.,,,,.,, 1 Department of Electrical Electronic and Communication Engineering Faculty of Science and Engineering Chuo University
More informationsec13.dvi
13 13.1 O r F R = m d 2 r dt 2 m r m = F = m r M M d2 R dt 2 = m d 2 r dt 2 = F = F (13.1) F O L = r p = m r ṙ dl dt = m ṙ ṙ + m r r = r (m r ) = r F N. (13.2) N N = R F 13.2 O ˆn ω L O r u u = ω r 1 1:
More informationkeisoku01.dvi
2.,, Mon, 2006, 401, SAGA, JAPAN Dept. of Mechanical Engineering, Saga Univ., JAPAN 4 Mon, 2006, 401, SAGA, JAPAN Dept. of Mechanical Engineering, Saga Univ., JAPAN 5 Mon, 2006, 401, SAGA, JAPAN Dept.
More informationSO(2)
TOP URL http://amonphys.web.fc2.com/ 1 12 3 12.1.................................. 3 12.2.......................... 4 12.3............................. 5 12.4 SO(2).................................. 6
More information24 Depth scaling of binocular stereopsis by observer s own movements
24 Depth scaling of binocular stereopsis by observer s own movements 1130313 2013 3 1 3D 3D 3D 2 2 i Abstract Depth scaling of binocular stereopsis by observer s own movements It will become more usual
More information本文6(599) (Page 601)
(MIRU2008) 2008 7 525 8577 1 1 1 E-mail: matsuzaki@i.ci.ritsumei.ac.jp, shimada@ci.ritsumei.ac.jp Object Recognition by Observing Grasping Scene from Image Sequence Hironori KASAHARA, Jun MATSUZAKI, Nobutaka
More information1 Abstract 2 3 n a ax 2 + bx + c = 0 (a 0) (1) ( x + b ) 2 = b2 4ac 2a 4a 2 D = b 2 4ac > 0 (1) 2 D = 0 D < 0 x + b 2a = ± b2 4ac 2a b ± b 2
1 Abstract n 1 1.1 a ax + bx + c = 0 (a 0) (1) ( x + b ) = b 4ac a 4a D = b 4ac > 0 (1) D = 0 D < 0 x + b a = ± b 4ac a b ± b 4ac a b a b ± 4ac b i a D (1) ax + bx + c D 0 () () (015 8 1 ) 1. D = b 4ac
More informationTOP URL 1
TOP URL http://amonphys.web.fc.com/ 3.............................. 3.............................. 4.3 4................... 5.4........................ 6.5........................ 8.6...........................7
More informationIS1-09 第 回画像センシングシンポジウム, 横浜,14 年 6 月 2 Hough Forest Hough Forest[6] Random Forest( [5]) Random Forest Hough Forest Hough Forest 2.1 Hough Forest 1 2.2
IS1-09 第 回画像センシングシンポジウム, 横浜,14 年 6 月 MI-Hough Forest () E-mail: ym@vision.cs.chubu.ac.jphf@cs.chubu.ac.jp Abstract Hough Forest Random Forest MI-Hough Forest Multiple Instance Learning Bag Hough Forest
More information数学の基礎訓練I
I 9 6 13 1 1 1.1............... 1 1................ 1 1.3.................... 1.4............... 1.4.1.............. 1.4................. 3 1.4.3........... 3 1.4.4.. 3 1.5.......... 3 1.5.1..............
More information149 (Newell [5]) Newell [5], [1], [1], [11] Li,Ryu, and Song [2], [11] Li,Ryu, and Song [2], [1] 1) 2) ( ) ( ) 3) T : 2 a : 3 a 1 :
Transactions of the Operations Research Society of Japan Vol. 58, 215, pp. 148 165 c ( 215 1 2 ; 215 9 3 ) 1) 2) :,,,,, 1. [9] 3 12 Darroch,Newell, and Morris [1] Mcneil [3] Miller [4] Newell [5, 6], [1]
More informationK 2 X = 4 MWG(f), X P 2 F, υ 0 : X P 2 2,, {f λ : X λ P 1 } λ Λ NS(X λ ), (υ 0 ) λ : X λ P 2 ( 1) X 6, f λ K X + F, f ( 1), n, n 1 (cf [10]) X, f : X
2 E 8 1, E 8, [6], II II, E 8, 2, E 8,,, 2 [14],, X/C, f : X P 1 2 3, f, (O), f X NS(X), (O) T ( 1), NS(X), T [15] : MWG(f) NS(X)/T, MWL(f) 0 (T ) NS(X), MWL(f) MWL(f) 0, : {f λ : X λ P 1 } λ Λ NS(X λ
More information24 21 21115025 i 1 1 2 5 2.1.................................. 6 2.1.1........................... 6 2.1.2........................... 7 2.2...................................... 8 2.3............................
More information(a) 1 (b) 3. Gilbert Pernicka[2] Treibitz Schechner[3] Narasimhan [4] Kim [5] Nayar [6] [7][8][9] 2. X X X [10] [11] L L t L s L = L t + L s
1 1 1, Extraction of Transmitted Light using Parallel High-frequency Illumination Kenichiro Tanaka 1 Yasuhiro Mukaigawa 1 Yasushi Yagi 1 Abstract: We propose a new sharpening method of transmitted scene
More information1 n A a 11 a 1n A =.. a m1 a mn Ax = λx (1) x n λ (eigenvalue problem) x = 0 ( x 0 ) λ A ( ) λ Ax = λx x Ax = λx y T A = λy T x Ax = λx cx ( 1) 1.1 Th
1 n A a 11 a 1n A = a m1 a mn Ax = λx (1) x n λ (eigenvalue problem) x = ( x ) λ A ( ) λ Ax = λx x Ax = λx y T A = λy T x Ax = λx cx ( 1) 11 Th9-1 Ax = λx λe n A = λ a 11 a 12 a 1n a 21 λ a 22 a n1 a n2
More informationt = h x z z = h z = t (x, z) (v x (x, z, t), v z (x, z, t)) ρ v x x + v z z = 0 (1) 2-2. (v x, v z ) φ(x, z, t) v x = φ x, v z
I 1 m 2 l k 2 x = 0 x 1 x 1 2 x 2 g x x 2 x 1 m k m 1-1. L x 1, x 2, ẋ 1, ẋ 2 ẋ 1 x = 0 1-2. 2 Q = x 1 + x 2 2 q = x 2 x 1 l L Q, q, Q, q M = 2m µ = m 2 1-3. Q q 1-4. 2 x 2 = h 1 x 1 t = 0 2 1 t x 1 (t)
More informationVol1-CVIM-172 No.7 21/5/ Shan 1) 2 2)3) Yuan 4) Ancuti 5) Agrawal 6) 2.4 Ben-Ezra 7)8) Raskar 9) Image domain Blur image l PSF b / = F(
Vol1-CVIM-172 No.7 21/5/27 1 Proposal on Ringing Detector for Image Restoration Chika Inoshita, Yasuhiro Mukaigawa and Yasushi Yagi 1 A lot of methods have been proposed for restoring blurred images due
More information2D-RCWA 1 two dimensional rigorous coupled wave analysis [1, 2] 1 ε(x, y) = 1 ε(x, y) = ϵ mn exp [+j(mk x x + nk y y)] (1) m,n= m,n= ξ mn exp [+j(mk x
2D-RCWA two dimensional rigoros copled wave analsis, 2] εx, εx, ϵ mn exp +jmk x x + nk ] m,n m,n ξ mn exp +jmk x x + nk ] 2 K x K x Λ x Λ ϵ mn ξ mn K x 2π Λ x K 2π Λ ϵ mn ξ mn Λ x Λ x Λ x Λ x Λx Λ Λx Λ
More informationS04S006 S04S023
S04S006 S04S023 0 2 1 3 2 7 3 16 4 19 5 Poincaré-Hopf 22 6 25 A 1 27 1 0 Poincaré-Hopf X V X = c 1 c n p p V i(v ; p) V i(v ) c i V c i s(v ; c i ) n i(v ) = χ(x) (s(v ; c i ) 1) i=1 χ(x) X 1 2 3 3 Poincaré-Hopf
More information2016 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 1 16 2 1 () X O 3 (O1) X O, O (O2) O O (O3) O O O X (X, O) O X X (O1), (O2), (O3) (O2) (O3) n (O2) U 1,..., U n O U k O k=1 (O3) U λ O( λ Λ) λ Λ U λ O 0 X 0 (O2) n =
More information1 M = (M, g) m Riemann N = (N, h) n Riemann M N C f : M N f df : T M T N M T M f N T N M f 1 T N T M f 1 T N C X, Y Γ(T M) M C T M f 1 T N M Levi-Civi
1 Surveys in Geometry 1980 2 6, 7 Harmonic Map Plateau Eells-Sampson [5] Siu [19, 20] Kähler 6 Reports on Global Analysis [15] Sacks- Uhlenbeck [18] Siu-Yau [21] Frankel Siu Yau Frankel [13] 1 Surveys
More information2 1 κ c(t) = (x(t), y(t)) ( ) det(c (t), c x (t)) = det (t) x (t) y (t) y = x (t)y (t) x (t)y (t), (t) c (t) = (x (t)) 2 + (y (t)) 2. c (t) =
1 1 1.1 I R 1.1.1 c : I R 2 (i) c C (ii) t I c (t) (0, 0) c (t) c(i) c c(t) 1.1.2 (1) (2) (3) (1) r > 0 c : R R 2 : t (r cos t, r sin t) (2) C f : I R c : I R 2 : t (t, f(t)) (3) y = x c : R R 2 : t (t,
More information006 11 8 0 3 1 5 1.1..................... 5 1......................... 6 1.3.................... 6 1.4.................. 8 1.5................... 8 1.6................... 10 1.6.1......................
More information1 Table 1: Identification by color of voxel Voxel Mode of expression Nothing Other 1 Orange 2 Blue 3 Yellow 4 SSL Humanoid SSL-Vision 3 3 [, 21] 8 325
社団法人人工知能学会 Japanese Society for Artificial Intelligence 人工知能学会研究会資料 JSAI Technical Report SIG-Challenge-B3 (5/5) RoboCup SSL Humanoid A Proposal and its Application of Color Voxel Server for RoboCup SSL
More informationmeiji_resume_1.PDF
β β β (q 1,q,..., q n ; p 1, p,..., p n ) H(q 1,q,..., q n ; p 1, p,..., p n ) Hψ = εψ ε k = k +1/ ε k = k(k 1) (x, y, z; p x, p y, p z ) (r; p r ), (θ; p θ ), (ϕ; p ϕ ) ε k = 1/ k p i dq i E total = E
More information: , 2.0, 3.0, 2.0, (%) ( 2.
2017 1 2 1.1...................................... 2 1.2......................................... 4 1.3........................................... 10 1.4................................. 14 1.5..........................................
More informationIPSJ SIG Technical Report iphone iphone,,., OpenGl ES 2.0 GLSL(OpenGL Shading Language), iphone GPGPU(General-Purpose Computing on Graphics Proc
iphone 1 1 1 iphone,,., OpenGl ES 2.0 GLSL(OpenGL Shading Language), iphone GPGPU(General-Purpose Computing on Graphics Processing Unit)., AR Realtime Natural Feature Tracking Library for iphone Makoto
More information11) 13) 11),12) 13) Y c Z c Image plane Y m iy O m Z m Marker coordinate system T, d X m f O c X c Camera coordinate system 1 Coordinates and problem
1 1 1 Posture Esimation by Using 2-D Fourier Transform Yuya Ono, 1 Yoshio Iwai 1 and Hiroshi Ishiguro 1 Recently, research fields of augmented reality and robot navigation are actively investigated. Estimating
More information,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.
9 α ν β Ξ ξ Γ γ o δ Π π ε ρ ζ Σ σ η τ Θ θ Υ υ ι Φ φ κ χ Λ λ Ψ ψ µ Ω ω Def, Prop, Th, Lem, Note, Remark, Ex,, Proof, R, N, Q, C [a, b {x R : a x b} : a, b {x R : a < x < b} : [a, b {x R : a x < b} : a,
More informationばらつき抑制のための確率最適制御
( ) http://wwwhayanuemnagoya-uacjp/ fujimoto/ 2011 3 9 11 ( ) 2011/03/09-11 1 / 46 Outline 1 2 3 4 5 ( ) 2011/03/09-11 2 / 46 Outline 1 2 3 4 5 ( ) 2011/03/09-11 3 / 46 (1/2) r + Controller - u Plant y
More information1 3DCG [2] 3DCG CG 3DCG [3] 3DCG 3 3 API 2 3DCG 3 (1) Saito [4] (a) 1920x1080 (b) 1280x720 (c) 640x360 (d) 320x G-Buffer Decaudin[5] G-Buffer D
3DCG 1) ( ) 2) 2) 1) 2) Real-Time Line Drawing Using Image Processing and Deforming Process Together in 3DCG Takeshi Okuya 1) Katsuaki Tanaka 2) Shigekazu Sakai 2) 1) Department of Intermedia Art and Science,
More informationagora04.dvi
Workbook E-mail: kawahira@math.nagoya-u.ac.jp 2004 8 9, 10, 11 1 2 1 2 a n+1 = pa n + q x = px + q a n better 2 a n+1 = aan+b ca n+d 1 (a, b, c, d) =(p, q, 0, 1) 1 = 0 3 2 2 2 f(z) =z 2 + c a n+1 = a 2
More informationN cos s s cos ψ e e e e 3 3 e e 3 e 3 e
3 3 5 5 5 3 3 7 5 33 5 33 9 5 8 > e > f U f U u u > u ue u e u ue u ue u e u e u u e u u e u N cos s s cos ψ e e e e 3 3 e e 3 e 3 e 3 > A A > A E A f A A f A [ ] f A A e > > A e[ ] > f A E A < < f ; >
More informationNote.tex 2008/09/19( )
1 20 9 19 2 1 5 1.1........................ 5 1.2............................. 8 2 9 2.1............................. 9 2.2.............................. 10 3 13 3.1.............................. 13 3.2..................................
More informationI A A441 : April 15, 2013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida )
I013 00-1 : April 15, 013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida) http://www.math.nagoya-u.ac.jp/~kawahira/courses/13s-tenbou.html pdf * 4 15 4 5 13 e πi = 1 5 0 5 7 3 4 6 3 6 10 6 17
More informationI
I 6 4 10 1 1 1.1............... 1 1................ 1 1.3.................... 1.4............... 1.4.1.............. 1.4................. 1.4.3........... 3 1.4.4.. 3 1.5.......... 3 1.5.1..............
More informationproc.dvi
M. D. Wheler Cyra Technologies, Inc. 3 3 CAD albedo Mapping textures on 3D geometric model using reflectance image Ryo Kurazume M. D. Wheler Katsushi Ikeuchi The University oftokyo Cyra Technologies, Inc.
More informationohpr.dvi
2003/12/04 TASK PAF A. Fukuyama et al., Comp. Phys. Rep. 4(1986) 137 A. Fukuyama et al., Nucl. Fusion 26(1986) 151 TASK/WM MHD ψ θ ϕ ψ θ e 1 = ψ, e 2 = θ, e 3 = ϕ ϕ E = E 1 e 1 + E 2 e 2 + E 3 e 3 J :
More informationII A A441 : October 02, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka )
II 214-1 : October 2, 214 Version : 1.1 Kawahira, Tomoki TA (Kondo, Hirotaka ) http://www.math.nagoya-u.ac.jp/~kawahira/courses/14w-biseki.html pdf 1 2 1 9 1 16 1 23 1 3 11 6 11 13 11 2 11 27 12 4 12 11
More information& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro
TV 1,2,a) 1 2 2015 1 26, 2015 5 21 Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Rotation Using Mobile Device Hiroyuki Kawakita 1,2,a) Toshio Nakagawa 1 Makoto Sato
More information重力方向に基づくコントローラの向き決定方法
( ) 2/Sep 09 1 ( ) ( ) 3 2 X w, Y w, Z w +X w = +Y w = +Z w = 1 X c, Y c, Z c X c, Y c, Z c X w, Y w, Z w Y c Z c X c 1: X c, Y c, Z c Kentaro Yamaguchi@bandainamcogames.co.jp 1 M M v 0, v 1, v 2 v 0 v
More information光学
Range Image Sensors Using Active Stereo Methods Kazunori UMEDA and Kenji TERABAYASHI Active stereo methods, which include the traditional light-section method and the talked-about Kinect sensor, are typical
More informationIPSJ SIG Technical Report Vol.2015-CVIM-196 No /3/6 1,a) 1,b) 1,c) U,,,, The Camera Position Alignment on a Gimbal Head for Fixed Viewpoint Swi
1,a) 1,b) 1,c) U,,,, The Camera Position Alignment on a Gimbal Head for Fixed Viewpoint Swiveling using a Misalignment Model Abstract: When the camera sets on a gimbal head as a fixed-view-point, it is
More information1).1-5) - 9 -
- 8 - 1).1-5) - 9 - ε = ε xx 0 0 0 ε xx 0 0 0 ε xx (.1 ) z z 1 z ε = ε xx ε x y 0 - ε x y ε xx 0 0 0 ε zz (. ) 3 xy ) ε xx, ε zz» ε x y (.3 ) ε ij = ε ij ^ (.4 ) 6) xx, xy ε xx = ε xx + i ε xx ε xy = ε
More informationyasi10.dvi
2002 50 2 259 278 c 2002 1 2 2002 2 14 2002 6 17 73 PML 1. 1997 1998 Swiss Re 2001 Canabarro et al. 1998 2001 1 : 651 0073 1 5 1 IHD 3 2 110 0015 3 3 3 260 50 2 2002, 2. 1 1 2 10 1 1. 261 1. 3. 3.1 2 1
More informationPart () () Γ Part ,
Contents a 6 6 6 6 6 6 6 7 7. 8.. 8.. 8.3. 8 Part. 9. 9.. 9.. 3. 3.. 3.. 3 4. 5 4.. 5 4.. 9 4.3. 3 Part. 6 5. () 6 5.. () 7 5.. 9 5.3. Γ 3 6. 3 6.. 3 6.. 3 6.3. 33 Part 3. 34 7. 34 7.. 34 7.. 34 8. 35
More information(3.6 ) (4.6 ) 2. [3], [6], [12] [7] [2], [5], [11] [14] [9] [8] [10] (1) Voodoo 3 : 3 Voodoo[1] 3 ( 3D ) (2) : Voodoo 3D (3) : 3D (Welc
1,a) 1,b) Obstacle Detection from Monocular On-Vehicle Camera in units of Delaunay Triangles Abstract: An algorithm to detect obstacles by using a monocular on-vehicle video camera is developed. Since
More informationReal AdaBoost HOG 2009 3 A Graduation Thesis of College of Engineering, Chubu University Efficient Reducing Method of HOG Features for Human Detection based on Real AdaBoost Chika Matsushima ITS Graphics
More information1 Web [2] Web [3] [4] [5], [6] [7] [8] S.W. [9] 3. MeetingShelf Web MeetingShelf MeetingShelf (1) (2) (3) (4) (5) Web MeetingShelf
1,a) 2,b) 4,c) 3,d) 4,e) Web A Review Supporting System for Whiteboard Logging Movies Based on Notes Timeline Taniguchi Yoshihide 1,a) Horiguchi Satoshi 2,b) Inoue Akifumi 4,c) Igaki Hiroshi 3,d) Hoshi
More informationA Feasibility Study of Direct-Mapping-Type Parallel Processing Method to Solve Linear Equations in Load Flow Calculations Hiroaki Inayoshi, Non-member
A Feasibility Study of Direct-Mapping-Type Parallel Processing Method to Solve Linear Equations in Load Flow Calculations Hiroaki Inayoshi, Non-member (University of Tsukuba), Yasuharu Ohsawa, Member (Kobe
More informationTitle 混合体モデルに基づく圧縮性流体と移動する固体の熱連成計算手法 Author(s) 鳥生, 大祐 ; 牛島, 省 Citation 土木学会論文集 A2( 応用力学 ) = Journal of Japan Civil Engineers, Ser. A2 (2017), 73 Issue
Title 混合体モデルに基づく圧縮性流体と移動する固体の熱連成計算手法 Author(s) 鳥生, 大祐 ; 牛島, 省 Citation 土木学会論文集 A2( 応用力学 ) = Journal of Japan Civil Engineers, Ser. A2 (2017), 73 Issue Date 2017 URL http://hdl.handle.net/2433/229150 Right
More informationnote4.dvi
10 016 6 0 4 (quantum wire) 4.1 4.1.1.6.1, 4.1(a) V Q N dep ( ) 4.1(b) w σ E z (d) E z (d) = σ [ ( ) ( )] x w/ x+w/ π+arctan arctan πǫǫ 0 d d (4.1) à ƒq [ƒg w ó R w d V( x) QŽŸŒ³ džq x (a) (b) 4.1 (a)
More information20 Method for Recognizing Expression Considering Fuzzy Based on Optical Flow
20 Method for Recognizing Expression Considering Fuzzy Based on Optical Flow 1115084 2009 3 5 3.,,,.., HCI(Human Computer Interaction),.,,.,,.,.,,..,. i Abstract Method for Recognizing Expression Considering
More informationJIS Z803: (substitution method) 3 LCR LCR GPIB
LCR NMIJ 003 Agilent 8A 500 ppm JIS Z803:000 50 (substitution method) 3 LCR LCR GPIB Taylor 5 LCR LCR meter (Agilent 8A: Basic accuracy 500 ppm) V D z o I V DUT Z 3 V 3 I A Z V = I V = 0 3 6 V, A LCR meter
More informationIPSJ SIG Technical Report 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai,
1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] 1 599 8531 1 1 Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai, Osaka 599 8531, Japan 2 565 0871 Osaka University 1 1, Yamadaoka, Suita, Osaka
More information1 (Berry,1975) 2-6 p (S πr 2 )p πr 2 p 2πRγ p p = 2γ R (2.5).1-1 : : : : ( ).2 α, β α, β () X S = X X α X β (.1) 1 2
2005 9/8-11 2 2.2 ( 2-5) γ ( ) γ cos θ 2πr πρhr 2 g h = 2γ cos θ ρgr (2.1) γ = ρgrh (2.2) 2 cos θ θ cos θ = 1 (2.2) γ = 1 ρgrh (2.) 2 2. p p ρgh p ( ) p p = p ρgh (2.) h p p = 2γ r 1 1 (Berry,1975) 2-6
More information第5章 偏微分方程式の境界値問題
October 5, 2018 1 / 113 4 ( ) 2 / 113 Poisson 5.1 Poisson ( A.7.1) Poisson Poisson 1 (A.6 ) Γ p p N u D Γ D b 5.1.1: = Γ D Γ N 3 / 113 Poisson 5.1.1 d {2, 3} Lipschitz (A.5 ) Γ D Γ N = \ Γ D Γ p Γ N Γ
More information