& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable),

Similar documents
main.dvi

ばらつき抑制のための確率最適制御

Dirichlet process mixture Dirichlet process mixture 2 /40 MIRU2008 :

( )/2 hara/lectures/lectures-j.html 2, {H} {T } S = {H, T } {(H, H), (H, T )} {(H, T ), (T, T )} {(H, H), (T, T )} {1

Microsoft PowerPoint - SSII_harada pptx

25 11M n O(n 2 ) O(n) O(n) O(n)

23_02.dvi

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)

カルマンフィルターによるベータ推定( )

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution

2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L

確率論と統計学の資料

THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE.

2 1,2, , 2 ( ) (1) (2) (3) (4) Cameron and Trivedi(1998) , (1987) (1982) Agresti(2003)

149 (Newell [5]) Newell [5], [1], [1], [11] Li,Ryu, and Song [2], [11] Li,Ryu, and Song [2], [1] 1) 2) ( ) ( ) 3) T : 2 a : 3 a 1 :

X X X Y R Y R Y R MCAR MAR MNAR Figure 1: MCAR, MAR, MNAR Y R X 1.2 Missing At Random (MAR) MAR MCAR MCAR Y X X Y MCAR 2 1 R X Y Table 1 3 IQ MCAR Y I

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2011-MBL-57 No.27 Vol.2011-UBI-29 No /3/ A Consideration of Features for Fatigue Es

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.

³ÎΨÏÀ

28 Horizontal angle correction using straight line detection in an equirectangular image

(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α,

IMES DISCUSSION PAPER SERIES Discussion Paper No. 99-J- 9 -J-19 INSTITUTE FOR MONETARY AND ECONOMIC STUDIES BANK OF JAPAN

turbo 1993code Berrou 1) 2[dB] SNR 05[dB] 1) interleaver parallel concatenated convolutional code ch

1 Web [2] Web [3] [4] [5], [6] [7] [8] S.W. [9] 3. MeetingShelf Web MeetingShelf MeetingShelf (1) (2) (3) (4) (5) Web MeetingShelf

,,.,.,,.,.,.,.,,.,..,,,, i

/22 R MCMC R R MCMC? 3. Gibbs sampler : kubo/

<4D F736F F D B B83578B6594BB2D834A836F815B82D082C88C60202E646F63>

Vol. 48 No. 4 Apr LAN TCP/IP LAN TCP/IP 1 PC TCP/IP 1 PC User-mode Linux 12 Development of a System to Visualize Computer Network Behavior for L

sakigake1.dvi

Vol.-ICS-6 No.3 /3/8 Input.8.6 y.4 Fig....5 receptive field x 3 w x y Machband w(x =

12/1 ( ) GLM, R MCMC, WinBUGS 12/2 ( ) WinBUGS WinBUGS 12/2 ( ) : 12/3 ( ) :? ( :51 ) 2/ 71


,, Andrej Gendiar (Density Matrix Renormalization Group, DMRG) 1 10 S.R. White [1, 2] 2 DMRG ( ) [3, 2] DMRG Baxter [4, 5] 2 Ising 2 1 Ising 1 1 Ising

AHPを用いた大相撲の新しい番付編成

Fig. 1 Relative delay coding.

tokei01.dvi

Microsoft Word doc

Part () () Γ Part ,

1 3DCG [2] 3DCG CG 3DCG [3] 3DCG 3 3 API 2 3DCG 3 (1) Saito [4] (a) 1920x1080 (b) 1280x720 (c) 640x360 (d) 320x G-Buffer Decaudin[5] G-Buffer D

解説 査読の虎の巻 山里敬也通信ソサイエティ副編集長 Takaya Yamazato 佐波孝彦通信ソサイエティ和文論文誌編集副委員長 Takahiko Saba 塩田茂雄通信ソサイエティ英文論文誌編集副委員長 Shigeo Shiota 太田能 IEICE Communications Expres

φ 4 Minimal subtraction scheme 2-loop ε 2008 (University of Tokyo) (Atsuo Kuniba) version 21/Apr/ Formulas Γ( n + ɛ) = ( 1)n (1 n! ɛ + ψ(n + 1)

2 / 5 Auction: Theory and Practice 3 / 5 (WTO) 1 SDR 27 1,6 Auction: Theory and Practice 4 / 5 2

(Basic of Proability Theory). (Probability Spacees ad Radom Variables , (Expectatios, Meas) (Weak Law

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS ) GPS Global Positioning System

1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z +

The 15th Game Programming Workshop 2010 Magic Bitboard Magic Bitboard Bitboard Magic Bitboard Bitboard Magic Bitboard Magic Bitboard Magic Bitbo

わが国企業による資金調達方法の選択問題

waseda2010a-jukaiki1-main.dvi

e-learning e e e e e-learning 2 Web e-leaning e 4 GP 4 e-learning e-learning e-learning e LMS LMS Internet Navigware

Sample function Re random process Flutter, Galloping, etc. ensemble (mean value) N 1 µ = lim xk( t1) N k = 1 N autocorrelation function N 1 R( t1, t1

(3.6 ) (4.6 ) 2. [3], [6], [12] [7] [2], [5], [11] [14] [9] [8] [10] (1) Voodoo 3 : 3 Voodoo[1] 3 ( 3D ) (2) : Voodoo 3D (3) : 3D (Welc

空力騒音シミュレータの開発

1 Fig. 1 Extraction of motion,.,,, 4,,, 3., 1, 2. 2.,. CHLAC,. 2.1,. (256 ).,., CHLAC. CHLAC, HLAC. 2.3 (HLAC ) r,.,. HLAC. N. 2 HLAC Fig. 2

dvi

Transcription:

.... Deeping and Expansion of Large-Scale Random Fields and Probabilistic Image Processing Kazuyuki Tanaka The mathematical frameworks of probabilistic image processing are formulated by means of Markov random fields and Bayesian inference. Some practical algorithms are constructed by applying belief propagation methods. Markov random fields and belief propagation methods can be regarded as one of statistical-mechanical approaches to probabilistic information processing. In the present paper, we review some fundamental frameworks and statisticalmechanical approaches of probabilistic image processing.. (Genrrative Model).,,, Graduate School of Information Sciences, Tohoku University.,.,,,..,.,. (Bayes Formula).., (Compulational Complexity).,., (Belief Propagation) 98,,.,,, (Statistical-Mechanical Informatics). ) 3),..,,.., (Bayes Formula),,,. (Bayesian Inference). (Prior Probability),, (Posterior Probability). 9/6/ c 9 Information Processing Society of Japan

& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable), x (State Vector). i (Random Variable) X i, X = (X, X,, X V ) T, (Random Vector). x Pr{X = x}. Pr{X = x}...,..,. Pr{X = x},. Pr { X = x } {i,j} E exp ( α(x i x j ) ) () E {i, j}. V =,,, 3,. E = { {, }, {, 3}, {3, }, {5, 6}, {6, 7}, {7, 8}, {9, }, {, }, {, }, {, 5}, {, 6}, {3, 7}, {, 8}, {5, 9}, {6, }, {7, }, {8, } } (). V E (V, E) x = (x, x,, x ) T. (V, E) Pr{X = x}. &/3 5 '6 ' ( &/3 ) 3 * 3 + ( 3, ) ) 3 * * * 3 + + + 3,, ) 3E- * 3F&G. + 3F&?&, 3H& - &/. && & 5 798:<;%=?> 7@;A=:<;B;%>C7@;B;A:<;BD> Lmn?opqb # $!!" #%$ IKJMLON5PFQRP<STPVURPFWXPZYOPZ[\PH]\PF^XP_N6`\P_NaN5PN5Q b cdjelxfn5pfq ghp@fiqrp<sjghp!fkstpvu ghp!fiwxphyrgp@fby\pz[xgp@fb[\pz]xgp@fb^\pn6`xgp@fnl`\pnmnoghp@fn6òpnmnoghp fnmnlp_n5q ghp@fnlpfwtghp!fqrphyrghp!fkstph[rgp@furpz]xgp@fbw\ph^tghp@fayopn6`xgp@fa[opnmnoghp!fb]\p_n5q gb x () Pr{X = x}. (Q = ). (). V = {, }, E = { {, } } x (V, E). x =, x,,. Pr{X = } = Pr{X = } > Pr{X = } = Pr{X = } (3) ( ).. 9/6/ V = {, }, E = {, } x = (, ) T, (, ) T, (, ) T, (, ) T () Pr{X = x}. c 9 Information Processing Society of Japan

9/6/ V = {,, 3,, 5, 6, 7, 8, 9} () E = { {, }, {, 3}, {, 5}, {5, 6}, {7, 8}, {8, 9}, {, }, {, 5}, {3, 6}, {, 7}, {5, 8}, {6, 9} } (5) (V, E) 5. 5 i, x i = ( i V \{5}) 5 x 5 = x 5 = Pr{X = x} x 5 = ( 3 ). 3 V = {,,, 9}, E (5) 5, x i = ( i V \{5}) 5 x 5 = x 5 = () Pr{X = x} x 5 =. V = {,,, } E (). (V, E) 6 7, ( x i ) = (i {,, ( 5, 9, ) }), x i = ( (i {3, ), 8,, }) 6 7 x 6 =,,, Pr{X = x} x 7 x 6 =., (), 6, 7 x 7.., V V = {,,, }, E () 6 7,! x i = (i {,!,! 5, 9, }),! x i! = (i {3,, 8,, }) 6 7 x 6 =,,, Pr{X = x} x 7!! x 6 =. x 7., Pr { X = x } x., x = x = = x V,., x = x = = x V x., E {i, j} x i = x j {i, j} (x i x j {i, j} ) x. () Pr { X = x } (Markov Chain Monte Carlo: MCMC) ),5), x = (x, x, x V ) T 5. 5 () Pr X = x x = (x, x, x V ) T. 5 α. 3 c 9 Information Processing Society of Japan

, x i x j {i, j} K(x) (x i x j) = ( δ xi,x j ) (6) {i,j} E {i,j} E. x K(x) Pr { X = x }. K(x) % x A = {x K(x)/ E.} Pr{A} Pr { X = x } Ā {x K(x)/ E >.} Pr{Ā} {x K(x)/ E.} {x K(x)/ E >.}. α Pr { X = x } Pr{A} > Pr{Ā} (7),,, α, Pr{A} < Pr{Ā} (8). A Pr{X = x}, A, Pr{A}. α x i = x j ({i, j} E) x, x i = x j({i, j} E) x. 5 α x. (7) (8) α.. 5 α = α α.... Cov[X i, X j ] (x i µ i )(x j µ j )Pr{X = x} ( {i, j} E) (9) x =x = x V =. µ i X i. µ i x =x = x V = x i Pr{X = x} () V + (9) Cov[X i, X j] α 6, α =.767..... 5 α = 6 V + (9) Cov[X i, X j ].. 5 α =, ( 7 ).. x 8. Q = 56 () 3. 3 V \{5} 5. 9/6/ Pr{X 5 = x 5 X =, X =, X 3 =, X =, X 6 =, X 7 =, X 8 =, X 9 = } Pr{X =, X =, X3 =, X =, X5 = x5, X6 =, X7 =, X8 =, X9 = } = Pr{X =, X =, X 3 =, X =, X 6 =, X 7 =, X 8 =, X 9 = } () c 9 Information Processing Society of Japan

9/6/ 情報処理学会研究報告 図 7 通常の自然画像を 値化した画像と図 5 の α = のときの生成画像の類似性. X x X x x X X3 x3 X3 x3 X x X x XV = X5, XV \{5} =, xv = x5, xv \{5} = X6 x6 X6 x6 X7 x7 X x 7 7 X8 x8 X8 x8 X X9 X9 x x9 (3) x9 という記号を導入する. この記号をつかうと式 ()-() は画素 5 以外の画素の輝度値が xi ( i V \{5}) であるときの画素 5 の輝度値に対する確率は次のように与えられる. Pr{X5 = x5 XV \{5} = xv \{5} } = Pr{XV \{5} = xv \{5} } = 図 8 式 () の確率分布におけるマルコフ連鎖モンテカルロ (Markov Chain Monte Carlo) 法による生成画像 x. X x5 = Pr{XV = xv } 式 ()-(5) に式 () を代入すると次の式が得られる. Y Pr{X5 = x5 XV \{5} = xv \{5} } exp α(x5 xj ) Pr{X =, X =, X3 =, X =, X6 =, X7 =, X8 =, X9 = } = X Pr{XV = xv } Pr{XV \{5} = xv \{5} } Pr{X =, X =, X3 =, X =, X5 = x5, X6 =, X7 =, X8 =, X9 = } () (5) (6) j 5 x5 = () 5 {,, 6, 8} (7) Pr{X5 = x5 XV \{5} = xv \{5} } = Pr{X5 = x5 Xk = xk, k 5} (8) 5 は画素 5 のすべての最近接画素の集合をあらわしている. 式 (6) 式が長いので が成り立つことを示している. 同様にして, 図 の例で V \{6, 7} の画素の状態が固定されている場合に 5 c 9 Information Processing Society of Japan

X V = X X X 3 X X 5 X 6 X 7 X 8 X 9 X X X, X V \{6,7} = X X X 3 X X 5 X 8 X 9 X X X, x V = x x x 3 x x 5 x 6 x 7 x 8 x 9 x x x, x V \{6,7} = x x x 3 x x 5 x 8 x 9 x x x (9), 6 7. Pr{X 6 = x 6, X 7 = x 7 X V \{6,7} = x V \{6,7} } = Pr{X V \{6,7} = x V \{6,7} } = x 6 =x 7 = ()-() (). Pr{X 6 = x 6, X 7 = x 7 X V \{6,7} = x V \{6,7} } ( exp ( α(x 6 x k ) )) k 6\{7} exp ( α(x 6 x 7 ) )( Pr{X V = x V } Pr{X V \{6,7} = x V \{6,7} } () Pr{X V = x V } () k 7\{6} exp ( α(x 7 x k ) )) () {6} {, 5, 7, }, {7} {3, 6, 8, } (3) Pr{X 6 = x 6, X 7 = x 7 X V \{6,7} = x V \{6,7} }. = Pr{X 6 = x 6, X 7 = x 7 X k = x k, k {6, 7}} () () (),. X V (Markov Random Field: MRF),. Pr{X i = x i X V \{i} = x V \{i} } = Pr{X i = x i X k = x k, k {i}} ( i V ) (5) i i., () () (V, E). Pr{X i = x i X V \{i} = x V \{i} } exp ( α(xi x k) ) (6) k i Pr{X i = x i, X j = x j X V \{i,j} = x V \{i,j} } ( exp ( α(x i x k ) )) k i\{j} exp ( α(xi xj))( 3. k j\{i} exp ( α(xj x k) )) (7) (),.. x. (Additive White Gaussian Noise). n i (i V ) n = (n, n,, n V ) y = (y, y,, y V ). y = x + n (8) y = (y, y,, y V ) Y = (Y, Y,, Y V ) ( 9 ). Pr{Y = y X = x} exp ( σ (y i x i ) ) (9) i V 9/6/ Pr{Y = y X = x} x y, y x Pr{X = x Y = y}. () (9) (Bayes 6 c 9 Information Processing Society of Japan

3 3 -. / -. / 3 3-75 -+- - -65 -+- - 5 6 7 8 9 5 6 7 8 9 : 3>; 33 3 : 3<; 3=3 3 9/6/ #"! 9 "!#!#!$ % & %'& ()*+*!#!#!,+ % & %'& (9). %$'&, '&#()(*(+& -,- $. /&&#()()(+&, -,- (3). Pr{X = x Y = y},. X i ( ), E[X i] = Q Q x =x = Q x V = x ipr{x = x Y = y} (3) Formula). Pr{X = x Y = y} = ( i V (9) y. Pr{Y = y X = x}pr{x = x} Pr{Y = y} exp ( σ (y i x i ) ))( {i,j} E (3) Pr{X = x Y exp ( α(x i x j ) )) (3) = y} (Posterior Probability)., Pr{X = x} (Prior Probability). () Pr{X = x} (3) Pr{X = x Y 9. = y}. O(exp( V )). ( ) (Belief Propagation). {i, j} M i j (x j ) M j i (x i ) (Message). (3) (Message Propagation Rule) ( ). Q M j i (x i ) Constant exp ( α(x i z) σ (z y j) ) z= M k j (z) ( {i, j} E) (3) k j \{i} 7 c 9 Information Processing Society of Japan

Q M i j(x j) Constant exp ( α(xj z) (z yi)) σ z= M k j (z) ( {i, j} E) (33) k i \{i} j j, j \{i} j i. (3)-(33),, k j \{i} O()., O( V ). {M i j (x), M j i (x) {i, j} E},.. 3. 3. ). 98,, (Low Density Parity Check: LDPC), (Code Division Multiple Access: CDMA), (Satisfiability: SAT). 3),6) 8). 3),6),9). (Sample Average)., () x, x (9) y., y Pr{ X Y = y} x x = x(y) x x d(x, x) = (xi xi) V i V,.,.,,., Q Q Q ( + + + d ( x, x(y) ) x =x = x V = Pr{Y = y X = x}dy dy dy V )Pr{X = x} (3),. (3) (Spin Glass Theory), (Configuration Average).,,,, CDMA, 3),) ). 9/6/ 8 c 9 Information Processing Society of Japan

5.. (),,. 5). ),). (No.879, No.8798) COE. ) K. Tanaka: Statistical-Mechanical Approach to Image Processing (Topical Review), Journal of Physics A: Mathematical and General, Vol.35, No.37, pp.r8- R5,. ) :,, 6. 3) ( ): SGC,, 6. ) :, (3). 5), : II,, 5. 6),,,, : I,, 3. 7) C. M. Bishop Pattern Recognition and Machine Learning, Springer, 6. 8) M. J. Wainwright and M. I. Jordan: Graphical Models, Exponential Families, and Variational Inference, now Publishing Inc, 8. 9) M. Opper and D. Saad (eds): Advanced Mean Field Methods Theory and Practice, MIT Press,. ) :,, 999. ) H. Nishimori: Statistical Physics of Spin Glasses and Information Processing, An Introduction, Oxford University Press,. ) : /,,. 3) Y. Kabashima and D. Saad: Statistial Mechanics of Low-Density Parity-Check Codes (Topical Review), Journal of Physics A: Mathematical and General, Vol. 37, No.6, pp.r-r3,. ) M. Mézard and A. Montanari: Information, Physics, and Computation, Oxford University Press, 9. 5) A. S. Willsky: Multiresolution Markov models for signal and image processing, Proceedings of IEEE, vol.9, no.8, pp.396-58,. ( 5 9 ) ( ) 36... 6.. 9...,,,, IEEE-CIS. 9/6/ 9 c 9 Information Processing Society of Japan