& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable),

Similar documents
main.dvi

main.dvi

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q

ばらつき抑制のための確率最適制御

Dirichlet process mixture Dirichlet process mixture 2 /40 MIRU2008 :

x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2

( )/2 hara/lectures/lectures-j.html 2, {H} {T } S = {H, T } {(H, H), (H, T )} {(H, T ), (T, T )} {(H, H), (T, T )} {1

Q [4] 2. [3] [5] ϵ- Q Q CO CO [4] Q Q [1] i = X ln n i + C (1) n i i n n i i i n i = n X i i C exploration exploitation [4] Q Q Q ϵ 1 ϵ 3. [3] [5] [4]

Microsoft PowerPoint - SSII_harada pptx

25 11M n O(n 2 ) O(n) O(n) O(n)

23_02.dvi

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)

カルマンフィルターによるベータ推定( )

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution

2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server

L. S. Abstract. Date: last revised on 9 Feb translated to Japanese by Kazumoto Iguchi. Original papers: Received May 13, L. Onsager and S.

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L

Microsoft Word - toyoshima-deim2011.doc

確率論と統計学の資料

THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE.

2 1,2, , 2 ( ) (1) (2) (3) (4) Cameron and Trivedi(1998) , (1987) (1982) Agresti(2003)

130 Oct Radial Basis Function RBF Efficient Market Hypothesis Fama ) 4) 1 Fig. 1 Utility function. 2 Fig. 2 Value function. (1) (2)

149 (Newell [5]) Newell [5], [1], [1], [11] Li,Ryu, and Song [2], [11] Li,Ryu, and Song [2], [1] 1) 2) ( ) ( ) 3) T : 2 a : 3 a 1 :

X X X Y R Y R Y R MCAR MAR MNAR Figure 1: MCAR, MAR, MNAR Y R X 1.2 Missing At Random (MAR) MAR MCAR MCAR Y X X Y MCAR 2 1 R X Y Table 1 3 IQ MCAR Y I

Optical Flow t t + δt 1 Motion Field 3 3 1) 2) 3) Lucas-Kanade 4) 1 t (x, y) I(x, y, t)

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j )

Isogai, T., Building a dynamic correlation network for fat-tailed financial asset returns, Applied Network Science (7):-24, 206,

untitled

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2011-MBL-57 No.27 Vol.2011-UBI-29 No /3/ A Consideration of Features for Fatigue Es

,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.

³ÎΨÏÀ

Kobe University Repository : Kernel タイトル Title 著者 Author(s) 掲載誌 巻号 ページ Citation 刊行日 Issue date 資源タイプ Resource Type 版区分 Resource Version 権利 Rights DOI

IPSJ SIG Technical Report An Evaluation Method for the Degree of Strain of an Action Scene Mao Kuroda, 1 Takeshi Takai 1 and Takashi Matsuyama 1

IPSJ SIG Technical Report 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai,



28 Horizontal angle correction using straight line detection in an equirectangular image

No. 3 Oct The person to the left of the stool carried the traffic-cone towards the trash-can. α α β α α β α α β α Track2 Track3 Track1 Track0 1

(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α,

IMES DISCUSSION PAPER SERIES Discussion Paper No. 99-J- 9 -J-19 INSTITUTE FOR MONETARY AND ECONOMIC STUDIES BANK OF JAPAN

turbo 1993code Berrou 1) 2[dB] SNR 05[dB] 1) interleaver parallel concatenated convolutional code ch

1 Web [2] Web [3] [4] [5], [6] [7] [8] S.W. [9] 3. MeetingShelf Web MeetingShelf MeetingShelf (1) (2) (3) (4) (5) Web MeetingShelf

letter by letter reading read R, E, A, D 1

i Version 1.1, (2012/02/22 24),.,..,.,,. R-space,, ( R- space),, Kahler (Kähler C-space)., R-space,., R-space, Hermite,.

,,.,.,,.,.,.,.,,.,..,,,, i

/22 R MCMC R R MCMC? 3. Gibbs sampler : kubo/

60 (W30)? 1. ( ) 2. ( ) web site URL ( :41 ) 1/ 77

<4D F736F F D B B83578B6594BB2D834A836F815B82D082C88C60202E646F63>

st.dvi

Hilbert, von Neuman [1, p.86] kt 2 1 [1, 2] 2 2

Vol. 48 No. 4 Apr LAN TCP/IP LAN TCP/IP 1 PC TCP/IP 1 PC User-mode Linux 12 Development of a System to Visualize Computer Network Behavior for L

sakigake1.dvi

三石貴志.indd

Vol.-ICS-6 No.3 /3/8 Input.8.6 y.4 Fig....5 receptive field x 3 w x y Machband w(x =

2 4 (four-dimensional variational(4dvar))(talagrand and Courtier(1987), Courtier et al.(1994)) (Ensemble Kalman Filter( EnKF))(Evensen(1994), Evensen(

12/1 ( ) GLM, R MCMC, WinBUGS 12/2 ( ) WinBUGS WinBUGS 12/2 ( ) : 12/3 ( ) :? ( :51 ) 2/ 71


Autumn II III Zon and Muysken 2005 Zon and Muysken 2005 IV II 障害者への所得移転の経済効果 分析に用いるデータ

1 : ( ) ( ) ( ) ( ) ( ) etc (SCA)

ii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,.

,, Andrej Gendiar (Density Matrix Renormalization Group, DMRG) 1 10 S.R. White [1, 2] 2 DMRG ( ) [3, 2] DMRG Baxter [4, 5] 2 Ising 2 1 Ising 1 1 Ising

01.Œk’ì/“²fi¡*

AHPを用いた大相撲の新しい番付編成

Fig. 1 Relative delay coding.

IPSJ SIG Technical Report Vol.2017-SLP-115 No /2/18 1,a) 1 1,2 Sakriani Sakti [1][2] [3][4] [5][6][7] [8] [9] 1 Nara Institute of Scie

IPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta

tokei01.dvi

Microsoft Word doc

Abstract This paper concerns with a method of dynamic image cognition. Our image cognition method has two distinguished features. One is that the imag

Part () () Γ Part ,

1 3DCG [2] 3DCG CG 3DCG [3] 3DCG 3 3 API 2 3DCG 3 (1) Saito [4] (a) 1920x1080 (b) 1280x720 (c) 640x360 (d) 320x G-Buffer Decaudin[5] G-Buffer D

? (EM),, EM? (, 2004/ 2002) von Mises-Fisher ( 2004) HMM (MacKay 1997) LDA (Blei et al. 2001) PCFG ( 2004)... Variational Bayesian methods for Natural

解説 査読の虎の巻 山里敬也通信ソサイエティ副編集長 Takaya Yamazato 佐波孝彦通信ソサイエティ和文論文誌編集副委員長 Takahiko Saba 塩田茂雄通信ソサイエティ英文論文誌編集副委員長 Shigeo Shiota 太田能 IEICE Communications Expres

3. ( 1 ) Linear Congruential Generator:LCG 6) (Mersenne Twister:MT ), L 1 ( 2 ) 4 4 G (i,j) < G > < G 2 > < G > 2 g (ij) i= L j= N

IPSJ SIG Technical Report Vol.2009-CVIM-167 No /6/10 Real AdaBoost HOG 1 1 1, 2 1 Real AdaBoost HOG HOG Real AdaBoost HOG A Method for Reducing

応用数学III-4.ppt

φ 4 Minimal subtraction scheme 2-loop ε 2008 (University of Tokyo) (Atsuo Kuniba) version 21/Apr/ Formulas Γ( n + ɛ) = ( 1)n (1 n! ɛ + ψ(n + 1)

., White-Box, White-Box. White-Box.,, White-Box., Maple [11], 2. 1, QE, QE, 1 Redlog [7], QEPCAD [9], SyNRAC [8] 3 QE., 2 Brown White-Box. 3 White-Box

2 / 5 Auction: Theory and Practice 3 / 5 (WTO) 1 SDR 27 1,6 Auction: Theory and Practice 4 / 5 2

IPSJ SIG Technical Report 1, Instrument Separation in Reverberant Environments Using Crystal Microphone Arrays Nobutaka ITO, 1, 2 Yu KITANO, 1

(Basic of Proability Theory). (Probability Spacees ad Radom Variables , (Expectatios, Meas) (Weak Law

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS ) GPS Global Positioning System

1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z +

The 15th Game Programming Workshop 2010 Magic Bitboard Magic Bitboard Bitboard Magic Bitboard Bitboard Magic Bitboard Magic Bitboard Magic Bitbo

わが国企業による資金調達方法の選択問題

untitled

waseda2010a-jukaiki1-main.dvi

e-learning e e e e e-learning 2 Web e-leaning e 4 GP 4 e-learning e-learning e-learning e LMS LMS Internet Navigware

Sample function Re random process Flutter, Galloping, etc. ensemble (mean value) N 1 µ = lim xk( t1) N k = 1 N autocorrelation function N 1 R( t1, t1

FA $*1$ $*$ 1, $*$2 : $*2$ : Takehiro Takano $*$ 1, Katsunori Ano*2 $*1$ : Graduate School of Engineering and Science, Shibaura Ins

(3.6 ) (4.6 ) 2. [3], [6], [12] [7] [2], [5], [11] [14] [9] [8] [10] (1) Voodoo 3 : 3 Voodoo[1] 3 ( 3D ) (2) : Voodoo 3D (3) : 3D (Welc

IPSJ SIG Technical Report Vol.2017-MUS-116 No /8/24 MachineDancing: 1,a) 1,b) 3 MachineDancing MachineDancing MachineDancing 1 MachineDan

bag-of-words bag-of-keypoints Web bagof-keypoints Nearest Neighbor SVM Nearest Neighbor SIFT Nearest Neighbor bag-of-keypoints Nearest Neighbor SVM 84

2/50 Auction: Theory and Practice 3 / 50 (WTO) 10 SDR ,600 Auction: Theory and Practice 4 / 50 2

1 Tokyo Daily Rainfall (mm) Days (mm)

‚æ62›ñ”©fi®”Ô‰Z‘p›ï‘Ü

空力騒音シミュレータの開発

1 Fig. 1 Extraction of motion,.,,, 4,,, 3., 1, 2. 2.,. CHLAC,. 2.1,. (256 ).,., CHLAC. CHLAC, HLAC. 2.3 (HLAC ) r,.,. HLAC. N. 2 HLAC Fig. 2

dvi

Transcription:

.... Deeping and Expansion of Large-Scale Random Fields and Probabilistic Image Processing Kazuyuki Tanaka The mathematical frameworks of probabilistic image processing are formulated by means of Markov random fields and Bayesian inference. Some practical algorithms are constructed by applying belief propagation methods. Markov random fields and belief propagation methods can be regarded as one of statistical-mechanical approaches to probabilistic information processing. In the present paper, we review some fundamental frameworks and statisticalmechanical approaches of probabilistic image processing.. (Genrrative Model).,,, Graduate School of Information Sciences, Tohoku University.,.,,,..,.,. (Bayes Formula).., (Compulational Complexity).,., (Belief Propagation) 98,,.,,, (Statistical-Mechanical Informatics). ) 3),..,,.., (Bayes Formula),,,. (Bayesian Inference). (Prior Probability),, (Posterior Probability). 9/6/ c 9 Information Processing Society of Japan

& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable), x (State Vector). i (Random Variable) X i, X = (X, X,, X V ) T, (Random Vector). x Pr{X = x}. Pr{X = x}...,..,. Pr{X = x},. Pr { X = x } {i,j} E exp ( α(x i x j ) ) () E {i, j}. V =,,, 3,. E = { {, }, {, 3}, {3, }, {5, 6}, {6, 7}, {7, 8}, {9, }, {, }, {, }, {, 5}, {, 6}, {3, 7}, {, 8}, {5, 9}, {6, }, {7, }, {8, } } (). V E (V, E) x = (x, x,, x ) T. (V, E) Pr{X = x}. &/3 5 '6 ' ( &/3 ) 3 * 3 + ( 3, ) ) 3 * * * 3 + + + 3,, ) 3E- * 3F&G. + 3F&?&, 3H& - &/. && & 5 798:<;%=?> 7@;A=:<;B;%>C7@;B;A:<;BD> Lmn?opqb # $!!" #%$ IKJMLON5PFQRP<STPVURPFWXPZYOPZ[\PH]\PF^XP_N6`\P_NaN5PN5Q b cdjelxfn5pfq ghp@fiqrp<sjghp!fkstpvu ghp!fiwxphyrgp@fby\pz[xgp@fb[\pz]xgp@fb^\pn6`xgp@fnl`\pnmnoghp@fn6òpnmnoghp fnmnlp_n5q ghp@fnlpfwtghp!fqrphyrghp!fkstph[rgp@furpz]xgp@fbw\ph^tghp@fayopn6`xgp@fa[opnmnoghp!fb]\p_n5q gb x () Pr{X = x}. (Q = ). (). V = {, }, E = { {, } } x (V, E). x =, x,,. Pr{X = } = Pr{X = } > Pr{X = } = Pr{X = } (3) ( ).. 9/6/ V = {, }, E = {, } x = (, ) T, (, ) T, (, ) T, (, ) T () Pr{X = x}. c 9 Information Processing Society of Japan

9/6/ V = {,, 3,, 5, 6, 7, 8, 9} () E = { {, }, {, 3}, {, 5}, {5, 6}, {7, 8}, {8, 9}, {, }, {, 5}, {3, 6}, {, 7}, {5, 8}, {6, 9} } (5) (V, E) 5. 5 i, x i = ( i V \{5}) 5 x 5 = x 5 = Pr{X = x} x 5 = ( 3 ). 3 V = {,,, 9}, E (5) 5, x i = ( i V \{5}) 5 x 5 = x 5 = () Pr{X = x} x 5 =. V = {,,, } E (). (V, E) 6 7, ( x i ) = (i {,, ( 5, 9, ) }), x i = ( (i {3, ), 8,, }) 6 7 x 6 =,,, Pr{X = x} x 7 x 6 =., (), 6, 7 x 7.., V V = {,,, }, E () 6 7,! x i = (i {,!,! 5, 9, }),! x i! = (i {3,, 8,, }) 6 7 x 6 =,,, Pr{X = x} x 7!! x 6 =. x 7., Pr { X = x } x., x = x = = x V,., x = x = = x V x., E {i, j} x i = x j {i, j} (x i x j {i, j} ) x. () Pr { X = x } (Markov Chain Monte Carlo: MCMC) ),5), x = (x, x, x V ) T 5. 5 () Pr X = x x = (x, x, x V ) T. 5 α. 3 c 9 Information Processing Society of Japan

, x i x j {i, j} K(x) (x i x j) = ( δ xi,x j ) (6) {i,j} E {i,j} E. x K(x) Pr { X = x }. K(x) % x A = {x K(x)/ E.} Pr{A} Pr { X = x } Ā {x K(x)/ E >.} Pr{Ā} {x K(x)/ E.} {x K(x)/ E >.}. α Pr { X = x } Pr{A} > Pr{Ā} (7),,, α, Pr{A} < Pr{Ā} (8). A Pr{X = x}, A, Pr{A}. α x i = x j ({i, j} E) x, x i = x j({i, j} E) x. 5 α x. (7) (8) α.. 5 α = α α.... Cov[X i, X j ] (x i µ i )(x j µ j )Pr{X = x} ( {i, j} E) (9) x =x = x V =. µ i X i. µ i x =x = x V = x i Pr{X = x} () V + (9) Cov[X i, X j] α 6, α =.767..... 5 α = 6 V + (9) Cov[X i, X j ].. 5 α =, ( 7 ).. x 8. Q = 56 () 3. 3 V \{5} 5. 9/6/ Pr{X 5 = x 5 X =, X =, X 3 =, X =, X 6 =, X 7 =, X 8 =, X 9 = } Pr{X =, X =, X3 =, X =, X5 = x5, X6 =, X7 =, X8 =, X9 = } = Pr{X =, X =, X 3 =, X =, X 6 =, X 7 =, X 8 =, X 9 = } () c 9 Information Processing Society of Japan

9/6/ 情報処理学会研究報告 図 7 通常の自然画像を 値化した画像と図 5 の α = のときの生成画像の類似性. X x X x x X X3 x3 X3 x3 X x X x XV = X5, XV \{5} =, xv = x5, xv \{5} = X6 x6 X6 x6 X7 x7 X x 7 7 X8 x8 X8 x8 X X9 X9 x x9 (3) x9 という記号を導入する. この記号をつかうと式 ()-() は画素 5 以外の画素の輝度値が xi ( i V \{5}) であるときの画素 5 の輝度値に対する確率は次のように与えられる. Pr{X5 = x5 XV \{5} = xv \{5} } = Pr{XV \{5} = xv \{5} } = 図 8 式 () の確率分布におけるマルコフ連鎖モンテカルロ (Markov Chain Monte Carlo) 法による生成画像 x. X x5 = Pr{XV = xv } 式 ()-(5) に式 () を代入すると次の式が得られる. Y Pr{X5 = x5 XV \{5} = xv \{5} } exp α(x5 xj ) Pr{X =, X =, X3 =, X =, X6 =, X7 =, X8 =, X9 = } = X Pr{XV = xv } Pr{XV \{5} = xv \{5} } Pr{X =, X =, X3 =, X =, X5 = x5, X6 =, X7 =, X8 =, X9 = } () (5) (6) j 5 x5 = () 5 {,, 6, 8} (7) Pr{X5 = x5 XV \{5} = xv \{5} } = Pr{X5 = x5 Xk = xk, k 5} (8) 5 は画素 5 のすべての最近接画素の集合をあらわしている. 式 (6) 式が長いので が成り立つことを示している. 同様にして, 図 の例で V \{6, 7} の画素の状態が固定されている場合に 5 c 9 Information Processing Society of Japan

X V = X X X 3 X X 5 X 6 X 7 X 8 X 9 X X X, X V \{6,7} = X X X 3 X X 5 X 8 X 9 X X X, x V = x x x 3 x x 5 x 6 x 7 x 8 x 9 x x x, x V \{6,7} = x x x 3 x x 5 x 8 x 9 x x x (9), 6 7. Pr{X 6 = x 6, X 7 = x 7 X V \{6,7} = x V \{6,7} } = Pr{X V \{6,7} = x V \{6,7} } = x 6 =x 7 = ()-() (). Pr{X 6 = x 6, X 7 = x 7 X V \{6,7} = x V \{6,7} } ( exp ( α(x 6 x k ) )) k 6\{7} exp ( α(x 6 x 7 ) )( Pr{X V = x V } Pr{X V \{6,7} = x V \{6,7} } () Pr{X V = x V } () k 7\{6} exp ( α(x 7 x k ) )) () {6} {, 5, 7, }, {7} {3, 6, 8, } (3) Pr{X 6 = x 6, X 7 = x 7 X V \{6,7} = x V \{6,7} }. = Pr{X 6 = x 6, X 7 = x 7 X k = x k, k {6, 7}} () () (),. X V (Markov Random Field: MRF),. Pr{X i = x i X V \{i} = x V \{i} } = Pr{X i = x i X k = x k, k {i}} ( i V ) (5) i i., () () (V, E). Pr{X i = x i X V \{i} = x V \{i} } exp ( α(xi x k) ) (6) k i Pr{X i = x i, X j = x j X V \{i,j} = x V \{i,j} } ( exp ( α(x i x k ) )) k i\{j} exp ( α(xi xj))( 3. k j\{i} exp ( α(xj x k) )) (7) (),.. x. (Additive White Gaussian Noise). n i (i V ) n = (n, n,, n V ) y = (y, y,, y V ). y = x + n (8) y = (y, y,, y V ) Y = (Y, Y,, Y V ) ( 9 ). Pr{Y = y X = x} exp ( σ (y i x i ) ) (9) i V 9/6/ Pr{Y = y X = x} x y, y x Pr{X = x Y = y}. () (9) (Bayes 6 c 9 Information Processing Society of Japan

3 3 -. / -. / 3 3-75 -+- - -65 -+- - 5 6 7 8 9 5 6 7 8 9 : 3>; 33 3 : 3<; 3=3 3 9/6/ #"! 9 "!#!#!$ % & %'& ()*+*!#!#!,+ % & %'& (9). %$'&, '&#()(*(+& -,- $. /&&#()()(+&, -,- (3). Pr{X = x Y = y},. X i ( ), E[X i] = Q Q x =x = Q x V = x ipr{x = x Y = y} (3) Formula). Pr{X = x Y = y} = ( i V (9) y. Pr{Y = y X = x}pr{x = x} Pr{Y = y} exp ( σ (y i x i ) ))( {i,j} E (3) Pr{X = x Y exp ( α(x i x j ) )) (3) = y} (Posterior Probability)., Pr{X = x} (Prior Probability). () Pr{X = x} (3) Pr{X = x Y 9. = y}. O(exp( V )). ( ) (Belief Propagation). {i, j} M i j (x j ) M j i (x i ) (Message). (3) (Message Propagation Rule) ( ). Q M j i (x i ) Constant exp ( α(x i z) σ (z y j) ) z= M k j (z) ( {i, j} E) (3) k j \{i} 7 c 9 Information Processing Society of Japan

Q M i j(x j) Constant exp ( α(xj z) (z yi)) σ z= M k j (z) ( {i, j} E) (33) k i \{i} j j, j \{i} j i. (3)-(33),, k j \{i} O()., O( V ). {M i j (x), M j i (x) {i, j} E},.. 3. 3. ). 98,, (Low Density Parity Check: LDPC), (Code Division Multiple Access: CDMA), (Satisfiability: SAT). 3),6) 8). 3),6),9). (Sample Average)., () x, x (9) y., y Pr{ X Y = y} x x = x(y) x x d(x, x) = (xi xi) V i V,.,.,,., Q Q Q ( + + + d ( x, x(y) ) x =x = x V = Pr{Y = y X = x}dy dy dy V )Pr{X = x} (3),. (3) (Spin Glass Theory), (Configuration Average).,,,, CDMA, 3),) ). 9/6/ 8 c 9 Information Processing Society of Japan

5.. (),,. 5). ),). (No.879, No.8798) COE. ) K. Tanaka: Statistical-Mechanical Approach to Image Processing (Topical Review), Journal of Physics A: Mathematical and General, Vol.35, No.37, pp.r8- R5,. ) :,, 6. 3) ( ): SGC,, 6. ) :, (3). 5), : II,, 5. 6),,,, : I,, 3. 7) C. M. Bishop Pattern Recognition and Machine Learning, Springer, 6. 8) M. J. Wainwright and M. I. Jordan: Graphical Models, Exponential Families, and Variational Inference, now Publishing Inc, 8. 9) M. Opper and D. Saad (eds): Advanced Mean Field Methods Theory and Practice, MIT Press,. ) :,, 999. ) H. Nishimori: Statistical Physics of Spin Glasses and Information Processing, An Introduction, Oxford University Press,. ) : /,,. 3) Y. Kabashima and D. Saad: Statistial Mechanics of Low-Density Parity-Check Codes (Topical Review), Journal of Physics A: Mathematical and General, Vol. 37, No.6, pp.r-r3,. ) M. Mézard and A. Montanari: Information, Physics, and Computation, Oxford University Press, 9. 5) A. S. Willsky: Multiresolution Markov models for signal and image processing, Proceedings of IEEE, vol.9, no.8, pp.396-58,. ( 5 9 ) ( ) 36... 6.. 9...,,,, IEEE-CIS. 9/6/ 9 c 9 Information Processing Society of Japan