2016 3
1 3 1.1............................... 3 1.2.............................. 4 1.3.............................. 6 2 Self-Organizing Map:SOM 7 2.1 SOM................................ 7 2.2 SOM.............................. 8 2.2.1 SOM................... 12 2.3 SOM.................. 13 2.3.1................. 13 2.4 (S-SOM)................... 15 2.4.1 S-SOM............ 16 2.4.2 S-SOM.................. 18 2.4.3 S-SOM U-matrix................... 21 3 24 3.1 (HMM).................. 24 3.2 HMM..................... 24 3.2.1 Forward algorithm....................... 25 3.2.2 Backward algorithm...................... 30 3.2.3 Viterbi algorithm........................ 32 3.2.4 Baum-Welch algorithm..................... 33 3.3 HMM..................... 43 4 (HMM-S-SOM) 51 1
4.1 HMM-S-SOM................. 52 4.2........................ 54 4.2.1 HMM-S-SOM....... 58 5 (F-HMM-S-SOM) 61 5.1 F-HMM-S-SOM........ 67 5.1.1 F-HMM-S-SOM...... 67 5.2 F-HMM-S-SOM.......... 68 5.2.1 DNA............. 68 5.2.2.............. 71 6 83 6.1.................. 87 6.1.1................... 89 6.1.2 2................... 91 6.1.3 3................... 93 7 96 98 99 101 2
1 1.1,,,,,,,,,,,,,k-means,,,, 3
1.2,,,,,,,,,,,, Siri,,,,,,, [1][2],,, [3],,,,,,,,,,, 4
,,,,,,, [4], [5], 5
1.3,,,,,, [6][7],,,,,,,, 3,, 3 6
2 Self-Organizing Map:SOM 2.1 SOM, (TKohonen) SOM,,,, 1, data 2, data3, (data1, data2),,,, 1: SOM 7
2.2 SOM, ( 2) SOM,,,data2 data3,data2,data3,data1 2 2: SOM SOM,data1, 2,SOM data1data3, 6 6 36,, SOM,,1 1,,,, ( 3), 8
,, 3: SOM,data1,data1 3, (winner node) data1, 4,,,, data1 data1 data1,data2, 9
4: data1 data2, data1 5, data2 data2, data2 data3 data2 6,,,data3 data2,data2 data1,,,,,, 10
5: data2 6: data3 11
, SOM,,, SOM 2.2.1 SOM STEP 1: ()v j SOM STEP 2: v j w N j k N j k N j k = min f j (k) (1) k f j (k) = n (a jt w kt ) 2 (2) STEP 3: N j k w k v j t=1 h(p jx, P jx ) = exp( P jx P jx 2 σ 2 ) (3) w k = β h(p jx, P jx )f j (k) (4) w new k = w k + w k (5) 12
P jx N j k P jx P jx P jx σ β 7 7: 24 w 2.3 SOM 2.3.1 8 Data1Data5 0255 4 R,G,B SOM Data2 Data5,, 13
, SOM SOMPlane-SOM,,,, SOM ( 9:), SOM,, SOM ( 9:), 14
8: SOM 2.4 (S-SOM) SOM SOM [8][9][10][11] [9] OpenGL Windows Windows API DirectX [12][13] 15
9: SOM 2.4.1 S-SOM DirectX D 10, 16
10: 11 11: S-SOM SOM 17
SOM SOM S-SOM SOM SOM 12 12: S-SOM 2.4.2 S-SOM n m v j = {a j1, a j2, a j3,..., a jn }(j = 1, 2, 3,..., m) S- SOM l N k (k = 1, 2, 3,..., l) w k = {w k1, w k2, w k3,..., w kn } 18
S-SOM STEP 1: w w STEP 2: v v j SOM STEP 3: v j w N j k N j k N j k = min k f j (k) (6) f j (k) = n (a jt w kt ) 2 (7) t=1 STEP 4: N j k w k v j h(p jx, P jx ) = exp( P jx P jx θ 2 ) (8) w k = β h(p jx, P jx )f j (k) (9) w new k = w k + w k (10) P jx N j k P jx P jx P jx θ β 19
θ 13β 13: S-SOM 20
2.4.3 S-SOM U-matrix SOM S-SOM 14 1 0 14: S-SOM U-matrix 21
S-SOM U-matrix S-SOM n n-1 S-SOM n 16 15 U-matrix T g U g ab U g ab N a N b U g ab = U g ba U g ab, U g ba U g N k (k = 1, 2, 3,..., l) w k = {w k1, w k2, w k3,..., w kn } SOM U g ab n U g ab = (w aj w bj ) 2 (11) j=1 T g N a V a, V g L V g = L (αu g + max h(u h ) M min h (U h ) ) (12) max h (U h ) min h (U h ) V a = L 1 6 j (αu g aj + max h(u h ) M min h (U h ) max h (U h ) min h (U h ) ) (13) α = M 1 max h (U h ) min h (U h ) (14) V g min(u h ) 0, max (U h ) M U g L h h 22
M L V a N a 6 U a L 15: S-SOM U-matrix 16: S-SOM U-matrix 23
3 3.1 (HMM) Baum-Welch algorithm Viterbi algorithm 3.2 HMM HMM Q = {q 1, q 2, q 3,..., q k } : q a i,j : q i q j j a i,j = 1 s i (x) : i x x s i (x) = 1 w[t] : t 24
Θ : HMM 3.2.1 Forward algorithm Forward algorithm HMM w w N R N R R M R R m (W ) = {r m [0], r m [1], r m [2],..., r m [N]} (m = 1, 2, 3,..., M) (15) r m [t] R m (W ) t S L w M N L w = s g(m,t 1) (w[t]) a g(m,t 1),j g(m, t) = r m [t] (16) m=1 t=1 17 A, B, C HMM w AABC w 1 25
17: 26
1: w R 1 R 3 w R 1 : a 0,0 s 0 (w[1]) a 0,1 s 0 (w[2]) a 1,2 s 1 (w[3]) a 2,3 s 2 (w[4]) = a 0,0 s 0 (A) a 0,1 s 0 (A) a 1,2 s 1 (B) a 2,3 s 2 (C) = 0.2 0.3 0.8 0.3 0.5 0.1 0.3 0.2 = 0.0000432 (17) R 2 : a 0,1 s 0 (w[1]) a 1,1 s 1 (w[2]) a 1,2 s 1 (w[3]) a 2,3 s 2 (w[4]) = a 0,1 s 0 (A) a 1,1 s 1 (A) a 1,2 s 1 (B) a 2,3 s 2 (C) = 0.8 0.3 0.5 0.7 0.5 0.1 0.3 0.2 = 0.0002520 (18) R 3 : 27
a 0,1 s 0 (w[1]) a 1,2 s 1 (w[2]) a 2,2 s 2 (w[3]) a 2,3 s 2 (w[4]) = a 0,1 s 0 (A) a 1,2 s 1 (A) a 2,2 s 2 (B) a 2,3 s 2 (C) = 0.8 0.3 0.5 0.7 0.7 0.3 0.3 0.2 = 0.0010584 (19) w L w 17 19 L w = 0.0000432 + 0.0002520 + 0.0010584 = 0.0013536 (20) 17 19 Forward algorithm f j (t) = s i (w[t]) q i Q f i (t 1)a i,j t 1 (21) f 0 (0) = 1 f j (t) t q j w = {w[1], w[2],..., w[t]} 28
Forward algorithm, L w f 0 (0) = 1.0 f 0 (1) = f 0 (0) a 0,0 s 0 (w[1]) = f 0 (0) a 0,0 s 0 (A) = 1.0 0.2 0.3 = 0.06 f 1 (1) = f 0 (0) a 0,1 s 0 (w[1]) = f 0 (0) a 0,1 s 0 (A) = 1.0 0.8 0.3 = 0.24 f 1 (2) = f 0 (1) a 0,1 s 0 (w[2]) + f 1 (1) a 1,1 s 1 (w[2]) = f 0 (1) a 0,1 s 0 (A) + f 1 (1) a 1,1 s 1 (A) = 0.06 0.8 0.3 + 0.24 0.5 0.7 = 0.0984 f 2 (2) = f 1 (1) a 1,2 s 1 (w[2]) = f 1 (1) a 1,2 s 1 (A) = 0.24 0.5 0.7 = 0.084 f 2 (3) = f 1 (2) a 1,2 s 1 (w[3]) + f 2 (2) a 2,2 s 2 (w[3]) = f 1 (2) a 1,2 s 1 (B) + f 2 (2) a 2,2 s 2 (B) = 0.0984 0.5 0.1 + 0.084 0.7 0.3 = 0.02256 f 3 (4) = f 2 (3) a 2,3 s 2 (w[4]) = f 2 (3) a 2,3 s 2 (C) = 0.02256 0.3 0.2 = 0.0013536 L w = f 3 (4) = 0.0013536 20 29
3.2.2 Backward algorithm Forward algorithm w q j Backward algorithm q j w = {w[n], w[n 1],..., w[t + 1]} b j (t) b i (t) = q j Q b j (t + 1)a i,j s i (w[t + 1]) (22) 17 AABC L w Backward algorithm 30
b 3 (4) = 1.0 b 2 (3) = b 3 (4) a 2,3 s 2 (w[4]) = b 3 (4) a 2,3 s 2 (C) = 1.0 0.3 0.2 = 0.06 b 2 (2) = b 2 (3) a 2,2 s 2 (w[3]) = b 2 (3) a 2,2 s 2 (B) = 0.06 0.7 0.3 = 0.0126 b 1 (2) = b 2 (3) a 1,2 s 1 (w[3]) = b 2 (3) a 1,2 s 1 (B) = 0.06 0.5 0.1 = 0.003 b 1 (1) = b 2 (2) a 1,2 s 1 (w[2]) + b 1 (2) a 1,1 s 1 (w[2]) = b 2 (2) a 1,2 s 1 (A) + b 1 (2) a 1,1 s 1 (A) = 0.0126 0.5 0.7 + 0.003 0.5 0.7 = 0.00546 b 0 (1) = b 1 (2) a 0,1 s 0 (w[2]) = b 1 (2) a 0,1 s 0 (A) = 0.003 0.8 0.3 = 0.00072 b 0 (0) = b 1 (1) a 0,1 s 0 (w[1]) + b 0 (1) a 0,0 s 0 (w[1]) = b 1 (1) a 0,1 s 0 (A) + b 0 (1) a 0,0 s 0 (A) = 0.00546 0.8 0.3 + 0.00072 0.2 0.3 = 0.0013536 L w = b 0 (0) = 0.0013536 Backward algorithm b 0 (0) Forward algorithm f 3 (4) b 0 (0) = f 3 (4) 31
3.2.3 Viterbi algorithm Forward algorithm f j (t) = s i (w[t]) f i (t 1)a i,j q i Q w Viterbi algorithm W Rm(w) N L w = arg max s g(m,t) (w[t]) a g(m,t),j g(m, t) = r m [t] (23) t=0 f j (t) = s i (w[t]) max q Q f i(t 1)a i,j (24) 32
3.2.4 Baum-Welch algorithm HMM Viterbi algorithm HMM Θ Θ w Viterbi algorithm w Θ Baum-Welch algorithm Θ HMM w Θ Θ Forward algorithm L w 18 left-to-light 33
18: Baum-Welch algorithm Baum-Welch algorithm O i,j : W = {w 1, w 2,..., w l } q i q j E i (x) : W = {w 1, w 2,..., w l } x q i P (w k Θ) : W = {w 1, w 2,..., w l } Θ w k L w P (w k Θ) = L w W P (W Θ) P (W Θ) = l k=1 L k l 34
, O i,j E i (x) t 1 O i,j = l k=1 1 fi k (t 1) a i,j s i (w k [t]) b k j (t) (25) P (w k Θ) t E i (x) = l k=1 1 P (w k Θ) t:w k [t]=x f k i (t 1) b k i (t 1) (26) O i,j E i (x) a i,j s i (x) a i,j = s i (x) = O i,j j O i,j E i(x) x E i (x ) (27) (28) 35
17 AABC AABC L w O i,j = = = t 1 1 f i (t 1) a i,j s i (w 1 [t]) b j (t) P (w k Θ) k=1 t 1 f i (t 1) a i,j s i (w 1 [t]) b j (t) P (w 1 Θ) t f i (t 1) a i,j s i (w 1 [t]) b j (t) L w t q i q j ξ t (i, j) ξ t (i, j) = f i(t 1) a i,j s i (w k [t]) b j (t) L w O i,j = t ξ t (i, j) PROCESS1PROCESS4 36
PROCESS 1 ξ t (i, j) ξ t (i, j) ξ 1 (0, 0) = f 0(0) a 0,0 s 0 (w 1 [1]) b 0 (1) = f 0(0) a 0,0 s 0 (A) b 0 (1) L w L w 1.0 0.2 0.3 0.00072 = = 0.03191489 0.00135360 ξ 1 (0, 1) = f 0(0) a 0,1 s 0 (w 1 [1]) b 1 (1) = f 0(0) a 0,1 s 0 (A) b 1 (1) L w L w 1.0 0.8 0.3 0.00546 = = 0.96808510 0.00135360 ξ 2 (0, 1) = f 0(1) a 0,1 s 0 (w 1 [2]) b 1 (2) = f 0(1) a 0,1 s 0 (A) b 1 (2) L w L w 0.06 0.8 0.3 0.003 = = 0.03191489 0.00135360 ξ 2 (1, 1) = f 1(1) a 1,1 s 1 (w 1 [2]) b 1 (2) = f 1(1) a 1,1 s 1 (A) b 1 (2) L w L w 0.24 0.5 0.7 0.003 = = 0.18617021 0.00135360 ξ 2 (1, 2) = f 1(1) a 1,2 s 1 (w 1 [2]) b 2 (2) = f 1(1) a 1,2 s 1 (A) b 2 (2) L w L w 0.24 0.5 0.7 0.0126 = = 0.78191489 0.00135360 ξ 3 (1, 2) = f 1(2) a 1,2 s 1 (w 1 [3]) b 2 (3) = f 1(2) a 1,2 s 1 (B) b 2 (3) L w L w 0.0984 0.5 0.1 0.06 = = 0.21808510 0.00135360 ξ 3 (2, 2) = f 2(2) a 2,2 s 2 (w 1 [3]) b 2 (3) = f 2(2) a 2,2 s 2 (B) b 2 (3) L w L w 0.084 0.7 0.3 0.06 = = 0.78191489 0.00135360 ξ 4 (2, 3) = f 2(3) a 2,3 s 2 (w 1 [4]) b 3 (4) = f 2(3) a 2,3 s 2 (C) b 3 (4) L w L w 0.02256 0.3 0.2 1.0 = = 1.0 0.00135360 37
ξ 1 (0, 0) + ξ 1 (0, 1) = 1.0 ξ 2 (0, 1) + ξ 2 (1, 1) + ξ 2 (1, 2) = 1.0 ξ 3 (1, 2) + ξ 3 (2, 2) = 1.0 ξ 4 (2, 3) = 1.0 PROCESS 2 a i,j PROCESS 1 ξ t (i, j) a i,j = O i,j j O i,j O i,j O i,j = ξ t (i, j) t j 29 O i,j = t ξ t (i, j) a i,j j a i,j = t ξ t(i, j) j ξ t(i, j) t (29) a i,j 38
a 0,0 = = a 0,1 = = a 1,1 = = a 1,2 = = a 2,2 = = a 2,3 = = ξ 1 (0, 0) ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) 0.03191489 0.03191489 + 0.96808510 + 0.03191489 0.03092783 ξ 1 (0, 1) + ξ 2 (0, 1) ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) 0.03191489 0.03191489 + 0.96808510 + 0.03191489 0.96907217 ξ 2 (1, 1) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.18617021 0.18617021 + 0.78191489 + 0.21808510 0.15695067 ξ 2 (1, 2) + ξ 3 (1, 2) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.78191489 + 0.21808510 0.18617021 + 0.78191489 + 0.21808510 0.84304933 ξ 3 (2, 2) ξ 3 (2, 2) + ξ 4 (2, 3) 0.78191489 0.78191489 + 1.0 0.43880597 ξ 4 (2, 3) ξ 3 (2, 2) + ξ 4 (2, 3) 1.0 0.78191489 + 1.0 0.56119403 a i,j a i,j 39
PROCESS 3 s i (x) s i (x) l 1 L w P (w k Θ) k=1 l 1 1 P (w k Θ) = 1 P (w 1 Θ) = 1 L w k=1 k=1 26 E i (x) = 1 L w t:w k [t]=x f i (t 1) b i (t 1) (30) 3022 E i (x) = 1 L w = 1 = = t:w k [t]=x L w t:w k [t]=x f i (t 1) b i (t 1) f i (t 1) b j (t)a i,j s i (w[t]) q j Q t:w k [t]=x f i(t 1) q j Q b j(t)a i,j s i (w[t]) t:w k [t]=x q j Q L w f i (t 1) b j (t)a i,j s i (w[t]) L w (31) ξ t (i, j) = f i(t 1) a i,j s i (w k [t]) b j (t) L w 31 E i (x) = t:w k [t]=x q j Q f i (t 1) b j (t)a i,j s i (w[t]) L w = t:w k [t]=x q j Q ξ t (i, j) (32) 40
28 s i (x) = = = E i (x) x E i (x ) t:w k [t]=x x t:w k [t]=x t t:w k [t]=x q j Q ξ t(i, j) q j Q ξ t(i, j) q j Q ξ t(i, j) q j Q ξ t(i, j) (33) 41
33 s i (x) s 0 (A) = ξ 1(0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) = 1.0 s 0 (B) = 0 ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) = 0.0 s 0 (C) = 0 ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) = 0.0 s 1 (A) = = s 1 (B) = = s 1 (C) = s 2 (A) = s 2 (B) = = s 2 (C) = = ξ 2 (1, 1) + ξ 2 (1, 2) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.18617021 + 0.78191489 0.18617021 + 0.78191489 + 0.21808510 0.81614350 ξ 3 (1, 2) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.21808510 0.18617021 + 0.78191489 + 0.21808510 0.18385650 0 ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) = 0.0 0 ξ 3 (2, 2) + ξ 4 (2, 3) = 0.0 ξ 3 (2, 2) ξ 3 (2, 2) + ξ 4 (2, 3) 0.78191489 0.78191489 + 1.0 0.43880597 ξ 4 (2, 3) ξ 3 (2, 2) + ξ 4 (2, 3) 1.0 0.78191489 + 1.0 0.56119403 42
3.3 HMM Forward algorithmbackward algorithmbaum-welch algorithm 17 AABC 1 t G = {G 1, G 2,...G t,...g l } k G t t k k G t = Γ t (0, 0) Γ t (0, 1) Γ t (0, k 1) Γ t (1, 0) Γ t (1, 1) Γ t (1, k 1)...... Γ t (k 1, 0) Γ t (k 1, 1) Γ t (k 1, k 1) 4 k = 4 G t Γ t (i, j) t i s i (w[t]) i j 0 i s i (w[t]) i j 1 0 1 t = 1 A i A Γ 1 (i, j) = 1, i 43
A i j Γ 1 (i, j) = 1 i j t = 1 t = 4 44
t = 1 t = 1 0 i = 0 G 1 i 0 1 t = 4 t = 4 4 i 3(... k 1 = 4 1 = 3) a i,3 0 i 1, 0 i = 0 1 Γ 1 (0, 0) = 0 (... s 0 (A) a 0,0 0) Γ 1 (0, 1) = 0 (... s 0 (A) a 0,1 0) Γ 1 (0, 2) = 1 (... s 0 (A) a 0,1 = 0) Γ 1 (0, 3) = 1 (... s 0 (A) a 0,1 = 0) 45
Γ 1 (1, 0) = 1 Γ 1 (2, 0) = 1 Γ 1 (3, 0) = 1 Γ 1 (1, 1) = 1 Γ 1 (2, 1) = 1 Γ 1 (3, 1) = 1 Γ 1 (1, 2) = 1 Γ 1 (2, 2) = 1 Γ 1 (3, 2) = 1 Γ 1 (1, 3) = 1 Γ 1 (2, 3) = 1 Γ 1 (3, 3) = 1 t G t 19 46
19: G t G t STEP 1 G 1 j 1 2 G 2 j 1 20 G 1 G 2 G 3, G 4 t = 1 G t j 1 G t+1 j 1 t = l 1 G 47
20: G STEP t = l G t i 1 G t 1 i 1 t = 2 G 21 48
21: G STEP STEP t i j G t Γ t (i, j) 0 1 0 1 AABC 22 1 49
22: G 50
4 隠れマルコフ球面自己組織化マップ (HMM-S-SOM) 隠れマルコフ球面自己組織化マップとは,S-SOM のノードに隠れマルコフモデル を用いたものであり 図 23, 入力データには, ベクトルではなく文字列集合を用い る. ノード上に内包された隠れマルコフモデルは, すべて同じ構造を持つ この自 己組織化マップは, データを直接分類するのではなくデータの背景にある確率モデ ルにもとづいて, 確率モデルを球面上にクラスタリングし, その分類結果を視覚的に も容易に提示することのできる学習モデルである S-SOM と HMM-S-SOM の違い については表 2 に示す ノードや入力データの形式の違い以外に, 勝者ノードの決 定方法や, ノードの更新の方法が異なる その詳細については, 次の HMM-S-SOM の学習アルゴリズムの説明に記す 図 23: 隠れマルコフ球面自己組織化マップ 51
2: S-SOM HMM-S-SOM 4.1 HMM-S-SOM STEP 0: HMM Θ k STEP 1: STEP 2: Baum-Welch algorithm STEP 3,, STEP 3: HMM Baum-Welch algorithm, HMM SOM 52
SOM, STEP1STEP3, STEP2,,,, 24,,,Baum-Welch algorithm HMM,,,,Baum-Welch algorithm HMM,STEP3 24: 53
4.2, left to right, Ergodic, U-matrix, U-matrix U-matrix,,,, A,B,C 3 HMM HMM M 1, M 3,..., M 10,,,, 25 312, 1322 54
3: M 1 4: M 2 5: M 3 6: M 4 7: M 5 8: M 6 55
9: M 7 10: M 8 11: M 9 12: M 10 13: M 1 14: M 2 56
15: M 3 16: M 4 17: M 5 18: M 6 19: M 7 20: M 8 57
21: M 9 22: M 10 25: 4.2.1 HMM-S-SOM HMM-S-SOM 26 26 M 1 Data1 M 1 M 10 M 1 M 10 M 3, M 8, M 9, M 7, M 2, M 5, M 6, M 4 M 1,U-matrix HMM, U-matrix HMM 58
26:, HMM,, 27,,,,model 1 model 4,HMM -matrix,, 59
27: 60
5 (F-HMM-S-SOM), Baum-Welch algorithm,,som,hmm,, SOM HMM, (F-HMM-S-SOM) ( 28) 28: F-HMM-S-SOM,, HMM,, 61
, F-S-HMM-SOM, A,B,C, n, n + 1,A,B,C,,, A,B,C, l, l 1 l, 12 (l 1) m, l max, (m 2 + m)(l max 1) ACCB, BCC,,, 2,ACCB, AC, 29 AC, 0, 000001000000, CC 000000000001, CB 000000000010 ACCB 62
000001000000000000000001000000000010,BCC 000000001000000000000001 29:,,, 30,, 0( 30: ) Θ k : HMMS-SOMHMMΘ k = {A k, S k } 63
30: A k = a k 01 a k 02 a k 0j a k 11 a k 12 a k 1j...... S k = s k 01 s k 02 s k 0h s k 11 a k 12 s k 1h...... a k i1 a k i2 a k ij s k i1 s k i2 s k ih a i,j i j s i,h i x h X : X = {x 1, x 2, x 3,..., x h } V = {v 1, v 2, v 3,..., v h } STEP 1: HMM Θ k STEP 2: W n Baum-Welch algorithm N i HMM L i, N i E i STEP 3:, 64
(1 E i E max ) L i,e max E i STEP 4: Baum-Welch algorithm, STEP 5:,SOM N n k Θ k STEP 4 Nk n HMM Θ n k = {An k, Sn k } A = β h(p nx, P nx )(A n k A k) A new k = A k + A S = β h(p nx, P nx )(Sk n S k) S new k = S k + S Θ new k Θ k Θ new k = {A new k, Sk new } h(p nx, P nx ) = exp( P nx P nx θ 2 ) P jx N j k P jx 65
P jx P jx β,baum-welch algorithm STEP 6: SOM SOM STEP2STEP5,,, U-matrix, 66
5.1 F-HMM-S-SOM,, HMM-S-SOM, 5.1.1 F-HMM-S-SOM HMM U-matrix 31,, 31: HMM U-matrix, ( 32),, 67
32: U-matrix 5.2 F-HMM-S-SOM,,,DNA (HMM-S-SOM) 5.2.1 DNA, DNA ATGC,ATGC [14], 1024, 6 1024, (Hsall), Mmuall, (Cfaall), Ecoall, (Dmaall), (Osaall),HMM- S-SOM, U-matrix 33 68
33: HMM-S-SOM,,.. F-HMM-S-SOM,HMM U-matrix, ( 34), ( 35),,,,,,,,,, 69
34: F-HMM-S-SOM : U-matrix 35: F-HMM-S-SOM : U-matrix 70
5.2.2, 2010 36,,,,,,, 36,,,, U, D, S, U D,, F-HMM-S-SOM,F- HMM-S-SOM HMM, 37 4, U,S,D 3,Umatrix, 71
36: 72
37:,,. t t+1, HMM-S-SOM F-HMM-S-SOM 38 39 0 38, 22 46 39, S-HMM-SOM, F-S-HMM-SOM,, S-HMM-SOM,U-matrix,,F-S-HMM-SOM, 73
38: 74
39: 75
, U-matrix,,,, F-S-HMM-SOM 1.377,S-HMM-SOM 2.623,,., 40,, S-HMM-SOM 40,,,, F-S-HMM-SOM 40,,,,,,,,,,,,,,,,, 76
40: 77
,, 41,,,, 0 23 42, 24 46 43 5,,,,, 4,,,,,, 78
41: 79
図 42: 移動平均線との比較 80
図 43: 移動平均線との比較 81
230 36 23 23: 82
6,,,,,,,,,,,,, 44, 1,2, SOM v j x i 1, (mask-vector)w i w i,, 0 1 1 0,1,0 83
44:, 1, 0 1,1 2, 30 5,,,, 45: SOM,, 45, h a, 84
STEP 1: 1 x i = {x i0, x i1,..., x in }, h a = {h a0, h a1,..., h an } 2 w i = {w i0, w i1,..., w in } 1 STEP 2: i,l 1 [argmin] i {w ij (h aj v aj w ij x ij )} L 2 (34) j L = j w2 ij w i w i STEP 3: wi,, µ a µ a µ ai ω 46 46: 85
(35) m [argmin] m { 1 m m k=2 ( ω k ω m ) 2 } ϕ (35),ϕ 0.5 µ t ω k f(t) = k, f(t) m w new it = w old it + γ(1 w old it ) (36) m < f(t) w new it = w old it + γ(0 w old it ) (37),γ 0 < γ < 1, wi w i 1 0 h a STEP 4:, x new j = x old j + α(v a x old j ) (38) w new j = w old j + β(w i w old j ) (39) α,,0 < α < 1, β 0 < β < 1 86
STEP2STEP4,,, h a, 6.1, h a, 1 3,,, noise Mersenne Twister[15] 01 1x,y,z y = 3 x + noise z = noise 2x 0.5, x,y,z x,y,z 87
47: y = 3 x + noise z = noise (x 0.5) y = 2 x + noise z = noise (x > 0.5) 3x 0.5,x,z, x 0.5,y z y = 3 x + 2 noise z = noise (x 0.5) 88
48: 2 y = 2 z + noise + 2 x = noise (x > 0.5) 6.1.1 50,, x y model 1 100 z, x y 0.819, SOM 51 SOM, 8, 89
49: 3 50: 90
51: SOM, x y, 6.1.2 2 52, x y, x y z, 95. 0.335, SOM 53,, x y z, x y 2,model 91
52: 53: SOM 92
6.1.3 3 3, 54 55 54 xy x y, y,x,y,y z x 0.5 0.5,, 78, 54: yz,xy, 56 93
55: 56: 94
56, 54 yz,yz, yz,xy,x, yz,x,, xy,,, SOM, 57 57: SOM,, x z SOM, 95
7,,,,,,,, DNA,,,,,,,,,,,,,,,,,,,,.,,,, 96
, k-means,., 2,,,.,, SOM F-HMM-S-SOM,, 97
,,,,,,,,,.,,, 98
[1],,2012. [2],,,, DEIM Forum 2010 F4-2,2010. [3]Teuvo Kohonen,, :,. [4]Chie Morita and Hiroshi Tsukimoto:Knowledge discovery from numerical data, Knowledge-based Systems,Vol.10,No7,pp.413-419,1998. [5] :,2009. [6] /,, HMM-SOM. [7], HMM-SOM MIDI. [8],,,, : A spherical SOM and its examples/thecnical Report of IEICE. [9],,, : S-SOM, 0 0 153 156,2004. 99
[10],,,, : SOM,, Vol.2010-MPS-77 No.29,2010. [11] Ying Xin WU SOM Vol.19, No.6, pp.611-617,2007. [12]N2Factory DirectX. [13]Jeffrey Richter, Christophe Nasarre:Advanced Windows 5, BP,2008. [14]Hiroshi Dozono. et.al:a Design Method of DNA Chips for sequence Analysis Using Self Organizing Maps, Proceeding of WSOM 2003,pp.116-121,2003. [15]M. Matsumoto and T. Nishimura:A 623-dimensionally equidistributed uniform pseudorandom number generator, ACM Trans. on Modeling and Computer Simulation Vol. 8, No. 1, January pp.3-30,1998. 100
1. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono: Basic Study on the Classification of Time Series Data Using a Frequency Integrated Spherical Hidden Markov Self Organizing Map, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.19 No.2, pp.212-216, 2015. 1. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono: Causal analysis of data using 2-layerd Spherical Self-Organizing Map, The 2015 International Conference on Computational Science and Computational Intelligence (CSCI 15), Las Vegas, USA, pp.198-202, 2015. 2. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono, Tatsuya Chuuto: The data analysis of stock market using a Frequency Integrated Spherical Hidden Markov Self Organizing Map, ICSIIT 2015, 4th International Conference on Soft Computing, Intelligent System and Information Technology, March 11-14, 2015 / Bari, Indonesia, pp.195-204, 2015. 3. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono: The Frequency Integrated Spherical Hidden Markov Self Organizing Map for Learning Time Series Data, ISIS2013, The International Symposium on Advanced Intelligent Systems, Daejeon, Korea, pp.362-370, 2013. 101
4. Hiroshi Dozono, Gen Niina and Kazuhiro Muramatsu,: Mapping of DNA sequences using Hidden Markov Model Self-Organizing Maps, Proceedings of The 10th annual IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology, IEEE Press, Singapore, 2013. 5. Hiroshi Dozono, Gen Niina and Yutaro Kaneko: Mapping the Groups of DNA Sequences using Hidden Markov Model Self Organizing Maps, The First BMIRC International Symposium on Frontiers in Computational Systems Biology and Bioengineering, Fukuoka, Japan, 2013. 6. Gen Niina, Hiroshi Dozono: The Spherical Hidden Markov Self Organizing Map for Learning Time Series Data, 22nd International Conference on Artificial Neural Networks, Lausanne, Switzerland, pp.563-570, Part I, 2012. 1.,, :, 15,, 2014. 2., : DirectX SOM 19, 2011. 3.,, : 64, 2011. 102