1 3 1.1............................... 3 1.2.............................. 4 1.3.............................. 6 2 Self-Organizing Map:SOM 7 2.1 SOM..



Similar documents
Grund.dvi

untitled

3. ( 1 ) Linear Congruential Generator:LCG 6) (Mersenne Twister:MT ), L 1 ( 2 ) 4 4 G (i,j) < G > < G 2 > < G > 2 g (ij) i= L j= N

SOM SOM(Self-Organizing Maps) SOM SOM SOM SOM SOM SOM i

知能と情報, Vol.29, No.6, pp


TC1-31st Fuzzy System Symposium (Chofu, September -, 15) cremental Neural Networ (SOINN) [5] Enhanced SOINN (ESOINN) [] ESOINN GNG Deng Evolving Self-

第85 回日本感染症学会総会学術集会後抄録(I)



三石貴志.indd

untitled

放射線専門医認定試験(2009・20回)/HOHS‐01(基礎一次)

2 3, 4, [1] [2] [3]., [4], () [3], [5]. Mel Frequency Cepstral Coefficients (MFCC) [9] Logan [4] MFCC MFCC Flexer [10] Bogdanov2010 [3] [14],,,

xx/xx Vol. Jxx A No. xx 1 Fig. 1 PAL(Panoramic Annular Lens) PAL(Panoramic Annular Lens) PAL (2) PAL PAL 2 PAL 3 2 PAL 1 PAL 3 PAL PAL 2. 1 PAL

i

Ł\”ƒ-2005

IPSJ SIG Technical Report Vol.2014-DBS-159 No.6 Vol.2014-IFAT-115 No /8/1 1,a) 1 1 1,, 1. ([1]) ([2], [3]) A B 1 ([4]) 1 Graduate School of Info

(4) ω t(x) = 1 ω min Ω ( (I C (y))) min 0 < ω < C A C = 1 (5) ω (5) t transmission map tmap 1 4(a) t 4(a) t tmap RGB 2 (a) RGB (A), (B), (C)

Computer Security Symposium October ,a) 1,b) Microsoft Kinect Kinect, Takafumi Mori 1,a) Hiroaki Kikuchi 1,b) [1] 1 Meiji U

BOK body of knowledge, BOK BOK BOK 1 CC2001 computing curricula 2001 [1] BOK IT BOK 2008 ITBOK [2] social infomatics SI BOK BOK BOK WikiBOK BO

2. Eades 1) Kamada-Kawai 7) Fruchterman 2) 6) ACE 8) HDE 9) Kruskal MDS 13) 11) Kruskal AGI Active Graph Interface 3) Kruskal 5) Kruskal 4) 3. Kruskal

IPSJ SIG Technical Report 1,a) 1,b) 1,c) 1,d) 2,e) 2,f) 2,g) 1. [1] [2] 2 [3] Osaka Prefecture University 1 1, Gakuencho, Naka, Sakai,


1.0, λ. Holt-Winters t + h,ỹ t ỹ t+h t = ỹ t + hf t.,,.,,,., Hassan [5],,,.,,,,,,Hassan EM,, [6] [8].,,,,Stenger [9]. Baum-Welch, Baum-Welch (Incremen

独立行政法人情報通信研究機構 Development of the Information Analysis System WISDOM KIDAWARA Yutaka NICT Knowledge Clustered Group researched and developed the infor

3 2 2 (1) (2) (3) (4) 4 4 AdaBoost 2. [11] Onishi&Yoda [8] Iwashita&Stoica [5] 4 [3] 3. 3 (1) (2) (3)

No. 3 Oct The person to the left of the stool carried the traffic-cone towards the trash-can. α α β α α β α α β α Track2 Track3 Track1 Track0 1

Vol. 43 No. 2 Feb. 2002,, MIDI A Probabilistic-model-based Quantization Method for Estimating the Position of Onset Time in a Score Masatoshi Hamanaka

IS1-09 第 回画像センシングシンポジウム, 横浜,14 年 6 月 2 Hough Forest Hough Forest[6] Random Forest( [5]) Random Forest Hough Forest Hough Forest 2.1 Hough Forest 1 2.2

平成○○年度知能システム科学専攻修士論文

本 ディスカッションペーパーの 内 容 や 意 見 は 全 て 執 筆 者 の 個 人 的 見 解 であり 金 融 庁 あるいは 金 融 研 究 センターの 公 式 見 解 を 示 すものではありません


27 YouTube YouTube UGC User Generated Content CDN Content Delivery Networks LRU Least Recently Used UGC YouTube CGM Consumer Generated Media CGM CGM U

GUI(Graphical User Interface) GUI CLI(Command Line Interface) GUI

1

BB 報告書完成版_修正版) doc

,, WIX. 3. Web Index 3. 1 WIX WIX XML URL, 1., keyword, URL target., WIX, header,, WIX. 1 entry keyword 1 target 1 keyword target., entry, 1 1. WIX [2

Visit Japan Campaign OD OD 18 UNWTO 19 OD JNTO ODUNWTO 1 1

syuu_2_10_3.dvi

guideline_1_0.dvi

[1] SBS [2] SBS Random Forests[3] Random Forests ii

Run-Based Trieから構成される 決定木の枝刈り法

Vol. 23 No. 4 Oct Kitchen of the Future 1 Kitchen of the Future 1 1 Kitchen of the Future LCD [7], [8] (Kitchen of the Future ) WWW [7], [3

2.R R R R Pan-Tompkins(PT) [8] R 2 SQRS[9] PT Q R WQRS[10] Quad Level Vector(QLV)[11] QRS R Continuous Wavelet Transform(CWT)[12] Mexican hat 4

IPSJ SIG Technical Report Vol.2017-MUS-116 No /8/24 MachineDancing: 1,a) 1,b) 3 MachineDancing MachineDancing MachineDancing 1 MachineDan

一般演題(ポスター)

1 n =3, 2 n 3 x n + y n = z n x, y, z 3 a, b b = aq q a b a b b a b a a b a, b a 0 b 0 a, b 2

第 55 回自動制御連合講演会 2012 年 11 月 17 日,18 日京都大学 1K403 ( ) Interpolation for the Gas Source Detection using the Parameter Estimation in a Sensor Network S. T

130 Oct Radial Basis Function RBF Efficient Market Hypothesis Fama ) 4) 1 Fig. 1 Utility function. 2 Fig. 2 Value function. (1) (2)

日本分子第4巻2号_10ポスター発表.indd

IPSJ SIG Technical Report Vol.2014-MBL-70 No.49 Vol.2014-UBI-41 No /3/15 2,a) 2,b) 2,c) 2,d),e) WiFi WiFi WiFi 1. SNS GPS Twitter Facebook Twit

DEIM Forum 2017 H2-2 Android LAN Android 1 Android LAN

DEIM Forum 2009 C8-4 QA NTT QA QA QA 2 QA Abstract Questions Recomme

Microsoft Word - deim2011_new-ichinose doc

untitled

204 / CHEMISTRY & CHEMICAL INDUSTRY Vol.69-1 January

DEIM Forum 2009 B4-6, Str

2003/3 Vol. J86 D II No Fig. 1 An exterior view of eye scanner. CCD [7] CCD PC USB PC PC USB RS-232C PC

日本内科学会雑誌第98巻第3号

IPSJ SIG Technical Report GPS LAN GPS LAN GPS LAN Location Identification by sphere image and hybrid sensing Takayuki Katahira, 1 Yoshio Iwai 1

jigp60-★WEB用★/ky494773452500058730

プログラム


,,, 2 ( ), $[2, 4]$, $[21, 25]$, $V$,, 31, 2, $V$, $V$ $V$, 2, (b) $-$,,, (1) : (2) : (3) : $r$ $R$ $r/r$, (4) : 3


IPSJ SIG Technical Report iphone iphone,,., OpenGl ES 2.0 GLSL(OpenGL Shading Language), iphone GPGPU(General-Purpose Computing on Graphics Proc

yakuri06023‡Ì…R…s†[

: u i = (2) x i Smagorinsky τ ij τ [3] ij u i u j u i u j = 2ν SGS S ij, (3) ν SGS = (C s ) 2 S (4) x i a u i ρ p P T u ν τ ij S c ν SGS S csgs


TF-IDF TDF-IDF TDF-IDF Extracting Impression of Sightseeing Spots from Blogs for Supporting Selection of Spots to Visit in Travel Sat

21 Pitman-Yor Pitman- Yor [7] n -gram W w n-gram G Pitman-Yor P Y (d, θ, G 0 ) (1) G P Y (d, θ, G 0 ) (1) Pitman-Yor d, θ, G 0 d 0 d 1 θ Pitman-Yor G

Fuzzy Multiple Discrimminant Analysis (FMDA) 5) (SOM) 6) SOM 3 6) SOM SOM SOM SOM SOM SOM 7) 8) SOM SOM SOM GPU 2. n k f(x) m g(x) (1) 12) { min(max)

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro

z.prn(Gray)

H1-H4*.ai

[2][3][4][5] 4 ( 1 ) ( 2 ) ( 3 ) ( 4 ) 2. Shiratori [2] Shiratori [3] [4] GP [5] [6] [7] [8][9] Kinect Choi [10] 3. 1 c 2016 Information Processing So


22 Google Trends Estimation of Stock Dealing Timing using Google Trends


II K116 : January 14, ,. A = (a ij ) ij m n. ( ). B m n, C n l. A = max{ a ij }. ij A + B A + B, AC n A C (1) 1. m n (A k ) k=1,... m n A, A k k

johnny-paper2nd.dvi

Vol.55 No (Jan. 2014) saccess 6 saccess 7 saccess 2. [3] p.33 * B (A) (B) (C) (D) (E) (F) *1 [3], [4] Web PDF a m

A B C D E F G H J K L M 1A : 45 1A : 00 1A : 15 1A : 30 1A : 45 1A : 00 1B1030 1B1045 1C1030

IPSJ SIG Technical Report NetMAS NetMAS NetMAS One-dimensional Pedestrian Model for Fast Evacuation Simulator Shunsuke Soeda, 1 Tomohisa Yam

dsample.dvi

IPSJ SIG Technical Report Vol.2012-MUS-96 No /8/10 MIDI Modeling Performance Indeterminacies for Polyphonic Midi Score Following and

8.1 Fubini 8.2 Fubini 9 (0%) 10 (50%) Carathéodory 10.3 Fubini 1 Introduction [1],, [2],, [3],, [4],, [5],, [6],, [7],, [8],, [1, 2, 3] 1980

No. 26 March 2016 IoT innovation IoT innovation 1 2 ICT Industry and IoT Innovation-Case Study of Competition and Cooperation between ICT and Automobi

2301/1     目次・広告

2005 1

ii

2009/9 Vol. J92 D No. 9 HTML [3] Microsoft PowerPoint Apple Keynote OpenOffice Impress XML 4 1 (A) (C) (F) Fig. 1 1 An example of slide i

カルマンフィルターによるベータ推定( )

3.1 Thalmic Lab Myo * Bluetooth PC Myo 8 RMS RMS t RMS(t) i (i = 1, 2,, 8) 8 SVM libsvm *2 ν-svm 1 Myo 2 8 RMS 3.2 Myo (Root


(MIRU2008) HOG Histograms of Oriented Gradients (HOG)


258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS ) GPS Global Positioning System

DEIM Forum 2012 E Web Extracting Modification of Objec

Transcription:

2016 3

1 3 1.1............................... 3 1.2.............................. 4 1.3.............................. 6 2 Self-Organizing Map:SOM 7 2.1 SOM................................ 7 2.2 SOM.............................. 8 2.2.1 SOM................... 12 2.3 SOM.................. 13 2.3.1................. 13 2.4 (S-SOM)................... 15 2.4.1 S-SOM............ 16 2.4.2 S-SOM.................. 18 2.4.3 S-SOM U-matrix................... 21 3 24 3.1 (HMM).................. 24 3.2 HMM..................... 24 3.2.1 Forward algorithm....................... 25 3.2.2 Backward algorithm...................... 30 3.2.3 Viterbi algorithm........................ 32 3.2.4 Baum-Welch algorithm..................... 33 3.3 HMM..................... 43 4 (HMM-S-SOM) 51 1

4.1 HMM-S-SOM................. 52 4.2........................ 54 4.2.1 HMM-S-SOM....... 58 5 (F-HMM-S-SOM) 61 5.1 F-HMM-S-SOM........ 67 5.1.1 F-HMM-S-SOM...... 67 5.2 F-HMM-S-SOM.......... 68 5.2.1 DNA............. 68 5.2.2.............. 71 6 83 6.1.................. 87 6.1.1................... 89 6.1.2 2................... 91 6.1.3 3................... 93 7 96 98 99 101 2

1 1.1,,,,,,,,,,,,,k-means,,,, 3

1.2,,,,,,,,,,,, Siri,,,,,,, [1][2],,, [3],,,,,,,,,,, 4

,,,,,,, [4], [5], 5

1.3,,,,,, [6][7],,,,,,,, 3,, 3 6

2 Self-Organizing Map:SOM 2.1 SOM, (TKohonen) SOM,,,, 1, data 2, data3, (data1, data2),,,, 1: SOM 7

2.2 SOM, ( 2) SOM,,,data2 data3,data2,data3,data1 2 2: SOM SOM,data1, 2,SOM data1data3, 6 6 36,, SOM,,1 1,,,, ( 3), 8

,, 3: SOM,data1,data1 3, (winner node) data1, 4,,,, data1 data1 data1,data2, 9

4: data1 data2, data1 5, data2 data2, data2 data3 data2 6,,,data3 data2,data2 data1,,,,,, 10

5: data2 6: data3 11

, SOM,,, SOM 2.2.1 SOM STEP 1: ()v j SOM STEP 2: v j w N j k N j k N j k = min f j (k) (1) k f j (k) = n (a jt w kt ) 2 (2) STEP 3: N j k w k v j t=1 h(p jx, P jx ) = exp( P jx P jx 2 σ 2 ) (3) w k = β h(p jx, P jx )f j (k) (4) w new k = w k + w k (5) 12

P jx N j k P jx P jx P jx σ β 7 7: 24 w 2.3 SOM 2.3.1 8 Data1Data5 0255 4 R,G,B SOM Data2 Data5,, 13

, SOM SOMPlane-SOM,,,, SOM ( 9:), SOM,, SOM ( 9:), 14

8: SOM 2.4 (S-SOM) SOM SOM [8][9][10][11] [9] OpenGL Windows Windows API DirectX [12][13] 15

9: SOM 2.4.1 S-SOM DirectX D 10, 16

10: 11 11: S-SOM SOM 17

SOM SOM S-SOM SOM SOM 12 12: S-SOM 2.4.2 S-SOM n m v j = {a j1, a j2, a j3,..., a jn }(j = 1, 2, 3,..., m) S- SOM l N k (k = 1, 2, 3,..., l) w k = {w k1, w k2, w k3,..., w kn } 18

S-SOM STEP 1: w w STEP 2: v v j SOM STEP 3: v j w N j k N j k N j k = min k f j (k) (6) f j (k) = n (a jt w kt ) 2 (7) t=1 STEP 4: N j k w k v j h(p jx, P jx ) = exp( P jx P jx θ 2 ) (8) w k = β h(p jx, P jx )f j (k) (9) w new k = w k + w k (10) P jx N j k P jx P jx P jx θ β 19

θ 13β 13: S-SOM 20

2.4.3 S-SOM U-matrix SOM S-SOM 14 1 0 14: S-SOM U-matrix 21

S-SOM U-matrix S-SOM n n-1 S-SOM n 16 15 U-matrix T g U g ab U g ab N a N b U g ab = U g ba U g ab, U g ba U g N k (k = 1, 2, 3,..., l) w k = {w k1, w k2, w k3,..., w kn } SOM U g ab n U g ab = (w aj w bj ) 2 (11) j=1 T g N a V a, V g L V g = L (αu g + max h(u h ) M min h (U h ) ) (12) max h (U h ) min h (U h ) V a = L 1 6 j (αu g aj + max h(u h ) M min h (U h ) max h (U h ) min h (U h ) ) (13) α = M 1 max h (U h ) min h (U h ) (14) V g min(u h ) 0, max (U h ) M U g L h h 22

M L V a N a 6 U a L 15: S-SOM U-matrix 16: S-SOM U-matrix 23

3 3.1 (HMM) Baum-Welch algorithm Viterbi algorithm 3.2 HMM HMM Q = {q 1, q 2, q 3,..., q k } : q a i,j : q i q j j a i,j = 1 s i (x) : i x x s i (x) = 1 w[t] : t 24

Θ : HMM 3.2.1 Forward algorithm Forward algorithm HMM w w N R N R R M R R m (W ) = {r m [0], r m [1], r m [2],..., r m [N]} (m = 1, 2, 3,..., M) (15) r m [t] R m (W ) t S L w M N L w = s g(m,t 1) (w[t]) a g(m,t 1),j g(m, t) = r m [t] (16) m=1 t=1 17 A, B, C HMM w AABC w 1 25

17: 26

1: w R 1 R 3 w R 1 : a 0,0 s 0 (w[1]) a 0,1 s 0 (w[2]) a 1,2 s 1 (w[3]) a 2,3 s 2 (w[4]) = a 0,0 s 0 (A) a 0,1 s 0 (A) a 1,2 s 1 (B) a 2,3 s 2 (C) = 0.2 0.3 0.8 0.3 0.5 0.1 0.3 0.2 = 0.0000432 (17) R 2 : a 0,1 s 0 (w[1]) a 1,1 s 1 (w[2]) a 1,2 s 1 (w[3]) a 2,3 s 2 (w[4]) = a 0,1 s 0 (A) a 1,1 s 1 (A) a 1,2 s 1 (B) a 2,3 s 2 (C) = 0.8 0.3 0.5 0.7 0.5 0.1 0.3 0.2 = 0.0002520 (18) R 3 : 27

a 0,1 s 0 (w[1]) a 1,2 s 1 (w[2]) a 2,2 s 2 (w[3]) a 2,3 s 2 (w[4]) = a 0,1 s 0 (A) a 1,2 s 1 (A) a 2,2 s 2 (B) a 2,3 s 2 (C) = 0.8 0.3 0.5 0.7 0.7 0.3 0.3 0.2 = 0.0010584 (19) w L w 17 19 L w = 0.0000432 + 0.0002520 + 0.0010584 = 0.0013536 (20) 17 19 Forward algorithm f j (t) = s i (w[t]) q i Q f i (t 1)a i,j t 1 (21) f 0 (0) = 1 f j (t) t q j w = {w[1], w[2],..., w[t]} 28

Forward algorithm, L w f 0 (0) = 1.0 f 0 (1) = f 0 (0) a 0,0 s 0 (w[1]) = f 0 (0) a 0,0 s 0 (A) = 1.0 0.2 0.3 = 0.06 f 1 (1) = f 0 (0) a 0,1 s 0 (w[1]) = f 0 (0) a 0,1 s 0 (A) = 1.0 0.8 0.3 = 0.24 f 1 (2) = f 0 (1) a 0,1 s 0 (w[2]) + f 1 (1) a 1,1 s 1 (w[2]) = f 0 (1) a 0,1 s 0 (A) + f 1 (1) a 1,1 s 1 (A) = 0.06 0.8 0.3 + 0.24 0.5 0.7 = 0.0984 f 2 (2) = f 1 (1) a 1,2 s 1 (w[2]) = f 1 (1) a 1,2 s 1 (A) = 0.24 0.5 0.7 = 0.084 f 2 (3) = f 1 (2) a 1,2 s 1 (w[3]) + f 2 (2) a 2,2 s 2 (w[3]) = f 1 (2) a 1,2 s 1 (B) + f 2 (2) a 2,2 s 2 (B) = 0.0984 0.5 0.1 + 0.084 0.7 0.3 = 0.02256 f 3 (4) = f 2 (3) a 2,3 s 2 (w[4]) = f 2 (3) a 2,3 s 2 (C) = 0.02256 0.3 0.2 = 0.0013536 L w = f 3 (4) = 0.0013536 20 29

3.2.2 Backward algorithm Forward algorithm w q j Backward algorithm q j w = {w[n], w[n 1],..., w[t + 1]} b j (t) b i (t) = q j Q b j (t + 1)a i,j s i (w[t + 1]) (22) 17 AABC L w Backward algorithm 30

b 3 (4) = 1.0 b 2 (3) = b 3 (4) a 2,3 s 2 (w[4]) = b 3 (4) a 2,3 s 2 (C) = 1.0 0.3 0.2 = 0.06 b 2 (2) = b 2 (3) a 2,2 s 2 (w[3]) = b 2 (3) a 2,2 s 2 (B) = 0.06 0.7 0.3 = 0.0126 b 1 (2) = b 2 (3) a 1,2 s 1 (w[3]) = b 2 (3) a 1,2 s 1 (B) = 0.06 0.5 0.1 = 0.003 b 1 (1) = b 2 (2) a 1,2 s 1 (w[2]) + b 1 (2) a 1,1 s 1 (w[2]) = b 2 (2) a 1,2 s 1 (A) + b 1 (2) a 1,1 s 1 (A) = 0.0126 0.5 0.7 + 0.003 0.5 0.7 = 0.00546 b 0 (1) = b 1 (2) a 0,1 s 0 (w[2]) = b 1 (2) a 0,1 s 0 (A) = 0.003 0.8 0.3 = 0.00072 b 0 (0) = b 1 (1) a 0,1 s 0 (w[1]) + b 0 (1) a 0,0 s 0 (w[1]) = b 1 (1) a 0,1 s 0 (A) + b 0 (1) a 0,0 s 0 (A) = 0.00546 0.8 0.3 + 0.00072 0.2 0.3 = 0.0013536 L w = b 0 (0) = 0.0013536 Backward algorithm b 0 (0) Forward algorithm f 3 (4) b 0 (0) = f 3 (4) 31

3.2.3 Viterbi algorithm Forward algorithm f j (t) = s i (w[t]) f i (t 1)a i,j q i Q w Viterbi algorithm W Rm(w) N L w = arg max s g(m,t) (w[t]) a g(m,t),j g(m, t) = r m [t] (23) t=0 f j (t) = s i (w[t]) max q Q f i(t 1)a i,j (24) 32

3.2.4 Baum-Welch algorithm HMM Viterbi algorithm HMM Θ Θ w Viterbi algorithm w Θ Baum-Welch algorithm Θ HMM w Θ Θ Forward algorithm L w 18 left-to-light 33

18: Baum-Welch algorithm Baum-Welch algorithm O i,j : W = {w 1, w 2,..., w l } q i q j E i (x) : W = {w 1, w 2,..., w l } x q i P (w k Θ) : W = {w 1, w 2,..., w l } Θ w k L w P (w k Θ) = L w W P (W Θ) P (W Θ) = l k=1 L k l 34

, O i,j E i (x) t 1 O i,j = l k=1 1 fi k (t 1) a i,j s i (w k [t]) b k j (t) (25) P (w k Θ) t E i (x) = l k=1 1 P (w k Θ) t:w k [t]=x f k i (t 1) b k i (t 1) (26) O i,j E i (x) a i,j s i (x) a i,j = s i (x) = O i,j j O i,j E i(x) x E i (x ) (27) (28) 35

17 AABC AABC L w O i,j = = = t 1 1 f i (t 1) a i,j s i (w 1 [t]) b j (t) P (w k Θ) k=1 t 1 f i (t 1) a i,j s i (w 1 [t]) b j (t) P (w 1 Θ) t f i (t 1) a i,j s i (w 1 [t]) b j (t) L w t q i q j ξ t (i, j) ξ t (i, j) = f i(t 1) a i,j s i (w k [t]) b j (t) L w O i,j = t ξ t (i, j) PROCESS1PROCESS4 36

PROCESS 1 ξ t (i, j) ξ t (i, j) ξ 1 (0, 0) = f 0(0) a 0,0 s 0 (w 1 [1]) b 0 (1) = f 0(0) a 0,0 s 0 (A) b 0 (1) L w L w 1.0 0.2 0.3 0.00072 = = 0.03191489 0.00135360 ξ 1 (0, 1) = f 0(0) a 0,1 s 0 (w 1 [1]) b 1 (1) = f 0(0) a 0,1 s 0 (A) b 1 (1) L w L w 1.0 0.8 0.3 0.00546 = = 0.96808510 0.00135360 ξ 2 (0, 1) = f 0(1) a 0,1 s 0 (w 1 [2]) b 1 (2) = f 0(1) a 0,1 s 0 (A) b 1 (2) L w L w 0.06 0.8 0.3 0.003 = = 0.03191489 0.00135360 ξ 2 (1, 1) = f 1(1) a 1,1 s 1 (w 1 [2]) b 1 (2) = f 1(1) a 1,1 s 1 (A) b 1 (2) L w L w 0.24 0.5 0.7 0.003 = = 0.18617021 0.00135360 ξ 2 (1, 2) = f 1(1) a 1,2 s 1 (w 1 [2]) b 2 (2) = f 1(1) a 1,2 s 1 (A) b 2 (2) L w L w 0.24 0.5 0.7 0.0126 = = 0.78191489 0.00135360 ξ 3 (1, 2) = f 1(2) a 1,2 s 1 (w 1 [3]) b 2 (3) = f 1(2) a 1,2 s 1 (B) b 2 (3) L w L w 0.0984 0.5 0.1 0.06 = = 0.21808510 0.00135360 ξ 3 (2, 2) = f 2(2) a 2,2 s 2 (w 1 [3]) b 2 (3) = f 2(2) a 2,2 s 2 (B) b 2 (3) L w L w 0.084 0.7 0.3 0.06 = = 0.78191489 0.00135360 ξ 4 (2, 3) = f 2(3) a 2,3 s 2 (w 1 [4]) b 3 (4) = f 2(3) a 2,3 s 2 (C) b 3 (4) L w L w 0.02256 0.3 0.2 1.0 = = 1.0 0.00135360 37

ξ 1 (0, 0) + ξ 1 (0, 1) = 1.0 ξ 2 (0, 1) + ξ 2 (1, 1) + ξ 2 (1, 2) = 1.0 ξ 3 (1, 2) + ξ 3 (2, 2) = 1.0 ξ 4 (2, 3) = 1.0 PROCESS 2 a i,j PROCESS 1 ξ t (i, j) a i,j = O i,j j O i,j O i,j O i,j = ξ t (i, j) t j 29 O i,j = t ξ t (i, j) a i,j j a i,j = t ξ t(i, j) j ξ t(i, j) t (29) a i,j 38

a 0,0 = = a 0,1 = = a 1,1 = = a 1,2 = = a 2,2 = = a 2,3 = = ξ 1 (0, 0) ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) 0.03191489 0.03191489 + 0.96808510 + 0.03191489 0.03092783 ξ 1 (0, 1) + ξ 2 (0, 1) ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) 0.03191489 0.03191489 + 0.96808510 + 0.03191489 0.96907217 ξ 2 (1, 1) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.18617021 0.18617021 + 0.78191489 + 0.21808510 0.15695067 ξ 2 (1, 2) + ξ 3 (1, 2) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.78191489 + 0.21808510 0.18617021 + 0.78191489 + 0.21808510 0.84304933 ξ 3 (2, 2) ξ 3 (2, 2) + ξ 4 (2, 3) 0.78191489 0.78191489 + 1.0 0.43880597 ξ 4 (2, 3) ξ 3 (2, 2) + ξ 4 (2, 3) 1.0 0.78191489 + 1.0 0.56119403 a i,j a i,j 39

PROCESS 3 s i (x) s i (x) l 1 L w P (w k Θ) k=1 l 1 1 P (w k Θ) = 1 P (w 1 Θ) = 1 L w k=1 k=1 26 E i (x) = 1 L w t:w k [t]=x f i (t 1) b i (t 1) (30) 3022 E i (x) = 1 L w = 1 = = t:w k [t]=x L w t:w k [t]=x f i (t 1) b i (t 1) f i (t 1) b j (t)a i,j s i (w[t]) q j Q t:w k [t]=x f i(t 1) q j Q b j(t)a i,j s i (w[t]) t:w k [t]=x q j Q L w f i (t 1) b j (t)a i,j s i (w[t]) L w (31) ξ t (i, j) = f i(t 1) a i,j s i (w k [t]) b j (t) L w 31 E i (x) = t:w k [t]=x q j Q f i (t 1) b j (t)a i,j s i (w[t]) L w = t:w k [t]=x q j Q ξ t (i, j) (32) 40

28 s i (x) = = = E i (x) x E i (x ) t:w k [t]=x x t:w k [t]=x t t:w k [t]=x q j Q ξ t(i, j) q j Q ξ t(i, j) q j Q ξ t(i, j) q j Q ξ t(i, j) (33) 41

33 s i (x) s 0 (A) = ξ 1(0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) = 1.0 s 0 (B) = 0 ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) = 0.0 s 0 (C) = 0 ξ 1 (0, 0) + ξ 1 (0, 1) + ξ 2 (0, 1) = 0.0 s 1 (A) = = s 1 (B) = = s 1 (C) = s 2 (A) = s 2 (B) = = s 2 (C) = = ξ 2 (1, 1) + ξ 2 (1, 2) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.18617021 + 0.78191489 0.18617021 + 0.78191489 + 0.21808510 0.81614350 ξ 3 (1, 2) ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) 0.21808510 0.18617021 + 0.78191489 + 0.21808510 0.18385650 0 ξ 2 (1, 1) + ξ 2 (1, 2) + ξ 3 (1, 2) = 0.0 0 ξ 3 (2, 2) + ξ 4 (2, 3) = 0.0 ξ 3 (2, 2) ξ 3 (2, 2) + ξ 4 (2, 3) 0.78191489 0.78191489 + 1.0 0.43880597 ξ 4 (2, 3) ξ 3 (2, 2) + ξ 4 (2, 3) 1.0 0.78191489 + 1.0 0.56119403 42

3.3 HMM Forward algorithmbackward algorithmbaum-welch algorithm 17 AABC 1 t G = {G 1, G 2,...G t,...g l } k G t t k k G t = Γ t (0, 0) Γ t (0, 1) Γ t (0, k 1) Γ t (1, 0) Γ t (1, 1) Γ t (1, k 1)...... Γ t (k 1, 0) Γ t (k 1, 1) Γ t (k 1, k 1) 4 k = 4 G t Γ t (i, j) t i s i (w[t]) i j 0 i s i (w[t]) i j 1 0 1 t = 1 A i A Γ 1 (i, j) = 1, i 43

A i j Γ 1 (i, j) = 1 i j t = 1 t = 4 44

t = 1 t = 1 0 i = 0 G 1 i 0 1 t = 4 t = 4 4 i 3(... k 1 = 4 1 = 3) a i,3 0 i 1, 0 i = 0 1 Γ 1 (0, 0) = 0 (... s 0 (A) a 0,0 0) Γ 1 (0, 1) = 0 (... s 0 (A) a 0,1 0) Γ 1 (0, 2) = 1 (... s 0 (A) a 0,1 = 0) Γ 1 (0, 3) = 1 (... s 0 (A) a 0,1 = 0) 45

Γ 1 (1, 0) = 1 Γ 1 (2, 0) = 1 Γ 1 (3, 0) = 1 Γ 1 (1, 1) = 1 Γ 1 (2, 1) = 1 Γ 1 (3, 1) = 1 Γ 1 (1, 2) = 1 Γ 1 (2, 2) = 1 Γ 1 (3, 2) = 1 Γ 1 (1, 3) = 1 Γ 1 (2, 3) = 1 Γ 1 (3, 3) = 1 t G t 19 46

19: G t G t STEP 1 G 1 j 1 2 G 2 j 1 20 G 1 G 2 G 3, G 4 t = 1 G t j 1 G t+1 j 1 t = l 1 G 47

20: G STEP t = l G t i 1 G t 1 i 1 t = 2 G 21 48

21: G STEP STEP t i j G t Γ t (i, j) 0 1 0 1 AABC 22 1 49

22: G 50

4 隠れマルコフ球面自己組織化マップ (HMM-S-SOM) 隠れマルコフ球面自己組織化マップとは,S-SOM のノードに隠れマルコフモデル を用いたものであり 図 23, 入力データには, ベクトルではなく文字列集合を用い る. ノード上に内包された隠れマルコフモデルは, すべて同じ構造を持つ この自 己組織化マップは, データを直接分類するのではなくデータの背景にある確率モデ ルにもとづいて, 確率モデルを球面上にクラスタリングし, その分類結果を視覚的に も容易に提示することのできる学習モデルである S-SOM と HMM-S-SOM の違い については表 2 に示す ノードや入力データの形式の違い以外に, 勝者ノードの決 定方法や, ノードの更新の方法が異なる その詳細については, 次の HMM-S-SOM の学習アルゴリズムの説明に記す 図 23: 隠れマルコフ球面自己組織化マップ 51

2: S-SOM HMM-S-SOM 4.1 HMM-S-SOM STEP 0: HMM Θ k STEP 1: STEP 2: Baum-Welch algorithm STEP 3,, STEP 3: HMM Baum-Welch algorithm, HMM SOM 52

SOM, STEP1STEP3, STEP2,,,, 24,,,Baum-Welch algorithm HMM,,,,Baum-Welch algorithm HMM,STEP3 24: 53

4.2, left to right, Ergodic, U-matrix, U-matrix U-matrix,,,, A,B,C 3 HMM HMM M 1, M 3,..., M 10,,,, 25 312, 1322 54

3: M 1 4: M 2 5: M 3 6: M 4 7: M 5 8: M 6 55

9: M 7 10: M 8 11: M 9 12: M 10 13: M 1 14: M 2 56

15: M 3 16: M 4 17: M 5 18: M 6 19: M 7 20: M 8 57

21: M 9 22: M 10 25: 4.2.1 HMM-S-SOM HMM-S-SOM 26 26 M 1 Data1 M 1 M 10 M 1 M 10 M 3, M 8, M 9, M 7, M 2, M 5, M 6, M 4 M 1,U-matrix HMM, U-matrix HMM 58

26:, HMM,, 27,,,,model 1 model 4,HMM -matrix,, 59

27: 60

5 (F-HMM-S-SOM), Baum-Welch algorithm,,som,hmm,, SOM HMM, (F-HMM-S-SOM) ( 28) 28: F-HMM-S-SOM,, HMM,, 61

, F-S-HMM-SOM, A,B,C, n, n + 1,A,B,C,,, A,B,C, l, l 1 l, 12 (l 1) m, l max, (m 2 + m)(l max 1) ACCB, BCC,,, 2,ACCB, AC, 29 AC, 0, 000001000000, CC 000000000001, CB 000000000010 ACCB 62

000001000000000000000001000000000010,BCC 000000001000000000000001 29:,,, 30,, 0( 30: ) Θ k : HMMS-SOMHMMΘ k = {A k, S k } 63

30: A k = a k 01 a k 02 a k 0j a k 11 a k 12 a k 1j...... S k = s k 01 s k 02 s k 0h s k 11 a k 12 s k 1h...... a k i1 a k i2 a k ij s k i1 s k i2 s k ih a i,j i j s i,h i x h X : X = {x 1, x 2, x 3,..., x h } V = {v 1, v 2, v 3,..., v h } STEP 1: HMM Θ k STEP 2: W n Baum-Welch algorithm N i HMM L i, N i E i STEP 3:, 64

(1 E i E max ) L i,e max E i STEP 4: Baum-Welch algorithm, STEP 5:,SOM N n k Θ k STEP 4 Nk n HMM Θ n k = {An k, Sn k } A = β h(p nx, P nx )(A n k A k) A new k = A k + A S = β h(p nx, P nx )(Sk n S k) S new k = S k + S Θ new k Θ k Θ new k = {A new k, Sk new } h(p nx, P nx ) = exp( P nx P nx θ 2 ) P jx N j k P jx 65

P jx P jx β,baum-welch algorithm STEP 6: SOM SOM STEP2STEP5,,, U-matrix, 66

5.1 F-HMM-S-SOM,, HMM-S-SOM, 5.1.1 F-HMM-S-SOM HMM U-matrix 31,, 31: HMM U-matrix, ( 32),, 67

32: U-matrix 5.2 F-HMM-S-SOM,,,DNA (HMM-S-SOM) 5.2.1 DNA, DNA ATGC,ATGC [14], 1024, 6 1024, (Hsall), Mmuall, (Cfaall), Ecoall, (Dmaall), (Osaall),HMM- S-SOM, U-matrix 33 68

33: HMM-S-SOM,,.. F-HMM-S-SOM,HMM U-matrix, ( 34), ( 35),,,,,,,,,, 69

34: F-HMM-S-SOM : U-matrix 35: F-HMM-S-SOM : U-matrix 70

5.2.2, 2010 36,,,,,,, 36,,,, U, D, S, U D,, F-HMM-S-SOM,F- HMM-S-SOM HMM, 37 4, U,S,D 3,Umatrix, 71

36: 72

37:,,. t t+1, HMM-S-SOM F-HMM-S-SOM 38 39 0 38, 22 46 39, S-HMM-SOM, F-S-HMM-SOM,, S-HMM-SOM,U-matrix,,F-S-HMM-SOM, 73

38: 74

39: 75

, U-matrix,,,, F-S-HMM-SOM 1.377,S-HMM-SOM 2.623,,., 40,, S-HMM-SOM 40,,,, F-S-HMM-SOM 40,,,,,,,,,,,,,,,,, 76

40: 77

,, 41,,,, 0 23 42, 24 46 43 5,,,,, 4,,,,,, 78

41: 79

図 42: 移動平均線との比較 80

図 43: 移動平均線との比較 81

230 36 23 23: 82

6,,,,,,,,,,,,, 44, 1,2, SOM v j x i 1, (mask-vector)w i w i,, 0 1 1 0,1,0 83

44:, 1, 0 1,1 2, 30 5,,,, 45: SOM,, 45, h a, 84

STEP 1: 1 x i = {x i0, x i1,..., x in }, h a = {h a0, h a1,..., h an } 2 w i = {w i0, w i1,..., w in } 1 STEP 2: i,l 1 [argmin] i {w ij (h aj v aj w ij x ij )} L 2 (34) j L = j w2 ij w i w i STEP 3: wi,, µ a µ a µ ai ω 46 46: 85

(35) m [argmin] m { 1 m m k=2 ( ω k ω m ) 2 } ϕ (35),ϕ 0.5 µ t ω k f(t) = k, f(t) m w new it = w old it + γ(1 w old it ) (36) m < f(t) w new it = w old it + γ(0 w old it ) (37),γ 0 < γ < 1, wi w i 1 0 h a STEP 4:, x new j = x old j + α(v a x old j ) (38) w new j = w old j + β(w i w old j ) (39) α,,0 < α < 1, β 0 < β < 1 86

STEP2STEP4,,, h a, 6.1, h a, 1 3,,, noise Mersenne Twister[15] 01 1x,y,z y = 3 x + noise z = noise 2x 0.5, x,y,z x,y,z 87

47: y = 3 x + noise z = noise (x 0.5) y = 2 x + noise z = noise (x > 0.5) 3x 0.5,x,z, x 0.5,y z y = 3 x + 2 noise z = noise (x 0.5) 88

48: 2 y = 2 z + noise + 2 x = noise (x > 0.5) 6.1.1 50,, x y model 1 100 z, x y 0.819, SOM 51 SOM, 8, 89

49: 3 50: 90

51: SOM, x y, 6.1.2 2 52, x y, x y z, 95. 0.335, SOM 53,, x y z, x y 2,model 91

52: 53: SOM 92

6.1.3 3 3, 54 55 54 xy x y, y,x,y,y z x 0.5 0.5,, 78, 54: yz,xy, 56 93

55: 56: 94

56, 54 yz,yz, yz,xy,x, yz,x,, xy,,, SOM, 57 57: SOM,, x z SOM, 95

7,,,,,,,, DNA,,,,,,,,,,,,,,,,,,,,.,,,, 96

, k-means,., 2,,,.,, SOM F-HMM-S-SOM,, 97

,,,,,,,,,.,,, 98

[1],,2012. [2],,,, DEIM Forum 2010 F4-2,2010. [3]Teuvo Kohonen,, :,. [4]Chie Morita and Hiroshi Tsukimoto:Knowledge discovery from numerical data, Knowledge-based Systems,Vol.10,No7,pp.413-419,1998. [5] :,2009. [6] /,, HMM-SOM. [7], HMM-SOM MIDI. [8],,,, : A spherical SOM and its examples/thecnical Report of IEICE. [9],,, : S-SOM, 0 0 153 156,2004. 99

[10],,,, : SOM,, Vol.2010-MPS-77 No.29,2010. [11] Ying Xin WU SOM Vol.19, No.6, pp.611-617,2007. [12]N2Factory DirectX. [13]Jeffrey Richter, Christophe Nasarre:Advanced Windows 5, BP,2008. [14]Hiroshi Dozono. et.al:a Design Method of DNA Chips for sequence Analysis Using Self Organizing Maps, Proceeding of WSOM 2003,pp.116-121,2003. [15]M. Matsumoto and T. Nishimura:A 623-dimensionally equidistributed uniform pseudorandom number generator, ACM Trans. on Modeling and Computer Simulation Vol. 8, No. 1, January pp.3-30,1998. 100

1. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono: Basic Study on the Classification of Time Series Data Using a Frequency Integrated Spherical Hidden Markov Self Organizing Map, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.19 No.2, pp.212-216, 2015. 1. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono: Causal analysis of data using 2-layerd Spherical Self-Organizing Map, The 2015 International Conference on Computational Science and Computational Intelligence (CSCI 15), Las Vegas, USA, pp.198-202, 2015. 2. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono, Tatsuya Chuuto: The data analysis of stock market using a Frequency Integrated Spherical Hidden Markov Self Organizing Map, ICSIIT 2015, 4th International Conference on Soft Computing, Intelligent System and Information Technology, March 11-14, 2015 / Bari, Indonesia, pp.195-204, 2015. 3. Gen Niina, Kazuhiro Muramatsu, Hiroshi Dozono: The Frequency Integrated Spherical Hidden Markov Self Organizing Map for Learning Time Series Data, ISIS2013, The International Symposium on Advanced Intelligent Systems, Daejeon, Korea, pp.362-370, 2013. 101

4. Hiroshi Dozono, Gen Niina and Kazuhiro Muramatsu,: Mapping of DNA sequences using Hidden Markov Model Self-Organizing Maps, Proceedings of The 10th annual IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology, IEEE Press, Singapore, 2013. 5. Hiroshi Dozono, Gen Niina and Yutaro Kaneko: Mapping the Groups of DNA Sequences using Hidden Markov Model Self Organizing Maps, The First BMIRC International Symposium on Frontiers in Computational Systems Biology and Bioengineering, Fukuoka, Japan, 2013. 6. Gen Niina, Hiroshi Dozono: The Spherical Hidden Markov Self Organizing Map for Learning Time Series Data, 22nd International Conference on Artificial Neural Networks, Lausanne, Switzerland, pp.563-570, Part I, 2012. 1.,, :, 15,, 2014. 2., : DirectX SOM 19, 2011. 3.,, : 64, 2011. 102