知的学習認識システム特論9.key

Size: px
Start display at page:

Download "知的学習認識システム特論9.key"

Transcription

1 1

2

3

4

5

6

7

8

9 Perceptron (Rosenblatt 57) Linear Separable (Minski & Papert 68) SVM (Vapnik 95) Neocognitron (Fukushima 80) Back Prop. (Rumelhart+ 86) Conv. net (LeCun+ 89) Deep learning (Hinton+ 06)

10 Deep Learning(深層学習)とは 神経回路(ニューラルネット)モデルを用いた人工知能技術 脳の働きを模した構造と学習方式 深い階層構造を持つことが特徴 It s 5 Input Recognition It s 5

11

12

13

14

15

16 (McCulloch&Pitts 43) f ( ) x 1 w 1 u x 2 w 2 u

17 McCulloch-Pits モデルでできること モデルパラメータ {w, θ} の変更 様々な論理素子 x1 w1 x2 w 2 Σ u θ u w1 w2 θ AND OR NAND

18 x1 x2 x u j = 3X w ji x i + b j f (u) i=1 j = f u j u

19 x1 x2 x3 3X u j = w ji x i + b j i=1 2 j p( j u j ) u j p( j u j ) = e u j

20

21 Output Input

22

23 NN の学習とは モデルパラメータ {w, θ} で振る舞いが変化 モデルパラメータをデータから決定する 学習 x1 w1 x2 w 2 Σ u θ u w1 w2 θ AND OR NAND

24 x t t {w,θ} x2 (0,1) (1,1) (0,0) (1,0)x1 1 x1 w1 x2 w2 t 1

25 x x x2 1 x1 w1 0 x2 w2 x1

26 Perceptron (Rosenblatt 57) Linear Separable (Minski & Papert 68) Boltzmann Mach. (Hinton+ 85) SVM (Vapnik 95) Neocognitron (Fukushima 80) Back Prop. (Rumelhart+ 86) Conv. net (LeCun+ 89) Deep learning (Hinton+ 06) Simple / Complex cell (Hubel & Wiesel 59) Population coding (Desimone+ 84, Tanaka+ 84)

27

28 0 0 : 1 1 : 0 0

29 {x n } 1 x1 n 1(x n ) 2(x n ) x2 n x3 n 3(x n ) It's "1"

30 1 2 φ2 = sgn 0 X B@ j w j j 1CA = sgn (w 0 + w w 2 2 ) φ1 w 0 + w w 2 2 = 0

31 t w x w t {xn, tn} (xn) = tn tn n w

32 (Minsk & Papert 68) φ2 φ1

33 x2 {x n } {z n } x2 z2 x1 x1 x0 z1 1 1 z0

34 {x n } 1 x1 n x2 n x3 n 1(x n ) 2(x n ) 3(x n ) It's "5"

35 {x n } 1 x1 n x2 n x3 n w20 w21 w22 1(x n ) 2(x n ) w23 3(x n ) It's "1"

36 Perceptron (Rosenblatt 57) Linear Separable (Minski & Papert 68) Boltzmann Mach. (Hinton+ 85) SVM (Vapnik 95) Neocognitron (Fukushima 80) Back Prop. (Rumelhart+ 86) Conv. net (LeCun+ 89) Deep learning (Hinton+ 06) Simple / Complex cell (Hubel & Wiesel 59) Population coding (Desimone+ 84, Tanaka+ 84)

37

38 x z hidden units x D w (1) MD z M w (2) KM K inputs outputs 1 x 1 x 0 z 1 w (2) 10 z 0

39 (Irie 88, Funahashi 89) {z n } z3 x1 x1 {x n } z2 x1 z1 x1 x1 x0 1 1 z0

40

41 t z1 z2 x1 x2 w11 (1) u w12 (1) w21 (1) u w22 (1) z1 z2 θ θ u u u u z1 z2 w1 (2) u w2 (2) u t

42 hidden units x D w (1) MD z M w (2) KM K inputs x 1 outputs 1 x 0 z 1 w (2) 10 z 0

43 (Widrow-Hoff 60) t x w 1 1 x 1 w 1 u u x 2 w 2 u x 2 w 2 u

44 E(w) = 1 2 n t n (x n ) 2 1 w (1) w (2) x1 n 1(x n ) t1 n E(w) x2 n x3 n 2(x n ) 3(x n ) t2 n t3 n w {xn } {t > 0

45 (1) 22 w11 (1) u w12 (1) w21 (1) u w22 (1) z1 z2 u u u u z1 z2 w1 (2) w2 (2) 2 u u t

46 (1) 22 w11 (1) u w12 (1) w21 (1) u w22 (1) z1 z2 u u u u z1 z2 w1 (2) w2 (2) 2 u u t

47 tk k k wkj j z i wji x {xn, tn} {w (1) ji, w (2) kj}

48 tk i k k wkj wji j z x n n k = k(x n ; w) E n (w) = 1 2 X t n k k n 2 k

49 tk k k wkj i w ji j j

50 tk k k wkj i w ji j j δ δ

51 k uk zj uj uk uj tk k δk zj δj xi xi k = 0 (u k )( k t k )

52 En(w) (Amari 67, Bottou+11,12)

53 (Le+11) (Zeiler 12) (Duchi+11) (Kingma+15)

54 (Rumelhart+ 86) (Ackle+ 85) (Cottrell+ 87) (Sejnowski & Rosenberg 87) (Gorman & Sejnowski 88) (LeCun+ 89)

55 まとめ ディープラーニングの要素技術は比較的枯れている 単層の Perceptron のやっていることは 線形分離問題の解決 階層を深くすることは 入力表現の変換による線形分 離能力の向上が期待出来る でもチューニング難しい ので SVM が主流に 学習はもっと問題 ターゲットが決まっているのであれば コスト関数 勾配法のアプローチが取れる 55

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution Convolutional Neural Network 2014 3 A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolutional Neural Network Fukui Hiroshi 1940 1980 [1] 90 3

More information

2 1,384,000 2,000,000 1,296,211 1,793,925 38,000 54,500 27,804 43,187 41,000 60,000 31,776 49,017 8,781 18,663 25,000 35,300 3 4 5 6 1,296,211 1,793,925 27,804 43,187 1,275,648 1,753,306 29,387 43,025

More information

知能科学:ニューラルネットワーク

知能科学:ニューラルネットワーク 2 3 4 (Neural Network) (Deep Learning) (Deep Learning) ( x x = ax + b x x x ? x x x w σ b = σ(wx + b) x w b w b .2.8.6 σ(x) = + e x.4.2 -.2 - -5 5 x w x2 w2 σ x3 w3 b = σ(w x + w 2 x 2 + w 3 x 3 + b) x,

More information

知能科学:ニューラルネットワーク

知能科学:ニューラルネットワーク 2 3 4 (Neural Network) (Deep Learning) (Deep Learning) ( x x = ax + b x x x ? x x x w σ b = σ(wx + b) x w b w b .2.8.6 σ(x) = + e x.4.2 -.2 - -5 5 x w x2 w2 σ x3 w3 b = σ(w x + w 2 x 2 + w 3 x 3 + b) x,

More information

untitled

untitled c ILSVRC LeNet 1. 1 convolutional neural network 1980 Fukushima [1] [2] 80 LeCun (back propagation) LeNet [3, 4] LeNet 2. 2.1 980 8579 6 6 01 okatani@vision.is.tohoku.ac.jp (simple cell) (complex cell)

More information

18 2 20 W/C W/C W/C 4-4-1 0.05 1.0 1000 1. 1 1.1 1 1.2 3 2. 4 2.1 4 (1) 4 (2) 4 2.2 5 (1) 5 (2) 5 2.3 7 3. 8 3.1 8 3.2 ( ) 11 3.3 11 (1) 12 (2) 12 4. 14 4.1 14 4.2 14 (1) 15 (2) 16 (3) 17 4.3 17 5. 19

More information

数学の基礎訓練I

数学の基礎訓練I I 9 6 13 1 1 1.1............... 1 1................ 1 1.3.................... 1.4............... 1.4.1.............. 1.4................. 3 1.4.3........... 3 1.4.4.. 3 1.5.......... 3 1.5.1..............

More information

http://www.ieice-hbkb.org/ 2 2 1 2 -- 2 -- 1 1--1 2011 2 26 26 1 1 AD 1 1 1-2 1-3 c 2013 2/(19)

http://www.ieice-hbkb.org/ 2 2 1 2 -- 2 -- 1 1--1 2011 2 26 26 1 1 AD 1 1 1-2 1-3 c 2013 2/(19) 2 -- 2 1 2011 2 1-1 1-2 SVM Boosting 1-3 1-4 c 2013 1/(19) http://www.ieice-hbkb.org/ 2 2 1 2 -- 2 -- 1 1--1 2011 2 26 26 1 1 AD 1 1 1-2 1-3 c 2013 2/(19) 1-4 c 2013 3/(19) 2 -- 2 -- 1 1--2 2011 2 Principal

More information

1) (AI) 5G AI AI Google IT Deep Learning RWC (RWCP) RWC Web RWCP [1] 2. RWC ETL-Mark I, II (1952, 1955) (ETL) (ETL-Mark

1) (AI) 5G AI AI Google IT Deep Learning RWC (RWCP) RWC Web RWCP [1] 2. RWC ETL-Mark I, II (1952, 1955) (ETL) (ETL-Mark RWC 10 RWC 1992 4 2001 13 RWC 21 RWC 1. RWC (Real World Computing ) (1982 1992 ) 3 1992 2001 10 21 RWCP 5 1992 1996 5 1997 2001 RWC (RWI) (PDC) 2 RWC (5G) 1 1) (AI) 5G AI AI Google IT Deep Learning RWC

More information

Deep Learningとは

Deep Learningとは 企画セッション 2 ディープラーニング 趣旨 : 応用 3 分野における Deep Learning( 深層学習 ) の研究の現状 画像 : 岡谷貴之 ( 東北大学 ) 画像認識分野でのディープラーニングの研究動向 音声 : 久保陽太郎 (NTT コミュニケーション科学基礎研究所 ) 音声認識分野における深層学習技術の研究動向 自然言語処理 : 渡邉陽太郎 ( 東北大学 ) 自然言語処理におけるディープラーニングの現状

More information

untitled

untitled 16 1 1 2 2 2 5 7 7 11 IT 13 19 19 51 95 111 16 10 112 16 114 120 121 21 15 124 PIO NET 5,167 85.3% 13.8% 50.2% 11.3% 1 3.1% 11.8% 13.3%5,167 13 7 18 17 1 31 50 10 10 4 13 15 7 16 15 10 23 13 7 18 17

More information

fiš„v2.dvi

fiš„v2.dvi (2001) 49 1 9 21 * 2000 12 27 2001 3 19 (PCA) (MDS) MDS Young Yamane AIT MDS MDS Makioka 2 MDS MDS PCA, MDS. 1. 140 Yes * 351 0198 2 1 Figures 1 and 3: Reprinted with permission from Young, P. M. and Yamane,

More information

x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2

x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2 Takio Kurita Neurosceince Research Institute, National Institute of Advanced Indastrial Science and Technology takio-kurita@aistgojp (Support Vector Machine, SVM) 1 (Support Vector Machine, SVM) ( ) 2

More information

9 8 7 (x-1.0)*(x-1.0) *(x-1.0) (a) f(a) (b) f(a) Figure 1: f(a) a =1.0 (1) a 1.0 f(1.0)

9 8 7 (x-1.0)*(x-1.0) *(x-1.0) (a) f(a) (b) f(a) Figure 1: f(a) a =1.0 (1) a 1.0 f(1.0) E-mail: takio-kurita@aist.go.jp 1 ( ) CPU ( ) 2 1. a f(a) =(a 1.0) 2 (1) a ( ) 1(a) f(a) a (1) a f(a) a =2(a 1.0) (2) 2 0 a f(a) a =2(a 1.0) = 0 (3) 1 9 8 7 (x-1.0)*(x-1.0) 6 4 2.0*(x-1.0) 6 2 5 4 0 3-2

More information

N N 1,, N 2 N N N N N 1,, N 2 N N N N N 1,, N 2 N N N 8 1 6 3 5 7 4 9 2 1 12 13 8 15 6 3 10 4 9 16 5 14 7 2 11 7 11 23 5 19 3 20 9 12 21 14 22 1 18 10 16 8 15 24 2 25 4 17 6 13 8 1 6 3 5 7 4 9 2 1 12 13

More information

X G P G (X) G BG [X, BG] S 2 2 2 S 2 2 S 2 = { (x 1, x 2, x 3 ) R 3 x 2 1 + x 2 2 + x 2 3 = 1 } R 3 S 2 S 2 v x S 2 x x v(x) T x S 2 T x S 2 S 2 x T x S 2 = { ξ R 3 x ξ } R 3 T x S 2 S 2 x x T x S 2

More information

Microsoft PowerPoint - qcomp.ppt [互換モード]

Microsoft PowerPoint - qcomp.ppt [互換モード] 量子計算基礎 東京工業大学 河内亮周 概要 計算って何? 数理科学的に 計算 を扱うには 量子力学を計算に使おう! 量子情報とは? 量子情報に対する演算 = 量子計算 一般的な量子回路の構成方法 計算って何? 計算とは? 計算 = 入力情報から出力情報への変換 入力 計算機構 ( デジタルコンピュータ,etc ) 出力 計算とは? 計算 = 入力情報から出力情報への変換 この関数はどれくらい計算が大変か??

More information

+ 1 ( ) I IA i i i 1 n m a 11 a 1j a 1m A = a i1 a ij a im a n1 a nj a nm.....

+   1 ( ) I IA i i i 1 n m a 11 a 1j a 1m A = a i1 a ij a im a n1 a nj a nm..... + http://krishnathphysaitama-uacjp/joe/matrix/matrixpdf 1 ( ) I IA i i i 1 n m a 11 a 1j a 1m A = a i1 a ij a im a n1 a nj a nm (1) n m () (n, m) ( ) n m B = ( ) 3 2 4 1 (2) 2 2 ( ) (2, 2) ( ) C = ( 46

More information

統計学のポイント整理

統計学のポイント整理 .. September 17, 2012 1 / 55 n! = n (n 1) (n 2) 1 0! = 1 10! = 10 9 8 1 = 3628800 n k np k np k = n! (n k)! (1) 5 3 5 P 3 = 5! = 5 4 3 = 60 (5 3)! n k n C k nc k = npk k! = n! k!(n k)! (2) 5 3 5C 3 = 5!

More information

II 2 3.,, A(B + C) = AB + AC, (A + B)C = AC + BC. 4. m m A, m m B,, m m B, AB = BA, A,, I. 5. m m A, m n B, AB = B, A I E, 4 4 I, J, K

II 2 3.,, A(B + C) = AB + AC, (A + B)C = AC + BC. 4. m m A, m m B,, m m B, AB = BA, A,, I. 5. m m A, m n B, AB = B, A I E, 4 4 I, J, K II. () 7 F 7 = { 0,, 2, 3, 4, 5, 6 }., F 7 a, b F 7, a b, F 7,. (a) a, b,,. (b) 7., 4 5 = 20 = 2 7 + 6, 4 5 = 6 F 7., F 7,., 0 a F 7, ab = F 7 b F 7. (2) 7, 6 F 6 = { 0,, 2, 3, 4, 5 },,., F 6., 0 0 a F

More information

2009 1. 2. 3. 4. 5. 2 2009 CONTENTS 4 6 8 TOPIC 01 10 TOPIC 02 11 TOPIC 03 12 TOPIC 04 14 TOPIC 05 15 TOPIC 06 15 TOPIC 07 16 18 18 19 20 21 22 22 22 23 24 25 26 27 27 27 28 29 30 TOPIC 08 16 TOPIC 09

More information

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,. 9 α ν β Ξ ξ Γ γ o δ Π π ε ρ ζ Σ σ η τ Θ θ Υ υ ι Φ φ κ χ Λ λ Ψ ψ µ Ω ω Def, Prop, Th, Lem, Note, Remark, Ex,, Proof, R, N, Q, C [a, b {x R : a x b} : a, b {x R : a < x < b} : [a, b {x R : a x < b} : a,

More information

さくらの個別指導 ( さくら教育研究所 ) 1 φ = φ 1 : φ [ ] a [ ] 1 a : b a b b(a + b) b a 2 a 2 = b(a + b). b 2 ( a b ) 2 = a b a/b X 2 X 1 = 0 a/b > 0 2 a

さくらの個別指導 ( さくら教育研究所 ) 1 φ = φ 1 : φ [ ] a [ ] 1 a : b a b b(a + b) b a 2 a 2 = b(a + b). b 2 ( a b ) 2 = a b a/b X 2 X 1 = 0 a/b > 0 2 a φ + 5 2 φ : φ [ ] a [ ] a : b a b b(a + b) b a 2 a 2 b(a + b). b 2 ( a b ) 2 a b + a/b X 2 X 0 a/b > 0 2 a b + 5 2 φ φ : 2 5 5 [ ] [ ] x x x : x : x x : x x : x x 2 x 2 x 0 x ± 5 2 x x φ : φ 2 : φ ( )

More information

main.dvi

main.dvi 5 IIR IIR z 5.1 5.1.1 1. 2. IIR(Infinite Impulse Response) FIR(Finite Impulse Response) 3. 4. 5. 5.1.2 IIR FIR 5.1 5.1 5.2 104 5. IIR 5.1 IIR FIR IIR FIR H(z) = a 0 +a 1 z 1 +a 2 z 2 1+b 1 z 1 +b 2 z 2

More information

A11 (1993,1994) 29 A12 (1994) 29 A13 Trefethen and Bau Numerical Linear Algebra (1997) 29 A14 (1999) 30 A15 (2003) 30 A16 (2004) 30 A17 (2007) 30 A18

A11 (1993,1994) 29 A12 (1994) 29 A13 Trefethen and Bau Numerical Linear Algebra (1997) 29 A14 (1999) 30 A15 (2003) 30 A16 (2004) 30 A17 (2007) 30 A18 2013 8 29y, 2016 10 29 1 2 2 Jordan 3 21 3 3 Jordan (1) 3 31 Jordan 4 32 Jordan 4 33 Jordan 6 34 Jordan 8 35 9 4 Jordan (2) 10 41 x 11 42 x 12 43 16 44 19 441 19 442 20 443 25 45 25 5 Jordan 26 A 26 A1

More information

一太郎 13/12/11/10/9/8 文書

一太郎 13/12/11/10/9/8 文書 (1) 17 3 (2) (3) (1) 1 (2) 2 (1) (2) (3) (4) (5) (6) (7) (8) 3 (1) 50 12.5km 1km (2) 16 1900 (3) 65 65 19 14 17.5 (4) 34 31 22 335 133 (5) 104 321 3 4 4 43 4 4 4 () 5 6 (1) (2) 7 8 (1) (2)24 24 (3) 9 (4)

More information

阪神5年PDF.PDF

阪神5年PDF.PDF 1995.1.17 N 0km 10 20 31 4,569 14,679 67,421 55,145 6,965 80 1,471 3,383 13,687 5,538 327 22 933 1,112 12,757 5,675 465 2 243 3,782 6,344 6,641 65 17 555 1,755 9,533 8,109 940 15 12 817 271 3,140 1 918

More information

! 1 m 43 7 1 150 ( ) 100 ( ) 11.3m 30 800 ( ) 1680 20 15 1,253 ( ) 1,500 51 52 300 1 4 134 1000 3 600 ( ) 30 , 402 km (1702) ( 1 402 67 12 23 10 ( ) ( 25,000 ) (1701 ) 485 ( 20 ) 400 (1860 ) (1) (2)

More information

untitled

untitled 60 547 547 4km [ ] 14 20 18 2,400 5,500 24 15 10,000 [ ] [ ] 1779 1779 1471-76 1914 1471-76 1779 1914 1779 1779 1914 1471-7676 1779 1471-76 1946 1914 59 8 25 30 1986 3km 2m 5 2km 18 6 [ ]

More information

Microsoft Word - 01_表紙

Microsoft Word - 01_表紙 1 2 3 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 7.0 190 km 30 62 63 64

More information

渋谷区耐震改修促進計画

渋谷区耐震改修促進計画 1 2 3 2 1,000 ( ) 1,500 ( ) 3 1,000 1 1,000 2,000 3 1,000 2,000 3 1,000 2,000 3 1,000 2,000 3 1,000 2,000 3 1,000 2,000 3 1,000 3 1,000 2,000 3 1,000 2,000 3 1,000 3 1,000 2 1,000 2,000 2 1,000 2,000 2

More information

1,000m 875m1 6km

1,000m 875m1 6km 1,000m 875m1 6km 1,000m 875m 30 13 14 11 2 14 23 27 50 30 3 () 23 24 25 16,534 16,792 18,017 13,946 17,884 18,712 30,480 34,676 36,729 1 (25 ) () 395 1,420 1,343 1,756 1,364 1,599 1,184 1,278 1,619 1,324

More information

私にとっての沖縄と独自性.PDF

私にとっての沖縄と独自性.PDF 6902117 2 1200km 48 11 46 36 40 (1) ( ) 3 1 1-1 1-2 2 (= ) 3 1. 14 14 ( ) ( 2001) ( ) ( ) 1390 1474 ( 2001) ( 4 ) ( ) 46 3000 ( ) = 5 1609 1602 ( 2001) 1-1 1-2 1-1 1-2 15 (2) 6 1314 ( ) (3) ( ) 1 ( 1993:48)

More information

-1 - -2 - -3 - -4 - -5 - -6- -7 - 260-8 - -9 - - 10-104km2 194km 340 104km2 194km 340 -11 - - 12-10km 20km 30km 260 260 1km 2km 2000 10km 20km 30km 260 260 1km 2km 2000 260 260 1km 2km 2000 - 13 - ( 3

More information

, , km 9.8km 10.5km 11.9km 14.4km 14.4km 34.1km 3.4km 31.7km 6.2km 7.3k

, , km 9.8km 10.5km 11.9km 14.4km 14.4km 34.1km 3.4km 31.7km 6.2km 7.3k 410 470 500 540 620 620 620 1,250 300 1,170 360 390 450 490 570 670 770 850 880 7.7km 9.8km 10.5km 11.9km 14.4km 14.4km 34.1km 3.4km 31.7km 6.2km 7.3km 8.9km 10.4km 12.9km 15.8km 19.0km 21.7km 22.4km 530

More information

.o...EPDF.p.indd

.o...EPDF.p.indd Social and Environmental report 28 1 2 3 5 7 9 11 12 17 18 19 22 24 25 26 27 28 29 3 2 3 4 5 6 7 4 1 2 5 3 6 8 9 1 1 2 3 4 11 12 1 2 3 4 13 14 1 2 3 4 5 6 7 8 15 16 17 1 2 3 18 19 1 2 3 4 2 21 1 2 3 4

More information

DEIM Forum 2018 C ARIMA Long Short-Term Memory LSTM

DEIM Forum 2018 C ARIMA Long Short-Term Memory LSTM DEIM Forum 2018 C3-2 657 8501 1-1 657 8501 1-1 E-mail: snpc94@cs25.scitec.kobe-u.ac.jp, eguchi@port.kobe-u.ac.jp 1. ARIMA Long Short-Term Memory LSTM Bayesian Optimization [1] [2] Multi-Task Bayesian Optimization

More information

0A_SeibutsuJyoho-RF.ppt

0A_SeibutsuJyoho-RF.ppt A ON-Center OFF-Center DeAngelis, Ohzawa, Freeman 1995 Nobel Prize 1981: Physiology and Medicine D.H. Hubel and T.N. Wiesel T.N. Wiesel D.H. Hubel V1/V2: (spikes) Display? Amplifiers and Filters V1 - simple

More information

all.dvi

all.dvi 38 5 Cauchy.,,,,., σ.,, 3,,. 5.1 Cauchy (a) (b) (a) (b) 5.1: 5.1. Cauchy 39 F Q Newton F F F Q F Q 5.2: n n ds df n ( 5.1). df n n df(n) df n, t n. t n = df n (5.1) ds 40 5 Cauchy t l n mds df n 5.3: t

More information

1 1 2 3 2.1.................. 3 2.2.......... 6 3 7 3.1......................... 7 3.1.1 ALAGIN................ 7 3.1.2 (SVM).........................

1 1 2 3 2.1.................. 3 2.2.......... 6 3 7 3.1......................... 7 3.1.1 ALAGIN................ 7 3.1.2 (SVM)......................... [5] Yahoo! Yahoo! (SVM) 3 F 7 7 (SVM) 3 F 6 0 1 1 2 3 2.1.................. 3 2.2.......... 6 3 7 3.1......................... 7 3.1.1 ALAGIN................ 7 3.1.2 (SVM)........................... 8

More information

LINEAR ALGEBRA I Hiroshi SUZUKI Department of Mathematics International Christian University

LINEAR ALGEBRA I Hiroshi SUZUKI Department of Mathematics International Christian University LINEAR ALGEBRA I Hiroshi SUZUKI Department of Mathematics International Christian University 2002 2 2 2 2 22 2 3 3 3 3 3 4 4 5 5 6 6 7 7 8 8 9 Cramer 9 0 0 E-mail:hsuzuki@icuacjp 0 3x + y + 2z 4 x + y

More information

it-ken_open.key

it-ken_open.key 深層学習技術の進展 ImageNet Classification 画像認識 音声認識 自然言語処理 機械翻訳 深層学習技術は これらの分野において 特に圧倒的な強みを見せている Figure (Left) Eight ILSVRC-2010 test Deep images and the cited4: from: ``ImageNet Classification with Networks et

More information

A = A x x + A y y + A, B = B x x + B y y + B, C = C x x + C y y + C..6 x y A B C = A x x + A y y + A B x B y B C x C y C { B = A x x + A y y + A y B B

A = A x x + A y y + A, B = B x x + B y y + B, C = C x x + C y y + C..6 x y A B C = A x x + A y y + A B x B y B C x C y C { B = A x x + A y y + A y B B 9 7 A = A x x + A y y + A, B = B x x + B y y + B, C = C x x + C y y + C..6 x y A B C = A x x + A y y + A B x B y B C x C y C { B = A x x + A y y + A y B B x x B } B C y C y + x B y C x C C x C y B = A

More information

untitled

untitled 186 17 100160250 1 10.1 55 2 18.5 6.9 100 38 17 3.2 17 8.4 45 3.9 53 1.6 22 7.3 100 2.3 31 3.4 47 OR OR 3 1.20.76 63.4 2.16 4 38,937101,118 17 17 17 5 1,765 1,424 854 794 108 839 628 173 389 339 57 6 18613

More information

untitled

untitled 1. 3 14 2. 1 12 9 7.1 3. 5 10 17 8 5500 4. 6 11 5. 1 12 101977 1 21 45.31982.9.4 79.71996 / 1997 89.21983 41.01902 6. 7 5 10 2004 30 16.8 37.5 3.3 2004 10.0 7.5 37.0 2004 8. 2 7 9. 6 11 46 37 25 55 10.

More information

n (1.6) i j=1 1 n a ij x j = b i (1.7) (1.7) (1.4) (1.5) (1.4) (1.7) u, v, w ε x, ε y, ε x, γ yz, γ zx, γ xy (1.8) ε x = u x ε y = v y ε z = w z γ yz

n (1.6) i j=1 1 n a ij x j = b i (1.7) (1.7) (1.4) (1.5) (1.4) (1.7) u, v, w ε x, ε y, ε x, γ yz, γ zx, γ xy (1.8) ε x = u x ε y = v y ε z = w z γ yz 1 2 (a 1, a 2, a n ) (b 1, b 2, b n ) A (1.1) A = a 1 b 1 + a 2 b 2 + + a n b n (1.1) n A = a i b i (1.2) i=1 n i 1 n i=1 a i b i n i=1 A = a i b i (1.3) (1.3) (1.3) (1.1) (ummation convention) a 11 x

More information

(資料2)第7回資料その1(ヒアリング概要)

(資料2)第7回資料その1(ヒアリング概要) 2 3 4 5 6 7 8 9 10 11 12 13 1 1 1 1 5 1 6 533 4 505 722 13 3325 475 1 2 3 13 10 31 1 1 1 (1) 1 (2) 2 (3) 3 (4) 4 5 5 6 7 8 8 8 9 11 11 12 13 14 15 16 19 (1) (2) (3) (1) (5 ) 1 (10 ) ( ) (2) 2 4 (3) 3 3,100

More information

IT 180 181 1) 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 (a) (b) (c) (d) (e) (f) (a) (a) (b) 214 215 216 (a) (a) (a)

More information

-------------------------- ----------------------------------------------------- -------------------------------------------------------------- ----------------------------------------------------- --------------------------------------------------------------

More information

<4D F736F F D DEC8BC A95BD90AC E A982BA81698AB A B B4790DF90AB8EBE8AB FC89408A4F816A82CC93AE8CFC82C98AD682B782E9838C837C815B D

<4D F736F F D DEC8BC A95BD90AC E A982BA81698AB A B B4790DF90AB8EBE8AB FC89408A4F816A82CC93AE8CFC82C98AD682B782E9838C837C815B D 27 29 2 IT 1,234 1,447 2,130 1,200 3,043 4 3 75 75 70-74 -10 J00 J101 J110 J111 J118 J300 J302-304 J301 26,475,118 155,290,311 1,234 14,472,130 75,784,748 12,003,043 79,505,563 1 1.00% 0.62% 1.31% 9 12

More information

, , ,210 9, ,

, , ,210 9, , 2006 5 642 7 2,671 35 732 1,727 602 489 386 74 373 533 305 1,210 9,786 2004 1,024 43.7 16.4 2004 978.6 40.2 2003 1 2006 5 1997 1998 1999 774 3,492 11 2,603 35 843 5,118 1,686 476 358 2000 738 3,534 11

More information

B B 10 7 581 10 8 582 10 9 583 B B 10 11 585 10 12 586 B 10 10 584 B

B B 10 7 581 10 8 582 10 9 583 B B 10 11 585 10 12 586 B 10 10 584 B 10 1 575 10 12 586 B B 10 1 575 10 2 576 B B 10 4 578 10 5 579 10 3 577 B 10 6 580 B B B 10 7 581 10 8 582 10 9 583 B B 10 11 585 10 12 586 B 10 10 584 B 11 1 587 11 12 598 B B 11 1 587 11 2 588 11 3 589

More information

2 (2016 3Q N) c = o (11) Ax = b A x = c A n I n n n 2n (A I n ) (I n X) A A X A n A A A (1) (2) c 0 c (3) c A A i j n 1 ( 1) i+j A (i, j) A (i, j) ã i

2 (2016 3Q N) c = o (11) Ax = b A x = c A n I n n n 2n (A I n ) (I n X) A A X A n A A A (1) (2) c 0 c (3) c A A i j n 1 ( 1) i+j A (i, j) A (i, j) ã i [ ] (2016 3Q N) a 11 a 1n m n A A = a m1 a mn A a 1 A A = a n (1) A (a i a j, i j ) (2) A (a i ca i, c 0, i ) (3) A (a i a i + ca j, j i, i ) A 1 A 11 0 A 12 0 0 A 1k 0 1 A 22 0 0 A 2k 0 1 0 A 3k 1 A rk

More information

SO(2)

SO(2) TOP URL http://amonphys.web.fc2.com/ 1 12 3 12.1.................................. 3 12.2.......................... 4 12.3............................. 5 12.4 SO(2).................................. 6

More information

10 1 1 (1) (2) (3) 3 3 1 3 1 3 (4) 2 32 2 (1) 1 1

10 1 1 (1) (2) (3) 3 3 1 3 1 3 (4) 2 32 2 (1) 1 1 10 10 1 1 (1) (2) (3) 3 3 1 3 1 3 (4) 2 32 2 (1) 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 (2) 1 (3) JI S JI S JI S JI S 25 175 J AS 3 (1) 3 70 (2) (3) 100 4 (1)69 (2) (3) (4) (5) (6) (7) (8)70 (9) (10)2 (11)

More information

1 (1) () (3) I 0 3 I I d θ = L () dt θ L L θ I d θ = L = κθ (3) dt κ T I T = π κ (4) T I κ κ κ L l a θ L r δr δl L θ ϕ ϕ = rθ (5) l

1 (1) () (3) I 0 3 I I d θ = L () dt θ L L θ I d θ = L = κθ (3) dt κ T I T = π κ (4) T I κ κ κ L l a θ L r δr δl L θ ϕ ϕ = rθ (5) l 1 1 ϕ ϕ ϕ S F F = ϕ (1) S 1: F 1 1 (1) () (3) I 0 3 I I d θ = L () dt θ L L θ I d θ = L = κθ (3) dt κ T I T = π κ (4) T I κ κ κ L l a θ L r δr δl L θ ϕ ϕ = rθ (5) l : l r δr θ πrδr δf (1) (5) δf = ϕ πrδr

More information

0630-j.ppt

0630-j.ppt 5 part II 2008630 6/30/2008 1 SR (latch) 1(2 22, ( SR S SR 1 SR SR,0, 6/30/2008 2 1 T 6/30/2008 3 (a)(x,y) (1,1) (0,0) X Y XOR S (S,R)(0,1) (0,0) (0,1) (b) AND (a) R YX XOR AND (S,R)(1,1) (c) (b) (c) 6/30/2008

More information

IPSJ SIG Technical Report Vol.2015-MPS-103 No.29 Vol.2015-BIO-42 No /6/24 Deep Convolutional Neural Network 1,a) 1,b),c) X CT (Computer Aided D

IPSJ SIG Technical Report Vol.2015-MPS-103 No.29 Vol.2015-BIO-42 No /6/24 Deep Convolutional Neural Network 1,a) 1,b),c) X CT (Computer Aided D Deep Convolutional Neural Network 1,a) 1,b),c) X CT (Computer Aided Diagnosis : CAD) CAD Deep Convolutional Neural Network (DCNN) DCNN CT DCNN DCNN Support Vector Machine DCNN, Anaysis for Deep Convolutional

More information

xyz,, uvw,, Bernoulli-Euler u c c c v, w θ x c c c dv ( x) dw uxyz (,, ) = u( x) y z + ω( yz, ) φ dx dx c vxyz (,, ) = v( x) zθ x ( x) c wxyz (,, ) =

xyz,, uvw,, Bernoulli-Euler u c c c v, w θ x c c c dv ( x) dw uxyz (,, ) = u( x) y z + ω( yz, ) φ dx dx c vxyz (,, ) = v( x) zθ x ( x) c wxyz (,, ) = ,, uvw,, Bernoull-Euler u v, w θ dv ( ) dw u (,, ) u( ) ω(, ) φ d d v (,, ) v( ) θ ( ) w (,, ) w( ) θ ( ) (11.1) ω φ φ dθ / dφ v v θ u w u w 11.1 θ θ θ 11. vw, (11.1) u du d v d w ε d d d u v ω γ φ w u

More information

untitled

untitled 0. =. =. (999). 3(983). (980). (985). (966). 3. := :=. A A. A A. := := 4 5 A B A B A B. A = B A B A B B A. A B A B, A B, B. AP { A, P } = { : A, P } = { A P }. A = {0, }, A, {0, }, {0}, {}, A {0}, {}.

More information

2009 IA 5 I 22, 23, 24, 25, 26, (1) Arcsin 1 ( 2 (4) Arccos 1 ) 2 3 (2) Arcsin( 1) (3) Arccos 2 (5) Arctan 1 (6) Arctan ( 3 ) 3 2. n (1) ta

2009 IA 5 I 22, 23, 24, 25, 26, (1) Arcsin 1 ( 2 (4) Arccos 1 ) 2 3 (2) Arcsin( 1) (3) Arccos 2 (5) Arctan 1 (6) Arctan ( 3 ) 3 2. n (1) ta 009 IA 5 I, 3, 4, 5, 6, 7 6 3. () Arcsin ( (4) Arccos ) 3 () Arcsin( ) (3) Arccos (5) Arctan (6) Arctan ( 3 ) 3. n () tan x (nπ π/, nπ + π/) f n (x) f n (x) fn (x) Arctan x () sin x [nπ π/, nπ +π/] g n

More information

目次 ガウス過程 (Gaussian Process; GP) 序論 GPによる回帰 GPによる識別 GP 状態空間モデル 概括 GP 状態空間モデルによる音楽ムードの推定

目次 ガウス過程 (Gaussian Process; GP) 序論 GPによる回帰 GPによる識別 GP 状態空間モデル 概括 GP 状態空間モデルによる音楽ムードの推定 公開講座 : ガウス過程の基礎と応用 05/3/3 ガウス過程の基礎 統計数理研究所 松井知子 目次 ガウス過程 (Gaussian Process; GP) 序論 GPによる回帰 GPによる識別 GP 状態空間モデル 概括 GP 状態空間モデルによる音楽ムードの推定 GP 序論 ノンパラメトリック予測 カーネル法の利用 参照文献 : C. E. Rasmussen and C. K. I. Williams

More information

W u = u(x, t) u tt = a 2 u xx, a > 0 (1) D := {(x, t) : 0 x l, t 0} u (0, t) = 0, u (l, t) = 0, t 0 (2)

W u = u(x, t) u tt = a 2 u xx, a > 0 (1) D := {(x, t) : 0 x l, t 0} u (0, t) = 0, u (l, t) = 0, t 0 (2) 3 215 4 27 1 1 u u(x, t) u tt a 2 u xx, a > (1) D : {(x, t) : x, t } u (, t), u (, t), t (2) u(x, ) f(x), u(x, ) t 2, x (3) u(x, t) X(x)T (t) u (1) 1 T (t) a 2 T (t) X (x) X(x) α (2) T (t) αa 2 T (t) (4)

More information