*2.5mm ”ŒŠá‡ÆfiÁ™¥‡Ì…Z†[…t…X…N…−†[…j…fi…O
|
|
- あまめ ひらみね
- 5 years ago
- Views:
Transcription
1 I. Takeuchi, Nagoya Institute of Technology 1/38
2 f(x) = w 1 x 1 + w 2 x w d x d f(x) = α 1 K(x, x 1 ) + α 2 K(x, x 2 ) α n K(x, x n ) {wj } d j=1 f {αi } n i=1 f I. Takeuchi, Nagoya Institute of Technology 2/38
3 LASSO L 1 w := arg min w R d λ w 1 }{{} + L 1 λ > 0 n (y i f(x i )) 2 i=1 SVM α 1 := arg min α R n 2 α Kα + C n max{0, 1 y i f(x i )} }{{} i=1 C > 0 K I. Takeuchi, Nagoya Institute of Technology 3/38
4 wj α i d n wj = 0 α i = 0 / I. Takeuchi, Nagoya Institute of Technology 4/38
5 w j = 0 α i = 0 Sure independence screening (Fan et al., 2007) Shrinking option in libsvm (Fan et al., 2005) safe screening w j = 0 α i = 0 (El Ghaoui et al., 2012) (Ogawa et al., 2013) I. Takeuchi, Nagoya Institute of Technology 5/38
6 Part 1 SVM Ogawa, Suzuki, and Takeuchi. Safe screening of non-support vectors in pathwise SVM computation. ICML2013. Part 2 Nakagawa, Suzumura, Karasuyama, Tsuda, and Takeuchi. Safe feature pruning for sparse high-order interaction models. arxiv: Part 3 Okumura, Suzuki, and Takeuchi. Quick sensitivity analysis for incremental data modification. KDD2015. Shibagaki, Suzuki, Karasuyama, and Takeuchi. Regularization Path of Cross-Validation Error Lower Bounds. NIPS2015. I. Takeuchi, Nagoya Institute of Technology 6/38
7 Part 1 SVM I. Takeuchi, Nagoya Institute of Technology 7/38
8 SVM 2 1 x x 1 Before safe screening (n = 1000 and d = 2) I. Takeuchi, Nagoya Institute of Technology 8/38
9 SVM Before safe screening (n = 1000 and d = 2) I. Takeuchi, Nagoya Institute of Technology 8/38
10 SVM 2 1 x x 1 Before safe screening After safe screening (n = 1000 and d = 2) I. Takeuchi, Nagoya Institute of Technology 8/38
11 SVM Before safe screening After safe screening (n = 1000 and d = 2) I. Takeuchi, Nagoya Institute of Technology 8/38
12 SVM Before safe screening After safe screening (n = 1000 and d = 2) I. Takeuchi, Nagoya Institute of Technology 8/38
13 SVM ŷ = { 1 if f(x) < 0, +1 if f(x) 0, f(x) = w x = n αi y i K(x, x i ) i=1 {(x i, y i )} n i=1 SVM α i = 0 f SV I. Takeuchi, Nagoya Institute of Technology 9/38
14 (SVs) : y i f(x i ) < 1 α i = C : y i f(x i ) = 1 α i [0, C] (non-svs) : y i f(x i ) > 1 α i = 0 }{{} X * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * ** * * * * * * * * * * * * * X1 * w y i f(x i ) = (y i x i ) w > 1 (x i, y i ) I. Takeuchi, Nagoya Institute of Technology 10/38
15 R d w B w B := {w w m r}, mr I. Takeuchi, Nagoya Institute of Technology 11/38
16 R d w B w B := {w w m r}, mr I. Takeuchi, Nagoya Institute of Technology 11/38
17 R d w B w B := {w w m r}, mr I. Takeuchi, Nagoya Institute of Technology 11/38
18 y i f(x i ) = (y i x i ) w (y i x i ) w min w B (y ix i ) w = (y i x i ) m y i x i r, (y i x i ) w max w B (y ix i ) w = (y i x i ) m + y i x i r I. Takeuchi, Nagoya Institute of Technology 12/38
19 y i f(x i ) = (y i x i ) w (y i x i ) w min w B (y ix i ) w = (y i x i ) m y i x i r, (y i x i ) w max w B (y ix i ) w = (y i x i ) m + y i x i r I. Takeuchi, Nagoya Institute of Technology 12/38
20 y i f(x i ) = (y i x i ) w (y i x i ) w min w B (y ix i ) w = (y i x i ) m y i x i r, (y i x i ) w max w B (y ix i ) w = (y i x i ) m + y i x i r I. Takeuchi, Nagoya Institute of Technology 12/38
21 w B w B min (y ix i ) w > 1 w B }{{} >1 (y i x i ) w > 1 } {{ } >1 α i = 0 }{{} SV w B SV I. Takeuchi, Nagoya Institute of Technology 13/38
22 SVM w := arg min w R d J(w) := 1 2 w 2 + C n l i (w). i=1 l i (w) := l(y i, x i w) R d w w B := { w } w m r. l i () R d l i ( ) m := 1 n C l i (), r := 1 n C l i (). i=1 i=1 I. Takeuchi, Nagoya Institute of Technology 14/38
23 SVM w := arg min w R d J(w) := 1 2 w 2 + C n l i (w). i=1 l i (w) := l(y i, x i w) w R d w w B := { w } w m r. l i ( w) R d l i w ( ) m := 1 n w C l i ( w), r := 1 n 2 2 w + C l i ( w). i=1 i=1 I. Takeuchi, Nagoya Institute of Technology 14/38
24 SVM w := arg min w R d J(w) := 1 2 w 2 + C n l i (w). i=1 l i (w) := l(y i, x i w) w R d w w B := { w } w m r. l i ( w) R d l i w ( ) m := 1 n w C l i ( w), r := 1 n 2 2 w + C l i ( w). i=1 i=1 I. Takeuchi, Nagoya Institute of Technology 14/38
25 w B w R d w w r SV MATLAB demo I. Takeuchi, Nagoya Institute of Technology 15/38
26 Before safe screening After safe screening I. Takeuchi, Nagoya Institute of Technology 16/38
27 w C C Part 3 I. Takeuchi, Nagoya Institute of Technology 17/38
28 Data Sample Size n LIBLINEAR Sc.Rule Sc.SVM Sc.Total acoustic 78, covtype 581, yahoo 1,036, url 2,396, kdd-a 8,407, kdd-b 19,264, C 0 = ( yy K ) 1 C = C 0 /0.8 SVM I. Takeuchi, Nagoya Institute of Technology 18/38
29 C 1 < C 2 <... Data Set Kernel (γ) LIBSVM/LIBLINEAR Sc.Rule Sc.SVM Sc.Total Linear dna RBF (0.1/d) n = 2, 000 RBF (1/d) d = 180 RBF (10/d) Linear DIGIT1 RBF (0.1/d) n = 1, 500 RBF (1/d) d = 241 RBF (10/d) Linear satimage RBF (0.1/d) n = 4, 435 RBF (1/d) d = 36 RBF (10/d) Linear gisette RBF (0.1/d) n = 6, 000 RBF (1/d) d = 5, 000 RBF (10/d) Linear mushrooms RBF (0.1/d) n = 8, 124 RBF (1/d) d = 112 RBF (10/d) Linear news20 RBF (0.1/d) n = 19, 996 RBF (1/d) d = 1, 355, 191 RBF (10/d) Linear shuttle RBF (0.1/d) n = 43, 500 RBF (1/d) d = 9 RBF (10/d) I. Takeuchi, Nagoya Institute of Technology 19/38
30 Part 2 I. Takeuchi, Nagoya Institute of Technology 20/38
31 L 1 LASSO LASSO LASSO w := arg min w R d λ w 1 + n (y i w x i ) 2 i=1 ( γ 1 := arg min γ i 1 2 i) γ R n 2 λ y s.t. n x ij γ i 1, j i=1 n x ij γi < 1 w j = 0, i=1 I. Takeuchi, Nagoya Institute of Technology 21/38
32 LASSO γ R (El Ghaoui et al., 2012) (Liu et al., 2014) (Fercoq et al., 2015) γ R n max x γ R ij γ i < 1 i=1 }{{} 1 n x ij γi < 1 i=1 }{{} 1 w j = 0 }{{} γ R I. Takeuchi, Nagoya Institute of Technology 22/38
33 d {(z i, y i )} n i=1, z i [0, 1], y i R D = r ρ=1 ( d ρ) f(z i ) = w 1 z 1 + w 2 z w d z d + w 1,2 z 1 z 2 + w 1,3 z 1 z w d 1,d z d 1 z d + w 1,2,3 z 1 z 2 z 3 + w 1,2,4 z 1 z 2 z w d 2,d 1,d z d 2 z d 1 z d X R n D LASSO X := n D (main effect) (2 nd order interactions) (r th order interactions) z zd 1 z1 1 z zd 1 1 z1 d... z1 1 z1 2 z1 r... zd r+1 1 z1 d r+2 z1 d z1 n... zd n z1 n zn 2... zd 1 n zn d... z1 n zn 2 zn r... zd r+1 n zn d r+2 zn d I. Takeuchi, Nagoya Institute of Technology 23/38
34 d = 5000, r = 5 D > n max γ i x ij, j = 1,..., D. γ R i=1 I. Takeuchi, Nagoya Institute of Technology 24/38
35 Safe Pruning Rule j spr(j) n spr(j) is true x ij γi < 1 w j = 0 for all j Des(j), i=1 Des(j) j I. Takeuchi, Nagoya Institute of Technology 25/38
36 spr(z 1 ) = false, A = {z 1 } I. Takeuchi, Nagoya Institute of Technology 26/38
37 spr(z 1 z 2 ) = true, A = {z 1 } I. Takeuchi, Nagoya Institute of Technology 26/38
38 spr(z 1 z 3 ) = false, A = {z 1, z 1 z 3 } I. Takeuchi, Nagoya Institute of Technology 26/38
39 spr(z 1 z 3 z 4 ) = false, A = {z 1, z 1 z 3, z 1 z 3 z 4 } I. Takeuchi, Nagoya Institute of Technology 26/38
40 spr(z 1 z 4 ) = true, A = {z 1, z 1 z 3, z 1 z 3 z 4 } I. Takeuchi, Nagoya Institute of Technology 26/38
41 spr(z 2 ) = true, A = {z 1, z 1 z 3, z 1 z 3 z 4 } I. Takeuchi, Nagoya Institute of Technology 26/38
42 spr(z 3 ) = true, A = {z 1, z 1 z 3, z 1 z 3 z 4 } I. Takeuchi, Nagoya Institute of Technology 26/38
43 spr(z 4 ) = false, A = {z 1, z 1 z 3, z 1 z 3 z 4, z 4 } I. Takeuchi, Nagoya Institute of Technology 26/38
44 A = {z 1, z 1 z 3, z 1 z 3 z 4, z 4 } I. Takeuchi, Nagoya Institute of Technology 26/38
45 λ 1 > λ 2 >... time IB SPR log Lam/LamMax traverse 1.8e e e e+08 1e+08 8e+07 6e+07 4e+07 2e+07 IB SPR log Lam/LamMax # of feature depth 1 depth 2 depth log Lam/LamMax protein time IB SPR log Lam/LamMax traverse 2.5e+08 2e e+08 1e+08 5e+07 IB SPR log Lam/LamMax # of feature depth 1 depth 2 depth log Lam/LamMax mnist Itemset Boosting (Saigo et al., 2006) I. Takeuchi, Nagoya Institute of Technology 27/38
46 Part 3 I. Takeuchi, Nagoya Institute of Technology 28/38
47 w w B θ w { ŷ = sign(f(x)) = sign(x w +1 if x ) = w > 0, 1 if x w < 0. min w B x w > 0 ŷ = +1, max w B x w < 0 ŷ = 1. I. Takeuchi, Nagoya Institute of Technology 29/38
48 (KDD2015) I. Takeuchi, Nagoya Institute of Technology 30/38
49 (KDD2015) I. Takeuchi, Nagoya Institute of Technology 30/38
50 (KDD2015) I. Takeuchi, Nagoya Institute of Technology 30/38
51 (KDD2015) The computational cost depends only on A + R. I. Takeuchi, Nagoya Institute of Technology 30/38
52 wold w wnew kdd2010 n train = D old > 8 million and n test > 0.5 million 0.01, 0.1, 1% % of updated instances 0.01% 0.1% 1% % of label identification % % % % of speed-ups % % % 99% 1/1000 I. Takeuchi, Nagoya Institute of Technology 31/38
53 (NIPS2015) Validation Error Regularization Parameter C Model selection can be done with approximation guarantee. I. Takeuchi, Nagoya Institute of Technology 32/38
54 (NIPS2015) Validation Error Regularization Parameter C Model selection can be done with approximation guarantee. I. Takeuchi, Nagoya Institute of Technology 32/38
55 (NIPS2015) Validation Error Regularization Parameter C Model selection can be done with approximation guarantee. I. Takeuchi, Nagoya Institute of Technology 32/38
56 (NIPS2015) Validation Error Regularization Parameter C Model selection can be done with approximation guarantee. I. Takeuchi, Nagoya Institute of Technology 32/38
57 (NIPS2015) Validation Error ε Regularization Parameter C Model selection can be done with approximation guarantee. I. Takeuchi, Nagoya Institute of Technology 32/38
58 + * ' & G + * ' & E x w C C x w C MNPO M RTSUVXWZY Q [ \^] _a` Vcb Q [ed V WfY [ \ ] Nhg KLNM K PRQTSVUNWYX O Z [/\ ]R^ Ù O Zba _ U WYX Z [ \ LTc ( ) % $# "!, > 2 43 ; < ; 9 < -/ :9 ;=< IH.?0 5@7 <BADC.FEB.?0 587 JLK ( ) % $# "!, -/ :9 ;< GF = >7:9 ; <@?BA.DC@ < 587:9 ; < HJI I. Takeuchi, Nagoya Institute of Technology 33/38
59 A lower and an upper bounds of wc x i is written as LB(w C x i ) = α(ŵ C, x i ) C C (β(ŵ C, x i ) + γ(g(ŵ C), x i )), UB(w C x i ) = β(ŵ C, x i ) + C C (α(ŵ C, x i ) + δ(g(ŵ C), x i )), where, for C C, and α(ŵ C, x i ) := 1 2 ( ŵ C x i + w C x i ) 0, β(ŵ C, x i ) := 1 2 ( ŵ C x i w C x i ) 0, γ(g(ŵ C), x i ) := 1 2 ( g(ŵ C), x i ) x i + g(ŵ C )x i ) 0, δ(g(ŵ C), x i ) := 1 2 ( g(ŵ C), x i ) x i g(ŵ C )x i ) 0, where g(ŵ C), x i ) is the gradient vector of the objective function at w = ŵ C. I. Takeuchi, Nagoya Institute of Technology 34/38
60 C 6 BDCFE?HG BJIKC 6 BDCFE?HG BJIKC ) /. - (,+* ')(!"# $ &% I. Takeuchi, Nagoya Institute of Technology 35/38
61 C 6 BDCFE?HG BJIKC 6 BDCFE?HG BJIKC ) /. - (,+* ')(!"# $ &% I. Takeuchi, Nagoya Institute of Technology 35/38
62 I. Takeuchi, Nagoya Institute of Technology 36/38
63 L. El Ghaoui, V. Viallon and T. Rabbani. Safe feature elimination in sparse supervised learning. Pacific Journal of Optimization, J. Liu, Z. Zhao, J. Wang and J. Ye. Safe Screening with Variational Inequalities and Its Application to Lasso. ICML2014. J. Fan and J. Lv. Sure independence screening for ultrahigh dimensional feature space. Journal of The Royal Statistical Society B, 70:849911, R. Fan, K. Chang, C. Hsieh, X. Wang and C. Lin. LIBLINEAR: A library for large linear classification. Journal of Machine Learning Research, vol. 9, pp , H. Saigo, T. Uno and K. Tsuda. Mining complex genotypic features for predicting hiv-1 drug resistance. Bioinformatics, 24: , K. Ogawa, Y. Suzuki and I. Takeuchi. Safe screening of non-support vectors in pathwise SVM computation. ICML2013. K. Nakagawa, S. Suzumura, M. Karasuyama, K. Tsuda and I. Takeuchi. Safe feature pruning for sparse high-order interaction models. arxiv: , S. Okumura, Y. Suzuki and I. Takeuchi. Quick sensitivity analysis for incremental data modification and its application to leave-one-out CV in linear classification problems. KDD2015. A. Shibagaki, Y. Suzuki, M. Karasuyama and I. Takeuchi. Regularization path of cross-validation error lower bounds. NIPS2015. I. Takeuchi, Nagoya Institute of Technology 37/38
x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2
Takio Kurita Neurosceince Research Institute, National Institute of Advanced Indastrial Science and Technology takio-kurita@aistgojp (Support Vector Machine, SVM) 1 (Support Vector Machine, SVM) ( ) 2
More informationIPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta
1 1 1 1 2 1. Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Takayuki Okatani 1 and Koichiro Deguchi 1 This paper presents a method for recognizing the pose of a wire harness
More informationJFE.dvi
,, Department of Civil Engineering, Chuo University Kasuga 1-13-27, Bunkyo-ku, Tokyo 112 8551, JAPAN E-mail : atsu1005@kc.chuo-u.ac.jp E-mail : kawa@civil.chuo-u.ac.jp SATO KOGYO CO., LTD. 12-20, Nihonbashi-Honcho
More information鉄鋼協会プレゼン
NN :~:, 8 Nov., Adaptive H Control for Linear Slider with Friction Compensation positioning mechanism moving table stand manipulator Point to Point Control [G] Continuous Path Control ground Fig. Positoining
More information(note-02) Rademacher 1/57
(note-02) Rademacher 1/57 (x 1, y 1 ),..., (x n, y n ) X Y f : X Y Y = R f Y = {+1, 1}, {1, 2,..., G} f x y 1. (x 1, y 1 ),..., (x n, y n ) f(x i ) ( ) 2. x f(x) Y 2/57 (x, y) f(x) f(x) y (, loss) l(f(x),
More informationkut-paper-template.dvi
26 Discrimination of abnormal breath sound by using the features of breath sound 1150313 ,,,,,,,,,,,,, i Abstract Discrimination of abnormal breath sound by using the features of breath sound SATO Ryo
More information…p…^†[…fiflF”¯ Pattern Recognition
Pattern Recognition Shin ichi Satoh National Institute of Informatics June 11, 2019 (Support Vector Machines) (Support Vector Machines: SVM) SVM Vladimir N. Vapnik and Alexey Ya. Chervonenkis 1963 SVM
More information(MIRU2008) HOG Histograms of Oriented Gradients (HOG)
(MIRU2008) 2008 7 HOG - - E-mail: katsu0920@me.cs.scitec.kobe-u.ac.jp, {takigu,ariki}@kobe-u.ac.jp Histograms of Oriented Gradients (HOG) HOG Shape Contexts HOG 5.5 Histograms of Oriented Gradients D Human
More informationicml10yomikai.key
A Fast Augmented Lagrangian Algorithm for Learning Low- Rank Matrices Ryota Tomioka, Taiji Suzuki, Masashi Sugiyama, Hisashi Kashima 2010/8/18 ICML2010 1 Matrix completion [Srebro et al. 05; Abernethy
More informationuntitled
2 : n =1, 2,, 10000 0.5125 0.51 0.5075 0.505 0.5025 0.5 0.4975 0.495 0 2000 4000 6000 8000 10000 2 weak law of large numbers 1. X 1,X 2,,X n 2. µ = E(X i ),i=1, 2,,n 3. σi 2 = V (X i ) σ 2,i=1, 2,,n ɛ>0
More informationohpmain.dvi
fujisawa@ism.ac.jp 1 Contents 1. 2. 3. 4. γ- 2 1. 3 10 5.6, 5.7, 5.4, 5.5, 5.8, 5.5, 5.3, 5.6, 5.4, 5.2. 5.5 5.6 +5.7 +5.4 +5.5 +5.8 +5.5 +5.3 +5.6 +5.4 +5.2 =5.5. 10 outlier 5 5.6, 5.7, 5.4, 5.5, 5.8,
More informationIsogai, T., Building a dynamic correlation network for fat-tailed financial asset returns, Applied Network Science (7):-24, 206,
H28. (TMU) 206 8 29 / 34 2 3 4 5 6 Isogai, T., Building a dynamic correlation network for fat-tailed financial asset returns, Applied Network Science (7):-24, 206, http://link.springer.com/article/0.007/s409-06-0008-x
More information1 n 1 1 2 2 3 3 3.1............................ 3 3.2............................. 6 3.2.1.............. 6 3.2.2................. 7 3.2.3........................... 10 4 11 4.1..........................
More information研究シリーズ第40号
165 PEN WPI CPI WAGE IIP Feige and Pearce 166 167 168 169 Vector Autoregression n (z) z z p p p zt = φ1zt 1 + φ2zt 2 + + φ pzt p + t Cov( 0 ε t, ε t j )= Σ for for j 0 j = 0 Cov( ε t, zt j ) = 0 j = >
More information1 1 2 3 2.1.................. 3 2.2.......... 6 3 7 3.1......................... 7 3.1.1 ALAGIN................ 7 3.1.2 (SVM).........................
[5] Yahoo! Yahoo! (SVM) 3 F 7 7 (SVM) 3 F 6 0 1 1 2 3 2.1.................. 3 2.2.......... 6 3 7 3.1......................... 7 3.1.1 ALAGIN................ 7 3.1.2 (SVM)........................... 8
More information医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 第 2 版 1 刷発行時のものです.
医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/009192 このサンプルページの内容は, 第 2 版 1 刷発行時のものです. i 2 t 1. 2. 3 2 3. 6 4. 7 5. n 2 ν 6. 2 7. 2003 ii 2 2013 10 iii 1987
More informationaca-mk23.dvi
E-Mail: matsu@nanzan-u.ac.jp [13] [13] 2 ( ) n-gram 1 100 ( ) (Google ) [13] (Breiman[3] ) [13] (Friedman[5, 6]) 2 2.1 [13] 10 20 200 11 10 110 6 10 60 [13] 1: (1892-1927) (1888-1948) (1867-1916) (1862-1922)
More information,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw
,.,. NP,.,. 1 1.1.,.,,.,.,,,. 2. 1.1.1 (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., 152-8552 2-12-1, tatsukawa.m.aa@m.titech.ac.jp, 190-8562 10-3, mirai@ism.ac.jp
More informationmain.dvi
DEIM Forum 2018 J7-3 305-8573 1-1-1 305-8573 1-1-1 305-8573 1-1-1 () 151-0053 1-3-15 6F URL SVM Identifying Know-How Sites basedonatopicmodelandclassifierlearning Jiaqi LI,ChenZHAO, Youchao LIN, Ding YI,ShutoKAWABATA,
More information1 IDC Wo rldwide Business Analytics Technology and Services 2013-2017 Forecast 2 24 http://www.soumu.go.jp/johotsusintokei/whitepaper/ja/h24/pdf/n2010000.pdf 3 Manyika, J., Chui, M., Brown, B., Bughin,
More information磁気測定によるオーステンパ ダクタイル鋳鉄の残留オーステナイト定量
33 Non-destructive Measurement of Retained Austenite Content in Austempered Ductile Iron Yoshio Kato, Sen-ichi Yamada, Takayuki Kato, Takeshi Uno Austempered Ductile Iron (ADI) 100kg/mm 2 10 ADI 10 X ADI
More informationSupport Vector Machine (SVM) 4 SVM SVM 2 80% 100% SVM SVM SVM 4 SVM 2 2 SVM 4
Analysis of Groove Feelings of Drums Plays 47 56340 19 1 31 Support Vector Machine (SVM) 4 SVM SVM 2 80% 100% SVM SVM SVM 4 SVM 2 2 SVM 4 1 1 1.1........................................ 1 1.1.1.............................
More information18 2 20 W/C W/C W/C 4-4-1 0.05 1.0 1000 1. 1 1.1 1 1.2 3 2. 4 2.1 4 (1) 4 (2) 4 2.2 5 (1) 5 (2) 5 2.3 7 3. 8 3.1 8 3.2 ( ) 11 3.3 11 (1) 12 (2) 12 4. 14 4.1 14 4.2 14 (1) 15 (2) 16 (3) 17 4.3 17 5. 19
More information149 (Newell [5]) Newell [5], [1], [1], [11] Li,Ryu, and Song [2], [11] Li,Ryu, and Song [2], [1] 1) 2) ( ) ( ) 3) T : 2 a : 3 a 1 :
Transactions of the Operations Research Society of Japan Vol. 58, 215, pp. 148 165 c ( 215 1 2 ; 215 9 3 ) 1) 2) :,,,,, 1. [9] 3 12 Darroch,Newell, and Morris [1] Mcneil [3] Miller [4] Newell [5, 6], [1]
More informationレビューテキストの書き の評価視点に対する評価点の推定 29 3
JAIST Reposi https://dspace.j Title レヒ ューテキストの書き手の評価視点に対する評価 点の推定 Author(s) 張, 博 Citation Issue Date 2017-03 Type Thesis or Dissertation Text version author URL http://hdl.handle.net/10119/14154 Rights
More informationComputational Semantics 1 category specificity Warrington (1975); Warrington & Shallice (1979, 1984) 2 basic level superiority 3 super-ordinate catego
Computational Semantics 1 category specificity Warrington (1975); Warrington & Shallice (1979, 1984) 2 basic level superiority 3 super-ordinate category preservation 1 / 13 analogy by vector space Figure
More information第8章 位相最適化問題
8 February 25, 2009 1 (topology optimizaiton problem) ( ) H 1 2 2.1 ( ) V S χ S : V {0, 1} S (characteristic function) (indicator function) 1 (x S ) χ S (x) = 0 (x S ) 2.1 ( ) Lipschitz D R d χ Ω (Ω D)
More informationCVaR
CVaR 20 4 24 3 24 1 31 ,.,.,. Markowitz,., (Value-at-Risk, VaR) (Conditional Value-at-Risk, CVaR). VaR, CVaR VaR. CVaR, CVaR. CVaR,,.,.,,,.,,. 1 5 2 VaR CVaR 6 2.1................................................
More informationA 99% MS-Free Presentation
A 99% MS-Free Presentation 2 Galactic Dynamics (Binney & Tremaine 1987, 2008) Dynamics of Galaxies (Bertin 2000) Dynamical Evolution of Globular Clusters (Spitzer 1987) The Gravitational Million-Body Problem
More information2016 Institute of Statistical Research
2016 Institute of Statistical Research 2016 Institute of Statistical Research 2016 Institute of Statistical Research 2016 Institute of Statistical Research 2016 Institute of Statistical Research 2016 Institute
More informationConvolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution
Convolutional Neural Network 2014 3 A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolutional Neural Network Fukui Hiroshi 1940 1980 [1] 90 3
More information..,,,, , ( ) 3.,., 3.,., 500, 233.,, 3,,.,, i
25 Feature Selection for Prediction of Stock Price Time Series 1140357 2014 2 28 ..,,,,. 2013 1 1 12 31, ( ) 3.,., 3.,., 500, 233.,, 3,,.,, i Abstract Feature Selection for Prediction of Stock Price Time
More information01.Œk’ì/“²fi¡*
AIC AIC y n r n = logy n = logy n logy n ARCHEngle r n = σ n w n logσ n 2 = α + β w n 2 () r n = σ n w n logσ n 2 = α + β logσ n 2 + v n (2) w n r n logr n 2 = logσ n 2 + logw n 2 logσ n 2 = α +β logσ
More information第122号.indd
-1- -2- -3- 0852-36-5150 0852-36-5163-4- -5- -6- -7- 1st 1-1 1-2 1-3 1-4 1-5 -8- 2nd M2 E2 D2 J2 C2-9- 3rd M3 E3 D3 J3 C3-10- 4th M4 E4 D4 J4 C4-11- -12- M5 E5 J5 D5 C5 5th -13- -14- NEWS NEWS -15- NEWS
More informationuntitled
c 645 2 1. GM 1959 Lindsey [1] 1960 Howard [2] Howard 1 25 (Markov Decision Process) 3 3 2 3 +1=25 9 Bellman [3] 1 Bellman 1 k 980 8576 27 1 015 0055 84 4 1977 D Esopo and Lefkowitz [4] 1 (SI) Cover and
More informationH.Haken Synergetics 2nd (1978)
27 3 27 ) Ising Landau Synergetics Fokker-Planck F-P Landau F-P Gizburg-Landau G-L G-L Bénard/ Hopfield H.Haken Synergetics 2nd (1978) (1) Ising m T T C 1: m h Hamiltonian H = J ij S i S j h i S
More information? (EM),, EM? (, 2004/ 2002) von Mises-Fisher ( 2004) HMM (MacKay 1997) LDA (Blei et al. 2001) PCFG ( 2004)... Variational Bayesian methods for Natural
SLC Internal tutorial Daichi Mochihashi daichi.mochihashi@atr.jp ATR SLC 2005.6.21 (Tue) 13:15 15:00@Meeting Room 1 Variational Bayesian methods for Natural Language Processing p.1/30 ? (EM),, EM? (, 2004/
More information°ÌÁê¿ô³ØII
July 14, 2007 Brouwer f f(x) = x x f(z) = 0 2 f : S 2 R 2 f(x) = f( x) x S 2 3 3 2 - - - 1. X x X U(x) U(x) x U = {U(x) x X} X 1. U(x) A U(x) x 2. A U(x), A B B U(x) 3. A, B U(x) A B U(x) 4. A U(x),
More information通信容量制約を考慮したフィードバック制御 - 電子情報通信学会 情報理論研究会(IT) 若手研究者のための講演会
IT 1 2 1 2 27 11 24 15:20 16:05 ( ) 27 11 24 1 / 49 1 1940 Witsenhausen 2 3 ( ) 27 11 24 2 / 49 1940 2 gun director Warren Weaver, NDRC (National Defence Research Committee) Final report D-2 project #2,
More informationit-ken_open.key
深層学習技術の進展 ImageNet Classification 画像認識 音声認識 自然言語処理 機械翻訳 深層学習技術は これらの分野において 特に圧倒的な強みを見せている Figure (Left) Eight ILSVRC-2010 test Deep images and the cited4: from: ``ImageNet Classification with Networks et
More informationTrial for Value Quantification from Exceptional Utterances 37-066593 1 5 1.1.................................. 5 1.2................................ 8 2 9 2.1.............................. 9 2.1.1.........................
More informationResearch on decision making in multi-player games with imperfect information
Research on decision making in multi-player games with imperfect information 37-086521 22 2 9 UCT UCT 46 % 60000 9 % 1 1 1.1........................................ 1 1.2.....................................
More information202 2 9 Vol. 9 yasuhisa.toyosawa@mizuho-cb.co.jp 3 3 Altman968 Z Kaplan and Urwitz 979 Merton974 Support Vector Machine SVM 20 20 2 SVM i s i x b si t = b x i i r i R * R r (R,R, L,R ), R < R < L < R
More informationOptical Lenses CCD Camera Laser Sheet Wind Turbine with med Diffuser Pitot Tube PC Fig.1 Experimental facility. Transparent Diffuser Double Pulsed Nd:
*1 *2 *3 PIV Measurement of Field of the Wind Turbine with a med Diffuser Kazuhiko TOSHIMITSU *4, Koutarou NISHIKAWA and Yuji OHYA *4 Department of Mechanical Engineering, Matsue National Collage of Technology,
More informationI, II 1, A = A 4 : 6 = max{ A, } A A 10 10%
1 2006.4.17. A 3-312 tel: 092-726-4774, e-mail: hara@math.kyushu-u.ac.jp, http://www.math.kyushu-u.ac.jp/ hara/lectures/lectures-j.html Office hours: B A I ɛ-δ ɛ-δ 1. 2. A 1. 1. 2. 3. 4. 5. 2. ɛ-δ 1. ɛ-n
More informationVol. 36, Special Issue, S 3 S 18 (2015) PK Phase I Introduction to Pharmacokinetic Analysis Focus on Phase I Study 1 2 Kazuro Ikawa 1 and Jun Tanaka 2
Vol. 36, Special Issue, S 3 S 18 (2015) PK Phase I Introduction to Pharmacokinetic Analysis Focus on Phase I Study 1 2 Kazuro Ikawa 1 and Jun Tanaka 2 1 2 1 Department of Clinical Pharmacotherapy, Hiroshima
More information2017 (413812)
2017 (413812) Deep Learning ( NN) 2012 Google ASIC(Application Specific Integrated Circuit: IC) 10 ASIC Deep Learning TPU(Tensor Processing Unit) NN 12 20 30 Abstract Multi-layered neural network(nn) has
More informationuntitled
18 1 2,000,000 2,000,000 2007 2 2 2008 3 31 (1) 6 JCOSSAR 2007pp.57-642007.6. LCC (1) (2) 2 10mm 1020 14 12 10 8 6 4 40,50,60 2 0 1998 27.5 1995 1960 40 1) 2) 3) LCC LCC LCC 1 1) Vol.42No.5pp.29-322004.5.
More information: u i = (2) x i Smagorinsky τ ij τ [3] ij u i u j u i u j = 2ν SGS S ij, (3) ν SGS = (C s ) 2 S (4) x i a u i ρ p P T u ν τ ij S c ν SGS S csgs
15 C11-4 Numerical analysis of flame propagation in a combustor of an aircraft gas turbine, 4-6-1 E-mail: tominaga@icebeer.iis.u-tokyo.ac.jp, 2-11-16 E-mail: ntani@iis.u-tokyo.ac.jp, 4-6-1 E-mail: itoh@icebeer.iis.u-tokyo.ac.jp,
More informationy = x x R = 0. 9, R = σ $ = y x w = x y x x w = x y α ε = + β + x x x y α ε = + β + γ x + x x x x' = / x y' = y/ x y' =
y x = α + β + ε =,, ε V( ε) = E( ε ) = σ α $ $ β w ( 0) σ = w σ σ y α x ε = + β + w w w w ε / w ( w y x α β ) = α$ $ W = yw βwxw $β = W ( W) ( W)( W) w x x w x x y y = = x W y W x y x y xw = y W = w w
More informationjohnny-paper2nd.dvi
13 The Rational Trading by Using Economic Fundamentals AOSHIMA Kentaro 14 2 26 ( ) : : : The Rational Trading by Using Economic Fundamentals AOSHIMA Kentaro abstract: Recently Artificial Markets on which
More informationわが国企業による資金調達方法の選択問題
* takeshi.shimatani@boj.or.jp ** kawai@ml.me.titech.ac.jp *** naohiko.baba@boj.or.jp No.05-J-3 2005 3 103-8660 30 No.05-J-3 2005 3 1990 * E-mailtakeshi.shimatani@boj.or.jp ** E-mailkawai@ml.me.titech.ac.jp
More informationDeep Learning Deep Learning GPU GPU FPGA %
2016 (412825) Deep Learning Deep Learning GPU GPU FPGA 16 1 16 69% Abstract Recognition by DeepLearning attracts attention, because of its high recognition accuracy. Lots of learning is necessary for Deep
More informationMicrosoft PowerPoint - SSII_harada pptx
The state of the world The gathered data The processed data w d r I( W; D) I( W; R) The data processing theorem states that data processing can only destroy information. David J.C. MacKay. Information
More information磁性物理学 - 遷移金属化合物磁性のスピンゆらぎ理論
email: takahash@sci.u-hyogo.ac.jp May 14, 2009 Outline 1. 2. 3. 4. 5. 6. 2 / 262 Today s Lecture: Mode-mode Coupling Theory 100 / 262 Part I Effects of Non-linear Mode-Mode Coupling Effects of Non-linear
More information第3章 非線形計画法の基礎
3 February 25, 2009 1 Armijo Wolfe Newton 2 Newton Lagrange Newton 2 SQP 2 1 2.1 ( ) S R n (n N) f (x) : R n x f R x S f (x ) = min x S R n f (x) (nonlinear programming) x 0 S k = 0, 1, 2, h k R n ɛ k
More information24 SPAM Performance Comparison of Machine Learning Algorithms for SPAM Discrimination
24 SPAM Performance Comparison of Machine Learning Algorithms for SPAM Discrimination 1130378 2013 3 9 SPAM SPAM SPAM SPAM SVM AdaBoost RandomForest SPAM SPAM UCI Machine Learning Repository Spambase 4601
More informationERATO100913
ERATO September 13, 2010, DC2 1/25 1. 2 2. 2/25 3/25 3/25 2 3/25 2 3/25 1 1 0.5 0.5 0 0 0.5 1 0 0 0.5 1 4/25 1 1 0.5 0.5 0 0 0.5 1 (0, 0) 0 0 0.5 1 4/25 1 1 0.5 0.5 0 0 0.5 1 (0, 0) ( 1, 0) 0 0 0.5 1 4/25
More information24 201170068 1 4 2 6 2.1....................... 6 2.1.1................... 6 2.1.2................... 7 2.1.3................... 8 2.2..................... 8 2.3................. 9 2.3.1........... 12
More information(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α,
[II] Optimization Computation for 3-D Understanding of Images [II]: Ellipse Fitting 1. (1) 2. (2) (edge detection) (edge) (zero-crossing) Canny (Canny operator) (3) 1(a) [I] [II] [III] [IV ] E-mail sugaya@iim.ics.tut.ac.jp
More informationsyuu_2_10_3.dvi
[1] [1, 2, 3] [1, 4, 5] 6 7 3 (0.66) (0.65) 1 [6] 0 1 1 2 3 2.1................................ 3 2.1.1.................................. 3 2.1.2.................................. 3 2.2...........................
More information, 1.,,,.,., (Lin, 1955).,.,.,.,. f, 2,. main.tex 2011/08/13( )
81 4 2 4.1, 1.,,,.,., (Lin, 1955).,.,.,.,. f, 2,. 82 4.2. ζ t + V (ζ + βy) = 0 (4.2.1), V = 0 (4.2.2). (4.2.1), (3.3.66) R 1 Φ / Z, Γ., F 1 ( 3.2 ). 7,., ( )., (4.2.1) 500 hpa., 500 hpa (4.2.1) 1949,.,
More informationIA hara@math.kyushu-u.ac.jp Last updated: January,......................................................................................................................................................................................
More informationIS1-09 第 回画像センシングシンポジウム, 横浜,14 年 6 月 2 Hough Forest Hough Forest[6] Random Forest( [5]) Random Forest Hough Forest Hough Forest 2.1 Hough Forest 1 2.2
IS1-09 第 回画像センシングシンポジウム, 横浜,14 年 6 月 MI-Hough Forest () E-mail: ym@vision.cs.chubu.ac.jphf@cs.chubu.ac.jp Abstract Hough Forest Random Forest MI-Hough Forest Multiple Instance Learning Bag Hough Forest
More informationkubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i
kubostat2018d p.1 I 2018 (d) model selection and kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2018 06 25 : 2018 06 21 17:45 1 2 3 4 :? AIC : deviance model selection misunderstanding kubostat2018d (http://goo.gl/76c4i)
More informationS I. dy fx x fx y fx + C 3 C vt dy fx 4 x, y dy yt gt + Ct + C dt v e kt xt v e kt + C k x v k + C C xt v k 3 r r + dr e kt S Sr πr dt d v } dt k e kt
S I. x yx y y, y,. F x, y, y, y,, y n http://ayapin.film.s.dendai.ac.jp/~matuda n /TeX/lecture.html PDF PS yx.................................... 3.3.................... 9.4................5..............
More informationA Feasibility Study of Direct-Mapping-Type Parallel Processing Method to Solve Linear Equations in Load Flow Calculations Hiroaki Inayoshi, Non-member
A Feasibility Study of Direct-Mapping-Type Parallel Processing Method to Solve Linear Equations in Load Flow Calculations Hiroaki Inayoshi, Non-member (University of Tsukuba), Yasuharu Ohsawa, Member (Kobe
More informationPRODUCT INFORMATION Highly Efficient FXS Carbide Ball Nose End Mills Vol. 3 PAT.P. FXS-EBT FXS-LS-EBT FXS-PC-EBT FXS-EBM
PRODUCT INFORMATION Highly Efficient FXS Carbide Ball Nose End Mills Vol. 3 PAT.P. FXS-EBT FXS-LS-EBT FXS-PC-EBT FXS-EBM 3 Flutes Series Features Thanks to 3 flutes ball nose geometry, all of that reach
More informationuntitled
48 B 17 4 Annuals of Disas. Prev. Res. Inst., Kyoto Univ., No. 48 B, 2005 (CO 2 ) (2003) Sim-CYCLE(Ito and Oikawa, 2000) CO 2 CO 2 Figure 1 CO 2 0 (Denning et al., 1995) CO 2 (2004) Sim-CYCLE CO 2 CO 2
More informationReal AdaBoost HOG 2009 3 A Graduation Thesis of College of Engineering, Chubu University Efficient Reducing Method of HOG Features for Human Detection based on Real AdaBoost Chika Matsushima ITS Graphics
More information1 2 8 24 32 44 48 49 50 SEC journal Vol.11 No.2 Sep. 2015 1 2 SEC journal Vol.11 No.2 Sep. 2015 SEC journal Vol.11 No.2 Sep. 2015 3 4 SEC journal Vol.11 No.2 Sep. 2015 SEC journal Vol.11 No.2 Sep. 2015
More information211 kotaro@math.titech.ac.jp 1 R *1 n n R n *2 R n = {(x 1,..., x n ) x 1,..., x n R}. R R 2 R 3 R n R n R n D D R n *3 ) (x 1,..., x n ) f(x 1,..., x n ) f D *4 n 2 n = 1 ( ) 1 f D R n f : D R 1.1. (x,
More information04-04 第 57 回土木計画学研究発表会 講演集 vs
04-04 vs. 1 2 1 980-8579 6-6-06 E-mail: shuhei.yamaguchi.p7@dc.tohoku.ac.jp 2 980-8579 6-6-06 E-mail: akamatsu@plan.civil.tohoku.ac.jp Fujita and Ogawa(1982) Fujita and Ogawa Key Words: agglomeration economy,
More informationOn the Limited Sample Effect of the Optimum Classifier by Bayesian Approach he Case of Independent Sample Size for Each Class Xuexian HA, etsushi WAKA
Journal Article / 学術雑誌論文 ベイズアプローチによる最適識別系の有限 標本効果に関する考察 : 学習標本の大きさ がクラス間で異なる場合 (< 論文小特集 > パ ターン認識のための学習 : 基礎と応用 On the limited sample effect of bayesian approach : the case of each class 韓, 雪仙 ; 若林, 哲史
More information…X…p†[…X’³‚¥›»‡¨‡æ‡Ñ…}…‰…`…J†[…l…‰−w‘K‡Ì‡½‡ß‡Ì“ÅfiK›»…A…‰…S…−…Y…•‡ÆCV†EPR‡Ö‡Ì›žŠp
CV PR 1, 1, 2 1 2 2009-08-31 @ PRMU/CVIM http://www.ibis.t.u-tokyo.ac.jp/ryotat/prmu09/ UT / Tokyo Tech DAL PRMU/CVIM 1 / 58 Outline 1-2 Dual Augmented Lagrangian Proximal minimization Legendre 3 4 UT
More informationv 1 v 2 e g ˆ Š Œ Ž p š ~ m n u { i 1, i 2, i 3, i 4 } { i 1, i 5 } v 1 v 2 v 3 v 4 v 5 v 6 { i 1, i 2, i 4 } { i 1, i 2, i 3, i 5 } { i 1, i 3, i 4 }
DEIM Forum 2009 D2-1 COPINE: 112 86 2 1 1 E-mail: {seki,sesejun}@sel.is.ocha.ac.jp COPINE COPINE: Mining Networks Sharing Common Patterns Mio SEKI and Jun SESE Graduate School of Humanities and Sciences,
More information19 σ = P/A o σ B Maximum tensile strength σ % 0.2% proof stress σ EL Elastic limit Work hardening coefficient failure necking σ PL Proportional
19 σ = P/A o σ B Maximum tensile strength σ 0. 0.% 0.% proof stress σ EL Elastic limit Work hardening coefficient failure necking σ PL Proportional limit ε p = 0.% ε e = σ 0. /E plastic strain ε = ε e
More information塗装深み感の要因解析
17 Analysis of Factors for Paint Depth Feeling Takashi Wada, Mikiko Kawasumi, Taka-aki Suzuki ( ) ( ) ( ) The appearance and quality of objects are controlled by paint coatings on the surfaces of the objects.
More informationkubostat2015e p.2 how to specify Poisson regression model, a GLM GLM how to specify model, a GLM GLM logistic probability distribution Poisson distrib
kubostat2015e p.1 I 2015 (e) GLM kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2015 07 22 2015 07 21 16:26 kubostat2015e (http://goo.gl/76c4i) 2015 (e) 2015 07 22 1 / 42 1 N k 2 binomial distribution logit
More informationIV (2)
COMPUTATIONAL FLUID DYNAMICS (CFD) IV (2) The Analysis of Numerical Schemes (2) 11. Iterative methods for algebraic systems Reima Iwatsu, e-mail : iwatsu@cck.dendai.ac.jp Winter Semester 2007, Graduate
More informationII Karel Švadlenka * [1] 1.1* 5 23 m d2 x dt 2 = cdx kx + mg dt. c, g, k, m 1.2* u = au + bv v = cu + dv v u a, b, c, d R
II Karel Švadlenka 2018 5 26 * [1] 1.1* 5 23 m d2 x dt 2 = cdx kx + mg dt. c, g, k, m 1.2* 5 23 1 u = au + bv v = cu + dv v u a, b, c, d R 1.3 14 14 60% 1.4 5 23 a, b R a 2 4b < 0 λ 2 + aλ + b = 0 λ =
More information2 G(k) e ikx = (ik) n x n n! n=0 (k ) ( ) X n = ( i) n n k n G(k) k=0 F (k) ln G(k) = ln e ikx n κ n F (k) = F (k) (ik) n n= n! κ n κ n = ( i) n n k n
. X {x, x 2, x 3,... x n } X X {, 2, 3, 4, 5, 6} X x i P i. 0 P i 2. n P i = 3. P (i ω) = i ω P i P 3 {x, x 2, x 3,... x n } ω P i = 6 X f(x) f(x) X n n f(x i )P i n x n i P i X n 2 G(k) e ikx = (ik) n
More information29 jjencode JavaScript
Kochi University of Technology Aca Title jjencode で難読化された JavaScript の検知 Author(s) 中村, 弘亮 Citation Date of 2018-03 issue URL http://hdl.handle.net/10173/1975 Rights Text version author Kochi, JAPAN http://kutarr.lib.kochi-tech.ac.jp/dspa
More information2019 1 5 0 3 1 4 1.1.................... 4 1.1.1......................... 4 1.1.2........................ 5 1.1.3................... 5 1.1.4........................ 6 1.1.5......................... 6 1.2..........................
More informationCOE-RES Discussion Paper Series Center of Excellence Project The Normative Evaluation and Social Choice of Contemporary Economic Systems Graduate Scho
COE-RES Discussion Paper Series Center of Excellence Project The Normative Evaluation and Social Choice of Contemporary Economic Systems Graduate School of Economics and Institute of Economic Research
More information2008 : 80725872 1 2 2 3 2.1.......................................... 3 2.2....................................... 3 2.3......................................... 4 2.4 ()..................................
More information29 Short-time prediction of time series data for binary option trade
29 Short-time prediction of time series data for binary option trade 1180365 2018 2 28 RSI(Relative Strength Index) 3 USD/JPY 1 2001 1 2 4 10 2017 12 29 17 00 1 high low i Abstract Short-time prediction
More information[1] SBS [2] SBS Random Forests[3] Random Forests ii
Random Forests 2013 3 A Graduation Thesis of College of Engineering, Chubu University Proposal of an efficient feature selection using the contribution rate of Random Forests Katsuya Shimazaki [1] SBS
More informationT rank A max{rank Q[R Q, J] t-rank T [R T, C \ J] J C} 2 ([1, p.138, Theorem 4.2.5]) A = ( ) Q rank A = min{ρ(j) γ(j) J J C} C, (5) ρ(j) = rank Q[R Q,
(ver. 4:. 2005-07-27) 1 1.1 (mixed matrix) (layered mixed matrix, LM-matrix) m n A = Q T (2m) (m n) ( ) ( ) Q I m Q à = = (1) T diag [t 1,, t m ] T rank à = m rank A (2) 1.2 [ ] B rank [B C] rank B rank
More informationIPSJ SIG Technical Report Vol.2015-MUS-107 No /5/23 HARK-Binaural Raspberry Pi 2 1,a) ( ) HARK 2 HARK-Binaural A/D Raspberry Pi 2 1.
HARK-Binaural Raspberry Pi 2 1,a) 1 1 1 2 3 () HARK 2 HARK-Binaural A/D Raspberry Pi 2 1. [1,2] [2 5] () HARK (Honda Research Institute Japan audition for robots with Kyoto University) *1 GUI ( 1) Python
More informationOverview (Gaussian Process) GPLVM GPDM 2 / 59
daichi@ism.ac.jp 2015-3-3( ) 1 / 59 Overview (Gaussian Process) GPLVM GPDM 2 / 59 (Gaussian Process) y 2 1 0 1 2 3 8 6 4 2 0 2 4 6 8 x x y (regressor) D = { (x (n), y (n) ) } N, n=1 x (n+1) y (n+1), (
More informationdvi
2017 65 2 185 200 2017 1 2 2016 12 28 2017 5 17 5 24 PITCHf/x PITCHf/x PITCHf/x MLB 2014 PITCHf/x 1. 1 223 8522 3 14 1 2 223 8522 3 14 1 186 65 2 2017 PITCHf/x 1.1 PITCHf/x PITCHf/x SPORTVISION MLB 30
More informationfiš„v8.dvi
(2001) 49 2 333 343 Java Jasp 1 2 3 4 2001 4 13 2001 9 17 Java Jasp (JAva based Statistical Processor) Jasp Jasp. Java. 1. Jasp CPU 1 106 8569 4 6 7; fuji@ism.ac.jp 2 106 8569 4 6 7; nakanoj@ism.ac.jp
More information1 4 1 ( ) ( ) ( ) ( ) () 1 4 2
7 1995, 2017 7 21 1 2 2 3 3 4 4 6 (1).................................... 6 (2)..................................... 6 (3) t................. 9 5 11 (1)......................................... 11 (2)
More informationmain.dvi
FDTD S A Study on FDTD Analysis based on S-Parameter 18 2 7 04GD168 FDTD FDTD S S FDTD S S S S FDTD FDTD i 1 1 1.1 FDTD.................................... 1 1.2 FDTD..................... 3 2 S 5 2.1 FDTD
More informationIBM-Mode1 Q: A: cash money It is fine today 2
8. IBM Model-1 @NICT mutiyama@nict.go.jp 1 IBM-Mode1 Q: A: cash money It is fine today 2 e f a P (f, a e) â : â = arg max a P (f, a e) â P (f, a e) 3 θ P (f e, θ) θ f d = { f, e } L(θ d) = log f,e d P
More information