Similar documents
4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q


23_33.indd

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution

IPSJ SIG Technical Report Vol.2017-MUS-116 No /8/24 MachineDancing: 1,a) 1,b) 3 MachineDancing MachineDancing MachineDancing 1 MachineDan

Mastering the Game of Go without Human Knowledge ( ) AI 3 1 AI 1 rev.1 (2017/11/26) 1 6 2

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)

x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2

ばらつき抑制のための確率最適制御

H(ω) = ( G H (ω)g(ω) ) 1 G H (ω) (6) 2 H 11 (ω) H 1N (ω) H(ω)= (2) H M1 (ω) H MN (ω) [ X(ω)= X 1 (ω) X 2 (ω) X N (ω) ] T (3)

ohpmain.dvi

Run-Based Trieから構成される 決定木の枝刈り法


2017 (413812)

2 3

it-ken_open.key

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L

[1] SBS [2] SBS Random Forests[3] Random Forests ii

インテル® VTune™ パフォーマンス・アナライザー 9.1 Windows* 版

07-二村幸孝・出口大輔.indd

untitled

1 Kinect for Windows M = [X Y Z] T M = [X Y Z ] T f (u,v) w 3.2 [11] [7] u = f X +u Z 0 δ u (X,Y,Z ) (5) v = f Y Z +v 0 δ v (X,Y,Z ) (6) w = Z +

[1] [2] [3] (RTT) 2. Android OS Android OS Google OS 69.7% [4] 1 Android Linux [5] Linux OS Android Runtime Dalvik Dalvik UI Application(Home,T

1. HNS [1] HNS HNS HNS [2] HNS [3] [4] [5] HNS 16ch SNR [6] 1 16ch 1 3 SNR [4] [5] 2. 2 HNS API HNS CS27-HNS [1] (SOA) [7] API Web 2

2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server


1 (n = 52, 386) DL (n = 52, 386) DL DL [4] Dynamic Time Warping(DTW ) [5] Altmetrics Gunther [

IPSJ SIG Technical Report NetMAS NetMAS NetMAS One-dimensional Pedestrian Model for Fast Evacuation Simulator Shunsuke Soeda, 1 Tomohisa Yam

IPSJ SIG Technical Report Vol.2015-MUS-107 No /5/23 HARK-Binaural Raspberry Pi 2 1,a) ( ) HARK 2 HARK-Binaural A/D Raspberry Pi 2 1.

DEIM Forum 2014 B Twitter Twitter Twitter 2006 Twitter 201

2.2 (a) = 1, M = 9, p i 1 = p i = p i+1 = 0 (b) = 1, M = 9, p i 1 = 0, p i = 1, p i+1 = 1 1: M 2 M 2 w i [j] w i [j] = 1 j= w i w i = (w i [ ],, w i [

Microsoft PowerPoint - SSII_harada pptx

FIT2013( 第 12 回情報科学技術フォーラム ) I-032 Acceleration of Adaptive Bilateral Filter base on Spatial Decomposition and Symmetry of Weights 1. Taiki Makishi Ch

258 5) GPS 1 GPS 6) GPS DP 7) 8) 10) GPS GPS ) GPS Global Positioning System

SICE東北支部研究集会資料(2017年)

P2P P2P peer peer P2P peer P2P peer P2P i

y = x x R = 0. 9, R = σ $ = y x w = x y x x w = x y α ε = + β + x x x y α ε = + β + γ x + x x x x' = / x y' = y/ x y' =

IPSJ SIG Technical Report 1, Instrument Separation in Reverberant Environments Using Crystal Microphone Arrays Nobutaka ITO, 1, 2 Yu KITANO, 1

Haiku Generation Based on Motif Images Using Deep Learning Koki Yoneda 1 Soichiro Yokoyama 2 Tomohisa Yamashita 2 Hidenori Kawamura Scho

2.2 6).,.,.,. Yang, 7).,,.,,. 2.3 SIFT SIFT (Scale-Invariant Feature Transform) 8).,. SIFT,,. SIFT, Mean-Shift 9)., SIFT,., SIFT,. 3.,.,,,,,.,,,., 1,

FileMaker Server Getting Started Guide

Microsoft Word - GraphLayout1-Journal-ver2.doc

DVIOUT

LAN LAN LAN LAN LAN LAN,, i


On the Limited Sample Effect of the Optimum Classifier by Bayesian Approach he Case of Independent Sample Size for Each Class Xuexian HA, etsushi WAKA

main.dvi

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.

THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE {s-kasihr, wakamiya,

2 1 κ c(t) = (x(t), y(t)) ( ) det(c (t), c x (t)) = det (t) x (t) y (t) y = x (t)y (t) x (t)y (t), (t) c (t) = (x (t)) 2 + (y (t)) 2. c (t) =

II A A441 : October 02, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka )


., White-Box, White-Box. White-Box.,, White-Box., Maple [11], 2. 1, QE, QE, 1 Redlog [7], QEPCAD [9], SyNRAC [8] 3 QE., 2 Brown White-Box. 3 White-Box

,., ping - RTT,., [2],RTT TCP [3] [4] Android.Android,.,,. LAN ACK. [5].. 3., 1.,. 3 AI.,,Amazon, (NN),, 1..NN,, (RNN) RNN

2 G(k) e ikx = (ik) n x n n! n=0 (k ) ( ) X n = ( i) n n k n G(k) k=0 F (k) ln G(k) = ln e ikx n κ n F (k) = F (k) (ik) n n= n! κ n κ n = ( i) n n k n

Microsoft Word - toyoshima-deim2011.doc

? (EM),, EM? (, 2004/ 2002) von Mises-Fisher ( 2004) HMM (MacKay 1997) LDA (Blei et al. 2001) PCFG ( 2004)... Variational Bayesian methods for Natural

IPSJ SIG Technical Report Vol.2014-MBL-70 No.49 Vol.2014-UBI-41 No /3/15 2,a) 2,b) 2,c) 2,d),e) WiFi WiFi WiFi 1. SNS GPS Twitter Facebook Twit

TaskPit TaskPit TaskPit TaskPit 3 TaskPit Windows OS PC CPU 2 TaskPit TaskPit Windows OS CPU 1 10 TaskPit

情報処理学会研究報告 IPSJ SIG Technical Report Vol.2015-GI-34 No /7/ % Selections of Discarding Mahjong Piece Using Neural Network Matsui

IEEE e

untitled

1 LAN SSID SSID SSID SSID SSID: SSID SSID IP SSID, VLAN IP SSID, eduroam SSID: SSID eduroam , ,,,, 3 LAN Mac (215 4 ) 17, (

untitled

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro

CVaR

ActionScript Flash Player 8 ActionScript3.0 ActionScript Flash Video ActionScript.swf swf FlashPlayer AVM(Actionscript Virtual Machine) Windows

Optical Flow t t + δt 1 Motion Field 3 3 1) 2) 3) Lucas-Kanade 4) 1 t (x, y) I(x, y, t)

音響モデル triphone 入力音声 音声分析 デコーダ 言語モデル N-gram bigram HMM の状態確率として利用 出力層 triphone: 3003 ノード リスコア trigram 隠れ層 2048 ノード X7 層 1 Structure of recognition syst

hotspot の特定と最適化

03.Œk’ì

02文化情報学鋤柄.p65

23 Fig. 2: hwmodulev2 3. Reconfigurable HPC 3.1 hw/sw hw/sw hw/sw FPGA PC FPGA PC FPGA HPC FPGA FPGA hw/sw hw/sw hw- Module FPGA hwmodule hw/sw FPGA h

AV 1000 BASE-T LAN 90 IEEE ac USB (3 ) LAN (IEEE 802.1X ) LAN AWS (Amazon Web Services) AP 3 USB wget iperf3 wget 40 MBytes 2 wget 40 MByt

顔認識の為のリアルタイム特徴抽出

Studio One 2クイック・スタート・ガイド

2007年08月号 022416/0812 会告

ε

IPSJ SIG Technical Report Vol.2014-CG-155 No /6/28 1,a) 1,2,3 1 3,4 CG An Interpolation Method of Different Flow Fields using Polar Inter

1, 2, 2, 2, 2 Recovery Motion Learning for Single-Armed Mobile Robot in Drive System s Fault Tauku ITO 1, Hitoshi KONO 2, Yusuke TAMURA 2, Atsushi YAM

, 1 ( f n (x))dx d dx ( f n (x)) 1 f n (x)dx d dx f n(x) lim f n (x) = [, 1] x f n (x) = n x x 1 f n (x) = x f n (x) = x 1 x n n f n(x) = [, 1] f n (x

FX ) 2

FX自己アフリエイトマニュアル

2). 3) 4) 1.2 NICTNICT DCRA Dihedral Corner Reflector micro-arraysdcra DCRA DCRA DCRA 3D DCRA PC USB PC PC ON / OFF Velleman K8055 K8055 K8055

COOLPIX S8000 Software Suite Nikon AC AC 2

: , 2.0, 3.0, 2.0, (%) ( 2.

_314I01BM浅谷2.indd

IPSJ SIG Technical Report Vol.2009-DPS-141 No.23 Vol.2009-GN-73 No.23 Vol.2009-EIP-46 No /11/27 t-room t-room 2 Development of

Int Int 29 print Int fmt tostring 2 2 [19] ML ML [19] ML Emacs Standard ML M M ::= x c λx.m M M let x = M in M end (M) x c λx.

IPSJ SIG Technical Report Vol.2014-GN-90 No.16 Vol.2014-CDS-9 No.16 Vol.2014-DCC-6 No /1/24 1,a) 2,b) 2,c) 1,d) QUMARION QUMARION Kinect Kinect


応用数学特論.dvi

P3PC

無 線 J009 lenovo X61s L GHz 4GB 160GB 12.1inch 1.42Kg Windows Vista J010 Lenovo T400s P9400 4GB 250GB 14.1inch 1.79Kg Windows 7 J011 lenovo T61

2 Poisson Image Editing DC DC 2 Poisson Image Editing Agarwala 3 4 Agarwala Poisson Image Editing Poisson Image Editing f(u) u 2 u = (x

デジタルカメラ COOLPIX S210 簡単操作ガイド

211 ‚æ2fiúŒÚ

2010 : M DCG 3 (3DCG) 3DCG 3DCG 3DCG S


B HNS 7)8) HNS ( ( ) 7)8) (SOA) HNS HNS 4) HNS ( ) ( ) 1 TV power, channel, volume power true( ON) false( OFF) boolean channel volume int

Transcription:

2008 : 80725872

1 2 2 3 2.1.......................................... 3 2.2....................................... 3 2.3......................................... 4 2.4 ().................................. 4 3 5 3.1........................................ 5 3.2................................. 6 4 8 4.1.................................... 8 4.2.................................. 8 4.3........................................ 9 4.4....................................... 10 5 11 5.1.............................. 11 6 12 6.1........................................ 12 6.2.................................... 12 6.3....................................... 13 6.4....................................... 13 7 14 7.1.......................................... 14 7.2.......................................... 17 8 23 9 24 1/24

1 1997 [1] *1 1 [5] *1 () 2

2 2.1 3 1 l(x), ψ a,t (x), f(x) a, t (translation) (dilation) 1 3 2.2 x y W (1), W (2) I, J, K x = {x i 1 < i < I} (1) x i = l(x i ) (2) h = W (1) x (3) ( ) h hj t j j = ψ a,t (h j ) = ψ (4) a j h = {h j 1 < j < J} (5) 3

y = W (2) h (6) y k = f(y k ) (7) y = {y k 1 < k < K} (8) 2.3 3 1. 2. 3. 1 2 3 *2 3 1 2.4 () (1) (2) a,t 2 (1) *2 0 MexicanHat ψ a,t (x) = (1 2X 2 )e X2, X = (x t)/a 5 a = 1 8 4

3 3.1 2-2-1 *3 2-2-1 XOR 10-bit tight encorders *4 (-1,1) (BP) (0.0,5.0) 5000epochs 0.07 100 () Microsoft Windows XP on Intel R Mac with Bootcamp Intel R Core T M 2 Duo CPU t7700 @ 2.40GHz, 2GB RAM Microsoft Visual Studio 2008, C++ 1 ( s) (%) Sigmoid 3.7 371 1042 100 Haar 3.4 19 45 98 MexicanHat 0.9 174 442 100 2 XOR *3 2 2 1 3 i-j-k *4 N log 2 N encorders() N-bit tight encorders 5

( s) (%) Sigmoid 4.2 110 3258 100 Haar 3.0 36 1044 11 MexicanHat 0.7 308 9717 13 3 10-bit tight encorders *5 ([1][3] *6 ) *7 () 3.2 3.2.1 1 ([1][3] ) 1 41-20-3 2-2-1 XOR *5 1 1 3 1 / *6 1 1 1 / () *7 6

3.2.2 *8 ( ) (-1,1) 1 () (encorders ) BP ( *9 ) ( * 10 ) 3.2.3 () *8 *9 (4) a j *10 (4) t j 7

4 4.1 3.2.1 1. ( W (1) ) 2. (a j, t j ) 3. ( W (2) ) 4. 4.2 4.2.1 W (1) W (1) W (1) 4.2.2 N N K-means a j t j / 1 1 0,1,2,3 4 {0,1,2,3} {0,1},{2,3} 8

2 {0},{1},{2},{3} 4 4.2.3 W (2) 0.5 0.0006 (-0.5,0.5) W (2) W (1) 4.2.4 W (2) 4.3 XOR 10-bit tight encorders 0.1 5.0 100 100 () ( s) ( s) (%) Sigmoid 3.7 371 20 1042 100 Haar 3.4 19 17 45 98 MexicanHat 0.9 174 19 442 100 4 XOR ( s) ( s) (%) Haar 2.1 5 181 29 100 MexicanHat 0.2 286 163 1189 89 5 XOR 9

( s) ( s) (%) Sigmoid 4.2 110 21 3258 100 Haar 3.0 36 26 1044 11 MexicanHat 0.7 308 21 9717 13 6 10-bit tight encorders ( s) ( s) (%) Haar 2.7 59 880 1702 27 MexicanHat - - - - 0 7 encorders 4.4 XOR Haar 7 K-means XOR MexicanHat encorders BP W (1) K-means K-means XOR encorders [0.0,1.0] W (1) [0.0,1.0] K-means 10

(1)W (1) (2) (1) ( ) (2) K-means 5 5.1 tanh Gabor MexicanHat 3 5.1.1 tanh [, ] 0 [S 0, S 1 ] 0 5.1.2 tanh (9) * 11 ψ(x)dx = 0 (9) *11 ψ(ω) 2 < ω (9) 11

5.1.3 (10) W f ψ (t, a) = 1 ( ) i t ψ f(i) (10) a a a t 1 a a i 6 1 [5]-[7] 6.1 1 [8] 6.2 2 x, y w, v θ = (w 1,..., w m, v 1,..., v n ) (11) y = f(x, θ) = v j ψ(w i x i ) (12) i,j ε y = f(x, θ) + ε (13) 12

2 [8] ε x, θ y x q(x) p(y, x, θ) = q(x) 1 2 e 1 2 (y f(x,θ))2 (14) θ 6.3 3 4 6.4 N θ E[(ˆθ θ)(ˆθ θ)] T 1 N G 1 (θ) (15) [ l(y, x, θ) l(y, x, θ) T ] G(θ) = E (16) θ θ 13

= plateau 3 [8] 4 [8] G(θ) 1 l η θ θ t+1 = θ t ηg 1 (θ t ) l(y t, x t, θ t ) θ t (17) G 1 (1) 0 (2) * 12 G 1 7 (1) (2) 2 7.1 7.1.1 3 Haar 5 MexicanHat 6 *12 tanh 14

1 0 Haar 0 * 13 * 14 5 Haar 6 MexicanHat 7.1.2 xor * 15 2 * 16 xor 2 1 1 7-8 3 1 1 2 9 2 9 1 7.1.3 [0.0, 3.0] 30 * 17 10 30 *13 *14 *15 z t+1 = f(x t, y t, z t ) z t+1 [0.1] *16 y = (x2 1 x2 2 )sin(10x 1x 2 )+1 2, x 1, x 2 [ 1.1] *17 5000 15

7 8 9 2 Amari [8] (1)1 0 (2)1 2 (1) (2) Haar MexicanHat 16

Haar (2) MexicanHat (2) 7.2 7.2.1 xor / 2.9 86.7% (26/30) 603.3 33.4 XOR 1030 11 4 500 10 11 12 1 * 18 1 1 0 2 *18 2 0-1 0 1 17

1 (1) 0 (1) 12 Haar / 2.5 3.3% (1/30) - - Haar XOR 5000 29 14 Haar [0,1] * 19 13 30 1 *19 1 18

13 14 MexicanHat 1.6 100% (30/30) 54.6 48.8 MexicanHat XOR 15 15 140 7.2.2 0.3 87.6% (26/30) 516.1 300.1 17 + 180 18 0 0/2 (2) 0 1 19

15 16 17 MexicanHat 0.3 90% (27/30) 215.3 166.8 MexicanHat 20

18 20 1000 2000 19 20 21

21 7.2.3 2-0% (0/30) - - 2 30 22 23 23 5 1 0 MexicanHat 1.1 100% (30/30) 297.2 95.4 2 MexicanHat 7.2.3 50 22

23 22 24 8 95% 23

[9] [1] Qinghua Zhang, Using Wavelet Network in Nonparametric Estimation, IEEE Transaction on Neural Networks, vol.8, Issue 2, 1997, pp.227-236 [2] Kang Li, Jian-Xun Peng, Neural input selection A fast model-based approach, Neurocomputing 70, 2007, pp.762-769 [3] Cheng-Jian Lin, Wavelet Neural Networks with a hybrid Learning Approach, JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 22, 2006, pp.1367-1387 [4] Rui Xu, Survey of Clustering Algorithms, IEEE Transaction on Neural Networks, vol.16, Issue 3, 2005, pp.645-678 [5] Shun-ichi Amari, Natural Gradient Works Efficiently in Learning, Neural Computation 10, 1998 [6] Shun-ishi Amari, Hyeyoung Park, Tomoko Ozeki, Singularities Affect Dynamics of Learning in Neuromanifolds, Neural Computation 18, 2006 [7] Florent Cousseau, Tomoko Ozeki, Shun-ichi Amari, Dynamics of Learning in Multilayer PerceptronsNear Singularities, IEEE Transactions on Neural Networks, vol.19, no.8, 2008 [8], I, //, vol.49, no.8, 2005 [9] A. Waibel, T. Hanazawa, G. Hinton, K. Shikano, and K. Lang, Phoneme recognition using time-delay neural networks, IEEE Trans. Acoustics, Speech, Signal Processing, vol. 37, pp. 328 339, 1989 9 24