( ) p.2/70

Similar documents
seminar0220a.dvi

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.

arma dvi

Microsoft Word - 信号処理3.doc

80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = i=1 i=1 n λ x i e λ i=1 x i! = λ n i=1 x i e nλ n i=1 x

ARspec_decomp.dvi

医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 第 2 版 1 刷発行時のものです.


) ] [ h m x + y + + V x) φ = Eφ 1) z E = i h t 13) x << 1) N n n= = N N + 1) 14) N n n= = N N + 1)N + 1) 6 15) N n 3 n= = 1 4 N N + 1) 16) N n 4

(Basic of Proability Theory). (Probability Spacees ad Radom Variables , (Expectatios, Meas) (Weak Law



t = h x z z = h z = t (x, z) (v x (x, z, t), v z (x, z, t)) ρ v x x + v z z = 0 (1) 2-2. (v x, v z ) φ(x, z, t) v x = φ x, v z

MAIN.dvi

II No.01 [n/2] [1]H n (x) H n (x) = ( 1) r n! r!(n 2r)! (2x)n 2r. r=0 [2]H n (x) n,, H n ( x) = ( 1) n H n (x). [3] H n (x) = ( 1) n dn x2 e dx n e x2



Stata 11 Stata ts (ARMA) ARCH/GARCH whitepaper mwp 3 mwp-083 arch ARCH 11 mwp-051 arch postestimation 27 mwp-056 arima ARMA 35 mwp-003 arima postestim

1 No.1 5 C 1 I III F 1 F 2 F 1 F 2 2 Φ 2 (t) = Φ 1 (t) Φ 1 (t t). = Φ 1(t) t = ( 1.5e 0.5t 2.4e 4t 2e 10t ) τ < 0 t > τ Φ 2 (t) < 0 lim t Φ 2 (t) = 0

s = 1.15 (s = 1.07), R = 0.786, R = 0.679, DW =.03 5 Y = 0.3 (0.095) (.708) X, R = 0.786, R = 0.679, s = 1.07, DW =.03, t û Y = 0.3 (3.163) + 0

* n x 11,, x 1n N(µ 1, σ 2 ) x 21,, x 2n N(µ 2, σ 2 ) H 0 µ 1 = µ 2 (= µ ) H 1 µ 1 µ 2 H 0, H 1 *2 σ 2 σ 2 0, σ 2 1 *1 *2 H 0 H

151021slide.dvi

A B P (A B) = P (A)P (B) (3) A B A B P (B A) A B A B P (A B) = P (B A)P (A) (4) P (B A) = P (A B) P (A) (5) P (A B) P (B A) P (A B) A B P

統計学のポイント整理

201711grade1ouyou.pdf

(1) (2) (3) (4) 1

meiji_resume_1.PDF

ii p ϕ x, t = C ϕ xe i ħ E t +C ϕ xe i ħ E t ψ x,t ψ x,t p79 やは時間変化しないことに注意 振動 粒子はだいたい このあたりにいる 粒子はだいたい このあたりにいる p35 D.3 Aψ Cϕdx = aψ ψ C Aϕ dx

³ÎΨÏÀ

基礎数学I

I A A441 : April 21, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) Google

微分積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 初版 1 刷発行時のものです.

Part 1 GARCH () ( ) /24, p.2/93

Part () () Γ Part ,

, 1 ( f n (x))dx d dx ( f n (x)) 1 f n (x)dx d dx f n(x) lim f n (x) = [, 1] x f n (x) = n x x 1 f n (x) = x f n (x) = x 1 x n n f n(x) = [, 1] f n (x

% 10%, 35%( 1029 ) p (a) 1 p 95% (b) 1 Std. Err. (c) p 40% 5% (d) p 1: STATA (1). prtesti One-sample test of pr

newmain.dvi

1. A0 A B A0 A : A1,...,A5 B : B1,...,B

Isogai, T., Building a dynamic correlation network for fat-tailed financial asset returns, Applied Network Science (7):-24, 206,

II R n k +1 v 0,, v k k v 1 v 0,, v k v v 0,, v k R n 1 a 0,, a k a 0 v 0 + a k v k v 0 v k k k v 0,, v k σ k σ dimσ = k 1.3. k

untitled

III 1 (X, d) d U d X (X, d). 1. (X, d).. (i) d(x, y) d(z, y) d(x, z) (ii) d(x, y) d(z, w) d(x, z) + d(y, w) 2. (X, d). F X.. (1), X F, (2) F 1, F 2 F

keisoku01.dvi

1 filename=mathformula tex 1 ax 2 + bx + c = 0, x = b ± b 2 4ac, (1.1) 2a x 1 + x 2 = b a, x 1x 2 = c a, (1.2) ax 2 + 2b x + c = 0, x = b ± b 2

QMI_10.dvi

QMI_09.dvi

untitled

°ÌÁê¿ô³ØII

I L01( Wed) : Time-stamp: Wed 07:38 JST hig e, ( ) L01 I(2017) 1 / 19

ii

..3. Ω, Ω F, P Ω, F, P ). ) F a) A, A,..., A i,... F A i F. b) A F A c F c) Ω F. ) A F A P A),. a) 0 P A) b) P Ω) c) [ ] A, A,..., A i,... F i j A i A


−g”U›ß™ö‡Æ…X…y…N…g…‰

AR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t

2000年度『数学展望 I』講義録

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi

tokei01.dvi

Report10.dvi

LINEAR ALGEBRA I Hiroshi SUZUKI Department of Mathematics International Christian University

数学の基礎訓練I

数学Ⅱ演習(足助・09夏)

I

X線-m.dvi

6.1 (P (P (P (P (P (P (, P (, P.101


ウェーブレット分数を用いた金融時系列の長期記憶性の分析

No δs δs = r + δr r = δr (3) δs δs = r r = δr + u(r + δr, t) u(r, t) (4) δr = (δx, δy, δz) u i (r + δr, t) u i (r, t) = u i x j δx j (5) δs 2

difgeo1.dvi

b3e2003.dvi

k2 ( :35 ) ( k2) (GLM) web web 1 :

確率論と統計学の資料

k3 ( :07 ) 2 (A) k = 1 (B) k = 7 y x x 1 (k2)?? x y (A) GLM (k

構造と連続体の力学基礎

1 c Koichi Suga, ISBN

waseda2010a-jukaiki1-main.dvi

一般化線形 (混合) モデル (2) - ロジスティック回帰と GLMM

di-problem.dvi

p = mv p x > h/4π λ = h p m v Ψ 2 Ψ

( ) ( 40 )+( 60 ) Schrödinger 3. (a) (b) (c) yoshioka/education-09.html pdf 1

i 18 2H 2 + O 2 2H 2 + ( ) 3K

gr09.dvi


6.1 (P (P (P (P (P (P (, P (, P.

4/15 No.

chap10.dvi

x () g(x) = f(t) dt f(x), F (x) 3x () g(x) g (x) f(x), F (x) (3) h(x) = x 3x tf(t) dt.9 = {(x, y) ; x, y, x + y } f(x, y) = xy( x y). h (x) f(x), F (x


kubostat2017e p.1 I 2017 (e) GLM logistic regression : : :02 1 N y count data or

02-量子力学の復習

x (x, ) x y (, y) iy x y z = x + iy (x, y) (r, θ) r = x + y, θ = tan ( y ), π < θ π x r = z, θ = arg z z = x + iy = r cos θ + ir sin θ = r(cos θ + i s

II Brown Brown

Untitled

1 4 1 ( ) ( ) ( ) ( ) () 1 4 2

II Time-stamp: <05/09/30 17:14:06 waki> ii

ohpmain.dvi

2011de.dvi

Sample function Re random process Flutter, Galloping, etc. ensemble (mean value) N 1 µ = lim xk( t1) N k = 1 N autocorrelation function N 1 R( t1, t1

4. ϵ(ν, T ) = c 4 u(ν, T ) ϵ(ν, T ) T ν π4 Planck dx = 0 e x 1 15 U(T ) x 3 U(T ) = σt 4 Stefan-Boltzmann σ 2π5 k 4 15c 2 h 3 = W m 2 K 4 5.

II A A441 : October 02, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka )

Transcription:

p.1/70

( ) p.2/70

( p.3/70

(1)20 1970 cf.box and Jenkins(1970) (2)1980 p.4/70

Kolmogorov(1940),(1941) Granger(1966) Hurst(1951) Hurst p.5/70

Beveridge Wheat Prices Index p.6/70

Nile River Water Level p.7/70

( ) p.8/70

( ) fx t jt =0; ±1; ±2;:::g: t g t E(X t )=μ( (i)fx μ =0) t ;X s ) t s (ii)cov(x ( ) fl(h) = Cov(X t+h ;X t ) : Autocovariance Function ρ(h) = fl(h)=fl(0) : Autocorrelation Function p.9/70

Z ß Z ß ( ) ( ) fl(h) = exp(ih )df ( ) ß fl(h) = exp(ih )f( )d : AbsolutelyContinuous ß p.10/70

1X 1X ( ) ( ) h=0 jρ(h)j < 1 ( ) jρ(h)j = 1 h=0 p.11/70

X t ffi 1 X t 1 ::: ffi p X t p = U t 1 U t 1 ::: q U t q ARMA(p; q) fu t jt =0; ±1; ±2;:::g : White Noise Var(U t )=ff 2 ; Cov(U t ;U s )=0;t6= s p.12/70

(B) = 1 1 ::: q B q ARMA ffi(b)x t = (B)U t t = X t 1 : Backward Shift Operator BX = 1 ffi 1 ::: ffi p B p ffi(b) p.13/70

ARMA Autocorrelation Function jρ(h)j»ka jhj (0 <a<1) 8h : Exponential Decay Spectral Density Function f( ) = ff2 i )j 2 j (e i )j 2 jffi(e 2ß Analytic Function : p.14/70

ARIMA(p; d; q) ( ffi(b)r d X(t) = (B)U t rx t = (1 B)X t = X t X t 1 : Difference Operator dx d j j (d =2; 3;:::) ( B) r d = (1 B) d = j=0 fr d X t g : StationaryARMA(p; q)model p.15/70

1) :::(d j +1) d(d 1) :::1) (j(j +1) (d + 1) (d j +1) (j 1 ARFIMA(p; d; q) ffi(b)r d X(t) = (B)U t d :Any Real Number 1X d j j ( B) r d = (1 B) d = j=0 Fractional Difference Operator d j = = p.16/70

1 d<1=2 )ARFIMA(p; d; q):stationary Process Spectral Density Function f( ) = ff2 i )j 2 j (e i )(1 e i ) d j 2 jffi(e 2ß f( )!1as! 0(0<d<1=2) p.17/70

2 Fractional Brownian Motion fb H (t)j0» t<1g: E(B H (t)) = 08t E(jB H (t) B H (s)j 2 ) = ff 2 jt sj 2H (0 <H< 1) H =1=2 ) Brownian Motion Fractional Gaussian Noise fx t jt =1; 2;:::g: X t = B H (t) B H (t 1) p.18/70

f( ) ο j j ff 1 (! 0) ρ(h) ο h ff (h!1) ff =1 2d =2 2H p.19/70

AR Model p.20/70

ARFIMA Model p.21/70

ARFIMA Model p.22/70

exp( 1 X0 n 1 n (fi;ff2 )X n ) 2 (.ARFIMA) X n = (X 1 ;X 2 ;:::;X n ) 0 : Observations fi = (d; ffi 1 ;:::;ffi p ; 1 ;:::; q ); ff 2 :Parameters 1. ( ) L(fi;ff 2 ) = 1 n=2 j n(fi;ff 2 )j 1=2 (2ß) n (fi;ff 2 ) : n n Covariance Matrix (^fi; ^ff 2 ) = arg fi;ff 2 L(fi;ff 2 ) p.23/70

n (fi;ff 2 )= 1 U log ff2 1 2ff 2 2 j 2( ß;ß) n ( j ) 2ßI j ; fi) ng( 2.Whittle ( ) X g( ; fi) f( ; fi;ff 2 ) = ff2 2ß n ( ) = jp n l=1 X te it j 2 I :Periodogram 2ßn j = 2ßj=n :Fourier Frequency p.24/70

1 X0 n 1 n (fi;ff2 )X n ß 1 2ff 2 2 j 2( ß;ß) n ( j ) 2ßI j ; fi) ng( Whittle ~fi; ~ff 2 = arg fi;ff 2 U n (fi;ff 2 ) 1 2 log ff2 ß 1 2 log j n(fi;ff 2 )j=n X p.25/70

(i)n!1 ( ) (ii) p n(^fi fi; ^ff 2 ff 2 ) Whittle. Whittle p.26/70

vs (1) Efficient! (2) Robust Efficient p.27/70

(1) f( ) = C 2d + o( 2d )! 0 d < 1=2; C :Positive Constant (i)f( ) =C 2d =0. (ii)f( j ) I j (= I( j )). 0 <j» m (iii)n! 1 m=n! 0.( = 0 I j ) p.28/70

1 2 Z (q )=F ( )) log(f q log C 1 2d 2d 1 ο 1 2 Z 2d) log q (1 q log (i). F ( ) = 0! 2d d! f(!)d! ο C 0 = q(> 0), 1 1 = d p.29/70

log( ^F (q m )= ^F ( m )) 1 q log ^F ( ) = 2ß n I j! ψ ^d AV E = 1 2 [n =2ß] X j=1 p.30/70

log I j = log f( j ) + log Ij C 2d log 2ßj log n (log C ) 2d log 2ßj = n I j (ii) f( j ) ο Ij f( + ) log j + U j = 0:577216 :::(Euler 0 s Constant) U j = log( )+ (Error Terms) f( j ) p.31/70

P m j=l log I j(log j 1 m j=l (log j 1 m l+1 P P m i=l log i)2 log i) m l+1 P m i=l LOG = 1 ^d 2 m ( ), (< m l )( ) p.32/70

1 m ^d GAU = arg d R(d) mx (iii) Whittle mx 2d! I j j ψ 2d log j! (1) ψ R(d) =log m j=1 j=1 p.33/70

Z 1 Z μ w(u)log(u)du) 1 (iv) d = h w ( μ 1 Z μ w( μ 1 )logf( )d 0 ( μ 1 log f( μ ) w( μ 1 )d ) 0 w(u)(0» u» 1) : (Positive Weight Function) h w =( 2 0 p.34/70

ψ 1 w ~d h = k ^f p = ^f( p )= kx 1 m+1 1 k m=2 j= m=2 I j+p P 2 m kx m=2 j=1 I j+p P 0 < 2p» m if μ = k,. ψ w p! log ^f k+1! w p log ^f p p=1 p=1 w p = w(p=k) 8 < if m<2p : p.35/70

^d WIN = μv 1 1 m ψ1 = h w ~d p p px px v p ~ dp 1 p px w l = w(l=p); v p = v(p=m); μv = m 1 m X p=1 v p,. p=1 ψ! w l log ^f l log ^f p+1! l=1 l=1 v(u)(0» u» 1) : Positive Weight Function p.36/70

( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Table 1: Bias of The Estimators(d = 0:4) Sample Size n =128 n =256 m =16 m =32 Bandwidth AVE -.125 -.100 LOG.019.017 GAU -.030 -.007 WIN -.084 -.029 p.37/70

( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Table 2: Standard Deviation of The = 0:4) Estimators(d Sample Size n =128 n =256 m =16 m =32 Bandwidth AVE.109.065 LOG.216.137 GAU.138.094 WIN.112.084 p.38/70

( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Table 3: MSE of The Estimators(d = 0:4) Sample Size n =128 n =256 m =16 m =32 Bandwidth AVE.028.014 LOG.047.019 GAU.020.009 WIN.020.008 p.39/70

( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Bias ^d LOG MSE ^d GAU, ^d WIN p.40/70

1X log f Λ ( ) p. p.41/70 f( ) = j1 exp(i )j 2d f Λ ( ) f Λ ( ) : Positive Continuous Function log f( ) = ( 2d) log j1 exp(i )j +logf Λ ( ) log f Λ ( ) = j h j ( ) :Fourier Expansion j=0 h 0 ( ) = 1= p ß; h j ( ) = cos(j )= p ß(j =1; 2;:::)

( ) j ( ). n, p. p.42/70

2 j d=0 (. ) ( ) 0 : d =0vsH 1 : d =1 H H 0 : d =0vsH 1 : d 6= 0. ( ) @R(d) j d=0 @d @2 R(d) 2 @d LM = m = R(d) Whittle. 1 χ 2. p.43/70

(Lobato and Robinson). (BP), (DM), (JY)vs. (1974.1-1985.6) Table 4: Unit Root Test( Λ : 5%Significant) Log Difference Data BP DM JY 20 2.039 0.164 0.330 40 3.283 0.662 0.042 60 4:372 Λ 0.036 0.029 80 9:987 Λ 0.201 2.833 100 8:185 Λ 1.719 2.429 p.44/70

40 12:830 Λ 10:005 Λ 10:710 Λ 60 19:029 Λ 32:333 Λ 26:771 Λ 80 24:045 Λ 62:196 Λ 24:153 Λ 100 32:199 Λ 100:356 Λ 13:852 Λ ( ) Table 5: Unit Test( Root : 5%Significant) Squared Log Difference Data Λ BP DM JY 20 3.332 3.135 p.45/70

( ) 2. (?) p.46/70

( ) Wiener-Kolmogorov ARFIMA!ARMA ex.aic,bic,etc. ex.ar vs p.47/70

1X 1X ψ j U t j Wiener-Kolmogorov Prediction (0 <h) (= X n ;X n 1 ;::: X n+h MA(1) X t = AR(1) j=0 ß j X t j = U(t) j=0 p.48/70

^X n (h) = 1X ß j X n+h j ψ 2 j Wiener-Kolmogorov ( ) ( ) h 1 X ß j ^Xn (h j) j=1 j=h Recursive Formula h 1 ^X n (h)) 2 ]=ff 2 X n+h E[(X j=0 p.49/70

Prediction (0 <h) (= X n ;X n 1 ;:::;X 1 n+h X ß j X n+h j Wiener-Kolmogorov ( ) (1) (X t =0;t<0.) h 1 n (h) = X Λ X ß j ^Xn (h j) j=1 n+h 1 X j=h p.50/70

^X n (h) = P n Π = p n Wiener-Kolmogorov ( ) (2) n 1 X ß j (h)x ( n +1 j) j=1 n = [ρ(i j)] : n n Autocorrelation Matrix P = (ß 1 (h);:::;ß n (h)) 0 ; p n =(ρ(h);:::;ρ(n + h 1)) 0 Π p.51/70

(.ARFIMA Type 1 (d ) (a) AR(MA) h, ß (h) j. (b)plug-in AR(MA) h =1 ( Innovation Algorithm) h =2; 3;::: ß (h) j. Type 2 (d ) (a) ARFIMA h, ß (h) j. (b)plug-in ARFIMA h =1 h =2; ß 3;::: (h) j. p.52/70

h2 P ^XARMA n (i) X n+i ) 2 P h 2 i=h ( ^XARF ( IMA n (i) X n+i ) 2 1 i=h 1 h2 i=h P ( 1 ( ) ARFIMA(p; d; 0)( ) (i)n.crato and B.K.Ray(1996) MSE(h 1 h 2 ) ARF IMA n (i) X n+i ) 2 ^X = ARMA n (i)( ^X ARF IMA n (i)) ^X. p.53/70

^XARF IMA n ( n+h ) 2 =MMSE vs ( ^XARMA n (h) X n+h ) 2 =MMSE (h) X (. ) (2)J.Brodsky and C.M.Hurvich(1999) MMSE (ARFIMA) p.54/70

(. ) Table 6: Crato and Ray(p = 1; d = 0:3; ffi = 0:65 %) n =120 n =360 Sample size Lead Time AIC AICc SIC 1-6 16.61 16.43 13.26 13-24 21.07 21.07 20.27 25-36 11.23 11.30 11.27 1-6 6.70 6.43 6.52 13-24 10.87 10.99 10.56 25-36 12.88 12.93 13.36 p.55/70

(. ) Table 7: Brodsky and Hurvich(p = 0; d = 0:45; n = 100) Lead Time ARFIMA ARMA(1,1) 2 0.98 1.03 4 1.04 1.07 10 1.04 1.10 15 0.97 1.09 20 1.02 1.10 40 1.09 1.25 p.56/70

( ) ARMA d d (ARFIMA) p.57/70

( ) d ( ) (X t t ) p.58/70

t =(X 1t ;X 2t ; ;X pt ) 0 (p ) X X (i =1;:::;p) it I(d) (Integrated process of order d) fi =(fi ;:::;fi p ) 0 ) 1, 1 fi X 0 t I(d b)(b >0) (d = b ). fi,. p.59/70

ν t = E t P 0t. it ; i =0; 1:, E t : P ( / ) 1t P ν t = log E t +logp 0t log P 1t log p =3; X t = (log E t ; log P 0t ; log P 1t ) 0, X t, fi =(1; 1; 1) 0 d = b =1? p.60/70

f(0) = G d ( ) 1st Step X it (i =1;:::;p) d i ( ). 2nd Step d i d. f( ) G : n n, p rank(g) p.61/70

( ) p.62/70

^d W GAU =0:47; ^db GAU =0:31; ^dd GAU =0:31 ( ) WTI, Brent, Dubai(1986.1-1998.1) 1st Step( ^d GAU ) (i) 5% d B = d D (ii) 1% d W = d B = d D 2nd Step (i) d B = d D. G(2 2). 1:51 10 2 ; 0:03 10 2! =2 1=1? (ii) d W = d B = d D. G(3 3). 1:81 10 2 ; 0:03 10 2 ; 0:01 10 2! =3 1=2? p.63/70

(. ). p.64/70

h = (h 1 ;:::;h d ) 0 dx h i i ( ß;ß] d exp(ih 0 )f( )d ( ): fx t jt =(t 1 ;:::;t d )g Z fl(h) = E(X t X t+h )= h 0 = i=1 f( ) : Spectral Density Function d p.65/70

Long Memory Random Fields(d = 2) (i) Separable Long Memory Random Field οj 1 j d 1 j 2 j d 2 (0 <d 1 ;d 2 < 1) f( ) (ii)isotropic Long Memory Random Field f( ) ο ( 2 1 + 2 2) d (0 <d<2) p.66/70

p.67/70

( ) Figure 8: Parametric Estimation p.68/70

( ) p.69/70 Figure 9: Nonparametric Estimation

: ( (A)) 11 29 ( ) 12 1 ( ) (JR ) p.70/70