p.1/70
( ) p.2/70
( p.3/70
(1)20 1970 cf.box and Jenkins(1970) (2)1980 p.4/70
Kolmogorov(1940),(1941) Granger(1966) Hurst(1951) Hurst p.5/70
Beveridge Wheat Prices Index p.6/70
Nile River Water Level p.7/70
( ) p.8/70
( ) fx t jt =0; ±1; ±2;:::g: t g t E(X t )=μ( (i)fx μ =0) t ;X s ) t s (ii)cov(x ( ) fl(h) = Cov(X t+h ;X t ) : Autocovariance Function ρ(h) = fl(h)=fl(0) : Autocorrelation Function p.9/70
Z ß Z ß ( ) ( ) fl(h) = exp(ih )df ( ) ß fl(h) = exp(ih )f( )d : AbsolutelyContinuous ß p.10/70
1X 1X ( ) ( ) h=0 jρ(h)j < 1 ( ) jρ(h)j = 1 h=0 p.11/70
X t ffi 1 X t 1 ::: ffi p X t p = U t 1 U t 1 ::: q U t q ARMA(p; q) fu t jt =0; ±1; ±2;:::g : White Noise Var(U t )=ff 2 ; Cov(U t ;U s )=0;t6= s p.12/70
(B) = 1 1 ::: q B q ARMA ffi(b)x t = (B)U t t = X t 1 : Backward Shift Operator BX = 1 ffi 1 ::: ffi p B p ffi(b) p.13/70
ARMA Autocorrelation Function jρ(h)j»ka jhj (0 <a<1) 8h : Exponential Decay Spectral Density Function f( ) = ff2 i )j 2 j (e i )j 2 jffi(e 2ß Analytic Function : p.14/70
ARIMA(p; d; q) ( ffi(b)r d X(t) = (B)U t rx t = (1 B)X t = X t X t 1 : Difference Operator dx d j j (d =2; 3;:::) ( B) r d = (1 B) d = j=0 fr d X t g : StationaryARMA(p; q)model p.15/70
1) :::(d j +1) d(d 1) :::1) (j(j +1) (d + 1) (d j +1) (j 1 ARFIMA(p; d; q) ffi(b)r d X(t) = (B)U t d :Any Real Number 1X d j j ( B) r d = (1 B) d = j=0 Fractional Difference Operator d j = = p.16/70
1 d<1=2 )ARFIMA(p; d; q):stationary Process Spectral Density Function f( ) = ff2 i )j 2 j (e i )(1 e i ) d j 2 jffi(e 2ß f( )!1as! 0(0<d<1=2) p.17/70
2 Fractional Brownian Motion fb H (t)j0» t<1g: E(B H (t)) = 08t E(jB H (t) B H (s)j 2 ) = ff 2 jt sj 2H (0 <H< 1) H =1=2 ) Brownian Motion Fractional Gaussian Noise fx t jt =1; 2;:::g: X t = B H (t) B H (t 1) p.18/70
f( ) ο j j ff 1 (! 0) ρ(h) ο h ff (h!1) ff =1 2d =2 2H p.19/70
AR Model p.20/70
ARFIMA Model p.21/70
ARFIMA Model p.22/70
exp( 1 X0 n 1 n (fi;ff2 )X n ) 2 (.ARFIMA) X n = (X 1 ;X 2 ;:::;X n ) 0 : Observations fi = (d; ffi 1 ;:::;ffi p ; 1 ;:::; q ); ff 2 :Parameters 1. ( ) L(fi;ff 2 ) = 1 n=2 j n(fi;ff 2 )j 1=2 (2ß) n (fi;ff 2 ) : n n Covariance Matrix (^fi; ^ff 2 ) = arg fi;ff 2 L(fi;ff 2 ) p.23/70
n (fi;ff 2 )= 1 U log ff2 1 2ff 2 2 j 2( ß;ß) n ( j ) 2ßI j ; fi) ng( 2.Whittle ( ) X g( ; fi) f( ; fi;ff 2 ) = ff2 2ß n ( ) = jp n l=1 X te it j 2 I :Periodogram 2ßn j = 2ßj=n :Fourier Frequency p.24/70
1 X0 n 1 n (fi;ff2 )X n ß 1 2ff 2 2 j 2( ß;ß) n ( j ) 2ßI j ; fi) ng( Whittle ~fi; ~ff 2 = arg fi;ff 2 U n (fi;ff 2 ) 1 2 log ff2 ß 1 2 log j n(fi;ff 2 )j=n X p.25/70
(i)n!1 ( ) (ii) p n(^fi fi; ^ff 2 ff 2 ) Whittle. Whittle p.26/70
vs (1) Efficient! (2) Robust Efficient p.27/70
(1) f( ) = C 2d + o( 2d )! 0 d < 1=2; C :Positive Constant (i)f( ) =C 2d =0. (ii)f( j ) I j (= I( j )). 0 <j» m (iii)n! 1 m=n! 0.( = 0 I j ) p.28/70
1 2 Z (q )=F ( )) log(f q log C 1 2d 2d 1 ο 1 2 Z 2d) log q (1 q log (i). F ( ) = 0! 2d d! f(!)d! ο C 0 = q(> 0), 1 1 = d p.29/70
log( ^F (q m )= ^F ( m )) 1 q log ^F ( ) = 2ß n I j! ψ ^d AV E = 1 2 [n =2ß] X j=1 p.30/70
log I j = log f( j ) + log Ij C 2d log 2ßj log n (log C ) 2d log 2ßj = n I j (ii) f( j ) ο Ij f( + ) log j + U j = 0:577216 :::(Euler 0 s Constant) U j = log( )+ (Error Terms) f( j ) p.31/70
P m j=l log I j(log j 1 m j=l (log j 1 m l+1 P P m i=l log i)2 log i) m l+1 P m i=l LOG = 1 ^d 2 m ( ), (< m l )( ) p.32/70
1 m ^d GAU = arg d R(d) mx (iii) Whittle mx 2d! I j j ψ 2d log j! (1) ψ R(d) =log m j=1 j=1 p.33/70
Z 1 Z μ w(u)log(u)du) 1 (iv) d = h w ( μ 1 Z μ w( μ 1 )logf( )d 0 ( μ 1 log f( μ ) w( μ 1 )d ) 0 w(u)(0» u» 1) : (Positive Weight Function) h w =( 2 0 p.34/70
ψ 1 w ~d h = k ^f p = ^f( p )= kx 1 m+1 1 k m=2 j= m=2 I j+p P 2 m kx m=2 j=1 I j+p P 0 < 2p» m if μ = k,. ψ w p! log ^f k+1! w p log ^f p p=1 p=1 w p = w(p=k) 8 < if m<2p : p.35/70
^d WIN = μv 1 1 m ψ1 = h w ~d p p px px v p ~ dp 1 p px w l = w(l=p); v p = v(p=m); μv = m 1 m X p=1 v p,. p=1 ψ! w l log ^f l log ^f p+1! l=1 l=1 v(u)(0» u» 1) : Positive Weight Function p.36/70
( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Table 1: Bias of The Estimators(d = 0:4) Sample Size n =128 n =256 m =16 m =32 Bandwidth AVE -.125 -.100 LOG.019.017 GAU -.030 -.007 WIN -.084 -.029 p.37/70
( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Table 2: Standard Deviation of The = 0:4) Estimators(d Sample Size n =128 n =256 m =16 m =32 Bandwidth AVE.109.065 LOG.216.137 GAU.138.094 WIN.112.084 p.38/70
( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Table 3: MSE of The Estimators(d = 0:4) Sample Size n =128 n =256 m =16 m =32 Bandwidth AVE.028.014 LOG.047.019 GAU.020.009 WIN.020.008 p.39/70
( ) Hidalgo and Yajima(2002.Ann.Inst.Statist.Math.) Bias ^d LOG MSE ^d GAU, ^d WIN p.40/70
1X log f Λ ( ) p. p.41/70 f( ) = j1 exp(i )j 2d f Λ ( ) f Λ ( ) : Positive Continuous Function log f( ) = ( 2d) log j1 exp(i )j +logf Λ ( ) log f Λ ( ) = j h j ( ) :Fourier Expansion j=0 h 0 ( ) = 1= p ß; h j ( ) = cos(j )= p ß(j =1; 2;:::)
( ) j ( ). n, p. p.42/70
2 j d=0 (. ) ( ) 0 : d =0vsH 1 : d =1 H H 0 : d =0vsH 1 : d 6= 0. ( ) @R(d) j d=0 @d @2 R(d) 2 @d LM = m = R(d) Whittle. 1 χ 2. p.43/70
(Lobato and Robinson). (BP), (DM), (JY)vs. (1974.1-1985.6) Table 4: Unit Root Test( Λ : 5%Significant) Log Difference Data BP DM JY 20 2.039 0.164 0.330 40 3.283 0.662 0.042 60 4:372 Λ 0.036 0.029 80 9:987 Λ 0.201 2.833 100 8:185 Λ 1.719 2.429 p.44/70
40 12:830 Λ 10:005 Λ 10:710 Λ 60 19:029 Λ 32:333 Λ 26:771 Λ 80 24:045 Λ 62:196 Λ 24:153 Λ 100 32:199 Λ 100:356 Λ 13:852 Λ ( ) Table 5: Unit Test( Root : 5%Significant) Squared Log Difference Data Λ BP DM JY 20 3.332 3.135 p.45/70
( ) 2. (?) p.46/70
( ) Wiener-Kolmogorov ARFIMA!ARMA ex.aic,bic,etc. ex.ar vs p.47/70
1X 1X ψ j U t j Wiener-Kolmogorov Prediction (0 <h) (= X n ;X n 1 ;::: X n+h MA(1) X t = AR(1) j=0 ß j X t j = U(t) j=0 p.48/70
^X n (h) = 1X ß j X n+h j ψ 2 j Wiener-Kolmogorov ( ) ( ) h 1 X ß j ^Xn (h j) j=1 j=h Recursive Formula h 1 ^X n (h)) 2 ]=ff 2 X n+h E[(X j=0 p.49/70
Prediction (0 <h) (= X n ;X n 1 ;:::;X 1 n+h X ß j X n+h j Wiener-Kolmogorov ( ) (1) (X t =0;t<0.) h 1 n (h) = X Λ X ß j ^Xn (h j) j=1 n+h 1 X j=h p.50/70
^X n (h) = P n Π = p n Wiener-Kolmogorov ( ) (2) n 1 X ß j (h)x ( n +1 j) j=1 n = [ρ(i j)] : n n Autocorrelation Matrix P = (ß 1 (h);:::;ß n (h)) 0 ; p n =(ρ(h);:::;ρ(n + h 1)) 0 Π p.51/70
(.ARFIMA Type 1 (d ) (a) AR(MA) h, ß (h) j. (b)plug-in AR(MA) h =1 ( Innovation Algorithm) h =2; 3;::: ß (h) j. Type 2 (d ) (a) ARFIMA h, ß (h) j. (b)plug-in ARFIMA h =1 h =2; ß 3;::: (h) j. p.52/70
h2 P ^XARMA n (i) X n+i ) 2 P h 2 i=h ( ^XARF ( IMA n (i) X n+i ) 2 1 i=h 1 h2 i=h P ( 1 ( ) ARFIMA(p; d; 0)( ) (i)n.crato and B.K.Ray(1996) MSE(h 1 h 2 ) ARF IMA n (i) X n+i ) 2 ^X = ARMA n (i)( ^X ARF IMA n (i)) ^X. p.53/70
^XARF IMA n ( n+h ) 2 =MMSE vs ( ^XARMA n (h) X n+h ) 2 =MMSE (h) X (. ) (2)J.Brodsky and C.M.Hurvich(1999) MMSE (ARFIMA) p.54/70
(. ) Table 6: Crato and Ray(p = 1; d = 0:3; ffi = 0:65 %) n =120 n =360 Sample size Lead Time AIC AICc SIC 1-6 16.61 16.43 13.26 13-24 21.07 21.07 20.27 25-36 11.23 11.30 11.27 1-6 6.70 6.43 6.52 13-24 10.87 10.99 10.56 25-36 12.88 12.93 13.36 p.55/70
(. ) Table 7: Brodsky and Hurvich(p = 0; d = 0:45; n = 100) Lead Time ARFIMA ARMA(1,1) 2 0.98 1.03 4 1.04 1.07 10 1.04 1.10 15 0.97 1.09 20 1.02 1.10 40 1.09 1.25 p.56/70
( ) ARMA d d (ARFIMA) p.57/70
( ) d ( ) (X t t ) p.58/70
t =(X 1t ;X 2t ; ;X pt ) 0 (p ) X X (i =1;:::;p) it I(d) (Integrated process of order d) fi =(fi ;:::;fi p ) 0 ) 1, 1 fi X 0 t I(d b)(b >0) (d = b ). fi,. p.59/70
ν t = E t P 0t. it ; i =0; 1:, E t : P ( / ) 1t P ν t = log E t +logp 0t log P 1t log p =3; X t = (log E t ; log P 0t ; log P 1t ) 0, X t, fi =(1; 1; 1) 0 d = b =1? p.60/70
f(0) = G d ( ) 1st Step X it (i =1;:::;p) d i ( ). 2nd Step d i d. f( ) G : n n, p rank(g) p.61/70
( ) p.62/70
^d W GAU =0:47; ^db GAU =0:31; ^dd GAU =0:31 ( ) WTI, Brent, Dubai(1986.1-1998.1) 1st Step( ^d GAU ) (i) 5% d B = d D (ii) 1% d W = d B = d D 2nd Step (i) d B = d D. G(2 2). 1:51 10 2 ; 0:03 10 2! =2 1=1? (ii) d W = d B = d D. G(3 3). 1:81 10 2 ; 0:03 10 2 ; 0:01 10 2! =3 1=2? p.63/70
(. ). p.64/70
h = (h 1 ;:::;h d ) 0 dx h i i ( ß;ß] d exp(ih 0 )f( )d ( ): fx t jt =(t 1 ;:::;t d )g Z fl(h) = E(X t X t+h )= h 0 = i=1 f( ) : Spectral Density Function d p.65/70
Long Memory Random Fields(d = 2) (i) Separable Long Memory Random Field οj 1 j d 1 j 2 j d 2 (0 <d 1 ;d 2 < 1) f( ) (ii)isotropic Long Memory Random Field f( ) ο ( 2 1 + 2 2) d (0 <d<2) p.66/70
p.67/70
( ) Figure 8: Parametric Estimation p.68/70
( ) p.69/70 Figure 9: Nonparametric Estimation
: ( (A)) 11 29 ( ) 12 1 ( ) (JR ) p.70/70