DEIM Forum 2009 E8-4 HMM 184 8584 3-7-2 E-mail: kei.wakabayashi.bq@gs-eng.hosei.ac.jp, miurat@k.hosei.ac.jp, (HMM)., EM HMM, Baum-Welch,,,, Forecasting Time-Series on Data Stream using Incremental Hidden Markov Models Kei WAKABAYASHI and Takao MIURA Dept.of Elect.& Elect. Engr., HOSEI University 3-7-2, KajinoCho, Koganei, Tokyo, 184 8584 Japan E-mail: kei.wakabayashi.bq@gs-eng.hosei.ac.jp, miurat@k.hosei.ac.jp Abstract In this paper, we propose a sophisticated technique for time-series prediction. Here we consider an observation as output of the event, and estimate future data based on transition of events. In this investigation we formalize event as state of Hidden Markov Model (HMM), and observation as output from state. However, HMM approach is not easy to apply data stream prediction. This is because Baum-Welch algorithm which is traditional unsupervised HMM learning algorithm scan all historical data many times. Here we apply incremental Baum-Welch algorithm which is a on-line training method for HMM to data stream prediction for adapting to new pattern. We show some experimental results to see the validity of the method. Key words Forecasting, Data Stream, Hidden Markov Model, Incremental Learning 1.,,.,.,,,,,, [4]. (Exponential Smoothing),,, t, t 1 Holt-Winters, [3]., t ỹ t, F t, ỹ t F t. ỹ t = λ 1 y t + (1 λ 1 )(ỹ t 1 + F t 1 ) F t = λ 2 (ỹ t ỹ t 1 ) + (1 λ 2 )F t 1,λ 1 λ 2 0.0 1
1.0, λ. Holt-Winters t + h,ỹ t ỹ t+h t = ỹ t + hf t.,,.,,,., Hassan [5],,,.,,,,,,Hassan EM,, [6] [8].,,,,Stenger [9]. Baum-Welch, Baum-Welch (Incremental Baum-Welch) Baum-Welch,,,, Stenger, Baum-Welch, Baum-Welch,. 2 3 時刻 t t+1 t+2 t t +1 栄養剤 52 15 7 18 11 ビール 1 24 68 21 52 25 ウコン 4 20 38 2 16 ジュース 31 36 29 63 105 近隣の会社の繁忙期 休み前の飲み会 二日酔い 休み前 高校生の休日 客層の変化. 4,5 2. 1,,, t, ( t + 1), ( t + 2),,, 1 t,, t + 1,,. t + 2,,, t t + 1 t + 2,,,, 1,,.,,,,,EM 2
Baum-Welch,Baum-Welch, Baum-Welch,,,. Baum-Welch,, Baum-Welch, 3. HMM 3. 1 (Hidden Markov Model, HMM),,,, [7]. (1) Q = q 1,..., q N (2) Σ = o 1,..., o M (3) A = a ij a ij q i q j (4) B = b i (o t ) b i (o t ) q i o t (5) π = π i π i q i., b i(o t),, (6) µ ik. µ ik q i k (7) σ ik. σ ik q i k t o t k o tk, q i, µ ik, σ ik. b i (o tk ) = 1 e (o tk µ ik ) 2σ ik 2πσik o t, o tk,o t = (o t1, o t2,, o td) i, b i(o t1, o t2,, o td) =. D b i(o tk ) k=1 A, µ, σ, Baum-Welch.,. Baum-Welch EM,, 3. 2 Baum-Welch, Baum- Welch(Incremental Baum-Welch),, T i γ T (i). T 1 i γ T 1 (i), i Σ T 1 t=1 γ(i) T i j γ T (i, j), γ T (i, j) = γ T 1 (i) a ijb j (o T ) Σ j a ij b j (o T ). T i γ T (i) = Σ jγ T (i, j)., [9]. a ij = ΣT t=1γ t (i, j) µ i = ΣT t=1γ t(i)o t Σ T t=1 γ t(i) = ΣT 1 t=1 γt(i)aij + γt (i, j) = σ i = ΣT t=1γ t (i)(o t µ i ) 2 ΣT 1 t=1 γ t(i)µ i + γ T (i)o T Σ T t=1 γ t(i) = ΣT 1 t=1 γt(i)σi + γt (i)(ot µi)2 Baum-Welch,,, 3. 3 HMM, HMM. Baum-Welch EM,, Baum-Welch, Baum-Welch,, N HMM,,, 1.0. 3
, Baum-Welch, T t i γ t(i), Σ T t=1γ t(i) T γ T (i)., T + 1. T + 1, T + 1 i γ T +1(i), γ T +1 (i) = Σ N j=1γ T (j)a ji. i, T + 1, µ i = (µ i1, µ i2,..., µ id)., T + 1, Σ N j=1γ T +1 (i)µ i, T + 1, T + 1 o T +1, Baum-Welch., T + 1., γ T +1(i) = Σ N j=1γ T (j)a jib i(o T +1), 4. 4. 1 1996 2000 5,1828, (m/s) (hour) (mm) 3,,, 3. 1, 1828, 10 50,, Baum-Welch HMM (HMM), Baum-Welch HMM (Inc HMM), Holt-Winters Exponential Smoothing (H-W), (Mean Square Error; MSE). t p t, x t,mse MSE = 1 T (p T t=1 t x t ) 2, 90, HMM HMM, 10 HMM 15 4. 2 2,,, MSE., 1996 6 29 16.1 4.8 0.0 1996 6 30 11.0 1.4 14.0 1996 7 1 8.4 2.6 0.5 1996 7 2 7.6 0.8 0.0 1996 7 3 7.9 0.9 6.5 1996 7 4 10.5 0.1 0.0 1996 7 5 16.8 0.7 3.5 1 HMM 13.29 18.79 196.82 76.30 HMM 13.14 15.89 190.99 73.34 Holt-Winters 21.33 23.35 316.43 120.37 2 (MSE) HMM HMM Holt-Winters 10% 70.79 69.10 118.37 20% 57.79 56.81 96.80 30% 60.39 59.52 98.79 40% 60.75 61.09 101.15 50% 65.18 64.50 106.12 3 MSE HMM HMM Holt-Winters 10% 49.51 47.91 79.54 20% 49.05 47.43 78.44 30% 51.51 51.62 83.00 40% 51.65 49.99 79.35 50% 52.79 45.46 73.39 4 MSE 20 365, MSE, MSE, MSE 3., Baum-Welch HMM MSE. Holt- Winters, HMM MSE 39 Baum-Welch HMM, 4, 15 MSE. 3,4,5,,, MSE,.,, HMM Holt-Winters., HMM HMM, HMM 6, 90 1646 1, 1646 4
HMM HMM Holt-Winters 10% 71.30 68.65 115.24 20% 76.30 73.34 120.37 30% 75.85 73.71 123.15 40% 61.01 60.39 102.09 50% 54.88 55.70 92.55 5 MSE (ms) 1 (ms) HMM 623.3 0.38 HMM 892.1 0.54 6., HMM, 1 0.16(ms) Baum-Welch, 4. 3, 3,4,5, Holt-Winters HMM,,,, HMM HMM, HMM HMM, Baum- Welch,,, HMM HMM, 50 HMM, 98, 40 50 98,, Baum-Welch HMM, HMM,,,, Baum-Welch HMM., 5.,, HMM,,,.,,, [1] Cavalin P.R., Sabourin R., Suen C.Y. and Britto Jr., A.S.: Evaluation of Incremental Learning Algorithms for An HMM-Based Handwritten Isolated Digits Recognizer, The 11th International Conference on Frontiers in Handwriting Recognition (ICFHR 2008), Montreal, August 19-21, 2008. [2] S. Duan and S. Babu: Processing forecasting queries. In Proc. of the 2007 Intl. Conf. on Very Large Data Bases, 2007. [3] Sarah Gelper, Roland Fried, and Christophe Croux: Robust Forecasting with Exponential and Holt-Winters Smoothing, (June 2007) [4] Jan G. de Gooijer, Rob J. Hyndman: 25 Years of IIF Time Series Forecasting: A Selective Review, Tinbergen Institute Discussion Papers 05-068/4, 2005. [5] Md. Rafiul Hassan, Baikunth Nath: StockMarket Forecasting Using Hidden Markov Model: A New Approach, Proceedings of the 5th International Conference on Intelligent Systems Design and Applications, 2005. [6] Jian, N. and Gruenwald, L.: Research Issues in Data Stream Association Rule Mining, SIGMOD Record 35-1, 2006, pp.14-19 [7] :,, 1999. [8] S.Muthukrishnan: Data streams: algorithms and applications, Proceedings of the fourteenth annual ACM-SIAM symposium on discrete algorithms (2003). [9] B. Stenger and V. Ramesh and N. Paragios: Topology free Hidden Markov Models: Application to background modeling, In IEEE International Conference on Computer Vision, 2001. [10] Kei Wakabayashi, Takao Miura: Topics Identification Based on Event Sequence Using Co-occurrence Words, 13th Intn l Conf. on Applications of Natural Language to Information Systems (NLDB),2008 5