自然言語処理22_289

Size: px
Start display at page:

Download "自然言語処理22_289"

Transcription

1 (RNN) Dyer (Dyer, Clark, Lavie, and Smith 2011) IBM (Brown, Pietra, Pietra, and Mercer 1993) word embedding word embedding RNN (Yang, Liu, Li, Zhou, and Yu 2013) IBM 4 Recurrent Neural Networks for Word Alignment Akihiro Tamura, Taro Watanabe and Eiichiro Sumita This paper proposes a novel word alignment model based on a recurrent neural network (RNN), in which an unlimited alignment history is represented by recurrently connected hidden layers. In addition, we perform unsupervised learning inspired by (Dyer et al. 2011), which utilizes artificially generated negative samples. Our alignment model is directional, like the generative IBM models (Brown et al. 1993). To overcome this limitation, we encourage an agreement between the two directional models by introducing a penalty function, which ensures word embedding consistency across two directional models during training. The RNN-based model outperforms both the feed-forward NN-based model (Yang et al. 2013) and the IBM Model 4 under Japanese-English and French-English word alignment tasks, and achieves comparable translation performance to those baselines under Japanese-English and Chinese- English translation tasks., National Institute of Information and Communications Technology, Google

2 Vol. 22 No. 4 December 2015 Key Words: Word Alignment, Recurrent Neural Network, Unsupervised Learning, Agreement Constraint 1 IBM 1-5 (Brown et al. 1993) HMM (Vogel, Ney, and Tillmann 1996) (Och and Ney 2003; Berg-Kirkpatrick, Bouchard-Côté, DeNero, and Klein 2010) Yang (FFNN) Context-Dependent Deep Neural Network for HMM (CD-DNN-HMM) (Dahl, Yu, Deng, and Acero 2012) HMM IBM 4 HMM (Yang et al. 2013) FFNN-HMM (NN) (RNN) RNN RNN FFNN (Mikolov, Karafiát, Burget, Cernocký, and Khudanpur 2010; Mikolov and Zweig 2012; Sundermeyer, Oparin, Gauvain, Freiberg, Schlüter, and Ney 2013) (Auli, Galley, Quirk, and Zweig 2013; Kalchbrenner and Blunsom 2013) RNN RNN FFNN RNN FFNN NN Yang IBM HMM 290

3 (Yang et al. 2013) RNN Dyer (Dyer et al. 2011) RNN f e e f (Matusov, Zens, and Ney 2004; Liang, Taskar, and Klein 2006; Graça, Ganchev, and Taskar 2008; Ganchev, Graça, and Taskar 2008) f e e f 2 RNN word embedding word embedding RNN FFNN IBM 4 FFNN IBM 4 1 FFNN F1 IBM F1 FFNN 0.74% (BLEU) IBM % (BLEU) RNN 2 3 RNN 1 NN IBM 4 291

4 Vol. 22 No. 4 December RNN (Brown et al. 1993; Vogel et al. 1996; Och and Ney 2003) (Taskar, Lacoste-Julien, and Klein 2005; Moore 2005; Blunsom and Cohn 2006) FFNN (Yang et al. 2013) 2.1 J f J 1 = f 1,..., f J I e I 1 = e 1,..., e I f J 1 e I 1 a J 1 = a 1,..., a J a j f j e aj null (e 0 ) f j a j = 0 f J 1 e I 1 e I 1 p(f J 1 e I 1) = a J 1 p(f J 1, a J 1 e I 1). (1) IBM 1 2 HMM (1) a J 1 p(f J 1, a J 1 e I 1) p a p t 2 p(f J 1, a J 1 e I 1) = J p a (a j a j 1, j)p t (f j e aj ). (2) j=1 3 HMM p a (a j a j a j 1 ) (fertility) (distortion) IBM 3-5 EM (Dempster, Laird, and Rubin 1977) 2 p a a 0 =0 292

5 (f J 1, e I 1) (3) â J 1 â J 1 = argmax p(f1 J, a J 1 e I 1). (3) a J 1 HMM (Viterbi 1967) 2.2 FFNN FFNN (Dahl et al. 2012) (Le, Allauzen, and Yvon 2012; Vaswani, Zhao, Fossum, and Chiang 2013) (Collobert and Weston 2008; Collobert, Weston, Bottou, Karlen, Kavukcuoglu, and Kuksa 2011) Yang FFNN CD-DNN-HMM (Dahl et al. 2012) HMM (Yang et al. 2013) FFNN FFNN (2) p a p t FFNN J s NN (a J 1 f1 J, e I 1) = t a (a j a j 1 c(e aj 1 )) t t (f j, e aj c(f j ), c(e aj )). (4) j=1 t a t t p a p t s NN a J 1 c( w) w HMM a j 1 FFNN 1 t t (f j, e aj c(f j ), c(e aj )) lookup 1 L {H, B H } {O, B O } L word embedding (Bengio, Ducharme, Vincent, and Janvin 2003) V f V e word embedding M L M ( V f + V e ) V f V e unk null null 293

6 Vol. 22 No. 4 December FFNN t t(f j, e aj c(f j), c(e aj )) f j e aj 1 3 lookup L word embedding z 0 lookup z 0 z 0 z 1 3 z 1 = f(h z 0 + B H ), (5) t t = O z 1 + B O. (6) H B H O B O z 1 z 0 z z f(x) (Yang et al. 2013) htanh(x) 4 t a (a j a j 1 c(e aj 1 )) (7) (SGD) 5 (Rumelhart, Hinton, and Williams 1986) loss(θ) = (f,e) T max{0, 1 s θ (a + f, e) + s θ (a f, e)}. (7) 3 NN FFNN RNN 1 l z l = f(h l z l 1 + B Hl ) 4 x < 1 htanh(x) = 1 x > 1 htanh(x) = 1 htanh(x) = x 5 RNN SGD D SGD 294

7 θ T s θ θ a J 1 (4) a + a θ 3 RNN a J 1 RNN s NN (a J 1 f J 1, e I 1) = J j=1 t RNN (a j a j 1 1, f j, e aj ). (8) t RNN a j FFNN a j 1 j 1 a j 1 1 FFNN 2 RNN lookup L {H d, R d, B d H } {O, B O} H d R d B d H a j 1 d (d = a j a j 1 ) d {H 8, H 7,, H 7, H 8, R 8, R 7,, R 7, R 8, B 8 H, B 7 H,, B7 H, B 8 H } y j FFNN 2 f 1 f J a j y i a 1 a j 1 2 RNN 295

8 Vol. 22 No. 4 December f j e aj f j e aj 2 lookup lookup 2 word embedding word embedding x j lookup FFNN lookup lookup x j j 1 y j 1 H d R d B d H a j 1 d y j j + 1 y j f j e aj t RNN (a j a j 1 1, f j, e aj ) 6 y j = f(h d x j + R d y j 1 + B d H), (9) t RNN = O y j + B O. (10) H d R d B d H O B O y j x j y j y j 1 y j 1 1 y j 1 1 y j 1 = y j f(x) (Yang et al. 2013) htanh(x) FFNN 2 RNN d 1 RNN y i FFNN 4 L H d R d B H d O B O SGD D = 1 SGD (Rumelhart et al. 1986) j 6 j = 1 a 0 0 y 0 y 1 = f(h d x 1 + B d H ). 296

9 FFNN (7) Dyer contrastive estimation (CE) (Smith and Eisner 2005) (Dyer et al. 2011) CE Dyer T V e T V e CE { loss(θ) = max 0, 1 (f +,e + ) T E Φ [s θ (a f +, e + )] + (f +,e ) Ω } E Φ [s θ (a f +, e )]. (11) Φ (f, e) E Φ [s θ ] Φ s θ Ω T V e e + T e e + (e V e+ e ) (11) Ω Noise Contrastive Estimation (Gutmann and Hyvärinen 2010; Mnih and Teh 2012) Negative Sampling (Mikolov, Sutskever, Chen, Corrado, and Dean 2013) f + e e + N W loss(θ) = { max 0, 1 E GEN [s θ (a f +, e + )] + 1 } E GEN [s θ (a f +, e )]. (12) N f + T e (12) e + f + ((f +, e + ) T ) e e + e + = e 297

10 Vol. 22 No. 4 December 2015 N f + GEN Φ e e + e V e l 0 IBM 1 (Vaswani, Huang, and Chiang 2012) f i f + C l 0 IBM 1 IBM 1 e 4.2 FFNN RNN f e f j e 1 Yang FFNN (Yang et al. 2013) Matusov Liang f e e f 2 2 (Matusov et al. 2004; Liang et al. 2006) Ganchev Graça EM E (Ganchev et al. 2008; Graça et al. 2008) word embedding word embedding { argmin loss(θf E ) + loss(θ EF ) + α θ LEF θ LF E }. (13) θ F E, θ EF θ F E θ EF f e e f θ L lookup L word embedding α θ θ 2- (13) loss(θ) (7) (12)

11 Algorithm 1 : θf 1 E, θef 1, T, MaxIter, D, N, C, IBM 1, W, α 1: for all t such that 1 t MaxIter do 2: {(f +, e + ) D } sample(d, T ) 3-1: {(f +, {e } N ) D } neg e ({(f +, e + ) D }, N, C, IBM 1) 3-2: {(e +, {f } N ) D } neg f ({(f +, e + ) D }, N, C, IBM 1) 4-1: θ t+1 F E update({f +, e +, {e } N ) D, θfe t, θef t, W, α) 4-2: θ t+1 EF update({e+, f +, {f } N ) D, θef t, θfe t, W, α) 5: end for : θ MaxIter+1 EF, θ MaxIter+1 FE T D (f +, e + ) D f + e + l 0 IBM 1 (IBM1) N ({e } N {f } N ) SGD θ F E θ EF word embedding θ EF θ F E θ F E θ EF NAACL 2003 shared task (Mihalcea and Pedersen 2003) Hansards (Hansards) Basic Travel Expression Corpus (BTEC) (Takezawa, Sumita, Sugaya, Yamamoto, and Yamamoto 2002) (IWSLT a ) IWSLT2007 (Fordyce 2007) (IWSLT) FBIS (FBIS) NTCIR-9 NTCIR-10 (Goto, Lu, Chow, Sumita, and Tsou 2011; Goto, Chow, Lu, Sumita, and Tsou 2013) (NTCIR-9, NTCIR-10) 1 Train Dev Test IWSLT a IWSLT BTEC 7 t θef t θt F E t 1 θt 1 F E θt 1 EF θef t θt F E θt EF θt F E 299

12 Vol. 22 No. 4 December Train Dev Test Hansards 100 K BTEC IWSLT a 9 K IWSLT 40 K 2, FBIS NIST K 878 NIST04 1,597 NTCIR M 2,000 2,000 NTCIR M 2,000 2,000 IWSLT a IWSLT 9,960 (Goh, Watanabe, Yamamoto, and Sumita 2010) 9,960 9, IWSLT a Hansards Hansards IWSLT a 2 8 NAACL 2003 shared task 110 Hansards 10 FBIS NIST02 NIST03 NIST RNN IBM 4 FFNN HMM Vogel (Vogel et al. 1996) (HMM indep ) Vogel Liang (Liang et al. 2006) (HMM joint ) IBM 4 IBM 1-4 HMM (Och and Ney 2003) IBM 1 HMM IBM (1 5 H ) GIZA++ (IBM4) HMM indep 8 IWSLT a 2,000 2 IWSLT a 300

13 HMM joint BerkleyAligner 9 Liang IBM 1 HMM 5 (Liang et al. 2006) FFNN word embedding M 30 5 z = z FFNN (Yang et al. 2013) 2.2 FFNN s FFNN s+c FFNN u FFNN u+c s u +c RNN word embedding M 30 y j 100 x j 60 = 30 2 FFNN RNN s RNN s+c RNN u RNN u+c 4 FFNN RNN M 2 NN lookup L word embedding [ 0.1, 0.1] word embedding Mikolov (Mikolov et al. 2010) RNNLM 10 5 unk SGD D l2 0.1 W N C α 0.1 (SMT) Moses (Koehn, Hoang, Birch, Callison-Burch, Federico, Bertoldi, Cowan, Shen, Moran, Zens, Dyer, Bojar, Constrantin, and Herbst 2007) ChaSen 11 Stanford Chinese segmenter SRILM (Stolcke 2002) modified Kneser-Ney (Kneser and Ney 1995; Chen and Goodman 1996) IWSLT NTCIR-9 NTCIR-10 5 FBIS English Gigaword Xinhua

14 Vol. 22 No. 4 December SMT MERT (Och 2003) F1 NN (REF) IBM4 (IBM4) 2 Hansards REF f e e f grow-diag-final-and (Koehn, Och, and Marcu 2003) F1 5% FFNN FFNN s (REF/IBM4) ++ FFNN 2 IWSLT a Hansards HMM indep HMM joint IBM FFNN s (IBM4 ) FFNN s+c (IBM4 ) RNN s (IBM4 ) RNN s+c(ibm4 ) FFNN u FFNN u+c RNN u RNN u+c FFNN s (REF ) FFNN s+c (REF ) RNN s (REF ) RNN s+c (REF ) F1 302

15 FFNN s (REF/IBM4) IBM4 (REF) REF 2 IWSLT a Hansards RNN RNN u+c 2 IWSLT a Hansards RNN s/s+c/u/u+c (IBM4) RNN s/s+c (REF) FFNN s/s+c/u/u+c (IBM4) FFNN s/s+c (REF) IWSLT a RNN s (REF) RNN s (IBM4) FFNN s (REF) FFNN s (IBM4) RNN RNN Hansards RNN 6.1 IWSLT a Hansards RNN s+c (REF/IBM4) RNN u+c RNN s (REF/IBM4) RNN u FFNN s+c (REF/IBM4) FFNN u+c FFNN s (REF/IBM4) FFNN u FFNN RNN HMM joint HMM indep HMM Liang 6.2 IWSLT a RNN u RNN u+c RNN s (IBM4) RNN s+c (IBM4) Hansards FFNN IBM4 IBM4 FFNN RNN 303

16 Vol. 22 No. 4 December SMT BLEU4 13 (Papineni, Roukos, Ward, and Zhu 2002) MERT MERT 3 3 (Utiyama, Yamamoto, and Sumita 2009) IWSLT NTCIR-9 NTCIR-10 FBIS 10 IBM 4 SMT IBM4 all 5% (Koehn 2004) 3 * IBM4 FFNN s (IBM4) IBM4 all 2 3 (Yang et al. 2013) 3 RNN u RNN u+c FFNN s (IBM4) IBM4 NTCIR-9 FBIS IBM4 all IWSLT NTCIR-9 NTCIR-10 FBIS NIST03 NIST04 IBM4 all IBM FFNN s (IBM4 ) RNN s (IBM4 ) RNN s+c (IBM4 ) RNN u RNN u+c BLEU4 (%) 13 mteval-v13a.pl ( 304

17 6 6.1 RNN 3 FFNN s RNN s FFNN s RNN s 3 RNN s FFNN s 3(a) have you been FFNN s RNN s 5.3 RNN (IWSLT a ) (Hansards) (b) FFNN s RNN s 3 305

18 Vol. 22 No. 4 December BTEC 4 40 K 9 K 1 K IWSLT IWSLT a IWSLT a 1,000 IWSLT a 9 K 1 K 40 K (REF) 40 K 4 1 K RNN s+c (REF) 9 K RNN u+c 40 K IBM4 RNN IBM4 22.5% (9,000/40,000) 3 RNN u+c SMT IBM4 all SMT 4 HMM Liang Liang K 9 K 1 K HMM indep HMM joint IBM FFNN s (IBM4 ) FFNN s+c (IBM4 ) RNN s(ibm4 ) RNN s+c(ibm4 ) FFNN u FFNN u+c RNN u RNN u+c FFNN s (REF ) FFNN s+c (REF ) RNN s(ref ) RNN s+c(ref ) F1 306

19 RNN u+c RNN s+c (REF) RNN 7 RNN RNN Dyer (Dyer et al. 2011) word embedding RNN FFNN (Yang et al. 2013) (y i ) FFNN c(f j ) c(e aj ) Yang FFNN (Yang et al. 2013) Watanabe (Watanabe, Suzuki, Tsukada, and Isozaki 2006) SMT The 52nd Annual Meeting of the Association for Computational Linguistics (Tamura, Watanabe, and Sumita 2014) 307

20 Vol. 22 No. 4 December 2015 Auli, M., Galley, M., Quirk, C., and Zweig, G. (2013). Joint Language and Translation Modeling with Recurrent Neural Networks. In Proceedings of EMNLP 2013, pp Bengio, Y., Ducharme, R., Vincent, P., and Janvin, C. (2003). A Neural Probabilistic Language Model. Journal of Machine Learning Research, 3, pp Berg-Kirkpatrick, T., Bouchard-Côté, A., DeNero, J., and Klein, D. (2010). Painless Unsupervised Learning with Features. In Proceedings of HLT:NAACL 2010, pp Blunsom, P. and Cohn, T. (2006). Discriminative Word Alignment with Conditional Random Fields. In Proceedings of Coling/ACL 2006, pp Brown, P. F., Pietra, S. A. D., Pietra, V. J. D., and Mercer, R. L. (1993). The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19 (2), pp Chen, S. F. and Goodman, J. (1996). An Empirical Study of Smoothing Techniques for Language Modeling. In Proceedings of ACL 1996, pp Collobert, R. and Weston, J. (2008). A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning. In Proceedings of ICML 2008, pp Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., and Kuksa, P. (2011). Natural Language Processing (Almost) from Scratch. Journal of Machine Learning Research, 12, pp Dahl, G. E., Yu, D., Deng, L., and Acero, A. (2012). Context-Dependent Pre-trained Deep Neural Networks for Large Vocabulary Speech Recognition. IEEE Transactions on Audio, Speech, and Language Processing, 20 (1), pp Dempster, A. P., Laird, N. M., and Rubin, D. B. (1977). Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society, Series B, 39 (1), pp Dyer, C., Clark, J., Lavie, A., and Smith, N. A. (2011). Unsupervised Word Alignment with Arbitrary Features. In Proceedings of ACL/HLT 2011, pp Fordyce, C. S. (2007). Overview of the IWSLT 2007 Evaluation Campaign. In Proceedings of IWSLT 2007, pp Ganchev, K., Graça, J. V., and Taskar, B. (2008). Better Alignments = Better Translations? In Proceedings of ACL/HLT 2008, pp Goh, C.-L., Watanabe, T., Yamamoto, H., and Sumita, E. (2010). Constraining a Generative Word Alignment Model with Discriminative Output. IEICE Transactions, 93-D (7), 308

21 pp Goto, I., Chow, K. P., Lu, B., Sumita, E., and Tsou, B. K. (2013). Overview of the Patent Machine Translation Task at the NTCIR-10 Workshop. In Proceedings of 10th NTCIR Conference, pp Goto, I., Lu, B., Chow, K. P., Sumita, E., and Tsou, B. K. (2011). Overview of the Patent Machine Translation Task at the NTCIR-9 Workshop. In Proceedings of 9th NTCIR Conference, pp Graça, J. V., Ganchev, K., and Taskar, B. (2008). Expectation Maximization and Posterior Constraints. In Proceedings of NIPS 2008, pp Gutmann, M. and Hyvärinen, A. (2010). Noise-Contrastive Estimation: A New Estimation Principle for Unnormalized Statistical Models. In Proceedings of AISTATS 2010, pp Kalchbrenner, N. and Blunsom, P. (2013). Recurrent Continuous Translation Models. In Proceedings of EMNLP 2013, pp Kneser, R. and Ney, H. (1995). Improved Backing-off for M-gram Language Modeling. In Proceedings of ICASSP 1995, pp Koehn, P. (2004). Statistical Significance Tests for Machine Translation Evaluation. In Proceedings of EMNLP 2004, pp Koehn, P., Hoang, H., Birch, A., Callison-Burch, C., Federico, M., Bertoldi, N., Cowan, B., Shen, W., Moran, C., Zens, R., Dyer, C., Bojar, O., Constrantin, A., and Herbst, E. (2007). Moses: Open Source Toolkit for Statistical Machine Translation. In Proceedings of ACL 2007, pp Koehn, P., Och, F. J., and Marcu, D. (2003). Statistical Phrase-Based Translation. In Proceedings of HLT/NAACL 2003, pp Le, H.-S., Allauzen, A., and Yvon, F. (2012). Continuous Space Translation Models with Neural Networks. In Proceedings of NAACL/HLT 2012, pp Liang, P., Taskar, B., and Klein, D. (2006). Alignment by Agreement. In Proceedings of HLT/NAACL 2006, pp Matusov, E., Zens, R., and Ney, H. (2004). Symmetric Word Alignments for Statistical Machine Translation. In Proceedings of Coling 2004, pp Mihalcea, R. and Pedersen, T. (2003). An Evaluation Exercise for Word Alignment. In Proceedings of the HLT-NAACL 2003 Workshop on Building and Using Parallel Texts: Data Driven Machine Translation and Beyond, pp Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., and Khudanpur, S. (2010). Recurrent Neural Network based Language Model. In Proceedings of INTERSPEECH 2010, pp

22 Vol. 22 No. 4 December 2015 Mikolov, T., Sutskever, I., Chen, K., Corrado, G., and Dean, J. (2013). Distributed Representations of Words and Phrases and their Compositionality. In Proceedings of NIPS 2013, pp Mikolov, T. and Zweig, G. (2012). Context Dependent Recurrent Neural Network Language Model. In Proceedings of SLT 2012, pp Mnih, A. and Teh, Y. W. (2012). A Fast and Simple Algorithm for Training Neural Probabilistic Language Models. In Proceedings of ICML 2012, pp Moore, R. C. (2005). A Discriminative Framework for Bilingual Word Alignment. In Proceedings of HLT/EMNLP 2005, pp Och, F. J. (2003). Minimum Error Rate Training in Statistical Machine Translation. In Proceedings of ACL 2003, pp Och, F. J. and Ney, H. (2003). A Systematic Comparison of Various Statistical Alignment Models. Computational Linguistics, 29, pp Papineni, K., Roukos, S., Ward, T., and Zhu, W.-J. (2002). BLEU: a Method for Automatic Evaluation of Machine Translation. In Proceedings of ACL 2002, pp Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986). Learning Internal Representations by Error Propagation. In Rumelhart, D. E. and McClelland, J. L. (Eds.), Parallel Distributed Processing, pp MIT Press. Smith, N. A. and Eisner, J. (2005). Contrastive Estimation: Training Log-Linear Models on Unlabeled Data. In Proceedings of ACL 2005, pp Stolcke, A. (2002). SRILM - An Extensible Language Modeling Toolkit. In Proceedings of ICSLP 2002, pp Sundermeyer, M., Oparin, I., Gauvain, J.-L., Freiberg, B., Schlüter, R., and Ney, H. (2013). Comparison of Feedforward and Recurrent Neural Network Language Models. In Proceedings of ICASSP 2013, pp Takezawa, T., Sumita, E., Sugaya, F., Yamamoto, H., and Yamamoto, S. (2002). Toward a Broad-coverage Bilingual Corpus for Speech Translation of Travel Conversations in the Real World. In Proceedings of LREC 2002, pp Tamura, A., Watanabe, T., and Sumita, E. (2014). Recurrent Neural Networks for Word Alignment Model. In Proceedings of ACL 2014, pp Taskar, B., Lacoste-Julien, S., and Klein, D. (2005). A Discriminative Matching Approach to Word Alignment. In Proceedings of HLT/EMNLP 2005, pp Utiyama, M., Yamamoto, H., and Sumita, E. (2009). Two Methods for Stabilizing MERT: NICT at IWSLT In Proceedings of IWSLT 2009, pp

23 Vaswani, A., Huang, L., and Chiang, D. (2012). Smaller Alignment Models for Better Translations: Unsupervised Word Alignment with the l 0 -norm. In Proceedings of ACL 2012, pp Vaswani, A., Zhao, Y., Fossum, V., and Chiang, D. (2013). Decoding with Large-Scale Neural Language Models Improves Translation. In Proceedings of EMNLP 2013, pp Viterbi, A. J. (1967). Error Bounds for Convolutional Codes and an Asymptotically Optimum Decoding Algorithm. IEEE Transactions on Information Theory, 13 (2), pp Vogel, S., Ney, H., and Tillmann, C. (1996). Hmm-based Word Alignment in Statistical Translation. In Proceedings of Coling 1996, pp Watanabe, T., Suzuki, J., Tsukada, H., and Isozaki, H. (2006). NTT Statistical Machine Translation for IWSLT In Proceedings of IWSLT 2006, pp Yang, N., Liu, S., Li, M., Zhou, M., and Yu, N. (2013). Word Alignment Modeling with Context Dependent Deep Neural Network. In Proceedings of ACL 2013, pp ACL Language and Information Technologies, School of Computer Science, Carnegie Mellon University, Master of Science 2004 ATR NTT NICT e

24 Vol. 22 No. 4 December ACL ACM

2

2 NTT 2012 NTT Corporation. All rights reserved. 2 3 4 5 Noisy Channel f : (source), e : (target) ê = argmax e p(e f) = argmax e p(f e)p(e) 6 p( f e) (Brown+ 1990) f1 f2 f3 f4 f5 f6 f7 He is a high school

More information

f ê ê = arg max Pr(e f) (1) e M = arg max λ m h m (e, f) (2) e m=1 h m (e, f) λ m λ m BLEU [11] [12] PBMT 2 [13][14] 2.2 PBMT Hiero[9] Chiang PBMT [X

f ê ê = arg max Pr(e f) (1) e M = arg max λ m h m (e, f) (2) e m=1 h m (e, f) λ m λ m BLEU [11] [12] PBMT 2 [13][14] 2.2 PBMT Hiero[9] Chiang PBMT [X 1,a) Graham Neubig 1,b) Sakriani Sakti 1,c) 1,d) 1,e) 1. Statistical Machine Translation: SMT[1] [2] [3][4][5][6] 2 Cascade Translation [3] Triangulation [7] Phrase-Based Machine Translation: PBMT[8] 1

More information

Computational Semantics 1 category specificity Warrington (1975); Warrington & Shallice (1979, 1984) 2 basic level superiority 3 super-ordinate catego

Computational Semantics 1 category specificity Warrington (1975); Warrington & Shallice (1979, 1984) 2 basic level superiority 3 super-ordinate catego Computational Semantics 1 category specificity Warrington (1975); Warrington & Shallice (1979, 1984) 2 basic level superiority 3 super-ordinate category preservation 1 / 13 analogy by vector space Figure

More information

21 Pitman-Yor Pitman- Yor [7] n -gram W w n-gram G Pitman-Yor P Y (d, θ, G 0 ) (1) G P Y (d, θ, G 0 ) (1) Pitman-Yor d, θ, G 0 d 0 d 1 θ Pitman-Yor G

21 Pitman-Yor Pitman- Yor [7] n -gram W w n-gram G Pitman-Yor P Y (d, θ, G 0 ) (1) G P Y (d, θ, G 0 ) (1) Pitman-Yor d, θ, G 0 d 0 d 1 θ Pitman-Yor G ol2013-nl-214 No6 1,a) 2,b) n-gram 1 M [1] (TG: Tree ubstitution Grammar) [2], [3] TG TG 1 2 a) ohno@ilabdoshishaacjp b) khatano@maildoshishaacjp [4], [5] [6] 2 Pitman-Yor 3 Pitman-Yor 1 21 Pitman-Yor

More information

( ) Kevin Duh

( ) Kevin Duh NAIST-IS-MT1251045 Factored Translation Models 2014 2 6 ( ) Kevin Duh Factored Translation Models Factored translation models Factored Translation Models, NAIST-IS-MT1251045, 2014 2 6. i Post-ordering

More information

IPSJ SIG Technical Report Vol.2014-NL-219 No /12/17 1,a) Graham Neubig 1,b) Sakriani Sakti 1,c) 1,d) 1,e) 1. [23] 1(a) 1(b) [19] n-best [1] 1 N

IPSJ SIG Technical Report Vol.2014-NL-219 No /12/17 1,a) Graham Neubig 1,b) Sakriani Sakti 1,c) 1,d) 1,e) 1. [23] 1(a) 1(b) [19] n-best [1] 1 N 1,a) Graham Neubig 1,b) Sakriani Sakti 1,c) 1,d) 1,e) 1. [23] 1(a) 1(b) [19] n-best [1] 1 Nara Institute of Science and Technology a) akabe.koichi.zx8@is.naist.jp b) neubig@is.naist.jp c) ssakti@is.naist.jp

More information

_314I01BM浅谷2.indd

_314I01BM浅谷2.indd 587 ネットワークの表現学習 1 1 1 1 Deep Learning [1] Google [2] Deep Learning [3] [4] 2014 Deepwalk [5] 1 2 [6] [7] [8] 1 2 1 word2vec[9] word2vec 1 http://www.ai-gakkai.or.jp/my-bookmark_vol31-no4 588 31 4 2016

More information

Haiku Generation Based on Motif Images Using Deep Learning Koki Yoneda 1 Soichiro Yokoyama 2 Tomohisa Yamashita 2 Hidenori Kawamura Scho

Haiku Generation Based on Motif Images Using Deep Learning Koki Yoneda 1 Soichiro Yokoyama 2 Tomohisa Yamashita 2 Hidenori Kawamura Scho Haiku Generation Based on Motif Images Using Deep Learning 1 2 2 2 Koki Yoneda 1 Soichiro Yokoyama 2 Tomohisa Yamashita 2 Hidenori Kawamura 2 1 1 School of Engineering Hokkaido University 2 2 Graduate

More information

No. 3 Oct The person to the left of the stool carried the traffic-cone towards the trash-can. α α β α α β α α β α Track2 Track3 Track1 Track0 1

No. 3 Oct The person to the left of the stool carried the traffic-cone towards the trash-can. α α β α α β α α β α Track2 Track3 Track1 Track0 1 ACL2013 TACL 1 ACL2013 Grounded Language Learning from Video Described with Sentences (Yu and Siskind 2013) TACL Transactions of the Association for Computational Linguistics What Makes Writing Great?

More information

ズテーブルを 用 いて 対 訳 専 門 用 語 を 獲 得 する 手 法 を 提 案 する 具 体 的 には まず 専 門 用 語 対 訳 辞 書 獲 得 の 情 報 源 として 用 いる 日 中 対 訳 文 対 に 対 して 句 に 基 づく 統 計 的 機 械 翻 訳 モデルを 適 用 すること

ズテーブルを 用 いて 対 訳 専 門 用 語 を 獲 得 する 手 法 を 提 案 する 具 体 的 には まず 専 門 用 語 対 訳 辞 書 獲 得 の 情 報 源 として 用 いる 日 中 対 訳 文 対 に 対 して 句 に 基 づく 統 計 的 機 械 翻 訳 モデルを 適 用 すること 日 中 パテントファミリーを 利 用 した 専 門 用 語 訳 語 推 定 フレーズテーブルおよび 対 訳 文 対 を 利 用 する 方 式 Estimating Translation of Technical Terms utilizing Japanese-Chinese Patent Families : an Approach based on Phrase Translation Tables

More information

Vol. 23 No. 5 December (Rule-Based Machine Translation; RBMT (Nirenburg 1989)) 2 (Statistical Machine Translation; SMT (Brown, Pietra, Piet

Vol. 23 No. 5 December (Rule-Based Machine Translation; RBMT (Nirenburg 1989)) 2 (Statistical Machine Translation; SMT (Brown, Pietra, Piet Graham Neubig, Sakriani Sakti 2 Improving Pivot Translation by Remembering the Pivot Akiva Miura, Graham Neubig,, Sakriani Sakti, Tomoki Toda and Satoshi Nakamura In statistical machine translation, the

More information

IBM-Mode1 Q: A: cash money It is fine today 2

IBM-Mode1 Q: A: cash money It is fine today 2 8. IBM Model-1 @NICT mutiyama@nict.go.jp 1 IBM-Mode1 Q: A: cash money It is fine today 2 e f a P (f, a e) â : â = arg max a P (f, a e) â P (f, a e) 3 θ P (f e, θ) θ f d = { f, e } L(θ d) = log f,e d P

More information

音響モデル triphone 入力音声 音声分析 デコーダ 言語モデル N-gram bigram HMM の状態確率として利用 出力層 triphone: 3003 ノード リスコア trigram 隠れ層 2048 ノード X7 層 1 Structure of recognition syst

音響モデル triphone 入力音声 音声分析 デコーダ 言語モデル N-gram bigram HMM の状態確率として利用 出力層 triphone: 3003 ノード リスコア trigram 隠れ層 2048 ノード X7 層 1 Structure of recognition syst 1,a) 1 1 1 deep neural netowrk(dnn) (HMM) () GMM-HMM 2 3 (CSJ) 1. DNN [6]. GPGPU HMM DNN HMM () [7]. [8] [1][2][3] GMM-HMM Gaussian mixture HMM(GMM- HMM) MAP MLLR [4] [3] DNN 1 1 triphone bigram [5]. 2

More information

203 図 2,re re, [Nivre 08]. y, 1 y i. ŷ = arg max y Y * J j=1 P r(y j y j 1 1,x) 2,, Pr y j y 1 j 1, x.,. ŷ = arg max y Y * 図 1 J j=1 exp(w o φ(y j,y j

203 図 2,re re, [Nivre 08]. y, 1 y i. ŷ = arg max y Y * J j=1 P r(y j y j 1 1,x) 2,, Pr y j y 1 j 1, x.,. ŷ = arg max y Y * 図 1 J j=1 exp(w o φ(y j,y j 202 31 2 2016 3 ニューラルネットワーク研究のフロンティア ニューラルネットワークによる構造学習の発展 Advances in Structured Learning by Neural Networks 渡辺太郎 Taro Watanabe Google Inc. tarow@google.com Keywords: natural language processing, machine

More information

46 583/4 2012

46 583/4 2012 4-3 A Transliteration System Based on Bayesian Alignment and its Human Evaluation within a Machine Translation System Andrew Finch and YASUDA Keiji This paper reports on contributions in two areas. Firstly,

More information

Vol. 23 No. 5 December (Rule-Based Machine Translation; RBMT (Nirenburg 1989)) 2 (Statistical Machine Translation; SMT (Brown, Pietra, Piet

Vol. 23 No. 5 December (Rule-Based Machine Translation; RBMT (Nirenburg 1989)) 2 (Statistical Machine Translation; SMT (Brown, Pietra, Piet Graham Neubig, Sakriani Sakti 2,,,,, Improving Pivot Translation by Remembering the Pivot Akiva Miura, Graham Neubig,, Sakriani Sakti, Tomoki Toda and Satoshi Nakamura In statistical machine translation,

More information

IPSJ-TOD

IPSJ-TOD Vol. 3 No. 2 91 101 (June 2010) 1 1 1 2 1 TSC2 Automatic Evaluation of Text Summaries by Using Paraphrase Kazuho Hirahara, 1 Hidetsugu Nanba, 1 Toshiyuki Takezawa 1 and Manabu Okumura 2 The evaluation

More information

A Japanese Word Dependency Corpus ÆüËܸì¤Îñ¸ì·¸¤ê¼õ¤±¥³¡¼¥Ñ¥¹

A Japanese Word Dependency Corpus   ÆüËܸì¤Îñ¸ì·¸¤ê¼õ¤±¥³¡¼¥Ñ¥¹ A Japanese Word Dependency Corpus 2015 3 18 Special thanks to NTT CS, 1 /27 Bunsetsu? What is it? ( ) Cf. CoNLL Multilingual Dependency Parsing [Buchholz+ 2006] (, Penn Treebank [Marcus 93]) 2 /27 1. 2.

More information

自然言語処理24_705

自然言語処理24_705 nwjc2vec: word2vec nwjc2vec nwjc2vec nwjc2vec 2 nwjc2vec 7 nwjc2vec word2vec nwjc2vec: Word Embedding Data Constructed from NINJAL Web Japanese Corpus Hiroyuki Shinnou, Masayuki Asahara, Kanako Komiya

More information

IPSJ SIG Technical Report Pitman-Yor 1 1 Pitman-Yor n-gram A proposal of the melody generation method using hierarchical pitman-yor language model Aki

IPSJ SIG Technical Report Pitman-Yor 1 1 Pitman-Yor n-gram A proposal of the melody generation method using hierarchical pitman-yor language model Aki Pitman-Yor Pitman-Yor n-gram A proposal of the melody generation method using hierarchical pitman-yor language model Akira Shirai and Tadahiro Taniguchi Although a lot of melody generation method has been

More information

2014/1 Vol. J97 D No. 1 2 [2] [3] 1 (a) paper (a) (b) (c) 1 Fig. 1 Issues in coordinating translation services. (b) feast feast feast (c) Kran

2014/1 Vol. J97 D No. 1 2 [2] [3] 1 (a) paper (a) (b) (c) 1 Fig. 1 Issues in coordinating translation services. (b) feast feast feast (c) Kran a) b) c) Improving Quality of Pivot Translation by Context in Service Coordination Yohei MURAKAMI a), Rie TANAKA b),andtoruishida c) Web 1. Web 26.8% 30.9% 21.3% 21% 1 n n(n 1) Department of Social Informatics,

More information

gengo.dvi

gengo.dvi 4 97.52% tri-gram 92.76% 98.49% : Japanese word segmentation by Adaboost using the decision list as the weak learner Hiroyuki Shinnou In this paper, we propose the new method of Japanese word segmentation

More information

3807 (3)(2) ,267 1 Fig. 1 Advertisement to the author of a blog. 3 (1) (2) (3) (2) (1) TV 2-0 Adsense (2) Web ) 6) 3

3807 (3)(2) ,267 1 Fig. 1 Advertisement to the author of a blog. 3 (1) (2) (3) (2) (1) TV 2-0 Adsense (2) Web ) 6) 3 Vol. 52 No. 12 3806 3816 (Dec. 2011) 1 1 Discovering Latent Solutions from Expressions of Dissatisfaction in Blogs Toshiyuki Sakai 1 and Ko Fujimura 1 This paper aims to find the techniques or goods that

More information

taro.watanabe at nict.go.jp

taro.watanabe at nict.go.jp taro.watanabe at nict.go.jp https://sites.google.com/site/alaginmt2014/ ... I want to study about machine translation. I need to master machine translation. machine translation want to study. infobox infobox

More information

IPSJ SIG Technical Report Vol.2009-DPS-141 No.20 Vol.2009-GN-73 No.20 Vol.2009-EIP-46 No /11/27 1. MIERUKEN 1 2 MIERUKEN MIERUKEN MIERUKEN: Spe

IPSJ SIG Technical Report Vol.2009-DPS-141 No.20 Vol.2009-GN-73 No.20 Vol.2009-EIP-46 No /11/27 1. MIERUKEN 1 2 MIERUKEN MIERUKEN MIERUKEN: Spe 1. MIERUKEN 1 2 MIERUKEN MIERUKEN MIERUKEN: Speech Visualization System Based on Augmented Reality Yuichiro Nagano 1 and Takashi Yoshino 2 As the spread of the Augmented Reality(AR) technology and service,

More information

BLEU Kishore Papineni, Salim Roukos, Todd Ward and Wei-Jing Zhu. (2002) BLEU: a method for Automatic Evaluation of Machine Translation. ACL. MT ( ) MT

BLEU Kishore Papineni, Salim Roukos, Todd Ward and Wei-Jing Zhu. (2002) BLEU: a method for Automatic Evaluation of Machine Translation. ACL. MT ( ) MT 4. BLEU @NICT mutiyama@nict.go.jp 1 BLEU Kishore Papineni, Salim Roukos, Todd Ward and Wei-Jing Zhu. (2002) BLEU: a method for Automatic Evaluation of Machine Translation. ACL. MT ( ) MT ( ) BLEU 2 BLEU

More information

( )

( ) B4IM2035 2017 2 10 ( ) (e.g., eat ) (e.g., arrest ),,, 10., B4IM2035, 2017 2 i 1 1 2 3 2.1................. 3 2.2........ 3 3 5 3.1.... 5 3.2 DCS Vector.............................. 6 3.3 DCS Vector.......

More information

1 IDC Wo rldwide Business Analytics Technology and Services 2013-2017 Forecast 2 24 http://www.soumu.go.jp/johotsusintokei/whitepaper/ja/h24/pdf/n2010000.pdf 3 Manyika, J., Chui, M., Brown, B., Bughin,

More information

IPSJ SIG Technical Report Vol.2017-MUS-116 No /8/24 MachineDancing: 1,a) 1,b) 3 MachineDancing MachineDancing MachineDancing 1 MachineDan

IPSJ SIG Technical Report Vol.2017-MUS-116 No /8/24 MachineDancing: 1,a) 1,b) 3 MachineDancing MachineDancing MachineDancing 1 MachineDan MachineDancing: 1,a) 1,b) 3 MachineDancing 2 1. 3 MachineDancing MachineDancing 1 MachineDancing MachineDancing [1] 1 305 0058 1-1-1 a) s.fukayama@aist.go.jp b) m.goto@aist.go.jp 1 MachineDancing 3 CG

More information

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution

Convolutional Neural Network A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolution Convolutional Neural Network 2014 3 A Graduation Thesis of College of Engineering, Chubu University Investigation of feature extraction by Convolutional Neural Network Fukui Hiroshi 1940 1980 [1] 90 3

More information

1 7.35% 74.0% linefeed point c 200 Information Processing Society of Japan

1 7.35% 74.0% linefeed point c 200 Information Processing Society of Japan 1 2 3 Incremental Linefeed Insertion into Lecture Transcription for Automatic Captioning Masaki Murata, 1 Tomohiro Ohno 2 and Shigeki Matsubara 3 The development of a captioning system that supports the

More information

一般社団法人電子情報通信学会 THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGIN

一般社団法人電子情報通信学会 THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGIN 一般社団法人電子情報通信学会 THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS 信学技報 IEICE Technical Report SP2019-12(2019-08)

More information

Vol. 43 No. 7 July 2002 ATR-MATRIX,,, ATR ITL ATR-MATRIX ATR-MATRIX 90% ATR-MATRIX Development and Evaluation of ATR-MATRIX Speech Translation System

Vol. 43 No. 7 July 2002 ATR-MATRIX,,, ATR ITL ATR-MATRIX ATR-MATRIX 90% ATR-MATRIX Development and Evaluation of ATR-MATRIX Speech Translation System Vol. 43 No. 7 July 2002 ATR-MATRIX,,, ATR ITL ATR-MATRIX ATR-MATRIX 90% ATR-MATRIX Development and Evaluation of ATR-MATRIX Speech Translation System Fumiaki Sugaya,,, Toshiyuki Takezawa, Eiichiro Sumita,

More information

IPSJ SIG Technical Report Vol.2010-NL-199 No /11/ treebank ( ) KWIC /MeCab / Morphological and Dependency Structure Annotated Corp

IPSJ SIG Technical Report Vol.2010-NL-199 No /11/ treebank ( ) KWIC /MeCab / Morphological and Dependency Structure Annotated Corp 1. 1 1 1 2 treebank ( ) KWIC /MeCab / Morphological and Dependency Structure Annotated Corpus Management Tool: ChaKi Yuji Matsumoto, 1 Masayuki Asahara, 1 Masakazu Iwatate 1 and Toshio Morita 2 This paper

More information

[1] B =b 1 b n P (S B) S S O = {o 1,2, o 1,3,, o 1,n, o 2,3,, o i,j,, o n 1,n } D = {d 1, d 2,, d n 1 } S = O, D o i,j 1 i

[1] B =b 1 b n P (S B) S S O = {o 1,2, o 1,3,, o 1,n, o 2,3,, o i,j,, o n 1,n } D = {d 1, d 2,, d n 1 } S = O, D o i,j 1 i 1,a) 2,b) 3,c) 1,d) CYK 552 1. 2 ( 1 ) ( 2 ) 1 [1] 2 [2] [3] 1 Graduate School of Information Science, Nagoya University, Japan 2 Information Technology Center, Nagoya University, Japan 3 Information &

More information

main.dvi

main.dvi 305 8550 1 2 CREST fujii@slis.tsukuba.ac.jp 1 7% 2 2 3 PRIME Multi-lingual Information Retrieval 2 2.1 Cross-Language Information Retrieval CLIR 1990 CD-ROM a. b. c. d. b CLIR b 70% CLIR CLIR 2.2 (b) 2

More information

18 2 20 W/C W/C W/C 4-4-1 0.05 1.0 1000 1. 1 1.1 1 1.2 3 2. 4 2.1 4 (1) 4 (2) 4 2.2 5 (1) 5 (2) 5 2.3 7 3. 8 3.1 8 3.2 ( ) 11 3.3 11 (1) 12 (2) 12 4. 14 4.1 14 4.2 14 (1) 15 (2) 16 (3) 17 4.3 17 5. 19

More information

自然言語処理23_175

自然言語処理23_175 2 Sequence Alignment as a Set Partitioning Problem Masaaki Nishino,JunSuzuki, Shunji Umetani, Tsutomu Hirao and Masaaki Nagata Sequence alignment, which involves aligning elements of two given sequences,

More information

IPSJ SIG Technical Report Vol.2012-MUS-96 No /8/10 MIDI Modeling Performance Indeterminacies for Polyphonic Midi Score Following and

IPSJ SIG Technical Report Vol.2012-MUS-96 No /8/10 MIDI Modeling Performance Indeterminacies for Polyphonic Midi Score Following and MIDI 1 2 3 2 1 Modeling Performance Indeterminacies for Polyphonic Midi Score Following and Its Application to Automatic Accompaniment Nakamura Eita 1 Yamamoto Ryuichi 2 Saito Yasuyuki 3 Sako Shinji 2

More information

fiš„v8.dvi

fiš„v8.dvi (2001) 49 2 333 343 Java Jasp 1 2 3 4 2001 4 13 2001 9 17 Java Jasp (JAva based Statistical Processor) Jasp Jasp. Java. 1. Jasp CPU 1 106 8569 4 6 7; fuji@ism.ac.jp 2 106 8569 4 6 7; nakanoj@ism.ac.jp

More information

IPSJ SIG Technical Report Vol.2013-CVIM-187 No /5/30 1,a) 1,b), 1,,,,,,, (DNN),,,, 2 (CNN),, 1.,,,,,,,,,,,,,,,,,, [1], [6], [7], [12], [13]., [

IPSJ SIG Technical Report Vol.2013-CVIM-187 No /5/30 1,a) 1,b), 1,,,,,,, (DNN),,,, 2 (CNN),, 1.,,,,,,,,,,,,,,,,,, [1], [6], [7], [12], [13]., [ ,a),b),,,,,,,, (DNN),,,, (CNN),,.,,,,,,,,,,,,,,,,,, [], [6], [7], [], [3]., [8], [0], [7],,,, Tohoku University a) omokawa@vision.is.tohoku.ac.jp b) okatani@vision.is.tohoku.ac.jp, [3],, (DNN), DNN, [3],

More information

( )

( ) NAIST-IS-MT1051071 2012 3 16 ( ) Pustejovsky 2 2,,,,,,, NAIST-IS- MT1051071, 2012 3 16. i Automatic Acquisition of Qualia Structure of Generative Lexicon in Japanese Using Learning to Rank Takahiro Tsuneyoshi

More information

ニュラールネットに基づく機械翻訳 ニューラルネットに 基づく機械翻訳 Graham Neubig 奈良先端科学技術大学院大学 (NAIST)

ニュラールネットに基づく機械翻訳 ニューラルネットに 基づく機械翻訳 Graham Neubig 奈良先端科学技術大学院大学 (NAIST) ニューラルネットに 基づく機械翻訳 Graham Neubig 奈良先端科学技術大学院大学 (NAIST) 205-9-5 I am giving a talk at Kyoto University 私 は 京都 大学 で 講演 を しています ( 終 ) 2 次の単語確率を推測 F = I am giving a talk P(e= 私 F) = 0.8 P(e= 僕 F) = 0.03 P(e=

More information

自然言語処理21_249

自然言語処理21_249 1,327 Annotation of Focus for Negation in Japanese Text Suguru Matsuyoshi This paper proposes an annotation scheme for the focus of negation in Japanese text. Negation has a scope, and its focus falls

More information

Outline ACL 2017 ACL ACL 2017 Chairs/Presidents

Outline ACL 2017 ACL ACL 2017 Chairs/Presidents ACL 2017, 2017/9/7 Outline ACL 2017 ACL ACL 2017 Chairs/Presidents ACL ACL he annual meeting of the Association for Computational Linguistics (Computational Linguistics) (Natural Language Processing) /

More information

10_08.dvi

10_08.dvi 476 67 10 2011 pp. 476 481 * 43.72.+q 1. MOS Mean Opinion Score ITU-T P.835 [1] [2] [3] Subjective and objective quality evaluation of noisereduced speech. Takeshi Yamada, Shoji Makino and Nobuhiko Kitawaki

More information

(MIRU2008) HOG Histograms of Oriented Gradients (HOG)

(MIRU2008) HOG Histograms of Oriented Gradients (HOG) (MIRU2008) 2008 7 HOG - - E-mail: katsu0920@me.cs.scitec.kobe-u.ac.jp, {takigu,ariki}@kobe-u.ac.jp Histograms of Oriented Gradients (HOG) HOG Shape Contexts HOG 5.5 Histograms of Oriented Gradients D Human

More information

130 Oct Radial Basis Function RBF Efficient Market Hypothesis Fama ) 4) 1 Fig. 1 Utility function. 2 Fig. 2 Value function. (1) (2)

130 Oct Radial Basis Function RBF Efficient Market Hypothesis Fama ) 4) 1 Fig. 1 Utility function. 2 Fig. 2 Value function. (1) (2) Vol. 47 No. SIG 14(TOM 15) Oct. 2006 RBF 2 Effect of Stock Investor Agent According to Framing Effect to Stock Exchange in Artificial Stock Market Zhai Fei, Shen Kan, Yusuke Namikawa and Eisuke Kita Several

More information

2797 4 5 6 7 2. 2.1 COM COM 4) 5) COM COM 3 4) 5) 2 2.2 COM COM 6) 7) 10) COM Bonanza 6) Bonanza 6 10 20 Hearts COM 7) 10) 52 4 3 Hearts 3 2,000 4,000

2797 4 5 6 7 2. 2.1 COM COM 4) 5) COM COM 3 4) 5) 2 2.2 COM COM 6) 7) 10) COM Bonanza 6) Bonanza 6 10 20 Hearts COM 7) 10) 52 4 3 Hearts 3 2,000 4,000 Vol. 50 No. 12 2796 2806 (Dec. 2009) 1 1, 2 COM TCG COM TCG COM TCG Strategy-acquisition System for Video Trading Card Game Nobuto Fujii 1 and Haruhiro Katayose 1, 2 Behavior and strategy of computers

More information

Machine Learning for NLP

Machine Learning for NLP 自然言語処理におけるディープラーニングの発展 Yuta Tsuboi IBM Research Tokyo yutat@jp.ibm.com 2015-03-16 出版予定のサーベイ論文の内容を元にお話します 坪井祐太, 自然言語処理におけるディープラーニングの発展, オペレーションズ リサーチ, Vol.60, No.4 (In press) 自然言語処理 (Natural Language Processing;

More information

IPSJ SIG Technical Report Vol.2014-CG-155 No /6/28 1,a) 1,2,3 1 3,4 CG An Interpolation Method of Different Flow Fields using Polar Inter

IPSJ SIG Technical Report Vol.2014-CG-155 No /6/28 1,a) 1,2,3 1 3,4 CG An Interpolation Method of Different Flow Fields using Polar Inter ,a),2,3 3,4 CG 2 2 2 An Interpolation Method of Different Flow Fields using Polar Interpolation Syuhei Sato,a) Yoshinori Dobashi,2,3 Tsuyoshi Yamamoto Tomoyuki Nishita 3,4 Abstract: Recently, realistic

More information

yasi10.dvi

yasi10.dvi 2002 50 2 259 278 c 2002 1 2 2002 2 14 2002 6 17 73 PML 1. 1997 1998 Swiss Re 2001 Canabarro et al. 1998 2001 1 : 651 0073 1 5 1 IHD 3 2 110 0015 3 3 3 260 50 2 2002, 2. 1 1 2 10 1 1. 261 1. 3. 3.1 2 1

More information

‰gficŒõ/’ÓŠ¹

‰gficŒõ/’ÓŠ¹ The relationship between creativity of Haiku and idea search space YOSHIDA Yasushi This research examined the relationship between experts' ranking of creative Haiku (a Japanese character poem including

More information

フレーズベース機械翻訳システムの構築 フレーズベース機械翻訳システムの構築 Graham Neubig & Kevin Duh 奈良先端科学技術大学院大学 (NAIST) 5/10/2012 1

フレーズベース機械翻訳システムの構築 フレーズベース機械翻訳システムの構築 Graham Neubig & Kevin Duh 奈良先端科学技術大学院大学 (NAIST) 5/10/2012 1 Graham Neubig & Kevin Duh 奈良先端科学技術大学院大学 (NAIST) 5/10/2012 1 フレーズベース統計的機械翻訳 ( SMT ) 文を翻訳可能な小さい塊に分けて 並べ替える Today I will give a lecture on machine translation. Today 今日は I will give を行います a lecture on の講義

More information

IPSJ SIG Technical Report 1, Instrument Separation in Reverberant Environments Using Crystal Microphone Arrays Nobutaka ITO, 1, 2 Yu KITANO, 1

IPSJ SIG Technical Report 1, Instrument Separation in Reverberant Environments Using Crystal Microphone Arrays Nobutaka ITO, 1, 2 Yu KITANO, 1 1, 2 1 1 1 Instrument Separation in Reverberant Environments Using Crystal Microphone Arrays Nobutaka ITO, 1, 2 Yu KITANO, 1 Nobutaka ONO 1 and Shigeki SAGAYAMA 1 This paper deals with instrument separation

More information

IPSJ SIG Technical Report Vol.2009-CVIM-167 No /6/10 Real AdaBoost HOG 1 1 1, 2 1 Real AdaBoost HOG HOG Real AdaBoost HOG A Method for Reducing

IPSJ SIG Technical Report Vol.2009-CVIM-167 No /6/10 Real AdaBoost HOG 1 1 1, 2 1 Real AdaBoost HOG HOG Real AdaBoost HOG A Method for Reducing Real AdaBoost HOG 1 1 1, 2 1 Real AdaBoost HOG HOG Real AdaBoost HOG A Method for Reducing number of HOG Features based on Real AdaBoost Chika Matsushima, 1 Yuji Yamauchi, 1 Takayoshi Yamashita 1, 2 and

More information

カルマンフィルターによるベータ推定( )

カルマンフィルターによるベータ推定( ) β TOPIX 1 22 β β smoothness priors (the Capital Asset Pricing Model, CAPM) CAPM 1 β β β β smoothness priors :,,. E-mail: koiti@ism.ac.jp., 104 1 TOPIX β Z i = β i Z m + α i (1) Z i Z m α i α i β i (the

More information

89-95.indd

89-95.indd 解 説 機械翻訳最新事情 : ( 上 ) 統計的機械翻訳入門 永田昌明渡辺太郎塚田元 NTT 科学基礎研究所 統計的機械翻訳 (statistical machin translation) は, 互いに翻訳になっている 2 つの言語の文の対から翻訳規則や対訳辞書を自動的に学習し, 言語翻訳を実現する技術である. この技術は過去 0 年間に大きく進歩し, アラビア語と英語のような語順が比較的近い言語対では,

More information

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro

& Vol.5 No (Oct. 2015) TV 1,2,a) , Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Ro TV 1,2,a) 1 2 2015 1 26, 2015 5 21 Augmented TV TV AR Augmented Reality 3DCG TV Estimation of TV Screen Position and Rotation Using Mobile Device Hiroyuki Kawakita 1,2,a) Toshio Nakagawa 1 Makoto Sato

More information

DEIM Forum 2009 C8-4 QA NTT QA QA QA 2 QA Abstract Questions Recomme

DEIM Forum 2009 C8-4 QA NTT QA QA QA 2 QA Abstract Questions Recomme DEIM Forum 2009 C8-4 QA NTT 239 0847 1 1 E-mail: {kabutoya.yutaka,kawashima.harumi,fujimura.ko}@lab.ntt.co.jp QA QA QA 2 QA Abstract Questions Recommendation Based on Evolution Patterns of a QA Community

More information

3. ( 1 ) Linear Congruential Generator:LCG 6) (Mersenne Twister:MT ), L 1 ( 2 ) 4 4 G (i,j) < G > < G 2 > < G > 2 g (ij) i= L j= N

3. ( 1 ) Linear Congruential Generator:LCG 6) (Mersenne Twister:MT ), L 1 ( 2 ) 4 4 G (i,j) < G > < G 2 > < G > 2 g (ij) i= L j= N RMT 1 1 1 N L Q=L/N (RMT), RMT,,,., Box-Muller, 3.,. Testing Randomness by Means of RMT Formula Xin Yang, 1 Ryota Itoi 1 and Mieko Tanaka-Yamawaki 1 Random matrix theory derives, at the limit of both dimension

More information

新製品開発プロジェクトの評価手法

新製品開発プロジェクトの評価手法 CIRJE-J-60 2001 8 A note on new product project selection model: Empirical analysis in chemical industry Kenichi KuwashimaUniversity of Tokyo Junichi TomitaUniversity of Tokyo August, 2001 Abstract By

More information

IPSJ SIG Technical Report Vol.2009-BIO-17 No /5/26 DNA 1 1 DNA DNA DNA DNA Correcting read errors on DNA sequences determined by Pyrosequencing

IPSJ SIG Technical Report Vol.2009-BIO-17 No /5/26 DNA 1 1 DNA DNA DNA DNA Correcting read errors on DNA sequences determined by Pyrosequencing DNA 1 1 DNA DNA DNA DNA Correcting read errors on DNA sequences determined by Pyrosequencing Youhei Namiki 1 and Yutaka Akiyama 1 Pyrosequencing, one of the DNA sequencing technologies, allows us to determine

More information

jpaper : 2017/4/17(17:52),,.,,,.,.,.,, Improvement in Domain Specific Word Segmentation by Symbol Grounding suzushi tomori, hirotaka kameko, takashi n

jpaper : 2017/4/17(17:52),,.,,,.,.,.,, Improvement in Domain Specific Word Segmentation by Symbol Grounding suzushi tomori, hirotaka kameko, takashi n ,,.,,,.,.,.,, Improvement in Domain Specific Word Segmentation by Symbol Grounding suzushi tomori, hirotaka kameko, takashi ninomiya, shinsuke mori and yoshimasa tsuruoka We propose a novel framework for

More information

A Study on Throw Simulation for Baseball Pitching Machine with Rollers and Its Optimization Shinobu SAKAI*5, Yuichiro KITAGAWA, Ryo KANAI and Juhachi

A Study on Throw Simulation for Baseball Pitching Machine with Rollers and Its Optimization Shinobu SAKAI*5, Yuichiro KITAGAWA, Ryo KANAI and Juhachi A Study on Throw Simulation for Baseball Pitching Machine with Rollers and Its Optimization Shinobu SAKAI*5, Yuichiro KITAGAWA, Ryo KANAI and Juhachi ODA Department of Human and Mechanical Systems Engineering,

More information

a) b) c) Speech Recognition of Short Time Utterance Based on Speaker Clustering Hiroshi SEKI a), Daisuke ENAMI, Faqiang ZHU, Kazumasa YAMAMOTO b), and

a) b) c) Speech Recognition of Short Time Utterance Based on Speaker Clustering Hiroshi SEKI a), Daisuke ENAMI, Faqiang ZHU, Kazumasa YAMAMOTO b), and a) b) c) Speech Recognition of Short Time Utterance Based on Speaker Clustering Hiroshi SEKI a), Daisuke ENAMI, Faqiang ZHU, Kazumasa YAMAMOTO b), and Seiichi NAKAGAWA c) 0.5 DNN (Deep Neural Network)

More information

kut-paper-template.dvi

kut-paper-template.dvi 14 Application of Automatic Text Summarization for Question Answering System 1030260 2003 2 12 Prassie Posum Prassie Prassie i Abstract Application of Automatic Text Summarization for Question Answering

More information

HASC2012corpus HASC Challenge 2010,2011 HASC2011corpus( 116, 4898), HASC2012corpus( 136, 7668) HASC2012corpus HASC2012corpus

HASC2012corpus HASC Challenge 2010,2011 HASC2011corpus( 116, 4898), HASC2012corpus( 136, 7668) HASC2012corpus HASC2012corpus HASC2012corpus 1 1 1 1 1 1 2 2 3 4 5 6 7 HASC Challenge 2010,2011 HASC2011corpus( 116, 4898), HASC2012corpus( 136, 7668) HASC2012corpus HASC2012corpus: Human Activity Corpus and Its Application Nobuo KAWAGUCHI,

More information

¥ì¥·¥Ô¤Î¸À¸ì½èÍý¤Î¸½¾õ

¥ì¥·¥Ô¤Î¸À¸ì½èÍý¤Î¸½¾õ 2013 8 18 Table of Contents = + 1. 2. 3. 4. 5. etc. 1. ( + + ( )) 2. :,,,,,, (MUC 1 ) 3. 4. (subj: person, i-obj: org. ) 1 Message Understanding Conference ( ) UGC 2 ( ) : : 2 User-Generated Content [

More information

IPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta

IPSJ SIG Technical Report Vol.2010-CVIM-170 No /1/ Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Ta 1 1 1 1 2 1. Visual Recognition of Wire Harnesses for Automated Wiring Masaki Yoneda, 1 Takayuki Okatani 1 and Koichiro Deguchi 1 This paper presents a method for recognizing the pose of a wire harness

More information

11 22 33 12 23 1 2 3, 1 2, U2 3 U 1 U b 1 (o t ) b 2 (o t ) b 3 (o t ), 3 b (o t ) MULTI-SPEAKER SPEECH DATABASE Training Speech Analysis Mel-Cepstrum, logf0 /context1/ /context2/... Context Dependent

More information

一般社団法人 電子情報通信学会 THE INSTITUTE OF ELECTRONICS, 社団法人 電子情報通信学会 INFORMATION AND COMMUNICATION ENGINEERS 信学技報 IEICE Technical Report NLC ( ) 信学

一般社団法人 電子情報通信学会 THE INSTITUTE OF ELECTRONICS, 社団法人 電子情報通信学会 INFORMATION AND COMMUNICATION ENGINEERS 信学技報 IEICE Technical Report NLC ( ) 信学 一般社団法人 電子情報通信学会 THE INSTITUTE OF ELECTRONICS, 社団法人 電子情報通信学会 INFORMATION AND COMMUNICATION ENGINEERS 信学技報 IEICE Technical Report NLC2017-17(2017-09 信学技報 TECHNICAL REPORT OF IEICE. THE INSTITUTE OF ELECTRONICS,

More information

2008 : 80725872 1 2 2 3 2.1.......................................... 3 2.2....................................... 3 2.3......................................... 4 2.4 ()..................................

More information

統計的機械翻訳モデルの構築 各モデルを対訳文から学習 対訳文 太郎が花子を訪問した Taro visited Hanako. 花子にプレセントを渡した He gave Hanako a present.... モデル翻訳モデル並べ替えモデル言語モデル 2

統計的機械翻訳モデルの構築 各モデルを対訳文から学習 対訳文 太郎が花子を訪問した Taro visited Hanako. 花子にプレセントを渡した He gave Hanako a present.... モデル翻訳モデル並べ替えモデル言語モデル 2 ALAGIN 機械翻訳セミナー単語アライメント Graham Neubig 奈良先端科学技術大学院大学 (NAIST) 2014 年 3 月 5 日 https://sites.google.com/site/alaginmt2014/ 1 統計的機械翻訳モデルの構築 各モデルを対訳文から学習 対訳文 太郎が花子を訪問した Taro visited Hanako. 花子にプレセントを渡した He gave

More information

& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable),

& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable), .... Deeping and Expansion of Large-Scale Random Fields and Probabilistic Image Processing Kazuyuki Tanaka The mathematical frameworks of probabilistic image processing are formulated by means of Markov

More information

,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw

,.,. NP,., ,.,,.,.,,, (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., , tatsukaw ,.,. NP,.,. 1 1.1.,.,,.,.,,,. 2. 1.1.1 (PCA)...,,. Tipping and Bishop (1999) PCA. (PPCA)., (Ilin and Raiko, 2010). PPCA EM., 152-8552 2-12-1, tatsukawa.m.aa@m.titech.ac.jp, 190-8562 10-3, mirai@ism.ac.jp

More information

7) 8) 9),10) 11) 18) 11),16) 18) 19) 20) Vocaloid 6) Vocaloid 1 VocaListener1 2 VocaListener1 3 VocaListener VocaListener1 VocaListener1 Voca

7) 8) 9),10) 11) 18) 11),16) 18) 19) 20) Vocaloid 6) Vocaloid 1 VocaListener1 2 VocaListener1 3 VocaListener VocaListener1 VocaListener1 Voca VocaListener2: 1 1 VocaListener2 VocaListener VocaListener2 VocaListener2 VocaListener VocaListener2 VocaListener2: A Singing Synthesis System Mimicking Voice Timbre Changes in Addition to Pitch and Dynamics

More information

1., 1 COOKPAD 2, Web.,,,,,,.,, [1]., 5.,, [2].,,.,.,, 5, [3].,,,.,, [4], 33,.,,.,,.. 2.,, 3.., 4., 5., ,. 1.,,., 2.,. 1,,

1., 1 COOKPAD 2, Web.,,,,,,.,, [1]., 5.,, [2].,,.,.,, 5, [3].,,,.,, [4], 33,.,,.,,.. 2.,, 3.., 4., 5., ,. 1.,,., 2.,. 1,, THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE.,, 464 8601 470 0393 101 464 8601 E-mail: matsunagah@murase.m.is.nagoya-u.ac.jp, {ide,murase,hirayama}@is.nagoya-u.ac.jp,

More information

1. HNS [1] HNS HNS HNS [2] HNS [3] [4] [5] HNS 16ch SNR [6] 1 16ch 1 3 SNR [4] [5] 2. 2 HNS API HNS CS27-HNS [1] (SOA) [7] API Web 2

1. HNS [1] HNS HNS HNS [2] HNS [3] [4] [5] HNS 16ch SNR [6] 1 16ch 1 3 SNR [4] [5] 2. 2 HNS API HNS CS27-HNS [1] (SOA) [7] API Web 2 THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS TECHNICAL REPORT OF IEICE. 657 8531 1 1 E-mail: {soda,matsubara}@ws.cs.kobe-u.ac.jp, {masa-n,shinsuke,shin,yosimoto}@cs.kobe-u.ac.jp,

More information

概要 単語の分散表現に基づく統計的機械翻訳の素性を提案 既存手法の FFNNLM に CNN と Gate を追加 dependency- to- string デコーダにおいて既存手法を上回る翻訳精度を達成

概要 単語の分散表現に基づく統計的機械翻訳の素性を提案 既存手法の FFNNLM に CNN と Gate を追加 dependency- to- string デコーダにおいて既存手法を上回る翻訳精度を達成 Encoding Source Language with Convolu5onal Neural Network for Machine Transla5on Fandong Meng, Zhengdong Lu, Mingxuan Wang, Hang Li, Wenbin Jiang, Qun Liu, ACL- IJCNLP 2015 すずかけ読み会奥村 高村研究室博士二年上垣外英剛 概要

More information

dvi

dvi 2017 65 2 185 200 2017 1 2 2016 12 28 2017 5 17 5 24 PITCHf/x PITCHf/x PITCHf/x MLB 2014 PITCHf/x 1. 1 223 8522 3 14 1 2 223 8522 3 14 1 186 65 2 2017 PITCHf/x 1.1 PITCHf/x PITCHf/x SPORTVISION MLB 30

More information

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q x-means 1 2 2 x-means, x-means k-means Bayesian Information Criterion BIC Watershed x-means Moving Object Extraction Using the Number of Clusters Determined by X-means Clustering Naoki Kubo, 1 Kousuke

More information

(i) 1 (ii) ,, 第 5 回音声ドキュメント処理ワークショップ講演論文集 (2011 年 3 月 7 日 ) 1) 1 2) Lamel 2) Roy 3) 4) w 1 w 2 w n 2 2-g

(i) 1 (ii) ,, 第 5 回音声ドキュメント処理ワークショップ講演論文集 (2011 年 3 月 7 日 ) 1) 1  2) Lamel 2) Roy 3) 4) w 1 w 2 w n 2 2-g 1 2 1 closed Automatic Detection of Edited Parts in Inexact Transcribed Corpora Using Alignment between Edited Transcription and Corresponding Utterance Kengo Ohta, 1 Masatoshi Tsuchiya 2 and Seiichi Nakagawa

More information

第62巻 第1号 平成24年4月/石こうを用いた木材ペレット

第62巻 第1号 平成24年4月/石こうを用いた木材ペレット Bulletin of Japan Association for Fire Science and Engineering Vol. 62. No. 1 (2012) Development of Two-Dimensional Simple Simulation Model and Evaluation of Discharge Ability for Water Discharge of Firefighting

More information

IPSJ SIG Technical Report Vol.2011-EC-19 No /3/ ,.,., Peg-Scope Viewer,,.,,,,. Utilization of Watching Logs for Support of Multi-

IPSJ SIG Technical Report Vol.2011-EC-19 No /3/ ,.,., Peg-Scope Viewer,,.,,,,. Utilization of Watching Logs for Support of Multi- 1 3 5 4 1 2 1,.,., Peg-Scope Viewer,,.,,,,. Utilization of Watching Logs for Support of Multi-View Video Contents Kosuke Niwa, 1 Shogo Tokai, 3 Tetsuya Kawamoto, 5 Toshiaki Fujii, 4 Marutani Takafumi,

More information

IPSJ SIG Technical Report Vol.2017-SLP-115 No /2/18 1,a) 1 1,2 Sakriani Sakti [1][2] [3][4] [5][6][7] [8] [9] 1 Nara Institute of Scie

IPSJ SIG Technical Report Vol.2017-SLP-115 No /2/18 1,a) 1 1,2 Sakriani Sakti [1][2] [3][4] [5][6][7] [8] [9] 1 Nara Institute of Scie 1,a) 1 1,2 Sakriani Sakti 1 1 1 1. [1][2] [3][4] [5][6][7] [8] [9] 1 Nara Institute of Science and Technology 2 Japan Science and Technology Agency a) ishikawa.yoko.io5@is.naist.jp 2. 1 Belief-Desire theory

More information

DEIM Forum 2009 B4-6, Str

DEIM Forum 2009 B4-6, Str DEIM Forum 2009 B4-6, 305 8573 1 1 1 152 8550 2 12 1 E-mail: tttakuro@kde.cs.tsukuba.ac.jp, watanabe@de.cs.titech.ac.jp, kitagawa@cs.tsukuba.ac.jp StreamSpinner PC PC StreamSpinner Development of Data

More information

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L 1,a) 1,b) 1/f β Generation Method of Animation from Pictures with Natural Flicker Abstract: Some methods to create animation automatically from one picture have been proposed. There is a method that gives

More information

DEIM Forum 2019 H Web 1 Tripadvisor

DEIM Forum 2019 H Web 1 Tripadvisor DEIM Forum 2019 H7-2 163 8677 1 24 2 E-mail: em18011@ns.kogakuin.ac.jp, kitayama@cc.kogakuin.ac.jp Web 1 Tripadvisor 1 2 1 1https://www.tripadvisor.com/ 2https://www.jalan.net/kankou/ 1 2 3 4 5 6 7 2 2.

More information

Vol.55 No (Jan. 2014) saccess 6 saccess 7 saccess 2. [3] p.33 * B (A) (B) (C) (D) (E) (F) *1 [3], [4] Web PDF a m

Vol.55 No (Jan. 2014) saccess 6 saccess 7 saccess 2. [3] p.33 * B (A) (B) (C) (D) (E) (F) *1 [3], [4] Web PDF   a m Vol.55 No.1 2 15 (Jan. 2014) 1,a) 2,3,b) 4,3,c) 3,d) 2013 3 18, 2013 10 9 saccess 1 1 saccess saccess Design and Implementation of an Online Tool for Database Education Hiroyuki Nagataki 1,a) Yoshiaki

More information

独立行政法人情報通信研究機構 Development of the Information Analysis System WISDOM KIDAWARA Yutaka NICT Knowledge Clustered Group researched and developed the infor

独立行政法人情報通信研究機構 Development of the Information Analysis System WISDOM KIDAWARA Yutaka NICT Knowledge Clustered Group researched and developed the infor 独立行政法人情報通信研究機構 KIDAWARA Yutaka NICT Knowledge Clustered Group researched and developed the information analysis system WISDOM as a research result of the second medium-term plan. WISDOM has functions that

More information

Trial for Value Quantification from Exceptional Utterances 37-066593 1 5 1.1.................................. 5 1.2................................ 8 2 9 2.1.............................. 9 2.1.1.........................

More information

2 251 Barrera, 1986; Barrera, e.g., Gottlieb, 1985 Wethington & Kessler 1986 r Cohen & Wills,

2 251 Barrera, 1986; Barrera, e.g., Gottlieb, 1985 Wethington & Kessler 1986 r Cohen & Wills, 2014 25 1 1 11 1 3,085 100 1 1988 e.g., 2000 3 e.g., 2005; 1999 100 1960 100 2012 2 6 23 1 98.2 1999 1999 3 65.3 1999 1996 1 21 e.g., 1999 3 1 2 251 Barrera, 1986; 1993 1 2 2001 3 2001 Barrera, 1981 1993

More information

28 Horizontal angle correction using straight line detection in an equirectangular image

28 Horizontal angle correction using straight line detection in an equirectangular image 28 Horizontal angle correction using straight line detection in an equirectangular image 1170283 2017 3 1 2 i Abstract Horizontal angle correction using straight line detection in an equirectangular image

More information

The Japanese Journal of Health Psychology, 29(S): (2017)

The Japanese Journal of Health Psychology, 29(S): (2017) Journal of Health Psychology Research 2017, Vol. 29, Special issue, 139 149Journal of Health Psychology Research 2016, J-STAGE Vol. Advance 29, Special publication issue, 139 149 date : 5 December, 2016

More information

11_寄稿論文_李_再校.mcd

11_寄稿論文_李_再校.mcd 148 2011.4 1 4 Alderson 1996, Chapelle 2001, Huston 2002, Barker 2004, Rimmer 2006, Chodorow et al. 2010 He & Dai 2006 2 3 4 2 5 4 1. 2. 3. 1 2 (1) 3 90 (2) 80 1964 Brown 80 90 British National Corpus

More information

Study on Throw Accuracy for Baseball Pitching Machine with Roller (Study of Seam of Ball and Roller) Shinobu SAKAI*5, Juhachi ODA, Kengo KAWATA and Yu

Study on Throw Accuracy for Baseball Pitching Machine with Roller (Study of Seam of Ball and Roller) Shinobu SAKAI*5, Juhachi ODA, Kengo KAWATA and Yu Study on Throw Accuracy for Baseball Pitching Machine with Roller (Study of Seam of Ball and Roller) Shinobu SAKAI*5, Juhachi ODA, Kengo KAWATA and Yuichiro KITAGAWA Department of Human and Mechanical

More information

Fig. 1 Relative delay coding.

Fig. 1 Relative delay coding. An Architecture of Small-scaled Neuro-hardware Using Probabilistically-coded Pulse Neurons Takeshi Kawashima, Non-member (DENSO CORPORATION), Akio Ishiguro, Member (Nagoya University), Shigeru Okuma, Member

More information

@08470030ヨコ/篠塚・窪田 221号

@08470030ヨコ/篠塚・窪田 221号 Abstract Among three distinctive types of Japanese writing systems Kanji, Hiragana and Katakana, a behavioral experiment using 97 university students as subjects implies that Katakana is regarded as most

More information

Vol. 43 No. 2 Feb. 2002,, MIDI A Probabilistic-model-based Quantization Method for Estimating the Position of Onset Time in a Score Masatoshi Hamanaka

Vol. 43 No. 2 Feb. 2002,, MIDI A Probabilistic-model-based Quantization Method for Estimating the Position of Onset Time in a Score Masatoshi Hamanaka Vol. 43 No. 2 Feb. 2002,, MIDI A Probabilistic-model-based Quantization Method for Estimating the Position of Onset Time in a Score Masatoshi Hamanaka, Masataka Goto,, Hideki Asoh and Nobuyuki Otsu, This

More information