12/1 ( ) GLM, R MCMC, WinBUGS 12/2 ( ) WinBUGS WinBUGS 12/2 ( ) : 12/3 ( ) :? ( :51 ) 2/ 71

Size: px
Start display at page:

Download "12/1 ( ) GLM, R MCMC, WinBUGS 12/2 ( ) WinBUGS WinBUGS 12/2 ( ) : 12/3 ( ) :? ( :51 ) 2/ 71"

Transcription

1 ( :51 ) 1/ 71 GCOE WinBUGS [email protected]

2 12/1 ( ) GLM, R MCMC, WinBUGS 12/2 ( ) WinBUGS WinBUGS 12/2 ( ) : 12/3 ( ) :? ( :51 ) 2/ 71

3 ( :51 ) 3/ 71 ( ) random effects BUGS

4 ( :51 ) 4/ 71

5 ( :51 ) 5/ 71 GLM (1)? y = 0, 1, 2, 3, (y ) (family = poisson) y = {0, 1}, y = {0, 1, 2,, N} (family = binomial) (family = Gamma) (family = gaussian)

6 R : glm() ( ) rbinom() glm(family = binomial) rbinom() glm(family = binomial) rpois() glm(family = poisson) rnbinom() glm.nb() ( ) rgamma() glm(family = gamma) rnorm() glm(family = gaussian) glm() glm.nb() MASS library GLM ( :51 ) 6/ 71

7 ( :51 ) 7/ 71 GLM (2)!! GLMM! GLMM/ random effects GLM

8 ( :51 ) 8/ 71 (Poisson distribution) (Binomial distribution) ( ) (Normal distribution, Gaussian ) (Gamma distribution)

9 ( :51 ) 9/ 71

10 ( :51 ) 10/ 71

11 ( :51 ) 11/ 71 : + WinBUGS 1. : GLMM 2.

12 1. : GLMM

13 : 8 ( ) ( ) : N i = 8 i y i = 3 : :? ( :51 ) 13/ 71

14 ( :51 ) 14/ 71 :?! ! y i

15 ( :51 ) 15/ 71 (overdispersion) y i 0.5 : overdispersion :?

16 ? : 1. fixed effects 2. random effects ( :51 ) 16/ 71

17 ( :51 ) 17/ 71 fixed random? fixed/random effects : 1. fixed effects : ( ) fixed effects 2. random effects : fixed effects ( ) random effects

18 ( :51 ) 18/ 71 : i N i y i ( ) Ni p(y i q i ) = q y i i (1 q i) N i y i, y i q i

19 ( :51 ) 19/ 71 q i = q(z i ) (logistic) q(z) = 1/{1 + exp( z)} q(z) z z i = a + b i a: b i : i ( )

20 b i (a {b 1, b 2,, b 100 }) /! ( )?? ( ) ( :51 ) 20/ 71

21 ( :51 ) 21/ 71 : b i s p(b i s) = 1 2πs 2 exp b2 i 2s 2, = 1 = 1.5 = 3 b i {b 1, b 2,, b 100 }

22 ( :51 ) 22/ 71 b i s = 1 = 1.5 = 3 b i s b i s b i y i

23 (C) : b i s s ( :51 ) 23/ 71 b i? (A) (B) (C)!? s (A) : b i (B) : b i ( -5 5 )

24 y i = 2 b i b i p(b i y i = 2, s) s p(y i = 2 b i ) b i p(b i s) s b i ( :51 ) 24/ 71

25 y i {2, 3, 5} b i s s s b i ( :51 ) 25/ 71

26 ( :51 ) 26/ 71? データ N[i] Y[i] 胚珠数中の生存 tau は hyper parameter 二項分布生存確率 q[i] 全体の平均 a 無情報事前分布 植物の個体差 b[i] 事前分布 tau 個体差のばらつき 無情報事前分布 ( 超事前分布 )

27 ( :51 ) 27/ 71 データ N[i] Y[i] 胚珠数中の生存 tau は hyper parameter 二項分布生存確率 q[i] 全体の平均 a 無情報事前分布 植物の個体差 b[i] 事前分布 tau 個体差のばらつき 無情報事前分布 ( 超事前分布 )

28 ? b i s = 0.1? s = 0.1 s individual posterior prior small large ( ) s ( :51 ) 28/ 71

29 ( :51 ) 29/ 71 τ = 1/s 2 s τ (non-informative prior) p(τ ) = τ α 1 e τ β Γ(α)β α, α = β = 10 4

30 ( :51 ) 30/ 71 (1) τ ( 1; 1) ( 1; 100) τ τ

31 (2) a ( 0; 1) ( 0; 100) a (logit) a ( :51 ) 31/ 71

32 ( :51 ) 32/ 71 p(a, {b i }, τ ) = 100 i=1 p(y i q(a + b i )) p(a) p(b i τ ) h(τ ) ( ) db i dτ da p(a, {b i }, τ ) 100 i=1 p(y i q(a + b i )) p(a) p(b i τ ) h(τ ) : : p(a, {b i }, τ ) 100 i=1 p(y i q(a + b i )) : p(a) p(b i τ ) h(τ )

33 ( :51 ) 33/ 71 b i s individual small hyperprior posterior prior (posterior) large hyperparameter s MCMC

34 ? p(a, {b i }, τ ) 100 i=1 p(y i q(a + b i )) p(a) p(b i τ ) h(τ ) p(a, {b i }, τ ) Markov chain Monte Carlo (MCMC)! WinBUGS ( :51 ) 34/ 71

35 Gibbs sampling p(a ) 100 p(y i q(a + b i )) p(a) i=1 p(τ ) 100 i=1 p(b i τ ) h(τ ) p(b 1 ) p(y 1 q(a + b 1 )) p(b 1 τ ) p(b 2 ) p(y 2 q(a + b 2 )) p(b 2 τ ). p(b 100 ) p(y 100 q(a + b 100 )) p(b 100 τ ) ( :51 ) 35/ 71

36 ( :51 ) 36/

37 ( :51 ) 37/ 71 :

38 ( :51 ) 38/ 71 ( ) ( ) ( ) ( ) データ N[i] Y[i] 胚珠数中の生存 tau は hyper parameter 二項分布生存確率 q[i] 全体の平均 a 無情報事前分布 植物の個体差 b[i] 事前分布 tau 個体差のばらつき 無情報事前分布 ( 超事前分布 ) : Markov Chain Monte Carlo (MCMC)

39 ( :51 ) 39/ 71

40 ( :51 ) 40/ 71 WinBUGS

41 ( :51 ) 41/ 71 WinBUGS BUGS (model.bug.txt) 3. R2WBwrapper R (runbugs.r) 4. R runbugs.r (source(runbugs.r) ) 5. bugs

42 ( :51 ) 42/ 71? データ N[i] Y[i] 胚珠数中の生存 tau は hyper parameter 二項分布生存確率 q[i] 全体の平均 無情報事前分布 p(a, {b i }, τ ) 100 i=1 a 植物の個体差 b[i] 事前分布 tau 個体差のばらつき 無情報事前分布 ( 超事前分布 ) p( q(a + b i )) p(a) p(b i τ ) h(τ )

43 BUGS model.bug.txt ( ) model{ for (i in 1:N.sample) { Y[i] ~ dbin(q[i], N[i]) # logit(q[i]) <- a + b[i] # q[i] } a ~ dnorm(0, 1.0E-4) # for (i in 1:N.sample) { b[i] ~ dnorm(0, tau) # } tau ~ dgamma(1.0e-4, 1.0E-4) # sigma <- sqrt(1 / tau) # tau SD } ( :51 ) 43/ 71

44 ( :51 ) 44/ 71 (hierarchical) random effects (non-informative) fixed effects (subjective) ( )

45 BUGS BUGS ( ) node ( ) 1. ~ sthochastic node 2. <- deterministic node ( :51 ) 45/ 71

46 R2WBwrapper R runbugs.r ( ) source("r2wbwrapper.r") # R2WBwrapper d <- read.csv("data.csv") # clear.data.param() # ( ) set.data("n.sample", nrow(d)) # set.data("n", d$n) # set.data("y", d$y) # ( :51 ) 46/ 71

47 R2WBwrapper R runbugs.r ( ) set.param("a", 0) # set.param("sigma", NA) # set.param("b", rep(0, N.sample)) # set.param("tau", 1, save = FALSE) # set.param("p", NA) # post.bugs <- call.bugs( # WinBUGS file = "model.bug.txt", n.iter = 2000, n.burnin = 1000, n.thin = 5 ) ( :51 ) 47/ 71

48 WinBUGS post.bugs <- call.bugs( file = "model.bug.txt", # WinBUGS n.iter = 2000, n.burnin = 1000, n.thin = 5 ) default ( ) 3 (n.chains = 3) MCMC sampling ( ) cf. PC MCMC chain 2000 step (n.iter = 2000) 1000 step (n.burnin = 1000) step 5 step (n.thin = 5) ( :51 ) 48/ 71

49 ? R source("runbugs.r") WinBUGS MCMC sampling (WinBUGS ) WinBUGS WinBUGS R post.bugs ( :51 ) 49/ 71

50 ( :51 ) 50/ 71 R a a?

51 ( :51 ) 51/ 71 bugs post.bugs (1) plot(post.bugs), R-hat Gelman-Rubin var ˆ ˆR + (ψ y) = W var ˆ + (ψ y) = n 1 n W + 1 n B W : chain variance B : chain variance Gelman et al Bayesian Data Analysis. Chapman & Hall/CRC

52 ( :51 ) 52/ 71 ubothinkpad/public_html/stat/2009/ism/winbugs/model.bug.txt", fit using WinBUGS, 3 chains, each with 1300 i 80% interval for each chain R-hat a sigma * b[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] tau * q[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] * array truncated for lack of space a sigma * b tau * q deviance medians and 80% intervals

53 mean sd 2.5% 25% 50% 75% 97.5% Rhat n.eff a sigma b[1] b[2] b[3] b[4] b[5] b[6] b[7] b[8] b[9] b[10] ( :51 ) 53/ 71 bugs post.bugs (2) print(post.bugs, digits.summary = 3) 95%

54 ( :51 ) 54/ 71 mcmc.list post.list <- to.list(post.bugs) plot(post.list[,1:4,], smooth = F),

55 ( :51 ) 55/ 71

56 ( :51 ) 56/ 71 mcmc post.mcmc <- to.mcmc(post.bugs) matrix :

57 2.

58 ( :51 ) 58/ 71 : abundance ( ) location :

59 ( :51 ) 59/ 71 : abundance location ( 95% )

60 β : β Normal(0, 10 2 ), ( :51 ) 60/ i λ i : y i Poisson(λ i ) λ i ( ) + ( ) : log λ i = β + r i

61 ( :51 ) 61/ 71 ( ) Conditional Autoregressive (CAR) r i (N i i, J i i ): j J r i Normal( i r j, σ ) σ N i N i σ : τ = 1/σ Gamma(1.0 2, ) p(β, {r i }, τ {y i }) = p({y i} β, {r i }, τ ) ( ( )dβdr1 dr 50 dτ

62 τ - τ (σ ) - τ (σ ) - τ ( ) abundance tau = abundance tau = tau = location location ( :51 ) 62/ 71

63 ( :51 ) 63/ 71 BUGS model { # BUGS for (i in 1:N.site) { Y[i] ~ dpois(mean[i]) # log(mean[i]) <- beta + re[i] # ( ) + ( ) } # re[i] CAR model re[1:n.site] ~ car.normal(adj[], Weights[], Num[], tau) beta ~ dnorm(0, 1.0E-2) # tau ~ dgamma(1.0e-2, 1.0E-2) # }

64 ( :51 ) 64/ 71 abundance β beta τ location tau ( 95% )

65 ( :51 ) 65/ 71 abundance location GLMM OK?

66 ( :51 ) 66/ 71 vs abundance abundance location location?

67 ( :51 ) 67/ 71 ( ):?!! abundance location (

68 ( :51 ) 68/ 71 abundance abundance location location!

69 ( :51 ) 69/ 71 abundance abundance location location CAR

70 ( :51 ) 70/ 71 : abundance abundance location location

71 ( :51 ) 71/ 71 : 1. : GLMM 2.

kubo2015ngt6 p.2 ( ( (MLE 8 y i L(q q log L(q q 0 ˆq log L(q / q = 0 q ˆq = = = * ˆq = 0.46 ( 8 y 0.46 y y y i kubo (ht

kubo2015ngt6 p.2 ( ( (MLE 8 y i L(q q log L(q q 0 ˆq log L(q / q = 0 q ˆq = = = * ˆq = 0.46 ( 8 y 0.46 y y y i kubo (ht kubo2015ngt6 p.1 2015 (6 MCMC [email protected], @KuboBook http://goo.gl/m8hsbm 1 ( 2 3 4 5 JAGS : 2015 05 18 16:48 kubo (http://goo.gl/m8hsbm 2015 (6 1 / 70 kubo (http://goo.gl/m8hsbm 2015 (6 2 /

More information

kubostat1g p. MCMC binomial distribution q MCMC : i N i y i p(y i q = ( Ni y i q y i (1 q N i y i, q {y i } q likelihood q L(q {y i } = i=1 p(y i q 1

kubostat1g p. MCMC binomial distribution q MCMC : i N i y i p(y i q = ( Ni y i q y i (1 q N i y i, q {y i } q likelihood q L(q {y i } = i=1 p(y i q 1 kubostat1g p.1 1 (g Hierarchical Bayesian Model [email protected] http://goo.gl/7ci The development of linear models Hierarchical Bayesian Model Be more flexible Generalized Linear Mixed Model (GLMM

More information

/ *1 *1 c Mike Gonzalez, October 14, Wikimedia Commons.

/ *1 *1 c Mike Gonzalez, October 14, Wikimedia Commons. 2010 05 22 1/ 35 2010 2010 05 22 *1 [email protected] *1 c Mike Gonzalez, October 14, 2007. Wikimedia Commons. 2010 05 22 2/ 35 1. 2. 3. 2010 05 22 3/ 35 : 1.? 2. 2010 05 22 4/ 35 1. 2010 05 22 5/

More information

kubostat2017j p.2 CSV CSV (!) d2.csv d2.csv,, 286,0,A 85,0,B 378,1,A 148,1,B ( :27 ) 10/ 51 kubostat2017j (http://goo.gl/76c4i

kubostat2017j p.2 CSV CSV (!) d2.csv d2.csv,, 286,0,A 85,0,B 378,1,A 148,1,B ( :27 ) 10/ 51 kubostat2017j (http://goo.gl/76c4i kubostat2017j p.1 2017 (j) Categorical Data Analsis [email protected] http://goo.gl/76c4i 2017 11 15 : 2017 11 08 17:11 kubostat2017j (http://goo.gl/76c4i) 2017 (j) 2017 11 15 1 / 63 A B C D E F G

More information

/22 R MCMC R R MCMC? 3. Gibbs sampler : kubo/

/22 R MCMC R R MCMC? 3. Gibbs sampler :   kubo/ 2006-12-09 1/22 R MCMC R 1. 2. R MCMC? 3. Gibbs sampler : [email protected] http://hosho.ees.hokudai.ac.jp/ kubo/ 2006-12-09 2/22 : ( ) : : ( ) : (?) community ( ) 2006-12-09 3/22 :? 1. ( ) 2. ( )

More information

kubo2017sep16a p.1 ( 1 ) : : :55 kubo ( ( 1 ) / 10

kubo2017sep16a p.1 ( 1 ) :   : :55 kubo (  ( 1 ) / 10 kubo2017sep16a p.1 ( 1 ) [email protected] 2017 09 16 : http://goo.gl/8je5wh : 2017 09 13 16:55 kubo (http://goo.gl/ufq2) ( 1 ) 2017 09 16 1 / 106 kubo (http://goo.gl/ufq2) ( 1 ) 2017 09 16 2 / 106

More information

一般化線形 (混合) モデル (2) - ロジスティック回帰と GLMM

一般化線形 (混合) モデル (2) - ロジスティック回帰と GLMM .. ( ) (2) GLMM [email protected] I http://goo.gl/rrhzey 2013 08 27 : 2013 08 27 08:29 kubostat2013ou2 (http://goo.gl/rrhzey) ( ) (2) 2013 08 27 1 / 74 I.1 N k.2 binomial distribution logit link function.3.4!

More information

講義のーと : データ解析のための統計モデリング. 第2回

講義のーと :  データ解析のための統計モデリング. 第2回 Title 講義のーと : データ解析のための統計モデリング Author(s) 久保, 拓弥 Issue Date 2008 Doc URL http://hdl.handle.net/2115/49477 Type learningobject Note この講義資料は, 著者のホームページ http://hosho.ees.hokudai.ac.jp/~kub ードできます Note(URL)http://hosho.ees.hokudai.ac.jp/~kubo/ce/EesLecture20

More information

kubostat2017c p (c) Poisson regression, a generalized linear model (GLM) : :

kubostat2017c p (c) Poisson regression, a generalized linear model (GLM) : : kubostat2017c p.1 2017 (c), a generalized linear model (GLM) : [email protected] http://goo.gl/76c4i 2017 11 14 : 2017 11 07 15:43 kubostat2017c (http://goo.gl/76c4i) 2017 (c) 2017 11 14 1 / 47 agenda

More information

講義のーと : データ解析のための統計モデリング. 第3回

講義のーと :  データ解析のための統計モデリング. 第3回 Title 講義のーと : データ解析のための統計モデリング Author(s) 久保, 拓弥 Issue Date 2008 Doc URL http://hdl.handle.net/2115/49477 Type learningobject Note この講義資料は, 著者のホームページ http://hosho.ees.hokudai.ac.jp/~kub ードできます Note(URL)http://hosho.ees.hokudai.ac.jp/~kubo/ce/EesLecture20

More information

(2/24) : 1. R R R

(2/24) : 1. R R R R? http://hosho.ees.hokudai.ac.jp/ kubo/ce/2004/ : [email protected] (2/24) : 1. R 2. 3. R R (3/24)? 1. ( ) 2. ( I ) : (p ) : cf. (power) p? (4/24) p ( ) I p ( ) I? ( ) (5/24)? 0 2 4 6 8 A B A B (control)

More information

: (GLMM) (pseudo replication) ( ) ( ) & Markov Chain Monte Carlo (MCMC)? /30

: (GLMM) (pseudo replication) ( ) ( ) & Markov Chain Monte Carlo (MCMC)? /30 PlotNet 6 ( ) 2006-01-19 TOEF(1998 2004), AM, growth6 DBH growth (mm) 1998 1999 2000 2001 2002 2003 2004 10 20 30 40 50 70 DBH (cm) 1. 2. - - : [email protected] http://hosho.ees.hokudai.ac.jp/ kubo/show/2006/plotnet/

More information

講義のーと : データ解析のための統計モデリング. 第5回

講義のーと :  データ解析のための統計モデリング. 第5回 Title 講義のーと : データ解析のための統計モデリング Author(s) 久保, 拓弥 Issue Date 2008 Doc URL http://hdl.handle.net/2115/49477 Type learningobject Note この講義資料は, 著者のホームページ http://hosho.ees.hokudai.ac.jp/~kub ードできます Note(URL)http://hosho.ees.hokudai.ac.jp/~kubo/ce/EesLecture20

More information

kubostat2017e p.1 I 2017 (e) GLM logistic regression : : :02 1 N y count data or

kubostat2017e p.1 I 2017 (e) GLM logistic regression : : :02 1 N y count data or kubostat207e p. I 207 (e) GLM [email protected] https://goo.gl/z9ycjy 207 4 207 6:02 N y 2 binomial distribution logit link function 3 4! offset kubostat207e (https://goo.gl/z9ycjy) 207 (e) 207 4

More information

: Bradley-Terry Burczyk

: Bradley-Terry Burczyk 58 (W15) 2011 03 09 [email protected] http://goo.gl/edzle 2011 03 09 (2011 03 09 19 :32 ) : Bradley-Terry Burczyk ? ( ) 1999 2010 9 R : 7 (1) 8 7??! 15 http://www.atmarkit.co.jp/fcoding/articles/stat/07/stat07a.html

More information

k3 ( :07 ) 2 (A) k = 1 (B) k = 7 y x x 1 (k2)?? x y (A) GLM (k

k3 ( :07 ) 2 (A) k = 1 (B) k = 7 y x x 1 (k2)?? x y (A) GLM (k 2012 11 01 k3 (2012-10-24 14:07 ) 1 6 3 (2012 11 01 k3) [email protected] web http://goo.gl/wijx2 web http://goo.gl/ufq2 1 3 2 : 4 3 AIC 6 4 7 5 8 6 : 9 7 11 8 12 8.1 (1)........ 13 8.2 (2) χ 2....................

More information

,, Poisson 3 3. t t y,, y n Nµ, σ 2 y i µ + ɛ i ɛ i N0, σ 2 E[y i ] µ * i y i x i y i α + βx i + ɛ i ɛ i N0, σ 2, α, β *3 y i E[y i ] α + βx i

,, Poisson 3 3. t t y,, y n Nµ, σ 2 y i µ + ɛ i ɛ i N0, σ 2 E[y i ] µ * i y i x i y i α + βx i + ɛ i ɛ i N0, σ 2, α, β *3 y i E[y i ] α + βx i Armitage.? SAS.2 µ, µ 2, µ 3 a, a 2, a 3 a µ + a 2 µ 2 + a 3 µ 3 µ, µ 2, µ 3 µ, µ 2, µ 3 log a, a 2, a 3 a µ + a 2 µ 2 + a 3 µ 3 µ, µ 2, µ 3 * 2 2. y t y y y Poisson y * ,, Poisson 3 3. t t y,, y n Nµ,

More information

2009 5 1...1 2...3 2.1...3 2.2...3 3...10 3.1...10 3.1.1...10 3.1.2... 11 3.2...14 3.2.1...14 3.2.2...16 3.3...18 3.4...19 3.4.1...19 3.4.2...20 3.4.3...21 4...24 4.1...24 4.2...24 4.3 WinBUGS...25 4.4...28

More information

kubostat2017b p.1 agenda I 2017 (b) probability distribution and maximum likelihood estimation :

kubostat2017b p.1 agenda I 2017 (b) probability distribution and maximum likelihood estimation : kubostat2017b p.1 agenda I 2017 (b) probabilit distribution and maimum likelihood estimation [email protected] http://goo.gl/76c4i 2017 11 14 : 2017 11 07 15:43 1 : 2 3? 4 kubostat2017b (http://goo.gl/76c4i)

More information

Dirichlet process mixture Dirichlet process mixture 2 /40 MIRU2008 :

Dirichlet process mixture Dirichlet process mixture 2 /40 MIRU2008 : Dirichlet Process : joint work with: Max Welling (UC Irvine), Yee Whye Teh (UCL, Gatsby) http://kenichi.kurihara.googlepages.com/miru_workshop.pdf 1 /40 MIRU2008 : Dirichlet process mixture Dirichlet process

More information

分布

分布 (normal distribution) 30 2 Skewed graph 1 2 (variance) s 2 = 1/(n-1) (xi x) 2 x = mean, s = variance (variance) (standard deviation) SD = SQR (var) or 8 8 0.3 0.2 0.1 0.0 0 1 2 3 4 5 6 7 8 8 0 1 8 (probability

More information

kubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i

kubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i kubostat2018d p.1 I 2018 (d) model selection and [email protected] http://goo.gl/76c4i 2018 06 25 : 2018 06 21 17:45 1 2 3 4 :? AIC : deviance model selection misunderstanding kubostat2018d (http://goo.gl/76c4i)

More information

X X X Y R Y R Y R MCAR MAR MNAR Figure 1: MCAR, MAR, MNAR Y R X 1.2 Missing At Random (MAR) MAR MCAR MCAR Y X X Y MCAR 2 1 R X Y Table 1 3 IQ MCAR Y I

X X X Y R Y R Y R MCAR MAR MNAR Figure 1: MCAR, MAR, MNAR Y R X 1.2 Missing At Random (MAR) MAR MCAR MCAR Y X X Y MCAR 2 1 R X Y Table 1 3 IQ MCAR Y I (missing data analysis) - - 1/16/2011 (missing data, missing value) (list-wise deletion) (pair-wise deletion) (full information maximum likelihood method, FIML) (multiple imputation method) 1 missing completely

More information

03.Œk’ì

03.Œk’ì HRS KG NG-HRS NG-KG AIC Fama 1965 Mandelbrot Blattberg Gonedes t t Kariya, et. al. Nagahara ARCH EngleGARCH Bollerslev EGARCH Nelson GARCH Heynen, et. al. r n r n =σ n w n logσ n =α +βlogσ n 1 + v n w

More information

/ 55 2 : : (GLM) 1. 1/23 ( )? GLM? (GLM ) 2.! 1/25 ( ) ffset (GLM )

/ 55 2 : : (GLM) 1. 1/23 ( )? GLM? (GLM ) 2.! 1/25 ( ) ffset (GLM ) 2012 01 25 1/ 55 ( II) : (2012 1 ) 2 2 (GLM) 2012 01 25! [email protected] http://g.gl/76c4i 2012 01 25 2/ 55 2 : : (GLM) 1. 1/23 ( )? GLM? (GLM ) 2.! 1/25 ( ) ffset (GLM ) 2012 01 25 3/ 55 1. : 2.

More information

Stanによるハミルトニアンモンテカルロ法を用いたサンプリングについて

Stanによるハミルトニアンモンテカルロ法を用いたサンプリングについて Stan によるハミルトニアンモンテカルロ法を用いたサンプリングについて 10 月 22 日中村文士 1 目次 1.STANについて 2.RでSTANをするためのインストール 3.STANのコード記述方法 4.STANによるサンプリングの例 2 1.STAN について ハミルトニアンモンテカルロ法に基づいた事後分布からのサンプリングなどができる STAN の HP: mc-stan.org 3 由来

More information

tokei01.dvi

tokei01.dvi 2. :,,,. :.... Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN 4 3. (probability),, 1. : : n, α A, A a/n. :, p, p Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN

More information

医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 第 2 版 1 刷発行時のものです.

医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます.   このサンプルページの内容は, 第 2 版 1 刷発行時のものです. 医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/009192 このサンプルページの内容は, 第 2 版 1 刷発行時のものです. i 2 t 1. 2. 3 2 3. 6 4. 7 5. n 2 ν 6. 2 7. 2003 ii 2 2013 10 iii 1987

More information

確率論と統計学の資料

確率論と統計学の資料 5 June 015 ii........................ 1 1 1.1...................... 1 1........................... 3 1.3... 4 6.1........................... 6................... 7 ii ii.3.................. 8.4..........................

More information

N cos s s cos ψ e e e e 3 3 e e 3 e 3 e

N cos s s cos ψ e e e e 3 3 e e 3 e 3 e 3 3 5 5 5 3 3 7 5 33 5 33 9 5 8 > e > f U f U u u > u ue u e u ue u ue u e u e u u e u u e u N cos s s cos ψ e e e e 3 3 e e 3 e 3 e 3 > A A > A E A f A A f A [ ] f A A e > > A e[ ] > f A E A < < f ; >

More information

バイオインフォマティクス特論4

バイオインフォマティクス特論4 藤 博幸 1-3-1. ピアソン相関係数 1-3-2. 致性のカッパ係数 1-3-3. 時系列データにおける変化検出 ベイズ統計で実践モデリング 5.1 ピアソン係数 第 5 章データ解析の例 データは n ペアの独 な観測値の対例 : 特定の薬剤の投与量と投与から t 時間後の注 する遺伝 の発現量 2 つの変数間の線形の関係性はピアソンの積率相関係数 r で表現される t 時間後の注 する遺伝

More information

& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable),

& 3 3 ' ' (., (Pixel), (Light Intensity) (Random Variable). (Joint Probability). V., V = {,,, V }. i x i x = (x, x,, x V ) T. x i i (State Variable), .... Deeping and Expansion of Large-Scale Random Fields and Probabilistic Image Processing Kazuyuki Tanaka The mathematical frameworks of probabilistic image processing are formulated by means of Markov

More information

10:30 12:00 P.G. vs vs vs 2

10:30 12:00 P.G. vs vs vs 2 1 10:30 12:00 P.G. vs vs vs 2 LOGIT PROBIT TOBIT mean median mode CV 3 4 5 0.5 1000 6 45 7 P(A B) = P(A) + P(B) - P(A B) P(B A)=P(A B)/P(A) P(A B)=P(B A) P(A) P(A B) P(A) P(B A) P(B) P(A B) P(A) P(B) P(B

More information

Ł\”ƒ.dvi

Ł\”ƒ.dvi , , 1 1 9 11 9 12 10 13 11 14 14 15 15 16 19 2 21 21 21 22 23 221 23 222 24 223 27 23 30 231 2PLM 31 232 CCM 31 233 2PLCM 33 234 34 235 35 3 51 31 51 32 53 321 53 322 54 323 2 BTM 54 2 324 55 325 MCMC

More information

Kullback-Leibler

Kullback-Leibler Kullback-Leibler 206 6 6 http://www.math.tohoku.ac.jp/~kuroki/latex/206066kullbackleibler.pdf 0 2 Kullback-Leibler 3. q i.......................... 3.2........... 3.3 Kullback-Leibler.............. 4.4

More information

kubostat2018a p.1 統計モデリング入門 2018 (a) The main language of this class is 生物多様性学特論 Japanese Sorry An overview: Statistical Modeling 観測されたパターンを説明する統計モデル

kubostat2018a p.1 統計モデリング入門 2018 (a) The main language of this class is 生物多様性学特論 Japanese Sorry An overview: Statistical Modeling 観測されたパターンを説明する統計モデル p.1 統計モデリング入門 2018 (a) The main language of this class is 生物多様性学特論 Japanese Sorry An overview: Statistical Modeling 観測されたパターンを説明する統計モデル 久保拓弥 (北海道大 環境科学) Why in Japanese? because even in Japanese, statistics

More information

統計モデリング入門 2018 (a) 生物多様性学特論 An overview: Statistical Modeling 観測されたパターンを説明する統計モデル 久保拓弥 (北海道大 環境科学) 統計モデリング入門 2018a 1

統計モデリング入門 2018 (a) 生物多様性学特論 An overview: Statistical Modeling 観測されたパターンを説明する統計モデル 久保拓弥 (北海道大 環境科学) 統計モデリング入門 2018a 1 統計モデリング入門 2018 (a) 生物多様性学特論 An overview: Statistical Modeling 観測されたパターンを説明する統計モデル 久保拓弥 (北海道大 環境科学) [email protected] 1/56 The main language of this class is Japanese Sorry Why in Japanese? because

More information

1 15 R Part : website:

1 15 R Part : website: 1 15 R Part 4 2017 7 24 4 : website: email: http://www3.u-toyama.ac.jp/kkarato/ [email protected] 1 2 2 3 2.1............................... 3 2.2 2................................. 4 2.3................................

More information

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,. 24(2012) (1 C106) 4 11 (2 C206) 4 12 http://www.math.is.tohoku.ac.jp/~obata,.,,,.. 1. 2. 3. 4. 5. 6. 7.,,. 1., 2007 (). 2. P. G. Hoel, 1995. 3... 1... 2.,,. ii 3.,. 4. F. (),.. 5... 6.. 7.,,. 8.,. 1. (75%)

More information

Multivariate Realized Stochastic Volatility Models with Dynamic Correlation and Skew Distribution: Bayesian Analysis and Application to Risk Managemen

Multivariate Realized Stochastic Volatility Models with Dynamic Correlation and Skew Distribution: Bayesian Analysis and Application to Risk Managemen Multivariate Realized Stochastic Volatility Models with Dynamic Correlation and Skew Distribution: Bayesian Analysis and Application to Risk Management 2019 3 15 Dai Yamashita (Hitotsubashi ICS) MSV Models

More information

1 911 9001030 9:00 A B C D E F G H I J K L M 1A0900 1B0900 1C0900 1D0900 1E0900 1F0900 1G0900 1H0900 1I0900 1J0900 1K0900 1L0900 1M0900 9:15 1A0915 1B0915 1C0915 1D0915 1E0915 1F0915 1G0915 1H0915 1I0915

More information

DSGE Dynamic Stochastic General Equilibrium Model DSGE 5 2 DSGE DSGE ω 0 < ω < 1 1 DSGE Blanchard and Kahn VAR 3 MCMC 2 5 4 1 1 1.1 1. 2. 118

DSGE Dynamic Stochastic General Equilibrium Model DSGE 5 2 DSGE DSGE ω 0 < ω < 1 1 DSGE Blanchard and Kahn VAR 3 MCMC 2 5 4 1 1 1.1 1. 2. 118 7 DSGE 2013 3 7 1 118 1.1............................ 118 1.2................................... 123 1.3.............................. 125 1.4..................... 127 1.5...................... 128 1.6..............

More information

Rによる計量分析:データ解析と可視化 - 第3回 Rの基礎とデータ操作・管理

Rによる計量分析:データ解析と可視化 - 第3回  Rの基礎とデータ操作・管理 R 3 R 2017 Email: [email protected] October 23, 2017 (Toyama/NIHU) R ( 3 ) October 23, 2017 1 / 34 Agenda 1 2 3 4 R 5 RStudio (Toyama/NIHU) R ( 3 ) October 23, 2017 2 / 34 10/30 (Mon.) 12/11 (Mon.)

More information

1 (Berry,1975) 2-6 p (S πr 2 )p πr 2 p 2πRγ p p = 2γ R (2.5).1-1 : : : : ( ).2 α, β α, β () X S = X X α X β (.1) 1 2

1 (Berry,1975) 2-6 p (S πr 2 )p πr 2 p 2πRγ p p = 2γ R (2.5).1-1 : : : : ( ).2 α, β α, β () X S = X X α X β (.1) 1 2 2005 9/8-11 2 2.2 ( 2-5) γ ( ) γ cos θ 2πr πρhr 2 g h = 2γ cos θ ρgr (2.1) γ = ρgrh (2.2) 2 cos θ θ cos θ = 1 (2.2) γ = 1 ρgrh (2.) 2 2. p p ρgh p ( ) p p = p ρgh (2.) h p p = 2γ r 1 1 (Berry,1975) 2-6

More information

36

36 36 37 38 P r R P 39 (1+r ) P =R+P g P r g P = R r g r g == == 40 41 42 τ R P = r g+τ 43 τ (1+r ) P τ ( P P ) = R+P τ ( P P ) n P P r P P g P 44 R τ P P = (1 τ )(r g) (1 τ )P R τ 45 R R σ u R= R +u u~ (0,σ

More information

seminar0220a.dvi

seminar0220a.dvi 1 Hi-Stat 2 16 2 20 16:30-18:00 2 2 217 1 COE 4 COE RA E-MAIL: [email protected] 2004 2 25 S-PLUS S-PLUS S-PLUS S-code 2 [8] [8] [8] 1 2 ARFIMA(p, d, q) FI(d) φ(l)(1 L) d x t = θ(l)ε t ({ε t }

More information

O1-1 O1-2 O1-3 O1-4 O1-5 O1-6

O1-1 O1-2 O1-3 O1-4 O1-5 O1-6 O1-1 O1-2 O1-3 O1-4 O1-5 O1-6 O1-7 O1-8 O1-9 O1-10 O1-11 O1-12 O1-13 O1-14 O1-15 O1-16 O1-17 O1-18 O1-19 O1-20 O1-21 O1-22 O1-23 O1-24 O1-25 O1-26 O1-27 O1-28 O1-29 O1-30 O1-31 O1-32 O1-33 O1-34 O1-35

More information

: (EQS) /EQUATIONS V1 = 30*V F1 + E1; V2 = 25*V *F1 + E2; V3 = 16*V *F1 + E3; V4 = 10*V F2 + E4; V5 = 19*V99

: (EQS) /EQUATIONS V1 = 30*V F1 + E1; V2 = 25*V *F1 + E2; V3 = 16*V *F1 + E3; V4 = 10*V F2 + E4; V5 = 19*V99 218 6 219 6.11: (EQS) /EQUATIONS V1 = 30*V999 + 1F1 + E1; V2 = 25*V999 +.54*F1 + E2; V3 = 16*V999 + 1.46*F1 + E3; V4 = 10*V999 + 1F2 + E4; V5 = 19*V999 + 1.29*F2 + E5; V6 = 17*V999 + 2.22*F2 + E6; CALIS.

More information

JMP V4 による生存時間分析

JMP V4 による生存時間分析 V4 1 SAS 2000.11.18 4 ( ) (Survival Time) 1 (Event) Start of Study Start of Observation Died Died Died Lost End Time Censor Died Died Censor Died Time Start of Study End Start of Observation Censor

More information

C el = 3 2 Nk B (2.14) c el = 3k B C el = 3 2 Nk B

C el = 3 2 Nk B (2.14) c el = 3k B C el = 3 2 Nk B I [email protected] 217 11 14 4 4.1 2 2.4 C el = 3 2 Nk B (2.14) c el = 3k B 2 3 3.15 C el = 3 2 Nk B 3.15 39 2 1925 (Wolfgang Pauli) (Pauli exclusion principle) T E = p2 2m p T N 4 Pauli Sommerfeld

More information

(5) 75 (a) (b) ( 1 ) v ( 1 ) E E 1 v (a) ( 1 ) x E E (b) (a) (b)

(5) 75 (a) (b) ( 1 ) v ( 1 ) E E 1 v (a) ( 1 ) x E E (b) (a) (b) (5) 74 Re, bondar laer (Prandtl) Re z ω z = x (5) 75 (a) (b) ( 1 ) v ( 1 ) E E 1 v (a) ( 1 ) x E E (b) (a) (b) (5) 76 l V x ) 1/ 1 ( 1 1 1 δ δ = x Re x p V x t V l l (1-1) 1/ 1 δ δ δ δ = x Re p V x t V

More information