数理統計学Iノート

Similar documents
untitled

tokei01.dvi

I L01( Wed) : Time-stamp: Wed 07:38 JST hig e, ( ) L01 I(2017) 1 / 19


..3. Ω, Ω F, P Ω, F, P ). ) F a) A, A,..., A i,... F A i F. b) A F A c F c) Ω F. ) A F A P A),. a) 0 P A) b) P Ω) c) [ ] A, A,..., A i,... F i j A i A

A B P (A B) = P (A)P (B) (3) A B A B P (B A) A B A B P (A B) = P (B A)P (A) (4) P (B A) = P (A B) P (A) (5) P (A B) P (B A) P (A B) A B P

Part () () Γ Part ,

確率論と統計学の資料

( )/2 hara/lectures/lectures-j.html 2, {H} {T } S = {H, T } {(H, H), (H, T )} {(H, T ), (T, T )} {(H, H), (T, T )} {1

分布

6.1 (P (P (P (P (P (P (, P (, P.


ii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,.

III III 2010 PART I 1 Definition 1.1 (, σ-),,,, Borel( ),, (σ-) (M, F, µ), (R, B(R)), (C, B(C)) Borel Definition 1.2 (µ-a.e.), (in µ), (in L 1 (µ)). T

6.1 (P (P (P (P (P (P (, P (, P.101

こんにちは由美子です

2 1 Introduction

(pdf) (cdf) Matlab χ ( ) F t

統計学のポイント整理

Rによる計量分析:データ解析と可視化 - 第3回 Rの基礎とデータ操作・管理

x y 1 x 1 y 1 2 x 2 y 2 3 x 3 y 3... x ( ) 2

³ÎΨÏÀ

k3 ( :07 ) 2 (A) k = 1 (B) k = 7 y x x 1 (k2)?? x y (A) GLM (k

名称未設定

II (Percolation) ( 3-4 ) 1. [ ],,,,,,,. 2. [ ],.. 3. [ ],. 4. [ ] [ ] G. Grimmett Percolation Springer-Verlag New-York [ ] 3

st.dvi

10:30 12:00 P.G. vs vs vs 2

k2 ( :35 ) ( k2) (GLM) web web 1 :

こんにちは由美子です

populatio sample II, B II? [1] I. [2] 1 [3] David J. Had [4] 2 [5] 3 2

(2 X Poisso P (λ ϕ X (t = E[e itx ] = k= itk λk e k! e λ = (e it λ k e λ = e eitλ e λ = e λ(eit 1. k! k= 6.7 X N(, 1 ϕ X (t = e 1 2 t2 : Cauchy ϕ X (t

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j )

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.

untitled

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi

201711grade1ouyou.pdf

ohpmain.dvi

ii 3.,. 4. F. (), ,,. 8.,. 1. (75% ) (25% ) =9 7, =9 8 (. ). 1.,, (). 3.,. 1. ( ).,.,.,.,.,. ( ) (1 2 )., ( ), 0. 2., 1., 0,.

80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = i=1 i=1 n λ x i e λ i=1 x i! = λ n i=1 x i e nλ n i=1 x

II Brown Brown

IA 2013 : :10722 : 2 : :2 :761 :1 (23-27) : : ( / ) (1 /, ) / e.g. (Taylar ) e x = 1 + x + x xn n! +... sin x = x x3 6 + x5 x2n+1 + (


I A A441 : April 21, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) Google

Basic Math. 1 0 [ N Z Q Q c R C] 1, 2, 3,... natural numbers, N Def.(Definition) N (1) 1 N, (2) n N = n +1 N, (3) N (1), (2), n N n N (element). n/ N.

B [ 0.1 ] x > 0 x 6= 1 f(x) µ 1 1 xn 1 + sin sin x 1 x 1 f(x) := lim. n x n (1) lim inf f(x) (2) lim sup f(x) x 1 0 x 1 0 (

(Basic of Proability Theory). (Probability Spacees ad Radom Variables , (Expectatios, Meas) (Weak Law

最小2乗法

Kullback-Leibler

.3. (x, x = (, u = = 4 (, x x = 4 x, x 0 x = 0 x = 4 x.4. ( z + z = 8 z, z 0 (z, z = (0, 8, (,, (8, 0 3 (0, 8, (,, (8, 0 z = z 4 z (g f(x = g(

ii

I (Analysis I) Lebesgue (Lebesgue Integral Theory) 1 (Seiji HIRABA) 1 ( ),,, ( )

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi

) ] [ h m x + y + + V x) φ = Eφ 1) z E = i h t 13) x << 1) N n n= = N N + 1) 14) N n n= = N N + 1)N + 1) 6 15) N n 3 n= = 1 4 N N + 1) 16) N n 4

ルベーグ積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 初版 1 刷発行時のものです.

Microsoft Word - 表紙.docx

December 28, 2018

I, II 1, A = A 4 : 6 = max{ A, } A A 10 10%

t χ 2 F Q t χ 2 F 1 2 µ, σ 2 N(µ, σ 2 ) f(x µ, σ 2 ) = 1 ( exp (x ) µ)2 2πσ 2 2σ 2 0, N(0, 1) (100 α) z(α) t χ 2 *1 2.1 t (i)x N(µ, σ 2 ) x µ σ N(0, 1

meiji_resume_1.PDF

( 28 ) ( ) ( ) 0 This note is c 2016, 2017 by Setsuo Taniguchi. It may be used for personal or classroom purposes, but not for commercial purp

第11回:線形回帰モデルのOLS推定

1 Tokyo Daily Rainfall (mm) Days (mm)

I A A441 : April 15, 2013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida )

renshumondai-kaito.dvi

1 1 [1] ( 2,625 [2] ( 2, ( ) /

1 1 sin cos P (primary) S (secondly) 2 P S A sin(ω2πt + α) A ω 1 ω α V T m T m 1 100Hz m 2 36km 500Hz. 36km 1

1 1.1 ( ). z = a + bi, a, b R 0 a, b 0 a 2 + b 2 0 z = a + bi = ( ) a 2 + b 2 a a 2 + b + b 2 a 2 + b i 2 r = a 2 + b 2 θ cos θ = a a 2 + b 2, sin θ =

18 ( ) I II III A B C(100 ) 1, 2, 3, 5 I II A B (100 ) 1, 2, 3 I II A B (80 ) 6 8 I II III A B C(80 ) 1 n (1 + x) n (1) n C 1 + n C

newmain.dvi

(note-02) Rademacher 1/57



医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 第 2 版 1 刷発行時のものです.

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.

i

A

y i OLS [0, 1] OLS x i = (1, x 1,i,, x k,i ) β = (β 0, β 1,, β k ) G ( x i β) 1 G i 1 π i π i P {y i = 1 x i } = G (

2 1,, x = 1 a i f i = i i a i f i. media ( ): x 1, x 2,..., x,. mode ( ): x 1, x 2,..., x,., ( ). 2., : box plot ( ): x variace ( ): σ 2 = 1 (x k x) 2

untitled

i

2011 ( ) ( ) ( ),,.,,.,, ,.. (. ), 1. ( ). ( ) ( ). : obata/,.,. ( )

2000年度『数学展望 I』講義録


AR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t

JMP V4 による生存時間分析

x () g(x) = f(t) dt f(x), F (x) 3x () g(x) g (x) f(x), F (x) (3) h(x) = x 3x tf(t) dt.9 = {(x, y) ; x, y, x + y } f(x, y) = xy( x y). h (x) f(x), F (x

kubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i

30

ii


講義のーと : データ解析のための統計モデリング. 第5回

( )

熊本県数学問題正解

講義のーと : データ解析のための統計モデリング. 第2回

2012 A, N, Z, Q, R, C

Z[i] Z[i] π 4,1 (x) π 4,3 (x) 1 x (x ) 2 log x π m,a (x) 1 x ϕ(m) log x 1.1 ( ). π(x) x (a, m) = 1 π m,a (x) x modm a 1 π m,a (x) 1 ϕ(m) π(x)

() n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (5) (6 ) n C + nc + 3 nc n nc n (7 ) n C + nc + 3 nc n nc n (

() x + y + y + x dy dx = 0 () dy + xy = x dx y + x y ( 5) ( s55906) 0.7. (). 5 (). ( 6) ( s6590) 0.8 m n. 0.9 n n A. ( 6) ( s6590) f A (λ) = det(a λi)


( ) sin 1 x, cos 1 x, tan 1 x sin x, cos x, tan x, arcsin x, arccos x, arctan x. π 2 sin 1 x π 2, 0 cos 1 x π, π 2 < tan 1 x < π 2 1 (1) (

Transcription:

I ver. 0/Apr/208 * (inferential statistics) *2 A, B *3 5.9 *4 *5 [6] [],.., 7 2004. [2].., 973. [3]. R (Wonderful R )., 9 206. [4]. ( )., 7 99. [5]. ( )., 8 992. [6],.., 989. [7]. - 30., 0 996. [4] [5] [2] * *2 (descriptive statistics) *3 4 A 5 B *4 *5

2 [] [6] web PDF PC SAS, SPSS, R R R [3] A A complement A c A A n(a) P P (A) A p p X (x) X 4.7 P (A B) B A 2. P B(A) E[X], V [X] X 4., 4.5 µ = E[X], σ = V [X], σ 2 = V [X] N(µ, σ 2 ) µ, σ 2 σ A.3 N(, 4) 4. N(, 4) 4 X X A.4 i.i.d. 5.8 X, X 2,..., X n X n 5.2 u 2 n 6.9 5......................................... 5.2............................... 6.3................................... 7 2 8 2. (conditional probability)......................... 8 2.2 (independence).................................. 20 2.3.................................. 24 2.4............................. 25 3 Bayes 27 3....................................... 27 3.2............................. 3 3.3........................... 32 4 35 4....................................... 35 4.2............................. 36 4.3.................................... 36 4.4................................... 39

3 5 48 5...................................... 48 5.2................................... 50 5.3......................................... 5 5.4........................................ 52 6 56 6......................................... 56 6.2 i.i.d...................................... 58 6.3 (parameter)..................................... 58 6.4 (estimator).................................... 59 6.5 (unbiasedness)................................... 60 6.6 (consistency)................................... 6 6.7 (efficiency)..................................... 62 6.8................................... 63 7 66 7..................................... 67 7.2....................... 67 7.3....................... 69 7.4................. 70 7.5.............................. 7 7.6......................... 72 8 73 8.0...................................... 73 8............................... 73 8.2........................................ 75 8.3.............................. 8 8.4 2............................. 86 8.5 χ 2................................. 88 8.6..................................... 93 8.7........................................ 93 8.8................................... 93 8.9................. 93 A 94 A. (Bernoulli)................................ 94 A.2 (binomial distribution)............................. 94 A.3 (normal distribution).............................. 97 A.4 Poisson................................. 02 A.5 (geometric distribution)............................ 07 A.6 (exponential distribution)........................... 0 A.7 (hypergeometric distribution)........................ B χ 2 t F 3 B. (χ 2 ).................................... 3

4 B.2 t........................................... 5 B.3 F........................................... 7 B.4............................ 8 C 20

5 *6. 6 2 6 6 (Laplace). ( ). P (A) = A Ω () 6 6 *7 (2) = 0 [0, ] = {x R 0 x } a [0, ] a *6 I A B *7

6 + = 0 [0, ] [0, 2 ] 0 [0, 2 ] a 0 0 0 [0, ] [0, 2 ] 2?.2 (Kolmogorov).2 ( (sample space)). Ω (sample space).3. Ω = {, } 0 Ω = {0, } Ω = {,, }, Ω = {,,, } 2 2 Ω = {(0, 0), (0, ), (, 0), (, )} Ω = {, 2, 3, 4, 5, 6} Ω = {, } 4 2 20 Ω = {, 2, 3, 4}, Ω = {, 2,, 2}, Ω = {, 2,, 20} n Ω = {0,,..., n} Ω = {0,, 2,... } Ω = [0, ) = {x R 0 x < + }. Ω = {(a, a 2, a 3,, a n, ) a n = 0 or } 0, 4 Ω Ω

.2 7.4 ( (event)). Ω A Ω (event) ω Ω {ω} Ω Ω Ω Ω 0 A = A, A = 0.5. Ω = {0, } {} Ω {0} Ω {0, } = Ω Ω = 2 {(0, )} 2 {(, 0), (, )} A.6. Ω = {, 2, 3, 4, 5, 6} () (2) 3 (3) () {2, 4, 6} (2) {, 2, 3, 5} (3) {2, 3, 5}.7. Ω A, B Ω A B A B (intersection) A B A B A B A B (union) A B A B A B A B A A (complement) A A.8. A B = A B (disjoint / mutually exclusive) A B

8 *8 A, B A B A B A B A B and or A B A A B, A B B A B.9. : A, A 2,... A i A, A 2,... i= A i A, A 2,... i=.0..6(2) = {, 3, 5} 3 = {, 2, 3} {, 3, 5} {, 2, 3} = {, 2, 3, 5} A Ω A = {} A 6 Ω = {, 2, 3, 4, 5, 6} A = {2, 3, 4, 5, 6} 2 6 4 Ω = {, 2, 3, 4} A = {2, 3, 4} 2 4. ( (de Morgan s laws)). () (A B) = A B: A, B = A B (2) (A B) = A B: A, B = A B.2. {A i } i i A i = i A i = i A i = A i = *8 I

.2 9.3 ( ). A i i i A i ( i A i) = i A i i *9.4 ( (probability measure)). A Ω P (A) P Ω (probability measure) P (A) A Ω P (Ω, P ) (probability space) () A Ω 0 P (A). (2) P (Ω) =. (3) A, A 2, (i j = A i A j = ) (.) ( ) P A n = P (A n ) n= n= (), (2) (3) (3) (3) A, B Ω P (A B) = P (A) + P (B) (3) A, B, C Ω P (A B C) = P (A) + P (B) + P (C) (3), (3) * 0 * (3).5 ( ). () P (A) = P (A). P ( ) = 0. (2) A B = P (B A) = P (B) P (A). (3) A B = P (A) P (B). A 0 P (A). (4) P (A B) = P (A)+P (B) P (A B), P (A B) = P (A)+P (B) P (A B)..23 *9 *0.3 (3) * P (Ω) =

0. A B = A B A B () A A =, Ω = A A = P (Ω) = P (A A) = P (A) + P (A) (2),(3) A A =, Ω = A A A (B A) =. A B B = A (B A) P (B) = P (A (B A)) = P (A) + P (B A) P (A) ( 0 P (B A)). A Ω P (A) P (Ω) =. (4) P (A B) = P (A (B (A B))) = P (A) + P (B (A B)) = P (A) + P (B) P (A B). (A B) B (3).6. P (A B) = P (A) + P (B) P (A B) P (A) + P (B) A, B P (A B) = P (A) + P (B) (3) (.) (subadditivity) (.2) ( ) P A n P (A n ) n= n=..7. A A Ω ( Ω < ) A Ω P (A) = A Ω.4 A B = = (A B) = A + B Ω.8. 2 () Ω (2) (3) 6 (4) (2) (3) () Ω = {(, ), (, 2), (, 3), (, 4), (, 5), (, 6), (2, ), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6), (3, ), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6), (4, ), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6), (5, ), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6), (6, ), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)} 36 Ω = {(i, j) i, j 6}

.2 i, j 6 i 6, j 6 (2), (3) A, 6 B A = {(, ), (2, 2), (3, 3), (4, 4), (5, 5), (6, 6)} B = {(, ), (, 2), (2, ), (, 3), (2, 2), (3, ), (, 4), (2, 3), (3, 2), (4, ) (, 5), (2, 4), (3, 3), (4, 2), (5, )} A = 6, B = 5 P (A) = 6 36 = 5, P (B) = 6 36 = 5 2. (4) A B = {(, ), (2, 2), (3, 3)} P (A B) = P (A)+P (B) P (A B) = 6 36 + 5 36 3 36 = 2..9. A, B A B A B A B P (A), P (B), P (A B).20. P (A) = 2, P (B) = 3, P (A B) = 4 () A B (2) A B (3) A B (4) A B (3), (4) () P (A B) = P (A B) = P (A B) = 4 = 3 4 (2) P (A B) = P (A B) = P (A B) = {P (A)+P (B) P (A B)} = 2 3 + 4 = 5 2 (3) P (A B) = P (B (A B)) = P (B) P (A B) = 2 (4) P (A B) = P (A (A B)) = P (A) + P (A B) = 3 4.2. N = N 365 N 365 P N 365 P N 365 N. 365 P N. N N N = 23 365N 50% N = 35 80% N = 57 99% ( ) N 364 365

2 0.8 0.6 0.4 0.2 0 0 0 20 30 40 50 60 A A P (A).22. 90% N N = 4..5(4) 3.23. 3 A, B, C P (A B C) =P (A) + P (B) + P (C) P (A B) P (B C) P (C A) + P (A B C). P (A B) = P (A) + P (B) P (A B) B B C P (A B C) = P (A) + P (B C) P (A (B C)) 2 = P (A) + P (B) + P (C) P (B C) P (A (B C)). A (B C) = (A B) (A C), (A B) (A C) = A B C P (A (B C)) = P (A B) + P (A C) P (A B C)..24. 4 P (A A 2 A n ) n = ( ) k P (A i A i2 A ik ) k= i <i 2< <i k n n = 4

.2 3 n = k = i n = i = P (A ) = ( ) P (A ) = P (A ) n = 2 k = i n = 2 i =, 2 ( ) 0 (P (A ) + P (A 2 )). k = 2 i < i 2 n = 2 (i, i 2 ) = (, 2) ( ) 2 P (A A 2 ) = P (A A 2 ) n = 3 k = 2 i < i 2 n = 3 (i, i 2 ) = (, 2), (, 3), (2, 3) 3 ( ) 2 (P (A A 2 ) + P (A A 3 ) + P (A 2 A 3 )) k = 3 P (A A 2 A 3 ) n = 4 P (A A 2 A 3 A 4) =P (A ) + P (A 2) + P (A 3) + P (A 4) (P (A A 2) + P (A A 3) + P (A A 4) + P (A 2 A 3) + P (A 2 A 4) + P (A 3 A 4)) + (P (A A 2 A 3) + P (A A 2 A 4) + P (A A 3 A 4) + P (A 2 A 3 A 4)) P (A A 2 A 3 A 4).25 (H27 ). 4 4 A, 2 3 B, 3 4 C A, B, C P (A B C) = P (A)+P (B)+P (C) P (A B) P (B C) P (C A)+P (A B C) P (A) = 6 6 6 6 6 6 = 36, P (B) = 3 6 3 6 6 6 6 6 = 4, P (C) = 6 6 6 6 3 6 3 6 = 4 P (A B) = 6 3 6 6 6 6 =, P (B C) = 72 3 6 3 6 3 6 3 6 = 6, P (C A) = 6 6 6 3 6 6 = 72, P (A B C) = 6 3 6 3 6 6 = 44 P (A B C) = 4 9..26 (H24 ). 20 6 2 2 3 3 4 4 5 5 6 5 2 2 3 6 5 2 k A k k 3 A 3, A 6, A 9, A 2 P (A 3 A 6 A 9 A 2 ). A 3 = {(, 2), (2, )}, A 6 = {(, 5), (2, 4), (3, 3), (4, 2), (5, )}, A 9 = {(3, 6), (4, 5), (5, 4), (6, 3)}, A 2 = {(6, 6)} P (A 3 ) = 4 ( 2 + 2 ) = 400 400 P (A 6 ) = 35 ( 5 + 2 4 + 3 3 + 4 2 + 5 ) = 400 400 P (A 9 ) = 70 (3 5 + 4 5 + 5 4 + 5 3) = 400 400 P (A 2 ) = 25 (5 5) = 400 400

4 ( 4 P (A 3 A 6 A 9 A 2 ) = 400 + 35 400 + 70 400 + 25 ) 400 = 33 200.27 (H26 ). A, B, C 3 (ABCABC ) 2 2 7 B 7 (, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, ) 6 6 36 = 6. B 5 6 ( ) 4 5 6, 2 n 6 6 ( ) 3(n )+ 5 6 6 n= 6 ( ) 3(n )+ 5 = 5 6 6 6 n= ( ) 3(n ) 5 6 = 5 6 6 (5/6) 3 = 30 9 * 2.28 ([2] ). K 0.5, 0.2 n p n lim n n n p n n k= p n = 0.5( p n ) + }{{} 0.2p n }{{} = 0.5 0.3p n p n 5/3 = 0.3(p n 5/3) p n = p k ( 3 ) n ( p 5 ) + 5 0 3 3 5 3 (n ) *2 (Markov)

.2 5 lim n n n k= p k = 5. 5 3.29. {a n } lim n a n = α lim n n n a k = α k=. a n = α (n =, 2,... ) n n a k = nα = α n α ε-δ ε-δ ε lim n a n n a n α N n > N = a n α n k= N n n a k = n k= N k= n a k }{{} + n + n n k=n+ a k } {{ } a k α n k=n+ α }{{} N n = n k= + n N α α (n ) n.30. A 0.7 0.3 0.2 0.8 a n, b n { a n = 0.7a n + 0.8b n ( ) b n = 0.3a n + 0.2b n lim a n, lim b n n n ( ) ( ) ( ) an 0.7 0.8 an = b n 0.3 0.2 b n }{{}}{{} p n A

6 p n = Ap n A p n = Ap n = A 2 p n 2 = = A n p 0 lim p n = lim n n An p 0 lim n An A n λ = λ < 0 A n = ( ) 8 + 3λ n 8 8λ n 3 3λ n 3 + 8λ n ( ) 8 8 (n ) 3 3 (n = 0) a 0 + b 0 = lim p n = n ( 8 8 3 3 ) ( ) a0 = b 0 ( ) 8 3 25% 4 ( ) a n = a n = α, b n = b n = β α + β = α = 8/, β = 3/ 3.3. ABC n A, B, C a n, b n, c n n A, B, C a 0, b 0, c 0 ( a 0 +b 0 +c 0 =.) n A 2 3 B C 3 B 2 A 2 C C 2 3 A 3 B a n 0 /2 2/3 a n p n = b n = 2/3 0 /3 b n = Ap n. c n /3 /2 0 c n A n A, 3 ± i 6 3 ± i 6 < n 5/4 5/4 5/4 lim n An = 4/4 4/4 4/4 2/4 2/4 2/4 a 0 + b 0 + c 0 = 5/4 5/4 5/4 lim n = lim n n An p 0 = 4/4 4/4 4/4 2/4 2/4 2/4 a 0 b 0 c 0 = 4 5 4 2

.3 7.3 (3).32 ( ). ( () A A 2 A 3 lim P (A n) = P A n ). n ( n= (2) A A 2 A 3 lim P (A n) = P A n ). n ( n= ( )) (3) A = lim A n lim P (A n) = P (A) = P lim A n. n n n A, A 2,... lim n A n lim sup A n = n n= k=n A k, lim inf n A n = (limit inferior) n= k=n A k (limit superior) lim sup A n = lim inf A n A n lim A n n n n = n A n : B n = B n = A n, A n+, A n+2,... lim sup A n = n = n A n A k k=n B n B n n=.33. A n = n ( P ) lim sup A n =, P n 2.26 ( ) lim inf A n = 0 n

8 2 2 2. (conditional probability) (Ω, P ) 2. ( ). P (B) 0 B A Ω (2.) P (A B) = P (A B) P (B) B A B =, A = P (A B) B =, A = P (A B) B =, A = P (A B) 3.7 2.2. P (E) > 0 E P E (A) = P (A E) P E P (A E).4 () P E (A) = > 0, P (E) P (Ω E) (2) P E (Ω) = = P (E) = (3) (A B) E = P (E) P (E) (A E) (B E) A B = (A E) (B E) = P ((A B) E) P (A E) + P (B E) P E (A B) = = = P E (A) + P E (B) P (E) P (E) P (A Ω) A Ω A Ω = A P Ω (A) = = P (A) = P (A) P (Ω) Ω 2.3. 2 Ω = {(0, 0), (0, ), (, 0), (, )}, P (A) = A/ Ω A = {(, )} =, B = {(0, ), (, 0), (, )} = P (A B) = P (A B) P (B) = /4 3/4 = 3. 2 2.4. 2 2 /2

2. (conditional probability) 9 2.5. 3 3 3 X ( ), Y ( ), Z ( ) Ω = {XR, XR2, Y W, Y W 2, ZR, ZW } 6 A = {XR, XR2, ZR}, P (A B) B = {XR, XR2} P (B A) = = 2/6 P (A) 3/6 = 2 3..20 2.6. P (A) = 2, P (B) = 3, P (A B) = 4 () P (A B) (2) P (A B) (3) P (B A) (4) P (A B) () P (A B) = P (A) + P (B) P (A B) = 7 2 /4 /3 = 3 P (A B) (3) P (B A) = = /4 4 P (A) /2 = 2 4 (2) P (A B) = P (A B) P (B) = (4) P (A B) = P (A B) = (2.) P (A B) = P (A B)P (B) A B A, B (A B = B A) P (A B) = P (B A) A, B P (B A) = P (B A)P (A) 2.7 ( ). P (A), P (B) 0 P (A B) = P (A B)P (B) = P (B A)P (A) P (A B) A B (joint probability) B A P (A B)P (B), A B P (B A)P (A) A, B A = (A B) (A B), (A B) (A B) = P (A) = P (A B) + P (A B) = P (A B)P (B) + P (A B)P (B) A B A B A

20 2 n 2.8. S, S 2,..., S n Ω = S i {S i } n i= Ω P (A) = i= n P (A S i )P (S i ) i= 2.9 ( ). N r () (2) A =, B = () P (A) = P (B) = r N. (2) P (A) = r N r P (A) = P (A) = N N. P (B) = P (B A)P (A) + P (B A)P (A) = r r N N + r N r r(n ) = N N N(N ) = r N P (A) = P (B) = r N (2) 2.5 2.0 ( ). 30%, 70% % θ, /2 P ( ) = θ, P ( ) = θ, P ( ) = P ( ) = /2 P ( ) = P ( )P ( ) + P ( )P ( ) = θ 2 + 2 2 = θ 2 + 4. θ = 2P ( ) /2 = /0 0%. θ < /2 θ < P ( ), θ > /2 P ( ) < θ 2.2 (independence)

2.2 (independence) 2 2. ( ). A, B Ω (independent) P (A B) = P (A)P (B) 2.2 ( ). A, B Ω () P (A B) = P (A). (2) P (B A) = P (B). (3) P (A B) = P (A B). (4) A, B () B A (2) A B (3) B A A B (4). (),(2): P (A B) = P (A)P (B) P (B) P (A B) = P (A B) = P (A). P (A B) = P (A) P (A B) = P (A B)P (B) = P (B) P (A)P (B) A B () A B (2) () (3): () P (A B) = P (A) P (A B) = P (A) = P (A B)P (B) + P (A B)P (B) P (A B)( P (B)) = P (A B)P (B) = P (A B)P (B) P (B) (3) (3) P (A) = P (A B)P (B) + P (A B)P (B) = P (A B)P (B) + P (A B)P (B) = P (A B)(P (B) + P (B)) = P (A B) () () (3) (4): P (A B) = P (A)P (B) P (A B) = P (A B) = P (A B) = (P (A) + P (B) P (A B)), P (A)P (B) = ( P (A))( P (B)) = P (A) P (B)+P (A)P (B) P (A B) = P (A)P (B) P (A B) = P (A)P (B) 2.3. Ω = {S( )A, S2, S3,, SK,, C( )K}, P (A) = A/ Ω A = {SA, HA, DA, CA} = H = {HA, H2,, HK} =

22 2 P (A H) = 52, P (A)P (H) = 4 52 3 52 = 52 P (A H) = P (A)P (H) A H 2.4. Ω = 53 P (A) = 4 3, P (H) =, P (A H) = 53 53 53 P (A H) P (A)P (H) A, H 2.5. 2.9 () A B (2) P (B A) = r N r N = P (B) 2.9 N r N r P (B) = P (B A) N A B i.i.d. 6. 2.6. a A, B P (A B) = P ( ) = 0 P (A)P (B) 0 2.3 A H a 2.7. P (A) = 3, P (A B) = 2 P (B) () A, B (2) A, B (3) P (A B) = 4 (4) P (B A) = 5 () P (A B) = P (A) + P (B) P (A B) = P (A) + P (B) P (A)P (B) P (B) P (B) = 4. (2) P (A B) = 0 P (B) = P (A B) P (A) = 6. (3) P (A B) = P (A B)P (B) = P (B). P (A B) = 4 P (A) + P (B) P (A B). P (B) P (B) = 2 9. (4) (3) P (B) = 7 30.

2.2 (independence) 23 2.8 ( ). n A, A 2,..., A n :, 2,..., n i < i 2 < < i k n P (A i A i2 A ik ) = P (A i )P (A i2 ) P (A ik ) 2.9. Ω = {, 2, 3, 4}, P (A) = A/ Ω A = {, 2}, B = {, 3}, C = {2, 3} P (A) = P (B) = P (C) = 2. P (A B) = P ({}) = 4 = 2 = P (A)P (B) 2 P (A C) = P ({2}) = 4 = 2 = P (A)P (C) 2 P (B C) = P ({3}) = 4 = 2 = P (B)P (C) 2 A B A C B C (independent pairwisely) P (A B C) = P ( ) = 0 P (A)P (B)P (C) = 2 2 2 = 8 P (A B C) P (A)P (B)P (C) A, B, C 2.20. P (A B) = P (A)P (B) P (A B C) = P (A)P (B)P (C) P (A B C) = P (A)P (B)P (C) A, B, C Ω = {, 2, 3, 4}, P ({}) = 2 4, P ({2}) = 3 4 2, P ({3}) = P ({4}) = 4 A = {, 2}, B = {2, 3}, C = 4 {2, 4} P (A) =, P (B) = P (C) =. P (A B C) = P ({2}) = 3 2 2 4, 2 P (A)P (B)P (C) = ( ) 2 = 3 2 2 4 P (A B C) = P (A)P (B)P (C). 2 P (A B) P (A)P (B), P (A C) P (A)P (C), P (B C) P (B)P (C) 2.2. P ({}) = P ({2}) = P ({3}) = P ({4}) = 4

24 2 P ({}) = 4 α, P ({2}) = + α P (A B C) = 4 P ({2}) = 4 + α = P (A)P (B)P (C) = ( ) 2 2 2 + α α = ± 2 α = 2 2 2 2.3 * 3 2.22 ( ). A, B C P (A B C) = P (A C)P (B C) A, B C (conditionally independent) C P C 2.2 P C (A B) = P C (A)P C (B) P (A B C) = P (A B C) P (C) = P (A B C) P (B C) P (B C) P (C) = P (A B C)P (B C) P (A C) = P (A B C) C A, B C B A A, B C C 2.23. Ω = {(0, 0), (0, ), (, 0), (, )}, P (A) = A/ Ω A = {(0, 0), (0, )}, B = {(0, 0), (, 0)}, C = {(0, ), (, 0)} 2 P (A) = P (B) = P (C) = 2 P (A B) = P (A C) = P (B C) = 4 A, B, C C P (A C) = P (B C) =, P (A B C) = 0 2 2.24. L =, H = 6, D = 2 5 L H D Ω = {(i, j) i, j 6}, P (A) = A/ Ω L = {(, ), (, 2), (, 3), (, 4), (, 5), (, 6), (2, ), (3, ), (4, ), (5, ), (6, )} H = {(6, ), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6), (, 6), (2, 6), (3, 6), (4, 6), (5, 6)} D = {(, 5), (, 6), (2, 5), (2, 6), (5, ), (6, ), (5, 2), (6, 2)} *3 3.3

2.4 25 P (L) = P (H) = 36, P (D) = 8 36 = 2 9. L H = {(, 6), (6, )} P (L H) = 2 P (L)P (H) = 36 L H ( ) 2 36 L D = {(, 5), (, 6), (5, ), (6, )}, H D = {(6, ), (6, 2), (, 6), (2, 6)}, L H D = {(, 6), (6, )} P (L D) = 4/36 8/36 = 2 P (H D) = 4/36 8/36 = 2 P (L H D) = 2/36 8/36 = 4 = ( ) 2 2 P (L H D) = P (L D)P (H D), L H D 2.4 2.25 ( (Borel-Cantelli) 0 ). A, A 2,... ( ) () P (A n ) < P lim sup A n = 0 n n= ( ) (2) A, A 2,... P (A n ) = P lim sup A n = n. () lim n k=n n= P (A k ) = 0 B n = k=n A k lim sup A n = n B B 2.32 (2) (.2) ( ) P lim sup A n = lim P (B n) lim n n n ( ( ) ) (2) P lim sup A n = 0 C n = n A k = n= k=n n= P (A k ) = 0. k=n ( A k k=n C n C C 2 () P ( n= C n ) = lim n P (C n). ) lim sup A n = n B n n= n= k=n A k = lim P (C n) = 0 n n P (A k ) = + x exp(x) k=n

26 2 exp(x) = e x P (A k ) lim m k=n m ( P (A k )) lim m k=n = lim m exp m exp ( P (A k )) ( ) ( ) m P (A k ) = exp P (A k ) = 0. k=n D n,m = m=n k=n m A k = m=n k=n m A k C n = k=n D n,m D n,m D n,m+.32(2) P (C n ) = lim P (D n,m) = lim m m k=n m P (A k ) = lim m k=n m ( P (A k )) = 0. A k = k=n ( ( ) ) P lim sup A n = lim P (C n) = 0 n n ( ) P lim sup A n = n lim sup.32 2.26. A n = n P (A n ) = 6 ( ) P (A n ) =. P lim sup A n =. n n= A n ( ) ( ( ) ) ( ) P lim sup A n = 0 = P lim sup A n = P lim inf A n. n n n

27 3 Bayes Bayes Bayes Bayes Bayes 8 763 Laplace Bayes 20 Fisher Neyman, Pearson Bayes Bayes Bayes Bayes Bayes 00 Bayes Bayes Bayes 3. P (A) 0 P (B A)P (A) = P (A B)P (B) P (A) 3. (Bayes (I)). P (B A) = P (A B)P (B) P (A) Ω 3.2 (Bayes (II)). S, S 2,..., S n Ω P (S i A) = P (A S i)p (S i ) P (A) = P (A S i)p (S i ) n P (A S i )P (S i ) i= 3.3. A, B, C 20, 35, 45% 5, 7, 4% A, B, C Ω Ω = A B C P (A) = 0.2, P (B) = 0.35, P (C) = 0.45 E

28 3 BAYES P (E A) = 0.05, P (E B) = 0.07, P (E C) = 0.04 Bayes P (E A)P (A) P (A E) = P (E A)P (A) + P (E B)P (B) + P (E C)P (C) 0.05 0.20 = 0.05 0.20 + 0.07 0.35 + 0.04 0.45 = 00 525 0.90 B P (B E) = 245 80 0.467, P (C E) = 525 525 0.343 3.4. L, U L (lucky) 4 U (unlucky) 4 () (2) U (3) U L, U P (L) = P (U) = /2 G =, B = P (G L) = 4/5, P (B L) = /5, P (G U) = /5, P (B U) = 4/5. () P (B) = P (B L)P (L) + P (B U)P (U) (2) Bayes P (U B) = = 5 2 + 4 5 2 = 2 P (B U)P (U) P (B) = 4/5 /2 /2 = 4 5 (3) P (BB L) = 5 5, P (BB U) = 4 5 4 5 Bayes P (BB U)P (U) P (U BB) = P (BB L)P (L) + P (BB U)P (U) 6/25 /2 = /25 /2 + 6/25 /2 = 6 7 L, U Bayes /2 U 2 Bayes Bayes 3.5 (3 ). A, B, C 3

3. 29 A B C B A C /3 /2 : B, C A = A, B = B, C = C, S A = A, S B = B, S C = C P (A S B ) P (A) = P (B) = P (C) = 3 P (S B B) = 0, P (S B C) =, P (S B A) = (S C A) = 2 Bayes P (A S B ) = = P (S B A)P (A) P (S B A)P (A) + P (S B B)P (B) + P (S B C)P (C) 2 3 2 3 + 0 3 + 3 = 3. A 3 A 3.6 ( ). TV 3 3 A =, B =, C = P (A) P (A) = P (B) = P (C) = M = B 3 P (A M) P (C M) A B, C P (M A) = B B 2 P (M B) = 0 C C B P (M C) =

30 3 BAYES M Bayes P (A M) = = P (C M) = P (M A)P (A) P (M A)P (A) + P (M B)P (B) + P (M C)P (C) 2 3 2 3 + 0 3 + 3 = 3 P (M C)P (C) P (M A)P (A) + P (M B)P (B) + P (M C)P (C) 3 = 2 3 + 0 3 + = 2 3 3 Bayes 3.7 ( ). 000 99% % 99% A =, S = P (S) = 0.00, P (S) = P (S) = 0.999, P (A S) = 0.99, P (A S) = 0.0 Bayes P (A S)P (S) P (S A) = P (A S)P (S) + P (A S)P (S) 0.99 0.00 = 0.99 0.00 + 0.0 0.999 = 22 0.09 9% 3.8. 99% 9% 9% 0.% 90 0.% P (A) = P (A S)P (S) + P (A S)P (S) 000 99% 999 %(0 ) 0% 000 P (A S)P (S) P (S A) = P (A S)P (S) + P (A S)P (S) 0.99 0. = 0.99 0. + 0.0 0.9 = 99 08 = 2 0.92

3.2 3 3.9 ( ). (Todo) 3.2 3.7 P (S), P (S) 000 P (S A), P (S A) Bayes P (S), P (S) (prior probability) P (S A), P (S A) (posterior probability) Bayes 3.0 ( ). T 2000 Rh AB Mr. X Mr. X A =, B = P (A B) =. P (A B) = 2000 P (B A) Bayes P (B) A P (B) = P (B) = 2 Bayes P (B A) = 99.9% P (A B)P (B) P (A B)P (B) + P (A B)P (B) = 2 2 + 2000 2 = 2000 200 0.9995 B T 30

32 3 BAYES P (B) = P (B A) = 300000 P (B) = 299999 300000 300000 300000 + 2000 299999 300000 % Bayes = 2000 30999 0.0066 C A B 3500 P (B) = 35000000 P (B A) 0.0027 ICPO 2002 0 P (B) = P (B A) 0.5 00000 3.. Wikipedia DNA () (207) DNA DNA 77 (2) 3 DNA 000 DNA DNA DNA Bayes 3.3 Bayes Bayes S: (spam), N: (non spam) K(w): w P (S), P (N) w, w 2,... P (K(w ) S), P (K(w 2 ) S),... w P (S K(w)) P (S) = 0.6, P (N) = 0.4

3.3 33 (w ) 0. 0.0 (w 2 ) 0.2 0.0 (w 3 ) 0. 0.02 (w 4 ) 0.2 0.02 : (w ) Bayes (3.) P (K(w ) S)P (S) P (S K(w )) = P (K(w ) S)P (S) + P (K(w ) N)P (N) 0. 0.6 = 0. 0.6 + 0.0 0.4 0.9429 6% 0.2 0.6 P (S K(w 2 )) = 0.2 0.6 + 0.0 0.4 0.9474, 0. 0.6 P (S K(w 3 )) = 0. 0.6 + 0.02 0.4 0.899, 0.2 0.6 P (S K(w 4 )) = 0.2 0.6 + 0.02 0.4 = 0.9 P (K(w ) K(w 2 ) S) A, B (P (A B) = P (A)P (B)) (P (A B S) = P (A S)P (B S)) P (S A B) Bayes = = Bayes = P (A B S)P (S) P (A B) P (B S) P (B) P (B S) P (B) B P (S B) = = P (A S)P (S) P (A) P (B S)P (A S)P (S) P (B)P (A) P (S A) P (B S)P (S) P (B) P (S) P (S A) A P (S A) B P (S B) Bayes (Bayesian updating) (naive Bayes classifier)

34 3 BAYES (3.) Bayes 0.2 0.9429 P (S K(w )) 0.9429 w2 0.2 0.9429 + 0.0 0.057 0.9950 w 3 0. 0.9950 0. 0.9950 + 0.02 0.0050 0.999 w 4 0.2 0.9950 0.2 0.9950 + 0.02 0.0050 0.9998 Bayes P (S) = 0.6 P (S) = 0.0 P (S) = 0.8 2 Bayes P (S) 0.0 0.5 0.9 P (S K(w )) 0. 0.967 0.99 w 2 0.574 0.9925 0.999 w 3 0.8800 0.9986 0.9998 w 4 0.9778 0.9998.0000 2: Bayes * 4 *4

35 4 * 5 2 4.2 4.3 B * 6 4. (Ω, P ) 4. ( ). Ω X(ω) (random variable) ω Ω X 4.2. () Ω = {, 2, 3, 4, 5, 6} = {i i 6}. X(i) = i Y (i) = 2i, Z(i) = i 2 (2) 2 Ω = {ω = (i, j) i, j 6}. X(ω) = i, Y (ω) = j, Z(ω) = i + j 2 (3) 3 Ω = {ω = (i, j, k) i, j, k = 0 or }. X(ω) = i, Y (ω) = j, Z(ω) = k, T (ω) = i + j + k, 2, 3 () Y = 2X, Z = X 2 (2) Z = X + Y, (3) T = X + Y + Z 00% 4.3. (2) Ω = {2, 3,...,, 2} Ω = {, 2,..., 30, 36} Ω = {0,, 2, 3, 4, 5} 2 *5 *6

36 4 4.2 X X = {ω Ω X(ω) = } X(ω) = ω Ω Ω P ({ω X(ω) = }) P (X = ) def = P ({ω X(ω) = }) X = X < a X b P (X < a) = P ({ω X(ω) < a}) P (X b) = P ({ω X(ω) b}) X, Y P (X = a, Y = b) = P ({ω X(ω) = a, Y (ω) = b}) = P ({ω X(ω) = a} {ω Y (ω) = b}) P (X = a or Y = b) = P ({ω X(ω) = a} {ω Y (ω) = b}) 4.4. 4.2 P (A) = A/ Ω () P (X = 5) = P ({ω X(ω) = 5}) = P ({5}) =, P (X 3) = P ({ω X(ω) 6 3}) = P ({, 2, 3}) = 2, P (Y 7) = P ({4, 5, 6}) =, P (Y ) = P (Ω) =, 2 P (Z = 4) = P ({2}) =, P (Z = 3) = P ( ) = 0 6 (2) P (X = ) = P ({(, ), (, 2), (, 3), (, 4), (, 5), (, 6)}) = 6 36 = 6 P (Z = 8) = P {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} = 5 8 36 (3) P (Y = ) = P ({(0,, 0), (,, 0), (0,, ), (,, )}) = 4 8 = 2 2 P (T = 2) = P ({(0,, ), (, 0, ), (,, 0)}) = 3 8 2 4.3 4.5 ( ). X F X (x) = P (X x) X (distribution function) (c.d.f., cumulative distribution function)

4.3 37 X F (x) (), (2), (3) (4), (5) 4.6 ( ). () P (X > x) = P (X x) = F (x) (2) a < b P (a < X b) = F (b) F (a) (3) a < b = F (a) F (b). (4) lim F (x) = 0, lim F (x) =. F ( ) = 0, F ( ) = x x (5) lim F (x) = F (a). x a+0. () P (A) = P (A) A = {X x} (2),(3) A B = P (B A) = P (B) P (A) A = {X a}, B = {X b} a < b A B (4), (5) (4) F (x) 0.8 0.6 0.4 0.2 N(0, ) Exp() t χ 2 4 B(0, 0.5) Poi(3) 0 0 5 0 5 0 x : A Poisson χ 2 x = 0 t (2) 4.7 ( ). () X F X (x)

38 4 F X (x) = x p X (t) dt p X (x) X (density function) (2) x 0, x, x 2,... X p X (k) := P (X = x k ) (k = 0,, 2,... ) F X (x) = p X (k) x k x x k x k p X (k) X p(x) 2 * 7 0 F (x) = x p(t) dt x p(t) F (x) 0 x t 2: 4.8. counting measure Radon- Nikodym 2 A B *7

4.4 39 4.9 ( ). () p(x) p(x) 0 p(x) dx = p(k) = k=0 (2) X d dx F X (x) = p X (x). P (a < X b) = F (b) F (a) = b a p(x) dx or a<x k b p(k) a a p(x) dx = 0 P (X = a) 0 * 8 (4.) P (a < X < b) = P (a x < b) = P (a x b) = P (a < X b) = b a p(x) dx (4.) p(x) p(k) = P (X = x k ) p(x) p(k) 4.4 4.0. X () P [X m] 2 P [X m] 2 m X (median) X P [X m] = P [X m] = F (m) = 2 (2) p(x) x = c c X (mode) 4. ( ). X E[X] = xp(x) dx E[X] = x k p(k) k=0 (expectation / expected value) (mean / average) *8

40 4 = 3 N(µ, σ 2 ) E[X] = µ, V [X] = σ 2 p(x) σ p(k) λ = 3 λ = 7 Poisson Poi(λ) E[X] = V [X] = λ σ µ σ x 2 3 4 5 6 7 8 9 0 2 3 k p(x) Exp(λ) Exp(λ) E[X] = /λ, V [X] = /λ 2 p(x) = π a 2 x 2 E[X] = 0, V [X] = a 2 /2 p(x) Exp(λ ) (λ < λ) 0 /λ /λ x a 0 x a 3: : Poisson < = = x p(x) xp(x) dx p(x) dx p(x) dx = 4.2 ( ). 4. x p(x) dx < + x k p(k) < + k=0

4.4 4 4.26, 4.27 4.3 ( ). (4.2) E[X] = X(ω)P (dω) Ω X(ω) = x, P (dω) = p(x) dx E[X + Y ] = E[X] + E[Y ] X 2X X 2 f(x) E[f(X)] = f(x)p(x) dx or f(x k )p(k) k=0 4.4 ( ). () X f(x), g(x) E[f(X) + g(x)] = E[f(X)] + E[g(X)]. a, b E[aX + b] = ae[x] + b. b E[b] = b. (2) X, Y E[X + Y ] = E[X] + E[Y ]. () 40 20 60 (2) 70 80 50 (2) 4.3 * 9. () E[f(X) + g(x)] = = (f(x) + g(x))p(x) dx f(x)p(x) dx + g(x)p(x) dx = E[f(X)] + E[g(X)] (2) (4.2) E[X + Y ] = (X(ω) + Y (ω))p (dω) Ω = X(ω)P (dω) + Y (ω)p (dω) = E[X] + E[Y ] Ω Ω *9

42 4 4.5 ( ). V [X] = E[(X E[X]) 2 ] X (variance) V [X] (standard deviation) σ s σ 2 v µ m µ = E[X], σ = V [X], σ 2 = V [X] V [X] = E[(X E[X]) 2 ] V [X] = (x µ) 2 p(x) dx or k=0 (x k µ) 2 p(k) = = 3 4.7 V [X] = E[(X E[X]) 2 ] µ = E[X] V [X] = E[(X µ) 2 ] = E[X 2 2µX + µ 2 ] = E[X 2 ] 2µE[X] + µ 2 = E[X 2 ] µ 2 4.6 ( ). () V [X] = E[X 2 ] (E[X]) 2. (2) a, b V [ax + b] = a 2 V [X]. E[aX + b] = ae[x] + b (3) V [X] 0 V [X] = 0 X (4) X, Y V [X ± Y ] = V [X] + V [Y ]. ± + 5.4 (3) V [X] < 0 V [X] < 0 V [X] = 0 V [X] > 0. (4) 5.4 () (2) E[aX + b] = ae[x] + b V [ax + b] = E[(aX + b E[aX + b]) 2 ] = E[(aX + b (ae[x] + b)) 2 ] = E[a 2 (X E[X]) 2 ] = a 2 E[(X E[X]) 2 ] = a 2 V [X].

4.4 43 (3) 4.3 V [X] = (x k µ) 2 p(k) (x k µ) 2 0 p(k) 0 V [X] 0. V [X] = (x k µ) 2 p(k) = 0 x k µ k p(k) = 0 k=0 k=0 P (X µ) = x k µ 0. X p(k) = 0 X µ 4.7 ( (Chebyshev) (Chebyshev s inequality)). µ = E[X], σ = V [X] α > 0 P ( X µ ασ) α 2 α > α = 2 X I = [µ 2σ, µ + 2σ] 50% 5 I α = 3 9 [µ 3σ, µ + 3σ] µ ± σ µ ± 3σ * 20. σ 2 = = (x µ) 2 p(x) dx (x µ) 2 p(x) dx + x µ ασ x µ <ασ (x µ) 2 p(x) dx 2 (x µ) 2 p(x) dx (x µ) 2 (ασ) 2 x µ ασ x µ ασ (ασ) 2 p(x) dx = α 2 σ 2 p(x) dx = α 2 σ 2 P [ X µ ασ]. x µ ασ α 2 σ 2 4.8 4.9 4.8 ( ). X µ = E[X], σ = V [X] Z = X µ E[Z] = 0, V [Z] = X Z σ (standardization) *20 [µ 3σ, µ + 3σ] 99.7%

44 4 [ ] X µ E[Z] = E = σ σ E[X µ] = (E[X] µ) = 0. σ [ ] ( ) 2 ( ) 2 X µ V [Z] = V = V [X] = σ 2 =. σ σ σ 4.9 ( (Bernoulli) ). 0 θ 0, X P (X = ) = θ, P (X = 0) = θ X (Bernoulli distribution) E[X], V [X] θ θ : E[X] = θ + 0 ( θ) = θ. : E[X 2 ] = 2 θ+0 2 ( θ) = θ V [X] = E[X 2 ] (E[X]) 2 = θ( θ). θ = 0 (θ = ) V [X] = 0 V [X] θ = /2 4.20 ( ). X p(k) = P (X = k) F (k) = P (X k) k 2 3 4 5 6 p(k) 6 F (k) 6 6 2 6 6 3 6 6 4 6 6 6 5 6 E[X] = 6 + 2 6 + 3 6 + 4 6 + 5 6 + 6 6 = 7 2 E[X 2 ] = 2 6 + 22 6 + 32 6 + 42 6 + 52 6 + 62 6 = 9 6 V [X] = E[X 2 ] (E[X]) 2 = 35 2. 4.2. 4.2 X () X E[X] = 7, E[Y ] = E[2X] = 2E[X] = 7, E[Z] = 2 E[X 2 ] = 9 6. (2) 4.4 p X (k) = P (X = k) = 6 () p Y (k) = p X (k) X Y E[X] = E[Y ] = 7, E[Z] = E[X + Y ] = 2 E[X] + E[Y ] = 7.

4.4 45 (3) 4.4 p Y () = P (Y = ) = 2, p Y (0) = 2 E[Y ] = 2 + 0 2 = 2. p X(k) = p Y (k) = p Z (k) E[X] = E[Z] = 2, E[T ] = E[X + Y + Z] = 3 2. 4.22. X µ = E[X], v = V [X] µ, v () E[2X] (2) E[3X + 4] (3) E[X 2 ] (4) E[X 2 + 2X + 3] (5) V [4X] (6) V [X + 2] (7) V [3X 4] (8) E[X 2 + 3X] a E[a] = a () E[2X] = 2E[X] = 2µ (2) E[3X + 4] = 3E[X] + E[4] = 3µ + 4 (3) E[X 2 ] = v + µ 2 (4) E[X 2 + 2X + 3] = E[X 2 ] + 2E[X] + E[3] = v + µ 2 + 2µ + 3 = µ 2 + 2µ + v + 3 (5) V [4X] = 4 2 V [X] = 6v (6) V [X + 2] = V [X] = v (7) V [3X + 4] = V [3X] = 3 2 V [X] = 9v (8) E[X 2 + 3X] = E[X 2 ] + 3E[X] = µ 2 + 4µ + v 4.23. a < b X { c (a x b) p(x) = 0 ( ) () c (2) E[X], V [X] b () c dx = c(b a) = c =. (2) µ = E[X] = c x dx = a b a a b 2 a 2 = a + b b b a 2 2. E[X2 ] = c x 2 dx = b 3 a 3 = a2 + ab + b 2 a b a 3 3 V [X] = E[X 2 ] µ 2 = a2 + ab + b 2 (a + b)2 (b a)2 =. 3 4 2 (a x b) p(x) = b a 0 ( ) [a, b] (uniform distribution) a, b b 4.24. X p(x) = { cx( x) (0 x ) 0 ( ) () c (2) P [0 X /2] (3) E[X], V [X] (4) F (x) () 2 (2) P [0 X /2] = 0 cx( x) dx = c = 6. /2 0 6x( x) dx = 2

46 4 (3) µ = E[X] = (4) F (x) = P [X x] = 0 x 6x( x) dx = 2, V [X] = E[X2 ] µ 2 = x p(t) dt 0 (x < 0) F (x) = 3x 2 2x 3 (0 x ) ( < x) 2 0 x 2 6x( x) dx = 20 4.25. ( σ > 0 ) µ X p X (x) = exp (x µ)2 2πσ 2 2σ 2 Y = e X Y F Y (y) = P [Y y] = P [e X y] = P [X log y] = log y p Y (y) = d dy F Y (y) = p X d (log y) (log y) dy ( ) (log y µ) 2 = 2πσ2 y exp 2σ 2 p X (x) dx X N(µ, σ) Y log Y Y (log normal distribution) 4.2 4.26 ( (St. Petersburg paradox)). k 2 k k ( ) 2 k p(k) p(k) = 2 k 2 kp(k) = 2 2 + 22 2 2 + + 2k 2 k + = + + =. k=

4.4 47 4.27 ( (Cauchy) ). p(x) = π( + x 2 ) (Cauchy distribution) a t B.2 p(x) 0.5 Cauchy p(x) 0.5 0 x 0 x p(x) π( + x 2 ) dx = π [arctan x] = ( π ( π 2 π )) = 2 x [ ] π( + x 2 ) dx = 2 x 0 π( + x 2 ) dx = 2 2π log( + x2 ) = + 0 [ x π( + x 2 dx = lim ) L,R 2π log( + x2 ) = lim L,R 2π log ] R ( + R 2 L + ( L) 2 R = L 0 R = 2L log 2 π 0 a (Lorentz distribution) (Breit-Wigner distribution) )

48 5 5 mo 5. 5. ( ). X, X 2,..., X n a, a 2,..., a n P (X a, X 2 a 2,..., X n a n ) = P (X a )P (X 2 a 2 ) P (X n a n ) X, X 2 P (X a ) A = {ω Ω X (ω) a } P (A ) P (X 2 a 2 ) A 2 = {ω Ω X 2 (ω) a 2 } P (A 2 ) P (X a, X 2 a 2 ) A 2 = {ω Ω X (ω) a X 2 (ω) a 2 } = A A 2 P (A A 2 ) = P (A )P (A 2 ), A A 2 a, a 2 X X 2 X X 2 5.2 ( ). X, X 2,..., X n f (x), f 2 (x),..., f n (x) E[f (X )f 2 (X 2 ) f n (X n )] = E[f (X )]E[f 2 (X 2 )] E[f n (X n )] 5.3. X, Y f(x) = x, g(y) = y = E[f(X)g(Y )] = E[XY ], = E[f(X)]E[g(Y )] = E[X]E[Y ] X, Y E[XY ] = E[X]E[Y ]

5. 49 5.4 ( ). X, Y V [X ± Y ] = V [X] + V [Y ].. µ X = E[X], µ Y = E[Y ] 4.4 E[X ± Y ] = E[X] ± E[Y ] = µ X ± µ Y (5.) V [X ± Y ] = E[(X ± Y E[X ± Y ]) 2 ] = E[(X ± Y (µ X ± µ Y )) 2 ] = E[((X µ X ) ± (Y µ Y )) 2 ] = E[(X µ X ) 2 + (Y µ Y ) 2 ± 2(X µ X )(Y µ Y )] = E[(X µ X ) 2 ] + E[(Y µ Y ) 2 ] ± 2E[(X µ X )(Y µ Y )] = V [X] + V [Y ] ± 2E[(X µ X )(Y µ Y )] 3 5.2 f(x) = x µ X, g(y) = y µ Y (5.2) E[(X µ X )(Y µ Y )] = E[X µ X ]E[Y µ Y ] = (E[X] µ X )(E[Y ] µ Y ) = 0 5.5. ± + E[X Y ] = E[X] E[Y ] V [X Y ] = V [X] V [Y ] X, Y V [X Y ] = V [X] + V [Y ] (a ± b) 2 = a 2 ± 2ab + b 2 b 2 (±) 2 = E[(X µ X )(Y µ Y )] 0 V [X +Y ] V [X]+V [Y ] Y = X X Y V [X + Y ] = V [0] = 0 Y = X X Y V [X + Y ] = V [2X] = 4V [X] > V [X] + V [Y ] Cov(X, Y ) def = E[(X E[X])(Y E[Y ])] X Y (covariance) X, Y Cov(X, Y ) = 0 X Y (5.2) (5.) V [X ± Y ] = V [X] + V [Y ] ± 2 Cov(X, Y ) 3 5.6. X,..., X n V [X + + X n ] = = n V [X k ] + k= k= n Cov(X k, X l ) k= l k n n V [X k ] + 2 Cov(X k, X l ) k= l>k ( ) 2 a k = k k a 2 k + k a k a l (5.) l k

50 5 5.7. Cov(X, Y ) = E[XY ] E[X]E[Y ] 5.2 5.8 ( ). X, X 2,... (independent and identically distributed, i.i.d.) µ = E[X k ], σ 2 = V [X k ] (k =, 2,... ) 5.9 ( ). X k k X, X 2,... i.i.d. µ = E[X ] σ = V [X ] 5.0 ( ). θ (0 θ ) k X k 0 X k P (X k = ) = θ, P (X k = 0) = θ, X, X 2,... i.i.d. E[X ] = θ 5. ( ). θ k X k X k P (X k = ) = θ, P (X k = 0) = θ i.i.d. (Bernoulli process) X, X 2,... i.i.d. E[X ] = µ, V [X ] = σ 2 5.2. X n := n (sample mean) n k= X k 5. X 0 0 X n E[X n ] = µ. X n µ (n ).

5.3 5 5.3 ( ). E[X n ] = µ, V [X n ] = σ2 n. [ ] [ n E[X n ] = E X k = n ] n n E X k = n E [X k ] = nµ = µ. n n V [X n ] = V [ n k= k= k= k= k= ] [ n X k = n ] n 2 V X k = n n 2 V [X k ] = n 2 nσ2 = σ2 n. k= 5.3 i.i.d. V [X n ] = σ2 0 (n ) n X n µ (low of large numbers) * 2 5.4 ( ). X, X 2,... i.i.d. E[ X ] < 5.6 () ε > 0 lim P ( X n µ ε ) = 0 ( ) n (2) P lim X n = µ = n. E[X n ] = µ, V [X n ] = σ2 n 4.7 0 P ( X n µ ε) ε 2 V [X n] = σ2 ε 2 0 (n ) n 5.5 ( ). X, X 2,... i.i.d. 0 θ n P (X k = ) = θ, P (X k = 0) = θ. X k n X n µ = E[X ] = θ lim X n = θ n lim X n n θ = 0.5 5 3 k= *2

52 5 set X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 0 X X 2 X 3 X 4 X 5 X 5 st 0 0 0 0 0 0 0 8/5 2nd 0 0 0 0 0 0 0 0 7/5 3rd 0 0 0 0 0 0 9/5 X k 4 3 6 X n /2 n = 5 X 5 = 3/5, 2/5 n /2 n = 000 50 4 n X n θ n = 000 n n = 0000 n = 00000 X n 0.0 0.2 0.4 0.6 0.8.0 θ = 0.8 θ = 0.5 θ = 0.2 0 200 400 600 800 000 n 4: X n n E[X k ] = θ θ = 0.2, 0.5, 0.8 50 5.6 ( ). Cauchy 4.27 i.i.d. X, X 2,... 5 Cauchy 0 X n 0 a a 5.4 lim n X n = µ n 5.5 n X n µ n (central limit theorem)

5.4 53 X n 0 5 0 5 0 0 5000 0000 5000 20000 25000 30000 n 5: Cauchy i.i.d. 20 4 5.7 ( ). X, X 2,... i.i.d. µ = E[X ], σ 2 = V [X ] n Z n = X n µ σ/ N(0, ) n ( lim P a X ) n µ b n σ/ n b = e x2 2 dx a 2π Z n E[X n ] = µ, V [X n ] = σ2 n Z n = X n µ σ/ n X n 4.8 E[Z n ] = 0, V [Z n ] =. A.3 0 n E[Z n ] = 0, V [Z n ] = 0 X k * 22 n standard normal distribution central limit theorem 5.8. 6 3 i.i.d X n n = X = X n = 5 n = 00 *22

54 5 0.0 0.2 0.4 0.6 0.8.0 n = n = 5 n = 00 0.0.0 2.0 3.0 0 5 0 5 0.0 0.2 0.4 0.6 0.8.0 0.2 0.4 0.6 0.8 0.40 0.50 0.60 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8.0 0 2 3 4 0 2 4 6 8 0 0 2 3 4 0.8.0.2.4 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.5.0.5 2.0 3 2 0 2 3 2 0 2.0 0.5 0.0 6: i.i.d. X, X 2,... X n 20000 X k n = n = 5 n = 00 5.9 ( ). 5.0 k X k = X k = 0 θ P (X k = ) = θ, P (X k = 0) = θ X, X 2,... i.i.d. µ = E[X k ] = θ n P ( a X ) n µ a σ/ n a e x2 2 dx a 2π = Φ(a) Φ( a) = 2Φ(a) Φ(z) (A.4) A.3 a =.96 Φ(.96) 0.9750 P 0.95 (.96 X ) n µ σ/ n.96 2Φ(.96) 0.95, X n.96σ n µ = θ X n +.96σ n

5.4 55 X n θ = µ X n σ 2 = θ( θ) = θ 2 σ X n ( X n ) n = 600 0.95 X n 0 08 X n ( X n ) θ X n + 0.08 X n ( X n ) θ X n ±0.08 X n ( X n ) X n 0.05 0. 0.2 0.3 0.4 0.5 ± 0.07 0.024 0.032 0.037 0.039 0.040 a a https://www.videor.co.jp/tvrating/attention/index.html %.%.% 5.20. 0.99 θ = 0.5 ±0.05% n Φ(2.576) 0.995 P ( 2.576 X ) n µ σ/ n 2.576 0.99 ±2.576 σ n = ±2.576 θ = 0.5 ±0.0005 0.5 0.5 2.576 0 0005 n θ( θ). n n 6635776

56 6 6 sample size ( number of samples) parameter (estimator) (estimate) X n = n u 2 n = n n k= X k n (X k X n ) 2 k= n 6. (population) * 23 I * 24 (sample survey) (statistical inference) (inferential statistics) (descriptive statistics) (sampling) (sample) (sample size) *23 *24 27 670

6. 57 N n = n (sampling with replacement) (sampling without replacement) N n (N = ) 2.5, 6.2 N n/n 6.8 (simple random sampling) n N 6.. 6.2 ( ). N n N! N(N ) (N n + ) N C n = = (N n)! n! n! (6.) n! = N C n N(N ) (N n + ) n N N N n n! (6.2) n! N n n n N N n (6.) (6.2) 6.3. 70, 7, 72, 73, 74 5 3

58 6 6.2 i.i.d. (population distribution) n X,..., X n (N = ) X,..., X n 2.5 N n (sample size) n X,..., X n i.i.d. 6.3 (parameter) (population parameter) N(µ, σ 2 ) Poisson Poi(λ) µ, σ 2, λ (parametric) parameter (non-parametric) (population mean) (population variance) Poisson N(µ, σ 2 ) 6.4. parameter

6.4 (estimator) 59 X,..., X n i.i.d. 6.5. µ, σ 2 k n E[X k ] = E[X ] = µ, V [X k ] = V [X ] = σ 2. 6.4 (estimator) µ X n = n X,..., X n X,..., X n S(X,..., X n ) (statistic) * 25 (sample distribution) B (estimator) * 26 θ ˆθ = ˆθ n = ˆθ(X,..., X n ) n k= X k ˆ * 27 (estimate) X = x,..., X n = x n ˆθ(x,..., x n ) (point estimation) 7 6.6. 5.5 5 X,..., X 5 3 X 5 3 x = 8/5, 7/5, 9/5 3 (number of samples) 3 (sample size) 77 *25 statistic statistic statistics *26 (test statistic) *27 ˆµ = n X k n k=

60 6 6.5 (unbiasedness) (bias) 6.7 ( ). θ ˆθ E[ˆθ] = θ (unbiased estimator) 6.8. i.i.d. X n E[X n ] = µ 5.3 µ σ 2 (sample variance) (6.3) s 2 n = n n (X k X n ) 2 k= 6.9 ( ). u 2 n = n n s2 n (unbiased variance) (6.4) u 2 n = n n (X k X n ) 2 k= 6.0. σ 2 s 2 s 2 (6.3) (6.4) u 2 n s2 n (6.3) u 2 n 6.. PC Excel s 2 n (VAR.P) u2 n (VAR.S) R (var()) R Excel R u 2 n s 2 n = n n u2 n s 2 n

6.6 (consistency) 6 6.2. u 2 n σ2 E[u 2 n] = σ 2.. E[s 2 n] = n n σ2 E[u 2 n] = n n n n σ2 = σ 2 s 2 n = n = n n { (Xk µ) (X n µ) } 2 k= n (X k µ) 2 2 n k= n (X k µ)(x n µ) + n k= n (X n µ) 2. k 2, 3 (X n µ) 3 n n (X n µ) 2 = (X n µ) 2, k= k= 2 2 n { n (X k µ)(x n µ) = 2(X n µ) n k= = 2(X n µ) 2 n X k n k= } n µ k= (6.5) sn 2 = n (X k µ) 2 (X n µ) 2 n k= 2 5.3 E[s 2 n] = n n E[(X k µ) 2 ] E[(X n µ) 2 ] = }{{}}{{} n nσ2 σ2 n = n n =V [X k ]=σ 2 k= =V [X n]=σ 2 /n σ2 6.3. σ u 2 n 6.6 [ u ] E 2 n σ [ u ] [ u ] E 2 n = σ E 2 n = E[u 2 n] = σ 2 = σ x2 dx x 2 dx 6.6 (consistency) X n n µ 5.4 6.4. θ ˆθ ( ) (consistent estimator) : ε > 0 lim P ˆθn θ > ε = 0 n

62 6 ( P lim n ) ˆθ n = θ = 6.5. ( ) ( ) n n ( ). lim n u2 n = lim n n s2 n = lim lim n n n s2 n = lim n s2 n s2 n (6.5) lim n s2 n = lim n n n (X k µ) 2 lim (X n µ) 2 n k= ( ) 2 P lim (X n µ) 2 = 0 = Y k = (X k µ) 2 n ( ) E[Y k ] = V [X k ] = σ 2 n P lim (X k µ) 2 = σ 2 =. n n ( k= P lim n s2 n = σ 2) =, u 2 n σ2 6.6. 6.3 E[ u 2 n] σ u 2 n n u 2 n σ 6.7 (efficiency) T = X + X 2 X 3, T 2 = X 3 = X + X 2 + X 3 3 E[T ] = E[X + X 2 X 3 ] = E[X ] + E[X 2 ] E[X 3 ] = µ + µ µ = µ E[T 2 ] = E[X 3 ] = µ µ T, T 2 4.6, 5.4 V [T ] = V [X ] + V [X 2 ] + V [X 3 ] = 3σ 2 V [T 2 ] = V [X 3 ] = σ2 3 T 2 T 2 T (efficient estimator) (maximum likelihood estimator) * 28 6.7. 2 X, X 2 µ T = αx + βx 2 α, β *28

6.8 63 T E[T ] = E[αX + βx 2 ] = αe[x ] + βe[x 2 ] = (α + β)µ E[T ] = µ α + β = V [T ] = V [αx + βx 2 ] = α 2 V [X ] + β 2 V [X 2 ] = (α 2 + β 2 )σ 2 α = β = 2 6.8. c,..., c n T = c X + + c n X n (best linear unbiased estimator, BLUE) 6.8 {x, x 2,..., x N } µ σ 2 µ = N σ 2 = N N i= x k N (x i µ) 2 = N i= N x 2 i µ 2 i= n X,..., X n 6.9. X k ( k n) (6.6) P [X k = x i ] = N ( i N) E[X k ] = µ, V [X k ] = σ 2.. (6.6) N n N(N ) (N n + ) X k = x i k x i N n (N )(N 2) (N (n ) + ) (6.6) E[X k ] = N x i = µ N V [X k ] = N N (x i µ) 2 = σ 2 i= i= 6.20. X n s 2 n (6.7) (6.8) (6.9) E[X n ] = µ V [X n ] = σ2 n N n N E[s 2 n] = n n σ2 N N

64 6 (6.9) s 2 n N N n n s2 n = N N n N (6.8) V [X n ] = σ2 n N n N C F = (finite population correction factor, FPC) u2 n N n N V [X n ] C 2 F C F N N n 00 40 00 40 00 0.6 0.6 0.77 2 7.2 Z n = X n µ N(0, ) σ2 /n X n µ C 2 F σ 2 /n X n ± z(α/2) σ C F n t X n ± t n (α/2) u2 n C F 2 0 Todo: {x,..., x N } 6.20 (6.7) E[X n ] = n k= n E[X k] = nµ = µ n (6.8) (6.9) (6.5) s 2 n (6.5) (6.8) E[s 2 n] = n n E[(X k µ) 2 ] E[(X n µ) 2 ] = }{{}}{{} n nσ2 V [X n ] =V [X k ]=σ 2 k= (6.8) 5.6 (6.0) =V [X n] = σ 2 σ2 n N n (n )N = N n(n ) σ2. [ ] Σk X k V [X n ] = V = n n 2 V [Σ kx k ] = n n n 2 V [X k ] + Cov(X k, X l ) k= k= l k (X k, X l ) N(N )

6.8 65 Cov(X k, X l ) = = N(N ) (x i µ)(x j µ) i j N (x i µ) N(N ) i= }{{} =0 = σ2 N k, l (B.3) 2 N (x i µ) 2 i= }{{} =Nσ 2 V [X n ] = ( ) n 2 nσ 2 + n(n ) σ2 = σ2 N n N n N

66 7 7 n (i.i.d.) X,..., X n µ = E[X k ] σ 2 = V [X k ] X n = n X k u 2 n = n k= n (X k X n ) 2 n k= 5.5 7/5 9/5 /2!! 7. ( ). 8.6 0 < α < θ (7.) P (L θ U) = α L, U [L, U] 00( α) % (confidence interval, CI) α (confidence coefficient) 00( α) % (confidence level, CL) L U α θ (7.) P (θ / [L, U]) = α 95, 90, 99 % * 29 50 % *29

7. 67 () (2) α/2 00( α) % 7. A.3 X N(0, ) 0 < α < P (X > z(α)) = α z(α) N(0, ) Φ(z) = P [X z] α = P [X > z] = P [X z] = Φ(z) z z(α) = Φ ( α) α 00α % α PC 3 * 30 α 0.005 0.0 0.025 0.05 0. z(α) 2.5758 2.3263.9600.6449.286 3: α 7.2 ( ). α α α F (x) F ( α) α 7.2 X,..., X n N(µ, σ 2 ) n X n Z n = X n µ σ/ A.24 A.5 N(0, ) N(0, ) n α/2 P [ Z n z(α/2)] = α α X n µ σ/ n z(α/2) z(α/2) X n µ σ/ n z(α/2) z(α/2) σ n X n µ z(α/2) σ n X n z(α/2) σ n µ X n + z(α/2) σ n, *30

68 7 P [X n z(α/2) n σ µ X n + z(α/2) n σ ] = α σ 2 X n ± z(α/2) σ n µ * 3 µ [X n z(α/2) σ n, X n + z(α/2) σ n ] α 7.3. σ 2 = 0 2 0 79, 2, 8, 90, 98, 2, 92, 0, 96, 92 µ 95% 99% 79 + 2 + + 92 σ = 0, n = 0 x 0 = = 99.9. 0 α = 0.05 z(α/2) = z(0.025) =.96 95% 99.9.96 0 0 µ 99.9 +.96 0 0 93.7 µ 06. 99% z(0.005) = 2.58 99.9 2.58 0 0 µ 99.9 + 2.58 0 0 9.75 µ 08.05 7.4 ( ). 99% 95% α = 0 00% z(0) = + 00% (, + ) µ α = z(/2) = 0 [X n, X n ] µ = X n 0 α *3

7.3 69 93.7 µ 06. 95%! N(00, 0 2 ) 0 µ = 00 95% [93.7, 06.] 00 03, 05, 80, 82, 3, 78, 90, 99, 03, 92 95% x 0 = 94.5 [88.3, 00.7] 79, 90, 02, 84, 97, 85, 09, 09, 75, 94 95% [86.2, 98.6] 00 L, U = [L, U] 00( α) % 7 95% CI on the mean (µ = 00) 0 00 90 0 50 00 samples 7: N(00, 0 2 ) 0 95% 00 00 95% 95% 00 94 00 6 7.3 * 32 t B.8 N(m, σ 2 ) n T = X n µ u 2 n /n *32

70 7 n t t n * 33 µ N(0, ) t T t n α t n (α) P [T > t n (α)] = α t N(0, ) P ( T t n (α/2)) = α. ( ) u 2 P X n t n (α/2) n u 2 n µ X n + t n (α/2) n = α n µ α [ ] u 2 X n t n (α/2) n u 2 n, X n + t n (α/2) n n 7.5. n = 6 x 6 = 32.6, u 2 6 = 9.28 µ 95% PC t t 5 (0.05/2) = 2.34 u 2 x 6 ± t 5 (0.025) 5 = 32.6 ± 2.35 6 95% [30.9767, 34.2233] 9.28 6 = 32.6 ±.6233 u 2 7.6. x 6 ±z(0.025) 6 32.6±.4927 6 [3.073, 34.0926] 7.4 n X n µ σ2 /n N(0, ) 6.5 σ 2 u 2 n µ α [ ] u 2 X n z(α/2) n u 2 n, X n + z(α/2) n n *33 T = Xn µ u 2 n /n t n t T t T t

7.5 7 5.9 5.9 σ 2 u 2 n σ2 X n ( X n ) 7.5 σ 2 σ 2 B.5 N(m, σ 2 ) n χ 2 = (n )u2 n σ 2 = ns2 n n σ 2 = ( Xk X n σ k= ) 2 n χ 2 χ 2 n σ2 t χ 2 α/2 α/2 X χ 2 n α χ2 n(α) P (X > χ 2 n(α)) = α P (X χ 2 n(α)) = α α α χ 2 0 P (X < χ 2 n( α)) = α χ n α χ 2 n( α) χ 2 ( P χ 2 < χ 2 n ( α 2 )) = α 2, P ( χ 2 n ( α ) < χ 2) = α 2 2 P ( ( χ 2 n α ) (n ( )u2 n α ) ) 2 σ 2 χ 2 n = α 2 σ 2 σ 2 a (n )u 2 n ( α ) σ 2 (n )u2 n ( χ 2 n χ 2 n α ) 2 2 7.7. 0 79, 2, 8, 90, 98, 2, 92, 0, 96, 92 95% x 0 = 99.9, u 2 0 = 55.7. 95% α = 0.05, n = 0 = 9 χ 2 9(α/2) χ 2 9( α/2) PC χ 2 9(0.025) = 9.023, χ 2 9(0.975) = 2.700 95% 9 55.7 9.023 = 73.7 σ 2 9 55.7 2.700 = 59 7.3 7 8

72 7 95% CI on the variance (σ 2 = 00) 800 400 200 00 95% CI on the SD (σ = 0) 800 400 200 00 0 0 50 00 samples 0 0 50 00 samples 8: N(00, 0 2 ) 0 95% 00 00 93 00 6 7.8. X n ± u 2 n ± ± (73.7 + 59)/2 = 296.35 296.35 ± 222.65 296.35 7.6 (Todo) Poisson

73 8 8.0 χ 2 t F Z Welch Kolmogorov Smirnov Mann Whitney χ 2, t, F Welch, Kolmogorov Smirnov, Mann Whitney * 34 8. 2 0 6 00 60 000 600 (hypothesis testing) /2 n S n S n B(n, /2) n = 0, 00, 000 S n E[S 0 ] = 5, E[S 00 ] = 50, E[S 000 ] = 500 6, 60, 600 6 60 600 *34 χ 2

74 8 PC P (S 0 6) = P (S 00 60) = P (S 000 600) = 0 k=6 00 k=60 000 k=600 ) k ( ) 0 k 0.38 2 ) k ( ) 00 k 0.028 2 ( ) k ( ) 000 k 000C k.4 0 0 2 2 0C k ( 2 00C k ( 2 0 6 40% 0 6 000 600 0 00 0 /2 * 35 00 60 60 3% 5% 3% /2 % 0.% 3% () /2 (2) (3) () () (3) * 36 8. ( ). P (S 0 6) = 93 52 P (S 00 60) = 45072645608352292345325 5845632502852867587087900672 P (S 000 600) = 73089320690598842797865573005225577742928794709757746870889648667442944329096966823243962042856663435520388875593650878367399544530755309588859269309524888525602648649655088227598396409695358906066488393929743590757990835830035567704568666485763945080246997889 535754303593336604742252453000090528070240585276680372875948575525562468062465998940784792906379733645877657342593572642846570279922887873492874096728388742549270537302538557093897709076523237497909706336993837795827797303853457285598238843270838302495826329348602834034688 *35 *36

8.2 75 30 29 PC PDF % 8.2 8.2. 8.2.6 ( ) () (null hypothesis) H 0 (2) X (3) (significance level) 0 < α < α = 0.05, 0.0 (4) H 0 P H0 (X A) = α A P H0 (X A) α 8.4 A (critical region) (5) X x x A H 0 (reject) x / A H 0 () (4) (5) 8.2.

76 8 8.3 ( ). M 99% 3 5 95% a M 3.5 M H 0 M X H 0 P (X 3) = 0.99, P (X 5) = 0.95 P (X < 3) = 0.0, P (X < 5) = 0.05, 5% A 5 = {x < 5} % A = {x < 3} x = 3.5 A 5 5% H 0 A % H 0 a 3 8.4 ( ). H 0 θ = /2 n S n B(n, θ) S n n = 00 P (S 00 58) 0.044, P (S 00 57) 0.067 5% S 00 58 P (X A) = α (4) 60 5% 8.5 ( ). 65 db 5 64 db σ 2 = 5 5% µ H 0 : µ = 65 Z = X n µ σ2 /n H 0 Z N(0, ) α = 0.05 P H0 ( Z >.96) = 0.05 Z >.96 64 65 z = = 3.73 5% H 0 5/5 0% P H0 ( Z >.64) = 0..73 H 0 8.8 db * 37 *37. 64 db 65 db

8.2 77 8.6. 65 db 5 64 db σ 2 = 5 5% H 0 : µ = 65 Z = X n µ σ2 /n H 0 Z N(0, ). α = 0.05 P (Z <.645) = 0.05 Z <.645 64 65 z = = 3.73 5% H 0 5/5 8.2.2 null hypothesis hypothesis null zero nothing

78 8 8.7. (accept) A Statistical methods in psychology journals: Guidelines and explanations Never use the unfortunate expression accept the null hypothesis. Fisher... the null hypothesis is never proved or established, but is possibly disproved,... Fisher 8.2.3 2 0 8.4 H 0 : θ = /2 S 00 = 00 0 H 0 * 38 (error of the first kind / type I error) * 39 2 (error of the second kind / type II error) 0 α, 2 β H 0 H 0 H 0 OK ( α) Type I error (α) H 0 Type II error (β) OK ( β) P H0 (A) = α α α α * 40 A B = P (A) P (B) *38 *39 *40

8.2 79 8.8. α = 0 P H0 (A) = α = 0 A = 00% 7.4 8.2.4 8.5 8.6 P H0 (A) = α A 9 2 p(x) A C D B x 9: P H0 (A) = P H0 (B) = P H0 (C) = P H0 (D) = α 2 β β 2 (power) (alternative hypothesis) H = H X A H 0 H A P H (X A) A = β = P H (X A) A 8.2.5 * 4 N(µ, ) H 0 : µ = 0 0 A = [z(α), ), B = (, z(α)], C = (, z(α/2)] [z(α/2), ) *4 likelihood Neyman Pearson

80 8 H 0 : N(0, ) A B C z(α/2) z(α) 0 z(α) z(α/2) 0: α H : µ > 0 P H0 (I) = α I P H (I) P H (A) A c [c, ) (one-sided test) H 0 H A B C z(α/2) z(α) 0 z(α) z(α/2) : µ > 0 P H (A) > P H (C) > P H (B) A H : µ < 0 P H (B) B H H 0 A B C z(α/2) z(α) 0 z(α) z(α/2) 2: µ < 0 P H (B) > P H (C) > P H (A) B H : µ 0 A B C c, c (, c] [c, ) (two-sided test)

8.3 8 α P H0 ((, c]) = P H0 ([c, )) = α/2 H? H 0 H? A B C z(α/2) z(α) 0 z(α) z(α/2) 3: µ 0 S n N * 42 8.9 8.2.6 () H 0, H (2) α (3) P H0 (X A) α P H (X A) A α P H0 (X > u(α)) = α α P H0 (X < l(α)) = α A = [u(α), ) A = (, l(α)] A = (, l(α/2)] [u(α/2), ) (4) X x x A H 0 x / A 8.3 B χ 2, t, F 3 *42 χ 2

82 8 * 43 t χ 2 8.9 (?). 4 205 27 A 36 0.06 0.04 0.02 0 0 4 8 2 6 20 24 28 32 36 4: 205 A 36 8.3. Z 8.5, 8.6 N(µ, σ 2 ) σ 2 H 0 : µ = 65 Z = X n µ σ2 /n µ σ2 Z 8.5 H : µ 65 8.6 H : µ < 65 8.5 8.6 *43

8.3 83 5% 64 8.5 H 0 8.6 H 0 8.3.2 t 8.5, 8.6 Z = X n µ σ2 /n σ2 µ Z Z N(0, ) σ 2 Z = X n µ σ2 /n σ 2 u 2 n T = X n µ u 2 n /n T µ µ T T t B.8 t t 8.0. T Gosset T t T t 8. ( t [6] ). 00m 58.8 0 57.6, 58.2, 56.2, 57.3, 58.7, 58.8, 56.3, 57., 57.3, 57. 57.46 0% () µ H 0 : µ = 58.8 H : µ < 58.8. (2) T = X n µ u 2 n /n (n = 0) H 0 T t n = t 9. (3) H P t9 (T <.383) = 0.. t 9 (0.) =.383. (4) u 2 57.46 58.8 0 = 0.794 T t = = 4.755 0.794/0 H 0 0% 8.2. R > t.test(c(57.6, 58.2, 56.2, 57.3, 58.7, 58.8, 56.3, 57., 57.3, 57.),

84 8 alternative="less", mu=58.8) # One Sample t-test data: c(57.6, 58.2, 56.2, 57.3, 58.7, 58.8, 56.3, 57., 57.3, 57.) t = -4.756, df = 9, p-value = 0.000576 alternative hypothesis: true mean is less than 58.8 95 percent confidence interval: -Inf 57.97646 sample estimates: mean of x 57.46 t p 0.05% 95% (, 57.97646] 8.6 t.test alternative="less" 8.3. p (p-value) 9 t t = 4.756 P (T < 4.756) = 0.000576 p p p p p p p

8.3 85 8.3.3 n n = 00 n = 20 * 44 8.4 ( [6] ). 95 0.5 (ppm) 0.03 (ppm 2 ) 0.3 (ppm) % () µ H 0 : µ = 0.3 H : µ > 0.3. (2) n = 95 Z = X n µ N(0, ) u 2 n /n (3) H Z N(0, ) P (Z > 2.3263) = 0.0. 0.5 0.3 (4) z = =.25 H 0 % 0.03/ 95 8.5 ( ). Z = X n µ σ2 /n N(0, ) or T = X n µ u 2 n /n t n t T = X n µ N(0, ) u 2 n /n t n t n N(0, ) 0.05 N(0, ) z(0.05) =.644854 t 20 (0.05) =.72478, t 94 (0.05) =.66226 5%, % n 00 T t n t t 8.3.4 χ 2 8.6 ( [6] ). (ml/sec) 0 2 20 5 2 5% *44 B(n, θ) θ /2 n = 20

86 8 () σ 2 H 0 : σ 2 = 0 2. H : σ 2 < 0 2. (2) χ 2 = (n )u2 n σ 2 (n = 20). H 0 χ 2 χ 2 9. (3) H P χ 2 9 (χ 2 < 0.2) = 0.05. 5% α 95% (4) χ 2 9 52 = 0 2 = 4.75 H 0 5% 8.4 2 m, n X = m Y = n X, X 2,, X m N(µ x, σ 2 x) Y, Y 2,, Y n N(µ y, σ 2 y) m k= n k= X k, u 2 x = m Y k, u 2 y = n m (X k X) 2, k= n (Y k Y ) 2 k= 2 2 8.4. 2 t Welch X N(µ x, σ 2 x/m), Y N(µ y, σ 2 y/n) X Y N(µ x µ y, σ 2 x/m + σ 2 y/n) (8.) X Y (µ x µ y ) σ 2 x m + σ2 y n N(0, ) σx 2 = σy 2 = σ 2 (8.) σ 2 pooled variance ˆσ 2 = (m )u2 x + (n )u 2 y m + n 2 X Y (µ x µ y ) ( ˆσ 2 m + ) n m + n 2 t

8.4 2 87 X Y σ 2 x, σ 2 y (8.) σ2 x, σ 2 y X m Y n (µ x µ y ) u 2 x m + u2 y n ( ) u 2 x m + u2 y n (u 2 x/m) 2 m + (u2 y/n) 2 n t t Welch (Welch s test) Welch t Welch 8.7. Welch Welch [7] F 8.4.3 t Welch a a 8.8 8.8. 2 0 x : 203,203,97,98,206,26,95,200,202,99 y : 204,95,20,229,22,2,204,93,25,220 5% H 0 : µ x = µ y Welch R > x<-c(203,203,97,98,206,26,95,200,202,99) > y<-c(204,95,20,229,22,2,204,93,25,220) > t.test(x,y) Welch Two Sample t-test data: x and y t = -2.026, df = 3.464, p-value = 0.06305 alternative hypothesis: true difference in means is not equal to 0 95 percent confidence interval: -7.95587 0.595587 sample estimates:

88 8 mean of x mean of y 20.9 20.2 p 0.063 0.05 x > t.test(x,y,alternative="less") Welch Two Sample t-test data: x and y t = -2.026, df = 3.464, p-value = 0.0353 alternative hypothesis: true difference in means is less than 0 95 percent confidence interval: -Inf -.06408 sample estimates: mean of x mean of y 20.9 20.2 p 0.05 x 8.4.2 n X,..., X n Y,..., Y n 2 d k = X k Y k t 8.9. 8.4.3 u 2 x χ 2 m, u 2 y χ 2 n F u 2 x/σ 2 x u 2 y/σ 2 y F m,n. H 0 : σ 2 x = σ 2 y 8.4.4 3 8.5 χ 2 χ 2 χ 2 χ 2 = { (Observed) (Expected)} 2 (Expected) = (O E) 2 E

8.5 χ 2 89 8.5. χ 2 (χ 2 goodness of fit test) K A, A 2,..., A K H 0 : P (A ) = p, P (A 2 ) = p 2,, P (A K ) = p K n H 0 m = np, m 2 = np 2,, m K = np K f,, f K ( K i= f i = n) A A 2 A K f f 2 f K n m m 2 m K n 8.20. n m i m i 0 H 0 χ 2 = (O E) 2 E k (f i m i ) 2 = i= m i k χ 2 χ 2 k. K = 2?? f 2 = n f, p 2 = p χ 2 = (f m ) 2 + (f 2 m 2 ) 2 m m 2 = (f np ) 2 + (n f (n np )) 2 np n( p ) { = (f np ) 2 np ( p ) = f np np ( p ) f B(n, p ) n χ 2 } 2 f i = m i χ 2 = 0 f i m i χ 2 α P (χ 2 > χ 2 k (α)) = α 8.2. 20 5% 2 3 4 5 6 8 25 7 20 22 8 20 H 0 : P (A ) = P (A 2 ) = = P (A 6 ) = /6 m = = m 6 =

90 8 20 /6 = 20 χ 2 = (8 20)2 20 + (25 20)2 20 + + (8 20)2 20 = 2.3 6 = 5 χ 2 P [χ 2 >.07] = 0.05 χ 2 = 2.3 5% 5% 8.22 ( ). / / χ 2 4 = 3 χ 2 9/6 3/6 3/6 /6 32.75 04.25 04.25 34.75 556 35 0 08 32 556 χ 2 (35 32.75)2 (0 04.25)2 = + + 32.75 04.25 (32 34.75)2 + = 0.470 34.75 P [χ 2 > 7.85] = 0.05 (08 04.25)2 04.25 5% 8.23. 3 χ 2 P [χ 2 < 0.470] = 0.075 8.24. A.36 300 2.07 2.04 Poisson 5% 0 2 3 4 5 6 38 75 89 54 20 9 5 300 λ λk λ = 2.07 Poisson p(k) = e k! 300 p(k) 0 2 3 4 5 6 38 75 89 54 20 9 5 300 38 78 8 56 29 2 6 300

8.5 χ 2 9 7 = 6 P χ 2 6 (χ 2 > 2.59) = 0.05 χ 2 (38 39)2 (5 3)2 = + + = 39 3 8.02 Poisson 8.25. 7 = 6 5 χ 2 P χ 2 5 (χ 2 >.07) = 0.05 λ = 2 6 χ 2 8.5.2 χ 2 n A, B r s B B 2 B s A n n 2 n s n A 2 n 2 n 22 n 2s n 2........ A r n r n r2 n rs n r n n 2 n s n n i = s n ij, n j = j= A B P (A i ) = n i n, P (B j) = n j n A B P (A i B j ) = P (A i )P (B j ) (A i, B j ) m ij r i= n ij m ij = n P (A i B j ) = n ni n n j n = n i n j n 8.26. n m ij 5 A B χ 2 (O E) 2 E r s (n ij m ij ) 2 = i= j= m ij m ij = n i n j n (r )(s ) χ 2 A B (r )(s ) 8.25 r s rs (r ) (s ) = (r )(s )

92 8 χ 2 = 0 χ 2 α P (χ 2 > χ 2 (r )(s ) (α)) = α 8.27. χ 2 χ 2 2 2 χ 2 = 2 2 i= j= ( n ij m ij ) 2 2 (Yates correction) R m ij 8.28. 60 9 3 2 8 30 48 27 33 60 27 2/60 = 5.4 33 2/60 = 6.6 27 48/60 = 2.6 33 48/60 = 26.4 (2 )(2 ) = P χ 2 (χ 2 > 3.84) = 0.05 χ 2 χ 2 = 5.4545 χ 2 = 4.0446 5% 8.29. A, B 5% A 5555 256 B 602 32 A 5299 256 B 569 32 5% P χ 2 (χ 2 > 3.84) = 0.05 χ 2 χ 2 = 3.06

8.6 93 8.30. A/B A/B χ 2 A/B B(N A, p A ), B(N B, p B ) H 0 : p A = p B 8.6 8.7 (Todo) Brunner-Munzel 8.8 (Todo) 8.9 (Todo)

94 A A WWW A. (Bernoulli) A.. 0 θ 0, X P (X = ) = θ, P (X = 0) = θ X (Bernoulli distribution) P (X = i) = θ i ( θ) i (i = 0, ) X 0 θ A.2. θ, θ( θ). E[X] = θ+0 ( θ) = θ, E[X 2 ] = 2 θ+0 2 ( θ) = θ V [X] = E[X 2 ] (E[X]) 2 = θ( θ). A.3. θ = 0, 0 θ = 2 4 θ = 0, 0 θ = 2 A.6 X, X 2,... i.i.d. (Bernoulli process) θ A.2 (binomial distribution) X, X 2,..., X k,... (A.) S n = n k= X k n

A.2 (binomial distribution) 95 A.4 ( ). S n (binomial distribution) B(n, θ) X B(n, θ) X B(n, θ) B(, θ) A.5. B(n, θ) (A.2) n C k n C k = { nc k θ k ( θ) n k k = 0,,..., n, n p(k) = 0 n! k! (n k)! n k. 0 X k 0 S n n k < 0, n + < k k P (S n = k) = 0 P (S n = k) = n C k θ k ( θ) n k k = 0,,..., n, n X k 0 S n = k X, X 2,..., X n k i k = 0, k X = i, X 2 = i 2,..., X n = i n i + + i n = k * 45 P (S n = k) = P (X = i,..., X n = i n ) i + +i n=k i + + i n = k = P (X = i ) P (X n = i n ) = = = i + +i n=k i + +i n=k i + +i n=k i + +i n=k θ i ( θ) i θ in ( θ) in θ i+ +in ( θ) n (i+ +in) θ k ( θ) n k i + + i n = k = n C k θ k ( θ) n k i + + i n = k n C k *45 n = 3, k = i, i 2, i 3 k = 2