数理統計学Iノート

Size: px
Start display at page:

Download "数理統計学Iノート"

Transcription

1 I ver. 0/Apr/208 * (inferential statistics) *2 A, B *3 5.9 *4 *5 [6] [],.., [2].., 973. [3]. R (Wonderful R )., [4]. ( )., [5]. ( )., [6],.., 989. [7] , [4] [5] [2] * *2 (descriptive statistics) *3 4 A 5 B *4 *5

2 2 [] [6] web PDF PC SAS, SPSS, R R R [3] A A complement A c A A n(a) P P (A) A p p X (x) X 4.7 P (A B) B A 2. P B(A) E[X], V [X] X 4., 4.5 µ = E[X], σ = V [X], σ 2 = V [X] N(µ, σ 2 ) µ, σ 2 σ A.3 N(, 4) 4. N(, 4) 4 X X A.4 i.i.d. 5.8 X, X 2,..., X n X n 5.2 u 2 n (conditional probability) (independence) Bayes

3 i.i.d (parameter) (estimator) (unbiasedness) (consistency) (efficiency) χ A 94 A. (Bernoulli) A.2 (binomial distribution) A.3 (normal distribution) A.4 Poisson A.5 (geometric distribution) A.6 (exponential distribution) A.7 (hypergeometric distribution) B χ 2 t F 3 B. (χ 2 )

4 4 B.2 t B.3 F B C 20

5 5 * (Laplace). ( ). P (A) = A Ω () 6 6 *7 (2) = 0 [0, ] = {x R 0 x } a [0, ] a *6 I A B *7

6 6 + = 0 [0, ] [0, 2 ] 0 [0, 2 ] a [0, ] [0, 2 ] 2?.2 (Kolmogorov).2 ( (sample space)). Ω (sample space).3. Ω = {, } 0 Ω = {0, } Ω = {,, }, Ω = {,,, } 2 2 Ω = {(0, 0), (0, ), (, 0), (, )} Ω = {, 2, 3, 4, 5, 6} Ω = {, } Ω = {, 2, 3, 4}, Ω = {, 2,, 2}, Ω = {, 2,, 20} n Ω = {0,,..., n} Ω = {0,, 2,... } Ω = [0, ) = {x R 0 x < + }. Ω = {(a, a 2, a 3,, a n, ) a n = 0 or } 0, 4 Ω Ω

7 ( (event)). Ω A Ω (event) ω Ω {ω} Ω Ω Ω Ω 0 A = A, A = 0.5. Ω = {0, } {} Ω {0} Ω {0, } = Ω Ω = 2 {(0, )} 2 {(, 0), (, )} A.6. Ω = {, 2, 3, 4, 5, 6} () (2) 3 (3) () {2, 4, 6} (2) {, 2, 3, 5} (3) {2, 3, 5}.7. Ω A, B Ω A B A B (intersection) A B A B A B A B (union) A B A B A B A B A A (complement) A A.8. A B = A B (disjoint / mutually exclusive) A B

8 8 *8 A, B A B A B A B A B and or A B A A B, A B B A B.9. : A, A 2,... A i A, A 2,... i= A i A, A 2,... i=.0..6(2) = {, 3, 5} 3 = {, 2, 3} {, 3, 5} {, 2, 3} = {, 2, 3, 5} A Ω A = {} A 6 Ω = {, 2, 3, 4, 5, 6} A = {2, 3, 4, 5, 6} Ω = {, 2, 3, 4} A = {2, 3, 4} 2 4. ( (de Morgan s laws)). () (A B) = A B: A, B = A B (2) (A B) = A B: A, B = A B.2. {A i } i i A i = i A i = i A i = A i = *8 I

9 ( ). A i i i A i ( i A i) = i A i i *9.4 ( (probability measure)). A Ω P (A) P Ω (probability measure) P (A) A Ω P (Ω, P ) (probability space) () A Ω 0 P (A). (2) P (Ω) =. (3) A, A 2, (i j = A i A j = ) (.) ( ) P A n = P (A n ) n= n= (), (2) (3) (3) (3) A, B Ω P (A B) = P (A) + P (B) (3) A, B, C Ω P (A B C) = P (A) + P (B) + P (C) (3), (3) * 0 * (3).5 ( ). () P (A) = P (A). P ( ) = 0. (2) A B = P (B A) = P (B) P (A). (3) A B = P (A) P (B). A 0 P (A). (4) P (A B) = P (A)+P (B) P (A B), P (A B) = P (A)+P (B) P (A B)..23 *9 *0.3 (3) * P (Ω) =

10 0. A B = A B A B () A A =, Ω = A A = P (Ω) = P (A A) = P (A) + P (A) (2),(3) A A =, Ω = A A A (B A) =. A B B = A (B A) P (B) = P (A (B A)) = P (A) + P (B A) P (A) ( 0 P (B A)). A Ω P (A) P (Ω) =. (4) P (A B) = P (A (B (A B))) = P (A) + P (B (A B)) = P (A) + P (B) P (A B). (A B) B (3).6. P (A B) = P (A) + P (B) P (A B) P (A) + P (B) A, B P (A B) = P (A) + P (B) (3) (.) (subadditivity) (.2) ( ) P A n P (A n ) n= n=..7. A A Ω ( Ω < ) A Ω P (A) = A Ω.4 A B = = (A B) = A + B Ω.8. 2 () Ω (2) (3) 6 (4) (2) (3) () Ω = {(, ), (, 2), (, 3), (, 4), (, 5), (, 6), (2, ), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6), (3, ), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6), (4, ), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6), (5, ), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6), (6, ), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)} 36 Ω = {(i, j) i, j 6}

11 .2 i, j 6 i 6, j 6 (2), (3) A, 6 B A = {(, ), (2, 2), (3, 3), (4, 4), (5, 5), (6, 6)} B = {(, ), (, 2), (2, ), (, 3), (2, 2), (3, ), (, 4), (2, 3), (3, 2), (4, ) (, 5), (2, 4), (3, 3), (4, 2), (5, )} A = 6, B = 5 P (A) = 6 36 = 5, P (B) = 6 36 = 5 2. (4) A B = {(, ), (2, 2), (3, 3)} P (A B) = P (A)+P (B) P (A B) = = A, B A B A B A B P (A), P (B), P (A B).20. P (A) = 2, P (B) = 3, P (A B) = 4 () A B (2) A B (3) A B (4) A B (3), (4) () P (A B) = P (A B) = P (A B) = 4 = 3 4 (2) P (A B) = P (A B) = P (A B) = {P (A)+P (B) P (A B)} = = 5 2 (3) P (A B) = P (B (A B)) = P (B) P (A B) = 2 (4) P (A B) = P (A (A B)) = P (A) + P (A B) = N = N 365 N 365 P N 365 P N 365 N. 365 P N. N N N = N 50% N = 35 80% N = 57 99% ( ) N

12 A A P (A) % N N = 4..5(4) A, B, C P (A B C) =P (A) + P (B) + P (C) P (A B) P (B C) P (C A) + P (A B C). P (A B) = P (A) + P (B) P (A B) B B C P (A B C) = P (A) + P (B C) P (A (B C)) 2 = P (A) + P (B) + P (C) P (B C) P (A (B C)). A (B C) = (A B) (A C), (A B) (A C) = A B C P (A (B C)) = P (A B) + P (A C) P (A B C) P (A A 2 A n ) n = ( ) k P (A i A i2 A ik ) k= i <i 2< <i k n n = 4

13 .2 3 n = k = i n = i = P (A ) = ( ) P (A ) = P (A ) n = 2 k = i n = 2 i =, 2 ( ) 0 (P (A ) + P (A 2 )). k = 2 i < i 2 n = 2 (i, i 2 ) = (, 2) ( ) 2 P (A A 2 ) = P (A A 2 ) n = 3 k = 2 i < i 2 n = 3 (i, i 2 ) = (, 2), (, 3), (2, 3) 3 ( ) 2 (P (A A 2 ) + P (A A 3 ) + P (A 2 A 3 )) k = 3 P (A A 2 A 3 ) n = 4 P (A A 2 A 3 A 4) =P (A ) + P (A 2) + P (A 3) + P (A 4) (P (A A 2) + P (A A 3) + P (A A 4) + P (A 2 A 3) + P (A 2 A 4) + P (A 3 A 4)) + (P (A A 2 A 3) + P (A A 2 A 4) + P (A A 3 A 4) + P (A 2 A 3 A 4)) P (A A 2 A 3 A 4).25 (H27 ). 4 4 A, 2 3 B, 3 4 C A, B, C P (A B C) = P (A)+P (B)+P (C) P (A B) P (B C) P (C A)+P (A B C) P (A) = = 36, P (B) = = 4, P (C) = = 4 P (A B) = =, P (B C) = = 6, P (C A) = = 72, P (A B C) = = 44 P (A B C) = (H24 ) k A k k 3 A 3, A 6, A 9, A 2 P (A 3 A 6 A 9 A 2 ). A 3 = {(, 2), (2, )}, A 6 = {(, 5), (2, 4), (3, 3), (4, 2), (5, )}, A 9 = {(3, 6), (4, 5), (5, 4), (6, 3)}, A 2 = {(6, 6)} P (A 3 ) = 4 ( ) = P (A 6 ) = 35 ( ) = P (A 9 ) = 70 ( ) = P (A 2 ) = 25 (5 5) =

14 4 ( 4 P (A 3 A 6 A 9 A 2 ) = ) 400 = (H26 ). A, B, C 3 (ABCABC ) B 7 (, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, ) = 6. B 5 6 ( ) 4 5 6, 2 n 6 6 ( ) 3(n ) n= 6 ( ) 3(n )+ 5 = n= ( ) 3(n ) 5 6 = (5/6) 3 = 30 9 * 2.28 ([2] ). K 0.5, 0.2 n p n lim n n n p n n k= p n = 0.5( p n ) + }{{} 0.2p n }{{} = p n p n 5/3 = 0.3(p n 5/3) p n = p k ( 3 ) n ( p 5 ) (n ) *2 (Markov)

15 .2 5 lim n n n k= p k = {a n } lim n a n = α lim n n n a k = α k=. a n = α (n =, 2,... ) n n a k = nα = α n α ε-δ ε-δ ε lim n a n n a n α N n > N = a n α n k= N n n a k = n k= N k= n a k }{{} + n + n n k=n+ a k } {{ } a k α n k=n+ α }{{} N n = n k= + n N α α (n ) n.30. A a n, b n { a n = 0.7a n + 0.8b n ( ) b n = 0.3a n + 0.2b n lim a n, lim b n n n ( ) ( ) ( ) an an = b n b n }{{}}{{} p n A

16 6 p n = Ap n A p n = Ap n = A 2 p n 2 = = A n p 0 lim p n = lim n n An p 0 lim n An A n λ = λ < 0 A n = ( ) 8 + 3λ n 8 8λ n 3 3λ n 3 + 8λ n ( ) 8 8 (n ) 3 3 (n = 0) a 0 + b 0 = lim p n = n ( ) ( ) a0 = b 0 ( ) % 4 ( ) a n = a n = α, b n = b n = β α + β = α = 8/, β = 3/ 3.3. ABC n A, B, C a n, b n, c n n A, B, C a 0, b 0, c 0 ( a 0 +b 0 +c 0 =.) n A 2 3 B C 3 B 2 A 2 C C 2 3 A 3 B a n 0 /2 2/3 a n p n = b n = 2/3 0 /3 b n = Ap n. c n /3 /2 0 c n A n A, 3 ± i 6 3 ± i 6 < n 5/4 5/4 5/4 lim n An = 4/4 4/4 4/4 2/4 2/4 2/4 a 0 + b 0 + c 0 = 5/4 5/4 5/4 lim n = lim n n An p 0 = 4/4 4/4 4/4 2/4 2/4 2/4 a 0 b 0 c 0 =

17 (3).32 ( ). ( () A A 2 A 3 lim P (A n) = P A n ). n ( n= (2) A A 2 A 3 lim P (A n) = P A n ). n ( n= ( )) (3) A = lim A n lim P (A n) = P (A) = P lim A n. n n n A, A 2,... lim n A n lim sup A n = n n= k=n A k, lim inf n A n = (limit inferior) n= k=n A k (limit superior) lim sup A n = lim inf A n A n lim A n n n n = n A n : B n = B n = A n, A n+, A n+2,... lim sup A n = n = n A n A k k=n B n B n n=.33. A n = n ( P ) lim sup A n =, P n 2.26 ( ) lim inf A n = 0 n

18 (conditional probability) (Ω, P ) 2. ( ). P (B) 0 B A Ω (2.) P (A B) = P (A B) P (B) B A B =, A = P (A B) B =, A = P (A B) B =, A = P (A B) P (E) > 0 E P E (A) = P (A E) P E P (A E).4 () P E (A) = > 0, P (E) P (Ω E) (2) P E (Ω) = = P (E) = (3) (A B) E = P (E) P (E) (A E) (B E) A B = (A E) (B E) = P ((A B) E) P (A E) + P (B E) P E (A B) = = = P E (A) + P E (B) P (E) P (E) P (A Ω) A Ω A Ω = A P Ω (A) = = P (A) = P (A) P (Ω) Ω Ω = {(0, 0), (0, ), (, 0), (, )}, P (A) = A/ Ω A = {(, )} =, B = {(0, ), (, 0), (, )} = P (A B) = P (A B) P (B) = /4 3/4 = /2

19 2. (conditional probability) X ( ), Y ( ), Z ( ) Ω = {XR, XR2, Y W, Y W 2, ZR, ZW } 6 A = {XR, XR2, ZR}, P (A B) B = {XR, XR2} P (B A) = = 2/6 P (A) 3/6 = P (A) = 2, P (B) = 3, P (A B) = 4 () P (A B) (2) P (A B) (3) P (B A) (4) P (A B) () P (A B) = P (A) + P (B) P (A B) = 7 2 /4 /3 = 3 P (A B) (3) P (B A) = = /4 4 P (A) /2 = 2 4 (2) P (A B) = P (A B) P (B) = (4) P (A B) = P (A B) = (2.) P (A B) = P (A B)P (B) A B A, B (A B = B A) P (A B) = P (B A) A, B P (B A) = P (B A)P (A) 2.7 ( ). P (A), P (B) 0 P (A B) = P (A B)P (B) = P (B A)P (A) P (A B) A B (joint probability) B A P (A B)P (B), A B P (B A)P (A) A, B A = (A B) (A B), (A B) (A B) = P (A) = P (A B) + P (A B) = P (A B)P (B) + P (A B)P (B) A B A B A

20 20 2 n 2.8. S, S 2,..., S n Ω = S i {S i } n i= Ω P (A) = i= n P (A S i )P (S i ) i= 2.9 ( ). N r () (2) A =, B = () P (A) = P (B) = r N. (2) P (A) = r N r P (A) = P (A) = N N. P (B) = P (B A)P (A) + P (B A)P (A) = r r N N + r N r r(n ) = N N N(N ) = r N P (A) = P (B) = r N (2) ( ). 30%, 70% % θ, /2 P ( ) = θ, P ( ) = θ, P ( ) = P ( ) = /2 P ( ) = P ( )P ( ) + P ( )P ( ) = θ = θ θ = 2P ( ) /2 = /0 0%. θ < /2 θ < P ( ), θ > /2 P ( ) < θ 2.2 (independence)

21 2.2 (independence) 2 2. ( ). A, B Ω (independent) P (A B) = P (A)P (B) 2.2 ( ). A, B Ω () P (A B) = P (A). (2) P (B A) = P (B). (3) P (A B) = P (A B). (4) A, B () B A (2) A B (3) B A A B (4). (),(2): P (A B) = P (A)P (B) P (B) P (A B) = P (A B) = P (A). P (A B) = P (A) P (A B) = P (A B)P (B) = P (B) P (A)P (B) A B () A B (2) () (3): () P (A B) = P (A) P (A B) = P (A) = P (A B)P (B) + P (A B)P (B) P (A B)( P (B)) = P (A B)P (B) = P (A B)P (B) P (B) (3) (3) P (A) = P (A B)P (B) + P (A B)P (B) = P (A B)P (B) + P (A B)P (B) = P (A B)(P (B) + P (B)) = P (A B) () () (3) (4): P (A B) = P (A)P (B) P (A B) = P (A B) = P (A B) = (P (A) + P (B) P (A B)), P (A)P (B) = ( P (A))( P (B)) = P (A) P (B)+P (A)P (B) P (A B) = P (A)P (B) P (A B) = P (A)P (B) 2.3. Ω = {S( )A, S2, S3,, SK,, C( )K}, P (A) = A/ Ω A = {SA, HA, DA, CA} = H = {HA, H2,, HK} =

22 22 2 P (A H) = 52, P (A)P (H) = = 52 P (A H) = P (A)P (H) A H 2.4. Ω = 53 P (A) = 4 3, P (H) =, P (A H) = P (A H) P (A)P (H) A, H () A B (2) P (B A) = r N r N = P (B) 2.9 N r N r P (B) = P (B A) N A B i.i.d a A, B P (A B) = P ( ) = 0 P (A)P (B) A H a 2.7. P (A) = 3, P (A B) = 2 P (B) () A, B (2) A, B (3) P (A B) = 4 (4) P (B A) = 5 () P (A B) = P (A) + P (B) P (A B) = P (A) + P (B) P (A)P (B) P (B) P (B) = 4. (2) P (A B) = 0 P (B) = P (A B) P (A) = 6. (3) P (A B) = P (A B)P (B) = P (B). P (A B) = 4 P (A) + P (B) P (A B). P (B) P (B) = 2 9. (4) (3) P (B) = 7 30.

23 2.2 (independence) ( ). n A, A 2,..., A n :, 2,..., n i < i 2 < < i k n P (A i A i2 A ik ) = P (A i )P (A i2 ) P (A ik ) 2.9. Ω = {, 2, 3, 4}, P (A) = A/ Ω A = {, 2}, B = {, 3}, C = {2, 3} P (A) = P (B) = P (C) = 2. P (A B) = P ({}) = 4 = 2 = P (A)P (B) 2 P (A C) = P ({2}) = 4 = 2 = P (A)P (C) 2 P (B C) = P ({3}) = 4 = 2 = P (B)P (C) 2 A B A C B C (independent pairwisely) P (A B C) = P ( ) = 0 P (A)P (B)P (C) = = 8 P (A B C) P (A)P (B)P (C) A, B, C P (A B) = P (A)P (B) P (A B C) = P (A)P (B)P (C) P (A B C) = P (A)P (B)P (C) A, B, C Ω = {, 2, 3, 4}, P ({}) = 2 4, P ({2}) = 3 4 2, P ({3}) = P ({4}) = 4 A = {, 2}, B = {2, 3}, C = 4 {2, 4} P (A) =, P (B) = P (C) =. P (A B C) = P ({2}) = , 2 P (A)P (B)P (C) = ( ) 2 = P (A B C) = P (A)P (B)P (C). 2 P (A B) P (A)P (B), P (A C) P (A)P (C), P (B C) P (B)P (C) 2.2. P ({}) = P ({2}) = P ({3}) = P ({4}) = 4

24 24 2 P ({}) = 4 α, P ({2}) = + α P (A B C) = 4 P ({2}) = 4 + α = P (A)P (B)P (C) = ( ) α α = ± 2 α = * ( ). A, B C P (A B C) = P (A C)P (B C) A, B C (conditionally independent) C P C 2.2 P C (A B) = P C (A)P C (B) P (A B C) = P (A B C) P (C) = P (A B C) P (B C) P (B C) P (C) = P (A B C)P (B C) P (A C) = P (A B C) C A, B C B A A, B C C Ω = {(0, 0), (0, ), (, 0), (, )}, P (A) = A/ Ω A = {(0, 0), (0, )}, B = {(0, 0), (, 0)}, C = {(0, ), (, 0)} 2 P (A) = P (B) = P (C) = 2 P (A B) = P (A C) = P (B C) = 4 A, B, C C P (A C) = P (B C) =, P (A B C) = L =, H = 6, D = 2 5 L H D Ω = {(i, j) i, j 6}, P (A) = A/ Ω L = {(, ), (, 2), (, 3), (, 4), (, 5), (, 6), (2, ), (3, ), (4, ), (5, ), (6, )} H = {(6, ), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6), (, 6), (2, 6), (3, 6), (4, 6), (5, 6)} D = {(, 5), (, 6), (2, 5), (2, 6), (5, ), (6, ), (5, 2), (6, 2)} *3 3.3

25 P (L) = P (H) = 36, P (D) = 8 36 = 2 9. L H = {(, 6), (6, )} P (L H) = 2 P (L)P (H) = 36 L H ( ) 2 36 L D = {(, 5), (, 6), (5, ), (6, )}, H D = {(6, ), (6, 2), (, 6), (2, 6)}, L H D = {(, 6), (6, )} P (L D) = 4/36 8/36 = 2 P (H D) = 4/36 8/36 = 2 P (L H D) = 2/36 8/36 = 4 = ( ) 2 2 P (L H D) = P (L D)P (H D), L H D ( (Borel-Cantelli) 0 ). A, A 2,... ( ) () P (A n ) < P lim sup A n = 0 n n= ( ) (2) A, A 2,... P (A n ) = P lim sup A n = n. () lim n k=n n= P (A k ) = 0 B n = k=n A k lim sup A n = n B B 2.32 (2) (.2) ( ) P lim sup A n = lim P (B n) lim n n n ( ( ) ) (2) P lim sup A n = 0 C n = n A k = n= k=n n= P (A k ) = 0. k=n ( A k k=n C n C C 2 () P ( n= C n ) = lim n P (C n). ) lim sup A n = n B n n= n= k=n A k = lim P (C n) = 0 n n P (A k ) = + x exp(x) k=n

26 26 2 exp(x) = e x P (A k ) lim m k=n m ( P (A k )) lim m k=n = lim m exp m exp ( P (A k )) ( ) ( ) m P (A k ) = exp P (A k ) = 0. k=n D n,m = m=n k=n m A k = m=n k=n m A k C n = k=n D n,m D n,m D n,m+.32(2) P (C n ) = lim P (D n,m) = lim m m k=n m P (A k ) = lim m k=n m ( P (A k )) = 0. A k = k=n ( ( ) ) P lim sup A n = lim P (C n) = 0 n n ( ) P lim sup A n = n lim sup A n = n P (A n ) = 6 ( ) P (A n ) =. P lim sup A n =. n n= A n ( ) ( ( ) ) ( ) P lim sup A n = 0 = P lim sup A n = P lim inf A n. n n n

27 27 3 Bayes Bayes Bayes Bayes Bayes Laplace Bayes 20 Fisher Neyman, Pearson Bayes Bayes Bayes Bayes Bayes 00 Bayes Bayes Bayes 3. P (A) 0 P (B A)P (A) = P (A B)P (B) P (A) 3. (Bayes (I)). P (B A) = P (A B)P (B) P (A) Ω 3.2 (Bayes (II)). S, S 2,..., S n Ω P (S i A) = P (A S i)p (S i ) P (A) = P (A S i)p (S i ) n P (A S i )P (S i ) i= 3.3. A, B, C 20, 35, 45% 5, 7, 4% A, B, C Ω Ω = A B C P (A) = 0.2, P (B) = 0.35, P (C) = 0.45 E

28 28 3 BAYES P (E A) = 0.05, P (E B) = 0.07, P (E C) = 0.04 Bayes P (E A)P (A) P (A E) = P (E A)P (A) + P (E B)P (B) + P (E C)P (C) = = B P (B E) = , P (C E) = L, U L (lucky) 4 U (unlucky) 4 () (2) U (3) U L, U P (L) = P (U) = /2 G =, B = P (G L) = 4/5, P (B L) = /5, P (G U) = /5, P (B U) = 4/5. () P (B) = P (B L)P (L) + P (B U)P (U) (2) Bayes P (U B) = = = 2 P (B U)P (U) P (B) = 4/5 /2 /2 = 4 5 (3) P (BB L) = 5 5, P (BB U) = Bayes P (BB U)P (U) P (U BB) = P (BB L)P (L) + P (BB U)P (U) 6/25 /2 = /25 /2 + 6/25 /2 = 6 7 L, U Bayes /2 U 2 Bayes Bayes 3.5 (3 ). A, B, C 3

29 3. 29 A B C B A C /3 /2 : B, C A = A, B = B, C = C, S A = A, S B = B, S C = C P (A S B ) P (A) = P (B) = P (C) = 3 P (S B B) = 0, P (S B C) =, P (S B A) = (S C A) = 2 Bayes P (A S B ) = = P (S B A)P (A) P (S B A)P (A) + P (S B B)P (B) + P (S B C)P (C) = 3. A 3 A 3.6 ( ). TV 3 3 A =, B =, C = P (A) P (A) = P (B) = P (C) = M = B 3 P (A M) P (C M) A B, C P (M A) = B B 2 P (M B) = 0 C C B P (M C) =

30 30 3 BAYES M Bayes P (A M) = = P (C M) = P (M A)P (A) P (M A)P (A) + P (M B)P (B) + P (M C)P (C) = 3 P (M C)P (C) P (M A)P (A) + P (M B)P (B) + P (M C)P (C) 3 = = Bayes 3.7 ( ) % % 99% A =, S = P (S) = 0.00, P (S) = P (S) = 0.999, P (A S) = 0.99, P (A S) = 0.0 Bayes P (A S)P (S) P (S A) = P (A S)P (S) + P (A S)P (S) = = % % 9% 9% 0.% 90 0.% P (A) = P (A S)P (S) + P (A S)P (S) % 999 %(0 ) 0% 000 P (A S)P (S) P (S A) = P (A S)P (S) + P (A S)P (S) = = =

31 ( ). (Todo) P (S), P (S) 000 P (S A), P (S A) Bayes P (S), P (S) (prior probability) P (S A), P (S A) (posterior probability) Bayes 3.0 ( ). T 2000 Rh AB Mr. X Mr. X A =, B = P (A B) =. P (A B) = 2000 P (B A) Bayes P (B) A P (B) = P (B) = 2 Bayes P (B A) = 99.9% P (A B)P (B) P (A B)P (B) + P (A B)P (B) = = B T 30

32 32 3 BAYES P (B) = P (B A) = P (B) = % Bayes = C A B 3500 P (B) = P (B A) ICPO P (B) = P (B A) Wikipedia DNA () (207) DNA DNA 77 (2) 3 DNA 000 DNA DNA DNA Bayes 3.3 Bayes Bayes S: (spam), N: (non spam) K(w): w P (S), P (N) w, w 2,... P (K(w ) S), P (K(w 2 ) S),... w P (S K(w)) P (S) = 0.6, P (N) = 0.4

33 (w ) (w 2 ) (w 3 ) (w 4 ) : (w ) Bayes (3.) P (K(w ) S)P (S) P (S K(w )) = P (K(w ) S)P (S) + P (K(w ) N)P (N) = % P (S K(w 2 )) = , P (S K(w 3 )) = , P (S K(w 4 )) = = 0.9 P (K(w ) K(w 2 ) S) A, B (P (A B) = P (A)P (B)) (P (A B S) = P (A S)P (B S)) P (S A B) Bayes = = Bayes = P (A B S)P (S) P (A B) P (B S) P (B) P (B S) P (B) B P (S B) = = P (A S)P (S) P (A) P (B S)P (A S)P (S) P (B)P (A) P (S A) P (B S)P (S) P (B) P (S) P (S A) A P (S A) B P (S B) Bayes (Bayesian updating) (naive Bayes classifier)

34 34 3 BAYES (3.) Bayes P (S K(w )) w w w Bayes P (S) = 0.6 P (S) = 0.0 P (S) = Bayes P (S) P (S K(w )) w w w : Bayes * 4 *4

35 35 4 * B * 6 4. (Ω, P ) 4. ( ). Ω X(ω) (random variable) ω Ω X 4.2. () Ω = {, 2, 3, 4, 5, 6} = {i i 6}. X(i) = i Y (i) = 2i, Z(i) = i 2 (2) 2 Ω = {ω = (i, j) i, j 6}. X(ω) = i, Y (ω) = j, Z(ω) = i + j 2 (3) 3 Ω = {ω = (i, j, k) i, j, k = 0 or }. X(ω) = i, Y (ω) = j, Z(ω) = k, T (ω) = i + j + k, 2, 3 () Y = 2X, Z = X 2 (2) Z = X + Y, (3) T = X + Y + Z 00% 4.3. (2) Ω = {2, 3,...,, 2} Ω = {, 2,..., 30, 36} Ω = {0,, 2, 3, 4, 5} 2 *5 *6

36 X X = {ω Ω X(ω) = } X(ω) = ω Ω Ω P ({ω X(ω) = }) P (X = ) def = P ({ω X(ω) = }) X = X < a X b P (X < a) = P ({ω X(ω) < a}) P (X b) = P ({ω X(ω) b}) X, Y P (X = a, Y = b) = P ({ω X(ω) = a, Y (ω) = b}) = P ({ω X(ω) = a} {ω Y (ω) = b}) P (X = a or Y = b) = P ({ω X(ω) = a} {ω Y (ω) = b}) P (A) = A/ Ω () P (X = 5) = P ({ω X(ω) = 5}) = P ({5}) =, P (X 3) = P ({ω X(ω) 6 3}) = P ({, 2, 3}) = 2, P (Y 7) = P ({4, 5, 6}) =, P (Y ) = P (Ω) =, 2 P (Z = 4) = P ({2}) =, P (Z = 3) = P ( ) = 0 6 (2) P (X = ) = P ({(, ), (, 2), (, 3), (, 4), (, 5), (, 6)}) = 6 36 = 6 P (Z = 8) = P {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} = (3) P (Y = ) = P ({(0,, 0), (,, 0), (0,, ), (,, )}) = 4 8 = 2 2 P (T = 2) = P ({(0,, ), (, 0, ), (,, 0)}) = ( ). X F X (x) = P (X x) X (distribution function) (c.d.f., cumulative distribution function)

37 X F (x) (), (2), (3) (4), (5) 4.6 ( ). () P (X > x) = P (X x) = F (x) (2) a < b P (a < X b) = F (b) F (a) (3) a < b = F (a) F (b). (4) lim F (x) = 0, lim F (x) =. F ( ) = 0, F ( ) = x x (5) lim F (x) = F (a). x a+0. () P (A) = P (A) A = {X x} (2),(3) A B = P (B A) = P (B) P (A) A = {X a}, B = {X b} a < b A B (4), (5) (4) F (x) N(0, ) Exp() t χ 2 4 B(0, 0.5) Poi(3) x : A Poisson χ 2 x = 0 t (2) 4.7 ( ). () X F X (x)

38 38 4 F X (x) = x p X (t) dt p X (x) X (density function) (2) x 0, x, x 2,... X p X (k) := P (X = x k ) (k = 0,, 2,... ) F X (x) = p X (k) x k x x k x k p X (k) X p(x) 2 * 7 0 F (x) = x p(t) dt x p(t) F (x) 0 x t 2: 4.8. counting measure Radon- Nikodym 2 A B *7

39 ( ). () p(x) p(x) 0 p(x) dx = p(k) = k=0 (2) X d dx F X (x) = p X (x). P (a < X b) = F (b) F (a) = b a p(x) dx or a<x k b p(k) a a p(x) dx = 0 P (X = a) 0 * 8 (4.) P (a < X < b) = P (a x < b) = P (a x b) = P (a < X b) = b a p(x) dx (4.) p(x) p(k) = P (X = x k ) p(x) p(k) X () P [X m] 2 P [X m] 2 m X (median) X P [X m] = P [X m] = F (m) = 2 (2) p(x) x = c c X (mode) 4. ( ). X E[X] = xp(x) dx E[X] = x k p(k) k=0 (expectation / expected value) (mean / average) *8

40 40 4 = 3 N(µ, σ 2 ) E[X] = µ, V [X] = σ 2 p(x) σ p(k) λ = 3 λ = 7 Poisson Poi(λ) E[X] = V [X] = λ σ µ σ x k p(x) Exp(λ) Exp(λ) E[X] = /λ, V [X] = /λ 2 p(x) = π a 2 x 2 E[X] = 0, V [X] = a 2 /2 p(x) Exp(λ ) (λ < λ) 0 /λ /λ x a 0 x a 3: : Poisson < = = x p(x) xp(x) dx p(x) dx p(x) dx = 4.2 ( ). 4. x p(x) dx < + x k p(k) < + k=0

41 , ( ). (4.2) E[X] = X(ω)P (dω) Ω X(ω) = x, P (dω) = p(x) dx E[X + Y ] = E[X] + E[Y ] X 2X X 2 f(x) E[f(X)] = f(x)p(x) dx or f(x k )p(k) k=0 4.4 ( ). () X f(x), g(x) E[f(X) + g(x)] = E[f(X)] + E[g(X)]. a, b E[aX + b] = ae[x] + b. b E[b] = b. (2) X, Y E[X + Y ] = E[X] + E[Y ]. () (2) (2) 4.3 * 9. () E[f(X) + g(x)] = = (f(x) + g(x))p(x) dx f(x)p(x) dx + g(x)p(x) dx = E[f(X)] + E[g(X)] (2) (4.2) E[X + Y ] = (X(ω) + Y (ω))p (dω) Ω = X(ω)P (dω) + Y (ω)p (dω) = E[X] + E[Y ] Ω Ω *9

42 ( ). V [X] = E[(X E[X]) 2 ] X (variance) V [X] (standard deviation) σ s σ 2 v µ m µ = E[X], σ = V [X], σ 2 = V [X] V [X] = E[(X E[X]) 2 ] V [X] = (x µ) 2 p(x) dx or k=0 (x k µ) 2 p(k) = = V [X] = E[(X E[X]) 2 ] µ = E[X] V [X] = E[(X µ) 2 ] = E[X 2 2µX + µ 2 ] = E[X 2 ] 2µE[X] + µ 2 = E[X 2 ] µ ( ). () V [X] = E[X 2 ] (E[X]) 2. (2) a, b V [ax + b] = a 2 V [X]. E[aX + b] = ae[x] + b (3) V [X] 0 V [X] = 0 X (4) X, Y V [X ± Y ] = V [X] + V [Y ]. ± (3) V [X] < 0 V [X] < 0 V [X] = 0 V [X] > 0. (4) 5.4 () (2) E[aX + b] = ae[x] + b V [ax + b] = E[(aX + b E[aX + b]) 2 ] = E[(aX + b (ae[x] + b)) 2 ] = E[a 2 (X E[X]) 2 ] = a 2 E[(X E[X]) 2 ] = a 2 V [X].

43 (3) 4.3 V [X] = (x k µ) 2 p(k) (x k µ) 2 0 p(k) 0 V [X] 0. V [X] = (x k µ) 2 p(k) = 0 x k µ k p(k) = 0 k=0 k=0 P (X µ) = x k µ 0. X p(k) = 0 X µ 4.7 ( (Chebyshev) (Chebyshev s inequality)). µ = E[X], σ = V [X] α > 0 P ( X µ ασ) α 2 α > α = 2 X I = [µ 2σ, µ + 2σ] 50% 5 I α = 3 9 [µ 3σ, µ + 3σ] µ ± σ µ ± 3σ * 20. σ 2 = = (x µ) 2 p(x) dx (x µ) 2 p(x) dx + x µ ασ x µ <ασ (x µ) 2 p(x) dx 2 (x µ) 2 p(x) dx (x µ) 2 (ασ) 2 x µ ασ x µ ασ (ασ) 2 p(x) dx = α 2 σ 2 p(x) dx = α 2 σ 2 P [ X µ ασ]. x µ ασ α 2 σ ( ). X µ = E[X], σ = V [X] Z = X µ E[Z] = 0, V [Z] = X Z σ (standardization) *20 [µ 3σ, µ + 3σ] 99.7%

44 44 4 [ ] X µ E[Z] = E = σ σ E[X µ] = (E[X] µ) = 0. σ [ ] ( ) 2 ( ) 2 X µ V [Z] = V = V [X] = σ 2 =. σ σ σ 4.9 ( (Bernoulli) ). 0 θ 0, X P (X = ) = θ, P (X = 0) = θ X (Bernoulli distribution) E[X], V [X] θ θ : E[X] = θ + 0 ( θ) = θ. : E[X 2 ] = 2 θ+0 2 ( θ) = θ V [X] = E[X 2 ] (E[X]) 2 = θ( θ). θ = 0 (θ = ) V [X] = 0 V [X] θ = / ( ). X p(k) = P (X = k) F (k) = P (X k) k p(k) 6 F (k) E[X] = = 7 2 E[X 2 ] = = 9 6 V [X] = E[X 2 ] (E[X]) 2 = X () X E[X] = 7, E[Y ] = E[2X] = 2E[X] = 7, E[Z] = 2 E[X 2 ] = 9 6. (2) 4.4 p X (k) = P (X = k) = 6 () p Y (k) = p X (k) X Y E[X] = E[Y ] = 7, E[Z] = E[X + Y ] = 2 E[X] + E[Y ] = 7.

45 (3) 4.4 p Y () = P (Y = ) = 2, p Y (0) = 2 E[Y ] = = 2. p X(k) = p Y (k) = p Z (k) E[X] = E[Z] = 2, E[T ] = E[X + Y + Z] = X µ = E[X], v = V [X] µ, v () E[2X] (2) E[3X + 4] (3) E[X 2 ] (4) E[X 2 + 2X + 3] (5) V [4X] (6) V [X + 2] (7) V [3X 4] (8) E[X 2 + 3X] a E[a] = a () E[2X] = 2E[X] = 2µ (2) E[3X + 4] = 3E[X] + E[4] = 3µ + 4 (3) E[X 2 ] = v + µ 2 (4) E[X 2 + 2X + 3] = E[X 2 ] + 2E[X] + E[3] = v + µ 2 + 2µ + 3 = µ 2 + 2µ + v + 3 (5) V [4X] = 4 2 V [X] = 6v (6) V [X + 2] = V [X] = v (7) V [3X + 4] = V [3X] = 3 2 V [X] = 9v (8) E[X 2 + 3X] = E[X 2 ] + 3E[X] = µ 2 + 4µ + v a < b X { c (a x b) p(x) = 0 ( ) () c (2) E[X], V [X] b () c dx = c(b a) = c =. (2) µ = E[X] = c x dx = a b a a b 2 a 2 = a + b b b a 2 2. E[X2 ] = c x 2 dx = b 3 a 3 = a2 + ab + b 2 a b a 3 3 V [X] = E[X 2 ] µ 2 = a2 + ab + b 2 (a + b)2 (b a)2 = (a x b) p(x) = b a 0 ( ) [a, b] (uniform distribution) a, b b X p(x) = { cx( x) (0 x ) 0 ( ) () c (2) P [0 X /2] (3) E[X], V [X] (4) F (x) () 2 (2) P [0 X /2] = 0 cx( x) dx = c = 6. /2 0 6x( x) dx = 2

46 46 4 (3) µ = E[X] = (4) F (x) = P [X x] = 0 x 6x( x) dx = 2, V [X] = E[X2 ] µ 2 = x p(t) dt 0 (x < 0) F (x) = 3x 2 2x 3 (0 x ) ( < x) 2 0 x 2 6x( x) dx = ( σ > 0 ) µ X p X (x) = exp (x µ)2 2πσ 2 2σ 2 Y = e X Y F Y (y) = P [Y y] = P [e X y] = P [X log y] = log y p Y (y) = d dy F Y (y) = p X d (log y) (log y) dy ( ) (log y µ) 2 = 2πσ2 y exp 2σ 2 p X (x) dx X N(µ, σ) Y log Y Y (log normal distribution) ( (St. Petersburg paradox)). k 2 k k ( ) 2 k p(k) p(k) = 2 k 2 kp(k) = k 2 k + = + + =. k=

47 ( (Cauchy) ). p(x) = π( + x 2 ) (Cauchy distribution) a t B.2 p(x) 0.5 Cauchy p(x) x 0 x p(x) π( + x 2 ) dx = π [arctan x] = ( π ( π 2 π )) = 2 x [ ] π( + x 2 ) dx = 2 x 0 π( + x 2 ) dx = 2 2π log( + x2 ) = + 0 [ x π( + x 2 dx = lim ) L,R 2π log( + x2 ) = lim L,R 2π log ] R ( + R 2 L + ( L) 2 R = L 0 R = 2L log 2 π 0 a (Lorentz distribution) (Breit-Wigner distribution) )

48 mo ( ). X, X 2,..., X n a, a 2,..., a n P (X a, X 2 a 2,..., X n a n ) = P (X a )P (X 2 a 2 ) P (X n a n ) X, X 2 P (X a ) A = {ω Ω X (ω) a } P (A ) P (X 2 a 2 ) A 2 = {ω Ω X 2 (ω) a 2 } P (A 2 ) P (X a, X 2 a 2 ) A 2 = {ω Ω X (ω) a X 2 (ω) a 2 } = A A 2 P (A A 2 ) = P (A )P (A 2 ), A A 2 a, a 2 X X 2 X X ( ). X, X 2,..., X n f (x), f 2 (x),..., f n (x) E[f (X )f 2 (X 2 ) f n (X n )] = E[f (X )]E[f 2 (X 2 )] E[f n (X n )] 5.3. X, Y f(x) = x, g(y) = y = E[f(X)g(Y )] = E[XY ], = E[f(X)]E[g(Y )] = E[X]E[Y ] X, Y E[XY ] = E[X]E[Y ]

49 ( ). X, Y V [X ± Y ] = V [X] + V [Y ].. µ X = E[X], µ Y = E[Y ] 4.4 E[X ± Y ] = E[X] ± E[Y ] = µ X ± µ Y (5.) V [X ± Y ] = E[(X ± Y E[X ± Y ]) 2 ] = E[(X ± Y (µ X ± µ Y )) 2 ] = E[((X µ X ) ± (Y µ Y )) 2 ] = E[(X µ X ) 2 + (Y µ Y ) 2 ± 2(X µ X )(Y µ Y )] = E[(X µ X ) 2 ] + E[(Y µ Y ) 2 ] ± 2E[(X µ X )(Y µ Y )] = V [X] + V [Y ] ± 2E[(X µ X )(Y µ Y )] f(x) = x µ X, g(y) = y µ Y (5.2) E[(X µ X )(Y µ Y )] = E[X µ X ]E[Y µ Y ] = (E[X] µ X )(E[Y ] µ Y ) = ± + E[X Y ] = E[X] E[Y ] V [X Y ] = V [X] V [Y ] X, Y V [X Y ] = V [X] + V [Y ] (a ± b) 2 = a 2 ± 2ab + b 2 b 2 (±) 2 = E[(X µ X )(Y µ Y )] 0 V [X +Y ] V [X]+V [Y ] Y = X X Y V [X + Y ] = V [0] = 0 Y = X X Y V [X + Y ] = V [2X] = 4V [X] > V [X] + V [Y ] Cov(X, Y ) def = E[(X E[X])(Y E[Y ])] X Y (covariance) X, Y Cov(X, Y ) = 0 X Y (5.2) (5.) V [X ± Y ] = V [X] + V [Y ] ± 2 Cov(X, Y ) X,..., X n V [X + + X n ] = = n V [X k ] + k= k= n Cov(X k, X l ) k= l k n n V [X k ] + 2 Cov(X k, X l ) k= l>k ( ) 2 a k = k k a 2 k + k a k a l (5.) l k

50 Cov(X, Y ) = E[XY ] E[X]E[Y ] ( ). X, X 2,... (independent and identically distributed, i.i.d.) µ = E[X k ], σ 2 = V [X k ] (k =, 2,... ) 5.9 ( ). X k k X, X 2,... i.i.d. µ = E[X ] σ = V [X ] 5.0 ( ). θ (0 θ ) k X k 0 X k P (X k = ) = θ, P (X k = 0) = θ, X, X 2,... i.i.d. E[X ] = θ 5. ( ). θ k X k X k P (X k = ) = θ, P (X k = 0) = θ i.i.d. (Bernoulli process) X, X 2,... i.i.d. E[X ] = µ, V [X ] = σ X n := n (sample mean) n k= X k 5. X 0 0 X n E[X n ] = µ. X n µ (n ).

51 ( ). E[X n ] = µ, V [X n ] = σ2 n. [ ] [ n E[X n ] = E X k = n ] n n E X k = n E [X k ] = nµ = µ. n n V [X n ] = V [ n k= k= k= k= k= ] [ n X k = n ] n 2 V X k = n n 2 V [X k ] = n 2 nσ2 = σ2 n. k= 5.3 i.i.d. V [X n ] = σ2 0 (n ) n X n µ (low of large numbers) * ( ). X, X 2,... i.i.d. E[ X ] < 5.6 () ε > 0 lim P ( X n µ ε ) = 0 ( ) n (2) P lim X n = µ = n. E[X n ] = µ, V [X n ] = σ2 n P ( X n µ ε) ε 2 V [X n] = σ2 ε 2 0 (n ) n 5.5 ( ). X, X 2,... i.i.d. 0 θ n P (X k = ) = θ, P (X k = 0) = θ. X k n X n µ = E[X ] = θ lim X n = θ n lim X n n θ = k= *2

52 52 5 set X X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 0 X X 2 X 3 X 4 X 5 X 5 st /5 2nd /5 3rd /5 X k X n /2 n = 5 X 5 = 3/5, 2/5 n /2 n = n X n θ n = 000 n n = 0000 n = X n θ = 0.8 θ = 0.5 θ = n 4: X n n E[X k ] = θ θ = 0.2, 0.5, ( ). Cauchy 4.27 i.i.d. X, X 2,... 5 Cauchy 0 X n 0 a a 5.4 lim n X n = µ n 5.5 n X n µ n (central limit theorem)

53 X n n 5: Cauchy i.i.d ( ). X, X 2,... i.i.d. µ = E[X ], σ 2 = V [X ] n Z n = X n µ σ/ N(0, ) n ( lim P a X ) n µ b n σ/ n b = e x2 2 dx a 2π Z n E[X n ] = µ, V [X n ] = σ2 n Z n = X n µ σ/ n X n 4.8 E[Z n ] = 0, V [Z n ] =. A.3 0 n E[Z n ] = 0, V [Z n ] = 0 X k * 22 n standard normal distribution central limit theorem i.i.d X n n = X = X n = 5 n = 00 *22

54 n = n = 5 n = : i.i.d. X, X 2,... X n X k n = n = 5 n = ( ). 5.0 k X k = X k = 0 θ P (X k = ) = θ, P (X k = 0) = θ X, X 2,... i.i.d. µ = E[X k ] = θ n P ( a X ) n µ a σ/ n a e x2 2 dx a 2π = Φ(a) Φ( a) = 2Φ(a) Φ(z) (A.4) A.3 a =.96 Φ(.96) P 0.95 (.96 X ) n µ σ/ n.96 2Φ(.96) 0.95, X n.96σ n µ = θ X n +.96σ n

55 X n θ = µ X n σ 2 = θ( θ) = θ 2 σ X n ( X n ) n = X n 0 08 X n ( X n ) θ X n X n ( X n ) θ X n ±0.08 X n ( X n ) X n ± a a %.%.% θ = 0.5 ±0.05% n Φ(2.576) P ( X ) n µ σ/ n ±2.576 σ n = ±2.576 θ = 0.5 ± n θ( θ). n n

56 sample size ( number of samples) parameter (estimator) (estimate) X n = n u 2 n = n n k= X k n (X k X n ) 2 k= n 6. (population) * 23 I * 24 (sample survey) (statistical inference) (inferential statistics) (descriptive statistics) (sampling) (sample) (sample size) *23 *

57 6. 57 N n = n (sampling with replacement) (sampling without replacement) N n (N = ) 2.5, 6.2 N n/n 6.8 (simple random sampling) n N ( ). N n N! N(N ) (N n + ) N C n = = (N n)! n! n! (6.) n! = N C n N(N ) (N n + ) n N N N n n! (6.2) n! N n n n N N n (6.) (6.2) , 7, 72, 73,

58 i.i.d. (population distribution) n X,..., X n (N = ) X,..., X n 2.5 N n (sample size) n X,..., X n i.i.d. 6.3 (parameter) (population parameter) N(µ, σ 2 ) Poisson Poi(λ) µ, σ 2, λ (parametric) parameter (non-parametric) (population mean) (population variance) Poisson N(µ, σ 2 ) 6.4. parameter

59 6.4 (estimator) 59 X,..., X n i.i.d µ, σ 2 k n E[X k ] = E[X ] = µ, V [X k ] = V [X ] = σ (estimator) µ X n = n X,..., X n X,..., X n S(X,..., X n ) (statistic) * 25 (sample distribution) B (estimator) * 26 θ ˆθ = ˆθ n = ˆθ(X,..., X n ) n k= X k ˆ * 27 (estimate) X = x,..., X n = x n ˆθ(x,..., x n ) (point estimation) X,..., X 5 3 X 5 3 x = 8/5, 7/5, 9/5 3 (number of samples) 3 (sample size) 77 *25 statistic statistic statistics *26 (test statistic) *27 ˆµ = n X k n k=

60 (unbiasedness) (bias) 6.7 ( ). θ ˆθ E[ˆθ] = θ (unbiased estimator) 6.8. i.i.d. X n E[X n ] = µ 5.3 µ σ 2 (sample variance) (6.3) s 2 n = n n (X k X n ) 2 k= 6.9 ( ). u 2 n = n n s2 n (unbiased variance) (6.4) u 2 n = n n (X k X n ) 2 k= 6.0. σ 2 s 2 s 2 (6.3) (6.4) u 2 n s2 n (6.3) u 2 n 6.. PC Excel s 2 n (VAR.P) u2 n (VAR.S) R (var()) R Excel R u 2 n s 2 n = n n u2 n s 2 n

61 6.6 (consistency) u 2 n σ2 E[u 2 n] = σ 2.. E[s 2 n] = n n σ2 E[u 2 n] = n n n n σ2 = σ 2 s 2 n = n = n n { (Xk µ) (X n µ) } 2 k= n (X k µ) 2 2 n k= n (X k µ)(x n µ) + n k= n (X n µ) 2. k 2, 3 (X n µ) 3 n n (X n µ) 2 = (X n µ) 2, k= k= 2 2 n { n (X k µ)(x n µ) = 2(X n µ) n k= = 2(X n µ) 2 n X k n k= } n µ k= (6.5) sn 2 = n (X k µ) 2 (X n µ) 2 n k= E[s 2 n] = n n E[(X k µ) 2 ] E[(X n µ) 2 ] = }{{}}{{} n nσ2 σ2 n = n n =V [X k ]=σ 2 k= =V [X n]=σ 2 /n σ σ u 2 n 6.6 [ u ] E 2 n σ [ u ] [ u ] E 2 n = σ E 2 n = E[u 2 n] = σ 2 = σ x2 dx x 2 dx 6.6 (consistency) X n n µ θ ˆθ ( ) (consistent estimator) : ε > 0 lim P ˆθn θ > ε = 0 n

62 62 6 ( P lim n ) ˆθ n = θ = 6.5. ( ) ( ) n n ( ). lim n u2 n = lim n n s2 n = lim lim n n n s2 n = lim n s2 n s2 n (6.5) lim n s2 n = lim n n n (X k µ) 2 lim (X n µ) 2 n k= ( ) 2 P lim (X n µ) 2 = 0 = Y k = (X k µ) 2 n ( ) E[Y k ] = V [X k ] = σ 2 n P lim (X k µ) 2 = σ 2 =. n n ( k= P lim n s2 n = σ 2) =, u 2 n σ E[ u 2 n] σ u 2 n n u 2 n σ 6.7 (efficiency) T = X + X 2 X 3, T 2 = X 3 = X + X 2 + X 3 3 E[T ] = E[X + X 2 X 3 ] = E[X ] + E[X 2 ] E[X 3 ] = µ + µ µ = µ E[T 2 ] = E[X 3 ] = µ µ T, T 2 4.6, 5.4 V [T ] = V [X ] + V [X 2 ] + V [X 3 ] = 3σ 2 V [T 2 ] = V [X 3 ] = σ2 3 T 2 T 2 T (efficient estimator) (maximum likelihood estimator) * X, X 2 µ T = αx + βx 2 α, β *28

63 T E[T ] = E[αX + βx 2 ] = αe[x ] + βe[x 2 ] = (α + β)µ E[T ] = µ α + β = V [T ] = V [αx + βx 2 ] = α 2 V [X ] + β 2 V [X 2 ] = (α 2 + β 2 )σ 2 α = β = c,..., c n T = c X + + c n X n (best linear unbiased estimator, BLUE) 6.8 {x, x 2,..., x N } µ σ 2 µ = N σ 2 = N N i= x k N (x i µ) 2 = N i= N x 2 i µ 2 i= n X,..., X n 6.9. X k ( k n) (6.6) P [X k = x i ] = N ( i N) E[X k ] = µ, V [X k ] = σ 2.. (6.6) N n N(N ) (N n + ) X k = x i k x i N n (N )(N 2) (N (n ) + ) (6.6) E[X k ] = N x i = µ N V [X k ] = N N (x i µ) 2 = σ 2 i= i= X n s 2 n (6.7) (6.8) (6.9) E[X n ] = µ V [X n ] = σ2 n N n N E[s 2 n] = n n σ2 N N

64 64 6 (6.9) s 2 n N N n n s2 n = N N n N (6.8) V [X n ] = σ2 n N n N C F = (finite population correction factor, FPC) u2 n N n N V [X n ] C 2 F C F N N n Z n = X n µ N(0, ) σ2 /n X n µ C 2 F σ 2 /n X n ± z(α/2) σ C F n t X n ± t n (α/2) u2 n C F 2 0 Todo: {x,..., x N } 6.20 (6.7) E[X n ] = n k= n E[X k] = nµ = µ n (6.8) (6.9) (6.5) s 2 n (6.5) (6.8) E[s 2 n] = n n E[(X k µ) 2 ] E[(X n µ) 2 ] = }{{}}{{} n nσ2 V [X n ] =V [X k ]=σ 2 k= (6.8) 5.6 (6.0) =V [X n] = σ 2 σ2 n N n (n )N = N n(n ) σ2. [ ] Σk X k V [X n ] = V = n n 2 V [Σ kx k ] = n n n 2 V [X k ] + Cov(X k, X l ) k= k= l k (X k, X l ) N(N )

65 Cov(X k, X l ) = = N(N ) (x i µ)(x j µ) i j N (x i µ) N(N ) i= }{{} =0 = σ2 N k, l (B.3) 2 N (x i µ) 2 i= }{{} =Nσ 2 V [X n ] = ( ) n 2 nσ 2 + n(n ) σ2 = σ2 N n N n N

66 n (i.i.d.) X,..., X n µ = E[X k ] σ 2 = V [X k ] X n = n X k u 2 n = n k= n (X k X n ) 2 n k= 5.5 7/5 9/5 /2!! 7. ( ) < α < θ (7.) P (L θ U) = α L, U [L, U] 00( α) % (confidence interval, CI) α (confidence coefficient) 00( α) % (confidence level, CL) L U α θ (7.) P (θ / [L, U]) = α 95, 90, 99 % * % *29

67 7. 67 () (2) α/2 00( α) % 7. A.3 X N(0, ) 0 < α < P (X > z(α)) = α z(α) N(0, ) Φ(z) = P [X z] α = P [X > z] = P [X z] = Φ(z) z z(α) = Φ ( α) α 00α % α PC 3 * 30 α z(α) : α 7.2 ( ). α α α F (x) F ( α) α 7.2 X,..., X n N(µ, σ 2 ) n X n Z n = X n µ σ/ A.24 A.5 N(0, ) N(0, ) n α/2 P [ Z n z(α/2)] = α α X n µ σ/ n z(α/2) z(α/2) X n µ σ/ n z(α/2) z(α/2) σ n X n µ z(α/2) σ n X n z(α/2) σ n µ X n + z(α/2) σ n, *30

68 68 7 P [X n z(α/2) n σ µ X n + z(α/2) n σ ] = α σ 2 X n ± z(α/2) σ n µ * 3 µ [X n z(α/2) σ n, X n + z(α/2) σ n ] α 7.3. σ 2 = , 2, 8, 90, 98, 2, 92, 0, 96, 92 µ 95% 99% σ = 0, n = 0 x 0 = = α = 0.05 z(α/2) = z(0.025) =.96 95% µ µ % z(0.005) = µ µ ( ). 99% 95% α = 0 00% z(0) = + 00% (, + ) µ α = z(/2) = 0 [X n, X n ] µ = X n 0 α *3

69 µ %! N(00, 0 2 ) 0 µ = 00 95% [93.7, 06.] 00 03, 05, 80, 82, 3, 78, 90, 99, 03, 92 95% x 0 = 94.5 [88.3, 00.7] 79, 90, 02, 84, 97, 85, 09, 09, 75, 94 95% [86.2, 98.6] 00 L, U = [L, U] 00( α) % 7 95% CI on the mean (µ = 00) samples 7: N(00, 0 2 ) 0 95% % 95% * 32 t B.8 N(m, σ 2 ) n T = X n µ u 2 n /n *32

70 70 7 n t t n * 33 µ N(0, ) t T t n α t n (α) P [T > t n (α)] = α t N(0, ) P ( T t n (α/2)) = α. ( ) u 2 P X n t n (α/2) n u 2 n µ X n + t n (α/2) n = α n µ α [ ] u 2 X n t n (α/2) n u 2 n, X n + t n (α/2) n n 7.5. n = 6 x 6 = 32.6, u 2 6 = 9.28 µ 95% PC t t 5 (0.05/2) = 2.34 u 2 x 6 ± t 5 (0.025) 5 = 32.6 ± % [ , ] = 32.6 ±.6233 u x 6 ±z(0.025) ± [3.073, ] 7.4 n X n µ σ2 /n N(0, ) 6.5 σ 2 u 2 n µ α [ ] u 2 X n z(α/2) n u 2 n, X n + z(α/2) n n *33 T = Xn µ u 2 n /n t n t T t T t

71 σ 2 u 2 n σ2 X n ( X n ) 7.5 σ 2 σ 2 B.5 N(m, σ 2 ) n χ 2 = (n )u2 n σ 2 = ns2 n n σ 2 = ( Xk X n σ k= ) 2 n χ 2 χ 2 n σ2 t χ 2 α/2 α/2 X χ 2 n α χ2 n(α) P (X > χ 2 n(α)) = α P (X χ 2 n(α)) = α α α χ 2 0 P (X < χ 2 n( α)) = α χ n α χ 2 n( α) χ 2 ( P χ 2 < χ 2 n ( α 2 )) = α 2, P ( χ 2 n ( α ) < χ 2) = α 2 2 P ( ( χ 2 n α ) (n ( )u2 n α ) ) 2 σ 2 χ 2 n = α 2 σ 2 σ 2 a (n )u 2 n ( α ) σ 2 (n )u2 n ( χ 2 n χ 2 n α ) , 2, 8, 90, 98, 2, 92, 0, 96, 92 95% x 0 = 99.9, u 2 0 = % α = 0.05, n = 0 = 9 χ 2 9(α/2) χ 2 9( α/2) PC χ 2 9(0.025) = 9.023, χ 2 9(0.975) = % = 73.7 σ =

72 % CI on the variance (σ 2 = 00) % CI on the SD (σ = 0) samples samples 8: N(00, 0 2 ) 0 95% X n ± u 2 n ± ± ( )/2 = ± (Todo) Poisson

73 χ 2 t F Z Welch Kolmogorov Smirnov Mann Whitney χ 2, t, F Welch, Kolmogorov Smirnov, Mann Whitney * (hypothesis testing) /2 n S n S n B(n, /2) n = 0, 00, 000 S n E[S 0 ] = 5, E[S 00 ] = 50, E[S 000 ] = 500 6, 60, *34 χ 2

74 74 8 PC P (S 0 6) = P (S 00 60) = P (S ) = 0 k=6 00 k= k=600 ) k ( ) 0 k ) k ( ) 00 k ( ) k ( ) 000 k 000C k C k ( 2 00C k ( % /2 * % 5% 3% /2 % 0.% 3% () /2 (2) (3) () () (3) * ( ). P (S 0 6) = P (S 00 60) = P (S ) = *35 *36

75 PC PDF % ( ) () (null hypothesis) H 0 (2) X (3) (significance level) 0 < α < α = 0.05, 0.0 (4) H 0 P H0 (X A) = α A P H0 (X A) α 8.4 A (critical region) (5) X x x A H 0 (reject) x / A H 0 () (4) (5) 8.2.

76 ( ). M 99% % a M 3.5 M H 0 M X H 0 P (X 3) = 0.99, P (X 5) = 0.95 P (X < 3) = 0.0, P (X < 5) = 0.05, 5% A 5 = {x < 5} % A = {x < 3} x = 3.5 A 5 5% H 0 A % H 0 a ( ). H 0 θ = /2 n S n B(n, θ) S n n = 00 P (S 00 58) 0.044, P (S 00 57) % S P (X A) = α (4) 60 5% 8.5 ( ). 65 db 5 64 db σ 2 = 5 5% µ H 0 : µ = 65 Z = X n µ σ2 /n H 0 Z N(0, ) α = 0.05 P H0 ( Z >.96) = 0.05 Z > z = = % H 0 5/5 0% P H0 ( Z >.64) = H db * 37 * db 65 db

77 db 5 64 db σ 2 = 5 5% H 0 : µ = 65 Z = X n µ σ2 /n H 0 Z N(0, ). α = 0.05 P (Z <.645) = 0.05 Z < z = = % H 0 5/ null hypothesis hypothesis null zero nothing

78 (accept) A Statistical methods in psychology journals: Guidelines and explanations Never use the unfortunate expression accept the null hypothesis. Fisher... the null hypothesis is never proved or established, but is possibly disproved,... Fisher H 0 : θ = /2 S 00 = 00 0 H 0 * 38 (error of the first kind / type I error) * 39 2 (error of the second kind / type II error) 0 α, 2 β H 0 H 0 H 0 OK ( α) Type I error (α) H 0 Type II error (β) OK ( β) P H0 (A) = α α α α * 40 A B = P (A) P (B) *38 *39 *40

79 α = 0 P H0 (A) = α = 0 A = 00% P H0 (A) = α A 9 2 p(x) A C D B x 9: P H0 (A) = P H0 (B) = P H0 (C) = P H0 (D) = α 2 β β 2 (power) (alternative hypothesis) H = H X A H 0 H A P H (X A) A = β = P H (X A) A * 4 N(µ, ) H 0 : µ = 0 0 A = [z(α), ), B = (, z(α)], C = (, z(α/2)] [z(α/2), ) *4 likelihood Neyman Pearson

80 80 8 H 0 : N(0, ) A B C z(α/2) z(α) 0 z(α) z(α/2) 0: α H : µ > 0 P H0 (I) = α I P H (I) P H (A) A c [c, ) (one-sided test) H 0 H A B C z(α/2) z(α) 0 z(α) z(α/2) : µ > 0 P H (A) > P H (C) > P H (B) A H : µ < 0 P H (B) B H H 0 A B C z(α/2) z(α) 0 z(α) z(α/2) 2: µ < 0 P H (B) > P H (C) > P H (A) B H : µ 0 A B C c, c (, c] [c, ) (two-sided test)

81 8.3 8 α P H0 ((, c]) = P H0 ([c, )) = α/2 H? H 0 H? A B C z(α/2) z(α) 0 z(α) z(α/2) 3: µ 0 S n N * () H 0, H (2) α (3) P H0 (X A) α P H (X A) A α P H0 (X > u(α)) = α α P H0 (X < l(α)) = α A = [u(α), ) A = (, l(α)] A = (, l(α/2)] [u(α/2), ) (4) X x x A H 0 x / A 8.3 B χ 2, t, F 3 *42 χ 2

82 82 8 * 43 t χ (?) A : 205 A Z 8.5, 8.6 N(µ, σ 2 ) σ 2 H 0 : µ = 65 Z = X n µ σ2 /n µ σ2 Z 8.5 H : µ H : µ < *43

83 % H H t 8.5, 8.6 Z = X n µ σ2 /n σ2 µ Z Z N(0, ) σ 2 Z = X n µ σ2 /n σ 2 u 2 n T = X n µ u 2 n /n T µ µ T T t B.8 t t 8.0. T Gosset T t T t 8. ( t [6] ). 00m , 58.2, 56.2, 57.3, 58.7, 58.8, 56.3, 57., 57.3, % () µ H 0 : µ = 58.8 H : µ < (2) T = X n µ u 2 n /n (n = 0) H 0 T t n = t 9. (3) H P t9 (T <.383) = 0.. t 9 (0.) =.383. (4) u = T t = = /0 H 0 0% 8.2. R > t.test(c(57.6, 58.2, 56.2, 57.3, 58.7, 58.8, 56.3, 57., 57.3, 57.),

84 84 8 alternative="less", mu=58.8) # One Sample t-test data: c(57.6, 58.2, 56.2, 57.3, 58.7, 58.8, 56.3, 57., 57.3, 57.) t = , df = 9, p-value = alternative hypothesis: true mean is less than percent confidence interval: -Inf sample estimates: mean of x t p 0.05% 95% (, ] 8.6 t.test alternative="less" 8.3. p (p-value) 9 t t = P (T < 4.756) = p p p p p p p

85 n n = 00 n = 20 * ( [6] ) (ppm) 0.03 (ppm 2 ) 0.3 (ppm) % () µ H 0 : µ = 0.3 H : µ > 0.3. (2) n = 95 Z = X n µ N(0, ) u 2 n /n (3) H Z N(0, ) P (Z > ) = (4) z = =.25 H 0 % 0.03/ ( ). Z = X n µ σ2 /n N(0, ) or T = X n µ u 2 n /n t n t T = X n µ N(0, ) u 2 n /n t n t n N(0, ) 0.05 N(0, ) z(0.05) = t 20 (0.05) =.72478, t 94 (0.05) = %, % n 00 T t n t t χ ( [6] ). (ml/sec) % *44 B(n, θ) θ /2 n = 20

86 86 8 () σ 2 H 0 : σ 2 = 0 2. H : σ 2 < 0 2. (2) χ 2 = (n )u2 n σ 2 (n = 20). H 0 χ 2 χ 2 9. (3) H P χ 2 9 (χ 2 < 0.2) = % α 95% (4) χ = 0 2 = 4.75 H 0 5% m, n X = m Y = n X, X 2,, X m N(µ x, σ 2 x) Y, Y 2,, Y n N(µ y, σ 2 y) m k= n k= X k, u 2 x = m Y k, u 2 y = n m (X k X) 2, k= n (Y k Y ) 2 k= t Welch X N(µ x, σ 2 x/m), Y N(µ y, σ 2 y/n) X Y N(µ x µ y, σ 2 x/m + σ 2 y/n) (8.) X Y (µ x µ y ) σ 2 x m + σ2 y n N(0, ) σx 2 = σy 2 = σ 2 (8.) σ 2 pooled variance ˆσ 2 = (m )u2 x + (n )u 2 y m + n 2 X Y (µ x µ y ) ( ˆσ 2 m + ) n m + n 2 t

87 X Y σ 2 x, σ 2 y (8.) σ2 x, σ 2 y X m Y n (µ x µ y ) u 2 x m + u2 y n ( ) u 2 x m + u2 y n (u 2 x/m) 2 m + (u2 y/n) 2 n t t Welch (Welch s test) Welch t Welch 8.7. Welch Welch [7] F t Welch a a x : 203,203,97,98,206,26,95,200,202,99 y : 204,95,20,229,22,2,204,93,25,220 5% H 0 : µ x = µ y Welch R > x<-c(203,203,97,98,206,26,95,200,202,99) > y<-c(204,95,20,229,22,2,204,93,25,220) > t.test(x,y) Welch Two Sample t-test data: x and y t = , df = 3.464, p-value = alternative hypothesis: true difference in means is not equal to 0 95 percent confidence interval: sample estimates:

88 88 8 mean of x mean of y p x > t.test(x,y,alternative="less") Welch Two Sample t-test data: x and y t = , df = 3.464, p-value = alternative hypothesis: true difference in means is less than 0 95 percent confidence interval: -Inf sample estimates: mean of x mean of y p 0.05 x n X,..., X n Y,..., Y n 2 d k = X k Y k t u 2 x χ 2 m, u 2 y χ 2 n F u 2 x/σ 2 x u 2 y/σ 2 y F m,n. H 0 : σ 2 x = σ 2 y χ 2 χ 2 χ 2 χ 2 = { (Observed) (Expected)} 2 (Expected) = (O E) 2 E

89 8.5 χ χ 2 (χ 2 goodness of fit test) K A, A 2,..., A K H 0 : P (A ) = p, P (A 2 ) = p 2,, P (A K ) = p K n H 0 m = np, m 2 = np 2,, m K = np K f,, f K ( K i= f i = n) A A 2 A K f f 2 f K n m m 2 m K n n m i m i 0 H 0 χ 2 = (O E) 2 E k (f i m i ) 2 = i= m i k χ 2 χ 2 k. K = 2?? f 2 = n f, p 2 = p χ 2 = (f m ) 2 + (f 2 m 2 ) 2 m m 2 = (f np ) 2 + (n f (n np )) 2 np n( p ) { = (f np ) 2 np ( p ) = f np np ( p ) f B(n, p ) n χ 2 } 2 f i = m i χ 2 = 0 f i m i χ 2 α P (χ 2 > χ 2 k (α)) = α % H 0 : P (A ) = P (A 2 ) = = P (A 6 ) = /6 m = = m 6 =

90 /6 = 20 χ 2 = (8 20) (25 20) (8 20)2 20 = = 5 χ 2 P [χ 2 >.07] = 0.05 χ 2 = 2.3 5% 5% 8.22 ( ). / / χ 2 4 = 3 χ 2 9/6 3/6 3/6 / χ 2 ( )2 ( )2 = ( )2 + = P [χ 2 > 7.85] = 0.05 ( ) % χ 2 P [χ 2 < 0.470] = A Poisson 5% λ λk λ = 2.07 Poisson p(k) = e k! 300 p(k)

91 8.5 χ = 6 P χ 2 6 (χ 2 > 2.59) = 0.05 χ 2 (38 39)2 (5 3)2 = + + = Poisson = 6 5 χ 2 P χ 2 5 (χ 2 >.07) = 0.05 λ = 2 6 χ χ 2 n A, B r s B B 2 B s A n n 2 n s n A 2 n 2 n 22 n 2s n A r n r n r2 n rs n r n n 2 n s n n i = s n ij, n j = j= A B P (A i ) = n i n, P (B j) = n j n A B P (A i B j ) = P (A i )P (B j ) (A i, B j ) m ij r i= n ij m ij = n P (A i B j ) = n ni n n j n = n i n j n n m ij 5 A B χ 2 (O E) 2 E r s (n ij m ij ) 2 = i= j= m ij m ij = n i n j n (r )(s ) χ 2 A B (r )(s ) 8.25 r s rs (r ) (s ) = (r )(s )

92 92 8 χ 2 = 0 χ 2 α P (χ 2 > χ 2 (r )(s ) (α)) = α χ 2 χ χ 2 = 2 2 i= j= ( n ij m ij ) 2 2 (Yates correction) R m ij /60 = /60 = /60 = /60 = 26.4 (2 )(2 ) = P χ 2 (χ 2 > 3.84) = 0.05 χ 2 χ 2 = χ 2 = % A, B 5% A B A B % P χ 2 (χ 2 > 3.84) = 0.05 χ 2 χ 2 = 3.06

93 A/B A/B χ 2 A/B B(N A, p A ), B(N B, p B ) H 0 : p A = p B (Todo) Brunner-Munzel 8.8 (Todo) 8.9 (Todo)

94 94 A A WWW A. (Bernoulli) A.. 0 θ 0, X P (X = ) = θ, P (X = 0) = θ X (Bernoulli distribution) P (X = i) = θ i ( θ) i (i = 0, ) X 0 θ A.2. θ, θ( θ). E[X] = θ+0 ( θ) = θ, E[X 2 ] = 2 θ+0 2 ( θ) = θ V [X] = E[X 2 ] (E[X]) 2 = θ( θ). A.3. θ = 0, 0 θ = 2 4 θ = 0, 0 θ = 2 A.6 X, X 2,... i.i.d. (Bernoulli process) θ A.2 (binomial distribution) X, X 2,..., X k,... (A.) S n = n k= X k n

95 A.2 (binomial distribution) 95 A.4 ( ). S n (binomial distribution) B(n, θ) X B(n, θ) X B(n, θ) B(, θ) A.5. B(n, θ) (A.2) n C k n C k = { nc k θ k ( θ) n k k = 0,,..., n, n p(k) = 0 n! k! (n k)! n k. 0 X k 0 S n n k < 0, n + < k k P (S n = k) = 0 P (S n = k) = n C k θ k ( θ) n k k = 0,,..., n, n X k 0 S n = k X, X 2,..., X n k i k = 0, k X = i, X 2 = i 2,..., X n = i n i + + i n = k * 45 P (S n = k) = P (X = i,..., X n = i n ) i + +i n=k i + + i n = k = P (X = i ) P (X n = i n ) = = = i + +i n=k i + +i n=k i + +i n=k i + +i n=k θ i ( θ) i θ in ( θ) in θ i+ +in ( θ) n (i+ +in) θ k ( θ) n k i + + i n = k = n C k θ k ( θ) n k i + + i n = k n C k *45 n = 3, k = i, i 2, i 3 k = 2

untitled

untitled 2 : n =1, 2,, 10000 0.5125 0.51 0.5075 0.505 0.5025 0.5 0.4975 0.495 0 2000 4000 6000 8000 10000 2 weak law of large numbers 1. X 1,X 2,,X n 2. µ = E(X i ),i=1, 2,,n 3. σi 2 = V (X i ) σ 2,i=1, 2,,n ɛ>0

More information

tokei01.dvi

tokei01.dvi 2. :,,,. :.... Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN 4 3. (probability),, 1. : : n, α A, A a/n. :, p, p Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN

More information

I L01( Wed) : Time-stamp: Wed 07:38 JST hig e, ( ) L01 I(2017) 1 / 19

I L01( Wed) : Time-stamp: Wed 07:38 JST hig e,   ( ) L01 I(2017) 1 / 19 I L01(2017-09-20 Wed) : Time-stamp: 2017-09-20 Wed 07:38 JST hig e, http://hig3.net ( ) L01 I(2017) 1 / 19 ? 1? 2? ( ) L01 I(2017) 2 / 19 ?,,.,., 1..,. 1,2,.,.,. ( ) L01 I(2017) 3 / 19 ? I. M (3 ) II,

More information

( 30 ) 30 4 5 1 4 1.1............................................... 4 1.............................................. 4 1..1.................................. 4 1.......................................

More information

..3. Ω, Ω F, P Ω, F, P ). ) F a) A, A,..., A i,... F A i F. b) A F A c F c) Ω F. ) A F A P A),. a) 0 P A) b) P Ω) c) [ ] A, A,..., A i,... F i j A i A

..3. Ω, Ω F, P Ω, F, P ). ) F a) A, A,..., A i,... F A i F. b) A F A c F c) Ω F. ) A F A P A),. a) 0 P A) b) P Ω) c) [ ] A, A,..., A i,... F i j A i A .. Laplace ). A... i),. ω i i ). {ω,..., ω } Ω,. ii) Ω. Ω. A ) r, A P A) P A) r... ).. Ω {,, 3, 4, 5, 6}. i i 6). A {, 4, 6} P A) P A) 3 6. ).. i, j i, j) ) Ω {i, j) i 6, j 6}., 36. A. A {i, j) i j }.

More information

A B P (A B) = P (A)P (B) (3) A B A B P (B A) A B A B P (A B) = P (B A)P (A) (4) P (B A) = P (A B) P (A) (5) P (A B) P (B A) P (A B) A B P

A B P (A B) = P (A)P (B) (3) A B A B P (B A) A B A B P (A B) = P (B A)P (A) (4) P (B A) = P (A B) P (A) (5) P (A B) P (B A) P (A B) A B P 1 1.1 (population) (sample) (event) (trial) Ω () 1 1 Ω 1.2 P 1. A A P (A) 0 1 0 P (A) 1 (1) 2. P 1 P 0 1 6 1 1 6 0 3. A B P (A B) = P (A) + P (B) (2) A B A B A 1 B 2 A B 1 2 1 2 1 1 2 2 3 1.3 A B P (A

More information

Part () () Γ Part ,

Part () () Γ Part , Contents a 6 6 6 6 6 6 6 7 7. 8.. 8.. 8.3. 8 Part. 9. 9.. 9.. 3. 3.. 3.. 3 4. 5 4.. 5 4.. 9 4.3. 3 Part. 6 5. () 6 5.. () 7 5.. 9 5.3. Γ 3 6. 3 6.. 3 6.. 3 6.3. 33 Part 3. 34 7. 34 7.. 34 7.. 34 8. 35

More information

確率論と統計学の資料

確率論と統計学の資料 5 June 015 ii........................ 1 1 1.1...................... 1 1........................... 3 1.3... 4 6.1........................... 6................... 7 ii ii.3.................. 8.4..........................

More information

( )/2 hara/lectures/lectures-j.html 2, {H} {T } S = {H, T } {(H, H), (H, T )} {(H, T ), (T, T )} {(H, H), (T, T )} {1

( )/2   hara/lectures/lectures-j.html 2, {H} {T } S = {H, T } {(H, H), (H, T )} {(H, T ), (T, T )} {(H, H), (T, T )} {1 ( )/2 http://www2.math.kyushu-u.ac.jp/ hara/lectures/lectures-j.html 1 2011 ( )/2 2 2011 4 1 2 1.1 1 2 1 2 3 4 5 1.1.1 sample space S S = {H, T } H T T H S = {(H, H), (H, T ), (T, H), (T, T )} (T, H) S

More information

分布

分布 (normal distribution) 30 2 Skewed graph 1 2 (variance) s 2 = 1/(n-1) (xi x) 2 x = mean, s = variance (variance) (standard deviation) SD = SQR (var) or 8 8 0.3 0.2 0.1 0.0 0 1 2 3 4 5 6 7 8 8 0 1 8 (probability

More information

6.1 (P (P (P (P (P (P (, P (, P.

6.1 (P (P (P (P (P (P (, P (, P. (011 30 7 0 ( ( 3 ( 010 1 (P.3 1 1.1 (P.4.................. 1 1. (P.4............... 1 (P.15.1 (P.16................. (P.0............3 (P.18 3.4 (P.3............... 4 3 (P.9 4 3.1 (P.30........... 4 3.

More information

June 2016 i (statistics) F Excel Numbers, OpenOffice/LibreOffice Calc ii *1 VAR STDEV 1 SPSS SAS R *2 R R R R *1 Excel, Numbers, Microsoft Office, Apple iwork, *2 R GNU GNU R iii URL http://ruby.kyoto-wu.ac.jp/statistics/training/

More information

ii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,.

ii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,. (1 C205) 4 10 (2 C206) 4 11 (2 B202) 4 12 25(2013) http://www.math.is.tohoku.ac.jp/~obata,.,,,..,,. 1. 2. 3. 4. 5. 6. 7. 8. 1., 2007 ( ).,. 2. P. G., 1995. 3. J. C., 1988. 1... 2.,,. ii 3.,. 4. F. ( ),..

More information

III III 2010 PART I 1 Definition 1.1 (, σ-),,,, Borel( ),, (σ-) (M, F, µ), (R, B(R)), (C, B(C)) Borel Definition 1.2 (µ-a.e.), (in µ), (in L 1 (µ)). T

III III 2010 PART I 1 Definition 1.1 (, σ-),,,, Borel( ),, (σ-) (M, F, µ), (R, B(R)), (C, B(C)) Borel Definition 1.2 (µ-a.e.), (in µ), (in L 1 (µ)). T III III 2010 PART I 1 Definition 1.1 (, σ-),,,, Borel( ),, (σ-) (M, F, µ), (R, B(R)), (C, B(C)) Borel Definition 1.2 (µ-a.e.), (in µ), (in L 1 (µ)). Theorem 1.3 (Lebesgue ) lim n f n = f µ-a.e. g L 1 (µ)

More information

6.1 (P (P (P (P (P (P (, P (, P.101

6.1 (P (P (P (P (P (P (, P (, P.101 (008 0 3 7 ( ( ( 00 1 (P.3 1 1.1 (P.3.................. 1 1. (P.4............... 1 (P.15.1 (P.15................. (P.18............3 (P.17......... 3.4 (P................ 4 3 (P.7 4 3.1 ( P.7...........

More information

こんにちは由美子です

こんにちは由美子です 1 2 . sum Variable Obs Mean Std. Dev. Min Max ---------+----------------------------------------------------- var1 13.4923077.3545926.05 1.1 3 3 3 0.71 3 x 3 C 3 = 0.3579 2 1 0.71 2 x 0.29 x 3 C 2 = 0.4386

More information

2 1 Introduction

2 1 Introduction 1 24 11 26 1 E-mail: toyoizumi@waseda.jp 2 1 Introduction 5 1.1...................... 7 2 8 2.1................ 8 2.2....................... 8 2.3............................ 9 3 10 3.1.........................

More information

(pdf) (cdf) Matlab χ ( ) F t

(pdf) (cdf) Matlab χ ( ) F t (, ) (univariate) (bivariate) (multi-variate) Matlab Octave Matlab Matlab/Octave --...............3. (pdf) (cdf)...3.4....4.5....4.6....7.7. Matlab...8.7.....9.7.. χ ( )...0.7.3.....7.4. F....7.5. t-...3.8....4.8.....4.8.....5.8.3....6.8.4....8.8.5....8.8.6....8.9....9.9.....9.9.....0.9.3....0.9.4.....9.5.....0....3

More information

統計学のポイント整理

統計学のポイント整理 .. September 17, 2012 1 / 55 n! = n (n 1) (n 2) 1 0! = 1 10! = 10 9 8 1 = 3628800 n k np k np k = n! (n k)! (1) 5 3 5 P 3 = 5! = 5 4 3 = 60 (5 3)! n k n C k nc k = npk k! = n! k!(n k)! (2) 5 3 5C 3 = 5!

More information

Rによる計量分析:データ解析と可視化 - 第3回 Rの基礎とデータ操作・管理

Rによる計量分析:データ解析と可視化 - 第3回  Rの基礎とデータ操作・管理 R 3 R 2017 Email: gito@eco.u-toyama.ac.jp October 23, 2017 (Toyama/NIHU) R ( 3 ) October 23, 2017 1 / 34 Agenda 1 2 3 4 R 5 RStudio (Toyama/NIHU) R ( 3 ) October 23, 2017 2 / 34 10/30 (Mon.) 12/11 (Mon.)

More information

x y 1 x 1 y 1 2 x 2 y 2 3 x 3 y 3... x ( ) 2

x y 1 x 1 y 1 2 x 2 y 2 3 x 3 y 3... x ( ) 2 1 1 1.1 1.1.1 1 168 75 2 170 65 3 156 50... x y 1 x 1 y 1 2 x 2 y 2 3 x 3 y 3... x ( ) 2 1 1 0 1 0 0 2 1 0 0 1 0 3 0 1 0 0 1...... 1.1.2 x = 1 n x (average, mean) x i s 2 x = 1 n (x i x) 2 3 x (variance)

More information

³ÎΨÏÀ

³ÎΨÏÀ 2017 12 12 Makoto Nakashima 2017 12 12 1 / 22 2.1. C, D π- C, D. A 1, A 2 C A 1 A 2 C A 3, A 4 D A 1 A 2 D Makoto Nakashima 2017 12 12 2 / 22 . (,, L p - ). Makoto Nakashima 2017 12 12 3 / 22 . (,, L p

More information

k3 ( :07 ) 2 (A) k = 1 (B) k = 7 y x x 1 (k2)?? x y (A) GLM (k

k3 ( :07 ) 2 (A) k = 1 (B) k = 7 y x x 1 (k2)?? x y (A) GLM (k 2012 11 01 k3 (2012-10-24 14:07 ) 1 6 3 (2012 11 01 k3) kubo@ees.hokudai.ac.jp web http://goo.gl/wijx2 web http://goo.gl/ufq2 1 3 2 : 4 3 AIC 6 4 7 5 8 6 : 9 7 11 8 12 8.1 (1)........ 13 8.2 (2) χ 2....................

More information

名称未設定

名称未設定 2007 12 19 i I 1 1 3 1.1.................... 3 1.2................................ 4 1.3.................................... 7 2 9 2.1...................................... 9 2.2....................................

More information

II (Percolation) ( 3-4 ) 1. [ ],,,,,,,. 2. [ ],.. 3. [ ],. 4. [ ] [ ] G. Grimmett Percolation Springer-Verlag New-York [ ] 3

II (Percolation) ( 3-4 ) 1. [ ],,,,,,,. 2. [ ],.. 3. [ ],. 4. [ ] [ ] G. Grimmett Percolation Springer-Verlag New-York [ ] 3 II (Percolation) 12 9 27 ( 3-4 ) 1 [ ] 2 [ ] 3 [ ] 4 [ ] 1992 5 [ ] G Grimmett Percolation Springer-Verlag New-York 1989 6 [ ] 3 1 3 p H 2 3 2 FKG BK Russo 2 p H = p T (=: p c ) 3 2 Kesten p c =1/2 ( )

More information

st.dvi

st.dvi 9 3 5................................... 5............................. 5....................................... 5.................................. 7.........................................................................

More information

10:30 12:00 P.G. vs vs vs 2

10:30 12:00 P.G. vs vs vs 2 1 10:30 12:00 P.G. vs vs vs 2 LOGIT PROBIT TOBIT mean median mode CV 3 4 5 0.5 1000 6 45 7 P(A B) = P(A) + P(B) - P(A B) P(B A)=P(A B)/P(A) P(A B)=P(B A) P(A) P(A B) P(A) P(B A) P(B) P(A B) P(A) P(B) P(B

More information

k2 ( :35 ) ( k2) (GLM) web web 1 :

k2 ( :35 ) ( k2) (GLM) web   web   1 : 2012 11 01 k2 (2012-10-26 16:35 ) 1 6 2 (2012 11 01 k2) (GLM) kubo@ees.hokudai.ac.jp web http://goo.gl/wijx2 web http://goo.gl/ufq2 1 : 2 2 4 3 7 4 9 5 : 11 5.1................... 13 6 14 6.1......................

More information

こんにちは由美子です

こんにちは由美子です Sample size power calculation Sample Size Estimation AZTPIAIDS AIDSAZT AIDSPI AIDSRNA AZTPr (S A ) = π A, PIPr (S B ) = π B AIDS (sampling)(inference) π A, π B π A - π B = 0.20 PI 20 20AZT, PI 10 6 8 HIV-RNA

More information

populatio sample II, B II? [1] I. [2] 1 [3] David J. Had [4] 2 [5] 3 2

populatio sample II, B II?  [1] I. [2] 1 [3] David J. Had [4] 2 [5] 3 2 (2015 ) 1 NHK 2012 5 28 2013 7 3 2014 9 17 2015 4 8!? New York Times 2009 8 5 For Today s Graduate, Just Oe Word: Statistics Google Hal Varia I keep sayig that the sexy job i the ext 10 years will be statisticias.

More information

(2 X Poisso P (λ ϕ X (t = E[e itx ] = k= itk λk e k! e λ = (e it λ k e λ = e eitλ e λ = e λ(eit 1. k! k= 6.7 X N(, 1 ϕ X (t = e 1 2 t2 : Cauchy ϕ X (t

(2 X Poisso P (λ ϕ X (t = E[e itx ] = k= itk λk e k! e λ = (e it λ k e λ = e eitλ e λ = e λ(eit 1. k! k= 6.7 X N(, 1 ϕ X (t = e 1 2 t2 : Cauchy ϕ X (t 6 6.1 6.1 (1 Z ( X = e Z, Y = Im Z ( Z = X + iy, i = 1 (2 Z E[ e Z ] < E[ Im Z ] < Z E[Z] = E[e Z] + ie[im Z] 6.2 Z E[Z] E[ Z ] : E[ Z ] < e Z Z, Im Z Z E[Z] α = E[Z], Z = Z Z 1 {Z } E[Z] = α = α [ α ]

More information

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j )

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j ) 5 Armitage. x,, x n y i = 0x i + 3 y i = log x i x i y i.2 n i i x ij i j y ij, z ij i j 2 y = a x + b 2 2. ( cm) x ij (i j ) (i) x, x 2 σ 2 x,, σ 2 x,2 σ x,, σ x,2 t t x * (ii) (i) m y ij = x ij /00 y

More information

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.

ii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,. 24(2012) (1 C106) 4 11 (2 C206) 4 12 http://www.math.is.tohoku.ac.jp/~obata,.,,,.. 1. 2. 3. 4. 5. 6. 7.,,. 1., 2007 (). 2. P. G. Hoel, 1995. 3... 1... 2.,,. ii 3.,. 4. F. (),.. 5... 6.. 7.,,. 8.,. 1. (75%)

More information

untitled

untitled 3 3. (stochastic differential equations) { dx(t) =f(t, X)dt + G(t, X)dW (t), t [,T], (3.) X( )=X X(t) : [,T] R d (d ) f(t, X) : [,T] R d R d (drift term) G(t, X) : [,T] R d R d m (diffusion term) W (t)

More information

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi I (Basics of Probability Theory ad Radom Walks) 25 4 5 ( 4 ) (Preface),.,,,.,,,...,,.,.,,.,,. (,.) (Basics of Proability Theory). (Probability Spacees ad Radom Variables...............2, (Expectatios,

More information

201711grade1ouyou.pdf

201711grade1ouyou.pdf 2017 11 26 1 2 52 3 12 13 22 23 32 33 42 3 5 3 4 90 5 6 A 1 2 Web Web 3 4 1 2... 5 6 7 7 44 8 9 1 2 3 1 p p >2 2 A 1 2 0.6 0.4 0.52... (a) 0.6 0.4...... B 1 2 0.8-0.2 0.52..... (b) 0.6 0.52.... 1 A B 2

More information

ohpmain.dvi

ohpmain.dvi fujisawa@ism.ac.jp 1 Contents 1. 2. 3. 4. γ- 2 1. 3 10 5.6, 5.7, 5.4, 5.5, 5.8, 5.5, 5.3, 5.6, 5.4, 5.2. 5.5 5.6 +5.7 +5.4 +5.5 +5.8 +5.5 +5.3 +5.6 +5.4 +5.2 =5.5. 10 outlier 5 5.6, 5.7, 5.4, 5.5, 5.8,

More information

ii 3.,. 4. F. (), ,,. 8.,. 1. (75% ) (25% ) =9 7, =9 8 (. ). 1.,, (). 3.,. 1. ( ).,.,.,.,.,. ( ) (1 2 )., ( ), 0. 2., 1., 0,.

ii 3.,. 4. F. (), ,,. 8.,. 1. (75% ) (25% ) =9 7, =9 8 (. ). 1.,, (). 3.,. 1. ( ).,.,.,.,.,. ( ) (1 2 )., ( ), 0. 2., 1., 0,. 23(2011) (1 C104) 5 11 (2 C206) 5 12 http://www.math.is.tohoku.ac.jp/~obata,.,,,.. 1. 2. 3. 4. 5. 6. 7.,,. 1., 2007 ( ). 2. P. G. Hoel, 1995. 3... 1... 2.,,. ii 3.,. 4. F. (),.. 5.. 6.. 7.,,. 8.,. 1. (75%

More information

80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = i=1 i=1 n λ x i e λ i=1 x i! = λ n i=1 x i e nλ n i=1 x

80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = i=1 i=1 n λ x i e λ i=1 x i! = λ n i=1 x i e nλ n i=1 x 80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = n λ x i e λ x i! = λ n x i e nλ n x i! n n log l(λ) = log(λ) x i nλ log( x i!) log l(λ) λ = 1 λ n x i n =

More information

II Brown Brown

II Brown Brown II 16 12 5 1 Brown 3 1.1..................................... 3 1.2 Brown............................... 5 1.3................................... 8 1.4 Markov.................................... 1 1.5

More information

IA 2013 : :10722 : 2 : :2 :761 :1 (23-27) : : ( / ) (1 /, ) / e.g. (Taylar ) e x = 1 + x + x xn n! +... sin x = x x3 6 + x5 x2n+1 + (

IA 2013 : :10722 : 2 : :2 :761 :1 (23-27) : : ( / ) (1 /, ) / e.g. (Taylar ) e x = 1 + x + x xn n! +... sin x = x x3 6 + x5 x2n+1 + ( IA 2013 : :10722 : 2 : :2 :761 :1 23-27) : : 1 1.1 / ) 1 /, ) / e.g. Taylar ) e x = 1 + x + x2 2 +... + xn n! +... sin x = x x3 6 + x5 x2n+1 + 1)n 5! 2n + 1)! 2 2.1 = 1 e.g. 0 = 0.00..., π = 3.14..., 1

More information

n ξ n,i, i = 1,, n S n ξ n,i n 0 R 1,.. σ 1 σ i .10.14.15 0 1 0 1 1 3.14 3.18 3.19 3.14 3.14,. ii 1 1 1.1..................................... 1 1............................... 3 1.3.........................

More information

I A A441 : April 21, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) Google

I A A441 : April 21, 2014 Version : Kawahira, Tomoki TA (Kondo, Hirotaka ) Google I4 - : April, 4 Version :. Kwhir, Tomoki TA (Kondo, Hirotk) Google http://www.mth.ngoy-u.c.jp/~kwhir/courses/4s-biseki.html pdf 4 4 4 4 8 e 5 5 9 etc. 5 6 6 6 9 n etc. 6 6 6 3 6 3 7 7 etc 7 4 7 7 8 5 59

More information

Basic Math. 1 0 [ N Z Q Q c R C] 1, 2, 3,... natural numbers, N Def.(Definition) N (1) 1 N, (2) n N = n +1 N, (3) N (1), (2), n N n N (element). n/ N.

Basic Math. 1 0 [ N Z Q Q c R C] 1, 2, 3,... natural numbers, N Def.(Definition) N (1) 1 N, (2) n N = n +1 N, (3) N (1), (2), n N n N (element). n/ N. Basic Mathematics 16 4 16 3-4 (10:40-12:10) 0 1 1 2 2 2 3 (mapping) 5 4 ε-δ (ε-δ Logic) 6 5 (Potency) 9 6 (Equivalence Relation and Order) 13 7 Zorn (Axiom of Choice, Zorn s Lemma) 14 8 (Set and Topology)

More information

B [ 0.1 ] x > 0 x 6= 1 f(x) µ 1 1 xn 1 + sin sin x 1 x 1 f(x) := lim. n x n (1) lim inf f(x) (2) lim sup f(x) x 1 0 x 1 0 (

B [ 0.1 ] x > 0 x 6= 1 f(x) µ 1 1 xn 1 + sin sin x 1 x 1 f(x) := lim. n x n (1) lim inf f(x) (2) lim sup f(x) x 1 0 x 1 0 ( . 28 4 14 [.1 ] x > x 6= 1 f(x) µ 1 1 xn 1 + sin + 2 + sin x 1 x 1 f(x) := lim. 1 + x n (1) lim inf f(x) (2) lim sup f(x) x 1 x 1 (3) lim inf x 1+ f(x) (4) lim sup f(x) x 1+ [.2 ] [, 1] Ω æ x (1) (2) nx(1

More information

(Basic of Proability Theory). (Probability Spacees ad Radom Variables , (Expectatios, Meas) (Weak Law

(Basic of Proability Theory). (Probability Spacees ad Radom Variables , (Expectatios, Meas) (Weak Law I (Radom Walks ad Percolatios) 3 4 7 ( -2 ) (Preface),.,,,...,,.,,,,.,.,,.,,. (,.) (Basic of Proability Theory). (Probability Spacees ad Radom Variables...............2, (Expectatios, Meas).............................

More information

最小2乗法

最小2乗法 2 2012 4 ( ) 2 2012 4 1 / 42 X Y Y = f (X ; Z) linear regression model X Y slope X 1 Y (X, Y ) 1 (X, Y ) ( ) 2 2012 4 2 / 42 1 β = β = β (4.2) = β 0 + β (4.3) ( ) 2 2012 4 3 / 42 = β 0 + β + (4.4) ( )

More information

Kullback-Leibler

Kullback-Leibler Kullback-Leibler 206 6 6 http://www.math.tohoku.ac.jp/~kuroki/latex/206066kullbackleibler.pdf 0 2 Kullback-Leibler 3. q i.......................... 3.2........... 3.3 Kullback-Leibler.............. 4.4

More information

.3. (x, x = (, u = = 4 (, x x = 4 x, x 0 x = 0 x = 4 x.4. ( z + z = 8 z, z 0 (z, z = (0, 8, (,, (8, 0 3 (0, 8, (,, (8, 0 z = z 4 z (g f(x = g(

.3. (x, x = (, u = = 4 (, x x = 4 x, x 0 x = 0 x = 4 x.4. ( z + z = 8 z, z 0 (z, z = (0, 8, (,, (8, 0 3 (0, 8, (,, (8, 0 z = z 4 z (g f(x = g( 06 5.. ( y = x x y 5 y 5 = (x y = x + ( y = x + y = x y.. ( Y = C + I = 50 + 0.5Y + 50 r r = 00 0.5Y ( L = M Y r = 00 r = 0.5Y 50 (3 00 0.5Y = 0.5Y 50 Y = 50, r = 5 .3. (x, x = (, u = = 4 (, x x = 4 x,

More information

ii

ii ii iii 1 1 1.1..................................... 1 1.2................................... 3 1.3........................... 4 2 9 2.1.................................. 9 2.2...............................

More information

I (Analysis I) Lebesgue (Lebesgue Integral Theory) 1 (Seiji HIRABA) 1 ( ),,, ( )

I (Analysis I) Lebesgue (Lebesgue Integral Theory) 1 (Seiji HIRABA) 1 ( ),,, ( ) I (Analysis I) Lebesgue (Lebesgue Integral Theory) 1 (Seiji HIRABA) 1 ( ),,, ( ) 1 (Introduction) 1 1.1... 1 1.2 Riemann Lebesgue... 2 2 (Measurable sets and Measures) 4 2.1 σ-... 4 2.2 Borel... 5 2.3...

More information

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi

(Basics of Proability Theory). (Probability Spacees ad Radom Variables,, (Ω, F, P ),, X,. (Ω, F, P ) (probability space) Ω ( ω Ω ) F ( 2 Ω ) Ω σ (σ-fi II (Basics of Probability Theory ad Radom Walks) (Preface),.,,,.,,,...,,.,.,,.,,. (Basics of Proability Theory). (Probability Spacees ad Radom Variables...............2, (Expectatios, Meas).............................

More information

) ] [ h m x + y + + V x) φ = Eφ 1) z E = i h t 13) x << 1) N n n= = N N + 1) 14) N n n= = N N + 1)N + 1) 6 15) N n 3 n= = 1 4 N N + 1) 16) N n 4

) ] [ h m x + y + + V x) φ = Eφ 1) z E = i h t 13) x << 1) N n n= = N N + 1) 14) N n n= = N N + 1)N + 1) 6 15) N n 3 n= = 1 4 N N + 1) 16) N n 4 1. k λ ν ω T v p v g k = π λ ω = πν = π T v p = λν = ω k v g = dω dk 1) ) 3) 4). p = hk = h λ 5) E = hν = hω 6) h = h π 7) h =6.6618 1 34 J sec) hc=197.3 MeV fm = 197.3 kev pm= 197.3 ev nm = 1.97 1 3 ev

More information

ルベーグ積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 初版 1 刷発行時のものです.

ルベーグ積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます.   このサンプルページの内容は, 初版 1 刷発行時のものです. ルベーグ積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/005431 このサンプルページの内容は, 初版 1 刷発行時のものです. Lebesgue 1 2 4 4 1 2 5 6 λ a

More information

Microsoft Word - 表紙.docx

Microsoft Word - 表紙.docx 黒住英司 [ 著 ] サピエンティア 計量経済学 訂正および練習問題解答 (206/2/2 版 ) 訂正 練習問題解答 3 .69, 3.8 4 (X i X)U i i i (X i μ x )U i ( X μx ) U i. i E [ ] (X i μ x )U i i E[(X i μ x )]E[U i ]0. i V [ ] (X i μ x )U i i 2 i j E [(X i

More information

December 28, 2018

December 28, 2018 e-mail : kigami@i.kyoto-u.ac.jp December 28, 28 Contents 2............................. 3.2......................... 7.3..................... 9.4................ 4.5............. 2.6.... 22 2 36 2..........................

More information

I, II 1, A = A 4 : 6 = max{ A, } A A 10 10%

I, II 1, A = A 4 : 6 = max{ A, } A A 10 10% 1 2006.4.17. A 3-312 tel: 092-726-4774, e-mail: hara@math.kyushu-u.ac.jp, http://www.math.kyushu-u.ac.jp/ hara/lectures/lectures-j.html Office hours: B A I ɛ-δ ɛ-δ 1. 2. A 1. 1. 2. 3. 4. 5. 2. ɛ-δ 1. ɛ-n

More information

t χ 2 F Q t χ 2 F 1 2 µ, σ 2 N(µ, σ 2 ) f(x µ, σ 2 ) = 1 ( exp (x ) µ)2 2πσ 2 2σ 2 0, N(0, 1) (100 α) z(α) t χ 2 *1 2.1 t (i)x N(µ, σ 2 ) x µ σ N(0, 1

t χ 2 F Q t χ 2 F 1 2 µ, σ 2 N(µ, σ 2 ) f(x µ, σ 2 ) = 1 ( exp (x ) µ)2 2πσ 2 2σ 2 0, N(0, 1) (100 α) z(α) t χ 2 *1 2.1 t (i)x N(µ, σ 2 ) x µ σ N(0, 1 t χ F Q t χ F µ, σ N(µ, σ ) f(x µ, σ ) = ( exp (x ) µ) πσ σ 0, N(0, ) (00 α) z(α) t χ *. t (i)x N(µ, σ ) x µ σ N(0, ) (ii)x,, x N(µ, σ ) x = x+ +x N(µ, σ ) (iii) (i),(ii) z = x µ N(0, ) σ N(0, ) ( 9 97.

More information

meiji_resume_1.PDF

meiji_resume_1.PDF β β β (q 1,q,..., q n ; p 1, p,..., p n ) H(q 1,q,..., q n ; p 1, p,..., p n ) Hψ = εψ ε k = k +1/ ε k = k(k 1) (x, y, z; p x, p y, p z ) (r; p r ), (θ; p θ ), (ϕ; p ϕ ) ε k = 1/ k p i dq i E total = E

More information

( 28 ) ( ) ( ) 0 This note is c 2016, 2017 by Setsuo Taniguchi. It may be used for personal or classroom purposes, but not for commercial purp

( 28 ) ( ) ( ) 0 This note is c 2016, 2017 by Setsuo Taniguchi. It may be used for personal or classroom purposes, but not for commercial purp ( 28) ( ) ( 28 9 22 ) 0 This ote is c 2016, 2017 by Setsuo Taiguchi. It may be used for persoal or classroom purposes, but ot for commercial purposes. i (http://www.stat.go.jp/teacher/c2epi1.htm ) = statistics

More information

第11回:線形回帰モデルのOLS推定

第11回:線形回帰モデルのOLS推定 11 OLS 2018 7 13 1 / 45 1. 2. 3. 2 / 45 n 2 ((y 1, x 1 ), (y 2, x 2 ),, (y n, x n )) linear regression model y i = β 0 + β 1 x i + u i, E(u i x i ) = 0, E(u i u j x i ) = 0 (i j), V(u i x i ) = σ 2, i

More information

1 Tokyo Daily Rainfall (mm) Days (mm)

1 Tokyo Daily Rainfall (mm) Days (mm) ( ) r-taka@maritime.kobe-u.ac.jp 1 Tokyo Daily Rainfall (mm) 0 100 200 300 0 10000 20000 30000 40000 50000 Days (mm) 1876 1 1 2013 12 31 Tokyo, 1876 Daily Rainfall (mm) 0 50 100 150 0 100 200 300 Tokyo,

More information

I A A441 : April 15, 2013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida )

I A A441 : April 15, 2013 Version : 1.1 I   Kawahira, Tomoki TA (Shigehiro, Yoshida ) I013 00-1 : April 15, 013 Version : 1.1 I Kawahira, Tomoki TA (Shigehiro, Yoshida) http://www.math.nagoya-u.ac.jp/~kawahira/courses/13s-tenbou.html pdf * 4 15 4 5 13 e πi = 1 5 0 5 7 3 4 6 3 6 10 6 17

More information

renshumondai-kaito.dvi

renshumondai-kaito.dvi 3 1 13 14 1.1 1 44.5 39.5 49.5 2 0.10 2 0.10 54.5 49.5 59.5 5 0.25 7 0.35 64.5 59.5 69.5 8 0.40 15 0.75 74.5 69.5 79.5 3 0.15 18 0.90 84.5 79.5 89.5 2 0.10 20 1.00 20 1.00 2 1.2 1 16.5 20.5 12.5 2 0.10

More information

1 1 [1] ( 2,625 [2] ( 2, ( ) /

1 1 [1] ( 2,625 [2] ( 2, ( ) / [] (,65 [] (,3 ( ) 67 84 76 7 8 6 7 65 68 7 75 73 68 7 73 7 7 59 67 68 65 75 56 6 58 /=45 /=45 6 65 63 3 4 3/=36 4/=8 66 7 68 7 7/=38 /=5 7 75 73 8 9 8/=364 9/=864 76 8 78 /=45 /=99 8 85 83 /=9 /= ( )

More information

1 1 sin cos P (primary) S (secondly) 2 P S A sin(ω2πt + α) A ω 1 ω α V T m T m 1 100Hz m 2 36km 500Hz. 36km 1

1 1 sin cos P (primary) S (secondly) 2 P S A sin(ω2πt + α) A ω 1 ω α V T m T m 1 100Hz m 2 36km 500Hz. 36km 1 sin cos P (primary) S (secondly) 2 P S A sin(ω2πt + α) A ω ω α 3 3 2 2V 3 33+.6T m T 5 34m Hz. 34 3.4m 2 36km 5Hz. 36km m 34 m 5 34 + m 5 33 5 =.66m 34m 34 x =.66 55Hz, 35 5 =.7 485.7Hz 2 V 5Hz.5V.5V V

More information

1 1.1 ( ). z = a + bi, a, b R 0 a, b 0 a 2 + b 2 0 z = a + bi = ( ) a 2 + b 2 a a 2 + b + b 2 a 2 + b i 2 r = a 2 + b 2 θ cos θ = a a 2 + b 2, sin θ =

1 1.1 ( ). z = a + bi, a, b R 0 a, b 0 a 2 + b 2 0 z = a + bi = ( ) a 2 + b 2 a a 2 + b + b 2 a 2 + b i 2 r = a 2 + b 2 θ cos θ = a a 2 + b 2, sin θ = 1 1.1 ( ). z = + bi,, b R 0, b 0 2 + b 2 0 z = + bi = ( ) 2 + b 2 2 + b + b 2 2 + b i 2 r = 2 + b 2 θ cos θ = 2 + b 2, sin θ = b 2 + b 2 2π z = r(cos θ + i sin θ) 1.2 (, ). 1. < 2. > 3. ±,, 1.3 ( ). A

More information

18 ( ) I II III A B C(100 ) 1, 2, 3, 5 I II A B (100 ) 1, 2, 3 I II A B (80 ) 6 8 I II III A B C(80 ) 1 n (1 + x) n (1) n C 1 + n C

18 ( ) I II III A B C(100 ) 1, 2, 3, 5 I II A B (100 ) 1, 2, 3 I II A B (80 ) 6 8 I II III A B C(80 ) 1 n (1 + x) n (1) n C 1 + n C 8 ( ) 8 5 4 I II III A B C( ),,, 5 I II A B ( ),, I II A B (8 ) 6 8 I II III A B C(8 ) n ( + x) n () n C + n C + + n C n = 7 n () 7 9 C : y = x x A(, 6) () A C () C P AP Q () () () 4 A(,, ) B(,, ) C(,,

More information

newmain.dvi

newmain.dvi 数論 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/008142 このサンプルページの内容は, 第 2 版 1 刷発行当時のものです. Daniel DUVERNEY: THÉORIE DES NOMBRES c Dunod, Paris, 1998, This book is published

More information

(note-02) Rademacher 1/57

(note-02) Rademacher 1/57 (note-02) Rademacher 1/57 (x 1, y 1 ),..., (x n, y n ) X Y f : X Y Y = R f Y = {+1, 1}, {1, 2,..., G} f x y 1. (x 1, y 1 ),..., (x n, y n ) f(x i ) ( ) 2. x f(x) Y 2/57 (x, y) f(x) f(x) y (, loss) l(f(x),

More information

25 7 18 1 1 1.1 v.s............................. 1 1.1.1.................................. 1 1.1.2................................. 1 1.1.3.................................. 3 1.2................... 3

More information

20 9 19 1 3 11 1 3 111 3 112 1 4 12 6 121 6 122 7 13 7 131 8 132 10 133 10 134 12 14 13 141 13 142 13 143 15 144 16 145 17 15 19 151 1 19 152 20 2 21 21 21 211 21 212 1 23 213 1 23 214 25 215 31 22 33

More information

医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 第 2 版 1 刷発行時のものです.

医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます.   このサンプルページの内容は, 第 2 版 1 刷発行時のものです. 医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/009192 このサンプルページの内容は, 第 2 版 1 刷発行時のものです. i 2 t 1. 2. 3 2 3. 6 4. 7 5. n 2 ν 6. 2 7. 2003 ii 2 2013 10 iii 1987

More information

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.

,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,. 9 α ν β Ξ ξ Γ γ o δ Π π ε ρ ζ Σ σ η τ Θ θ Υ υ ι Φ φ κ χ Λ λ Ψ ψ µ Ω ω Def, Prop, Th, Lem, Note, Remark, Ex,, Proof, R, N, Q, C [a, b {x R : a x b} : a, b {x R : a < x < b} : [a, b {x R : a x < b} : a,

More information

i

i i 1 1 1.1..................................... 1 1.2........................................ 3 1.3.................................. 4 1.4..................................... 4 1.5......................................

More information

A

A A 2563 15 4 21 1 3 1.1................................................ 3 1.2............................................. 3 2 3 2.1......................................... 3 2.2............................................

More information

y i OLS [0, 1] OLS x i = (1, x 1,i,, x k,i ) β = (β 0, β 1,, β k ) G ( x i β) 1 G i 1 π i π i P {y i = 1 x i } = G (

y i OLS [0, 1] OLS x i = (1, x 1,i,, x k,i ) β = (β 0, β 1,, β k ) G ( x i β) 1 G i 1 π i π i P {y i = 1 x i } = G ( 7 2 2008 7 10 1 2 2 1.1 2............................................. 2 1.2 2.......................................... 2 1.3 2........................................ 3 1.4................................................

More information

2 1,, x = 1 a i f i = i i a i f i. media ( ): x 1, x 2,..., x,. mode ( ): x 1, x 2,..., x,., ( ). 2., : box plot ( ): x variace ( ): σ 2 = 1 (x k x) 2

2 1,, x = 1 a i f i = i i a i f i. media ( ): x 1, x 2,..., x,. mode ( ): x 1, x 2,..., x,., ( ). 2., : box plot ( ): x variace ( ): σ 2 = 1 (x k x) 2 1 1 Lambert Adolphe Jacques Quetelet (1796 1874) 1.1 1 1 (1 ) x 1, x 2,..., x ( ) x a 1 a i a m f f 1 f i f m 1.1 ( ( )) 155 160 160 165 165 170 170 175 175 180 180 185 x 157.5 162.5 167.5 172.5 177.5

More information

untitled

untitled WinLD R (16) WinLD https://www.biostat.wisc.edu/content/lan-demets-method-statistical-programs-clinical-trials WinLD.zip 2 2 1 α = 5% Type I error rate 1 5.0 % 2 9.8 % 3 14.3 % 5 22.6 % 10 40.1 % 3 Type

More information

i

i 009 I 1 8 5 i 0 1 0.1..................................... 1 0.................................................. 1 0.3................................. 0.4........................................... 3

More information

2011 ( ) ( ) ( ),,.,,.,, ,.. (. ), 1. ( ). ( ) ( ). : obata/,.,. ( )

2011 ( ) ( ) ( ),,.,,.,, ,.. (. ), 1. ( ). ( ) ( ). :   obata/,.,. ( ) 2011 () () (),,.,,.,,. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.,.. (. ), 1. ( ). ()(). : www.math.is.tohoku.ac.jp/ obata/,.,. () obata@math.is.tohoku.ac.jp http://www.dais.is.tohoku.ac.jp/ amf/, (! 22 10.6; 23 10.20;

More information

2000年度『数学展望 I』講義録

2000年度『数学展望 I』講義録 2000 I I IV I II 2000 I I IV I-IV. i ii 3.10 (http://www.math.nagoya-u.ac.jp/ kanai/) 2000 A....1 B....4 C....10 D....13 E....17 Brouwer A....21 B....26 C....33 D....39 E. Sperner...45 F....48 A....53

More information

20 6 4 1 4 1.1 1.................................... 4 1.1.1.................................... 4 1.1.2 1................................ 5 1.2................................... 7 1.2.1....................................

More information

AR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t

AR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t 87 6.1 AR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t 2, V(y t y t 1, y t 2, ) = σ 2 3. Thus, y t y t 1,

More information

JMP V4 による生存時間分析

JMP V4 による生存時間分析 V4 1 SAS 2000.11.18 4 ( ) (Survival Time) 1 (Event) Start of Study Start of Observation Died Died Died Lost End Time Censor Died Died Censor Died Time Start of Study End Start of Observation Censor

More information

x () g(x) = f(t) dt f(x), F (x) 3x () g(x) g (x) f(x), F (x) (3) h(x) = x 3x tf(t) dt.9 = {(x, y) ; x, y, x + y } f(x, y) = xy( x y). h (x) f(x), F (x

x () g(x) = f(t) dt f(x), F (x) 3x () g(x) g (x) f(x), F (x) (3) h(x) = x 3x tf(t) dt.9 = {(x, y) ; x, y, x + y } f(x, y) = xy( x y). h (x) f(x), F (x [ ] IC. f(x) = e x () f(x) f (x) () lim f(x) lim f(x) x + x (3) lim f(x) lim f(x) x + x (4) y = f(x) ( ) ( s46). < a < () a () lim a log xdx a log xdx ( ) n (3) lim log k log n n n k=.3 z = log(x + y ),

More information

kubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i

kubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i kubostat2018d p.1 I 2018 (d) model selection and kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2018 06 25 : 2018 06 21 17:45 1 2 3 4 :? AIC : deviance model selection misunderstanding kubostat2018d (http://goo.gl/76c4i)

More information

30

30 3 ............................................2 2...........................................2....................................2.2...................................2.3..............................

More information

ii

ii i 2013 5 143 5.1...................................... 143 5.2.................................. 144 5.3....................................... 148 5.4.................................. 153 5.5...................................

More information

X G P G (X) G BG [X, BG] S 2 2 2 S 2 2 S 2 = { (x 1, x 2, x 3 ) R 3 x 2 1 + x 2 2 + x 2 3 = 1 } R 3 S 2 S 2 v x S 2 x x v(x) T x S 2 T x S 2 S 2 x T x S 2 = { ξ R 3 x ξ } R 3 T x S 2 S 2 x x T x S 2

More information

講義のーと : データ解析のための統計モデリング. 第5回

講義のーと :  データ解析のための統計モデリング. 第5回 Title 講義のーと : データ解析のための統計モデリング Author(s) 久保, 拓弥 Issue Date 2008 Doc URL http://hdl.handle.net/2115/49477 Type learningobject Note この講義資料は, 著者のホームページ http://hosho.ees.hokudai.ac.jp/~kub ードできます Note(URL)http://hosho.ees.hokudai.ac.jp/~kubo/ce/EesLecture20

More information

( )

( ) 18 10 01 ( ) 1 2018 4 1.1 2018............................... 4 1.2 2018......................... 5 2 2017 7 2.1 2017............................... 7 2.2 2017......................... 8 3 2016 9 3.1 2016...............................

More information

熊本県数学問題正解

熊本県数学問題正解 00 y O x Typed by L A TEX ε ( ) (00 ) 5 4 4 ( ) http://www.ocn.ne.jp/ oboetene/plan/. ( ) (009 ) ( ).. http://www.ocn.ne.jp/ oboetene/plan/eng.html 8 i i..................................... ( )0... (

More information

講義のーと : データ解析のための統計モデリング. 第2回

講義のーと :  データ解析のための統計モデリング. 第2回 Title 講義のーと : データ解析のための統計モデリング Author(s) 久保, 拓弥 Issue Date 2008 Doc URL http://hdl.handle.net/2115/49477 Type learningobject Note この講義資料は, 著者のホームページ http://hosho.ees.hokudai.ac.jp/~kub ードできます Note(URL)http://hosho.ees.hokudai.ac.jp/~kubo/ce/EesLecture20

More information

2012 A, N, Z, Q, R, C

2012 A, N, Z, Q, R, C 2012 A, N, Z, Q, R, C 1 2009 9 2 2011 2 3 2012 9 1 2 2 5 3 11 4 16 5 22 6 25 7 29 8 32 1 1 1.1 3 1 1 1 1 1 1? 3 3 3 3 3 3 3 1 1, 1 1 + 1 1 1+1 2 2 1 2+1 3 2 N 1.2 N (i) 2 a b a 1 b a < b a b b a a b (ii)

More information

Z[i] Z[i] π 4,1 (x) π 4,3 (x) 1 x (x ) 2 log x π m,a (x) 1 x ϕ(m) log x 1.1 ( ). π(x) x (a, m) = 1 π m,a (x) x modm a 1 π m,a (x) 1 ϕ(m) π(x)

Z[i] Z[i] π 4,1 (x) π 4,3 (x) 1 x (x ) 2 log x π m,a (x) 1 x ϕ(m) log x 1.1 ( ). π(x) x (a, m) = 1 π m,a (x) x modm a 1 π m,a (x) 1 ϕ(m) π(x) 3 3 22 Z[i] Z[i] π 4, (x) π 4,3 (x) x (x ) 2 log x π m,a (x) x ϕ(m) log x. ( ). π(x) x (a, m) = π m,a (x) x modm a π m,a (x) ϕ(m) π(x) ϕ(m) x log x ϕ(m) m f(x) g(x) (x α) lim f(x)/g(x) = x α mod m (a,

More information

() n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (5) (6 ) n C + nc + 3 nc n nc n (7 ) n C + nc + 3 nc n nc n (

() n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (5) (6 ) n C + nc + 3 nc n nc n (7 ) n C + nc + 3 nc n nc n ( 3 n nc k+ k + 3 () n C r n C n r nc r C r + C r ( r n ) () n C + n C + n C + + n C n n (3) n C + n C + n C 4 + n C + n C 3 + n C 5 + (4) n C n n C + n C + n C + + n C n (5) k k n C k n C k (6) n C + nc

More information

() x + y + y + x dy dx = 0 () dy + xy = x dx y + x y ( 5) ( s55906) 0.7. (). 5 (). ( 6) ( s6590) 0.8 m n. 0.9 n n A. ( 6) ( s6590) f A (λ) = det(a λi)

() x + y + y + x dy dx = 0 () dy + xy = x dx y + x y ( 5) ( s55906) 0.7. (). 5 (). ( 6) ( s6590) 0.8 m n. 0.9 n n A. ( 6) ( s6590) f A (λ) = det(a λi) 0. A A = 4 IC () det A () A () x + y + z = x y z X Y Z = A x y z ( 5) ( s5590) 0. a + b + c b c () a a + b + c c a b a + b + c 0 a b c () a 0 c b b c 0 a c b a 0 0. A A = 7 5 4 5 0 ( 5) ( s5590) () A ()

More information

( ) 2002 1 1 1 1.1....................................... 1 1.1.1................................. 1 1.1.2................................. 1 1.1.3................... 3 1.1.4......................................

More information

( ) sin 1 x, cos 1 x, tan 1 x sin x, cos x, tan x, arcsin x, arccos x, arctan x. π 2 sin 1 x π 2, 0 cos 1 x π, π 2 < tan 1 x < π 2 1 (1) (

( ) sin 1 x, cos 1 x, tan 1 x sin x, cos x, tan x, arcsin x, arccos x, arctan x. π 2 sin 1 x π 2, 0 cos 1 x π, π 2 < tan 1 x < π 2 1 (1) ( 6 20 ( ) sin, cos, tan sin, cos, tan, arcsin, arccos, arctan. π 2 sin π 2, 0 cos π, π 2 < tan < π 2 () ( 2 2 lim 2 ( 2 ) ) 2 = 3 sin (2) lim 5 0 = 2 2 0 0 2 2 3 3 4 5 5 2 5 6 3 5 7 4 5 8 4 9 3 4 a 3 b

More information