2 1 Introduction
|
|
- けんじ いんそん
- 5 years ago
- Views:
Transcription
1 toyoizumi@waseda.jp
2 2 1 Introduction Bayes
3 7 Poisson Poisson Poisson χ 2 ( t F t
4
5 5 1 Introduction Methods & Evaluation ( Problem Requirements 1 Text book Business Statistics (Barron s Business Review Series) Douglas Downing ( ), Jeff Clark ( ) Barrons Educational Series Inc ; 4th (2003/09) 1.
6 6 1 Introduction 1.1: Business Statistics (Barron s Business Review Series)[1]
7 Example 1.1 ( ) Problem 1.1.
8 Definition 2.1 (average). Given n samples of data, the quantity below is called by average. x = x 1 + x x n n = n i=1 x i. (2.1) n Definition 2.2 (median). Given n samples of data, the center of the ordered samples is called by median. Definition 2.3 (mode). Given n samples of data, the value which is appeared most in the data is called by mode. Problem average median mode Problem 2.2.
9 Definition 2.4 (variance). Given n samples of data, the quantity below is called by variance of data. The sample variance is also defined by Var(x) = σ 2 = n i=1 (x i x) 2 n (2.2) = x 2 x 2. (2.3) s 2 2 = n i=1 (x i x) 2 n 1 Problem 2.3. (2.3) (2.4) x 2 Definition 2.5 (standard deviation). Given n samples of data, the square-root of the variance Var(x) is called by standard deviation, and given by σ = Var(x) (2.5) = x 2 x 2. (2.6) 2.3 Example 2.1 ( ). Problem 2.4. Excel
10 Remark 3.1. Definition 3.1. Ω A A P{A or B} = P{A B} P{A}, (3.1) = P(A) + P(B) P(A B). (3.2) P{A and B} = P{A B} (3.3) P{ not A} = P{A c } = 1 P{A}. (3.4)
11 Problem n j np j = n! (n j)!, (3.5) n j ( ) n n! = n C j = j (n j)! j!, (3.6) Problem
12 n h P{ n h } = = ( )( n 1 h 2 n! h!(n h)! ) n (4.1) ( ) 1 n 2 (4.2) Problem 4.1. Problem (null hypothesis) (alternative hypothesis) false positive)
13 error) Problem 4.3. Remark % 10% Example 4.1. (null hypothesis) p = 1/2. (alternative hypothesis) p 1/2. Problem 4.4. false positive) error)
14 P{ } = (5.1) MRI 5% 5% P{A B} B B P{ } = 1 20, P{ } = (5.2) 1 [3, p.207]
15 P{A B} P{A B} = P{A B} P{B}. (5.3) P{ }. (5.4) (5.2) (5.4) Problem 5.1 (False positives 2 ). Answer the followings: 1. Suppose there are illegal acts in one in companies on the average. You as a accountant audit companies. The auditing contains some uncertainty. There is a 1% chance that a normal company is declared to have some problem. Find the probability that the company declared to have a problem is actually illegal. 2. Suppose you are tested by a disease that strikes 1/1000 population. This test has 5% false positives, that mean even if you are not affected by this disease, you have 5% chance to be diagnosed to be suffered by it. A medical operation will cure the disease, but of course there is a misoperation. Given that your result is positive, what can you say about your situation? 1/6 2 Modified from [3, p.207].
16 Definition 5.1. B P{A B} = P{A B} P{B}. (5.5) Example 5.1. P{( ) ( )} P{ } = P{ } P{ } = P{ } P{ }P{ 3} = P{ } = P{ } = 1 6. Problem 5.2. A K Problem 5.3. A
17 5.4. Bayes 17 Problem 5.4. Definition 5.2. A B Theorem 5.1 ( ). A B Problem 5.5. Theorem 5.1 P{A B} = P{A} (5.6) P{A B} = P{A}P{B}. (5.7) 5.4 Bayes Bayes Theorem 5.2 (Bayes). P{B A} = Problem 5.6. Theorem 5.2 P{A B}P{B} P{A B}P{B} + P{A B c }P{B c }. (5.8)
18 Definition 6.1. Example 6.1. W: 6.2 Definition 6.2 (). X f (a) = P{X = a}, (6.1) Problem 6.1. X X Definition 6.3 (). X F(x) = P{X x}, (6.2) f (x) Theorem 6.1. f (x) = df(x) dx = dp(x x). (6.3) dx lim F(x) = 1, (6.4) x lim F(x) = 0 (6.5) x
19 Definition 6.4 ( ). E[X] = xdp{x x} (6.6) Remark 6.1. X E[X] = xdp{x x} = ap{x = a} (6.7) a 6.4 Example 6.2. U,V 1/2 U 1/2, (6.8) V = { 1 with probability 1/2, 0 with probability 1/2. (6.9) Problem 6.2. E[U] = E[V ] = 1/2 Definition 6.5 ( ). Var[X] = E [ (X E[X]) 2]. (6.10)
20 20 6 Theorem 6.2. Var[X] = E[X 2 ] (E[X]) 2. (6.11) Var[aX + b] = a 2 Var[X]. (6.12) Remark 6.2. X Definition 6.6 (). σ = Var[X]. (6.13) Problem 6.3. Example 6.2 Theorem 6.3. X,Y E[XY ] = E[X]E[Y ], (6.14) Var[X +Y ] = Var[X] +Var[Y ]. (6.15) 6.5 A Definition 6.7 ( ). A = { 1 with probability p, 0 with probability 1 p. (6.16) Theorem 6.4. E[A] = p, (6.17) Var[A] = p(1 p). (6.18)
21 Definition 6.8. X 1,X 2,...,X n, (6.19) P{X i x} = F(x), (6.20) X 1,X 2,...,X n i.i.d: independent and identically distributed) X Theorem 6.5. X 1,X 2,...,X n, (6.21) X = 1 n n i=1 X i, (6.22) E[ X] = E[X], (6.23) Var[ X] = Var[X]. (6.24) n
22 22 7 Poisson 7.1 Definition 7.1 (). p X X Theorem 7.1. X P{X = i} = (1 p) i 1 p, (7.1) E[X] = 1 p, (7.2) Var[X] = 1 p p 2. (7.3) Proof. X = { 1 with probability p, 1 + X with probability 1 p, (7.4) X X E[X] = p 1 + (1 p)e[1 + X], (7.5) E[X] = 1/p
23 Problem 7.1. E[X 2 ] = p (1 p)e[(1 + X) 2 ], (7.6) Var[X] Problem / Definition 7.2 (). p n X X P{X = i} = ( ) n p i (1 p) n i, (7.7) i Theorem 7.2. X Bernouilli A i P{A i = 1} = p X = n i=1 Theorem 7.3. X A i, (7.8) E[X] = np, (7.9) Var[X] = np(1 p). (7.10)
24 24 7 Poisson 7.3 Problem 7.3 ( ) P_n n 7.1: : Problem 7.4 ( ). 7% 5% 7.4 Poisson Poisson Definition 7.3 (Poisson ). N N λ Poisson
25 7.4. Poisson 25 P_n n 7.2: P_n n 7.3: risk n 7.4:
26 26 7 Poisson P{N = n} = λ n Theorem 7.4. Poisson N n! e λ. (7.11) E[N] = λ, (7.12) Var[N] = λ. (7.13) Theorem 7.5. n np λ = np Poisson 7.5 Poisson Problem 7.5 ( ) P_n n 7.5: : Problem 7.6 ( )
27 Definition 7.4 ( ). M A N M B n A X X ( M )( N M ) i n i P{X = i} = ( N. (7.14) n) Theorem 7.6. N E[N] = nm N, (7.15) ( )( M Var[N] = n 1 M )( ) N n. (7.16) N N N 1
28 Definition 8.1. X a b X [a,b] a c d b P{c X d} = d c b a, (8.1) Remark 8.1. P{X = x} = 0. (8.2) 8.2 Definition 8.2. Example 8.1. H H Definition 8.3 ( ). X (Cumulative Distribution Function: CDF or Probability Density Function: PDF) Theorem 8.1. F(x) = P{X x}. (8.3) P{X > a} = 1 F(a), (8.4) P{b < X < c} = F(c) F(b). (8.5)
29 Problem 8.1. [a,b] Definition 8.4 ( ). F(x) (probability density function:pdf) f (x) = df(x) dx. (8.6) Problem 8.2. [a,b] Theorem 8.2. Theorem 8.3. P{a < X b} = E[X] = E[X 2 ] = b a f (x)dx = F(b) F(a). (8.7) x f (x)dx, (8.8) x 2 f (x)dx, (8.9) Var[X] = E[X 2 ] E[X] 2. (8.10) Problem 8.3. [a,b] X 8.3 µ σ 2 Definition 8.5 (). X µ σ 2 f (x) f (x) = 1 2πσ e [(x µ)/σ]2 /2. (8.11) X N(µ,σ 2 ) Definition 8.6 ( ). Z µ = 0 σ 2 = 1 Z N(0,1) Φ(x) = P{Z x}. (8.12)
30 30 8 Theorem % P{µ 2σ X µ + 2σ} (8.13) Theorem 8.5. µ σ 2 Y Z Y = µ + σz. (8.14) Z = Y µ σ. (8.15) Theorem 8.6. ( ) a µ P{X a} = Φ. (8.16) σ Proof. { X µ P{X a} = P a µ } σ σ { = P Z a µ } σ ( ) a µ = Φ. σ Remark 8.2. Excel Example 8.2 (). X POS 2.5% Theorem 8.5 Z = X µ σ (8.17)
31 N[0,1] P{ 2 Z 2} (8.18) P{Z 2} + P{ 2 Z 2} + P{Z 2} = 1. (8.19) P{Z 2} = P{Z 2} X P{Z 2} = 1/2P{ 2 Z 2} = (8.20) = P{Z 2} { X µ = P σ } 2 = P{X µ + 2σ} = P{X 440} % Problem 8.4. Sony Sony Sony Excel Theorem 8.7. X i µ i σi 2 ) X = n i=1 X i N ( n i=1 µ i, n i=1 σ 2 i. (8.21)
32 Definition 8.7 ( ). log(y ) Y X Y = e X, (8.22) Theorem 8.8 ( ). X N[µ,σ 2 ] Y = e X E[Y ] = e µ+σ 2 /2, (8.23) Var[Y ] = e 2µ+2σ 2 e 2µ+σ 2. (8.24) Problem 8.5. E[Y ] e µ 8.5 Theorem 8.9 ( ). Figure 8.1 Remark χ 2 ( χ 2 Definition 8.8 (χ 2 ). Z Z χ 2 χ = Z 2 (8.25)
33 8.6. χ 2 ( : The detailed histgram of the sample average A = 1 n n i=1 X i when n = 10, where X i is a Bernouilli random variable with E[X i ] = 1/2. The solid line is the corresponding Normal distribution. Theorem Proof. E[χ] = 1, (8.26) Var[χ] = 2. (8.27) E[χ] = E[Z 2 ] = 1. (8.28) Definition 8.9 ( n χ 2 ). Z i χ n = n χ 2 n i=1 Z 2 i, (8.29) Remark 8.4. n Z i Remark 8.5. χ 2 Theorem E[χ n ] = n, (8.30) Var[χ n ] = 2n. (8.31)
34 t Definition 8.10 (t ). Z Y m χ 2 T = Z Y /m, (8.32) m student t Remark 8.6. student t 8.8 F Definition 8.11 (t ). X Y m n χ 2 m n F F = X/m Y /n, (8.33)
35 Definition 9.1 ( ). X Y F(x,y) = P{X x,y y}. (9.1) Theorem 9.1. f (x,y) = d2 F(x,y). (9.2) dxdy E[XY ] = xy f (x, y)dxdy. (9.3) Problem 9.1. X Y 9.2 Definition 9.2 (). P{X x} = F X (x) = F(x, ). (9.4) f X (x) = y= f (x,y)dy. (9.5) Problem 9.2. X Y
36 Definition 9.3 ( ). P{X x Y = y}. (9.6) f (x Y = y) = f (x,y) f Y (y). (9.7) Problem 9.3. X Y f (x Y = 2), (9.8) 9.4 X Y (9.7) Theorem 9.2. X Y f (x Y = y) = f X (x). (9.9) f (x,y) = f X (x) f Y (y). (9.10) E[XY ] = E[X]E[Y ]. (9.11)
37 Definition 9.4. X Y Cov(X,Y ) = E [(X E[X])(Y E[Y ])] (9.12) = E [XY ] E[X]E[Y ]. (9.13) Problem 9.4. Cov(X,Y ) Problem 9.5. X Y Definition 9.5. ρ(x,y ) = Cov(X,Y ) Var(X)Var(Y ). (9.14) Problem 9.6. ρ(x,y ) 1-1 Problem 9.7. X Y 9.6 Theorem 9.3. E[X +Y ] = E[X] + E[Y ], (9.15) Var[X +Y ] = Var[X] +Var[Y ] + 2Cov[X,Y ]. (9.16) Remark 9.1. X Y Problem 9.8. Var[X +Y ] = Var[X] +Var[Y ] + 2Cov[X,Y ]. (9.17)
38 Example 9.1. Worldwide Fastburgers, Inc W E[W] = 1000, (9.18) Var[W] = 400, (9.19) HaveItYourWay Burgers, Inc H E[H] = 1000, (9.20) Var[H] = 400. (9.21) FunGoodTimes Pizza, Inc F E[F] = 1000, (9.22) Var[F] = 400. (9.23) Problem 9.9. Problem Theorem 9.3 Cov(W, H) = 380, (9.24) Cov(W, F) = 200, (9.25)
39 Problem HaveItYourWay Burgers, Inc Var[W + H] = Var[W] +Var[H] + 2Cov(W,H) (9.26) = (9.27) = 1,560. (9.28) Problem FunGoodTimes Pizza, Inc Problem 9.13.
40 X µ n X 1,X 2,...,X n µ ˆµ x ˆµ = x = 1 n n i=0 X i. (10.1) x Definition Example ,7,4,10,12, (10.2) µ = E[X] ˆµ ˆµ = x = 1 ( ) = 7. (10.3) 5 Remark ˆµ
41 Definition 10.2 ( ). P{X 1 = x 1,X 2 = x 2,...,X n = x n µ}, (10.4) µ Remark Example 10.2 () {1,0,0,0,1,0,1,0,0,0}. (10.5) p p = 1/2 P{X 1 = 1,X 2 = 0,...,X 10 = 0} = , (10.6) 210 p p = 1/3 P{X 1 = 1,X 2 = 0,...,X 10 = 0} = , (10.7) 310 Problem p P{X 1 = 1,X 2 = 0,...,X 10 = 0} = p 3 (1 p) 7 (10.8) p log log logp{x 1 = 1,X 2 = 0,...,X 10 = 0} = 3log p + 7log(1 p), (10.9)
42 42 10 p p 0 3 p + 7 = 0, (10.10) 1 p p = 3/10 n p = 1 n n i=1 x i (10.11) (x 1,x 2,...,x n ) Example X ˆµ = x = 1 n µ = E[X] n i=0 Example X ˆ σ 2 = 1 n n i=1 σ 2 = Var[X] X i, (10.12) (X i x) 2, (10.13) Example 10.5 (). (Definition 7.2 ( ) n P{X = i} = p i (1 p) n i, (10.14) i (10.15) p
43 Definition consistent estimator) Remark Problem Example x µ [ ] n 1 Var[ x] = Var X i n i = 1 n 2 nvar[x] = 1 Var[X] 0 as n 0. n 10.4 Definition Problem x Example ˆ σ 2 E[X] = 0, (10.16) σ 2 = Var[X] = E[X 2 ]. (10.17) ˆ σ 2 = 1 n n i=0 (X i x) 2. (10.18)
44 44 10 [ n ] E[ σ ˆ2 ] = 1 n E (X i x) 2 i=0 (10.19) = 1 n n i=0 E [ X 2 i 2 xx i + x 2] (10.20) (n 1)σ 2 =. (10.21) n E[X 2 i ] = σ 2, (10.22) E[ xx i ] = 1 n σ 2, (10.23) E[ x 2 ] = 1 n σ 2, (10.24) E[ σ ˆ2 (n 1)σ 2 ] =. (10.25) n σ ˆ2 s ˆ2 sˆ 2 = 1 n 1 n i=0 (X i x) 2 (10.26) E[ ˆ s 2 ] = σ 2, (10.27) Example
45 Example µ (X 1,X 2 ) ˆµ 2 = X 1 + X 2 2 (10.28) 1000 ˆµ 1000 = 1 n n i X i, (10.29) Problem ˆµ 1000 Problem 10.5.
46 46 11 Chapter 10 Problem 11.1 ( ). = 1, (11.1) Problem = 1, (11.2) 11.1 X 1,X 2,...,X n µ = E[X] x x = 1 n n i=0 X i. (11.3)
47 Theorem x Proof. Theorem 8.7 x x x µ c µ [ x c, x + c] µ [ x c, x + c] 95% c Definition 11.1 (). X 1,X 2,...,X n µ x [ x c, x + c] CL µ x P{ x c < µ < x + c} = CL. (11.4) Theorem 11.2 (). X 1,X 2,...,X n c P{ x c < µ < x + c} = 0.95, (11.5) (11.6) c = 1.96σ n, (11.7) 95% [ [ x c, x + c] = x 1.96σ, x σ ], (11.8) n n
48 48 11 Lemma Lemma Z = x µ σ 2 /n = n x µ σ, (11.9) N[0,1] Proof. x µ, σ 2 /n Theorem 8.5 Proof of Theorem Lemma 11.1 P{ x c < µ < x + c} = P{ c < x µ < c} = P{ c n σ < n x µ σ < c n σ } = P{ c n σ < Z < c n σ }. P{ 1.95 < Z < 1.95} = (11.10) c = 1.96σ n. (11.11) Problem t Section 11.1 c = 1.96σ n, (11.12)
49 11.2. t 49 X σ 2 Problem σ 2 1. σ 2 ˆσ 2 2. ˆσ 2 c Problem Lemma 11.1 Lemma Lemma ŝ 2 = 1 n 1 n i=0 (X i x) 2. (11.13) n( x µ) T =, (11.14) ŝ n 1 t Section 8.7 Proof. Section 8.7 Z χ 2 Y T = Z Y /m, (11.15) T t (11.14) n( x µ)/σ T = (11.16) ŝ/σ n( x µ)/σ = n i=0 (X i x) 2 /[(n 1)σ 2 ]. (11.17)
50 50 11 Lemma 11.1 n i=0 (X i x) 2 σ 2, (11.18) x n 1 χ 2 [2]P84-P87 Lemma 11.2 Theorem 11.3 ( ). X 1,X 2,...,X n n P{ x c < µ < x + c} = 0.95, (11.19) (11.20) c c = a ˆσ n, (11.21) a t P{ a < T < a} = 0.95, (11.22) n = 7 a = % [ [ x c, x + c] = x a ˆσ, x + a ˆσ ], (11.23) n n Proof. P{ x c < µ < x + c} = P{ c < x µ < c} = P{ c n ˆσ < T < c n ˆσ }. t P{ c n ˆσ < T < c n ˆσ } (11.24) Remark t t
51 Normal t n : t [1]
52 Problem Example 12.1 ( ) :
53 Problem N M p = M N, (12.1) n n X ˆp ˆp = X n, (12.2) ˆp p Problem Lemma n X (n, p) n X np np(1 p)
54 54 12 Proof. n X n Theorem n X ˆp = X n, (12.3) N(p, p(1 p)/n) Proof. Lemma 12.1 X N(np,np(1 p)) ˆp = X n, (12.4) X [ ] X E[ ˆp] = E = np = p. (12.5) n n [ ] X Var[ ˆp] = Var = 1 p(1 p) np(1 p) =. (12.6) n n2 n Remark ˆp E[ ˆp] = p. (12.7) n ˆp Var[ ˆp] = p(1 p) n 0, (12.8) ˆp
55 Theorem n ˆp 95% [ ] ˆp(1 ˆp) ˆp(1 ˆp) ˆp 1.96, ˆp (12.9) n n Proof. Theorem 12.1 ˆp σ 2 = Var[ ˆp] = ˆp(1 ˆp). (12.10) n Theorem 11.3 Remark [1] P259 Example n = 2018 ˆp = % [ ] ˆp(1 ˆp) ˆp(1 ˆp) ˆp 1.96, ˆp = [ , ], n n (12.11) 12.3 Example Literacy Digest Roosevelt Roosevelt Problem Literacy Digest Example 12.4 (). 12.2
56 Problem 12.5 ( ). : ( ) Problem 12.6.
57 :
58 : : Problem (YES or NO)
59 Section 4.2 (null hypothesis) (alternative hypothesis) error false positive) false negative) ( 5%
60 60 13 Example Problem % 20% 5% 13.2 Definition 13.1 ( ). Example X X X n = 100 p = 1/2
61 X = 90 n = 100 p = 1/2 Problem example 13.3 Definition X 1,X 2,...,X n µ = E[X] µ H 0 : µ = µ. (13.1) Example 13.3 ( ). N[µ,σ 2 ] σ 2 =
62 62 13 x {9,11,6,10,7,4,0,7,8,6,8,2,18}. (13.2) x = 7.38, (13.3) Problem x = 7.38 µ = 7 H 0 H 0 : µ = 7. (13.4) X 1,X 2,...,X 13 N[7,16.16] Lemma 11.1 x N[7,16.16/n] Z Z = n x µ σ = = 0.341, N[0,1] Z N[0,1] N[0,1] Z P{ 1.96 < Z < 1.96} = 0.95, (13.5) Z = 0.341, 7 5%
63 Problem Remark 13.1 ( ). 95% 5% 13.4 Example 13.4 () : {7,16,19,12,15,9,6,16,14,7,2,15,23,15,12,18,9}. (13.6) Problem
64 64 13 Section 13.3 H 0 : µ = µ. (13.7) Section H 0 : µ < µ. (13.8) Definition Example 13.5 ( ). 11 σ T T = n x µ ˆσ = = n 1 t P{T < 1.75} = 0.95, (13.9) T = 1.26 n 1 t 5% Problem {3,2,7,5,4,8,7,8} (13.10)
65 Problem ( % Problem 13.8.
66 66 [1] Douglas Downing and Jeffrey Clark. Business Statistics. Barrons s Educational Series, Inc., [2] Shingo Shirahata. Toukei Kaiseki Nyumon. Kyouritu, [3] Nassim Nicholas Taleb. Fooled by Randomness: The Hidden Role of Chance in the Markets and in Life. Random House Trade Paperbacks, 2005.
tokei01.dvi
2. :,,,. :.... Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN 4 3. (probability),, 1. : : n, α A, A a/n. :, p, p Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN
More information( )/2 hara/lectures/lectures-j.html 2, {H} {T } S = {H, T } {(H, H), (H, T )} {(H, T ), (T, T )} {(H, H), (T, T )} {1
( )/2 http://www2.math.kyushu-u.ac.jp/ hara/lectures/lectures-j.html 1 2011 ( )/2 2 2011 4 1 2 1.1 1 2 1 2 3 4 5 1.1.1 sample space S S = {H, T } H T T H S = {(H, H), (H, T ), (T, H), (T, T )} (T, H) S
More informationA B P (A B) = P (A)P (B) (3) A B A B P (B A) A B A B P (A B) = P (B A)P (A) (4) P (B A) = P (A B) P (A) (5) P (A B) P (B A) P (A B) A B P
1 1.1 (population) (sample) (event) (trial) Ω () 1 1 Ω 1.2 P 1. A A P (A) 0 1 0 P (A) 1 (1) 2. P 1 P 0 1 6 1 1 6 0 3. A B P (A B) = P (A) + P (B) (2) A B A B A 1 B 2 A B 1 2 1 2 1 1 2 2 3 1.3 A B P (A
More information統計学のポイント整理
.. September 17, 2012 1 / 55 n! = n (n 1) (n 2) 1 0! = 1 10! = 10 9 8 1 = 3628800 n k np k np k = n! (n k)! (1) 5 3 5 P 3 = 5! = 5 4 3 = 60 (5 3)! n k n C k nc k = npk k! = n! k!(n k)! (2) 5 3 5C 3 = 5!
More informationI L01( Wed) : Time-stamp: Wed 07:38 JST hig e, ( ) L01 I(2017) 1 / 19
I L01(2017-09-20 Wed) : Time-stamp: 2017-09-20 Wed 07:38 JST hig e, http://hig3.net ( ) L01 I(2017) 1 / 19 ? 1? 2? ( ) L01 I(2017) 2 / 19 ?,,.,., 1..,. 1,2,.,.,. ( ) L01 I(2017) 3 / 19 ? I. M (3 ) II,
More informationuntitled
2 : n =1, 2,, 10000 0.5125 0.51 0.5075 0.505 0.5025 0.5 0.4975 0.495 0 2000 4000 6000 8000 10000 2 weak law of large numbers 1. X 1,X 2,,X n 2. µ = E(X i ),i=1, 2,,n 3. σi 2 = V (X i ) σ 2,i=1, 2,,n ɛ>0
More information数理統計学Iノート
I ver. 0/Apr/208 * (inferential statistics) *2 A, B *3 5.9 *4 *5 [6] [],.., 7 2004. [2].., 973. [3]. R (Wonderful R )., 9 206. [4]. ( )., 7 99. [5]. ( )., 8 992. [6],.., 989. [7]. - 30., 0 996. [4] [5]
More informationrenshumondai-kaito.dvi
3 1 13 14 1.1 1 44.5 39.5 49.5 2 0.10 2 0.10 54.5 49.5 59.5 5 0.25 7 0.35 64.5 59.5 69.5 8 0.40 15 0.75 74.5 69.5 79.5 3 0.15 18 0.90 84.5 79.5 89.5 2 0.10 20 1.00 20 1.00 2 1.2 1 16.5 20.5 12.5 2 0.10
More information最小2乗法
2 2012 4 ( ) 2 2012 4 1 / 42 X Y Y = f (X ; Z) linear regression model X Y slope X 1 Y (X, Y ) 1 (X, Y ) ( ) 2 2012 4 2 / 42 1 β = β = β (4.2) = β 0 + β (4.3) ( ) 2 2012 4 3 / 42 = β 0 + β + (4.4) ( )
More information80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = i=1 i=1 n λ x i e λ i=1 x i! = λ n i=1 x i e nλ n i=1 x
80 X 1, X 2,, X n ( λ ) λ P(X = x) = f (x; λ) = λx e λ, x = 0, 1, 2, x! l(λ) = n f (x i ; λ) = n λ x i e λ x i! = λ n x i e nλ n x i! n n log l(λ) = log(λ) x i nλ log( x i!) log l(λ) λ = 1 λ n x i n =
More information( 30 ) 30 4 5 1 4 1.1............................................... 4 1.............................................. 4 1..1.................................. 4 1.......................................
More information10:30 12:00 P.G. vs vs vs 2
1 10:30 12:00 P.G. vs vs vs 2 LOGIT PROBIT TOBIT mean median mode CV 3 4 5 0.5 1000 6 45 7 P(A B) = P(A) + P(B) - P(A B) P(B A)=P(A B)/P(A) P(A B)=P(B A) P(A) P(A B) P(A) P(B A) P(B) P(A B) P(A) P(B) P(B
More information分布
(normal distribution) 30 2 Skewed graph 1 2 (variance) s 2 = 1/(n-1) (xi x) 2 x = mean, s = variance (variance) (standard deviation) SD = SQR (var) or 8 8 0.3 0.2 0.1 0.0 0 1 2 3 4 5 6 7 8 8 0 1 8 (probability
More informationHiroshi Toyoizumi May 24,
Hiroshi Toyoizumi May 24, 2006 1 E-mail: toyoizumi@waseda.jp http://www.f.waseda.jp/toyoizumi/ 1 2 2 2 2.1 1 Figure 1: [2] 2 3 2 4 2.2 Figure 2: Business Statistics (Barron s Business Review Series)[1]
More information確率論と統計学の資料
5 June 015 ii........................ 1 1 1.1...................... 1 1........................... 3 1.3... 4 6.1........................... 6................... 7 ii ii.3.................. 8.4..........................
More information( 28 ) ( ) ( ) 0 This note is c 2016, 2017 by Setsuo Taniguchi. It may be used for personal or classroom purposes, but not for commercial purp
( 28) ( ) ( 28 9 22 ) 0 This ote is c 2016, 2017 by Setsuo Taiguchi. It may be used for persoal or classroom purposes, but ot for commercial purposes. i (http://www.stat.go.jp/teacher/c2epi1.htm ) = statistics
More informationPart () () Γ Part ,
Contents a 6 6 6 6 6 6 6 7 7. 8.. 8.. 8.3. 8 Part. 9. 9.. 9.. 3. 3.. 3.. 3 4. 5 4.. 5 4.. 9 4.3. 3 Part. 6 5. () 6 5.. () 7 5.. 9 5.3. Γ 3 6. 3 6.. 3 6.. 3 6.3. 33 Part 3. 34 7. 34 7.. 34 7.. 34 8. 35
More information³ÎΨÏÀ
2017 12 12 Makoto Nakashima 2017 12 12 1 / 22 2.1. C, D π- C, D. A 1, A 2 C A 1 A 2 C A 3, A 4 D A 1 A 2 D Makoto Nakashima 2017 12 12 2 / 22 . (,, L p - ). Makoto Nakashima 2017 12 12 3 / 22 . (,, L p
More information..3. Ω, Ω F, P Ω, F, P ). ) F a) A, A,..., A i,... F A i F. b) A F A c F c) Ω F. ) A F A P A),. a) 0 P A) b) P Ω) c) [ ] A, A,..., A i,... F i j A i A
.. Laplace ). A... i),. ω i i ). {ω,..., ω } Ω,. ii) Ω. Ω. A ) r, A P A) P A) r... ).. Ω {,, 3, 4, 5, 6}. i i 6). A {, 4, 6} P A) P A) 3 6. ).. i, j i, j) ) Ω {i, j) i 6, j 6}., 36. A. A {i, j) i j }.
More information¥¤¥ó¥¿¡¼¥Í¥Ã¥È·×¬¤È¥Ç¡¼¥¿²òÀÏ Âè2²ó
2 212 4 13 1 (4/6) : ruby 2 / 35 ( ) : gnuplot 3 / 35 ( ) 4 / 35 (summary statistics) : (mean) (median) (mode) : (range) (variance) (standard deviation) 5 / 35 (mean): x = 1 n (median): { xr+1 m, m = 2r
More information¥¤¥ó¥¿¡¼¥Í¥Ã¥È·×¬¤È¥Ç¡¼¥¿²òÀÏ Âè2²ó
2 2015 4 20 1 (4/13) : ruby 2 / 49 2 ( ) : gnuplot 3 / 49 1 1 2014 6 IIJ / 4 / 49 1 ( ) / 5 / 49 ( ) 6 / 49 (summary statistics) : (mean) (median) (mode) : (range) (variance) (standard deviation) 7 / 49
More information(pdf) (cdf) Matlab χ ( ) F t
(, ) (univariate) (bivariate) (multi-variate) Matlab Octave Matlab Matlab/Octave --...............3. (pdf) (cdf)...3.4....4.5....4.6....7.7. Matlab...8.7.....9.7.. χ ( )...0.7.3.....7.4. F....7.5. t-...3.8....4.8.....4.8.....5.8.3....6.8.4....8.8.5....8.8.6....8.9....9.9.....9.9.....0.9.3....0.9.4.....9.5.....0....3
More information25 II :30 16:00 (1),. Do not open this problem booklet until the start of the examination is announced. (2) 3.. Answer the following 3 proble
25 II 25 2 6 13:30 16:00 (1),. Do not open this problem boolet until the start of the examination is announced. (2) 3.. Answer the following 3 problems. Use the designated answer sheet for each problem.
More informationii 3.,. 4. F. ( ), ,,. 8.,. 1. (75% ) (25% ) =7 24, =7 25, =7 26 (. ). 1.,, ( ). 3.,...,.,.,.,.,. ( ) (1 2 )., ( ), 0., 1., 0,.
(1 C205) 4 10 (2 C206) 4 11 (2 B202) 4 12 25(2013) http://www.math.is.tohoku.ac.jp/~obata,.,,,..,,. 1. 2. 3. 4. 5. 6. 7. 8. 1., 2007 ( ).,. 2. P. G., 1995. 3. J. C., 1988. 1... 2.,,. ii 3.,. 4. F. ( ),..
More informationkubostat2017b p.1 agenda I 2017 (b) probability distribution and maximum likelihood estimation :
kubostat2017b p.1 agenda I 2017 (b) probabilit distribution and maimum likelihood estimation kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2017 11 14 : 2017 11 07 15:43 1 : 2 3? 4 kubostat2017b (http://goo.gl/76c4i)
More information6.1 (P (P (P (P (P (P (, P (, P.
(011 30 7 0 ( ( 3 ( 010 1 (P.3 1 1.1 (P.4.................. 1 1. (P.4............... 1 (P.15.1 (P.16................. (P.0............3 (P.18 3.4 (P.3............... 4 3 (P.9 4 3.1 (P.30........... 4 3.
More information6.1 (P (P (P (P (P (P (, P (, P.101
(008 0 3 7 ( ( ( 00 1 (P.3 1 1.1 (P.3.................. 1 1. (P.4............... 1 (P.15.1 (P.15................. (P.18............3 (P.17......... 3.4 (P................ 4 3 (P.7 4 3.1 ( P.7...........
More information1 1 ( ) ( 1.1 1.1.1 60% mm 100 100 60 60% 1.1.2 A B A B A 1
1 21 10 5 1 E-mail: qliu@res.otaru-uc.ac.jp 1 1 ( ) ( 1.1 1.1.1 60% mm 100 100 60 60% 1.1.2 A B A B A 1 B 1.1.3 boy W ID 1 2 3 DI DII DIII OL OL 1.1.4 2 1.1.5 1.1.6 1.1.7 1.1.8 1.2 1.2.1 1. 2. 3 1.2.2
More informationJune 2016 i (statistics) F Excel Numbers, OpenOffice/LibreOffice Calc ii *1 VAR STDEV 1 SPSS SAS R *2 R R R R *1 Excel, Numbers, Microsoft Office, Apple iwork, *2 R GNU GNU R iii URL http://ruby.kyoto-wu.ac.jp/statistics/training/
More information2 1,2, , 2 ( ) (1) (2) (3) (4) Cameron and Trivedi(1998) , (1987) (1982) Agresti(2003)
3 1 1 1 2 1 2 1,2,3 1 0 50 3000, 2 ( ) 1 3 1 0 4 3 (1) (2) (3) (4) 1 1 1 2 3 Cameron and Trivedi(1998) 4 1974, (1987) (1982) Agresti(2003) 3 (1)-(4) AAA, AA+,A (1) (2) (3) (4) (5) (1)-(5) 1 2 5 3 5 (DI)
More informationii 3.,. 4. F. (), ,,. 8.,. 1. (75%) (25%) =7 20, =7 21 (. ). 1.,, (). 3.,. 1. ().,.,.,.,.,. () (12 )., (), 0. 2., 1., 0,.
24(2012) (1 C106) 4 11 (2 C206) 4 12 http://www.math.is.tohoku.ac.jp/~obata,.,,,.. 1. 2. 3. 4. 5. 6. 7.,,. 1., 2007 (). 2. P. G. Hoel, 1995. 3... 1... 2.,,. ii 3.,. 4. F. (),.. 5... 6.. 7.,,. 8.,. 1. (75%)
More informationst.dvi
9 3 5................................... 5............................. 5....................................... 5.................................. 7.........................................................................
More informationuntitled
3 3. (stochastic differential equations) { dx(t) =f(t, X)dt + G(t, X)dW (t), t [,T], (3.) X( )=X X(t) : [,T] R d (d ) f(t, X) : [,T] R d R d (drift term) G(t, X) : [,T] R d R d m (diffusion term) W (t)
More informationwaseda2010a-jukaiki1-main.dvi
November, 2 Contents 6 2 8 3 3 3 32 32 33 5 34 34 6 35 35 7 4 R 2 7 4 4 9 42 42 2 43 44 2 5 : 2 5 5 23 52 52 23 53 53 23 54 24 6 24 6 6 26 62 62 26 63 t 27 7 27 7 7 28 72 72 28 73 36) 29 8 29 8 29 82 3
More informationpopulatio sample II, B II? [1] I. [2] 1 [3] David J. Had [4] 2 [5] 3 2
(2015 ) 1 NHK 2012 5 28 2013 7 3 2014 9 17 2015 4 8!? New York Times 2009 8 5 For Today s Graduate, Just Oe Word: Statistics Google Hal Varia I keep sayig that the sexy job i the ext 10 years will be statisticias.
More informationAkito Tsuboi June 22, T ϕ T M M ϕ M M ϕ T ϕ 2 Definition 1 X, Y, Z,... 1
Akito Tsuboi June 22, 2006 1 T ϕ T M M ϕ M M ϕ T ϕ 2 Definition 1 X, Y, Z,... 1 1. X, Y, Z,... 2. A, B (A), (A) (B), (A) (B), (A) (B) Exercise 2 1. (X) (Y ) 2. ((X) (Y )) (Z) 3. (((X) (Y )) (Z)) Exercise
More informationii 3.,. 4. F. (), ,,. 8.,. 1. (75% ) (25% ) =9 7, =9 8 (. ). 1.,, (). 3.,. 1. ( ).,.,.,.,.,. ( ) (1 2 )., ( ), 0. 2., 1., 0,.
23(2011) (1 C104) 5 11 (2 C206) 5 12 http://www.math.is.tohoku.ac.jp/~obata,.,,,.. 1. 2. 3. 4. 5. 6. 7.,,. 1., 2007 ( ). 2. P. G. Hoel, 1995. 3... 1... 2.,,. ii 3.,. 4. F. (),.. 5.. 6.. 7.,,. 8.,. 1. (75%
More informationこんにちは由美子です
1 2 . sum Variable Obs Mean Std. Dev. Min Max ---------+----------------------------------------------------- var1 13.4923077.3545926.05 1.1 3 3 3 0.71 3 x 3 C 3 = 0.3579 2 1 0.71 2 x 0.29 x 3 C 2 = 0.4386
More informationTest IV, March 22, 2016 6. Suppose that 2 n a n converges. Prove or disprove that a n converges. Proof. Method I: Let a n x n be a power series, which converges at x = 2 by the assumption. Applying Theorem
More information1 Tokyo Daily Rainfall (mm) Days (mm)
( ) r-taka@maritime.kobe-u.ac.jp 1 Tokyo Daily Rainfall (mm) 0 100 200 300 0 10000 20000 30000 40000 50000 Days (mm) 1876 1 1 2013 12 31 Tokyo, 1876 Daily Rainfall (mm) 0 50 100 150 0 100 200 300 Tokyo,
More informationx y 1 x 1 y 1 2 x 2 y 2 3 x 3 y 3... x ( ) 2
1 1 1.1 1.1.1 1 168 75 2 170 65 3 156 50... x y 1 x 1 y 1 2 x 2 y 2 3 x 3 y 3... x ( ) 2 1 1 0 1 0 0 2 1 0 0 1 0 3 0 1 0 0 1...... 1.1.2 x = 1 n x (average, mean) x i s 2 x = 1 n (x i x) 2 3 x (variance)
More information浜松医科大学紀要
On the Statistical Bias Found in the Horse Racing Data (1) Akio NODA Mathematics Abstract: The purpose of the present paper is to report what type of statistical bias the author has found in the horse
More information分散分析・2次元正規分布
2 II L10(2016-06-30 Thu) : Time-stamp: 2016-06-30 Thu 13:55 JST hig F 2.. http://hig3.net ( ) L10 2 II(2016) 1 / 24 F 2 F L09-Q1 Quiz :F 1 α = 0.05, 2 F 3 H 0, : σ 2 1 /σ2 2 = 1., H 1, σ 2 1 /σ2 2 1. 4
More information統計的データ解析
ds45 xspec qdp guplot oocalc (Error) gg (Radom Error)(Systematc Error) x, x,, x ( x, x,..., x x = s x x µ = lm = σ µ x x = lm ( x ) = σ ( ) = - x = js j ( ) = j= ( j) x x + xj x + xj j x + xj = ( x x
More information5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j )
5 Armitage. x,, x n y i = 0x i + 3 y i = log x i x i y i.2 n i i x ij i j y ij, z ij i j 2 y = a x + b 2 2. ( cm) x ij (i j ) (i) x, x 2 σ 2 x,, σ 2 x,2 σ x,, σ x,2 t t x * (ii) (i) m y ij = x ij /00 y
More informationMicrosoft Word - 表紙.docx
黒住英司 [ 著 ] サピエンティア 計量経済学 訂正および練習問題解答 (206/2/2 版 ) 訂正 練習問題解答 3 .69, 3.8 4 (X i X)U i i i (X i μ x )U i ( X μx ) U i. i E [ ] (X i μ x )U i i E[(X i μ x )]E[U i ]0. i V [ ] (X i μ x )U i i 2 i j E [(X i
More information43433 8 3 . Stochastic exponentials...................................... 3. Girsanov s theorem......................................... 4 On the martingale property of stochastic exponentials 5. Gronwall
More informationohpmain.dvi
fujisawa@ism.ac.jp 1 Contents 1. 2. 3. 4. γ- 2 1. 3 10 5.6, 5.7, 5.4, 5.5, 5.8, 5.5, 5.3, 5.6, 5.4, 5.2. 5.5 5.6 +5.7 +5.4 +5.5 +5.8 +5.5 +5.3 +5.6 +5.4 +5.2 =5.5. 10 outlier 5 5.6, 5.7, 5.4, 5.5, 5.8,
More informationAR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t
87 6.1 AR(1) y t = φy t 1 + ɛ t, ɛ t N(0, σ 2 ) 1. Mean of y t given y t 1, y t 2, E(y t y t 1, y t 2, ) = φy t 1 2. Variance of y t given y t 1, y t 2, V(y t y t 1, y t 2, ) = σ 2 3. Thus, y t y t 1,
More information(note-02) Rademacher 1/57
(note-02) Rademacher 1/57 (x 1, y 1 ),..., (x n, y n ) X Y f : X Y Y = R f Y = {+1, 1}, {1, 2,..., G} f x y 1. (x 1, y 1 ),..., (x n, y n ) f(x i ) ( ) 2. x f(x) Y 2/57 (x, y) f(x) f(x) y (, loss) l(f(x),
More informationカテゴリ変数と独立性の検定
II L04(2015-05-01 Fri) : Time-stamp: 2015-05-01 Fri 22:28 JST hig 2, Excel 2, χ 2,. http://hig3.net () L04 II(2015) 1 / 20 : L03-S1 Quiz : 1 2 7 3 12 (x = 2) 12 (y = 3) P (X = x) = 5 12 (x = 3), P (Y =
More informationMicrosoft Word - ??? ????????? ????? 2013.docx
@ィーィェィケィャi@@ @@pbィ 050605a05@07ィ 050605a@070200 pbィ 050605a05@07ィ 050605a@070200@ィーィィu05@0208 1215181418 12 1216121419 171210 1918181811 19181719101411 1513 191815181611 19181319101411 18121819191418 1919151811
More informationRによる計量分析:データ解析と可視化 - 第3回 Rの基礎とデータ操作・管理
R 3 R 2017 Email: gito@eco.u-toyama.ac.jp October 23, 2017 (Toyama/NIHU) R ( 3 ) October 23, 2017 1 / 34 Agenda 1 2 3 4 R 5 RStudio (Toyama/NIHU) R ( 3 ) October 23, 2017 2 / 34 10/30 (Mon.) 12/11 (Mon.)
More informationIII III 2010 PART I 1 Definition 1.1 (, σ-),,,, Borel( ),, (σ-) (M, F, µ), (R, B(R)), (C, B(C)) Borel Definition 1.2 (µ-a.e.), (in µ), (in L 1 (µ)). T
III III 2010 PART I 1 Definition 1.1 (, σ-),,,, Borel( ),, (σ-) (M, F, µ), (R, B(R)), (C, B(C)) Borel Definition 1.2 (µ-a.e.), (in µ), (in L 1 (µ)). Theorem 1.3 (Lebesgue ) lim n f n = f µ-a.e. g L 1 (µ)
More informationx () g(x) = f(t) dt f(x), F (x) 3x () g(x) g (x) f(x), F (x) (3) h(x) = x 3x tf(t) dt.9 = {(x, y) ; x, y, x + y } f(x, y) = xy( x y). h (x) f(x), F (x
[ ] IC. f(x) = e x () f(x) f (x) () lim f(x) lim f(x) x + x (3) lim f(x) lim f(x) x + x (4) y = f(x) ( ) ( s46). < a < () a () lim a log xdx a log xdx ( ) n (3) lim log k log n n n k=.3 z = log(x + y ),
More informationkubostat2015e p.2 how to specify Poisson regression model, a GLM GLM how to specify model, a GLM GLM logistic probability distribution Poisson distrib
kubostat2015e p.1 I 2015 (e) GLM kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2015 07 22 2015 07 21 16:26 kubostat2015e (http://goo.gl/76c4i) 2015 (e) 2015 07 22 1 / 42 1 N k 2 binomial distribution logit
More information微分積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 初版 1 刷発行時のものです.
微分積分 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. ttp://www.morikita.co.jp/books/mid/00571 このサンプルページの内容は, 初版 1 刷発行時のものです. i ii 014 10 iii [note] 1 3 iv 4 5 3 6 4 x 0 sin x x 1 5 6 z = f(x, y) 1 y = f(x)
More information記号と準備
tbasic.org * 1 [2017 6 ] 1 2 1.1................................................ 2 1.2................................................ 2 1.3.............................................. 3 2 5 2.1............................................
More information,. Black-Scholes u t t, x c u 0 t, x x u t t, x c u t, x x u t t, x + σ x u t, x + rx ut, x rux, t 0 x x,,.,. Step 3, 7,,, Step 6., Step 4,. Step 5,,.
9 α ν β Ξ ξ Γ γ o δ Π π ε ρ ζ Σ σ η τ Θ θ Υ υ ι Φ φ κ χ Λ λ Ψ ψ µ Ω ω Def, Prop, Th, Lem, Note, Remark, Ex,, Proof, R, N, Q, C [a, b {x R : a x b} : a, b {x R : a < x < b} : [a, b {x R : a x < b} : a,
More information25 7 18 1 1 1.1 v.s............................. 1 1.1.1.................................. 1 1.1.2................................. 1 1.1.3.................................. 3 1.2................... 3
More information医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. このサンプルページの内容は, 第 2 版 1 刷発行時のものです.
医系の統計入門第 2 版 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/009192 このサンプルページの内容は, 第 2 版 1 刷発行時のものです. i 2 t 1. 2. 3 2 3. 6 4. 7 5. n 2 ν 6. 2 7. 2003 ii 2 2013 10 iii 1987
More informationIsogai, T., Building a dynamic correlation network for fat-tailed financial asset returns, Applied Network Science (7):-24, 206,
H28. (TMU) 206 8 29 / 34 2 3 4 5 6 Isogai, T., Building a dynamic correlation network for fat-tailed financial asset returns, Applied Network Science (7):-24, 206, http://link.springer.com/article/0.007/s409-06-0008-x
More information名称未設定
2007 12 19 i I 1 1 3 1.1.................... 3 1.2................................ 4 1.3.................................... 7 2 9 2.1...................................... 9 2.2....................................
More informationCVaR
CVaR 20 4 24 3 24 1 31 ,.,.,. Markowitz,., (Value-at-Risk, VaR) (Conditional Value-at-Risk, CVaR). VaR, CVaR VaR. CVaR, CVaR. CVaR,,.,.,,,.,,. 1 5 2 VaR CVaR 6 2.1................................................
More information講義のーと : データ解析のための統計モデリング. 第2回
Title 講義のーと : データ解析のための統計モデリング Author(s) 久保, 拓弥 Issue Date 2008 Doc URL http://hdl.handle.net/2115/49477 Type learningobject Note この講義資料は, 著者のホームページ http://hosho.ees.hokudai.ac.jp/~kub ードできます Note(URL)http://hosho.ees.hokudai.ac.jp/~kubo/ce/EesLecture20
More informationkubostat2018d p.2 :? bod size x and fertilization f change seed number? : a statistical model for this example? i response variable seed number : { i
kubostat2018d p.1 I 2018 (d) model selection and kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2018 06 25 : 2018 06 21 17:45 1 2 3 4 :? AIC : deviance model selection misunderstanding kubostat2018d (http://goo.gl/76c4i)
More informationuntitled
1 Hitomi s English Tests 1 2 3 4 5 6 7 8 9 10 11 12 13 14 1 1 0 1 1 0 1 0 0 0 1 0 0 1 0 2 0 0 1 1 0 0 0 0 0 1 1 1 1 0 3 1 1 0 0 0 0 1 0 1 0 1 0 1 1 4 1 1 0 1 0 1 1 1 1 0 0 0 1 1 5 1 1 0 1 1 1 1 0 0 1 0
More information応用数学III-4.ppt
III f x ( ) = 1 f x ( ) = P( X = x) = f ( x) = P( X = x) =! x ( ) b! a, X! U a,b f ( x) =! " e #!x, X! Ex (!) n! ( n! x)!x! " x 1! " x! e"!, X! Po! ( ) n! x, X! B( n;" ) ( ) ! xf ( x) = = n n!! ( n
More informationL1 What Can You Blood Type Tell Us? Part 1 Can you guess/ my blood type? Well,/ you re very serious person/ so/ I think/ your blood type is A. Wow!/ G
L1 What Can You Blood Type Tell Us? Part 1 Can you guess/ my blood type? 当ててみて / 私の血液型を Well,/ you re very serious person/ so/ I think/ your blood type is A. えーと / あなたはとっても真面目な人 / だから / 私は ~ と思います / あなたの血液型は
More informationPage 1 of 6 B (The World of Mathematics) November 20, 2006 Final Exam 2006 Division: ID#: Name: 1. p, q, r (Let p, q, r are propositions. ) (10pts) (a
Page 1 of 6 B (The World of Mathematics) November 0, 006 Final Exam 006 Division: ID#: Name: 1. p, q, r (Let p, q, r are propositions. ) (a) (Decide whether the following holds by completing the truth
More information~ ユリシーズ における語りのレベル Synopsis Who Is the Man in Macintosh? - Narrative Levels in Ulysses Wataru TAKAHASHI Who is the man in macintosh? This is a famous enigma in Ulysses. He comes out of the blue on the
More informationuntitled
17 5 13 1 2 1.1... 2 1.2... 2 1.3... 3 2 3 2.1... 3 2.2... 5 3 6 3.1... 6 3.2... 7 3.3 t... 7 3.4 BC a... 9 3.5... 10 4 11 1 1 θ n ˆθ. ˆθ, ˆθ, ˆθ.,, ˆθ.,.,,,. 1.1 ˆθ σ 2 = E(ˆθ E ˆθ) 2 b = E(ˆθ θ). Y 1,,Y
More informationi
009 I 1 8 5 i 0 1 0.1..................................... 1 0.................................................. 1 0.3................................. 0.4........................................... 3
More informationZ[i] Z[i] π 4,1 (x) π 4,3 (x) 1 x (x ) 2 log x π m,a (x) 1 x ϕ(m) log x 1.1 ( ). π(x) x (a, m) = 1 π m,a (x) x modm a 1 π m,a (x) 1 ϕ(m) π(x)
3 3 22 Z[i] Z[i] π 4, (x) π 4,3 (x) x (x ) 2 log x π m,a (x) x ϕ(m) log x. ( ). π(x) x (a, m) = π m,a (x) x modm a π m,a (x) ϕ(m) π(x) ϕ(m) x log x ϕ(m) m f(x) g(x) (x α) lim f(x)/g(x) = x α mod m (a,
More information151021slide.dvi
: Mac I 1 ( 5 Windows (Mac Excel : Excel 2007 9 10 1 4 http://asakura.co.jp/ books/isbn/978-4-254-12172-8/ (1 1 9 1/29 (,,... (,,,... (,,, (3 3/29 (, (F7, Ctrl + i, (Shift +, Shift + Ctrl (, a i (, Enter,
More information外国語科 ( 英語 Ⅱ) 学習指導案 A TOUR OF THE BRAIN ( 高等学校第 2 学年 ) 神奈川県立総合教育センター 平成 20 年度研究指定校共同研究事業 ( 高等学校 ) 授業改善の組織的な取組に向けて 平成 21 年 3 月 平成 20 年度研究指定校である光陵高等学校において 授業改善に向けた組織的な取組として授業実践を行った学習指導案です 生徒主体の活動を多く取り入れ 生徒の学習活動に変化をもたせるとともに
More informationi
i 1 1 1.1..................................... 1 1.2........................................ 3 1.3.................................. 4 1.4..................................... 4 1.5......................................
More informationk2 ( :35 ) ( k2) (GLM) web web 1 :
2012 11 01 k2 (2012-10-26 16:35 ) 1 6 2 (2012 11 01 k2) (GLM) kubo@ees.hokudai.ac.jp web http://goo.gl/wijx2 web http://goo.gl/ufq2 1 : 2 2 4 3 7 4 9 5 : 11 5.1................... 13 6 14 6.1......................
More informationseminar0220a.dvi
1 Hi-Stat 2 16 2 20 16:30-18:00 2 2 217 1 COE 4 COE RA E-MAIL: ged0104@srv.cc.hit-u.ac.jp 2004 2 25 S-PLUS S-PLUS S-PLUS S-code 2 [8] [8] [8] 1 2 ARFIMA(p, d, q) FI(d) φ(l)(1 L) d x t = θ(l)ε t ({ε t }
More information* n x 11,, x 1n N(µ 1, σ 2 ) x 21,, x 2n N(µ 2, σ 2 ) H 0 µ 1 = µ 2 (= µ ) H 1 µ 1 µ 2 H 0, H 1 *2 σ 2 σ 2 0, σ 2 1 *1 *2 H 0 H
1 1 1.1 *1 1. 1.3.1 n x 11,, x 1n Nµ 1, σ x 1,, x n Nµ, σ H 0 µ 1 = µ = µ H 1 µ 1 µ H 0, H 1 * σ σ 0, σ 1 *1 * H 0 H 0, H 1 H 1 1 H 0 µ, σ 0 H 1 µ 1, µ, σ 1 L 0 µ, σ x L 1 µ 1, µ, σ x x H 0 L 0 µ, σ 0
More information〈論文〉興行データベースから「古典芸能」の定義を考える
Abstract The long performance database of rakugo and kabuki was totaled, and it is found that few programs are repeated in both genres both have the frequency differential of performance. It is a question
More information現代日本論演習/比較現代日本論研究演習I「統計分析の基礎」
URL: http://tsigeto.info/statg/ I () 3 2016 2 ( 7F) 1 : (1); (2) 1998 (70 20% 6 9 ) (30%) ( 2) ( 2) 2 1. (4/14) 2. SPSS (4/21) 3. (4/28) [] 4. (5/126/2) [1, 4] 5. (6/9) 6. (6/166/30) [2, 5] 7. (7/78/4)
More information1 1 [1] ( 2,625 [2] ( 2, ( ) /
[] (,65 [] (,3 ( ) 67 84 76 7 8 6 7 65 68 7 75 73 68 7 73 7 7 59 67 68 65 75 56 6 58 /=45 /=45 6 65 63 3 4 3/=36 4/=8 66 7 68 7 7/=38 /=5 7 75 73 8 9 8/=364 9/=864 76 8 78 /=45 /=99 8 85 83 /=9 /= ( )
More informationADM-Hamiltonian Cheeger-Gromov 3. Penrose
ADM-Hamiltonian 1. 2. Cheeger-Gromov 3. Penrose 0. ADM-Hamiltonian (M 4, h) Einstein-Hilbert M 4 R h hdx L h = R h h δl h = 0 (Ric h ) αβ 1 2 R hg αβ = 0 (Σ 3, g ij ) (M 4, h ij ) g ij, k ij Σ π ij = g(k
More information基礎数学I
I & II ii ii........... 22................. 25 12............... 28.................. 28.................... 31............. 32.................. 34 3 1 9.................... 1....................... 1............
More information現代日本論演習/比較現代日本論研究演習I「統計分析の基礎」
URL: http://tsigeto.info/statg/ I ( ) 3 2017 2 ( 7F) 1 : (1) ; (2) 1998 (70 20% 6 8 ) (30%) ( 2) ( 2) 2 1. (4/13) 2. SPSS (4/20) 3. (4/27) [ ] 4. (5/11 6/1) [1, 4 ] 5. (6/8) 6. (6/15 6/29) [2, 5 ] 7. (7/6
More informationOutline ( ) / 10
1 2014 ( ) 1 2014 1 / 10 Outline 1 1.1 ( ) 1 2014 2 / 10 Outline 1 1.1 2 1.2 ( ) 1 2014 2 / 10 Outline 1 1.1 2 1.2 3 1.3 ( ) 1 2014 2 / 10 Outline 1 1.1 2 1.2 3 1.3 4 1.4 ( ) 1 2014 2 / 10 Outline 1 1.1
More information.3. (x, x = (, u = = 4 (, x x = 4 x, x 0 x = 0 x = 4 x.4. ( z + z = 8 z, z 0 (z, z = (0, 8, (,, (8, 0 3 (0, 8, (,, (8, 0 z = z 4 z (g f(x = g(
06 5.. ( y = x x y 5 y 5 = (x y = x + ( y = x + y = x y.. ( Y = C + I = 50 + 0.5Y + 50 r r = 00 0.5Y ( L = M Y r = 00 r = 0.5Y 50 (3 00 0.5Y = 0.5Y 50 Y = 50, r = 5 .3. (x, x = (, u = = 4 (, x x = 4 x,
More informationClustering in Time and Periodicity of Strong Earthquakes in Tokyo Masami OKADA Kobe Marine Observatory (Received on March 30, 1977) The clustering in time and periodicity of earthquake occurrence are investigated
More informationsolutionJIS.dvi
May 0, 006 6 morimune@econ.kyoto-u.ac.jp /9/005 (7 0/5/006 1 1.1 (a) (b) (c) c + c + + c = nc (x 1 x)+(x x)+ +(x n x) =(x 1 + x + + x n ) nx = nx nx =0 c(x 1 x)+c(x x)+ + c(x n x) =c (x i x) =0 y i (x
More informationy i OLS [0, 1] OLS x i = (1, x 1,i,, x k,i ) β = (β 0, β 1,, β k ) G ( x i β) 1 G i 1 π i π i P {y i = 1 x i } = G (
7 2 2008 7 10 1 2 2 1.1 2............................................. 2 1.2 2.......................................... 2 1.3 2........................................ 3 1.4................................................
More informationII (Percolation) ( 3-4 ) 1. [ ],,,,,,,. 2. [ ],.. 3. [ ],. 4. [ ] [ ] G. Grimmett Percolation Springer-Verlag New-York [ ] 3
II (Percolation) 12 9 27 ( 3-4 ) 1 [ ] 2 [ ] 3 [ ] 4 [ ] 1992 5 [ ] G Grimmett Percolation Springer-Verlag New-York 1989 6 [ ] 3 1 3 p H 2 3 2 FKG BK Russo 2 p H = p T (=: p c ) 3 2 Kesten p c =1/2 ( )
More information149 (Newell [5]) Newell [5], [1], [1], [11] Li,Ryu, and Song [2], [11] Li,Ryu, and Song [2], [1] 1) 2) ( ) ( ) 3) T : 2 a : 3 a 1 :
Transactions of the Operations Research Society of Japan Vol. 58, 215, pp. 148 165 c ( 215 1 2 ; 215 9 3 ) 1) 2) :,,,,, 1. [9] 3 12 Darroch,Newell, and Morris [1] Mcneil [3] Miller [4] Newell [5, 6], [1]
More information2
2011 8 6 2011 5 7 [1] 1 2 i ii iii i 3 [2] 4 5 ii 6 7 iii 8 [3] 9 10 11 cf. Abstracts in English In terms of democracy, the patience and the kindness Tohoku people have shown will be dealt with as an exception.
More information(2 X Poisso P (λ ϕ X (t = E[e itx ] = k= itk λk e k! e λ = (e it λ k e λ = e eitλ e λ = e λ(eit 1. k! k= 6.7 X N(, 1 ϕ X (t = e 1 2 t2 : Cauchy ϕ X (t
6 6.1 6.1 (1 Z ( X = e Z, Y = Im Z ( Z = X + iy, i = 1 (2 Z E[ e Z ] < E[ Im Z ] < Z E[Z] = E[e Z] + ie[im Z] 6.2 Z E[Z] E[ Z ] : E[ Z ] < e Z Z, Im Z Z E[Z] α = E[Z], Z = Z Z 1 {Z } E[Z] = α = α [ α ]
More information10 2000 11 11 48 ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) CU-SeeMe NetMeeting Phoenix mini SeeMe Integrated Services Digital Network 64kbps 16kbps 128kbps 384kbps
More information<4D F736F F D B B83578B6594BB2D834A836F815B82D082C88C60202E646F63>
確率的手法による構造安全性の解析 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/55271 このサンプルページの内容は, 初版 1 刷発行当時のものです. i 25 7 ii Benjamin &Cornell Ang & Tang Schuëller 1973 1974 Ang Mathematica
More informationきずなプロジェクト-表紙.indd
P6 P7 P12 P13 P20 P28 P76 P78 P80 P81 P88 P98 P138 P139 P140 P142 P144 P146 P148 #1 SHORT-TERM INVITATION GROUPS 2012 6 10 6 23 2012 7 17 14 2012 7 17 14 2012 7 8 7 21 2012 7 8 7 21 2012 8 7 8 18
More information2 1,, x = 1 a i f i = i i a i f i. media ( ): x 1, x 2,..., x,. mode ( ): x 1, x 2,..., x,., ( ). 2., : box plot ( ): x variace ( ): σ 2 = 1 (x k x) 2
1 1 Lambert Adolphe Jacques Quetelet (1796 1874) 1.1 1 1 (1 ) x 1, x 2,..., x ( ) x a 1 a i a m f f 1 f i f m 1.1 ( ( )) 155 160 160 165 165 170 170 175 175 180 180 185 x 157.5 162.5 167.5 172.5 177.5
More informationJOURNAL OF THE JAPANESE ASSOCIATION FOR PETROLEUM TECHNOLOGY VOL. 66, NO. 6 (Nov., 2001) (Received August 10, 2001; accepted November 9, 2001) Alterna
JOURNAL OF THE JAPANESE ASSOCIATION FOR PETROLEUM TECHNOLOGY VOL. 66, NO. 6 (Nov., 2001) (Received August 10, 2001; accepted November 9, 2001) Alternative approach using the Monte Carlo simulation to evaluate
More information