I

Size: px
Start display at page:

Download "I"

Transcription

1 Ver

2 I (MCAR, MAR, MNAR) Little and Rubin (2002) MCAR, MAR, MNAR NRC(2010) MCAR MAR MNAR MCAR MAR MNAR MNAR MCAR MAR MNAR MAR

3 Estimand Estimand Estimand Estimand Efficacy effectiveness Mallinckrodt (2013) 6 estimand Estimand Estimand Estimand Estimand 4, Estimand Mallinckrodt et al. (2014) 3 estimand estimand Complete case LOCF BOCF Complete case LOCF BOCF Estimand LOCF BOCF Selection Model SM SM SM estimand MAR SM MAR SM 1 SM MAR SM 2 MMRM MNAR (outcome-dependent dropout) SM SM MNAR SM MNAR SM

4 Pattern-Mixture Model PMM CCMV NCMV ACMV PMM CCMV NCMV ACMV CCMV NCMV ACMV CCMV NCMV ACMV PMM (CCMV, NCMV, ACMV) NFMV PMM (NFMV) NFMV PMM SM PMM ( %PATTERNMIXTURE) Multiple Imuptation MI Multiple Imuptation Multiple Imuptation Rubin Multiple Imuptation Multiple Imputation Marginal treatment effect ( ) PMM pattern imputation Controlled imputation pattern imputation SAS pattern imputation Delta adjustment pattern imputation AE pattern imputation Shared Parameter Model SPM SPM %shared_parameter

5 Inverse Probability Weighted Complete-Case (IPWCC) Estimator µ IPWCC IPWCC estimators for Estimating Equations IPWCC weighted Generalized Estimating Equation (wgee) Doubly Robust (DR) II MMRM 98 9 MMRM MMRM SAS PROC MIXED CLASS MODEL LSMEANS REPEATED RANDOM MMRM MMRM Type (i) Type (ii) SM PMM SPM Estimand

6 11 Estimand Estimand Estimand Estimand Estimand Estimand Estimand Estimand Analytic Road Map

7 Ver Appendix A C Appendix 1, Appendix 2) Appendix A C (Ver1.0 Appendix) Appendix 1 Appendix Appendix 1 Appendix Appendix 1 Appendix 2 Web

8 I

9 1 1.1 III LOCF (Last Observation Carried Forward) LOCF ANCOVA (Tanaka et al., 2014) EMA "Guideline on Missing Data in Confirmatory Clinical Trials" National Research Council "The prevention and treatment of missing data in clinical trials" NRC (2010) EMA FDA JPMA, 2014b JPMA, 2014a; 2014b NRC (2010) estimand estimand estimand NRC (2010) estimand ICH E9 (R1) Final Concept Paper (ICH Steering Committee, 2014) PMDA FDA Lisa LaVange Web, 2015; LaVange, PMDA estimand

10 1. I estimand (1 8 ) 2. II LOCF ANCOVA MMRM (Mixed Model for Repeated Measures) (9 12 ) 3. (Appendix) (LMM) (Appendix A C) LOCF ANCOVA LOCF ANCOVA MMRM Estimand (Linear Model, LM) ANCOVA Linear Mixed Model, LMM LMM Appendix A NRC (2010) JPMA (2014a) Hughes et al. (2012) (2015) Missing At Random "Missing Not At Random (MNAR)" "Not Missing At Random (NMAR)" sensitivity analysis "MNAR" 1 1 MMRM 9 "Mixed" MMRM 1 9

11 1.3.4 (SM) SeM 1.1: ANCOVA ANalysis of COVAriance AR(1) 1 AutoRegressive(1) CS Compound Symmetry iid independent and identically distributed LM Linear Model LMM Linear Mixed Model LOCF MAR Last Observation Carried Forward Missing At Random MI Multiple Imputation MMRM MNAR Mixed Model for Repeated Measures Missing Not At Random PMM Pattern-Mixture Model pmi placebo Multiple Imputation REML REstricted Maximum Likelihood SM Selection Model SPM Shared Paremeter Model i j j j j = 1, 2, j = 1 1 j = 2 2 j = 3 4 i = 1,, N; j = 1,, n 1 n i j M ij 1 0 R ij 10

12 i M i, R i 2 i +1 D i n D i = R ij + 1 j=1 D i n n = SAS SAS Proc MI pmi NRC (2010) NAS NRC MNAR Missingdata.org.uk ( SAS ( ( ) Appendix C [1]. (2015). PMDA. NAS M i1 0 R i1 M i = M i2 M i3 = 0 0, R i = R i2 R i3 = M i4 1 R i

13 EMA estimand ( [2] European Medicines Agency (2010). Guideline on missing data in confirmatory clinical trials. ( [3] Hughes, S., Harris, J., Flack, N., and Cuffe, R. L. (2012). The statistician s role in the prevention of missing data. Pharmaceutical statistics, 11(5), [4] ICH Steering Committee (2014). Final Concept Paper E9(R1): Addendum to Statistical Principles for Clinical Trials dated 22 October [5] JPMA (2014a). - NAS EMA estimand -.. ( [6] JPMA (2014b). - NAS EMA estimand -.. ( [7] LaVange, L. (2015). Missing Data Issues in Regulatory Clinical Trials. NAS EMA estimand ( [8] (2015). 2. NAS EMA estimand ( [9] National Research Council. (2010). The prevention and treatment of missing data in clinical trials. National Academy Press. [10] (2015).. NAS EMA estimand ( [11] Tanaka, S., Fukinbara, S., Tsuchiya, S., Suganami, H., & Ito, Y. M. (2014). Current Practice in Japan for the Prevention and Treatment of Missing Data in Confirmatory Clinical Trials A Survey of Japanese and Foreign Pharmaceutical Manufacturers. Therapeutic Innovation & Regulatory Science, 48(6),

14 NRC (2010) 1 (p8) (missing data) QOL (missing outcome) R ij i j M ij 0 1 R ij

15 i (i = 1,, N) j (j = 1,, n) Y ij i j Y i i Yi o i Yi m i R i i (Y o i, R i) 4 3 i Y i1 1 Y i1 Y i2 ( ) Y i =, Yi o = Y Y i2, 1 Ym i = Y i4, R i = i3 1 Y i3 0 Y i4 (Y o i, R i) (full data 4 ) (Y i, R i ) (observed data) (Y o i, R i) 4 Y i4 R i4 Y i4 0 R i Little and Rubin (2002) R i 0, 1 R i = (1, 1, 1, 0) R i = (1, 0, 0, 1) "complete data" 14

16 (dropout 5 ) 1 (monotone missing) R i = (1, 1, 1, 0), (1, 1, 0, 0), (1, 0, 0, 0) R ij (non-monotone missing) R i = (1, 1, 0, 1), (1, 0, 1, 0) R ij estimand (MCAR, MAR, MNAR) (Y i, R i ) f(y i, R i θ, ψ) = f(y i θ) f(r i Y i, ψ) f(r i Y i, ψ) R i Y i θ ψ Little and Rubin (2002) MCAR, MAR, MNAR MCAR (Missing Completely At Random) MAR (Missing At Random) MNAR (Missing Not At Random) 3 Little and Rubin (2002) 6 (MCAR) : f(r i Y i, ψ) = f(r i ψ) (MAR) : f(r i Y i, ψ) = f(r i Y o i, ψ) (2.1) (MNAR) : f(r i Y i, ψ) f(r i Y o i, ψ) MAR MCAR MNAR MAR 7 6 Pattern-Mixture Model MNAR MAR 2 5 withdrawal attrition 6 Little and Rubin (2002) MNAR NMAR(Not Missing At Random) R i M i 7 (2.1) MNAR MAR 15

17 Little (2008) X i b i (MCAR) : f(r i X i, Y i, b i, ψ) = f(r i X i, ψ) (MAR) : f(r i X i, Y i, b i, ψ) = f(r i X i, Y o i, ψ) (2.2) (MNAR) : f(r i X i, Y i, b i, ψ) f(r i X i, Y o i, ψ) NRC(2010) MCAR MAR MNAR NRC (2010) 2 X 8 V MCAR f(r i X i, V i, Y i ) = f(r i ) (2.3) 9 MAR f(r i X i, V i, Y i ) = f(r i X i, Vi o, Yi o ) Vi o V i MNAR MAR MCAR MAR MNAR (2.1) 10 (MCAR) : f(r i Y i, ψ) = f(r i ψ) (MAR) : f(r i Y i, ψ) = f(r i Yi o, ψ) (MNAR) : f(r i Y i, ψ) f(r i Yi o, ψ) 3 Y i = (Y i1, Y i2, Y i3 ) R i = (R i1, R i2, R i3 ) MCAR MAR MNAR Appendix B 8 9 (2.2) MCAR (2.2) (2.3) 10 Seaman et al. (2013) everywhere MCAR everywhere MAR 16

18 MCAR MCAR 1(j = 1) Y i1 1 Y i 0.1 f(r i1 = 0 Y i ) = 0 (2.4) f(r i2 = 0 Y i, R i1 = 0) = 1 (2.5) f(r i2 = 0 Y i, R i1 = 1) = 0.1 f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 (2.6) f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = 0.1 i j R i1 = = R ij = 0 R i,j+1 = f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) = 1 f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) = MAR MAR j Y ij (j + 1) Y i,j+1 Y ij Y i,j 1 Y i,j+1 f(r i1 = 0 Y i ) = 0 (2.7) f(r i2 = 0 Y i, R i1 = 0) = 1 logitf(r i2 = 0 Y i, R i1 = 1) = Y i1 (2.8) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 (2.9) logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i2 ( ) f(x) logitf(x) = log exp (logitf(x)) = f(x) 1 f(x) 1 f(x) f(x) = exp(logitf(x)) 1 + exp(logitf(x)) (2.8) (2.9) 1 logitf(r i2 = 0 Y i, R i1 = 1) = Y i1 f(r i2 = 0 Y i1, R i1 = 1) = exp( Y i1) 1 + exp ( Y i1 ) 17

19 logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i2 f(r i3 = 0 Y i1, R i1 = 1, R i2 = 1) = exp( Y i2) 1 + exp ( Y i2 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) = 1 f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) = 0 MAR MNAR 1 MNAR 2 1 f(r i1 = 0 Y i ) = 0 (2.10) f(r i2 = 0 Y i, R i1 = 0) = 1 (2.11) logitf(r i2 = 0 Y i, R i1 = 1) = Y i Y i2 f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 (2.12) logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i Y i3 f(r i1 = 0 Y i ) = 0 f(r i2 = 0 Y i, R i1 = 0) = 1 f(r i2 = 0 Y i, R i1 = 1) = exp( Y i Y i2 ) 1 + exp( Y i Y i2 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = exp( Y i Y i3 ) 1 + exp( Y i Y i3 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) = 1 f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) = 0 MNAR MNAR 2 MNAR 2 MAR 11 f(r i1 = 0 Y i ) =

20 logitf(r i2 = 0 Y i, R i1 = 0) = Y i1 logitf(r i2 = 0 Y i, R i1 = 1) = Y i1 logitf(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = Y i2 logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 0) logitf(r i3 = 0 Y i, R i1 = 0, R i2 = 1) logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i2 = Y i2 = Y i2 f(r i1 = 0 Y i ) = 0 logitf(r i2 = 0 R i1 Y i ) = Y i1 logitf(r i3 = 0 R i1, R i2, Y i ) = Y i2 f(r i1 = 0 Y i ) = 0 f(r i2 = 0 R i1, Y i ) = f(r i3 = 0 R i1, R i2, Y i ) = exp( Y i1 ) 1 + exp( Y i1 ) exp( Y i2 ) 1 + exp( Y i2 ) MNAR MCAR MAR MNAR MCAR MAR MNAR 1 MCAR MAR 12 MNAR MCAR MAR MNAR 13 MCAR MCAR 1 MAR MCAR MAR 1 MNAR MNAR 2 MAR MNAR Y m (Y o, X, R) MAR MNAR 12 MAR MAR 13 19

21 MAR MNAR MAR MAR MAR (Mallinckrodt et al., 2008) MAR MNAR MNAR Verbeke and Molenberghs(2000) Local Influence MNAR

22 2 7 全データ ( 欠測データ含む ) 応答変数 時点 2.1: MCAR 観測データ ( M C A R ) 応答変数 時点 2.2: MCAR 2.3, 完了例 ( M C A R ) 2 7 中止例 ( M C A R ) 応答変数 応答変数 時点 時点 2.3: MCAR 2.4: MCAR MCAR MCAR

23 MCAR MCAR MCAR 2.1 MAR 観測データ ( M A R ) 応答変数 時点 2.5: MAR 完了例 ( M A R ) 2 7 中止例 ( M A R ) 応答変数 応答変数 時点 時点 2.6: MAR 2.7: MAR 22

24 MNAR 観測データ ( M N A R ) 応答変数 時点 2.8: MNAR 完了例 ( M N A R ) 2 7 中止例 ( M N A R ) 応答変数 応答変数 時点 時点 2.9: MNAR 2.10: MNAR MAR MNAR MAR MAR MAR MAR 16 MAR MNAR 1 MAR MNAR MAR 23

25 2.4.1 Molenberghs and Kenward (2007) Y i = (Yi o, Ym i ) 17 R i R i Y i 18 R i Y i Y i R i (full-data 19 )(Y i, R i ) (Y i, R i ) (full data likelihood) N N L (θ, ψ Y i ) f(y i, R i X i, θ, ψ) = f(yi o, Yi m, R i X i, θ, ψ) i=1 i=1 Yi m Yi m θ, ψ Y i m 20 observed data likelihood) L(θ, ψ Yi o ) L (θ, ψ Y i )dyi m i=1 N i=1 f(y i, R i X i, θ, ψ)dy m i N = f(yi o, Yi m, R i X i, θ, ψ)dyi m 21 (Yi o, R i) θ, ψ Little and Rubin (2002) 2 (missing-data mechanism is ignorable) (i) MAR (ii) θ ψ θ ψ (θ, ψ) Ω θ,ψ θ Ω θ ψ Ω ψ θ ψ f(y i, R i X i, θ, ψ) = f(y i X i, θ) f(r i X i, ψ) 17 Y i Y i = ((Yi o), (Yi m) ) 18 MCAR MCAR 19 "complete-data" 20 (Yi o, R i) 21 Little and Rubin (2002) "full likelihood" 24

26 θ ψ f(y i X i, θ) = ( 1 (2πσ2 ) exp 1 ) 3 2σ 2 (Y i µ) (Y i µ) (µ = (µ 1, µ 2, µ 3 ) ) θ = (µ 1, µ 2, µ 3, σ 2 ) f(r i, Y i, ψ) MAR f(r i1 = 0 Y i, ψ) = 0 f(r i2 = 0 Y i, R i1 = 0, ψ) = 1 f(r i2 = 0 Y i, R i1 = 1, ψ) = exp(ψ 1 + ψ 2 Y i1 ) 1 + exp (ψ 1 + ψ 2 Y i1 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0, ψ) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0, ψ) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 1, ψ) = exp(ψ 1 + ψ 2 Y i2 ) 1 + exp (ψ 1 + ψ 2 Y i2 ) ψ = (ψ 1, ψ 2 ) θ ψ 22 Y ij Y ij 23 θ = (µ 1, µ 2, µ 3, σ 2 ) ψ f(r i Y i, ψ) f(r i1 = 0 Y i, ψ) = 0 f(r i2 = 0 Y i, R i1 = 0, ψ) = 1 ( ) exp ψ 1 + ψ 2 Yi1 µ 1 f(r i2 = 0 Y i, R i1 = 1, ψ) σ = ( ) 1 + exp ψ 1 + ψ 2 Yi1 µ 1 σ 22 µ 1 Ω θ = µ 2 µ 3 R4 µ 1, µ 2, µ 3 R, σ 2 > 0 = R 3 R >0 σ 2 { ( ) } ψ1 Ω ψ = R 2 ψ 1, ψ 2 R = R 2 ψ 2 µ 1 µ 2 µ 3 Ω (θ,ψ) = σ 2 R 4 µ 1, µ 2, µ 3 R, σ 2 > 0, ψ 1, ψ 2 R = (R 3 R >0 ) R 2 ψ 1 ψ 2 Ω (θ,ψ) = Ω θ Ω ψ 23 25

27 f(r i3 = 0 Y i, R i1 = 0, R i2 = 0, ψ) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0, ψ) = 1 ( ) exp ψ 1 + ψ 2 Yi2 µ 2 f(r i3 = 0 Y i, R i1 = 1, R i2 = 1, ψ) σ = ( ) 1 + exp ψ 1 + ψ 2 Yi2 µ 2 σ ψ = (ψ 1, ψ 2, µ 1, µ 2, σ) ψ θ = (µ 1, µ 2, µ 3, σ 2 ) (µ 1, µ 2 ) θ (σ 2 ) σ θ ψ 24 θ f(r i Y i ) f(r i Y i ) θ f(r i Y i ) Selection Model f(yi o, Ym i θ) Yi m L(θ Yi o ) = f(yi o, Yi m θ)dyi m θ L(θ Yi o) L(θ, ψ Yo i ) R i R i L(θ, ψ Yi o) L(θ Yi o) 5 Little and Rubin (2002) (2002) 2.5 ( PGB) PGB (PLB ) Ω θ = R 3 R >0, Ω ψ = R 4 R >0 ( θ ψ µ 1 µ 2 µ 3 ) σ 2 = ψ 1 ψ 2 µ 1 µ 2 σ Ω (θ, = ψ) x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 R 9 x 1 = x 7, x 2 = x 8, x 4 = x 2 9, x 4, x 9 > 0 Ω (θ, ψ) Ω θ Ω ψ 26

28 2.1: Visit PLB (N = 67) PGB (N = 69) SD SD BL PLB PGB 2.11: 12 PGB 46 (66.7%) PLB 34 (50.7%) 2.2 PGB PLB ( LOCF ) PLB [95%CI] (95%CI: [ 2.150, 0.916]) ( 2.3) 27

29 2.2: PLB PGB (55.2%) 49 (70.0%) (ITT) 67 (100.0%) 69 (98.6%) 30 (44.8%) 21 (30.0%) 9 (13.4%) 15 (21.4%) 20 (29.9%) 5 (7.1%) 1 (1.5%) 1 (1.4%) 2.3: (ITT) [PGB PLB ] a) PLB PGB 95%CI (SE) p N (SD) (0.146) (1.253) N (SE) (0.235) (0.231) [ 2.150, 0.916] < (0.312) a) a) MAR MAR MNAR [1]. (2002)... [2] Little, R. J. (2008). Selection and Pattern-Mixture modes. Cphaiter 18 in Advances in longitudinal data analaysis

30 [3] Little, R. J. A. and Rubin, D. B. (2002). Statistical analysis with missing data. 2nd edition. John Wiley & Sons. New York. [4] Mallinckrodt, C. H., Lane, P. W., Schnell, D., Peng, Y. and Maucuso, J. P. (2008). Recommendations for the primary analysis of continuous endpoints in longitudinal clinical trials. Drug Information Journal, 42, [5] Molenberghs, G., and Kenward, M. (2007). Missing Data in Clinical Studies. Wiley. [6] National Research Council. (2010). The prevention and treatment of missing data in clinical trials. National Academy Press. [7]. ( [8] Seaman, S., Galati, J., Jackson, D., and Carlin, J. (2013). What Is Meant by Missing at Random?. Statistical Science, 28(2), [9] 1991)... [10] Verbeke, G, and Molenberghs, G. (2000). Linear mixed models for longitudinal data. Springer. 29

31 3 Estimand 3.1 NRC (2010) 2 Mallinckrodt (2013) 3 11 Mallinckrodt et al. (2014) ICH Steering Committee (2014) estimand ICH E9 (R1) Estimand Estimand NRC (2010) 2 Recommendation 1 (p26) (a) (b) (c) (d) (the measures of intervention effects) causal estimand Mallicnkrodt (2013) 3 (estimand ) estimand estimand estimand Estimand Mallinckrodt (2013) NRC (2010) estimand NRC (2010) 2 (p22) estimand Mallinckrodt (2013) 3 "What is being estimated" and/or rescue medication ICH E9 (R1) final concept paper (ICH Steering Committee, 2014) "Statement of the Perceived Problem" (p1) (property) rescue medication Estimand Estimand ICH E9 (R1) final concept paper "Issues to be Resolved" (p2) Estimand estimand 30

32 estimand NRC (2010) 2 estimand NRC (2010) 2 Mallinckrodt (2013) 3 estimand estimand 3.3 Efficacy effectiveness estimand 2 Mallinckrodt (2013) 3 efficacy effectiveness 2 Efficacy per-protocol estimand (PPS estimand) "de jure estimand" effectiveness intention-to-treat estimand (ITT estimand) "de facto estimand" rescue medication Mallinckrodt (2013) estimand 1 estimand 6 Mallinckrodt et al. (2013) POC study efficacy effectiveness O Neill and Temple (2012) ITT outcome study 1 studies of symptom Mallinckrodt (2013) 6 estimand Mallinckrodt (2013) (Major Depressive Disorder, MDD) 6 estimand 6 5 NRC (2010) NRC (2010) 2 Mallinckrodt (2013) Estimand 1 Estimand 1 1 OS PFS 2 31

33 estimand effectiveness rescue medication Mallinckrodt et al. (2012) estimand 1 estimand Estimand 2 Estimand 2 estimand run-in phase randomized withdrawal design efficacy NRC (2010) wash-out (2 p24) run-in phase (2 p33) (2 p33) Estimand 3 Estimand 3 estimand efficacy estimand estimand NRC (2010) estimand (available and effective mechanism for ensuring adherence) (2 p25) (rarely used) (2 p24) Mallinckrodt (2013) (14 ) estimand estimand estimand estimand 6 Mallinckrodt et al. (2012) estimand 3 Hypothetical estimand effectiveness Estimand 4, 5 Estimand 4 Area Under the Curve (AUC) Estimand 5 32

34 Estimand 4, 5 effectiveness estimand 4 estimand 5 estimand NRC (2010) 2 responder rate non-responder estimand 5 (p32) Mallinckrodt (2013) 3 estimand (p20) Estimand 6 Estimand 6 estimand effectiveness estimand 1 estimand 6 Estimand 6 rescue medication rescue medication placebo Multiple Imputaion (pmi) 3 BOCF NRC (2010) Reccomendataion 10 (p77) (single imputation) 4 Estimand 3 estimand 3 estimand Mallinckrodt et al. (2014) 3 estimand Mallinckrodt et al. (2014) 3 estimand 6 3 Estimand A Estimand 1 Estimand B Estimand 3 Estimand C Estimand estimand Leuchs et al. (2015) estimand effectiveness efficacy 3 pmi 6 33

35 3.1: Mallinckrodt (2013) MDD 6 estimand Estimand rescue medication 1 effectiveness 2 efficacy 3 efficacy 4 effectiveness 5 effectiveness 6 effectiveness 3.7 estimand Lehmann and Casella (1998) estimand NRC (2010) Rubin (1996) "scientific estimand" NRC (2010) ICH Steering Committee (2014) Mallinckrodt et al. (2012) Mallinckrodt (2013) Mallinckrodt et al. (2014) FDA O Neill and Temple (2012) Soon (2009) PMDA FDA CDER (Center for Drug Evaluation and Research) Lisa LaVange (, 2015; LaVange, 2015) Leuchs et al. (2015) BfArM (Federal Institute for Drugs and Medical Devices) Holzhauer et al. (2015) rescue medication estimand [1]. (2015). PMDA. NAS EMA estimand ( 34

36 [2] Holzhauer, B., Akacha, M., & Bermann, G. (2015). Choice of estimand and analysis methods in diabetes trials with rescue medication. Pharmaceutical statistics. [3] ICH Steering Committee (2014). Final Concept Paper E9(R1): Addendum to Statistical Principles for Clinical Trials dated 22 October [4] LaVange, L. (2015). Missing Data Issues in Regulatory Clinical Trials. NAS EMA estimand ( [5] Lehmann, E. L., & Casella, G. (1998). Theory of point estimation. Springer Science & Business Media. [6] Leuchs, A. K., Zinserling, J., Brandt, A., Wirtz, D., & Benda, N. (2015). Choosing Appropriate Estimands in Clinical Trials. Therapeutic Innovation & Regulatory Science, [7] Mallinckrodt, C. H. (2013). Preventing and treating missing data in longitudinal clinical trials. Cambridge University Press. [8] Mallinckrodt, C. H., Lin, Q., Lipkovich, I., & Molenberghs, G. (2012). A structured approach to choosing estimands and estimators in longitudinal clinical trials. Pharmaceutical statistics, 11(6), [9] Mallinckrodt, C., Roger, J., Chuang-Stein, C., Molenberghs, G., Lane, P. W., O Kelly, M., Ratitch, B., Xu, L., Gilbert, S., Mehrotra, D. V., Wolfinger, R., & Thijs, H. (2013). Missing data: turning guidance into action. Statistics in Biopharmaceutical Research, 5(4), [10] Mallinckrodt, C. H, Roger, J., Chuang-Stein, G., Molenberghs, G., O Kelly, M., Ratitch, B., Janssens, M., and Bunouf, P. (2014). Recent Developments in the Prevention and Treatment of Missing Data. Therapeutic Innovation & Regulatory Science [11] National Research Council. (2010). The prevention and treatment of missing data in clinical trials. National Academy Press. [12] O Neill, R. T., & Temple, R. (2012). The prevention and treatment of missing data in clinical trials: an FDA perspective on the importance of dealing with it. Clinical Pharmacology & Therapeutics, 91(3), [13] Soon, G. G. (2009). Minimizing missing data in clinical trials: Design, operation, and regulatory considerations. Drug Information Journal, 43(4), [14] Rubin, D. B. (1996). Multiple imputation after 18+ years. Journal of the American statistical Association, 91(434),

37 4 Complete case LOCF BOCF 4.1 Complete case LOCF BOCF 3 1. ( ) complete case 2. LOCF (Last Observation Carried Forward) BOCF (Baseline Observation Carried Forward) 3. (GEE ) 9 MMRM (Mixed-effect Models for Repeated Measures) complete case NRC (2010) EMA (2010) Kenward and Molenberghs (2009) LOCF BOCF 4.2 Complete case complete case Complete case MCAR 2 complete case complete case 36

38 4.3 complete case complete case LOCF BOCF NRC (2010) single imputation method ( ) NRC (2010, p77) LOCF BOCF LOCF LOCF NRC (2010, p65-66) LOCF LOCF LOCF complete case ( ) LOCF Molenberghs et al. (2004) LOCF Kenward and Molenberghs (2009) ( ) LOCF LOCF ( ) LOCF LOCF 12 NRC (2010, p65-66) LOCF MCAR MAR LOCF MNAR n n MAR i P (Y n X, Y o, R = 1) = P (Y n X, Y o, R = 0) (4.1) 37

39 R n R = 0 R = 1 (4.1) X Y o = (Y 1,..., Y n 1 ) Y n LOCF Y n 1 Y n 1 P (Y n = Y n 1 X, Y o ) = 1 P (Y n X, Y o, R = 0) P (Y n X, Y o, R = 1) LOCF MCAR MAR NRC (2010) Kenward and Molenberghs (2009) EMA (2010) ( ) LOCF BOCF LOCF BOCF 0 BOCF BOCF EMA (2010) LOCF BOCF BOCF LOCF BOCF LOCF BOCF Estimand LOCF BOCF LOCF BOCF estimand NRC (2010) Ayele et al. (2014) LOCF BOCF efficacy effectiveness estimand LOCF efficacy BOCF effectiveness Ayele et al. (2014) estimand 3 LOCF estimand 6 BOCF estimand LOCF BOCF efficacy effectiveness complete case LOCF BOCF 38

40 Multiple Imputation (MI) MI (M ) LOCF BOCF MAR M M M 1 Pattern Mixture Model (PMM) LOCF BOCF 2 1 MAR MCAR 2 ( ) NRC (2010) EMA (2010) MI MI 6 [1] Ayele, B. T., Lipkovich, I., Molenberghs, G., and Mallinckrodt, C. H. (2014). A Multiple-Imputation-Based Approach to Sensitivity Analyses and Effectiveness Assessments in Longitudinal Clinical Trials. Journal of biopharmaceutical statistics, 24(2), [2] European Medicines Agency. Committee for Medicinal Products for Human Use (CHMP). Guideline on missing data in confirmatory clinical trials. EMA/CPMP/EWP/1776/99 Rev Published July 2, [3] Kenward, M. G., and Molenberghs, G. (2009). Last observation carried forward: a crystal ball?. Journal of biopharmaceutical statistics, 19(5), [4] Molenberghs, G., Thijs, H., Jansen, I., Beunckens, C., Kenward, M. G., Mallinckrodt, C., and Carroll, R. J. (2004). Analyzing incomplete longitudinal clinical trial data. Biostatistics, 5(3), [5] National Research Council. The Prevention and Treatment of Missing Data in Clinical Trials. Panel on Handling Missing Data in Clinical Trials. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press;

41 5 Selection Model 5.1 Selection Model (SM) SM 1 MMRM MAR SM 1 MAR SM MMRM MMRM MMRM MAR SM MMRM MNAR SM 5.2 SM i (i = 1,, N) j (j = 1,, n) Y ij i j Y i i Yi o i Yi m i R i i (full data 1 ) (Y i, R i ) (observed data) (Y o i, R i) SM SM (Little and Rubin, 2002) f(y i, R i θ, ψ) = f(y i θ) f(r i Y i, ψ) Y i f(y i θ) R i R i Y i θ ψ θ ψ 2 1 "complete data"

42 X i b i θ ) θ = ( θ1 b i X i Y i f(y i X i, b i, θ 1 ) X i b i f(b i X i, θ 2 ) f(y i X i, θ) = f(y i, X i, b i, θ 1 ) f(b i X i, θ 2 )db i SM f(r i, Y i, b i X i, θ, ψ) = f(y i X i, b i, θ 1 ) f(r i X i, Y i, b i, ψ) f(b i X i, θ 2 ) (5.1) 2 3 θ SM estimand SM estimand efficacy 3 Mallinckrodt 6 estimand 2, MAR SM SM MAR MNAR MAR MAR SM 1 SM MAR θ R i f(r i X i, Y i, b i, ψ) = f(r i X i, Yi o, ψ) 3 ( θ1 θ = θ 2 41 )

43 ( f(y i, R i, b i X i, θ, ψ) = f(y i X i, b i, θ 1 ) f(r i X i, Y o i, ψ) f(b i X i, θ 2 ) (5.2) f(yi o, R i X i, θ, ψ) = f(y i, R i, b i X i, θ, ψ)db i dyi m = f(y i X i, b i, θ 1 ) f(r i X i, Yi o, ψ) f(b i X i, θ 2 )db i dyi m ( (5.2)) = f(r i X i, Yi o, ψ) f(y i X i, b i, θ 1 ) f(b i X i, θ 2 )db i dyi m = f(r i X i, Yi o, ψ) f(yi o, Yi m X i, θ 1, θ 2 )dyi m = f(r i X i, Y o i, ψ) f(y o i X i, θ) log {f(y o i, R i X i, θ, ψ)} = log{f(r i X i, Y o i, ψ) f(y o i X i, θ)} = log{f(r i X i, Y o i, ψ)} + log{f(y o i X i, θ)} θ f(r i X i, Yi o, ψ) θ 0 θ Y i Y i 1 Y i SAS PROC MIXED Y i SM f(y i X i, b i ) = f(yi o, Ym i X i, b i ) 4 IPW GEE wgee Doubly Robust MAR SM 2 MMRM MAR SM MMRM (Mixed Model for Repeated Measures) MMRM 9 "MMRM" Mallinckrodt (2013) 7.2 "MMRM" Direct Likelihood (DL) 4 Type (i) 5 NRC (2010) p73 5 SM 42

44 MMRM SM Appendix A MAR SM MMRM MAR SM MAR SM 6 MMRM 3 σ 2 Y i = X i β + b i ϵ i (b i iid N(0, ν 2 ), ϵ i iid N(0, σ 2 I 3 ), b i ϵ i ) V [Y i ] = V [b i ϵ i ] = 1 3 V [b i ]1 3 + V [ϵ i ] = 1 3 ν σ 2 I = ν σ ν 2 + σ 2 ν 2 ν 2 = ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 (unstructured) MMRM MMRM MMRM (unstructured) 3 unstructured 0 σ1 2 σ 12 σ 13 iid Y i = X i β + b i ϵ i b i N(0, ν 2 iid ), ϵ i N 0, σ 12 σ2 2 σ 23 0 σ 13 σ 23 σ3 2 V [Y i ] = V [b i ϵ i ] (5.3) = 1 3 V [b i ]1 3 + V [ϵ i ] (5.4) σ1 2 σ 12 σ 13 = 1 3 ν σ 12 σ2 2 σ 23 σ 13 σ 23 σ

45 1 1 1 σ1 2 σ 12 σ 13 = ν σ 12 σ 2 2 σ σ 13 σ 23 σ3 2 ν 2 + σ1 2 ν 2 + σ 12 ν 2 + σ 13 = ν 2 + σ 12 ν 2 + σ2 2 ν 2 + σ 23 (5.5) ν 2 + σ 13 ν 2 + σ 23 ν 2 + σ 2 3 ν 2 ν 2, σ 2 1, σ 12, 1 7 unstructured unstructured MMRM Mallinckrodt et al. (2001) MMRM unstructured 1 unstructured ) ν 2 + σ 2 1 ν 2 + σ 12 1 Mallinckrodt (2013) 7.4 unstructured Mallinckrodt et al. (2008) Mallinckrodt et al. (2004) α MMRM Compound Symmetry (parsimonious) unstructured unstructured 1 7 (identifiablity) ν2 + σ 2 1 ν 2 + σ 12 ν 2 + σ 13 ν 2 + σ 12 ν 2 + σ 2 2 ν 2 + σ 23 ν 2 + σ 13 ν 2 + σ 23 ν 2 + σ ν 2 = 0.8 σ 2 1 = σ2 2 = σ2 3 = 0.2, σ 12 = σ 23 = 0.1, σ 13 = 0 2. ν 2 = 0.5 σ 2 1 = σ2 2 = σ2 3 = 0.5, σ 12 = σ 23 = 0.4, σ 13 = 0.3 = 0 < ν 2 <

46 5.3.3 MNAR (outcome-dependent dropout) SM MNAR Little (2008) outcome-dependent dropout 9 f(r i X i, Y i, b i, ψ) = f(r i X i, Y i, ψ) f(y i, R i, b i X i, θ, ψ) = f(y i X i, b i, θ 1 ) f(r i X i, Y i, ψ) f(b i X i, θ 2 ) b i f(y i, R i X i, θ, ψ) = f(y i X i, b i, θ 1 ) f(r i X i, Y i, ψ) f(b i X i, θ 2 )db i = f(y i X i, θ) f(r i X i, Y i, ψ) (5.6) (5.6) b i f(r i X i, Y i, ψ) 10 f(yi o, R i X i, θ, ψ) = f(yi o, Yi m X i, θ) f(r i X i, Yi o, Yi m, ψ)dyi m (5.7) MAR Diggle and Kenward (1994) R i R ij = 1 R ij = 0 logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij (j = 2,, n) (5.8) P r(r ij = 0 R i,j 1 = 0, Y i, X i, ψ) = 1 (j = 2,, n) (5.9) P r(r i1 = 1 Y i, X i, ψ) = 1 (5.10) 11 ψ = (ψ 1, ψ 2, ψ 3 ) (5.8) i j Y ij 1 Y i,j 1 Y ij 2 ψ 3 = 0 Y i,j 1 MAR (5.9) (5.10) (j = 1) 9 NRC (2010) 5 IPW 10 f(y i X i, b i, θ 1 ) f(y i X i, θ) Y i = X i β + Z i b i + ϵ i, b i iid N(0, D), ϵ i iid N(0, Σ) {b i } {ϵ i } f(y i X i, b i, θ 1 ) = Y i b i N(X i β + Z i b i, Σ) f(y i X i, θ) = Y i N(X i β, Z i DZ i + Σ) θ 2 b i D Y i b i Y i 11 X i X i 45

47 1. logistic 2. j j, (j 1) 2 Y 3. j 1 Y 2 1 Y i1 4 3 Y i3 MNAR MNAR NFD (Non Future dependence) NFD 6 Appendix B Diggle and Kenward (1994) Nelder-Mead Nelder and Mead Nelder and Mead (1965a, 1965b) 5.4 SM II MMRM 10 NRC (2010) 5 Type (i) untestable Type (ii) testable 2 Type (ii) Type (i) Type (i) MAR MMRM MAR Type (i), MNAR SM MNAR SM MNAR SM MNAR SM MMRM Diggle and Kenward (1994) logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij (j = 2,, n) (5.11) P r(r ij = 0 R i,j 1 = 0, R i,j 2,, R i1, Y i, X i, ψ) = 1 (j = 2,, n) P r(r i1 = 1 Y i, X i, ψ) = 1 R i,j 2, R i1 ψ 1, ψ 2, ψ 3 ψ 3 = 0 MAR ψ 3 0 MNAR ψ 3 MAR MNAR ψ 3 MAR Verbeke and Molenberghs (2000) H 0 : ψ 3 = 0, H 1 : ψ 3 0 MAR MNAR 12 MAR 12 46

48 MNAR SM 2 2 Type (i) 1 MAR Diggle and Kenward (1994) logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij (j = 2,, n) (5.12) P r(r ij = 0 R i,j 1 = 0, R i,j 2,, R i1, Y i, X i, ψ) = 1 (j = 2,, n) P r(r i1 = 1 Y i, X i, ψ) = 1 R i,j 2, R i1 MAR MNAR ψ 3 ψ 3 MAR ψ 3 MAR ψ 3 MAR NRC (2010) 5 (sensitivity parameter) Delta adjustment 13 (i) ψ (ii) (i) ψ 3 1 MNAR SM (iii) (i) ψ 3 (ii) logistic MNAR MNAR 14 ψ 3 ψ 3 SM MMRM SM MMRM 5.5 Missingdata.org.uk SM SM 1 2 Appendix C 2 1 logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 1 + ψ 3 Y i,j 1 + ψ 5 Y ij 2 logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 2 + ψ 4 Y i,j 1 + ψ 6 Y ij (5.13) (5.11) (5.12) ψ 1, ψ 2, ψ 3 13 Pattern-Mixture Model SM ψ i 14 47

49 "%Selection_Model2" (5.13) ψ 1 ψ 6 inputds covtype UN TOEP TOEPH ARH AR CSH CS response model clasvar mech MCAR MAR MNAR const 3 8 derivative (0, 1) method NR Newton-Raphoson ridge QN Newton out1 out2 LSMEAN out3 LSMEAN debug "%SM_GridSearch" (5.13) ψ 1, ψ 2, ψ 3, ψ 4 ψ 5, ψ 6 inputds psi5grid 1 ψ 5 (-1 1) psi6grid 2 ψ 6 (-1 1) inputds covtype UN TOEP TOEPH ARH AR CSH CS response model clasvar mech MCAR MAR MNAR const 3 8 derivative (0, 1) method NR Newton-Raphoson ridge QN Newton out1 out2 LSMEAN out3 LSMEAN debug 0 [1] Diggle, P. and Kenward, M. G. (1994). Informative drop-out in longitudinal data analysis. Journal of Applied Statistics, 43,

50 [2] Little, R. J. (2008). Selection and Pattern-Mixture modes. Cphaiter 18 in Advances in longitudinal data analaysis [3] Little, R. J. A. and Rubin, D. B. (2002). Statistical analysis with missing data. 2nd edition. John Wiley & Sons. New York. [4] Mallinckrodt, C. H. (2013). Preventing and treating missing data in longitudinal clinical trials. Cambridge University Press. [5] Mallinckrodt C. H., Clark W. S. and David S. R. (2001). Accounting for dropout bias using mixed-effects models. Journal Biopharmaceutical Statistics, 11, [6] Mallinckrodt, C. H., Kaiser, C. J., Watkin, J. G., Molenberghs, G., & Carroll, R. J. (2004). The effect of correlation structure on treatment contrasts estimated from incomplete clinical trial data with likelihood-based repeated measures compared with last observation carried forward ANOVA. Clinical Trials, 1(6), [7] Mallinckrod, C. H., Lane, P. W., Schnell, D., Peng, Y., & Mancuso, J. P. (2008). Recommendations for the primary analysis of continuous endpoints in longitudinal clinical trials. Drug Information Journal, 42(4), [8] National Research Council. (2010). The prevention and treatment of missing data in clinical trials. National Academy Press. [9] Nelder, J. A., and Mead, R. (1965a). A simplex method for function minimization. The computer journal, 7(4), [10] Nelder, J. A., and Mead, R. (1965b). A simplex method for function minimization - errata. The computer journal, 8(1), 27. [11] Verbeke, G, and Molenberghs, G. (2000). Linear mixed models for longitudinal data. Springer. 49

51 6 Pattern-Mixture Model 6.1 Pattern-Mixture Model PMM PMM PMM Identifying Restrictions PMM Controlled imputation PMM Ratitch et al. (2013) Introduction Mallinckrodt et al. (2013) Jump to reference (J2R) Copy reference (CR), Copy increment reference (CIR) Controlled imputation PMM PMM (Patterm-Mixture Model, PMM) 6.1 PMM PMM estimand PMM efficacy effectiveness Mallinckrodt et al. (2013) 12 Analytic road map PMM estimand efficacy (estimand 3) contorolled imputation placebo multiple imputation effectiveness (estimand 6) Ayele et al. (2014) placebo multiple imputation effectiveness effectiveness 50

52 efficacy efficacy MNAR estimand 6.1: 6.2 PMM 1 R j Y 1,.., Y t 1 Y t f j (Y t Y 1,.., Y t 1 ) := f (Y t Y 1,.., Y t 1, R = j) R 1 R j Y 1,.., Y j Y j+1,.., Y t ( ) NFMV PMM R j (j + 1) j (j + 1) (j + 2) j = 1 Y 3 1 f 1 (Y 3 Y 1, Y 2 ) = f (Y 3 Y 1, Y 2, R = 1) R i i R Molenberghs et al. (1998), Kenward et al. (2003) 51

53 f 1 (3 12) = f (Y 3 Y 1, Y 2, R = 1) 2 1 R 1 Y 1 Y 2 Y 3 Y 3 2 f 2 (Y 3 Y 1, Y 2 ) = f (Y 3 Y 1, Y 2, R = 2) f 2 (3 12) = f (Y 3 Y 1, Y 2, R = 2) R 2 Y 1, Y 2 Y 3 Y 3 3 f 3 (Y 3 Y 1, Y 2 ) = f (Y 3 Y 1, Y 2, R = 3) f 3 (3 12) = f (Y 3 Y 1, Y 2, R = 3) R 3 Y 1, Y 2, Y 3 Y 3 2 R t Y 1,..., Y T f ( t) (Y 1,.., Y T ) = f (Y 1,.., Y T R t) 6.3 PMM PMM Little (1993, 1994, 1995) Selection Model SM PMM ( ) ( f Yi o, Yi m, R i X i, θ, ψ = f R i X i, ψ ) ( f Yi o, Yi m R i, X i, θ ) Y o i : i Y m i : i R i : i X i : i ψ : θ : ψ, θ SM PMM SM Y i R i R i 52

54 Yi o, Ym i X i R i PMM X i Yi m PMM "under-identified" Little (1993, 1994) "Identifying restriction" ( ) 6.4 f Yi o, Ym i, R i X i, θ, ψ ( f Yi o, Yi m, R i X i, ( = f R i X i, ψ ) f ) θ, ψ ( Y o i R i, X i, θ ) f ( Yi m Yi o, R i, X i, θ ) ( f R i X i, ψ ) ( f Yi o R i, X i, θ ) ( f Yi m Yo i, R i, X i, θ ) ( R i 0 1 R i R i R i R i R i PMM (CCMV) (NCMV) (ACMV) : 53

55 NRC , p85 Type (i) Little (1993, 1994) Complete Case Missing Values (CCMV) Neighboring Case Missing Values (NCMV) Available Case Missing Values (ACMV) Thijs et al. (2002) 3 Kenward et al. (2003) Interior family (6.2) PMM (1) (2) (3) (4) (5) PMM (3) PMM 6.5 CCMV NCMV ACMV PMM R = 1,, T R = t t t ( ) f Y 1,..., Y T, R = t X, θ, ψ = f t (Y 1,.., Y t X, θ ) f t (Y t+1, Y t+2,..., Y T Y 1,.., Y t, X, θ ) ( f R = t X, ψ ) X, θ ψ 1 f t (Y 1,.., Y t X, θ ) 2 f t (Y t+1, Y t+2,..., Y T Y 1,.., Y t, X, θ ) X θ ψ (6.1) CCMV, NCMV, ACMV Thijs et al. (2002) (s 1) Y 1,..., Y s 1 54

56 (j = s,..., T ) Y s CCMV NCMV ACMV f t (Y s Y 1,..., Y s 1 ) }{{} = T ω sj f j (Y s Y 1,..., Y s 1 ) j=s, s = t + 1,.., T (6.2) } {{ } = ω ss f s (Y s Y 1,..., Y s 1 ) + + ω st f T (Y s Y 1,..., Y s 1 ) (6.2) s t f s (Y s Y 1,..., Y s 1 ),, f T (Y s Y 1,..., Y s 1 ) R s Y s interior family interior family (6.2) ω sj 6.1: A B C D 6.1 CCMV, NCMV, ACMV ω sj CCMV CCMV (Little, 1993) ω T,T = ω T 1,T = ω T 2,T =... = ω t+1,t = 1 ω sj = 0, j T. Y s f t (Y s Y 1,..., Y s 1 ) = f T (Y s Y 1,..., Y s 1 ), s = t + 1,..., T (6.3) 4 CCMV f 3 (4 123) }{{} = f 4 (4 123) ( ) }{{} f 2 (3 12) = f 4 (3 12), f 2 (4 123) = f 4 (4 123) 55

57 f 1 (2 1) = f 4 (2 1), f 1 (3 12) = f 4 (3 12), f 1 (4 123) = f 4 (4 123) CCMV A CCMV : CCMV ( ) A f 4 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) B f 3 (1) f 3 (2 1) f 3 (3 12) f 4 (4 123) C f 2 (1) f 2 (2 1) f 4 (3 12) f 4 (4 123) D f 1 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) NCMV NCMV ω T,T = ω T 1,T 1 = ω T 2,T 2 =... = ω t+1,t+1 = 1 ω sj = 0, j s. Y s f t (Y s Y 1,..., Y s 1 ) = f s (Y s Y 1,..., Y s 1 ), s = t + 1,..., T (6.4) 4 NCMV f 3 (4 123) = f 4 (4 123) f 2 (3 12) = f 3 (3 12), f 2 (4 123) = f 4 (4 123) f 1 (2 1) = f 2 (2 1), f 1 (3 12) = f 3 (3 12), f 1 (4 123) = f 4 (4 123) NCMV 6.1 C D 3 B 3 NCMV : NCMV ( ) A f 4 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) B f 3 (1) f 3 (2 1) f 3 (3 12) f 4 (4 123) C f 2 (1) f 2 (2 1) f 3 (3 12) f 4 (4 123) D f 1 (1) f 2 (2 1) f 3 (3 12) f 4 (4 123) 56

58 6.5.4 ACMV ACMV(Molenberghs et al., 1998) f t (Y s Y 1,..., Y s 1 ) = f s (Y s Y 1,..., Y s 1 ) = f(y s Y 1,..., Y s 1, R s) = f(y 1,..., Y s 1, Y s, R s) f(y 1,..., Y s 1, R s) = f(y 1,..., Y s 1, Y s R = s)f(r = s) f(y 1,..., Y s 1, Y s R = T )f(r = T ) f(y 1,..., Y s 1 R = s)f(r = s) f(y 1,..., Y s 1 R = T )f(r = T ) = α sf s (Y 1,..., Y s 1, Y s ) α s f T (Y 1,..., Y s 1, Y s ) ( f(r = j) = α j ) α s f s (Y 1,..., Y s 1 ) α T f T (Y 1,..., Y s 1 ) T α j f j (Y 1,..., Y s ) = = = j=s T α j f j (Y 1,..., Y s 1 ) j=s T j=s α j f j (Y 1,..., Y s 1 ) f j (Y s Y 1,.., Y s 1 ) T α l f l (Y 1,..., Y s 1 ) l=s T ω sj f j (Y s Y 1,.., Y s 1 ), s = t + 1,..., T. (6.5) j=s ω sj = α jf j (Y 1,..., Y s 1 ) T α l f l (Y 1,, Y s 1 ) l=s α j j Molenberghs et al., ACMV f 3 (4 123) = f 4 (4 123) f 2 (3 12) = f 3 (3 12), f 2 (4 123) = f 4 (4 123) f 1 (2 1) = f 2 (2 1), f 1 (3 12) = f 3 (3 12), f 1 (4 123) = f 4 (4 123) ACMV 6.1 D 2 2 A B C 2 ACMV : ACMV ( ) A f 4 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) B f 3 (1) f 3 (2 1) f 3 (3 12) f 4 (4 123) C f 2 (1) f 2 (2 1) f 3 (3 12) f 4 (4 123) D f 1 (1) f 2 (2 1) f 3 (3 12) f 4 (4 123) Molenberghs et al. (1998) 1 Appendix B 1: MAR ACMV 57

59 6.5.5 CCMV NCMV ACMV (6.2) (6.1) t T t 1 f t (Y 1,..., Y T ) = f t (Y 1,.., Y t ) T ω T s,j f j (Y T s Y 1,..., Y T s 1 ) (6.6) s=0 j=t s 4 CCMV NCMV ACMV 4 ω 33 + ω 34 = 1, ω 44 = 1 ω := ω 33 ω 34 = 1 ω f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 3 (4 123) = f 3 (123) ω 44 f 4 (4 123) ( CCMV, NCMV, ACMV ) = f 3 (123) f 4 (4 123) ( ω 44 = 1) f 2 (1234) = f 2 (12) f 2 (3 12) f 2 (4 123) = f 2 (12) [ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] ω 44 f 4 (4 123) ( (6.6)) = f 2 (12) [ω f 3 (3 12) + (1 ω) f 4 (3 12)] f 4 (4 123) ( ω 33 = ω, ω 34 = 1 ω, ω 44 = 1) f 1 (1234) = f 1 (1) f 1 (2 1) f 1 (3 12) f 1 (4 123) = f 1 (1) [ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1)] [ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] ω 44 f 4 (4 123) ( (6.6)) = f 1 (1) [ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1)] [ω f 3 (3 12) + (1 ω)f 4 (3 12)] f 4 (4 123) ( ω 33 = ω, ω 34 = 1 ω, ω 44 = 1) (6.7) ω Case X Case 1: CCMV ω = ω 33 = ω 22 = ω 23 = 0, ω 44 = ω 34 = ω 24 = 1 (6.7) CCMV 4 f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 4 (4 123) f 2 (1234) = f 2 (12) f 4 (3 12) f 4 (4 123) f 1 (1234) = f 1 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) Case 2: NCMV ω = ω 33 = ω 22 = 1, ω 34 = ω 23 (1) = ω 24 (1) = 0 58

60 CCMV 4 f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 4 (4 123) f 2 (1234) = f 2 (12) f 3 (3 12) f 4 (4 123) f 1 (1234) = f 1 (1) f 2 (2 1) f 3 (3 12) f 4 (4 123) Case 3: ACMV α 3 f 3 (12) ω = ω 33 = α 3 f 3 (12) + α 4 f 4 (12), 1 ω = ω α 4 f 4 (12) 34 = α 3 f 3 (12) + α 4 f 4 (12), α 2 f 2 (1) ω 22 = α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), ω α 3 f 3 (1) 23 = α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), ω 24 = α 4 f 4 (1) α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1) α j = f(r = j) ACMV 4 Case 3 ω (6.7) MAR Case 3 O Kelly and Ratitch 2014 CCMV NCMV ACMV MAR MAR MAR MMRM O Kelly and Ratitch 2014 MAR MI ACMV PMM, ACMV ACMV NCMV O Kelly and Ratitch CCMV ACMV NCMV NCMV CCMV NCMV ACMV PMM Step1 t (= 1,..., T ) f t (Y 1,.., Y t ) Step2 CCMV NCMV ACMV 59

61 Step3 Step2 f t (Y t+1,..., Y T Y 1,..., Y t ) ω 2 Step Step3-1 (6.2) 1 (6.2) (T s + 1) ω sj (s = t + 1,..., T ; j = s,..., T ) f j (Y s Y 1,..., Y s 1 ) U s1 1 s (T t) (T t) k (= s,..., T ) k 1 k ω sj U ω sj, (k = s,, T, k 2) j=s j=s 0 U ω 11, (k = 1) Step3-2 k Y s Step4 Step3-2 Step5 MMRM Step6 Rubin MAR ACMV 2. MI (CCMV, NCMV, ACMV) 3 CCMV, NCMV, ACMV interior family CCMV NCMV MNAR ACMV MAR CCMV NCMV MNAR MAR O Kelly and Ratitch

62 Interior family Non Future Missing Values (NFMV) Interior family NFMV MNAR 6.7 NFMV PMM CCMV, NCMV MNAR MAR MNAR MNFD (Missing Non Future Dependence) MNFD NFMV PMM MNFD NFD Appendix B NFD MNFD MAR PMM MNAR Kenward et al. (2003) NFMV PMM Interior family CCMV, ACMV, NCMV NFMV MNAR SM R = 1,, T R = t t. SM ( ) ( f Y 1,.., Y T, R = t X, θ, ψ = f Y 1,..., Y T X, θ ) f ( ) R = t Y 1,..., Y T, X. ψ (6.8) PMM ( ) ( f Y 1,..., Y T, R = t X, θ, ψ = f Y 1,..., Y T R = t, X, θ ) ( f R = t X, ψ ) = f t (Y 1,..., Y T X, θ ) ( f R = t X, ψ ) = f t (Y 1,..., Y t X, θ ) ( f t Y t+1 Y 1,..., Y t, X, θ ) f t (Y t+2,..., Y T Y 1,..., Y t+1, X, θ ) ( f R = t X, ψ ) (6.9) (6.9) 2 3 t f t (Y 1,..., Y t X, θ) t + 1 f t (Y t+1 Y 1,..., Y t, X, θ) t + 2 f t (Y t+2,...,( Y T Y 1,..., Y t+1, X, θ) 1 4 (f R = t X, ψ ) ) X θ ψ NFMV SM MNFD Missing Non-Future Dependent MNFD 2 (6.1) (6.1) t (t + 1) (6.9) t (t + 1) (t + 2) (t + 2) NFMV (6.1) 61

63 R = t t Y o = (Y 1,, Y t ) t + 1 Y m = (Y t+1,, Y T ) MNFD (Kenward et al., 2003) f (R = t Y 1,..., Y T ) = f (R = t Y 1,..., Y t+1 ) (6.10) (t + 1) Y t+1 Y t+2,..., Y T Little and Rubin (2002) MNAR MAR Molenberghs et al. (2007) MNAR MAR 6.5 MNAR SM Diggle and Kenward (1994) 3 logit{p r(r i,t+1 = 0 R i1 = 1,, R i,t = 1, Y i, X i, ψ)} = ψ 1 + ψ 2 Y i,t + ψ 3 Y i,t+1 MNFD PMM MNFD Non- Future Missing Value restrictions NFMV s t + 2, f (Y s Y 1,..., Y s 1, R = t) = f (Y s Y 1,..., Y s 1, R s 1) (6.11) Kenward et al. (2003) MNFD NFMV Appendix B t Y t Y t+1, Y t+2,..., Y s 1 Y s (s 1) Y s Y s NFMV s = 5, t = 3, 2, 1 f (Y 5 Y 1, Y 2, Y 3, Y 4, R = 3) = f (Y 5 Y 1, Y 2, Y 3, Y 4, R 4) Y 3 Y 4 f (Y 5 Y 1, Y 2, Y 3, Y 4, R = 2) = f (Y 5 Y 1, Y 2, Y 3, Y 4, R 4) Y 2 Y 3 f (Y 5 Y 1, Y 2, Y 3, Y 4, R = 1) = f (Y 5 Y 1, Y 2, Y 3, Y 4, R 4) Y 1 Y 2 R = 4 Y 5 NFMV D 3 NFMV Missingdata.org.uk 62

64 6.5: NFMV f A f 4 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) B f 3 (1) f 3 (2 1) f 3 (3 12) C f 2 (1) f 2 (2 1) f 3 (4 123) D f 1 (1) f 2 (3 12) f 3 (4 123) (6.11) f (Y s Y 1,..., Y s 1, R s 1) R = s 1 R s f (Y s Y 1,..., Y s 1, R s 1) = f (Y s Y 1,..., Y s 1, R = s 1) f (R = s 1 R s 1) }{{} + f (Y s Y 1,..., Y s 1, R s) f (R s R s 1) }{{} R = s 1 Y s R s Y s f (Y s Y 1, Y s 1, R = s 1) (6.12) s = 5 f (Y 5 Y 1, Y 2, Y 3, Y 4, R = 4) NFMV NFMV NFMV CCMV NCMV NFMV 1 ACMV ACMV Appendix B ACMV MAR NFMV Kenward et al. (2003) Appendix B 2: MNFD NFMV Molenberghs et al. (1998) MAR ACMV 1 NFMV (6.12) CCMV NCMV MNAR Mallinckrodt et al., 2013 ACMV MAR ACMV PMM NFMV CCMV NFMV NCMV PMM MAR Mallinckrodt et al. (2013) (NFMV) NFMV t f t (Y 1,..., Y T ) = f t (Y 1,..., Y t ) f t (Y t+1 Y 1,..., Y t ) f t (Y t+2,..., Y T Y 1,..., Y t+1 ) T = f t (Y 1,..., Y t ) f t (Y t+1 Y 1,..., Y t ) f t (Y s Y 1,..., Y s 1 ) (6.13) 63 s=t+2

65 (6.11) T = 4 R = 1 f 1 (Y 1, Y 2, Y 3, Y 4 ) = f 1 (Y 1 ) f 1 (Y 2 Y 1 ) 4 f 1 (Y s Y 1, Y 2 ) s=3 = f 1 (Y 1 ) f 1 (Y 2 Y 1 ) f 1 (Y 3 Y 1, Y 2 ) f 1 (Y 4 Y 1, Y 2, Y 3 ) }{{}}{{}}{{}}{{} NFMV 3 NFMV NFMV (6.11) s = t + 2,, T f t (Y s Y 1,..., Y s 1 ) = f (Y s Y 1,..., Y s 1, R s 1) = f s 1 (Y s Y 1,..., Y s 1 ) (6.14) s = t + 2,..., T f s 1 (Y s Y 1,..., Y s 1 ) = = = T j=s 1 T j=s 1 T j=s 1 T j=s 1 α j f j (Y 1,..., Y s ) α j f j (Y 1,..., Y s 1 ) α j f j (Y 1,..., Y s 1 ) f j (Y s Y 1,.., Y s 1 ) T α l f l (Y 1,, Y s 1 ) l=s 1 ω sj f j (Y s Y 1,..., Y s 1 ) (6.15) α j f j (Y 1,..., Y s 1 ) ω sj =, T α l f l (Y 1,, Y s 1 ) l=s 1 α j : j MNAR NFMV (6.12) 4 NFMV 4 (6.15) (6.11) (6.12) g f 2 (1234) ω 21 + ω 22 + ω 23 + ω 23 = 1, ω 32 + ω 33 + ω 34, ω 43 + ω 44 = 1 δ := ω 43 ω 44 = 1 δ f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 3 (4 123) = f 3 (123) g 3 (4 123) f 2 (1234) = f 2 (12) f 2 (3 12) f 2 (4 123) 64

66 = f 2 (12) f 2 (3 12) f 3 (4 123) ( NFMV) = f 2 (12) g 2 (3 12)f 3 (4 123) (NFMV g 2 = f 2 (12) g 2 (3 12) [ω 43 f 3 (4 123) + ω 44 f 4 (4 123)] ( (6.15)) [ ] = f 2 (12) g 2 (3 12) δ g 3 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ) NFMV g 3 f 1 (1234) = f 1 (1) f 1 (2 1) f 1 (34 12) = f 1 (1) g 1 (2 1) f 1 (3 12) f 1 (4 123) = f 1 (1) g 1 (2 1)f 2 (3 12) f 3 (4 123) ( NFMV) = f 1 (1) g 1 (2 1) [ω 32 f 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] [ ] ω 43 g 3 (4 123) + ω 44 f 4 (4 123) ( (6.15)) [ ] = f 1 (1) g 1 (2 1) ω 32 g 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] δ g 3 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ) δ ω 43, ω 44 δ α 3 f 3 (123) δ = ω 43 = α 3 f 3 (123) + α 4 f 4 (123), 1 δ = ω α 4 f 4 (123) 44 = α 3 f 3 (123) + α 4 f 4 (123), α 2 f 2 (12) ω 32 = α 2 f 2 (12) + α 3 f 3 (12) + α 4 f 4 (12), ω α 3 f 3 (12) 33 = α 2 f 2 (12) + α 3 f 3 (12) + α 4 f 4 (12), ω 34 = α 4 f 4 (12) α 2 f 2 (12) + α 3 f 3 (12) + α 4 f 4 (12) NFMV g 1, g 2, g 3. Kenward et al. (2003) NFD1 Case 4 NFD2 Case 5 NFD1 NFD2 interior family CCMV NCMV NFD1 NFD2 6.5 NFMV CCMV NCMV : NFD1 (NFMV + CCMV) A f 4 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) B f 3 (1) f 3 (2 1) f 3 (3 12) f 4 (4 123) C f 2 (1) f 2 (2 1) f 4 (3 12) f 4 (4 123) D f 1 (1) f 4 (2 1) (1 ω 33 (12)) f 4 (3 12) + ω 33 (12) f 3 (3 12) f 4 (4 123) 6.7: NFD2 (NFMV + NCMV) A f 4 (1) f 4 (2 1) f 4 (3 12) f 4 (4 123) B f 3 (1) f 3 (2 1) f 3 (3 12) f 4 (4 123) C f 2 (1) f 2 (2 1) f 3 (3 12) f 4 (4 123) D f 1 (1) f 2 (2 1) (1 ω 34 (12)) f 3 (3 12) + ω 34 (12) f 4 (3 12) f 4 (4 123) NFMV 4 Case 4: NFMV+CCMV (NFD1) g 1 (2 1) = f 4 (2 1), g 2 (3 12) = f 4 (3 12), g 3 (4 123) = f 4 (4 123) 65

67 4 f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 3 (4 123) = f 3 (123) f 4 (4 123) f 2 (1234) = f 2 (12) f 2 (3 12) f 2 (4 123) = f 2 (12) f 4 (3 12)f 3 (4 123) ( CCMV, NFMV) = f 2 (12) f 4 (3 12) [ω 43 f 3 (4 123) + ω 44 f 4 (4 123)] ( (6.15)) [ ] = f 2 (12) f 4 (3 12) δ f 4 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ, CCMV) = f 2 (12) f 4 (3 12)f 4 (4 123) f 1 (1234) = f 1 (1) f 1 (2 1) f 1 (34 12) = f 1 (1) f 4 (2 1) f 1 (3 12) f 1 (4 123) = f 1 (1) f 4 (2 1)f 2 (3 12) f 3 (4 123) ( NFMV) = f 1 (1) f 1 (2 1) [ω 32 f 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] [ω 43 f 3 (4 123) + ω 44 f 4 (4 123)] [ ( (6.15)) ] = f 1 (1) f 4 (2 1) ω 32 f 4 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] δ f 4 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ, CCMV) [ ] = f 1 (1) f 4 (2 1) (1 ω 33 ) f 4 (3 12) + ω 33 f 3 (3 12) f 4 (4 123) ( ω 32 + ω 33 + ω 34 = 1) 4 Case 1: CCMV R = 4 Case 4: NFD1 f 1 (3 12) R = 3 f 3 (3 12) R = 4 f 4 (3 12) NFD1 CCMV Case 5: NFMV+NCMV (NFD2) g 1 (2 1) = f 2 (2 1), g 2 (3 12) = f 3 (3 12), g 3 (4 123) = f 4 (4 123) 4 f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 3 (4 123) = f 3 (123) f 4 (4 123) f 2 (1234) = f 2 (12) f 2 (3 12) f 2 (4 123) = f 2 (12) f 3 (3 12)f 3 (4 123) ( NFMV) = f 2 (12) f 3 (3 12) [ω 43 f 3 (4 123) + ω 44 f 4 (4 123)] ( (6.15)) [ ] = f 2 (12) f 3 (3 12) δ f 4 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ, NCMV) = f 2 (12) f 3 (3 12)f 4 (4 123) f 1 (1234) = f 1 (1) f 1 (2 1) f 1 (34 12) = f 1 (1) f 4 (2 1) f 1 (3 12) f 1 (4 123) = f 1 (1) f 2 (2 1)f 2 (3 12) f 3 (4 123) ( NFMV) = f 1 (1) f 2 (2 1) [ω 32 f 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] [ω 43 f 4 (4 123) + ω 44 f 4 (4 123)] ( (6.15)) 66

68 = f 1 (1) f 2 (2 1) [ω 32 f 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] [ ] δ f 4 (4 123) + (1 δ) f 4 (4 123) ( NCMV, ω 43 = δ, ω 44 = 1 δ) [ ] = f 1 (1) f 2 (2 1) ω 32 f 3 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) f 4 (4 123) ( NCMV) [ ] = f 1 (1) f 2 (2 1) (1 ω 34 ) f 3 (3 12) + ω 34 f 4 (3 12) f 4 (4 123) ( ω 32 + ω 33 + ω 34 = 1) 4 Case 2: NCMV Case 5: NFD2 f 1 (3 12) R = 3 f 3 (3 12) R = 4 f 4 (3 12) NFD2 NCMV 4 Case : Case 4 67

69 6.4: Case NFMV PMM Step1 f t (Y 1,..., Y t ) Step2 CCMV NCMV Y t+1 f t (Y t+1 Y 1,..., Y t ) NFMV f (Y s Y 1, Y s 1, R = s 1) g Case 4 (NFMV+CCMV) Case 5 (NFMV+ NCMV) CCMV NCMV Step3 NFMV f t (Y t+2,.., Y T Y 1,.., Y t+1 ) 68

70 Step4 (Multiple Imputaiton) f t (Y t+2,.., Y T Y 1,.., Y t+1 ), f t (Y t+1 Y 1,.., Y t ) 6.8 Step5 MMRM Step6 Rubin (1987). Marginal analysis 6.8 Verbeke and Molenberghs (2009) PMM NFMV 4 g ACMV 10 ( ) PMM Step Step 2 Y t+1 f t (Y t+1 Y 1,..., Y t ) g ACMV Case 6 (NFMV+ACMV) Case 7 (NFMV+ACMV+ ) Case 6: NFMV+ACMV ACMV MAR 2 s T 4 f 4 (1234) = f 4 (1234) g s 1 (Y s Y 1,..., Y s 1 ) = f s (Y s Y 1,..., Y s 1 ) T = ω sj f j (Y s Y 1,.., Y s 1 ) f 3 (1234) = f 3 (123) g 3 (4 123) [ ] f 2 (1234) = f 2 (12) g 2 (3 12) δ g 3 (4 123) + (1 δ) f 4 (4 123) [ ] f 1 (1234) = f 1 (1) g 1 (2 1) ω 32 g 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] δ g 3 (4 123) + (1 δ) f 4 (4 123) g g 1 (2 1) = f 2 (2 1) = ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1) j=s g 2 (3 12) = f 3 (3 12) = ω 33 f 3 (3 12) + ω 34 f 4 (3 12) g 3 (4 123) = f 4 (4 123) = f 4 (4 123) 69

71 Case 7: NFMV+ACMV+ g s 1 (Y s Y 1,..., Y s 1 ) = f s (Y s Y 1,..., Y s 1 ) T = ω sj f j (Y s Y 1,.., Y s 1 ) Case 6 g g 1 (2 1) = f 2 (2 1) = ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1) g 2 (3 12) = f 3 (3 12) = ω 33 f 3 (3 12) + ω 34 f 4 (3 12) g 3 (4 123) = f 4 (4 123) = f 4 (4 123) j=s ω sj = α j f j (Y 1,..., Y s 1 ) T α l f l (f 1,..., f s 1 ) l=s ω 22 = ω 24 = ω 34 = α 2 f 2 (1) α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), ω α 3 f 3 (1) 23 = α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), α 4 f 4 (1) α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), ω α 3 f 3 (12) 33 = α 3 f 3 (12) + α 4 f 4 (12), α 4 f 4 (12) α 3 f 3 (12) + α 4 f 4 (12) Case 7 = 0 MAR MAR NRC (2010) SM PMM Molenberghs et al. (1998) ACMV SM Missing At Random (MAR) SM PMM CCMV NCMV Missing Not At Random (MNAR) Kenward et al. (2003) MNAR SM Missing Non-Future Dependent (MNFD NFD) MNAR SM 4 Kenward et al. (2003) MNFD PMM Non-Future Missing Value (NFMV) restriction ACMV Interior family Kenward et al. (2003) MCAR, MAR, MNAR MNAR Non-future dependent Future dependent 6.5 (Molenberghs et al., 2007, p37) 4 Shared parameter models Non-Future Dependent (Wu and Bailey, 1989; Little, 1995) 70

72 6.5: SM PMM 6.5 MAR MNAR PMM ( %PATTERNMIXTURE) %PATTERNMIXTURE 6 %F IRST ANL(constraint =, type =, yvar =, classvars =) %GEN COV (incovmean =, incovvar =, outcovset =, visitval =) MIXED %GENP ARMS(inparm =, incov =, outparm =) %GENV ALS(dataset =, firstvis =, basevis =, yvar =) %MEGACALL(numimp =, Itype =, Iconstraint =, yvar =, firstv =, lastvis =) %GENCOV %GENPARM %GENVALS numimp Itype visit constraint NFMV ALL 2 Iconstraint 2 CCMV NCMV ACMV %ANALY ZE(numimp =, yvar =, model2 =, calssvars2 =, othervars =) MIXED LSMEAN 71

73 6.8 Multiple Imuptation MI Multiple Imuptation MI Rubin 1978, 1987 MAR MAR MI Mallinckrodt MI MI Dmitrienko et al MI PMM Multiple Imuptation Rubin MI 2 θ 1, θ 2 Y f (θ 1, θ 2 Y ) θ 1 θ 2 f (θ 1, θ 2 Y ) = f (θ 1 Y ) f (θ 2 θ 1, Y ) θ 2 f (θ 2 Y ) = E θ1 [f (θ 2 θ 1, Y )] = f (θ 1 Y ) f (θ 2 θ 1, Y ) dθ 1 θ 2 E [θ 2 Y ] = E θ1 [E θ2 [θ 2 θ 1, Y ]] V ar [θ 2 Y ] = E θ1 [V ar θ2 [θ 2 θ 1, Y ]] + V ar θ1 [E θ2 [θ 2 θ 1, Y ]] θ 1 θ (m) 1, m = 1,..., M E [θ 2 Y ] = 1 M V ar [θ 2 Y ] = 1 M M m=1 [ ] E θ2 θ 2 θ (m) 1, Y = θ 2 M m=1 [ ] V ar θ2 θ 2 θ (m) 1, Y + 1 M 1 M m=1 [ ] (E θ2 θ 2 θ (m) 1, Y θ ) 2 2 θ 2 θ 1 θ 2 MI Molenberghs et al, (2007) p

74 6.8.3 Multiple Imuptation PMM ACMV MI MAR MI efficacy estimand 3 ACMV MAR MI Yuan 2011 R = t j = t + 1, t + 2,..., T Y j Y 1,..., Y j 1 Y j f (Y j Y 1,..., Y j 1 ) 1. Y j = β 0 +β 1 Y β j 1 Y j 1 Y 1,..., Y j β = ( β0, β 1,..., β ) j 1 σ 2 j V j = σ j 2 (X X) 1 (X X) Y 1,..., Y j σ 2 j, β σ 2 j = σ 2 j (n j j) /G, G χ 2 n j j β = β + σ j V hjz n j Y j Z j V hj V j = V hj V hj β = (β 0, β 1,..., β j 1 ), σ 2 j Y j Ŷj Ŷ j = β 0 + β 1 Y β j 1 Y j 1 + z i σ j M SAS PROC MI MONOTONE METHOD = REG Multiple Imputation k θ M M M θ (1), θ (2),..., θ (M) θ (m), m = 1, 2,..., M ( ) θ (m) θ N (0, U (m)) θ = 1 M M m=1 Ŵ = θ (m) θ ) ( θ θ N(0, V) ( ) M + 1 V = Ŵ + B M M Û (m) m=1 M 73

75 B = M m=1 ( θ (m) θ ) ( θ(m) θ ) M 1 Ŵ B ( θ θ ) V 1 ( θ θ ) χ 2 k χ 2 k N M Li et al. (1991) F H 0 : θ = θ 0, H 1 : θ θ 0 F p. ( θ θ0 ) V 1 ( θ θ0 ) F = k (R + 1) p = P r ( Fw k > F ) w = 4 + (τ 4) (1 + R = 1 ( k M τ = k (M 1) F k w ) τ (τ > 4 ) R ) tr(bw 1 ) M F F k, = χ 2 /k SAS M PROC MIANALYZE PROC MIANALYZE τ 4 F 2 (Rubin, 1987, p137) w = 1 ( 2 + (k + 1) (M 1) ) R θ θ Lθ L : ( L θ ) Lθ N ( 0, LVL ) ( ) M + 1 LVL = LŴL + L M BL M Ŵ = U (m) m=1 M M ( θ (m) θ ) ( θ(m) ) θ B = m=1 M 1 MI Rubin (1987, p114) Efficiency = ( 1 + γ ) 1 M 74

76 γ γ M γ 6.6 Molenberghs et al, 2007, p : Rubin (1987) 3 5 B k B Graham et al. (2007) 10% 30% 1% 20 Siddiqui (2011) 40% 10 White et al Royston and White ,000 (2015) MI Rubin (1987, p114) % p p 6.9 Marginal treatment effect ( ) Kenward et al PMM t = 1,..., T l = 1,..., g β lt t = 1,..., T π t β l T β l (marginal treatment effect) = β lt π t, l = 1,..., g t=1 V ar ( β1,..., β ) g = AVA V = ) ( βlt Var O O ) (β 1,.., β g ), A = Var ( βt (β 11,.., β T g, π 1,.., π T ) 75

77 H 0 : β 1 =... = β g = 0 Wald β ( β1,..., β ) (AVA ) 1 1 g. χ 2 g β g 6.10 PMM Ratitch et al. (2013) Mallinckrodt et al. (2013) 1. Controlled/Refernece pattern imputation 2. Delta adjustment pattern imputation 3. AE pattern imputation Ratitch et al. (2013), Mallinckrodt et al. (2013) pattern imputation Controlled imputation MI (Controlled imputation) placebo Multiple Imputation (pmi) 5 2 Y 0, Y 1 Y 0 Y 1 Y 1 R = 0( ), 1 T rt = 0( ), 1 θ X MNAR ( f Y 1 T rt = 0, Y 0, R = 0, X, θ ) f or f ( Y 1 T rt = 0, Y 0, R = 1, X, θ ) ( Y 1 T rt = 1, Y 0, R = 0, X, θ ) ( ) f Y 1 T rt = 1, Y 0, R = 1, X. θ MNAR ( 1) Y 1 f ( Y 1 T rt = 1, Y 0, R = 0, X, θ ) ( = f Y 1 T rt = 0, Y 0, R = 1, X, θ ) Mallinkckrodt (2013) 10.5 (p98-99) t = 1,..., T SAS PROC MI Step 1. t = 0 Step 2. t = t + 1 X 1,, t 1 t Step 3. 1 (t 1) t SAS PROC MI t t 5 pmi 76

78 Step 4. t t < T Step 2 t = T Step 5 Step 5. Step 1 Step 4 M M Step 6. M MMRM Step 7. Rubin (1987) p SAS PROC MIANALYIZE Little and Yau (1996) Controlled imputation MNAR MAR controlled imputation 3 estimand 6 "effectiveness" LOCF BOCF reasonable Mallinckrodt (2013) LOCF BOCF 2 / / LOCF BOCF efficacy pattern imputation SAS pattern imputation SAS Ratitch et al. (2011) Yuan (2014) PROC MI (SAS Ver 9.4) PMM MNAR MNAR 2 MODEL ADJUST MODEL controlled imputation SAS code 1 SAS code PROC MI DATA = XXXX; SEED = 14821; NIMPUTE = 100; /* */ OUT = result; CLASS trt; MONOTONE REG(y1); /* */ MNAR MODEL(y1/ MODELOBS = (trt = 0 )); /*trt=0( ) y1 */ VAR x0 y1; /* x0 */ RUN; pattern imputation Mallinckrodt et al. (2013) controlled imputation referenced based imputation Jump to reference (J2R) Copy reference (CR) Copy increment method (CIR) Missingdata.org.uk 77

79 PMM five macro MissingmacroDoc20.docx pattern imputation Kenward estimand MissingmacroDoc20.docx i j (= 1, 2,..., q) k (= 1,..., t) Y ijk r p (= 0,.., t) p MissingmacroDoc20.docx E (Y ijk ) = A jk A jk j k E p (Y ijk ) = B pjk B pjk p j k A jk B pjk J2R CR CIR J2R (Jump to reference) A jk (for k p) B pjk = A rk (for k > p) r MNAR de facto (effectiveness CR (Copy Reference) B pjk = A rk (for all p, j, k) r CIR (Copy Increment Reference) A jk (for k p) B pjk = A jp + A rk A rp (for k > p) r A rk A rp r p k p j A jp r J2R de facto Delta adjustment pattern imputation Delta adjustment pattern imputation (, 10 ) estimand efficacy 78

80 effectiveness MNAR MAR tipping point MAR tipping point analysis Y 0, Y 1 MNAR ( 2) Y 1 Y 1 ( f Y 1 T rt = 1, Y 0, R = 0, X, θ ) ( = f Y 1 + T rt = 1, Y 0, R = 1, X, θ ) MNAR 1 2 SAS ADJUST Delta adjustment pattern imputation SAS 2 SAS code PROC MI DATA = XXXX; SEED = 14821; NIMPUTE = 100; /* */ OUT = result; CLASS trt; MONOTONE REG(y1); /* */ MNAR ADJUST(y1 / SHIFT = ADJUSTOBS = (trt = 1 )); /*trt = 1( ) y1 ( -2-1 ) */ VAR trt x0 y1; /* x0 */ RUN; AE pattern imputation (Ratitch et al., 2013) estimand effectiveness / (CCMV) BOCF BOCF BOCF MAR patter imputaion

81 [1] Ayele, B. T., Lipkovich, I., Molenberghs, G. & Mallinckrodt, C. H. (2014). A multiple-imputation-based approach to sensitivity analysis and effectiveness assesments in longitudinal clinical trials.journal of Biopharmaceutical Statistics. 24, [2] Diggle, P., & Kenward, M. G. (1994). Informative drop-out in longitudinal data analysis. Applied statistics, [3] Dmitrienko, A., Molenberghs, G., Chuang-Stein, C., & Offen, W. W. (2005). Analysis of clinical trials using SAS: A practical guide. SAS Institute. [4] Graham, J. W., Olchowski, A. E. & Gilreath, T. D. (2007). How many imputations are really needed? Some practical clarifications of multiple imputation theory. Prevention Science, 8, [5] Kenward, M. G., Molenberghs, G., & Thijs, H. (2003). Pattern mixture models with proper time dependence. Biometrika, 90(1), [6] Li, K. H., Raghunathan, T. E., & Rubin, D. B. (1991). Large-sample significance levels from multiply imputed data using moment-based statistics and an F reference distribution. Journal of the American Statistical Association, 86(416), [7] Little, R. J. (1993). Pattern-mixture models for multivariate incomplete data. Journal of the American Statistical Association, 88(421), [8] Little, R. J. (1994). A class of pattern-mixture models for normal incomplete data. Biometrika, 81(3), [9] Little, R. J. (1995). Modeling the drop-out mechanism in repeated-measures studies. Journal of the American Statistical Association, 90(431), [10] Little, R. J. & Yau, L. (1996). Intent-to-treat analysis for longitudinal studies with drop-outs. Biometrics, [11] Little, R. J. A. and Rubin, D. B. (2002).Statistical analysis with missing data. 2nd edition. John Wiley & Sons. New York. [12] Mallinckrodt, C. H. (2013).Preventing and treating missing data in longitudinal clinical trials: a practical guide. Cambridge University Press. [13] Mallinckrodt, C., Roger, J., Chuang-Stein, C., Molenberghs, G., O Kelly, M., Ratitch, B.,... & Bunouf, P. (2013). Recent Developments in the Prevention and Treatment of Missing Data. Therapeutic Innovation & Regulatory Science, [14] Molenberghs, G., Michiels, B., Kenward, M. G., & Diggle, P. J. (1998). Monotone missing data and pattern mixture models. Statistica Neerlandica, 52(2), [15] Molenberghs, G., & Kenward, M. (2007). Missing data in clinical studies (Vol. 61). John Wiley & Sons. [16] National Research Council (2010). The Prevention and Treatment of Missing Data in Clinical Trials. Washington, DC: The National Academies Press. [17] O Kelly, M., & Ratitch, B. (2014). Clinical Trials with Missing Data: A Guide for Practitioners. John Wiley & Sons. [18] Ratitch, B., & O Kelly, M. (2011). Implementation of pattern-mixture models using standard SAS/STAT procedures. PharmaSUG, Paper-SP04. 80

82 [19] Ratitch, B., O Kelly, M., & Tosiello, R. (2013). Missing data in clinical trials: from clinical assumptions to statistical analysis using pattern mixture models.pharmaceutical statistics, 12(6), [20] Royston, P., & White, I. R. (2011). Multiple imputation by chained equations (MICE): implementation in Stata.J Stat Softw, 45(4), [21] Rubin, D.B. (1978), Multiple Imputations in Sample Surveys - A Phenomenological Bayesian Approach to Nonresponse, Proceedings of the Survey Research Methods Section of the American Statistical Association, [22] Rubin, D. B. (1987). Multiple imputation for nonresponse in surveys. John Wiley & Sons. [23] Siddiqui, O. (2011). MMRM versus MI in dealing with missing data a comparison based on 25 NDA data sets. Journal of Biopharmaceutical Statistics. 21, [24] Thijs, H., Molenberghs, G., Michiels, B., Verbeke, G., & Curran, D. (2002). Strategies to fit pattern mixture models. Biostatistics, 3(2), [25] Verbeke, G., & Molenberghs, G. (2009). Linear mixed models for longitudinal data. Springer Science & Business Media. [26] Wu, M. C., & Bailey, K. R. (1989). Estimation and comparison of changes in the presence of informative right censoring: conditional linear model. Biometrics, [27] White, I. R., Royston, P., & Wood, A. M. (2011). Multiple imputation using chained equations: issues and guidance for practice. Statistics in medicine, 30(4), [28] Yuan, Y. (2011). Multiple imputation using SAS software. Journal of Statistical Software, 45(6), [29] Yuan, Y. (2014). Sensitivity Analysis in Multiple Imputation for Missing Data. In Proceedings of the SAS Global Forum 2014 Conference. ( [30]. (2015). (2) (Multiple Imputation MI). SAS. 81

83 7 Shared Parameter Model 7.1 Selection Model (SM) Pattern Mixture Model (PMM) 5 6 MAR MNAR MNAR Wu and Carroll (1988) Wu and Bailey (1989) Follmann and Wu (1995) Shared Parameter Model ( SPM) SPM ( ) SPM (Y ij, R ij, X ij ) i(i = 1,, N) j(j = 1,, n) Y ij i j i Y i o Y i m Y i = (Y o, Y m ) R ij 1 j 0 j f(y i, R i, b i X i, θ, ψ) (7.1) X i Y i R i i θ ψ D i D i = j R ij + 1 D i = n + 1 X i θ ψ i q b i = (b i1, b i2,..., b iq ) 0 G i f(y i, R i, b i ) SPM f(y i, R i, b i ) = f(y i R i, b i )f(r i b i )f(b i ) (7.2) SPM Y ij R ij b i f(y i R i, b i ) = f(y i b i ) (7.3) Y i Y i o Y i m b i (7.2) f(y i b i ) = f(y i o b i )f(y i m b i ) (7.4) f(y i, R i, b i ) = f(y i o b i )f(y i m b i )f(r i b i )f(b i ) (7.5) 82

84 SPM share Fitzmaurice et al. (2008) SPM MNAR Y i = (Yi o,ym i ) R ij f(r ij Y o i, Y m b i ) = i f(r ij b i )f(y o i, Y m i b i )f(b i )db i b i f(y o i, Y m i b i )f(b i )db i = f(r ij b i )f(b i Y o i, Y m i )db i (7.6) b i f(b i Y i o, Y i m ) Y i m Y i m SPM MNAR SPM Selection Model (SM) Pattern Mixture Model (PMM) MNAR 7.2 SPM f(y o i,r i ) f(y o i, R i ) = f(y o i, Y m m i, R i, b i )db i dy i y m b = f(y o i, Y m m i R i, b i )f(r i b i )f(b i )db i dy i y m b = f(y o i, Y m m i b i )f(r i b i )f(b i )db i dy i ( (7.3)) y m b = f(y o i b i )f(y m m i b i )f(r i b i )f(b i )db i dy i ( (7.4)) = = y m b b b ( ) f(y o i b i )f(r i b i )f(b i ) f(y m m i b i )dy i db i y m f(y i o b i )f(r i b i )f(b i )db i (7.7) SPM f(y o i b i ) f(r i b i ) f(y o i b i ) y ij (Group i ) (T ime ij ) (T ime ij Group i ) SPM ( ) 2 (a) (b) Creemers et al. (2010) 83

85 b i0, b i1 (a) f(y o i b i ) (7.8) (b) (7.9) Y ij = β 0 + β 1 T ime ij + β 2 Group i + β 3 T ime ij Group i + b i0 + e ij (7.8) Y ij = β 0 + β 1 T ime ij + β 2 Group i + β 3 T ime ij Group i + b i0 + b i1 T ime ij + e ij (7.9) (b i0, b i1 ) N(0, G) (7.8) b i1 G e ij f(r i b i ) Complementary log-log link (c) (d) Creemers et al. (2010) (c) (7.10) (d) (7.11) α 0j, γ 1, γ 2 P r(d i j) = 1 exp( exp(α 0j + γ 1 b i0 )) (7.10) P r(d i j) = 1 exp( exp(α 0j + γ 1 b i0 + γ 2 b i1 )) (7.11) (7.10) 1 (7.12) P r(d i 1) = 1 exp( exp(α 01 + γ 1 b i0 )) (7.12) 1 2 P r(d i 2) P r(d i 1) = 1 exp( exp(α 02 + γ 1 b i0 )) {1 exp( exp(α 01 + γ 1 b i0 ))} logitp r(d i j) = α 0j + γ 1 b i0 (7.13) logitp r(d i j) = α 0j + γ 1 b i0 + γ 2 b i1 (7.14) 7.3 L = f(y o i, R i ) Monte Carlo EM (McCulloch, 1997) Laplace (Gao, 2004) Gauss Gauss SPM SAS PROC NLMIXED PROC NLMIXED SPM Appendix C PROC NLMIXED 84

86 Gauss Gauss PROC NLMIXED SAS Pinheiro et al. (1995) Unweighted Analysis Approximate Conditional Model Appendix C Fitzmaurice et al. (2008) 7.4 SPM %shared_parameter SPM Little (1995) SPM SPM SM PMM estimand %shared_parameter %shared_parameter %shared_parameter Appendix C MODL, LINK, RANDOM_SLOP E MODL LINK CLOGLOG Complementary log-log link LOGIT RANDOM_SLOP E NONE LINEAR NONE LINEAR %shared_parameter 7.2 SPM %shared_parameter 2 [1] Creemers, A., Hens, N., Aerts, M., Molenberghs, G., Verbeke, G., and Kenward, M. G. (2010). A Sensitivity Analysis for Shared Parameter Models for Incomplete Longitudinal Outcomes. Biometrical Journal, 52(1), [2] Fitzmaurice, G., Davidian, M., Verbeke, G., and Molenberghs, G. (2008). Longitudinal data analysis. CRC Press. [3] Follmann, D., and Wu, M. (1995). An approximate generalized linear model with random effects for informative missing data. Biometrics, 51(1),

87 [4] Gao, S. (2004). A shared random effect parameter approach for longitudinal dementia data with non ignorable missing data. Statistics in Medicine, 23(2), [5] Little, R. J. (1995). Modeling the drop-out mechanism in repeated-measures studies. Journal of the American Statistical Association, 90(431), [6] McCulloch, C. E. (1997). Maximum likelihood algorithms for generalized linear mixed models. Journal of the American statistical Association, 92(437), [7] Pinheiro, J. C., and Bates, D. M. (1995). Approximations to the log-likelihood function in the nonlinear mixedeffects model. Journal of computational and Graphical Statistics, 4(1), [8] Wu, M. C., and Carroll, R. J. (1988). Estimation and comparison of changes in the presence of informative right censoring by modeling the censoring process. Biometrics, 44(1), [9] Wu, M. C., and Bailey, K. R. (1989). Estimation and comparison of changes in the presence of informative right censoring: conditional linear model. Biometrics, 45(3),

88 8 8.1 IPWCC wgee Doubly Robust estimand estimand 3 MAR SM PMM SPM MAR µ µ i Y i R i 1 0 n µ µ = n Y i i=1 n (8.1) µ µ Y i 1 µ µ µ = n R i Y i i=1 n (8.2) R i i=1 8.2 Y i R i = 0 Y i R i Y i = 0 µ MCAR µ µ MNAR 1 Y i R i MCAR MNAR MCAR : P Ri Y i (R i = 1, Y i = y i θ, ψ) = P Ri (R i = 1 ψ) P Yi (Y i = y i θ) 1 MAR Y i 1 87

89 MNAR : P Ri Y i (R i = 1, Y i = y i θ, ψ) = P Ri Y i (R i = 1 Y i = y i, ψ) P Yi (Y i = y i θ) P Ri Y i (R i = 1 Y i = y i, ψ) R i π(y i ) = P Ri Y i (R i = 1 Y i = y i, ψ) (8.3) R i π(y i ) Bernoulli E[R i Y i ] = 1 P (R i = 1 Y i ) + 0 P (R i = 0 Y i ) = P (R i = 1 Y i ) = π(y i ) (8.4) E[R i ] = E[E[R i Y i ]] = E[π(Y i )] (8.5) n 8.2 MCAR : n R i Y i i=1 µ = n = R i i=1 n 1 n i=1 n 1 n R i i=1 R i Y i n E[RY ] E[R] E[Y ] = = E[Y ] = µ (8.6) E[R] E[R] MNAR : (8.4) (8.5) n R i Y i i=1 µ = n = R i i=1 n 1 n i=1 n 1 n R i i=1 R i Y i n E[RY ] E[R] = = E[E[RY Y ]] E[R] E[Y π(y )] E[π(Y )] = E[Y E[R Y ]] E[E[R Y ]] E[Y ] = µ (8.7) µ MCAR µ MNAR µ 8.2 Inverse Probability Weighted Complete-Case (IPWCC) Estimator µ IPWCC (8.2) µ MNAR µ X i W i MAR 88

90 MAR (Tsiatis, 2006) P (R i = 1 Y i = y i, W i = w i, X i = x i ) = P (R i = 1 W i = w i, X i = x i, ψ) (8.8) W i auxiliary variable 2 3 Mallinckrodt, π(w i, X i, ψ) π(w i, X i, ψ) = P (R i = 1 W i = w i, X i = x i, ψ) (8.9) Y i 1/π µ µ IP W = 1 n n R i Y i i=1 π i (W i, X i, ψ) (8.10) ψ µ IP W = 1 n [ ] R i Y i n R i Y i n π i (W i, X i, ψ) E π i (W i, X i, ψ) i=1 [ [ RY ]] = E E Y, W, X π(w, X, ψ) [ ] Y = E E [R Y, W, X] π(w, X, ψ) [ ] Y = E π(w, X, ψ) π(w, X, ψ) = E [Y ] = µ (8.11) π Inverse Probability Weighted Complete-Case (IPWCC) Estimator 8.10 (Robins et al., 1994) IPWCC π MAR IPWCC 0 IPWCC Williamson et al., 2012 P (Y, R) = P (Y )P (R Y ) Selection Model Selection Model IPWCC 2 Collins et al. (2001) a) b) c)

91 1. ψ π(w, X, ψ) = P (R = 1 W, X, ψ) logit {P (R = 1 W, X)} = ψ 0 + Wψ 1 + Xψ 2 2. Y IPWCC µ IP W (Robins et al., 1994) µ IP W = 1 n n R i Y i i=1 π i (W i, X i, ψ) IPWCC estimators for Estimating Equations µ θ n U(Y i, X i, θ) = 0 (8.12) i=1 4 n X i (Y i X iβ) = 0 (8.13) i= Y i U IPW n i=1 n i=1 β IP W R i π i U(Y i, X i, θ) = 0 (8.14) R i π i X i (Y i X iβ) = 0 (8.15) µ IP W i = X i β IP W IPWCC IPWCC 4 Y = X β + ϵ E[ϵ] = 0 ϵ 90

92 ( ) Y1 1. n, Yn n π 1 π n B B 1000 Barker, θ (b) 3. B θ (1), θ (B) θ (2008) Carpenter and Bithell(2000) 8.3 weighted Generalized Estimating Equation (wgee) IPWCC 1 Generalized Estimating Equation GEE GEE IPWCC GEE β Liang and Zeger, 1986; Pan et al., 2000; Pepe and Anderson, 1994 MAR MNAR GEE MAR MNAR "non-ignorable" Hogan et al., 2004; Touloumi et al., 2001 GEE MAR weighted Generalized Estimating Equation (wgee) Robins et al., 1995 observation-specific subject-specific 2 Robins et al., 1995; Preisser et al., 2002; Fitzmaurice et al., 1995; Hogan et al., %WGEE subject-specific Proc GEE observation-specific, 2015) W ij 5 MNAR MAR wgee Mallinckrodt, 2013, 12.4 (2015) wgee wgee wgee observation-specific subject-specific ) Observation-specific (OS) weighted GEE Y i X i β S os (β) = N i=1 µ i β V i(ρ) 1 i (Y i µ i ) = 0 (8.16) i i µ i = E [Y i ] i 5 Inclusive Model 91

93 η i = g(µ i ) = X i V i (ρ) i ρ i Subject-specific (SS) weighted GEE Y o i X i β S ss (β) = N i=1 i (ss) (µo i ) β (Vo i (ρ)) 1 (Y o i µ o i ) = 0 (8.17) i i µ o i = E [Yo i ] i η i = g(µ o i ) = X i V o i (ρ) i (ss) ρ i Observation-specific weighted GEE Observation-specific weighted GEE (8.18) (8.19) i j ϕ ij (W ij, X i, ψ) ψ (j 1) j ϕ ij (W ij, X i, ψ) = P (R ij = 1 R i,j 1 = 1, W ij, X i, ψ) (8.18) logit {P (R ij = 1 R i,j 1 = 1, W ij = w ij, X i = x i, ψ)} = W ij ψ 1 + X i ψ 2 (8.19) i j π ij j = 1 1 j π ij (W i, X i, ψ) = ϕ ik (W ik, X i, ψ) (8.20) k=1 Observation-specific weighted GEE ij = (π ij (W i, X i, ψ)) 1 Y ij i n i n i n i i (j, j) ij 0 MAR (Robins et al., 1995) Subject-specific weighted GEE Subject-specific i observation-specific i subject-specific i (ss) Fitzmaurice et al. (1995) i D i J, i D i = J k=1 R ik + 1 D i k R ik = 1 R ik = 0 D 1 (J + 1) (J + 1) D i d i P (D i = d i W i, X i, ψ) (8.18) (8.19) 92

94 (d i = J + 1) i (ss) = P (D = d i W i, X i, ψ) = P (R ij = 1 W i, X i, ψ) 1 = (π ij (W i, X i, ψ)) 1 (8.21) (d i J) i (ss) = P (D = d i W i, X i, ψ) = P (R idi = 0, R idi 1 = 1 W i, X i, ψ) 1 = ((π idi 1(W i, X i, ψ)) (1 ϕ idi (W idi, X i, ψ))) 1 (8.22) Subject-specific weighted GEE (8.21) (8.22) D i d i P (D = d i W i, X i, ψ) observation-specific weighted GEE MAR (Fitzmaurice et al., 1995) β Subject-specific wgee observation-specific wgee Robin, 1995; Preisser et al., 2002; Fitzmaurice et al., 1995; Hogan et al., 2004; Seaman, 2009 SAS Ver 9.3 PROC GENMOD SS OS Ver 9.4 PROC GEE SS OS (2015) 12 %WGEE subject-specific weighted GEE β 1. (8.18) (8.19) logistic (8.20) (8.21) (8.22) i 2. β β 0 V i (ρ) 4. (8.23) β β r+1 = β r + [ n i=1 µ i β V i(ρ) 1 µ i β ] 1 [ n 5. 3., 4. β i=1 µ i β V i(ρ) 1 i (Y i µ i ) ] (8.23) 6. observation-specific wgee independent PROC GENMOD PROC GEE (Rodriguez, 2014) PROC GENMOD 6 Robin, (1995) Preisser et al., (2002) 8.4 Doubly Robust (DR) IPWCC Y i Y i 6 Proc GENMOD 93

95 Augmented IPW(AIPW) (Rotnitzky et al., 1998) IPWCC MAR µ 1 µ 0 µ 1, µ 0 AIPW (Tsiatis, 2006) µ 1 = 1 n µ 0 = 1 n n i=1 n i=1 R i1 Y i1 π(w i1, X i1, ψ) 1 n R i0 Y i0 π(w i0, X i0, ψ) 1 n { } n R i1 i=1 π(w i1, X i1, ψ) 1 g(w i1, X i1, β) { } n R i0 π(w i0, X i0, ψ) 1 g(w i0, X i0, β) (8.24) i=1 Y 1 IPWCC g(w ij, X ij, β) β Y g(w ij, X ij, β) = E[Y i W ij, X ij ] 8.24 π(w ij, X ij, ψ) Y g(w ij, X ij, β) β Tsiatis, 2006; Cao et al., 2009; Carpenter et al., 2006 Doubly Robust µ 1 µ 0 µ 1 µ j = 0, 1 µ j = 1 n [ n Y ij + R ij π(w ij, X ij, ψ) i=1 π(w ij, X ij, ψ) { Y ij g(w ij, X ij, β) } ] E[ µ 1 ] E[ µ 0 ] β ψ β ψ 1. π(w ij, X ij, ψ) ψ j = 0, 1 [ Y ij + R ij π(w ij, X ij, ψ) π(w ij, X ij, ψ) ] {Y ij g(w ij, X ij, β )} E[ µ j ] = E [ Rij P (R ij = 1 W ij, X ij, ψ) = E[Y ij ] + E P (R ij = 1 W ij, X ij, ψ) [ E[P (Rij = 1 W ij, X ij, ψ)] P (R ij = 1 W ij, X ij, ψ) = E[Y ij ] + E P (R ij = 1 W ij, X ij, ψ) = E[Y ij ] + E [0 {Y ij g(w ij, X ij, β )}] = E[Y ij ] = µ j ] {Y ij g(w ij, X ij, β )} ( (8.5)) ] {Y ij g(w ij, X ij, β )} 2. Y g(w ij, X ij, β) j = 0, 1 [ Y ij + R ij π(w ij, X ij, ψ ) π(w ij, X ij, ψ ) E[ µ j ] = E = [ Rij π(w ij, X ij, ψ ) E[Y ij ] + E π(w ij, X ij, ψ ) 94 ] {Y ij g(w ij, X ij, β)} ] E [{Y ij g(w ij, X ij, β) W ij, X ij }]

96 [ Rij π(w ij, X ij, ψ ) = E[Y ij ] + E π(w ij, X ij, ψ ) = E[Y ij ] + E = E[Y ij ] = µ j [ Rij π(w ij, X ij, ψ ) π(w ij, X ij, ψ ) ] {E [Y ij W ij, X ij ] g(w ij, X ij, β)} ] 0 µ j IPWCC IPWCC (Seaman and Copas, 2009; Belinda, 2013) Doubly Robust Doubly Robust Tsiatis (2006) (2014) µ = n Y i i=1 n µ = n R i Y i n i=1 R i i=1 MAR MNAR IPWCC µ IP W = 1 n n R i Y i i=1 π(w i, X i, ψ) MAR wgee MAR IPWCC 95

97 Doubly Robust MAR Y Doubly-Robust Double Robust [1] Barker, N. (2005). A practical introduction to the bootstrap using the SAS system. In SAS Conference Proceedings: Phuse 2005: October ; Heidelberg, Germany SAS. [2] Cao, W., Tsiatis, A. A., and Davidian, M. (2009). Improving efficiency and robustness of the doubly robust estimator for a population mean with incomplete data. Biometrika, asp033. [3] Carpenter, J., and Bithell, J. (2000). Bootstrap confidence intervals: when, which, what? A practical guide for medical statisticians. Statistics in medicine, 19(9), [4] Carpenter, J. R., Kenward, M. G., and Vansteelandt, S. (2006). A comparison of multiple imputation and doubly robust estimation for analyses with missing data. Journal of the Royal Statistical Society: Series A (Statistics in Society), 169(3), [5] Collins, L. M., Schafer, J. L., and Kam, C. M. (2001). A comparison of inclusive and restrictive strategies in modern missing data procedures. Psychological methods,6(4), 330. [6] Fitzmaurice, G. M., Molenberghs, G., and Lipsitz, S. R. (1995). Regression models for longitudinal binary responses with informative drop-outs. Journal of the Royal Statistical Society. Series B (Methodological), [7]. (2014).., 62(1), [8] Hogan, J. W., Roy, J., and Korkontzelou, C. (2004). Handling drop out in longitudinal studies. Statistics in medicine, 23(9), [9], (2008), EM MCMC., [10]. (2015). (3)Proc GEE wgee SAS [11] Liang, K. Y., and Zeger, S. L. (1986). Longitudinal data analysis using generalized linear models. Biometrika,73(1), [12] Mallinckrodt, C. H. (2013). Preventing and Treating Missing Data in Longitudinal Clinical Trials. Cambridge Press. [13] Pan, W., Louis, T. A., and Connett, J. E. (2000). A note on marginal linear regression with correlated response data. The American Statistician, 54(3), [14] Pepe, S, M., and Anderson, G. L. (1994). A cautionary note on inference for marginal regression models with longitudinal data and general correlated response data. Communications in Statistics-Simulation and Computation, 23(4),

98 [15] Preisser, J. S., Lohman, K. K., and Rathouz, P. J. (2002). Performance of weighted estimating equations for longitudinal binary data with drop outs missing at random. Statistics in medicine, 21(20), [16] Robins, J. M., Rotnitzky, A., and Zhao, L. P. (1994). Estimation of regression coefficients when some regressors are not always observed. Journal of the American Statistical Association, 89(427), [17] Robins, J. M., Rotnitzky, A., and Zhao, L. P. (1995). Analysis of semiparametric regression models for repeated outcomes in the presence of missing data. Journal of the American Statistical Association, 90(429), [18] Rodriguez, R. N., and Stokes, M. (2014). SAS/STATR 13.1 Round-Up. [19] Rotnitzky, A., Robins, J. M., and Scharfstein, D. O. (1998). Semiparametric regression for repeated outcomes with nonignorable nonresponse. Journal of the American Statistical Association, 93(444), [20] Seaman, S., and Copas, A. (2009). Doubly robust generalized estimating equations for longitudinal data. Statistics in medicine, 28(6), [21] Touloumi, G., Babiker, A. G., Pocock, S. J., and Darbyshire, J. H. (2001). Impact of missing data due to drop outs on estimators for rates of change in longitudinal studies: a simulation study. Statistics in medicine, 20(24), [22] Tsiatis, A. (2006). Semiparametric theory and missing data. Springer. [23] Williamson, E. J., Morley, R., Lucas, A., and Carpenter, J. R. (2012). Variance estimation for stratified propensity score estimators. Statistics in medicine, 31(15),

99 II MMRM

100 9 MMRM 9.1 ( ) LOCF ( ) LOCF complete case (Mallinckrodt et al., 2008) 4 LOCF MMRM(Mixed Model for Repeated Measures) LOCF 2 Rubin(1976) Little & Rubin(2002) 3 MCAR MAR MNAR 2 (Ignorable) (Non-ignorable) Laird(1988) MCAR MAR MNAR Laird & Ware (1982) ( ) ( ) ( Direct Likelifood DL ) MAR( MCAR) MAR Mallinckrodt et al. (2001a; 2001b) "MMRM" MMRM MAR DL MMRM MAR SM 5 99

101 9.2 MMRM MMRM MMRM (Linear Mixed Model LMM) Laird & Ware (1982) LMM 1 Y i = X i β + Z i b i + ϵ i, i = 1,..., N b i N(0, D), ϵ i N(0, Σ i ), (9.1) b 1,..., b N, ϵ 1,..., ϵ N. Y i i n i 2 β p b i q ( ) X i Z i (n i p) (n i q) ϵ i n i D (i, j) d ij d ij = d ji (q q) Σ i (n i n i ) ( 3 ) LMM (General Linear Model LM) ( ) (9.1) Y i N(X i β, V i ), V i = Z i DZ i + Σ i. (9.1) V i Mallinckrodt et al. (2001a; 2001b) MMRM MMRM LMM 4 "MMRM" LMM (Compound Symmetory CS) CS CS CS 1 (1-AutoRegression AR(1) ) Toeplitz ( ) (Unstructured UN ) 2 (Akaike Information Criteria AIC) Bayes (Bayessian Information Criteria BIC) Lu & Mehrotra (2009) Mallinckrodt et al. (2001a;2001b) 1 Laird & Ware (1982) 2 n n i (n i n, i = 1,..., N) 3 4 MMRM 100

102 9.1: MMRM Visit Baseline * * * * * * * Siddiqui (2011) III UN MMRM MMRM (Mallinckrodt et al., 2001b) Laird & Ware (1982) Cnaan et al. (1997) Verbeke & Molenberghs(1997) 9.3 SAS MMRM LMM SAS PROC MIXED MMRM 9.2 MMRM PROC MIXED DATA=datasetName; CLASS TREATMENT TIME_POINT SUBJECT_ID; MODEL RESPONSE = BASELINE TREATMENT TIME_POINT TREATMENT*TIME_POINT / DDFM=KR; LSMEANS TREATMENT*TIME_POINT / ALPHA=0.05 CL DIFF=CONTROL( PLACEBO, tn ); REPEATED TIME_POINT / SUBJECT=SUBJECT_ID TYPE=UN; RUN; 5 UN 101

103 9.2: MMRM SUBJECT_ID TREATMENT BASELINE TIME_POINT RESPONSE A-01 Drug 1234 t A-01 Drug 1234 t A-02 Placebo 1234 t A-03 Placebo 1234 t PROC MIXED SAS PROC MIXED PROC MIXED MMRM EMPIRICAL SCORING EMPRICAL ( ) (Huber, 1967; Liang & Zeger, 1986) UN (CS ) EMPRICAL 6 PROC MIXED (Restricted Maximum Likelihood Method REML) Newton-Raphson UN (Lu & Mehrotra, 2009) PROC MIXED Newton-Raphson Fisher s scoring SCORING SCORING= Fisher s Scoring Newton-Raphson Fisher s Scoring SCORING= CLASS CLASS 6 MCAR (Mallinckrodt, 2013) 102

104 9.3.3 MODEL MODEL ( ) Dinh & Yang (2011) MODEL DDFM PROC MIXED 7 Satterthwaite Kenward-Roger (KR) (Kenward & Roger, 1997) LSMEANS LSMEANS TREATMENT TIME_POINT (ALPHA ) (CL ) (DIFF ) ADJUST REPEATED REPEATED ϵ i Σ i SUBJECT TYPE 8 ( ) REPEATED Σ i R Σ i RCORR RANDOM MMRM RANDOM MMRM MMRM UN UN ( ) 10 REML REPEATED RANDOM 7 8 CS AR(1) Toeplitz UN SAS HELP 9 i Σ i "R=i" "RCORR=i" 10 (Identifiability) 103

105 Hessian Appendix A A MMRM Mallinckrodt et al. (2008) MMRM MAR ( MNAR ) MAR MMRM UN REML MAR (LaVange, 2014) "MMRM" Mallinckrodt et al. (2008) MMRM (REML) ( ) Kenward-Roger α = 0.05 ( ) 2 ( ) MMRM MAR Collins et al. (2001) Mallinckrodt et al. (2013) "Restrictive Model" "Inclusive Model" 10 MAR (Multiple Imputaion MI) (weighted Generalized Estimating Equation wgee) MMRM MMRM Lu et al. (2008) 104

106 J n 1, n 2,..., n J n 1 n 2 n J n J+1 = 0 i Y i = (y i1,..., y ij ) µ = (µ 1,..., µ J ) Σ Σ (j, k) σ jk (j, k = 1,..., J) r j = n j /n 1 (j = 1,..., J) j ( ) r = (r 1,..., r J ) MMRM µ µ µ I 1 ( ) J Σ 1 j 0 j (J j) I = (n j n j+1 ) 0 (J j) j 0 (J j) (J j) j=1 Σ j Σ (j j) 0 kl 0 (k l) (l = 0 ) j j j ( ) J I R 1 j 0 j (J j) = (r j r j+1 ) 0 (J j) j 0 (J j) (J j) j=1 R j R (j j) R (j, k) σ jk /(σ j σ k ) (j, k = 1,..., J) σ j = σ 1/2 jj (9.2) Var( µ J ) = ϕ(σj 2/n 1) ϕ = [I ] 1 JJ µ J ϕ Lu et al. (2008) ϕ "Inflation factor" (9.3) ϕ r R Lu et al. (2008) (9.2) (9.3) 9.6 MAR SM MMRM MNAR MAR Mallinckrodt et al. (2001b) MNAR MMRM (Yuan & Little, 2009) MNAR MMRM MAR MNAR (Mallinckrodt et al., 2004; Shen et al., 2006; Mallinckrodt et al., 2008) [1] Collins, L. M., Schafer, J. L. & Kam, C. M. (2001). A comparison of inclusive and restrictive strategies in modern missing data process. Psychological Methods, 6(4), [2] Cnaan, A., Laird, N. M. & Slasor, P. (1997). Using the general linear mixed model to analysis unbalanced repeated measures and longitudinal data. Statistics in Medicine., 16,

107 [3] Dinh, P. & Yang, P. (2011). Handling baselines in repeated measures analysis with missing data at random. Journal of Biopharmaceutical Statistics, 21, [4] Huber, P. J. (1967). The behavior of maximum likelihood estimation under nonstandard conditions. Proceeding of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, 1, LeCam, L. M. and Neyman, J. editors. University of California Press, [5] Kenward, M. G. & Roger, J. H. (1997). Small sample inference for mixed effects from restricted maximum likelihood. Biometrics, 53(3), [6] Laird N. M. (1988). Missing Data in Longitudinal Studies. Stat. Med., [7] Laird, N. M. & Ware, J. H. (1982). Random-effects models for longitudinal data. Biometrics, 38(4), [8] LaVange, L. M. (2014). The role of statistics in regulatory decision making. Therapeutic Innovation & Regulatory Science, 48(1), [9] Liang, Kung-Yee, & Zeger, S. L. (1986). Longitudinal data analysis using generalized linear models. Biometrika, 73(1), [10] Little, R. J. A. & Rubin, D. B. (2002). Statistical analysis with missing data, 2nded. New York: Wiley. [11] Lu, K., Luo, X. & Chen, P. Y. (2008). Sample size estimation for repeated measures analysis in randomized clinical trials with missing data. The International Journal of Biostatistics, 4(1), [12] Lu, K. & Mehrotra D. V. (2009). Specification of covariance structure in longitudinal data analysis for randomized clinical trials. Statistics in Medicine, 29, [13] Mallinckrodt, C. H. (2013). Preventing and Treating Missing Data in Longitudinal Clinical Trials. Cambridge University press. [14] Mallinckrodt, C. H., Clark, W. S. & David, S. R. (2001a). Accounting for dropout bias using mixed-effects models. Journal Biopharmaceutical Statistics, 11, [15] Mallinckrodt, C. H., Clark, W. S. & David, S. R. (2001b). Type I error rates from mixed effects model repeated measures versus fixed effects anova with missing values imputed via last observation carried forward. Drug Information Journal, 35, [16] Mallinckrodt, C. H., Kaiser, C. J., Watkin, J. G., Molenberghs, G. & Carroll R. J. (2004). The effect of correlation structure on treatment contrasts estimated from incomplete clinical trial data with likelihood-based repeated measures compared with last observation carried forward ANOVA. Clinical trials (London, England), 1, [17] Mallinckrodt, C. H., Lane, P. W., Schnell, D., Peng, Y. & Maucuso, J. P. (2008). Recommendations for the primary analysis of continuous endpoints in longitudinal clinical trials. Drug Information Journal, 42, [18] Mallinckrodt, C. H., Roger, J., Chuang-Stein, C., Molenberghs, G., O Kelly, M., Ratitch, B., Janssens, M. & Bunouf, P. (2013) Recent developments in the prevention and treatment of missing data. Therapeutic Innovation & Regulatory Science, published online. [19] Rubin, D. B. (1976). Inference and missing data. Biometrika, 63, [20] Shen, S., Beunckens, C., Mallinckrodt, C. & Molenberghs, G. (2006). A local influence sensitivity analysis for incomplete longitudinal depression data. J. Biopharm. Stat., 16, [21] Siddiqui, O. (2011). MMRM versus MI in dealing with missing data - A comparison based on 25 NDA data sets. Journal of Biopharmaceutical Statistics, 21,

108 [22] Verbeke, G. & Molenberghs, G. (1997). Linear mixed models in practice: a SAS oriented approach, New York : Springer-Verlag. [23] Yuan, Y. & Little, R. J. A. (2009). Mixed-effect hybrid models for longitudinal data with nonignorable dropout. Biometrics, 65, [24] (2001). ( ).. 107

109 NRC(2010) 3 Mallinckrodt (2013) estimand Analytic Road Map 10.1 Mallinckrodt(2013) Analytic Road Map 2 (1) 9.2 MMRM (2) MAR (3)estimand estimand 3 efficacy (1) (3) (1) (2) MAR (3) estimand effectiveness Analytic Road Map Mallinckrodt (2013) 14 NRC(2010) 5 (1) (2) (3) estimand Mallinckrodt (2013) 10.1: Analytic Road Map 1 ICH E9 (R1) 2 MMRM DL Diagnostics: residuals, influence, correlation, time 108

110 10.2 III HAM-D (12 ) 20% 4 12 estimand estimand : HAM-D SD 10.1: HAM-D SD (3 ) (6 ) (9 ) (12 ) (SD) (SD) (SD) (SD) (SD) (4.1) (4.2) (5.4) (6.3) (6.8) SAS (4.2) (4.2) (4.6) (6.3) (6.1) 10.3: 3 109

111 id : trt : x0 : time : val : 9 MMRM SAS HAM-D Unstructured REML (1) Fisher s scoring (2) Toeplitz, Heterogeneous CS AR(1) CS VC Kenward Roger 4 0 5% ODS OUTPUT DIFFS = diffs; PROC MIXED DATA = inputds; RUN; CLASS trt time id ; * ; MODEL val = x0 trt time trt*time / DDFM = kr; * = ; REPEATED time / TYPE = un SUBJECT = id; LSMEANS trt*time / DIFF; % 10.1 Analytic Road Map : 4 SE p (SE) MMRM (0.69) (0.70)

112 10.3: estimand efficacy 3 (primary) MAR MMRM 1 efficacy MAR MMRM ( ) MNAR SM ( ) efficacy PMM MNAR MMRM 3 pmi (estimand) effectiveness 6 (secondary) MNAR + MMRM Mallinckrodt (2013) 12.3 " " " " " " " " NRC (2010) 4 (p49) Mallinckrodt (2013) (auxiliary variables) MNAR MAR Mallinckrodt (2013) Collins et al. (2001) "Restrictive Model" "Inclusive Model" Restrictive Model Inclusive Model MI wgee MMRM (Unstructured) Unstructured Toeplitz Heterogeneous CS AR(1) CS VC Unstructured AIC 9 111

113 MMRM SAS MIXED PROCEDURE RESIDUAL (1) (2) Mallinckrodt (2013) p134 Student 2 Appendix A Student Student 2 31 MMRM 10.4 SE MMRM Cook D MIXED PROCEDURE INFLUENCE Cook D Cook D 0.03 Cook D 10.4 Cook D MMRM 10.4 local influence Verbeke and Molenberghs (2000) Molenberghs and Kenward (2007) 10.4: Student Cook D 112

114 10.4: SE Student Cook D MCAR MAR MNAR MCAR (Mallinckrodt, 2013) MNAR MAR (Little, 1995) 9 (MMRM) MNAR MAR MAR (Mallinckrodt, 2013) Type (i) Type (ii) Type (i) (ii) (NRC, 2010, p85) Type (ii) (testable) Type (i) (untestable) MAR MNAR Type (i) MAR MNAR NRC(2010) 5 MAR MNAR MAR (Recommendation 15) MNAR PMM SM SPM SAS NRC(2010) Mallinckrodt (2013) SM PMM SPM SM SM (10.1) (10.2) ψ 5 ψ 6 Y ij Y ij ψ 5 ψ 6 0 MAR 0 MNAR ψ 5 ψ 6 Mallinckrodt et al.,

115 SM Type (i) f(y i, R i θ, ψ) = f(y i θ) f(r i Y i, ψ) (10.1) 1 logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 1 + ψ 3 Y i,j 1 + ψ 5 Y ij 2 logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 2 + ψ 4 Y i,j 1 + ψ 6 Y ij (10.2) SM %SM_GridSearch %SM_GridSearch( psi5grid = , psi6grid = , const = 4, inputds = inputds, * ; covtype response modl clasvar mech = un, = val, derivative = 1, method = x0 trt time trt*time, = trt time, = MNARS, = %str(qn), out1 = &h._sp_estimates, * ; out2 = &h._sp_diffs, * ; out3 = &h._sp_lsmeans, * lsmeans ; debug = 0 ); 10.5: (SM) 4 (SE) SE (0.72) -9.91(0.72) (0.71) -9.84(0.71) (0.70) -9.74(0.71) (0.70) -9.58(0.70) (0.69) -9.31(0.70) SM (0.69) -8.96(0.70) (0.71) -8.60(0.72) (0.73) -8.29(0.74) (0.75) -8.09(0.75) (0.76) -8.00(0.76) (0.76) -7.94(0.76)

116 10.5: SE SM 10.6: SE (SM) PMM PMM Y 1, Y 2, Y 3, Y NRC(2010) 5 PMM (6 ) Case 7 NFMV+ACMV+ g s 1 (Y s Y 1,..., Y s 1 ) = f s (Y s Y 1,..., Y s 1 ) T = ω sj f j (Y s Y 1,.., Y s 1 ) (10.3) j=s +4 NFMV ω 43 + ω 44 = 1, ω 32 + ω 33 + ω 34 = 1 δ = ω 43 ω 44 = 1 δ f 4 (1234) = f 4 (1234) f 3 (1234) = f 3 (123) f 3 (4 123) = f 3 (123) g 3 (4 123) f 2 (1234) = f 2 (12) f 2 (3 12) f 2 (4 123) = f 2 (12) g 2 (3 12)f 3 (4 123) ( NFMV) = f 2 (12) g 2 (3 12) [ω 43 f 3 (4 123) + ω 44 f 4 (4 123)] [ ] = f 2 (12) g 2 (3 12) δ g 3 (4 123) + (1 δ) f 4 (4 123) f 1 (1234) = f 1 (1) f 1 (2 1) f 1 (34 12) = f 1 (1) g 1 (2 1) f 1 (3 12) f 1 (4 123) = f 1 (1) g 1 (2 1)f 2 (3 12) f 3 (4 123) ( NFMV) = f 1 (1) g 1 (2 1) [ω 32 f 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] [ω 43 f 3 (4 123) + ω 44 f 4 (4 123)] = f 1 (1) g 1 (2 1) [ω 32 f 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12)] 115 ( ω 43 = δ, ω 44 = 1 δ)

117 [ ] δ g 3 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ) [ ] = f 1 (1) g 1 (2 1) ω 32 g 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] δ g 3 (4 123) + (1 δ) f 4 (4 123) f 0 (1234) = f 0 (1) f 0 (2 1) f 0 (34 12) = g 0 (1)f 0 (2 1) f 0 (3 12) f 0 (4 123) = g 0 (1)f 1 (2 1)f 2 (3 12)f 3 (4 123) = g 0 (1) [ω 21 f 1 (2 1) + ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1)] [ ] ω 32 g 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] ω 43 g 3 (4 123) + (1 ω 44 ) f 4 (4 123) ( NFMV) = g 0 (1) [ω 21 f 1 (2 1) + ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1)] [ ] ω 32 g 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] δ g 3 (4 123) + (1 δ) f 4 (4 123) ( ω 43 = δ, ω 44 = 1 δ) [ ] = g 0 (1) ω 21 g 1 (2 1) + ω 22 f 2 (2 1) + ω 23 f 3 (2 1) + ω 24 f 4 (2 1) [ ] ω 32 g 2 (3 12) + ω 33 f 3 (3 12) + ω 34 f 4 (3 12) [ ] δ g 3 (4 123) + (1 δ) f 4 (4 123) (10.4) α 3 f 3 (123) δ = ω 43 = α 3 f 3 (123) + α 4 f 4 (123), 1 δ = ω α 4 f 4 (123) 44 = α 3 f 3 (123) + α 4 f 4 (123) α 2 f 2 (12) ω 32 = α 2 f 2 (12) + α 3 f 3 (12) + α 4 f 4 (12), ω α 3 f 3 (12) 33 = α 2 f 2 (12) + α 3 f 3 (12) + α 4 f 4 (12), α 4 f 4 (12) ω 34 = α 2 f 2 (12) + α 3 f 3 (12) + α 4 f 4 (12), α 1 f 1 (1) ω 21 = α 1 f 1 (1) + α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), ω α 2 f 2 (1) 22 = α 1 f 1 (1) + α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), α 3 f 3 (1) ω 23 = α 1 f 1 (1) + α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1), ω α 4 f 4 (1) 24 = α 1 f 1 (1) + α 2 f 2 (1) + α 3 f 3 (1) + α 4 f 4 (1) g 1 (2 1) g 2 (3 12) g 3 (4 123) ACMV g s 1 (Y s Y 1,..., Y s 1 ) = f s (Y s Y 1,..., Y s 1 ) (10.5) g 0 (1) = f 1 (1 ) g 1 (2 1) = f 2 (2 1) g 2 (3 12) = f 3 (3 12) (10.6) g 3 (4 123) = f 4 (4 123) 116

118 MAR efficacy estimand 3 4 NRC(2010) PMM 5 E[Y 1 R 1] f 1 (1) E[Y 2 Y 1, R 2] f 2 (2 1) R E[Y 1 R 1] = µ 1 E[Y 2 Y 1, R 2] = µ 2 + Y 1 β 2 E[Y 3 Y 2, R 3] = µ 3 + Y 2β 3 (10.7) E[Y 4 Y 3, R = 4] = µ 4 + Y 3β 4 NFMV E[Y 2 Y 1, R = 0] = E[Y 2 Y 1, R 1] E[Y 3 Y 2, R = 0] = E[Y 3 Y 2, R = 1] = E[Y 3 Y 2, R 2] E[Y 4 Y 3, R = 0] = E[Y 4 Y 3, R = 1] = E[Y 4 Y 3, R = 2] = E[Y 4 Y 3, R 3] g (10.6) E[Y 1 R = 0] = µ 1 + E[Y 2 Y 1, R = 1] = µ Y 1 β 2 E[Y 3 Y 2, R = 2] = µ Y 2β 3 (10.8) E[Y 4 Y 3, R = 3] = µ Y 3β 4 E[Y 1 R = 0] g 0 (1) = f 1 (1 ) 6. PMM SM SM PMM SM MAR PMM MAR SM PMM PMM (%delta_pmm) ACMV effectiveness 5 NRC(2010) 6 (10.6) (10.8) ( ) Y2 f 2 Y2 Y 1 dy2 = µ E[Y 2 Y 1, R = 1] g 1 (2 1) E[Y 2 Y 1, R = 1] = = = Y 2 g 1 (2 1) dy ( Y 2 f 2 Y2 ) Y1 dy2 ( (Y 2 )f 2 Y2 ) Y 1 dy2 + f 2 ( Y2 Y 1 ) dy2 = µ

119 %delta_pmm(datain = inputds, trtname = trt, subjname = id, visname = time, basecont = %str(v0), postcont = %str(val), seed = , nimp = 10, deltavis = %str(first), deltacont = %str(&delta), deltacontarm = %str(0,1), deltacontmethod = %str(meanabs), favorcont = %str(&fav), primaryname = val, analcovarcont = %str(v0), analmethod = mmrm, repstr = %str(un), trtref = 0, dataout = data0imputed1, resout = pmmresults1 ); &deltavis 1 ("FIRST") ("ALL") &delta &fav 6 (&delta &fav = (3, high, -3.0) (2, high, -2.0) (1, high, -1.0) (0, high, 0.0) (1, low, 1.0) (2, low, 2.0) (3, low, 3.0) 10.7: SE PMM 10.8: SE (PMM) 118

120 10.6: (PMM) 4 (SE) SE (0.69) -9.18(0.71) (0.69) -9.09(0.71) (0.69) -9.00(0.71) PMM NFV (NCMV) (0.69) -8.91(0.72) (0.69) -8.83(0.72) (0.69) -8.74(0.72) (0.70) -8.65(0.73) SPM 7 SPM (10.9) MNAR SPM MAR f(y i, R i, b i ) = f(y i o b i )f(y i m b i )f(r i b i )f(b i ) (10.9) 7 (10.10) Complementary log-log link Complementary log-log link (10.11 ) Y ij = β 0 + β 1 T ime ij + β 2 Group i + β 3 T ime ij Group i + b i0 + e ij (10.10) P r(d i j) = 1 exp( exp(α 0j + γ 1 b i0 )) (10.11) SPM SE %SHARED_PARAMETER(INPUTDS = inputds, SUBJVAR = id, TRTVAR = trt, TIME = time, MODL = %STR(val = x0 trt time trt*time), LINK = CLOGLOG, RANDOM_SLOPE = %STR(NONE), * ; DEBUG = 0 ); &LINK "CLOGLOG" Complementary log-log link "LOGIT" &RANDOM_SLOPE "NONE" "LINEAR" 119

121 : SPM 2 4 SE (SE) SPM Estimand 3 3 estimand efficacy effectiveness estimand efficacy estimand effectiveness estimand estimand Mallinckrodt (2013) 14 pmi MMRM %cbi_pmm %cbi_pmm(datain = inputds, trtname = trt, subjname = id, visname = time, basecont = %str(v0), baseclass =, postcont = %str(val), postclass =, seed = , nimp = 100, primaryname = val, analcovarcont = %str(v0), trtref = 0, analmethod = mmrm, repstr = un, dataout = data0imputed2, resout = pmmresults2 ); 120

122 10.8: pmi MMRM 4 SE (SE) pmi (0.70) (0.71) MMRM (1) (2) (3)estimand : MMRM estimand MMRM SM PMM+MMRM SPM pmi+mmrm estimand (NRC,2010; Mallinckrodt, 2013) [1] Collins, L. M., Schafer, J. L., and Kam, C. M. (2001). A comparison of inclusive and restrictive strategies in modern missing data procedures. Psychological methods, 6(4), 330. [2] Little, R. J. A. (1995). Modeling the drop-out mechanism in repeated measure studies. Journal of American statistical Association, 90(431): [3] Mallinckrodt, C. H. (2013). Preventing and Treating Missing Data in Longitudinal Clinical Trials. Cambridge Press. [4] Mallinckrodt, C., Roger, J., Chuang-Stein, C., Molenberghs, G., O Kelly, M., Ratitch, B., and Bunouf, P. (2013). Recent developments in the prevention and treatment of missing data. Therapeutic Innovation and Regulatory Science, [5] Molenberghs, G., and Kenward, M. G. (2007). Missing Data in Clinical Studies. Wiley. 121

123 [6] National Research Council. (2010). The Prevention and Treatment of Missing Data in Clinical Trials. The National Academies Press. [7] Verbeke, G., and Molenberghs, G. (2000). Linear Mixed Models for Longitudinal Data. Springer. 122

124 11 Estimand 11.1 Analytic Road Map Estimand 3 Mallinckrodt (2013) 6 estimand 1 estimand estimand estimand Major Depression Disorder, MDD 123

125 11.1: Mallinckrodt (2013) MDD 6 estimand Estimand rescue medication 1 effectiveness MMRM ANCOVA 2 efficacy (MAR) MMRM, PMM (ACMV), MI-ANCOVA, MI-MMRM, wgee (MNAR) SM, PMM (ACMV ), SPM 3 efficacy (MAR) MMRM, PMM (ACMV), MI-ANCOVA, MI-MMRM, wgee (MNAR) SM, PMM (ACMV ), SPM 4 effectiveness t, ANOVA, ANCOVA, LMM 5 effectiveness t, ANOVA, ANCOVA, LMM 6 effectiveness pmi ANCOVA, pmi MMRM BOCF ANCOVA, BOCF MMRM 124

126 Estimand 3 10 primary estimand estimand 3 MAR MMRM 9 PMM (ACMV) 6 MI ANCOVA MI MMRM (6 ) wgee 8 12 MNAR SM 5 PMM ACMV ) 6 SPM 7 LOCF ANCOVA Estimand 2 active run-in estimand 2 estimand 3 estimand 3 LOCF ANCOVA estimand Estimand 1 Estimand 1 MMRM 9 ANCOVA 3 Rescue medication controlled imputation Mallinckrodt (2013) 11 2 LOCF ANCOVA LOCF 3 LOCF LOCF 125

127 Estimand 6 Estimand 6 effectiveness estimand rescue medication rescue medication BOCF (4, 6 ) controlled imputation (pmi) (6 ) Estimand 4 1 AUC 1 2 t ANOVA ANCOVA LMM Appendix A Estimand 5 estimand t ANOVA ANCOVA LMM Appendix A 11.3 Analytic Road Map 10 Analytic Road Map 126

128 11.1: Analytic Road Map estimand efficacy MAR effectiveness estimand MNAR [1] Mallinckrodt, C. H. (2013). Preventing and treating missing data in longitudinal clinical trials. Cambridge University Press. 127

129 III III estimand estimand 3 efficacy MAR 12.1 Analytic Road Map Mallinckrodt, 2013 MMRM MI wgee 3 α 2 LOCF ANCOVA OC ANCOVA 12.1: Analytic Road Map 1 III MAR MNAR MAR MNAR MNAR 2 III MCAR MAR MCAR MNAR 2 128

130 MAR MNAR 10,000 α III HAM-D 2 g g = 1 g = / 4 4 (y 0 y i i i = 1, 2, 3, 4) SD 12.1: SD SD (4.0) 18.0 (5.0) 15.0 (5.0) 12.0 (6.0) 9.0 (6.0) 20.0 (4.0) 18.0 (5.0) 16.0 (5.0) 14.0 (6.0) 12.0 (6.0) 12.2: 129

131 12.2: : % 5% 10% 13% 15% α SD H 0 : µ A4 = µ P 4 µ A4 4 µ P 4 4 α 12.4: α SD : α 130

132 SD : 12.5: SD SD SD SD SD SD (4.1) (4.4) (5.7) (6.6) (6.9) (4.2) (4.5) (4.7) (6.5) (6.2) Bernoulli MAR MNAR y 0 y i i, i = 1, 2, 3, 4 MAR p i = 1 MNAR p i = exp( y i 1 ) exp( y i y i ) 131

133 12.5: MAR MNAR 12.6: MAR MNAR MAR SD SD SD SD SD (4.1) (4.3) (5.5) (6.4) (6.8) (4.2) (4.5) (4.7) (6.7) (6.2) MNAR SD SD SD SD SD (4.1) (4.2) (5.4) (6.3) (6.8) (4.2) (4.2) (4.6) (6.3) (6.1) 10,000 MAR MNAR 10,000 α 12.4 MAR MNAR 10,

134 MAR MNAR 10, α 5% MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA 12.7: Unstructured REML Kenward Roger 5 4 ANCOVA logistic subject-specific EXCH Missingdata.org.uk %WGEE LOCF 4 ANCOVA 4 4 ANCOVA 133

135 4 MAR 12.8: 10,000 MAR % 5.9% 10.5% 13.4% 15.4% 0.0% 6.0% 10.5% 13.8% 16.5% 12.9: MAR α % % 10,000 10,000 MSE SD MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA : MAR 134

136 MNAR 12.10: 10,000 MNAR % 6.0% 10.0% 12.6% 14.3% 0.0% 6.0% 10.4% 13.7% 16.1% 12.11: MNAR α % % 10,000 10,000 MSE SD MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA : MNAR α MAR MNAR wgee 5% wgee MAR MNAR MMRM OC ANCOVA MI ANCOVA LOCF ANCOVA wgee MMRM MI ANCOVA wgee LOCF ANCOVA SD wgee MMRM MI ANCOVA OC ANCOVA MMRM MI ANCOVA MAR MNAR α wgee α 5%

137 wgee (2015) α subject-specific Fitzmaurice et al. (1995) subject-specific (2015) α wgee LOCF ANCOVA α MMRM MI ANCOVA OC ANCOVA MMRM MCAR MAR MCAR MNAR 1 α III g g = 1 g = / 4 4 (y 0 y i i i = 1, 2, 3, 4) SD 12.12: SD SD (1.40) 6.00 (1.60) 5.00 (1.80) 5.00 (2.00) 5.00 (2.00) 7.00 (1.40) 6.75 (1.60) 6.50 (1.80) 6.25 (2.00) 6.00 (2.00) 12.8: 136

138 12.13: : % 20% 26% 30% 34% 0% 18% 30% 40% 48% α SD H 0 : µ A4 = µ P 4 µ A4 4 µ P 4 4 α 12.15: α SD : α 4 estimand α 1 2 estimand α α estimand estimand SD 1 137

139 12.10: 12.16: SD SD SD SD SD SD (1.4) (1.4) (2.0) (2.2) (2.3) (1.5) (1.4) (1.7) (2.2) (2.1) Bernoulli MCAR MAR MCAR+MAR MCAR MNAR MCAR+MNAR MCAR+MAR 100 / 50 MCAR 50 MAR 1 : 1 MCAR+MNAR 1 : 1 MCAR MAR MNAR MCAR g i p gi = exp(γ gi ) γ gi g = 1 g = 2 1 i = i = i = i = MAR i p i = 1 MNAR i p i = exp( y i 1 ) exp( y i y i ) 138

140 12.11: MCAR+MAR MCAR+MNAR 12.17: MCAR+MAR MCAR+MNAR MCAR+MAR SD SD SD SD SD (1.4) (1.4) (2.1) (2.2) (2.2) (1.5) (1.5) (1.7) (2.1) (2.2) MCAR+MNAR SD SD SD SD SD (1.4) (1.4) (2.1) (2.2) (2.1) (1.5) (1.5) (1.7) (2.1) (2.2) 10,000 MCAR+MAR MCAR+MNAR 10,000 α MCAR+MAR MCAR+MNAR 10,

141 MCAR+MAR MCAR+MNAR 10, α 5% MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA 12.18: Unstructured REML Kenward Roger ANCOVA logistic subject-specific EXCH Missingdata.org.uk %WGEE LOCF 4 ANCOVA 4 4 ANCOVA 140

142 4 MCAR+MAR 12.19: 10,000 MCAR+MAR % 20.0% 26.3% 30.3% 34.3% 0.0% 17.6% 30.7% 40.7% 48.9% 12.20: MCAR+MAR α % % 10,000 10,000 MSE SD MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA : MCAR+MAR 141

143 MCAR+MNAR 12.21: 10,000 MCAR+MNAR % 19.1% 24.8% 29.1% 33.2% 0.0% 18.8% 31.9% 41.9% 49.7% 12.22: MCAR+MNAR α % % 10,000 10,000 MSE SD MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA : MCAR+MNAR α MCAR+MAR MCAR+MNAR MMRM MI ANCOVA 5% LOCF ANCOVA MCAR+MAR MCAR+MNAR LOCF ANCOVA MMRM MI ANCOVA OC ANCOVA wgee MMRM MI ANCOVA wgee LOCF ANCOVA SD LOCF ANCOVA wgee MMRM MI ANCOVA MMRM MI ANCOVA MAR MNAR α wgee α 5%

144 LOCF ANCOVA α SD MMRM MI ANCOVA wgee LOCF ANCOVA OC ANCOVA 2 MMRM estimand 3 MAR Mallinckrodt (2013) LOCF ANCOVA α 1 5% 2 α LOCF ANCOVA α 100 / 12.14: 1 α 12.15: 2 α 143

145 [1] Fitzmaurice, G. M., Molenberghs, G., and Lipsitz, S. R. (1995). Regression models for longitudinal binary responses with informative drop-outs. Journal of the Royal Statistical Society. Series B (Methodological), [2],,,,. (2015). (3)Proc GEE wgee. SAS. [3] Mallinckrodt, C. H. (2013). Preventing and treating missing data in longitudinal clinical trials. Cambridge University Press. 144

146 SAS 1, 2 ( ), 3, 5, 11 Appendix A, B (2013 ) 2 (2.5), Appendix C (2014 ) 4, 7 (2014 ) 12 Appendix C Appendix C NRC EMA ( ) RI ( ) FDA, EMA, PMDA ( ) Meiji Seika

147 146

148 Appendix

149 Appendix A 4 A A A A A A A A A AR(1) A A Compound Symmetry A A A A.7 SAS A A A A A A A A A A A A A A A A A A.9.3 (REML) A A A

150 Appendix B 47 B B B B B MCAR B MAR B MNAR B MNAR B.3 5 Selection Model B.3.1 MNAR B i B j (j 3) i B i B.4 6 NFMV NFD ACMV MAR B B B B B B B B.4.2 NFMV ACMV B NFMV B NFMV B NFMV B ACMV B ACMV B ACMV B NFMV ACMV B B B.4.3 NRC (2010) PMM B.4.4 ACMV MAR B B B B B B.4.5 NFMV NFD B B B B B

151 Appendix C SAS 77 C C.2 Selection_Model C C C C C C.3 cbi_pmm C C C C C C.4 delta_pmm C C C C C C.5 Shared_Parameter C C C C

152 Appendix A A SAS MMRM MCAR A.2 i 1 n i 2 Y i = (Y i1,, Y ini ), (i = 1,, N) i (Linear Model, LM) Y i = X i β + ϵ i β X i ϵ i E[ϵ i ] = 0 ϵ i N(0, σ 2 I ni ) (Linear Mixed Model, LMM) b i Y i = X i β + Z i b i + ϵ i (A.1) Z i β b i 1 2 4

153 1 3 i j (j = 1, 2, 3) t j b i Y i1 = µ + t 1 + b i + ϵ i1 Y i2 = µ + t 2 + b i + ϵ i2 (A.2) Y i3 = µ + t 3 + b i + ϵ i3 µ Y i1 Y i = Y i2, β = t 1, ϵ t i = 2 Y i3 t 3 ϵ i1 ϵ i2 ϵ i3 X i = , Z i = (A.2) Y i = X i β + Z i b i + ϵ i Y i, X i, Z i β, b i, ϵ i β (estimation) b i 3 4 (i) (ii) 2 5 (i) 1 6 (ii) Y 1 = µ + b 1 + ϵ 1, Y 2 = µ + b 1 + ϵ 2 b 1, ϵ 1, ϵ 2 V [b 1 ] = σb 2, V [ϵ 1] = V [ϵ 2 ] = σ 2 Y 1, Y 2 Y 1 Y 2 Cov[Y 1, Y 2 ] = Cov[µ + b 1 + ϵ 1, µ + b 1 + ϵ 2 ] = Cov[b 1, b 1 ] = σ 2 b V [Y 1 ] = V [b 1 ] + V [ϵ 1 ] = σ 2 b + σ2, V [Y 2 ] = V [b 1 ] + V [ϵ 2 ] = σ 2 b + σ2 Corr[Y 1, Y 2 ] = Cov[Y 1, Y 2 ] V [Y1 ] V [Y 2 ] = σb 2 σb 2 + σ2 5

154 b i N(0, τ 2 I m ) b i τ 2 b i b i (estimation) (estimator) b i (prediction) (predictor) 7 1 b i = E[b i Y i ] ϵ i b i ϵ j i, j i j b i b j ϵ i ϵ j A (A.1) b i Y i (A.1) Y i = X i β + Z i b i + ϵ i b i N(0, D), ϵ i N(0, Σ i ) b i ϵ i Y i b i N(X i β, Σ i ) Y i N(X i β, Z i DZ i + Σ i) f(y i b i ) f(y i ) g(b i ) f(y i ) = f(y i b i ) g(b i )db i A.4 (i) 7 6

155 (ii) (iii) (i) i t 1, t 2, t 3 3 Y ij t i 1 Y ij = β 0 + β 1 t j + ϵ i 2 Y ij = β 0 + β 1 t j + β 2 t 2 j + ϵ i t 1, t 2, t 3 β 0, β 1, β 2 Y ij = β 0 + t j + ϵ ij t 1, t 2, t 3 t 1, t 2, t 3 8 (ii) 1 MMRM 9 A A.1: 1 15 A.2 A

156 A.2: A.3: A.4: 1 A.5: A.4, A.5 A.1 A.6: A.7:

157 2 3 4 A.6, A.7 A A.8: A.9: 5 6 A.8, A A (iii) LOCF ANCOVA MMRM 0 (Liang and Zeger, 2000) A.5 (A.1) Y i = X i β + Z i b i + ϵ i A β 1, β 2 (t 1, t 2, t 3 ) b i 2 2 i j Y ij i = 1, 2 i = 3, 4 1 Y 11 = µ + β 1 + γt 1 + b 1 + ϵ 11 Y 12 = µ + β 1 + γt 2 + b 1 + ϵ 12 Y 13 = µ + β 1 + γt 3 + b 1 + ϵ 13 Y 21 = µ + β 1 + γt 1 + b 2 + ϵ 21 9

158 Y 22 = µ + β 1 + γt 2 + b 2 + ϵ 22 Y 23 = µ + β 1 + γt 3 + b 2 + ϵ 23 Y 31 = µ + β 2 + γt 1 + b 3 + ϵ 31 Y 32 = µ + β 2 + γt 2 + b 3 + ϵ 32 Y 33 = µ + β 2 + γt 3 + b 3 + ϵ 33 Y 41 = µ + β 2 + γt 1 + b 4 + ϵ 41 Y 42 = µ + β 2 + γt 2 + b 4 + ϵ 42 Y 43 = µ + β 2 + γt 3 + b 4 + ϵ i µ Y i1 Y i = Y i2, β = β 1, ϵ β i = 2 Y i3 γ ϵ i1 ϵ i2 ϵ i3 X 1 = X 2 = t t 2, X 3 = X 4 = t t 2, t 3 1 Z 1 = Z 2 = Z 3 = Z 4 = 1 1 = t 3 11 Y 1 = X 1 β + Z 1 b 1 + ϵ 1 Y 2 = X 2 β + Z 2 b 2 + ϵ 2 Y 3 = X 3 β + Z 3 b 3 + ϵ 3 Y 4 = X 4 β + Z 4 b 4 + ϵ 4 Y i = X i β + Z i b i + ϵ i (i = 1, 2, 3, 4) (A.3) X 1 X 2 X 3 X 4 10

159 Y = Y 1 Y 2 Y 3 Y 4 = Y 11 Y 12 Y 13 Y 21 Y 22 Y 23 Y 31 Y 32 Y 33 Y 41 Y 42 Y 43, b = b 1 b 2 b 3 b 4, ϵ = ϵ 1 ϵ 2 ϵ 3 ϵ 4 = ϵ 11 ϵ 12 ϵ 13 ϵ 21 ϵ 22 ϵ 23 ϵ 31 ϵ 32 ϵ 33 ϵ 41 ϵ 42 ϵ 43 X = X 1 X 2 X 3 X 4 = t t t t t t t t t t t t 3, Z = Z Z Z Z 4 = (A.3) i = 1, 2, 3, 4 Y = Xβ + Zb + ϵ b i ϵ i b i iid N(0, ν 2 ), ϵ i iid N(0, σ 2 I 3 ) b i ϵ i (A.3) Y i V [Y i ] = V [Z i b i + ϵ i ] = Z i V [b i ]Z i + V [ϵ i ] = 1 3 V [b i ]1 3 + σ 2 I 3 11

160 = ν σ ν 2 + σ 2 ν 2 ν 2 = ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ = (A.4) 12 Y i Compound Symmetry ν 2 = 2, σ 2 = V [Y i ] = = j k Y ij Y ik Cov[Y ij, Y ik ] = 2 Cor[Y ij, Y ik ] = Cov[Y ij, Y ik ] V [Yij ] V [Y ij ] = = 0.4 Y = Xβ + Zb + ϵ V [Y] = V [Zb] + V [ϵ] = ZV [b]z + V [ϵ] Z Z σ 2 I Z ( = ν 2 ) 0 Z σ 2 I I 0 0 Z Z σ 2 I Z Z σ 2 I 3 Z 1 Z σ 2 I = ν 2 0 Z 2 Z σ 2 I Z 3 Z σ 2 I Z 4 Z σ 2 I 3 ν 2 Z 1 Z 1 + σ 2 I ν 2 Z 2 Z 2 + σ 2 I = 0 0 ν 2 Z 3 Z 3 + σ 2 I ν 2 Z 4 Z 4 + σ 2 I

161 i = 1, 2, 3, 4 1 ( ) Z i Z i = = = ν 2 Z i Z i + σ 2 I 3 = ν σ = ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 τ 2 = ν 2 + σ 2 τ 2 ν 2 ν ν 2 τ 2 ν ν 2 ν 2 τ τ 2 ν 2 ν ν 2 τ 2 ν ν 2 ν 2 τ V [Y] = τ 2 ν 2 ν ν 2 τ 2 ν ν 2 ν 2 τ τ 2 ν 2 ν ν 2 τ 2 ν ν 2 ν 2 τ 2 A b i ϵ i ϵ i Y i Compound Symmetry (ϵ i ) Compound Symmetry

162 A i = 1, 2 i = 3, 4 k i = 1, 2 k = 1 i = 3, 4 k = 2 14 Y ij = µ + β k + γt j + b i1 + b i2 t j + ϵ ij i b i1, b 2i b 11, b 12, b 21, b 22, b 31, b 32, b 41, b 42 8 i b i = (b i1, b i2 ) b i1 iid N 0, ν2 1 0 b i2 0 0 ν2 2 ϵ iid N(0, σ 2 I 3 ) b 1i b 2i PROC MIXED Variance Components 15 D = ν ν2 2 Y i µ Y i1 Y i = Y i2, β = β 1, b β i = b ϵ i1 i1, ϵ i = ϵ 2 b i2 i2 Y i3 ϵ i3 γ t 1 X 1 = X 2 = t 2, X 3 = X 4 = t 3 1 t 1 Z 1 = Z 2 = Z 3 = Z 4 = 1 t 2 1 t 3 i t t t 3 Y i = X i β + Z i b i + ϵ i (i = 1, 2, 3, 4) E[Y i ] = X i β V [Y i ] = V [Z i b i ] + V [ϵ i ] = Z i V [b i ]Z i + σ 2 I 3 = Z i DZ i + σ 2 I 3 14 β k k = 1, 2 15 ( ) ν 2 ν 12 D = 1 ν 12 ν 12 ν2 2 SAS RANDOM " / TYPE = UN; " 14

163 = 1 t 1 1 t 2 1 t 3 ν ν t 1 t 2 t 3 + σ σ σ 2 = ν t 2 1ν σ 2 ν t 1 t 2 ν 2 2 ν t 1 t 3 ν 2 2 ν t 1 t 2 ν 2 2 ν t 2 2ν σ 2 ν t 2 t 3 ν 2 2 ν t 1 t 3 ν 2 2 ν t 2 t 3 ν 2 2 ν t 2 3ν σ 2 Y = Y 1 Y 2 Y 3 Y 4 = Y 11 Y 12 Y 13 Y 21 Y 22 Y 23 Y 31 Y 32 Y 33 Y 41 Y 42 Y 43, b = b 1 b 2 b 3 b 4 = b 11 b 12 b 21 b 22 b 31 b 32 b 41 b 42, ϵ = ϵ 1 ϵ 2 ϵ 3 ϵ 4 = ϵ 11 ϵ 12 ϵ 13 ϵ 21 ϵ 22 ϵ 23 ϵ 31 ϵ 32 ϵ 33 ϵ 41 ϵ 42 ϵ 43 X = X 1 X 2 X 3 X 4 = t t t t t t t t t t t t 3, Z = Z Z Z Z 4 = 1 t t t t t t t t t t t t 3 Y = Xβ + Zb + ϵ 15

164 E[Y] = Xβ = X 1 X 2 X 3 X 4 β = X 1 β X 2 β X 3 β X 4 β V [Y] = ZV [b]z + V [ϵ] Z D Z Z D Z = 0 0 Z D Z Z D Z 4 σ 2 I σ 2 I σ 2 I σ 2 I 3 Z 1 DZ 1 + σ 2 I Z 2 DZ 2 + σ 2 I = 0 0 Z 3 DZ 3 + σ 2 I Z 4 DZ 4 + σ 2 I 3 A AR(1)

165 AR(1) 1 < ρ < 1 1 ρ ρ 2 iid b i N(0, ν 2 iid ), ϵ ij N 0, σ2 ρ 1 ρ ρ 2 ρ 1 18 Y i Z i = 1 3, b i N(0, σ 2 ) Y i = X i β + Z i b i + ϵ i E[Y i ] = X i β V [Y i ] = 1 3 V [b i ]1 3 + V [ϵ i ] ρ ρ 2 = ν σ2 ρ 1 ρ ρ 2 ρ 1 ν 2 + σ 2 ν 2 + σ 2 ρ ν 2 + σ 2 ρ 2 = ν 2 + σ 2 ρ ν 2 + σ 2 ν 2 + σ 2 ρ ν 2 + σ 2 ρ 2 ν 2 + σ 2 ρ ν 2 + σ 2 (A.5) (A.4) 1 < ρ < 1 AR(1) ν 2 = 1, σ 2 = 2, ρ = V [Y i ] = = j k Cov[Y ij, Y ik ] = j k Corr[Y ij, Y ik ] = Cov[Y ij, Y ik ] V [Yij ] V [Y ik ] = j k = j k Corr[Y i1, Y i2 ] = Corr[Y i2, Y i3 ] = Corr[Y i1, Y i3 ] = = A AR(1) 1 ρ ρ 2 V [ϵ i ] = σ 2 ρ 1 ρ ρ 2 ρ 1 18 ϵ ij (i, j) σ 2 ρ i j 17

166 AR(1) AR(1) k l k, l k l t k t l (t k t l ) ( 1 exp t ) ( 1 t 2 exp t ) 1 t 3 θ θ ( V [ϵ i ] = σ 2 exp t ) ( 1 t 2 1 exp t ) 2 t 3 ( θ θ exp t ) ( 1 t 3 exp t ) 2 t 3 1 θ θ 19 t 1 = 1, t 2 = 2, t 3 = 4, θ = 0.5 ( 1 exp 1 ) ( exp 2 ) ( V [ϵ i ] = σ 2 exp 1 ) ( 1 exp 1 ) ( exp 2 ) ( exp 1 ) exp ( 2) exp ( 4) = σ 2 exp ( 2) 1 exp ( 2) = σ exp ( 4) exp ( 2) SAS PROC MIXED REPEATED Spatial Covariance Structure A Compound Symmetry V [Y i ] 1 Y i = X i β + ϵ i ϵ i iid N 0 0 0, ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 E[Y i ] = X i β (LMM) (LM) MMRM MMRM(Mixed Model for Repeated Measures) 18

167 V [Y i ] = ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 Y i ϵ i 1 Z i b i + ϵ i A.5.7 (A.1) Y i = X i β + Z i b i + ϵ i b i iid N(0, D), ϵ i iid N(0, Σ i ) b i ϵ i E[Y i ] = X i β, V [Y i ] = Z i DZ i + Σ i 22 SAS PROC MIXED b i RANDOM ϵ i REPEATED A i (i) r m,i = Y i X i β (ii) r c,i = Y i X i β Zi bi β b i (i) (ii) X i β Xi β + Zi bi Student (studentized residual) Σ i i D i 23 r m,i, r c,i 24 19

168 r m,i (i) Student V [rm,i] (ii) Student r c,i V [rc,i] V i = Z i DZ i + Σ i, Q i = X i (X i V i X i) X i, K i = I Z i DZ i V 1 i, V [r m,i ] = V i Q i V [r c,i ] = K i ( V i Q i )K i SAS PROC MIXED RANDOM residual ods graphics on html A.7.9 A.10 (i) (ii) (i) (ii) 6 A.10: 2 (a)0 (b)student Y i = X i β + Z i b i + ϵ i E[Z i b i ] = 0, E[ϵ i ] = 0 ϵ i Z i b i + ϵ i 0 Student Student A

169 Cook D (Cook, 1977) Cook D β i β (i) i β 25 i β Cook D D i = 1 rank(x) ( β β (i) ) V [ β] ( β β (i) ) V [ β] β SAS PROC MIXED influence 4 A.7.10 A.11 2 A.11: 3 A.7 SAS 1 5 SAS A.8, A.9 A i j b i N(0, 4 2 ) ϵ ij N(0, 4 2 ) 25 i β β (i) β β (i) β β (i) 21

170 i i = 1 i = 1 j (1, 2, 3) d i 1 2 t i 1, 2, 3 Y i1 = 20 + b i 3t i + ϵ i1 (i = 1,, 10) Y ij = 20 + b i 3d i 3t i + ϵ i1 (i = 1,, 10, j = 2, 3) 3 30% 3 MCAR 3 30% 3 * ; DATA D1; CALL STREAMINIT(16747); DO DRUG = 1 TO 2; DO PATNO = 1 TO 10; PAT = RAND( NORM, 0, 4); * ; TIME = 1; * 1; TIMECLASS = TIME; E = RAND( NORM, 0, 4); * ; Y = 20 + PAT - 3*TIME + E ; * ; OUTPUT; DO TIME = 2, 3; * 2, 3; TIMECLASS = TIME; E = RAND( NORM, 0, 4); * ; Y = 20 + PAT - 3* DRUG - 3*TIME + E ; * 2,3 ; OUTPUT; END; END; END; RUN; 22

171 * ; DATA D2; SET D1; CALL STREAMINIT(51885); IF TIME = 3 THEN DO; IF RAND( UNIF ) < 0.3 THEN DELETE; * 3 30% ; END; RUN; D2 S A S システム O B S D R U G P A T N O T I M E T I M E C L A S S Y A.12: D2 3 DRUG PATNO TIME TIMECLASS Y 23

172 A.13: A.14: 1 5 Y ij = µ + b i + β di + γt i + ϵ ij β di d i = 1 β di = β 1 d i = 2 β di = β 2 A PROC MIXED DATA = D2; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S ; RANDOM INT / SUB = PATNO(DRUG) ; RUN; SE WORK.D2 Y Variance Components PATNO(DRUG) REML Profile Model Based Containment 24

173 DRUG 2 PATNO PATNO 10 PATNO(DRUG) 10 2 X 4 Z 1 20 Obs MMRM 25

174 Intercept PATNO(DRUG) Residual b i N(0, ν 2 ), ϵ i N(0, σ 2 I 3 ) ν 2 = , σ 2 = Y i V [Y i ] = ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 = Ĉorr[Y ij, Y ik ] = ν 2 ν2 + σ 2 ν 2 + σ 2 = (j k) AIC ( ) AICC ( ) BIC ( ) t Pr > t Intercept <.0001 DRUG DRUG TIME < V [Y i ] V [Y i ] Ĉorr ŜE 26

175 β = µ β 1 β 2 γ = Containment µ : 18, β 1 : 18, γ : 32 ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = Type 3 Type 3 F Pr > F DRUG TIME <.0001 F = ( β1 ŜE[ β 1 ] ) 2 = ( ) p % A Kenward-Roger PROC MIXED DATA = D2 ; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S DDFM = KR; RANDOM INT / SUB = PATNO(DRUG) ; RUN; 27

176 Intercept PATNO(DRUG) Residual β 27 t Pr > t Intercept <.0001 DRUG DRUG TIME <.0001 µ : 44.5, β 1 : 16.9, γ : ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = Containment 1-1 A A

177 PROC MIXED DATA = D2 EMPIRICAL; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S; RANDOM INT / SUB = PATNO(DRUG) ; RUN; t Pr > t Intercept <.0001 DRUG DRUG TIME <.0001 Containment 1-1 µ : 18, β 1 : 18, γ : 32 ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = A Containment 29

178 PROC MIXED DATA = D2 EMPIRICAL; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S ; RANDOM INT TIME / SUB = PATNO(DRUG) ; RUN; Intercept PATNO(DRUG) TIME PATNO(DRUG) Residual D = ν ν2 2 b i = b i1 b i2 iid N(0, D), ϵ i N(0, σ 2 I 3 ) ν 2 1 = , ν 2 2 = , σ 2 = V [Y i ] = Z i DZ i + σ 2 I 3 ν t 2 1 ν 2 2 ν t 1 t 2 ν 2 2 ν t 1 t 3 ν = ν t 1 t 2 ν 2 2 ν t 2 2 ν 2 2 ν t 2 t 3 ν σ ν t 1 t 3 ν 2 2 ν t 2 t 3 ν 2 2 ν t 2 3 ν =

179 = t Pr > t Intercept <.0001 DRUG DRUG TIME <.0001 β = µ β 1 β 2 γ = Containment 1-1 ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = A AR(1) REPEATED TIME TIME TIMECLASS TIMECLASS PROC MIXED DATA = D2 EMPIRICAL; CLASS DRUG PATNO TIMECLASS; MODEL Y = DRUG TIME / S; RANDOM INT / SUB = PATNO(DRUG); REPEATED TIMECLASS / SUB = PATNO(DRUG) TYPE = AR(1) ; RUN; 31

180 Intercept PATNO(DRUG) AR(1) PATNO(DRUG) Residual b i N(0, ν 2 ), ϵ i N 0, σ2 1 ρ ρ 2 ρ 1 ρ ρ 2 ρ 1 ν 2 = , ρ = , σ 2 = ρ ρ 2 V [Y i ] = ν σ2 ρ 1 ρ ρ 2 ρ 1 ν 2 + σ 2 ν 2 + σ 2 ρ ν 2 + σ 2 ρ 2 = ν + σ 2 ρ ν 2 + σ 2 ν 2 + σ 2 ρ ν 2 + σ 2 ρ 2 ν 2 + σ 2 ρ ν 2 + σ = =

181 t Pr > t Intercept <.0001 DRUG DRUG TIME <.0001 β = µ β 1 β 2 γ = ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = A PROC MIXED DATA = D2 EMPIRICAL; CLASS DRUG PATNO TIMECLASS; MODEL Y = DRUG TIME / S ; RANDOM INT / SUB = PATNO(DRUG); REPEATED TIMECLASS / SUB = PATNO(DRUG) TYPE = SP(EXP)(TIME); RUN;

182 Intercept PATNO(DRUG) SP(EXP) PATNO(DRUG) Residual b i N(0, ν 2 ), ϵ i N 0, σ 2 ( exp exp ( 1 exp t ) 1 t 2 θ t ) 1 t 2 1 exp ( θ t ) ( 1 t 3 exp t ) 2 t 3 θ θ ( exp t ) 1 t 3 ( θ t ) 2 t 3 θ 1 ν 2 = , θ = , σ 2 = V [Y i ] = ν σ exp = ( exp ν 2 + σ 2 ( ν 2 + σ 2 exp t ) 1 t 2 ( θ ν 2 + σ 2 exp t ) 1 t 3 θ ( 1 exp t ) 1 t 2 θ t ) 1 t 2 1 exp ( θ t ) ( 1 t 3 exp t ) 2 t 3 θ θ ( ν 2 + σ 2 exp t ) 1 t 2 θ ν 2 + σ 2 ( ν 2 + σ 2 exp t ) 2 t 3 θ ( exp t ) 1 t 3 ( θ t ) 2 t 3 θ 1 ( ν 2 + σ 2 exp ν 2 + σ 2 exp ( exp ( ) = exp exp ( ) = t ) 1 t 3 ( θ t ) 2 t 3 θ ν 2 + σ 2 ) ( ) exp ( ) exp ( ) exp

183 t Pr > t Intercept <.0001 DRUG DRUG time <.0001 β = µ β 1 β 2 γ = ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = A CS PROC MIXED DATA = D2 EMPIRICAL; CLASS DRUG PATNO TIMECLASS; MODEL Y = DRUG TIME / S; REPEATED TIMECLASS / SUB = PATNO(DRUG) TYPE = CS; RUN;

184 CS PATNO(DRUG) Residual ϵ i N 0, ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 ν 2 ν 2 ν 2 ν 2 + σ 2 ν 2 = , σ 2 = t Pr > t Intercept <.0001 DRUG DRUG TIME <.0001 β = µ β 1 β 2 γ = ŜE[ µ] = , ŜE[ β 1 ] = , ŜE[ γ] = A Model Residual 36

185 PROC MIXED DATA = D2 ; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S RESIDUAL; RANDOM INT / SUB = PATNO(DRUG) ; RUN; A.10 A.7.10 PROC MIXED DATA = D2 ; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S INFLUENCE (EFFECT = PATNO(DRUG) ITER = 5); RANDOM INT / SUB = PATNO(DRUG) ; RUN; β 1 β (i) iter i β (i) iter = 5 5 β A.11 A CS 5 Y i 0 σ 2 + η 2 η 2 η 2 iid b i N(0, ν 2 iid ), ϵ i N 0, η 2 σ 2 + η 2 η 2 0 η 2 η 2 σ 2 + η 2 V [Y i ] = 1 3 V [b i ]1 3 + V [ϵ i ] σ 2 + η 2 η 2 η 2 = ν η 2 σ 2 + η 2 η η 2 η 2 σ 2 + η 2 σ 2 + (ν 2 + η 2 ) (ν 2 + η 2 ) (ν 2 + η 2 ) = (ν 2 + η 2 ) σ 2 + (ν 2 + η 2 ) (ν 2 + η 2 ) (ν 2 + η 2 ) (ν 2 + η 2 ) σ 2 + (ν 2 + η 2 ) ν 2 η 2 (ν 2 + η 2 ) ν 2 + η 2 = 3 ν 2 η 2 ν 2 = 0 η 2 37

186 SAS PROC MIXED PROC MIXED DATA = D2 EMPIRICAL; CLASS TIMECLASS DRUG PATNO; MODEL Y = DRUG TIME / S; RANDOM INT /SUB = PATNO(DRUG) ; REPEATED TIMECLASS / SUB = PATNO(DRUG) TYPE = CS; RUN; Hessian 2 y ij = µ + α i + ϵ ij (i = 1, 2, j = 1,, n i ) SAS 0 28 β 1, β 2 µ γt i A β 2 =

187 Box and Draper (1976) "Esentially, all models are wrong, but some are useful" (sensitivity analysis) 30 MMRM MMRM LOCF ANCOVA unstructured 31 III 32 A.8.1 SAS PROC MIXED RANDOM Containment REPEATED RANDOM Between-within Satterthwaite Kenward-Roger RANDOM REPEATED Type = UN SAS Kenward-Roger χ 2 Kenward-Roger SAS/STAT 14.1 KenwardRoger II 39

188 A β β V m [ β] = (X V 1 X) 1 V r [ β] = (X V ( α) 1 X) 1 ( n i=1 ) 1 1 X i V i (Y i X i β)(yi X i β) V i X i (X V ( α) 1 X) 1 SAS PROC MIXED PROC MIXED EMPIRICAL SAS Ver 9.3 Satterthwaite Kenward-Roger A.9 Y i = X i β + Z i b i + ϵ i (b i N(0, D), ϵ i N(0, Σ)) β D Σ D = ν2 1 0, Σ = σ ν (ν1, 2 ν2, 2 σ 2 ) ρ ρ 2 ( ) D = ν 2, Σ = σ 2 ρ 1 ρ ρ 2 ρ 1 (ν 2, σ 2, ρ) 3 α β D, Σ D, Σ α α β D, Σ 40

189 α Σ, D 33 (ML) (REML) 2 2 A.9.1 (ML) (REML) i (i = 1,, N) n i 34 Y i = X i β + Z i b i + ϵ i b i iid N(0, D), ϵ i iid N (0, Σ i ) b i ϵ i β Y i iid N(X i β, Z i DZ i + Σ i ) i 1 f(y i β, α) = ( (2π) n i Zi DZ i + Σ i exp 1 ) 2 (Y i X i β) (Z i DZ i + Σ i ) 1 (Y i X i β) (A.6) N L(β, α) = f(y i β, α) i=1 N l(β, α) = log f(y i β, α) i=1 β, α Newton-Raphson Fisher s Scoring SAS PROC MIXED ridge-stabilized Newton-Raphson Fisher s scoring PROC MIXED scoring 35 A.9.2 i b i Y i f(y i β, b i, Σ i ) = 1 ( (2π) n i Σi exp 1 ) 2 (Y i X i β Z i b i ) Σ 1 i (Y i X i β Z i b i ) 33 Σ, D α = (α 1, α 2 ) 34 n i (A.7)

190 Y i, X i, Z i β, α b i V i, X i V ix i V i 36 b i b i b i N(0, D) b i β 37 b i D b i D b i f(y i β, b i, Σ i ) : Y i f(b i D) : b i b i (integrate out) f(y i β, α) = f(y i β, b i, Σ i ) f(b i D)db i (A.8) b i D f(y i β, α) (A.8) 1 f(y i β, α) = ( (2π) n i Zi DZ i + Σ i exp 1 ) 2 (Y i X i β) (Z i DZ i + Σ i ) 1 (Y i X i β) 38 V i = Z i DZ i + Σ i V i α V i (α) l(α, β) = N i=1 { 1 ( ) 2 log (2π) n i V i + 1 } 2 (Y i X i β) V 1 i (Y i X i β) (A.9) β β = N i=1 ( X i V 1 i ) 1 X i X V 1 i i Y i β V i β α α β β (A.9) l ML (α) = N i=1 α { 1 2 log ( (2π) 3 V i ) (Y i X i β) V 1 i (Y i X i β) } Lee and Nelder (1996) 38 42

191 A.9.3 (REML) (ML) (REML) α β α ML REML β N(µ, σ 2 ) x 1,, x N µ, σ 2 µ = 1 N N x i, σ 2 = 1 N i=1 N (x i µ) 2 i=1 E[ σ 2 ] = N 1 N σ2 σ 2 = 1 N 1 N (x i µ) 2 σ 2 (REML) (N 1) (N 1) E[ σ 2 ] (N 1) i=1 1 Ñ rank(x) = p X Ñ X Ñ (Ñ p) a 1,, añ p Ñ (Ñ p) A 39 ) A = (a 1,, añ p Y A W = A Y Ñ = N n i W (Ñ p) i=1 E[W] = E[A Y] = A E[Y] = A Xβ = 0 ( A X ) V [W] = V [A Y] = A V [Y]A = A VA W 0 α W α REML REML rank(x) = p, dim(y i ) = n i, Ñ = N l REML (α) = Ñ p N i=1 log(2π) + 1 N 2 log X i X i 1 2 log N ( Y i X i β ) V 1 i i=1 ( ) Y i X i β 39 A X 43 i=1 X i V 1 i X i 1 2 N log V i i=1 i=1 n i

192 ML l REML (α) = 1 N 2 log X i V 1 i X i + l ML(α) + C i=1 (C α, β ) A.9.4 α β α β SAS PROC MIXED REML l REML (α) = 1 2 log N i=1 X i V 1 i X i + l ML(α) + C (C α, β ) A.9.5 b i b i b i D b i β, α b i prediction (estimatior) (estimated value) (predictor) (predicted value) EBLUP(Empirical Best Linear Unbiased Predictor) b = E[b Y] b Mixed Model Equation X Σ 1 X Z Σ 1 X b X Σ 1 Z Z Σ 1 Z + D β b b = DZ V 1 (Y X β) = X V 1 Y Z V 1 Y PROC MIXED DATA = D2 ; CLASS DRUG PATNO; MODEL Y = DRUG TIME / S ; RANDOM INT / SUB = PATNO(DRUG) S; RUN; 44

193 PATNO t Pr > t Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept Intercept A.10 SAS (2001) (2009) (2011) Verbeke and Molenberghs (2000) Molenberghs Web SAS SAS Little et al. (2006) [1] Box,G.E.P. and Draper,N.R. (1987). Empirical Model-Building and Response Surfaces. Wiley. [2] Cook, R. D. (1977), Detection of Influential Observations in Linear Regression, Technometrics, 19,

194 [3]. (2009)... [4] Liang, K. Y., & Zeger, S. L. (2000). Longitudinal data analysis of continuous and discrete responses for pre-post designs. Sankhyā: The Indian Journal of Statistics, Series B, [5] McCulloch, C. E., Searle, S. R. and Neuhaus, J. M. (2008). Generalized, Linear, and Mixed Models (2nd Ed.), Wiley ( (2011). CAC). [6] Little, R, C, Milliken, G, A, Stroup, Wolfinger, R, D, and Schabenberber, O. (2006). SAS for Mixed Models. SAS Institute. [7] Verbeke, G, and Molenberghs, G. (1997). Linear Mixed Models in Practice: A SAS-oriented approach. Springer., (2001) -SAS -.. [8] Verbeke, G, and Molenberghs, G. (2000). Linear Mixed models for longitudinal data. Springer. 46

195 Appendix B B.1 2 MCAR MAR MNAR 5 MNAR SM 6 (i) NFMV ACMV ACMV (ii) ACMV MAR (iii) NFMV NFD B.2 2 B.2.1 Little and Rubin (2002) MCAR MAR MNAR (MCAR) : f(r i Y i, ψ) = f(r i ψ) (MAR) : f(r i Y i, ψ) = f(r i Y o i, ψ) (B.1) (MNAR) : f(r i Y i, ψ) f(r i Y o i, ψ) 2 MCAR MAR MNAR MCAR MAR MNAR B i j Y ij Y i = (Y i1, Y i2, Y i3 ) R i = (R i1, R i2, R i3 ) R ij = 1 R ij = 0 B MCAR 2 MCAR f(r i1 = 0 Y i ) = 0 (B.2) 1 Seaman et al. (2013) everywhere MCAR MAR 47

196 f(r i2 = 0 Y i, R i1 = 0) = 1 f(r i2 = 0 Y i, R i1 = 1) = 0.1 f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = 0.1 (B.3) (B.4) i j R ij = 0 R i,j+1 = 0 1 f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) = 1 f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) = 0 1 R ij (B.1) R i = (R i1, R i2, R i3 ) (B.2) (B.5) (B.1) 2 R i MCAR (B.2) (B.5) f(r i1 = 1 Y i ) = 1 f(r i2 = 1 Y i, R i1 = 0) = 0 f(r i2 = 1 Y i, R i1 = 1) = 0.9 f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) = 0 f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) = 0 f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) = 0.9 R i = (R i1, R i2, R i3 ) 3 f(r i1 = 1, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1, R i1 = 1 Y i ) (B.5) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) = = 0.81 f(r i1 = 0, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 f(r i1 = 1, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = = 0 f(r i1 = 0, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 f(r i1 = 1, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) 2 Y i R i 3 R i1 = 0, 1, R i2 = 0, 1, R i3 = 0, = 8 48

197 = = 0.09 f(r i1 = 0, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 f(r i1 = 1, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = = 0.1 f(r i1 = 0, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 Y i f(r i Y i ) = f(r i ) (B.1) MCAR B MAR MAR f(r i1 = 0 Y i ) = 0 f(r i2 = 0 Y i, R i1 = 0) = 1 (B.6) logitf(r i2 = 0 Y i, R i1 = 1) = Y i1 (B.7) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i2 (B.8) ( ) f(x) logitf(x) = log exp (logitf(x)) = f(x) 1 f(x) 1 f(x) f(x) = exp(logitf(x)) 1 + exp(logitf(x)) (B.7) (B.8) 1 logitf(r i2 = 0 Y i, R i1 = 1) = Y i1 f(r i2 = 0 Y i1, R i1 = 1) = exp( Y i1) 1 + exp ( Y i1 ) logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i2 f(r i3 = 0 Y i1, R i1 = 1, R i2 = 1) = exp( Y i2) 1 + exp ( Y i2 ) (B.6) (B.8) f(r i1 = 0 Y i ) = 0 49

198 f(r i2 = 0 Y i, R i1 = 0) = 1 f(r i2 = 0 Y i, R i1 = 1) = exp( Y i1) 1 + exp ( Y i1 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = exp( Y i2) 1 + exp ( Y i2 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) = 1 f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) = 0 f(r i1 = 1 Y i ) = 1 f(r i2 = 1 Y i, R i1 = 0) = 0 1 f(r i2 = 1 Y i, R i1 = 1) = 1 + exp ( Y i1 ) f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) = 0 f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) = 0 1 f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) = 1 + exp ( Y i2 ) MAR Y o i ϕ f(r i1 = 1, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) 1 = 1 + exp ( Y i2 ) exp ( Y i1 ) 1 1 = {1 + exp ( Y i2 )}{1 + exp ( Y i1 )} (Y o i = (Y i1, Y i2, Y i3 ) ) f(r i1 = 0, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = (Y i2, Y i3 ) ) f(r i1 = 1, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = 0 exp( Y i1 ) 1 + exp ( Y i1 ) 1 = 0 (Yo i = (Y i1, Y i3 )) f(r i1 = 0, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = (Y i3 )) f(r i1 = 1, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) = exp( Y i2) 1 + exp ( Y i2 ) exp ( Y i1 ) 1 exp( Y i2 ) = {1 + exp ( Y i2 )}{1 + exp ( Y i1 )} (Y o i = (Y i1, Y i2 ) ) f(r i1 = 0, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = (Y i2 )) 50

199 f(r i1 = 1, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = 1 exp( Y i1 ) 1 + exp ( Y i1 ) 1 = exp( Y i1) 1 + exp ( Y i1 ) (Y o i = (Y i1 )) f(r i1 = 0, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = ϕ) Y ij Y o i f(r i Y i ) = f(r i Y o ) (B.1) MAR B MNAR 1 2 MNAR 1 f(r i1 = 0 Y i ) = 0 f(r i2 = 0 Y i, R i1 = 0) = 1 (B.9) logitf(r i2 = 0 Y i, R i1 = 1) = Y i Y i2 (B.10) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) f(r i1 = 0 Y i ) = 0 f(r i2 = 0 Y i, R i1 = 0) = 1 = Y i Y i3 f(r i2 = 0 Y i, R i1 = 1) = exp( Y i Y i2 ) 1 + exp( Y i Y i2 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = 1 f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = exp( Y i Y i3 ) 1 + exp( Y i Y i3 ) f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) = 1 f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) = 0 f(r i1 = 1 Y i ) = 1 f(r i2 = 1 Y i, R i1 = 0) = 0 1 f(r i2 = 1 Y i, R i1 = 1) = 1 + exp ( Y i Y i2 ) f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) = 0 f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) = 0 1 f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) = 1 + exp ( Y i Y i3 ) 51 (B.11)

200 MNAR Y o i ϕ f(r i1 = 1, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) 1 = 1 + exp ( Y i Y i3 ) exp ( Y i Y i2 ) 1 1 = {1 + exp ( Y i Y i3 )}{1 + exp ( Y i Y i2 )} (Y o i = (Y i1, Y i2, Y i3 ) ) f(r i1 = 0, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = (Y i2, Y i3 ) ) f(r i1 = 1, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = 0 exp( Y i Y i2 ) 1 + exp ( Y i Y i2 ) 1 = 0 (Yo i = (Y i1, Y i3 )) f(r i1 = 0, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = (Y i3 )) f(r i1 = 1, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) = exp( Y i Y i3 ) 1 + exp ( Y i Y i3 ) exp ( Y i Y i2 ) 1 exp( Y i Y i3 ) = {1 + exp ( Y i Y i3 )}{1 + exp ( Y i Y i2 )} (Y o i = (Y i1, Y i2 ) ) f(r i1 = 0, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = (Y i2 )) f(r i1 = 1, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = 1 exp( Y i Y i2 ) 1 + exp ( Y i Y i2 ) 1 = exp( Y i Y i2 ) 1 + exp ( Y i Y i2 ) (Y o i = (Y i1 )) f(r i1 = 0, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = = 0 (Y o i = ϕ) f(r i1 = 1, R i2 = 1, R i3 = 0) Y o i Y ij MNAR B MNAR 2 MNAR 2 MAR R ij f(r i1 = 0 Y i ) = 0 logitf(r i2 = 0 Y i, R i1 = 0) logitf(r i2 = 0 Y i, R i1 = 1) = Y i1 = Y i1 52

201 logitf(r i3 = 0 Y i, R i1 = 0, R i2 = 0) logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 0) = Y i2 = Y i2 logitf(r i3 = 0 Y i, R i1 = 0, R i2 = 1) logitf(r i3 = 0 Y i, R i1 = 1, R i2 = 1) = Y i2 = Y i2 f(r i1 = 0 Y i ) = 0 logitf(r i2 = 0 Y i, R i1 ) = Y i1 logitf(r i3 = 0 Y i, R i1, R i2 ) = Y i2 f(r i1 = 0 Y i ) = 0 f(r i2 = 0 Y i, R i1 ) = f(r i3 = 0 Y i, R i1, R i2 ) = exp( Y i1 ) 1 + exp( Y i1 ) exp( Y i2 ) 1 + exp( Y i2 ) f(r i1 = 1 Y i ) = 1 f(r i2 = 1 Y i, R i1 ) = f(r i3 = 1 Y i, R i1, R i2 ) = exp( Y i1 ) exp( Y i2 ) f(r i1 = 1, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) 1 = 1 + exp( Y i2 ) exp( Y i1 ) 1 1 = (Yi o = (Y i1, Y i2, Y i3 ) ) {1 + exp( Y i2 }{1 + exp( Y i1 )} f(r i1 = 0, R i2 = 1, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = exp( Y i2 ) exp( Y i1 ) 0 = 0 (Y o i = (Y i2, Y i3 ) ) f(r i1 = 1, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) 1 = 1 + exp( Y i2 ) exp( Y i1 ) 1 + exp( Y i1 ) 1 exp( Y i1 ) = {1 + exp( Y i2 )}{1 + exp( Y i1 )} (Y o i = (Y i1, Y i3 ) ) f(r i1 = 0, R i2 = 0, R i3 = 1 Y i ) = f(r i3 = 1 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = exp( Y i2 ) exp( Y i1 ) 1 + exp( Y i1 ) 0 = 0 (Y o i = (Y i3 )) f(r i1 = 1, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 1) f(r i2 = 1 Y i, R i1 = 1) f(r i1 = 1 Y i ) = exp( Y i2) 1 + exp( Y i2 ) exp( Y i1 ) 1 exp( Y i2 ) = {1 + exp( Y i2 )}{1 + exp( Y i1 )} 53 (Y o i = (Y i1, Y i2 ) )

202 f(r i1 = 0, R i2 = 1, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 1) f(r i2 = 1 Y i, R i1 = 0) f(r i1 = 0 Y i ) = exp( Y i2) 1 + exp( Y i2 ) exp( Y i1 ) 0 = 0 (Y o i = (Y i2 )) f(r i1 = 1, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 1, R i2 = 0) f(r i2 = 0 Y i, R i1 = 1) f(r i1 = 1 Y i ) = exp( Y i2) 1 + exp( Y i2 ) exp( Y i1 ) 1 + exp( Y i1 ) 1 exp( Y i1 ) exp( Y i2 ) = {1 + exp( Y i2 )}{1 + exp( Y i1 )} (Y o i = (Y i1 )) f(r i1 = 0, R i2 = 0, R i3 = 0 Y i ) = f(r i3 = 0 Y i, R i1 = 0, R i2 = 0) f(r i2 = 0 Y i, R i1 = 0) f(r i1 = 0 Y i ) = exp( Y i2) 1 + exp( Y i2 ) exp( Y i1 ) 1 + exp( Y i1 ) 0 = 0 (Y o i = ϕ) f(r i1 = 1, R i2 = 0, R i3 = 1, Y i ) Y o i Y ij MNAR B.3 5 Selection Model 5 MNAR B.3.1 MNAR MNAR Selection Model logit{p r(r ij = 0 R i1 = 1,, R i,j 1 = 1, Y i, X i, ψ)} = ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij (j = 2,, n) P r(r ij = 0 R i,j 1 = 0, R i,j 2, R i1, Y i, X i, ψ) = 1 (j = 2,, n) P r(r i1 = 1 Y i, X i, ψ) = 1 (B.12) R i,j 2, R i1 MNAR 1 6 NFD (Non-Future Dependence) 3 B i Yi o = Y i, R i1 = R i2 = R in = 1 f(yi o, R i X i, θ, ψ) = f(y, R i X i, θ, ψ) = f(y i X i, θ) f(r i Y i, X i, ψ) = f(y i X i, θ) P r(r i1 = 1,, R in = 1 Y i, ψ) = f(y i X i, θ) P r(r i1 = 1 X i, Y i, ψ) 54

203 P r(r i2 = 1 R i1 = 1, X i, Y i, ψ) P r(r in = 1 R i1 = = R i,n 1 = 1, Y i, X i, ψ) (B.13) f(y i X i, θ, ψ) (B.12) θ, ψ 4 (B.12) P r(r ij = 0 R i1 = = R i,j 1 = 1, Y i, X i, ψ) = exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) 1 + exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) P r(r ij = 1 R i1 = = R i,j 1 = 1, Y i, X i, ψ) =1 P r(r ij = 0 R i1 = = R i,j 1 = 1, Y i, X i, ψ) 1 = 1 + exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) 1 f(y i X i, θ) = ( (2π)n Σ exp 1 ) 2 (Y i µ i ) Σ 1 (Y i µ i ) 5 Y i = Yi o (B.13) f(yi o 1, R i X i, θ, ψ) = ( (2π)n Σ exp 1 ) 2 (Y i µ i ) Σ 1 (Y i µ i ) 1 i exp(ψ 1 + ψ 2 Y i1 + ψ 3 Y i2 ) exp(ψ 1 + ψ 2 Y i,n 1 + ψ 3 Y in ) B j (j 3) i R i1 = = R i,j 1 = 1, R ij = = R in = 0 Yi o = (Y i1,, Y i,j 1 ), Yi m = (Y ij,, Y in ) f(yi o, R i X i, θ, ψ) = f(yi o, Yi m, R i X i, θ, ψ)dyi m = f(yi o, Yi m X i, θ) f(r i X i, Yi o, Yi m, ψ)dyi m = f(y m i Y o i, X i, θ) f(y o i X i, θ) P r(r ij = = R in = 0 R i1 = = R i,j 1 = 1, X i, Y i, ψ) P r(r i1 = = R i,j 1 = 1 X i, Y o i, ψ)dy m i = f(yi o X i, θ) P r(r i1 = = R i,j 1 = 1 X i, Yi o, ψ) f(yi m Yi o, X i, θ) P r(r ij = = R in = 0 R i1 = = R i,j 1 = 1, X i, Y i,j 1,, Y in, ψ)dy ij dy in ( Y m i = (Y ij,, Y in ) ) µ i i µ i = X i β X i 55

204 = f(yi o X i, θ) P r(r i1 = = R i,j 1 = 1 X i, Yi o, ψ) f(y ij,, Y in Y i1,, Y i,j 1, X i, θ) P r(r ij = 0 R i1 = = R i,j 1 = 1, X i, Y i,j 1, Y ij, ψ) P r(r i,j+1 = 0 R i1 = = R i,j 1 = 1, R ij = 0, X i, Y ij, Y i,j+1, ψ) P r(r in = 0 R i1 = = R i,j 1 = 1, R ij = = R i,n 1 = 0, X i, Y i,n 1, Y in, ψ)dy ij dy in ( NFD) = f(yi o X i, θ) P r(r i1 = 1 X i, Y 1, ψ) P r(r i2 = 1 R i1 = 1, X i, Y 1, Y 2, ψ) P r(r i,j 1 = 1 R i1 = = R i,j 2 = 1, Y i,j 2, Y i,j 1, X i, ψ) f(y ij,, Y in Y i1,, Y i,j 1, X i, θ) P r(r ij = 0 R i1 = = R i,j 1 = 1, X i, Y i,j 1, Y ij, ψ) 1 1 dy ij dy i,j+1 dy in = f(yi o X i, θ) P r(r i2 = 1 R i1 = 1, X i, Y i1, Y i2, ψ) P r(r i,j 1 = 1 R i1 = = R i,j 2 = 1, X i, Y i,j 2, Y i,j 1, ψ) f(y ij Yi o, X i, θ) P r(r ij = 0 R i1 = = R i,j 1 = 1, X i, Y i,j 1, Y ij, ψ)dy ij (B.14) NFD 1 Diggle and Kenward (1994) Nelder-Mead 6 %Selection_Model2, %SM_GridSearch, Newton-Raphson ridge Newton ( ) f(y ij Yi o 1, θ) = exp (Y ij µ ij ) 2 2πσij 2 2σij 2 µ ij, σij 2 Yo i = (Y i1,, Y i,j 1 ) Y ij Y ij (B.12) P r(r ij = 0 R i1 = = R i,j 1 = 1, X i, Y i, ψ) = exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) 1 + exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) (B.14) ( ) 1 exp (Y ij µ ij ) 2 exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) 2πσij 2 2σij exp(ψ 1 + ψ 2 Y i,j 1 + ψ 3 Y ij ) dy ij Y i,j 1 i B i 3 Yi o f(yo i X i, θ) P r(r i2 = 1 R i1 = 1, X i, Y i1, Y i2, ψ) Nelder-Mead SAS Proc IML NLPNRR Nelder-Mead (1965a, 1965b) 56

205 3 R i1 = 1, R i2 = R i3 = = R in = 0, Yi o = (Y i1), Yi m = (Y i2,, Y in ) f(yi o, R i X i, θ, ψ) = f(y i1, R i X i, θ, ψ) = f(y i1, Yi m, R i X i, θ, ψ)dyi m = f(y i1, Yi m X i, θ) P r(r i1 = 1, R i2 = = R in = 0 X i, Y i1, Yi m, ψ)dyi m = f(y i2,, Y in Y i1, X i, θ) f(y i1 X i, θ) P r(r i2 = = R in = 0 R i1 = 1, X i, Y i, ψ) P r(r i1 = 1 X i, Y i1, ψ)dy i2 dy in = f(y i1 X i, θ) P r(r i1 = 1 X i, Y i1, ψ) f(y i2,, Y in Y i1, X i, θ) P r(r i2 = = R in = 0 R i1 = 1, X i, Y i1,, Y in, ψ)dy i2 dy in = f(yi o X i, θ) 1 f(y i2,, Y in Y i1, X i, θ) P r(r i2 = 0 R i1 = 1, X i, Y i1, Y i2, ψ) P r(r i3 = 0 R i1 = 1, R i2 = 0, X i, Y i2, Y i3, ψ) P r(r in = 0 R i1 = 1, R i,2 = = R i,n 1 = 0, X i, Y i,n 1, Y in, ψ)dy i2 dy in = f(yi o X i, θ) f(y i2, Y i3,, Y in Y i1, X i, θ) P r(r i2 = 0 R i1 = 1, X i, Y i1, Y i2, ψ) 1 1 dy i2 dy i3 dy in = f(yi o X i, θ) f(y i2 Y i1, X i, θ) P r(r i2 = 0 R i1 = 1, X i, Y i1, Y i2, ψ)dy i2 j (j 3) B.4 6 NFMV NFD ACMV MAR 6 1. NFMV ACMV ACMV 2. ACMV MAR 3. NFMV NFD 2 3 SM PMM 1. NRC (2010) 5 PMM 3 NRC (2010) 5 PMM MAR B

206 B "ACMV MAR" "NFMV NFD" 6 i, j B i 1 i T i = 1 R Y i i 1 Y i i i > 1 f(y i Y i 1, R = j) j Y i Y j 1 f(y 3 Y 2, R = 1) 1 3 f(y 3 Y 2, R 3) f j (Y i Y i 1 ) f(y i Y i 1, R = j) B P r(r = t 1 R t 1) 0 ( t 2) i 1 Y i f(y i ) > 0 B f(y i Y i 1, R j) =f(y i Y i 1, R = j) P (R = j Y i 1, R j) + f(y i Y i 1, R j + 1) P (R j + 1 Y i 1, R j) (B.15). f(y i Y i 1, R j) = f(y i, Y i 1, R j) f(y i 1, R j) = f(y i, Y i 1, R = j) + f(y i, Y i 1, R j + 1) f(y i 1, R j) (T 1) 58

207 = f(y i, Y i 1, R = j) f(y i 1, R = j) f(y i 1, R = j) f(y i 1, R j) + f(y i, Y i 1, R j + 1) f(y i 1, R j + 1) f(y i 1, R j + 1) f(y i 1, R j) =f(y i Y i 1, R = j) P (R = j Y i 1, R j) + f(y i Y i 1, R j + 1) P (R j + 1 Y i 1, R j) B t 2, j t 1 f(y t Y t 1, R t) f(y t Y t 1, R = j) f(y t Y t 1, R t) R t t t j t 1 R = j < t f(y t Y t 1, R = j) j (< t) t B f(y 4 Y 3, R = 4), f(y 3 Y 2, R = 4), f(y 3 Y 2, R = 3), f(y 2 Y 1, R = 4), f(y 2 Y 1, R = 3), f(y 2 Y 1, R 2) 8 f(y 4 Y 3, R = 3), f(y 4 Y 3, R = 2), f(y 4 Y 3, R = 1), f(y 3 Y 2, R = 2), f(y 3 Y 2, R = 1), f(y 2 Y 1, R = 1) 1 4 B.1: f(y 3 Y 2, R 3) R f(y 3 Y 2, R = 3), f(y 3 Y 2, R = 4) 6 interior family 59

208 B.2: f(y 1 R = 4) f(y 2 Y 1, R = 4) f(y 3 Y 2, R = 4) f(y 4 Y 3, R = 4) 2 f(y 1 R = 3) f(y 2 Y 1, R = 3) f(y 3 Y 2, R = 3) f(y 4 Y 3, R = 3) 4 3 f(y 1 R = 2) f(y 2 Y 1, R = 2) f(y 3 Y 2, R = 2) f(y 4 Y 3, R = 2) 3 4 f(y 1 R = 1) f(y 2 Y 1, R = 1) f(y 3 Y 2, R = 1) f(y 4 Y 3, R = 1) f(y 4 Y 3, R = 4) complete case 2 4 f(y 4 Y 3, R = 3), f(y 4 Y 3, R = 2), f(y 4 Y 3, R = 1) B.4.2 NFMV ACMV PMM 1 CCMV NCMV ACMV NFMV 6 NFMV ACMV 2 B NFMV B NFMV t 3, j t 2 f(y t Y t 1, R = j) = f(y t Y t 1, R t 1) (B.16) NFMV 9 f(y t Y t 1, R = 1) = f(y t Y t 1, R = 2) = = f(y t Y t 1, R = t 2) B NFMV 4 NFMV t = 4, j = 1 ( 2 = t 2) (B.16) f(y 4 Y 3, R = 1) = f(y 4 Y 3, R 3) (B.17) 9 Kenward et al. (2003) t = 2 t = 2 f(y 2 Y 1, R = 0) 0 1 t 3 60

209 10 t = 4, j = 2 t = 3, j = 1 f(y 4 Y 3, R = 2) = f(y 4 Y 3, R 3) f(y 3 Y 2, R = 1) = f(y 3 Y 2, R 2) (B.18) (B.19) t 3, j t 2 NFMV (B.16) NFMV (B.17) (B.19) f(y 4 Y 3, R = 1) = f(y 4 Y 3, R = 2) = f(y 4 Y 3, R 3) f(y 3 Y 2, R = 1) = f(y 3 Y 2, R 2) 2 (i) f(y 4 Y 3, R = 1), f(y 4 Y 3, R = 2), f(y 3 Y 2, R = 1) 11 (ii) f(y 4 Y 3, R = 3), f(y 3 Y 2, R = 2), f(y 2 Y 1, R = 1) 12 NFMV 4 B.3: B.4: NFMV f(y 1 R = 4) f(y 2 Y 1, R = 4) f(y 3 Y 2, R = 4) f(y 4 Y 3, R = 4) 2 f(y 1 R = 3) f(y 2 Y 1, R = 3) f(y 3 Y 2, R = 3) 3 f(y 1 R = 2) f(y 2 Y 1, R = 2) f(y 4 Y 3, R 3) 4 f(y 1 R = 1) f(y 3 Y 2, R 2) f(y 4 Y 3, R 3) B ACMV B ACMV t 2, j t 1 f(y t Y t 1, R = j) = f(y t Y t 1, R t) (B.20) ACMV 10 4 R = 4 R 4 11 f(y 4 Y 3, R 3) f(y 4 Y 3, R 3) = f(y 4 Y 3, R = 3) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) }{{}}{{} 12 61

210 B ACMV NFMV 4 f(y 4 Y 3, R = 1) = f(y 4 Y 3, R = 4) f(y 4 Y 3, R = 2) = f(y 4 Y 3, R = 4) f(y 4 Y 3, R = 3) = f(y 4 Y 3, R = 4) f(y 3 Y 2, R = 1) = f(y 3 Y 2, R 3) f(y 3 Y 2, R = 2) = f(y 3 Y 2, R 3) f(y 2 Y 1, R = 1) = f(y 2 Y 1, R 2) (B.21) (B.22) (B.23) (B.24) (B.25) (B.26) f(y 4 Y 3, R = 1) = f(y 4 Y 3, R = 2) = f(y 4 Y 3, R = 3) =f(y 4 Y 3, R = 4) f(y 3 Y 2, R = 1) = f(y 3 Y 2, R = 2) =f(y 3 Y 2, R 3) f(y 2 Y 1, R = 1) =f(y 2 Y 1, R 2) NFMV NFMV B.5: B.6: ACMV f(y 1 R = 4) f(y 2 Y 1, R = 4) f(y 3 Y 2, R = 4) f(y 4 Y 3, R = 4) 2 f(y 1 R = 3) f(y 2 Y 1, R = 3) f(y 3 Y 2, R = 3) f(y 4 Y 3, R = 4) 3 f(y 1 R = 2) f(y 2 Y 1, R = 2) f(y 3 Y 2, R 3) f(y 4 Y 3, R = 4) 4 f(y 1 R = 1) f(y 2 Y 1, R 2) f(y 3 Y 2, R 3) f(y 4 Y 3, R = 4) 13 B NFMV ACMV NFMV ACMV ACMV NFMV ACMV f(y 3 Y 2, R 3) f(y 3 Y 2, R = 3), f(y 3 Y 2, R = 4) 62

211 NFMV ACMV NFMV ACMV NFMV 4 B.4 14 B ACMV NFMV ACMV NFMV (B.21) (B.26) (B.17) (B.19) (B.21) (B.23) (B.24) (B.25) (B.17) (B.18) (B.19) (B.21) (B.23) (B.17), (B.18) (B.23) f(y 4 Y 3, R = 3) = f(y 4 Y 3, R = 4) 2 f(y 4 Y 3, R 3) 15 f(y 4 Y 3, R = 4) = f(y 4 Y 3, R = 3) = f(y 4 Y 3, R 3) (B.27) (B.21) f(y 4 Y 3, R = 1) = f(y 4 Y 3, R = 4) = f(y 4 Y 3, R 3) (B.17) (B.22) (B.27) f(y 4 Y 3, R = 2) = f(y 4 Y 3, R = 4) = f(y 4 Y 3, R 3) (B.18) (B.21) (B.23) (B.17) (B.18) (B.24), (B.25) (B.19) (B.25) f(y 3 Y 2, R = 2) = f(y 3 Y 2, R 3) f(y 3 Y 2, R 2) 16 (B.24) f(y 3 Y 2, R = 1) = f(y 3 Y 2, R 3) = f(y 3 Y 2, R 2) (B.19) (B.24) (B.25) (B.19) (B.21) (B.26) (B.17) (B.19) (B.26) ACMV NFMV f(y 4 Y 3, R 3) = f(y 4 Y 3, R = 3) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) = f(y 4 Y 3, R = 4) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) ( (B.23)) { } = f(y 4 Y 3, R = 4) P (R = 3 Y 3, R 3) + P (R = 4 Y 3, R 3) = f(y 4 Y 3, R = 4) P (R 3 Y 3, R 3) = f(y 4 Y 3, R = 4) (B.23) f(y 4 Y 3, R = 3) 16 f(y 3 Y 2, R 2) = f(y 3 Y 2, R = 2) P (R = 2 Y 2, R 2) + f(y 3 Y 2, R 3) P (R 3 Y 2, R 2) = f(y 3 Y 2, R 3) P (R = 2 Y 2, R 2) + f(y 3 Y 2, R 3) P (R 3 Y 2, R 2) ( (B.25)) { } = f(y 3 Y 2, R 3) P (R = 2 Y 2, R 2) + P (R 3 Y 2, R 2) = f(y 3 Y 2, R 3) 63

212 NFMV ACMV 17 (B.21) (B.23) (B.17) (B.22) (B.23) (B.18) (B.24) (B.25) (B.19) (B.21) (B.17) (B.22) (B.18) (B.24) (B.19) (B.23) (B.25) (B.26) NFMV f(y t Y t 1, R = t 1) 18 NFMV (B.23), (B.25), (B.26) ACMV (B.17) (B.19) (B.23) (B.25) (B.26) (B.21) (B.26) (1) (B.17) (B.23) (B.21) (2) (B.18) (B.23) (B.22) (3) (B.23) (B.23) (4) (B.19) (B.25) (B.24) (5) (B.25) (B.25) (6) (B.26) (B.26) (1) (B.17) (B.23) (B.21) (B.17) (B.15) f(y 4 Y 3, R = 1) = f(y 4 Y 3, R 3) = f(y 4 Y 3, R = 3) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) = f(y 4 Y 3, R = 4) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) ( (B.23)) { } = f(y 4 Y 3, R = 4) P (R = 3 Y 3, R 3) + P (R = 4 Y 3, R 3) = f(y 4 Y 3, R = 4) P (R 3 Y 3, R 3) = f(y 4 Y 3, R = 4) f(y 4 Y 3, R = 1) = f(y 4 Y 3, R = 4) (B.21) (B.17) (B.23) (B.21) (2) (B.18) (B.23) (B.22) (B.18) (B.23) f(y 4 Y 3, R = 2) = f(y 4 Y 3, R 3) = f(y 4 Y 3, R = 3) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) = f(y 4 Y 3, R = 4) P (R = 3 Y 3, R 3) + f(y 4 Y 3, R = 4) P (R = 4 Y 3, R 3) ( (B.23)) { } = f(y 4 Y 3, R = 4) P (R = 3 Y 3, R 3) + P (R = 4 Y 3, R 3) = f(y 4 Y 3, R = 4) P (R 3 Y 3, R 3) = f(y 4 Y 3, R = 4) (B.22) (B.18) (B.23) (B.22) (3) (B.23) (B.23) (4) (B.19) (B.25) (B.24) f(y 3 Y 2, R = 1) = f(y 3 Y 2, R 2) B.4 64

213 = f(y 3 Y 2, R = 2) P (R = 2 Y 2, R 2) + f(y 3 Y 2, R 3) P (R 3 Y 2, R 2) = f(y 3 Y 2, R 3) P (R = 2 Y 2, R 2) + f(y 3 Y 2, R 3) P (R 3 Y 2, R 2) ( (B.25)) { } = f(y 3 Y 2, R 3) P (R = 2 Y 2, R 2) + P (R 3 Y 2, R 2) = f(y 3 Y 2, R 3) (B.24) (B.19) (B.25) (B.24) (5) (B.25) (B.25) (6) (B.26) (B.26) NFMV (B.23), (B.25), (B.26) ACMV (1) (6) NFMV (B.23), (B.25), (B.26) ACMV (B.23), (B.25), (B.26) NFMV B.4 B.7 (B.23), (B.25), (B.26) f(y 4 Y 3, R = 3) = f(y 4 Y 3, R = 4) f(y 3 Y 2, R = 2) = f(y 3 Y 2, R 3) f(y 2 Y 1, R = 1) = f(y 2 Y 1, R 2) B.7 B.7: NFMV B f(y 1 R = 4) f(y 2 Y 1, R = 4) f(y 3 Y 2, R = 4) f(y 4 Y 3, R = 4) 2 f(y 1 R = 3) f(y 2 Y 1, R = 3) f(y 3 Y 2, R = 3) 3 f(y 1 R = 2) f(y 2 Y 1, R = 2) f(y 4 Y 3, R 3) 4 f(y 1 R = 1) f(y 3 Y 2, R 2) f(y 4 Y 3, R 3) NFMV t 3, j t 1 f(y t Y t 1, R = j) = f(y t Y t 1, R t 1) 3 f(y t Y t 1, R = t 1) = f(y t Y t 1, R t) ( t 2) NFMV ACMV ACMV ACMV t 3, j t 2 f(y t Y t 1, R = j) = f(y t Y t 1, R t 1) t 2 f(y t Y t 1, R = t 1) = f(y t Y t 1, R t) ACMV 19 B.8 19 NFMV t, j (NFMV t 3, j t 2) 65

214 B.8: ACMV f(y 1 R = 4) f(y 2 Y 1, R = 4) f(y 3 Y 2, R = 4) f(y 4 Y 3, R = 4) 2 f(y 1 R = 3) f(y 2 Y 1, R = 3) f(y 3 Y 2, R = 3) f(y 4 Y 3, R = 4) 3 f(y 1 R = 2) f(y 2 Y 1, R = 2) f(y 3 Y 2, R 3) f(y 4 Y 3, R 3) 4 f(y 1 R = 1) f(y 2 Y 1, R 2) f(y 3 Y 2, R 2) f(y 4 Y 3, R 3) B (B.23) f(y 4 Y 3, R = 3) = f(y 4 Y 3, R = 4) (B.25) f(y 3 Y 2, R = 2) = f(y 3 Y 2, R 3) (B.26)f(Y 2 Y 1, R = 1) = f(y 2 Y 1, R 2) f(y t Y t 1, R t) = f(y t Y t 1, R = t 1) ( t 2) (B.28) NFMV (B.28) ACMV 4 ACMV NFMV (B.28) (B.28) ACMV ACMV NFMV ACMV t 3 f(y t Y t 1, R t) = f(y t Y t 1, R = t 1) = f(y t Y t 1, R t 1) (B.20) t 3, j t 2 f(y t Y t 1, R = j) = f(y t Y t 1, R t) = f(y t Y t 1, R t 1) (B.16) NFMV ACMV NFMV ACMV NFMV (B.28) NFMV (B.28) ACMV t 3, j t 2 NFMV (B.16) f(y t Y t 1, R = j) = f(y t Y t 1, R t 1) = f(y t Y t 1, R = t 1) P (R = t 1 Y t 1, R t 1) + f(y t Y t 1, R t) P (R t Y t 1, R t 1) = f(y t Y t 1, R t) P (R = t 1 Y t 1, R t 1) + f(y t Y t 1, R t) P (R t Y t 1, R t 1) ( (B.28)) { } = f(y t Y t 1, R t) P (R = t 1 Y t 1, R t 1) + P (R t Y t 1, R t 1) = f(y t Y t 1, R t) t 3, j t 2 (B.20) ACMV t 2 j = t 1 (B.20) (B.28) NFMV (B.28) ACMV NFMV (B.28) ACMV NFMV f(y t Y t 1, R = t 1) = f(y t Y t 1, R t) ( t 2) 66 ACMV

215 NFMV ACMV (B.28) f(y t Y t 1, R = t 1) = f(y t Y t 1, R t) ( t 2) NFMV ACMV (B.28) ACMV MAR 20 (B.28) = MAR = MAR NFMV f(y t Y t 1, R = t 1) B.7 (B.28) MAR NRC (2010) 5 PMM B.4.3 NRC (2010) PMM NRC (2010) 5 PMM NRC (2010) p98 NFMV 21 NRC (2010) p99 f(y t Y t 1, R = t 1) f(y t Y t 1, R t) (NRC 22) NRC (2010) (22) (NRC 22) (NRC 32) 1 4 E[Y 1 ] = µ 1 E[Y 2 Y 1, R 2] = µ 2 + Y 1 β 2 E[Y 3 Y 2, R 3] = µ 3 + Y 2β 3 (B.29) E[Y 4 Y 3, R = 4] = µ 4 + Y 3β 4 (NRC 33) E[Y 2 Y 1, R = 1] = µ 2 + Y 1 β 2 E[Y 3 Y 2, R = 2] = µ 3 + Y 2β 3 (B.30) E[Y 4 Y 3, R = 3] = µ 4 + Y 3β 4 NFMV (B.28) ACMV (B.29) (B.30) ACMV ( MAR) 22 NRC (2010) 5 β 2 = β2, β 3 = β 3, β 4 = β 4, NFD NFD NFMV 22 (B.28) E[Y 2 Y 1, R = 1] = E[Y 2 Y 1, R 2] E[Y 3 Y 2, R = 2] = E[Y 3 Y 2, R 3] E[Y 4 Y 3, R = 3] = E[Y 4 Y 3, R = 4] (B.28) NRC (2010) 67

216 µ 2 = µ 2 + µ2 µ 3 = µ 3 + µ3 µ 4 = µ 4 + µ4 (B.30) E[Y 2 Y 1, R = 1] = µ 2 + µ2 + Y 1 β 2 (= E[Y 2 Y 1, R 2] + µ2 ) E[Y 3 Y 2, R = 2] = µ 3 + µ3 + Y 2β 3 (= E[Y 3 Y 2, R 3] + µ3 ) E[Y 4 Y 3, R = 3] = µ 4 + µ4 + Y 3β 4 (= E[Y 4 Y 3, R = 4] + µ4 ) µ2 = µ3 = µ4 = 0 ACMV ( MAR) 23 µ2, µ3, µ4 0 MAR µ2, µ3, µ4 MAR NRC (2010) 5 PMM MI missingdata.org.uk %delta_pmm %cbi_pmm Appendix C B.4.4 ACMV MAR ACMV MAR NFMV NFD ACMV MAR B.4.1. ACMV MAR Molenberghs et al. (1998) Verbeke and Molenberghs (2000) B MAR t 1 P (R = t Y T ) = P (R = t Y t ) ACMV t 2, j t 1 f(y t Y t 1, R = j) = f(y t Y t 1, R t) f(y t Y t 1, R = 1) = f(y t Y t 1, R = 2) = = f(y t Y t 1, R = t 1) 1 i j T i, j Y j i = (Y i,, Y j ) Y i T T i = j Y j i = Y i (= Y j ) B j T, 1 t T, P (R = j Y t ) > 0 Y j f(y j ) > 0 23 NFMV ACMV MAR 68

217 B B.4.1. AMCV t 2, j t 1 f(y t Y t 1, R = j) = f(y t Y t 1 ) (B.31). ACMV (B.31) t j t, j 1 t 1 f(y t Y t 1 ) = f(y t Y t 1, R = i) P (R = i Y t 1 ) + f(y t Y t 1, R t) P (R t Y t 1 ) i=1 (B.32) ACMV (B.31) (B.32) ACMV t 1 f(y t Y t 1 ) = f(y t Y t 1, R = i) P (R = i Y t 1 ) + f(y t Y t 1, R t) P (R t Y t 1 ) i=1 t 1 = f(y t Y t 1, R = j) P (R = i Y t 1 ) i=1 + f(y t Y t 1, R = j) P (R t Y t 1 ) ( ACMV ) t 1 = f(y t Y t 1, R = j) P (R = i Y t 1 ) + f(y t Y t 1, R = j) P (R t Y t 1 ) = f(y t Y t 1, R = j) = f(y t Y t 1, R = j) i=1 { t 1 } P (R = i Y t 1 ) + P (R t Y t 1 ) i=1 (B.31) (B.31) ACMV (B.32) (B.31) f(y t Y t 1, R = 1) = f(y t Y t 1, R = 2) = = f(y t Y t 1, R = t 1) = f(y t Y t 1 ) j t 1 t 1 f(y t Y t 1 ) = f(y t Y t 1, R = i) P (R = i Y t 1 ) + f(y t Y t 1, R t) P (R t Y t 1 ) i=1 t 1 f(y t Y t 1, R = j) = f(y t Y t 1, R = j) P (R = i Y t 1 ) + f(y t Y t 1, R t) P (R t Y t 1 ) i=1 i=1 ( j t 1, (B.31)) t 1 = f(y t Y t 1, R = j) P (R = i Y t 1 ) + f(y t Y t 1, R t) P (R t Y t 1 ) t 1 f(y t Y t 1, R = j) f(y t Y t 1, R = j) P (R = i Y t 1 ) = f(y t Y t 1, R t) P (R t Y t 1 ) i=1 { } t 1 f(y t Y t 1, R = j) 1 P (R = i Y t 1 ) = f(y t Y t 1, R t) P (R t Y t 1 ) i=1 f(y t Y t 1, R = j) P (R t Y t 1 ) = f(y t Y t 1, R t) P (R t Y t 1 ) 69

218 t 2 P (R t Y t 1 ) > 0 f(y t Y t 1, R = j) = f(y t Y t 1, R t) t 2, j t 1 ACMV (B.31) ACMV B.4.2. MAR 1 t u T t, u P (R = t Y u ) = P (R = t Y t ). u = t u = T MAR 1 t < u < T P (R = t Y u ) = f(r = t, Y u) f(y u ) f(r = t, Yu, Y T = u+1)dyu+1 T f(y u ) f(r = t, YT )dyu+1 T = f(y u ) P (R = t YT )f(y T )dyu+1 T = f(y u ) = P (R = t Yt )f(y T )dy T u+1 f(y u ) = P (R = t Y t) f(y T )dy T u+1 f(y u ) ( MAR) ( u > t) = P (R = t Y t) f(y u ) f(y u ) = P (R = t Y t ) 1 t u T P (R = t Y u ) = P (R = t Y t ) B < i < j T i, j j 1 f(y j i+1 Y i) = f(y t+1 Y t ). 3 X, Y, Z t=i f(x, Y Z) = f(x Y, Z) f(y Z) f(x, Y Z) = = f(y j i+1 Y i) = f(y j 1 i+1, Y j Y i ) f(x, Y, Z) f(z) f(x, Y, Z) f(y, Z) f(y, Z) f(z) = f(x Y, Z) f(y Z) 70

219 = f(y j Y j 1 i+1, Y i) f(y j 1 i+1 Y i) = f(y j Y j 1 ) f(y j 2 i+1, Y j 1 Y i ) = f(y j Y j 1 ) f(y j 1 Y j 2 i+1, Y i) f(y j 2 i+1 Y i) = f(y j Y j 1 ) f(y j 1 Y j 2 ) f(y j 2 i+1 Y i). = f(y j Y j 1 ) f(y j 1 Y j 2 ) f(y i+1 Y i ) j 1 = f(y t+1 Y t ) t=i B B.4.1 B.4.1. B.4.1 ACMV (B.31) MAR (B.31) MAR (B.31) MAR j 1 f(y T, R = j) = P (R = j Y T ) f(y T ) = P (R = j Y j ) f(y T ) ( MAR) (B.33) MAR 1 j < t T B.4.2 f(y T, R = j) = f(y t, Yt+1, T R = j) = f(yt+1 T Y t, R = j) P (R = j Y t ) f(y t ) = f(yt+1 T Y t, R = j) P (R = j Y j ) f(y t ) ( t > j B.4.2 ) (B.34) (B.33) = (B.34) f(yt+1 T Y t, R = j) P (R = j Y j ) f(y t ) = P (R = j Y j ) f(y T ) f(yt+1 T Y t, R = j) = f(y T ) f(y t ) = f(y t, Yt+1) T = f(yt+1 T Y t ) ( P (R = j Y j ) > 0) f(y t ) Y T t+2 f(y t+1 Y t, R = j) = f(y t+1 Y t ) t = T 1 t = T (B.34) (B.33) f(y T, R = j) = f(y T 1, Y T, R = j) = f(y T Y T 1, R = j) P (R = j Y T 1 ) f(y T 1 ) = f(y T Y T 1, R = j) P (R = j Y j ) f(y T 1 ) f(y T Y T 1, R = j) P (R = j Y j ) f(y T 1 ) = P (R = j Y j ) f(y T ) 71

220 f(y T Y T 1, R = j) = f(y T ) f(y T 1 ) = f(y T Y T 1 ) ( P (R = j Y j ) > 0) MAR (B.31) (B.31) MAR t 2, j t 1 f(y T, R = j) = f(y j, R = j) f(y T j+1 Y j, R = j) T = f(y j, R = j) f(y t Y t 1, R = j) t=j+1 ( B.4.3) T = f(y j, R = j) f(y t Y t 1 ) t=j+1 ( (B.31)) = f(y j, R = j) f(y j+1 Y j ) f(y j+2 Y j+1 ) f(y T Y T 1 ) ( B.4.3) = f(y j, R = j) f(yj+1 T Y j ) = P (R = j Y j ) f(y j ) f(yj+1 T Y j ) = P (R = j Y j ) f(y T ) (B.35) f(y T, R = j) = P (R = j Y T ) f(y T ) (B.36) (B.35) (B.36) P (R = j Y j ) f(y T ) = P (R = j Y T ) f(y T ) f(y T ) > 0 P (R = j Y j ) = P (R = j Y T ) t 2, j t 1 1 j T 1 j = T (B.31) MAR MAR (B.31) B.4.1 MAR ACMV B.4.5 NFMV NFD ACMV MAR NFMV NFD B.4.2. NFMV NFD Kenward (2003) B NFD (Non Future Dependence) t 1 P (R = t Y T ) = P (R = t Y t+1 ) NFMV t 3, j t 2 f(y t Y t 1, R = j) = f(y t Y t 1, R t 1) f(y t Y t 1, R = 1) = f(y t Y t 1, R = 2) = = f(y t Y t 1, R = t 2) 72

221 B B j T, 1 t T, P (R = j Y t ) > 0 Y j f(y j ) > 0 B ACMV MAR 2 B.4.4. NFMV t 3, j t 2 f(y t Y t 1, R = j) = f(y t Y t 1 ) (B.37). NFMV (B.37) t j t, j 1 t 2 f(y t Y t 1 ) = f(y t Y t 1, R = i) P (R = i Y t 1 ) + f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) i=1 NFMV (B.37) (B.38) NFMV t 2 f(y t Y t 1 ) = f(y t Y t 1, R = i) P (R = i Y t 1 ) + f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) i=1 t 2 = f(y t Y t 1, R = j) P (R = i Y t 1 ) i=1 + f(y t Y t 1, R = j) P (R t 1 Y t 1 ) ( (NFMV )) t 2 = f(y t Y t 1, R = j) P (R = i Y t 1 ) + f(y t Y t 1, R = j) P (R t 1 Y t 1 ) = f(y t Y t 1, R = j) = f(y t Y t 1, R = j) NFMV (B.37) i=1 { t 2 } P (R = i Y t 1 ) + P (R t 1 Y t 1 ) i=1 (B.38) (B.37) NFMV (B.38) (B.37) f(y t Y t 1, R = 1) = f(y t Y t 1, R = 2) = = f(y t Y t 1, R = t 2) = f(y t Y t 1 ) t 2 f(y t Y t 1 ) = f(y t Y t 1, R = i) P (R = i Y t 1 ) + f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) i=1 t 2 f(y t Y t 1, R = j) = f(y t Y t 1, R = j) P (R = i Y t 1 ) i=1 + f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) ( (B.37)) t 2 = f(y t Y t 1, R = j) P (R = i Y t 1 ) + f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) i=1 t 2 f(y t Y t 1, R = j) f(y t Y t 1, R = j) P (R = i Y t 1 ) = f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) i=1 73

222 { } t 2 f(y t Y t 1, R = j) 1 P (R = i Y t 1 ) = f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) i=1 f(y t Y t 1, R = j) P (R t 1 Y t 1 ) = f(y t Y t 1, R t 1) P (R t 1 Y t 1 ) t 2 P (R t 1 Y t 1 ) > 0 f(y t Y t 1, R = j) = f(y t Y t 1, R t 1) (B.37) NFMV (B.37) NFMV B.4.5. NFD 1 t < u T t, u P (R = t Y u ) = P (R = t Y t+1 ). u = T NFD 1 t < u < T P (R = t Y u ) = f(r = t, Y u) f(y u ) f(r = t, Yu, Y T = u+1)dyu+1 T f(y u ) f(r = t, YT )dyu+1 T = f(y u ) P (R = t YT )f(y T )dyu+1 T = f(y u ) = P (R = t Yt+1 )f(y T )dy T u+1 f(y u ) = P (R = t Y t+1) f(y T )dy T u+1 f(y u ) ( NFD) ( u > t) = P (R = t Y t+1) f(y u ) f(y u ) = P (R = t Y t+1 ) 1 t < u T P (R = t Y u ) = P (R = t Y t+1 ) B B.4.2 B.4.2. B.4.4 NFMV (B.37) NFD (B.37) NFD (B.37) NFD j 1 f(y T, R = j) = P (R = j Y T ) f(y T ) = P (R = j Y j+1 ) f(y T ) ( NFD) (B.39) 74

223 B.4.5 t 3, j t 2 t f(y T, R = j) = f(y t, Y T t+1, R = j) = f(y T t+1 Y t, R = j) P (R = j Y t ) f(y t ) = f(y T t+1 Y t, R = j) P (R = j Y j+1 ) f(y t ) ( t > j B.4.5 ) (B.40) (B.39) = (B.40) f(y T t+1 Y t, R = j) P (R = j Y j+1 ) f(y t ) = P (R = j Y j+1 ) f(y T ) f(yt+1 T Y t, R = j) = f(y T ) f(y t ) = f(y t, Yt+1) T = f(yt+1 T Y t ) ( P (R = j Y j+1 ) > 0) f(y t ) Y T t+2 f(y t+1 Y t, R = j) = f(y t+1 Y t ) t = T 1 t = T f(y T, R = j) = f(y t, Y T t+1, R = j) NFD (B.37) = f(y T Y T 1, R = j) P (R = j Y T 1 ) f(y T 1 ) = f(y T Y T 1, R = j) P (R = j Y j+1 ) f(y T 1 ) (B.37) NFD t 3, j t 2 f(y T, R = j) = f(y j+1, R = j) f(y T j+2 Y j+1, R = j) T = f(y j+1, R = j) f(y t Y t 1, R = j) t=j+2 ( B.4.3) T = f(y j+1, R = j) f(y t Y t 1 ) t=j+2 ( (B.37)) = f(y j+1, R = j) f(y j+2 Y j+1 ) f(y j+3 Y j+2 ) f(y T Y T 1 ) = f(y j+1, R = j) f(yj+2 T Y j+1 ) ( B.4.3) = P (R = j Y j+1 ) f(y j+1 ) f(yj+2 T Y j+1 ) = P (R = j Y j+1 ) f(y T ) (B.41) f(y T, R = j) = P (R = j Y T ) f(y T ) (B.42) (B.41) (B.42) P (R = j Y j+1 ) f(y T ) = P (R = j Y T ) f(y T ) f(y T ) > 0 P (R = j Y j+1 ) = P (R = j Y T ) t 3, j t 2 1 j T 2 j = T 1 (B.37) NFD NFD (B.37) B.4.4 NFD NFMV 75

224 [1] Diggle, P. and Kenward, M. G. (1994). Informative drop-out in longitudinal data analysis. Journal of Applied Statistics, 43, [2] Kenward, M. G., Molenberghs, G., and Thijs, H. (2003). Pattern-mixture models with proper time dependence. Biometrika, 90(1), [3] Mallinckrodt, C. H. (2013). Preventing and treating missing data in longitudinal clinical trials. Cambridge University Press. [4] Molenberghs, G., Michiels, B., Kenward, M. G., and Diggle, P. J. (1998). Monotone missing data and patternmixture models. Statistica Neerlandica, 52(2), [5] Nelder, J. A., and Mead, R. (1965a). A simplex method for function minimization. The computer journal, 7(4), [6] Nelder, J. A., and Mead, R. (1965b). A simplex method for function minimization - errata. The computer journal, 8(1), 27. [7] National Research Council. (2010). The prevention and treatment of missing data in clinical trials. National Academy Press. [8] Seaman, S., Galati, J., Jackson, D., and Carlin, J. (2013). What Is Meant by Missing at Random?. Statistical Science, 28(2), [9] Verbeke, G., and Molenberghs, G. (2000). Linear mixed models for longitudinal data. Springer. 76

225 Appendix C SAS C.1 DIA (Drug Information Association) Scientific Working Group on Missing Data DIA working group Mallinckrodt et al. (2013) DIA working group SAS DIA working group C.1 DIA working group SAS C.1 SAS C.1: DIA working group SAS 77

226 C.1: DIA working group Descriptive Summaries Inclusive Modeling Approaches SAS Code for Descriptive Summaries and Plot Vansteelandt s Doubly Robust Estimation Method Direct Likelihood Approach Including Influence and Residual Diagnosis Inclusive Multiple Imputation (MI) and Weighted Generalized Estimation Equation (wgee) Descriptive Statistics Dbrobust Vansteelandt Doubly Robust Direct Likelihood MMRM MI Inclusive and WGEE MI, WGEE Missing Not at Random (MNAR) Methods Pattern Mixture Model PMM Identifiability Constraints CCMV NCMV Shared Parameter Model Selection Model SPM Shared Parameter Model Selection Model Control- Based Multiple Imputation Reference-based MI via Multivariate Normal Repeated measures (MNRM) Five Macros 5 CIR, J2R, CR, ALMCF OLMCF Placebo MI including Tipping Point Analysis with Delta-Adjusting Imputation PMM Delta Tipping Point and CBI Placebo multiple imputation Example Datasets Mallinckrodt, C. H., Roger, J., Chuang-Stein, C., Molengerghs, G., Lane, P. W., O Kelly, M., Ratitch, B., Xu, Lei., Gilbert, S., Mehrotra, D. V., Wolfinger, R., and Thijs, H. (2013), Missing Data: Turning Guidance into Action. Statistics in Biopharmaceutial Research. 5,

227 C.2 Selection_Model2 C.2.1 SAS Selection_Model2 Diggle and Kenward (1994) MAR, MNAR 5 Selection Model DIA working group Missing Not at Random (MNAR) Methods Selection Model Base SAS, SAS/Stat, SAS/IML %Selection_Model2( INPUTDS = HMGR, COVTYPE =UN, response = change, MODL = seq therapy seq*therapy basval basval*seq, CLASVAR = seq therapy, mech= MNAR, psi5=-.02, psi6=-.02, const= 8, derivative= 1, out1 = HMGR_estimates, out2 = HMGR_lsmeandifferences, out3 = HMGR_lsmeans ); Out1: Out2: Out3: C.2.2 INPUTDS INPUTDS=HMGR COVTYPE COVTYPE=UN UN, TOEP, TOEPH, ARH, AR, CSH, CS Proc mixed response Response= change 79

228 MODL model = seq therapy seq*therapy basval basval*seq Proc mixed CLASVAR CLASVAR = seq therapy mech mech= MNAR MCAR MAR MNAR: ψ 5, ψ 6 MNARS: MNAR ψ 5, ψ 6 psi5 psi5 = ψ 5 mech=mnars psi6 psi6 = ψ 6 mech=mnars const const= (µ-8σ, µ+8σ) 3 8 µ σ derivative derivative = 1 1: Loglikderivative 0: out1 out1 = HMGR_estimates out2 out2 = HMGR_lsmeandifferences out3 out3 = HMGR_lsmeans C Selection_Model2 SM_GridSearch ψ5 ψ6 80

229 Selection_Model2:. Selection_Model "initial" "SeM" "initial" "SeM" Initial: (1)proc logistic (2) proc mixed MMRM Selection Model (3) visit SeM2: proc iml loglik loglikderivative Loglik Loglikderivative SAS nlpnrr derivative nlpnrr loglikderivate SAS nlpfdd proc print SM_GridSearch psi5grid psi6grid psi5 psi6 Selection_Model2 C Visit "seq" 3. "therapy" 4. ID "patient" SAS "derivative = 1" "derivative = 0" (10 30 ), "derivative=0" C Diggle, P. & Kenward, M. G. (1994). Informative drop-out in longitudinal data analysis. Applied Statistics. 43(1),

230 C.3 cbi_pmm C.3.1 cbi_pmm Pattern mixture model (Control based imputation) 6 Pattern Mixture Model 10 SAS DIA working group Control-Based Multiple Imputation Placebo MI including Tipping Point Analysis with Delta-Adjusting Imputation Base SAS and SAS/STAT %cbi_pmm( datain=data0s, trtname=trt, subjname=id, visname=visit, basecont=%str(v0), baseclass=, postcont=%str(val), postclass=, seed= , nimp=100, primaryname=val, analcovarcont=%str(v0), trtref=0, analmethod=mmrm, repstr=un, dataout=data0imputed, resout=pmmresults ); Resout SAS Output C.3.2 datain 82

231 trtname trtname=trt subjname subjname=subjid ID visname visname=timeptn Visit ID Visit 0 (0,1,2, ) poolsite poolsite=invid ID basecont baseclass postcont postcont=%str(scorepr SCORESC) postclass postclass=%str(scoretr SCOREOT) seed seed= nimp nimp= primaryname analcovarcont analcovarclass trtref 0 analmethod analmethod=%str(mmrm) ancova Impute ancova mmrm ancova repstr repstr=un analmethod=mmrm type=un DDFM=KR dataout pmmimputations resout pmmresults C.3.3 cbi_pmm 83

232 C.3.4 proc MI MAR proc MI MCMC C Ratitch B, O Kelly M. Implementation of Pattern-Mixture Models Using Standard SAS/STAT Procedures. PharmaSUG Available at 84

233 C.4 delta_pmm C.4.1 delta_pmm Pattern mixture model 6 Pattern Mixture Model 10 SAS DIA working group Missing Not at Random (MNAR) Methods Placebo MI including Tipping Point Analysis with Delta-Adjusting Imputation Base SAS, SAS/Stat %delta_pmm( datain=data0, trtname=trt, subjname=patient, visname=visit, poolsite=poolinv, basecont=%str(basval), baseclass=%str(gendern), postcont=%str(hamdtl17 hamatotl), postclass=%str(pgiimp), seed= , nimp=100, deltavis=all, deltacont=%str(1 0.2), deltacontarm=%str(1 1), deltacontmethod=%str(meanabs meanper), favorcont=%str(low low), deltacontlimit=%str(), deltaclass=%str(0.5), deltaclassarm=%str(1), favorclass=%str(low), deltaclasslimit=%str(7), primaryname=hamdtl17, analcovarcont=%str(basval basval*visit), analcovarclass=%str(poolinv), trtref=0, analmethod=mmrm, repstr=un, dataout=data0imputed2, resout=pmmresults2 ); 85

234 Resout SAS Output C.4.2 datain trtname trtname=trt subjname subjname=subjid ID visname visname=timeptn Visit ID Visit 0 (0,1,2, ) poolsite poolsite=invid ID basecont baseclass postcont postcont=%str(scorepr SCORESC) postclass postclass=%str(scoretr SCOREOT) seed seed= nimp nimp= deltavis deltavis=%str(all) first Delta adjustment visit visit visit all first NRC(2010) first 86

235 deltacont deltacont=%str(1 0.1) postcont delta MAR favorcont Mean delta adjustment (deltacontmethod=meanabs) delta Mean delta adjustment deltacontmethod=meanper 0 1 MAR deltacontmethod=slopeper 0 1 visit deltacontarm deltacontarm=%str(1 0,1) delta adjustment postcont postcont 1 1 delta adjustment 2 (0,1) delta adjustment deltacontmethod deltacontmethod=%str(meanabs slopeper) Delta adjustment Postcont meanabs, meanper, slopeper 3 favorcont favorcont=%str(low high) Postcont Low high 87

236 deltacontlimit deltacontlimit=%str(0 10) Postcont deltaclass deltaclass=%str( ) postclass delta MAR favorclass deltaclassarm deltaclassarm=%str(1 0,1) delta adjustment postclass postclass 1 1 delta adjustment 2 (0,1) delta adjustment favorclass favorclass=%str(low high) Postclass Low high deltaclasslimit deltaclasslimit=%str(0 5) Postclass primaryname analcovarcont analcovarclass trtref analmethod analmethod=%str(mmrm) Impute ancova mmrm ancova 88

237 repstr repstr=unr analmethod=mmrm dataout myimputations resout pmmresults C.4.3 delta_pmm C.4.4 (1) MAR (2) visit C National Research Council. Panel on Handling Missing Data in Clinical Trials. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. The Prevention and Treatment of Missing Data in Clinical Trials. The National Academies Press: Washington, DC; Van Buuren S., Boshuizen, H.C., and Knook, D.L. Miltiple Imputation of Missing Blood Pressure Covariates in Survival Analysis. Statistics in Medicine, 1999 (18), Bohdana Ratitch, Michael O Kelly and Robert Tosiello. Missing Data in Clinical Trials: from Clinical Assumptions to Statistical Analysis Using Pattern Mixture Models. Pharmaceutical Statistics, 2013, (12)

238 C.5 Shared_Parameter1 C.5.1 SAS Shared_Parameter1 Shared Parameter Model 7 Shared Parameter Model DIA working group Missing Not at Random (MNAR) Methods Shared Parameter Model Base SAS SAS/Stat SAS/IML %shared_parameter( INPUTDS=TEMP, SUBJVAR=PATIENT, TRTVAR=DRUG, TIME=SWK, Additional_Class=%STR(POOLINV GENDER), MODL=%STR(CHANGE=POOLINV BASE DRUG SWK SWK*DRUG), LINK=%STR(CLOGLOG), RANDOM_SLOPE=%STR(LINEAR), DEBUG=0 ); _SP_OUT_FINAL C.5.2 INPUTDS INPUTDS=TEMP SUBJVAR SUBJVAR=PATIENT ID TRTVAR TRTVAR=DRUG 0 1 TIME TIME=SWK 90

239 QUADTIME QUADTIME=QUADWEEK Additional_Class Additional_Class=%STR(POOLINV GENDER) MODL PROC MIXED CLASS SUBJVAR TRTVAR LINK LINK=%STR(CLOGLOG) CLOGLOG 2 CLOGLOG LOGIT MODL MODL=%STR(CHANGE=POOLINV BASE DRUG SWK SWK*DRUG) PROC MIXED MODEL RANDOM_SLOPE RANDOM_SLOPE=%STR(LINEAR) 3 NONE LINEAR QUAD CRIT CRIT= DEBUG DEBUG=0 0 DEBUG=1 91

240 C.5.3 shaped parameter model Mixed Y i = X i β + Z i U i + ϵ i (C.1) Complementary log-log link (C.2) C.3) log( log(1 P rob(d i = j D i j))) = W i α + Γ i U i log( log(1 P rob(d i j)) = W i α + Γ i U i (C.2) (C.3) (C.4) LINK=%STR(CLOGLOG) logit(p rob(d i j)) = W i α + Γ U i (C.5) U i RANDOM_SLOPE=STR(XXXX) 3 NONE LINEAR 1 2 QUAD endpoint endpoint endpoint PROC NLMIXED 3 MIXED 1 MAR 2 3 C RANDOM_SLOPE=STR(QUAD) 2. PROC NLMIXED 3. 92

241 SAS 1, 2 ( ), 3, 5, 11 Appendix A, B (2013 ) 2 (2.5), Appendix C (2014 ) 4, 7 (2014 ) 12 Appendix C Appendix C NRC EMA ( ) RI ( ) FDA, EMA, PMDA ( ) Meiji Seika

242 94

243 Appendix 2 欠測のあるデータに関する感度分析の事例 日本製薬工業協会医薬品評価委員会データサイエンス部会 2015 年度タスクフォース 4 欠測のあるデータの解析チーム 1

244 目次 1. はじめに 事例調査の方法 各事例 Brexpiprazole(REXULTI ) 背景 統合失調症の治療 大うつ病の補助療法 参考文献 Nintedanib(OFEV ) 背景 INPULSIS 試験 参考文献 Riociguat(ADEMPAS ) 背景 試験 (PATENT-1) 感度分析 Reviewer の見解 参考文献 Vedolizumab(Entyvio ) 背景 試験デザイン 試験結果 参考文献

245 1. はじめに 医薬品評価委員会データサイエンス部会 2013 年度タスクフォース 2 では, 医薬品開発において欠測データによって生じる問題を把握するために, 承認審査において欠測データの影響について議論された事例を調査した. 当時の調査対象は,2010 年 1 月から 2013 年 6 月の期間に承認された品目の FDA,EMA および PMDA の審査報告書であった ( 詳細な調査範囲については 2014 年 7 月に公開された 審査資料で欠測データが議論された事例 を参照 ).NRC レポートは 2010 年に公開されたことから, 対象とした試験のほとんどは NRC レポートが公開される前, または公開されて間もない段階で第 3 相試験が計画, 実施されていたと考えられ, 欠測メカニズムに MNAR を仮定した pattern-mixture model に基づく感度分析を実施した事例は見られなかった. そのため今回の調査は前調査のフォローアップとして,FDA の審査報告書 ( 特に statistical review) において欠測データの取り扱いが議論になっており, さらに欠測メカニズムに MNAR を仮定した下で pattern-mixture model などを利用した感度分析を実施している試験に注目した. 今回注目した品目の事例については,2015 年 12 月 22 日に実施された 2015 年計量生物セミナーにて口頭発表またはスライド * の公開により紹介を行った. 本 appendix 2 は, その内容を報告書として再構成したものである. * 本調査の方法の詳細を 2 章に示した. 取り上げた事例は,FDA の審査報告書および関連する論文での記述を基に 3 章に記載した. なお,3 章での記載は FDA の審査報告書や論文の内容を日本語化し, タスクフォースメンバーにより再構成したものであるため, 記載した文章および用語等が元の文献の逐語的な翻訳には必ずしもなっていないことに留意していただきたい. 3

246 2. 事例調査の方法 FDA の web* 上に公開されている承認品目一覧 (New Molecular Entity and New Therapeutic Biological Product Approvals) から, 承認日が 2013 年 7 月 1 日から 2015 年 7 月 31 日までの品目を調査対象として特定した ( 診断薬及び抗悪性腫瘍薬を除いた 57 件 ). 次に, 該当品目の審査報告書, 特に statistical review を調査し, 欠測メカニズムに MNAR を仮定した下で pattern-mixture model などを利用した感度分析を実施しているものとして,Brexpiprazole(REXULTI ),Nintedanib (OFEV ),Riociguat(ADEMPAS ), および Vedolizumab(ENTYVIO ) の 4 品目を注目する事例として取り上げることとした. なお,ANORO TM ELLIPTA は該当品目の一つであり,jump to reference による解析を感度分析として実施しているが,2015 年 2 月 13 日に開催された, 日本製薬工業協会データサイエンス部会シンポジウム 臨床試験の欠測データの取り扱いに関する最近の展開と今後の課題について NAS レポート,EMA ガイドライン,estimand と解析方法の概説 で紹介された事例であるため本 Appendix 2 では取り上げない. * 4

247 3. 各事例 3.1 Brexpiprazole(REXULTI ) 背景 Brexpiprazole( 商品名 REXULTI ) は, 統合失調症及び大うつ病 ( 補助療法 ) の治療薬として 2015 年 7 月に米国で承認された. 申請データパッケージにおいては, 各適応症に対して, それぞれ 2 本ずつの第 3 相試験の結果が提出された. 統合失調症の治療 : Study 230,231 大うつ病の補助療法 : Trial 227,228 以下では, それぞれの適応症ごとに,statistical review に記載のあった欠測データの取扱いに関する議論について要約する 統合失調症の治療 試験デザイン 統合失調症の治療 に対しては,2 本の第 3 相試験が実施された (Study 230,231). いずれの試験もランダム化二重盲検プラセボ対照並行群間比較試験であり, 試験デザインについては, 群の構成及びその割付比を除いて, ほとんどの部分で共通していた ( 表 3.1.1). 表 Study 230 と 231 の群構成及びその割付比 Study 230 Study 231 群 プラセボ,1 mg,2 mg,4 mg プラセボ,0.25 mg,2 mg,4 mg 割付比 3:2:3:3 2:1:2:2 観察期間は 6 週間で, 主要評価項目は 6 週時点の PANSS total score の変化量, 主解析には MMRM, その感度分析として multiple imputation を利用した pattern-mixture approach が設定された. 次項で感度分析の詳細について説明する 感度分析の詳細 主解析の MMRM に対する感度分析として,multiple imputation を利用した pattern-mixture approach が実施された. 当該感度分析では, 以下に示した 2 つのシナリオが申請者により検討されている. シナリオ 1 : Brexpiprazole 投与群において lack of efficacy で中止した被験者に MNAR を仮定し,Brexpiprazole 投与群におけるその他の理由による中止, 及びプラセボ投与群における中止に対しては MAR を仮定したもの. シナリオ 2 : Brexpiprazole 投与群において lack of efficacy または adverse event で中止した被験者に MNAR を仮定し,Brexpiprazole 投与群におけるその他の理由による中止, 及びプラセボ投与群における中止に対しては MAR を仮定したもの. 5

248 MNAR を仮定した被験者に対しては以下の手順で補完を行う. まず,MAR を仮定した multiple imputation によって欠測データを補完する. 次に, その補完した値から, 主解析の MMRM で推定された群間差の k% だけマイナスする.k を 0,10,20,,100,, と動かしていき, 主解析の結論が逆転する k, すなわち,tipping point を探索する (tipping point analysis) 試験結果 中止割合被験者の内訳を表 3.1.2, 表 に示す. いずれの試験も中止割合は概ね 30~40% 程度であった. 表 Study 230 の被験者の内訳 (Brexpiprazole FDA statistical review. (2015). p. 23. Table 11 より改変 ) Brexpiprazole プラセボ 1 mg 2 mg 4 mg ランダム化例数 120 (100%) 186 (100%) 184 (100%) 184 (100%) 有効性解析対象例数 117 (97.5%) 179 (96.2%) 181 (98.4%) 180 (97.8%) 試験完了例数 81 (67.5%) 129 (69.4%) 130 (70.7%) 118 (64.1%) 試験中止例数 39 (32.5%) 57 (30.6%) 54 (29.3%) 66 (35.9%) 追跡不能 0 (0%) 0 (0%) 0 (0%) 0 (0%) 有害事象 11 (9.2%) 11 (5.9%) 13 (7.1%) 22 (12.0%) 中止基準に該当 0 (0%) 0 (0%) 2 (1.1%) 1 (0.5%) 治験責任医師による 2 (1.7%) 0 (0%) 0 (0%) 1 (0.5%) 同意撤回 被験者による 15 (12.5%) 25 (13.4%) 23 (12.5%) 21 (11.4%) 同意撤回 プロトコールからの 2 (1.7%) 1 (0.5%) 0 (0%) 0 (0%) 逸脱 有効性の欠如 9 (7.5%) 20 (10.8%) 16 (8.7%) 21 (11.4%) 6

249 表 Study 231 の患者の内訳 (Brexpiprazole FDA statistical review. (2015). p. 23. Table 12 より改変 ) Brexpiprazole プラセボ 1 mg 2 mg 4 mg ランダム化例数 90 (100%) 182 (100%) 180 (100%) 184 (100%) 有効性解析対象例数 87 (96.7%) 180 (98.9%) 178 (98.9%) 178 (96.7%) 試験完了例数 56 (62.2%) 124 (68.1%) 121 (67.2%) 109 (59.2%) 試験中止例数 34 (37.8%) 58 (31.9%) 59 (32.8%) 75 (40.8%) 追跡不能 0 (0%) 0 (0%) 0 (0%) 1 (0.5%) 有害事象 12 (13.3%) 15 (8.2%) 17 (9.4%) 32 (17.4%) 中止基準に該当 1 (1.1%) 0 (0%) 1 (0.6%) 0 (0%) 治験責任医師による 0 (0%) 1 (0.5%) 1 (0.6%) 3 (1.6%) 同意撤回 被験者による 13 (14.4%) 24 (13.2%) 31 (17.2%) 21 (11.4%) 同意撤回 プロトコールからの 1 (1.1%) 1 (0.5%) 2 (1.1%) 0 (0%) 逸脱 有効性の欠如 7 (7.8%) 17 (9.3%) 7 (3.9%) 18 (9.8%) 主解析の結果 Study 230 では,4 mg で有意差が認められ,Study 231 では,4 mg に加えて 2 mg においても有意差が認められた ( 表 3.1.4). 7

250 表 Study 230 と 231 の主解析の結果 (Brexpiprazole FDA statistical review. (2015). p. 13. Table 3 より改変 ) Study 230 Brex. 1 mg Brex. 2 mg Brex. 4 mg Placebo Number of patients N=117 N=179 N=181 N=180 Baseline Mean (SD) 93.2 (12.7) 96.3 (12.9) 95.0 (12.4) 94.6 (12.8) Mean Change at Week 6 (SE) (1.9) (1.5) (1.5) (1.5) Treatment Difference % Confidence Interval (-8.1, 1.3) (-7.2, 1.1) (-10.6, -2.3) - p-value Average Effect (2mg & 4mg) LS Mean Difference=-4.78, p-value= versus Placebo Study 231 Brex mg Brex. 2 mg Brex. 4 mg Placebo Number of patients N=87 N=180 N=178 N=178 Baseline Mean (SD) 93.6 (11.5) 95.9 (13.7) 94.7 (12.1) 95.7 (11.5) Mean Change at Week 6 (SE) (2.2) (1.5) (1.5) (1.6) Treatment Difference % Confidence Interval (-8.3, 2.5) (-13.1, -4.4) (-12.0, -3.3) - p-value 0.29 < Average Effect (2mg & 4mg) LS Mean Difference=-8.18, p-value< versus Placebo N=number of patients, SD=Standard Deviation, SE=Standard Error 感度分析の結果感度分析の結果を表 3.1.5(Study 230) と表 3.1.6(Study 231) に示す. 先に挙げた 2 つのシナリオに加えて, Brexpiprazole 投与群において中止したすべての被験者に MNAR を仮定したもの であるシナリオ 3 が reviewer によって実施された. Study 230 では, シナリオ 1,2,3 の tipping point はそれぞれ,260(%),130(%),80(%) であった.Study 231 では, シナリオ 2 と 3 の tipping point はそれぞれ,300(%),160(%) であったが, シナリオ 1 においては,k を 500(%) まで動かしても主解析の結論が逆転することはなかった. 8

251 表 Study 230 の感度分析の結果 (Brexpiprazole FDA statistical review. (2015). p. 15. Table 4 より改変 ) Percentage of Treatment Effect Subtracted from MAR imputed missing data Treatment Comparison Treatment Difference 95% Confidence Interval p-value Lack of Efficacy in Brex Group as MNAR( シナリオ 1) k=100(%) Brex.2mg vs Placebo -2.8 (-7.3, 1.7) 0.22 Brex.4mg vs Placebo -5.2 (-9.7, -0.7) k=260(%) Brex.2mg vs Placebo -2.3 (-6.7, 2.1) 0.31 (tipping point) Brex.4mg vs Placebo -4.3 (-8.8, 0.1) Lack of Efficacy and AE in Brex Group as MNAR( シナリオ 2) k=100(%) Brex.2mg vs Placebo -2.7 (-7.2, 1.9) 0.25 Brex.4mg vs Placebo -4.8 (-9.3, -0.2) k=130(%) Brex.2mg vs Placebo -2.7 (-7.2, 1.8) 0.25 (tipping point) Brex.4mg vs Placebo -4.4 (-8.9, 0.1) Dropout due to Any Reason in Brex Group as MNAR( シナリオ 3) k=80(%) Brex.2mg vs Placebo -2.8 (-7.6, 2.0) 0.25 (tipping point) Brex.4mg vs Placebo -4.4 (-8.9, 0.1)

252 表 Study 231 の感度分析の結果 (Brexpiprazole FDA statistical review. (2015). p. 16. Table 5 より改変 ) Percentage of Treatment Effect Subtracted from MAR imputed missing data Treatment Comparison Treatment Difference Lack of Efficacy in Brex Group as MNAR( シナリオ 1) 95% Confidence Interval p-value k=100(%) Brex.2mg vs Placebo -9.1 (-14.0, -4.3) Brex.4mg vs Placebo -8.4 (-13.0, -3.9) k=500(%) (tipping Brex.2mg vs Placebo -5.9 (-11.1, -0.8) point not reached) Brex.4mg vs Placebo -7.2 (-12.4, -2.0) Lack of Efficacy and AE in Brex Group as MNAR( シナリオ 2) k=100(%) Brex.2mg vs Placebo -8.6 (-13.4, -3.9) Brex.4mg vs Placebo -7.9 (-12.5, -3.2) k=300(%) Brex.2mg vs Placebo -5.0 (-10.1, 0.2) (tipping point) Brex.4mg vs Placebo -5.3 (-10.7, 0.2) Dropout due to Any Reason in Brex Group as MNAR( シナリオ 3) k=100(%) Brex.2mg vs Placebo -7.2 (-12.3, -2.1) Brex.4mg vs Placebo -6.2 (-11.0, -1.4) k=160(%) (tipping point) Brex.2mg vs Placebo -5.8 (-10.8, -0.9) Brex.4mg vs Placebo -5.1 (-10.4, 0.2) Statistical review では, 申請者が実施したシナリオ 1 と 2 では tipping point が 100(%) を超 えていたこと,reviewer が実施したシナリオ 3 においても tipping point が 80(%) に届いていた ことを受けて, 主解析の結果と感度分析の結果は一貫していた と結論付けている 大うつ病の補助療法 試験デザイン 大うつ病の補助療法 に対しても,2 本の第 3 相試験が実施された (Trial 227,228). いずれの試験もランダム化二重盲検プラセボ対照並行群間比較試験であり, 試験デザインについては, 群の構成を除いて, ほとんどの部分で共通していた. 群の構成は,Trial 227 がプラセボ, Brexpiprazole 1 mg,3 mg の 3 群,Trial 228 ではプラセボと Brexpiprazole 2 mg の 2 群であった. 観察期間は, 既存の抗うつ治療で効果が不十分な患者 を選択するための 8 週間 ( プラセボ単盲検,Phase A) と, 有効性および安全性評価のための 6 週間 ( 二重盲検,Phase B) から成 10

253 っており, 主要評価項目には Phase A 終了時から Phase B 終了時まで (6 週間 ) の MADRS total score の変化量が設定された. 主解析には MMRM, その感度分析には multiple imputation を利用した pattern-mixture approach の他に,shared parameter model,random coefficient pattern-mixture model,observed case における ANCOVA や,LOCF で欠測データを補完した ANCOVA が設定された 試験結果 中止割合 いずれの試験も 10% に満たない中止割合であった. 主解析の結果 Trial 227 及び 228 の主解析の結果を表 3.1.7, 表 に示す. 表 Trial 227 の主解析の結果 (Brexpiprazole FDA statistical review. (2015). p. 41. Table 5 より改変 ) Variable 1mg Brex+ADT* 3mg Brex+ADT* Placebo+ADT* MADRS Total Score, MMRM N=225 N=226 N=218 Mean (SD) End of Phase A (5.61) (5.24) (5.27) LS Mean (SE) Change At Week (0.50) (0.51) (0.51) LS Mean Difference (95%CI) (-2.58, 0.20) (-2.92, -0.13) - P-value * ADT: Antidepressant treatment 表 Trial 228 の主解析の結果 (Brexpiprazole FDA statistical review. (2015). p. 48. Table 12 より改変 ) Endpoint Parameter Primary Efficacy Endpoint 2 mg/day Brex+ADT* Placebo+ADT* MADRS Total Score, MMRM by Randomized Treatment N=187 N=191 Group (Efficacy Sample - Primary Analysis) Mean (SD) end of Phase A (Week 8) (5.79) (5.60) LS mean change (SE) at end of Phase B (Week 14) (0.61) (0.60) LS mean difference (95%CI) (-4.70, -1.54) - P-value * ADT: Antidepressant treatment 11

254 Trial 227 では, どちらの用量もプラセボに対する優越性を示すことはできなかった. (Brexpiprazole 3 mg では多重性を調整しない場合 p= であったが, 事前に規定された Hochberg の方法によって多重性を調整すると有意ではなかった.) 一方,Trial 228 では Brexpiprazole 2 mg のプラセボに対する優越性が示された. 感度分析の結果 感度分析の結果は主解析と同様であった と記載があったのみで,statistical review には, 感度分析の方法も含めて, 詳細な記述がなかった. 上記のように結論付けるにあたって, 中止割合が小さく, 欠測値による問題はわずかなものであった との記述があった 参考文献 [1] Brexpiprazole FDA statistical review. (2015). 12

255 3.2 Nintedanib(OFEV ) 背景 Nintedanib( 商品名 OFEV ) は, 特発性肺線維症 (idiopathic pulmonary fibrosis, 以下 IPF) の治療薬として 2014 年 10 月に米国で承認され,2015 年 7 月に日本においても承認された. その申請データパッケージの中で 2 つの第 3 相試験の結果が提出された 試験 (INPULSIS 1):IPF 患者を対象に, 有効性及び安全性の検討 試験 (INPULSIS 2):IPF 患者を対象に, 有効性及び安全性の検討 INPULSIS 1,INPULSIS 2 試験は国際共同第 3 相試験として実施され, 試験デザインが類似している sister trial であった. 試験デザインが類似していることより, 本項では INPULSIS 試験として記載を統一し,FDA の statistical review を基に内容を記載する INPULSIS 試験 試験デザイン INPULSIS 試験は二重盲検ランダム化並行群間プラセボ対照比較試験であった. 主要評価項目は FVC( 努力肺活量 ) の年間減少率 (ml/year) が設定されており, 図 において丸で囲まれた時点が測定時点であった.INPULSIS 試験は投与中止した場合でも, 来院を依頼し, データを収集するプロトコールであった. 図 INPULSIS 試験デザイン (Nintedanib statistical review (2014) p7. Figure 1 を改変 ) 13

<4D F736F F F696E74202D A957A8E9197BF817A8C7697CA90B695A8835A837E B5F B193FC816989A18E B8F816A5F46696E616C2E >

<4D F736F F F696E74202D A957A8E9197BF817A8C7697CA90B695A8835A837E B5F B193FC816989A18E B8F816A5F46696E616C2E > 発表内容 臨床試験における estimand と感度分析 欠測のあるデータに対する解析手法の基礎 ~ (1) 背景 基本事項の整理と発表の概要 ~ 日本製薬工業協会医薬品評価委員会データサイエンス部会 TF4 欠測のあるデータの解析チーム (JPMA 欠測チーム ) 持田製薬株式会社横山雄一東レ株式会社土居正明 1 estimand についての背景と基本事項 2 感度分析についての背景と基本事項 2.

More information

TF● :テーマ名

TF● :テーマ名 日本製薬工業協会シンポジウム 臨床試験の欠測データの取り扱いに関する最近の展開と今後の課題について - 統計手法 estimand と架空の事例に対する流れの整理 - (5)Estimand の解説 医薬品評価委員会データサイエンス部会タスクフォース 4 欠測のあるデータに対する解析方法論 SAS プログラム検討チーム 東レ株式会社土居正明 1 本シンポジウムの概要 (PM: セッション 2) 臨床試験の計画から解析までの流れの整理

More information

TF● :テーマ名

TF● :テーマ名 日本製薬工業協会シンポジウム 臨床試験の欠測データの取り扱いに関する最近の展開と今後の課題について - 統計手法 estimand と架空の事例に対する流れの整理 - (2) Selection Model, MMRM の解説 医薬品評価委員会データサイエンス部会タスクフォース4 欠測のあるデータに対する解析方法論 SASプログラム検討チーム塩野義製薬株式会社藤原正和株式会社大塚製薬工場大江基貴 発表構成

More information

TF● :テーマ名

TF● :テーマ名 日本製薬工業協会シンポジウム 臨床試験の欠測データの取り扱いに関する最近の展開と今後の課題について - 統計手法 estimand と架空の事例に対する流れの整理 - (3) PATTERN-MIXTURE MODEL の解説 医薬品評価委員会データサイエンス部会タスクフォース 4 欠測のあるデータに対する解析方法論 SAS プログラム検討チーム 田辺三菱製薬株式会社高橋文博 1 発表構成 1. Multiple

More information

X X X Y R Y R Y R MCAR MAR MNAR Figure 1: MCAR, MAR, MNAR Y R X 1.2 Missing At Random (MAR) MAR MCAR MCAR Y X X Y MCAR 2 1 R X Y Table 1 3 IQ MCAR Y I

X X X Y R Y R Y R MCAR MAR MNAR Figure 1: MCAR, MAR, MNAR Y R X 1.2 Missing At Random (MAR) MAR MCAR MCAR Y X X Y MCAR 2 1 R X Y Table 1 3 IQ MCAR Y I (missing data analysis) - - 1/16/2011 (missing data, missing value) (list-wise deletion) (pair-wise deletion) (full information maximum likelihood method, FIML) (multiple imputation method) 1 missing completely

More information

Microsoft PowerPoint - 【Day2-3】Estimandの紹介_final.pptx

Microsoft PowerPoint - 【Day2-3】Estimandの紹介_final.pptx Estimand の紹介 2015 年 2 月 13 日製薬協データサイエンス部会タスクフォース4 佐伯浩之 1 セッション 2: NAS レポート EMA ガイドライン estimand と解析方法の概説 NAS,EMAガイドラインの紹介 欠測データの問題 / 影響 Estimandの紹介 事前の計画と報告, 欠測予防 事例紹介 パネルディスカッション 2 発表構成 Estimand とは - ICH

More information

テーマ1 LOCF の妥当性と代替手法の検討

テーマ1 LOCF の妥当性と代替手法の検討 テーマ 1 LOCF (Last Observation Carried Forward) の妥当性と代替手法の検討 ファシリテーター日本化薬株式会社平井隆幸持田製薬株式会社横山雄一 ラウンドテーブルディスカッションを行った結果については何らかの形で成果物として公表させて頂く事を御了承ください. 本日のタイムテーブル タイムライン時間内容 13:30~13:40 10 分本日の論点整理 13:40~14:10

More information

TF● :テーマ名

TF● :テーマ名 日 本 製 薬 工 業 協 会 シンポジウム 臨 床 試 験 の 欠 測 データの 取 り 扱 いに 関 する 最 近 の 展 開 と 今 後 の 課 題 について - 統 計 手 法 estimandと 架 空 の 事 例 に 対 する 流 れの 整 理 - (7) 架 空 の 事 例 2 ( 主 解 析 の 選 択 例 数 設 計 データの 発 生 方 法 ) 医 薬 品 評 価 委 員 会 データサイエンス

More information

橡表紙参照.PDF

橡表紙参照.PDF CIRJE-J-58 X-12-ARIMA 2000 : 2001 6 How to use X-12-ARIMA2000 when you must: A Case Study of Hojinkigyo-Tokei Naoto Kunitomo Faculty of Economics, The University of Tokyo Abstract: We illustrate how to

More information

dvi

dvi 2017 65 2 217 234 2017 Covariate Balancing Propensity Score 1 2 2017 1 15 4 30 8 28 Covariate Balancing Propensity Score CBPS, Imai and Ratkovic, 2014 1 0 1 2 Covariate Balancing Propensity Score CBPS

More information

臨床試験の欠測データの取り扱いに関する最近の展開と今後の課題について

臨床試験の欠測データの取り扱いに関する最近の展開と今後の課題について 臨床試験の欠測データの取り扱いに関する 最近の展開と今後の課題について - NAS レポート,EMA ガイドライン,estimand と解析方法の概説 - 2014 年 7 月 日本製薬工業協会 医薬品評価委員会データサイエンス部会 タスクフォース 2 目次 1. 序文... 6 1.1 はじめに... 6 1.2 欠測データの影響... 7 1.3 概要... 9 1.3.1 欠測データの影響...

More information

Power Transformation and Its Modifications Toshimitsu HAMASAKI, Tatsuya ISOMURA, Megu OHTAKI and Masashi GOTO Key words : identity transformation, pow

Power Transformation and Its Modifications Toshimitsu HAMASAKI, Tatsuya ISOMURA, Megu OHTAKI and Masashi GOTO Key words : identity transformation, pow Power Transformation and Its Modifications Toshimitsu HAMASAKI, Tatsuya ISOMURA, Megu OHTAKI and Masashi GOTO Key words : identity transformation, power-normal distribution, structured data, unstructured

More information

こんにちは由美子です

こんにちは由美子です Sample size power calculation Sample Size Estimation AZTPIAIDS AIDSAZT AIDSPI AIDSRNA AZTPr (S A ) = π A, PIPr (S B ) = π B AIDS (sampling)(inference) π A, π B π A - π B = 0.20 PI 20 20AZT, PI 10 6 8 HIV-RNA

More information

わが国企業による資金調達方法の選択問題

わが国企業による資金調達方法の選択問題 * takeshi.shimatani@boj.or.jp ** kawai@ml.me.titech.ac.jp *** naohiko.baba@boj.or.jp No.05-J-3 2005 3 103-8660 30 No.05-J-3 2005 3 1990 * E-mailtakeshi.shimatani@boj.or.jp ** E-mailkawai@ml.me.titech.ac.jp

More information

スライド 1

スライド 1 欠測のあるデータにおける population-averaged 及び subject-specific アプローチの性能評価 多田圭佑サノフィ株式会社 研究開発部門医薬開発本部 統計解析 プログラミング部統計解析室 土居正明駒嵜弘 Performance evaluation of population-averaged and subject-specific approach with missing

More information

x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2

x T = (x 1,, x M ) x T x M K C 1,, C K 22 x w y 1: 2 2 Takio Kurita Neurosceince Research Institute, National Institute of Advanced Indastrial Science and Technology takio-kurita@aistgojp (Support Vector Machine, SVM) 1 (Support Vector Machine, SVM) ( ) 2

More information

kubostat2015e p.2 how to specify Poisson regression model, a GLM GLM how to specify model, a GLM GLM logistic probability distribution Poisson distrib

kubostat2015e p.2 how to specify Poisson regression model, a GLM GLM how to specify model, a GLM GLM logistic probability distribution Poisson distrib kubostat2015e p.1 I 2015 (e) GLM kubo@ees.hokudai.ac.jp http://goo.gl/76c4i 2015 07 22 2015 07 21 16:26 kubostat2015e (http://goo.gl/76c4i) 2015 (e) 2015 07 22 1 / 42 1 N k 2 binomial distribution logit

More information

dvi

dvi 2017 65 2 185 200 2017 1 2 2016 12 28 2017 5 17 5 24 PITCHf/x PITCHf/x PITCHf/x MLB 2014 PITCHf/x 1. 1 223 8522 3 14 1 2 223 8522 3 14 1 186 65 2 2017 PITCHf/x 1.1 PITCHf/x PITCHf/x SPORTVISION MLB 30

More information

And Business

And Business Discussion Papers In Economics And Business Discussion Paper 03-06 Graduate School of Economics and Osaka School of International Public Policy (OSIPP) Osaka University, Toyonaka, Osaka 560-0043, JAPAN

More information

カルマンフィルターによるベータ推定( )

カルマンフィルターによるベータ推定( ) β TOPIX 1 22 β β smoothness priors (the Capital Asset Pricing Model, CAPM) CAPM 1 β β β β smoothness priors :,,. E-mail: koiti@ism.ac.jp., 104 1 TOPIX β Z i = β i Z m + α i (1) Z i Z m α i α i β i (the

More information

Attendance Demand for J-League õ Shinsuke KAWAI* and Takeo HIRATA* Abstract The purpose of this study was to clarify the variables determining the attendance in J-league matches, using the 2,699 J-league

More information

Vol. 36, Special Issue, S 3 S 18 (2015) PK Phase I Introduction to Pharmacokinetic Analysis Focus on Phase I Study 1 2 Kazuro Ikawa 1 and Jun Tanaka 2

Vol. 36, Special Issue, S 3 S 18 (2015) PK Phase I Introduction to Pharmacokinetic Analysis Focus on Phase I Study 1 2 Kazuro Ikawa 1 and Jun Tanaka 2 Vol. 36, Special Issue, S 3 S 18 (2015) PK Phase I Introduction to Pharmacokinetic Analysis Focus on Phase I Study 1 2 Kazuro Ikawa 1 and Jun Tanaka 2 1 2 1 Department of Clinical Pharmacotherapy, Hiroshima

More information

udc-2.dvi

udc-2.dvi 13 0.5 2 0.5 2 1 15 2001 16 2009 12 18 14 No.39, 2010 8 2009b 2009a Web Web Q&A 2006 2007a20082009 2007b200720082009 20072008 2009 2009 15 1 2 2 2.1 18 21 1 4 2 3 1(a) 1(b) 1(c) 1(d) 1) 18 16 17 21 10

More information

Vol. 29, No. 2, (2008) FDR Introduction of FDR and Comparisons of Multiple Testing Procedures that Control It Shin-ichi Matsuda Department of

Vol. 29, No. 2, (2008) FDR Introduction of FDR and Comparisons of Multiple Testing Procedures that Control It Shin-ichi Matsuda Department of Vol. 29, No. 2, 125 139 (2008) FDR Introduction of FDR and Comparisons of Multiple Testing Procedures that Control It Shin-ichi Matsuda Department of Information Systems and Mathematical Sciences, Faculty

More information

untitled

untitled 18 1 2,000,000 2,000,000 2007 2 2 2008 3 31 (1) 6 JCOSSAR 2007pp.57-642007.6. LCC (1) (2) 2 10mm 1020 14 12 10 8 6 4 40,50,60 2 0 1998 27.5 1995 1960 40 1) 2) 3) LCC LCC LCC 1 1) Vol.42No.5pp.29-322004.5.

More information

untitled

untitled c 645 2 1. GM 1959 Lindsey [1] 1960 Howard [2] Howard 1 25 (Markov Decision Process) 3 3 2 3 +1=25 9 Bellman [3] 1 Bellman 1 k 980 8576 27 1 015 0055 84 4 1977 D Esopo and Lefkowitz [4] 1 (SI) Cover and

More information

,, Poisson 3 3. t t y,, y n Nµ, σ 2 y i µ + ɛ i ɛ i N0, σ 2 E[y i ] µ * i y i x i y i α + βx i + ɛ i ɛ i N0, σ 2, α, β *3 y i E[y i ] α + βx i

,, Poisson 3 3. t t y,, y n Nµ, σ 2 y i µ + ɛ i ɛ i N0, σ 2 E[y i ] µ * i y i x i y i α + βx i + ɛ i ɛ i N0, σ 2, α, β *3 y i E[y i ] α + βx i Armitage.? SAS.2 µ, µ 2, µ 3 a, a 2, a 3 a µ + a 2 µ 2 + a 3 µ 3 µ, µ 2, µ 3 µ, µ 2, µ 3 log a, a 2, a 3 a µ + a 2 µ 2 + a 3 µ 3 µ, µ 2, µ 3 * 2 2. y t y y y Poisson y * ,, Poisson 3 3. t t y,, y n Nµ,

More information

Mantel-Haenszelの方法

Mantel-Haenszelの方法 Mantel-Haenszel 2008 6 12 ) 2008 6 12 1 / 39 Mantel & Haenzel 1959) Mantel N, Haenszel W. Statistical aspects of the analysis of data from retrospective studies of disease. J. Nat. Cancer Inst. 1959; 224):

More information

kubostat2017e p.1 I 2017 (e) GLM logistic regression : : :02 1 N y count data or

kubostat2017e p.1 I 2017 (e) GLM logistic regression : : :02 1 N y count data or kubostat207e p. I 207 (e) GLM kubo@ees.hokudai.ac.jp https://goo.gl/z9ycjy 207 4 207 6:02 N y 2 binomial distribution logit link function 3 4! offset kubostat207e (https://goo.gl/z9ycjy) 207 (e) 207 4

More information

A5 PDF.pwd

A5 PDF.pwd Average Treatment Effect; ATE attributes Randomized Factorial Survey Experiment; RFSE cues ATE ATE Hainmueller et al. 2014 Average Marginal Component Effect ATE 67 4 2017 2 845 , ;, ATE, ;, ;, W 846 67

More information

untitled

untitled Quantitative Risk Assessment on the Public Health Impact of Pathogenic Vibrio parahaemolyticus in Raw Oyster 1 15 5 23 48 2 21 1 16 1 16 1 11 3 1 3 4 23 1 2 16 12 16 5 6 Hazard IdentificationExposure

More information

「国債の金利推定モデルに関する研究会」報告書

「国債の金利推定モデルに関する研究会」報告書 : LG 19 7 26 2 LG Quadratic Gaussian 1 30 30 3 4 2,,, E-mail: kijima@center.tmu.ac.jp, E-mail: tanaka-keiichi@tmu.ac.jp 1 L G 2 1 L G r L t),r G t) L r L t) G r G t) r L t) h G t) =r G t) r L t) r L t)

More information

yasi10.dvi

yasi10.dvi 2002 50 2 259 278 c 2002 1 2 2002 2 14 2002 6 17 73 PML 1. 1997 1998 Swiss Re 2001 Canabarro et al. 1998 2001 1 : 651 0073 1 5 1 IHD 3 2 110 0015 3 3 3 260 50 2 2002, 2. 1 1 2 10 1 1. 261 1. 3. 3.1 2 1

More information

Vol.8 No (July 2015) 2/ [3] stratification / *1 2 J-REIT *2 *1 *2 J-REIT % J-REIT J-REIT 6 J-REIT J-REIT 10 J-REIT *3 J-

Vol.8 No (July 2015) 2/ [3] stratification / *1 2 J-REIT *2 *1 *2 J-REIT % J-REIT J-REIT 6 J-REIT J-REIT 10 J-REIT *3 J- Vol.8 No.2 1 9 (July 2015) 1,a) 2 3 2012 1 5 2012 3 24, 2013 12 12 2 1 2 A Factor Model for Measuring Market Risk in Real Estate Investment Hiroshi Ishijima 1,a) Akira Maeda 2 Tomohiko Taniyama 3 Received:

More information

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j )

5 Armitage x 1,, x n y i = 10x i + 3 y i = log x i {x i } {y i } 1.2 n i i x ij i j y ij, z ij i j 2 1 y = a x + b ( cm) x ij (i j ) 5 Armitage. x,, x n y i = 0x i + 3 y i = log x i x i y i.2 n i i x ij i j y ij, z ij i j 2 y = a x + b 2 2. ( cm) x ij (i j ) (i) x, x 2 σ 2 x,, σ 2 x,2 σ x,, σ x,2 t t x * (ii) (i) m y ij = x ij /00 y

More information

130 Oct Radial Basis Function RBF Efficient Market Hypothesis Fama ) 4) 1 Fig. 1 Utility function. 2 Fig. 2 Value function. (1) (2)

130 Oct Radial Basis Function RBF Efficient Market Hypothesis Fama ) 4) 1 Fig. 1 Utility function. 2 Fig. 2 Value function. (1) (2) Vol. 47 No. SIG 14(TOM 15) Oct. 2006 RBF 2 Effect of Stock Investor Agent According to Framing Effect to Stock Exchange in Artificial Stock Market Zhai Fei, Shen Kan, Yusuke Namikawa and Eisuke Kita Several

More information

スライド 1

スライド 1 企画セッション 欠測のあるデータにおける主解析の検討 (4) Vansteelandt の方法による Doubly Robust な推定量を用いた連続量経時データの解析 土居正明 1)2) 駒嵜弘 1)3) 横山雄一 1)4) 鵜飼裕之 1)5) 藤原正和 1)6) 1) 日本製薬工業協会医薬品評価委員会データサイエンス部会タスクフォース 4 欠測のあるデータの解析検討チーム 2) 東レ株式会社,

More information

gofman2.eps

gofman2.eps 2011 7 10 4 7 1 ICRP(2007) 5.7 10 2 Sv 1 1 13 (=4600 =11 =660 ) 10mSv 5.7 10 4 4600[] =2.6[] ICRP 0 1 licrp 1 2 1 DDREF ICRP(2007) (ICRP 2007, p.178) - (idem., p.174) 1Sv - (DDREF: dose and dose-rate effectiveness

More information

2009年度 東京薬科大学 薬学部 授業計画

2009年度 東京薬科大学 薬学部 授業計画 2009 2 234 Tokyo University of Pharmacy and Life Sciences since 880 Tokyo University of Pharmacy and Life Sciences Department of Clinical Pharmacy Men's Division Department of Clinical Applied Pharmacy

More information

Meas- urement Angoff, W. H. 19654 Equating non-parallel tests. Journal of Educational Measurement, 1, 11-14. Angoff, W. H. 1971a Scales, norms and equivalent scores. In R. L. Thorndike (Ed.) Educational

More information

03.Œk’ì

03.Œk’ì HRS KG NG-HRS NG-KG AIC Fama 1965 Mandelbrot Blattberg Gonedes t t Kariya, et. al. Nagahara ARCH EngleGARCH Bollerslev EGARCH Nelson GARCH Heynen, et. al. r n r n =σ n w n logσ n =α +βlogσ n 1 + v n w

More information

1 Tokyo Daily Rainfall (mm) Days (mm)

1 Tokyo Daily Rainfall (mm) Days (mm) ( ) r-taka@maritime.kobe-u.ac.jp 1 Tokyo Daily Rainfall (mm) 0 100 200 300 0 10000 20000 30000 40000 50000 Days (mm) 1876 1 1 2013 12 31 Tokyo, 1876 Daily Rainfall (mm) 0 50 100 150 0 100 200 300 Tokyo,

More information

1) Lysozyme and Viral Infections, 2nd Intern- Symposium on Fleming's Lysozyme, ational Milano, Apr. 1961. 2) Ermol'eva, Z.V. et al.: Experimental study and clinical application of lysozyme. Fed. Proc.

More information

Brennan, G. and Lomasky, L., Democracy and Decision : The Pure Theory of Electoral Preference, Cambridge: Cambridge U. P., 1993. Campbell, A., Converse, P. E., Miller W. E., and Stokes, D. E., Elections

More information

Stepwise Chow Test * Chow Test Chow Test Stepwise Chow Test Stepwise Chow Test Stepwise Chow Test Riddell Riddell first step second step sub-step Step

Stepwise Chow Test * Chow Test Chow Test Stepwise Chow Test Stepwise Chow Test Stepwise Chow Test Riddell Riddell first step second step sub-step Step Stepwise Chow Test * Chow Test Chow Test Stepwise Chow Test Stepwise Chow Test Stepwise Chow Test Riddell Riddell first step second step sub-step Stepwise Chow Test a Stepwise Chow Test Takeuchi 1991Nomura

More information

JFE.dvi

JFE.dvi ,, Department of Civil Engineering, Chuo University Kasuga 1-13-27, Bunkyo-ku, Tokyo 112 8551, JAPAN E-mail : atsu1005@kc.chuo-u.ac.jp E-mail : kawa@civil.chuo-u.ac.jp SATO KOGYO CO., LTD. 12-20, Nihonbashi-Honcho

More information

Hi-Stat Discussion Paper Series No.248 東京圏における 1990 年代以降の住み替え行動 住宅需要実態調査 を用いた Mixed Logit 分析 小林庸平行武憲史 March 2008 Hitotsubashi University Research Unit

Hi-Stat Discussion Paper Series No.248 東京圏における 1990 年代以降の住み替え行動 住宅需要実態調査 を用いた Mixed Logit 分析 小林庸平行武憲史 March 2008 Hitotsubashi University Research Unit Hi-Stat Discussion Paper Series No.248 東京圏における 1990 年代以降の住み替え行動 住宅需要実態調査 を用いた Logit 分析 小林庸平行武憲史 March 2008 Hitotsubashi University Research Unit for Statistical Analysis in Social Sciences A 21st-Century

More information

GLM PROC GLM y = Xβ + ε y X β ε ε σ 2 E[ε] = 0 var[ε] = σ 2 I σ 2 0 σ 2 =... 0 σ 2 σ 2 I ε σ 2 y E[y] =Xβ var[y] =σ 2 I PROC GLM

GLM PROC GLM y = Xβ + ε y X β ε ε σ 2 E[ε] = 0 var[ε] = σ 2 I σ 2 0 σ 2 =... 0 σ 2 σ 2 I ε σ 2 y E[y] =Xβ var[y] =σ 2 I PROC GLM PROC MIXED ( ) An Introdunction to PROC MIXED Junji Kishimoto SAS Institute Japan / Keio Univ. SFC / Univ. of Tokyo e-mail address: jpnjak@jpn.sas.com PROC MIXED PROC GLM PROC MIXED,,,, 1 1.1 PROC MIXED

More information

Special IssueDiagnoses and therapeutic agents for age-related diseases Reviews Original Case reports β Medicinal drugs affecting on clinical laboratory blood test results and adverse effects of them

More information

66-1 田中健吾・松浦紗織.pwd

66-1 田中健吾・松浦紗織.pwd Abstract The aim of this study was to investigate the characteristics of a psychological stress reaction scale for home caregivers, using Item Response Theory IRT. Participants consisted of 337 home caregivers

More information

y = x 4 y = x 8 3 y = x 4 y = x 3. 4 f(x) = x y = f(x) 4 x =,, 3, 4, 5 5 f(x) f() = f() = 3 f(3) = 3 4 f(4) = 4 *3 S S = f() + f() + f(3) + f(4) () *4

y = x 4 y = x 8 3 y = x 4 y = x 3. 4 f(x) = x y = f(x) 4 x =,, 3, 4, 5 5 f(x) f() = f() = 3 f(3) = 3 4 f(4) = 4 *3 S S = f() + f() + f(3) + f(4) () *4 Simpson H4 BioS. Simpson 3 3 0 x. β α (β α)3 (x α)(x β)dx = () * * x * * ɛ δ y = x 4 y = x 8 3 y = x 4 y = x 3. 4 f(x) = x y = f(x) 4 x =,, 3, 4, 5 5 f(x) f() = f() = 3 f(3) = 3 4 f(4) = 4 *3 S S = f()

More information

Design of highly accurate formulas for numerical integration in weighted Hardy spaces with the aid of potential theory 1 Ken ichiro Tanaka 1 Ω R m F I = F (t) dt (1.1) Ω m m 1 m = 1 1 Newton-Cotes Gauss

More information

商品流動性リスクの計量化に関する一考察(その2)―内生的流動性リスクを考慮したストレス・テスト―

商品流動性リスクの計量化に関する一考察(その2)―内生的流動性リスクを考慮したストレス・テスト― E-mail: shigeru_yoshifuji@btm.co.jp E-mail: fuminobu_otake@btm.co.jp Bangia et al. G Bangia et al. exogenous liquidity risk endogenous liquidity risk et al LTCMLong Term Capital Management Fed G G T

More information

p *2 DSGEDynamic Stochastic General Equilibrium New Keynesian *2 2

p *2 DSGEDynamic Stochastic General Equilibrium New Keynesian *2 2 2013 1 nabe@ier.hit-u.ac.jp 2013 4 11 Jorgenson Tobin q : Hayashi s Theorem : Jordan : 1 investment 1 2 3 4 5 6 7 8 *1 *1 93SNA 1 p.180 1936 100 1970 *2 DSGEDynamic Stochastic General Equilibrium New Keynesian

More information

k2 ( :35 ) ( k2) (GLM) web web 1 :

k2 ( :35 ) ( k2) (GLM) web   web   1 : 2012 11 01 k2 (2012-10-26 16:35 ) 1 6 2 (2012 11 01 k2) (GLM) kubo@ees.hokudai.ac.jp web http://goo.gl/wijx2 web http://goo.gl/ufq2 1 : 2 2 4 3 7 4 9 5 : 11 5.1................... 13 6 14 6.1......................

More information

日本製薬工業協会シンポジウム 生存時間解析の評価指標に関する最近の展開ー RMST (restricted mean survival time) を理解するー 2. RMST の定義と統計的推測 2018 年 6 月 13 日医薬品評価委員会データサイエンス部会タスクフォース 4 生存時間解析チー

日本製薬工業協会シンポジウム 生存時間解析の評価指標に関する最近の展開ー RMST (restricted mean survival time) を理解するー 2. RMST の定義と統計的推測 2018 年 6 月 13 日医薬品評価委員会データサイエンス部会タスクフォース 4 生存時間解析チー 日本製薬工業協会シンポジウム 生存時間解析の評価指標に関する最近の展開ー RMST (restricted mean survival time) を理解するー 2. RMST の定義と統計的推測 2018 年 6 月 13 日医薬品評価委員会データサイエンス部会タスクフォース 4 生存時間解析チーム 日本新薬 ( 株 ) 田中慎一 留意点 本発表は, 先日公開された 生存時間型応答の評価指標 -RMST(restricted

More information

* 1 1 (i) (ii) Brückner-Hartree-Fock (iii) (HF, BCS, HFB) (iv) (TDHF,TDHFB) (RPA) (QRPA) (v) (vi) *

* 1 1 (i) (ii) Brückner-Hartree-Fock (iii) (HF, BCS, HFB) (iv) (TDHF,TDHFB) (RPA) (QRPA) (v) (vi) * * 1 1 (i) (ii) Brückner-Hartree-Fock (iii) (HF, BCS, HFB) (iv) (TDHF,TDHFB) (RPA) (QRPA) (v) (vi) *1 2004 1 1 ( ) ( ) 1.1 140 MeV 1.2 ( ) ( ) 1.3 2.6 10 8 s 7.6 10 17 s? Λ 2.5 10 10 s 6 10 24 s 1.4 ( m

More information

untitled

untitled 2010 58 1 39 59 c 2010 20 2009 11 30 2010 6 24 6 25 1 1953 12 2008 III 1. 5, 1961, 1970, 1975, 1982, 1992 12 2008 2008 226 0015 32 40 58 1 2010 III 2., 2009 3 #3.xx #3.1 #3.2 1 1953 2 1958 12 2008 1 2

More information

不安障害研究, 9(1), 17-32, 2017

不安障害研究, 9(1), 17-32, 2017 9(1), 17 32, 2017 Panic and Agoraphobia Scale 1 2 3 2 2 2 2 4 4 1 2 3 4 Panic and Agoraphobia Scale PAS DSM-IV-TR APA, 2000 3 4 PAS PAS PAS Patient Global Impression Improvement PAS 1980 Diagnostic and

More information

表紙1

表紙1 LIPER HERA: Higher Education Role Analysis 17 18 19 1.1 Christopher Hood Martin Lodge capacity 1 1980 34 David C. McClelland Testing for Competence Rather than for Intelligence 2 traits 3 Looking to the

More information

Stata 11 Stata ROC whitepaper mwp anova/oneway 3 mwp-042 kwallis Kruskal Wallis 28 mwp-045 ranksum/median / 31 mwp-047 roctab/roccomp ROC 34 mwp-050 s

Stata 11 Stata ROC whitepaper mwp anova/oneway 3 mwp-042 kwallis Kruskal Wallis 28 mwp-045 ranksum/median / 31 mwp-047 roctab/roccomp ROC 34 mwp-050 s BR003 Stata 11 Stata ROC whitepaper mwp anova/oneway 3 mwp-042 kwallis Kruskal Wallis 28 mwp-045 ranksum/median / 31 mwp-047 roctab/roccomp ROC 34 mwp-050 sampsi 47 mwp-044 sdtest 54 mwp-043 signrank/signtest

More information

自殺の経済社会的要因に関する調査研究報告書

自殺の経済社会的要因に関する調査研究報告書 17 1 2 3 4 5 11 16 30,247 17 18 21,024 +2.0 6 12 13 WHO 100 14 7 15 2 5 8 16 9 10 17 11 12 13 14 15 16 17 II I 18 Durkheim(1897) Hamermesh&Soss(1974)Dixit&Pindyck(1994) Becker&Posner(2004) Rosenthal(1993)

More information

2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server

2007/8 Vol. J90 D No. 8 Stauffer [7] 2 2 I 1 I 2 2 (I 1(x),I 2(x)) 2 [13] I 2 = CI 1 (C >0) (I 1,I 2) (I 1,I 2) Field Monitoring Server a) Change Detection Using Joint Intensity Histogram Yasuyo KITA a) 2 (0 255) (I 1 (x),i 2 (x)) I 2 = CI 1 (C>0) (I 1,I 2 ) (I 1,I 2 ) 2 1. [1] 2 [2] [3] [5] [6] [8] Intelligent Systems Research Institute,

More information

The Japanese Journal of Health Psychology, 29(S): (2017)

The Japanese Journal of Health Psychology, 29(S): (2017) Journal of Health Psychology Research 2017, Vol. 29, Special issue, 139 149Journal of Health Psychology Research 2016, J-STAGE Vol. Advance 29, Special publication issue, 139 149 date : 5 December, 2016

More information

H22 BioS t (i) treat1 treat2 data d1; input patno treat1 treat2; cards; ; run; 1 (i) treat = 1 treat =

H22 BioS t (i) treat1 treat2 data d1; input patno treat1 treat2; cards; ; run; 1 (i) treat = 1 treat = H BioS t (i) treat treat data d; input patno treat treat; cards; 3 8 7 4 8 8 5 5 6 3 ; run; (i) treat treat data d; input group patno period treat y; label group patno period ; cards; 3 8 3 7 4 8 4 8 5

More information

RTM RTM Risk terrain terrain RTM RTM 48

RTM RTM Risk terrain terrain RTM RTM 48 Risk Terrain Model I Risk Terrain Model RTM,,, 47 RTM RTM Risk terrain terrain RTM RTM 48 II, RTM CSV,,, RTM Caplan and Kennedy RTM Risk Terrain Modeling Diagnostics RTMDx RTMDx RTMDx III 49 - SNS 50 0

More information

untitled

untitled 11-19 2012 1 2 3 30 2 Key words acupuncture insulated needle cervical sympathetick trunk thermography blood flow of the nasal skin Received September 12, 2011; Accepted November 1, 2011 I 1 2 1954 3 564-0034

More information

ヒストリカル法によるバリュー・アット・リスクの計測:市場価格変動の非定常性への実務的対応

ヒストリカル法によるバリュー・アット・リスクの計測:市場価格変動の非定常性への実務的対応 VaR VaR VaR VaR GARCH E-mail : yoshitaka.andou@boj.or.jp VaR VaR LTCM VaR VaR VaR VaR VaR VaR VaR VaR t P(t) P(= P() P(t)) Pr[ P X] =, X t100 (1 )VaR VaR P100 P X X (1 ) VaR VaR VaR VaR VaR VaR VaR VaR

More information

seminar0220a.dvi

seminar0220a.dvi 1 Hi-Stat 2 16 2 20 16:30-18:00 2 2 217 1 COE 4 COE RA E-MAIL: ged0104@srv.cc.hit-u.ac.jp 2004 2 25 S-PLUS S-PLUS S-PLUS S-code 2 [8] [8] [8] 1 2 ARFIMA(p, d, q) FI(d) φ(l)(1 L) d x t = θ(l)ε t ({ε t }

More information

一般化線形 (混合) モデル (2) - ロジスティック回帰と GLMM

一般化線形 (混合) モデル (2) - ロジスティック回帰と GLMM .. ( ) (2) GLMM kubo@ees.hokudai.ac.jp I http://goo.gl/rrhzey 2013 08 27 : 2013 08 27 08:29 kubostat2013ou2 (http://goo.gl/rrhzey) ( ) (2) 2013 08 27 1 / 74 I.1 N k.2 binomial distribution logit link function.3.4!

More information

main.dvi

main.dvi SGC - 70 2, 3 23 ɛ-δ 2.12.8 3 2.92.13 4 2 3 1 2.1 2.102.12 [8][14] [1],[2] [4][7] 2 [4] 1 2009 8 1 1 1.1... 1 1.2... 4 1.3 1... 8 1.4 2... 9 1.5... 12 1.6 1... 16 1.7... 18 1.8... 21 1.9... 23 2 27 2.1

More information

untitled

untitled 2 : n =1, 2,, 10000 0.5125 0.51 0.5075 0.505 0.5025 0.5 0.4975 0.495 0 2000 4000 6000 8000 10000 2 weak law of large numbers 1. X 1,X 2,,X n 2. µ = E(X i ),i=1, 2,,n 3. σi 2 = V (X i ) σ 2,i=1, 2,,n ɛ>0

More information

スケーリング理論とはなにか? - --尺度を変えて見えること--

スケーリング理論とはなにか?  - --尺度を変えて見えること-- ? URL: http://maildbs.c.u-tokyo.ac.jp/ fukushima mailto:hukusima@phys.c.u-tokyo.ac.jp DEX-SMI @ 2006 12 17 ( ) What is scaling theory? DEX-SMI 1 / 40 Outline Outline 1 2 3 4 ( ) What is scaling theory?

More information

08医療情報学22_1_水流final.PDF

08医療情報学22_1_水流final.PDF 22 (1), 702002: 59 59- The Problem of Nursing Common Language for the Information Sharing in Clinical Practice The fact-finding in regard to the correspondence between name and content of nursing action

More information

56 56 The Development of Preschool Children s Views About Conflict Resolution With Peers : Diversity of changes from five-year-olds to six-year-olds Y

56 56 The Development of Preschool Children s Views About Conflict Resolution With Peers : Diversity of changes from five-year-olds to six-year-olds Y 56 56 The Development of Preschool Children s Views About Conflict Resolution With Peers : Diversity of changes from five-year-olds to six-year-olds Yukari KUBO 311615 1 1 1 97 43-2 2005 Rubin, Bukowski,

More information

L Y L( ) Y0.15Y 0.03L 0.01L 6% L=(10.15)Y 108.5Y 6%1 Y y p L ( 19 ) [1990] [1988] 1

L Y L( ) Y0.15Y 0.03L 0.01L 6% L=(10.15)Y 108.5Y 6%1 Y y p L ( 19 ) [1990] [1988] 1 1. 1-1 00 001 9 J-REIT 1- MM CAPM 1-3 [001] [1997] [003] [001] [1999] [003] 1-4 0 . -1 18 1-1873 6 1896 L Y L( ) Y0.15Y 0.03L 0.01L 6% L=(10.15)Y 108.5Y 6%1 Y y p L 6 1986 ( 19 ) -3 17 3 18 44 1 [1990]

More information

tokei01.dvi

tokei01.dvi 2. :,,,. :.... Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN 4 3. (probability),, 1. : : n, α A, A a/n. :, p, p Apr. - Jul., 26FY Dept. of Mechanical Engineering, Saga Univ., JAPAN

More information

「スウェーデン企業におけるワーク・ライフ・バランス調査 」報告書

「スウェーデン企業におけるワーク・ライフ・バランス調査 」報告書 1 2004 12 2005 4 5 100 25 3 1 76 2 Demoskop 2 2004 11 24 30 7 2 10 1 2005 1 31 2 4 5 2 3-1-1 3-1-1 Micromediabanken 2005 1 507 1000 55.0 2 77 50 50 /CEO 36.3 37.4 18.1 3-2-1 43.0 34.4 / 17.6 3-2-2 78 79.4

More information

ON STRENGTH AND DEFORMATION OF REINFORCED CONCRETE SHEAR WALLS By Shigeru Mochizuki Concrete Journal, Vol. 18, No. 4, April 1980, pp. 1 `13 Synopsis A

ON STRENGTH AND DEFORMATION OF REINFORCED CONCRETE SHEAR WALLS By Shigeru Mochizuki Concrete Journal, Vol. 18, No. 4, April 1980, pp. 1 `13 Synopsis A ON STRENGTH AND DEFORMATION OF REINFORCED CONCRETE SHEAR WALLS By Shigeru Mochizuki Concrete Journal, Vol. 18, No. 4, April 1980, pp. 1 `13 Synopsis After Tokachioki Earthquake of 1968, the importance

More information

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L

Input image Initialize variables Loop for period of oscillation Update height map Make shade image Change property of image Output image Change time L 1,a) 1,b) 1/f β Generation Method of Animation from Pictures with Natural Flicker Abstract: Some methods to create animation automatically from one picture have been proposed. There is a method that gives

More information

1 (1997) (1997) 1974:Q3 1994:Q3 (i) (ii) ( ) ( ) 1 (iii) ( ( 1999 ) ( ) ( ) 1 ( ) ( 1995,pp ) 1

1 (1997) (1997) 1974:Q3 1994:Q3 (i) (ii) ( ) ( ) 1 (iii) ( ( 1999 ) ( ) ( ) 1 ( ) ( 1995,pp ) 1 1 (1997) (1997) 1974:Q3 1994:Q3 (i) (ii) ( ) ( ) 1 (iii) ( ( 1999 ) ( ) ( ) 1 ( ) ( 1995,pp.218 223 ) 1 2 ) (i) (ii) / (iii) ( ) (i ii) 1 2 1 ( ) 3 ( ) 2, 3 Dunning(1979) ( ) 1 2 ( ) ( ) ( ) (,p.218) (

More information

食道がん化学放射線療法後のsalvage手術

食道がん化学放射線療法後のsalvage手術 2006 2 17 52 Daly JM, et al. J Am Coll Surg 2000;190:562-573 Esophageal Cancer: Results of an American College of Surgeons Patient Care Evaluation Study Daly JM, et al. J Am Coll Surg 2000;190:562-573

More information

Jorgenson F, L : L: Inada lim F =, lim F L = k L lim F =, lim F L = 2 L F >, F L > 3 F <, F LL < 4 λ >, λf, L = F λ, λl 5 Y = Const a L a < α < CES? C

Jorgenson F, L : L: Inada lim F =, lim F L = k L lim F =, lim F L = 2 L F >, F L > 3 F <, F LL < 4 λ >, λf, L = F λ, λl 5 Y = Const a L a < α < CES? C 27 nabe@ier.hit-u.ac.jp 27 4 3 Jorgenson Tobin q : Hayashi s Theorem Jordan Saddle Path. GDP % GDP 2. 3. 4.. Tobin q 2 2. Jorgenson F, L : L: Inada lim F =, lim F L = k L lim F =, lim F L = 2 L F >, F

More information

<4D F736F F D B B83578B6594BB2D834A836F815B82D082C88C60202E646F63>

<4D F736F F D B B83578B6594BB2D834A836F815B82D082C88C60202E646F63> 確率的手法による構造安全性の解析 サンプルページ この本の定価 判型などは, 以下の URL からご覧いただけます. http://www.morikita.co.jp/books/mid/55271 このサンプルページの内容は, 初版 1 刷発行当時のものです. i 25 7 ii Benjamin &Cornell Ang & Tang Schuëller 1973 1974 Ang Mathematica

More information

42 3 u = (37) MeV/c 2 (3.4) [1] u amu m p m n [1] m H [2] m p = (4) MeV/c 2 = (13) u m n = (4) MeV/c 2 =

42 3 u = (37) MeV/c 2 (3.4) [1] u amu m p m n [1] m H [2] m p = (4) MeV/c 2 = (13) u m n = (4) MeV/c 2 = 3 3.1 3.1.1 kg m s J = kg m 2 s 2 MeV MeV [1] 1MeV=1 6 ev = 1.62 176 462 (63) 1 13 J (3.1) [1] 1MeV/c 2 =1.782 661 731 (7) 1 3 kg (3.2) c =1 MeV (atomic mass unit) 12 C u = 1 12 M(12 C) (3.3) 41 42 3 u

More information

(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α,

(a) (b) (c) Canny (d) 1 ( x α, y α ) 3 (x α, y α ) (a) A 2 + B 2 + C 2 + D 2 + E 2 + F 2 = 1 (3) u ξ α u (A, B, C, D, E, F ) (4) ξ α (x 2 α, 2x α y α, [II] Optimization Computation for 3-D Understanding of Images [II]: Ellipse Fitting 1. (1) 2. (2) (edge detection) (edge) (zero-crossing) Canny (Canny operator) (3) 1(a) [I] [II] [III] [IV ] E-mail sugaya@iim.ics.tut.ac.jp

More information

untitled

untitled ... 4...5...6...7...10...11... 12...12...12...13...14...15...15...16...16...17...17...18...18...19...19...19 Hill...20...20...21...21...22...23...24...25... 34...34...35...38-2 - ...41...49...51...51...51...52...53...56...56...57...60...60-3

More information

e.g., Mahoney, Vandell, Simpkins, & Zarrett, Bohnert, Fredricks, & Randall2010 breadth intensitydurationengagement e.g., Mahone

e.g., Mahoney, Vandell, Simpkins, & Zarrett, Bohnert, Fredricks, & Randall2010 breadth intensitydurationengagement e.g., Mahone 2014 25 4 453465 1 670 6 1 2 2008, 2009 2002 e.g., 2002; 1999 100 1 2 2013; 2002; 1999 1 2007 2009 2006; 20082006 2013 454 254 e.g., Mahoney, Vandell, Simpkins, & Zarrett, 2009 1 23 2013 Bohnert, Fredricks,

More information

研究論集Vol.16-No.2.indb

研究論集Vol.16-No.2.indb Vol. No. pp. - SSTSST SST Eriko HARADA This study was aimed at students with hearing impairments to improve their social skills and self-esteem by putting social skills training SSTinto practice and discussing

More information

(Jackson model) Ziman) (fluidity) (viscosity) (Free v

(Jackson model) Ziman) (fluidity) (viscosity) (Free v 1) 16 6 10 1) e-mail: nishitani@ksc.kwansei.ac.jp 0. 1 2 0. 1. 1 2 0. 1. 2 3 0. 1. 3 4 0. 1. 4 5 0. 1. 5 6 0. 1. 6 (Jackson model) 8 0. 1. 7 10. 1 10 0. 1 0. 1. 1 Ziman) (fluidity) (viscosity) (Free volume)(

More information

160_cov.indd

160_cov.indd 3 1 2 3 1. 2. 2.1 7 3 1996 welfare to work 2. 1 195 196 197 198 99 Autumn 7 No. 16 4. 3.5 3. 2.5 2. 1.5 1..5. 195 1952 1954 1956 1958 196 1962 1964 1966 1968 197 1972 1974 1976 1978 198 1982 1984 1986

More information

第 4 回生物統計情報学シンポジウム Estimand に関する議論の事例と今後の展望 1 Estimand が医薬品開発に 与えるインパクト 松岡伸篤ファイザー株式会社 2018 年 7 月 27 日 ( 金 )

第 4 回生物統計情報学シンポジウム Estimand に関する議論の事例と今後の展望 1 Estimand が医薬品開発に 与えるインパクト 松岡伸篤ファイザー株式会社 2018 年 7 月 27 日 ( 金 ) 第 4 回生物統計情報学シンポジウム Estimand に関する議論の事例と今後の展望 1 Estimand が医薬品開発に 与えるインパクト 松岡伸篤ファイザー株式会社 2018 年 7 月 27 日 ( 金 ) 発表内容 2 承認審査過程での Estimand に関する議論の事例 エルツグリフロジン (SGLT2 阻害薬 ) 医薬品開発に与えるインパクト 試験デザイン, オペレーション ラベリングなど

More information

% 95% 2002, 2004, Dunkel 1986, p.100 1

% 95% 2002, 2004, Dunkel 1986, p.100 1 Blended Learning 要 旨 / Moodle Blended Learning Moodle キーワード:Blended Learning Moodle 1 2008 Moodle e Blended Learning 2009.. 1994 2005 1 2 93% 95% 2002, 2004, 2011 2011 1 Dunkel 1986, p.100 1 Blended Learning

More information

SEJulyMs更新V7

SEJulyMs更新V7 1 2 ( ) Quantitative Characteristics of Software Process (Is There any Myth, Mystery or Anomaly? No Silver Bullet?) Zenya Koono and Hui Chen A process creates a product. This paper reviews various samples

More information

1 はじめに 85

1 はじめに 85 1 はじめに 85 2 ジョイント スペースによるブランド選択の分析 2.1 ジョイント スペース マップ 86 2.2 ジョイント スペースとマーケティング変数を組み込んだブランド選択モデル hjt exp hjt exp hit h jt hjt hjt hjt hjt hk hjkt hjt k k hk h k hj hmt jm. m m hmt h m t jm m j hjt jm hmt.

More information

2 10 The Bulletin of Meiji University of Integrative Medicine 1,2 II 1 Web PubMed elbow pain baseball elbow little leaguer s elbow acupun

2 10 The Bulletin of Meiji University of Integrative Medicine 1,2 II 1 Web PubMed elbow pain baseball elbow little leaguer s elbow acupun 10 1-14 2014 1 2 3 4 2 1 2 3 4 Web PubMed elbow pain baseball elbow little leaguer s elbow acupuncture electric acupuncture 2003 2012 10 39 32 Web PubMed Key words growth stage elbow pain baseball elbow

More information

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q

4. C i k = 2 k-means C 1 i, C 2 i 5. C i x i p [ f(θ i ; x) = (2π) p 2 Vi 1 2 exp (x µ ] i) t V 1 i (x µ i ) 2 BIC BIC = 2 log L( ˆθ i ; x i C i ) + q x-means 1 2 2 x-means, x-means k-means Bayesian Information Criterion BIC Watershed x-means Moving Object Extraction Using the Number of Clusters Determined by X-means Clustering Naoki Kubo, 1 Kousuke

More information

On the Limited Sample Effect of the Optimum Classifier by Bayesian Approach he Case of Independent Sample Size for Each Class Xuexian HA, etsushi WAKA

On the Limited Sample Effect of the Optimum Classifier by Bayesian Approach he Case of Independent Sample Size for Each Class Xuexian HA, etsushi WAKA Journal Article / 学術雑誌論文 ベイズアプローチによる最適識別系の有限 標本効果に関する考察 : 学習標本の大きさ がクラス間で異なる場合 (< 論文小特集 > パ ターン認識のための学習 : 基礎と応用 On the limited sample effect of bayesian approach : the case of each class 韓, 雪仙 ; 若林, 哲史

More information

Unknown

Unknown Journal of Breast and Thyroid Sonology Journal of Breast and Thyroid Sonology Vol.2, No.3 July 2013 Report The 30 th Meeting of Japan Association of Breast and Thyroid Sonology... 1 Department of Organ

More information

1 Nelson-Siegel Nelson and Siegel(1987) 3 Nelson-Siegel 3 Nelson-Siegel 2 3 Nelson-Siegel 2 Nelson-Siegel Litterman and Scheinkman(199

1 Nelson-Siegel Nelson and Siegel(1987) 3 Nelson-Siegel 3 Nelson-Siegel 2 3 Nelson-Siegel 2 Nelson-Siegel Litterman and Scheinkman(199 Nelson-Siegel Nelson-Siegel 1992 2007 15 1 Nelson and Siegel(1987) 2 FF VAR 1996 FF B) 1 Nelson-Siegel 15 90 1 Nelson and Siegel(1987) 3 Nelson-Siegel 3 Nelson-Siegel 2 3 Nelson-Siegel 2 Nelson-Siegel

More information