Estimating duration distribution aided by auxiliary longitudinal measures in presence of missing time origin
AbstractUnderstanding the distribution of an event duration time is essential in many studies. The exact time to the event is often unavailable, and thus so is the full event duration. By linking relevant longitudinal measures to the event duration, we propose to estimate the duration distribution via the first-hitting-time model (e.g. Lee and Whitmore in Stat Sci 21(4):501 –513, 2006). The longitudinal measures are assumed to follow a Wiener process with random drift. We apply a variant of the MCEM algorithm to compute likelihood-based estimators of the parameters in the longitudinal process model. This allows us to ada...
Source: Lifetime Data Analysis - April 5, 2021 Category: Statistics Source Type: research

The semiparametric accelerated trend-renewal process for recurrent event data
AbstractRecurrent event data arise in many biomedical longitudinal studies when health-related events can occur repeatedly for each subject during the follow-up time. In this article, we examine the gap times between recurrent events. We propose a new semiparametric accelerated gap time model based on the trend-renewal process which contains trend and renewal components that allow for the intensity function to vary between successive events. We use the Buckley –James imputation approach to deal with censored transformed gap times. The proposed estimators are shown to be consistent and asymptotically normal. Model diagnos...
Source: Lifetime Data Analysis - March 25, 2021 Category: Statistics Source Type: research

Information measures and design issues in the study of mortality deceleration: findings for the gamma-Gompertz model
AbstractMortality deceleration, or the slowing down of death rates at old ages, has been repeatedly investigated, but empirical studies of this phenomenon have produced mixed results. The scarcity of observations at the oldest ages complicates the statistical assessment of mortality deceleration, even in the parsimonious parametric framework of the gamma-Gompertz model considered here. The need for thorough verification of the ages at death can further limit the available data. As logistical constraints may only allow to validate survivors beyond a certain (high) age, samples may be restricted to a certain age range. If we...
Source: Lifetime Data Analysis - February 25, 2021 Category: Statistics Source Type: research

Testing equivalence of survival before but not after end of follow-up
AbstractFor equivalence trials with survival outcomes, a popular testing approach is the elegant test for equivalence of two survival functions suggested by Wellek (Biometrics 49: 877 –881, 1993). This test evaluates whether or not the difference between the true survival curves is practically irrelevant by specifying an equivalence margin on the hazard ratio under the proportional hazards assumption. However, this approach is based on extrapolating the behavior of the survival curves to the whole time axis, whereas in practice survival times are only observed until the end of follow-up. We propose a modification of Well...
Source: Lifetime Data Analysis - January 30, 2021 Category: Statistics Source Type: research

The ROC of Cox proportional hazards cure models with application in cancer studies
AbstractWith recent advancement in cancer screening and treatment, many patients with cancers are identified at early stage and clinically cured. Importantly, uncured patients should be treated timely before the cancer progresses to advanced stages for which therapeutic options are rather limited. It is also crucial to identify uncured subjects among patients with early-stage cancers for clinical trials to develop effective adjuvant therapies. Thus, it is of interest to develop statistical predictive models with as high accuracy as possible in predicting the latent cure status. The receiver operating characteristic curve (...
Source: Lifetime Data Analysis - January 28, 2021 Category: Statistics Source Type: research

An additive hazards cure model with informative interval censoring
AbstractThe existence of a cured subgroup happens quite often in survival studies and many authors considered this under various situations (Farewell in Biometrics 38:1041 –1046, 1982; Kuk and Chen in Biometrika 79:531–541, 1992; Lam and Xue in Biometrika 92:573–586, 2005; Zhou et al. in J Comput Graph Stat 27:48–58, 2018). In this paper, we discuss the situation where only interval-censored data are available and furthermore, the censoring may be informative, for which there does not seem to exist an established estimation procedure. For the analysis, we present a three component model consisting of a logistic mod...
Source: Lifetime Data Analysis - January 22, 2021 Category: Statistics Source Type: research

Semi-parametric survival analysis via Dirichlet process mixtures of the First Hitting Time model
AbstractTime-to-event data often violate the proportional hazards assumption inherent in the popular Cox regression model. Such violations are especially common in the sphere of biological and medical data where latent heterogeneity due to unmeasured covariates or time varying effects are common. A variety of parametric survival models have been proposed in the literature which make more appropriate assumptions on the hazard function, at least for certain applications. One such model is derived from the First Hitting Time (FHT) paradigm which assumes that a subject ’s event time is determined by a latent stochastic proce...
Source: Lifetime Data Analysis - January 8, 2021 Category: Statistics Source Type: research

Semiparametric regression based on quadratic inference function for multivariate failure time data with auxiliary information
AbstractThis paper deals with statistical inference procedure of multivariate failure time data when the primary covariate can be measured only on a subset of the full cohort but the auxiliary information is available. To improve efficiency of statistical inference, we use quadratic inference function approach to incorporate the intra-cluster correlation and use kernel smoothing technique to further utilize the auxiliary information. The proposed method is shown to be more efficient than those ignoring the intra-cluster correlation and auxiliary information and is easy to implement. In addition, we develop a chi-squared te...
Source: Lifetime Data Analysis - January 8, 2021 Category: Statistics Source Type: research

Optimal designs for discrete-time survival models with random effects
AbstractThis paper considers the optimal design for the frailty model with discrete-time survival endpoints in longitudinal studies. We introduce the random effects into the discrete hazard models to account for the heterogeneity between experimental subjects, which causes the observations of the same subject at the sequential time points being correlated. We propose a general design method to collect the survival endpoints as inexpensively and efficiently as possible. A cost-based generalizedD (\(D_s\))-optimal design criterion is proposed to derive the optimal designs for estimating the fixed effects with cost constraint...
Source: Lifetime Data Analysis - January 8, 2021 Category: Statistics Source Type: research

Joint modeling of longitudinal continuous, longitudinal ordinal, and time-to-event outcomes
AbstractIn this paper, we propose an innovative method for jointly analyzing survival data and longitudinally measured continuous and ordinal data. We use a random effects accelerated failure time model for survival outcomes, a linear mixed model for continuous longitudinal outcomes and a proportional odds mixed model for ordinal longitudinal outcomes, where these outcome processes are linked through a set of association parameters. A primary objective of this study is to examine the effects of association parameters on the estimators of joint models. The model parameters are estimated by the method of maximum likelihood. ...
Source: Lifetime Data Analysis - November 24, 2020 Category: Statistics Source Type: research

Firth adjusted score function for monotone likelihood in the mixture cure fraction model
AbstractModels for situations where some individuals are long-term survivors, immune or non-susceptible to the event of interest, are extensively studied in biomedical research. Fitting a regression can be problematic in situations involving small sample sizes with high censoring rate, since the maximum likelihood estimates of some coefficients may be infinity. This phenomenon is called monotone likelihood, and it occurs in the presence of many categorical covariates, especially when one covariate level is not associated with any failure (in survival analysis) or when a categorical covariate perfectly predicts a binary res...
Source: Lifetime Data Analysis - November 13, 2020 Category: Statistics Source Type: research

The added value of new covariates to the brier score in cox survival models
AbstractCalibration is an important measure of the predictive accuracy for a prognostic risk model. A widely used measure of calibration when the outcome is survival time is the expected Brier score. In this paper, methodology is developed to accurately estimate the difference in expected Brier scores derived from nested survival models and to compute an accompanying variance estimate of this difference. The methodology is applicable to time invariant and time-varying coefficient Cox survival models. The nested survival model approach is often applied to the scenario where the full model consists of conventional and new co...
Source: Lifetime Data Analysis - October 22, 2020 Category: Statistics Source Type: research

A dependent Dirichlet process model for survival data with competing risks
AbstractIn this paper, we first propose a dependent Dirichlet process (DDP) model using a mixture of Weibull models with each mixture component resembling a Cox model for survival data. We then build a Dirichlet process mixture model for competing risks data without regression covariates. Next we extend this model to a DDP model for competing risks regression data by using a multiplicative covariate effect on subdistribution hazards in the mixture components. Though built on proportional hazards (or subdistribution hazards) models, the proposed nonparametric Bayesian regression models do not require the assumption of const...
Source: Lifetime Data Analysis - October 12, 2020 Category: Statistics Source Type: research

Accelerated failure time model for data from outcome-dependent sampling
AbstractOutcome-dependent sampling designs such as the case –control or case–cohort design are widely used in epidemiological studies for their outstanding cost-effectiveness. In this article, we propose and develop a smoothed weighted Gehan estimating equation approach for inference in an accelerated failure time model under a general failure time outco me-dependent sampling scheme. The proposed estimating equation is continuously differentiable and can be solved by the standard numerical methods. In addition to developing asymptotic properties of the proposed estimator, we also propose and investigate a new optimal p...
Source: Lifetime Data Analysis - October 12, 2020 Category: Statistics Source Type: research

EM algorithm for the additive risk mixture cure model with interval-censored data
AbstractInterval-censored failure time data arise in a number of fields and many authors have recently paid more attention to their analysis. However, regression analysis of interval-censored data under the additive risk model can be challenging in maximizing the complex likelihood, especially when there exists a non-ignorable cure fraction in the population. For the problem, we develop a sieve maximum likelihood estimation approach based on Bernstein polynomials. To relieve the computational burden, an expectation –maximization algorithm by exploiting a Poisson data augmentation is proposed. Under some mild conditions, ...
Source: Lifetime Data Analysis - September 30, 2020 Category: Statistics Source Type: research