attentive state-space modeling
Attentive State-Space Modeling of Disease Progression
Models of disease progression are instrumental for predicting patient outcomes and understanding disease dynamics. Existing models provide the patient with pragmatic (supervised) predictions of risk, but do not provide the clinician with intelligible (unsupervised) representations of disease pathophysiology. In this paper, we develop the attentive state-space model, a deep probabilistic model that learns accurate and interpretable structured representations for disease trajectories. Unlike Markovian state-space models, in which the dynamics are memoryless, our model uses an attention mechanism to create memoryful dynamics, whereby attention weights determine the dependence of future disease states on past medical history. To learn the model parameters from medical records, we develop an infer ence algorithm that simultaneously learns a compiled inference network and the model parameters, leveraging the attentive state-space representation to construct a Rao-Blackwellized variational approximation of the posterior state distribution. Experiments on data from the UK Cystic Fibrosis registry show that our model demonstrates superior predictive accuracy and provides insights into the progression of chronic disease.
Reviews: Attentive State-Space Modeling of Disease Progression
The key idea in this paper is to maintain this property of discrete state space models while relaxing the stationary Markov assumption on the transition probabilities that we typically use to simplify inference. Although this idea is not new (e.g. The variational inference algorithm for this model also seems to be new. In practice, we can relax the "strict" Markov assumption (i.e. the state in year t 1 is conditionally independent of the past given the state at year t) by augmenting the state with the past h 1 years. This keeps the inference exact and relatively easy to implement.
Reviews: Attentive State-Space Modeling of Disease Progression
The authors propose a generative discrete state space model and novel variational inference algorithm for modeling disease trajectories. Overall, reviewers found the paper well-written and convincing. However, the authors are encouraged to strongly consider the feedback received. Specifically, in preparing the camera-ready version please incorporate experiments comparing to common extensions of simple well-understood methods (e.g., higher-order HMMs).
Attentive State-Space Modeling of Disease Progression
Models of disease progression are instrumental for predicting patient outcomes and understanding disease dynamics. Existing models provide the patient with pragmatic (supervised) predictions of risk, but do not provide the clinician with intelligible (unsupervised) representations of disease pathophysiology. In this paper, we develop the attentive state-space model, a deep probabilistic model that learns accurate and interpretable structured representations for disease trajectories. Unlike Markovian state-space models, in which the dynamics are memoryless, our model uses an attention mechanism to create "memoryful" dynamics, whereby attention weights determine the dependence of future disease states on past medical history. To learn the model parameters from medical records, we develop an infer ence algorithm that simultaneously learns a compiled inference network and the model parameters, leveraging the attentive state-space representation to construct a "Rao-Blackwellized" variational approximation of the posterior state distribution.
Attentive State-Space Modeling of Disease Progression
Alaa, Ahmed M., Schaar, Mihaela van der
Models of disease progression are instrumental for predicting patient outcomes and understanding disease dynamics. Existing models provide the patient with pragmatic (supervised) predictions of risk, but do not provide the clinician with intelligible (unsupervised) representations of disease pathophysiology. In this paper, we develop the attentive state-space model, a deep probabilistic model that learns accurate and interpretable structured representations for disease trajectories. Unlike Markovian state-space models, in which the dynamics are memoryless, our model uses an attention mechanism to create "memoryful" dynamics, whereby attention weights determine the dependence of future disease states on past medical history. To learn the model parameters from medical records, we develop an infer ence algorithm that simultaneously learns a compiled inference network and the model parameters, leveraging the attentive state-space representation to construct a "Rao-Blackwellized" variational approximation of the posterior state distribution.