Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Learning conditional variational autoencoders with missing covariates
24
Zitationen
5
Autoren
2023
Jahr
Abstract
Conditional variational autoencoders (CVAEs) are versatile deep latent variable models that extend the standard VAE framework by conditioning the generative model with auxiliary covariates. The original CVAE model assumes that the data samples are independent, whereas more recent conditional VAE models, such as the Gaussian process (GP) prior VAEs, can account for complex correlation structures across all data samples. While several methods have been proposed to learn standard VAEs from partially observed datasets, these methods fall short for conditional VAEs. In this work, we propose a method to learn conditional VAEs from datasets in which auxiliary covariates can contain missing values as well. The proposed method augments the conditional VAEs with a prior distribution for the missing covariates and estimates their posterior using amortised variational inference. At training time, our method accounts for the uncertainty associated with the missing covariates while simultaneously maximising the evidence lower bound. We develop computationally efficient methods to learn CVAEs and GP prior VAEs that are compatible with mini-batching. Our experiments on simulated datasets as well as on real-world biomedical datasets show that the proposed method outperforms previous methods in learning conditional VAEs from non-temporal, temporal, and longitudinal datasets.
Ähnliche Arbeiten
Maximum Likelihood from Incomplete Data Via the <i>EM</i> Algorithm
1977 · 49.488 Zit.
2022 · 19.530 Zit.
Inference from Iterative Simulation Using Multiple Sequences
1992 · 16.580 Zit.
Auto-Encoding Variational Bayes
2013 · 15.586 Zit.
Understanding the difficulty of training deep feedforward neural networks
2010 · 12.676 Zit.