A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains
Normalizing flows, diffusion normalizing flows and variational autoencoders
are powerful generative models. In this paper, we provide a unified framework
to handle these approaches via Markov chains. Indeed, we consider stochastic
normalizing flows as pair of Markov chains fulfilling some properties and show
that many state-of-the-art models for data generation fit into this framework.
The Markov chains point of view enables us to couple both deterministic layers
as invertible neural networks and stochastic layers as Metropolis-Hasting
layers, Langevin layers and variational autoencoders in a mathematically sound
way. Besides layers with densities as Langevin layers, diffusion layers or
variational autoencoders, also layers having no densities as deterministic
layers or Metropolis-Hasting layers can be handled. Hence our framework
establishes a useful mathematical tool to combine the various approaches.