Bayesian Flow Networks: A New Deep Generative Modeling Approach

A new deep generative modeling technique called Bayesian Flow Networks (BFNs) was recently introduced in a paper by Alex Graves et al. from NNAISENSE. BFNs combine Bayesian inference with neural networks to iteratively model data distributions.

The key idea behind BFNs is having two connected distributions – an input distribution that receives information about each data variable independently via Bayesian updates, and an output distribution produced by a neural network that can exploit dependencies between variables. The input distribution starts as a simple prior like a standard normal or uniform distribution. At each step, it is updated based on a sample from a “sender” distribution, bringing it closer to the data. The parameters of the input distribution are fed into the neural network, which outputs an “output” distribution that should match the data distribution. The loss function measures the divergence between “sender” and “receiver” distributions, where the receiver integrates over the output distribution.

BFNs were tested on image, text and binary data. On CIFAR-10, they achieved 2.66 bits/dim, close to state-of-the-art. For dynamically binarized MNIST, they reached 1.41 bits/dim, beating other discrete diffusion models. On text modeling using the text8 dataset, BFNs also outperformed prior discrete diffusion methods with 1.41 bits/char.

The results show BFNs can effectively model both continuous and discrete data distributions. Being able to handle discrete data with a fully continuous process is a notable advantage over regular diffusion models. The framework is also flexible enough to handle different data types with minimal modification.

The authors hypothesize BFNs could learn faster on very large datasets compared to regular diffusion models, since the network inputs are less noisy. They also propose conditional BFNs for tasks like super-resolution. Future work may focus on improving the accuracy schedule heuristics and applying BFNs to other data domains like graphs, audio or video.

Overall, BFNs present a compelling new approach for deep generative modeling. Their strong results across different data types highlight the benefits of combining Bayesian inference directly with neural networks. If their advantages for large datasets and discrete modeling hold up, BFNs could become a new go-to technique for tasks like image generation and language modeling.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Emsi's feed

Subscribe now to keep reading and get access to the full archive.

Continue reading