8-Part Technical Series

Diffusion & Flow Matching Internals

A deep technical series demystifying how diffusion models and flow matching actually work — from probability foundations to state-of-the-art generation. Built for engineers who want to understand, not just use.

8 Articles
~290 Min total read
30+ Interactive demos
01
Foundations

Generative Modeling & Probability Foundations

The landscape of generative models, probability distributions, KL divergence, evidence lower bounds, and why learning to reverse noise is the key insight behind diffusion.

02
Core Theory

Denoising Diffusion Probabilistic Models

The forward noising process, reverse denoising, the variational bound, noise prediction networks, and the surprisingly simple DDPM training objective.

03
Theory

Score Functions & Langevin Dynamics

Score functions as gradients of log-density, score matching, denoising score matching, annealed Langevin dynamics, and noise-conditional score networks.

04
Theory

SDEs & Continuous-Time Diffusion

Continuous-time diffusion as stochastic differential equations, forward and reverse SDEs, the probability flow ODE, and variance-preserving vs variance-exploding formulations.

05
Core Theory

Flow Matching & Continuous Normalizing Flows

Neural ODEs, continuous normalizing flows, the flow matching objective, conditional flow matching, rectified flows, and optimal transport paths.

06
Architecture

Architectures & Conditioning

U-Net design for diffusion, Diffusion Transformers (DiT), classifier guidance, classifier-free guidance, text conditioning via cross-attention, and latent diffusion.

07
Efficiency

Samplers & Acceleration

DDIM, DPM-Solver, Euler and Heun methods, progressive distillation, consistency models, and how to generate in 1–4 steps instead of 1000.

08
Applications

Applications & Frontiers

Image, video, 3D, and audio synthesis, inpainting and editing, discrete diffusion for text, molecular generation, and the convergence of diffusion and flow matching.