Home  | Publications | HPL+26

On Closed-Form Couplings

MCML Authors

Abstract

Few-step generative modelling is an open challenge for flow models. Rectified flows tackle it by distilling a pre-trained “teacher” into a few-step “student”, using strong noise–data couplings supplied by the teacher. For a finite dataset and a Gaussian probability path, the probability-flow vector field induced by the empirical distribution is available in closed form, which would allow us to skip training a teacher model. Surprisingly, these couplings turn out to be poor teachers and significantly reduce the performance of the student. We analyse this phenomenon empirically and theoretically, arguing that it stems from intrinsic ambiguity in the induced couplings caused by the strong sensitivity of terminal states to small initialisation perturbations. Under symmetry assumptions, we further prove that the closed-form probability-flow vector field preserves dataset symmetries and induces invariant Voronoi partitions.

inproceedings HPL+26


GRaM @ICLR 2026

Workshop on Geometry-grounded Representation Learning and Generative Modeling at the 14th International Conference on Learning Representations. Rio de Janeiro, Brazil, Apr 23-27, 2026. To be published. Preprint available.

Authors

T. HöppeS. Bauer • Q. Liu • A. Dittadi • K. Neklyudov

Links

URL

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: HPL+26

Back to Top