BézierFlow: Learning Bézier Stochastic Interpolant Schedulers for Few-Step Generation
Overview
Overall Novelty Assessment
BézierFlow proposes learning optimal stochastic interpolant schedulers via Bézier-based parameterization to accelerate few-step generation with pretrained diffusion and flow models. The paper resides in the 'Trajectory Transformation Learning' leaf under 'Sampling Trajectory Optimization', a sparse subcategory containing only this work among the 50 papers surveyed. This positioning reflects a relatively unexplored research direction: rather than learning discrete timesteps or distilling multi-step teachers, the method parameterizes continuous trajectory transformations using differentiable Bézier functions. The sparsity of this leaf suggests the approach occupies a niche within the broader few-step generation landscape.
The taxonomy reveals that most acceleration efforts concentrate on distillation-based methods (21 papers across six subcategories) or domain-specific applications (8 papers). Within 'Sampling Trajectory Optimization', the sibling leaf 'Scheduler and Timestep Learning' contains two papers focused on discrete timestep selection, while 'Guidance Optimization Techniques' addresses guidance signal scheduling. BézierFlow diverges by targeting continuous trajectory transformations rather than discrete schedules or guidance modulation. The taxonomy narrative mentions related geometric approaches like Flow Map Matching and Self Corrected Flow, which explore alternative trajectory structures but are not classified in the same leaf, suggesting methodological distinctions in how trajectory optimization is formulated.
Among 29 candidates examined, the contribution-level analysis shows varied novelty signals. The core idea of learning stochastic interpolant schedulers examined 9 candidates with no clear refutations, suggesting limited prior work on this specific formulation. The Bézier parameterization examined 10 candidates, again with no refutations, indicating the representation choice appears distinctive. However, the BézierFlow training framework examined 10 candidates and found 1 refutable match, suggesting some overlap with existing lightweight training approaches. The limited search scope (29 papers, not exhaustive) means these findings reflect top-K semantic matches rather than comprehensive coverage of all related work.
Given the sparse taxonomy leaf and limited refutations across most contributions, BézierFlow appears to introduce a relatively novel trajectory transformation approach within the examined literature. The single refutation for the training framework likely reflects overlap with general lightweight training paradigms rather than the specific Bézier-based scheduler parameterization. However, the analysis is constrained by the 29-candidate search scope and may not capture all relevant prior work in trajectory optimization or scheduler learning. The taxonomy structure suggests the method occupies a distinct but narrow niche within the broader few-step generation field.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors propose optimizing the sampling trajectories themselves by learning stochastic interpolant schedulers, which govern the geometry of the sampling path. This broadens the scope beyond learning discrete ODE timesteps to learning continuous trajectory transformations that preserve endpoint distributions.
The authors introduce a Bézier-based parameterization for stochastic interpolant schedulers that naturally satisfies boundary conditions, differentiability, and monotonicity of the signal-to-noise ratio. This reduces the problem to learning an ordered set of control points while ensuring the scheduler functions remain smooth and well-defined.
The authors develop BézierFlow, a complete lightweight training framework that combines the optimization of sampling trajectories with Bézier-based continuous parameterization. The method achieves 2–3× performance improvement for sampling with ≤10 NFEs while requiring only 15 minutes of training.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
Contribution Analysis
Detailed comparisons for each claimed contribution
Learning optimal stochastic interpolant schedulers for few-step generation
The authors propose optimizing the sampling trajectories themselves by learning stochastic interpolant schedulers, which govern the geometry of the sampling path. This broadens the scope beyond learning discrete ODE timesteps to learning continuous trajectory transformations that preserve endpoint distributions.
[32] Flow map matching with stochastic interpolants: A mathematical framework for consistency models PDF
[51] Building normalizing flows with stochastic interpolants PDF
[52] Lipschitz-guided design of interpolation schedules in generative models PDF
[53] Dynamic-TreeRPO: Breaking the Independent Trajectory Bottleneck with Structured Sampling PDF
[54] Consistent3D: Towards Consistent High-Fidelity Text-to-3D Generation with Deterministic Sampling Prior PDF
[55] Generative AI for Molecular Simulations PDF
[56] Likely Interpolants of Generative Models PDF
[57] Probabilistic Forecasting with Stochastic Interpolants and Föllmer Processes PDF
[58] Stochastic Interpolants via Conditional Dependent Coupling PDF
Bézier-based parameterization of SI schedulers
The authors introduce a Bézier-based parameterization for stochastic interpolant schedulers that naturally satisfies boundary conditions, differentiability, and monotonicity of the signal-to-noise ratio. This reduces the problem to learning an ordered set of control points while ensuring the scheduler functions remain smooth and well-defined.
[59] Draw Step by Step: Reconstructing CAD Construction Sequences from Point Clouds via Multimodal Diffusion. PDF
[60] Research on Ship Automatic Berthing Algorithm Based on Flow Matching and Velocity Matching PDF
[61] SketchRefiner: Text-Guided Sketch Refinement Through Latent Diffusion Models PDF
[62] An adaptive compressor characteristic map method based on the Bézier curve PDF
[63] Implicit Bézier Motion Model for Precise Spatial and Temporal Control PDF
[64] Bezier Distillation PDF
[65] CLDM-Palm: A controllable latent diffusion model for high-fidelity palmprint generation based on Bézier curves PDF
[66] Modelling Additive Manufacturing Processes via Graph-Conditioned Diffusion Models PDF
[67] Is Your Diffusion Model Actually Denoising? PDF
BézierFlow lightweight training framework
The authors develop BézierFlow, a complete lightweight training framework that combines the optimization of sampling trajectories with Bézier-based continuous parameterization. The method achieves 2–3× performance improvement for sampling with ≤10 NFEs while requiring only 15 minutes of training.