BézierFlow: Learning Bézier Stochastic Interpolant Schedulers for Few-Step Generation

ICLR 2026 Conference SubmissionAnonymous Authors
Stochastic interpolantsBézier functionsDiffusion modelsflow models
Abstract:

We introduce BézierFlow, a lightweight training approach for few-step generation with pretrained diffusion and flow models. BézierFlow achieves a 2–3× performance improvement for sampling with \leq 10 NFEs while requiring only 15 minutes of training. Recent lightweight training approaches have shown promise by learning optimal timesteps, but their scope remains restricted to ODE discretizations. To broaden this scope, we propose learning the optimal transformation of the sampling trajectory by parameterizing stochastic interpolant (SI) schedulers. The main challenge lies in designing a parameterization that satisfies critical desiderata, including boundary conditions, differentiability, and monotonicity of the SNR. To effectively meet these requirements, we represent scheduler functions as Bézier functions, where control points naturally enforce these properties. This reduces the problem to learning an ordered set of points in the time range, while the interpretation of the points changes from ODE timesteps to Bézier control points. Across a range of pretrained diffusion and flow models, BézierFlow consistently outperforms prior timestep-learning methods, demonstrating the effectiveness of expanding the search space from discrete timesteps to Bézier-based trajectory transformations.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

BézierFlow proposes learning optimal stochastic interpolant schedulers via Bézier-based parameterization to accelerate few-step generation with pretrained diffusion and flow models. The paper resides in the 'Trajectory Transformation Learning' leaf under 'Sampling Trajectory Optimization', a sparse subcategory containing only this work among the 50 papers surveyed. This positioning reflects a relatively unexplored research direction: rather than learning discrete timesteps or distilling multi-step teachers, the method parameterizes continuous trajectory transformations using differentiable Bézier functions. The sparsity of this leaf suggests the approach occupies a niche within the broader few-step generation landscape.

The taxonomy reveals that most acceleration efforts concentrate on distillation-based methods (21 papers across six subcategories) or domain-specific applications (8 papers). Within 'Sampling Trajectory Optimization', the sibling leaf 'Scheduler and Timestep Learning' contains two papers focused on discrete timestep selection, while 'Guidance Optimization Techniques' addresses guidance signal scheduling. BézierFlow diverges by targeting continuous trajectory transformations rather than discrete schedules or guidance modulation. The taxonomy narrative mentions related geometric approaches like Flow Map Matching and Self Corrected Flow, which explore alternative trajectory structures but are not classified in the same leaf, suggesting methodological distinctions in how trajectory optimization is formulated.

Among 29 candidates examined, the contribution-level analysis shows varied novelty signals. The core idea of learning stochastic interpolant schedulers examined 9 candidates with no clear refutations, suggesting limited prior work on this specific formulation. The Bézier parameterization examined 10 candidates, again with no refutations, indicating the representation choice appears distinctive. However, the BézierFlow training framework examined 10 candidates and found 1 refutable match, suggesting some overlap with existing lightweight training approaches. The limited search scope (29 papers, not exhaustive) means these findings reflect top-K semantic matches rather than comprehensive coverage of all related work.

Given the sparse taxonomy leaf and limited refutations across most contributions, BézierFlow appears to introduce a relatively novel trajectory transformation approach within the examined literature. The single refutation for the training framework likely reflects overlap with general lightweight training paradigms rather than the specific Bézier-based scheduler parameterization. However, the analysis is constrained by the 29-candidate search scope and may not capture all relevant prior work in trajectory optimization or scheduler learning. The taxonomy structure suggests the method occupies a distinct but narrow niche within the broader few-step generation field.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
28
Contribution Candidate Papers Compared
1
Refutable Paper

Research Landscape Overview

Core task: few-step generation with pretrained diffusion and flow models. The field has organized itself around several complementary strategies for accelerating sampling without retraining large models from scratch. Distillation-Based Acceleration Methods compress multi-step trajectories into fewer function evaluations by training student models to mimic teacher outputs, often leveraging adversarial losses or consistency objectives as seen in works like Adversarial Diffusion Distillation[4] and Latent Consistency Models[10]. Sampling Trajectory Optimization refines the paths taken through latent space—whether by learning better couplings, adjusting time schedules, or transforming entire trajectories—while Architectural and Training Innovations explore modifications to network design and loss formulations that inherently support faster inference. Parallel and Non-Autoregressive Sampling investigates methods that generate multiple timesteps or tokens simultaneously, Domain-Specific Applications tailor acceleration techniques to modalities such as video or protein design, and Conditional Generation and Inverse Problems address guidance and task-specific constraints during few-step synthesis. Within Sampling Trajectory Optimization, a particularly active line of work focuses on trajectory transformation learning: rather than distilling individual steps, these methods learn to map or rectify entire sampling paths to achieve straighter, more efficient flows. BezierFlow[0] exemplifies this approach by parameterizing trajectories with Bézier curves, enabling smooth interpolation and direct optimization of the end-to-end path. This contrasts with approaches like Shortcut Models[3], which construct piecewise shortcuts between intermediate states, and Directly Denoising[5], which emphasizes single-step jumps from noise to data. Meanwhile, works such as Flow Map Matching[32] and Self Corrected Flow[35] explore alternative geometric structures and iterative refinement strategies for trajectory design. BezierFlow[0] sits naturally among these trajectory-centric methods, sharing their emphasis on global path structure while offering a distinct parametric framework that balances expressiveness with computational tractability.

Claimed Contributions

Learning optimal stochastic interpolant schedulers for few-step generation

The authors propose optimizing the sampling trajectories themselves by learning stochastic interpolant schedulers, which govern the geometry of the sampling path. This broadens the scope beyond learning discrete ODE timesteps to learning continuous trajectory transformations that preserve endpoint distributions.

9 retrieved papers
Bézier-based parameterization of SI schedulers

The authors introduce a Bézier-based parameterization for stochastic interpolant schedulers that naturally satisfies boundary conditions, differentiability, and monotonicity of the signal-to-noise ratio. This reduces the problem to learning an ordered set of control points while ensuring the scheduler functions remain smooth and well-defined.

9 retrieved papers
BézierFlow lightweight training framework

The authors develop BézierFlow, a complete lightweight training framework that combines the optimization of sampling trajectories with Bézier-based continuous parameterization. The method achieves 2–3× performance improvement for sampling with ≤10 NFEs while requiring only 15 minutes of training.

10 retrieved papers
Can Refute

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Within the taxonomy built over the current TopK core-task papers, the original paper is assigned to a leaf with no direct siblings and no cousin branches under the same grandparent topic. In this retrieved landscape, it appears structurally isolated, which is one partial signal of novelty, but still constrained by search coverage and taxonomy granularity.

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Learning optimal stochastic interpolant schedulers for few-step generation

The authors propose optimizing the sampling trajectories themselves by learning stochastic interpolant schedulers, which govern the geometry of the sampling path. This broadens the scope beyond learning discrete ODE timesteps to learning continuous trajectory transformations that preserve endpoint distributions.

Contribution

Bézier-based parameterization of SI schedulers

The authors introduce a Bézier-based parameterization for stochastic interpolant schedulers that naturally satisfies boundary conditions, differentiability, and monotonicity of the signal-to-noise ratio. This reduces the problem to learning an ordered set of control points while ensuring the scheduler functions remain smooth and well-defined.

Contribution

BézierFlow lightweight training framework

The authors develop BézierFlow, a complete lightweight training framework that combines the optimization of sampling trajectories with Bézier-based continuous parameterization. The method achieves 2–3× performance improvement for sampling with ≤10 NFEs while requiring only 15 minutes of training.