HyperAIHyperAI

Command Palette

Search for a command to run...

2 months ago

Transition Models: Rethinking the Generative Learning Objective

Zidong Wang Yiyuan Zhang Xiaoyu Yue Xiangyu Yue Yangguang Li Wanli Ouyang Lei Bai

Transition Models: Rethinking the Generative Learning Objective

Abstract

A fundamental dilemma in generative modeling persists: iterative diffusionmodels achieve outstanding fidelity, but at a significant computational cost,while efficient few-step alternatives are constrained by a hard qualityceiling. This conflict between generation steps and output quality arises fromrestrictive training objectives that focus exclusively on either infinitesimaldynamics (PF-ODEs) or direct endpoint prediction. We address this challenge byintroducing an exact, continuous-time dynamics equation that analyticallydefines state transitions across any finite time interval. This leads to anovel generative paradigm, Transition Models (TiM), which adapt toarbitrary-step transitions, seamlessly traversing the generative trajectoryfrom single leaps to fine-grained refinement with more steps. Despite havingonly 865M parameters, TiM achieves state-of-the-art performance, surpassingleading models such as SD3.5 (8B parameters) and FLUX.1 (12B parameters) acrossall evaluated step counts. Importantly, unlike previous few-step generators,TiM demonstrates monotonic quality improvement as the sampling budgetincreases. Additionally, when employing our native-resolution strategy, TiMdelivers exceptional fidelity at resolutions up to 4096x4096.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Transition Models: Rethinking the Generative Learning Objective | Papers | HyperAI