HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

MotionLCM: Real-time Controllable Motion Generation via Latent Consistency Model

Wenxun Dai Ling-Hao Chen Jingbo Wang Jinpeng Liu Bo Dai Yansong Tang

MotionLCM: Real-time Controllable Motion Generation via Latent
  Consistency Model

Abstract

This work introduces MotionLCM, extending controllable motion generation to areal-time level. Existing methods for spatial control in text-conditionedmotion generation suffer from significant runtime inefficiency. To address thisissue, we first propose the motion latent consistency model (MotionLCM) formotion generation, building upon the latent diffusion model (MLD). By employingone-step (or few-step) inference, we further improve the runtime efficiency ofthe motion latent diffusion model for motion generation. To ensure effectivecontrollability, we incorporate a motion ControlNet within the latent space ofMotionLCM and enable explicit control signals (e.g., pelvis trajectory) in thevanilla motion space to control the generation process directly, similar tocontrolling other latent-free diffusion models for motion generation. Byemploying these techniques, our approach can generate human motions with textand control signals in real-time. Experimental results demonstrate theremarkable generation and controlling capabilities of MotionLCM whilemaintaining real-time runtime efficiency.

Code Repositories

Dai-Wenxun/MotionLCM
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
motion-synthesis-on-humanml3dMotionLCM (4-step)
Diversity: 9.607
FID: 0.304
Multimodality: 2.259
R Precision Top3: 0.798

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
MotionLCM: Real-time Controllable Motion Generation via Latent Consistency Model | Papers | HyperAI