HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

EDGE: Editable Dance Generation From Music

Jonathan Tseng Rodrigo Castellon C. Karen Liu

EDGE: Editable Dance Generation From Music

Abstract

Dance is an important human art form, but creating new dances can be difficult and time-consuming. In this work, we introduce Editable Dance GEneration (EDGE), a state-of-the-art method for editable dance generation that is capable of creating realistic, physically-plausible dances while remaining faithful to the input music. EDGE uses a transformer-based diffusion model paired with Jukebox, a strong music feature extractor, and confers powerful editing capabilities well-suited to dance, including joint-wise conditioning, and in-betweening. We introduce a new metric for physical plausibility, and evaluate dance quality generated by our method extensively through (1) multiple quantitative metrics on physical plausibility, beat alignment, and diversity benchmarks, and more importantly, (2) a large-scale user study, demonstrating a significant improvement over previous state-of-the-art methods. Qualitative samples from our model can be found at our website.

Code Repositories

Stanford-TML/EDGE
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
motion-synthesis-on-aistEDGE (w=2)
Beat alignment score: 0.26
motion-synthesis-on-aistEDGE (w=1)
Beat alignment score: 0.27
motion-synthesis-on-finedanceEDGE
BAS: 0.2116
fid_k: 94.34

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
EDGE: Editable Dance Generation From Music | Papers | HyperAI