HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Dancing to Music

Hsin-Ying Lee Xiaodong Yang Ming-Yu Liu Ting-Chun Wang Yu-Ding Lu Ming-Hsuan Yang Jan Kautz

Dancing to Music

Abstract

Dancing to music is an instinctive move by humans. Learning to model the music-to-dance generation process is, however, a challenging problem. It requires significant efforts to measure the correlation between music and dance as one needs to simultaneously consider multiple aspects, such as style and beat of both music and dance. Additionally, dance is inherently multimodal and various following movements of a pose at any moment are equally likely. In this paper, we propose a synthesis-by-analysis learning framework to generate dance from music. In the analysis phase, we decompose a dance into a series of basic dance units, through which the model learns how to move. In the synthesis phase, the model learns how to compose a dance by organizing multiple basic dancing movements seamlessly according to the input music. Experimental qualitative and quantitative results demonstrate that the proposed method can synthesize realistic, diverse,style-consistent, and beat-matching dances from music.

Code Repositories

NVlabs/Dance2Music
Official
pytorch
Mentioned in GitHub
nvlabs/dancing2music
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
motion-synthesis-on-braceDancing 2 Music
Beat DTW cost: 11.60
Beat alignment score: 0.129
Footwork average: 50.09
Frechet Inception Distance: 0.5884
Powermove average: 33.87
Toprock average: 16.04

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Dancing to Music | Papers | HyperAI