HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Music2Dance: DanceNet for Music-driven Dance Generation

Wenlin Zhuang Congyi Wang Siyu Xia Jinxiang Chai Yangang Wang

Music2Dance: DanceNet for Music-driven Dance Generation

Abstract

Synthesize human motions from music, i.e., music to dance, is appealing and attracts lots of research interests in recent years. It is challenging due to not only the requirement of realistic and complex human motions for dance, but more importantly, the synthesized motions should be consistent with the style, rhythm and melody of the music. In this paper, we propose a novel autoregressive generative model, DanceNet, to take the style, rhythm and melody of music as the control signals to generate 3D dance motions with high realism and diversity. To boost the performance of our proposed model, we capture several synchronized music-dance pairs by professional dancers, and build a high-quality music-dance pair dataset. Experiments have demonstrated that the proposed method can achieve the state-of-the-art results.

Benchmarks

BenchmarkMethodologyMetrics
motion-synthesis-on-aistDanceNet
Beat alignment score: 0.143
FID: 69.13

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Music2Dance: DanceNet for Music-driven Dance Generation | Papers | HyperAI