HyperAIHyperAI

Command Palette

Search for a command to run...

Course LoRA Expert Dynamic Hybrid Method D-MoLE

Date

3 months ago

The Dynamic Mixture of Curriculum LoRA Experts (D-MoLE) is a new method designed for continuous multimodal instruction fine-tuning, proposed by the Interactive Content Security Team of Alibaba Group Security Department and Tsinghua University on June 13, 2025. It aims to enable the multimodal large language model (MLLM) to continuously adapt to new tasks while effectively retaining existing knowledge under a limited parameter budget. The related paper results are "Dynamic Mixture of Curriculum LoRA Experts for Continual Multimodal Instruction Tuning", the paper has been included in ICML 2025.

D-MoLE combines the concepts of LoRa and Mixture of Experts (MoE) and introduces a curriculum learning mechanism. It dynamically selects and combines different LoRa modules to adapt to new tasks while minimizing interference with existing knowledge. Extensive experiments demonstrate that D-MoLE significantly outperforms state-of-the-art baseline models, achieving an average improvement of 15% over the best baseline. This is the first study of continuous learning in MLLMs from an architectural perspective.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Course LoRA Expert Dynamic Hybrid Method D-MoLE | Wiki | HyperAI