HyperAIHyperAI

Command Palette

Search for a command to run...

Mixture-of-Subspaces in Low-Rank Adaptation

Taiqiang Wu Jiahao Wang Zhe Zhao Ngai Wong

Abstract

In this paper, we introduce a subspace-inspired Low-Rank Adaptation (LoRA)method, which is computationally efficient, easy to implement, and readilyapplicable to large language, multimodal, and diffusion models. Initially, weequivalently decompose the weights of LoRA into two subspaces, and find thatsimply mixing them can enhance performance. To study such a phenomenon, werevisit it through a fine-grained subspace lens, showing that such modificationis equivalent to employing a fixed mixer to fuse the subspaces. To be moreflexible, we jointly learn the mixer with the original LoRA weights, and termthe method Mixture-of-Subspaces LoRA (MoSLoRA). MoSLoRA consistentlyoutperforms LoRA on tasks in different modalities, including commonsensereasoning, visual instruction tuning, and subject-driven text-to-imagegeneration, demonstrating its effectiveness and robustness. Codes are availableat https://github.com/wutaiqiang/MoSLoRA{github}.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp