HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

SingLoRA: Low Rank Adaptation Using a Single Matrix

David Bensa\u00efd Noam Rotstein Roy Velich Daniel Bensa\u00efd Ron Kimmel

SingLoRA: Low Rank Adaptation Using a Single Matrix

Abstract

Low-Rank Adaptation (LoRA) has significantly advanced parameter-efficientfine-tuning of large pretrained models. LoRA augments the pre-trained weightsof a model by adding the product of two smaller matrices that together form alow-rank matrix update. Recent research has shown that scale disparitiesbetween these two matrices often cause unstable training dynamics, leading tosuboptimal performance. In this paper, we propose SingLoRA, which reformulateslow-rank adaptation by learning the weights update as a decomposition of asingle low-rank matrix multiplied by its transpose. This simple designinherently removes inter-matrix scale conflicts, ensuring stable optimization,and roughly halves the parameter count. We analyze SingLoRA within theinfinite-width neural network framework, showing that it guarantees stablefeature learning by construction. Extensive experiments on multiple tasksvalidate these benefits. In common sense reasoning, fine-tuning LLama 7B onMNLI with SingLoRA achieves 91.3% accuracy - surpassing LoRA (89.1%) and LoRA+(90.2%) - while using only 60% of their parameter budget. In image generation,fine-tuning Stable Diffusion with SingLoRA significantly improves imagefidelity on DreamBooth, achieving a DINO similarity score of 0.151, compared toscores of 0.148 and 0.143 for DoRA and LoRA, respectively.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
SingLoRA: Low Rank Adaptation Using a Single Matrix | Papers | HyperAI