Preethi Srinivasan

Preethi Srinivasan

@3pi

Fine-Tuning Large Models with Fewer Parameters: LoRA and Intrinsic Dimension Explained

Submitted Mar 24, 2025

In this talk, we will explore the growing need to fine-tune large pre-trained models for specialized tasks, and the limitations of conventional fine-tuning methods—especially their high computational and storage costs. We begin with Parameter-Efficient Fine-Tuning (PEFT) techniques, focusing on LoRA (Low-Rank Adaptation), an adapter-based approach that enables efficient model adaptation by introducing a small number of trainable parameters.

Through a hands-on implementation of LoRA in a multilayer perceptron (MLP) for a binary classification task, we’ll cover adapter insertion, parameter configuration, and the evaluation of parameter efficiency. We’ll also discuss real-world workflows—like sharing models via Hugging Face Hub—and explore practical extensions such as Quantized-LoRA for reducing inference-time memory usage.

Building on this foundation, the talk transitions into the theory of Intrinsic Dimension (ID)—the hypothesis that neural networks, despite their large size, may require only a few effective directions to learn. Using random subspace training, we measure ID and analyze how models behave when learning is constrained to low-dimensional subspaces. This leads to a key insight: LoRA’s efficiency aligns closely with the principles of intrinsic dimension, offering a deeper theoretical understanding of why PEFT methods work.

This talk bridges the ideas explored in my four-part blog series, which aims to demystify PEFT, LoRA, and intrinsic dimension for a broader audience. These blogs gained good visibility — receiving positive traction on /r/ Machine learning and ranking 7th on Hacker News, where they stayed on the front page for a day. First two blogs on LoRA became the basis of a talk I gave at PyCon India 2024.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Jump starting better data engineering and AI futures

Supported by

Meet-up sponsor

Nutanix is a global leader in cloud software, offering organizations a single platform for running apps and data across clouds.

Community sponsor