MaCP: Minimal yet Mighty Adaptation via Hierarchical Cosine Projection
| Authors | |
|---|---|
| Publication date | 2025 |
| Host editors |
|
| Book title | The 63rd Annual Meeting of the Association for Computational Linguistics (ACL 2025) : proceedings of the conference |
| Book subtitle | ACL 2025 : July 27-August 1, 2025 |
| ISBN (electronic) |
|
| Event | 63rd Annual Meeting of the Association for Computational Linguistics, ACL 2025 |
| Volume | Issue number | 1 |
| Pages (from-to) | 20602–20618 |
| Publisher | Kerrville, TX: Association for Computational Linguistics |
| Organisations |
|
| Abstract |
We present a new adaptation method MaCP, Minimal yet Mighty adaptive Cosine Projection, that achieves exceptional performance while requiring minimal parameters and memory for fine-tuning large foundation models.Its general idea is to exploit the superior energy compaction and decorrelation properties of cosine projection to improve both model efficiency and accuracy.Specifically, it projects the weight change from the low-rank adaptation into the discrete cosine space.Then, the weight change is partitioned over different levels of the discrete cosine spectrum, and each partition’s most critical frequency components are selected.Extensive experiments demonstrate the effectiveness of MaCP across a wide range of single-modality tasks, including natural language understanding, natural language generation, text summarization, as well as multi-modality tasks such as image classification and video understanding. MaCP consistently delivers superior accuracy, significantly reduced computational complexity, and lower memory requirements compared to existing alternatives.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.18653/v1/2025.acl-long.1006 |
| Downloads |
2025.acl-long.1006
(Final published version)
|
| Permalink to this page | |
