Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F23%3ADLF75ZZZ" target="_blank" >RIV/00216208:11320/23:DLF75ZZZ - isvavai.cz</a>
Result on the web
<a href="http://arxiv.org/abs/2311.09344" target="_blank" >http://arxiv.org/abs/2311.09344</a>
DOI - Digital Object Identifier
—
Alternative languages
Result language
švédština
Original language name
Language and Task Arithmetic with Parameter-Efficient Layers for Zero-Shot Summarization
Original language description
"Parameter-efficient fine-tuning (PEFT) using labeled task data can significantly improve the performance of large language models (LLMs) on the downstream task. However, there are 7000 languages in the world and many of these languages lack labeled data for real-world language generation tasks. In this paper, we propose to improve zero-shot cross-lingual transfer by composing language or task specialized parameters. Our method composes language and task PEFT modules via element-wise arithmetic operations to leverage unlabeled data and English labeled data. We extend our approach to cases where labeled data from more languages is available and propose to arithmetically compose PEFT modules trained on languages related to the target. Empirical results on summarization demonstrate that our method is an effective strategy that obtains consistent gains using minimal training of PEFT modules."
Czech name
—
Czech description
—
Classification
Type
O - Miscellaneous
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
—
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů