beeFormer: Bridging the Gap Between Semantic and Interaction Similarity in Recommender Systems
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F24%3A10492912" target="_blank" >RIV/00216208:11320/24:10492912 - isvavai.cz</a>
Nalezeny alternativní kódy
RIV/68407700:21240/24:00377673
Výsledek na webu
<a href="https://doi.org/10.1145/3640457.3691707" target="_blank" >https://doi.org/10.1145/3640457.3691707</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1145/3640457.3691707" target="_blank" >10.1145/3640457.3691707</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
beeFormer: Bridging the Gap Between Semantic and Interaction Similarity in Recommender Systems
Popis výsledku v původním jazyce
Recommender systems often use text-side information to improve their predictions, especially in cold-start or zero-shot recommendation scenarios, where traditional collaborative filtering approaches cannot be used. Many approaches to text-mining side information for recommender systems have been proposed over recent years, with sentence Transformers being the most prominent one. However, these models are trained to predict semantic similarity without utilizing interaction data with hidden patterns specific to recommender systems. In this paper, we propose beeFormer, a framework for training sentence Transformer models with interaction data. We demonstrate that our models trained with beeFormer can transfer knowledge between datasets while outperforming not only semantic similarity sentence Transformers but also traditional collaborative filtering methods. We also show that training on multiple datasets from different domains accumulates knowledge in a single model, unlocking the possibility of trainin
Název v anglickém jazyce
beeFormer: Bridging the Gap Between Semantic and Interaction Similarity in Recommender Systems
Popis výsledku anglicky
Recommender systems often use text-side information to improve their predictions, especially in cold-start or zero-shot recommendation scenarios, where traditional collaborative filtering approaches cannot be used. Many approaches to text-mining side information for recommender systems have been proposed over recent years, with sentence Transformers being the most prominent one. However, these models are trained to predict semantic similarity without utilizing interaction data with hidden patterns specific to recommender systems. In this paper, we propose beeFormer, a framework for training sentence Transformer models with interaction data. We demonstrate that our models trained with beeFormer can transfer knowledge between datasets while outperforming not only semantic similarity sentence Transformers but also traditional collaborative filtering methods. We also show that training on multiple datasets from different domains accumulates knowledge in a single model, unlocking the possibility of trainin
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GX20-16819X" target="_blank" >GX20-16819X: Porozumění jazyku: od syntaxe k diskurzu</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2024
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Proceedings of the 18th ACM Conference on Recommender Systems
ISBN
979-8-4007-0505-2
ISSN
—
e-ISSN
—
Počet stran výsledku
6
Strana od-do
1102-1107
Název nakladatele
Association for Computing Machinery
Místo vydání
New York, NY, United States
Místo konání akce
Bari, Italy
Datum konání akce
14. 9. 2024
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—