Low-rank and global-representation-key-based attention for graph transformer
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F61989100%3A27240%2F23%3A10253783" target="_blank" >RIV/61989100:27240/23:10253783 - isvavai.cz</a>
Result on the web
<a href="https://www.sciencedirect.com/science/article/pii/S002002552300693X?via%3Dihub" target="_blank" >https://www.sciencedirect.com/science/article/pii/S002002552300693X?via%3Dihub</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/j.ins.2023.119108" target="_blank" >10.1016/j.ins.2023.119108</a>
Alternative languages
Result language
angličtina
Original language name
Low-rank and global-representation-key-based attention for graph transformer
Original language description
Transformer architectures have been applied to graph-specific data such as protein structure and shopper lists, and they perform accurately on graph/node classification and prediction tasks. Researchers have proved that the attention matrix in Transformers has low-rank properties, and the self-attention plays a scoring role in the aggregation function of the Transformers. However, it can not solve the issues such as heterophily and over-smoothing. The low-rank properties and the limitations of Transformers inspire this work to propose a Global Representation (GR) based attention mechanism to alleviate the two heterophily and over-smoothing issues. First, this GR-based model integrates geometric information of the nodes of interest that conveys the structural properties of the graph. Unlike a typical Transformer where a node feature forms a Key, we propose to use GR to construct the Key, which discovers the relation between the nodes and the structural representation of the graph. Next, we present various compositions of GR emanating from nodes of interest and alpha-hop neighbors. Then, we explore this attention property with an extensive experimental test to assess the performance and the possible direction of improvements for future works. Additionally, we provide mathematical proof showing the efficient feature update in our proposed method. Finally, we verify and validate the performance of the model on eight benchmark datasets that show the effectiveness of the proposed method.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
10200 - Computer and information sciences
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Information sciences
ISSN
0020-0255
e-ISSN
1872-6291
Volume of the periodical
642
Issue of the periodical within the volume
září 2023
Country of publishing house
US - UNITED STATES
Number of pages
17
Pages from-to
—
UT code for WoS article
000998393300001
EID of the result in the Scopus database
—