All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Pretrained Quantum-Inspired Deep Neural Network for Natural Language Processing

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F25%3ASKZ27EEG" target="_blank" >RIV/00216208:11320/25:SKZ27EEG - isvavai.cz</a>

  • Result on the web

    <a href="https://www.scopus.com/inward/record.uri?eid=2-s2.0-85194833525&doi=10.1109%2fTCYB.2024.3398692&partnerID=40&md5=85c6fb05e8f73a481954c683835c4efb" target="_blank" >https://www.scopus.com/inward/record.uri?eid=2-s2.0-85194833525&doi=10.1109%2fTCYB.2024.3398692&partnerID=40&md5=85c6fb05e8f73a481954c683835c4efb</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1109/TCYB.2024.3398692" target="_blank" >10.1109/TCYB.2024.3398692</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Pretrained Quantum-Inspired Deep Neural Network for Natural Language Processing

  • Original language description

    Natural language processing (NLP) may face the inexplicable 'black-box' problem of parameters and unreasonable modeling for lack of embedding of some characteristics of natural language, while the quantum-inspired models based on quantum theory may provide a potential solution. However, the essential prior knowledge and pretrained text features are often ignored at the early stage of the development of quantum-inspired models. To attacking the above challenges, a pretrained quantum-inspired deep neural network is proposed in this work, which is constructed based on quantum theory for carrying out strong performance and great interpretability in related NLP fields. Concretely, a quantum-inspired pretrained feature embedding (QPFE) method is first developed to model superposition states for words to embed more textual features. Then, a QPFE-ERNIE model is designed by merging the semantic features learned from the prevalent pretrained model ERNIE, which is verified with two NLP downstream tasks: 1) sentiment classification and 2) word sense disambiguation (WSD). In addition, schematic quantum circuit diagrams are provided, which has potential impetus for the future realization of quantum NLP with quantum device. Finally, the experiment results demonstrate QPFE-ERNIE is significantly better for sentiment classification than gated recurrent unit (GRU), BiLSTM, and TextCNN on five datasets in all metrics and achieves better results than ERNIE in accuracy, F1-score, and precision on two datasets (CR and SST), and it also has advantage for WSD over the classical models, including BERT (improves F1-score by 5.2 on average) and ERNIE (improves F1-score by 4.2 on average) and improves the F1-score by 8.7 on average compared with a previous quantum-inspired model QWSD. QPFE-ERNIE provides a novel pretrained quantum-inspired model for solving NLP problems, and it lays a foundation for exploring more quantum-inspired models in the future. © 2024 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>SC</sub> - Article in a specialist periodical, which is included in the SCOPUS database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

  • Continuities

Others

  • Publication year

    2024

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    IEEE Transactions on Cybernetics

  • ISSN

    2168-2267

  • e-ISSN

  • Volume of the periodical

    54

  • Issue of the periodical within the volume

    10

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    13

  • Pages from-to

    5973-5985

  • UT code for WoS article

  • EID of the result in the Scopus database

    2-s2.0-85194833525