All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

A Computational Perspective on Neural-Symbolic Integration

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F24%3A00375262" target="_blank" >RIV/68407700:21230/24:00375262 - isvavai.cz</a>

  • Result on the web

    <a href="https://doi.org/10.3233/NAI-240672" target="_blank" >https://doi.org/10.3233/NAI-240672</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.3233/NAI-240672" target="_blank" >10.3233/NAI-240672</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    A Computational Perspective on Neural-Symbolic Integration

  • Original language description

    Neural-Symbolic Integration (NSI) aims to marry the principles of symbolic AI techniques, such as logical reasoning, with the learning capabilities of neural networks. In recent years, many systems have been proposed to address this integration in a seemingly {{efficient}} manner. However, from the computational perspective, this is in principle impossible to do. Specifically, some of the core symbolic problems are provably hard, hence a general NSI system necessarily needs to adopt this computational complexity, too. Many NSI methods try to circumvent this downside by inconspicuously dropping parts of the symbolic capabilities while mapping the problems into static tensor representations in exchange for efficient deep learning acceleration. In this paper, we argue that the aim for a general NSI system, {properly} covering both the neural and symbolic paradigms, has important computational implications on the learning representations, the structure of the resulting computation graphs, and the underlying hardware and software stacks. Particularly, we explain how the currently prominent, tensor-based deep learning with static computation graphs is conceptually insufficient as a foundation for such general NSI, which we discuss in a wider context of established (statistical) relational and structured deep learning methods. Finally, we delve into the underlying hardware acceleration aspects and outline some promising computational directions toward fully expressive and efficient NSI.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>ost</sub> - Miscellaneous article in a specialist periodical

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GA24-11664S" target="_blank" >GA24-11664S: Relational Reinforcement Learning for Science Acceleration</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2024

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Neurosymbolic Artificial Intelligence

  • ISSN

    2949-8732

  • e-ISSN

    2949-8732

  • Volume of the periodical

    1

  • Issue of the periodical within the volume

    1

  • Country of publishing house

    NL - THE KINGDOM OF THE NETHERLANDS

  • Number of pages

    12

  • Pages from-to

    1-12

  • UT code for WoS article

  • EID of the result in the Scopus database