Enhancing Domain Modeling with Pre-trained Large Language Models: An Automated Assistant for Domain Modelers
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F24%3A10488754" target="_blank" >RIV/00216208:11320/24:10488754 - isvavai.cz</a>
Result on the web
<a href="https://doi.org/10.1007/978-3-031-75872-0_13" target="_blank" >https://doi.org/10.1007/978-3-031-75872-0_13</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/978-3-031-75872-0_13" target="_blank" >10.1007/978-3-031-75872-0_13</a>
Alternative languages
Result language
angličtina
Original language name
Enhancing Domain Modeling with Pre-trained Large Language Models: An Automated Assistant for Domain Modelers
Original language description
Domain modeling involves creating abstract representations of information within a specific domain using techniques such as conceptual modeling and ontology engineering. Traditionally, manual creation and maintenance of domain models are labor intensive and require modeling expertise. This paper explores the automation of domain modeling using pre-trained large language models (LLMs), presenting an experimental LLM-based conceptual modeling assistant that collaborates with a human expert. The assistant provides modeling suggestions based on a given textual description of the domain of interest, aiding in the design of classes, attributes, and associations. We present a generic framework for domain modeling assistants that consists of class, attribute, and association generators, and show how they can be implemented using an LLM. We demonstrate a concrete configuration of this framework and its prototype implementation. We evaluated the effectiveness of the framework configuration across various domains. Our findings indicate that the assistant significantly enhances the efficiency of modeling while maintaining reasonable quality of the outputs.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2024
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISBN
978-3-031-75872-0
ISSN
—
e-ISSN
1611-3349
Number of pages
19
Pages from-to
235-253
Publisher name
Springer
Place of publication
Cham
Event location
PITTSBURGH, PENNSYLVANIA, USA
Event date
Oct 28, 2024
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—