Skip to main content

Research Repository

Advanced Search

enunlg: a Python library for reproducible neural data-to-text experimentation

Howcroft, David M; Gkatzia, Dimitra

Authors



Abstract

Over the past decade, a variety of neural ar-chitectures for data-to-text generation (NLG) have been proposed. However, each system typically has its own approach to pre-and post-processing and other implementation details. Diversity in implementations is desirable, but it also confounds attempts to compare model performance: are the differences due to the proposed architectures or are they a byprod-uct of the libraries used or a result of pre-and post-processing decisions made? To improve reproducibility, we re-implement several pre-Transformer neural models for data-to-text NLG within a single framework to facilitate direct comparisons of the models themselves and better understand the contributions of other design choices. We release our library at https: //github.com/NapierNLP/enunlg to serve as a baseline for ongoing work in this area including research on NLG for low-resource languages where transformers might not be optimal .

Citation

Howcroft, D. M., & Gkatzia, D. (2023). enunlg: a Python library for reproducible neural data-to-text experimentation. In Proceedings of the 16th International Natural Language Generation Conference: System Demonstrations (4-5)

Conference Name 16th International Natural Language Generation Conference
Conference Location Prague, Czechia
Start Date Sep 13, 2023
End Date Sep 15, 2023
Acceptance Date Jul 12, 2023
Online Publication Date Sep 11, 2023
Publication Date 2023
Deposit Date Nov 15, 2023
Publicly Available Date Nov 15, 2023
Publisher Association for Computational Linguistics (ACL)
Pages 4-5
Book Title Proceedings of the 16th International Natural Language Generation Conference: System Demonstrations
ISBN 979-8-89176-002-8
Public URL http://researchrepository.napier.ac.uk/Output/3385911
Publisher URL https://aclanthology.org/2023.inlg-demos.2/

Files





You might also like



Downloadable Citations