Next Article in Journal
A Method for Prediction and Analysis of Student Performance That Combines Multi-Dimensional Features of Time and Space
Previous Article in Journal
Solving the Control Synthesis Problem Through Supervised Machine Learning of Symbolic Regression
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework

1
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
2
School of Electrical Engineering and Computing, Adama Science and Technology University, Adama 1888, Ethiopia
3
Department of Artificial Intelligence and Data Science, College of AI Convergence, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
*
Authors to whom correspondence should be addressed.
Mathematics 2024, 12(22), 3596; https://doi.org/10.3390/math12223596
Submission received: 18 October 2024 / Revised: 14 November 2024 / Accepted: 15 November 2024 / Published: 17 November 2024

Abstract

Current language models have achieved remarkable success in NLP tasks. Nonetheless, individual decoding methods face difficulties in realizing the immense potential of these models. The challenge is primarily due to the lack of a decoding framework that can integrate language models and decoding methods. We introduce DecoStrat, which bridges the gap between language modeling and the decoding process in D2T generation. By leveraging language models, DecoStrat facilitates the exploration of alternative decoding methods tailored to specific tasks. We fine-tuned the model on the MultiWOZ dataset to meet task-specific requirements and employed it to generate output(s) through multiple interactive modules of the framework. The Director module orchestrates the decoding processes, engaging the Generator to produce output(s) text based on the selected decoding method and input data. The Manager module enforces a selection strategy, integrating Ranker and Selector to identify the optimal result. Evaluations on the stated dataset show that DecoStrat effectively produces a diverse and accurate output, with MBR variants consistently outperforming other methods. DecoStrat with the T5-small model surpasses some baseline frameworks. Generally, the findings highlight DecoStrat’s potential for optimizing decoding methods in diverse real-world applications.
Keywords: decoding methods; data-to-text generation (D2T); language models (LMs); natural language generation (NLG); natural language processing (NLP) decoding methods; data-to-text generation (D2T); language models (LMs); natural language generation (NLG); natural language processing (NLP)

Share and Cite

MDPI and ACS Style

Jimale, E.L.; Chen, W.; Al-antari, M.A.; Gu, Y.H.; Agbesi, V.K.; Feroze, W. DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework. Mathematics 2024, 12, 3596. https://doi.org/10.3390/math12223596

AMA Style

Jimale EL, Chen W, Al-antari MA, Gu YH, Agbesi VK, Feroze W. DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework. Mathematics. 2024; 12(22):3596. https://doi.org/10.3390/math12223596

Chicago/Turabian Style

Jimale, Elias Lemuye, Wenyu Chen, Mugahed A. Al-antari, Yeong Hyeon Gu, Victor Kwaku Agbesi, and Wasif Feroze. 2024. "DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework" Mathematics 12, no. 22: 3596. https://doi.org/10.3390/math12223596

APA Style

Jimale, E. L., Chen, W., Al-antari, M. A., Gu, Y. H., Agbesi, V. K., & Feroze, W. (2024). DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework. Mathematics, 12(22), 3596. https://doi.org/10.3390/math12223596

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop