Open Access iconOpen Access

ARTICLE

crossmark

Prompt-Guided Dialogue State Tracking with GPT-2 and Graph Attention

Muhammad Asif Khan1, Dildar Hussain2, Bhuyan Kaibalya Prasad3, Irfan Ullah4, Inayat Khan5, Jawad Khan6,*, Yeong Hyeon Gu2,*, Pavlos Kefalas7

1 School of Computer Science and Engineering, Southeast University, Nanjing, 211189, China
2 Department of AI and Data Science, Sejong University, Seoul, 05006, Republic of Korea
3 Department of Electronics and Communication Engineering, National Institute of Technology, Rourkela, 769008, India
4 Department of Computer Science, Shaheed Benazir Bhutto University, Sheringal, 18050, Pakistan
5 Department of Computer Science, University of Engineering and Technology, Mardan, 23200, Pakistan
6 School of Computing, Gachon University, Seongnam, 13120, Republic of Korea
7 Department of Informatics, Aristotle University of Thessaloniki, Thessaloniki, 54124, Greece

* Corresponding Authors: Jawad Khan. Email: email; Yeong Hyeon Gu. Email: email

Computers, Materials & Continua 2025, 85(3), 5451-5468. https://doi.org/10.32604/cmc.2025.069134

Abstract

Dialogue State Tracking (DST) is a critical component of task-oriented spoken dialogue systems (SDS), tasked with maintaining an accurate representation of the conversational state by predicting slots and their corresponding values. Recent advances leverage Large Language Models (LLMs) with prompt-based tuning to improve tracking accuracy and efficiency. However, these approaches often incur substantial computational and memory overheads and typically address slot extraction implicitly within prompts, without explicitly modeling the complex dependencies between slots and values. In this work, we propose PUGG, a novel DST framework that constructs schema-driven prompts to fine-tune GPT-2 and utilizes its tokenizer to implement a memory encoder. PUGG explicitly extracts slot values via GPT-2 and employs Graph Attention Networks (GATs) to model and reason over the intricate relationships between slots and their associated values. We evaluate PUGG on four publicly available datasets, where it achieves state-of-the-art performance across multiple evaluation metrics, highlighting its robustness and generalizability in diverse conversational scenarios. Our results indicate that the integration of GPT-2 substantially reduces model complexity and memory consumption by streamlining key processes. Moreover, prompt tuning enhances the model’s flexibility and precision in extracting relevant slot-value pairs, while the incorporation of GATs facilitates effective relational reasoning, leading to improved dialogue state representations.

Keywords

Spoken dialogue systems; dialogue state tracking; prompt tuning; GPT-2; graph attention networks

Cite This Article

APA Style
Khan, M.A., Hussain, D., Prasad, B.K., Ullah, I., Khan, I. et al. (2025). Prompt-Guided Dialogue State Tracking with GPT-2 and Graph Attention. Computers, Materials & Continua, 85(3), 5451–5468. https://doi.org/10.32604/cmc.2025.069134
Vancouver Style
Khan MA, Hussain D, Prasad BK, Ullah I, Khan I, Khan J, et al. Prompt-Guided Dialogue State Tracking with GPT-2 and Graph Attention. Comput Mater Contin. 2025;85(3):5451–5468. https://doi.org/10.32604/cmc.2025.069134
IEEE Style
M. A. Khan et al., “Prompt-Guided Dialogue State Tracking with GPT-2 and Graph Attention,” Comput. Mater. Contin., vol. 85, no. 3, pp. 5451–5468, 2025. https://doi.org/10.32604/cmc.2025.069134



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 700

    View

  • 256

    Download

  • 0

    Like

Share Link