Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
Injecting Linguistic Knowledge Into BERT for Dialogue State Tracking
oleh: Xiaohan Feng, Xixin Wu, Helen Meng
Format: | Article |
---|---|
Diterbitkan: | IEEE 2024-01-01 |
Deskripsi
Dialogue State Tracking (DST) models often employ intricate neural network architectures, necessitating substantial training data, and their inference process lacks transparency. This paper proposes a method that extracts linguistic knowledge via an unsupervised framework and subsequently utilizes this knowledge to augment BERT’s performance and interpretability in DST tasks. The knowledge extraction procedure is computationally economical and does not require annotations or additional training data. The injection of the extracted knowledge can be achieved by the addition of simple neural modules. We employ the Convex Polytopic Model (CPM) as a feature extraction tool for DST tasks and illustrate that the acquired features correlate with syntactic and semantic patterns in the dialogues. This correlation facilitates a comprehensive understanding of the linguistic features influencing the DST model’s decision-making process. We benchmark this framework on various DST tasks and observe a notable improvement in accuracy.