Few-Shot Query Intent Detection via Relation-Aware Prompt Learning

Zhang, Liang, Li, Yuan, Zhang, Shijie, Zhang, Zheng, Li, Xitong

arXiv.org Artificial Intelligence 

Abstract--Intent detection is a crucial component of modern conversational systems, since accurately identifying user intent at the beginning of a conversation is essential for generating effective responses. Currently, most of the intent detectors can only work effectively with the assumption that high-quality labeled data is available (i.e., the collected data is labeled by domain experts). T o ease this process, recent efforts have focused on studying this problem under a more challenging few-shot scenario. These approaches primarily leverage large-scale unlabeled dialogue text corpora to pretrain language models through various pretext tasks, followed by fine-tuning for intent detection with very limited annotations. Despite the improvements achieved, existing methods have predominantly focused on textual data, neglecting to effectively capture the crucial structural information inherent in conversational systems, such as the query-query relation and query-answer relation. Specifically, the query-query relation captures the semantic relevance between two queries within the same session, reflecting the user's refinement of her request, while the query-answer relation represents the conversational agent's clarification and response to a user query . T o address this gap, we propose SAID, a novel framework that integrates both textual and relational structure information in a unified manner for model pretraining for the first time. Firstly, we introduce a relation-aware prompt module, which employs learnable relation tokens as soft prompts, enabling the model to learn shared knowledge across multiple relations and become explicitly aware of how to interpret query text within the context of these relations. Secondly, we reformulate the few-shot intent detection problem using prompt learning by creating a new intent-specific relation-aware prompt, which incorporates intent-specific relation tokens alongside the semantic information embedded in intent names, helping the pretrained model effectively transfer the pretrained knowledge acquired from related relational perspectives. Building on this framework, we further propose a novel mechanism, the query-adaptive attention network (QueryAdapt), which operates at the relation token level by generating intent-specific relation tokens from well-learned query-query and query-answer relations explicitly, enabling more fine-grained knowledge transfer. Extensive experimental results on two real-world datasets demonstrate that SAID significantly outperforms state-of-the-art methods, achieving improvements of up to 27% in the 3-shot setting. When equipped with the relation token-level QueryAdapt module, it yields additional performance gains of up to 21% in the same setting.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found