EduAgentQG: A Multi-Agent Workflow Framework for Personalized Question Generation
Jia, Rui, Zhang, Min, Liu, Fengrui, Jiang, Bo, Kuang, Kun, Dai, Zhongxiang
–arXiv.org Artificial Intelligence
Abstract--High-quality personalized question banks are crucial for supporting adaptive learning and individualized assessment. Manually designing questions is time-consuming and often fails to meet diverse learning needs, making automated question generation a crucial approach to reduce teachers' workload and improve the scalability of educational resources. However, most existing question generation methods rely on single-agent or rule-based pipelines, which still produce questions with unstable quality, limited diversity, and insufficient alignment with educational goals. T o address these challenges, we propose EduAgentQG, a multi-agent collaborative framework for generating high-quality and diverse personalized questions. The framework consists of five specialized agents and operates through an iterative feedback loop: the Planner generates structured design plans and multiple question directions to enhance diversity; the Writer produces candidate questions based on the plan and optimizes their quality and diversity using feedback from the Solver and Educator; the Solver and Educator perform binary scoring across multiple evaluation dimensions and feed the evaluation results back to the Writer; the Checker conducts final verification, including answer correctness and clarity, ensuring alignment with educational goals. Through this multi-agent collaboration and iterative feedback loop, EduAgentQG generates questions that are both high-quality and diverse, while maintaining consistency with educational objectives. Experiments on two mathematics question datasets demonstrate that EduAgentQG outperforms existing single-agent and multi-agent methods in terms of question diversity, goal consistency, and overall quality. High-quality personalized question banks are crucial for supporting adaptive learning and individualized assessment [1], [2], [3]. In practical teaching, experienced educators can often determine the specific educational goals a student needs to achieve based on observation and prior knowledge [4], [5], [6]. Teachers typically engage in iterative cycles of planning, drafting, validation, and optimization to design questions that are both diagnostically effective and pedagogically meaningful, balancing knowledge coverage, cognitive skill development, and difficulty levels [7], [8]. Existing question banks may not always contain suitable questions, and even when relevant questions are available, they may have been previously attempted by students [9], [10], [11].
arXiv.org Artificial Intelligence
Nov-18-2025
- Country:
- Asia > China
- Guangdong Province > Shenzhen (0.04)
- Hong Kong (0.04)
- Shanghai > Shanghai (0.05)
- Zhejiang Province > Hangzhou (0.04)
- North America > United States
- Ohio (0.04)
- Asia > China
- Genre:
- Instructional Material > Course Syllabus & Notes (0.34)
- Research Report > New Finding (0.46)
- Industry:
- Education > Educational Setting > K-12 Education (0.46)
- Technology: