INTERACT: Enabling Interactive, Question-Driven Learning in Large Language Models
Kendapadi, Aum, Zaman, Kerem, Menon, Rakesh R., Srivastava, Shashank
–arXiv.org Artificial Intelligence
Large language models (LLMs) excel at answering questions but remain passive learners--absorbing static data without the ability to question and refine knowledge. This paper explores how LLMs can transition to interactive, question-driven learning through student-teacher dialogues. We introduce INTERACT (INTEReractive Learning for Adaptive Concept Transfer), a framework in which a "student" LLM engages a "teacher" LLM through iterative inquiries to acquire knowledge across 1,347 contexts, including song lyrics, news articles, movie plots, academic papers, and images. Our experiments show that across a wide range of scenarios and LLM architectures, interactive learning consistently enhances performance, achieving up to a 25% improvement, with 'cold-start' student models matching static learning baselines in as few as five dialogue turns. Interactive setups can also mitigate the disadvantages of weaker teachers, showcasing the robustness of question-driven learning.
arXiv.org Artificial Intelligence
Dec-15-2024
- Country:
- Asia (1.00)
- Europe (1.00)
- North America > United States (0.67)
- Genre:
- Research Report
- Experimental Study (0.93)
- New Finding (1.00)
- Research Report
- Industry:
- Education
- Assessment & Standards > Student Performance (0.31)
- Educational Setting (0.67)
- Government > Regional Government (0.67)
- Health & Medicine (1.00)
- Leisure & Entertainment (1.00)
- Media > Film (0.68)
- Education
- Technology: