Hybrid EEG--Driven Brain--Computer Interface: A Large Language Model Framework for Personalized Language Rehabilitation

Hossain, Ismail, Banik, Mridul

arXiv.org Artificial Intelligence 

--Conventional augmentative and alternative communication (AAC) systems and language-learning platforms often fail to adapt in real time to the user's cognitive and linguistic needs, especially in neurological conditions such as post-stroke aphasia or amyotrophic lateral sclerosis. Recent advances in noninvasive electroencephalography (EEG)-based brain-computer interfaces (BCIs) and transformer-based large language models (LLMs) offer complementary strengths: BCIs capture users' neural intent with low fatigue, while LLMs generate contextually tailored language content. Objective: We propose and evaluate a novel hybrid framework that leverages real-time EEG signals to drive an LLM-powered language rehabilitation assistant. This system aims to: (1) enable users with severe speech or motor impairments to navigate language-learning modules via mental commands; (2) dynamically personalize vocabulary, sentence-construction exercises, and corrective feedback; and (3) monitor neural markers of cognitive effort to adjust task difficulty on the fly. All individuals have the right to self-expression, social participation, and the agency to impact their environment. For individuals with complex communication needs, augmentative and alternative communication (AAC) systems provide critical tools to facilitate communication. However, traditional AAC methods--such as printed communication boards or eye gaze devices--may not be accessible for individuals with severe speech and physical impairments (SSPI).