Pre-trained Transformer-models using chronic invasive electrophysiology for symptom decoding without patient-individual training
Merk, Timon, Salehi, Saeed, Koehler, Richard M., Cui, Qiming, Olaru, Maria, Hahn, Amelia, Provenza, Nicole R., Little, Simon, Abbasi-Asl, Reza, Starr, Phil A., Neumann, Wolf-Julian
–arXiv.org Artificial Intelligence
Neural decoding of pathological and physiological states can enable patient-individualized closed-loop neuromodulation therapy. Recent advances in pre-trained large-scale foundation models offer the potential for generalized state estimation without patient-individual training. Here we present a foundation model trained on chronic longitudinal deep brain stimulation recordings spanning over 24 days. Adhering to long time-scale symptom fluctuations, we highlight the extended context window of 30 minutes. We present an optimized pre-training loss function for neural electrophysiological data that corrects for the frequency bias of common masked auto-encoder loss functions due to the 1-over-f power law. We show in a downstream task the decoding of Parkinson's disease symptoms with leave-one-subject-out cross-validation without patient-individual training.
arXiv.org Artificial Intelligence
Aug-15-2025
- Country:
- Europe > Germany
- Berlin (0.05)
- North America > United States
- California > San Francisco County
- San Francisco (0.34)
- Texas > Harris County
- Houston (0.04)
- California > San Francisco County
- Europe > Germany
- Genre:
- Research Report (1.00)
- Industry:
- Health & Medicine > Therapeutic Area > Neurology > Parkinson's Disease (1.00)
- Technology: