Brant-2: Foundation Model for Brain Signals
Yuan, Zhizhang, Zhang, Daoze, Chen, Junru, Gu, Gefei, Yang, Yang
–arXiv.org Artificial Intelligence
Foundational models benefit from pre-training on large amounts of unlabeled data and enable strong performance in a wide variety of applications with a small amount of labeled data. Such models can be particularly effective in analyzing brain signals, as this field encompasses numerous application scenarios, and it is costly to perform large-scale annotation. In this work, we present the largest foundation model in brain signals, Brant-2. Compared to Brant, a foundation model designed for intracranial neural signals, Brant-2 not only exhibits robustness towards data variations and modeling scales but also can be applied to a broader range of brain neural data. By experimenting on an extensive range of tasks, we demonstrate that Brant-2 is adaptive to various application scenarios in brain signals. Further analyses reveal the scalability of the Brant-2, validate each component's effectiveness, and showcase our model's ability to maintain performance in scenarios with scarce labels.
arXiv.org Artificial Intelligence
Mar-28-2024
- Country:
- North America > United States (0.28)
- Genre:
- Research Report (1.00)
- Industry:
- Health & Medicine
- Health Care Technology (1.00)
- Therapeutic Area > Neurology (1.00)
- Health & Medicine
- Technology: