Goto

Collaborating Authors

 fbn


Beyond Pairwise Connections: Extracting High-Order Functional Brain Network Structures under Global Constraints

Zhan, Ling, Huang, Junjie, Yu, Xiaoyao, Chen, Wenyu, Jia, Tao

arXiv.org Artificial Intelligence

Functional brain network (FBN) modeling often relies on local pairwise interactions, whose limitation in capturing high-order dependencies is theoretically analyzed in this paper. Meanwhile, the computational burden and heuristic nature of current hypergraph modeling approaches hinder end-to-end learning of FBN structures directly from data distributions. To address this, we propose to extract high-order FBN structures under global constraints, and implement this as a Global Constraints oriented Multi-resolution (GCM) FBN structure learning framework. It incorporates 4 types of global constraint (signal synchronization, subject identity, expected edge numbers, and data labels) to enable learning FBN structures for 4 distinct levels (sample/subject/group/project) of modeling resolution. Experimental results demonstrate that GCM achieves up to a 30.6% improvement in relative accuracy and a 96.3% reduction in computational time across 5 datasets and 2 task settings, compared to 9 baselines and 10 state-of-the-art methods. Extensive experiments validate the contributions of individual components and highlight the interpretability of GCM. This work offers a novel perspective on FBN structure learning and provides a foundation for interdisciplinary applications in cognitive neuroscience. Code is publicly available on https://github.com/lzhan94swu/GCM.


Brain-like Functional Organization within Large Language Models

Sun, Haiyang, Zhao, Lin, Wu, Zihao, Gao, Xiaohui, Hu, Yutao, Zuo, Mengfei, Zhang, Wei, Han, Junwei, Liu, Tianming, Hu, Xintao

arXiv.org Artificial Intelligence

The human brain has long inspired the pursuit of artificial intelligence (AI). Recently, neuroimaging studies provide compelling evidence of alignment between the computational representation of artificial neural networks (ANNs) and the neural responses of the human brain to stimuli, suggesting that ANNs may employ brain-like information processing strategies. While such alignment has been observed across sensory modalities--visual, auditory, and linguistic--much of the focus has been on the behaviors of artificial neurons (ANs) at the population level, leaving the functional organization of individual ANs that facilitates such brain-like processes largely unexplored. In this study, we bridge this gap by directly coupling sub-groups of artificial neurons with functional brain networks (FBNs), the foundational organizational structure of the human brain. Specifically, we extract representative patterns from temporal responses of ANs in large language models (LLMs), and use them as fixed regressors to construct voxel-wise encoding models to predict brain activity recorded by functional magnetic resonance imaging (fMRI). This framework links the AN sub-groups to FBNs, enabling the delineation of brain-like functional organization within LLMs. Our findings reveal that LLMs (BERT and Llama 1-3) exhibit brain-like functional architecture, with sub-groups of artificial neurons mirroring the organizational patterns of well-established FBNs. Notably, the brain-like functional organization of LLMs evolves with the increased sophistication and capability, achieving an improved balance between the diversity of computational behaviors and the consistency of functional specializations. This research represents the first exploration of brain-like functional organization within LLMs, offering novel insights to inform the development of artificial general intelligence (AGI) with human brain principles.


Overcoming the Challenges of Batch Normalization in Federated Learning

Guerraoui, Rachid, Pinot, Rafael, Rizk, Geovani, Stephan, John, Taiani, François

arXiv.org Artificial Intelligence

Batch normalization has proven to be a very beneficial mechanism to accelerate the training and improve the accuracy of deep neural networks in centralized environments. Yet, the scheme faces significant challenges in federated learning, especially under high data heterogeneity. Essentially, the main challenges arise from external covariate shifts and inconsistent statistics across clients. We introduce in this paper Federated BatchNorm (FBN), a novel scheme that restores the benefits of batch normalization in federated learning. Essentially, FBN ensures that the batch normalization during training is consistent with what would be achieved in a centralized execution, hence preserving the distribution of the data, and providing running statistics that accurately approximate the global statistics. FBN thereby reduces the external covariate shift and matches the evaluation performance of the centralized setting. We also show that, with a slight increase in complexity, we can robustify FBN to mitigate erroneous statistics and potentially adversarial attacks.


Spatial-Temporal Convolutional Attention for Mapping Functional Brain Networks

Liu, Yiheng, Ge, Enjie, Qiang, Ning, Liu, Tianming, Ge, Bao

arXiv.org Machine Learning

Recently, to overcome the shallow nature of the linear models, various of deep learning based methods have been Using functional magnetic resonance imaging (fMRI) and proposed to discover the FBNs. Most of these methods are deep learning to explore functional brain networks (FBNs) based on the autoencoders, they use different autoencoders has attracted many researchers. However, most of these to extract the sources in an self-supervised manner, and then studies are still based on the temporal correlation between use the generative linear model, such as LASSO to generate the sources and voxel signals, and lack of researches on the the FBNs [6, 7]. In general, these deep learning based methods dynamics of brain function. Due to the widespread local can indeed extract better encoder representations as the correlations in the volumes, FBNs can be generated directly sources than the classical methods, such as ICA and SDL, but in the spatial domain in a self-supervised manner by using still generate FBNs in a linear and independent manner, with spatial-wise attention (SA), and the resulting FBNs has the sources extraction and the FBNs generation as 2 separate a higher spatial similarity with templates compared to the steps. Generating the FBNs in such way is time-consuming classical method. Therefore, we proposed a novel Spatial-and does not fully utilize the advantages of deep learning, and Temporal Convolutional Attention (STCA) model to discover cannot directly generate the FBNs with deep learning.


Finet: Using Fine-grained Batch Normalization to Train Light-weight Neural Networks

Luo, Chunjie, Zhan, Jianfeng, Wang, Lei, Gao, Wanling

arXiv.org Machine Learning

To build light-weight network, we propose a new normalization, Fine-grained Batch Normalization (FBN). Different from Batch Normalization (BN), which normalizes the final summation of the weighted inputs, FBN normalizes the intermediate state of the summation. We propose a novel light-weight network based on FBN, called Finet. At training time, the convolutional layer with FBN can be seen as an inverted bottleneck mechanism. FBN can be fused into convolution at inference time. After fusion, Finet uses the standard convolution with equal channel width, thus makes the inference more efficient. On ImageNet classification dataset, Finet achieves the state-of-art performance (65.706% accuracy with 43M FLOPs, and 73.786% accuracy with 303M FLOPs), Moreover, experiments show that Finet is more efficient than other state-of-art light-weight networks.


Introduction of Recurrent Neural Networks (RNN) - Ankitaism

#artificialintelligence

The Artificial Neural Networks (ANN) have evolved tremendously with a variety of networks to suit for applications according their individual properties. The ANN have a simple structure consisting of nodes (also called processing units) connected to each other via weights. The network gets stimulated by giving input to few are all nodes, and this stimulation, also called activation spreads through entire network. The way in which layers are connected and fed categorizes ANNs in to feed-forward networks (FFN) or feed-back networks (FBN). The FFNs are acyclic in nature i.e. just one forward travelling of weights and biases; whereas the FBNs are cyclically connected i.e. some layers have a connection coming from the other layers recursively.