Goto

Collaborating Authors

 activity pattern


A Bayesian method for reducing bias in neural representational similarity analysis

Mingbo Cai, Nicolas W. Schuck, Jonathan W. Pillow, Yael Niv

Neural Information Processing Systems

In neuroscience, the similarity matrix of neural activity patterns in response to different sensory stimuli or under different cognitive states reflects the structure of neural representational space. Existing methods derive point estimations of neural activity patterns from noisy neural imaging data, and the similarity is calculated from these point estimations.


Large Language Models as Urban Residents: An LLM Agent Framework for Personal Mobility Generation

Neural Information Processing Systems

This paper introduces a novel approach using Large Language Models (LLMs) integrated into an agent framework for flexible and effective personal mobility generation. LLMs overcome the limitations of previous models by effectively processing semantic data and offering versatility in modeling various tasks.


Supplementary Material: Simultaneous embedding of multiple attractor manifolds in a recurrent neural network using constrained gradient optimization

Neural Information Processing Systems

The dynamics of neural activity are described by a standard rate model. Note that only the third term of Eq. 'th place cell preferred firing position in the's are standard unit vectors spanning an orthonormal basis. To derive Eq. 3 we evaluate the derivative of Energy landscapes were uniformly shifted throughout the manuscript by a constant (Figs. For each network with a different number of total embedded maps, 15 realizations were performed in which the permutations between the spatial maps were chosen independently and at random. Code availability Code is available at public repository https://doi.org/10.5281/zenodo.10016179.



Large Language Models as Urban Residents: An LLM Agent Framework for Personal Mobility Generation Jiawei Wang

Neural Information Processing Systems

This paper introduces a novel approach using Large Language Models (LLMs) integrated into an agent framework for flexible and effective personal mobility generation. LLMs overcome the limitations of previous models by effectively processing semantic data and offering versatility in modeling various tasks.


Supplementary Material: Simultaneous embedding of multiple attractor manifolds in a recurrent neural network using constrained gradient optimization

Neural Information Processing Systems

The dynamics of neural activity are described by a standard rate model. Note that only the third term of Eq. 'th place cell preferred firing position in the's are standard unit vectors spanning an orthonormal basis. To derive Eq. 3 we evaluate the derivative of Energy landscapes were uniformly shifted throughout the manuscript by a constant (Figs. For each network with a different number of total embedded maps, 15 realizations were performed in which the permutations between the spatial maps were chosen independently and at random. Code availability Code is available at public repository https://doi.org/10.5281/zenodo.10016179.




Recovering Individual-Level Activity Sequences from Location-Based Service Data Using a Novel Transformer-Based Model

Luo, Weiyu, Xiong, Chenfeng

arXiv.org Artificial Intelligence

Word Count: 6, 279 words + 3 table (250 words per table) = 7, 029 words Submitted [ 08/01/2025 ] *Corresponding Author Weiyu Luo, Chenfeng Xiong 2 ABSTR A CT Location - Based Service (LBS) data provides critical insights into human mobility, yet its sparsity often yields incomplete trip and activity sequences, making accurate inferences about trips and activities difficult . We raised a research problem: Can we use activity sequences derived from high - quality LBS data to recover incomplete activity sequences at individual level? This study proposes a new solution, the Variable Selection Network - fused Insertion Transformer (VSNIT), integrating the Insertion Transformer ' s flexible sequence construction with the Variable Selection Network's dynamic covariate handling capability, to recover missing segments in incomplete activity sequences while preserving existing data . The findings show that VSNIT inserts more diverse, realistic activity patterns, more closely matching real - world variability, and restores disrupted activity transiti ons more effectively aligning with the target. It also performs significantly better than the baseline model across all metrics. These results highlight VSNIT ' s superior accuracy and diversity in activity sequence recovery tasks, demonstrating its potential to enhance LBS data utility for mobility analysis. This approach offers a promising framework for future location - based research and applications. Keywords: Sequence - To - Sequence Modeling, Location - Based - Service Data, Data Spar sity, Insertion Transformer, Activity - Based M odeling, Human Mobility Weiyu Luo, Chenfeng Xiong 3 INTRODUCTION Activity - based model Activity - based modeling (ABM) emerged in response to the limitations of traditional trip - based models, providing a more behaviorally appropriate framework for understanding travel demand ( 1 - 3) .


Forcing LLMs to be evil during training can make them nicer in the long run

MIT Technology Review

For this study, Lindsey and his colleagues worked to lay down some of that groundwork. Previous research has shown that various dimensions of LLMs' behavior--from whether they are talking about weddings to persistent traits such as sycophancy--are associated with specific patterns of activity in the simulated neurons that constitute LLMs. Those patterns can be written down as a long string of numbers, in which each number represents how active a specific neuron is when the model is expressing that behavior. Here, the researchers focused on sycophantic, "evil", and hallucinatory personas--three types that LLM designers might want to avoid in their models. To identify those patterns, the team devised a fully automated pipeline that can map out that pattern given a brief text description of a persona.