Goto

Collaborating Authors

 zebrafish


Intrinsic Goals for Autonomous Agents: Model-Based Exploration in Virtual Zebrafish Predicts Ethological Behavior and Whole-Brain Dynamics

Keller, Reece, Kirsch, Alyn, Pei, Felix, Pitkow, Xaq, Kozachkov, Leo, Nayebi, Aran

arXiv.org Artificial Intelligence

Autonomy is a hallmark of animal intelligence, enabling adaptive and intelligent behavior in complex environments without relying on external reward or task structure. Existing reinforcement learning approaches to exploration in reward-free environments, including a class of methods known as model-based intrinsic motivation, exhibit inconsistent exploration patterns and do not converge to an exploratory policy, thus failing to capture robust autonomous behaviors observed in animals. Moreover, systems neuroscience has largely overlooked the neural basis of autonomy, focusing instead on experimental paradigms where animals are motivated by external reward rather than engaging in ethological, naturalistic and task-independent behavior. To bridge these gaps, we introduce a novel model-based intrinsic drive explicitly designed after the principles of autonomous exploration in animals. Our method (3M-Progress) achieves animal-like exploration by tracking divergence between an online world model and a fixed prior learned from an ecological niche. To the best of our knowledge, we introduce the first autonomous embodied agent that predicts brain data entirely from self-supervised optimization of an intrinsic goal -- without any behavioral or neural training data -- demonstrating that 3M-Progress agents capture the explainable variance in behavioral patterns and whole-brain neural-glial dynamics recorded from autonomously behaving larval zebrafish, thereby providing the first goal-driven, population-level model of neural-glial computation. Our findings establish a computational framework connecting model-based intrinsic motivation to naturalistic behavior, providing a foundation for building artificial agents with animal-like autonomy.


NLP4Neuro: Sequence-to-sequence learning for neural population decoding

Morra, Jacob J., Fouke, Kaitlyn E., Hang, Kexin, He, Zichen, Traubert, Owen, Dunn, Timothy W., Naumann, Eva A.

arXiv.org Artificial Intelligence

Delineating how animal behavior arises from neural activity is a foundational goal of neuroscience. However, as the computations underlying behavior unfold in networks of thousands of individual neurons across the entire brain, this presents challenges for investigating neural roles and computational mechanisms in large, densely wired mammalian brains during behavior. Transformers, the backbones of modern large language models (LLMs), have become powerful tools for neural decoding from smaller neural populations. These modern LLMs have benefited from extensive pre-training, and their sequence-to-sequence learning has been shown to generalize to novel tasks and data modalities, which may also confer advantages for neural decoding from larger, brain-wide activity recordings. Here, we present a systematic evaluation of off-the-shelf LLMs to decode behavior from brain-wide populations, termed NLP4Neuro, which we used to test LLMs on simultaneous calcium imaging and behavior recordings in larval zebrafish exposed to visual motion stimuli. Through NLP4Neuro, we found that LLMs become better at neural decoding when they use pre-trained weights learned from textual natural language data. Moreover, we found that a recent mixture-of-experts LLM, DeepSeek Coder-7b, significantly improved behavioral decoding accuracy, predicted tail movements over long timescales, and provided anatomically consistent highly interpretable readouts of neuron salience. NLP4Neuro demonstrates that LLMs are highly capable of informing brain-wide neural circuit dissection.


ZAPBench: A Benchmark for Whole-Brain Activity Prediction in Zebrafish

Lueckmann, Jan-Matthis, Immer, Alexander, Chen, Alex Bo-Yuan, Li, Peter H., Petkova, Mariela D., Iyer, Nirmala A., Hesselink, Luuk Willem, Dev, Aparna, Ihrke, Gudrun, Park, Woohyun, Petruncio, Alyson, Weigel, Aubrey, Korff, Wyatt, Engert, Florian, Lichtman, Jeff W., Ahrens, Misha B., Januszewski, Michał, Jain, Viren

arXiv.org Artificial Intelligence

Data-driven benchmarks have led to significant progress in key scientific modeling domains including weather and structural biology. Here, we introduce the Zebrafish Activity Prediction Benchmark (ZAPBench) to measure progress on the problem of predicting cellular-resolution neural activity throughout an entire vertebrate brain. The benchmark is based on a novel dataset containing 4d light-sheet microscopy recordings of over 70,000 neurons in a larval zebrafish brain, along with motion stabilized and voxel-level cell segmentations of these data that facilitate development of a variety of forecasting methods. Initial results from a selection of time series and volumetric video modeling approaches achieve better performance than naive baseline methods, but also show room for further improvement. The specific brain used in the activity recording is also undergoing synaptic-level anatomical mapping, which will enable future integration of detailed structural information into forecasting methods.


Mapping the shifting gaze of 'fishlets'

Popular Science

They have Wolverine-like regeneration abilities–and can almost entirely regrow their spinal cords after damage. They also give scientists insight into some of the animal brain's most primal states. While working with week-old zebrafish larvae, a team of scientists decoded how the connections made by a network of neurons in the brainstem guide where the fish looks. They also created a simplified artificial circuit that can predict visual movement and activity in the animal's brain. This discovery sheds light on how the brain handles short-term memory and could lead to some new ways to treat eye movement disorders in humans.


Giving Zebrafish Psychotropic Drugs to Train AI Algorithms - Neuroscience News

#artificialintelligence

Summary: Researchers trained an AI to determine which psychotropic agent a zebrafish had been exposed to based on the animal's behaviors and locomotion patterns. Neuroscientists from St. Petersburg University, led by Professor Allan V. Kalueff, in collaboration with an international team of IT specialists, have become the first in the world to apply the artificial intelligence (AI) algorithms to phenotype zebrafish psychoactive drug responses. They managed to train AI to determine--by fish response--which psychotropic agents were used in the experiment. The research findings are published in the journal Progress in Neuro-Psychopharmacology and Biological Psychiatry. The zebrafish (Danio rerio) is a freshwater bony fish that is presently the second-most (after mice) used model organism in biomedical research.


Scientists have captured the brain making memories for the first time

Daily Mail - Science & tech

A team of USC researchers has filmed the live brains of zebrafish to show how the brain processes and stores memories in a ground-breaking study which could offer hope for new PTSD treatments. With the help of a tailor made microscope, researchers were able to record how brain cells of the fish - which are transparent when young - 'lit up like Times Square on New Year's Eve' during the experiment. The study, which mapped the changes in the brain, made the surprising find that making memories appears to create new synapses - connections between neurons -or made them disappear entirely. The widely accepted theory that learning and memories strengthen synapses was not apparent. 'For the last 40 years the common wisdom was that you learn by changing the strength of the synapses but that's not what we found in this case,' co-author, director of the Informatics Division at the USC Information Sciences Institute and computer scientist Prof. Carl Kesselman said in a press release.


Eye-brain connection humans first evolved in fish 100 MILLION years earlier than previously thought

Daily Mail - Science & tech

The sophisticated network of nerves connecting our eyes to our brains evolved 100 million years earlier than previously thought – a discovery that'literally changes the textbook.' A team of international scientists found the connection scheme was already present in the ancient gar fish that lived 450 million years ago, which means the eye-brain connection pre-dates animals living on land. The long-held theory suggests the connection first evolved in terrestrial creatures and, from there, carried on into humans where scientists believe it helps with our depth perception and 3D vision. Michigan State University's Ingo Braasch said: 'Modern fish, they don't have this type of eye-brain connection.' 'That's one of the reasons that people thought it was a new thing in tetrapods.' A team of international scientists found the connection scheme was already present in the ancient gar fish that lived 450 million years ago, which means the eye-brain connection pre-dates animals living on land.


Forget Finding Nemo: This AI can identify a single zebrafish out of a 100-strong shoal

#artificialintelligence

AI systems excel in pattern recognition, so much so that they can stalk individual zebrafish and fruit flies even when the animals are in groups of up to a hundred. To demonstrate this, a group of researchers from the Champalimaud Foundation, a private biomedical research lab in Portugal, trained two convolutional neural networks to identify and track individual animals within a group. The aim is not so much to match or exceed humans' ability to spot and follow stuff, but rather to automate the process of studying the behavior of animals in their communities. "The ultimate goal of our team is understanding group behavior," said Gonzalo de Polavieja. "We want to understand how animals in a group decide together and learn together."


Scientists video that shows living cells interacting in 3D detail

Daily Mail - Science & tech

Existing microscope technology means that we can only observe cells in an isolated environment, often restricted to a simple glass slide. But now, thanks to a breakthrough discovery, scientists have found a way to study cellular processes in their natural habitat: deep inside living organisms. In a study published Friday in Science, researchers describe how they developed a new kind of microscope that uses sophisticated'guide star' technology. The result is a series of mesmerizing high-resolution 3D videos that document biological processes in never-before-seen detail. Researchers from Harvard Medical School, Boston Children's Hospital and the Howard Hughes Medical Institute collaborated for the study.


Small brains, big data

#artificialintelligence

When we think about big data, we usually think about the web: the billions of users of social media, the sensors on millions of mobile phones, the thousands of contributions to Wikipedia, and so forth. Due to recent innovations, web-scale data can now also come from a camera pointed at a small, but extremely complex object: the brain. New progress in distributed computing is changing how neuroscientists work with the resulting data -- and may, in the process, change how we think about computation. The brain consists of many neurons -- a hundred thousand in a fly or larval zebrafish, millions in a mouse, billions in a human. Its function depends on the neurons' activity, and how they communicate with one another.