Goto

Collaborating Authors

 mathis


AI enables a Who's Who of brown bears in Alaska

AIHub

AI enables a Who's Who of brown bears in Alaska Being able to distinguish individual animals - including their unique history, movement patterns and habits - can help scientists better understand how their species function, and therefore better manage habitats and study population dynamics. Today, most computer vision systems for tracking animals are effective on species with patterns and markings, such as zebras, leopards and giraffes. The task is much more complicated for unmarked species where individual differences are harder to spot. Distinguishing a particular brown bear from its peers in a non-invasive way requires an incredible eye for detail and years of viewing the same bears over time. What's more, these bears emerge from hibernation in the spring with shaggy fur and having lost quite a bit of weight and then substantially increase their body weight feasting on salmon, as well as fully shedding their winter coat - that's enough to throw off experts as well as AI algorithms.


MBE-ARI: A Multimodal Dataset Mapping Bi-directional Engagement in Animal-Robot Interaction

Noronha, Ian, Jawaji, Advait Prasad, Soto, Juan Camilo, An, Jiajun, Gu, Yan, Kaur, Upinder

arXiv.org Artificial Intelligence

Animal-robot interaction (ARI) remains an unexplored challenge in robotics, as robots struggle to interpret the complex, multimodal communication cues of animals, such as body language, movement, and vocalizations. Unlike human-robot interaction, which benefits from established datasets and frameworks, animal-robot interaction lacks the foundational resources needed to facilitate meaningful bidirectional communication. To bridge this gap, we present the MBE-ARI (Multimodal Bidirectional Engagement in Animal-Robot Interaction), a novel multimodal dataset that captures detailed interactions between a legged robot and cows. The dataset includes synchronized RGB-D streams from multiple viewpoints, annotated with body pose and activity labels across interaction phases, offering an unprecedented level of detail for ARI research. Additionally, we introduce a full-body pose estimation model tailored for quadruped animals, capable of tracking 39 keypoints with a mean average precision (mAP) of 92.7%, outperforming existing benchmarks in animal pose estimation. The MBE-ARI dataset and our pose estimation framework lay a robust foundation for advancing research in animal-robot interaction, providing essential tools for developing perception, reasoning, and interaction frameworks needed for effective collaboration between robots and animals. The dataset and resources are publicly available at https://github.com/RISELabPurdue/MBE-ARI/, inviting further exploration and development in this critical area.


Adaptive Intelligence: leveraging insights from adaptive behavior in animals to build flexible AI systems

Mathis, Mackenzie Weygandt

arXiv.org Artificial Intelligence

Biological intelligence is inherently adaptive -- animals continually adjust their actions based on environmental feedback. However, creating adaptive artificial intelligence (AI) remains a major challenge. The next frontier is to go beyond traditional AI to develop "adaptive intelligence," defined here as harnessing insights from biological intelligence to build agents that can learn online, generalize, and rapidly adapt to changes in their environment. Recent advances in neuroscience offer inspiration through studies that increasingly focus on how animals naturally learn and adapt their world models. In this Perspective, I will review the behavioral and neural foundations of adaptive biological intelligence, the parallel progress in AI, and explore brain-inspired approaches for building more adaptive algorithms.


Modeling the minutia of motor manipulation with AI

AIHub

In neuroscience and biomedical engineering, accurately modeling the complex movements of the human hand has long been a significant challenge. Current models often struggle to capture the intricate interplay between the brain's motor commands and the physical actions of muscles and tendons. This gap not only hinders scientific progress but also limits the development of effective neuroprosthetics aimed at restoring hand function for those with limb loss or paralysis. EPFL professor Alexander Mathis and his team have developed an AI-driven approach that advances our understanding of these complex motor functions. The team used a creative machine learning strategy that combined curriculum-based reinforcement learning with detailed biomechanical simulations.


AmadeusGPT: a natural language interface for interactive animal behavioral analysis

Neural Information Processing Systems

The process of quantifying and analyzing animal behavior involves translating the naturally occurring descriptive language of their actions into machine-readable code. Yet, codifying behavior analysis is often challenging without deep understanding of animal behavior and technical machine learning knowledge. To limit this gap, we introduce AmadeusGPT: a natural language interface that turns natural language descriptions of behaviors into machine-executable code. Large-language models (LLMs) such as GPT3.5 and GPT4 allow for interactive language-based queries that are potentially well suited for making interactive behavior analysis. However, the comprehension capability of these LLMs is limited by the context window size, which prevents it from remembering distant conversations.


Movie clip reconstructed by an AI reading mice's brains as they watch

New Scientist

A mouse's brain activity may give some indication into what it is seeing A black-and-white movie has been extracted almost perfectly from the brain signals of mice using an artificial intelligence tool. Mackenzie Mathis at the Swiss Federal Institute of Technology Lausanne and her colleagues collected brain activity data from around 50 mice while they watched a 30-second movie clip nine times. The researchers then trained an AI to link this data to 600 frames of the clip, in which a man runs to a car and opens its boot. The data was previously collected by other researchers who inserted metal probes, which record electrical pulses from neurons, into the mice's primary visual cortexes, the area of the brain involved in processing visual information. Some brain activity data was also collected by imaging the mice's brains using a microscope. Next, Mathis and her team tested the ability of their trained AI to predict the order of frames within the clip using brain activity data that was collected from the mice as they watched the movie for the tenth time.


Artificial intelligence and big data to help preserve wildlife

AIHub

A team of experts in artificial intelligence and animal ecology have put forward a new, cross-disciplinary approach intended to enhance research on wildlife species and make more effective use of the vast amounts of data now being collected thanks to new technology. Their study appears in Nature Communications. The field of animal ecology has entered the era of big data and the Internet of Things. Unprecedented amounts of data are now being collected on wildlife populations, thanks to sophisticated technology such as satellites, drones and terrestrial devices like automatic cameras and sensors placed on animals or in their surroundings. These data have become so easy to acquire and share that they have shortened distances and time requirements for researchers while minimizing the disrupting presence of humans in natural habitats.