Goto

Collaborating Authors

 invertebrate


He'd need some LARGE SquarePants: Footage of a sea star with a 'big bottom' sparks hilarity as it's compared to SpongeBob's Patrick

Daily Mail - Science & tech

The sea floor is home to all sorts of weird and wonderful creatures. But one in particular has become an online sensation, thanks to its impressive'buttocks'. A big–bottomed sea star has been spotted more than 1,000 metres (3,280ft) below the waves. And it appears to have a backside that will make even the most avid gymgoer jealous. This has led many baffled viewers to compare the creature to Patrick from the animated series Spongebob Squarepants.


Flies disguised as wasps can't fool birds

Popular Science

Breakthroughs, discoveries, and DIY tips sent every weekday. Despite their bee-like appearance, hoverflies are all buzz, no bite. The harmless insects, more closely related to midges than wasps, imitate their distant stinging cousins with stripes, high contrast colors, and narrow waists. In theory, the "flies in wasps' clothing" use this strategy to ward off would-be predators, without having to pay the cost of evolving venom and an appendage to inject it. The quality of hoverfly mimicry can vary– from detailed disguises to the insect equivalent of slapping on a pair of cat ears for a Halloween party.


SuoiAI: Building a Dataset for Aquatic Invertebrates in Vietnam

Vo, Tue, Sharma, Lakshay, Dinh, Tuan, Dinh, Khuong, Nguyen, Trang, Phan, Trung, Do, Minh, Vu, Duong

arXiv.org Artificial Intelligence

Understanding and monitoring aquatic biodiversity is critical for ecological health and conservation efforts. This paper proposes SuoiAI, an end-to-end pipeline for building a dataset of aquatic invertebrates in Vietnam and employing machine learning (ML) techniques for species classification. We outline the methods for data collection, annotation, and model training, focusing on reducing annotation effort through semi-supervised learning and leveraging state-of-the-art object detection and classification models. Our approach aims to overcome challenges such as data scarcity, fine-grained classification, and deployment in diverse environmental conditions.


BarcodeMamba: State Space Models for Biodiversity Analysis

Gao, Tiancheng, Taylor, Graham W.

arXiv.org Artificial Intelligence

DNA barcodes are crucial in biodiversity analysis for building automatic identification systems that recognize known species and discover unseen species. Unlike human genome modeling, barcode-based invertebrate identification poses challenges in the vast diversity of species and taxonomic complexity. Among Transformer-based foundation models, BarcodeBERT excelled in species-level identification of invertebrates, highlighting the effectiveness of self-supervised pretraining on barcode-specific datasets. Recently, structured state space models (SSMs) have emerged, with a time complexity that scales sub-quadratically with the context length. SSMs provide an efficient parameterization of sequence modeling relative to attention-based architectures. Given the success of Mamba and Mamba-2 in natural language, we designed BarcodeMamba, a performant and efficient foundation model for DNA barcodes in biodiversity analysis. We conducted a comprehensive ablation study on the impacts of self-supervised training and tokenization methods, and compared both versions of Mamba layers in terms of expressiveness and their capacity to identify "unseen" species held back from training. Our study shows that BarcodeMamba has better performance than BarcodeBERT even when using only 8.3% as many parameters, and improves accuracy to 99.2% on species-level accuracy in linear probing without fine-tuning for "seen" species. In our scaling study, BarcodeMamba with 63.6% of BarcodeBERT's parameters achieved 70.2% genus-level accuracy in 1-nearest neighbor (1-NN) probing for unseen species. The code repository to reproduce our experiments is available at https://github.com/bioscan-ml/BarcodeMamba.


BarcodeBERT: Transformers for Biodiversity Analysis

Arias, Pablo Millan, Sadjadi, Niousha, Safari, Monireh, Gong, ZeMing, Wang, Austin T., Lowe, Scott C., Haurum, Joakim Bruslund, Zarubiieva, Iuliia, Steinke, Dirk, Kari, Lila, Chang, Angel X., Taylor, Graham W.

arXiv.org Artificial Intelligence

Understanding biodiversity is a global challenge, in which DNA barcodes - short snippets of DNA that cluster by species - play a pivotal role. In particular, invertebrates, a highly diverse and under-explored group, pose unique taxonomic complexities. We explore machine learning approaches, comparing supervised CNNs, fine-tuned foundation models, and a DNA barcode-specific masking strategy across datasets of varying complexity. While simpler datasets and tasks favor supervised CNNs or fine-tuned transformers, challenging species-level identification demands a paradigm shift towards self-supervised pretraining. We propose BarcodeBERT, the first self-supervised method for general biodiversity analysis, leveraging a 1.5 M invertebrate DNA barcode reference library. This work highlights how dataset specifics and coverage impact model selection, and underscores the role of self-supervised pretraining in achieving high-accuracy DNA barcode-based identification at the species and genus level. Indeed, without the fine-tuning step, BarcodeBERT pretrained on a large DNA barcode dataset outperforms DNABERT and DNABERT-2 on multiple downstream classification tasks. The code repository is available at https://github.com/Kari-Genomics-Lab/BarcodeBERT


Why invertebrates should be included in animal welfare protections

New Scientist

FRANKLIN the cuttlefish considered the juicy prawn meat morsel in front of her. As mouth-watering as it looked, she resisted temptation and waited for her favourite meal to become available – live shrimp. Her self-control is impressive and comparable to what we see in chimpanzees and crows. Self-control is a vital cognitive skill that underpins decision-making and future planning. In humans, these abilities are linked to sentience because they are thought to involve conscious experience.

  Country:
  Industry: Health & Medicine (0.36)

Automatic image-based identification and biomass estimation of invertebrates

Ärje, Johanna, Melvad, Claus, Jeppesen, Mads Rosenhøj, Madsen, Sigurd Agerskov, Raitoharju, Jenni, Rasmussen, Maria Strandgård, Iosifidis, Alexandros, Tirronen, Ville, Meissner, Kristian, Gabbouj, Moncef, Høye, Toke Thomas

arXiv.org Machine Learning

Understanding how biological communities respond to environmental changes is a key challenge in ecology and ecosystem management. The apparent decline of insect populations necessitates more biomonitoring but the time-consuming sorting and identification of taxa pose strong limitations on how many insect samples can be processed. In turn, this affects the scale of efforts to map invertebrate diversity altogether. Given recent advances in computer vision, we propose to replace the standard manual approach of human expert-based sorting and identification with an automatic image-based technology. We describe a robot-enabled image-based identification machine, which can automate the process of invertebrate identification, biomass estimation and sample sorting. We use the imaging device to generate a comprehensive image database of terrestrial arthropod species. We use this database to test the classification accuracy i.e. how well the species identity of a specimen can be predicted from images taken by the machine. We also test sensitivity of the classification accuracy to the camera settings (aperture and exposure time) in order to move forward with the best possible image quality. We use state-of-the-art Resnet-50 and InceptionV3 CNNs for the classification task. The results for the initial dataset are very promising ($\overline{ACC}=0.980$). The system is general and can easily be used for other groups of invertebrates as well. As such, our results pave the way for generating more data on spatial and temporal variation in invertebrate abundance, diversity and biomass.


AstraZeneca and research partners drive wider application of machine learning to better understand the impact of chemicals on the environment

#artificialintelligence

Today, together with researchers at Kings College London, the Universities of Northumbria and Suffolk, and the Francis Crick Institute, AstraZeneca published a paper in Environmental Science and Technology calling for the wider application of machine learning in environmental toxicology research, to reduce the burden on animal testing and better meet the future challenges of scientific discovery. Environmental Protection, together with Access to Healthcare and Ethics and Transparency, is a key priority of the approach to sustainability at AstraZeneca. Our scientific approach to environmental sustainability reduces our environmental impact by protecting our air, land and water, reducing our dependence on natural resources and ensuring the environmental safety of our products. This publication is the result of an ongoing collaboration between AstraZeneca and academic partners, who have been working together to see how machine learning can help us to better understand the impact of chemicals on the environment. Pollution from contaminants continues to be a cause for concern, not only on the environment but also for public health. To understand the effects of this pollution, the research team wanted to look specifically at how chemicals can accumulate in fish and invertebrates.


Bottlenose dolphins are able to work together as a team with 'extreme precision'

Daily Mail - Science & tech

New research suggests dolphins are even smarter than first thought and can coordinate their behaviour with one another with'extreme precision'. In a new experiment, bottlenose dolphins had to press an underwater button at the same time as their partner. Scientists found they could synchronise their actions almost perfectly. The marine mammals were working so closely they pressed the button within an average of 370 milliseconds of their partner doing the same. Pictured is a triple synchronous dive by a trio of male bottlenose dolphins.


Army Building "Self-Aware" Squid-Like Robot That Can Be "3-D Printed" During Combat

#artificialintelligence

The Army Research Laboratory's next robot weapon isn't a new predator drone or even a robot dog like the infamous prototype developed by Boston Labs. Instead, it's a "self-aware" robot built from flexible materials inspired by invertebrates like squid, the Army Times reports. But in addition to its advance machine-learning capabilities, the material used to build the robots is so lightweight and malleable that soldiers will be able to "print" the robots on the battlefield, the control them with controllers that send electric currents through the materials. In case you weren't already terrified of robots that can jump over walls, fly or crawl, Army researchers are developing your next nightmare – a flexible, soft robot inspired by squid and other invertebrates. And they want soldiers to be able to use 3D printers to make them on the battlefield.

  Country: North America > United States > Minnesota (0.08)
  Industry: Government > Military > Army (1.00)