Rauch, Lukas
BirdSet: A Dataset and Benchmark for Classification in Avian Bioacoustics
Rauch, Lukas, Schwinger, Raphael, Wirth, Moritz, Heinrich, René, Huseljic, Denis, Lange, Jonas, Kahl, Stefan, Sick, Bernhard, Tomforde, Sven, Scholz, Christoph
Deep learning (DL) models have emerged as a powerful tool in avian bioacoustics to assess environmental health. To maximize the potential of cost-effective and minimal-invasive passive acoustic monitoring (PAM), DL models must analyze bird vocalizations across a wide range of species and environmental conditions. However, data fragmentation challenges a comprehensive evaluation of generalization performance. Therefore, we introduce the BirdSet dataset, comprising approximately 520,000 global bird recordings for training and over 400 hours of PAM recordings for testing. Our benchmark offers baselines for several DL models to enhance comparability and consolidate research across studies, along with code implementations that include comprehensive training and evaluation protocols.
Fast Fishing: Approximating BAIT for Efficient and Scalable Deep Active Image Classification
Huseljic, Denis, Hahn, Paul, Herde, Marek, Rauch, Lukas, Sick, Bernhard
Deep active learning (AL) seeks to minimize the annotation costs for training deep neural networks. Bait, a recently proposed AL strategy based on the Fisher Information, has demonstrated impressive performance across various datasets. However, Bait's high computational and memory requirements hinder its applicability on large-scale classification tasks, resulting in current research neglecting Bait in their evaluation. This paper introduces two methods to enhance Bait's computational efficiency and scalability. Notably, we significantly reduce its time complexity by approximating the Fisher Information. In particular, we adapt the original formulation by i) taking the expectation over the most probable classes, and ii) constructing a binary classification task, leading to an alternative likelihood for gradient computations. Consequently, this allows the efficient use of Bait on large-scale datasets, including ImageNet. Our unified and comprehensive evaluation across a variety of datasets demonstrates that our approximations achieve strong performance with considerably reduced time complexity.
Active Bird2Vec: Towards End-to-End Bird Sound Monitoring with Transformers
Rauch, Lukas, Schwinger, Raphael, Wirth, Moritz, Sick, Bernhard, Tomforde, Sven, Scholz, Christoph
We propose a shift towards end-to-end learning in bird sound monitoring by combining self-supervised (SSL) and deep active learning (DAL). Leveraging transformer models, we aim to bypass traditional spectrogram conversions, enabling direct raw audio processing. ActiveBird2Vec is set to generate high-quality bird sound representations through SSL, potentially accelerating the assessment of environmental changes and decision-making processes for wind farms. Additionally, we seek to utilize the wide variety of bird vocalizations through DAL, reducing the reliance on extensively labeled datasets by human experts. We plan to curate a comprehensive set of tasks through Huggingface Datasets, enhancing future comparability and reproducibility of bioacoustic research. A comparative analysis between various transformer models will be conducted to evaluate their proficiency in bird sound recognition tasks. We aim to accelerate the progression of avian bioacoustic research and contribute to more effective conservation strategies.
DADO -- Low-Cost Query Strategies for Deep Active Design Optimization
Decke, Jens, Gruhl, Christian, Rauch, Lukas, Sick, Bernhard
In this experience report, we apply deep active learning to the field of design optimization to reduce the number of computationally expensive numerical simulations. We are interested in optimizing the design of structural components, where the shape is described by a set of parameters. If we can predict the performance based on these parameters and consider only the promising candidates for simulation, there is an enormous potential for saving computing power. We present two selection strategies for self-optimization to reduce the computational cost in multi-objective design optimization problems. Our proposed methodology provides an intuitive approach that is easy to apply, offers significant improvements over random sampling, and circumvents the need for uncertainty estimation. We evaluate our strategies on a large dataset from the domain of fluid dynamics and introduce two new evaluation metrics to determine the model's performance. Findings from our evaluation highlights the effectiveness of our selection strategies in accelerating design optimization. We believe that the introduced method is easily transferable to other self-optimization problems.
ActiveGLAE: A Benchmark for Deep Active Learning with Transformers
Rauch, Lukas, Aßenmacher, Matthias, Huseljic, Denis, Wirth, Moritz, Bischl, Bernd, Sick, Bernhard
Deep active learning (DAL) seeks to reduce annotation costs by enabling the model to actively query instance annotations from which it expects to learn the most. Despite extensive research, there is currently no standardized evaluation protocol for transformer-based language models in the field of DAL. Diverse experimental settings lead to difficulties in comparing research and deriving recommendations for practitioners. To tackle this challenge, we propose the ActiveGLAE benchmark, a comprehensive collection of data sets and evaluation guidelines for assessing DAL. Our benchmark aims to facilitate and streamline the evaluation process of novel DAL strategies. Additionally, we provide an extensive overview of current practice in DAL with transformer-based language models. We identify three key challenges - data set selection, model training, and DAL settings - that pose difficulties in comparing query strategies. We establish baseline results through an extensive set of experiments as a reference point for evaluating future work. Based on our findings, we provide guidelines for researchers and practitioners.