Goto

Collaborating Authors

 Rymarczyk, Dawid


ProtoMIL: Multiple Instance Learning with Prototypical Parts for Fine-Grained Interpretability

arXiv.org Artificial Intelligence

Multiple Instance Learning (MIL) gains popularity in many real-life machine learning applications due to its weakly supervised nature. However, the corresponding effort on explaining MIL lags behind, and it is usually limited to presenting instances of a bag that are crucial for a particular prediction. In this paper, we fill this gap by introducing ProtoMIL, a novel self-explainable MIL method inspired by the case-based reasoning process that operates on visual prototypes. Thanks to incorporating prototypical features into objects description, ProtoMIL unprecedentedly joins the model accuracy and fine-grained interpretability, which we present with the experiments on five recognized MIL datasets.


Classifying bacteria clones using attention-based deep multiple instance learning interpreted by persistence homology

arXiv.org Artificial Intelligence

In this work, we analyze if it is possible to distinguish between different clones of the same bacteria species (Klebisiella pneumoniae) based only on microscopic images. It is a challenging task, previously considered impossible due to the high clones' similarity. For this purpose, we apply a multi-step algorithm with attention-based multiple instance learning. Except for obtaining accuracy at the level of 0.9, we introduce extensive interpretability based on CellProfiler and persistence homology, increasing the understandability and trust in the model.


ProtoPShare: Prototype Sharing for Interpretable Image Classification and Similarity Discovery

arXiv.org Artificial Intelligence

In this paper, we introduce ProtoPShare, a self-explained method that incorporates the paradigm of prototypical parts to explain its predictions. The main novelty of the ProtoPShare is its ability to efficiently share prototypical parts between the classes thanks to our data-dependent merge-pruning. Moreover, the prototypes are more consistent and the model is more robust to image perturbations than the state of the art method ProtoPNet. We verify our findings on two datasets, the CUB-200-2011 and the Stanford Cars.


Kernel Self-Attention in Deep Multiple Instance Learning

arXiv.org Machine Learning

Multiple Instance Learning (MIL) is weakly supervised learning, which assumes that there is only one label provided for the entire bag of instances. As such, it appears in many problems of medical image analysis, like the whole-slide images classification of biopsy. Most recently, MIL was also applied to deep architectures by introducing the aggregation operator, which focuses on crucial instances of a bag. In this paper, we enrich this idea with the self-attention mechanism to take into account dependencies across the instances. We conduct several experiments and show that our method with various types of kernels increases the accuracy, especially in the case of non-standard MIL assumptions. This is of importance for real-word medical problems, which usually satisfy presence-based or threshold-based assumptions.


Deep learning approach to description and classification of fungi microscopic images

arXiv.org Artificial Intelligence

Diagnosis of fungal infections can rely on microscopic examination, however, in many cases, it does not allow unambiguous identification of the species due to their visual similarity. Therefore, it is usually necessary to use additional biochemical tests. That involves additional costs and extends the identification process up to 10 days. Such a delay in the implementation of targeted treatment is grave in consequences as the mortality rate for immunosuppressed patients is high. In this paper, we apply machine learning approach based on deep learning and bag-of-words to classify microscopic images of various fungi species. Our approach makes the last stage of biochemical identification redundant, shortening the identification process by 2-3 days and reducing the cost of the diagnostic examination.