epidermis
Researchers Create 3D-Printed Artificial Skin That Allows Blood Circulation
Swedish researchers have developed two types of 3D bioprinting technology to artificially generate skin containing blood vessels. It could be a breakthrough in the quest to regenerate damaged skin. When treating severe burns and trauma, skin regeneration can be a matter of life or death. Extensive burns are usually treated by transplanting a thin layer of epidermis, the top layer of skin, from elsewhere on the body. However, this method not only leaves large scars, it also does not restore the skin to its original functional state.
- Europe > Sweden > Östergötland County > Linköping (0.07)
- Asia > China (0.07)
- South America (0.05)
- (6 more...)
Design of a Five-Fingered Hand with Full-Fingered Tactile Sensors Using Conductive Filaments and Its Application to Bending after Insertion Motion
Miyama, Kazuhiro, Hasegawa, Shun, Kawaharazuka, Kento, Yamaguchi, Naoya, Okada, Kei, Inaba, Masayuki
Abstract-- The purpose of this study is to construct a contact point estimation system for the both side of a finger, and to realize a motion of bending the finger after inserting the finger into a tool (hereinafter referred to as the bending after insertion motion). In order to know the contact points of the full finger including the joints, we propose to fabricate a nerve inclusion flexible epidermis by combining a flexible epidermis and a nerve line made of conductive filaments, and estimate the contact position from the change of resistance of the nerve line. A nerve inclusion flexible epidermis attached to a thin fingered robotic hand was combined with a twin-armed robot and tool use experiments were conducted. The contact information can be used for tool use, confirming the effectiveness of the proposed method. I. Introduction A. Outline of the Bending after Insertion Motion degree of freedom to grasp and use scissors.
Adapting Segment Anything Model to Melanoma Segmentation in Microscopy Slide Images
Melanoma segmentation in Whole Slide Images (WSIs) is useful for prognosis and the measurement of crucial prognostic factors such as Breslow depth and primary invasive tumor size. In this paper, we present a novel approach that uses the Segment Anything Model (SAM) for automatic melanoma segmentation in microscopy slide images. Our method employs an initial semantic segmentation model to generate preliminary segmentation masks that are then used to prompt SAM. We design a dynamic prompting strategy that uses a combination of centroid and grid prompts to achieve optimal coverage of the super high-resolution slide images while maintaining the quality of generated prompts. To optimize for invasive melanoma segmentation, we further refine the prompt generation process by implementing in-situ melanoma detection and low-confidence region filtering. We select Segformer as the initial segmentation model and EfficientSAM as the segment anything model for parameter-efficient fine-tuning. Our experimental results demonstrate that this approach not only surpasses other state-of-the-art melanoma segmentation methods but also significantly outperforms the baseline Segformer by 9.1% in terms of IoU.
- Health & Medicine > Therapeutic Area > Oncology > Skin Cancer (1.00)
- Health & Medicine > Therapeutic Area > Dermatology (1.00)
S-SYNTH: Knowledge-Based, Synthetic Generation of Skin Images
Kim, Andrea, Saharkhiz, Niloufar, Sizikova, Elena, Lago, Miguel, Sahiner, Berkman, Delfino, Jana, Badano, Aldo
Development of artificial intelligence (AI) techniques in medical imaging requires access to large-scale and diverse datasets for training and evaluation. In dermatology, obtaining such datasets remains challenging due to significant variations in patient populations, illumination conditions, and acquisition system characteristics. In this work, we propose S-SYNTH, the first knowledge-based, adaptable open-source skin simulation framework to rapidly generate synthetic skin, 3D models and digitally rendered images, using an anatomically inspired multi-layer, multi-component skin and growing lesion model. The skin model allows for controlled variation in skin appearance, such as skin color, presence of hair, lesion shape, and blood fraction among other parameters. We use this framework to study the effect of possible variations on the development and evaluation of AI models for skin lesion segmentation, and show that results obtained using synthetic data follow similar comparative trends as real dermatologic images, while mitigating biases and limitations from existing datasets including small dataset size, lack of diversity, and underrepresentation.
- Health & Medicine > Therapeutic Area > Dermatology (1.00)
- Health & Medicine > Diagnostic Medicine > Imaging (1.00)
Towards Highly Expressive Machine Learning Models of Non-Melanoma Skin Cancer
Thomas, Simon M., Lefevre, James G., Baxter, Glenn, Hamilton, Nicholas A.
Pathologists have a rich vocabulary with which they can describe all the nuances of cellular morphology. In their world, there is a natural pairing of images and words. Recent advances demonstrate that machine learning models can now be trained to learn high-quality image features and represent them as discrete units of information. This enables natural language, which is also discrete, to be jointly modelled alongside the imaging, resulting in a description of the contents of the imaging. Here we present experiments in applying discrete modelling techniques to the problem domain of non-melanoma skin cancer, specifically, histological images of Intraepidermal Carcinoma (IEC). Implementing a VQ-GAN model to reconstruct high-resolution (256x256) images of IEC images, we trained a sequence-to-sequence transformer to generate natural language descriptions using pathologist terminology. Combined with the idea of interactive concept vectors available by using continuous generative methods, we demonstrate an additional angle of interpretability. The result is a promising means of working towards highly expressive machine learning systems which are not only useful as predictive/classification tools, but also means to further our scientific understanding of disease.
- Oceania > Australia > Queensland (0.04)
- North America > United States > Washington > King County > Seattle (0.04)
- North America > United States > California > Sonoma County > Santa Rosa (0.04)
- Europe > Netherlands > North Holland > Amsterdam (0.04)
- Health & Medicine > Therapeutic Area > Oncology > Skin Cancer (1.00)
- Health & Medicine > Therapeutic Area > Dermatology (1.00)
Using imaging and machine learning tools to analyse features of plant leaves
Andrew Leakey, Jiayang (Kevin) Xie and their colleagues developed an improved method for analyzing features of plant leaves that contribute to water-use efficiency in crops like corn, sorghum (pictured) and Setaria. They used advanced statistical approaches to identify regions of the genome and lists of genes that contribute to these traits. Scientists have developed and deployed a series of new imaging and machine learning tools to discover attributes that contribute to water-use efficiency in crop plants during photosynthesis and to reveal the genetic basis of variation in those traits. The findings are described in a series of four research papers led by University of Illinois Urbana-Champaign graduate students Jiayang (Kevin) Xie and Parthiban Prakash, and postdoctoral researchers John Ferguson, Samuel Fernandes and Charles Pignon. The goal is to breed or engineer crops that are better at conserving water without sacrificing yield, said Andrew Leakey, a professor of plant biology and of crop sciences at the University of Illinois Urbana-Champaign, who directed the research.