Dresp-Langley, Birgitta
Spatiotemporal modeling of grip forces captures proficiency in manual robot control
Liu, Rongrong, Wandeto, John M., Nageotte, Florent, Zanne, Philippe, de Mathelin, Michel, Dresp-Langley, Birgitta
This paper builds on our previous work by exploiting Artificial Intelligence to predict individual grip force variability in manual robot control. Grip forces were recorded from various loci in the dominant and non dominant hands of individuals by means of wearable wireless sensor technology. Statistical analyses bring to the fore skill specific temporal variations in thousands of grip forces of a complete novice and a highly proficient expert in manual robot control. A brain inspired neural network model that uses the output metric of a Self Organizing Map with unsupervised winner take all learning was run on the sensor output from both hands of each user. The neural network metric expresses the difference between an input representation and its model representation at any given moment in time t and reliably captures the differences between novice and expert performance in terms of grip force variability.Functionally motivated spatiotemporal analysis of individual average grip forces, computed for time windows of constant size in the output of a restricted amount of task-relevant sensors in the dominant (preferred) hand, reveal finger-specific synergies reflecting robotic task skill. The analyses lead the way towards grip force monitoring in real time to permit tracking task skill evolution in trainees, or identify individual proficiency levels in human robot interaction in environmental contexts of high sensory uncertainty. Parsimonious Artificial Intelligence (AI) assistance will contribute to the outcome of new types of surgery, in particular single-port approaches such as NOTES (Natural Orifice Transluminal Endoscopic Surgery) and SILS (Single Incision Laparoscopic Surgery).
The Grossberg Code: Universal Neural Network Signatures of Perceptual Experience
Dresp-Langley, Birgitta
Two universal functional principles of Adaptive Resonance Theory simulate the brain code of all biological learning and adaptive intelligence. Low level representations of multisensory stimuli in their immediate environmental context are formed on the basis of bottom up activation and under the control of top down matching rules that integrate high level long term traces of contextual configuration. These universal coding principles lead to the establishment of lasting brain signatures of perceptual experience in all living species, from aplysiae to primates. They are revisited in this paper here on the basis of examples drawn from the original code and from some of the most recent related empirical findings on contextual modulation in the brain, highlighting the potential of Grossberg's pioneering insights and groundbreaking theoretical work for intelligent solutions in the domain of developmental and cognitive robotics.
Surgical task expertise detected by a self-organizing neural network map
Dresp-Langley, Birgitta, Liu, Rongrong, Wandeto, John M.
Individual grip force profiling of bimanual simulator task performance of experts and novices using a robotic control device designed for endoscopic surgery permits defining benchmark criteria that tell true expert task skills from the skills of novices or trainee surgeons. Grip force variability in a true expert and a complete novice executing a robot assisted surgical simulator task reveal statistically significant differences as a function of task expertise. Here we show that the skill specific differences in local grip forces are predicted by the output metric of a Self Organizing neural network Map (SOM) with a bio inspired functional architecture that maps the functional connectivity of somatosensory neural networks in the primate brain.
Unsupervised automatic classification of Scanning Electron Microscopy (SEM) images of CD4+ cells with varying extent of HIV virion infection
Wandeto, John M., Dresp-Langley, Birgitta
Archiving large sets of medical or cell images in digital libraries may require ordering randomly scattered sets of image data according to specific criteria, such as the spatial extent of a specific local color or contrast content that reveals different meaningful states of a physiological structure, tissue, or cell in a certain order, indicating progression or recession of a pathology, or the progressive response of a cell structure to treatment. Here we used a Self Organized Map (SOM)-based, fully automatic and unsupervised, classification procedure described in our earlier work and applied it to sets of minimally processed grayscale and/or color processed Scanning Electron Microscopy (SEM) images of CD4+ T-lymphocytes (so-called helper cells) with varying extent of HIV virion infection. It is shown that the quantization error in the SOM output after training permits to scale the spatial magnitude and the direction of change (+ or -) in local pixel contrast or color across images of a series with a reliability that exceeds that of any human expert. The procedure is easily implemented and fast, and represents a promising step towards low-cost automatic digital image archiving with minimal intervention of a human operator.