Visell, Yon
Tactile Displays Driven by Projected Light
Linnander, Max, Goetz, Dustin, Reardon, Gregory, Hawkes, Elliot, Visell, Yon
Tactile displays that lend tangible form to digital content could transform computing interactions. However, achieving the resolution, speed, and dynamic range needed for perceptual fidelity remains challenging. We present a tactile display that directly converts projected light into visible tactile patterns via a photomechanical surface populated with millimeter-scale optotactile pixels. The pixels transduce incident light into mechanical displacements through photostimulated thermal gas expansion, yielding millimeter scale displacements with response times of 2 to 100 milliseconds. Employing projected light for power transmission and addressing renders these displays highly scalable. We demonstrate devices with up to 1511 addressable pixels. Perceptual studies confirm that they can reproduce diverse spatiotemporal tactile patterns with high fidelity. This research establishes a foundation for practical, versatile high-resolution tactile displays driven by light.
Tactile Perception in Upper Limb Prostheses: Mechanical Characterization, Human Experiments, and Computational Findings
Ivani, Alessia Silvia, Catalano, Manuel G., Grioli, Giorgio, Bianchi, Matteo, Visell, Yon, Bicchi, Antonio
Our research investigates vibrotactile perception in four prosthetic hands with distinct kinematics and mechanical characteristics. We found that rigid and simple socket-based prosthetic devices can transmit tactile information and surprisingly enable users to identify the stimulated finger with high reliability. This ability decreases with more advanced prosthetic hands with additional articulations and softer mechanics. We conducted experiments to understand the underlying mechanisms. We assessed a prosthetic user's ability to discriminate finger contacts based on vibrations transmitted through the four prosthetic hands. We also performed numerical and mechanical vibration tests on the prostheses and used a machine learning classifier to identify the contacted finger. Our results show that simpler and rigid prosthetic hands facilitate contact discrimination (for instance, a user of a purely cosmetic hand can distinguish a contact on the index finger from other fingers with 83% accuracy), but all tested hands, including soft advanced ones, performed above chance level. Despite advanced hands reducing vibration transmission, a machine learning algorithm still exceeded human performance in discriminating finger contacts. These findings suggest the potential for enhancing vibrotactile feedback in advanced prosthetic hands and lay the groundwork for future integration of such feedback in prosthetic devices.