LLM-Glasses: GenAI-driven Glasses with Haptic Feedback for Navigation of Visually Impaired People
Tokmurziyev, Issatay, Cabrera, Miguel Altamirano, Khan, Muhammad Haris, Mahmoud, Yara, Moreno, Luis, Tsetserukou, Dzmitry
–arXiv.org Artificial Intelligence
Abstract-- We present LLM-Glasses, a wearable navigation system designed to assist visually impaired individuals by combining haptic feedback, YOLO-World object detection, and GPT-4o-driven reasoning. The system delivers real-time tactile guidance via temple-mounted actuators, enabling intuitive and independent navigation. Three user studies were conducted to evaluate its effectiveness: (1) a haptic pattern recognition study achieving an 81.3% average recognition rate across 13 distinct patterns, (2) a VICON-based navigation study in which participants successfully followed predefined paths in open spaces, and (3) an LLM-guided video evaluation demonstrating 91.8% accuracy in open scenarios, 84.6% with static obstacles, and 81.5% with dynamic obstacles. These results demonstrate the system's reliability in controlled environments, with ongoing work focusing on refining its responsiveness and adaptability to diverse real-world scenarios. LLM-Glasses showcases the potential of combining generative AI with haptic interfaces to empower visually impaired individuals with intuitive and effective mobility solutions.
arXiv.org Artificial Intelligence
Mar-4-2025
- Country:
- Africa > Central African Republic
- Ombella-M'Poko > Bimbo (0.04)
- North America > Costa Rica
- Heredia Province > Heredia (0.04)
- Africa > Central African Republic
- Genre:
- Research Report > New Finding (0.88)
- Industry:
- Health & Medicine (1.00)
- Technology: