Goto

Collaborating Authors

Engineers combine AI and wearable cameras in self-walking robotic exoskeletons

#artificialintelligence

Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and making control decisions on their own using sophisticated artificial intelligence (AI) technology. The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements. "We're giving robotic exoskeletons vision so they can control themselves," said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet. Exoskeletons legs operated by motors already exist, but users must manually control them via smartphone applications or joysticks. "That can be inconvenient and cognitively demanding," said Laschowski, also a student member of the Waterloo Artificial Intelligence Institute (Waterloo.ai).


Computer vision and deep-learning AI combined in self-walking robotic exoskeletons

#artificialintelligence

Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and moving on their own using sophisticated artificial intelligence (AI) technology. The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements. "We're giving robotic legs vision so they can control themselves," said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet. Exoskeletons and prosthetic devices operated by motors already exist, but users must manually control them via smartphone applications. That can be inconvenient and cognitively demanding.


Researchers Take Steps Towards Autonomous AI-Powered Exoskeleton Legs

#artificialintelligence

University of Waterloo researchers are using deep learning and computer vision to develop autonomous exoskeleton legs to help users walk, climb stairs, and avoid obstacles. The ExoNet project, described in an early-access paper on "Frontiers in Robotics and AI", fits users with wearable cameras. AI software processes the camera's video stream, and is being trained to recognize surrounding features such as stairs and doorways, and then determine the best movements to take. "Our control approach wouldn't necessarily require human thought," said Brokoslaw Laschowski, Ph.D. candidate in systems design engineering and lead author on the ExoNet project. "Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons that walk for themselves."


This Week's Awesome Tech Stories From Around the Web (Through April 17)

#artificialintelligence

The massive document, produced by the Stanford Institute for Human-Centered Artificial Intelligence, is packed full of data and graphs, and we've plucked out 15 that provide a snapshot of the current state of AI." Geoffrey Hinton Has a Hunch About What's Next for Artificial Intelligence Siobhan Roberts MIT Technology Review "Back in November, the computer scientist and cognitive psychologist Geoffrey Hinton had a hunch. After a half-century's worth of attempts--some wildly successful--he'd arrived at another promising insight into how the brain works and how to replicate its circuitry in a computer." Robotic Exoskeletons Could One Day Walk by Themselves Charles Q. Choi IEEE Spectrum "Ultimately, the ExoNet researchers want to explore how AI software can transmit commands to exoskeletons so they can perform tasks such as climbing stairs or avoiding obstacles based on a system's analysis of a user's current movements and the upcoming terrain. With autonomous cars as inspiration, they are seeking to develop autonomous exoskeletons that can handle the walking task without human input, Laschowski says." Microsoft Buys AI Speech Tech Company Nuance for $19.7 Billion James Vincent The Verge "The $19.7 billion acquisition of Nuance is Microsoft's second-largest behind its purchase of LinkedIn in 2016 for $26 billion.


Human-Centered Design of Wearable Neuroprostheses and Exoskeletons

AI Magazine

Human-centered design of wearable robots involves the development of innovative science and technologies that minimize the mismatch between humans’ and machines’ capabilities, leading to their intuitive integration and confluent interaction. Here, we summarize our human-centered approach to the design of closed-loop brain-machine interfaces (BMI) to powered prostheses and exoskeletons that allow people to act beyond their impaired or diminished physical or sensory-motor capabilities. The goal is to develop multifunctional human-machine interfaces with integrated diagnostic, assistive and therapeutic functions. Moreover, these complex human-machine systems should be effective, reliable, safe and engaging and support the patient in performing intended actions with minimal effort and errors with adequate interaction time. To illustrate our approach, we review an example of a user-in-the-loop, patient-centered, non-invasive BMI system to a powered exoskeleton for persons with paraplegia. We conclude with a summary of challenges to the translation of these complex human-machine systems to the end-user.