"[T]he current capabilities of many AI systems closely match some of the specialized needs of disabled people.... Fortunately, there is a growing interest in applying the scientific knowledge and engineering experience developed by AI researchers to the domain of assistive technology and in investigating new methods and techniques that are required within the assistive technology domain."
– Bruce G. Buchanan; from his Foreword to Assistive Technology and Artificial Intelligence: Applications in Robotics, User Interfaces and Natural Language Processing
University of Waterloo researchers are using deep learning and computer vision to develop autonomous exoskeleton legs to help users walk, climb stairs, and avoid obstacles. The ExoNet project, described in an early-access paper on "Frontiers in Robotics and AI", fits users with wearable cameras. AI software processes the camera's video stream, and is being trained to recognize surrounding features such as stairs and doorways, and then determine the best movements to take. "Our control approach wouldn't necessarily require human thought," said Brokoslaw Laschowski, Ph.D. candidate in systems design engineering and lead author on the ExoNet project. "Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons that walk for themselves."
Canadian boffins are testing semi-autonomous exoskeletons that could help people with limited mobility walk again without the need for implanted sensors. Researchers at the University of Waterloo, Ontario, are hard at work trying to combine modern deep-learning systems with robotic prostheses. They hope to give disabled patients who have suffered spinal cord injuries or strokes, or are inflicted with conditions including multiple sclerosis, spinal, cerebral palsy, and osteoarthritis, the ability to get back on their feet and move freely. The project differs from other efforts for amputees that involve trying to control the movement of machines using electrodes implanted in nerves and muscles in the limbs and brain, explained Brock Laschowski, a PhD student at the university who is leading the ExoNet study. "Our control approach wouldn't necessarily require human thought. Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons that walk for themselves."
Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and moving on their own using sophisticated artificial intelligence (AI) technology. The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements. "We're giving robotic legs vision so they can control themselves," said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet. Exoskeletons and prosthetic devices operated by motors already exist, but users must manually control them via smartphone applications. That can be inconvenient and cognitively demanding.
Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and making control decisions on their own using sophisticated artificial intelligence (AI) technology. The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements. "We're giving robotic exoskeletons vision so they can control themselves," said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet. Exoskeletons legs operated by motors already exist, but users must manually control them via smartphone applications or joysticks. "That can be inconvenient and cognitively demanding," said Laschowski, also a student member of the Waterloo Artificial Intelligence Institute (Waterloo.ai).
Over the course of the next decade humans will integrate more with technology to'upgrade' our lives including brain chips and exoskeletons, a new report claims. Produced by dentsu, a global advertising and digital agency, the report looks at ways the world could change over the next 10 years and the impact on global brands. 'As brands assess the impact of a seismic year and look to chart a new path to recovery, these trends provide them with a roadmap for the next decade,' the firm wrote in the executive summary to the report. One key area of change will be the continued rise of the'synthetic society' as people increasingly incorporate the latest technology into their lives. The study suggests people could even use brain chips to aid memory and exoskeletons to make us faster and stronger. Dentsu predict there will be a number of'key events' over the next decade including the FIFA eWorld Cup becoming the most watched sporting event in the world Over the next decade as automation takes away jobs and technology becomes a larger part of our lives, we will see a'human dividend' appear. Study authors claim this will come in the form of a premium on human skills robots can't do or that can't easily be automated.
The hardware and software system relies on radar sensors and a trio of cameras that enable the wheelchair to "see" what's around. Collision avoidance software then acts to prevent users from unwittingly bumping into walls and objects. The company baked in drop-off detection software to recognize nearby steps and sudden declines in the pavement. If a user is going up a steep ramp and is in danger of tipping over, the software sounds an alarm and can alert designated people nearby to help.
Ivaldi, Serena, Maurice, Pauline, Gomes, Waldez, Theurel, Jean, Wioland, Liên, Atain-Kouadio, Jean-Jacques, Claudon, Laurent, Hani, Hind, Kimmoun, Antoine, Sellal, Jean-Marc, Levy, Bruno, Paysant, Jean, Malikov, Sergueï, Chenuel, Bruno, Settembre, Nicla
We conducted a pilot study to evaluate the potential and feasibility of back-support exoskeletons to help the caregivers in the Intensive Care Unit (ICU) of the University Hospital of Nancy (France) executing Prone Positioning (PP) maneuvers on patients suffering from severe COVID-19-related Acute Respiratory Distress Syndrome. After comparing four commercial exoskeletons, the Laevo passive exoskeleton was selected and used in the ICU in April 2020. The first volunteers using the Laevo reported very positive feedback and reduction of effort, confirmed by EMG and ECG analysis. Laevo has been since used to physically assist during PP in the ICU of the Hospital of Nancy, following the recrudescence of COVID-19, with an overall positive feedback.
Microsoft is opening up limited access to a text-to-speech AI called Custom Neural Voice, which allows developers to create custom synthetic voices. The tech is part of an Azure AI service called Speech. Companies can use the tech for things like voice-powered smart assistants and devices, chatbots, online learning and reading audiobooks or news. They'll have to apply for access and gain approval from Microsoft before they can harness Custom Neural Voice. The tech can deliver more natural-sounding voices than many other text-to-speech services, according to Microsoft.
A quadriplegic man with minimal movement and feeling in his limbs fed himself for the first time in 30 years – and he did so using his mind. Robert'Buz' Chmielewski was involved in surfing accident as a teen, but in 2019 he underwent a 10-hour surgery to have six electrodes implanted into his brain to control a pair of robotic arms. Working with John Hopkins Medicine (JHM), Chmielewski is now able to operate both prosthetic arms and manipulate them to perform separate tasks, like feeding himself a Twinkie. 'It's pretty cool,' said Chmielewski, whose sense of accomplishment was unmistakable after using his thoughts to command the robotic limbs to cut and feed him a piece of golden sponge cake. Robert'Buz' Chmielewski was involved in surfing accident as a teen, but in 2019 he underwent a 10-hour surgery to have six electrodes implanted into his brain to control a pair of robotic arms and just showed the ability to feed himself'I wanted to be able to do more of it,' he said.
A Texan man has built his own bionic hand using artificial intelligence (AI) after three years of research. After finding most bionic hands can cost up to $150,000, Ryan Saavedra, 27, set out to create one at a fraction of the cost. The prosthetic he created, called the Globally Available Robotic Arm (GARA), measures electrical activity of muscle tissue – a method called electromyography (EMG) – and combines this with AI to predict hand movements. When attached to the limb of an amputee, it is capable of intuitive finger movements and clasping objects such as cups. Saavedra's company, Alt-Bionics, has already made a prototype that costs less than $700 (£520) to produce, and is now working to commercialise the device.