Goto

Collaborating Authors

Results


Researchers Take Steps Towards Autonomous AI-Powered Exoskeleton Legs

#artificialintelligence

University of Waterloo researchers are using deep learning and computer vision to develop autonomous exoskeleton legs to help users walk, climb stairs, and avoid obstacles. The ExoNet project, described in an early-access paper on "Frontiers in Robotics and AI", fits users with wearable cameras. AI software processes the camera's video stream, and is being trained to recognize surrounding features such as stairs and doorways, and then determine the best movements to take. "Our control approach wouldn't necessarily require human thought," said Brokoslaw Laschowski, Ph.D. candidate in systems design engineering and lead author on the ExoNet project. "Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons that walk for themselves."


In the lab: Robotic AI-powered exoskeletons to help disabled people move freely without implants

#artificialintelligence

Canadian boffins are testing semi-autonomous exoskeletons that could help people with limited mobility walk again without the need for implanted sensors. Researchers at the University of Waterloo, Ontario, are hard at work trying to combine modern deep-learning systems with robotic prostheses. They hope to give disabled patients who have suffered spinal cord injuries or strokes, or are inflicted with conditions including multiple sclerosis, spinal, cerebral palsy, and osteoarthritis, the ability to get back on their feet and move freely. The project differs from other efforts for amputees that involve trying to control the movement of machines using electrodes implanted in nerves and muscles in the limbs and brain, explained Brock Laschowski, a PhD student at the university who is leading the ExoNet study. "Our control approach wouldn't necessarily require human thought. Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons that walk for themselves."


Computer vision and deep-learning AI combined in self-walking robotic exoskeletons

#artificialintelligence

Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and moving on their own using sophisticated artificial intelligence (AI) technology. The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements. "We're giving robotic legs vision so they can control themselves," said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet. Exoskeletons and prosthetic devices operated by motors already exist, but users must manually control them via smartphone applications. That can be inconvenient and cognitively demanding.


Why your brain is not a computer

#artificialintelligence

We are living through one of the greatest of scientific endeavours – the attempt to understand the most complex object in the universe, the brain. Scientists are accumulating vast amounts of data about structure and function in a huge array of brains, from the tiniest to our own. Tens of thousands of researchers are devoting massive amounts of time and energy to thinking about what brains do, and astonishing new technology is enabling us to both describe and manipulate that activity. We can now make a mouse remember something about a smell it has never encountered, turn a bad mouse memory into a good one, and even use a surge of electricity to change how people perceive faces. We are drawing up increasingly detailed and complex functional maps of the brain, human and otherwise. In some species, we can change the brain's very structure at will, altering the animal's behaviour as a result. Some of the most profound consequences of our growing mastery can be seen in our ability to enable a paralysed person to control a robotic arm with the power of their mind.


AI to help world's first removal of space debris

#artificialintelligence

The technology is being developed by Swiss startup ClearSpace, a spin-off from the Ecole Polytechnique Fédérale de Lausanne (EPFL). Their removal target is the now-obsolete Vespa Upper Part, a 100 kg payload adaptor orbiting 660 km above the Earth. ClearSpace-1 will use an AI-powered camera to find the debris. Its robotic arms will then grab the object and drag it back to the atmosphere before burning it up. "A central focus is to develop deep learning algorithms to reliably estimate the 6D pose (three rotations and three translations) of the target from video-sequences even though images taken in space are difficult," said Mathieu Salzmann, an EPFL scientist spearheading the project.


Cyborg computer chips will get their brain from human neurons

#artificialintelligence

A.I. has already gotten to almost sci-fi levels of emulating brain activity, so much so that amputees can experience mind-controlled robotic arms, and neural networks might soon be a thing. Cortical Labs sounds like it could have been pulled from the future. Co-founder and CEO Hong Wen Chong and his team are merging biology and technology by embedding real neurons onto a specialized computer chip. Instead of being programmed to act like a human brain, it will use those neurons to think and learn and function on its own. The hybrid chips will save tremendous amounts of energy with an actual neuron doing the processing for them.


A neural network, connected to a human brain, could mean more advanced prosthetics

#artificialintelligence

In the future, some researchers hope people who lose the use of limbs will be able to control robotic prostheses using brain-computer interfaces -- like Luke Skywalker did effortlessly in "Star Wars." The problem is that brain signals are tricky to decode, meaning that existing brain-computer interfaces that control robotic limbs are often slow or clumsy. But that could be changing. Last week, a team of doctors and neuroscientists released a paper in the journal Nature Medicine about a brain-computer interface that uses a neural network to decode brain signals into precise movements by a lifelike, mind-controlled robotic arm. The researchers took data from a 27-year-old quadriplegic man who had an array of microelectrodes implanted in his brain, and fed it into a series of neural nets, which are artificial intelligence systems loosely modeled after our brains' circuits that excel at finding patterns in large sets of information.