"[T]he current capabilities of many AI systems closely match some of the specialized needs of disabled people.... Fortunately, there is a growing interest in applying the scientific knowledge and engineering experience developed by AI researchers to the domain of assistive technology and in investigating new methods and techniques that are required within the assistive technology domain."
– Bruce G. Buchanan; from his Foreword to Assistive Technology and Artificial Intelligence: Applications in Robotics, User Interfaces and Natural Language Processing
University of Minnesota Twin Cities researchers have developed a more accurate, less invasive technology that allows amputees to move a robotic arm using their brain signals instead of their muscles. Many current commercial prosthetic limbs use a cable and harness system that is controlled by the shoulders or chest, and more advanced limbs use sensors to pick up on subtle muscle movements in a patient's existing limb above the device. But, both options can be cumbersome, unintuitive, and take months of practice for amputees to learn how to move them. Researchers in the University's Department of Biomedical Engineering, with the help of industry collaborators, have created a small, implantable device that attaches to the peripheral nerve in a person's arm. When combined with an artificial intelligence computer and a robotic arm, the device can read and interpret brain signals, allowing upper limb amputees to control the arm using only their thoughts. The researchers' most recent paper is published in the Journal of Neural Engineering, a peer-reviewed scientific journal for the interdisciplinary field of neural engineering.
Meta's AI division has been busy in recent months finding ways to make concrete production more sustainable and machine translation better. Now one of the company's ML teams has created a tool that builds realistic musculoskeletal simulations that run up to 4,000 times faster than state-of-the-art prosthetics. According to Meta CEO Mark Zuckerberg, the company can train the models to do things like twirl pens and rotate objects. Mark Zuckerberg just announced MyoSuite, a new AI platform we developed to build realistic musculoskeletal simulations to help accelerate development of prosthetics. It could also help us build avatars that move more realistically in the metaverse.
In the late nineteen-forties, Delmar Harder, a vice-president at Ford, popularized the term "automation"--a "nickname," he said, for the increased mechanization at the company's Detroit factory. Harder was mostly talking about the automatic transfer of car parts between machines, but the concept soon grew legs--and sometimes a robotic arm--to encompass a range of practices and possibilities. From the immediate postwar years to the late nineteen-sixties, America underwent what we might call an automation boom, not only in the automotive sector but in most heavy-manufacturing industries. As new technology made factory work more efficient, it also rendered factory workers redundant, often displacing them into a growing service sector. Automation looks a little different these days, but the rhetoric around it remains basically the same.
On May 4, NASA's InSight lander made a huge discovery, recording the biggest quake ever detected on another world, a magnitude 5 temblor. But InSight's greatest accomplishment may also be its last act; just two weeks later, scientists on the InSight team revealed that the lander's solar panels are now blanketed with dust, which has gradually accumulated since its arrival on the planet. Those panels' diminishing power will likely spell the end of the mission. When the lander arrived on the Red Planet, the panels generated 5,000 watt-hours per sol (a Martian day), but now they're down to about a tenth of that, said Kathya Zamora Garcia, InSight deputy project manager at NASA's Jet Propulsion Laboratory, at a virtual press conference on Tuesday. The scientists will keep running Insight's seismometer and robotic arm camera full-time for a few more weeks, and will run them for half-days every other sol after that, but they expect InSight's science operations to end this summer, possibly in July.
Ai-Da sits behind a desk, paintbrush in hand. She looks up at the person posing for her, and then back down as she dabs another blob of paint onto the canvas. A lifelike portrait is taking shape. If you didn't know a robot produced it, this portrait could pass as the work of a human artist. Ai-Da is touted as the "first robot to paint like an artist", and an exhibition of her work called Leaping into the Metaverse opened at the Venice Biennale. Ai-Da produces portraits of sitting subjects using a robotic hand attached to her lifelike feminine figure.
In March, Chipotle introduced Chippy, an AI-powered robotic arm that makes intentionally imperfect tortilla chips; some with slightly more salt, others with a more distinct tang of lime. And Chippy isn't the only robot being put to work; Cecilia.ai, a mechanical mixologist, is being implemented in bars around the world to serve up the perfect margarita while chatting with customers using conversational AI. Since the mid-2010s, the world has been advancing Industry 4.0, which is a combination of artificial intelligence (AI), additive manufacturing, and the Internet of Things (IoT). Experts argue that the COVID-19 pandemic accelerated the shift to Industry 5.0 and that soon AI-powered platforms and robots will largely take on monotonous tasks that no longer require human labor. So how do robots learn to fulfill these tasks?
Robotic limbs that are controlled by electrostatic brakes rather than many motors could lead to a new generation of lightweight robots that use around 90 per cent less power than existing designs. Robots usually have one or more motors for every joint to control its movement, but Patrick Lancaster at the University of Washington in Seattle and his colleagues have created simple brakes that enable joints to be frozen or released in precise combinations so that a single motor can power a limb with as many as 10 joints.
A new telerobotic system developed by engineers at Massachusetts Institute of Technology (MIT) enables surgeons to remotely treat patients suffering from a stroke or aneurysm. The system utilizes a joystick that the surgeons can use in a hospital to control a robotic arm at another location. This enables them to operate on patients during the critical time window needed to preserve brain function and save lives.
Skeletonics is a 3-meter-tall exoskeleton suit that doesn't run on electricity. The suit relies on kinetic energy from the user to mirror every move they make. In a demo held at Haneda Innovation City, a usee demonstrated some of what the suit's capabilities, which included some ability to grasp small objects and move around.
MIT engineers have developed a telerobotic system to help surgeons quickly and remotely treat patients experiencing a stroke or aneurysm. With a modified joystick, surgeons in one hospital may control a robotic arm at another location to safely operate on a patient during a critical window of time that could save the patient's life and preserve their brain function. The robotic system, whose movement is controlled through magnets, is designed to remotely assist in endovascular intervention -- a procedure performed in emergency situations to treat strokes caused by a blood clot. Such interventions normally require a surgeon to manually guide a thin wire to the clot, where it can physically clear the blockage or deliver drugs to break it up. One limitation of such procedures is accessibility: Neurovascular surgeons are often based at major medical institutions that are difficult to reach for patients in remote areas, particularly during the "golden hour" -- the critical period after a stroke's onset, during which treatment should be administered to minimize any damage to the brain.