For example, in Terminator XXVIII: Rise of the Earthlings (2051), a brave young android is tasked with saving the world from an army of killer humans sent from the future to destroy robotkind. Leading the human rebellion is Barry, an 18-stone unemployed bus driver from Caerphilly whose powers include the ability to eat a foot-long meatball marinara from Subway in under nine seconds. In the war zones of the future, robot generals will send human beings on to the battlefield to check for land mines and other unexploded devices. "Previously, this highly dangerous work was carried out by bomb disposal robots," explains Major-General Sir Optimus Prime. "Sending human beings instead will reduce the risk to robot life.
Robots offer an opportunity to enable people to live safely and comfortably in their homes as they grow older. In the near future (we're all hoping), robots will be able to help us by cooking, cleaning, doing chores, and generally taking care of us, but they're not yet at the point where they can do those sorts of things autonomously. Putting a human in the loop can help robots be useful more quickly, which is especially important for the people who would benefit the most from this technology--specifically, folks with disabilities that make them more reliant on care. Ideally, the people who need things done would be the people in the loop telling the robot what to do, but that can be particularly challenging for those with disabilities that limit how mobile they are. If you can't move your arms or hands, for example, how are you going to control a robot?
Engagement is a key factor in every social interaction, be it between humans or humans and robots. Many studies were aimed at designing robot behavior in order to sustain human engagement. Infants and children, however, learn how to engage their caregivers to receive more attention.We used a social robot platform, DragonBot, that learned which of its social behaviors retained human engagement. This was achieved by implementing a reinforcement learning algorithm, wherein the reward is the proximity and number of people near the robot. The experiment was run in the World Science Festival in New York, where hundreds of people interacted with the robot. After more than two continuous hours of interaction, the robot learned by itself that making a sad face was the most rewarding expression. Further analysis showed that after a sad face, people's engagement rose for thirty seconds. In other words, the robot learned by itself in two hours that almost no-one leaves a sad DragonBot.
Scientists have taught a robot how to use human-like hand gestures while speaking by feeding it footage of people giving presentations. The android learned to use a pointing gesture to portray'you' or'me', as well as a crooked arm action to suggest holding something. Building robots that gesticulate like humans will make interactions with them feel more natural, the Korean team behind the technology said. They built the robot around machine learning software that they showed 52 hours of TED talks - presentations given by expert speakers on various topics. Pictured is a Pepper robot that scientists taught to give human-like hand gestures during speech.
In the late 1950s, Sandia Laboratory was looking for a way to handle radioactive materials without putting humans in danger. The answer was the Mobot--short for either "remote robot" or "mobile robot"--a remotely operated system designed by Hughes Aircraft Co. in 1959 that offered a unique and effective combination of strength and dexterity. A Sandia press release announcing the robot called it a "Replacement for Man," but in fact the robot had no autonomy; it was teleoperated by a human sitting in front of a massive control console connected to the robot by a 60-meter cable. The Mark I version of the Mobot had a pair of meter-long hydraulically actuated arms capable of lifting up to 68 kilograms (150 pounds), along with adjustable-strength grippers. The entire system was mounted on a forklift.