Robots in the work place can perform hazardous or even 'impossible' tasks; e.g., toxic waste clean-up, desert and space exploration, and more. AI researchers are also interested in the intelligent processing involved in moving about and manipulating objects in the real world.
There is widespread public support for a ban on so-called "killer robots", which campaigners say would "cross a moral line" after which it would be difficult to return. Polling across 26 countries found over 60 per cent of the thousands asked opposed lethal autonomous weapons that can kill with no human input, and only around a fifth backed them. The figures showed public support was growing for a treaty to regulate these controversial new technologies - a treaty which is already being pushed by campaigners, scientists and many world leaders. However, a meeting in Geneva at the close of last year ended in a stalemate after nations including the US and Russia indicated they would not support the creation of such a global agreement. Mary Wareham of Human Rights Watch, who coordinates the Campaign to Stop Killer Robots, compared the movement to successful efforts to eradicate landmines from battlefields.
Autonomous vehicles relying on light-based image sensors often struggle to see through blinding conditions, such as fog. But MIT researchers have developed a sub-terahertz-radiation receiving system that could help steer driverless cars when traditional methods fail. Sub-terahertz wavelengths, which are between microwave and infrared radiation on the electromagnetic spectrum, can be detected through fog and dust clouds with ease, whereas the infrared-based LiDAR imaging systems used in autonomous vehicles struggle. To detect objects, a sub-terahertz imaging system sends an initial signal through a transmitter; a receiver then measures the absorption and reflection of the rebounding sub-terahertz wavelengths. That sends a signal to a processor that recreates an image of the object.
A six-legged robot can find its way home without the help of GPS, thanks to tactics borrowed from desert ants. The robot, called AntBot, uses light from the sky to judge the direction its going. To assess the distance travelled it uses a combination of observing the motion of objects on the ground as they pass by and counting steps. All three of these techniques are used by desert ants. To test AntBot, Stéphane Viollet at the Aix-Marseille University in France and colleagues set an outdoor homing task: first go to several checkpoints, then return home.
Antarctic scientists seeking to locate the wreck of Sir Ernest Shackleton's lost ship, the Endurance, have arrived at the search site. The team broke through thick pack ice on Sunday to reach the vessel's last known position in the Weddell Sea. Robotic submersibles will now spend the next few days scouring the ocean floor for the maritime icon. Shackleton and his crew had to abandon Endurance in 1915 when it was crushed by sea ice and sank in 3,000m of water. Their escape across the frozen floes on foot and in lifeboats is an extraordinary story that has resonated down through the years - and makes the wooden polar yacht perhaps the most sought-after of all undiscovered wrecks.
Apple could be planning to introduce an emojii version of its Siri virtual assistant, according to a new patent application from the tech giant. The patent request, filed with the US Patent and Trademark Office, describes an emoji-based avatar for a smart home speaker that can adapt to a user's mood. Though not mentioned by name in the patent, the description of the smart speaker accurately resembles that of the Apple HomePod. Apple's patent application describes a "humanistic avatar, a simplified graphical representation of a digital assistant such as an emoji-based avatar" – essentially a cartoon version of Siri. Depending on what request is made through the smart speaker, the emoji assistant would be able to react appropriately.
Slowly and silently, they glide across the floor wearing bright yellow dresses that look like they were plucked from a haunted 1920s boarding school. No, you haven't encountered some Mothman-like terror entombed inside a department store mannequin, the byproduct of a twisted, futuristic fever dream. You've merely stepped inside Mongkutwattana General Hospital in Bangkok, where a team of robot nurses has been unleashed to make life easier. Their job: ferrying documents between eight stations inside the health-care facility, a job that used to be carried out busy human nurses, hospital director Reintong Nanna told Newsflare last year. "These robotic nurses help to improve the efficiency and performance of working in the hospital," he said.
For several decades, various types of artificial intelligence have been facing off with people in highly competitive games and then quickly destroying their human competition. AI long ago mastered chess, the Chinese board game Go and even the Rubik's cube, which it managed to solve in just 0.38 seconds. Now machines have a new game that will allow them to humiliate humans: Jenga, the popular game ---- and source of melodramatic 1980s commercials ---- in which players strategically remove pieces from an increasingly unstable tower of 54 blocks, placing each one on top until the entire structure collapses. A newly released video from MIT shows a robot developed by the school's engineers playing the game with surprising precision. The machine is quipped with a soft-pronged gripper, a force-sensing wrist cuff and an external camera, allowing the robot to perceive the tower's vulnerabilities the way a human might, according to Alberto Rodriguez, the Walter Henry Gale career development assistant professor in the Department of Mechanical Engineering at MIT. "Unlike in more purely cognitive tasks or games such as chess or Go, playing the game of Jenga also requires mastery of physical skills such as probing, pushing, pulling, placing, and aligning pieces," Rodriguez said in a statement released by the school.
The person controlling the robot wears an electroencephalography (EEG) cap, which measures the brain's electrical activity via the scalp. They then look at a screen that has several pre-selected metal seams for the robot to weld. When their chosen option flickers, it generates a specific electrical response in the brain detectable by the EEG.
The Galapagos Islands are famous for their exotic wildlife, which in most cases is not nearly as afraid of humans as it should be. Humans have done some seriously horrible things to the animals living there, like packing thousands of giant tortoises upside down on ships because they would stay alive without food or water for months and could then be eaten. People traveling to and living in the Galapagos have caused other serious problems to the fragile ecosystem: In addition to devastating oil spills, humans have introduced numerous invasive species to the islands. In particular, goats, which were brought on purpose, and rats, which were brought accidentally, have been catastrophic for endemic animal populations. For decades, the Galapagos National Park Directorate (DPNG) has been working to remove invasive species island by island, including tens of thousands of feral goats, pigs, and donkeys.