In Japan, a plastic moulding company called Nissei Eco (which also does funeral arrangements, I guess) is planning to introduce SoftBank's Pepper robot as a cheaper substitute for human priests reading sutras at Buddhist funerals. The average cost of a funeral in Japan is somewhere between two and three million yen, as of the most recent study of the industry, which is nearly a decade old. Nissei Eco is offering the small, white, and aggressively shiny humanoid robot, suitably attired in the robe of a Buddhist monk, as an optional add-on in their a la carte menu of funeral services. And with a per-funeral cost of just 50,000 yen (about $450), the robot costs "significantly less than the cash offerings typically made to Buddhist priests," according to the Japan Times.
Today (or, yesterday, but today Australia time, where it's probably already tomorrow), 116 founders of robotics and artificial intelligence companies from 26 countries released an open letter urging the United Nations to ban lethal autonomous weapon systems (LAWS). While this most recent letter renews the call for a United Nations ban on lethal autonomous weapons systems and makes the perspective from a subset of robotics companies a little more explicit than it might have been before, there has not otherwise been a lot of tangible progress towards an actual ban that we've been able to identify over the past two years. Here's a big pile of links to our past coverage: One of the primary critiques of a ban on lethal autonomous weapons systems is that it would be practically impossible to implement, considering how much usefulness autonomous systems offer in all kinds of other applications, the minimal separation between commercial and military technology, and how little difference there can be between an autonomous system and a weaponized autonomous system, or a weaponized system with a human in the loop and one without. Are you then talking about accountability for a human who authorizes a system to take lethal action autonomously, or verifying that there's a human in the loop making all the decisions about whether or not a system can take a lethal action?
This project is one of five being explored as part of an artificial intelligence pilot research program sponsored by NASA. Last Thursday at an event at Intel, participants in the NASA Frontier Development Laboratory research accelerators presented results showing how artificial intelligence can speed up space science. Their results agreed with human image classification about 98 percent of the time, about five times the accuracy of previous image analysis systems. One group's algorithm, called FlareNET, outperformed NOAA's existing system for predicting solar flares.
It is the latest in a string of autonomous vehicles made by SMART, including a golf cart, electric taxi and, most recently, a scooter that zipped more than 100 MIT visitors around on tours in 2016. According to a recent press release, Panasonic is planning to conduct technical trials of the WHILL NEXT this year. It also employs automation technology developed for Panasonic's autonomous (and adorable) hospital delivery robot, HOSPI. Beyond use in hospitals and airports, the SMART team says they envision a connected autonomous mobility system, where a user could use a scooter or wheelchair indoors at an office, zip outside and pick up a golf cart to cross the parking lot, and slip into an autonomous car to drive home.
Dino Mehanovic, John Bass, Thomas Courteau, David Rancourt, and Alexis Lussier Desbiens from the University of Sherbrooke realized that perching with a fixed-wing aircraft doesn't need to involve a stall to achieve that vertical and ultra low-speed approach, as long as you can maintain control over the aircraft. We are thinking about various failure causes (unsuitable states during the approach, smooth surface for the microspines) and failure detection timing (before touchdown, at touchdown and after touchdown). You also have to consider numerous factors that are sometime hard to quantify: efficiency of gears, reuse of some components between flight and climbing, transition time, propeller size, operating away from the design point, battery size, etc. Autonomous Thrust-Assisted Perching of a Fixed-Wing UAV on Vertical Surfaces, by Dino Mehanovic, John Bass, Thomas Courteau, David Rancourt, and Alexis Lussier Desbiens from the University of Sherbrooke in Canada, was presented at the 2017 Living Machines Conference at Stanford, where it won a Best Paper award.
The Isaac robot simulator advances these tasks by providing an AI-based software platform that lets teams train robots in highly realistic virtual environments and then transfer that knowledge to real-world units. Under the hood, it demonstrated the ability to apply previously learned models of tool affordances, tool classification from vision, automatic tool pose detection, object segmentation and full/empty hand classification to achieve its task. By utilizing quadrotors attached to a mainframe via passive spherical joints as rotating-thrust generator, this SmQ (Spherically-connected multiple Quadrotor) system is fully-actuated (e.g., can resist sideway wind without tilting) and also backdrivable (e.g., impedance control possible for compliant interaction). With design optimization to address the tight weight-thrust margin of current rotor and battery technologies and proper control design, the ODAR system can exhibit such capability for "real" manipulation as 1) downward pushing force larger than 6kg (much larger than its own weight of 2.6kg) and 2) peg-in-hole teleoperation with radial tolerance of only 0.5mm, all unprecedented by other aerial manipulation systems (e.g., drone-manipulator).
To help pull off this mind-reading illusion, computer scientists created a computer algorithm that can automatically help find compelling word and image combinations. In recent years, McOwan has teamed up with Howard Williams, another computer scientist at Queen Mary University London, to develop computer algorithms that can help create new magic tricks. To collect relevant data in making the magic trick, the Queen Mary University London researchers performed an online psychology experiment by showing human participants various selections of 10 trademarks from a pool of 100 of the most famous trademarks. Second, it used a previously developed search algorithm, called BM25, to organize and rank the collected data according to certain association categories (such as food-related words).
A new study, in which IBM Watson took just 10 minutes to analyze a brain cancer patient's genome and suggest a treatment plan, demonstrates the potential of artificially intelligent medicine to improve patient care. Both the NYGC clinicians and Watson identified mutations in genes that weren't checked in the panel test, but which nonetheless suggested potentially beneficial drugs and clinical trials. Both Watson and the expert team received the patient's genome information and identified genes that showed mutations; went through the medical literature to see if those mutations had figured in other cancer cases; looked for reports of successful treatment with drugs; and checked for clinical trials that the patient might be eligible for. IBM's Parida notes that the cost of sequencing an entire genome has plummeted in recent years, opening up the possibility that whole-genome sequencing will soon be a routine part of cancer care.
It's not just the temperature that makes Venus a particularly nasty place for computers--the pressure at the surface is around 90 atmospheres, equivalent to the pressure 3,000 feet down in Earth's ocean. The majority of ideas for Venus surface exploration have essentially been the same sort of thing that the Soviets did with the Venera probes: Stuffing all the electronics inside of an insulated container hooked up to a stupendously powerful air conditioning system, probably driven by some alarmingly radioactive plutonium-powered Stirling engines. With funding from the NASA Innovative Advanced Concepts (NIAC) program, the JPL team wants to see whether it might be possible to build a Venus exploration rover without conventional sensors, computers, or power systems. A Strandbeest operates off just a couple simple sensors, which control whether the legs move backwards or forwards, and it has built-in logic to avoid soft sand and water.