Robots in the work place can perform hazardous or even 'impossible' tasks; e.g., toxic waste clean-up, desert and space exploration, and more. AI researchers are also interested in the intelligent processing involved in moving about and manipulating objects in the real world.
Drones are often touted as being the answer to a great number of modern day challenges. Soon, we are told, they will be making shopping deliveries for us, dropping off pizzas, and even taxiing us around. A similar expectation about drone capability has been seen in the world of insurance. It was thought that insurers would have large drone divisions, allowing them to easily assess large or hazardous properties, or make claims assessments on otherwise difficult to view property. But this has not happened for a few reasons.
Technology is poised to change the workplace. Soon you may have a robot for a co-worker or a microchip embedded under your skin that's a work ID. Some innovations are already making an impact. Virtual reality, for example, is going beyond gaming to serve as a powerful workplace training tool. One of the biggest areas where VR training can be useful is safety, according to J. P. Gownder, vice president at research firm Forrester.
This is Recorded Future, inside threat intelligence for cybersecurity. It's a wide-ranging category, covering everything from connected thermostats, refrigerators, and security cameras to industrial control systems, self-driving cars, and medical devices. It's hardly an exaggeration to say that if a device has a power source, somebody is thinking up a way to connect it to the internet. And with that comes opportunities for improving our lives and the world we live in, and risks to our security and privacy. Our guest this week is Chris Poulin. He's a principal at Booz Allen Hamilton, where he leads their Internet of Things security practice. Devices have been connecting to the internet for a long time, and in fact, it's kind of interesting. Way back in my career, I was always fascinated where physical and digital meet, and so I would say, probably around 2009 or so is when I sort of realized that the internet was a place where other things … So, beyond, for example, industrial control systems, which had to send their telemetry. So, pumps saying how fast their motors were spinning, how much heat, how much pressure was in pipes, et cetera, et cetera, all of that was being reported in industrial control systems, and I'd say that was probably one of the first … what we would consider nowadays to be "Internet of Things" things. So, there was always this awareness that they were connected, and then the rest of the world decided that they were going to connect other things like cars. And so, for example, OnStar and Uconnect and all of those things have been connecting cars back to a call center for a long time, but it used mobile airwaves. So, you could argue that those things were connected.
Ian Bremmer warns the audience about the dangers of automation at an Intelligence Squared U.S. debate. Can we all agree that Google Duplex demo was eerie? A robot, posing as a human being, scheduled a reservation over the phone. We all knew artificial intelligence was coming, but it was breathtaking to hear software come to life. Before we go any further, let's get our terms straight.
A life-saving AI system that can identify the signs of malnutrition from a single photo of someone has been developed by a non-profit organisation. This system, called MERON (Method for Extremely Rapid Observation of Nutritional status) is still a prototype, but was 78 per cent accurate on the adults it was tested on. The technology reduces the need for lots of equipment and specialists in the field, making it easier to identify the symptoms of malnutrition, developers claim. By spotting the signs sooner, treatment can also be administered before the condition becomes critical. A non-profit organisation has developed technology which uses AI to identify the signs of malnutrition from a single photo of a person.
The market for construction robots is set to grow to $166.4 million by 2023, according to new research from Markets & Markets. Last year the construction robot market was worth a little over $60 million. The bump represents a projected six-year CAGR of 16.8 percent. The emergence of robots on the job site tracks recent trends in industrial automation technology. In the last decade, machine vision has enabled robots to safely navigate factory and warehouse floors without the need for dedicated tracks.
The same can apply to autonomous vehicles. Methods will be established to test whether autonomous cars are likely to be safer than human drivers. In the unfortunate event of an accident where the robotaxi was at fault, an investigation by the authorities will determine whether the vehicle was indeed compliant. If it is concluded that the vehicle was compliant, but there was a sensor malfunction, then the incident would be handled in the same manner as a human-piloted vehicle would suffering from a brake or steering malfunction, which falls into the category of product liability. If it is determined that the vehicle was compliant; however, the software simply failed as a result of encountering a corner case it could not handle correctly, then traditional auto insurance can cover the damage, and a new insurance product; "A/V insurance," can cover the cost associated with the required disclosures and investigations.
The crux of the problem is that the field of artificial intelligence has not come to grips with the infinite complexity of language. Just as you can make infinitely many arithmetic equations by combining a few mathematical symbols and following a small set of rules, you can make infinitely many sentences by combining a modest set of words and a modest set of rules. A genuine, human-level A.I. will need to be able to cope with all of those possible sentences, not just a small fragment of them. The narrower the scope of a conversation, the easier it is to have. If your interlocutor is more or less following a script, it is not hard to build a computer program that, with the help of simple phrase-book-like templates, can recognize a few variations on a theme.
EVERY DAY AROUND 10m people take an Uber. The company has made ride-hailing commonplace in more than 600 cities in 82 countries. But the Volvo XC90 picking its way through traffic on a wintry morning in Pittsburgh is no ordinary Uber. Climb into the back, and you will see a screen mounted between the front seats, showing a digital representation of the world around the car, with other vehicles, pedestrians and cyclists highlighted as clusters of blue dots. Tap the screen to say you are ready to leave, and the car starts to move.