Tom is an analyst at the US Department of Defense (DoD).1 All day long, he and his team collect and process massive amounts of data from a variety of sources--weather data from the National Weather Service, traffic information from the US Department of Transportation, military troop movements, public website comments, and social media posts--to assess potential threats and inform mission planning. While some of the information Tom's group collects is structured and can be categorized easily (such as tropical storms in progress or active military engagements), the vast majority is simply unstructured text, including social media conversations, comments on public websites, and narrative reports filed by field agents. Because the data is unstructured, it's difficult to find patterns and draw meaningful conclusions. Tom and his team spend much of their day poring over paper and digital documents to detect trends, patterns, and activity that could raise red flags. In response to these kinds of challenges, DoD's Defense Advanced Research Projects Agency (DARPA) recently created the Deep Exploration and Filtering of Text (DEFT) program, which uses natural language processing (NLP), a form of artificial intelligence, to automatically extract relevant information and help analysts derive actionable insights from it.2 Across government, whether in defense, transportation, human services, public safety, or health care, agencies struggle with a similar problem--making sense out of huge volumes of unstructured text to inform decisions, improve services, and save lives.
Machine learning is one of those buzz words that gets thrown around as a synonym for AI (Artificial Intelligence). But this really is not accurate. Note that machine learning is a subset of AI. This field has also been around for quite some time, with the roots going back to the late 1950s. It was during this period that IBM's Arthur L. Samuel created the first machine learning application, which played chess.
Researchers at the Georgia Institute of Technology found that state-of-the-art object recognition systems are less accurate at detecting pedestrians with darker skin tones. Crash-testing: The researchers tested eight image-recognition systems (each trained on a standard data set) against a large pool of pedestrian images. They divided the pedestrians into two groups for lighter and darker skin tones according to the Fitzpatrick skin type scale, a way of classifying human skin color. Color coded: The detection accuracy of the systems was found to be lower by an average of five percentage points for the group with darker skin. This held true even when controlling for time of day and obstructed view.
There are already networks in place that can detect seismic activity and send an alert as soon as an earthquake is underway. But the current technology doesn't actually send the alert until all of the sensors in the network covering a given area have detected seismic waves. And it could take about a minute from the moment activity is initially detected until an alert hits the wire. A minute is a long time in an emergency. Government agencies, public works, and local utilities ideally need to alert the populace and do things like halt trains and shut off power lines to potentially mitigate damage – every second counts.
Intel has done pretty well for itself by consistently figuring out ways of making CPUs faster and more efficient. But with the end of Moore's Law lurking on the horizon, Intel has been exploring ways of extending computing with innovative new architectures at Intel Labs. Quantum computing is one of these initiatives, and Intel Labs has been testing its own 49-qubit processors. Beyond that, Intel Labs is exploring neuromorphic computing (emulating the structure and, hopefully, some of the functionality of the human brain with artificial neural networks) as well as probabilistic computing, which is intended to help address the need to quantify uncertainty in artificial intelligence applications. Rich Uhlig has been the director of Intel Labs since December of 2018, which is really not all that long, but he's been at Intel since 1996 (most recently as Director of Systems and Software Research for Intel Labs) so he seems well qualified to hit the ground running.
A link has been posted to your Facebook feed. BMW and the maker of Mercedes-Benz have reached a deal to collaborate on the development of self-driving car technology. The partnership between BMW and Daimler is a tectonic shift for the rival German luxury automakers, reflecting their need to collaborate on extremely expensive and challenging autonomous vehicle systems. The companies had already formed a joint venture to collaborate on "mobility services," such as car sharing and ride-hailing services. Taken together, these moves suggest that you could one day share a ride in a car jointly produced by two companies whose history of fierce competition is akin to the rivalry between American automakers Ford and General Motors.
BARCELONA, SPAIN - Next-generation wireless technology is taking the medical world a crucial step closer to robots performing remotely controlled surgery, a doctor in Spain said Wednesday after carrying out the world's first 5G-powered telementored operation. Doctors have telementored surgeries in the past using wireless networks, but blazing fast 5G increases image quality and definition, which are crucial for medical teams to take decisions with as much information, and as few mistakes, as possible. "This is a first step to achieve our dream, which is to make remote operations in the near future," said Dr. Antonio de Lacy after providing real-time guidance via a 5G video link from a Barcelona congress center to a surgical team that operated on a patient with an intestinal tumor about 5 kilometers away at the Hospital Clinic. Experts predict 5G will allow surgeons to control a robot arm to carry out operations in remote locations that lack specialist doctors. De Lacy, the head of the hospital's gastrointestinal surgery service, used his finger to draw on a screen an area of the intestine where nerves are located and instructed the team how to navigate the surgery.
A series of new Google Assistant ads aired during the 91st annual Oscars Sunday night, reimagining how some classic and recent films would change with Google's AI helper. Movies like 2001: A Space Odyssey, Ladybird, Psycho, and Scream all appeared in short ads that played throughout the award show. The ads reconsidered how crucial scenes from each movie would work with Google Assistant saving time or assisting the characters. This isn't the first time Google Assistant has appeared in classic movies; a previous commercial also reimagined Home Alone with an AI-friendly scenario back in December. A collection of the commercials that aired tonight can be seen below.