If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
East Japan Railway Co. on Wednesday showed the media a test trial of two artificial intelligence robots designed to guide passengers at Tokyo Station. The robots -- Pepper of SoftBank Robotics Corp. and SEMMI of German railway company Deutsche Bahn AG -- were deployed at an information desk in a shopping and dining center called Gransta on the basement floor of the station. Visitors can ask for directions to stores and restaurants in the facility in languages including Japanese, English and Chinese, JR East said. The trial, which began Monday and will last through May 31, comes as part of a technological exchange between JR East and Deutsche Bahn that began in 1992. They will look at the machines' capabilities and visitors' reactions to their appearance.
If you board a flight out of the United States four years from now, chances are the government is going to scan your face -- an ambitious timeline that has privacy experts reeling. That's according to a recent Department of Homeland Security report, which says that U.S. Customs and Border Protection (CBP) plans to dramatically expand its Biometric Exit program to cover 97 percent of outbound air passengers within four years. Through this program, which was already in place in 15 U.S. airports at the end of 2018, passengers have their faces scanned by cameras before boarding flights out of the nation. If the AI-powered system determines that the photo doesn't match one on file, CBP officials can look into it. The goal of these airport face scans is purportedly to catch people who have overstayed their visas, but civil liberties expert Edward Hasbrouck sees them as potentially giving the government increased control over American citizens.
Seeing the out-of-sight has turned a new corner. Now, digital cameras can take an image of an object hidden around a wall, which could help autonomous cars detect hazards in blind spots. In principle, any vertical edge can act as an accidental camera, by projecting subtle patterns of light onto the ground. These patterns reveal a semblance of what is happening on the other side of the edge and, though too faint to be noticed by the human eye, can be enhanced and interpreted by imaging algorithms.
PROVIDENCE, RHODE ISLAND - A self-driving shuttle got pulled over by police on its first day carrying passengers on a new Rhode Island route. Providence Police Chief Hugh Clements said an officer pulled over the odd-looking autonomous vehicle because he had never seen one before. "It looked like an oversize golf cart," Clements said. The vehicle, operated by Michigan-based May Mobility, was dropping off passengers Wednesday morning at Providence's Olneyville Square when a police cruiser arrived with blinking lights and a siren. It was just hours after the public launch of a state-funded pilot for a shuttle service called "Little Roady."
Despite concerns over facial recognition's impact on civil liberties, public agencies have continued to apply the tool liberally across the U.S. with one of the biggest deployments coming to an airport near you. The U.S. Department of Homeland Security (DHS) said that it plans to expand its application of facial recognition to 97 percent of all passengers departing the U.S. by 2023, according to the Verge. By comparison, facial recognition technology is deployed in just 15 airports, according to figures recorded at the end of 2018. In what is being referred to as'biometric exit,' the agency plans to use facial recognition to more thoroughly track passengers entering and leaving the country. The U.S. Department of Homeland Security (DHS) said that it plans to expand its application of facial recognition to 97 percent of all passengers departing the U.S. by 2023 The system functions by taking a picture of passengers before they depart and then cross-referencing the image with a database containing photos of passports and visas.
The drone attack that brought Gatwick airport to a standstill last December could have been an "inside job", according to police, who said the perpetrator may have been operating the drone from within the airport. Sussex police told BBC Panorama that the fact an insider may have been behind the attack was "treated as a credible line of enquiry from the earliest stages of the police response". Gatwick's chief operating officer, Chris Woodroofe, believes the perpetrator was familiar with the airport's operational procedures and had a clear view of the runway or possibly infiltrated its communication network. "It was clear that the drone operators had a link into what was going on at the airport," he told Panorama, in his first interview since the incident. He said the culprit had carefully picked a drone that would remain undetected by the airport's DJI Aeroscope detection system being tested at the time.
The drone attack that caused chaos at Gatwick before Christmas was carried out by someone with knowledge of the airport's operational procedures, the airport has said. A Gatwick chief told BBC Panorama the drone's pilot "seemed to be able to see what was happening on the runway". Sussex Police told the programme the possibility an "insider" was involved was a "credible line" of inquiry. About 140,000 passengers were caught up in the disruption. The runway at the UK's second busiest airport was closed for 33 hours between 19 and 21 December last year - causing about 1,000 flights to be cancelled or delayed.
Royal Caribbean Cruises has begun using facial recognition systems to speed passengers on their way through security and ID checks. You and your family are at the pier, giddy to board the massive cruise ship docked nearby. Ahead lies a week of sunny beaches, indulgent buffet feasts and lounging around doing absolutely nothing. And then you see the long lines for security, baggage and ID checks. It often takes 75 minutes for passengers to check in, but the Pool Deck looks a lifetime away.
Explainable Artificial Intelligence (XAI) brings a lot of attention recently. Explainability is being presented as a remedy for lack of trust in model predictions. Model agnostic tools such as LIME, SHAP, or Break Down promise instance level interpretability for any complex machine learning model. But how certain are these explanations? Can we rely on additive explanations for non-additive models? In this paper, we examine the behavior of model explainers under the presence of interactions. We define two sources of uncertainty, model level uncertainty, and explanation level uncertainty. We show that adding interactions reduces explanation level uncertainty. We introduce a new method iBreakDown that generates non-additive explanations with local interaction.
GACHA will begin operating for the general public in the city of espoo in april 2019, before rolling out to hämeenlinna, vantaa, and helsinki later in the year. MUJI and sensible 4 say that the inspiration for the design came from a toy capsule, a universal shape that'embodies joy and excitement, bringing peace and happiness to those who encounter it.' 'the GACHA development got started when sensible 4 team, working back then with the first generation of robot buses, noticed that they just don't perform at all even in light rain, not to mention the typical winter conditions in finland,' says harri santamala, CEO of sensible 4. 'completely autonomous self-driving technology is not here yet.