If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Facial recognition cameras prevent crime, protect the public and do not breach the privacy of innocent people whose images are captured, a police force has argued. Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using facial recognition technology on him. But Jeremy Johnson QC compared automated facial recognition (AFR) to the use of DNA to solve crimes and said it would have had little impact on Bridges. Johnson, representing the police, said: "AFR is a further technology that potentially has great utility for the prevention of crime, the apprehension of offenders and the protection of the public." The technology maps faces in a crowd and then compares them with a watch list of images, which can include suspects, missing people and persons of interest to the police.
A self-driving shuttle got pulled over by police on its first day carrying passengers on a new Rhode Island route. Providence Police Chief Hugh Clements says an officer pulled over the odd-looking autonomous vehicle because he had never seen one before. The bus-like vehicle operated by Michigan-based May Mobility was dropping off passengers Wednesday morning when a police cruiser arrived with blinking lights and a siren. It was just hours after the public launch of a state-funded pilot shuttle service. The shuttle offers free rides on a 12-stop urban loop.
PROVIDENCE, RHODE ISLAND - A self-driving shuttle got pulled over by police on its first day carrying passengers on a new Rhode Island route. Providence Police Chief Hugh Clements said an officer pulled over the odd-looking autonomous vehicle because he had never seen one before. "It looked like an oversize golf cart," Clements said. The vehicle, operated by Michigan-based May Mobility, was dropping off passengers Wednesday morning at Providence's Olneyville Square when a police cruiser arrived with blinking lights and a siren. It was just hours after the public launch of a state-funded pilot for a shuttle service called "Little Roady."
The images on Eduardo Fidalgo's computer show mundane scenes – a sofa scattered with pillows, a folded duvet on a bed, some children's toys strewn across a floor. They depict views most of us would see around our own homes. But these rather ordinary pictures are helping to build a new weapon in the fight against crime. Fidalgo and his colleagues are using the images to train a machine to spot clues in crime scene photographs. When police officers visit a crime scene or a suspect's home, they are often confronted with an overwhelming amount of visual information.
Police in Vancouver, British Columbia are cracking down on burglary with a machine learning solution that uses an algorithm to deconstruct crime patterns. Through spatial analytics, police are able to predict where residential break-and-enters will occur and place police patrols accordingly. The department first tried this technology with a pilot test that reduced burglary by more than 20% month over month. Now they are making the approach common practice. "Every 28 days, our management reviews crime trends, crime clustering, and crime issues across the city," said Ryan Prox, Special Constable in Charge of Crime Analytics Advisory and Development Unit, Vancouver Police.
Some of the best, or at least sharpest, minds on the planet are devoted to guessing what we might buy next, and showing us advertisements for it. Often the results are ludicrously inaccurate; sometimes they are creepily precise. Would we trust the same kind of technology to predict what crimes we might next commit? That is the question raised by the latest report by campaigners at Liberty on the implications of the police's use of big data and machine learning, the technologies usually referred to as artificial intelligence. When they're used to sell us things, they are relatively harmless.
A police force in the UK is using an algorithm to help decide which crimes are solvable and should be investigated by officers. As a result, the force trialling it now investigates roughly half as many reported assaults and public order offences. This saves time and money, but some have raised concerns that the algorithm could bake in human biases and lead to some solvable cases being ignored. The tool is currently only used for assessing assaults and public order offences, but may be extended to other crimes in the future. When a crime is reported to police, an officer is normally sent to the scene to find out basic facts.
For the second time in less than a month, suspected drone sightings have shut down a UK airport. On 8 January flights out of London Heathrow were suspended for over an hour. And between 19 and 21 December, more than 140,000 people at London Gatwick had their travel plans disrupted after drones were spotted above the airport. How can drones cause so much disruption? Airports operate on a just-in-time basis, with Heathrow moving a plane onto or off its runways every 45 seconds on average.
Police will be handed extra powers to combat drones after the mass disruption at Gatwick airport in the run-up to Christmas. Gatwick was repeatedly forced to close between 19 and 21 December due to reported drone sightings, affecting about 1,000 flights. In response the government has announced a package of measures which include plans to give police the power to land, seize and search drones. The Home Office will also begin to test and evaluate the use of counter-drone technology at airports and prisons. The exclusion zone around airports will be extended to approximately a 5km-radius (3.1 miles), with additional extensions from runway ends.
"Military capability" deployed to counter illegal drone flights at Gatwick Airport has been withdrawn, the Ministry of Defence has confirmed. The Army was deployed as hundreds of flights were cancelled on 20 December following repeated drone sightings. Gatwick said it had spent £5m to prevent future attacks, but would not comment on the nature of the system. Sussex Police said no arrests have been made since a couple were released without charge on 23 December. More than 140,000 passengers were affected by cancellations and delays during the 36 hours of chaos.