If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Difficult co-workers who defy authority are more likely to cheat on their partners, a new study suggests. Researchers at the University of Texas discovered the correlation after studying the records of police officers, financial advisers, white-collar criminals and senior executives who used the Ashley Madison marital infidelity website. The data suggests a strong connection between people's actions in their personal and professional lives. They found that Ashley Madison were more than twice as likely to engage in corporate misconduct. Researchers investigated four study groups totalling 11,235 individuals.
A Los Angeles police officer wears an Axon body camera in 2017. On Thursday, the company announced it is holding off on facial recognition software, citing its unreliability. A Los Angeles police officer wears an Axon body camera in 2017. On Thursday, the company announced it is holding off on facial recognition software, citing its unreliability. The largest manufacturer of police body cameras is rejecting the possibility of selling facial recognition technology – at least, for now.
San Francisco District Attorney George Gascon, left, announces a new AI tool that will curb racial biases when deciding criminal charges, alongside Alex Chohlas-Wood, right, who helped develop the tool.ASSOCIATED PRESS San Francisco says it will start using an artificial intelligence tool to reduce possible racial bias among prosecutors reviewing police reports, a "first-in-the-nation" use of a technology whose applications have been criticized for compounding bias. On Wednesday, District Attorney George Gascón announced that the city on July 1 would begin to use a "bias mitigation tool" that automatically redacts anything on the police report that might be suggestive of race, from hair color to zip code. Information about the police officer, such as badge number, will also be hidden. Currently, the district attorney's office manually removes the first few pages of the report, but if any race details are in the narrative--the section where the police officer describes the crime--prosecutors can see them. "This technology will reduce the threat that implicit bias poses to the purity of decisions which have serious ramifications for the accused, and that will help make our system of justice more fair and just," Gascón said.
Facial recognition cameras prevent crime, protect the public and do not breach the privacy of innocent people whose images are captured, a police force has argued. Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using facial recognition technology on him. But Jeremy Johnson QC compared automated facial recognition (AFR) to the use of DNA to solve crimes and said it would have had little impact on Bridges. Johnson, representing the police, said: "AFR is a further technology that potentially has great utility for the prevention of crime, the apprehension of offenders and the protection of the public." The technology maps faces in a crowd and then compares them with a watch list of images, which can include suspects, missing people and persons of interest to the police.
A self-driving shuttle got pulled over by police on its first day carrying passengers on a new Rhode Island route. Providence Police Chief Hugh Clements says an officer pulled over the odd-looking autonomous vehicle because he had never seen one before. The bus-like vehicle operated by Michigan-based May Mobility was dropping off passengers Wednesday morning when a police cruiser arrived with blinking lights and a siren. It was just hours after the public launch of a state-funded pilot shuttle service. The shuttle offers free rides on a 12-stop urban loop.
PROVIDENCE, RHODE ISLAND - A self-driving shuttle got pulled over by police on its first day carrying passengers on a new Rhode Island route. Providence Police Chief Hugh Clements said an officer pulled over the odd-looking autonomous vehicle because he had never seen one before. "It looked like an oversize golf cart," Clements said. The vehicle, operated by Michigan-based May Mobility, was dropping off passengers Wednesday morning at Providence's Olneyville Square when a police cruiser arrived with blinking lights and a siren. It was just hours after the public launch of a state-funded pilot for a shuttle service called "Little Roady."
The images on Eduardo Fidalgo's computer show mundane scenes – a sofa scattered with pillows, a folded duvet on a bed, some children's toys strewn across a floor. They depict views most of us would see around our own homes. But these rather ordinary pictures are helping to build a new weapon in the fight against crime. Fidalgo and his colleagues are using the images to train a machine to spot clues in crime scene photographs. When police officers visit a crime scene or a suspect's home, they are often confronted with an overwhelming amount of visual information.
Police in Vancouver, British Columbia are cracking down on burglary with a machine learning solution that uses an algorithm to deconstruct crime patterns. Through spatial analytics, police are able to predict where residential break-and-enters will occur and place police patrols accordingly. The department first tried this technology with a pilot test that reduced burglary by more than 20% month over month. Now they are making the approach common practice. "Every 28 days, our management reviews crime trends, crime clustering, and crime issues across the city," said Ryan Prox, Special Constable in Charge of Crime Analytics Advisory and Development Unit, Vancouver Police.
Some of the best, or at least sharpest, minds on the planet are devoted to guessing what we might buy next, and showing us advertisements for it. Often the results are ludicrously inaccurate; sometimes they are creepily precise. Would we trust the same kind of technology to predict what crimes we might next commit? That is the question raised by the latest report by campaigners at Liberty on the implications of the police's use of big data and machine learning, the technologies usually referred to as artificial intelligence. When they're used to sell us things, they are relatively harmless.
A police force in the UK is using an algorithm to help decide which crimes are solvable and should be investigated by officers. As a result, the force trialling it now investigates roughly half as many reported assaults and public order offences. This saves time and money, but some have raised concerns that the algorithm could bake in human biases and lead to some solvable cases being ignored. The tool is currently only used for assessing assaults and public order offences, but may be extended to other crimes in the future. When a crime is reported to police, an officer is normally sent to the scene to find out basic facts.