Goto

Collaborating Authors

 future crime


Real-life Minority Report: Argentina will use AI to 'predict future crimes'

Daily Mail - Science & tech

Argentinian security forces have announced plans to use artificial intelligence to'predict future crimes' but experts warn the move could threaten citizens' rights. Far-right president Javier Milei has created the Artificial Intelligence Applied to Security Unit which will use algorithms to analyse historical crime data. The data produced will then be used to predict future crimes, The Guardian has reported. The security unit is also expected to be able to use facial recognition software to track down wanted persons and detect suspicious activity. However, the Minority Report-esque resolution has concerned human rights campaigners who fear certain groups in society may be over-scrutinised by the AI technology.


Argentina will use AI to 'predict future crimes' but experts worry for citizens' rights

The Guardian

Argentina's security forces have announced plans to use artificial intelligence to "predict future crimes" in a move experts have warned could threaten citizens' rights. The country's far-right president Javier Milei this week created the Artificial Intelligence Applied to Security Unit, which the legislation says will use "machine-learning algorithms to analyse historical crime data to predict future crimes". It is also expected to deploy facial recognition software to identify "wanted persons", patrol social media, and analyse real-time security camera footage to detect suspicious activities. While the ministry of security has said the new unit will help to "detect potential threats, identify movements of criminal groups or anticipate disturbances", the Minority Report-esque resolution has sent alarm bells ringing among human rights organisations. Experts fear that certain groups of society could be overly scrutinised by the technology, and have also raised concerns over who – and how many security forces – will be able to access the information.


TechScape: can AI really predict crime?

The Guardian

In 2011, the Los Angeles police department rolled out a novel approach to policing called Operation Laser. Laser – which stood for Los Angeles Strategic Extraction and Restoration – was the first predictive policing programme of its kind in the US, allowing the LAPD to use historical data to predict with laser precision (hence the name) where future crimes might be committed and who might commit them. But it was all but precise. The programme used historical crime data like arrests, calls for service, field interview cards – which police filled out with identifying information every time they stopped someone regardless of the reason – and more to map out "problem areas" for officers to focus their efforts on or assign criminal risk scores to individuals. Information collected during these policing efforts was fed into computer software that further helped automate the department's crime-prediction efforts.


AI for Crime Prevention and Detection - 5 Current Applications

#artificialintelligence

Daniel Faggella is Head of Research at Emerj. Called upon by the United Nations, World Bank, INTERPOL, and leading enterprises, Daniel is a globally sought-after expert on the competitive strategy implications of AI for business and government leaders. Companies and cities all over world are experimenting with using artificial intelligence to reduce and prevent crime, and to more quickly respond to crimes in progress. The ideas behind many of these projects is that crimes are relatively predictable; it just requires being able to sort through a massive volume of data to find patterns that are useful to law enforcement. This kind of data analysis was technologically impossible a few decades ago, but the hope is that recent developments in machine learning are up to the task.


Justice Can't Be Colorblind: How to Fight Bias with Predictive Policing

@machinelearnbot

Originally published by Scientific American. Law enforcement's use of predictive analytics recently came under fire again. Dartmouth researchers made waves reporting that simple predictive models--as well as nonexpert humans--predict crime just as well as the leading proprietary analytics software. That the leading software achieves (only) human-level performance might not actually be a deadly blow, but a flurry of press from dozens of news outlets has quickly followed. In any case, even as this disclosure raises questions about one software tool's credibility, a more enduring, inherent quandary continues to plague predictive policing.


Artificial Intelligence as a Weapon for Hate and Racism

#artificialintelligence

The stunning advancement of artificial intelligence and machine learning has brought advances in society. These technologies have improved medicine and how quickly doctors can diagnose disease, for example. IBM's AI platform Watson helps reduce water waste in drought stricken areas. AI even entertains us--the more you use Netflix, the more it learns what your viewing preferences are and makes suggestions based on what you like to watch. However, there is a very dark side to AI, and it's worrying many social scientists and some in the tech industry.


After reading thousands of romance books, Google's AI is writing eerie post-modern poetry

#artificialintelligence

Risk assessment scoring algorithms are used in courtrooms throughout the United States to determine whether someone is more likely to commit a future crime. Evidence shows they are biased against blacks. "There's software used across the country to predict future criminals. And it's biased against blacks. ON A SPRING AFTERNOON IN 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid's blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs.Just as the 18-year-old girls were realizing they were too big for the tiny conveyances -- which belonged to a 6-year-old boy -- a woman came running after them saying, "That's my kid's stuff." Borden and her friend immediately dropped the bike and scooter and walked away. But it was too late -- a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of 80. Compare their crime with a similar one: The previous summer, 41-year-old Vernon Prater was picked up for shoplifting 86.35 worth of tools from a nearby Home Depot store. Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile. Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden -- who is black -- was rated a high risk. Prater -- who is white -- was rated a low risk."