If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Lord Burnett of Maldon, the current Lord Chief Justice, has set up a new Advisory Body with the aim of ensuring that the Judiciary of England and Wales is fully informed about developments in artificial intelligence (AI). Professor Richard Susskind, President of the Society for Computers & Law, has been named chair of the body, and in a recent interview stated that AI has taken off in the last six or seven years, to the point where it has become "affordable and practical". Professor Susskind believes that the new group will start a dialogue among the judiciary about "one of the most influential technologies that there is", and recognises the importance of judges being open to the opportunities that AI technology could offer to the court system (with "practical tasks" cited as an example). The 10-person team will be made up of both senior judges (including Lord Neuberger, past President of the UK Supreme Court, and Lady Justice Sharp, Vice-President of the Queen's Bench Division), as well as leading experts on AI and law (such as Professor Katie Atkinson, past President of the International Association for AI and Law). There is little doubt that automation already plays an essential role for the legal profession, for example, in large disclosure exercises.
Slate Plus members get extended, ad-free versions of our podcasts--and much more. Sign up today and try it free for two weeks. Copy this link and add it in your podcast app. For detailed instructions, see our Slate Plus podcasts page. Listen to Amicus via Apple Podcasts, Overcast, Spotify, Stitcher, or Google Podcasts.
In 1970, Lyudmila Terentyevna Aleksandrova lost her right hand. It happened at work, where she was employed by the Russian state. With her hand gone, she fought for a disability allowance that never materialized, batted about by district and regional courts. Eventually, after decades of frustration, she brought the case to the European Court of Human Rights, which ruled in 2007 that there had been a violation in Aleksandrova's right to a fair trial. Pay the money, it told Russia.
In May of 2010, prompted by a series of high-profile scandals, the mayor of New Orleans asked the US Department of Justice to investigate the city police department (NOPD). Ten months later, the DOJ offered its blistering analysis: during the period of its review from 2005 onwards, the NOPD had repeatedly violated constitutional and federal law. It used excessive force, and disproportionately against black residents; targeted racial minorities, non-native English speakers, and LGBTQ individuals; and failed to address violence against women. The problems, said assistant attorney general Thomas Perez at the time, were "serious, wide-ranging, systemic and deeply rooted within the culture of the department." Despite the disturbing findings, the city entered a secret partnership only a year later with data-mining firm Palantir to deploy a predictive policing system.
As artificial intelligence moves into the courtroom, much has been written about sentencing algorithms with hidden biases. Daniel L. Chen, a researcher at both the Toulouse School of Economics and University of Toulouse Faculty of Law, has a different idea: using AI to help correct the biased decisions of human judges. Chen, who holds both a law degree and a doctorate in economics, has spent years collecting data on judges and US courts. "One thing that's been particularly nagging my mind is how to understand all of the behavioral biases that we've found," he says. For example, human biases that can tip the scales when making a decision.
A machine learning model may exhibit discrimination when used to make decisions involving people. One potential cause for such outcomes is that the model uses a statistical proxy for a protected demographic attribute. In this paper we formulate a definition of proxy use for the setting of linear regression and present algorithms for detecting proxies. Our definition follows recent work on proxies in classification models, and characterizes a model's constituent behavior that: 1) correlates closely with a protected random variable, and 2) is causally influential in the overall behavior of the model. We show that proxies in linear regression models can be efficiently identified by solving a second-order cone program, and further extend this result to account for situations where the use of a certain input variable is justified as a ``business necessity''. Finally, we present empirical results on two law enforcement datasets that exhibit varying degrees of racial disparity in prediction outcomes, demonstrating that proxies shed useful light on the causes of discriminatory behavior in models.
Sprint shares were higher on Wednesday following news that the company is preparing to mortgage its wireless airwaves. ALBANY – Sprint has agreed to pay a $330 million settlement after the company skirted New York tax law for nearly a decade, New York's attorney general announced Friday. The record-breaking settlement came in the wake of a false claims lawsuit filed by Attorney General Barbara Underwood alleging the cellular provider failed to collect and remit over $100 million in state and local taxes on flat-rate calling plans. The $330 million settlement is the largest recovery by a single state in a false claims lawsuit, according to the attorney general's office. "Sprint knew exactly how New York sales tax law applied to its plans – yet for years the company flagrantly broke the law, cheating the state and its localities out of tax dollars that should have been invested in our communities," Underwood said in a statement.
Braz, Fabricio Ataides, da Silva, Nilton Correia, de Campos, Teofilo Emidio, Chaves, Felipe Borges S., Ferreira, Marcelo H. S., Inazawa, Pedro Henrique, Coelho, Victor H. D., Sukiennik, Bernardo Pablo, de Almeida, Ana Paula Goncalves Soares, Vidal, Flavio Barros, Bezerra, Davi Alves, Gusmao, Davi B., Ziegler, Gabriel G., Fernandes, Ricardo V. C., Zumblick, Roberta, Peixoto, Fabiano Hartmann
The Brazilian court system is currently the most clogged up judiciary system in the world. Thousands of lawsuit cases reach the supreme court every day. These cases need to be analyzed in order to be associated to relevant tags and allocated to the right team. Most of the cases reach the court as raster scanned documents with widely variable levels of quality. One of the first steps for the analysis is to classify these documents. In this paper we present a Bidirectional Long Short-Term Memory network (Bi-LSTM) to classify these pieces of legal document.
When the head of the U.S. Supreme Court says artificial intelligence (AI) is having a significant impact on how the legal system in this country works, you pay attention. That's exactly what happened when Chief Justice John Roberts was asked the following question: "Can you foresee a day when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?" His answer startled the audience. "It's a day that's here and it's putting a significant strain on how the judiciary goes about doing things," he said, as reported by The New York Times. In the last decade, the field of AI has experienced a renaissance.
Sharing a name with a famous person can prompt endless jokes and comments -- but in these particularly politically-charged times, having the same name as a political figure can be especially tiresome. That's something a young man from Kentucky named Brett Kavanagh has learned only too well in recent weeks: On Friday, Brett, 27, complained about the recent woes of having his name, prompting others with famous names to commiserate. Women named Siri and Alexa, and men named Michael Jackson and Bruce Lee, all tweeted about how hard it is to have a well-known name. His tweet inspired others to chime in, including this person who pointed to a Scottish man named Steve Bannon -- who is not the same as Breitbart's Steve Bannon A man named Bruce Y. Lee knows the struggle This Brett, who works in customer service and lives in Louisville, spells his last name differently from new Supreme Court Justice Brett Kavanaugh, but it seems their nearly-identical names has caused him some trouble. Tough times: Brett (pictured) doesn't spell his name the same way as the judge, either'This is a terrible time to be named Brett Kavanagh,' he tweeted.