Goto

Collaborating Authors

Results


Pennsylvania man's 'gunlike hand gesture' toward neighbor was a crime, court rules

FOX News

Fox News Flash top headlines for August 29 are here. Check out what's clicking on Foxnews.com A Pennsylvania court ruled Tuesday that making a "gunlike hand gesture" is a crime after a man made the hand motion during an argument with his neighbor -- an act which reportedly made several nearby residents nervous and prompted a call to police. Stephen Kirchner, 64, made the gesture toward his neighbor in Manor Township in June 2018, according to surveillance video. Kirchner, walking alongside a female neighbor, "stopped, made eye contact with [the male neighbor] and then made a hand gesture at him imitating the firing and recoiling of a gun," according to court documents.


Fair Algorithms for Learning in Allocation Problems

arXiv.org Machine Learning

Settings such as lending and policing can be modeled by a centralized agent allocating a resource (loans or police officers) amongst several groups, in order to maximize some objective (loans given that are repaid or criminals that are apprehended). Often in such problems fairness is also a concern. A natural notion of fairness, based on general principles of equality of opportunity, asks that conditional on an individual being a candidate for the resource, the probability of actually receiving it is approximately independent of the individual's group. In lending this means that equally creditworthy individuals in different racial groups have roughly equal chances of receiving a loan. In policing it means that two individuals committing the same crime in different districts would have roughly equal chances of being arrested. We formalize this fairness notion for allocation problems and investigate its algorithmic consequences. Our main technical results include an efficient learning algorithm that converges to an optimal fair allocation even when the frequency of candidates (creditworthy individuals or criminals) in each group is unknown. The algorithm operates in a censored feedback model in which only the number of candidates who received the resource in a given allocation can be observed, rather than the true number of candidates. This models the fact that we do not learn the creditworthiness of individuals we do not give loans to nor learn about crimes committed if the police presence in a district is low. As an application of our framework, we consider the predictive policing problem. The learning algorithm is trained on arrest data gathered from its own deployments on previous days, resulting in a potential feedback loop that our algorithm provably overcomes. We empirically investigate the performance of our algorithm on the Philadelphia Crime Incidents dataset.


Clergy sex abuse has cost Catholic Church $3 billion in settlements

FOX News

Dr. Alex McFarland and Christopher Hale on the implications of the sweeping grand jury report about priest sexual abuse in Pennsylvania and how the Catholic Church can begin to make amends. Clergy sex abuse of children has rocked the Catholic Church not only in terms of trust and reputation, but also financially, to the tune of more than $3 billion, according to National Public Radio. The multibillion-dollar expense has gone to settlements in response to lawsuits filed by people abused by clergy, reports NPR. Nearly 20 Catholic dioceses and religious orders have filed for bankruptcy because of the scandals. An attorney whose firm represented abuse victims said that the money the church has paid because of the crisis is part of justice for those who suffered, though it hardly compensates for all the damage done.


Bias detectives: the researchers striving to make algorithms fair

#artificialintelligence

In 2015, a worried father asked Rhema Vaithianathan a question that still weighs on her mind. A small crowd had gathered in a basement room in Pittsburgh, Pennsylvania, to hear her explain how software might tackle child abuse. Each day, the area's hotline receives dozens of calls from people who suspect that a child is in danger; some of these are then flagged by call-centre staff for investigation. But the system does not catch all cases of abuse. Vaithianathan and her colleagues had just won a half-million-dollar contract to build an algorithm to help. Vaithianathan, a health economist who co-directs the Centre for Social Data Analytics at the Auckland University of Technology in New Zealand, told the crowd how the algorithm might work. For example, a tool trained on reams of data -- including family backgrounds and criminal records -- could generate risk scores when calls come in.


Key pretrial hearing in Cosby criminal case set for November

U.S. News

A key pretrial hearing to determine what evidence prosecutors can use in Bill Cosby's Pennsylvania sex assault case has been scheduled for early November. Prosecutors hope to call 13 other accusers to show the comedian had a pattern of drugging and molesting women. The criminal charges involve an encounter with Andrea Constand in 2004. Prosecutors also want to use Cosby's deposition from Constand's 2005 lawsuit. Cosby acknowledges under oath that he had sexual encounters with a series of women after giving them drugs or alcohol.


Pennsylvania's Shame

Slate

At first, he balked at Rago's explanations: that standard police identification procedures were suggestive and unreliable and entire fields of expert testimony were not based on real science. Greenleaf had routinely relied on these tactics to convict the defendants he prosecuted, and no judge or defense attorney had ever challenged him. Greenleaf recalled a long-ago case in which he had called an expert to testify that tool markings found on a victim's stolen property in a string of burglaries were a match to a tool found in the defendant's possession. The markings were a key piece of evidence in identifying the perpetrator, and Greenleaf had told the jury that they were "120 percent accurate." Since the early 2000s, arguments had been raised in courtrooms and in academia that this type of evidence should be categorically excluded.


Minority Report computers may soon mark out children as 'likely criminals'

Daily Mail - Science & tech

Is it possible to predict whether someone will commit a crime some time in the future? It sounds like an idea from the 2002 science-fiction movie Minority Report. But that's what statistical researcher Richard Berk, from the University of Pennsylvania, hopes to find out from work he's carried out this year in Norway. Is it possible to predict whether someone will commit a crime some time in the future? It sounds like an idea from the 2002 science-fiction movie Minority Report (pictured).