Goto

Collaborating Authors

 race science


The Race-Science Blogger Cited by The New York Times

The Atlantic - Technology

Lasker, the Times explained, was the "intermediary" who tipped off the publication about Mamdani's application, which was included in a larger hack of Columbia's computer systems. After the Times published its story, Lasker celebrated on X. "I break-uh dah news," he wrote to his more than 260,000 followers. On both X and Substack, where he also has a large following, Lasker is best-known for compiling charts on the "Black-White IQ gap" and otherwise linking race to real-world outcomes. He seems convinced that any differences are the result of biology, and has shot down other possible explanations. He has suggested that crime is genetic.

  Country:
  Industry: Law (0.31)

An Algorithm That 'Predicts' Criminality Based on a Face Sparks a Furor

#artificialintelligence

In early May, a press release from Harrisburg University claimed that two professors and a graduate student had developed a facial-recognition program that could predict whether someone would be a criminal. The release said the paper would be published in a collection by Springer Nature, a big academic publisher. With "80 percent accuracy and with no racial bias," the paper, A Deep Neural Network Model to Predict Criminality Using Image Processing, claimed its algorithm could predict "if someone is a criminal based solely on a picture of their face." The press release has since been deleted from the university website. Tuesday, more than 1,000 machine-learning researchers, sociologists, historians, and ethicists released a public letter condemning the paper, and Springer Nature confirmed on Twitter it will not publish the research.


An Algorithm That 'Predicts' Criminality Based on a Face Sparks a Furor

WIRED

In early May, a press release from Harrisburg University claimed that two professors and a graduate student had developed a facial-recognition program that could predict whether someone would be a criminal. The release said the paper would be published in a collection by Springer Nature, a big academic publisher. With "80 percent accuracy and with no racial bias," the paper, A Deep Neural Network Model to Predict Criminality Using Image Processing, claimed its algorithm could predict "if someone is a criminal based solely on a picture of their face." The press release has since been deleted from the university website. Tuesday, more than 1,000 machine-learning researchers, sociologists, historians, and ethicists released a public letter condemning the paper, and Springer Nature confirmed on Twitter it will not publish the research.