Physicists are increasingly developing artificial intelligence and machine learning techniques to advance our understanding of the physical world but there is a rising concern about the bias in such systems and their wider impact on society at large. In 2011, during her undergraduate degree at Georgia Institute of Technology, Ghanaian-US computer scientist Joy Buolamwini discovered that getting a robot to play a simple game of peek-a-boo with her was impossible – the machine was incapable of seeing her dark-skinned face. Later, in 2015, as a Master's student at Massachusetts Institute of Technology's Media Lab working on a science–art project called Aspire Mirror, she had a similar issue with facial analysis software: it detected her face only when she wore a white mask. Buolamwini's curiosity led her to run one of her profile images across four facial recognition demos, which, she discovered, either couldn't identify a face at all or misgendered her – a bias that she refers to as the "coded gaze". She then decided to test 1270 faces of politicians from three African and three European countries, with different features, skin tones and gender, which became her Master's thesis project "Gender Shades: Intersectional accuracy disparities in commercial gender classification" (figure 1).
Surveillance cameras, like the one here in Boston, are used throughout Massachusetts. The state now regulates how police use facial recognition technology. Surveillance cameras, like the one here in Boston, are used throughout Massachusetts. The state now regulates how police use facial recognition technology. Massachusetts lawmakers passed one of the first state-wide restrictions of facial recognition as part of a sweeping police reform law.
Dr Kate Darling is a research specialist in human-robot interaction, robot ethics and intellectual property theory and policy at the Massachusetts Institute of Technology (MIT) Media Lab. In her new book, The New Breed, she argues that we would be better prepared for the future if we started thinking about robots and artificial intelligence (AI) like animals. What is wrong with the way we think about robots? So often we subconsciously compare robots to humans and AI to human intelligence. The comparison limits our imagination.
Let's say, just hypothetically, that a surveillance robot styled after a dog was giving you a hard time. In this situation, you'd want to shut the thing down, and quickly. Thankfully, when it comes to Boston Dynamic's Spot robot, there are several ways to do just that. The robots, marketed for industrial use and used for viral hijinks, evoke a robot dystopia in the public imagination -- a fact compounded by an April viral video of the NYPD trotting out its very own customized Spot. The first reported instance of police using Spot was in November of 2019, when the Massachusetts State Police leased at least one of the robots for a three-month trial period.
Though police have been using facial recognition technology for the last two decades to try to identify unknown people in their investigations, the practice of putting the majority of Americans into a perpetual photo lineup has gotten surprisingly little attention from lawmakers and regulators. Lawmakers, civil liberties advocates and police chiefs have debated whether and how to use the technology because of concerns about both privacy and accuracy. But figuring out how to regulate it is tricky. So far, that has meant an all-or-nothing approach. City Councils in Oakland, Portland, San Francisco, Minneapolis and elsewhere have banned police use of the technology, largely because of bias in how it works.
Civil rights activists have successfully pushed for bans on police use of facial recognition in cities like Oakland, San Francisco, and Somerville, Massachusetts. Now, a coalition led by Amnesty International is setting its sights on the nation's biggest city--New York--as part of a drive for a global moratorium on government use of the technology. Amnesty's #BantheScan campaign is backed by Legal Aid, the New York Civil Liberties Union, and AI For the People among other groups. After New York, the group plans to target New Delhi and Ulaanbaatar in Mongolia. "New York is the biggest city in the country," says Michael Kleinman, director of Amnesty International's Silicon Valley Initiative.
Gov. Charlie Baker says he now "looks forward" to signing an amended police reform bill after the Senate passed changes scaling back limitations on law enforcement's use of facial recognition technology and leaving training oversight under the purview of police. But before it lands on the Republican governor's desk, the amended bill must now clear the House, where an earlier version passed without a veto-proof majority. State senators passed the 15-page amendment in a 31-9 vote following a Monday-night session, hours after releasing the redrafted language. In a statement to reporters, the Baker administration said the Senate proposal "reflects the amendments that the Governor made to the bill two weeks ago." "After discussing the governor's amendments with the Black and Latino Legislative Caucus, the Administration believes this package addresses the issues identified by the Governor's amendments and he looks forward to signing this version should it reach his desk," said Lizzy Guyton, Baker's communications director.
Gov. Baker asked the Massachusetts Legislature to amend an expansive police reform bill last week, citing the bill's ban on facial recognition technology. Despite his refusal to sign the bill in its current form, Gov. Baker agrees with legislators on many other measures, such as new training standards. The Bay State's efforts to restore public trust in police are commendable, but state officials left out a crucial component: rural police departments. While the debate over facial recognition technology rages in Boston, towns in Western Massachusetts are struggling to pay for new brakes in police cruisers. Higher training standards are a step in the right direction, but small departments often don't have enough officers to cover shifts while other officers are in class. And at the core of this disconnect, urban activists and journalists characterize police departments as structurally broken and officers as occupying militants, whereas small town residents tend to see police officers as friends and neighbors.
Joy Buolamwini from the MIT Media Lab says facial-recognition software has the highest error rates for darker-skinned females. New applications powered by artificial intelligence (AI) are being embraced by the public and private sectors. Their early uses hint at what's to come. In June 2020, IBM, Amazon and Microsoft announced that they were stepping back from facial-recognition software development amid concerns that it reinforces racial and gender bias. Amazon and Microsoft said they would stop selling facial-recognition software to police until new laws are passed in the United States to address potential human-rights abuses.
Massachusetts could make history as the first state to issue a ban on the use of facial recognition by law enforcement. The state's House and Senate lawmakers have approved a police reform bill that would prohibit police departments and other public agencies from using facial recognition systems. As Forbes notes, there will be exceptions, such as if cops can secure a warrant to use facial recognition against someone's driver's license. Officers can also write a request to be able to use the technology if they can show evidence that it's needed to prevent serious injury or death. In addition to the facial recognition ban, the police reform bill also prohibits cops from using chokeholds and rubber bullets.