Civil Rights & Constitutional Law


Facial recognition helps mom and dad see kids' camp photos, raises privacy concerns for some

USATODAY

A photo from a summer camp posted to the camp's website so parents can view them. Venture capital-backed Waldo Photos has been selling the service to identify specific children in the flood of photos provided daily to parents by many sleep-away camps. Camps working with the Austin, Texas-based company give parents a private code to sign up. When the camp uploads photos taken during activities to its website, Waldo's facial recognition software scans for matches in the parent-provided headshots. Once it finds a match, the Waldo system (as in "Where's Waldo?") then automatically texts the photos to the child's parents.


The cameras that know if you're happy - or a threat

BBC News

Facial recognition tech is becoming more sophisticated, with some firms claiming it can even read our emotions and detect suspicious behaviour. But what implications does this have for privacy and civil liberties? Facial recognition tech has been around for decades, but it has been progressing in leaps and bounds in recent years due to advances in computing vision and artificial intelligence (AI), tech experts say. It is now being used to identify people at borders, unlock smart phones, spot criminals, and authenticate banking transactions. But some tech firms are claiming it can also assess our emotional state.


Turns out Orlando won't stop using Amazon's facial recognition software

Mashable

Looks like Orlando won't quit its controversial test of Amazon's facial recognition software after all. The city of Orlando and Orlando Police Department released a joint statement on Monday announcing the city would continue testing Rekognition, Amazon's deep learning facial recognition technology, which has the power to identify every face in a crowd. Last month, the American Civil Liberties Union sent a letter to Orlando lawmakers claiming the city started testing the program "without inviting a public debate, obtaining local legislative authorization, or adopting rules to prevent harm to Orlando community members," and demanded that it "immediately" stop using it. Orlando did stop using Rekognition, but the decision wasn't due to the outcry from privacy and anti-surveillance advocates. Instead, the trial contract simply expired, which left open the possibility of using Rekognition again.


Facial recognition technology: The need for public regulation and corporate responsibility - Microsoft on the Issues

#artificialintelligence

All tools can be used for good or ill. Even a broom can be used to sweep the floor or hit someone over the head. The more powerful the tool, the greater the benefit or damage it can cause. The last few months have brought this into stark relief when it comes to computer-assisted facial recognition – the ability of a computer to recognize people's faces from a photo or through a camera. This technology can catalog your photos, help reunite families or potentially be misused and abused by private companies and public authorities alike. Facial recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression. These issues heighten responsibility for tech companies that create these products.


Orlando police decide to keep testing controversial Amazon facial recognition program

USATODAY

An image from the product page of Amazon's Rekognition service, which provides image and video facial and item recognition and analysis. SAN FRANCISCO -- The Orlando Police Department in Florida is planning to continue its test of a facial recognition program from Amazon, despite outcry from civil rights and privacy groups that law enforcement and government agencies could abuse the technology. OPD announced last month that the trial proof of concept run of the software had expired, but OPD public information officer, Sgt. Eduardo Bernal, said in a release Monday that the department will continue its testing of the program. Two years ago, Amazon built the facial and product recognition tool, called Rekognition, as a way for customers to quickly search a database of images and look for matches.


Safeguarding human rights in the era of artificial intelligence

#artificialintelligence

The use of artificial intelligence in our everyday lives is on the increase, and it now covers many fields of activity. Something as seemingly banal as avoiding a traffic jam through the use of a smart navigation system, or receiving targeted offers from a trusted retailer is the result of big data analysis that AI systems may use. While these particular examples have obvious benefits, the ethical and legal implications of the data science behind them often go unnoticed by the public at large. Artificial intelligence, and in particular its subfields of machine learning and deep learning, may only be neutral in appearance, if at all. Underneath the surface, it can become extremely personal.


Safeguarding human rights in the era of artificial intelligence

#artificialintelligence

The use of artificial intelligence in our everyday lives is on the increase, and it now covers many fields of activity. Something as seemingly banal as avoiding a traffic jam through the use of a smart navigation system, or receiving targeted offers from a trusted retailer is the result of big data analysis that AI systems may use. While these particular examples have obvious benefits, the ethical and legal implications of the data science behind them often go unnoticed by the public at large. Artificial intelligence, and in particular its subfields of machine learning and deep learning, may only be neutral in appearance, if at all. Underneath the surface, it can become extremely personal.


NSA Spy Buildings, Facebook Data, and More Security News This Week

WIRED

It has been, to be quite honest, a fairly bad week, as far as weeks go. But despite the sustained downbeat news, a few good things managed to happen as well. California has passed the strongest digital privacy law in the United States, for starters, which as of 2020 will give customers the right to know what data companies use, and to disallow those companies from selling it. It's just the latest in a string of uncommonly good bits of privacy news, which included last week's landmark Supreme Court decision in Carpenter v. US. That ruling will require law enforcement to get a warrant before accessing cell tower location data.


Orlando ends Amazon facial recognition program over privacy concerns

Daily Mail

Florida has stopped testing Amazon's facial recognition program after rights groups raised concerns that the service could be used in ways that could violate civil liberties. Orlando ended a pilot program last week after its contract with Amazon.com Inc to use its Rekognition service expired. 'Partnering with innovative companies to test new technology - while also ensuring we uphold privacy laws and in no way violate the rights of others - is critical to us as we work to further keep our community safe,' the city and the Orlando Police Department said in a joint statement Monday. Orlando was one of several U.S. jurisdictions that Amazon has pitched its service to since unveiling it in late 2016 as a way to detect offensive content and secure public safety.


Some Amazon investors side with ACLU on facial recognition

Washington Post

Some Amazon company investors said Monday they are siding with privacy and civil rights advocates who are urging the tech giant to not sell a powerful face recognition tool to police. The American Civil Liberties Union is leading the effort against Amazon's Rekognition product, delivering a petition with 152,000 signatures to the company's Seattle headquarters Monday, telling the company to "cancel this order." They're asking Amazon to stop marketing Rekognition to government agencies over privacy issues that they say can be used to discriminate against minorities. Amazon said it's an object detection tool. The company through a spokesman said it can be used for law enforcement tasks ranging from fighting human trafficking to finding lost children, and that just like computers, it can be a force for good in responsible hands.