Results


#KindrGrindr: Gay dating app launches anti-racism campaign

BBC News

If you're a black or Asian user of gay dating app Grindr, then it's possible you've encountered racism while using it. Some users of the app have said they've come across what they believe are discriminatory statements on other profiles - things like "no blacks and no Asians". Others say they've faced racist comments in conversation with users when they've rejected their advances. Now Grindr has taken a stand against discrimination on its platform and says no user is entitled to tear another down for "being who they are". It's launched the #KindrGrindr campaign to raise awareness of racism and discrimination and promote inclusivity among users.


Google in China: Internet giant 'plans censored search engine'

BBC News

Google is developing a version of its search engine that will conform to China's censorship laws, reports say. The company shut down the engine in 2010, complaining that free speech was being limited. But online news site The Intercept says Google has being working on a project code-named Dragonfly that will block terms like human rights and religion, a move sure to anger activists. One state-owned newspaper in China, Securities Daily, dismissed the report. "We provide a number of mobile apps in China, such as Google Translate and Files Go, help Chinese developers, and have made significant investments in Chinese companies like JD.com," it said.


Google Might Be Ready to Play By China's Censorship Rules

WIRED

In 2010, Google made a moral calculus. The company had been censoring search results in China at the behest of the Communist government since launching there in 2006. But after a sophisticated phishing attack to gain access to the Gmail accounts of Chinese human rights activists, Google decided to stop censoring results, even though it cost the company access to the lucrative Chinese market. Across nearly a decade, Google's decision to weigh social good over financial profit became part of Silicon Valley folklore, a handy anecdote that cast the tech industry as a democratizing force in the world. But to tech giants with an insatiable appetite for growth, China's allure is just as legendary.


Google under fire over reported plans to launch a censored search engine in China

Daily Mail

Google is reportedly going to launch a censored version of its search engine in China. The tech giant has been secretly planning to launch the product since last year, as part of a project referred to inside the company as'Dragonfly,' according to The Intercept, which was given internal documents from a whistleblower. It comes as Google has tried and failed to make inroads in the Chinese market over the past several years. Google has been planning to launch the product since last year, as part of a project referred to inside the company as'Dragonfly.' While China is home to the world's largest number of internet users, a 2015 report by US think tank Freedom House found that the country had the most restrictive online use policies of 65 nations it studied, ranking below Iran and Syria.


Facial recognition helps mom and dad see kids' camp photos, raises privacy concerns for some

USATODAY

A photo from a summer camp posted to the camp's website so parents can view them. Venture capital-backed Waldo Photos has been selling the service to identify specific children in the flood of photos provided daily to parents by many sleep-away camps. Camps working with the Austin, Texas-based company give parents a private code to sign up. When the camp uploads photos taken during activities to its website, Waldo's facial recognition software scans for matches in the parent-provided headshots. Once it finds a match, the Waldo system (as in "Where's Waldo?") then automatically texts the photos to the child's parents.


Fighting the "coded gaze"

#artificialintelligence

When I was a master's student at MIT, I worked on a number of different art projects that used facial analysis technology. One in particular--called The Aspire Mirror-- would detect my face in a mirror and then display a reflection of something different, based on what inspired me or what I wanted to empathize with. As I was working on it, I realized that the software I was using had a hard time detecting my face. But after I made one adjustment, the software no longer struggled: I put on a white mask. This disheartening moment brought to mind Franz Fanon's book Black Skin White Masks, which interrogates the complexities of changing oneself--putting on a mask to fit the norms or expectations of a dominant culture.


NSA Spy Buildings, Facebook Data, and More Security News This Week

WIRED

It has been, to be quite honest, a fairly bad week, as far as weeks go. But despite the sustained downbeat news, a few good things managed to happen as well. California has passed the strongest digital privacy law in the United States, for starters, which as of 2020 will give customers the right to know what data companies use, and to disallow those companies from selling it. It's just the latest in a string of uncommonly good bits of privacy news, which included last week's landmark Supreme Court decision in Carpenter v. US. That ruling will require law enforcement to get a warrant before accessing cell tower location data.


Finding the Vulnerable with Biometrics, Artificial Intelligence: Atlanta's Trust Stamp to aid in locating those lost to human trafficking - Swanson Reed - Specialist R&D Tax Advisors

#artificialintelligence

Artificial intelligence may put an end to a long-running industry: human trafficking. The average age a minor enters the sex trade in the U.S. is 12 to 14 years old–many of the victims being runaway girls who were sexually abused. Thankfully, Attorney Generals in the U.S. and Mexico are planning to implement a new system that will help to locate victims of human trafficking. Trust Stamp, an Atlanta-based startup, will be providing the'meat and potatoes' of the life-saving technology. According to the company website, "[Trust Stamp] creates proprietary artificial intelligence solutions; researching and leveraging facial biometric science and wide-scale data mining to deliver insightful identity & trust predictions while identifying and defending against fraudulent identity attacks."


Orlando ends Amazon facial recognition program over privacy concerns

Daily Mail

Florida has stopped testing Amazon's facial recognition program after rights groups raised concerns that the service could be used in ways that could violate civil liberties. Orlando ended a pilot program last week after its contract with Amazon.com Inc to use its Rekognition service expired. 'Partnering with innovative companies to test new technology - while also ensuring we uphold privacy laws and in no way violate the rights of others - is critical to us as we work to further keep our community safe,' the city and the Orlando Police Department said in a joint statement Monday. Orlando was one of several U.S. jurisdictions that Amazon has pitched its service to since unveiling it in late 2016 as a way to detect offensive content and secure public safety.


Understanding Self-Narration of Personally Experienced Racism on Reddit

AAAI Conferences

We identify and classify users’ self-narration of racial discrimination and corresponding community support in social media. We developed natural language models first to distinguish self-narration of racial discrimination in Reddit threads, and then to identify which types of support are provided and valued in subsequent replies. Our classifiers can detect the self-narration of personally experienced racism in online textual accounts with 83% accuracy and can recognize four types of supportive actions in replies with up to 88% accuracy. Descriptively, our models identify types of racism experienced and the racist concepts (e.g., sexism, appearance or accent related) most experienced by people of different races. Finally, we show that commiseration is the most valued form of social support.