When ProPublica revealed that Facebook allowed advertisers to target or exclude users by their "ethnic affinity," the social network at first doubled down on defending the practice. "We are committed to providing people with quality ad experiences, which includes helping people see messages that are both relevant to the cultural communities they are interested in and have content that reflects or represents their communities," Facebook said at the time. But the American Civil Liberties Union, the NAACP Legal Defense Fund and other advocacy groups were already questioning the legality of the advertising offering long before the public became aware. Conversations surrounding the use of the feature began last spring, advocates said. Facebook was sued over the feature last week.
Facebook's features let advertisers limit which users see their material, ideally those who will be more interested in their products. But currently included in the "demographics" section of their ad-targeting tool is the ability to select which users see material based on their "ethnic affinity," which the social titan began offering two years ago to aid its multicultural advertising. Facebook automatically lumps users into these categories based on their activity and interests -- categories which advertisers can choose to exclude or specifically target. Since the social network doesn't ask users to racially identify themselves, Facebook collects activity data and then assigns each user an "ethnic affinity." This is basically a preference for stories, events and organizations that coincide with those the social network believes are also held by a certain ethnic group.
Facebook shares slid Thursday after the company warned about slowing advertising growth. The suit, filed in U.S. District Court for the Northern District of California last week, accuses the Menlo Park, Calif.-based company with violating federal anti-discrimination laws for housing and employment. The practice came to light late last month when the non-profit news organization Pro Publica published an analysis showing that the social network lets advertisers target who sees their ads by "ethnic affinity." The wording on Facebook's ad-buy page under "Narrow audience" says "EXCLUDE people who match at least ONE of the following," and includes African American, Asian Americans and four categories of Hispanics. Ad purchasers can also add demographic interest or behaviors they want to exclude.
In October, Google Maps rolled out an experimental feature that estimated the number of calories burned when people walked to their destinations. The feature had a little quirk, though: It used mini-cupcakes to illustrate the calorie counts. Users pointed out that calorie counts could act as a trigger for people with eating disorders. Others found it shaming, and questioned how Google was counting the calories. There was no easy way to turn the feature off.
Facebook is scaling back its ad-targeting options in an attempt to prevent advertisers from discriminating against users, including by religion or ethnicity. Facebook announced on Tuesday that it will remove more than 5,000 ways of narrowing the audience for an ad. Now advertisers will not be able to hide their ads from Facebook users interested in topics such as "Passover," "Native American culture," "Buddhism," "Evangelicalism," and "Islamic culture." "While these options have been used in legitimate ways to reach people interested in a certain product or service, we think minimizing the risk of abuse is more important," Facebook explained in its statement. Facebook's announcement comes on the heels of a complaint filed against the company by the U.S. Department of Housing and Urban Development.