metcalf
The Contestation of Tech Ethics: A Sociotechnical Approach to Ethics and Technology in Action
Recent controversies related to topics such as fake news, privacy, and algorithmic bias have prompted increased public scrutiny of digital technologies and soul-searching among many of the people associated with their development. In response, the tech industry, academia, civil society, and governments have rapidly increased their attention to "ethics" in the design and use of digital technologies ("tech ethics"). Yet almost as quickly as ethics discourse has proliferated across the world of digital technologies, the limitations of these approaches have also become apparent: tech ethics is vague and toothless, is subsumed into corporate logics and incentives, and has a myopic focus on individual engineers and technology design rather than on the structures and cultures of technology production. As a result of these limitations, many have grown skeptical of tech ethics and its proponents, charging them with "ethics-washing": promoting ethics research and discourse to defuse criticism and government regulation without committing to ethical behavior. By looking at how ethics has been taken up in both science and business in superficial and depoliticizing ways, I recast tech ethics as a terrain of contestation where the central fault line is not whether it is desirable to be ethical, but what "ethics" entails and who gets to define it. This framing highlights the significant limits of current approaches to tech ethics and the importance of studying the formulation and real-world effects of tech ethics. In order to identify and develop more rigorous strategies for reforming digital technologies and the social relations that they mediate, I describe a sociotechnical approach to tech ethics, one that reflexively applies many of tech ethics' own lessons regarding digital technologies to tech ethics itself.
- North America > United States > California (0.93)
- Europe (0.93)
- Asia (0.67)
- Media > News (1.00)
- Law > Statutes (1.00)
- Law > Civil Rights & Constitutional Law (1.00)
- (6 more...)
This AI Uses Echolocation to Identify What You're Doing
He and his colleagues have built a device, about the size of a thin laptop, that emits sound at frequencies 10 times higher than the shrillest note a piccolo can sustain. The pitches it produces are inaudible to the human ear. When Guo's team aims the device at a person and fires an ultrasonic pitch, the gadget listens for the echo using its hundreds of embedded microphones. Then, employing artificial intelligence techniques, his team tries to decipher what the person is doing from the reflected sound alone. The technology is still in its infancy, but they've achieved some promising initial results.
- North America > United States > New York (0.05)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.05)
- Asia > China > Hubei Province > Wuhan (0.05)
AI Research Is in Desperate Need of an Ethical Watchdog
About a week ago, Stanford University researchers posted online a study on the latest dystopian AI: They'd made a machine learning algorithm that essentially works as gaydar. After training it with tens of thousands of photographs from dating sites, the algorithm could perform better than a human judge in specific instances. For example, when given photographs of a gay white man and a straight white man taken from dating sites, the algorithm could guess which one was gay more accurately than actual people participating in the study.* They wanted to protect gay people. "[Our] findings expose a threat to the privacy and safety of gay men and women," wrote Michal Kosinski and Yilun Wang in the paper.
- Law (0.98)
- Information Technology > Services (0.76)
AI Research Is in Desperate Need of an Ethical Watchdog
About a week ago, Stanford University researchers posted online a study on the latest dystopian AI: They'd made a machine learning algorithm that essentially works as gaydar. After training the algorithm with tens of thousands of photographs from a dating site, the algorithm could, for example, guess if a white man in a photograph was gay with 81 percent accuracy. They wanted to protect gay people. "[Our] findings expose a threat to the privacy and safety of gay men and women," wrote Michal Kosinski and Yilun Wang in the paper. They built the bomb so they could alert the public about its dangers.
- North America > United States > New York > Suffolk County > Stony Brook (0.06)
- North America > United States > California (0.05)
- Health & Medicine > Therapeutic Area > Immunology (0.50)
- Law > Statutes (0.49)
- Health & Medicine > Therapeutic Area > Infections and Infectious Diseases (0.32)
ai-research-is-in-desperate-need-of-an-ethical-watchdog
Stanford's review board approved Kosinski and Wang's study. "The vast, vast, vast majority of what we call'big data' research does not fall under the purview of federal regulations," says Metcalf. Take a recent example: Last month, researchers affiliated with Stony Brook University and several major internet companies released a free app, a machine learning algorithm that guesses ethnicity and nationality from a name to about 80 percent accuracy. The group also went through an ethics review at the company that provided training list of names, although Metcalf says that an evaluation at a private company is the "weakest level of review that they could do."
- Information Technology > Networks (0.35)
- Law > Civil Rights & Constitutional Law (0.30)
- Health & Medicine > Therapeutic Area > Internal Medicine (0.30)
- (2 more...)
- Information Technology > Artificial Intelligence > Machine Learning (0.93)
- Information Technology > Data Science > Data Mining > Big Data (0.36)