dinkin
'Pretty revolutionary': a Brooklyn exhibit interrogates white-dominated AI to make it more inclusive
At the Plaza at 300 Ashland Place in downtown Brooklyn, patrons mill around a large yellow shipping container with black triangles painted on its side. A nod to the flying geese quilt pattern, which may have served as a coded message for enslaved people escaping to freedom along the Underground Railroad, the design and container serve as a bridge between the past and the future of the African diaspora. At the center of the art project by the Brooklyn-based transmedia artist Stephanie Dinkins, a large screen displays artificial intelligence (AI) generated images that showcase the diversity of the city. Commissioned by the New York-based art non-profit More Art and designed in collaboration with the architects LOT-EK, the AI laboratory If We Don't, Who Will? will be on display until 28 September. It seeks to challenge a white-dominated generative-AI space by highlighting Black ethos and cultural cornerstones.
- North America > United States > New York (0.25)
- North America > Canada > Ontario > Toronto (0.15)
The Case for Giving Robots an Identity
The first time Stephanie Dinkins met Bina48, in 2014, she worried the thing was dead. "She was turned off," Dinkins says. Dinkins caught the robot's stare and knew she'd found her muse. Bina48 had been conceived several years earlier by Martine Rothblatt, the polymathic entrepreneur. Rothblatt fashioned the AI-powered bot in the likeness of her wife, Bina, training its speech patterns on a database of Bina-isms.
- North America > United States > Vermont (0.07)
- North America > United States > New York > Suffolk County > Stony Brook (0.07)
The Artist Working to Make Artificial Intelligence Less White
It's no secret by now that artificial intelligence has a white guy problem. One could say the same of almost any industry, but the tech world is singular in rapidly shaping the future. As has been widely publicized, the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating the imbalances and injustices that exist in the real world. There was the case of black people being classified as gorillas; the computer system that rejected an Asian man's passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime--giving rise to accusations of profiling. Earlier this year, the release of Google's Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans.
- North America > United States > Louisiana > Orleans Parish > New Orleans (0.27)
- North America > United States > Illinois > Cook County > Chicago (0.27)
- North America > United States > Vermont (0.07)