So, what if I maliciously and knowingly write something false here about the poke bowl place a block away from me? I'd clearly be liable, but my poke friends would probably want to go after deeper pockets. Slate would be on the hook, too (sorry, guys) as the creator of the content that would also be published on its website. I called up Jeff Kosseff to walk through the chain of defamatory liability to find out. Jeff, a former journalist turned cybersecurity professor at the U.S. Naval Academy, is like the character at the beginning of every apocalyptic movie who single-mindedly obsesses over something to the amusement or concern of those around them, until … For Jeff, that something is Section 230 of the 1996 Communications Decency Act--or, as he calls it in his brilliant 2019 biography of the legislation, The Twenty-Six Words That Created the Internet: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." It's the reason why Facebook, Reddit, Wikipedia, and other platforms aren't liable for content published by their users.
A drive to improve the lives of people in every corner of America and beyond, by helping them save. That's what drives us here at Future Tense, as our close followers have come to appreciate. If you're in Bentonville, Arkansas, and somehow happened to find this in your inboxes this morning, welcome. If your bid for TikTok doesn't succeed, we'd be happy to entertain offers. Future Tense could provide you with that same cool vibe and content platform oozing ecommerce possibilities.
In late January, when the coronavirus was just starting to hint that it might disrupt life as we knew it, I decided to re-read Station Eleven by Emily St. John Mandel--a novel that alternates between the time when a deadly virus sweeps across the world, and the remnants of society 20 years later. Reading it--especially on a plane--in that moment felt a bit like daring the virus to get worse, but in a safe way, like telling a ghost story at a campfire in a dark, spooky wood. As the pandemic has progressed, I have continued to notice beats that echo moments small and large in pandemic literature, long one of my favorite genres. The rise of goofy masks is foretold in Ling Ma's Severance. Elements of the far right-wing response bring to mind the militias of Chuck Wendig's alarming Wanderers.
While I was making dinner, I yelled at Alexa. But the recipe was a little complicated, and I kept having to repeat myself to get the damn Amazon Echo to turn off the timer. And when I used my computer communication voice to ask it to play NPR One so I could catch up on the news--it had been a whole eight or nine minutes since I had checked in with the world--it tried three times to instead play "The Austin 100: A SXSW Mix From NPR Music." I feel a little bad about it, remembering Rachel Withers' (very persuasive!) 2018 piece for Future Tense about why she won't date men who are rude to Alexa: It matters how you interact with your virtual assistant, not because it has feelings or will one day murder you in your sleep for disrespecting it, but because of how it reflects on you. Alexa is not human, but we engage with her like one.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society. Amazon's object and facial recognition software, which the company claims offers real-time detection across tens of millions of mugs, including "up to 100 faces in challenging crowded photos." After its launch in late 2016, Amazon Web Services started marketing the visual surveillance tool (which it dubbed "Rekognition") to law enforcement agencies around the country--including partnering directly with the police department in Orlando and a sheriff's department in Oregon. But now, as April Glaser reports, civil rights groups are pushing back. Last week, a coalition including the ACLU, Human Rights Watch, and the Council on American-Islamic Relations, sent an open letter expressing their "profound concerns" that governments could easily abuse the technology to target communities of color, undocumented immigrants, and political protestors.