Coded Bias, directed by Shalini Kantayya, is a documentary in the way Artificial Intelligence trails human data with the assistance of algorithms incorporated in sophisticated Machine Learning Models. Although many of the algorithms used today were created in the 80s, we have digitalised our lives, and data, in a massive amount never so accessible in the history of humankind. Adding to that, the increase in processing power by computers and wireless exchange of information by the 5G technology means AI is probably the most powerful technology ever designed. It already has the capacity to individualised strategies to nudge behaviours desired by a third party. It is only visible to the targeted person, leaves no traces and almost unregulated with few exceptions like the GDPR (General Data Protection Regulation).
This week viewers can pick up some catalog titles in 4K, like Saw, as well as 20th Anniversary Edition versions of Shrek and the first Fast and the Furious movie. But the major launch this week is Bioware's remastered version of the Mass Effect trilogy, now available across console generations and on PC, with improved graphics, gameplay and almost all of the content ever released for the games. Otherwise, Netflix has the final season of Castlevania, as well as a new round of episodes in the Love Death & Robots anthology. HBO's feature film release of the week is Those Who Wish Me Dead, and if you're looking for something a little different then try Intergalactic, a sci-fi prison break series from the UK that's streaming on Peacock. Look below to check out each day's highlights, including trailers and let us know what you think (or what we missed). Let's Be Real, Fox, 9:30 PM Everything's Gonna Be Okay, Freeform, 10 PM All times listed are ET.
As a reader of this column, you probably know I am releasing a new book, The Augmented Workforce: How AR, AI, and 5G Will Impact Every Dollar You Make on May 25 with my co-author, John Buzzell. For the next four weeks, I'll be sharing excerpts from the book as columns to help brands, businesses and professionals prepare and understand what the era of the augmented workforce holds, how to prepare and how to leverage the changes to stay relevant in the future. Dearly disrupted we are gathered here to talk about the future spaces we will inhabit. We are living through a period of rapid change, possibly beyond society's capacity to keep up. Various emerging technologies, such as artificial intelligence (AI), augmented reality (AR), virtual reality (VR), and 5G, along with dozens of devices that work together (Internet of Things), have helped to create an environment in which new inventions, possibilities, and learning curves change weekly. According to Peter Diamandis and Steven Kotler, authors of The Future Is Faster Than You Think, "Moore's Law is the reason the smartphone in your pocket is a thousand times smaller, a thousand times cheaper, and a million times more powerful than a supercomputer from the 1970s. In 2023 the average thousand-dollar laptop will have the same computing power as a human brain (roughly 1016 cycles per second). Twenty-five years after that, that same average laptop will have the power of all the human brains currently on Earth."
Progress in technology and increased levels of private investment in startup AI companies is accelerating, according to the 2021 AI Index, an annual study of AI impact and progress developed by an interdisciplinary team at the Stanford Institute for Human-Centered Artificial Intelligence. Indeed, AI is showing up just about everywhere. In recent weeks, there have been stories of how AI is used to monitor the emotional state of cows and pigs, dodge space junk in orbit, teach American Sign Language, speed up assembly lines, win elite crossword puzzle tournaments, assist fry cooks with hamburgers, and enable "hyperautomation." Soon there will be little left for humans to do beyond writing long-form journalism -- until that, too, is replaced by AI. The text generation engine GPT-3 from OpenAI is potentially revolutionary in this regard, leading a New Yorker essay to claim: "Whatever field you are in, if it uses language, it is about to be transformed." AI is marching forward, and its wonders are increasingly evident and applied.
If you've ever seen Finding Dory, the sweet and funny sequel to box-office hit Finding Nemo, you'll know what it means to'speak whale.' Without the help of a befuddled beluga whale called Bailey, the film's forgetful fishy heroine would never escape the fictional Marine Life Institute and "just keep swimming" to find her long-lost parents. Thanks to robotics and artificial intelligence (specifically natural language processing powered by machine learning), it might also be possible to understand whale lingo outside Disney-Pixar's imaginary kaleidoscopic multiverse. In probably the most significant interspecies communication project of all time, researchers recently set about the titanic task of translating the Morse code-like series of clicks (or codas) used by sperm whales to chat with one another. And the results of Project CETI will no doubt be far more interesting than human discussions around the price of fish -- or any other clever piscine reference you can think of.
Once upon a time in a world where streaming was still a novelty, TV remotes were filled with what seemed like thousands of buttons. They were complicated to navigate and intimidating to use. On top of all that input overload, you also had to memorize an ever-increasing glut of channel numbers and then punch them into a number pad like you were calling someone on the phone. It was cumbersome but we accepted it. Over the past several years, however, streaming's dominance has shifted the remote's form and function from an overwhelmingly long channel changer to a simple and compact menu navigator. Roku, a pioneer which helped popularize dedicated streaming hardware more than a decade ago, is back to further cement this sea change with an upgraded remote that brings even more quality-of-life improvements.
Welcome to Thanks, I Love It, our series highlighting something onscreen we're obsessed with this week. The Outside Story centers on a man who's been stuck at home for some ungodly period of time, who is now being made to venture out and try to interact with other humans again. At least, it wasn't meant to be. "I don't even feel like I've fully processed how much it echoes our current reality," writer-director Casimir Nozkowski says on the phone with Mashable. "The fact that the film connects to the pandemic, connects to something that's so epic, so devastating, so ridiculous -- I don't even really know what to do with that information."
After creating the AI Platform Notebooks instance, you can start with your experiments. Let's look into the model specifics for the use case. For analyzing sentiments of the movie reviews in IMDB dataset, we will be fine-tuning a pre-trained BERT model from Hugging Face. Fine-tuning involves taking a model that has already been trained for a given task and then tweaking the model for another similar task. Specifically, the tweaking involves replicating all the layers in the pre-trained model including weights and parameters, except the output layer.
A photograph of the sky by Trevor Paglen can look like a massive abstraction, except for a tiny speck, a surveillance drone, spotted like a malignant dot on a chest x-ray. His images of secluded military sites in Nevada can also ooze with colour from the churning heat and dust. In the new documentary film Unseen Skies, directed by Yaara Bou Melhem, Paglen calls the effect "impressionistic haze". Photographing those places, often from miles away (or farther), is about "seeing and not seeing at the same time," Paglen says. "For me those images were about capturing that paradox."
After the runaway success of his first book The Martian, a science-driven thriller about a stranded astronaut which spawned a blockbuster movie starring Matt Damon, Andy Weir tried to do what many science fiction authors before him have attempted. It was going to be called Zhek. This story originally appeared on WIRED UK. "I thought this was going to be my magnum opus," he says. "My epic science fiction saga that everyone is going to know me for. I got about 70,000 words in and I had to abandon it, because it was just not coming together--the characters weren't interesting, the plot was crawling along. It was going to be this massive tome that nobody wanted to read."