Facial recognition technology is known to have flaws. In 2019, a national study of over 100 facial recognition algorithms found that they did not work as well on Black and Asian faces. Two other Black men -- Robert Williams and Michael Oliver, both of whom live in the Detroit, Mich., area -- were also arrested for crimes they did not commit based on bad facial recognition matches. Like Mr. Parks, Mr. Oliver filed a lawsuit against the city over the wrongful arrest. Nathan Freed Wessler, an attorney with the American Civil Liberties Union who believes that police should stop using face recognition technology, said the three cases demonstrate "how this technology disproportionately harms the Black community."
By Mike Williams When you take a medication, you want to know precisely what it does. Pharmaceutical companies go through extensive testing to ensure that you do. With a new deep learning-based technique created at Rice University's Brown School of Engineering, they may soon get a better handle on how drugs in development will perform in the human body. Lydia Kavraki, Professor of Computer Science, has introduced Metabolite Translator, a computational tool that predicts metabolites, the products of interactions between small molecules like drugs and enzymes. The Rice researchers take advantage of deep-learning methods and the availability of massive reaction datasets to give developers a broad picture of what a drug will do.
Anyone who's ever been on an earnings call knows company executives already tend to look at the world through rose-colored glasses, but a new study by economics and machine learning researchers says that's getting worse, thanks to machine learning. The analysis found that companies are adapting their language in forecasts, SEC regulatory filings, and earnings calls due to the proliferation of AI used to analyze and derive signals from the words they use. In other words: Businesses are beginning to change the way they talk because they know machines are listening. Forms of natural language processing are used to parse and process text in the financial documents companies are required to submit to the SEC. Machine learning tools are then able to do things like summarize text or determine whether language used is positive, neutral, or negative.
So you just got a smart speaker as a holiday present. Now what to do with them? You've come to the right place. On command, by saying "Hey Siri," for the HomePod, "Hey Google" for the Nest Audio or "Alexa," on Echo speakers, you can instruct them to play music of your choice, either via a subscription service, or more generically, as part of a themed radio station via the Pandora service. Amazon's speakers play music from Amazon Music, Spotify, Apple Music, Pandora and iHeartRadio, while Apple plays just from Apple Music and Pandora.
For this first time in his life, Pete Peeks was able to use both hands to hang Christmas lights outside his house this year -- thanks to the help of a high school robotics team. Peeks, 38, was born without the full use of his right hand, and though many may take gripping a nail, hammering it in and stringing holiday lights for granted, Peeks said it was beyond his wildest dreams. Early this month, he became one of the latest clients of the Sequoyah High School Robotics Team in Canton, Georgia. The team has designs and 3D- printed custom prosthesis to send for free to people around the world who need them. And as Americans gather for the winter holidays, the students will be at home continuing their work.
In an internal memo that he later posted online explaining Gebru's departure, Dean told employees that the paper "didn't meet our bar for publication" and "ignored too much relevant research" on recent positive improvements to the technology. Gebru's superiors had insisted that she and the other Google co-authors either retract the paper or remove their names. Employees in Google Research, the department that houses the ethical AI team, say authors who make claims about the benefits of large language models have not received the same scrutiny during the approval process as those who highlight the shortcomings.
David Silver is responsible for several eye-catching demonstrations of artificial intelligence in recent years, working on advances that helped revive interest in the field after the last great AI Winter. At DeepMind, a subsidiary of Alphabet, Silver has led the development of techniques that let computers learn for themselves how to solve problems that once seemed intractable. Most famously, this includes AlphaGo, a program revealed in 2017 that taught itself to play the ancient board game Go to a grandmaster level. Go is too subtle and instinctive to be tamed using conventional programming, but AlphaGo learned to play through practice and positive reward--an AI technique known as "reinforcement learning." In 2018, Silver and colleagues developed a more general version of the program, called AlphaZero, capable of learning to play expert chess and shogi as well as Go.
The hand is made using aviation-level aluminum and plastic. To set it up, amputees are instructed to "think" about moving individual fingers and making hand gestures while the prosthetic is attached. Meanwhile, the device measures and remembers what each signal looks like. It's ready to operate within 15 minutes, the company says. After training, the robotic hand will respond to each of the muscle triggers it picked up during the exercise.