In 2016, DeepMind's AlphaGo famously defeated Lee Sedol, an international Go champion, becoming the first computer program to beat a human world champion. In 2018, LawGeex, an AI contract review platform, pulled the same stunt on human lawyers. The AI system achieved a 94 percent accuracy rate at surfacing risks in non-disclosure agreements (NDAs). Experienced human lawyers average out at 85 percent accuracy for the same task. The study, conducted in collaboration with Duke and Stanford Law Schools, pitted AI against 20 top U.S.-trained lawyers with decades of experience specifically in reviewing NDAs, one of the most common agreements in law.
Epic Games, CubicMotion, 3Lateral, Tencent, and Vicon took a big step toward creating believable digital humans today with the debut of Siren, a demo of a woman rendered in real-time using Epic's Unreal Engine 4 technology. The move is a step toward transforming both films and games using digital humans who look and act like the real thing. The tech, shown off at Epic's event at the Game Developers Conference in San Francisco, is available for licensing for game or film makers. Cubic Motion's computer vision technology empowered producers to conveniently and instantaneously create digital facial animation, saving the time and cost of digitally animating it by hand. "Everything you saw was running in the Unreal Engine at 60 frames per second," said Epic Games chief technology officer Kim Libreri, during a press briefing on Wednesday morning at GDC. "Creating believable digital characters that you can interact with and direct in real-time is one of the most exciting things that has happened in the computer graphics industry in recent years."
The full instructions are here, and a sample game is here. AIs are now better than humans at Backgammon, Checkers, Chess, Othello, and Go. See Audrey Keurenkov's A'Brief' History of Game AI Up to AlphaGo for a more in-depth timeline. In 2017, Michael Tucker, Nikhil Prabala, and I set out to create PAI, the world's first AI for Pathwayz. The AIs for Othello and Backgammon were especially relevant to our development of PAI. Othello, like Pathwayz, is a relatively young game -- at least compared to the ancient Backgammon, Checkers, Chess, and Go.
Researchers at MIT have developed what they call a soft underwater robot that looks and moves like a fish, offering scientists a vehicle for observing marine life that's less conspicuous to other sea creatures than humans in diving gear. The team at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) reckon SoFi could be the answer to the challenge of documenting marine life up close. Unlike other underwater drones, SoFi doesn't need to be tethered to boats and doesn't have any noisy propellers but rather relies on the same tail movements fish use to accelerate and pivot in water. To achieve a fish tail motion SoFi's lithium polymer battery powers a motor that pumps water into two diaphragms located inside its flexible silicone tail. The expansion and contraction of each diaphragm causes the tail to bend left and right, just like a fish, and propels it at about "half a body length per second", according to MIT.
Every product here is independently selected by Mashable journalists. If you buy something featured, we may earn an affiliate commission which helps support our work. Easter is a great time to snag a bargain from Amazon as they hold their annual Amazon Easter sale. However, there is only one day left, so if you have't had a look already, It might be worth taking some time out and heading over to Amazon to see what you can pick up on the cheap. SEE ALSO: 8 Reasons Easter and Children Don't Mix
It took Ernõ Rubik more than a month to solve his namesake puzzle the first time. Today, competitive cubers can best the classic brain teaser in less than five seconds, and casual players can do it in minutes. Their not-so-secret weapon is math. Devising or memorizing sequences of moves that accomplish a particular goal--for instance, swapping two corners--is key to cracking your Rubik's Cube. When game designers start stacking more layers onto a standard 3-by-3-by-3-square cuboid, it doesn't change those algorithms much; it just makes the solve mega-tedious.
Artificial intelligence and robotics are disrupting every aspect of work and redefining productivity. The old ways of not just working, but also assessing capabilities, hiring and compensation, are undergoing a massive change. In a conversation with Knowledge@Wharton, Srikanth Karra, chief human resource officer at Indian IT services firm Mphasis, discusses what this means for individuals, organizations and countries. Karra said managerial jobs and tasks that are repetitive in nature will be displaced and the ability to learn new skills will be critical for individuals who want to stay relevant. Companies will need to devise new ways of training and assessing the skills of employees while countries must develop a learning ecosystem. "Work will be more contractual in nature and deep technical skills, creativity and learnability will be at a premium," he noted.
Video game adaptations have a long history of being, well … mostly completely terrible, thanks largely to the vapid efforts of one Uwe Boll. And even the most ardent Angelina Jolie fan would presumably admit that the Tomb Raider movies were hardly the Oscar-winner's finest hour. So why would Alicia Vikander, Hollywood It girl and current art house dahling, sign up to star as Lara Croft in a reboot of the action-adventure series? Were there no Marvel superhero parts available? Critics have reacted with predictable sniffiness to Norwegian director Roar Uthaug's debut Hollywood outing, with the movie rated just 50% on the review aggregator Rotten Tomatoes.
Artificial intelligence is suddenly in people's homes, driving their cars, and running their security systems. Users interact with chatbots, sometimes unaware they're not talking to live people. Designers and marketing agencies trust computer-generated insights and machine learning over human input in making business decisions. Artificial intelligence development seemed to happen overnight, but it has been a series of developments that stretches back hundreds of years. It's hard to imagine that, 381 years ago, anyone could have conceived of artificial intelligence.
Artificial Intelligence (AI) is once again a promising technology. The last time this happened was in the 1980s, and before that, the late 1950s through the early 1960s. In between, commentators often described AI as having fallen into "Winter," a period of decline, pessimism, and low funding. Understanding the field's more than six decades of history is difficult because most of our narratives about it have been written by AI insiders and developers themselves, most often from a narrowly American perspective. In addition, the trials and errors of the early years are scarcely discussed in light of the current hype around AI, heightening the risk that past mistakes will be repeated. How can we make better sense of AI's history and what might it tell us about the present moment?