You might know motion capture as the tech that transformed Andy Serkis into Gollum, but now it can transform everyday people into animated avatars in virtual worlds, and all in real-time. Motion capture--which uses body sensors, ultra-precise cameras, and modeling software to create 3D animations from real-life human movement--is now taking on location-based virtual reality, or LBVR. PCWorld visited a leading motion capture company called Vicon in Oxford, England to learn how mocap has evolved to take on this new frontier in entertainment. If you've watched behind-the-scenes footage of how motion capture (or mocap) works, you've probably seen actors in skintight lycra suits covered with golf ball-sized sensors. Normally, dozens of infrared cameras track these sensors to model an actor's movements.
In a makeshift changing room filled with Disney Infinity figures, I strip down to my boxers and pull on a two-part Lycra suit. For years, movie and video game studios have used mocap to bring digital characters to life. A circular, plastic arm wraps around the front of her face, similar to orthodontic headgear, with an LED light strip and cameras fitted on the inside. The cinematics were crafted with motion capture technology developed by Weta Digital, a visual effects company in New Zealand co-owned by Peter Jackson.
Epic Games has been obsessed with real-time motion capture for years, but the company is now trying to take its experiments with the technology one step further. Enter "Siren," a digital personality that it created alongside a few prominent firms in the gaming industry: Vicon, Cubic Motion, 3Lateral and Tencent (which just became a major investor in Ubisoft). The crazy thing about Siren is that she comes to life using live mocap tech, powered by software from Vicon, that can make her body and finger movements be captured and live-streamed into an Unreal Engine project. Back in 2016, Epic Games teased the live motion-capture technology first used for Hellblade, which was stunning and showed the potential of the tech. With this new iteration, though, the company says it hopes to take "live-captured digital humans to the next level."
Epic Games, CubicMotion, 3Lateral, Tencent, and Vicon took a big step toward creating believable digital humans today with the debut of Siren, a demo of a woman rendered in real-time using Epic's Unreal Engine 4 technology. The move is a step toward transforming both films and games using digital humans who look and act like the real thing. The tech, shown off at Epic's event at the Game Developers Conference in San Francisco, is available for licensing for game or film makers. Cubic Motion's computer vision technology empowered producers to conveniently and instantaneously create digital facial animation, saving the time and cost of digitally animating it by hand. "Everything you saw was running in the Unreal Engine at 60 frames per second," said Epic Games chief technology officer Kim Libreri, during a press briefing on Wednesday morning at GDC. "Creating believable digital characters that you can interact with and direct in real-time is one of the most exciting things that has happened in the computer graphics industry in recent years."
It's Wednesday morning, and your password philosophy is wrong. Oh and we know even more about that new iPhone. New guidelines will try to fix the password mess Bill Burr, a manager at the National Institute of Standards and Technology (NIST), wrote a password primer in 2003 that recommended many of the rules we have now: special characters, capitals and numbers. Sadly, it wasn't updated as regularly as he intended and created the mess of hard-to-remember logins we're dealing with now. Now, NIST has a set of draft guidelines that are intended to be more secure, mostly because they will be easier for people and businesses to implement and use every day.