Media


Andy Serkis, Motion Capture Master

Slate

Listen to Slate's The Gist: Slate Plus members get extended, ad-free versions of our podcasts--and much more. Sign up today and try it free for two weeks. Copy this link and add it in your podcast app. For detailed instructions, see our Slate Plus podcasts page. Listen to The Gist via Apple Podcasts, Overcast, Spotify, Stitcher, or Google Play.


How next-gen motion capture will supercharge VR arcades

PCWorld

You might know motion capture as the tech that transformed Andy Serkis into Gollum, but now it can transform everyday people into animated avatars in virtual worlds, and all in real-time. Motion capture--which uses body sensors, ultra-precise cameras, and modeling software to create 3D animations from real-life human movement--is now taking on location-based virtual reality, or LBVR. PCWorld visited a leading motion capture company called Vicon in Oxford, England to learn how mocap has evolved to take on this new frontier in entertainment. If you've watched behind-the-scenes footage of how motion capture (or mocap) works, you've probably seen actors in skintight lycra suits covered with golf ball-sized sensors. Normally, dozens of infrared cameras track these sensors to model an actor's movements.


100 years of motion-capture technology

Engadget

Modern motion-capture systems are the product of a century of tinkering, innovation and computational advances. Mocap was born a lifetime before Gollum hit the big screen in The Lord of the Rings, and ages before the Cold War, Vietnam War or World War II. It was 1915, in the midst of the First World War, when animator Max Fleischer developed a technique called rotoscoping and laid the foundation for today's cutting-edge mocap technology. Rotoscoping was a primitive and time-consuming process, but it was a necessary starting point for the industry. In the rotoscope method, animators stood at a glass-topped desk and traced over a projected live-action film frame-by-frame, copying actors' or animals' actions directly onto a hand-drawn world.


[D] What resources for 3D human pose estimation? • r/MachineLearning

#artificialintelligence

Hello, I'm searching for resource for 3D human pose estimation (single person, real time, single or multiple RGB/RGBD cameras).


[P] I've got 1k to pay somebody to help me put together an art project using human pose estimation. • r/MachineLearning

@machinelearnbot

I don't really belong here, considering I barely know how to write some code. But I'm not sure where to look for help building a project where I am trying to utilize Human Pose Estimation and/or skeleton tracking. I've got some funding to pay somebody who is interested in helping, now I just don't know where to find that person, so I thought I'd look here.


Motion-capture footage from the original 'Mortal Kombat' has us on a nostalgia trip

Mashable

If you ever played the OG Mortal Kombat, you have to see the original motion-capture of actors whose movements and likeness would be the inspiration for one of the greatest gaming franchises of all-time. Imgur user RambleKhron compiled the videos, turning sections of them into gifs with a short description and a link to the video it came from. Below are those choice gifs made by RambleKhron. It's no secret to fans of the original game that motion capture was used for playable fighters. It gave the game its gritty feel, and made the violence more realistic and dangerous at a time when video game violence was seriously under attack by both the media and governments worldwide.


Paris Machine Learning #10 Ending Season 4, Large-scale video classification, community detection, Code Mining, Maps, Load Monitoring and.Cognitive.

#artificialintelligence

At Qucit we use geographic data collected from hundreds of cities on a daily basis. It is collected from many different sources, often from each city's open data website and used as inputs to our models. We can then predict parking times, bikeshare stations occupations, stress levels, parking fraud… Gathering it is a lot of fastidious work and we aim at automating it. In order to do so we want to get our data from a source available everywhere: satellite images. We now have a good enough precision to detect roads, buildings and we believe that single trees can be detected too. We tested our model on the SpaceNet images and labels, acquired thanks to the Spacenet Challenge. During this challenge the Topcoder Community had to develop automated methods for extracting building footprints from high-resolution satellite imagery. Non Intrusive Load Monitoring is the field of electrical consumption disaggregation within a building thus enabling people to increase their energy efficiency and reduce both energy demand and electricity costs. We will present to you this active research field and what are the learning challenges @Smart Impulse.


This Camera Free Motion Capture Smartsuit Pro Is Insanely Cool

Forbes Technology

Each time I move my arm, the frog on screen extends his. I drop into a low splits and the frog follows suit, even mirroring me when I transition into a handstand. Yes, his hands float a little over the ground, but that's because he's been calibrated as someone six inches taller than me. Finally, I take my headset off, and the frog reacts like I've decapitated it, head hanging backward, reminiscent of Shakespeare's Macbeth. But even in its death throes, it's still uncannily accurate, following my every move perfectly -- with zero cameras involved in this motion capture.


Motion capture and visual effects bring back Tarkin for 'Rogue One'

Los Angeles Times

One of the best-kept secrets of 2016 was the fact that a major character in Gareth Edwards' "Rogue One: A Star Wars Story" would be appearing on screen for the first time since the actor who portrayed him passed away over 20 years ago. Through visual effects wizardry and a live-action performance by actor Guy Henry, the commander of the first Death Star in 1977's "Star Wars," Grand Moff Tarkin, was brought back to the big screen as though the late Peter Cushing was still portraying him. For John Knoll, it was the most difficult aspect of his responsibilities as visual effects supervisor on the global blockbuster. An Oscar winner for his work on "Pirates of the Caribbean – Dead Man's Chest," Knoll believes the illusion wouldn't have succeeded without Henry's presence. The effects team's job was effectively that of someone who would be creating cosmetic or prosthetic makeup.


Faceshift: Apple buys Star Wars motion-capture company - BBC News

AITopics Original Links

Apple has purchased the company behind motion-capture technology used in the latest Star Wars film. Faceshift, a Zurich based start-up, specialises in software that allows 3D animated characters to mimic the facial expressions of an actor. Apple has now bought the company, though it is not known how much the deal cost the tech giant. It is also unclear what Apple's plans are for the company following its acquisition. A spokesman said: "Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans."