Goto

Collaborating Authors

Adobe Character Animator Lets You Make Cartoons Speak With Face Tracking

AITopics Original Links

Facial recognition is one of the earliest skills that baby humans learn. Considering how important paying attention to faces is to our survival it is no surprise that computer vision researchers have put a lot of effort into this field of research. The applications of static facial recognition have gone from enabling digital cameras to improve the focus on multiple faces in an image to Facebook's auto-identification of people in photo galleries. Face tracking is the ability to recognize the geometry of facial features in moving images and follow them despite changes in angle or lighting. Adobe's new Character Animator app, announced today for its Creative Cloud service, uses advanced face tracking to create animated effects that are downright playful.


Correlation and Interpolation Networks for Real-time Expression Analysis/Synthesis

Neural Information Processing Systems

We describe a framework for real-time tracking of facial expressions that uses neurally-inspired correlation and interpolation methods. A distributed view-based representation is used to characterize facial state, and is computed using a replicated correlation network. The ensemble response of the set of view correlation scores is input to a network based interpolation method, which maps perceptual state to motor control states for a simulated 3-D face model. Activation levels of the motor state correspond to muscle activations in an anatomically derived model. By integrating fast and robust 2-D processing with 3-D models, we obtain a system that is able to quickly track and interpret complex facial motions in real-time.


Correlation and Interpolation Networks for Real-time Expression Analysis/Synthesis

Neural Information Processing Systems

We describe a framework for real-time tracking of facial expressions that uses neurally-inspired correlation and interpolation methods. A distributed view-based representation is used to characterize facial state, and is computed using a replicated correlation network. The ensemble response of the set of view correlation scores is input to a network based interpolation method, which maps perceptual state to motor control states for a simulated 3-D face model. Activation levels of the motor state correspond to muscle activations in an anatomically derived model. By integrating fast and robust 2-D processing with 3-D models, we obtain a system that is able to quickly track and interpret complex facial motions in real-time.


Correlation and Interpolation Networks for Real-time Expression Analysis/Synthesis

Neural Information Processing Systems

We describe a framework for real-time tracking of facial expressions that uses neurally-inspired correlation and interpolation methods. A distributed view-based representation is used to characterize facial state, and is computed using a replicated correlation network. The ensemble response of the set of view correlation scores is input to a network based interpolation method, which maps perceptual state to motor control states for a simulated 3-D face model. Activation levels of the motor state correspond to muscle activations in an anatomically derived model. By integrating fast and robust 2-D processing with 3-D models, we obtain a system that is able to quickly track and interpret complex facial motions in real-time.


How AI Is Breathing Life Into Animation

#artificialintelligence

"Knowing how powerful machine learning has become, it's just a matter of time before it completely takes over the animation industry." In recent years, deep learning has increased the modern scope of animation, making it more accessible and powerful than before. Artificial intelligence has become a shiny new weapon in the creator's arsenal. The advancement of hardware and AI has blurred the lines between virtual and real characters(eg: movies like Alita). Something that could have taken hours to perform by animators is being done by automation in minutes.