Deep Learning Is Making Video Game Characters Move Like Real People

#artificialintelligence 

As video games give players more freedom to explore complex digital worlds, it becomes more challenging for a CG character to naturally move and interact with everything in it. So to prevent those awkward transitions between pre-programmed movements, researchers have turned to AI and deep learning to make video game characters move almost as realistically as real humans do. To help make video game characters walk, run, jump, and perform other movements as realistically as possible, video game developers will often rely on human performances that are captured and translated to digital characters. It produces results that are faster and better looking than animating video game characters by hand, but it's impossible to plan for every possible way a character will interact with a digital world, according to the researchers. Game developers try to plan for as many possibilities as they can, but they ultimately have to rely on software to transition between animations of a character walking up to a chair, and then sitting down on it, and more often than not, those segues feel stilted, unnatural, and can diminish a player's experience. Computer scientists from the University of Edinburgh and Adobe Research have come up with a novel solution they'll be presenting at the ACM Siggraph Asia conference being held in Brisbane, Australia, next month.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found