In video games, non-playable characters can be somewhat clueless. An NPC might wander across a city block and face-plant into a streetlamp, and then maybe vanish the next block over. NPCs leap into player-characters' punches or commit to kicking a wall 400 times, never learning that the wall won't kick back. Unity Technologies is in the business of NPCs. Founded in 2004, Unity makes an eponymous game engine that provides the architecture for hundreds of video games using its real-time 3D computer graphics technology.
Airbus is deriving value from its use of synthetic data to process satellite imagery, and premier game engine player Unity is using synthetic data to break into new markets, including robotics and autonomous vehicles. Those were takeaways from two talks at the recent Ai4 virtual conference with enterprise AI users in a range of industries. "Simulation and synthetic data generation is a game changer for AI," stated Danny Lange, senior VP of AI and Machine Learning at Unity of San Francisco. The company has grown up in game development and now offers an engine for producing 3D projects, which they plan to use to extend into new industries including robotics, autonomous cars, engineering and manufacturing. Over 60% of all games are made with Unity, which is installed on over six billion unique devices and has three billion monthly active players, Lange stated.
This article is part of a VB special issue. Read the full series here: The metaverse - How close are we? Defining the "metaverse" is a difficult task, but one commonly accepted definition is a digital space populated by representations of people, places, and things. Through a combination of technologies including virtual reality (VR), augmented reality (AR), and AI, the metaverse that some futurists envision is an extension of the real world -- albeit without the physical trappings. Companies like Rockstar and Roblox have pitched the metaverse as the ideal platform for gaming, but there's no limit to the potential applications in the enterprise.
For years, video game developers have used artificial intelligence to animate those characters encountered by a player, but non-playable characters, or NPCs, have been based on sets of rules coded by humans. Using the AI technology du jour, machine learning, future NPCs will program and reprogram their own rules, based on the experiences they encounter in games, in the process getting smarter the longer they play. So says Danny Lange, the VP of AI and machine learning at Unity Technologies, a major maker of game "engine" software that handles the underlying mechanics of titles like Firewatch and ChronoBlade. Today the company announced Unity Machine Learning Agents--open-source software linking its game engine to machine learning programs such as Google's TensorFlow. It will allow non-playable characters, through trial and error, to develop better, more creative strategies than a human could program, says Lange, using a branch of machine learning called deep reinforcement learning.
No other topic took 2021 by storm quite like the metaverse. As we all experienced yet another year of living through a pandemic, the idea of a new, immersive reality captured the interests and imaginations of many. As with any new concept, it's helpful to level set on what the metaverse is -- or will be. I like how my Unity colleague and one of the early pioneers of 3D media and virtual reality, Tony Parisi, put it in his excellent article on the metaverse: "The metaverse is the next evolution of the internet … enhanced and upgraded to consistently deliver 3D content, spatially organized information and experiences, and real-time synchronous communication." Much of the attention around the metaverse to date has centered on social experiences where people can meet up, but I'm most excited by the potential of the "industrial metaverse" where the goal doesn't have anything to do with social interaction; rather, it's about simulating experiences in the virtual world before moving into the physical world.