Collaborating Authors


AI can now play Minecraft just as well as you - here's why that matters


Experts at OpenAI have trained a neural network to play Minecraft to an equally high standard as human players. The neural network was trained on 70,000 hours of miscellaneous in-game footage, supplemented with a small database of videos in which contractors performed specific in-game tasks, with the keyboard and mouse inputs also recorded. After fine-tuning, OpenAI found the model was able to perform all manner of complex skills, from swimming to hunting for animals and consuming their meat. It also grasped the "pillar jump", a move whereby the player places a block of material below themselves mid-jump in order to gain elevation. Perhaps most impressive, the AI was able to craft diamond tools (requiring a long string of actions to be executed in sequence), which OpenAI described as an "unprecedented" achievement for a computer agent.

OpenAI Introduces a Neural Network That Can Play 'Minecraft'


OpenAI has developed a neural network that can play Minecraft like humans. The Artificial Intelligence (AI) model was trained over 70,000 hours of miscellaneous in-game footage, along with a small database of videos in which specific in-game tasks were performed. Keyboard and mouse inputs are also recorded. OpenAI fine-tuned the AI, and now, it is skillful as a human-it can swim, hunt for animals, and eat. The AI can also do the pillar jump, where a player places a block of material below themselves in mid-air to gain more elevation.

In letter to board, Enthusiast leadership asks CEO to step down

Washington Post - Technology News

On June 7, Greywood announced it intended to nominate Shinggo Lu, a current Enthusiast employee and the co-founder of U.GG, a "League of Legends" analytics platform and a recent Enthusiast acquisition, to the new board. Lu shared the news in an Enthusiast Slack channel with over 250 employees, entreating other employees to ask him questions, and sparking a spirited but largely cordial conversation between staff and some members of the company's leadership over Enthusiast's direction and treatment of employees, according to messages viewed by The Post.

In 'Hindsight,' players explore memories and comes to terms with grief

Washington Post - Technology News

As the team learned who Mary was, the game's writing also changed. The tone shifted significantly midway through development, Kidwell said. It was initially inspired by Terrence Malik's "The Tree of Life," a coming-of-age story that unfolds as the characters, through voice-overs, philosophize about the meaning of life. The movie's voice-overs are eloquent and emotional but use stilted language, which didn't fit Mary's journey. Ultimately, as the team gained a better understanding of Mary's character, they switched to a more grounded, conversational tone for her voice-overs, Kidwell said, which sounded much more natural.

Blizzard buys 'Spellbreak' studio Proletariat to speed up 'WoW' development


The studio has another major release lined up in the form of World of Warcraft expansion Dragonflight, which is expected to arrive by the end of 2022. To help get WoW expansions out on time and ensure they meet the bar in terms of quality, Blizzard bought Spellbreak studio Proletariat to bolster its ranks of developers, as GamesBeat reports. The news comes one day after Proletariat announced it will shut down Spellbreak early next year. The free-to-play game is an intriguing take on the battle royale genre, with players using magical powers instead of guns. The game never took off, though.

AI learns how to play Minecraft by watching videos - AI News


Open AI has trained a neural network to play Minecraft by Video PreTraining (VPT) on a massive unlabeled video dataset of human Minecraft play, while using just a small amount of labeled contractor data. With a bit of fine-tuning, the AI research and deployment company is confident that its model can learn to craft diamond tools, a task that usually takes proficient humans over 20 minutes (24,000 actions). Its model uses the native human interface of keypresses and mouse movements, making it quite general, and represents a step towards general computer-using agents. A spokesperson for the Microsoft-backed firm said: "The internet contains an enormous amount of publicly available videos that we can learn from. You can watch a person make a gorgeous presentation, a digital artist draw a beautiful sunset, and a Minecraft player build an intricate house. However, these videos only provide a record of what happened but not precisely how it was achieved, i.e. you will not know the exact sequence of mouse movements and keys pressed. "If we would like to build large-scale foundation models in these domains as we've done in language with GPT, this lack of action labels poses a new challenge not present in the language domain, where "action labels" are simply the next words in a sentence." In order to utilise the wealth of unlabeled video data available on the internet, Open AI introduces a novel, yet simple, semi-supervised imitation learning method: Video PreTraining (VPT). The team begin by gathering a small dataset from contractors where it records not only their video, but also the actions they took, which in its case are keypresses and mouse movements. With this data the company can train an inverse dynamics model (IDM), which predicts the action being taken at each step in the video. Importantly, the IDM can use past and future information to guess the action at each step. The spokesperson added: "This task is much easier and thus requires far less data than the behavioral cloning task of predicting actions given past video frames only, which requires inferring what the person wants to do and how to accomplish it.

Sony targets PC gamers with new hardware brand, Inzone

Washington Post - Technology News

Sony's gaming division, PlayStation, has primarily been focused on the console games market,

The AI tool behind Thanos made facial animation in 'The Quarry' a snap

Washington Post - Technology News

The Oscar-winning studio has produced visual effects for movies like "Titantic," "The Curious Case of Benjamin Button" and several Marvel films. To create the photorealistic characters seen in "The Quarry," it used the AI facial capture system Masquerade, which was developed to replicate Josh Brolin's likeness for his character Thanos in "Avengers: Infinity War." Masquerade was originally designed to do one thing: to take the performance from a head-mounted camera and translate it into a digital mesh that could then be rendered in a movie. For "The Quarry," the VFX team needed something that could track the movement and facial expressions of actors and create digital characters that could be edited in real time. So they built Masquerade 2.0.

Evil Geniuses wants data, not money, to determine success in esports

Washington Post - Technology News

Another good example of where we're looking five, ten years from now is how the definition of who can be an esports coach expands if in-game decision-making can be more automated and standardized. We can focus on other historically overlooked areas, which is what our current director of performance is really focused around: leadership skills, how to build team culture, how to manage resiliency. It's hard to find the unicorn that can be all facets of a traditional sports coach, a people leader and tactical-positional knowledge. And that's why data partnerships, like our HPE partnership, are really exciting for me. It gives us stepwise development in our professionalization of this, as well as opportunities to scale.

An AI Was Trained To Play Minecraft With 70,000 Hours Of YouTube Videos


OpenAI, the artificial intelligence research organization founded by Elon Musk, has trained an AI to play Minecraft almost as well as humans. It only took about 70,000 hours of binging YouTube videos. A blog post detailing the feat reveals that researchers used a technique called "Video PreTraining (VPT)" to train a neural network on how to play Minecraft. This involved gathering 2,000 hours of sample dataset from actual humans playing Minecraft to include not just the raw video, but also exact keypresses and mouse movements. From there, the researchers trained an inverse dynamics model (IDM) to predict the future action being taken at each step in the videos.