OpenAI researchers demonstrated a neural network that has learned to play Minecraft using only walkthrough videos.
To train the neural network, engineers downloaded a huge database of videos for it, in which random people play Minecraft, and then slightly improved its algorithms.
In the first stage, the neural networks showed 2,000 hours of marked-up video with Minecraft gameplay. Here, the researchers demonstrated which buttons users pressed during the game. This allowed the system to learn to guess which buttons were pressed. In the second stage, the neural network has already viewed 70 thousand hours of unlabeled video (without data on the buttons being pressed). All entries were taken from open sources.
The researchers showed on video how the neural network swims, hunts, extracts resources and creates objects from them. Moreover, the algorithms even mastered the technique of jumping on a pole, hunting for cows and taught how to create a diamond pickaxe.
The experts noted that the method of training the neural network using videos turned out to be very effective and made it possible to achieve results much faster than using other solutions.