Researchers of the University of Edinburgh have trained a neural network model to animate game characters. So the character can interact with the environment to perform some action. These actions include sitting, standing, moving around, avoiding obstacles, and raising objects. The architecture of the model is based on the Neural State Machine.

Even simple tasks, such as sitting on a chair, are difficult to model using training with a teacher. The difficulty lies in the fact that such a task involves integrated planning and the ability to navigate the environment. Neural State Machine simulates player interaction in changing scenes.

The neural network accepts the target location and the type of action to be performed as an input. At the output, the model gives a sequence of steps to perform the target action. The researchers included in the model a method for augmenting data so that the characters adapt to the specifics of the geometry of the environment. This allows you to randomly change 3D geometry, while maintaining the context of the original action. Thanks to this approach, the neural network learns to act in different environments.



What is inside the neural network?

The system architecture consists of a gating network and a network for predicting movements. The gating network receives a subset of the current state parameters and the target action vector as input. At the output, the gating network gives out coefficients for blending the previous actions, which are used to predict the next action. The network for predicting movements accepts input variables for controlling the pose and trajectory of the character and the output of the gating network. Then the model predicts the parameters of the next action.

Checking the operation of this model



Researchers compared the Neural State Machine with MLP, PFNN, MANN, LSTM, and Auto-LSTM architectures. So we can see that NSM produces more accurate results than other approaches.

* soruce: https://github.com/sebastianstarke/AI4Animation//blob/master/Media/SIGGRAPH_Asia_2019/Paper.pdf Show All Articles