logo
banner

Journals & Publications

Journals Publications Papers

Papers

Encoder-Decoder Recurrent Network Model for Interactive Character Animation Generation
Jul 14, 2017Author:
PrintText Size A A

Title: Encoder-decoder recurrent network model for interactive character animation generation

  Authors: Wang, YM; Che, WJ; Xu, B  Author

Full Names: Wang, Yumeng; Che, Wujun; Xu, Bo

 Source: VISUAL COMPUTER, 33 (6-8):971-980; 10.1007/s00371-017-1378-5 JUN 2017  

Language: English  

Abstract: In this paper, we propose a generative recurrent model for human-character interaction. Our model is an encoder-recurrent-decoder network. The recurrent network is composed by multiple layers of long short-term memory (LSTM) and is incorporated with an encoder network and a decoder network before and after the recurrent network. With the proposed model, the virtual character's animation is generated on the fly while it interacts with the human player. The coming animation of the character is automatically generated based on the history motion data of both itself and its opponent. We evaluated our model based on both public motion capture databases and our own recorded motion data. Experimental results demonstrate that the LSTM layers can help the character learn a long history of human dynamics to animate itself. In addition, the encoder-decoder networks can significantly improve the stability of the generated animation. This method can automatically animate a virtual character responding to a human player.

  ISSN: 0178-2789

  eISSN: 1432-2315

  IDS Number: EX1EY

  Unique ID: WOS:000402964800027

*Click Here to View Full Record