ItMRnet

Published: 9 May 2022| Version 1 | DOI: 10.17632/gx8bmc3hmz.1
Contributors:
Wei Jia, ShuJi Li, Lei Wang

Description

An iterative solution for improving the generalization ability of unsupervised skeleton motion retargeting >>>>>>>>>Downloading and Preprocessing Data<<<<<<<<< All data we use are come from the website of mixamo(https: //www.mixamo.com/ ). You need to create an account to get the animation files. There is a way to quickly start downloading files. > *https://github.com/gnuton/mixamo_anims_downloader* Once the fbx animation files have been downloaded, run *fbx2bvh* in Blender(note: press *"Shift + F11"* to jump to the script screen). Then, process the bvh files into npy file by *bvh2npy.py*. >>>>>>>>>Training Phase<<<<<<<<< Training ItMRNet by: > *./src/ItMRnet_Train.py* The model will be saved at "./models". We also provide pre-trained models to get the help started quickly(models are saved at "./models"). >>>>>>>>>Execution Phase<<<<<<<<< Executing ItMRnet by: >*./src/ItMRnet_Test.py* The details of the four groups of experiments are shown at *https://drive.google.com/file/d/1T0o6kFho96CniyjX7LXV_xgaeHlDJqbV/view?usp=sharing*. The experimental data are shown at "./datasets/test/" >>>>>>>>>Evaluation<<<<<<<<< > Run* ./result/evaluate.py* >>>>>>>>Rendering model to generate videos<<<<<<<<< Download blender files with character skins -->https://drive.google.com/file/d/1m_n1RzdmOv8o1KQJpH3YDC0Drb7KcH8Z/view?usp=sharing >Run *./result/make_videos.py* to generate videos. The visualization results of ItMRNet are all placed in the video at https://drive.google.com/file/d/15Jv_qdcdsS8Rn26lIZc-o5T8YaKgi17R/view?usp=sharing. >>>>>>>>>IK processing<<<<<<<<< Please refer to this work:*https://github.com/DeepMotionEditing/deep-motion-editing*

Files

Institutions

Hefei University of Technology

Categories

Motion Synthesis

Licence