UNIC: Neural Garment Deformation Field
for Real-time Clothed Character Animation

Chengfeng Zhao1, Junbo Qi2, Zhiyang Dou3, Minchen Li4, Ziwei Liu5, Wenping Wang6, Yuan Liu1,5,†

1The Hong Kong University of Science and Technology    2Waseda University    3The University of Hong Kong   
4Carnegie Mellon University    5Nanyang Technological University    6Texas A&M University
Corresponding author

Simulating physically realistic garment deformations is an essential task for immersive content in computer graphics, which is often achieved by physics simulation methods. However, these methods are typically time-consuming, computationally demanding, and require costly hardware, which is not suitable for real-time applications. Recent learning-based methods tried to resolve this problem by training graph neural networks to learn the garment deformation on vertices, which however fails to capture the intricate deformation of complex garment meshes with complex topologies. In this paper, we introduce a novel neural deformation field-based method, named UNIC, to animate the garments of an avatar in real time given the motion sequences. Our key idea is to learn the instance-specific neural deformation field to animate the garment meshes. Such an instance-specific learning scheme does not require UNIC to generalize to new garments but only new motion sequences, which greatly reduces the difficulty in training and improves the deformation quality. Moreover, neural deformation fields map the 3D points to their deformation offsets, which not only avoid handling topologies of the complex garments but also inject a natural smoothness constraint in the deformation learning. Extensive experiments have been conducted on various kinds of garment meshes to demonstrate the effectiveness and efficiency of UNIC over baseline methods.
Pipeline

UNIC learns an instance-specific neural deformation field that deforms arbitrarily complex garments to follow unseen character poses in real time. We first encode the character poses of two consecutive frames into a compact latent space. Then, we categorically sample a latent vector from the learned space and concatenate it with garment vertex coordinates. After that, we feed the concatenation into an MLP-based deformation decoder to predict deformation offsets for all vertices. Finally, post-process intersection handling is applied to avoid the intersection between the avatar mesh and the deformed garment mesh.

Results

Given garments of different topology complexity and geometry density, UNIC achieves comparable simulation quality against to the "gold standard".

Qualitative Comparisons

We present input garment geometries and simulation results of each method. Note that we select garment samples that are closest to our dataset within each method's data domain to fairly evaluate the simulation quality.

Runtime Cost Comparisons

We test both learning-based methods and physical simulation method on all four garment styles in our dataset, ranging from simple t-shirt to complex tulle skirt. In our experiment, UNIC consistently outperforms all the other approaches including GPU-accelerated professional software (Marvelous Designer 2009) in efficiency. The runtime cost is measured in fps on a single NVIDIA RTX 3090 GPU.

Citation

@InProceedings{zhao2025unic, title = {UNIC: Neural Garment Deformation Field for Real-time Clothed Character Animation}, author = {Zhao, Chengfeng and Qi, Junbo and Dou, Zhiyang and Li, Minchen and Liu, Yuan}, journal = {arXiv preprint arXiv:}, year = {2025} }

UNIC: Neural Garment Deformation Field for Real-time Clothed Character Animation
Thanks to Lior Yariv and Jianfeng Xiang for the website template