A GAN-based temporally stable shading model for fast animation of photorealistic hair
hair animation, fast shading, neural networks
We introduce an unsupervised GAN-based model for shading photorealistic hair animations. Our model is much faster than previous rendering algorithms and produces fewer artifacts than other neural image translation methods. The main idea is to extend the Cycle-GAN structure to avoid semi-transparent hair appearance and to exactly reproduce the interaction of the lights with the scene. We usetwo constraints to ensure temporal coherence and highlight stability. Our approach outperforms and is computationally more efficient than previous methods.
Tsinghua University Press
Zhi Qiao, Takashi Kanai. A GAN-based temporally stable shading model for fast animation of photorealistic hair. Computational Visual Media 2021, 7(1): 127-138.