Research on Photorealistic Virtual Face Modeling

Research on Photorealistic Virtual Face Modeling

Xiangzhen He, Shengyin Zhu, Yihao Zhang, Yerong Hu, Dengyun Zhu, Xiaoyue Liu, Fucheng Wan
Copyright: © 2022 |Pages: 11
DOI: 10.4018/JITR.299949
Article PDF Download
Open access articles are freely available for download

Abstract

Most of the traditional face modeling methods adopt the parameter-based method to construct, but the face model constructed by this method is too smooth and ignores the detailed features of the face. To solve this problem, a virtual face modeling method based on 3D motion capture data is proposed in this paper. In order to improve the deformation method for the realistic modeling purpose, this paper divides the modeling process of personalized face into two parts: overall modification and local deformation. The overall deformation modifies the facial shape and the position of the facial features of Maya universal model. Based on the principle of radial basis function interpolation algorithm, a smooth interpolation function is constructed, and the new position coordinates of non-characteristic points are obtained by solving the linear equations, so that they are more in line with the physiological characteristics of human faces. The end result is a more realistic virtual face that you model.
Article Preview
Top

1. Introduction

With the rapid development of science and technology, how to reconstruct 3d facial expressions realistically has always been a research hotspot in the field of computer graphics and computer vision (Wang and Shen, 2018; Wang et al., 2020; Mohammad et al., 2018). Facial expression plays a crucial role in human face-to-face communication, which can directly express the internal emotion changes of the communicated objects (Sheng, 2017). Facial movements are abundant, and people are sensitive to facial expressions (Ge, 2014). In recent years, the deep integration of expression capture and interactive experience has brought new direction to human-computer real-time interactive experience (Ran and Chen, 2016). The single human-computer interaction mode can no longer meet the needs of human beings. The user-centered, highly personalized and omni-directional perception interaction method has become the development trend and demand. Efficient tracking of face data with limited hardware resources (Jon, 2020). In this situation, integrated, networked, intelligent and standardized new generation of interaction methods emerge at the right moment, emphasizing that human-computer interaction can be as natural and convenient as human interaction. Human interaction includes language, expression, body movements and many other aspects, especially facial expressions. Face modeling is the key to modern visual effects in special-effects movies and computer games (Yang and Tong, 2018). Human face contains a large amount of feature information and is the most important organ for character recognition. The study of computer simulation of specific faces is of great significance (Lv and Li, 2018), which plays a crucial role in information communication. The authenticity of facial expression animation is one of the most critical evaluation indexes of facial expression simulation. In the process of traditional facial expression animation, the lifelike effect of its animation often needs a large amount of human intervention as the cost (Fang, 2010). With the continuous development of computer animation, in order to obtain realistic character animation, people use motion capture equipment to capture the movements of performers, and directly drive the movement of virtual characters through these motion data (Wei, 2016). Therefore, in order to meet the needs of human visual experience, the study of realistic human face becomes an urgent need.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 15: 6 Issues (2022): 1 Released, 5 Forthcoming
Volume 14: 4 Issues (2021)
Volume 13: 4 Issues (2020)
Volume 12: 4 Issues (2019)
Volume 11: 4 Issues (2018)
Volume 10: 4 Issues (2017)
Volume 9: 4 Issues (2016)
Volume 8: 4 Issues (2015)
Volume 7: 4 Issues (2014)
Volume 6: 4 Issues (2013)
Volume 5: 4 Issues (2012)
Volume 4: 4 Issues (2011)
Volume 3: 4 Issues (2010)
Volume 2: 4 Issues (2009)
Volume 1: 4 Issues (2008)
View Complete Journal Contents Listing