Tracking facial motion
- 17 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
We describe a computer system that allows real-time tracking of facial expressions. Sparse, fast visual measurements using 2D templates are used to observe the face of a subject. Rather than track features on the face, the distributed response of a set of templates is used to characterize a given facial region. These measurements ape coupled via a linear interpolation method to states in a physically-based model of facial animation, which includes both skin and muscle dynamics. By integrating real-time 2D image-processing with 3D models we obtain a system that is able to quickly track and interpret complex facial motions.Keywords
This publication has 9 references indexed in Scilit:
- Space-time gesturesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- A vision system for observing and extracting facial action parametersPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1994
- 3-D motion estimation in model-based facial image codingIEEE Transactions on Pattern Analysis and Machine Intelligence, 1993
- Analysis and synthesis of facial image sequences using physical and anatomical modelsIEEE Transactions on Pattern Analysis and Machine Intelligence, 1993
- Interactive graphics for plastic surgeryPublished by Association for Computing Machinery (ACM) ,1992
- Modelling and animating faces using scanned dataThe Journal of Visualization and Computer Animation, 1991
- Automatic lipreading by optical‐flow analysisSystems and Computers in Japan, 1991
- Performance-driven facial animationPublished by Association for Computing Machinery (ACM) ,1990
- Animating facial expressionsACM SIGGRAPH Computer Graphics, 1981