image2
image3
image4
image5
image6
image7
image1

User Interactive MPEG-4 Compatible Facial Animation System

Escher, M. and Goto, T. and Kshirsagar, S. and Zanardi, C. and Magnenat-Thalmann, N.


Abstract: This paper describes different processes and their interactions needed to generate a virtual environment inhabited by a clone representing real people and virtual autonomous actors. It requires communication between a cloned face (or avatar) and virtual face. This needs the cloning and mimicking aspects to reconstruct the 3D model and movements of the real face. The autonomous virtual face is able to respond and interact through facial expressions and speech. Several main processing are necessary to reach this goal. The processing of the input data is crucial since it represents the only interaction of the user with the virtual world and autonomous actor. We have implemented the processing of the two basic media used in a dialog, which are speech and facial expressions. We also discuss about the implementation of the emotionally autonomous actor. Finally we give a description of the real-time facial animation system. The whole system is based on the MPEG-4 definition of FAP, visemes and expressions.


@inproceedings{13,
  booktitle = {International Workshop on Synthetic - Natural Hybrid Coding and Three Dimensional Imaging (IWSNHC3DI'99)},
  author = {Escher, M. and Goto, T. and Kshirsagar, S. and Zanardi, C. and Magnenat-Thalmann, N.},
  title = {User Interactive MPEG-4 Compatible Facial Animation System},
  pages = {29-32},
  month = sep,
  year = {1999},
  topic = {Facial Animation}
}