Compared to video-based systems, the technique we developed using electrophysiological data enables faster detection of facial expressions and even in the presence of subtle movements. Features from 8 EMG sensors located around the face were extracted. Gaussian models for six basic facial expressions - anger, surprise, disgust, happiness, sadness and neutral - were learnt from these features and provide a mean recognition rate of 92%. Finally, a prototype of one possible application of this system was developed wherein the output of the recognizer was sent to the expressions module of a 3D avatar that then mimicked the expression.
Online recognition of the facial expression from EMG signals (the video was slowed down for better visualization). |
The avatar replicates the recognized facial expression. |
This work was supported by the Thinking Head project, a special initiative of the ARC and NH&MRC.