Avatar mirrors users facial expressions in real-time using a standard webcam
- Real-time avatar
- Keio University
A Keio University group, led by Associate Professor Yasue Mitsukura, has developed a method for measuring which way a person is facing and how their expression changes. This system achieves high speed and high precision, using an ordinary PC and a USB camera.
"We think this system could be used by CG animation hobbyists, in Web dialog systems that show a character instead of the person's face, and for making characters move in real time at events. Because the system uses just one PC and one camera, it can be applied in many situations very easily."
To detect and track faces, this system uses time-series signal processing. It tracks characteristic points, including the eyes, nose, and mouth, at high speed with high precision. The white dots on the screen show the points used to track the face, and the red line shows the orientation of the face. So you can see that the system is detecting the face appropriately, in line with the way it's facing and the movement of the mouth.
"We're using an algorithm that gets updated in line with the motion of the face. So it can track the face very fast, with very high precision. That's the basic technology for this avatar system."
This system also analyzes the shape of the person's expression. So it can reproduce how the eyebrows and mouth are moving, and whether the person is laughing, angry, or surprised.
As well as avatars, this system could also be used for games that detect and react to changes in people's faces.
"Systems using motion capture with markers on the face aren't convenient for ordinary users. In that sense, until now, there hasn't been technology for moving a model freely in real time."
From now on, the researchers aim to develop motion generation software for standard PCs. That should enable various ways to apply and implement this system.
- This Week
- This Month
- All Time