I've heard a lot about it.
Do you think this will be implemented?
Sounds nice but complicated.
And i heard that UT3 engine includes FaceFX
Basically, this means that if Epic aren't too lazy, we could see characters moving their lips according to that we're saying via microphone in voice chat. (This feature is not approved. Just my guess.)
Originally posted by http://www.beyondunreal.com/content/articles/168_1.php
Sounds nice but complicated.
And i heard that UT3 engine includes FaceFX
FaceFX can be used to animate characters by hand or generate lip-synchronization and speech gesture data automatically from audio dialogue. FaceFX uses industry leading VoiceIn® speech recognition technology from Fonix to perfectly synchronize the audio to the mouth movements.
Comment