The dead-eyed NPC has been the butt of endless jokes and memes for pretty much as long as 3D games have been around. When playing big-time open-world RPGs, you spend most of your time talking to digital faces. Whether that’s in Deus Ex, Skyrim, or even The Witcher 3, if the NPC talking to you seems off in any way could ruin every gamer’s favorite synergistic word, “immersion.” This often leaves NPCs in the tough spot of either looking like a puppet or looking like Angelina Jolie in that movie Beowulf from 2007.
Now, with the release of Cyberpunk 2077, a new system for facial animation has entered the video game industry. Jali Technology has worked to create an animation system that takes all 44 of the English phonemes and animates them to an NPC’s face along with an audio file. This is all based on machine learning and basic locations of jaw, lip, and tongue movements. Jali’s tech also goes beyond English and supplies other phonemes from other languages, allowing Cyberpunk 2077 to sync multiple languages with relative ease.
So what does the future hold for this technology, and will it bring us closer to crossing the uncanny valley?