Adobe’s Character Animator wins an Emmy

Character Animator character

The National Academy of Television Arts and Sciences recognizes Character Animator as a “pioneering system for live performance-based animation based on facial recognition.”

Character Animator character
I’d like to thank my early ancestors: pen and paper. Character Animator closes the loop between character and animator.

Adobe’s Character Animator takes mo-cap to the desktop. Animated characters can be driven by actors using just the computer’s webcam and microphone. Using Adobe’s Sensei AI, the software analyzes the user’s facial movements and audio and matches them to simple animated characters. Some day it may transform the way facial animation is created. But long before that, Character Animator is giving people a new medium for creation and self-expression. Anyone can use it, but it seems to be a natural for TV. So far, it’s been used for The Simpsons, Our Cartoon President, and more recently for Tooning out the News, a new show produced with Stephen Colbert.

For a little bit more about the history and previous work on Character Animator, Adobe’s Bryan Lamkin, EVP/GM Digital Media celebrates the award with a blog post.