Adobe Character AnimatorAnimationCharacter AnimatorDVA SegmentUncategorized

[ad_1]

This year, we celebrated International Animation Day a little early — at Silicon Valley Fashion Week?!. Dave Werner, senior experience designer on our Character Animator team, helped character Growly McMonsterpants talk about his style choices with host Mustafa Khan. Dave used augmented reality and live animation to perform the comedy sketch, and Adobe even got a shout out in The New York Times.

Adobe Character Animator has been in development since 2013, and even though the product is still in preview, the technology has been in high demand. It already has been used successfully for live episodes of “The Simpsons” and to create live interaction between Stephen Colbert and Cartoon Donald Trump and Cartoon Hillary Clinton on “The Late Show.”

Character Animator makes it possible for anyone to animate any asset — whether illustration or photo — and make it come alive. By tracking a user’s facial movements and using performance-based animation rather than keyframes, animation is suddenly much easier and more fluid.

Partnering Research and Development

New technology is emerging all the time, but there is also a lag before we see that technology transfer to commercial products. Character Animator emphasized both research and engineering components from the initial product concept, as opposed to transferring technology into an existing product.

Adobe Researcher Jovan Popovic sought to simplify animation with new techniques that relied on performance- and physics-based simulation. As this work matured, he began to advocate for simpler animation tools within Adobe products. At first, it was difficult to convey how the tedious keyframe animation could be simpler.

“After talking past product teams for some time, I realized I needed to change my approach. I modified my presentation to talk about simulation and performances to automate animation, rather than just saying that we should make animation easier,” explains Jovan.

At the same time, David Simons, senior principal scientist and one of the original creators of After Effects, was watching After Effects customers create complicated stacks of layers and using expressions to get animated characters — they doing a lot of extra work to make it happen. Because David and Jovan both worked in the same office building in Seattle, it became natural for the two of them to talk — over lunch, many times. And that’s how Character Animator was born.

Jovan recalls, “I had access to the technology, but David brought the product development team. Five of us — David Simons, Dan Wilk, Jim Acquavella, Wil Li and I from Research — started by asking a lot of questions about what we needed to develop and how to make it happen.

“It’s been fantastic to be able to conceive of a product in an unexplored area. I love having discussions between two points of view and then having some consensus emerge for what we think is the best. Character Animator is unlike any other product at Adobe simply because researchers have been involved from the early stages and so the product has had really awesome technology — things that no one else can do — behind it since its inception. Wil and I did whatever needed to be done, just like the engineers did whatever needed to be done. It’s been a great partnership of researchers doing product things and product people doing research things.”

And at David’s insistence, the entire team participated in customer research and customer visits, because, according to David, “If you don’t know who the customers are or can’t relate to them, it’s hard to make good decisions.”

svfw-2016-11

 

Applying Technology to New Products

Character Animator is designed for two different types of artists: those who want to easily create character animations using two-dimensional artwork, and those who want to rig complex characters without creating a confusing tangle of expressions.

With computer animation, geometric modeling, and physical simulation, the team developed a tool that focused on animation in a really organic way. Through a camera or webcam, users can create animation by performing the actions they want their character to have. Whatever action the camera captures, the character mimics it on screen. This includes not only gross motor movements, but also detailed facial expressions like mouth shape (open, smile, neutral), head position (rotation, angle, scale), and eyebrow movement.

Additionally, the tool is designed to make cartoon-style lip sync fully automatic. Lip movements on characters are created from audio, rather than video, so that the character’s lips move in reaction to what the user is saying. Using machine learning technology, the audio is converted to one of 61 different phonemes, and those phonemes are then mapped to 11 different mouth shapes, or visemes. Current Character Animator technology recognizes a variety of languages, but its specialty is currently English.

The Simpsons team worked closely with Adobe to adjust Character Animator to work the way they wanted it to. For instance, they wanted Homer’s lips to sync better with his voice, so the team engineered a solution and that enhancement is now included in the software for everyone.

With Cartoon Donald Trump, the animators loved how his body warped into a recoil when the performer angled away to the edge of the camera. But with The Simpsons, the creators wanted Homer to animate in the same way that he does in the show — it’s a style they perfected over 20 years of creating The Simpsons. The Simpsons team created numerous drawings for how Homer should look at certain points and used keystrokes to animate him in those ways.

Making Character Animator Work for You

Despite the new accessibility of animation to those who never thought they would have the time nor talent, Character Animator was not developed to replace professional animators. On the contrary, the program is developed so users don’t have to rely on it to determine how their images are generated or how their character moves around. Any image you can create can be animated in any way you want it to.

We believe this new technology will provide more opportunities for animators because they now can create more in less time and with less budget, fueling an increased demand for great animation. In fact, we envision traditional animators creating detailed hand-drawn puppets that can be packaged for use by non-animators in the same organization or who are connected via a marketplace, similar to the way stock images are transacted today.

We’ll have some exciting updates at Adobe MAX next week, so if you’ll be there, please feel free to stop by and talk about what you think the product should be doing. If you’re already using Character Animator, we’d love to have you share your feedback in our user-to-user forum.

[ad_2]

Source link

Leave a Reply