Character Animator is a character animation application that lets you bring expressive characters to life from your own artwork.
How it works?
Adobe Character Animator lets you import and animate creative work from Adobe Photoshop or Adobe Illustrator. First, record a video using your camera and microphone. As you shoot, Character Animator mimics your facial expression, synchronizes your lips with your speech, and gives you full control over every aspect of your character’s movement.
You can animate a character you got from another creator, or animate your own artwork from Photoshop or Illustrator. You can even record your own behaviors or use existing ones from other sources.
For whom?
Sometimes digital graphics are created from scratch: they make a sketch, a line drawing and gradually paint it, working on textures and chiaroscuro. This is very similar to the work of ordinary pencil and brush artists. But applications and technologies like these open up other possibilities for creativity.
Details
For the character, such parameters as changing the scale, position, rotation will become available. The built-in diagram editor will allow you to control the speed and smoothly bring the character to the desired position. You can create a «repeat» or«trigger» — a pre-animated sequence that you can turn on at the right time.
Thanks to webcam capture, Character Animator is a very useful tool to synchronize animation and character dialogue in real time. For example, the package was used by the animators of The Simpsons: Homer answered calls from real viewers on the air.
Where the characters come from
The work process begins with creating a character in Illustrator or Photoshop. We either make our own character entirely, or download eyes, mouth, etc. from Creative Cloud Market. If the names of the layers correspond to the parts of the body (chest, head, eyes, mouth), then no further adjustment of the animation skeleton is required. You can also assign layers to body parts directly in ACA. You can control the character with yourself sitting in front of the webcam. Any changes to the artwork made in Photoshop or Illustrator will automatically appear in Character Animator.
Tracking facial expressions
A webcam can be used to track facial expressions. The position, rotation and distance from the camera to the face are monitored, movements such as lips, eyebrows, etc. are tracked. Whatever emotion you portray, joy or surprise, Face tracker will track it and apply it to the character.
Automatic limb animation
While Character Animator is automatically controlling the face, you can wiggle the character with the mouse. You can add additional properties, such as making the character’s jaw and mouth move vertically, like a ventriloquist, or making hand gestures.
Automatic lip sync with sound (Lip Sync)
Just speak into the microphone while sitting in front of the webcam and you will get your character to talk.
Animation by the laws of physics
When the character’s body parts react to the setting, it adds believability. Adobe Character Animator has a Dangle characteristic that controls the swing and hang properties of the character. For example, if you add long hair to a character, it can wiggle left and right when the character shakes his head, or his ears can jump up and down. You can add additional animations such as falling snow or floating bubbles.
Add Breathe and other characteristics
The function automatically controls how often and how much the character’s chest expands. You can restrict movement to certain parts of the body and leave others at rest. For example, fix the torso and move only the arms and head.
Recording and editing
A scene can be recorded in parts, with many takes. Each such recording is visible as a separate track in the Timeline panel. By trimming and shifting tracks, the final scene is assembled and exported to Adobe After Effects CC for further editing, or to Adobe Premiere Pro CC, or to Adobe Media Encoder CC.
As Forbes points out, the Character Animator app is capable of taking a custom artwork and adding some realistic physics to it. Adobe was not a pioneer here, but the innovation is the ability to use a webcam to track movement and facial expressions that can be applied to illustrations in real time.