The LipSync Component inspector is custom-designed to simplify setting up a character with
the poses and gestures they need to play back LipSyncData animations.
When a new LipSync component is added to a Game Object, the inspector will look like this:
You can select a Blend System from the drop down. Any script in the project that inherits from
the BlendSystem class will be shown here. When this choice is changed, the editor will handle
the adding/removing of Blend Systems for you. Different Blend Systems are designed to work with
different types of character, and so will have different options and requirements, but most will
require some kind of renderer to be assigned. Some, such as the Sprite Blend System, may delegate
this to another component, which will be added automatically.
Once a Blend System is chosen (and has any required fields filled in), the full LipSync Inspector
will be visible. It is comprised of the following parts:
Blend System Settings
The contents of this section vary depending on which Blend System has been added, but it
will contain all the public options, such as the Mesh or Sprite Renderer to be used. The
BlendSystem Commands area below it is also determined by the Blend System chosen.
Editor Tabs Pro
These tabs switch between the Phoneme, Emotion and Gesture sections of the inspector. This row
also contains the presets button, which brings up a menu for loading and saving preset phoneme & gesture poses.
In Phoneme and Emotion modes, this section displays a list of poses. Clicking one will
expand the pose editor to set up that pose. In Gesture mode, a list of geatures is shown
here instead, with fields for assigning animation clips. See the
Phonemes & Emotions
and Gestures pages for more detail on each.
This section contains the bulk of the settings for an individual LipSync component. They
are as follows:
Additionally, below the AudioSource at the top of the inspector, is the Use Bone Transforms checkbox.
When this is checked, it will be possible to add Transforms to poses, along with Blendables (blend shapes, Sprites etc).
Which method should be used to play LipSyncData animations?
Audio Playback mode uses the playback position of the audio clip. This
ensures perfect synchronisation between audio and animation, but won't work well
with non real-time rendering. If the animation has no audio, Custom Timer will be used.
Custom Timer mode uses a built-in timer based on Time.deltaTime. Technically
it allows audio and visuals to become out-of-sync, but it's compatible with WebGL
and some mobile platforms that don't work correctly with Audio Playback mode.
Fixed Frame Rate mode plays the animation back at a fixed frames-per-second
(FPS) rate. This mode is good for non real-time rendering such as VRPanorama Pro.
If this option is chosen, a Frame Rate option is revealed below.
Play on Awake
If checked, a LipSyncData clip will be played when the component awakes. This
functions the same as on an AudioSource. The following options are revealed when
The LipSyncData clip that will be played on awake.
The delay, in seconds, before the animation will start playing.
If true, any clip played on this component will loop back to the start when finished.
Scale Audio Speed
Whether to set the audio's playback speed to match the current timescale (Edit >
Project Settings > Time). By default, audio in Unity disregards the timescale, so pausing
or slowing down time has no effect on dialogue unless this is checked.
If there are no phonemes within this many seconds in an animation, the character
will go back to its Rest pose (or neutral if Rest is not available).
Pre-Rest Hold Time
The length of time, in seconds, the animation will take to transition to or from
a neutral or rest state.
Phoneme Curve Generation Mode
Determines what method is used to generate animation curves for phonemes.
Loose will create smoother curves that can look more natural, but are prone
to overshooting on fast changes (So may cause exaggerated poses).
Tight will ensure that the animation never overshoots, but may give a slightly
more robotic look to animations.
Keep Emotion When Finished
If true, an emotion marker that doesn't blend out before the end of a clip will remain set
on the character after the clip finishes playing. Otherwise the emotion will simply blend out.
You can call
.ResetEmotion from a script to blend back to neutral at a later point.
Emotion Curve Generation Mode
Animation curve generation method for emotions. See 'Phoneme Curve Generation Mode' for details.
Checking it will also reveal another option, Account for Animation. With this checked,
bones that are both used in one or more poses and controlled by an Animator will
work better. Note: Any transforms that are in poses but aren't controlled by an
Animator won't animate correctly! Only turn this on if you are using major, animated bones
in your poses.