Live Link Intermediate

Live Link Face allows you to drive MetaHuman facial animations in real-time using an iPhone with ARKit face tracking. This enables performance capture workflows where an actor's expressions are mirrored onto a digital character instantly — no markers, headsets, or expensive mocap equipment required.

Requirements

ItemRequirement
iPhoneiPhone X or later (TrueDepth camera required)
AppLive Link Face app (free on App Store)
NetworkiPhone and PC on the same local network
UE PluginLive Link and Apple ARKit plugins enabled in Unreal

Setup Steps

  1. Enable Plugins

    In Unreal Engine, enable the Live Link, Apple ARKit, and Apple ARKit Face Support plugins. Restart the editor.

  2. Configure Live Link Source

    Open Window → Live Link. Add a new source of type "Apple ARKit Face". The iPhone will appear once it connects.

  3. Connect the iPhone

    In the Live Link Face app, enter your PC's IP address and tap Connect. You should see tracking data streaming in the Live Link panel.

  4. Map to MetaHuman

    In the MetaHuman's Animation Blueprint, add a Live Link Pose node. Map the ARKit blend shapes to the MetaHuman face control rig inputs.

ARKit Blend Shapes

ARKit provides 52 blend shape coefficients covering eyebrows, eyes, cheeks, jaw, lips, and tongue. MetaHuman's face rig natively accepts these values, making the mapping straightforward. Fine-tune with multiplier curves to match your character's facial proportions.

Recording Tip: Use the Take Recorder in Unreal Engine to capture Live Link data as animation sequences. This lets you review, edit, and composite performances in Sequencer just like traditional keyframe animation.

Troubleshooting

  • If the connection drops, ensure both devices are on the same subnet and no firewall is blocking UDP traffic
  • Calibrate the actor's neutral face in the app before each session for best results
  • Reduce buffer size in Live Link settings for lower latency (at the cost of some jitter)