Live Link Intermediate
Live Link Face allows you to drive MetaHuman facial animations in real-time using an iPhone with ARKit face tracking. This enables performance capture workflows where an actor's expressions are mirrored onto a digital character instantly — no markers, headsets, or expensive mocap equipment required.
Requirements
| Item | Requirement |
|---|---|
| iPhone | iPhone X or later (TrueDepth camera required) |
| App | Live Link Face app (free on App Store) |
| Network | iPhone and PC on the same local network |
| UE Plugin | Live Link and Apple ARKit plugins enabled in Unreal |
Setup Steps
- Enable Plugins
In Unreal Engine, enable the Live Link, Apple ARKit, and Apple ARKit Face Support plugins. Restart the editor.
- Configure Live Link Source
Open Window → Live Link. Add a new source of type "Apple ARKit Face". The iPhone will appear once it connects.
- Connect the iPhone
In the Live Link Face app, enter your PC's IP address and tap Connect. You should see tracking data streaming in the Live Link panel.
- Map to MetaHuman
In the MetaHuman's Animation Blueprint, add a Live Link Pose node. Map the ARKit blend shapes to the MetaHuman face control rig inputs.
ARKit Blend Shapes
ARKit provides 52 blend shape coefficients covering eyebrows, eyes, cheeks, jaw, lips, and tongue. MetaHuman's face rig natively accepts these values, making the mapping straightforward. Fine-tune with multiplier curves to match your character's facial proportions.
Troubleshooting
- If the connection drops, ensure both devices are on the same subnet and no firewall is blocking UDP traffic
- Calibrate the actor's neutral face in the app before each session for best results
- Reduce buffer size in Live Link settings for lower latency (at the cost of some jitter)
Lilly Tech Systems