AI in Unity Best Practices
Ship production-quality AI by following proven patterns for performance, debugging, testing, and cross-platform deployment in Unity.
Performance Optimization
- Limit NavMesh recalculation: Avoid baking NavMesh every frame. Use NavMeshObstacle carving for dynamic changes and rebake only when the level layout changes significantly.
- Stagger AI updates: Do not update all agents every frame. Distribute updates across frames using coroutines or a manager that ticks a subset of agents each frame.
- Use Physics layers: Put AI perception raycasts on dedicated physics layers to reduce collision check overhead.
- Pool everything: Use object pooling for projectiles, effects, and temporary AI objects to avoid garbage collection spikes.
- Profile with Unity Profiler: Use the CPU and Memory profilers to identify AI-related bottlenecks. Look for GC allocations in AI update loops.
Debugging AI in Unity
Gizmos and Debug Drawing
Use
OnDrawGizmos()to visualize detection ranges, NavMesh paths, behavior tree states, and perception cones directly in the Scene view.Debug UI Overlay
Create a runtime UI that shows each agent's current state, target, health, and decision history. Toggle it with a debug key.
Logging Framework
Implement a structured logging system for AI decisions. Include timestamps, agent IDs, and decision context so you can trace exactly why an agent did something.
Replay System
Record agent inputs and world state so you can replay and analyze AI behavior frame-by-frame after the fact.
Cross-Platform Deployment
| Platform | AI Considerations |
|---|---|
| Mobile | Reduce agent count, simplify BTs, use smaller neural networks, watch thermal throttling |
| Console | Full AI budget available, but test on target hardware for accurate profiling |
| WebGL | No threading, limited memory. Keep AI simple and avoid large ONNX models |
| VR | Must maintain 90fps. Strict AI time budget of 1-2ms. Offload to background threads where possible |
Testing AI
- Unit tests: Test individual behavior tree nodes and FSM transitions in isolation using Unity Test Framework.
- Integration tests: Run automated scenarios where AI agents must complete objectives. Assert on outcomes.
- Smoke tests: Let agents run for extended periods to catch edge cases, stuck states, and performance degradation.
- Playtesting: No automated test replaces human playtesters. Schedule regular playtests focused on AI behavior.
Start with traditional AI (NavMesh, behavior trees, FSMs) for production games. ML-Agents is excellent for research, prototyping adaptive behaviors, and specific use cases like racing line optimization. Hybrid approaches work well: use traditional AI for core behaviors and ML for specific subsystems.
Run AI on the server to prevent cheating. Replicate only the results (positions, animations) to clients. Use server-authoritative NavMesh pathfinding and keep AI decision-making server-side. For client-side prediction, interpolate AI positions smoothly.
Lilly Tech Systems