← Back to Home
Intermediate ~10 min read

Replay System Tutorial

Record real gameplay sessions — combat, AI, physics, streaming — and use them as repeatable performance benchmarks. This tutorial walks you through the complete replay workflow from recording to CI integration.

In This Tutorial

  1. Prerequisites
  2. Create a Replay Scenario
  3. Record a Replay
  4. Review the Replay File
  5. Run the Capture
  6. Record a Baseline and Compare
  7. Read the Replay-Aware Report
  8. Understanding Replay Variance
  9. Blueprint API for Replays
  10. CI Integration
1

Prerequisites

Before you begin, make sure you have the following:

PerfGuard plugin installed in your UE5 project. If you haven't set it up yet, see the Quick Start tutorial first.

A map with gameplay content — something with AI, physics, or streaming that you want to benchmark. You can also use the bundled demo scene at Content/Demo/Levels/DemoScene.

Basic PIE familiarity — you should know how to start Play-In-Editor and use the console (~ key).

2

Create a Replay Scenario

Right-click in the Content Browser, then select PerfGuard → Performance Scenario. Name it something descriptive like PS_CombatReplay.

Open the scenario asset and configure these settings:

Scenario Name: CombatReplay (or any name you prefer)
Capture Source: Replay — this tells PerfGuard to play back a recorded demo instead of a Level Sequence
Map Asset: Select your gameplay map
Warmup Frames: 120 — replay scenarios need more warmup for streaming and initialization
Stat Configs: Add FrameTime, GPUTime, GameThreadTime, and DrawCalls

*
Set warmup to 120+ frames Streaming-heavy maps need extra warmup time to finish loading textures, building HLODs, and stabilizing frame times before capture begins.
3

Record a Replay

You have two options for recording gameplay.

Option A: In-Editor Recorder (Recommended)

Open your replay scenario in the Details panel. The Replay Configuration section appears automatically when Capture Source is set to Replay. Click Play (PIE) to start Play-In-Editor, then click Record New... in the Replay Configuration section.

Play through your scenario — fight enemies, trigger physics, run through streaming zones. When you're done, click Stop Recording. The replay file is automatically selected in the dropdown, and the duration is auto-detected.

PerfGuard replay recording controls in the Details panel

Option B: Console Commands

If you prefer manual control, start PIE and open the console (~ key):

Console
// Start recording
demorec MyCombatReplay

// Play through your scenario, then stop
demostop

The replay is saved to Saved/Demos/MyCombatReplay.replay. Back in the scenario Details panel, click Refresh in the Replay Configuration dropdown and select your file.

4

Review the Replay File

After recording, the Replay Configuration section shows metadata for your selected file:

Replay file metadata showing size, date, and duration

File size — a healthy replay is typically 1–50 MB depending on duration and complexity.
Last modified date — replays older than 30 days may not be compatible with engine updates.
Duration — auto-detected from the replay header when available.

Click the Validate Scenario button in the Actions section to run replay-specific checks: replay file exists and is non-empty, capture duration is set, warmup frames are adequate (120+ recommended), and determinism settings are configured.

!
Enable fixed frame rate For more deterministic results, enable fixed frame rate in Project Settings → General Settings → Framerate → Use Fixed Frame Rate. This reduces variance from frame-to-frame timing differences during replay playback.
5

Run the Capture

Click Run Standalone in the Actions section of the scenario Details panel. This launches a separate process (not PIE) that loads the map, starts replay playback, warms up for the configured number of frames, captures CSV performance data, and saves results to Saved/PerfGuard/Results/.

You can also run from the CLI:

CLI
perfguard run suite.json

Or directly with Gauntlet:

Gauntlet — Linux / macOS
UnrealEditor MyProject.uproject -game \
    -gauntlet=PerfGuardGauntletController \
    -scenario=CombatReplay \
    -csvprofile -RenderOffScreen -unattended -log
Gauntlet — Windows
UnrealEditor.exe MyProject.uproject -game ^
    -gauntlet=PerfGuardGauntletController ^
    -scenario=CombatReplay ^
    -csvprofile -RenderOffScreen -unattended -log
6

Record a Baseline and Compare

Once you have a clean capture, save it as the performance baseline for future comparisons:

CLI — Record Baseline
perfguard baseline record CombatReplay --auto-csv

After making gameplay or rendering changes, run the capture again (Step 5) and then compare against the baseline:

CLI — Compare
perfguard baseline compare CombatReplay --auto-csv
*
Run 3+ times for statistical confidence Replay captures have more variance than Level Sequence captures. Run the capture at least 3 times and use perfguard analyze to get confidence intervals before drawing conclusions about regressions.
7

Read the Replay-Aware Report

Generate the HTML report from your results:

CLI
perfguard report Saved/PerfGuard/Results/
HTML report with replay capture source badge and variance warnings

The HTML report includes several replay-specific features:

Capture Source Badge — an orange "Replay" badge next to the scenario name, so you can instantly distinguish replay captures from Level Sequence captures.

Single-Run Warning — if you only ran once, the report recommends running 3+ times for reliable regression detection.

High CoV Warning — if the coefficient of variation exceeds 5%, the report flags it and explains that this is expected with replay captures due to non-deterministic physics, AI, and streaming.

Replay Metadata — the report's metadata section shows the replay file name and capture source for traceability.

8

Understanding Replay Variance

Replay captures have inherent variance that Level Sequence captures do not. Understanding the sources helps you set appropriate thresholds and interpret results correctly.

!
Replay captures have inherent variance Use wider regression thresholds (8–15%) for replay scenarios compared to the typical 5% for Level Sequence captures. Physics non-determinism, AI randomization, and async streaming all contribute to run-to-run variation.
Source Cause Mitigation
Physics Floating-point non-determinism across runs Use wider thresholds (8–15%) or multi-run analysis
AI Behavior tree randomization and decision variance Seed RNGs for test scenarios, or accept variance
Streaming Async loading timing varies between runs Higher warmup frames (120–180)
Network Replay doesn't capture all client-side effects Use local replays only
*
Recommended practices Run 3+ times and use perfguard analyze for statistical confidence. Set warmup to 120+ frames for streaming-heavy maps. Use explicit duration — don't rely on auto-detection for CI. Enable fixed frame rate in Project Settings for more deterministic results. Use wider thresholds (8–15%) for physics-heavy replays.
9

Blueprint API for Replays

The replay system is fully exposed to Blueprint via the PerfGuard Subsystem (access via Get Game Instance → Get Subsystem → PerfGuard Subsystem). These functions let you build custom replay workflows entirely in Blueprint:

Blueprint Functions
StartReplayPlayback(ReplayFilePath, World)   // Start replay playback
StopReplayPlayback()                            // Stop replay playback
IsReplayPlaying()                               // Check if a replay is currently playing
GetReplayDuration(ReplayFilePath)              // Get duration in seconds
GetDefaultReplayDir()                           // Get configured replay directory path
*
Full Blueprint reference For a complete walkthrough of PerfGuard's Blueprint API including delegates, events, and custom capture workflows, see the Blueprint API tutorial.
10

CI Integration

Replay scenarios work with all CI systems just like Level Sequence scenarios. The key difference is using wider thresholds to account for replay variance.

In your suite.json, replay scenarios look like any other — the scenario asset itself defines the capture source:

suite.json
{
  "scenarios": [
    {
      "name": "CombatReplay",
      "csv": "Saved/Profiling/CSV/",
      "thresholds": {
        "FrameTime": {"max_regression_percent": 10.0},
        "GPUTime": {"max_regression_percent": 10.0}
      }
    }
  ]
}

Note the wider thresholds (10% vs the typical 5%) to account for replay variance.

Your CI workflow runs the same command regardless of capture source:

GitHub Actions YAML
- name: Run replay benchmark
  run: |
    perfguard run suite.json --mode compare --budget 30fps

For more details on CI pipeline configuration including Jenkins, GitLab, and BuildGraph, see the CI Setup tutorial.