Each card shows who executes, who signs off, entry / exit criteria, deliverables, and the most common pitfall.
The line is field-level vs process-level. Get this wrong and your acceptance evidence doesn't reflect real user experience.
Field-level / configuration-level. Tests that individual user stories or configurations work as designed.
Process-level / business-flow. Tests end-to-end business journeys.
If a tester is checking field X accepts data type A → that's FAT.
If a tester is walking through "I'm a stock controller, show me my Tuesday morning workflow" → that's UAT.
Blocks all testing or renders the area untestable. Ship-stopper.
Important function broken but business can continue with manual or alternative steps.
Inconvenience but not blocking. Logged with post-go-live fix plan.
Typos, alignment, label inconsistencies. Backlog candidates.
Go-live rule: zero P1; zero P2 unless a workaround is agreed and documented; P3 within published ceiling, with post-go-live fix plan; P4 in backlog. P2 workarounds must be agreed by the Design Authority and Process Owner and recorded in the Test Exit Report — no grey area.
Tick items as evidence is in. The sponsor signs the final authorisation only when all are green.
Start with the test-completion sign-offs (FAT, SAT, SIT, UAT, BAT). Each is non-negotiable.
Describe what's being tested and who's doing it. The finder maps it to the right level.