What you'll learn
- How to structure a control matrix across four blocks with 28 columns that satisfy ISAE 3402.16 and ISAE 3402.23
- How to write control descriptions that pass the five-element test (WHO, WHAT, WHEN, EVIDENCE, EXCEPTION)
- How IPE flagging prevents the most common PCAOB finding on system-generated reports
- How to classify controls as key or non-key with rationale that survives a reviewer's first pass
- How the ISAE 3402 template pack pre-populates 11 worked example controls across seven control objectives
Planning meeting. The EP opens the PY ISAE 3402 file and pulls last year's control matrix forward. Five columns (control objective, control description, frequency, type, notes) and a nod that "we just refresh the dates." In the files we've reviewed, that approach generates the first wave of RNs before fieldwork even starts, because the matrix cannot carry the weight the standard places on it.
An ISAE 3402 control matrix that survives review needs a minimum of 28 columns across four distinct blocks (identification, classification, linkage, assessment), with every control description passing a five-element test covering who performs it, what they do, when, what evidence it produces, and what happens when exceptions occur. The file should tell a story. A reviewer should be able to trace from objective to testing conclusion without a single clarifying question.
Why five columns is not enough
ISAE 3402.16(a) lists eight mandatory description criteria for the service organisation's system. The control matrix is where most of those criteria land. A five-column matrix cannot address the link between controls and risks (ISAE 3402.23), the distinction between key and non-key controls with documented rationale, IPE identification, CUEC dependencies, or the pre-defined evidence expectations that set the baseline for testing.
Every missing column becomes a gap that the reviewer fills with a question. Five questions in the review means the matrix goes back for rework. Twenty-eight columns answered upfront means the reviewer traces a complete chain from control objective to testing conclusion without interruption.
The column count is not arbitrary. It reflects what the standard requires, what inspectors check, and what the testing protocol needs as input. Each column feeds forward into subsequent tabs. The testing protocol pulls frequency, population, risk level, and key/non-key classification directly from the matrix. If those columns do not exist, the tester invents them on the fly, and consistency breaks.
The four-block structure
The ISAE 3402 template pack organises the 28 columns into four blocks. Each block serves a different purpose in the audit chain.
Identification block (columns 1 to 6)
The identification block answers which control objective this control serves, what process area it belongs to, its unique identifier, what the control actually does (per the five-element test), and who owns it. The control objective description follows a formula: "Controls provide reasonable assurance that [specific operational outcome]." Vague objectives ("controls over IT") fail the specificity test in ISAE 3402.18.
Classification block (columns 7 to 13)
The classification block answers whether the control is manual, automated, or IT-dependent manual, how often it operates, whether it is key or non-key (with mandatory rationale), which system it runs in, and whether it relies on information produced by the entity (IPE). This block is where most RNs originate because it requires judgment calls that teams often skip or answer generically. The EP signs, the RN clears, nobody loves it.
Linkage block (columns 14 to 20)
The linkage block connects the control to the rest of the engagement file. Every control must trace to a risk in the risk assessment. Every control maps to a COSO component and to specific FS assertions. If a CUEC dependency exists, the linkage block records it. Without this block, the control matrix is a standalone document. With it, the matrix becomes a node in a connected chain.
Assessment block (columns 21 to 28)
The assessment block defines the testing expectations. What evidence should the tester expect to see? What testing approach will be used (the standard's prohibition on inquiry alone under ISAE 3402.25(a) is reinforced here). What is the sample size basis? What is the population? Design effectiveness and operating effectiveness get separate columns because a control that is well-designed but does not operate consistently has a fundamentally different deficiency profile. Exception notes and PY references close the block.
The five-element control description test
The single most common deficiency in ISAE 3402 files is a vague control description. "Management reviews the payroll report" tells the tester almost nothing. Compare that to a description that passes all five elements:
WHO performs the control: the Payroll Manager (a role title, never a person's name).
WHAT they do: reviews the monthly payroll variance checklist, comparing total payroll cost per department to the approved budget and prior month, investigating variances exceeding 5%.
WHEN they do it: by the 15th of each following month.
What EVIDENCE the control produces: signed variance checklist with investigation notes for all flagged departments, retained in the payroll SharePoint folder.
What happens when an EXCEPTION occurs: variances exceeding 10% are escalated to the Finance Director with a written explanation within two business days.
A control description that omits any element creates problems downstream. Without WHO, the tester cannot confirm the right person performed the control. Without WHEN, the tester cannot determine whether the control operated on time. Without EVIDENCE, the tester does not know what to inspect. Without EXCEPTION, the tester has no benchmark for evaluating whether deviations were handled appropriately. This is the finding that generates the most review notes on ISAE 3402 files.
The five-element test applies to every control in the matrix. Automated controls still need it. WHO is the system. WHAT is the specific validation or calculation. WHEN is real-time or batch (specify which). EVIDENCE is the system log or rejection report, and EXCEPTION is the error handling routine.
IPE flagging: the column that prevents the PCAOB's top finding
Information produced by the entity (IPE) is any system-generated report or data extract that the auditor uses as audit evidence. The PCAOB's 2024 Staff Alert identified IPE testing failures as the most common deficiency in service organisation engagements. Firms relied on system-generated access listings, payroll registers, and exception reports without testing whether those reports were complete and accurate.
The control matrix contains a dedicated IPE flag column. When a control relies on a system-generated report as evidence (a user access listing from the ERP, a payroll variance report, or a change management log), the IPE flag is set to "Y." A companion column then requires documentation of which report constitutes the IPE, which system generates it, what the tester will do to test completeness, and what the tester will do to test accuracy.
This flag is set at the matrix stage, not during testing. If IPE identification happens only when the tester encounters a report in the field, some IPE will be missed. By flagging it in the matrix, the testing protocol inherits the flag and includes IPE testing steps automatically.
For the 11 example controls in the pack, IPE flags are pre-populated. The quarterly access review relies on the user access listing (IPE). The payroll variance review relies on the payroll register (IPE). The change management procedure relies on the CAB meeting minutes generated by the ITSM tool (IPE). The incident management procedure relies on the ticketing system export (IPE). Each flagged control has a pre-written completeness and accuracy approach that the tester can adapt to the specific entity's systems.
Key versus non-key classification
Every control in the matrix receives a key or non-key classification. The distinction matters for two reasons. Sample sizes are higher for key controls, and a deviation in a key control is more likely to trigger a gap analysis entry.
The rationale column requires the team to answer four questions. What risk does this control address? Does a compensating control exist? What happens if this control fails? How quickly would a failure be detected by another procedure in the file?
A control is key when it is the primary control for a risk and no compensating detective control would catch failures between operating cycles. A control is non-key when it supplements a primary control. In the pack's worked examples, the quarterly access review by the Information Security Manager is classified as key because no compensating control prevents unauthorised access between quarterly review cycles. The annual access certification by department heads is non-key because the quarterly review is the primary control.
A vague classification with no rationale ("key because it's important") is the finding most likely to generate an RN. The rationale column forces specificity. The file should tell a story from risk, through control, to test result, without the reviewer having to fill in the reasoning.
Worked example: Van der Berg Payroll Services B.V.
Van der Berg Payroll Services B.V., a Dutch payroll processing bureau with €34M revenue, processing payroll for 112 client entities. Type II engagement covering 1 January to 31 December 2025. Seven control objectives across ITGC, payroll processing, change management, and financial reporting.
Define the control objective
The engagement team opens the control matrix and defines the payroll processing control objective: "Controls provide reasonable assurance that payroll transactions are processed accurately, completely, and only with proper authorisation." The process area is Payroll Processing. Documentation note: the payroll processing objective description follows the formula template. Objective links to the inaccurate payroll output risk and the unauthorised payroll changes risk in the risk assessment tab.
Write the control description with the five-element test
The team populates the payroll variance review control using the five-element test. WHO: Payroll Manager. WHAT: Reviews monthly payroll variance checklist, comparing total payroll cost per department to approved budget and prior month, investigating variances exceeding 5%. WHEN: By the 15th of each following month. EVIDENCE: Signed variance checklist with investigation notes, retained in payroll SharePoint folder. EXCEPTION: Variances exceeding 10% escalated to Finance Director within two business days. Documentation note: control description field contains all five elements in sequence. Control type classified as IT-dependent manual (relies on system-generated variance report but performed by a person).
Populate the classification block
The classification block records: frequency is monthly (12 occurrences per year), key control (rationale: "Primary detective control over payroll accuracy. No other control independently verifies total payroll cost against budget. If this control fails, variances up to 10% go undetected until quarterly financial review"), system is the payroll application, IPE flag is Y. Documentation note: IPE description reads "Payroll variance report generated by [Payroll System]. Completeness: reconcile report total to general ledger payroll expense. Accuracy: reperform 2 variance calculations."
Connect the linkage block
The linkage block connects the payroll variance review control to the inaccurate payroll output risk, COSO component Control Activities, assertions Accuracy and Completeness, the payroll authorisation complementary user entity control (user entities must authorise payroll transactions before submission), no subservice organisation dependency. Documentation note: the payroll authorisation CUEC cross-reference means the control's effectiveness depends on user entities submitting authorised data. If the payroll authorisation CUEC does not operate, this control could process accurately but on fraudulent input.
Set the assessment block testing parameters
The assessment block sets: evidence expected is the signed checklist with investigation notes, testing approach is inspection plus reperformance, sample size basis is 3 to 4 per standard monthly sampling at 95% confidence, population is 12 monthly reviews, design effectiveness assessed as effective per the payroll variance review design walkthrough. Documentation note: operating effectiveness left blank at matrix stage. Populated during testing phase. Exception column pre-populated with "No exceptions" as default, to be updated if deviations found.
The completed matrix row for the payroll variance review control contains 28 populated fields. A reviewer can trace from the control objective through the description, classification, risk linkage, and testing parameters without asking a single clarifying question. That is what a defensible file looks like at the row level.
Practical checklist for control matrix construction
Common mistakes
Writing control descriptions that omit the exception-handling element. Without it, the tester has no benchmark for evaluating whether deviations were handled appropriately, and operating effectiveness conclusions become unsupportable.
Classifying all controls as key. When every control is key, the classification carries no information. The distinction exists to focus testing effort on controls where failure has the greatest consequence. If 90% of controls are key, the team has not applied judgment.
Omitting the linkage block entirely. A control matrix without risk linkage is a list of controls, not an audit working paper. ISAE 3402.23 requires the auditor to identify risks that threaten control objective achievement. Controls that do not map to identified risks cannot demonstrate they address those risks.
Frequently asked questions
What is the difference between an ISAE 3402 Type I and Type II report?
A Type I report covers the description of the service organisation's system and the suitability of the design of controls at a specific date. A Type II report covers both the design and operating effectiveness of controls over a period, typically six to twelve months. Type II reports require the auditor to test whether controls operated effectively throughout the period by selecting samples and examining evidence of control execution. Most user auditors require a Type II report because it provides assurance that controls not only existed but actually worked during the period covered by their audit.
What are the five elements of a complete control description in an ISAE 3402 matrix?
A complete control description must specify: who performs the control (the named role, not a department), what the control does (the specific action or verification performed), when it is performed (the frequency, whether daily, weekly, monthly, or per transaction), what evidence it produces (the document, system log, sign-off, or approval record), and what happens when an exception is identified (the follow-up action or escalation). If any of these five elements is missing, the control description is incomplete and the auditor cannot design an effective test of operating effectiveness.
How many controls should an ISAE 3402 control matrix typically contain?
There is no prescribed number. The matrix should contain enough controls to address all risks identified for each control objective. A typical payroll service organisation might have 40 to 60 controls across 8 to 12 control objectives. A SaaS platform handling financial data might have 60 to 100 controls covering access management, change management, data integrity, backup and recovery, and incident management. The risk is having too few controls (leaving risks unaddressed) or too many with overlapping descriptions that make testing unnecessarily expensive.
What is Information Produced by the Entity (IPE) and why does it matter for ISAE 3402?
IPE refers to reports, data extracts, or system outputs that the service organisation uses as inputs to its controls. For example, an exception report that triggers a review control is IPE. ISAE 3402.A32 requires the auditor to evaluate the completeness and accuracy of IPE used in control operation. If a monthly reconciliation control relies on a system-generated report, the auditor must test that the report captures all transactions and calculates correctly. Flagging IPE in the control matrix ensures the auditor designs specific procedures to validate each report before relying on it as evidence.
How do you link controls to risks and control objectives in the matrix?
Each control should map to one or more risks, and each risk should map to a control objective. The linkage block in the matrix typically contains columns for the control objective reference, the risk reference, and the control reference. This three-way mapping demonstrates that every identified risk is addressed by at least one control and that every control serves a purpose. ISAE 3402.23 requires the practitioner to identify risks that threaten the achievement of control objectives. Controls that do not link to an identified risk suggest either a missing risk or a redundant control.
Related content
- ISAE 3402 glossary entry. Covers the difference between Type I and Type II engagements and the reporting requirements under the standard.
- ISAE 3402 template pack. Contains the 28-column control matrix with 11 pre-populated example controls, plus the testing protocol, gap analysis, and CUEC register.
- ISAE 3402 testing: why your sampling paragraph references are probably wrong. Explains how the testing protocol connects to the control matrix and why most firms cite the wrong ISAE 3402 paragraphs for sampling.
Related tools
Put audit concepts into practice with these free tools: