SOC 2 Type I/II

SOC 2 Readiness Assessment

How to run a SOC 2 readiness assessment. Gap analysis, scoping, remediation planning, and preparing for Type I fieldwork.
Browse SOC 2 Type I/II topics

Readiness is the single highest-leverage phase of SOC 2

A well-run SOC 2 readiness assessment saves more time and money than any other activity in the compliance program. Skip it and you enter fieldwork with unknown gaps, triggering remediation mid-audit, extending timelines, and burning goodwill with your CPA firm. Do it properly and you arrive at the audit knowing exactly where you stand, with a clean, prioritized punch list already worked.

Readiness is not a dress rehearsal. It is a structured gap analysis that compares your current controls against the Trust Services Criteria you have selected and outputs a remediation plan. The plan — with owners, due dates, and evidence requirements — becomes the roadmap for the weeks or months before fieldwork begins.

What a readiness assessment covers

A SOC 2 readiness assessment has five components. Skipping any of them weakens the value of the output.

1. Scoping

Scoping defines what the audit will cover. This includes:

  • Systems in scope: the applications, infrastructure, databases, and third-party services that store, process, or transmit customer data
  • Trust Services Criteria: security is required; availability, processing integrity, confidentiality, and privacy are optional and selected based on commitments
  • Locations: physical offices or data centers in scope
  • Entities: if the company has subsidiaries or separate business units, decide which are covered
  • Observation period (for Type II): the start and end dates the auditor will test against

Scoping decisions made during readiness typically carry through to the audit contract. Changing scope mid-engagement is expensive.

2. Control inventory

Catalog every control currently in place that could contribute to SOC 2 coverage. Sources include:

  • Existing information security policy
  • Identity and access management configuration
  • Infrastructure and application security tooling
  • HR processes (onboarding, offboarding, training)
  • Vendor management practices
  • Incident response and business continuity plans

The output is a control inventory mapped to the categories of the SOC 2 requirements. It does not need to be exhaustive — the goal is to understand what exists, not perfect it.

3. Gap analysis

With scope and inventory defined, compare what you have against what the Trust Services Criteria require. For every point of focus, answer:

  • Is there a control in place?
  • Is the control documented?
  • Is the control operating?
  • Is there evidence the control operated over time (for Type II)?

Gaps fall into three categories.

Gap TypeDescriptionTypical Effort
Missing controlNo control exists for the criterionHigh — design and implement
Undocumented controlControl exists but is not written downLow — document what you do
No evidenceControl exists but generates no auditable evidenceMedium — instrument evidence generation

4. Remediation planning

Each gap becomes a remediation item with:

  • Description of what is missing
  • Owner (named individual or team)
  • Priority (must-fix before audit vs nice-to-have)
  • Estimated effort
  • Due date aligned to the audit timeline
  • Evidence requirement after remediation

Prioritize gaps that are likely to be tested first and gaps that take the longest to close. Examples of items that frequently need the most lead time: centralized logging deployment, MFA rollout to all in-scope systems, policy set formalization, and vendor assessments. See policies and procedures for the policy baseline most SOC 2 programs need.

5. Evidence catalog

For every control — existing or newly created — identify the evidence the auditor will request. Evidence may be:

  • Static documents (policies, agreements, plans)
  • Snapshots (access review exports, configuration screenshots)
  • Continuous artifacts (logs, tickets, alerts) for Type II

The catalog prevents the scramble during fieldwork when auditors send their first request list and the team realizes half the evidence is not where it needs to be. Related glossary: evidence collection and remediation.

How readiness connects to Type I and Type II

Readiness is typically the first stop on the path to a SOC 2 report.

  • If your next report will be Type I, readiness identifies gaps to close so control design passes. Remediation must be complete before the Type I reporting date. See SOC 2 Type 1 vs Type 2.
  • If your next report will be Type II, readiness closes gaps so controls can operate cleanly across the observation period. Any remediation that happens during the observation period creates risk that the auditor will see control failure earlier in the period.
  • If you plan to skip Type I and go straight to Type II, readiness is even more important because there is no point-in-time checkpoint to catch design flaws before the observation clock starts.

How this fits into SOC 2

Readiness is not a Trust Services Criterion itself but supports CC3 (risk assessment) and CC4 (monitoring activities). The readiness output becomes evidence that the organization assessed control adequacy and took action. Many auditors ask to see the readiness assessment or equivalent gap analysis during fieldwork as an indicator of program maturity.

Readiness also informs scoping conversations with the CPA firm. Sharing your gap analysis with a prospective auditor during the selection process demonstrates seriousness and can help estimate fieldwork effort accurately. This in turn affects the cost estimate.

Deliverables of a good readiness assessment

By the end of readiness, you should have:

  • A written scope statement (systems, criteria, observation period)
  • A control inventory mapped to the Trust Services Criteria
  • A gap analysis document
  • A remediation plan with owners and due dates
  • An evidence catalog listing required artifacts per control
  • A refined understanding of likely audit cost and timeline

These artifacts are worth maintaining after readiness ends — they become the operating system of the SOC 2 program through the audit and beyond.

Common mistakes

  • Scoping too broadly. Including criteria you have no customer commitment for adds work without adding value. Start tight.
  • Skipping evidence planning. Identifying gaps without identifying how evidence will be produced leads to scrambling later.
  • No owner on remediation items. Items without owners stall. Every gap needs a name attached.
  • Treating readiness as a document exercise. Readiness is an operational sprint, not a report. The goal is to close gaps, not just describe them.
  • Using readiness as a substitute for Type I. Some buyers ask for Type I specifically. Readiness is not an auditor's opinion and does not satisfy that request.

Implementation tips

  • Start readiness at least three months before you want to begin fieldwork for Type I, or before the observation period begins for Type II.
  • Use your compliance platform to run the gap analysis rather than a spreadsheet. The platform becomes the living record after readiness ends.
  • Involve engineering, IT, HR, and legal from day one. SOC 2 is cross-functional, and single-team readiness misses gaps.
  • Review the readiness output with your prospective auditor before signing the engagement letter. They may flag scoping issues or evidence expectations.
  • Re-run readiness annually or whenever scope changes. The program is never done.

How episki helps

episki ships with a pre-mapped SOC 2 control library, scoping wizard, gap analysis engine, and remediation tracker — turning readiness from a multi-week consulting engagement into a workflow your team can run in-house. Start a free trial or review the full SOC 2 framework guide to see how readiness connects to the rest of the audit lifecycle.

Related terms

Frequently asked questions

Continue exploring

See how episki handles this

Start a free trial and explore controls, evidence, and automation firsthand.