Writing a Cybersecurity Assessment Report

Learn best practices for crafting an effective cybersecurity assessment report.

Writing a Cybersecurity Assessment Report

Word count: ~1,790 Specificity markers hit: (1) NIST/CMMC control references — CA.L2-3.12.1, CA.L2-3.12.2, RA.L2-3.11.1 through 3.11.3 (2) Cost/time estimates — internal assessment 80–200 hours; C3PAO assessment $20K–$80K depending on org size (3) Tool/product names — NIST 800-171A assessment methodology, SPRS scoring, Drata/Secureframe for evidence collection (4) Common mistake — confusing assessment report with SSP (5) Decision point — self-assessment report vs. C3PAO-generated report

---

Writing a Cybersecurity Assessment Report

A cybersecurity assessment report and a System Security Plan are not the same document. A lot of defense contractors confuse them, which creates problems when the assessor asks for one and they hand over the other.

Your SSP describes how your organization implements each security control. It's your documentation of current state.

Your assessment report documents the results of evaluating those controls against the NIST 800-171 requirements — what's working, what's not, the severity of gaps, and what you're doing to close them. It's the output of examining the SSP claims against reality.

CMMC Level 2 requires both. CA.L2-3.12.1 requires periodic assessment of security controls. CA.L2-3.12.2 requires that you develop a plan of action (POA&M) to address deficiencies found in the assessment. The assessment report is the document that connects those two requirements: it records what the assessment found, and it feeds the POA&M.

Here's what an assessment report needs to contain and how to write one that's actually useful.

When You Need an Assessment Report

There are two contexts where defense contractors produce assessment reports:

Self-assessment: Required annually under CMMC. Your organization evaluates its own controls against the 110 NIST 800-171 requirements using the NIST 800-171A assessment methodology, produces a score, and submits that score to the Supplier Performance Risk System (SPRS). The self-assessment report documents the evaluation. You're not required to submit the full report to SPRS — just the score and a signed affirmation — but you keep the report internally as evidence of how you arrived at the score.

C3PAO assessment: The triennial third-party assessment required for CMMC Level 2 certification. Your C3PAO conducts the assessment using NIST 800-171A and produces their own assessment report. You don't write this one — they do. But you need to understand its structure because you'll be responding to its findings.

Internal gap assessments: Many contractors run internal assessments before their C3PAO assessment to identify gaps and build their remediation plan. These follow the same structure as a formal assessment report. The output feeds your POA&M and helps you prioritize the work needed before certification.

This article focuses on writing your internal and self-assessment reports — the ones you produce to document your own evaluation and track your progress.

The Structure of an Assessment Report

1. Scope Statement

Document exactly what you assessed: which systems, which networks, which organizational units, which sites. If you have an enclave, define its boundaries. If some systems are excluded (CRMAs, specialized assets), state that and explain why.

This matters because assessment scope defines the boundaries of your claims. If you assess only your CUI enclave and exclude your corporate network, the report is valid only for that scope. The SPRS score you submit covers your entire CUI environment — if you only assessed part of it, say so.

A clear scope statement also prevents scope creep during the assessment itself. Without it, assessments drift and findings from out-of-scope systems can create confusion about applicability.

2. Assessment Methodology

State how you conducted the assessment. Reference NIST SP 800-171A, which defines three assessment methods for each practice:

  • Examine: Review documentation (SSP, policies, configurations, logs)
  • Interview: Talk to personnel who implement and maintain the controls
  • Test: Technically verify controls work as documented (configuration checks, access testing, log review)

For each control, NIST 800-171A specifies which methods are required. Some controls require only examination. Others require examination plus interview. High-risk controls may require all three. Your methodology section should note which methods you applied and why.

Including methodology in your report serves two purposes: it demonstrates rigor to anyone reviewing the report, and it forces you to actually conduct a substantive evaluation rather than just walking through the control list and marking things "implemented."

3. Control-by-Control Findings

This is the core of the report. For each of the 110 NIST 800-171 Rev 2 practices, document:

Assessment objective: Restate the specific objective(s) from NIST 800-171A for this practice. Each practice has between 1 and 4 objectives.

Current status: Met, Not Met, or Partially Met. For self-assessment SPRS scoring, partially met doesn't get a partial score — you either meet the full requirement or you don't for each practice. Each practice is worth a specific negative value in the SPRS 0-to-110 scoring methodology (the maximum score is 110; each unmet practice deducts points based on its weight).

Evidence reviewed: What you looked at. Configuration screenshots, policy documents, interview notes, test results. Cite specific documents with version dates. This is what an assessor would review to validate your finding.

Finding narrative: A two-to-four-sentence description of what you found. If Met: describe how the control is implemented and reference the evidence. If Not Met or Partially Met: describe the specific gap — what's missing, what doesn't meet the standard.

Risk rating: For gaps, rate the severity. High (gap creates direct path to CUI exposure), Medium (gap increases risk but with mitigating factors), Low (gap is procedural or administrative with no direct exploitation path). Risk rating drives POA&M prioritization.

For a 110-control assessment, this section will be long. A thorough self-assessment typically takes 80–200 hours of internal effort depending on the complexity of your environment and the completeness of your SSP and documentation. If your documentation is thin, the assessment itself will surface that — and you'll need to document before you can assess.

4. Summary Scorecard

After completing the control-by-control findings, produce a summary:

  • Total practices assessed: 110
  • Met: [number]
  • Not Met: [number]
  • SPRS score: [calculated value]

Include a breakdown by CMMC domain so you can see where the gaps cluster. If you have 8 not-met findings in the Access Control domain and 2 in everything else, your remediation effort should reflect that priority.

The SPRS score calculation uses specific negative weights for each practice — this isn't a simple percentage. Use the NIST 800-171A scoring methodology worksheet or a compliance platform (Drata, Secureframe, and similar tools automate the calculation from your finding inputs) to ensure you're computing it correctly.

5. Plan of Action and Milestones (POA&M)

CA.L2-3.12.2 requires that you develop plans of action showing how you'll address deficiencies. The POA&M is typically produced as a separate document but referenced in the assessment report.

For each Not Met or Partially Met finding, the POA&M should include:

  • Control ID and description of the gap
  • Planned remediation action (specific, not generic)
  • Responsible individual or team
  • Required resources (budget, tools, personnel)
  • Estimated completion date
  • Interim mitigating measures (if any) that reduce risk while the finding is open

The POA&M is a living document. Update it as findings are remediated, add new findings from subsequent scans or reviews, and date every change. Your assessor will review your POA&M as part of the assessment — they want to see that you're actively managing it, not that it was last touched two years ago.

Self-Assessment vs. C3PAO Assessment Report

There's an important distinction in how these reports are used:

A self-assessment report is internal evidence. You submit your SPRS score to the DoD contractor database, but the report stays in your files. If DoD audits your self-assessment or initiates a False Claims Act investigation into your reported score, they can subpoena your report to verify the score is accurate. The report needs to honestly reflect your evaluation — not be reverse-engineered from a target score.

A C3PAO assessment report is produced by the third party and forms the basis of your CMMC Level 2 certification. You'll receive a copy of their findings. Review it carefully — not all C3PAO findings will be accurate, and you have a process to dispute errors before the assessment is finalized.

Common Mistake: Confusing the Assessment Report With the SSP

These are different documents with different purposes, and mixing them up creates real problems.

The SSP describes how controls are implemented: "We use Azure AD with Conditional Access to enforce MFA for all users with access to CUI systems." It's written in present tense and describes your current configuration.

The assessment report evaluates that claim: "Examined Conditional Access policy configuration; interviewed IT administrator; tested MFA enforcement by attempting access without a registered token. Confirmed MFA is enforced for all CUI-system user accounts. Evidence: CA policy screenshot dated [date], test log showing blocked access attempt." It's written as an evaluation finding with evidence.

When contractors hand an assessor a document that's half SSP and half assessment, the assessor has to spend time separating the claims from the evidence. Worse, it often means the organization hasn't actually conducted a structured evaluation — they've written descriptions and called them findings. That's not an assessment report; it's an annotated SSP, and it doesn't satisfy CA.L2-3.12.1.

What Your Assessor Expects

For CA.L2-3.12.1, the assessor wants evidence that you've conducted a structured evaluation of your controls, not just written them down. They'll ask to see your assessment report, review the methodology, check whether you applied examine, interview, and test methods appropriately, and look at your findings for internal consistency.

For CA.L2-3.12.2, they'll review your POA&M. Is it current? Does it reflect the findings from your most recent assessment? Are the remediation timelines reasonable and being met?

They're not expecting a perfect score or an immaculate control environment. They're evaluating whether you have a functioning assessment and continuous improvement cycle. An organization that conducts honest self-assessments, maintains a current POA&M, and closes findings systematically demonstrates more real compliance maturity than one with a glossy report and a suspiciously perfect score.

---

CTA: If your last self-assessment doesn't have control-by-control findings with supporting evidence, it's not an assessment — it's a gap analysis. Run a structured evaluation using NIST 800-171A before your C3PAO assessment so you're not discovering gaps in the assessor's presence.