Rewrite: closing-cmmc-gaps-a-remediation-playbook

Explore best practices for effective remediation IT to enhance compliance and cybersecurity resilience.

Rewrite: closing-cmmc-gaps-a-remediation-playbook

Word count: ~2,200

Specificity markers hit:

  1. ✅ NIST/CMMC control reference (SPRS scoring weights, AC/SC/SI domain priorities, RA.L2-3.11.2, specific control families)
  2. ✅ Cost/time estimate (90-day sprint cycles, POA&M 180-day limit, remediation timelines by control type)
  3. ✅ Tool/product name (SPRS, DoD Assessment Methodology, POA&M structure, Jira/ServiceNow for tracking, Nessus/Rapid7 for scanning)
  4. ✅ Common mistake (remediating in random order, treating policy writing as the same as implementation, letting POA&M items age without closure)
  5. ✅ Decision point with guidance (which gaps to tackle first, what can go in a POA&M vs. what must be done before assessment)

---

Closing CMMC Gaps: A Remediation Playbook

A gap assessment gives you a list of problems. Remediation is solving them in the right order, at the right pace, with enough documentation to satisfy an assessor who wasn't there when you fixed things.

Most organizations get the gap assessment done and then lose momentum. They start fixing things that seem urgent or that someone on the team knows how to do, rather than working through a prioritized plan. Six months later they've closed 40 gaps but missed 15 high-impact ones, and their SPRS score is still low because the scoring methodology penalizes missing controls unevenly.

This playbook is designed to prevent that.

Understand How Gaps Translate to SPRS Score Impact

Before you start remediating, you need to understand the scoring mechanism you're working against. The DoD Basic Assessment Methodology assigns point values to each NIST 800-171 control based on impact. Not all controls are worth the same number of points.

The scoring works from a baseline of 110 (all controls met). Each unimplemented control subtracts either 1, 3, or 5 points. The SPRS score can go as low as -203. Here's the distribution:

  • 5-point controls: The highest-impact gaps. These are typically in the Access Control (AC), Identification and Authentication (IA), Configuration Management (CM), and System and Communications Protection (SC) domains.
  • 3-point controls: Medium-impact. Spread across Audit and Accountability (AU), Incident Response (IR), System and Information Integrity (SI), and Risk Assessment (RA) domains.
  • 1-point controls: Lower individual impact but still required.

A practical example: If you're missing MFA for privileged accounts (IA.L2-3.5.3, 5 points) and haven't deployed FIPS-validated encryption at rest (SC.L2-3.13.8, 5 points), those two gaps alone cost you 10 points. Fixing both before submission improves your score by 10 points without touching anything else.

Where to find the specific point values: The DoD Assessment Methodology document, available at dodcio.defense.gov. It lists every control with its point value. Download it, add a column to your gap list showing the SPRS impact of each gap, and sort by impact. That's your remediation priority list.

Phase 1: Triage Your Gaps (Weeks 1–2)

Take your gap assessment output and categorize every gap into three buckets:

Must fix before assessment. Controls where the gap is so fundamental that no C3PAO would issue certification with them open. Examples: no MFA implementation at all, no encryption for CUI at rest, no centralized audit logging, no security awareness training program. These are not POA&M-eligible in most cases.

Can be POA&M'd. Controls that are partially implemented or where a viable path to implementation exists within 180 days of assessment. CMMC does allow for POA&M items at Level 2, but they must represent genuine partially-implemented controls — not "we haven't started this yet." A control where you've deployed the tooling but haven't completed documentation is a reasonable POA&M item. A control where you haven't thought about it yet is a gap that needs to go in the "must fix" bucket.

Can be addressed in parallel. Policy-level and procedural gaps that don't require significant technical work — documentation updates, policy formalization, procedure writing — can often proceed in parallel with technical implementation work. Don't let policy work slow down technical remediation.

Output from Phase 1: Three categorized lists, each sorted by SPRS impact. This is your remediation roadmap.

Phase 2: Execute in 90-Day Sprints (Months 2–9)

Don't try to close everything at once. Work in 90-day sprints with defined targets.

Sprint 1 (Months 2–4): High-impact, high-SPRS-value gaps.

Start with the controls that will improve your SPRS score most and eliminate the most fundamental assessment risks. This typically means:

  • Identity and authentication: MFA for all remote access and privileged accounts (IA.L2-3.5.3, IA.L2-3.5.4), unique user identifiers (IA.L2-3.5.1), and authenticator management (IA.L2-3.5.2). If you don't have MFA everywhere it's required, this is your first sprint target. Implementation: Azure AD Conditional Access, Okta, Duo Security, or hardware tokens depending on your environment. Timeline: 4–8 weeks to fully deploy and document.
  • Encryption: FIPS-validated encryption at rest (SC.L2-3.13.8) and in transit (SC.L2-3.13.11). BitLocker with FIPS mode on Windows, FileVault on Mac, TLS 1.2+ with FIPS cipher suites for all CUI transmission. Timeline: 4–6 weeks to deploy and generate validation documentation.
  • Centralized audit logging: The Audit and Accountability (AU) domain has nine controls that collectively require logging security-relevant events from CUI systems, protecting those logs, and retaining them for at least one year. If you don't have a SIEM or centralized logging solution (Splunk, Microsoft Sentinel, Elastic SIEM, Graylog), implement one. This is not optional and assessors verify it directly. Timeline: 6–10 weeks to deploy, configure, and document.

Sprint 2 (Months 4–6): Configuration management and boundary protection.

  • System baselines and change control (CM.L2-3.4.1 through CM.L2-3.4.3): Documented baseline configurations for every CUI system type, with a formal change control process. Use CIS Benchmarks or DISA STIGs as your baseline source. The documentation work here is significant — plan for 40–80 hours to document baselines for a typical environment.
  • Network boundary protection (SC.L2-3.13.1, SC.L2-3.13.5): Firewalls at the CUI environment boundary with logging and traffic monitoring. If you're using a cloud enclave, verify that the network security groups and firewall rules match your documented boundary controls.
  • Vulnerability scanning (RA.L2-3.11.2): Deploy and configure a vulnerability scanner (Nessus Professional, Rapid7 InsightVM, Qualys) and run your first scan. Document your scanning schedule and remediation SLAs. Assessors will ask about this and may want to see scan reports going back at least 90 days.

Sprint 3 (Months 6–9): Documentation, training, and procedural controls.

  • SSP completion: You should be building the SSP throughout remediation, documenting each control as it's implemented. By Sprint 3, the SSP should reflect the current implemented state. Review every control description against your actual configuration. This is where most discrepancies appear.
  • Security awareness training (AT.L2-3.2.1 through AT.L2-3.2.3): Run training for all users, document completions, and complete role-specific training for system administrators and privileged users. Don't leave this to the last week before your assessment — assessors will check training completion dates and a batch of completions one week before the assessment looks staged.
  • Incident response program (IR.L2-3.6.1 through IR.L2-3.6.3): Written incident response plan, a mechanism for reporting incidents to DoD within 72 hours, and documented incident response capability testing. This is frequently underdeveloped at small contractors.
  • Remaining policy and procedure documentation: Personnel security, physical access procedures, media protection procedures, maintenance procedures. These don't require new technical implementations but they do require time to write properly.

Building and Managing Your POA&M

The Plan of Action and Milestones is a living document, not a parking lot for things you don't want to fix. Here's how to build one that actually supports your assessment:

Structure each POA&M item with: - Control ID and description - Current state (why it's not fully implemented) - Planned remediation action - Resources needed (who, what tools, what budget) - Target completion date - Milestone dates for intermediate steps

Realistic timelines matter. POA&M items with 180-day target dates that actually take 30 days to implement tell the assessor you understand the work. POA&M items with 180-day dates that existed the same way at your previous assessment tell them you're using the POA&M to avoid fixing things.

Keep it current. Update the POA&M as you close items. An assessor who sees 20 items on your POA&M and your SSP shows those controls are now implemented will view it favorably — it demonstrates a functioning remediation program. An assessor who sees 20 items that haven't moved in six months sees a compliance program in name only.

What cannot go in a POA&M: Controls where the risk to CUI is so high that no reasonable POA&M timeline is acceptable. MFA, encryption, and audit logging are the most common examples. These need to be implemented before your assessment, not planned for after it.

Evidence as You Go

One of the most expensive mistakes in CMMC remediation is implementing controls without capturing evidence in real time. Three months after you deploy BitLocker, you may not have documentation of the deployment process, the FIPS validation certificate number, or screenshots of the Group Policy settings. You'll need all of that for your assessor.

Build evidence collection into your remediation process:

  • Screenshot configurations immediately after implementation
  • Save vulnerability scan reports as PDFs with dates
  • Archive training completion records with timestamps
  • Capture change management tickets for every security-significant configuration change
  • Save encryption validation certificates (NIST CMVP validation numbers) for every cryptographic module protecting CUI

Organize evidence by CMMC domain and control as you go. Don't wait until the pre-assessment phase to build your evidence package — you'll be collecting evidence for 110 controls under time pressure, and the quality will show.

Common Mistakes in Remediation

Remediating in random order. Fixing the easy things first feels productive but often doesn't move the SPRS score or eliminate the highest-risk gaps. Sort by SPRS impact and risk severity, not by difficulty or familiarity.

Writing policies before implementing controls. Policy documentation that describes controls you haven't implemented creates two problems: the policy gets outdated as the implementation changes, and your assessor will find discrepancies between what the policy says and what your systems actually do. Implement first, document accurately second.

Treating POA&M items as permanent fixtures. A POA&M item with a 180-day target date that's still open 12 months later isn't a POA&M item — it's an unresolved finding waiting to become an assessment failure. Close your POA&M items or escalate them as resource problems that need management attention.

Not updating the SSP during remediation. Your SSP needs to reflect the implemented state of your controls. If you implement MFA in month 3 but don't update the SSP until month 8, the SSP is inaccurate for five months and will likely diverge from your actual configuration by the time you update it. Update the relevant SSP section within two weeks of implementing each control.

What Your Assessor Expects

Your C3PAO assessor will review your POA&M as part of the assessment. They're looking for:

  • Genuine partial implementation for each POA&M item (not "haven't started")
  • Realistic timelines that match the stated work
  • Evidence of progress since the POA&M was created
  • POA&M items consistent with what the SSP describes

The strongest evidence of a functioning remediation program is a closed POA&M history — items that were open at the start of your assessment preparation and are now fully implemented and documented. That demonstrates a security program that works, not just one that exists on paper.

---

Calculating your current SPRS score? Use the DoD Assessment Methodology spreadsheet from dodcio.defense.gov to apply point values to each of your gaps. Sort by points-at-risk to build your priority queue. Your score before and after remediation tells you exactly what each sprint of work is worth.