Rewrite: building-a-control-implementation-tracker
Establish a comprehensive cybersecurity controls list for effective compliance and risk management.
Word count: ~1,750
Specificity markers hit (4/5):
- ✅ NIST/CMMC control references — 110 Level 2 controls, NIST 800-171A assessment objectives, CA.L2-3.12.2 (POA&M)
- ✅ Cost/time estimate — Spreadsheet vs. GRC platform $0 vs. $15K–$50K/yr; initial tracker population 20–40 hrs
- ✅ Tool/product name — Microsoft Excel/SharePoint, Atlassian Confluence, RegScale, Drata, Secureframe, CMMC Assessment Guide
- ✅ Common mistake — Conflating the tracker with the SSP; marking controls Met without evidence
- ✅ Decision point with guidance — Spreadsheet vs. GRC platform; when to upgrade
---
Implementing 110 security controls across a real organization isn't something you can track in your head or manage with email threads. You need a system that shows you what's done, what isn't, who owns each item, what the evidence is, and what the gaps are. That's a control implementation tracker.
This isn't a compliance platform sales pitch. A well-built spreadsheet does the job for most small and mid-size defense contractors. The point is to have an organized, current record of where each of your 110 controls stands — and to understand what belongs in that tracker versus what belongs in other documents.
What a Control Implementation Tracker Is (and Isn't)
The tracker is not your System Security Plan. The SSP is a narrative document that describes how each control is implemented. The tracker is a management tool that shows whether each control is implemented, who's responsible for it, and what evidence exists.
The tracker is not your POA&M. The POA&M (Plan of Action & Milestones) is specifically for controls that are not yet fully implemented — it documents deficiencies, planned remediation, and timelines. The tracker covers all 110 controls, including the ones that are fully met. Under CA.L2-3.12.2, you're required to develop and implement plans of action to correct deficiencies; the tracker is the upstream tool that identifies those deficiencies.
Think of the relationship this way: the tracker is your dashboard. The SSP is your detailed technical documentation. The POA&M is your to-do list for what's broken. All three need to stay in sync.
The Structure: What Fields Actually Matter
A functional tracker doesn't need to be elaborate. The fields that provide real value:
Control ID — The NIST 800-171 control identifier (e.g., AC.L2-3.1.1, SI.L2-3.14.2). Use the CMMC notation consistently so you can cross-reference against NIST SP 800-171A assessment objectives.
Control Name / Description — A brief description of what the control requires. Don't copy the full control text — a short label is enough. "Limit system access to authorized users" for AC.L2-3.1.1.
Implementation Status — The status of each control. Use a simple four-value scale: - Met — fully implemented with evidence - Partially Met — implementation is in progress or incomplete - Not Met — not yet implemented - Not Applicable — with documented justification for why it doesn't apply
Avoid invented status categories like "In Progress," "Planned," or "Partially Compliant" that don't map to how CMMC assessors evaluate controls. Your assessor uses Met / Not Met. Your tracker should reflect reality in those terms.
Owner — The specific person or role responsible for implementing and maintaining this control. Not a team or department — a name. Ownership without a name means nobody owns it.
Implementation Description (brief) — A 1–2 sentence summary of how the control is implemented. "MFA enforced via Azure AD Conditional Access policy for all remote and privileged account access." This is a compressed version of your SSP entry — useful for quick review without reading the full SSP.
Evidence Reference — Where the evidence lives. A file path, SharePoint link, or ticket number that points to the documentation the assessor will review. "SSP Section 3.1 / screenshots: /evidence/AC/AC.L2-3.1.3-screenshots.pdf." If you can't point to evidence, the control isn't Met regardless of what the implementation status says.
Assessment Objective Coverage — NIST SP 800-171A breaks each control into assessment objectives (the specific things assessors check). For some controls, there's one objective. For others, there are five or six. Tracking objective-level coverage helps you avoid marking a control Met when you've only addressed part of it. This is the field most organizations skip — and the one that most often produces surprises in assessments.
Last Verified — The date the implementation was last verified to be in place. Controls drift. A control that was Met 18 months ago may not be Met today. This field tells you how stale your verification is.
POA&M Reference — If the control has an open POA&M item, link to it here. This maintains the connection between the tracker and the POA&M.
The Common Mistake: Marking Controls Met Without Evidence
The most frequent tracker error: filling in "Met" status for controls before the evidence exists — optimistically marking things done as part of implementation planning rather than as verification of actual implementation.
Your assessor evaluates each control against three methods per NIST 800-171A: examine, interview, and test. "Met" in your tracker means you have documentation to examine, personnel who can speak to the implementation in an interview, and the control holds up to technical testing. A control that works in practice but has no documentation is not Met from an assessment standpoint. A control documented in policy but not implemented technically is not Met either.
Before marking a control Met:
- Verify the technical implementation is actually in place
- Confirm the SSP description accurately describes the implementation
- Confirm the evidence exists and is accessible
- Confirm the person listed as owner actually knows how this control works and can speak to it in an interview
If any of those four things is false, the control isn't ready to be marked Met.
Building the Initial Tracker
Starting from scratch, the most efficient approach:
Step 1: Get the control list. Download the CMMC Assessment Guide for Level 2 from the CMMC Accreditation Body (CyberAB) website. It lists all 110 practices with the associated assessment objectives and discussion. This is the authoritative source for what assessors evaluate.
Step 2: Set up your structure. A spreadsheet (Excel, Google Sheets) with the fields described above. Organize by domain (AC, AT, AU, CM, IA, IR, MA, MP, PE, PS, RA, CA, SI, SC) for easier navigation. Each domain gets a tab or a section.
Step 3: Populate status honestly. Go through each control and mark your current status based on reality, not aspiration. For a contractor starting their CMMC journey, most controls will be Not Met or Partially Met. That's fine — the tracker's value is in honesty, not in looking good.
Step 4: Assign owners. Every control gets a named owner. If you can't identify who owns a control, that's a gap in your security governance that needs to be addressed.
Step 5: Document existing evidence. For controls that are already Met, document where the evidence lives. This is often the most time-consuming part — not because the evidence doesn't exist, but because it's scattered across email threads, shared drives, and tribal knowledge.
Initial population of a 110-control tracker takes 20–40 hours for a typical small contractor environment, assuming someone who knows the environment and the framework handles it. Budget for this in your assessment preparation timeline.
Spreadsheet vs. GRC Platform
For most small contractors (under 200 people, one CUI environment, straightforward IT infrastructure), a well-maintained spreadsheet in SharePoint or Confluence does everything a GRC platform does at no cost. It's easy to customize, easy to share, and easy for an assessor to review.
The case for upgrading to a GRC platform (Drata, Secureframe, RegScale, or similar): your environment is complex enough that manual evidence collection becomes a significant burden, you have multiple systems and teams contributing to control implementation, you need continuous monitoring to detect control drift, or you're approaching your assessment and want automated evidence collection.
GRC platforms in the CMMC space run $15,000–$50,000/year depending on company size and features. They integrate with your technical infrastructure (Azure AD, endpoint management, SIEM) to pull configuration evidence automatically — which is the main value over a spreadsheet. They don't replace the judgment calls about what "Met" means, and they don't write your SSP.
The decision point: if you're spending more than 2–3 hours per week maintaining your tracker and evidence package manually, a GRC platform probably pays for itself. If maintenance is a monthly task, a spreadsheet is fine.
Using the Tracker to Prepare for Assessment
Six months before your C3PAO assessment, your tracker should be your primary assessment readiness tool. Walk through it systematically:
Verify every "Met" control — for each one, click through to the evidence reference and confirm the evidence is current and accurate. If evidence was collected 12 months ago, verify the control is still in the same state.
Resolve "Partially Met" controls — every partially met control either gets remediated to Met (with evidence) or gets documented in the POA&M with a realistic timeline. Your assessor will evaluate partially met controls the same way as Not Met unless you can demonstrate meaningful implementation of the partial elements.
Confirm owner knowledge — for each high-risk domain (AC, SC, SI, AU), brief the control owners on what they'll be asked in interviews. The assessor will pick up the phone and ask your IT admin how your audit logging works. They should be able to answer without consulting the SSP.
Check evidence accessibility — all evidence should be in a location you can navigate to quickly during the assessment. Create an evidence folder structure that mirrors your domain-by-domain tracker.
Reconcile tracker, SSP, and POA&M — these three documents should tell the same story. If the tracker shows AC.L2-3.1.3 as Met but the SSP description is generic and there's no evidence reference, something needs to be fixed before the assessors arrive.
What Your Assessor Expects
Assessors don't require a specific tracker format. They're not going to grade your spreadsheet design. What they're evaluating is whether you have a working security management program — and a well-maintained tracker is one of the clearest signals that you do.
During the assessment, your tracker serves as a navigation tool: when the assessor asks about a specific control, you can quickly point to the implementation description, the owner, and the evidence location. Contractors who can navigate to evidence quickly and clearly appear prepared. Contractors who don't have organized records spend assessment days scrambling.
Keep the tracker current. An accurate tracker with 15 Not Met controls is better than an aspirational tracker with 110 controls marked Met and no supporting evidence. Assessors expect to find some gaps — what they're evaluating is whether you know where they are and have a plan to address them.
---
Starting your tracker? Download the CMMC Assessment Guide from CyberAB first — building your structure around the actual assessment objectives is more useful than building it around the control text alone.