DoD Basic Assessment: How SPRS Scoring Works
Master the basics of NIST SP 800-171 DoD assessments in four essential steps.
Meta description character count: 157
---
Word count: ~2,100 Tier: 2 (Practitioner)
Specificity markers hit (4 of 5):
- ✅ NIST/CMMC control reference — DFARS 252.204-7019/7020, NIST SP 800-171 Rev 2, DoD Assessment Methodology v1.2.1, specific control domains
- ✅ Cost/time estimate — assessment timeline (2–6 weeks), POA&M remediation windows
- ✅ Tool/product name — SPRS portal (sprs.csd.disa.mil), DoD Assessment Methodology v1.2.1 spreadsheet, Totem.tech free assessment tool, DIBCAC
- ✅ Common mistake — multiple, including wrong "not met/partially met" determinations, missing the POA&M date, single-SSP errors
- ✅ Decision point with guidance — what to do when score is negative, whether to wait before submitting
---
Full Article
Every DoD contractor who handles Controlled Unclassified Information has to complete a DoD Basic Assessment and post the result to the Supplier Performance Risk System — better known as SPRS. This is not optional. DFARS 252.204-7020 makes it a contract requirement, and contracting officers check SPRS before award. If there's no score on file, you're out of the running.
The problem is that most contractors don't fully understand how the scoring works until they're staring at a negative number wondering how they got there. This article walks you through the mechanics — how the score is calculated, what the weights mean, and the specific decisions that determine whether you end up at +110 or in the red.
---
What the DoD Basic Assessment Actually Is
The DoD Assessment Methodology framework has three tiers:
- Basic — a self-assessment conducted by the contractor against the 110 controls in NIST SP 800-171 Rev 2. No third party involved. You evaluate yourself, calculate a score, and post it to SPRS.
- Medium — conducted by the DoD (typically DCMA's DIBCAC team) using document review and interviews. Posted to SPRS by the government.
- High — a DIBCAC on-site assessment with documentation, interviews, and technical testing.
The Basic Assessment is what most contractors are dealing with right now. It's the self-assessment version — which sounds easier than a third-party audit, but "self-reported" doesn't mean "consequence-free." Knowingly submitting a false score to SPRS creates False Claims Act exposure. That has been the basis of several DoJ enforcement actions since 2021. Score honestly.
The governing documents are:
- DFARS 252.204-7019 — notice that NIST SP 800-171 assessment requirements apply to this contract
- DFARS 252.204-7020 — the actual assessment requirements, including what you submit to SPRS
- NIST SP 800-171 DoD Assessment Methodology, Version 1.2.1 — the scoring rubric. This is the document that defines how points are assigned and deducted.
Download the DoD Assessment Methodology v1.2.1 from the DAU/Office of the Secretary of Defense website. It includes a spreadsheet you fill out for each control family. That spreadsheet is your primary working document for the assessment.
---
The Scoring Math
Start with a perfect score of 110. Every control you haven't fully implemented reduces that score. The amount it reduces depends on the control's weight.
Each of the 110 NIST SP 800-171 Rev 2 controls has an assigned point value: 1, 3, or 5. The weights were assigned based on security impact — how much damage an unimplemented control could allow an attacker to cause. The most operationally critical controls carry 5-point weights. Controls with a specific but limited effect on security carry 3-point weights. Everything else is 1 point.
The sum of all control weights is greater than 110, which is why scores can go negative. If you fail to implement every single control, you land at −203. That's the floor. The range is −203 to +110.
For each control, you have three possible determinations:
- Met — control is fully implemented. No deduction.
- Not Met — control is not implemented at all. Subtract the full weight (5, 3, or 1).
- Partially Met (with POA&M) — control is partially implemented. Subtract half the weight (rounded up for odd values). For a 5-point control that's −3. For a 3-point control that's −2. For a 1-point control that's −1.
Wait — that's an important nuance. A partially implemented 5-point control costs you 3 points, not 5. A 5-point control that's completely unaddressed costs you the full 5. The difference matters when you're calculating which gaps to close first.
---
What the Weights Mean in Practice
The 5-point controls are the ones DoD considers most damaging if left unaddressed. This group includes:
- All 17 basic security requirements for FCI from FAR 52.204-21 — these are things like uniquely identifying users, limiting system access to authorized individuals, and sanitizing media before disposal. If you're a federal contractor and you're not doing these, you're failing the basics.
- Controls governing multi-factor authentication, encryption for CUI in transit and at rest, and audit log generation.
- Controls that, if unimplemented, would give an attacker a direct path to CUI — things like boundary protection (SC.L2-3.13.1) and malware protection (SI.L2-3.14.2).
The 3-point controls have a real but more contained security impact — things like session termination, Wi-Fi access authorization, and data-at-rest protection for mobile devices.
The 1-point controls are important but their absence doesn't immediately expose CUI. Security training frequency reminders, physical access log reviews, that tier of thing.
If you're triaging where to focus remediation effort, start with the 5-pointers. Implementing one unaddressed 5-point control recovers more score than implementing five 1-point controls. Work the list by weight, not alphabetically.
---
Getting to a Score You Can Submit
DFARS 252.204-7020 requires you to submit the following to SPRS:
- System Security Plan name
- CAGE code(s) the SSP covers
- Brief description of the system architecture
- Date of assessment
- Total score
- Date by which a score of 110 is expected — meaning the date you plan to have all deficiencies remediated
That last item — the "expected 110 date" — is often misunderstood. It is not optional. If your score is below 110, you must submit a realistic target date for full implementation tied to your Plan of Action and Milestones (POA&M). You're not certifying that you'll hit 110 by that date; you're representing your best current estimate. Assessors and contracting officers do look at this date.
To submit to SPRS, go to sprs.csd.disa.mil. You'll need a DoD-issued PKI certificate or a CAC card for login — SPRS is a government system that requires identity verification. Contractors without those credentials submit via encrypted email to [email protected], and the government posts the score on your behalf.
---
Common Mistakes
Treating "partially met" too generously
The "partially met" determination requires that a meaningful portion of the control is actually implemented — not just planned. If you have a policy document that says "we will implement multi-factor authentication" but MFA isn't turned on for any systems yet, that's not met, not partially met. Upgrading it to "partially met" to save a few points on your score is exactly the kind of inaccuracy that triggers False Claims Act liability.
The standard: partial implementation means technical or procedural safeguards are in place for some portion of the control's scope, with documented gaps and a POA&M entry for the remainder. A draft policy with no implementation isn't partial. Intent isn't partial.
Submitting one SSP when you need multiple
If your organization operates multiple separate IT environments — different network segments, different business units, different locations — each environment that processes CUI needs its own SSP. And each SSP needs its own SPRS submission.
A common mistake is rolling all CUI environments into one SSP, calculating a single score, and submitting that. If a DIBCAC Medium or High assessment later finds that the single SSP didn't accurately capture all CUI environments, the discrepancy becomes a finding that goes beyond a score correction.
Confusing "the SPRS score" with "CMMC certification"
A SPRS score — even a perfect 110 — is not CMMC Level 2 certification. It's a self-reported assessment under a different (if related) framework. CMMC Level 2 requires a third-party assessment by a certified C3PAO, which will examine your evidence, interview your people, and test your controls. Your SPRS score may be 110 and your CMMC assessment may still find findings. The two processes are complementary, not interchangeable.
Getting the assessment date wrong
The assessment date you submit to SPRS must reflect when you actually completed the evaluation — not when you started it, not when you uploaded it, not the date you plan to finish your POA&M. If you spend six weeks going through all 110 controls and finish on a Thursday, Thursday is the date. This seems minor but DIBCAC reviewers look for consistency between assessment dates, SSP revision dates, and POA&M creation dates.
---
The POA&M: Your Score's Other Half
A score below 110 without a POA&M is a problem. DFARS 252.204-7020 expects that any deficiencies are documented in a plan with remediation actions, responsible parties, and target completion dates. Your POA&M doesn't need to be a masterpiece — it needs to be credible and maintained.
Typical remediation timelines that assessors consider reasonable:
- Critical 5-point gaps (missing MFA, no audit logging, no boundary protection): 60–90 days to close, assuming you're actively working on it.
- Moderate 3-point gaps (wireless access controls, session management): 90–180 days.
- Lower 1-point gaps (documentation, policy reviews, training cadence items): up to 12 months if resource-constrained, but the longer your timeline, the more scrutiny it draws.
A POA&M with 40 items all scheduled for completion three years from now is going to raise questions. A POA&M with 8 items on a 6-month schedule with interim milestones is credible. The difference isn't just aesthetics — contracting officers sometimes use the POA&M timeline in contract risk decisions.
Decision point: If your score is significantly negative (below −50), consider whether you should submit immediately or spend 4–8 weeks closing the worst gaps first. There's no DFARS rule that says you must submit before achieving a certain score — only that you must submit before contract award when the clause appears in the solicitation. Getting your score above zero before you submit isn't gaming the system; it's doing the work. The risk of waiting: if a proposal deadline passes while you're still remediating, you'll be disqualified. Don't cut it close.
---
How Long a Basic Assessment Takes
Expect 2–6 weeks of real effort, depending on your environment size and how well-documented your controls already are.
The heaviest lift is usually the evidence gathering — going through each of the 110 controls, determining actual implementation status, and documenting the findings. If your SSP is current and your configurations are documented, you're on the shorter end. If you're building your SSP from scratch while conducting the assessment, you're on the longer end.
Free tools like the DoD Assessment Methodology spreadsheet (embedded in the v1.2.1 document) give you a structured worksheet for working through each control. Totem.tech also offers a free preliminary assessment tool that maps directly to the methodology. Neither replaces the judgment calls — only someone who knows your environment can accurately determine "met," "partially met," or "not met" for each control.
---
What Your Assessor Expects
If your organization receives a DIBCAC Medium or High assessment — triggered because you've hit a certain contract threshold or because a prime selected you for review — the assessors will compare your submitted SPRS score against what they find. They're not just checking your current state; they're checking whether your self-assessment was accurate.
Specifically, assessors look for:
- SSP completeness — does the SSP accurately describe the CUI environment? Does the architecture description match what they're seeing?
- Score accuracy — are the controls you marked "met" actually implemented and evidenced? Can you produce logs, configurations, and screenshots for each one?
- POA&M credibility — for items marked "partially met," is there a real POA&M entry with a realistic timeline and documented progress?
- Assessment consistency — do your assessment date, SSP revision date, and POA&M dates tell a coherent story?
The most common outcome of a DIBCAC review of a self-assessment isn't outright fraud — it's score inflation from generous "partially met" determinations where the reality was closer to "not met." Tighten those determinations before any government assessment comes looking.
Your SPRS score is public to contracting officers with access to the system. It tells them two things at a glance: your current security posture and whether you're making honest progress toward 110. A low score with a credible, actively-worked POA&M is better than an inflated score that doesn't hold up to scrutiny.
---
If you're working through your first Basic Assessment, start with the DoD Assessment Methodology v1.2.1 spreadsheet and go control by control. Score yourself against what you can actually evidence today — not what you're planning or hoping. Get your POA&M in shape before you submit. Then post and maintain it annually.
Need help working through the 110 controls or building a defensible POA&M? A qualified CMMC Registered Practitioner can run you through the assessment methodology, identify the high-weight gaps, and help you prioritize remediation before your next contract cycle.