The 5 CMMC Mistakes That Cost Contractors Millions

The Aerojet Rocketdyne settlement was $9 million. A former employee filed a qui tam lawsuit alleging the company misrepresented its cybersecurity compliance on Department of Defense (redesignated the Department of War by executive order, September 2025) (DoD) and NASA contracts. The core allegation: Aerojet knew it was not meeting the required security controls and certified anyway.

That settlement is not the last one. The DoJ Civil Division has an active CMMC enforcement initiative, and the False Claims Act creates a mechanism for employees, competitors, and whistleblowers to report misrepresentation directly to the government.

Most CMMC compliance failures are not intentional fraud. They are the result of predictable, preventable mistakes that are made repeatedly across the defense industrial base. This article covers the five most common and most costly ones.

Mistake 1: Waiting for the Contract Requirement

The most damaging mistake a defense contractor can make is treating CMMC as a contract response task. The thinking goes: when a solicitation requires CMMC Level 2 C3PAO certification, we will start the process.

The problem is that C3PAO certification has a real-world lead time measured in months, not weeks. A gap assessment and initial scoping exercise takes weeks. Remediation of significant gaps takes 12 to 24 months for organizations starting from a low baseline. The C3PAO assessment itself requires scheduling in advance. Conditional certification allows 180 days to close out POA&M items. Add these together and the minimum realistic timeline from starting to certified is 12 months for a well-prepared organization, and 18 to 24 months for organizations with significant gaps.

When Phase 2 solicitations begin appearing in late 2026, any contractor who has not started their program will be unable to compete for those awards. The window to build a CMMC program that supports contract performance during the Phase 2 and Phase 3 rollout is open right now and narrowing.

What to do instead: Start your gap assessment now, regardless of whether a specific solicitation has arrived. If you handle CUI under a DoD contract, you already have a compliance obligation under DFARS 252.204-7012. Treat CMMC readiness as an ongoing program, not a pre-bid activity.

Mistake 2: Inflating the SPRS Score

The Supplier Performance Risk System score is where many compliance representations go wrong. Organizations self-assess, find that their actual score is lower than they would like it to be, and then either ignore gaps in their scoring or over-interpret "partial implementation" as "Met."

This is a serious problem for two reasons.

First, inflating your SPRS score is a potential False Claims Act violation. When a senior official affirms the accuracy of a SPRS submission under penalty of law, and the submission overstates compliance, the affirmation creates civil liability. Under 31 U.S.C. § 3729, the government can recover treble damages and civil penalties. Any employee who knows about the misrepresentation can file a qui tam lawsuit.

Second, an inflated SPRS score creates operational security risk. If you tell yourself and your customers that you have security controls you actually do not have, you are operating under a false picture of your actual risk posture. When a breach occurs, the gap between your stated compliance and your actual controls will be very visible in the litigation record.

What to do instead: Score accurately. A lower SPRS score is not a disqualifier in and of itself. It is data. It tells you what gaps need to be fixed and in what priority order. Build a POA&M, remediate the gaps systematically, and resubmit as your posture improves. A well-documented gap assessment and an honest SPRS score with a credible POA&M is far better than a fraudulent score.

Mistake 3: Treating the SSP as a Checkbox Document

The System Security Plan is the most commonly deficient artifact in CMMC gap assessments. It is also the artifact that C3PAO assessors rely on most heavily to understand your environment and evaluate your controls.

Two SSP failure modes appear repeatedly:

  • The aspirational SSP: The SSP describes security controls in terms of what the organization intends to implement or is planning to implement, rather than what is currently operational. When an assessor reviews the SSP and then tests the technical environment, they find that controls described as “implemented” are not actually in place. The result is a long list of Not Met findings that could have been avoided if the SSP had been accurate.
  • The missing SSP: Some organizations have never built an SSP at all. This produces an immediate “No Score” in SPRS because CA.L2-3.12.4 (SSP requirement) is a hard gate. You cannot complete a Level 2 assessment without an SSP assessed as Met. An assessor cannot evaluate controls that are not described anywhere.

A third SSP failure is the outdated SSP: a document that was written a year or two ago and has not been updated to reflect changes to the IT environment, new systems, new personnel, or changes to how CUI is handled.

What to do instead: Build the SSP to reflect your current, actual security posture. For each requirement, document the current implementation status (Implemented, Partially Implemented, Planned, Alternative Implementation, or Not Applicable), describe specifically how it is implemented, and reference supporting evidence artifacts. Update it when your environment changes. Treat it as a living document, not a one-time deliverable.

Mistake 4: Mishandling the POA&M

Plan of Actions and Milestones (POA&M) management has three common failure modes.

  • Putting prohibited controls on the POA&M. Six specific controls defined at 32 CFR § 170.21(a)(2)(iii) cannot be placed on a POA&M. If any of these six are Not Met and listed on your POA&M, SPRS returns “No Status.” Your CMMC certification attempt fails. The six prohibited controls represent gaps that regulators have determined are too critical to defer. Organizations sometimes discover they have a prohibited control on their POA&M mid-assessment, which terminates the assessment.
  • Underestimating the 180-day closeout. Organizations that receive Conditional Level 2 status have 180 days to close all POA&M items. The POA&M closeout can only be finalized in eMASS once during that 180-day period. If any item is still Not Met when the closeout assessment is submitted, Conditional Status terminates and the organization must start a new full assessment. There is no partial credit for closing most but not all items.
  • Building a POA&M with unrealistic timelines. A POA&M with 30-day remediation timelines for items that realistically take 6 months signals to assessors that the document is aspirational rather than operational. Assessors will question the credibility of a POA&M that does not reflect realistic project timelines.

What to do instead: Before finalizing your POA&M, verify that none of the six prohibited controls are included. Build realistic timelines based on actual remediation complexity and your team’s bandwidth. When pursuing Conditional certification, treat the 180-day window as a hard deadline and plan for the work to be complete at least 30 days before the deadline to allow for closeout preparation.

Mistake 5: Choosing the Wrong CMMC Partner

This mistake is not visible until it is too late. An organization selects a CMMC Registered Provider Organization (RPO) based on the lowest price and receives a gap assessment that looks comprehensive, produces a polished deliverable, and leaves the organization believing they are ready for a C3PAO assessment.

When the C3PAO arrives and starts testing actual controls, they find gaps the RPO missed. Evidence documentation is missing. The SSP describes controls that are not implemented. The assessment produces a long list of Not Met findings, remediation requirements, and a delayed or failed certification.

The cost of choosing the wrong RPO is measured in: the C3PAO assessment fees spent on an unsuccessful assessment, the time and cost of additional remediation, the delay in achieving certification, and potentially lost contracts during the gap period.

What distinguishes a capable CMMC RPO from a less capable one:

  • Certified professionals: The team working on your engagement should hold CCP (Certified CMMC Professional) or CCA (Certified CMMC Assessor) credentials from the Cyber AB. These credentials require passing a proctored exam and completing recognized training.
  • Technical depth: Effective CMMC gap assessments require actual technical testing, not just document review and interviews. An RPO that produces a gap assessment without verifying technical implementation is producing a documentation gap assessment, not a security gap assessment.
  • Evidence methodology: The evidence collected and documented during your readiness engagement should be aligned with what DIBCAC objective evidence lists expect. If your RPO does not know what a DIBCAC assessor looks for, the evidence they help you collect may not satisfy the actual assessment.
  • References from similar organizations: Ask for references from organizations of similar size and complexity that achieved C3PAO certification after working with the RPO.

What to do instead: Vet your CMMC RPO before signing an engagement. Ask about credentials, methodology, and the specific experience of the people who will work on your engagement. A lower price with a less capable team is a false economy.

The Pattern Behind the Mistakes

These five mistakes share a common thread: they all stem from treating CMMC as a documentation exercise rather than a real security program.

CMMC is a verification program. C3PAOs are trained to distinguish between organizations that have actually implemented security controls and organizations that have written documents saying they have. The depth of testing in a C3PAO assessment, including technical verification, system log review, and interviews with multiple personnel, is designed specifically to find the gap between documentation and reality.

The organizations that succeed in CMMC assessments are the ones that built real security programs, documented them accurately, and engaged partners who understood what assessors actually look for.

Key Takeaways

  • Start your CMMC program before a specific contract requirement forces the issue
  • Score your SPRS accurately — an inflated score creates False Claims Act liability
  • Build an SSP that reflects your current, actual security posture
  • Never place a prohibited control on your POA&M; treat the 180-day closeout window as a hard deadline with one shot
  • Vet your CMMC RPO on credentials, technical depth, and evidence methodology

Learn More

For the complete CMMC framework, see the CMMC 101: The Complete Guide to CMMC Compliance for Defense Contractors.

Related articles in this series:

Concerned your current CMMC program has one of these gaps? NR Labs provides CMMC gap assessments and program reviews designed to identify and fix exactly these kinds of issues before they become assessment failures. Contact us to schedule a review.