Evidence collection is where CMMC readiness programs either hold up under scrutiny or fall apart. An organization can have every security control technically implemented, but if it cannot produce the artifacts an assessor expects, those controls will score as Not Met.
This article explains what CMMC evidence actually is, how assessors use it, what types of artifacts satisfy common requirements, and how to build a defensible evidence package before your C3PAO assessment begins.
How CMMC Assessors Evaluate Requirements
C3PAO assessors evaluate each requirement using three types of evidence:
- Examine: Review of documentation, policies, procedures, plans, system configurations, reports, and other artifacts. This is the largest category of evidence.
- Interview: Conversations with personnel who are responsible for or knowledgeable about security controls. Assessors interview system administrators, security staff, HR representatives, end users, and management.
- Test: Technical verification that controls are actually working as described. Assessors run queries, review logs, test authentication, review firewall rules, and verify configurations directly.
A requirement is only Met when all three methods, examine, interview, and test, are consistent and support the Met determination. If the documentation says MFA is enforced but a test shows an account can authenticate with just a password, the requirement is Not Met regardless of what the policy says.
The DIBCAC Objective Evidence Lists
The Defense Industrial Base Cybersecurity Assessment Center publishes objective evidence lists that describe specifically what assessors expect to see for each CMMC requirement. As of July 2025, these lists are the most authoritative reference for evidence preparation.
The key principle behind the DIBCAC evidence lists: evidence must be specific, dated, and traceable to your actual environment. Generic policy templates, screenshots from demo environments, and undated documents do not satisfy the standard.
Evidence Types by Domain
Access Control (AC)
Access control evidence covers who has access to what, how that access is managed, and how it is controlled.
Key artifacts:
- Active directory or identity platform screenshots showing user account configuration (account types, group memberships, MFA enrollment status)
- Access control policy document with effective date and approval signature
- User access review records: dated spreadsheet or report showing last access review, changes made, accounts removed
- Privilege access management screenshots confirming privileged accounts require separate credentials from standard accounts
- Screenshots of remote access configuration (VPN with MFA required)
- Network access control policy showing external connection approval process
Common evidence gaps: Undated access reviews, access review spreadsheets with no evidence they were acted upon, missing documentation of the approval process for new access grants.
Audit and Accountability (AU)
Audit evidence demonstrates that your systems are logging what they need to log and that someone reviews those logs.
Key artifacts:
- SIEM or log management platform screenshot showing all in-scope systems are sending logs
- Log retention policy confirming logs are retained for the required period
- Evidence of log review activity: SIEM dashboard showing review cadence, tickets or reports generated from log review, dated alert review records
- Sample audit event types enabled: logon/logoff, privilege use, account management events, policy changes
- Screenshots of log integrity protection (logs stored where they cannot be modified by the systems that generated them)
Common evidence gaps: Logs configured but no evidence of review, logs retained but only for 30 days (usually insufficient), missing coverage for key event types.
Configuration Management (CM)
Configuration management evidence shows that systems are built to a secure baseline and that changes are controlled.
Key artifacts:
- Configuration baseline document for each operating system type in scope (Windows, Linux, network devices)
- Screenshots of group policy or configuration management platform showing baseline enforcement
- Change management policy and procedure
- Sample change request records showing approval before implementation
- Vulnerability scan results showing patch status
- Software inventory showing only authorized software is installed
Common evidence gaps: No formal configuration baseline, patch scan results that are outdated (not current), change management process exists on paper but no records of it being followed.
Identification and Authentication (IA)
Authentication evidence demonstrates how user identities are verified and how authentication controls are enforced.
Key artifacts:
- Identity platform (Azure AD, Okta, Active Directory) screenshots showing MFA enrollment rates and enforcement policies
- Password policy screenshots showing minimum length, complexity, and lockout settings
- Privileged Access Management (PAM) screenshots confirming privileged accounts use MFA
- Remote access authentication screenshots (VPN login flow showing MFA challenge)
- Service account inventory with ownership and purpose documented
- Evidence of regular review of service accounts and credential rotation
Common evidence gaps: MFA "enabled" but not enforced (users can bypass), service accounts with shared or undocumented credentials, privileged accounts that share credentials with standard user accounts.
Incident Response (IR)
Incident response evidence demonstrates that the organization is prepared to detect, respond to, and recover from security incidents.
Key artifacts:
- Incident response plan document (current version with effective date, approval signature)
- Incident response team roles and contact list
- Evidence of at least annual incident response test or tabletop exercise: exercise plan, scenario, participation list, after-action report
- Incident tracking record (even if no incidents occurred, the system for tracking them must exist)
- Process for reporting incidents to US-CERT within required timeframes
- Evidence that employees know how to report a suspected incident
Common evidence gaps: Incident response plan that has never been tested, no evidence of exercises, plan that references personnel or systems that no longer exist.
Maintenance (MA)
Maintenance evidence covers how system maintenance is conducted and controlled, including remote maintenance.
Key artifacts:
- Maintenance policy covering both local and remote maintenance
- Maintenance log showing scheduled and unscheduled maintenance activities
- Remote maintenance session records showing authentication and authorization before sessions begin
- Evidence that remote maintenance sessions are terminated when complete (no persistent backdoor access)
- Process for sanitizing or escorting maintenance equipment brought on-site
Common evidence gaps: No maintenance logs, remote maintenance conducted via uncontrolled means (direct RDP without MFA), vendor remote access accounts left active after maintenance is complete.
Multi-Factor Authentication (IA.L2-3.5.3)
MFA is a 5-point requirement and one of the most commonly tested controls. Evidence here is critical.
Key artifacts:
- Identity platform screenshot showing MFA enforcement policy (not just enabled, enforced, meaning users cannot bypass it)
- Screenshot of conditional access policy or equivalent that blocks access without MFA
- Evidence that MFA is required specifically for privileged account access
- For remote access: VPN or remote desktop gateway configuration showing MFA is required before session establishment
- Test results or authentication flow screenshots showing the MFA challenge in action
Important: "MFA is available" is not the same as "MFA is enforced." Assessors test this. If a test account can log in with only a password, the requirement is Not Met regardless of the policy documentation.
System and Communications Protection (SC)
SC evidence covers network architecture, encryption, and communication controls.
Key artifacts:
- Network diagram showing CUI environment boundary, firewall placement, and network segment separation
- Firewall rule set review (current rule set with documentation of each rule's business purpose)
- Encryption configuration screenshots: TLS 1.2+ enforced for data in transit, disk encryption enabled for CUI data at rest
- FIPS-validated cryptography confirmation: certificate or product documentation showing FIPS 140-2/3 validation
- VPN configuration showing encryption standards in use
- For cloud environments: FedRAMP authorization documentation for the CSP
Common evidence gaps: Network diagram that does not reflect the actual current architecture, firewall rules that allow broader access than necessary, encryption present but not FIPS-validated.
System and Information Integrity (SI)
SI evidence demonstrates that systems are protected against malware, vulnerabilities, and integrity compromise.
Key artifacts:
- Endpoint protection platform (EDR/AV) screenshot showing coverage across all in-scope systems
- Definition update configuration showing automatic updates enabled
- Patch management console output showing current patch status (date of last scan, systems with outstanding patches, policy for remediation timelines)
- Vulnerability scan results (dated within 90 days for most assessments)
- Security alerting configuration showing what alerts are generated and to whom
Common evidence gaps: AV present but not on all systems, definitions out of date on some systems, patch management console showing significant backlog with no documented remediation plan.
Evidence Packaging: How to Organize for Assessment
Evidence organization matters as much as the evidence itself. An assessor reviewing 500 unorganized screenshots cannot efficiently evaluate your controls. Proper organization reduces assessment friction and the risk of findings caused by an assessor not being able to locate relevant evidence.
Recommended organization structure:
Evidence Package/
├── AC - Access Control/
│ ├── AC.L2-3.1.1 - Access Control Policy.pdf
│ ├── AC.L2-3.1.1 - User Account Screenshot_2026-03-01.png
│ └── AC.L2-3.1.12 - MFA Enforcement Policy Screenshot.png
├── AU - Audit and Accountability/
│ ├── AU.L2-3.3.1 - Logging Configuration Screenshot.png
│ └── AU.L2-3.3.1 - Log Review Record_Feb2026.pdf
[... continue by domain]
Naming conventions:
- Include the requirement ID in the file name
- Include the date the artifact was captured
- Use descriptive names that indicate what the artifact shows
- Avoid generic names like "screenshot1.png"
Artifact freshness: Evidence should be current. Screenshots taken 12 months before an assessment may not reflect the current state of controls. For a C3PAO assessment, evidence collected within 90 days of the assessment start date is the standard for most technical artifacts. Policy documents should show a recent review or approval date.
The SHA-256 Hashing Requirement
All artifacts submitted to eMASS must be hashed using SHA-256. The folder hash value and the log filename must be submitted alongside the artifacts. This is a technical submission requirement that often catches organizations off guard. Ensure your evidence packaging process includes hash generation before upload.
Common Evidence Mistakes That Cause Assessment Failures
- Policies without implementation evidence. A strong written policy without technical evidence that it is being followed scores as Not Met. Policies and technical implementation must be corroborated.
- Screenshots without context. A screenshot showing a configuration setting without context (which system, which policy, what date) is difficult for an assessor to evaluate. Annotate screenshots to identify the system, environment, and date.
- Undated artifacts. Evidence without dates cannot be verified as current. Every artifact should have a visible date.
- Missing personnel interview preparation. Assessors will interview your IT staff, security personnel, and sometimes end users. Personnel who cannot explain how controls work in the context of your specific environment create interview findings. Brief your team on what controls exist and how to describe them.
Key Takeaways
- Assessors evaluate evidence through examine, interview, and test — all three must be consistent
- Evidence must be specific, dated, and traceable to your actual environment
- Organize evidence by domain and requirement before the assessment begins
- All eMASS artifacts require SHA-256 hashing
- Common gaps: policies without technical corroboration, undated artifacts, MFA "enabled" but not enforced
Learn More
For the complete CMMC framework, see the CMMC 101: The Complete Guide to CMMC Compliance for Defense Contractors.
Related articles in this series:
Need help building a defensible evidence package before your C3PAO assessment? NR Labs builds evidence packages aligned to DIBCAC standards as part of our C3PAO readiness engagements. Contact us to get started.