A comprehensive guide for conducting Digital Sovereignty and Security maturity assessments
Version 1.1 - 7th March 2026This is the 201 - Domain Overview & Assessment guide. Other levels available:
This Level 201 Enablement Guide provides comprehensive instructions for conducting Full Maturity Assessments with customers and partners. It is designed for technical managers, solution architects, and workshop facilitators who need to deliver consistent, high-quality assessments that provide valuable insights into an organization's Digital Sovereignty and Security maturity.
This guide assumes familiarity with basic Digital Sovereignty concepts. If you or your audience are new to Digital Sovereignty, consider starting with the 101 - Introduction guide.
The Full Maturity Assessment is a structured evaluation tool that measures an organization's capabilities across multiple domains using a proven 5-level maturity model based on the CMMI (Capability Maturity Model Integration) framework:
| Level | Name | Range | Description |
|---|---|---|---|
| Level 1 | Initial | 0-20% | Unpredictable, reactive processes; ad-hoc approach |
| Level 2 | Managed | 21-40% | Planned and executed processes; basic controls in place |
| Level 3 | Defined | 41-60% | Standardized and documented processes across organization |
| Level 4 | Quantitatively Managed | 61-80% | Measured and controlled processes with metrics |
| Level 5 | Optimizing | 81-100% | Continuous improvement and innovation |
We offer two primary assessment profiles, each focused on different organizational priorities:
7 Domains: Data Sovereignty, Technical Sovereignty, Operational Sovereignty, Assurance Sovereignty, Open Source, Executive Oversight, Managed Services
Focus: Organizational control and independence from external dependencies, particularly important for government, healthcare, finance, and organizations with strict data residency requirements.
7 Domains: Secure Infrastructure, Secure Data, Secure Identity, Secure Application, Secure Network, Secure Recovery, Secure Operations
Focus: Comprehensive security posture across all layers of the technology stack, ideal for compliance-driven organizations and those with high security requirements.
Most organizations benefit from starting with Digital Sovereignty as it addresses strategic independence concerns. Security assessments can follow to provide deeper technical security insights.
Proper preparation is critical to a successful assessment. Consider the following when scheduling:
The assessment requires input from multiple stakeholders to ensure accurate ratings. Recommended participants:
| Role | Why They're Needed | Essential? |
|---|---|---|
| CIO / CTO | Strategic oversight, budget authority, executive-level questions | Yes |
| CISO / Security Lead | Security controls, risk management, compliance frameworks | Yes |
| Cloud/Infrastructure Lead | Technical sovereignty, infrastructure control, vendor relationships | Yes |
| Compliance/Legal Officer | Data residency, jurisdictional control, regulatory requirements | Recommended |
| Operations Manager | Operational processes, disaster recovery, managed services | Recommended |
| Procurement Lead | Vendor management, supply chain, contract terms | Optional |
Selecting the appropriate Line of Business (LOB) is crucial as it applies industry-specific weightings to domains. Guide your customer through this decision:
Best for: Banks, insurance companies, financial services, payment processors
Emphasized domains: Data Sovereignty (2.0×), Assurance Sovereignty (2.0×), Operational Sovereignty (1.5×)
Rationale: Financial institutions face stringent regulatory requirements (PCI DSS, SOX, DORA) demanding strong data protection, audit controls, and business continuity.
Best for: Hospitals, health systems, medical research, healthcare technology
Emphasized domains: Data Sovereignty (2.0×), Operational Sovereignty (2.0×)
Rationale: Healthcare organizations must protect sensitive patient data (HIPAA, GDPR) while maintaining 24/7 operational resilience for patient safety.
Best for: Federal/state/local government, public sector, defense contractors
Emphasized domains: Data Sovereignty (2.0×), Assurance Sovereignty (2.0×), Executive Oversight (2.0×)
Rationale: Government entities handle sensitive citizen data and critical infrastructure with strict sovereignty requirements, transparency needs, and national security concerns.
Best for: Industrial manufacturing, automotive, aerospace, discrete manufacturing
Emphasized domains: Operational Sovereignty (2.0×), Managed Services (2.0×)
Rationale: Manufacturers prioritize production uptime, OT/IT integration, and IP protection for proprietary designs and processes.
Best for: Telecom providers, ISPs, mobile carriers, network infrastructure
Emphasized domains: Data Sovereignty (2.0×), Operational Sovereignty (2.0×), Assurance Sovereignty (2.0×)
Rationale: Telecom operators manage critical communications infrastructure with subscriber data protection requirements and strict regulatory compliance (NIS2).
Best for: Organizations without specific industry focus or those spanning multiple sectors
Emphasized domains: All domains equally weighted (1.0×)
Rationale: Provides an unbiased assessment across all domains without industry-specific emphasis.
Send this checklist to participants at least 1 week before the assessment:
Before the session, ensure:
A well-structured session keeps participants engaged and ensures comprehensive coverage of all domains.
Explain assessment purpose, maturity model, review agenda, confirm participants and roles
Discuss and select appropriate assessment profile and industry weighting
Work through each domain systematically (~17 min per domain for 7 domains)
Review spider chart, discuss scores, identify obvious gaps
Discuss next steps, schedule follow-up, export results
Sessions often run long as participants want to discuss their challenges. Build in buffer time or be prepared to schedule a continuation session. Consider breaking complex assessments into multiple shorter sessions.
Use this script to open your assessment session professionally:
"Thank you all for joining today's Full Maturity Assessment. Over the next 2-3 hours, we'll be evaluating your organization's capabilities across [Digital Sovereignty / Security] domains using a proven 5-level maturity framework."
"This assessment is designed to be honest and constructive—not punitive. Most organizations score between levels 2-3 initially, and that's perfectly normal. The goal is to establish a baseline and identify priority areas for improvement."
"I'll be asking questions about your current capabilities and asking for evidence of implementation. Please be candid—overestimating maturity only hurts your own planning. If you're unsure about an answer, we can flag it for follow-up."
"Let's start by selecting your industry profile, which will adjust the weighting of domains based on your sector's specific needs..."
Each capability is rated using a slider with four implementation status levels. Guide participants through this process:
Each capability uses a 0-3 slider to indicate implementation status:
Note: These capability ratings contribute to the overall 5-level maturity score for each domain.
Always ask for evidence to support capability ratings. Here are examples of acceptable evidence for each implementation status:
| Status | Slider Value | Acceptable Evidence Examples |
|---|---|---|
| No Capability | 0 | Verbal confirmation of gap, acknowledgment that capability doesn't exist, no current plans |
| In Planning | 1 | Project proposals, budget requests, initial requirements gathering, vendor evaluations, roadmap items |
| Work in Progress | 2 | Draft policies, active projects, pilot implementations, partial rollouts, some teams using it, configuration in progress |
| Fully Complete | 3 | Approved policies, documented procedures, organization-wide implementation, training completed, metrics being collected, regular reviews occurring |
Don't confuse capability status with overall maturity: A single capability rated "Fully Complete" (3) doesn't mean the organization is at "Optimizing" maturity level (Level 5). The overall maturity rating is calculated across all capabilities in a domain based on the percentage of possible points achieved.
Example: The CIO believes they have Level 4 disaster recovery, but the Operations Manager says they've never successfully tested it.
Response: "I'm hearing different perspectives here. Let's focus on what we can verify. [Operations Manager], can you describe your most recent DR test? [CIO], what metrics are you using to assess DR maturity? Based on industry best practices, regular testing is required for Level 4. Without test evidence, we should consider Level 2 or 3."
Approach: Stay neutral, ask for evidence, refer to maturity definitions, help them reach consensus based on facts.
Example: After several Level 1-2 scores, the CISO becomes defensive: "We have excellent security! This assessment is unfair!"
Response: "I appreciate your commitment to security. These scores reflect maturity along a journey—they're not a judgment of your team's effort or capability. Many excellent organizations score at Level 2-3 initially. The assessment helps us identify where focused investment will have the most impact. Would it help to review the scoring criteria together?"
Approach: Validate their feelings, emphasize growth mindset, reframe scores as opportunities, avoid blame.
Example: Multiple participants don't know the answer to questions about vendor contracts or key management.
Response: "That's valuable information in itself—if key stakeholders don't know, that typically indicates Level 1 or 2 maturity. Let's mark this for follow-up investigation and make a provisional rating of Level 1. You can update it later once you've verified."
Approach: Frame "don't know" as data, assign conservative rating, offer to revisit, ensure follow-up action item is captured.
Keep the assessment moving while ensuring thoroughness:
This section provides detailed guidance for each Digital Sovereignty domain. Each domain contains 8 questions organized into three tiers:
Each capability is assigned points (1-8) reflecting its importance within the domain. Higher point values indicate more critical capabilities for achieving sovereignty.
How scores are calculated: Each capability's slider value (0-3) is converted to a percentage (0%, 33%, 67%, or 100% of implementation), then multiplied by the capability's point value. For example, a 5-point capability rated "Work in Progress" (slider value 2) contributes 67% × 5 = 3.33 points.
The assessment automatically calculates domain scores by summing all capability contributions, then converts the total to an overall maturity level (Initial, Managed, Defined, Quantitatively Managed, or Optimizing).
This domain assesses an organization's ultimate control over its data, independent of external jurisdictions or political influences. It goes beyond basic data residency by focusing on legal control, access, and encryption management. Maturity here confirms that data location is actively governed by the organization's legal and business requirements, rather than dictated solely by a cloud provider or foreign law.
What this measures: Whether the organization explicitly controls where data is stored based on legal requirements
Key questions to ask:
Evidence to request: Data residency policy document, cloud provider contracts specifying regions, configuration screenshots showing geo-restrictions
Red flags: "We think it's in [region]", "The cloud provider handles that", "We haven't checked recently"
What this measures: Compliance with data protection regulations and implementation of privacy controls
Key questions to ask:
Evidence to request: Privacy policies, consent management systems, GDPR compliance documentation, Privacy Impact Assessments
Red flags: Confusion about applicable regulations, no defined process for data subject requests, relying solely on vendor certifications
What this measures: Whether the organization knows what data it has, where it is, and how sensitive it is
Key questions to ask:
Evidence to request: Data inventory/catalog, classification framework document, data discovery tool demonstrations, data ownership registers
Red flags: "We're working on that", manual spreadsheet-based tracking, no data ownership assigned
What this measures: Ability to resist extra-territorial legal demands and maintain domestic legal control
Key questions to ask:
Evidence to request: Vendor contracts showing governing law clauses, legal risk register, documented escalation procedures
Red flags: Contracts governed by foreign law, no notification provisions, unaware of jurisdictional conflicts
What this measures: Whether the organization exclusively controls encryption keys, independent of cloud providers
Key questions to ask:
Evidence to request: Key management architecture diagrams, HSM procurement/contracts, key rotation policies, external key management (EKM) solution documentation
Red flags: Provider-managed keys, lack of HSMs, no key rotation schedule, unclear about who can access keys
Note: This is a 6-point question because key control is fundamental to data sovereignty. Organizations often struggle here.
What this measures: Protection of data during processing (data-in-use), not just storage and transit
Key questions to ask:
Evidence to request: Confidential computing implementations (Intel SGX, AMD SEV, AWS Nitro Enclaves), memory encryption configurations, log sanitization policies
Red flags: Unaware of data-in-use protection, relying only on at-rest and in-transit encryption, plaintext logging of sensitive data
Note: This is often Level 1-2 for most organizations; confidential computing is still emerging.
What this measures: Real-time monitoring and immutable logging of all data movements
Key questions to ask:
Evidence to request: Data flow maps, DLP dashboards, audit log retention policies, SIEM integration, transfer blocking evidence
Red flags: No data flow visibility, reactive rather than preventive controls, logs stored with cloud provider
What this measures: Strict, audited, and revocable control over vendor and partner access to data
Key questions to ask:
Evidence to request: Third-party access policies, Privileged Access Management (PAM) systems, session recordings, vendor risk assessments
Red flags: Persistent vendor access, no session monitoring, vendors located in concerning jurisdictions, inability to quickly revoke access
Note: This is the highest point value question as third-party access is a primary sovereignty risk.
Technical Sovereignty evaluates the degree of control an organization maintains over the foundational components of its technology stack—from hardware and firmware to application source code and runtime environments. High maturity signifies deliberate reduction in reliance on proprietary interfaces and single-vendor ecosystems, ensuring the ability to rebuild or migrate critical functions if necessary.
Key Focus Areas: Technology stack ownership, vendor lock-in mitigation, standardized frameworks, interoperability, hardware provenance, self-hosted runtimes, IP control, future-proofing
Common Discussion Topics: Open source adoption, Kubernetes and containerization, multi-cloud strategies, escrow agreements, supply chain security
What this measures: The extent to which the organization controls its foundational technology components
Key questions to ask:
Evidence to request: Technology inventory, internal skills matrix, documentation of core systems
Red flags: Heavy reliance on proprietary systems, lack of internal technical expertise, "vendor handles everything" mentality
What this measures: Assessment and mitigation of vendor lock-in risks
Key questions to ask:
Evidence to request: Lock-in risk assessment, vendor contracts with exit clauses, list of vendor-specific dependencies
Red flags: No exit strategy, heavy use of proprietary APIs, contracts without portability provisions
What this measures: Use of industry-standard, non-proprietary technical frameworks
Key questions to ask:
Evidence to request: Technical standards documentation, API specifications, architecture review guidelines
Red flags: No standards policy, heavy reliance on vendor-specific APIs, systems that can't interoperate
What this measures: Ability to migrate workloads between platforms
Key questions to ask:
Evidence to request: Container adoption metrics, IaC repositories, migration test results
Red flags: No containerization strategy, manual infrastructure management, untested migration procedures
What this measures: Control over hardware supply chain and verification
Key questions to ask:
Evidence to request: Hardware procurement policies, supply chain verification procedures, TPM/secure boot configurations
Red flags: No hardware provenance tracking, unverified firmware updates, no supply chain validation
What this measures: Direct control over application runtime environments
Key questions to ask:
Evidence to request: Hosting architecture diagrams, administrative access controls, runtime configuration
Red flags: Fully managed PaaS with no underlying access, provider-controlled runtimes, limited administrative rights
What this measures: Ownership and control of source code and intellectual property
Key questions to ask:
Evidence to request: Source code repository locations, code escrow agreements, development contracts with IP clauses
Red flags: Vendor-owned custom code, third-party hosted version control, unclear IP ownership
What this measures: Strategic planning to address sovereignty risks in technology evolution
Key questions to ask:
Evidence to request: Technology roadmap documents, dependency risk register, sovereignty evaluation criteria
Red flags: No long-term planning, reactive approach to sovereignty, no evaluation framework for new tech
This domain examines the organization's autonomy and independence in executing critical business and IT operations. It ensures that essential functions can be performed without reliance on external human expertise or infrastructure outside the organization's direct control or trusted sovereign borders.
Key Focus Areas: Process documentation, managed service dependencies, IAM, internal skills, disaster recovery, supply chain vetting, incident response, operational autonomy
Common Discussion Topics: Break-glass procedures, in-house vs. outsourced operations, business continuity planning, geopolitical isolation scenarios
What this measures: Documentation of critical operational procedures for independence
Key questions to ask:
Evidence to request: Operational runbooks, procedure documentation, documentation update logs
Red flags: Reliance on vendor documentation, undocumented critical procedures, tribal knowledge
What this measures: Internal operational capability vs. external dependencies
Key questions to ask:
Evidence to request: Skills matrix, training programs, vendor dependency assessment
Red flags: Heavy vendor dependency, no internal capability, no skills development program
What this measures: Operational resilience and recovery capabilities
Key questions to ask:
Evidence to request: DR test results, backup infrastructure documentation, continuity plans
Red flags: Untested DR plans, backup dependency on same vendor, no failover capability
What this measures: Control over incident response processes
Key questions to ask:
Evidence to request: Incident response playbooks, SOC operations, log management infrastructure
Red flags: Vendor-controlled incident response, logs only accessible via vendor, no internal SOC
What this measures: Control over deployment processes
Key questions to ask:
Evidence to request: CI/CD pipeline architecture, deployment automation tools, release management processes
Red flags: Vendor-managed deployments, external CI/CD platforms, no automated deployment capability
What this measures: Operational continuity through staff development
Key questions to ask:
Evidence to request: Succession plans, cross-training programs, knowledge transfer documentation
Red flags: Single points of failure in staffing, no succession planning, concentrated expertise
What this measures: Independent infrastructure monitoring capability
Key questions to ask:
Evidence to request: Monitoring tool architecture, telemetry data storage, alerting systems
Red flags: Vendor-provided monitoring only, no independent telemetry, limited visibility
What this measures: Ability to exit infrastructure and migrate workloads
Key questions to ask:
Evidence to request: Migration test results, automated migration tools, documented RTO/RPO
Red flags: No exit strategy, untested migration, undefined RTO, manual migration processes
Assurance Sovereignty addresses the right, capability, and transparency required to verify the security and compliance claims of both internal systems and external vendors. It's the mechanism by which trust is verified, not assumed, through independent audits and continuous technical validation.
Key Focus Areas: Audit rights, sovereign SIEM, compliance verification, transparency requirements, sovereign certifications, continuous monitoring, security testing, vulnerability management
Common Discussion Topics: Right to audit clauses, SOC 2 Type II, penetration testing, third-party attestations, domestic vs. foreign auditors
What this measures: Right and capability to audit service providers
Key questions to ask:
Evidence to request: Audit rights in contracts, recent audit reports, audit schedules
Red flags: No audit rights, relying solely on vendor attestations, never conducted an audit
What this measures: Compliance framework implementation and verification
Key questions to ask:
Evidence to request: Compliance certifications, compliance monitoring systems, verification procedures
Red flags: Unclear compliance requirements, expired certifications, no verification process
What this measures: Control over security monitoring infrastructure
Key questions to ask:
Evidence to request: SIEM architecture, log aggregation infrastructure, monitoring tool ownership
Red flags: Vendor-controlled SIEM, logs not accessible independently, foreign-hosted security tools
What this measures: Infrastructure integrity verification capabilities
Key questions to ask:
Evidence to request: Integrity monitoring systems, tamper detection tools, baseline configurations
Red flags: No integrity verification, unable to detect tampering, no baseline management
What this measures: Independent security testing and validation
Key questions to ask:
Evidence to request: Penetration test reports, third-party assessment results, validation methodology
Red flags: No independent testing, relying on vendor testing only, unvalidated controls
What this measures: Compliance audit trail management
Key questions to ask:
Evidence to request: Audit log infrastructure, immutability controls, retention policies
Red flags: Mutable audit logs, no retention policy, inability to produce evidence quickly
What this measures: Control plane security and monitoring
Key questions to ask:
Evidence to request: Control plane access controls, monitoring configuration, alerting rules
Red flags: No control plane visibility, unmonitored access, no alerting for anomalies
What this measures: Tested capability to migrate due to assurance failures
Key questions to ask:
Evidence to request: Migration test results, exit criteria documentation, automated playbooks
Red flags: No tested migration capability, undefined exit criteria, manual migration only
This domain assesses the organization's strategic use of open-source software to reduce proprietary dependencies, increase transparency, and build internal capabilities. Mature organizations actively contribute to and influence open-source projects critical to their sovereignty goals.
Key Focus Areas: Open source strategy, community participation, license compliance, vulnerability management, sovereign distributions, contribution policies, internal expertise, project governance
Common Discussion Topics: Red Hat Enterprise Linux, Kubernetes, Apache projects, InnerSource, security scanning, open source vs. commercial support
What this measures: Strategic approach to open source software adoption
Key questions to ask:
Evidence to request: OSS adoption policy, software inventory showing OSS vs proprietary, justification process
Red flags: No OSS policy, default to proprietary solutions, no evaluation of alternatives
What this measures: Governance and tracking of open source components
Key questions to ask:
Evidence to request: OSS governance documentation, dependency tracking tools (SBOM), vulnerability scanning
Red flags: No OSS governance, unknown dependencies, no vulnerability tracking
What this measures: Active participation in open source communities
Key questions to ask:
Evidence to request: OSS contribution records, community participation metrics, sponsorship agreements
Red flags: No contributions, developers not allowed to participate, only consuming OSS
What this measures: Internal capability to support critical OSS
Key questions to ask:
Evidence to request: Skills matrix for OSS technologies, fork capability demonstration, support procedures
Red flags: No internal OSS expertise, unable to support critical components, total reliance on community
What this measures: OSS security and vulnerability management
Key questions to ask:
Evidence to request: Vulnerability monitoring systems, patching processes, security scanning tools
Red flags: No security monitoring for OSS, reactive patching only, no scanning capability
What this measures: Assessment of OSS project health and sustainability
Key questions to ask:
Evidence to request: Project health assessments, sustainability criteria, contingency plans
Red flags: No project assessment, dependency on unmaintained projects, no contingency planning
What this measures: OSS supply chain security
Key questions to ask:
Evidence to request: Signature verification processes, provenance tracking, supply chain security controls
Red flags: No signature verification, unverified sources, no supply chain security
What this measures: Strategic use of OSS for sovereignty goals
Key questions to ask:
Evidence to request: Sovereignty roadmap showing OSS initiatives, replacement projects, metrics
Red flags: No strategic OSS approach, no measurable sovereignty benefits, OSS not in roadmap
Executive Oversight ensures that sovereignty concerns are understood, prioritized, and actively managed at the highest levels of the organization. This domain measures board and C-suite engagement, dedicated budgets, governance structures, and accountability for sovereignty outcomes.
Key Focus Areas: Board awareness, dedicated governance, budget allocation, sovereignty policies, risk management, accountability, strategic planning, regulatory engagement
Common Discussion Topics: Board reporting, sovereignty champions, dedicated budgets vs. embedded costs, KPIs and metrics, regulatory relationships
What this measures: Executive and board-level awareness of sovereignty
Key questions to ask:
Evidence to request: Board meeting agendas, executive sponsor designation, governance documentation
Red flags: No board discussion, no executive ownership, sovereignty relegated to IT only
What this measures: Formal sovereignty strategy and alignment
Key questions to ask:
Evidence to request: Sovereignty strategy document, goals and metrics, review schedules
Red flags: No formal strategy, unclear goals, no alignment with business objectives
What this measures: Financial commitment to sovereignty
Key questions to ask:
Evidence to request: Budget allocation, investment tracking, ROI measurements
Red flags: No dedicated budget, costs buried in general IT spend, no ROI tracking
What this measures: Sovereignty maturity tracking and KPIs
Key questions to ask:
Evidence to request: KPI dashboards, tracking systems, benchmark reports
Red flags: No tracking, no defined KPIs, no benchmarking
What this measures: Sovereignty integration into procurement
Key questions to ask:
Evidence to request: Procurement policies, vendor selection criteria, contract templates
Red flags: No sovereignty in procurement, standard vendor contracts accepted, no negotiation
What this measures: Employee awareness and training
Key questions to ask:
Evidence to request: Training programs, onboarding materials, awareness campaigns
Red flags: No training, employees unaware, sovereignty not in culture
What this measures: Regulatory engagement and awareness
Key questions to ask:
Evidence to request: Regulatory engagement records, monitoring systems, industry participation
Red flags: No regulatory engagement, reactive to regulations, no industry participation
What this measures: External communication of sovereignty posture
Key questions to ask:
Evidence to request: Public commitments, customer materials, stakeholder communications
Red flags: No external communication, sovereignty not mentioned to customers, no public commitments
This domain evaluates how the organization manages relationships with external managed service providers while maintaining sovereignty. It addresses vendor selection criteria, contractual controls, geographic restrictions, transition planning, and the balance between operational efficiency and sovereign control.
Key Focus Areas: Vendor selection criteria, contractual controls, geographic restrictions, data access limitations, performance monitoring, transition planning, alternatives evaluation, insourcing capabilities
Common Discussion Topics: Domestic vs. foreign MSPs, data center locations, support personnel jurisdictions, exit strategies, dual-source strategies
What this measures: Inventory and classification of managed service providers
Key questions to ask:
Evidence to request: MSP inventory, outsourced functions list, risk classifications
Red flags: No MSP inventory, unknown sovereignty risks, unclassified services
What this measures: Contractual data ownership and portability
Key questions to ask:
Evidence to request: MSP contracts showing data ownership clauses, portability provisions
Red flags: Ambiguous ownership, limited data access, vendor-controlled keys, no portability guarantee
What this measures: Geographic and jurisdictional control of MSPs
Key questions to ask:
Evidence to request: Provider locations, contract governing law, personnel jurisdiction documentation
Red flags: Foreign-based providers, foreign jurisdiction contracts, unrestricted access locations
What this measures: Exit planning and transition capabilities
Key questions to ask:
Evidence to request: Exit clauses in contracts, transition plans, alternative provider evaluations
Red flags: No exit provisions, undefined transition time, untested procedures, no alternatives
What this measures: Visibility into managed service operations
Key questions to ask:
Evidence to request: Audit capabilities, operational logs access, monitoring dashboards
Red flags: No visibility, unable to audit, limited logging, no incident detection
What this measures: Service level agreements and accountability
Key questions to ask:
Evidence to request: SLA documentation, sovereignty requirements in contracts, compliance monitoring
Red flags: Weak SLAs, no sovereignty requirements, no monitoring, limited recourse
What this measures: Provider security and compliance verification
Key questions to ask:
Evidence to request: Provider certifications, risk assessment reports, security requirements
Red flags: Unverified certifications, no risk assessments, different standards for providers
What this measures: Multi-provider strategy for resilience
Key questions to ask:
Evidence to request: Multi-provider architecture, failover tests, geographic distribution
Red flags: Single provider dependency, no failover capability, untested resilience, no geographic diversity
After completing the assessment, guide the customer through understanding their results.
The spider/radar chart provides a visual representation of maturity across all domains:
Most organizations on their first assessment score:
Reassure customers that these results are normal starting points, not failures.
The results page now includes a revolutionary Themes tab that reorganizes all 56 capabilities by concept rather than domain:
The thematic view reveals patterns that domain-based views miss:
Use this view to have strategic conversations: "Your governance is strong, but technical implementation lags—let's discuss resourcing."
The assessment interface now displays a real-time progress counter in the header:
Work with the customer to translate scores into actionable priorities:
| Priority | Criteria | Example |
|---|---|---|
| Critical (0-3 months) | Regulatory requirement, high industry weighting, Level 1 on high-point questions | Implementing HSM-based key management for healthcare patient data |
| High (3-6 months) | Significant sovereignty risk, medium weighting, Level 2 on strategic questions | Establishing sovereign audit rights with cloud providers |
| Medium (6-12 months) | Important capability, standard weighting, Level 2-3 on foundation questions | Implementing data classification and discovery tools |
| Low (12+ months) | Optimization, already at Level 3+, advanced questions | Establishing open source contribution programs |
Policy development, basic controls, compliance alignment, data inventory, vendor assessment
Technical implementations, key management, vendor migrations, skills development, tooling deployment
Continuous monitoring, optimization, innovation, industry leadership, operational autonomy
Help customers export and distribute results appropriately:
Schedule follow-up activities to maintain momentum:
The assessment often reveals opportunities for further engagement:
When conducting assessments remotely:
Behavior: Claims high maturity without evidence, dismisses concerns, believes "we have the best security"
Approach: Acknowledge their confidence, then request specific evidence. Use data and industry benchmarks. Ask their technical team to verify claims. Frame lower scores as "industry-standard journey" rather than failures.
Behavior: Late to session, distracted, checking phone, wants to rush through
Approach: Respectfully emphasize the value of their time investment. Show early results to demonstrate value. Offer to reschedule if they can't focus. Break into shorter sessions if needed.
Behavior: Takes low scores personally, explains why gaps aren't their fault, blames budget/management
Approach: Emphasize this is organizational assessment, not personal evaluation. Validate resource constraints. Position results as ammunition for budget requests. Frame gaps as opportunities to demonstrate need for investment.
Behavior: Debates every nuance, wants to discuss technical details extensively, struggles to choose between maturity levels
Approach: Appreciate their thoroughness. Set time limits for each question. Offer to deep-dive on specific topics afterward. Remind that perfect accuracy is less important than directional understanding. Use "parking lot" for detailed technical discussions.
Ready-to-use templates are available to support your assessment delivery:
Visit the Templates Library for:
| Term | Definition |
|---|---|
| BYOK | Bring Your Own Key - Customer-generated encryption keys imported to cloud provider |
| CLOUD Act | US law allowing government access to data held by US companies regardless of location |
| Confidential Computing | Protection of data during processing using hardware-based secure enclaves |
| Data Residency | Physical location where data is stored |
| Data Sovereignty | Legal and technical control over data, including ability to resist foreign access demands |
| DLP | Data Loss Prevention - Tools to monitor and prevent unauthorized data transfers |
| EKM | External Key Management - Encryption keys managed outside cloud provider infrastructure |
| HSM | Hardware Security Module - Dedicated cryptographic processor for key management |
| PAM | Privileged Access Management - System for controlling and monitoring administrative access |
| SCA | Software Composition Analysis - Scanning third-party code for vulnerabilities |
| TEE | Trusted Execution Environment - Secure area of processor for sensitive operations |
| Zero Trust | Security model assuming no implicit trust, requiring verification for all access |
Subject: Preparation for Digital Sovereignty Maturity Assessment - [Date]
Dear [Stakeholders],
Thank you for scheduling a Full Maturity Assessment. This session will evaluate your organization's Digital Sovereignty capabilities across 7 key domains using a proven 5-level maturity framework.
Session Details:
Date/Time: [Date/Time]
Location/Link: [Details]
Required Participants: CIO/CTO, CISO, Cloud/Infrastructure Lead, Compliance Officer
Please prepare:
Looking forward to our session.
Best regards,
[Your Name]
Subject: Digital Sovereignty Assessment Results and Next Steps
Dear [Stakeholders],
Thank you for participating in yesterday's maturity assessment. Your engagement and candor were excellent.
Key Findings:
Attached you'll find:
Recommended Next Steps:
I'll follow up next week to schedule our roadmap session.
Best regards,
[Your Name]
| Level | Key Indicators | Common Language |
|---|---|---|
| 1 | No policy, ad-hoc, reactive, "we're planning to" | "We know we need to do this" |
| 2 | Draft policies, pilots, project plans, some implementation | "We're working on it" |
| 3 | Approved policies, widespread deployment, documented standards | "We have this in place" |
| 4 | Metrics, dashboards, KPIs, regular reporting, measured outcomes | "We measure and optimize this" |
| 5 | Continuous improvement, innovation, industry leadership | "We're leading the industry" |