Overview
The Safety Readiness Scorecard serves as the executive safety assessment tool for project stakeholders, compliance auditors, and safety engineers. It consolidates four critical dimensions of safety evidence — requirements completeness, traceability integrity, verification coverage, and FMEA analysis — into a unified readiness matrix that maps directly to standards compliance requirements.
The scorecard is a reference dashboard accessible from the Home Dashboard and linked from the Standards Compliance Overview. Unlike operational dashboards that update in real-time, the scorecard includes both automated metrics (computed from work item linkage) and manual assessment inputs (documented compliance evidence).
Scorecard Layout and Structure
The Safety Readiness Scorecard displays a matrix with standards listed as rows and five metric columns:
| ISO 26262 Part | Clause Coverage | Work Products | Status | Gap Count | Readiness |
|---|
| Part 3: Concept Phase | 92% | HARA, Safety Goals, FSR | Complete | 2 | High |
| Part 4: System Design | 49% | TSC, FTTI, Safety Mechanisms | In Progress | 8 | Medium |
| Part 5: HW Design | 35% | HW Safety Req, FMEDA | In Progress | 12 | Low |
| Part 6: SW Design | 41% | SW Safety Req, SW Architecture | In Progress | 10 | Low |
| Part 7: Production | 15% | Production Plan, Control Plan | Not Started | 18 | Critical |
| Part 8: Supporting | 78% | Config Mgmt, Change Mgmt | In Progress | 4 | Medium |
| Part 9: ASIL Decomp | 88% | Decomposition Arguments | Complete | 1 | High |
Each cell displays:
- A numeric percentage (0–100%)
- A status badge:
- :fontawesome-regular-check-circle: ✓ (green) — 85% or higher, compliant
- :fontawesome-solid-triangle-exclamation: ⚠️ (orange) — 50–84%, gaps exist
- :fontawesome-solid-circle-xmark: ✗ (red) — below 50%, critical gaps
Cells marked N/A indicate metrics not applicable to that standard.
Metric Definitions and Calculation
Requirements Completeness (Requirements %)
| Property | Value |
|---|
| Name | Requirements Completeness |
| Type | Numeric Percentage (0–100%) |
| Calculation | (Requirements with at least one traceability link) ÷ (Total requirements of relevant types) × 100 |
| Data Source | Lucene query of linkedWorkItems and backlinkedWorkItems fields |
| Status Threshold | ≥85% green, 50–84% orange, <50% red |
| Example | 87 of 100 system requirements linked to other work items → 87% |
Details:
Requirements completeness validates that all safety-critical requirements have been considered in downstream analysis. A requirement is marked “complete” if it has at least one outgoing or incoming link to any work item (regardless of link type or direction).
Applicable requirement types vary by standard:
- ISO 26262 Part 3 (Concept Phase): Customer Requirements, Safety Requirements, Functional Requirements
- ISO 26262 Part 4 (System Design): System Requirements, Safety Goals
- ISO 26262 Part 5 (Hardware Design): Design Requirements (hardware-classified)
- ISO 26262 Part 6 (Software Development): Design Requirements (software-classified)
- IATF 16949/APQP: Design Characteristics with SC/CC classification
A requirement with a link to ANY work item counts as “complete” — even a single annotation or customer feedback link qualifies. Use Traceability Coverage for stricter checks of specific link types (e.g., verification, refinement).
Traceability Coverage (Traceability %)
| Property | Value |
|---|
| Name | Traceability Coverage |
| Type | Numeric Percentage (0–100%) |
| Calculation | Sum of bidirectional link coverage per chain ÷ Total chains × 100 |
| Data Source | backlinkedWorkItems:<roleType> Lucene queries for each standard-specific chain |
| Status Threshold | ≥90% green, 70–89% orange, <70% red |
| Example | System Reqs→Design Reqs (100%) + Design Reqs→Tests (100%) + Chars→Failure Modes (78%) → avg 93% |
Details:
Traceability coverage ensures the V-model decomposition is complete across all levels. Each standard specifies required linkage chains:
ISO 26262 Part 4 (System Design):
- Customer Reqs → System Reqs (via
refines)
- System Reqs → Design Reqs (via
refines)
- System Reqs → Safety Goals (via
assesses)
ISO 26262 Part 5 (Hardware Design):
- Design Reqs → Functions (via
implements)
- Design Reqs → Characteristics (via
characterizes)
- Characteristics → Failure Modes (via
failureMode link or implicit via component design structure)
AIAG-VDA FMEA:
- Failure Modes → Risk Controls (via
mitigates)
- Risk Controls → Test Cases (via
verifies)
IATF 16949/APQP:
- Characteristics → Control Plan Items (via
controlPlan)
- Process Steps → Control Plan Items (via
monitors)
Coverage is calculated as the percentage of source items with at least one link to an expected target item. Gaps are identified using Lucene queries like:
type:sysReq AND NOT backlinkedWorkItems:refines=*
Verification Coverage (Verification %)
| Property | Value |
|---|
| Name | Verification Coverage |
| Type | Numeric Percentage (0–100%) |
| Calculation | (Requirements with back-linked test cases) ÷ (Total testable requirements) × 100 |
| Data Source | backlinkedWorkItems:verifies Lucene query filtered to system/design requirements |
| Status Threshold | ≥85% green, 60–84% orange, <60% red |
| Example | 26 of 31 system requirements have back-linked Verification Test Cases → 83.9% |
Details:
Verification coverage demonstrates that all testable requirements have corresponding verification evidence (test cases, test runs, or verification protocols). This is a mandatory requirement for ISO 26262 Part 4 Clause 6 (verification and validation planning).
Applicable requirement types:
- System Requirements (testable at system level)
- Design Requirements (testable at component/unit level)
- User Needs (validated via validation test cases)
- Safety Goals (indirect verification through derived requirements)
A requirement is marked “verified” if it has at least one back-link via the verifies role to:
testCase (unit/component test)
validationTestCase (system or user acceptance test)
verificationTestCase (formal verification protocol)
Software requirements typically achieve 100% verification coverage because software is verified at the source code level (unit tests, integration tests, static analysis). Gaps here typically indicate missing test case creation during development, not unverifiable requirements.
FMEA Coverage (FMEA Coverage %)
| Property | Value |
|---|
| Name | FMEA Coverage |
| Type | Numeric Percentage (0–100%) |
| Calculation | (Failure modes with post-mitigation assessment) ÷ (Total failure modes in scope) × 100 |
| Data Source | postmitigationAP and pfmRPNPost field values for Design FMEA and Process FMEA |
| Status Threshold | ≥90% green, 70–89% orange, <70% red |
| Example | 248 of 260 failure modes have postmitigationAP value → 95.4% coverage |
Details:
FMEA coverage validates that all identified failure modes have been subjected to risk assessment and that mitigation measures are documented. This is the key evidence artifact for AIAG-VDA FMEA and ISO 26262 Part 4 Clause 7 (FMEA and HARA).
Two distinct coverage metrics are tracked:
Design FMEA (DFMEA & System FMEA):
- Metric: Post-mitigation Action Priority (
postmitigationAP field)
- Values: H (High), M (Medium), L (Low), or NA (not applicable)
- Interpretation: All failure modes must have an assessed post-mitigation AP per AIAG-VDA methodology
- Calculation:
count(FM where postmitigationAP is not empty) ÷ total(FM) × 100
Process FMEA (PFMEA):
- Metric: Post-mitigation RPN (
pfmRPNPost field, calculated as Severity × Occurrence Post × Detection Post)
- Values: Numeric (0–1000)
- Interpretation: All process failure modes must have post-mitigation risk assessment
- Calculation:
count(PFM where pfmRPNPost is not empty) ÷ total(PFM) × 100
Gaps indicate incomplete risk assessment (missing mitigation actions) and must be resolved before the FMEA can support ISO 26262 compliance claims.
The scorecard displays post-mitigation coverage only. Pre-mitigation gaps are expected during initial FMEA creation. Monitor trending of post-mitigation coverage as the project evolves — it should increase toward 100% as risk controls are implemented and verified.
Document Status
| Property | Value |
|---|
| Name | Document Status Count |
| Type | Count + Status Badges |
| Data Source | LiveDoc modules in specified spaces filtered by document type |
| Example | ISO 26262 Part 4: “4 specs” (System Requirements, System Design, Interface Design, Hardware Specification) |
Details:
The scorecard lists the count of specification documents (LiveDocs) that provide formal compliance evidence for the standard. Each document is linked in the detailed scorecard view.
Typical document types per standard:
ISO 26262:
- Part 3 (Concept Phase): Concept specification, HARA report, Safety concept
- Part 4 (System Design): System design specification, Interface design specification, System FMEA report
- Part 5 (Hardware): Hardware design specification, Hardware FMEA report, Component verification plans
- Part 6 (Software): Software design specification, Software verification report, Code review records
AIAG-VDA:
- FMEA reports (Design, Process, Control Plan)
- Process flowcharts
- Risk mitigation action tracking
IATF 16949/APQP:
- Control Plans per component
- Process capability studies
- PPAP documentation
Overall Readiness Score (Overall %)
| Property | Value |
|---|
| Name | Overall Readiness Score |
| Type | Numeric Percentage (0–100%) |
| Calculation | Average of all applicable metric percentages (N/A metrics excluded) |
| Status Threshold | ≥85% green (compliant), 60–84% orange (gaps), <60% red (critical) |
| Example | (87% + 100% + 84% + 100%) ÷ 4 = 92.75% → 93% (rounded) |
Details:
The Overall score is a simple arithmetic mean of all non-N/A metrics for that standard. This provides a single readiness indicator for executive reporting.
Interpretation:
- ≥85% (✓ Green): Standard compliance requirements substantially met; document gaps for audit trail
- 60–84% (⚠️ Orange): Compliance gaps exist but remediation is in progress; escalate to safety management review
- <60% (✗ Red): Critical compliance gaps; halt release decisions until gaps are closed
| Property | Type | Default | Description |
|---|
title | String | ”Safety Readiness Scorecard” | Scorecard headline |
standards | Enum Array | [iso26262, aiag-vda, iatf16949, iso21448] | Which standards to include in matrix |
includeMetrics | Enum Array | [requirements, traceability, verification, fmea, docStatus] | Which columns to display |
refreshInterval | Integer (seconds) | 3600 | How often metrics are recalculated (in seconds); 3600 = hourly |
highlightThreshold | Object | {green: 85, orange: 60} | Percentage thresholds for color coding |
showGapLinks | Boolean | true | Whether to hyperlink gap counts to detail views |
exportFormat | Enum | pdf | Export format for scorecard report (pdf, xlsx, html) |
Example Configuration (Velocity Macro)
#set($scorecard = {
'title': 'ISO 26262 Functional Safety Readiness',
'standards': ['iso26262-part3', 'iso26262-part4', 'iso26262-part5', 'iso26262-part6'],
'metrics': ['requirements', 'traceability', 'verification', 'fmea'],
'refreshInterval': 7200,
'exportButton': true
})
#nxSafetyReadinessScorecard($scorecard)
Standard-Specific Scorecard Rows
ISO 26262 Part 3 (Concept Phase)
| Aspect | Requirement |
|---|
| Standards Chain | Concept Specification → Functional Safety Concept → HARA → Preliminary Safety Goals |
| Requirements | All customer needs and operational scenarios documented; includes “What if” analysis |
| Traceability | Concept elements → HARA hazards → Preliminary Safety Goals; 100% bidirectional linkage |
| Verification | Concept reviews and safety analysis sign-off; not quantified (N/A in scorecard) |
| FMEA Coverage | Not applicable (FMEA begins in Part 4 System Design phase) |
| Documentation | Concept phase typically includes 20+ specification and analysis documents |
| Readiness Target | ≥75% acceptable for concept phase gate (verification is qualitative) |
ISO 26262 Part 4 (System Design)
| Aspect | Requirement |
|---|
| Standards Chain | System Design Spec → System FMEA → System Verification Plan → Risk Control Strategy |
| Requirements | All system requirements with SC/CC classification; 87% baseline for automotive projects |
| Traceability | System Reqs → Design Reqs (100%); Design Reqs → Tests (85%+); Characteristics → Failure Modes (90%+) |
| Verification | 80%+ of system requirements must have verification test plan or equivalent evidence |
| FMEA Coverage | System FMEA must cover 100% of identified failure modes with post-mitigation assessment per AIAG-VDA AP methodology |
| Documentation | Typically 4–6 documents: System Design Spec, System FMEA Report, Interface Spec, Verification Plan, Risk Matrix |
| Readiness Target | ≥90% required for design phase completion; <85% blocks system-level testing |
ISO 26262 Part 5 (Hardware Design)
| Aspect | Requirement |
|---|
| Standards Chain | Hardware Design Spec → Hardware FMEA → Component Verification → Design Robustness Analysis |
| Requirements | All hardware design requirements with ASIL allocation; includes safety mechanisms, diagnostic coverage |
| Traceability | Design Reqs → Characteristics (100%); Characteristics → Component Tests (95%+); Failure Modes → Risk Controls (100%) |
| Verification | 100% of hardware design requirements must have verification evidence (testing, analysis, or review) |
| FMEA Coverage | Hardware FMEA 100% complete with post-mitigation RPN; all ASIL D/C controls verified with diagnostic coverage ≥60% |
| Documentation | Typically 6–8 documents: Hardware Design Spec, Component DFMEAs, Diagnostic Coverage Analysis, Hardware Verification Report |
| Readiness Target | ≥95% required for hardware release to manufacturing |
ISO 26262 Part 6 (Software Development)
| Aspect | Requirement |
|---|
| Standards Chain | Software Arch Design → Software Implementation → Software Unit Tests → Software Integration Tests → Verification Report |
| Requirements | 100% of software design requirements specified; typically exceeds 100% due to code review findings |
| Traceability | Requirements → Design Units (100%); Design Units → Source Code (100%); Code → Unit Tests (100%) |
| Verification | 100% of software requirements verified via unit tests, integration tests, and static analysis |
| FMEA Coverage | Software FMEA optional (not typically used); instead document code review results, test coverage metrics |
| Documentation | Typically 8–12 documents: Software Architecture, Software Design, Code Review Checklist, Test Plans, Test Reports, Static Analysis Reports |
| Readiness Target | ≥95% for feature-complete software release; 100% for release to production |
AIAG-VDA FMEA
| Aspect | Requirement |
|---|
| Standards Chain | DFMEA (3 levels: System/Subsystem/Component) → Risk Control Action Plan → PFMEA → Control Plan |
| Requirements | Not applicable (FMEA does not require formal requirement traceability) |
| Traceability | Design Reqs → Failure Modes (90%+); Failure Modes → Risk Controls (95%+); Risk Controls → Tests (85%+) |
| Verification | Verification evidence (test cases, CAP closure) must support all risk controls with Action Priority H or M |
| FMEA Coverage | DFMEA: 100% post-mitigation AP assigned; Process FMEA: 100% post-mitigation RPN ≤10 (Low) or mitigated to acceptable residual risk |
| Documentation | DFMEA Report, PFMEA Report, Control Plan, Process Capability Study, Mistake-Proofing Evidence |
| Readiness Target | ≥94% FMEA coverage typical; <90% indicates incomplete risk assessment |
IATF 16949 / APQP (Advanced Product Quality Planning)
| Aspect | Requirement |
|---|
| Standards Chain | Design Characteristics (SC/CC) → Control Plan Items → Reaction Plans → Verification Test Results |
| Requirements | All design characteristics must be identified and classified (SC/CC); 60%+ baseline for new design programs |
| Traceability | Characteristics → Control Plan (60–80%); Characteristics → Process Steps (60–70%); Tests → Characteristics (target 100%) |
| Verification | Design verification test data required for all SC/CC characteristics; typically 70–90% coverage during APQP phase gate 3 |
| FMEA Coverage | Control Plan items derived from FMEA risk controls; 80%+ of control plan items should trace to failure modes |
| Documentation | APQP checklist, Control Plan (3–5 pages), Process Capability Study, Product Design Record, Warranty/Recall Analysis |
| Readiness Target | ≥75% acceptable for prototype phase; ≥90% required for production release |
ISO 21448 SOTIF (Safety of the Intended Functionality)
| Aspect | Requirement |
|---|
| Standards Chain | Functional Specification → Scenario Analysis → Hazard Analysis (beyond malfunction) → Residual Risk Evaluation |
| Requirements | SOTIF hazards identified and assessed; includes degraded operation, sensor failures, and non-malfunction scenarios |
| Traceability | Use Cases / Scenarios → SOTIF Hazards (95%+); SOTIF Hazards → Safety Goals (100%); Safety Goals → Design Requirements (90%+) |
| Verification | Field usage data, degraded mode testing, and scenario-based verification; 80%+ test scenario coverage |
| FMEA Coverage | SOTIF FMEA complements traditional FMEA; covers “what if sensor saturates”, “partial failures”, “signal delays” not covered by malfunction analysis |
| Documentation | SOTIF Hazard Analysis Report, Scenario Test Plan, Field Data Analysis, Operational Safety Specification |
| Readiness Target | ≥80% for pre-production; ≥95% after first field season with real usage data |
Interpreting the Scorecard
Green Status (≥85%)
Metric meets standard requirements. Action: Document compliance evidence in audit trail (e.g., link to verification report, test case results, or management review meeting notes). No further action required unless standard explicitly mandates additional controls.
Example: System Requirements Traceability = 100% — all system requirements have design refinements or test links.
Orange Status (60–84%)
Metric indicates non-compliance risk. Action: Identify and escalate gaps to safety management. Typical gap categories:
- Incomplete Requirements (Requirements % = 75%): 25 of 100 requirements not yet linked to design/tests. Assign owners for each gap, define completion date.
- Unverified Requirements (Verification % = 70%): 30 of 100 testable items lack verification evidence. Create missing test cases or document design review as alternative verification method.
- Uncovered Failure Modes (FMEA Coverage % = 80%): 50 of 250 failure modes missing post-mitigation AP assessment. Assign FMEA team to complete risk assessment.
Schedule remediation review within 2 weeks. Do not advance to next phase gate until remediated.
Red Status (<60%)
Metric fails standard requirements. Action: Halt release decisions immediately. Escalate to project safety manager and compliance officer.Root cause analysis required:
- Root Cause: Insufficient resources, process breakdown, tool configuration error, or misalignment on standard interpretation
- Mitigation: Assign dedicated task force, extend schedule if necessary, allocate additional resources
- Target: Reach ≥60% (orange) within 1 week, ≥85% (green) within 4 weeks
Example red scenarios:
- Requirements % = 40%: Only 40 of 100 requirements have any linkage. Indicates major gap in traceability process.
- Verification % = 45%: Less than half of design requirements have verification test cases. Indicates incomplete test planning.
- FMEA Coverage % = 50%: Half of failure modes have no post-mitigation assessment. Indicates incomplete risk analysis.
Drill-Down to Detailed Gaps
Each scorecard cell with a gap count (e.g., “87% | 13 gaps”) is a hyperlink to a detailed Gap Report showing:
- List of Uncovered Items: All work items not meeting the metric (e.g., all system requirements without design refinements)
- Owner Assignment: Shows current owner if assigned, otherwise shows “Unassigned”
- Due Date: Estimated completion date based on project schedule
- Remediation Action: Proposed resolution (e.g., “Create design requirement”, “Link to existing test case”, “Close as obsolete”)
- Evidence Link: Hyperlink to supporting documentation (e.g., design review minutes, CAP closure)
Example gap report for “Requirements Completeness = 87%”:
- SysReq-107: Missing FTTI specification for AEB braking latency
- Assigned to: Systems Engineer
- Due: Sprint 24
- Impact: Blocks Part 4 clause 6.4.3 compliance
- SysReq-108: Incomplete diagnostic coverage analysis for radar sensor
- Assigned to: HW Safety Engineer
- Due: Sprint 25
- Impact: Blocks Part 5 FMEDA completion
- SysReq-109: SW unit testing coverage below ASIL-D threshold (94% vs 100% required)
- Assigned to: SW Test Lead
- Due: Sprint 23
- Impact: Blocks Part 6 clause 9 verification
Cross-References
For detailed guidance on improving each metric:
For generating formal compliance reports: