Dashboard Overview
The Testing Space Dashboard implements a Velocity template that displays:
- Real-time test case statistics showing total verification and validation work items
- Document inventory navigation organized by system element hierarchy
- Traceability coverage metrics with clickable gap analysis for identifying unverified requirements
- Verification methods reference explaining ISO 26262 verification approaches
- Quick links to verification and validation sheets for drill-down analysis
The Testing space separates verification (system built correctly per ISO 26262 Part 4-6) from validation (right system built per Part 4 §8). This distinction is critical for automotive functional safety compliance. Verification uses the verifies link role; validation uses the validates link role.
Dashboard Components
Testing Space Banner
| Property | Value |
|---|
| Component | Space header with branded identity |
| Color Scheme | Teal: #00695c (primary), #00897b (secondary) |
| Title | Testing |
| Subtitle | Verification and Validation — Test Coverage Dashboard |
| Macro | nxSpaceBanner(spaceId, title, subtitle, primaryColor, secondaryColor) |
| Purpose | Visual identification and navigation anchoring for Testing space |
The banner establishes the testing space as a distinct navigation zone within the TestAuto2 project, using teal branding to differentiate from Requirements (blue), Design (purple), and Risks (red) spaces.
Test Cases and Documents Count Stats Bar
| Property | Value |
|---|
| Component | Real-time metrics display |
| Metrics Tracked | Total test case work items (type:testCase), total documents in Testing space |
| Data Source | Lucene query: type:testCase + document module folder filter |
| Macro | nxStatsBar() with nxStatItem() children |
| Update Frequency | Live (on page load) |
| Use Case | At-a-glance project test coverage status |
Example Output:
The stats bar queries all work items of type testCase to calculate the current test inventory, and counts LiveDoc modules located in the Testing space folder for document inventory visibility.
Verification Methods Reference Table
ISO 26262 Part 4-6 defines four verification methods. This static reference section documents each approach with descriptions and applicability:
| Method | Description | ISO Reference | Typical Use | Limitation |
|---|
| Review | Formal inspection by qualified engineers of design documents, code, or test procedures | ISO 26262 Part 6 §8.5.2 | Design reviews, code reviews, test plan review | Does not prove functional correctness |
| Analysis | Mathematical or logical analysis of design properties, failure modes, or timing behavior | ISO 26262 Part 6 §8.5.3 | Failure mode analysis (FMEA), timing analysis, dataflow analysis | Does not prove execution in hardware environment |
| Simulation | Execution of design in virtual environment (model-in-the-loop, software-in-the-loop) | ISO 26262 Part 6 §8.5.4 | ECU algorithm testing before hardware, scenario testing | Does not verify hardware integration |
| Testing | Execution on target hardware (integration testing, system testing) | ISO 26262 Part 4 §7.4, Part 6 §8.5.5 | System integration, end-to-end behavior, hardware-software interaction | Cannot verify all failure scenarios |
ISO 26262 requires a mix of verification methods for each requirement. High-ASIL requirements typically use Review + Analysis + Simulation + Testing. Lower-ASIL requirements may use Review + Testing alone. Select methods based on the requirement’s ASIL level and complexity.
Traceability Coverage Metrics
The Testing space dashboard displays three critical coverage metrics, each with a clickable gap query to drill into unverified requirements.
System Requirements Verification Coverage
| Property | Value |
|---|
| Metric Name | System Requirements Verification Coverage |
| Definition | Percentage of system requirement work items with backlinked test cases via verifies link role |
| Numerator | System requirements with backlinkedWorkItems:verifies=* |
| Denominator | Total system requirement work items (type:sysReq) |
| Gap Query | type:sysReq AND NOT backlinkedWorkItems:verifies=TA* |
| Target | 100% (all system requirements verified by at least one test case) |
| Compliance Requirement | ISO 26262 Part 4 §7.4 (System Testing) |
Example:
Covered: 26 of 31 system requirements verified
Gap: 5 unverified system requirements (identified by gap query)
Coverage: 83.9%
The coverage bar displays a clickable progress bar showing the percentage and count. Clicking the gap count opens Lucene query results showing unverified requirements for immediate action.
Customer Requirements Validation Coverage
| Property | Value |
|---|
| Metric Name | Customer Requirements Validation Coverage |
| Definition | Percentage of customer requirement work items with backlinked validation test cases via validates link role |
| Numerator | Customer requirements with backlinkedWorkItems:validates=* |
| Denominator | Total customer requirement work items (type:customerRequirement) |
| Gap Query | type:customerRequirement AND NOT backlinkedWorkItems:validates=TA* |
| Target | 100% (all customer requirements validated by at least one test case) |
| Compliance Requirement | ISO 26262 Part 4 §8 (Functional Safety Validation) |
Example:
Covered: 12 of 25 customer requirements validated
Gap: 13 unvalidated customer requirements
Coverage: 48%
Validation differs from verification: validation confirms the system meets actual user needs and operates safely in real-world conditions, while verification confirms system-level requirements are correctly implemented.
Design Requirements Verification Coverage
| Property | Value |
|---|
| Metric Name | Design Requirements Verification Coverage |
| Definition | Percentage of design requirement work items with backlinked test cases via verifies link role |
| Numerator | Design requirements with backlinkedWorkItems:verifies=* |
| Denominator | Total design requirement work items (type:desReq) |
| Gap Query | type:desReq AND NOT backlinkedWorkItems:verifies=TA* |
| Target | 100% (all design requirements verified by test cases) |
| Compliance Requirement | ISO 26262 Part 4 §7.4 (Design Verification Testing) |
Example:
Covered: 15 of 15 design requirements verified
Gap: 0 unverified design requirements
Coverage: 100%
Design verification ensures that detailed design implementations (components, interfaces, algorithms) correctly implement system-level requirements before integration.
Test Document Inventory Tree
| Property | Value |
|---|
| Component | Hierarchical document navigation table |
| Columns | System Element / Document, Type, Status, Work Item Count, Tool |
| Data Source | LiveDoc modules in Testing space folder |
| Macro | nxDocInventoryTree(spaceId='Testing', columnHeader='System Element / Document', expandFirstLevel=true) |
| Hierarchy | System → Subsystem → Component → Document |
| Sort Order | By system element custom field value |
Example Structure:
- AEB System (System Level)
- System Test Specification (12 test cases, 8 passed)
- System Integration Test Plan (6 test cases, 4 passed)
- Sensor Housing (Subsystem Level)
- Sensor Housing Test Spec (8 test cases, 6 passed)
- Environmental Test Plan (5 test cases, 3 passed)
- ECU Processing (Subsystem Level)
- ECU Functional Test Spec (10 test cases, 7 passed)
- ECU Integration Test Plan (4 test cases, 2 passed)
- Vehicle Interface (Subsystem Level)
- CAN Bus Test Spec (4 test cases, 4 passed)
The document inventory tree enables V&V engineers to quickly locate test documents organized by system hierarchy, rather than flat alphabetical listing.
Verification Sheet Links
System Verification Sheet
| Property | Value |
|---|
| Sheet Type | PowerSheet |
| Purpose | Display system requirements with linked test cases via verifies role |
| Configuration | System Verification Sheet.yaml |
| Expansion | System requirements expand to show back-linked test cases |
| Link Role | verifies (system requirement ← test case) |
| Use Case | Review all system requirements and their verification status |
Sheet Structure:
System Requirement | Test Case ID | Test Case Title | Test Method | Status
SR-001 | TC-001 | Sensor Init Test | Analysis | ✓ Verified
SR-002 | TC-002, TC-003 | Failover Tests | Testing | ✓ Verified
SR-003 | (gap) | (unverified) | — | ⚠ Gap
Subsystem Verification Sheet
| Property | Value |
|---|
| Sheet Type | PowerSheet |
| Purpose | Display subsystem-level requirements with linked test cases |
| Scope | Filtered to subsystem-level requirements (type:sysReq with subsystem classification) |
| Expansion | Shows test cases linked via verifies role |
| Use Case | Track verification completeness for specific subsystems |
Subsystem verification sheets enable component teams to own verification of their requirements independently, with roll-up to system level.
Design Verification Sheet
| Property | Value |
|---|
| Sheet Type | PowerSheet |
| Purpose | Display design requirements with linked verification test cases |
| Configuration | Design Verification Sheet.yaml |
| Expansion | Design requirements expand to show back-linked test cases |
| Link Role | verifies (design requirement ← test case) |
| Use Case | Verify detailed design implementations before system integration |
Sheet Structure:
Design Requirement | Component | Test Case | Test Method | Status
DR-001 (SC) | SoC | TC-021 | LSIL | ✓ Verified
DR-002 (CC) | Memory | TC-022 | LSIL | ✓ Verified
DR-003 | PMIC | (gap) | — | ⚠ Gap
V-Model Verification Workflow Diagram
Test Case Work Item Type
The Testing space dashboard operates on the testCase work item type, which represents executable verification or validation procedures linked to requirements.
| Property | Value |
|---|
| Work Item Type | testCase |
| Instances in Project | 49 total (system + design + component level) |
| Parent Type | verificationTestCase or validationTestCase |
| Link Roles (Outbound) | verifies (to sysReq, desReq), validates (to customerRequirement) |
| Link Roles (Inbound) | None (test cases are terminal in traceability) |
| Custom Fields | testMethod, executionStatus, evidence |
| Workflow States | Draft → Ready → Executed → Passed/Failed |
For deeper guidance on testing workflows, see:
Customization
The Testing Space Dashboard can be customized by editing the Velocity template at .polarion/pages/spaces/Testing/page.xml. Common customizations include:
- Adding new traceability metrics — Insert additional
nxLinkCoverage() blocks to track custom link roles or requirement types
- Changing color scheme — Modify
nxSpaceBanner() color parameters (currently teal #00695c/#00897b)
- Adding new PowerSheet links — Insert quick-link cards to newly created verification sheets
- Filtering by ASIL level — Add coverage metrics filtered by requirement ASIL classification
All changes must preserve the {{PLACEHOLDER}} syntax in dynamic content and use relative wiki page paths for cross-references.