This dashboard serves as the central workspace for V&V engineers performing verification (ensuring the system is built correctly per ISO 26262 Part 4-6) and validation (ensuring the right system is built) activities. It provides real-time visibility into test coverage metrics, traceability to requirements, and verification evidence management.
Dashboard Overview
The V&V Engineer Dashboard implements the ISO 26262 distinction between verification and validation:
- Verification: Confirms that system, subsystem, and component designs conform to their respective requirement specifications (system built correctly). Uses the ‘verifies’ link role.
- Validation: Confirms that customer requirements are satisfied by the complete integrated system (right system built). Uses the ‘validates’ link role.
The dashboard centralizes both verification and validation activities in a single workspace, enabling V&V engineers to track coverage across all test levels and ensure bidirectional traceability required by ISO 26262 Part 8 (Functional Safety Management).
Dashboard Components
Test Artifact Statistics
| Component | Type | Purpose | Default |
|---|
| Test Cases Count | KPI Metric | Total test case work items in project | Dynamic count |
| Verification Test Cases | KPI Metric | Test cases linked via ‘verifies’ role to requirements | Dynamic count |
| Validation Test Cases | KPI Metric | Test cases linked via ‘validates’ role to customer requirements | Dynamic count |
| Test Documents | KPI Metric | Total documents in Testing space | Dynamic count |
| Coverage Trend | Chart | Historical trend showing coverage percentage over time | 7-day rolling |
Technical Details:
The statistics bar uses Polarion Transaction API queries executed via Nextedy macro nxStatCount. The implementation counts work items by type using Lucene queries: type:testCase AND moduleId=Testing for test documents, linkedWorkItems:verifies for verification backlinks, and linkedWorkItems:validates for validation backlinks.
Verification Methods Reference Table
ISO 26262 Part 4 Section 8 defines four acceptable verification methods. The dashboard displays these methods with implementation guidance:
| Method | Description | ISO 26262 Reference | Applicable Levels |
|---|
| Review | Examination of specification or design by qualified personnel | Part 4 §8.2.1 | System, Subsystem, Component, Software |
| Analysis | Mathematical or logical proof of correctness; timing analysis; WCET analysis | Part 4 §8.2.2 | System, Subsystem, Component |
| Simulation | Execution of model against test scenarios to verify behavior | Part 4 §8.2.3 | System (HILS), Subsystem, Component (SIL) |
| Testing | Execution of compiled code or hardware against test cases and acceptance criteria | Part 4 §8.2.4 | All levels (SIL, HIL, vehicle testing) |
Each method can be applied individually or in combination. The dashboard tracks test coverage using the ‘verifies’ link role to establish traceability between test cases and system/subsystem/design requirements.
Test Document Inventory Tree
The Test Document Inventory Tree displays all documents in the Testing space organized by system element hierarchy. The tree structure enables V&V engineers to navigate by system decomposition:
SYSTEM: AEB System
├── SUBSYSTEM: Sensor Housing Subsystem
│ ├── Sensor Housing Assembly - Test Cases (12 items)
│ ├── Camera Module - Test Cases (8 items)
│ └── Radar Module - Test Cases (10 items)
├── SUBSYSTEM: ECU Processing Subsystem
│ ├── System-on-Chip (SoC) - Test Cases (7 items)
│ ├── Safety Co-Processor - Test Cases (4 items)
│ └── Memory - Test Cases (3 items)
└── SUBSYSTEM: Vehicle Interface Subsystem
└── CAN Transceivers - Test Cases (5 items)
Macro Implementation: Uses Nextedy nxDocInventoryTree(spaceFilter="Testing", columnHeader="System Element / Document", expandFirstLevel=true) to generate the hierarchical tree filtered to the Testing space only.
System Requirements Verification Coverage
This metric displays the percentage of system requirements that are verified by at least one test case:
| Metric | Value | Formula |
|---|
| Verified Requirements | Dynamic | Count of sysReq items with backlinks from testCase via ‘verifies’ role |
| Total System Requirements | 31 | Count of all sysReq work items in project |
| Coverage Percentage | Dynamic | Verified / Total × 100% |
| Gap Query | Link to gaps | Lucene: type:sysReq AND NOT backlinkedWorkItems:verifies |
Coverage Bar Implementation: The dashboard renders a traffic-light progress bar using nxCoverageBar(covered=$verified, total=$total, label="System Requirements Verification", gapQuery="type:sysReq AND NOT backlinkedWorkItems:verifies"). The bar displays:
- Green (≥80%): Full verification coverage
- Yellow (50-79%): Partial coverage requiring gap closure
- Red (<50%): Significant verification gaps
Clicking the gap count link navigates directly to the Work Items Tracker filtered to show unverified requirements, enabling rapid gap closure workflows.
Customer Requirements Validation Coverage
This metric displays the percentage of customer requirements validated by at least one test case:
| Metric | Value | Formula |
|---|
| Validated Requirements | Dynamic | Count of customerReq items with backlinks from testCase via ‘validates’ role |
| Total Customer Requirements | 25 | Count of all customerReq work items in project |
| Coverage Percentage | Dynamic | Validated / Total × 100% |
| Gap Query | Link to gaps | Lucene: type:customerReq AND NOT backlinkedWorkItems:validates |
Technical Implementation: Parallel to verification coverage but using the ‘validates’ link role instead of ‘verifies’. This implements ISO 26262 Part 8 requirement for bidirectional traceability between customer requirements and validation evidence.
Design Requirements Verification Coverage
This metric tracks verification coverage for design-level requirements:
| Metric | Value | Formula |
|---|
| Verified Design Reqs | Dynamic | Count of desReq items with backlinks from testCase via ‘verifies’ role |
| Total Design Requirements | 15 | Count of all desReq work items in project |
| Coverage Percentage | Dynamic | Verified / Total × 100% |
Design verification typically includes unit testing, component integration testing, and subsystem testing phases. The dashboard supports traceability from design requirements through characteristics to failure mode analysis, enabling comprehensive design V&V.
System/Subsystem/Design Verification Sheet Links
The dashboard provides quick links to PowerSheet documents that expand verification traceability:
| Sheet | Purpose | Link Target |
|---|
| System Verification Sheet | Expands sysReq items with testCase backlinks via ‘verifies’ role | .polarion/nextedy/sheet-configurations/System Verification Sheet.yaml |
| Subsystem Verification Sheet | Expands subsystem-scoped requirements with verification test cases | .polarion/nextedy/sheet-configurations/Subsystem Verification Sheet.yaml |
| Design Verification Sheet | Expands desReq items with testCase backlinks via ‘verifies’ role | .polarion/nextedy/sheet-configurations/Design Verification Sheet.yaml |
| User Need Validation Sheet | Expands customerReq items with validation test cases via ‘validates’ role | .polarion/nextedy/sheet-configurations/User Need Validation Sheet.yaml |
These PowerSheets enable V&V engineers to view complete verification/validation chains with traceability weights, requirement attributes, test case details, and evidence attachments in a single interactive view.
V&V Dashboard Workflow
The dashboard supports the following V&V engineer workflows:
Configuration Properties
| Property | Type | Default | Description |
|---|
| TESTING_SPACE_ID | String | Testing | Polarion space ID for test documents |
| VERIFICATION_COVERAGE_TARGET | Percentage | 80% | Target coverage % for system/design verification |
| VALIDATION_COVERAGE_TARGET | Percentage | 75% | Target coverage % for customer requirement validation |
| VERIFICATION_LINK_ROLE | String | verifies | Link role name for verification traceability |
| VALIDATION_LINK_ROLE | String | validates | Link role name for validation traceability |
| INCLUDE_MANUAL_TESTS | Boolean | true | Include manual test cases in coverage metrics |
| INCLUDE_AUTOMATED_TESTS | Boolean | true | Include automated test cases in coverage metrics |
| REFRESH_INTERVAL | Minutes | 5 | Dashboard metric refresh frequency |
| EXPORT_FORMAT | Enum | PDF, Excel | Supported export formats for coverage reports |
Related Pages
For more information on V&V workflows and test management:
Aim for ≥80% verification coverage for all system and design requirements before system integration testing. Track validation coverage separately to ensure customer requirements are addressed by the complete system. Use the gap query links to identify and prioritize coverage gaps.
Do not confuse verification (system built correctly) with validation (right system built). Use the ‘verifies’ link role exclusively for requirement-to-test-case verification traceability, and ‘validates’ exclusively for customer requirement validation. Mixing link roles breaks ISO 26262 compliance audits.