Overview
Verification Test Cases verify system, subsystem, and design requirements through the verifies link role. They are distinct from Validation Test Case which validates Customer Requirement work items. This distinction aligns with V-model methodology where verification confirms technical specifications while validation confirms customer needs.
VERIFICATION HIERARCHY (V-Model Right Side)
System Requirements (31)
↓ verifies ↓
System Verification Tests (31)
↓ evidence ↓
External Artifacts
Design Requirements (15)
↓ verifies ↓
Design Verification Tests (15)
↓ evidence ↓
Test Reports, CAT Analysis
Core Properties
| Name | Type | Default | Description |
|---|
| ID | String (auto-generated) | — | Unique work item identifier (e.g., TA-TEST-001) assigned by Polarion |
| Title | String | — | Test case name describing what is being verified (e.g., Verify System Response Time Under Peak Load) |
| Description | String | — | Detailed narrative explaining test objectives, scope, and success criteria |
| Status | Enum | draft | Workflow state: draft, inProgress, inReview, pendingApproval, approved |
| Assignee | User | — | V&V engineer responsible for creating and executing the test case |
Test Content Properties
| Name | Type | Default | Description |
|---|
| Test Objective | String | — | High-level goal of the test (e.g., Validate ECU response to loss-of-signal condition) |
| Test Method | Enum | — | Verification approach: blackBox, whiteBox, integration, inspection, demonstration |
| Test Environment | String | — | Equipment, tools, or simulator configuration required (e.g., dSPACE MicroAutoBox II, CANoe 11) |
| Preconditions | String | — | Initial system state or configuration before test execution (e.g., ECU powered, CAN bus running at 500 kbps) |
| Test Steps | String | — | Sequence of actions and observations (step number, action, expected result) |
| Expected Results | String | — | Acceptance criteria and pass/fail thresholds (e.g., Response time ≤ 100 ms, no CAN errors logged) |
| Pass Criteria | String | — | Quantitative or qualitative criteria for test success |
| Notes | String | — | Additional context, dependencies, or known issues |
Execution Tracking Properties
| Name | Type | Default | Description |
|---|
| Execution Status | Enum | notStarted | Test run state: notStarted, inProgress, passed, failed, blocked |
| Executed By | User | — | Engineer who performed the test run |
| Execution Date | Date | — | Date test was executed (ISO 8601 format: YYYY-MM-DD) |
| Result Summary | String | — | Brief outcome description with notable findings or deviations |
| Evidence Attached | Boolean | — | Indicates if external verification evidence is linked (e.g., test report, CAT file, oscilloscope screenshot) |
Traceability Properties
| Name | Type | Default | Description |
|---|
| Verifies (Link Role) | Link | — | Reference to System Requirement, Subsystem Requirement, or Design Requirement work item being verified |
| External References | Link | — | Link role pointing to external evidence artifacts (test reports, CAT files, simulation logs, oscilloscope captures) stored in Polarion or external repositories |
| Relates To | Link | — | Generic relationship to other test cases or related work items for context |
Relationships and Link Roles
Primary Relationships
Link Role: verifies
- Direction: VerificationTestCase → Requirement
- Cardinality: Many-to-many (one test can verify multiple requirements; one requirement can be verified by multiple tests)
- Usage: Establishes the core V-model right-side traceability. A single test case often verifies multiple related requirements (e.g., functional + performance requirements for the same system element).
Link Role: externalReferences
- Direction: VerificationTestCase → External Artifact
- Cardinality: One-to-many (one test case can reference multiple evidence artifacts)
- Usage: Points to external verification evidence such as CAT (Computer Aided Test) files, oscilloscope waveforms, test execution reports, or simulation logs in
.polarion/documents/external or external URL repositories.
Example Link Configuration
<!-- RTM Model: VerificationTestCase entity definition -->
<entity name="VerificationTestCase" polarionType="testCase">
<relationship name="verifies" role="verifies"
targetEntity="SystemRequirement|SubsystemRequirement|DesignRequirement"/>
<relationship name="externalReferences" role="externalReferences"
targetEntity="ExternalReference"/>
</entity>
PowerSheet Integration
Verification Test Cases appear in two PowerSheet configurations supporting ISO 26262 V-model verification workflows:
System Verification Sheet
Purpose: Bidirectional traceability matrix linking system requirements → verification test cases → evidence
Column Groups:
- System Requirements (purple): ID, title, description, classification
- Verification Tests (orange): Linked test case IDs and titles with multiLine support for detailed test descriptions
- External Evidence (red): Verification evidence artifacts (test reports, CAT files)
Query: RTM model query starting from SystemRequirement entity, filtered by TestCase module parameter
Entity Factory: Enables inline test case creation directly from the sheet, auto-creating linked VerificationTestCase work items in Testing/SystemVerificationSpecification module
Design Pattern: Three-level expansion hierarchy (requirement → verificationTestCases → externalReferences) supporting complete verification evidence chain
See System Verification Sheet for configuration details.
Design Verification Sheet
Purpose: Traceability matrix linking design requirements → verification test cases → evidence for design-level verification
Column Groups:
- Design Requirements (blue): ID, title, type, rationale with read-only display
- Verification Tests (orange): Linked test case titles with bold formatting for visual emphasis
- External Evidence (red): Supporting evidence artifacts
Query: DesignRequirement source with verificationTestCases and externalReferences expansion
Entity Factory: Auto-creates VerificationTestCase work items in Testing/DesignVerificationSpecification module with automatic link creation
Document Grouping: Groups rows by owning document, enabling multi-document verification reviews
See Design Verification Sheet for configuration details.
Workflow Integration
Verification Test Cases participate in the standard Document Workflow States through their parent module:
Verification Test Cases must reach approved state before test execution can begin. This ensures design review gate compliance for ISO 26262 Part 5/6 (Design) requirements.
RTM Model Configuration
Verification Test Cases are configured in the RTM domain model as follows:
Entity Type: TestCase (polarionType: testCase)
Document Constraints:
- System Verification Tests: Constrained to
Testing/SystemVerificationSpecification document type
- Design Verification Tests: Constrained to
Testing/DesignVerificationSpecification document type
Relationship Configuration:
entity_type: TestCase
polarion_type: testCase
document_filters:
- type: systemVerificationSpecification
path: Testing/SystemVerificationSpecification
- type: designVerificationSpecification
path: Testing/DesignVerificationSpecification
relationships:
verifies:
target_types: [SystemRequirement, SubsystemRequirement, DesignRequirement]
externalReferences:
target_types: [ExternalReference]
Code Example
<!-- Verification Test Case work item example (minimal structure) -->
<workitem>
<type>testCase</type>
<title>Verify ECU Response to Sensor Failure</title>
<description>
This test verifies that the ECU transitions to safe state when the front-facing camera sensor fails.
Test Objective: Validate limp-home mode activation
Test Method: blackBox
Test Environment: dSPACE MicroAutoBox II with CANoe
Preconditions:
- ECU powered and running
- CAN bus initialized at 500 kbps
- Front camera sensor ready
Test Steps:
1. Start ECU in normal operating mode
2. Simulate camera sensor loss (unplug harness)
3. Monitor CAN bus for DTC (Diagnostic Trouble Code)
4. Verify AEB system transitions to degraded mode
5. Record response time from fault detection to mode transition
Expected Results:
- DTC logged within 100 ms
- Mode transition completed within 500 ms
- AEB system operational at reduced capability
- No unintended braking events
Pass Criteria: All expected results observed; response time ≤ 500 ms
</description>
<status>approved</status>
<assignee>john.doe</assignee>
<custom-field name="executionStatus">notStarted</custom-field>
<links>
<link role="verifies" type="SystemRequirement" workitem="TA-SR-015"/>
<link role="externalReferences" workitem="evidence-001"/>
</links>
</workitem>
Custom Fields
Verification Test Cases may include project-specific custom fields for detailed traceability. Common examples:
| Field Name | Type | Purpose |
|---|
ISO26262_Requirement_Level | Enum | Compliance level: Part 3 (Concept), Part 4 (System), Part 5 (Hardware), Part 6 (Software) |
ASIL_Applicable | Enum | ASIL classification of verified requirement (QM, A, B, C, D) |
Test_Complexity | Enum | Estimated complexity: simple, moderate, complex |
Estimated_Duration_Hours | Number | Time estimate for test execution and documentation |
Verification_Environment_Category | Enum | Scope: unit, integration, system, field |
Traceability_Complete | Boolean | Flag indicating all required linkages established |
Usage Patterns
Pattern 1: Requirement-Driven Verification Planning
- Create Design Requirement work item in
designRequirementsSpecification
- Use Design Verification Sheet to create linked VerificationTestCase via entity factory
- Test case auto-created in
Testing/DesignVerificationSpecification module with verifies link established
- Author test method, environment, and acceptance criteria
- Route through approval workflow
Pattern 2: Multi-Level Verification Coverage
Verify the same functional requirement at multiple levels:
- System Level: VerificationTestCase in
systemVerificationSpecification verifies System Requirement
- Subsystem Level: VerificationTestCase in subsystem-specific module verifies Subsystem Requirement
- Design Level: VerificationTestCase in
designVerificationSpecification verifies Design Requirement
This hierarchical approach ensures ISO 26262 Part 4/5/6 compliance with evidence at each design level.
Pattern 3: Evidence Chain Documentation
Link verification evidence through the external reference chain:
DesignRequirement (Brake Response Time ≤ 200 ms)
↓ verifies
VerificationTestCase (Verify Response Time)
↓ externalReferences
- ExternalReference: simulation-results-v2.dsx (CAT file)
- ExternalReference: test-report-2025-02-15.pdf (Test execution report)
Coverage Reporting
Verification Test Case coverage is tracked in safety readiness metrics:
Metric: Verification Coverage
Covered = Count(DesignRequirement with back-linked VerificationTestCase)
Total = Count(DesignRequirement)
Coverage% = (Covered / Total) × 100
Gap Query (Lucene): type:testCase AND NOT linkedWorkItems:verifies=TA*
identifies verification test cases not linked to any requirements.
Reverse Gap Query (Lucene): type:desReq AND NOT backlinkedWorkItems:verifies=TA*
identifies design requirements without verification test cases.
Cross-References