Skip to main content

Overview

Verification Test Cases verify system, subsystem, and design requirements through the verifies link role. They are distinct from Validation Test Case which validates Customer Requirement work items. This distinction aligns with V-model methodology where verification confirms technical specifications while validation confirms customer needs.
VERIFICATION HIERARCHY (V-Model Right Side)

  System Requirements (31)
           ↓ verifies ↓
    System Verification Tests (31)
           ↓ evidence ↓
      External Artifacts

  Design Requirements (15)
           ↓ verifies ↓
    Design Verification Tests (15)
           ↓ evidence ↓
      Test Reports, CAT Analysis
Verification Test Cases appear in two PowerSheet configurations: System Verification Sheet and Design Verification Sheet. These sheets provide three-tier traceability matrices linking requirements → test cases → external evidence.

Core Properties

NameTypeDefaultDescription
IDString (auto-generated)Unique work item identifier (e.g., TA-TEST-001) assigned by Polarion
TitleStringTest case name describing what is being verified (e.g., Verify System Response Time Under Peak Load)
DescriptionStringDetailed narrative explaining test objectives, scope, and success criteria
StatusEnumdraftWorkflow state: draft, inProgress, inReview, pendingApproval, approved
AssigneeUserV&V engineer responsible for creating and executing the test case

Test Content Properties

NameTypeDefaultDescription
Test ObjectiveStringHigh-level goal of the test (e.g., Validate ECU response to loss-of-signal condition)
Test MethodEnumVerification approach: blackBox, whiteBox, integration, inspection, demonstration
Test EnvironmentStringEquipment, tools, or simulator configuration required (e.g., dSPACE MicroAutoBox II, CANoe 11)
PreconditionsStringInitial system state or configuration before test execution (e.g., ECU powered, CAN bus running at 500 kbps)
Test StepsStringSequence of actions and observations (step number, action, expected result)
Expected ResultsStringAcceptance criteria and pass/fail thresholds (e.g., Response time ≤ 100 ms, no CAN errors logged)
Pass CriteriaStringQuantitative or qualitative criteria for test success
NotesStringAdditional context, dependencies, or known issues

Execution Tracking Properties

NameTypeDefaultDescription
Execution StatusEnumnotStartedTest run state: notStarted, inProgress, passed, failed, blocked
Executed ByUserEngineer who performed the test run
Execution DateDateDate test was executed (ISO 8601 format: YYYY-MM-DD)
Result SummaryStringBrief outcome description with notable findings or deviations
Evidence AttachedBooleanIndicates if external verification evidence is linked (e.g., test report, CAT file, oscilloscope screenshot)

Traceability Properties

NameTypeDefaultDescription
Verifies (Link Role)LinkReference to System Requirement, Subsystem Requirement, or Design Requirement work item being verified
External ReferencesLinkLink role pointing to external evidence artifacts (test reports, CAT files, simulation logs, oscilloscope captures) stored in Polarion or external repositories
Relates ToLinkGeneric relationship to other test cases or related work items for context

Primary Relationships

diagram Link Role: verifies
  • Direction: VerificationTestCase → Requirement
  • Cardinality: Many-to-many (one test can verify multiple requirements; one requirement can be verified by multiple tests)
  • Usage: Establishes the core V-model right-side traceability. A single test case often verifies multiple related requirements (e.g., functional + performance requirements for the same system element).
Link Role: externalReferences
  • Direction: VerificationTestCase → External Artifact
  • Cardinality: One-to-many (one test case can reference multiple evidence artifacts)
  • Usage: Points to external verification evidence such as CAT (Computer Aided Test) files, oscilloscope waveforms, test execution reports, or simulation logs in .polarion/documents/external or external URL repositories.
<!-- RTM Model: VerificationTestCase entity definition -->
<entity name="VerificationTestCase" polarionType="testCase">
  <relationship name="verifies" role="verifies" 
    targetEntity="SystemRequirement|SubsystemRequirement|DesignRequirement"/>
  <relationship name="externalReferences" role="externalReferences" 
    targetEntity="ExternalReference"/>
</entity>

PowerSheet Integration

Verification Test Cases appear in two PowerSheet configurations supporting ISO 26262 V-model verification workflows:

System Verification Sheet

Purpose: Bidirectional traceability matrix linking system requirements → verification test cases → evidence Column Groups:
  • System Requirements (purple): ID, title, description, classification
  • Verification Tests (orange): Linked test case IDs and titles with multiLine support for detailed test descriptions
  • External Evidence (red): Verification evidence artifacts (test reports, CAT files)
Query: RTM model query starting from SystemRequirement entity, filtered by TestCase module parameter Entity Factory: Enables inline test case creation directly from the sheet, auto-creating linked VerificationTestCase work items in Testing/SystemVerificationSpecification module Design Pattern: Three-level expansion hierarchy (requirement → verificationTestCases → externalReferences) supporting complete verification evidence chain See System Verification Sheet for configuration details.

Design Verification Sheet

Purpose: Traceability matrix linking design requirements → verification test cases → evidence for design-level verification Column Groups:
  • Design Requirements (blue): ID, title, type, rationale with read-only display
  • Verification Tests (orange): Linked test case titles with bold formatting for visual emphasis
  • External Evidence (red): Supporting evidence artifacts
Query: DesignRequirement source with verificationTestCases and externalReferences expansion Entity Factory: Auto-creates VerificationTestCase work items in Testing/DesignVerificationSpecification module with automatic link creation Document Grouping: Groups rows by owning document, enabling multi-document verification reviews See Design Verification Sheet for configuration details.

Workflow Integration

Verification Test Cases participate in the standard Document Workflow States through their parent module:
Verification Test Cases must reach approved state before test execution can begin. This ensures design review gate compliance for ISO 26262 Part 5/6 (Design) requirements.

RTM Model Configuration

Verification Test Cases are configured in the RTM domain model as follows: Entity Type: TestCase (polarionType: testCase) Document Constraints:
  • System Verification Tests: Constrained to Testing/SystemVerificationSpecification document type
  • Design Verification Tests: Constrained to Testing/DesignVerificationSpecification document type
Relationship Configuration:
entity_type: TestCase
polarion_type: testCase
document_filters:
  - type: systemVerificationSpecification
    path: Testing/SystemVerificationSpecification
  - type: designVerificationSpecification
    path: Testing/DesignVerificationSpecification
relationships:
  verifies:
    target_types: [SystemRequirement, SubsystemRequirement, DesignRequirement]
  externalReferences:
    target_types: [ExternalReference]

Code Example

<!-- Verification Test Case work item example (minimal structure) -->
<workitem>
  <type>testCase</type>
  <title>Verify ECU Response to Sensor Failure</title>
  <description>
    This test verifies that the ECU transitions to safe state when the front-facing camera sensor fails.
    
Test Objective: Validate limp-home mode activation
Test Method: blackBox
Test Environment: dSPACE MicroAutoBox II with CANoe
Preconditions:
  - ECU powered and running
  - CAN bus initialized at 500 kbps
  - Front camera sensor ready
  
Test Steps:
  1. Start ECU in normal operating mode
  2. Simulate camera sensor loss (unplug harness)
  3. Monitor CAN bus for DTC (Diagnostic Trouble Code)
  4. Verify AEB system transitions to degraded mode
  5. Record response time from fault detection to mode transition
  
Expected Results:
  - DTC logged within 100 ms
  - Mode transition completed within 500 ms
  - AEB system operational at reduced capability
  - No unintended braking events
  
Pass Criteria: All expected results observed; response time ≤ 500 ms
  </description>
  <status>approved</status>
  <assignee>john.doe</assignee>
  <custom-field name="executionStatus">notStarted</custom-field>
  
  <links>
    <link role="verifies" type="SystemRequirement" workitem="TA-SR-015"/>
    <link role="externalReferences" workitem="evidence-001"/>
  </links>
</workitem>

Custom Fields

Verification Test Cases may include project-specific custom fields for detailed traceability. Common examples:
Field NameTypePurpose
ISO26262_Requirement_LevelEnumCompliance level: Part 3 (Concept), Part 4 (System), Part 5 (Hardware), Part 6 (Software)
ASIL_ApplicableEnumASIL classification of verified requirement (QM, A, B, C, D)
Test_ComplexityEnumEstimated complexity: simple, moderate, complex
Estimated_Duration_HoursNumberTime estimate for test execution and documentation
Verification_Environment_CategoryEnumScope: unit, integration, system, field
Traceability_CompleteBooleanFlag indicating all required linkages established

Usage Patterns

Pattern 1: Requirement-Driven Verification Planning

  1. Create Design Requirement work item in designRequirementsSpecification
  2. Use Design Verification Sheet to create linked VerificationTestCase via entity factory
  3. Test case auto-created in Testing/DesignVerificationSpecification module with verifies link established
  4. Author test method, environment, and acceptance criteria
  5. Route through approval workflow

Pattern 2: Multi-Level Verification Coverage

Verify the same functional requirement at multiple levels:
  • System Level: VerificationTestCase in systemVerificationSpecification verifies System Requirement
  • Subsystem Level: VerificationTestCase in subsystem-specific module verifies Subsystem Requirement
  • Design Level: VerificationTestCase in designVerificationSpecification verifies Design Requirement
This hierarchical approach ensures ISO 26262 Part 4/5/6 compliance with evidence at each design level.

Pattern 3: Evidence Chain Documentation

Link verification evidence through the external reference chain: DesignRequirement (Brake Response Time ≤ 200 ms) ↓ verifies VerificationTestCase (Verify Response Time) ↓ externalReferences
  • ExternalReference: simulation-results-v2.dsx (CAT file)
  • ExternalReference: test-report-2025-02-15.pdf (Test execution report)

Coverage Reporting

Verification Test Case coverage is tracked in safety readiness metrics: Metric: Verification Coverage
Covered = Count(DesignRequirement with back-linked VerificationTestCase)
Total   = Count(DesignRequirement)
Coverage% = (Covered / Total) × 100
Gap Query (Lucene): type:testCase AND NOT linkedWorkItems:verifies=TA* identifies verification test cases not linked to any requirements. Reverse Gap Query (Lucene): type:desReq AND NOT backlinkedWorkItems:verifies=TA* identifies design requirements without verification test cases.

Cross-References