Overview
Test Cases play a dual role in automotive safety engineering:
- Verification Test Cases — Validate system, subsystem, and design requirements through planned testing activities (ISO 26262 Part 4-6)
- Validation Test Cases — Validate customer requirements and user needs against stakeholder expectations (ISO 26262 Part 3)
The Test Case work item type implements a sophisticated relationship model that supports many-to-many traceability from requirements to test activities to external evidence artifacts, enabling comprehensive V-model coverage tracking and compliance reporting.
Test Case Hierarchy and Linking
Test Cases exist in a three-level traceability chain:
- Verification Test Cases use the
verifies link role to trace system, subsystem, and design requirements
- Validation Test Cases use the
validates link role to trace customer requirements
- External References document the actual test evidence (reports, logs, certificates) via the
externalReferences link role
Multiple test cases can verify a single requirement, and a single test case can verify multiple requirements, enabling flexible many-to-many test-to-requirement coverage.
Core Identity Fields
| Name | Type | Default | Description |
|---|
| ID | String (read-only) | Auto-generated | Unique work item identifier, typically TC-### for test case numbering |
| title | String | (required) | Human-readable test case name, e.g. “Verify AEB activation below 80 km/h” |
| description | String | (optional) | Detailed description of what the test case validates, preconditions, and expected outcome |
| type | Enumeration (read-only) | testCase | Work item type identifier (always testCase for this type) |
| status | Enumeration | draft | Current state in work item lifecycle: draft, inProgress, inReview, pendingApproval, approved, rejected, obsolete |
| assignee | User | (optional) | Engineer responsible for executing or maintaining the test case |
| created | Date (read-only) | Current timestamp | Date and time when test case was created |
| updated | Date (read-only) | Current timestamp | Date and time of last modification |
| Name | Type | Default | Description |
|---|
| priority | Enumeration | medium | Test execution priority: low, medium, high, critical. Critical tests are essential for safety compliance verification. |
| expectedResult | String | (optional) | Description of the expected outcome when test passes, including measurable acceptance criteria |
| actualResult | String | (optional) | Recorded outcome from actual test execution, captured when test is run |
| testMethod | Enumeration | (optional) | Testing approach: manual, automated, simulation, analysis. Defines how verification will be performed. |
| estimatedDuration | String | (optional) | Expected time to execute test, e.g. “2 hours”, “1 day”. Aids in test execution planning. |
| evidenceLocation | String | (optional) | Reference to where test evidence will be stored (file path, URL, or document reference) |
Traceability Relationship Fields
| Name | Type | Default | Description |
|---|
| verifies | Link (outgoing) | (optional) | Links to system, subsystem, or design requirements that this test case verifies. Multiple test cases can verify one requirement; one test case can verify multiple requirements. |
| validates | Link (outgoing) | (optional) | Links to customer requirements or user needs that this test case validates. Used for validation testing per V-model methodology. |
| externalReferences | Link (outgoing) | (optional) | Links to external evidence artifacts (test reports, logs, certificates, analysis documents) that document test execution and results |
| derivedFrom | Link (outgoing) | (optional) | Links to risk controls or requirements that motivated the creation of this test case |
| relatedTests | Link (outgoing) | (optional) | Links to other test cases that have dependency or relationship (e.g., prerequisite tests, integration tests) |
Coverage and Completion Fields
| Name | Type | Default | Description |
|---|
| coveragePercentage | Integer | 0 | Percentage of test case completion (0-100), used in dashboards and coverage reports to track verification progress |
| testExecutionDate | Date | (optional) | Date when test was most recently executed, enabling lifecycle tracking and audit trail |
| testResult | Enumeration | (optional) | Result of most recent test execution: passed, failed, inconclusive, blocked, skipped. Core metric for verification readiness. |
| approvalSignature | User (read-only) | (optional) | Approver who signed off on test case design and coverage when transitioning to approved status |
ISO 26262 and IATF Classification Fields
| Name | Type | Default | Description |
|---|
| classification | Enumeration | (optional) | Safety classification: SC (Safety Critical), CC (Conformance Critical), or standard. Inherited from traced requirement classification. Use orange (SC) or red (CC) badge styling in PowerSheets. |
| asil | Enumeration | (optional) | ASIL level (QM, A, B, C, D) inherited from linked requirement’s safety goal. Indicates rigor and traceability completeness required. |
| verificationLevel | Enumeration | (optional) | V-model decomposition level: system, subsystem, design, unit. Determines which requirements this test case can verify. |
| referenceStandard | String | (optional) | Standard or specification this test traces to (e.g., “ISO 26262 Part 5”, “IATF 16949 Section 8.5.4”) |
Test Case Example: Design Verification Test
<workItem type="testCase" id="TC-045">
<title>Verify AEB Activation Response Time under 100 ms</title>
<description>
This test case verifies that the AEB system's electronic control unit detects
an obstacle and initiates braking within 100 milliseconds, meeting the timing
requirement for functional safety under ISO 26262.
Preconditions:
- Camera module powered and calibrated
- Radar module active and tracking obstacle at 50m range
- Vehicle speed: 60 km/h
- Road surface: dry asphalt
Test Steps:
1. Initialize test environment with simulated obstacle at 50m distance
2. Record timestamp of obstacle detection signal entering ECU
3. Record timestamp of first braking command output from ECU
4. Calculate elapsed time between detection and command
5. Compare against 100ms threshold
Expected Result: Activation response time ≤ 100 milliseconds in all test runs
</description>
<priority>critical</priority>
<testMethod>automated</testMethod>
<estimatedDuration>4 hours</estimatedDuration>
<classification>SC</classification>
<asil>B</asil>
<verificationLevel>design</verificationLevel>
<verifies>
<link workItemId="DREQ-023" role="verifies">
Design Requirement: AEB activation response time ≤ 100 ms
</link>
<link workItemId="SREQ-015" role="verifies">
System Requirement: Timely braking response activation
</link>
</verifies>
<externalReferences>
<link reference="shared://TestReports/ECU_ResponseTime_v2.3.pdf" role="evidence">
Test Report: ECU Response Time Validation v2.3
</link>
<link reference="shared://Logs/TestRun_20260215_ECU_ResponseTime.csv" role="evidence">
Test Execution Log: 2026-02-15 09:30 UTC
</link>
</externalReferences>
<testExecutionDate>2026-02-15</testExecutionDate>
<testResult>passed</testResult>
<coveragePercentage>100</coveragePercentage>
<status>approved</status>
<assignee>eng-maria-design</assignee>
</workItem>
Verification vs Validation Test Cases
The TestAuto2 solution distinguishes between verification and validation using different link roles and document types:
| Aspect | Verification Test Case | Validation Test Case |
|---|
| Link Role | verifies | validates |
| Requirement Level | System, Subsystem, Design | Customer Requirement, User Need |
| V-Model Phase | System Design (Part 4), Design (Part 5), Implementation (Part 6) | Requirements (Part 3) |
| Success Criterion | Does the design implement the requirement? | Does the system meet customer expectations? |
| Evidence Type | Component test logs, integration test results, analysis | System-level test reports, user acceptance records |
| Document Location | Testing/SystemVerificationSpecification, Testing/DesignVerificationSpecification | Testing/ValidationSpecification |
| Stakeholder | Design and V{{available_pages}}V engineers | Customers, product managers, safety assessors |
| PowerSheet Display | Purple (requirement) + Orange (verification) columns | Green (customer requirement) + Validation columns |
PowerSheet Integration
Test Cases appear in several PowerSheet configurations:
Each sheet enables inline test case creation via entity factory, allowing users to define tests directly from the traceability context without navigation.
Test Case Lifecycle and Workflow
Test Cases follow the standard Document Workflow States with seven states:
When a test case transitions to pendingApproval, approvers from the project’s project_approver role are automatically added via the AddDefaultApprovals function. At least one approver must sign with “approved” verdict, and no disapprovals are permitted, before the test case can transition to approved status.
Mandatory Fields by Workflow State
| State | Mandatory Fields | Purpose |
|---|
| draft | title | Allows initial creation without full details |
| inProgress | assignee, description | Indicates work is active; assigned and has scope |
| inReview | assignee, testMethod, expectedResult | Peer review can assess test design quality |
| pendingApproval | assignee, description, expectedResult, actualResult (if executed) | Formal approval gate; full test documentation required |
| approved | All above + approvalSignature | Test case is accepted and ready for execution |
Test Case Documentation and Evidence Linking
Test Cases serve as the single point of entry for test evidence in the traceability matrix. External evidence artifacts are linked via the externalReferences relationship:
| Evidence Type | Example | Use Case |
|---|
| Test Report (PDF) | TestReports/AEB_Integration_Test_Report_v3.1.pdf | Formal test results for audits |
| Test Execution Log (CSV/JSON) | TestLogs/20260215_ECU_ResponseTime.csv | Detailed test data for root cause analysis |
| Test Code / Script | TestScripts/aeb_sensor_fusion_unit_test.py | Automated test implementation for reproducibility |
| Analysis Document | Analysis/StatisticalAnalysis_SensorAccuracy.md | Mathematical or statistical evidence |
| Certification / Certificate | Certs/Sensor_Module_Safety_Cert_ISO26262.pdf | Third-party validation evidence |
| Video / Recording | TestVideos/AEB_Full_System_Test_20260215.mp4 | Visual evidence for complex scenarios |
Test evidence must be retained for the product’s entire lifecycle per ISO 26262 Part 8 requirements. Use the evidenceLocation field to document where evidence is stored and archived, and configure document retention policies in Polarion administration.
Test Case Creation Context
Test Cases are typically created in the Testing space under one of three document types:
- systemVerificationSpecification — For test cases verifying system requirements
- designVerificationSpecification — For test cases verifying design requirements
- validationSpecification — For test cases validating customer requirements
When creating a test case from a PowerSheet, the entity factory automatically:
- Creates the test case in the appropriate document based on requirement level
- Establishes the
verifies or validates link from the requirement
- Sets the
verificationLevel and classification fields from the linked requirement
- Assigns the test case to the current user as initial assignee
A single test case can verify multiple requirements (e.g., a system integration test that validates both timing and data accuracy requirements). Similarly, multiple test cases can verify one requirement (e.g., different test methods: simulation, hardware-in-the-loop, road test). This flexibility supports realistic test planning while maintaining complete traceability coverage.
Integration with Verification and Validation PowerSheets
The PowerSheet configurations use Test Cases as the core data model for V-model verification tracking. When using a Verification PowerSheet:
- Requirement Source Query — Queries system/design requirements from the current document
- First-Level Expansion — Traverses the
verifies link role to show all test cases linked to each requirement
- Second-Level Expansion — Traverses the
externalReferences link role from test cases to show evidence artifacts
- Entity Factory — Enables inline test case creation with automatic link establishment
- Coverage Calculation — Computes percentage of requirements with at least one test case
See System Verification Sheet and Design Verification Sheet for detailed PowerSheet configuration examples.
| Type | Relationship | Purpose |
|---|
| System Requirement | Test cases verify system requirements | Functional requirements at system decomposition level |
| Design Requirement | Test cases verify design requirements | Implementation-level technical specifications |
| Customer Requirement | Validation test cases validate customer requirements | Stakeholder and regulatory needs |
| User Need | Validation test cases can validate user needs | Informal stakeholder requirements |
| Risk Control | Test cases may be derived from risk controls | Mitigation verification and effectiveness testing |
| Characteristic | Test cases verify characteristic target values | Quality parameter validation |
| Subsystem Requirement | Test cases verify subsystem requirements | Intermediate decomposition level |
Configuration and Customization
Test Case behavior can be customized through Polarion project configuration:
- Custom Fields — Add project-specific test metadata (e.g., test environment, tools used, tester qualifications)
- Workflow States — Extend or modify the seven-state lifecycle to match your organization’s approval gates
- Form Layout — Configure which fields appear for different user roles via Hazard Form Layout equivalents
- Reports — Generate test coverage reports using Verification Test Case and Validation Test Case specifications
See Project Properties for project-level configuration guidance.
Summary
The Test Case work item type is the cornerstone of V-model verification and validation in TestAuto2 — Automotive Safety Solution. By implementing sophisticated many-to-many traceability between requirements and test evidence, Test Cases enable organizations to demonstrate complete verification coverage and maintain audit trails required by ISO 26262, AIAG-VDA FMEA, and IATF 16949 standards. The tight integration with PowerSheet verification matrices and external evidence linking makes Test Cases the primary mechanism for documenting and proving that functional safety requirements have been met.