Skip to main content

Overview

Test Cases play a dual role in automotive safety engineering:
  • Verification Test Cases — Validate system, subsystem, and design requirements through planned testing activities (ISO 26262 Part 4-6)
  • Validation Test Cases — Validate customer requirements and user needs against stakeholder expectations (ISO 26262 Part 3)
The Test Case work item type implements a sophisticated relationship model that supports many-to-many traceability from requirements to test activities to external evidence artifacts, enabling comprehensive V-model coverage tracking and compliance reporting.

Test Case Hierarchy and Linking

Test Cases exist in a three-level traceability chain: diagram
  • Verification Test Cases use the verifies link role to trace system, subsystem, and design requirements
  • Validation Test Cases use the validates link role to trace customer requirements
  • External References document the actual test evidence (reports, logs, certificates) via the externalReferences link role
Multiple test cases can verify a single requirement, and a single test case can verify multiple requirements, enabling flexible many-to-many test-to-requirement coverage.

Core Identity Fields

NameTypeDefaultDescription
IDString (read-only)Auto-generatedUnique work item identifier, typically TC-### for test case numbering
titleString(required)Human-readable test case name, e.g. “Verify AEB activation below 80 km/h”
descriptionString(optional)Detailed description of what the test case validates, preconditions, and expected outcome
typeEnumeration (read-only)testCaseWork item type identifier (always testCase for this type)
statusEnumerationdraftCurrent state in work item lifecycle: draft, inProgress, inReview, pendingApproval, approved, rejected, obsolete
assigneeUser(optional)Engineer responsible for executing or maintaining the test case
createdDate (read-only)Current timestampDate and time when test case was created
updatedDate (read-only)Current timestampDate and time of last modification

Test Case Metadata

NameTypeDefaultDescription
priorityEnumerationmediumTest execution priority: low, medium, high, critical. Critical tests are essential for safety compliance verification.
expectedResultString(optional)Description of the expected outcome when test passes, including measurable acceptance criteria
actualResultString(optional)Recorded outcome from actual test execution, captured when test is run
testMethodEnumeration(optional)Testing approach: manual, automated, simulation, analysis. Defines how verification will be performed.
estimatedDurationString(optional)Expected time to execute test, e.g. “2 hours”, “1 day”. Aids in test execution planning.
evidenceLocationString(optional)Reference to where test evidence will be stored (file path, URL, or document reference)

Traceability Relationship Fields

NameTypeDefaultDescription
verifiesLink (outgoing)(optional)Links to system, subsystem, or design requirements that this test case verifies. Multiple test cases can verify one requirement; one test case can verify multiple requirements.
validatesLink (outgoing)(optional)Links to customer requirements or user needs that this test case validates. Used for validation testing per V-model methodology.
externalReferencesLink (outgoing)(optional)Links to external evidence artifacts (test reports, logs, certificates, analysis documents) that document test execution and results
derivedFromLink (outgoing)(optional)Links to risk controls or requirements that motivated the creation of this test case
relatedTestsLink (outgoing)(optional)Links to other test cases that have dependency or relationship (e.g., prerequisite tests, integration tests)

Coverage and Completion Fields

NameTypeDefaultDescription
coveragePercentageInteger0Percentage of test case completion (0-100), used in dashboards and coverage reports to track verification progress
testExecutionDateDate(optional)Date when test was most recently executed, enabling lifecycle tracking and audit trail
testResultEnumeration(optional)Result of most recent test execution: passed, failed, inconclusive, blocked, skipped. Core metric for verification readiness.
approvalSignatureUser (read-only)(optional)Approver who signed off on test case design and coverage when transitioning to approved status

ISO 26262 and IATF Classification Fields

NameTypeDefaultDescription
classificationEnumeration(optional)Safety classification: SC (Safety Critical), CC (Conformance Critical), or standard. Inherited from traced requirement classification. Use orange (SC) or red (CC) badge styling in PowerSheets.
asilEnumeration(optional)ASIL level (QM, A, B, C, D) inherited from linked requirement’s safety goal. Indicates rigor and traceability completeness required.
verificationLevelEnumeration(optional)V-model decomposition level: system, subsystem, design, unit. Determines which requirements this test case can verify.
referenceStandardString(optional)Standard or specification this test traces to (e.g., “ISO 26262 Part 5”, “IATF 16949 Section 8.5.4”)

Test Case Example: Design Verification Test

<workItem type="testCase" id="TC-045">
  <title>Verify AEB Activation Response Time under 100 ms</title>
  <description>
    This test case verifies that the AEB system's electronic control unit detects 
    an obstacle and initiates braking within 100 milliseconds, meeting the timing 
    requirement for functional safety under ISO 26262.
    
    Preconditions:
    - Camera module powered and calibrated
    - Radar module active and tracking obstacle at 50m range
    - Vehicle speed: 60 km/h
    - Road surface: dry asphalt
    
    Test Steps:
    1. Initialize test environment with simulated obstacle at 50m distance
    2. Record timestamp of obstacle detection signal entering ECU
    3. Record timestamp of first braking command output from ECU
    4. Calculate elapsed time between detection and command
    5. Compare against 100ms threshold
    
    Expected Result: Activation response time ≤ 100 milliseconds in all test runs
  </description>
  <priority>critical</priority>
  <testMethod>automated</testMethod>
  <estimatedDuration>4 hours</estimatedDuration>
  <classification>SC</classification>
  <asil>B</asil>
  <verificationLevel>design</verificationLevel>
  <verifies>
    <link workItemId="DREQ-023" role="verifies">
      Design Requirement: AEB activation response time ≤ 100 ms
    </link>
    <link workItemId="SREQ-015" role="verifies">
      System Requirement: Timely braking response activation
    </link>
  </verifies>
  <externalReferences>
    <link reference="shared://TestReports/ECU_ResponseTime_v2.3.pdf" role="evidence">
      Test Report: ECU Response Time Validation v2.3
    </link>
    <link reference="shared://Logs/TestRun_20260215_ECU_ResponseTime.csv" role="evidence">
      Test Execution Log: 2026-02-15 09:30 UTC
    </link>
  </externalReferences>
  <testExecutionDate>2026-02-15</testExecutionDate>
  <testResult>passed</testResult>
  <coveragePercentage>100</coveragePercentage>
  <status>approved</status>
  <assignee>eng-maria-design</assignee>
</workItem>

Verification vs Validation Test Cases

The TestAuto2 solution distinguishes between verification and validation using different link roles and document types:
AspectVerification Test CaseValidation Test Case
Link Roleverifiesvalidates
Requirement LevelSystem, Subsystem, DesignCustomer Requirement, User Need
V-Model PhaseSystem Design (Part 4), Design (Part 5), Implementation (Part 6)Requirements (Part 3)
Success CriterionDoes the design implement the requirement?Does the system meet customer expectations?
Evidence TypeComponent test logs, integration test results, analysisSystem-level test reports, user acceptance records
Document LocationTesting/SystemVerificationSpecification, Testing/DesignVerificationSpecificationTesting/ValidationSpecification
StakeholderDesign and V{{available_pages}}V engineersCustomers, product managers, safety assessors
PowerSheet DisplayPurple (requirement) + Orange (verification) columnsGreen (customer requirement) + Validation columns

PowerSheet Integration

Test Cases appear in several PowerSheet configurations: Each sheet enables inline test case creation via entity factory, allowing users to define tests directly from the traceability context without navigation.

Test Case Lifecycle and Workflow

Test Cases follow the standard Document Workflow States with seven states: diagram
When a test case transitions to pendingApproval, approvers from the project’s project_approver role are automatically added via the AddDefaultApprovals function. At least one approver must sign with “approved” verdict, and no disapprovals are permitted, before the test case can transition to approved status.

Mandatory Fields by Workflow State

StateMandatory FieldsPurpose
drafttitleAllows initial creation without full details
inProgressassignee, descriptionIndicates work is active; assigned and has scope
inReviewassignee, testMethod, expectedResultPeer review can assess test design quality
pendingApprovalassignee, description, expectedResult, actualResult (if executed)Formal approval gate; full test documentation required
approvedAll above + approvalSignatureTest case is accepted and ready for execution

Test Case Documentation and Evidence Linking

Test Cases serve as the single point of entry for test evidence in the traceability matrix. External evidence artifacts are linked via the externalReferences relationship:
Evidence TypeExampleUse Case
Test Report (PDF)TestReports/AEB_Integration_Test_Report_v3.1.pdfFormal test results for audits
Test Execution Log (CSV/JSON)TestLogs/20260215_ECU_ResponseTime.csvDetailed test data for root cause analysis
Test Code / ScriptTestScripts/aeb_sensor_fusion_unit_test.pyAutomated test implementation for reproducibility
Analysis DocumentAnalysis/StatisticalAnalysis_SensorAccuracy.mdMathematical or statistical evidence
Certification / CertificateCerts/Sensor_Module_Safety_Cert_ISO26262.pdfThird-party validation evidence
Video / RecordingTestVideos/AEB_Full_System_Test_20260215.mp4Visual evidence for complex scenarios
Test evidence must be retained for the product’s entire lifecycle per ISO 26262 Part 8 requirements. Use the evidenceLocation field to document where evidence is stored and archived, and configure document retention policies in Polarion administration.

Test Case Creation Context

Test Cases are typically created in the Testing space under one of three document types:
  • systemVerificationSpecification — For test cases verifying system requirements
  • designVerificationSpecification — For test cases verifying design requirements
  • validationSpecification — For test cases validating customer requirements
When creating a test case from a PowerSheet, the entity factory automatically:
  1. Creates the test case in the appropriate document based on requirement level
  2. Establishes the verifies or validates link from the requirement
  3. Sets the verificationLevel and classification fields from the linked requirement
  4. Assigns the test case to the current user as initial assignee
A single test case can verify multiple requirements (e.g., a system integration test that validates both timing and data accuracy requirements). Similarly, multiple test cases can verify one requirement (e.g., different test methods: simulation, hardware-in-the-loop, road test). This flexibility supports realistic test planning while maintaining complete traceability coverage.

Integration with Verification and Validation PowerSheets

The PowerSheet configurations use Test Cases as the core data model for V-model verification tracking. When using a Verification PowerSheet:
  1. Requirement Source Query — Queries system/design requirements from the current document
  2. First-Level Expansion — Traverses the verifies link role to show all test cases linked to each requirement
  3. Second-Level Expansion — Traverses the externalReferences link role from test cases to show evidence artifacts
  4. Entity Factory — Enables inline test case creation with automatic link establishment
  5. Coverage Calculation — Computes percentage of requirements with at least one test case
See System Verification Sheet and Design Verification Sheet for detailed PowerSheet configuration examples.
TypeRelationshipPurpose
System RequirementTest cases verify system requirementsFunctional requirements at system decomposition level
Design RequirementTest cases verify design requirementsImplementation-level technical specifications
Customer RequirementValidation test cases validate customer requirementsStakeholder and regulatory needs
User NeedValidation test cases can validate user needsInformal stakeholder requirements
Risk ControlTest cases may be derived from risk controlsMitigation verification and effectiveness testing
CharacteristicTest cases verify characteristic target valuesQuality parameter validation
Subsystem RequirementTest cases verify subsystem requirementsIntermediate decomposition level

Configuration and Customization

Test Case behavior can be customized through Polarion project configuration:
  • Custom Fields — Add project-specific test metadata (e.g., test environment, tools used, tester qualifications)
  • Workflow States — Extend or modify the seven-state lifecycle to match your organization’s approval gates
  • Form Layout — Configure which fields appear for different user roles via Hazard Form Layout equivalents
  • Reports — Generate test coverage reports using Verification Test Case and Validation Test Case specifications
See Project Properties for project-level configuration guidance.

Summary

The Test Case work item type is the cornerstone of V-model verification and validation in TestAuto2 — Automotive Safety Solution. By implementing sophisticated many-to-many traceability between requirements and test evidence, Test Cases enable organizations to demonstrate complete verification coverage and maintain audit trails required by ISO 26262, AIAG-VDA FMEA, and IATF 16949 standards. The tight integration with PowerSheet verification matrices and external evidence linking makes Test Cases the primary mechanism for documenting and proving that functional safety requirements have been met.