Skip to main content

Purpose and Role

Validation Test Cases implement the top arm of the V-model, creating bidirectional traceability between Customer Requirement work items and actual evidence of validation. In ISO 26262 terminology, validation ensures the system meets its intended use; in ISO 14971, validation confirms that risk controls are effective in practice. Key distinctions: diagram Validation Test Cases are work items of type validationTestCase that:
  • Link to one or more Customer Requirement entities via the validates relationship
  • May include test procedure steps and acceptance criteria
  • Support evidence attachment and traceability to actual test runs
  • Track sign-off status in document workflow
  • Enable V-model compliance reporting per ISO 26262, ISO 14971, and IATF 16949

Properties

Core Identification

Property NameTypeDefaultDescription
idStringAuto-generatedUnique work item identifier (e.g., VTC-001) within project prefix
titleStringDescriptive title of the validation test case (e.g., “Validate AEB activation in low-light conditions”)
descriptionStringDetailed test description, acceptance criteria, and expected behavior
statusEnumerationdraftWorkflow state: draft, inProgress, inReview, pendingApproval, approved
assigneeUser ReferencePerson responsible for executing the validation test

Validation and Linkage

Property NameTypeDefaultDescription
validatesRequirementsLink (validates)Back-linked Customer Requirement work items this test validates. Populated via link.role=validates in RTM model
testCaseTypeEnumerationuserAcceptanceType of validation test: userAcceptance, scenarioTesting, usabilityTest, regressionValidation, fieldTest
testPhaseEnumerationsystemValidationValidation phase in V-model: systemValidation, integrationValidation, fieldValidation

Test Procedure and Evidence

Property NameTypeDefaultDescription
testProcedureStringStep-by-step test instructions (may reference external test procedure documents)
acceptanceCriteriaStringExplicit pass/fail criteria for the validation test
testEnvironmentStringDescription of test environment: hardware platform, OS version, test tools, user population
testDataStringInput data, test scenarios, or boundary conditions to be validated
externalReferencesLink CollectionEvidence artifacts: test logs, photos, videos, customer sign-off documents, linked via externalReferences relationship

Traceability and Coverage

Property NameTypeDefaultDescription
linkedCustomerRequirementsCount (Computed)Number of Customer Requirement items this test validates (used in traceability reports)
testCaseStatusEnumerationnotStartedExecution status: notStarted, inProgress, passed, failed, blocked, deferred
lastExecutedDateDateTimestamp of most recent test execution
executedByUser ReferencePerson who executed the test (may differ from assignee)

Optional Advanced Fields

Property NameTypeDefaultDescription
riskLevelEnumerationmediumRisk level if validation test fails: low, medium, high, critical. Informs validation priority and scheduling
customerRepresentativeSignoffUser ReferenceCustomer or stakeholder who approved the validation result
notesStringAdditional validation notes, edge cases tested, or anomalies observed
versionStringDocument version where this validation test is documented (e.g., “v1.2”)

Relationships and Traceability

Outgoing Relationships

ValidatesCustomer Requirement Every Validation Test Case must link to at least one customer requirement via the validates role. A single customer requirement may be validated by multiple test cases (testing different scenarios or user populations).
  • ValidationTestCase VTC-001
    • validates > CR-005: User shall be able to activate manual override
    • validates > CR-006: Manual override shall require confirmation
    • validates > CR-007: System shall log all manual overrides

Incoming Relationships

Validated By (back-linked) ← Customer Requirement A Customer Requirement is displayed with a count of Validation Test Cases that validate it, enabling V-model traceability verification.

External References

Validation Test Cases may link to external evidence artifacts via externalReferences:
  • Test execution logs (CSV, JSON)
  • Screen recordings or photos
  • Customer acceptance sign-offs (PDF)
  • Field validation reports
  • Regulatory audit evidence
These are managed through the PowerSheet User Need Validation Sheet.

Workflow Lifecycle

Validation Test Cases progress through document workflow states: diagram State Meanings:
  • Draft — Test case being authored; not yet ready for formal review
  • In Review — Test procedure under review by quality/safety engineer
  • Pending Approval — Approved by technical reviewer; awaiting management sign-off
  • Approved — Ready for execution; test can now be run
Once a test is Approved, execution status (testCaseStatus) tracks whether the test has been run:
  • notStartedinProgresspassed / failed / blocked
  • Validation Test Cases validate customer requirements (does it meet user needs?)
  • Verification Test Cases verify design/system requirements (does the design meet specifications?)
  • Both are essential for ISO 26262 Part 8 (Functional Safety Management) compliance
  • See Verification vs Validation for detailed methodology

Usage in TestAuto2

Creating Validation Test Cases

Validation Test Cases are typically created in a testsSpecification module (e.g., “Validation Test Specification”) within the Testing space. Use the Validation Test Case work item type when:
  1. Customer requirements need evidence of fulfillment
  2. User acceptance testing is being planned
  3. Field validation or beta testing is required
  4. Regulatory audit requires customer sign-off
See Create Validation Test Cases for step-by-step creation workflow.

Linking to Customer Requirements

Use the validates link role to establish traceability:

Tracking Validation Coverage

Use the User Need Validation Sheet PowerSheet to visualize:
  • All customer requirements and their validation test cases
  • External evidence linked to each test
  • Validation completion status (passed/failed/blocked)
Validation Coverage Formula:
Validation Coverage % = (# Customer Requirements with ≥1 passed validation test) / (Total Customer Requirements) × 100

Integration with Test Runs

If your Polarion project includes Test Management, Validation Test Cases can be linked to Test Runs for formal execution tracking. See Integrate with Test Runs.

Example: AEB System Validation

Scenario: AEB (Automatic Emergency Braking) system customer requirement validation CR-008: User shall be able to preview braking action in simulation mode ├─ VTC-031: Validate simulation mode preview Test Procedure:
  1. Load AEB Simulation App
  2. Set obstacles at 10m, 15m, 20m
  3. Run simulation
  4. Verify braking preview overlay appears Acceptance Criteria:
    • Preview shows exact braking distance for each obstacle
    • Braking overlay color matches severity level
    • Preview completes within 500ms Test Environment: Windows 10, AEB Sim v2.3 Evidence: simulation-log-20260215.csv, screenshot.png, customer-signoff.pdf └─ VTC-032: Validate simulation accuracy vs. real system Test Procedure:
  5. Compare simulation prediction with field test data
  6. Calculate prediction error for 50 scenarios Acceptance Criteria:
    • Mean absolute error ≤ 5% of actual braking distance
    • 95th percentile error ≤ 10% Evidence: accuracy-analysis.xlsx, field-comparison-report.pdf
Concepts: Work Item Types: PowerSheet Configurations: Guides: Standards References:
  • ISO 26262-2:2018 § 5.3.2 — Functional Safety Management; validation requirements
  • ISO 14971:2019 § 7.6 — Risk Management; validation of risk controls
  • IATF 16949:2016 § 8.5.5 — Control of externally provided processes; customer validation