Skip to main content
  • Verification asks: “Are we building the product correctly?” — Did we implement the design according to specifications?
  • Validation asks: “Are we building the right product?” — Does the final system satisfy the actual user needs and safety goals?
ISO 26262 Part 4 formalizes this distinction across the V-model development process. Verification occurs on the right side of the V (ascending from implementation back to design), while validation connects the top of the V (system behavior) back to the bottom-left origin (customer needs and hazard scenarios).

Why the Distinction Matters

Consider an Automatic Emergency Braking (AEB) system: Verification scenario: A system requirement states “The ECU shall activate braking within 200ms of obstacle detection.” A verification test measures actual response time under controlled conditions. If the measured time is 180ms, verification passes — the system meets its specification. Validation scenario: A customer need states “The AEB system shall prevent rear-end collisions in urban traffic.” Validation testing involves real-world driving scenarios with pedestrians, cyclists, and unexpected obstacles. Even if all requirements verify correctly, validation might reveal that 200ms response time is insufficient for certain real-world edge cases (e.g., a child running into the road). The critical insight: You can perfectly verify all requirements yet still fail validation if the requirements themselves don’t capture real safety needs.
Passing all verification tests does NOT guarantee a safe product. Verification only proves you built what you specified — it cannot detect gaps in the specification itself. That’s why ISO 26262 mandates both verification AND validation as separate activities.

The V-Model Framework

TestAuto2 implements the ISO 26262 V-model as a bidirectional traceability structure: diagram Left side (decomposition): Requirements flow downward and become more detailed at each level. Customer needs refine into system-level requirements, which refine into design-level requirements, which guide implementation. Right side (verification): Each design artifact is verified against its parent requirement. Design verification confirms that code/hardware matches design requirements. System verification confirms that integrated subsystems match system requirements. Top span (validation): Validation tests link directly to customer requirements and user needs, skipping intermediate abstraction layers. They ask: “Does the complete system solve the original problem?”

Traceability in TestAuto2

TestAuto2 uses two distinct link roles to enforce the verification/validation distinction:
Link RoleSourceTargetMeaningExample
verifiesVerification Test CaseSystem/Design RequirementTest confirms requirement implementation”Brake response time test” verifies “ECU shall activate braking within 200ms”
validatesValidation Test CaseCustomer Requirement / User NeedTest confirms customer need satisfaction”Urban collision avoidance test” validates “Prevent rear-end collisions in city traffic”
These link roles appear throughout the RTM model:
  • Requirements PowerSheets show which requirements have verification or validation evidence
  • Verification PowerSheets expand requirements to show linked test cases
  • Coverage dashboards compute separate metrics for verification coverage vs validation coverage
  • Traceability reports trace both chains independently to identify gaps
Use the V&V Engineer Dashboard to see verification and validation coverage side-by-side with drill-down queries for gaps.

ISO 26262 Verification Methods

ISO 26262 Part 8 defines four acceptable verification methods:
Verification MethodApplicable ToEvidence
Requirements ReviewAll requirementsReview records
Design AnalysisArchitectureAnalysis reports
Simulation/ModelingSystem behaviorSimulation data
Hardware/Software TestingImplementationTest results
Review: Peer inspection of requirement text for completeness, consistency, correctness. Evidence: signed review checklist. Analysis: Static analysis of architecture, code, or models without execution. Examples: FTA, FMEA, code static analysis. Evidence: analysis report with findings. Simulation: Executing models or prototypes in controlled environments. Examples: SiL (Software-in-Loop), HiL (Hardware-in-Loop), vehicle dynamics simulation. Evidence: simulation test logs. Testing: Executing the actual implementation with instrumentation. Examples: unit tests, integration tests, system tests. Evidence: test execution records with pass/fail status. TestAuto2 tracks verification method in custom fields on test case work items, enabling compliance reporting per ISO 26262 Part 8 Clause 9.

Validation Scope and Timing

Validation occurs at two key points in the development lifecycle: 1. Early validation (Part 4 §8.4): During concept phase, validate that safety goals correctly address hazards identified in HARA. Evidence: safety goal review against hazard scenarios. 2. Final validation (Part 4 §8): After integration, validate that the complete system satisfies customer requirements and user needs in operational scenarios. Evidence: validation test execution in representative environments. Validation test characteristics:
  • Use real or representative hardware (not simulation)
  • Test in operational environments (temperature, vibration, EMI)
  • Include edge cases and degraded modes
  • Measure user-observable behavior (not internal signals)
  • Cover all ASIL-rated safety goals
Higher ASIL levels require more rigorous validation. ASIL D may require validation testing on production-intent hardware, while ASIL A may accept validation on prototype platforms. TestAuto2’s ASIL Classification System explains how ASIL affects testing rigor.

Verification vs Validation Coverage Metrics

TestAuto2 computes separate coverage metrics for each process: Verification coverage:
Verified Requirements     System/Design Requirements with backlinks from test cases
─────────────────────── = ───────────────────────────────────────────────────────
Total Requirements        Total System/Design Requirements
Displayed on: Requirements Space Dashboard, Testing Space Dashboard Validation coverage:
Validated Requirements   Customer Requirements with backlinks from validation tests
────────────────────── = ──────────────────────────────────────────────────────────
Total Customer Needs     Total Customer Requirements + User Needs
Displayed on: Testing Space Dashboard, Safety Readiness Scorecard Both metrics use the RTM model’s bidirectional link traversal: the relationship is created from test → requirement, but coverage is computed by querying requirement → backlinked tests.

PowerSheet Views for Verification and Validation

TestAuto2 provides specialized PowerSheets for each activity: Verification PowerSheets: Validation PowerSheet: Each sheet uses RTM model expansion to show requirements in the left column group and linked test cases in the right column group, with entity factory configured to create test cases directly from the sheet interface.

Common Pitfalls

MistakeConsequenceSolution
Using verification tests for validationMisses real-world edge cases; system passes all tests but fails in fieldCreate separate validation test work items linked directly to customer requirements
Validating against design requirementsConfirms implementation matches design, but doesn’t confirm design is correctValidation tests must link to customer requirements or user needs, NOT design specs
Single coverage metric combining bothCannot detect when verification is strong but validation is missingUse separate coverage queries; TestAuto2 dashboards show both metrics independently
Testing internal signals instead of user-observable behaviorValidation tests read internal ECU signals rather than measuring actual braking distanceValidation tests must measure outcomes users experience (stopping distance, collision avoidance)
ISO 26262 Part 8 audits specifically check for evidence of BOTH verification and validation. A common audit finding: “All requirements have test cases, but test cases only verify technical specs — no validation of actual customer needs.”

Integration with FMEA and Risk Analysis

Verification and validation link to Risk Control Types through test evidence: Detection controls require verification: A risk control that detects failures (e.g., plausibility check on sensor data) must be verified to confirm the detection logic works. The verification test links to the risk control via the verifies link role. Safety goals require validation: A safety goal derived from HARA (e.g., “Ensure obstacle detection reliability”) must be validated in operational scenarios. Validation tests link to the safety goal to confirm it’s achieved. TestAuto2’s Risk Control Effectiveness Report checks that every risk control has verification evidence, and the Safety Readiness Scorecard checks that every safety goal has validation evidence.

Practical Workflow

For verification:
  1. Create verification test cases targeting system or design requirements
  2. Link tests to requirements using the verifies link role
  3. Use Verification PowerSheets to review coverage and create missing tests
  4. Check verification coverage on the Testing Space dashboard
  5. Track verification evidence by linking test runs or external artifacts
For validation:
  1. Create validation test cases targeting customer requirements or user needs
  2. Link tests using the validates link role
  3. Use the User Need Validation Sheet to plan validation campaigns
  4. Check validation coverage on the Testing Space Dashboard
  5. Execute validation tests in operational environments and attach evidence

Further Reading

  • ISO 26262-4:2018 Clause 7 (Verification of System Design)
  • ISO 26262-4:2018 Clause 8 (Validation of Safety Concept)
  • ISO 26262-8:2018 Clause 9 (Verification and Validation)
  • Testing Space Dashboard — Verification vs Validation coverage metrics
  • Verification Test Case work item type reference
  • Validation Test Case work item type reference