Test cases serve three critical verification roles:
Validation — Confirm customer requirements are correctly understood and achievable
Verification — Demonstrate that system, subsystem, and design requirements are correctly implemented
Traceability — Maintain bidirectional links from requirements through testing to certification evidence
Test cases are defined in the Testing space and organized by verification stage (system verification, subsystem verification, design verification). Each test case includes the verification method, test level, expected results, and environmental category for DO-160G environmental qualification tracking.
Expected output, behavior, or performance metric when the test passes
passFailCriteria
Text (plain)
Optional
Quantitative or qualitative criteria for determining test success
Example:
Copy
Ask AI
expectedResult:Output voltage within specification ±10% of nominal 28V nominalpassFailCriteria:Pass if measured output ≥ 25.2V and ≤ 30.8VFail if measurement outside this range or voltmeter reading is unstable
Test cases link to DO-160G environmental qualification categories. This enables the DO-160G Environmental Qualification PowerSheet to generate traceability from design requirements through environmental specifications (characteristics) to qualification tests.
Property
Type
Default
Description
environmentalCategory
Enum
Optional
DO-160G environmental qualification section for this test. Enables traceability through the Environmental Qualification PowerSheet
Fungus and Bacteria — DO-160G §3.14 Fungus and Bacteria
Salt Fog — DO-160G §3.15 Salt Fog
When a test case is assigned an environmentalCategory, it appears in the Qualification Tests column group of the DO-160G Environmental Qualification PowerSheet, enabling traceability from design requirements → characteristics → test cases across environmental qualification sections.
A test case validates a customer requirement (customerRequirement). Validation confirms that the customer requirement is achievable and correctly understood.
Copy
Ask AI
testCase --[validates]--> customerRequirement
Interpretation: This test case demonstrates that the customer requirement is correct and feasible.
Test cases may be associated with system elements, functions, characteristics, or other work items to provide additional context.
Copy
Ask AI
testCase <--[associatedWith]--> systemElement | function | characteristic
The complete set of link roles available for test cases, including cardinality rules and required roles, should be verified in the RTM domain model configuration (rtm.yaml) within your Polarion project.
Test execution tracking, test result recording, and links to test evidence artifacts (logs, reports, inspection records) are typically handled through Polarion’s Test Management module. Verify the specific fields and workflow in your project configuration.
Test case execution status is tracked separately from the test case definition. Once test cases are approved, they transition to Approved status and become part of the verification baseline. Test execution results are recorded in test runs, maintaining a complete traceability chain from requirement → test case → test execution → evidence.
Test cases are created in multiple modules based on verification level:
Module Path
Scope
Purpose
Testing/CustomerReqsValidation
Customer requirements
Validation cases (VAL-*) demonstrating customer requirements are achievable
Testing/SystemReqsVerification
System requirements
System-level verification cases for system requirements
Testing/SubsystemReqsVerification
Subsystem requirements
Subsystem-level verification cases, auto-generated per subsystem
Testing/DesignReqsVerification
Design requirements
Design-level verification cases for design requirements
Testing/SystemElementVerification
System elements
Component- and subsystem-level test cases per system element
Each document in these modules contains test cases that verify requirements at that level in the system hierarchy. Test cases are typically auto-created by the PowerSheet or imported from test management tools.
The Design Verification Sheet PowerSheet displays test case verification traceability:Design Requirements (leftmost)
↓ expand via “verifies” link
Test Cases (rightmost)Columns in the Design Verification Sheet:
Test Level — Unit, Component, Subsystem, System, Integration, Field
The Design Verification Sheet includes an entityFactory configuration that enables creating new test cases directly within the Testing/DesignReqsVerification document from the PowerSheet interface — streamlining the verification planning process.
Define Expected Results Early — Before test execution, capture expected outputs and pass/fail criteria in the test case definition. This prevents subjective interpretation during testing.
Use Quantitative Pass/Fail Criteria — Specify measurable thresholds (voltage ranges, timing windows, error counts) rather than qualitative descriptions (“acceptable,” “satisfactory”).
Assign Environmental Categories — For environmental or qualification testing, assign the appropriate environmentalCategory to enable DO-160G traceability and automated qualification matrix generation.
Maintain One-to-One or Many-to-One — A single test case may verify multiple requirements (many-to-one), but each requirement should have clear verification strategy. Use multiple test cases if different methods are needed.
Track Verification Method — Document whether the requirement is verified by test, analysis, inspection, demonstration, or review. This is required for compliance with DO-178C and DO-254.
Reference Standard Procedures — For critical tests, reference the procedure document, test specification, or standard used (e.g., “Per DO-160G §3.7 Vibration Resonance Survey”).