Skip to main content

Overview

Test cases serve three critical verification roles:
  • Validation — Confirm customer requirements are correctly understood and achievable
  • Verification — Demonstrate that system, subsystem, and design requirements are correctly implemented
  • Traceability — Maintain bidirectional links from requirements through testing to certification evidence
Test cases are defined in the Testing space and organized by verification stage (system verification, subsystem verification, design verification). Each test case includes the verification method, test level, expected results, and environmental category for DO-160G environmental qualification tracking.

Core Properties

PropertyTypeDefaultDescription
IDStringAuto-generatedUnique test case identifier with prefix convention (VAL-* for validation, VER-* for verification)
TitleStringRequiredDescriptive name of the test case or acceptance criterion
DescriptionTextOptionalDetailed explanation of test setup, procedures, and rationale
StatusEnumDraftCurrent state: Draft, In Review, Pending Approval, Approved, Rejected, Obsolete
PriorityEnumMediumTest execution priority: Highest, High, Medium, Low, Lowest

Verification Method

Test cases must specify how verification is performed. This custom field classifies the verification evidence type:
PropertyTypeDefaultDescription
verificationMethodEnumRequiredMethod used to verify the requirement. Valid values: Test, Analysis, Inspection, Demonstration, Review
Guidance:
  • Test — Functional, performance, or stress testing of actual hardware or software
  • Analysis — Mathematical, simulation, or code review demonstrating compliance
  • Inspection — Physical examination of design artifacts, documentation, or manufacturing records
  • Demonstration — Live operation or simulation showing correct behavior
  • Review — Formal review of design, documentation, or process records

Test Level

Test level defines where verification occurs in the system hierarchy and development lifecycle:
PropertyTypeDefaultDescription
testLevelEnumRequiredLevel at which the test is performed. Valid values: Unit, Component, Subsystem, System, Integration, Field
Typical Mapping:
  • Unit — Individual software functions or integrated circuits
  • Component — LRU (Line Replaceable Unit) or subassembly
  • Subsystem — Complete subsystem (e.g., Flight Control, Power Distribution)
  • System — Full aircraft or platform system
  • Integration — System integration and interface verification
  • Field — Operational testing in actual aircraft or production environment

Expected Results and Pass/Fail Criteria

PropertyTypeDefaultDescription
expectedResultText (plain)OptionalExpected output, behavior, or performance metric when the test passes
passFailCriteriaText (plain)OptionalQuantitative or qualitative criteria for determining test success
Example:
expectedResult:
Output voltage within specification ±10% of nominal 28V nominal

passFailCriteria:
Pass if measured output ≥ 25.2V and ≤ 30.8V
Fail if measurement outside this range or voltmeter reading is unstable

Environmental Category

Test cases link to DO-160G environmental qualification categories. This enables the DO-160G Environmental Qualification PowerSheet to generate traceability from design requirements through environmental specifications (characteristics) to qualification tests.
PropertyTypeDefaultDescription
environmentalCategoryEnumOptionalDO-160G environmental qualification section for this test. Enables traceability through the Environmental Qualification PowerSheet
Valid Categories: See DO-160G Environmental Qualification enumeration for complete list. Common categories include:
  • Temperature — DO-160G §3.3 Temperature (ambient, rapid change, thermal gradient)
  • Altitude — DO-160G §3.4 Altitude (decompression, cabin pressure)
  • Humidity — DO-160G §3.5 Humidity
  • Vibration — DO-160G §3.7 Vibration (resonance survey, random vibration)
  • Acceleration — DO-160G §3.8 Acceleration (linear)
  • Acoustic Noise — DO-160G §3.9 Acoustic Noise
  • Explosive Atmosphere — DO-160G §3.10 Explosive Atmosphere
  • Sand and Dust — DO-160G §3.11 Sand and Dust
  • Immersion — DO-160G §3.12 Immersion
  • Fluid Contamination — DO-160G §3.13 Fluid Contamination
  • Fungus and Bacteria — DO-160G §3.14 Fungus and Bacteria
  • Salt Fog — DO-160G §3.15 Salt Fog
When a test case is assigned an environmentalCategory, it appears in the Qualification Tests column group of the DO-160G Environmental Qualification PowerSheet, enabling traceability from design requirements → characteristics → test cases across environmental qualification sections. Test cases participate in four primary traceability relationships:

Validates (outbound)

A test case validates a customer requirement (customerRequirement). Validation confirms that the customer requirement is achievable and correctly understood.
testCase --[validates]--> customerRequirement
Interpretation: This test case demonstrates that the customer requirement is correct and feasible.

Verifies (outbound)

A test case verifies system, subsystem, or design requirements (sysReq, desReq). Verification confirms that the requirement is correctly implemented.
testCase --[verifies]--> sysReq | desReq
Interpretation: Passing this test case proves the requirement is implemented correctly.

Tested By (inbound)

A work item (requirement, design characteristic, or configuration item) tested by a test case (inbound relationship).
sysReq <--[testedBy]-- testCase
This is the inverse of the “verifies” link, used in query and report views.

Associated With (inbound/outbound)

Test cases may be associated with system elements, functions, characteristics, or other work items to provide additional context.
testCase <--[associatedWith]--> systemElement | function | characteristic
The complete set of link roles available for test cases, including cardinality rules and required roles, should be verified in the RTM domain model configuration (rtm.yaml) within your Polarion project.

Test Case Naming Convention

Test case IDs follow a prefix convention based on verification type:
PrefixPurposeExample
VAL-*Validation (tests customer requirement)VAL-001 Power Supply Input Voltage Range
VER-*Verification (tests system/design requirement)VER-1.2.3 System Voltage Regulation Accuracy
The prefix makes it easy to distinguish validation cases (confirming requirements are correct) from verification cases (confirming implementation).

Test Execution and Evidence

Test execution tracking, test result recording, and links to test evidence artifacts (logs, reports, inspection records) are typically handled through Polarion’s Test Management module. Verify the specific fields and workflow in your project configuration.
Test case execution status is tracked separately from the test case definition. Once test cases are approved, they transition to Approved status and become part of the verification baseline. Test execution results are recorded in test runs, maintaining a complete traceability chain from requirement → test case → test execution → evidence.

Document Scope and Organization

Test cases are created in multiple modules based on verification level:
Module PathScopePurpose
Testing/CustomerReqsValidationCustomer requirementsValidation cases (VAL-*) demonstrating customer requirements are achievable
Testing/SystemReqsVerificationSystem requirementsSystem-level verification cases for system requirements
Testing/SubsystemReqsVerificationSubsystem requirementsSubsystem-level verification cases, auto-generated per subsystem
Testing/DesignReqsVerificationDesign requirementsDesign-level verification cases for design requirements
Testing/SystemElementVerificationSystem elementsComponent- and subsystem-level test cases per system element
Each document in these modules contains test cases that verify requirements at that level in the system hierarchy. Test cases are typically auto-created by the PowerSheet or imported from test management tools.

Integration with PowerSheets

The Design Verification Sheet PowerSheet displays test case verification traceability: Design Requirements (leftmost) ↓ expand via “verifies” link Test Cases (rightmost) Columns in the Design Verification Sheet:
  • Design Requirement ID — Linked design requirement
  • Requirement Title — Description
  • Requirement SubType — Discipline (system/software/hardware)
  • Requirement Rationale — Justification
  • Test Case ID — Linked test case
  • Test Case Title — Test description
  • Verification Method — Test, Analysis, Inspection, Demonstration, Review
  • Test Level — Unit, Component, Subsystem, System, Integration, Field
The Design Verification Sheet includes an entityFactory configuration that enables creating new test cases directly within the Testing/DesignReqsVerification document from the PowerSheet interface — streamlining the verification planning process.

Best Practices

  1. Define Expected Results Early — Before test execution, capture expected outputs and pass/fail criteria in the test case definition. This prevents subjective interpretation during testing.
  2. Use Quantitative Pass/Fail Criteria — Specify measurable thresholds (voltage ranges, timing windows, error counts) rather than qualitative descriptions (“acceptable,” “satisfactory”).
  3. Assign Environmental Categories — For environmental or qualification testing, assign the appropriate environmentalCategory to enable DO-160G traceability and automated qualification matrix generation.
  4. Maintain One-to-One or Many-to-One — A single test case may verify multiple requirements (many-to-one), but each requirement should have clear verification strategy. Use multiple test cases if different methods are needed.
  5. Track Verification Method — Document whether the requirement is verified by test, analysis, inspection, demonstration, or review. This is required for compliance with DO-178C and DO-254.
  6. Reference Standard Procedures — For critical tests, reference the procedure document, test specification, or standard used (e.g., “Per DO-160G §3.7 Vibration Resonance Survey”).
Code: .polarion/nextedy/sheet-configurations/DO-160G Environmental Qualification.yaml, Component RTM.yaml, Configuration Index.yaml, Design Verification Sheet.yaml, Interface Control Matrix.yaml, Problem Report Tracker.yaml, Process Steps.yaml, Review Action Item Tracker.yaml, SOI Stage Gate Dashboard.yaml, Use Steps Specification.yaml, User Need Validation Sheet.yaml, characteristics.yaml, component-characteristics.yaml, customer-requirements.yaml, design-requirements.yaml, subsystem-functions.yaml, subsystem-verification.yaml, system-elements.yaml, test-verification.yaml (0.52) · .polarion/tracker/fields/testCase-custom-fields.xml, desReq-custom-fields.xml, processStep-custom-fields.xml, characteristic-custom-fields.xml, systemElement-custom-fields.xml, commonCauseEvent-custom-fields.xml, riskControl-custom-fields.xml, task-custom-fields.xml, custom-fields.xml (0.50) · .polarion/nextedy/models/rtm.yaml (0.48) · .polarion/pages/spaces/_default/Data Model/page.xml (0.39) · .polarion/tracker/fields/workitem-type-enum.xml (0.37) · .polarion/documents/fields/document-type-enum.xml (0.37) · .polarion/tracker/fields/complianceObjective-standard-enum.xml, complianceObjective-status-enum.xml, complianceRequirement-complianceStatus-enum.xml, complianceRequirement-evidenceType-enum.xml (0.37) · .polarion/tracker/fields/severity-enum.xml, status-enum.xml, priority-enum.xml, implementationStatus-enum.xml, riskSpecification-document-status-enum.xml (0.36)