Skip to main content
Understanding DO-178C within the Aerospace Safety Solution means understanding its relationship to the broader V-model: software requirements inherit their criticality classification from system-level safety analysis, and then the standard prescribes how thoroughly those requirements must be verified depending on that criticality.

The DAL Bridge

The central concept in DO-178C is the Design Assurance Level (DAL), which translates a safety outcome (from ARP 4761 failure condition classification) into a verification rigor prescription. diagram DAL is not assigned by software engineers — it flows down from the safety assessment. The ARP 4761 failure condition classification (Catastrophic → Hazardous → Major → Minor) maps directly to DAL A through D. This means software teams cannot independently determine how thoroughly to test a component; the answer comes from the FHA and PSSA upstream. DAL E (No Safety Effect) requires no certification objectives at all.

Two PowerSheet Instruments

The Aerospace Safety Solution provides two complementary views for DO-178C compliance, each serving a different audience and audit purpose.

Objectives Compliance Matrix

The DO-178C Objectives Compliance Matrix is an auditor-facing instrument. Think of it as a checklist aligned to FAA Table A-7 structure: it lists every software assurance objective prescribed by the standard and tracks its completion status separately for each DAL level. Each objective carries one of five statuses:
StatusMeaning
SatisfiedObjective has been met with documented evidence
In ProgressWork underway but not yet complete
Not StartedNo activity yet
Not ApplicableObjective does not apply to this system
WaivedApproved deviation — DER waiver on record
The Waived status is significant: it acknowledges that not every project achieves every objective. A waiver is a documented, approved deviation — its presence in the matrix is evidence of process discipline, not a gap. The matrix supports progressive disclosure through four views — Full Matrix, DAL A Only, DAL A + B, and Summary — which correspond to increasing stages of SOI (Stage of Involvement) audit scrutiny. During early audits, a Summary view reduces noise; during final SOI, the full matrix is reviewed.

Software Verification Traceability Matrix

Where the Objectives Matrix answers “have we satisfied each standard requirement?”, the Software Verification Matrix answers “is every software requirement tested?” — a bidirectional traceability question. The chain runs from system requirements down through subsystem and low-level design requirements, with test cases attached at both system and design levels. The dual verification column structure is intentional: DO-178C requires demonstrating coverage at multiple levels of decomposition, not just at the lowest level.
The Software Verification Matrix deliberately excludes customer requirements at the top of its chain. Customer requirements describe system behavior; DO-178C governs software implementation. Starting from the system requirement level keeps the scope focused on what the software must do, not what the customer wants the aircraft to do.

Evidence Categories

Both instruments recognize six categories of compliance evidence:
  • Document — design specifications, plans, standards
  • Test Result — executed test procedure outcomes
  • Analysis — formal methods, model checking, code analysis
  • Review Record — peer reviews, inspections, checklists
  • Inspection — physical or configuration inspection
  • Demonstration — observed system behavior
The evidence type matters for audit purposes. A DAL A objective satisfied by Analysis carries different weight than one satisfied by Test Result — and regulators may require specific evidence types for specific objectives.

Certification Readiness Metrics

The Certification Readiness Scorecard provides per-DAL health metrics that allow program managers to track software certification progress over time:
  • Classification % — design requirements with DAL classification assigned
  • Decomposition % — requirements with established parent/child traceability links
  • Test Coverage % — design requirements linked to at least one test case
Color thresholds give instant status: green ≥ 80%, orange 50–79%, red < 50%.
The current Aero1 Flight Control Computer project shows 0% classification for DO-178C because all existing design requirements are hardware/electrical in subtype. Software design requirements must be created and decomposed before DAL assignment and classification metrics become meaningful. The compliance scorecard reflects this state accurately — a red score here signals missing artifacts, not a failed process.

Common Misconception: DAL Is Not Assigned by Software Teams

A frequent misunderstanding is that software architects or developers choose the DAL for a software component based on how important they believe it is. This is incorrect in the model the Aerospace Safety Solution implements. DAL derives from failure condition classification established in the FHA and PSSA. If the safety assessment determines that a failure of the flight control software could cause a Catastrophic outcome, DAL A applies — regardless of developer judgment. The flow is strictly top-down: safety analysis → DAL assignment → verification rigor. This design is deliberate. It prevents teams from under-classifying software to reduce certification burden, and it ensures that verification investments are proportional to actual safety risk. DO-178C does not operate in isolation. In the Aerospace Safety Solution, it integrates with:
  • ARP 4754A — System development requirements decompose into software design requirements, providing the starting point for the verification matrix
  • ARP 4761 — Failure condition classification from the safety assessment drives DAL assignment
  • DO-254 — Hardware design assurance uses an identical Objectives Compliance Matrix structure, enabling parallel audit workflows for mixed hardware/software systems
  • DO-326A — Security requirements traceability is tracked separately from software verification; security threats and their mitigations are not conflated with functional software objectives
This integration reflects the real audit environment: a certification authority reviewing a Flight Control Computer will examine evidence across all four standards simultaneously, and the compliance instruments are designed to support that cross-standard review without duplication. For practical guidance on building out software design requirements and linking test cases, the V-model walkthrough in The V-Model Development Process provides the system-level context for where DO-178C activities fit within the full development lifecycle.
Code: datasets/sol-aero-ui-walkthrough/summary.md, navigation.md, dashboards/home-dashboard.md, dashboards/role-dashboards.md, dashboards/standards-compliance.md, risksheet-views/risksheet-views.md, work-item-types/data-model.md (0.53) · .polarion/tracker/fields/complianceObjective-standard-enum.xml, complianceObjective-status-enum.xml, complianceRequirement-complianceStatus-enum.xml, complianceRequirement-evidenceType-enum.xml (0.48) · .polarion/pages/spaces/_default/Standards Compliance Overview/page.xml, Certification Readiness Scorecard/page.xml, Compliance Matrix/page.xml (0.46) · .polarion/nextedy/sheet-configurations/ARP 4754A System Development Assurance.yaml (0.45) · .polarion/nextedy/sheet-configurations/DO-178C Software Verification Matrix.yaml (0.44) · .polarion/nextedy/sheet-configurations/DO-326A Security Requirements Traceability.yaml (0.44) · .polarion/nextedy/sheet-configurations/DO-178C Objectives Compliance Matrix.yaml (0.44) · .polarion/nextedy/sheet-configurations/DO-254 Objectives Compliance Matrix.yaml (0.43) · .polarion/tracker/fields/dal-enum.xml (0.43) · .polarion/tracker/fields/workitem-type-enum.xml (0.42)