Safety Assurance: Closing the SRM Loop
“Safety Risk Management reduces risk. Safety Assurance verifies that risk stays reduced.”
In the previous post, we developed safety risk controls and evaluated residual risk under Title 14, Code of Federal Regulations (14 CFR) § 5.55(c)–(d). Now that controls are implemented and residual risk is accepted, we can transition to safety assurance: the set of processes used to monitor performance and verify those controls remain effective over time.
What Is Safety Assurance?
Safety assurance is not a separate program layered on top of Safety Risk Management (SRM)—it is the feedback loop confirming whether earlier risk‑control decisions are achieving their intended safety outcomes. This is where Subpart D (§§ 5.71–5.75) comes in: under Subpart D, organizations must—
Develop and maintain processes to monitor safety performance,
Assess that performance against safety objectives,
Identify ineffective safety risk controls and/or new hazards, and
Implement corrective action through continuous improvement.
In practice, safety assurance answers two ongoing questions:
Are our risk controls working?
Has anything changed that introduces new risk?
Safety Performance Monitoring and Measurement (§ 5.71)
Section 5.71 requires organizations to acquire and analyze data to monitor safety performance. The regulation identifies eight minimum data‑acquisition processes, which include—
Continuously observe operational processes and the operating environment for conditions that could affect safety performance.
Conduct systematic audits to verify that operational systems are functioning as intended and in conformance with established procedures.
Periodically evaluate both the SMS and operational processes to confirm the system is performing as designed.
Investigate incidents, accidents, and cases of potential noncompliance to identify root causes and systemic factors.
Maintain a confidential employee reporting system. Frontline employees observe conditions that audits and evaluations may never capture.
Investigate hazard notifications received from outside the organization — including regulatory sources, industry groups, and peer operators.
The organization must document these processes as part of its SMS procedures under § 5.95. The purpose is not to generate paperwork, but to determine whether established safety risk controls are functioning as intended and whether the operational environment has changed in ways that could introduce new hazards.
Advisory Circular (AC) 120–92D, Safety Management Systems for Aviation Service Providers, Section 3.5, emphasizes that data sources for monitoring can range from day‑to‑day supervisory activities and pass‑down logs, to formal programs such as—
The AC also makes clear that employee‑reporting systems, defined in the organization’s safety policy under § 5.21(a)(4), fill important gaps in an organization’s data‑collection process—frontline employees are often in the best position to observe conditions audits or evaluations may not capture.
For smaller organizations, much of this data‑gathering occurs naturally during normal operations:
Reviewing dispatch logs,
Flightcrew duty records, or
Maintenance deferrals.
The key is recognizing those activities are part of safety assurance and ensuring the data feeds into analysis and decision making.
Safety Performance Assessment (§ 5.73)
Section 5.73 requires organizations to assess safety performance against the safety objectives documented in their safety policy under § 5.21(a)(1). These assessments must include reviews by the accountable executive and must—
Confirm the organization is complying with its established safety risk controls — the ones documented under § 5.55(c).
Assess how well the Safety Management System as a whole is performing against the objectives in your safety policy.
Evaluate whether each risk control under § 5.55(c) is producing the intended safety outcome. Flag any that are not working.
Identify changes in the operational environment — new routes, equipment, procedures, or personnel — that could introduce new hazards.
Look beyond known risks. Safety assurance must actively surface hazards that did not exist — or were not recognized — when controls were first designed.
This is the decision‑making stage. The data collected under § 5.71 is examined by the individuals the organization has identified under § 5.23(b) as having the authority to make safety risk acceptance decisions. If ineffective controls or new hazards are identified, § 5.73(b) requires the organization to reenter the SRM process. This is not a failure; it is how SMS is designed to function.
Continuous Improvement (§ 5.75)
Section 5.75 requires organizations to establish and implement processes to correct safety‑performance deficiencies identified during the assessments we just talked about. Similarly, these corrective action processes must be documented under § 5.95.
AC 120–92D (Section 3.5.11) draws an important distinction. The corrective action process under § 5.75 is typically triggered when employees are not using established risk controls properly—the controls exist, but conformance is the issue. In these cases, retraining or procedural reinforcement may be sufficient; however, when performance data indicates the risk controls themselves are not producing the expected results despite proper use, the path leads back to SRM for system redesign. The key is determining whether the problem is systemic or employee‑related.
Continuous improvement does not mean constant change. It means responsive adjustment when performance data indicates it is necessary.
What This Means for Your SMS Documentation
If you are building or refining your SMS manual, Subpart D connects to several areas that should already be documented.
Your SMS processes and procedures (§ 5.95) should define how the organization will monitor safety performance, including—
Data sources,
Responsibilities, and
Review intervals.
Your safety policy (§ 5.21) should include the—
Measurable safety objectives against which § 5.73 assessments will be measured,
The confidential employee‑reporting policy feeding data into safety assurance, and
The code of ethics reinforcing the organization’s commitment to acting on safety information.
Your safety accountability structure (§ 5.23) should identify who has the authority to make risk‑acceptance decisions during assessments, and who is responsible for directing corrective action.
These are not separate documentation exercises—they are connections between Subpart D and the safety policy, SRM outputs, and accountability structures already established.
Recordkeeping (§ 5.97)
Organizations must retain records of safety assurance outputs for a minimum of 5 years under § 5.97(b). This documentation supports future assessments and provides evidence the organization is monitoring, evaluating, and acting on safety‑performance data.
Closing the Loop: Plain‑language Examples
The following examples illustrate how safety assurance works in practice, using the same stepped format from our SRM posts.
An organization implemented additional taxi briefing procedures in response to surface movement risk at complex airports during low-visibility operations. Under SRM, severity was assessed as "major" and likelihood as "remote." With controls applied, residual risk was accepted.
The organization monitors surface-deviation events through operational data, reviews employee safety reports for taxi-related concerns, and audits compliance with the enhanced briefing procedures at affected stations.
After six months, the organization reviews the data. Surface-deviation reports at the affected airports have declined, and audits confirm flightcrews are consistently completing the enhanced briefings. The control appears effective.
In our previous post, an operator identified an increase in unstable approaches during winter weather. Under SRM, the organization revised its stabilized approach criteria, reinforced go-around expectations, and increased flight data monitoring of approach stability. Residual risk was formally accepted.
The operator monitors approach stability rates through flight-data analysis, tracks go-around compliance, and reviews employee reports related to approach procedures. Seasonal trends are compared against the pre-control baseline.
After the first winter season under the revised criteria, leadership reviews the data. Approach stability rates have improved — but not by as much as expected at two specific airports with challenging terrain and approach profiles.
Because the controls are not fully achieving the intended risk reduction at those airports, the organization initiates SRM under § 5.73(b) to evaluate airport-specific factors, such as approach design, terrain, and prevailing weather, and develop targeted controls for those locations.
This is the SMS feedback loop in action: the system adjusts based on evidence, not assumption.
✈ Conclusion
Under Subpart D, safety assurance closes the SRM loop. Organizations—
Continuously collect safety data from operational processes, audits, employee reports, and external sources (§ 5.71).
Review data against safety objectives with the accountable executive. Determine whether risk controls are working (§ 5.73).
Flag ineffective controls and any new hazards introduced by changes in the operational environment.
Address conformance issues through retraining or reinforcement. Re-enter SRM when the controls themselves need redesign (§ 5.75).
Not constant change — responsive adjustment. The SMS evolves based on evidence, not assumption.
Where SRM is a design process—building controls into the system—safety assurance is the ongoing verification that those controls are delivering the intended results.
It bears repeating: SRM reduces risk. Safety assurance ensures it stays reduced.
We welcome thoughtful and respectful discussion.
To keep this space safe and productive, please follow these guidelines:
Be respectful. Personal attacks, name-calling, and abusive language will not be tolerated.
Stay on topic. Keep comments relevant to the content of the post.
No spam or self-promotion. Links and promotions that are not relevant to the discussion will be removed.
Use appropriate language. This is a professional environment—please avoid profanity or offensive language.
Protect privacy. Don’t share personal information—yours or anyone else’s.
You may reply to, like, or flag other comments.
Comments are moderated and may be edited or removed at our discretion.
By commenting, you agree to abide by this policy.