A picture of two books. The one at the front titled Systematic Safety by E. Lloyd and W. Tye. The second is obscured behind it, without the title on full show, it begins 'Aircraft'

How Aviation Learned to Learn

For much of the twentieth century, flying was dangerous in ways we now struggle to imagine. In the 1960s and early 70s, commercial aviation accidents were frequent enough to be accepted as an unfortunate reality of progress. Aircraft were improving, but not fast enough to keep pace with the complexity of the system they were operating within.

At first, the industry’s response focused almost entirely on technology. When something went wrong, investigators looked for mechanical failure: a faulty engine, a design flaw, a systems malfunction. And in many cases, those explanations were valid. Early aircraft were less reliable, and safety improvements were urgently needed.

But over time, a pattern emerged that engineering alone couldn’t explain.

Accidents were happening even when the aircraft was technically sound. Crews with thousands of hours of experience were making errors that, in hindsight, seemed avoidable. Communication broke down. The problem wasn’t just the machine. It was the system.

This realisation marked a turning point.

The 1977 Tenerife disaster (still the deadliest accident in aviation history) forced the industry to confront uncomfortable truths. Two fully functional aircraft, experienced crews, and no mechanical failure. What failed was communication, situational awareness, and the ability to speak up under pressure.

The response that followed was profound.

Human factors moved from the margins to the centre of safety thinking. Instead of asking, “Who made the mistake?”, the question became, “Why did this make sense to the people involved at the time?”. 

One of the most significant changes came in 1976 with the introduction of the Aviation Safety Reporting System (ASRS). It offered pilots, air traffic controllers, cabin crew and others a confidential, non-punitive way to report near-misses, hazards, and unsafe conditions. In the UK, the Civil Aviation Authority embedded similar principles and independent schemes such as CHIRP (the Confidential Human Factors Incident Reporting Programme) were established in 1982.

The intent was the same on both sides of the Atlantic, and elsewhere: to separate learning from blame.

The impact was transformative.

The industry unlocked insights that it had never had before. Reports flooded in, not just about accidents, but about the small things that almost went wrong. Patterns emerged long before lives were lost. 

Insight flowed back into training, cockpit design, communication protocols, and organisational practice. Speaking up became a responsibility, not a risk.

Aviation didn’t eliminate mistakes. It accepted that people are human, and designed systems that could learn from that reality. 

Over time, these practices became formalised through Safety Management Systems, now mandated internationally. Learning became continuous rather than reactive. Safety became actively managed, not passively hoped for. 

By the early 2000s, commercial aviation had become one of the safest forms of transport in the world. Not because people stopped making errors, but because the system got much better at noticing, understanding, and responding to them.

And that’s the lesson here that reaches far beyond aviation.

Resilient organisations don’t rely on perfect plans or tighter control. Aviation shows what’s possible when organisations design for learning. The key is to treat trust as a strategic asset and involve the people closest to the work in shaping practical responses. 

That’s what we at EmpowerPath aim to do; work with you to develop an environment where open communication and trust are at the forefront so your business can take off without any hitches too!

Find out more about how we can help here.