WEBINAR | Introducing Event Learning Review
Andy Shone highlights a global shift in safety thinking—from a traditional investigation mindset to a learning mindset. This shift, central to Human & Organisational Performance (HOP), recognises that conventional investigations often fail to deliver genuine learning. They tend to produce generic findings such as non-compliance, lack of training, or inadequate supervision, which lead to equally generic fixes like reminders, retraining, or rewriting procedures. These so-called solutions rarely improve work or prevent recurrence.
Event Learning Review (ELR) is designed as a direct replacement for traditional investigations, offering greater flexibility, adaptability, and focus on context. It seeks to understand the messy, everyday reality of work—why actions made sense at the time—rather than fitting findings into pre-set categories or chasing root causes. The approach builds on HOP principles: people make mistakes, blame fixes nothing, context drives behaviour, learning is vital, and response matters.
Andy emphasises that current models, while once helpful, now constrain learning by driving “investigation by numbers”—finding what you look for, then fixing only that. Event Learning Review promotes curiosity, exploring normal work as well as the event, and identifying emergent themes rather than causes.
Relationship with Learning Teams
Event Learning Review complements Learning Teams. While Learning Teams are excellent for proactive exploration and collective problem-solving, they can be resource-intensive and less suited for some technical failures. ELR uses similar facilitation skills and questioning styles but is more flexible—interviews, observations, and document reviews can occur without assembling everyone in the same room.
The Event Learning Review Process
The method follows a Learn – Define – Improve cycle:
-
Trigger – Determine which events merit an Event Learning Review (fewer, higher-quality reviews over quantity).
-
Learning phase – Begin with understanding normal work and context, then move towards the event. Use interviews, observations, and collaborative discussions, focusing on local rationality (why the actions made sense at the time).
-
Defining themes – Identify key themes emerging from the data, avoiding rigid categorisation.
-
Improvement actions – Develop actions that improve work broadly, not just prevent the same event.
-
Verification – Check not only that actions are completed, but that they are effective in practice and have not introduced new problems.
Key Shifts in Perspective
-
From “work as imagined” to “work as done” – prioritising firsthand accounts over documented procedures.
-
From linear cause-effect models to complex, messy realities.
-
From simple explanations to deeper contextual understanding.
Benefits and Outcomes
Andy describes outcomes beyond just preventing recurrence: improved systems, richer understanding of operational challenges, and reframing events from being aberrations to recognising the resilience people show in preventing more frequent failures.
Questions from Attendees
-
Difference from Learning Teams? Similar philosophy, different vehicle; ELR is a direct investigation replacement.
-
Extra workload? No—replaces current investigations, ideally reducing volume and increasing quality.
-
Regulatory requirements? Most jurisdictions do not mandate a specific investigation method; regulators have generally been receptive when outcomes improve.
-
Reviewing actions? Typically at 6–12 months, with informal follow-ups earlier.
-
Who to involve? Primarily those closest to the work and involved in the event, plus others with relevant operational insight.
-
Training focus? The HOPLAB course emphasises facilitation skills as much as process steps.
Andy closes by highlighting that Event Learning Review is both a technical tool and a philosophical shift, grounded in HOP principles. When applied with genuine curiosity and disciplined follow-through, it can transform post-event learning into an engine for meaningful operational improvement.