Avsnitt

  • In this episode, we explore how safety in diving is not just about avoiding accidents but about building systems that can fail safely. Drawing on a real-life incident shared by Phil Short, we examine how a small technical issue—debris in a rebreather valve—could have escalated into a life-threatening situation during a cave dive. We highlight the critical role of technical preparation, situational awareness, and non-technical skills like teamwork, leadership, and communication in managing and recovering from unexpected challenges. By sharing these stories, we aim to help divers understand how to plan for failures and enhance safety through learning, reflection, and a chronic unease about what could go wrong.

    Original blog: https://www.thehumandiver.com/blog/failing-safely-400m-back-in-a-cave

     

    Links: How Safe is Your Diving blog: https://www.thehumandiver.com/blog/how-safe-is-your-diving

     

    Tags:  English, CCR, Decision Making, Gareth Lock, Leadership, Non-Technical Skills, Teamwork

  • In this episode, we dive into the concept of psychological safety and its critical role in diving and team performance. Psychological safety, defined as a shared belief that it's safe to take interpersonal risks, enables people to ask questions, make mistakes, contribute ideas, and challenge the status quo without fear of judgment or reprisal. Drawing on insights from experts like Amy Edmondson and Dr. Timothy Clark, we explore its four stages: inclusion, learner safety, contributor safety, and challenger safety, with a focus on how each stage impacts divers, instructors, and teams. From life-or-death scenarios to fostering innovation, creating a culture of psychological safety can improve decision-making, teamwork, and training outcomes. Tune in to learn how to build this essential skill in your diving and beyond.

    Original blog: https://www.thehumandiver.com/blog/how-safe-is-your-diving

     

    Links: If Only video: https://vimeo.com/382399090

    Debrief guide: https://www.thehumandiver.com/debrief

    Psychological Safety and Learning Behaviour in Work Teams: http://web.mit.edu/curhan/www/docs/Articles/15341_Readings/Group_Performance/Edmondson%20Psychological%20safety.pdf

    High Performing Teams need Psychological Safety: https://liberationist.org/high-performing-teams-need-psychological-safety/

    What Psychological Safety is not: https://qz.com/work/1470164/what-is-psychological-safety/

     

    Tags:  English, Communication, Decision Making, Gareth Lock, Just Culture, Leadership, Teamwork

  • Saknas det avsnitt?

    Klicka här för att uppdatera flödet manuellt.

  • In this episode, we explore the double-edged nature of goal setting—how it drives achievement but can also lead to risky decisions when pressure and commitment override safety and judgment. Using examples from mountaineering and advanced diving, including a personal story about a challenging CCR trimix course, we delve into the concept of "destructive goal setting." The discussion highlights how external pressures and an unwillingness to abandon goals can cloud decision-making, and emphasizes the importance of open communication, team empowerment, and stepping back to reassess whether "the juice is worth the squeeze."

    Original blog: https://www.thehumandiver.com/blog/is-the-juice-worth-the-squeeze

     

    Tags:  English, Cognitive Biases, Decision Making, Guy Shockey

  • In this episode, we explore the concept of counterfactual reasoning—our tendency to imagine how incidents could have been avoided by different actions—and why it falls short in improving safety. While this type of hindsight helps us feel better by creating a sense of order, it doesn’t address the real-world conditions or decisions that led to the incident. Instead of asking, "Why didn’t they do Y instead of X?" we should ask, "How did doing X make sense to them at the time?" By focusing on what actually happened and understanding the context, we can uncover valuable insights to improve safety and decision-making in diving.

    Original blog: https://www.thehumandiver.com/blog/counter-factuals

     

    Tags:  English, Cognitive Biases, Decision Making, Gareth Lock, Incident Analysis

  • In this episode, we explore a personal account of a Gareth’s experience with decompression sickness (DCS) and the critical decision-making process that followed. The story dives into the internal monologue, biases, and stigmas surrounding DCS, highlighting how emotions and uncertainties influence risk-based decisions. We also examine industry practices, the importance of creating a psychologically safe culture for discussing incidents, and the need for better preparedness when things go wrong. This episode challenges listeners to reflect on their own decision-making and encourages a shift toward curiosity and learning in the diving community.

    Original blog: https://www.thehumandiver.com/blog/the-bend-is-uninteresting-the-related-decisions-are-much-more-so

     

    Links: PACE model: https://gcaptain.com/graded-assertiveness-captain-i-have-a-concern/

    Prospect Theory: https://www.jstor.org/stable/1914185

    Blog about Normalisation of Deviance: https://www.thehumandiver.com/blog/being-a-deviant-is-normal

    Distancing through Differencing: https://www.researchgate.net/profile/David_Woods11/publication/292504703_Distancing_through_differencing_An_obstacle_to_organizational_learning_following_accidents/links/5742fb1808ae9ace8418b7ea/Distancing-through-differencing-An-obstacle-to-organizational-learning-following-accidents.pdf

     

    Tags: English, Decision Making, Gareth Lock

  • In this episode, we explore Professor James Reason's Swiss Cheese Model, which helps explain how incidents occur when multiple safety barriers fail at different levels within a system. We discuss how organizational, supervisory, and individual errors can combine to create accidents, and how the holes in these barriers move and shift over time. Using dynamic models, we highlight that safety is an emergent property of a system, where small errors accumulate and can lead to larger, more significant failures. We also examine the role of human error, risk management, and attention to detail in preventing accidents and emphasize the complexity of real-world systems, where multiple factors often lead to a critical mass of failure before an incident happens.

    Original blog: https://www.thehumandiver.com/blog/when-the-holes-line-up

     

    Links: Animated simple Swiss Cheese model: https://vimeo.com/326723142

    Big Hole model: https://vimeo.com/326723122

    Little Hole model: https://vimeo.com/326723109

     

    Tags:  English, Gareth Lock, Human Factors, Incident Investigation

  • In this episode, we dive into the concept of "good enough" in diving and how it relates to decision-making, risk, and safety. We explore why terms like "safe" and "good" are subjective and often influenced by context, experience, and social pressures, rather than absolutes. Using real-life examples, we discuss how divers weigh trade-offs between efficiency and thoroughness, balancing time, money, and risk to make decisions in uncertain situations. By understanding the biases and constraints that shape our choices, we can better assess what "good enough" means in different scenarios and improve through shared stories and context-rich learning.

    Original blog: Spiderman drawing video: https://youtu.be/x9wn633vl_c

    Blog from Steve Shorrock: https://humanisticsystems.com/2016/12/05/the-varieties-of-human-work/

    Efficiency-Thouroughness Trade Off: http://erikhollnagel.com/ideas/etto-principle/index.html

    Latent Pathogens from James Reason: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1117770/

    Outcome bias: https://en.wikipedia.org/wiki/Outcome_bias

     

    Tags: English, Gareth Lock

  • In this episode, we explore how decision-making under uncertainty plays a crucial role in scuba diving, drawing insights from Prospect Theory and real-life scenarios. We discuss how psychological factors, like loss aversion, influence divers to take risks they might otherwise avoid—whether it's diving with faulty gear after weeks of being unable to dive or dealing with pressures during high-profile expeditions. Highlighting examples from both individual dives and operational standards in dive centers, we examine the balance between minimizing loss and managing uncertainty. Finally, we emphasize the importance of teamwork, robust communication, and standardization to mitigate risks, ensuring safer and more informed diving decisions.

    Original blog: DOSPERT Study: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1301089

    Near death experience in Truk lagoon: https://www.scubaboard.com/community/threads/complacency-kills-its-not-just-an-empty-threat.567481/

     

    Tags:  English, Decision Making, Gareth Lock, Human Factors, Risk

  • In this episode, we explore how risk is perceived and managed in diving, where emotions, biases, and mental shortcuts often outweigh logic and statistics. Diving fatalities are statistically rare, but those numbers don’t resonate emotionally—our decisions are more influenced by stories and personal experiences. Through real-life examples, we unpack biases like availability bias, outcome bias, and the “turkey illusion,” showing how these distort our understanding of risks. The discussion also highlights strategies for improving risk management, such as using checklists, planning and debriefing effectively, and sharing experiences to enhance collective learning. Join us to rethink how we approach uncertainty and decision-making in diving and beyond.

    Original blog: https://www.thehumandiver.com/blog/riskoffatality

     

    Links: Fatalities COnference Procceedings: https://www.diversalertnetwork.org/files/Fatalities_Proceedings.pdf

    Numbers don;t have the same emotional relevance as stories: https://hbr.org/2003/06/storytelling-that-moves-people

    Risk of dying from a shark attack: https://www.floridamuseum.ufl.edu/shark-attacks/odds/compare-risk/death/

    Behavioural economics: https://www.behavioraleconomics.com/resources/introduction-behavioral-economics/

    Prospect theory: https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/prospect-theory/

    Video about normalisation of deviance: https://vimeo.com/174875861

    4 T’s of risk management: https://www.facebook.com/groups/1612046102342961/permalink/2160646497482916/

    How it makes sense for “stupid” decisions: https://www.facebook.com/gareth.lock.5/videos/10155465887236831/

     

    Tags: English, Gareth Lock

  • How do you measure safety in diving? This episode dives into a real story of a dive team that adapted to an emerging safety risk when two divers, certified but inexperienced in drysuits and challenging conditions, showed signs of stress. Through situational awareness, communication, and teamwork, the team adjusted their plan, choosing a safer dive site where the less experienced divers could build confidence. The story highlights how safety isn’t about luck or strict rule-following but proactive decision-making and collaboration. We explore how divers can develop the skills to create safety and why “nothing happening” often means someone made it happen.

    Original blog: https://www.thehumandiver.com/blog/safety-is-nothingness

     

    Tags:  English, Decision Making, Gareth Lock, Human Factors, Leadership, Safety, Teamwork

  • In this episode, we explore the decision-making challenges in diving, sharing a personal story of risky dives and lessons learned. A diver reflects on their early diving experiences, from breaking training depth limits to encountering equipment failures at 30m, and how a lack of knowledge and overconfidence contributed to risky choices. We discuss the importance of understanding context when evaluating incidents, avoiding hindsight bias, and learning from mistakes to improve safety. Diving involves inherent risks, but by fostering curiosity, sharing lessons, and acknowledging uncertainties, we can create a safer and more informed diving community.

    Original blog: https://www.thehumandiver.com/blog/responsible-but-not-informed

     

    Tags:  English, Decision Making, Gareth Lock, Human Factors, Risk Management

  • In this episode, we dive into cognitive dissonance—the psychological discomfort of confronting facts that challenge our beliefs—and how it impacts decision-making and safety in diving. Drawing on insights from Black Box Thinking by Matthew Syed and examples from aviation, justice, and diving, we explore why even highly educated individuals can resist change to protect their reputation. From misconceptions about Nitrox and gas planning to biases in equipment and training preferences, we examine common examples in diving and discuss how human factors can improve safety. We also share practical steps to reduce cognitive dissonance, embrace learning from failure, and foster open-mindedness in the diving community.

    Original blog: https://www.thehumandiver.com/blog/cognitive-dissonance

     

    Links: Ditching in the Hudson of Cactus 1549: https://en.wikipedia.org/wiki/US_Airways_Flight_1549

    Story about cult foollowers expecting a UFO: https://www.minnpost.com/second-opinion/2011/04/when-facts-fail-ufo-cults-birthers-and-cognitive-dissonance

    “Unintended co-ejaculators”: https://ethicsunwrapped.utexas.edu/cognitive-dissonance-case-unindicted-co-ejaculator

    Examples of cognitive dissonance: https://en.wikipedia.org/wiki/Cognitive_dissonance

     

    Tags:  English, Decision Making, Gareth Lock, Human Factors

  • In this episode, we explore the gap between knowledge and action, focusing on how even small, intentional changes can lead to significant improvements in safety and performance. Drawing from examples like the WHO Safe Surgical Checklist and lessons from diving, we highlight the importance of applying what we know—whether through simple tools like checklists and debriefs or by understanding decision-making and systemic issues. Alongside a personal story about working with a coach to turn knowledge into impactful action, we challenge listeners to reflect: what will you do to turn your insights into meaningful change?

    Original blog: https://www.thehumandiver.com/blog/anotherbrickinthewall

     

    Links: CAP 737 http://publicapps.caa.co.uk/modalapplication.aspx?appid=11&mode=detail&id=6480

    IOGP Doc 502 https://www.iogp.org/bookstore/product/guidelines-for-implementing-well-operations-crew-resource-management-training/

    Non-technical skills for surgeons: https://www.rcsed.ac.uk/professional-support-development-resources/learning-resources/non-technical-skills-for-surgeons-notss

    The Castle: http://www.thisiscolossal.com/2018/02/the-castle-by-jorge-mendez-blake/

    World Health Organisation Safe Surgical Checklist: https://www.nejm.org/doi/full/10.1056/NEJMsa0810119

    Semmelweis: https://en.wikipedia.org/wiki/Ignaz_Semmelweis

    Distancing through differencing: https://www.researchgate.net/publication/292504703_Distancing_through_differencing_An_obstacle_to_organizational_learning_following_accidents

     

    Tags:  English, Decision-Making, Gareth Lock, Human Factors, Non-Technical Skills

  • In this episode, we delve into the complexities of managing risk and uncertainty in diving, challenging the notion that accidents are "entirely predictable." Unlike measurable risks, diving involves countless variables that create uncertainty, often managed through mental shortcuts and biases. We discuss how hindsight bias, overconfidence, and peer pressure can cloud judgment, leading to poor decisions. Effective feedback, teamwork, and tools like checklists can reduce uncertainty, while debriefs and learning from others’ mistakes are crucial for improvement. Tune in to explore how divers can navigate uncertainty to enhance safety and performance in this high-stakes environment.

    Original blog: https://www.thehumandiver.com/blog/uncertainty-vs-predictable

     

    Links: Risk vs Uncertainty: http://www.mindtherisk.com/literature/67-risk-savvy-by-gerd-gigerenzer

    Thinking, Fast and Slow: https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow

    Blog about the Dunning Kruger effect: https://www.thehumandiver.com/blog/incompetent-and-unaware-you-don-t-know-what-you-don-t-know

    Blog about biases: https://www.humaninthesystem.co.uk/blog/i-am-biased-you-are-biased-we-are-all-biased

     

    Tags:  English, Decision Making, Gareth Lock, Risk

  • The diving industry faces challenges in maintaining high safety standards due to a lack of effective feedback mechanisms and a fear of reprisal for reporting substandard practices. Feedback is essential for improving performance and preventing dangerous "normalization of deviance," but it’s often viewed as blame rather than an opportunity for learning. Without proper acknowledgment or action from agencies, divers and instructors lose trust in the system, leading to fewer reports and greater risks. To protect the self-regulating nature of the industry, the community must embrace constructive feedback, report unsafe practices, and demand accountability from agencies to ensure safety and uphold standards.

    Original blog: https://www.thehumandiver.com/blog/standard-you-accept

     

    Links: Blog about normalisation of deviance: https://www.thehumandiver.com/blog/being-a-deviant-is-normal

    Willful blindness: https://www.ted.com/talks/margaret_heffernan_the_dangers_of_willful_blindness

    ​Case study from healthcare in the US: https://news.aamc.org/patient-care/article/best-response-medical-errors-transparency/

     

    Tags:  English, Gareth Lock, Just Culture, Reporting

  • When discussing diving incidents, it’s vital to shift away from blame and hindsight bias and instead foster a culture of open dialogue to understand why decisions made sense at the time. Often, divers are doing their best with the resources, training, and information available, but situational awareness and decision-making are shaped by incomplete data, personal experience, and environmental factors. Criticism without context or constructive feedback doesn’t improve safety or learning; instead, it deters people from sharing critical insights. By embracing a "just culture," the diving community can better explore the underlying factors behind incidents, address systemic issues, and create meaningful opportunities for growth and safety improvement.

    Original blog: https://www.thehumandiver.com/blog/cannot-improve-do-not-understand

     

    Links: Report of the death of CCR diver: https://cognitasresearch.files.wordpress.com/2015/05/dillon-2015-findings-in-the-inquest-into-the-death-of-philip-gray.pdf

     

    Tags: English, Gareth Lock, Just Culture

    ​​

  • Safety in diving is not a standalone priority but one of many factors, including time, money, resources, and productivity, that individuals and organizations must balance in a dynamic environment. Safety is best understood as reducing risk to an "acceptable level," but defining what is acceptable can be complex and context-dependent. Using principles like ALARP (As Low As Reasonably Practicable), risk is mitigated until further reduction becomes disproportionately expensive or impractical. Both training organizations and divers face trade-offs between safety and competing priorities, which can shift depending on circumstances. Divers must critically assess their own safety standards and weigh the effort, time, and money required to mitigate risks, understanding that "safety" is a shared responsibility within the larger system of diving. Ultimately, improving safety requires self-awareness, courage, and a commitment to learning from near-misses and incidents.

    Original blog: https://www.thehumandiver.com/blog/safetyisnot_the_priority

     

    Links: ICAO Safety Management Manual: https://www.icao.int/safety/SafetyManagement/Documents/Doc.9859.3rd%20Edition.alltext.en.pdf

    Royal Sociecty Risk Assessment report: https://books.google.co.uk/books/about/Risk_Assessment.html?id=LRcmQwAACAAJ&redir_esc=y

    John Adams book ‘Risk’: http://www.john-adams.co.uk/wp-content/uploads/2017/01/RISK-BOOK.pdf

    Efficiency-Throughouness Trade Off: http://erikhollnagel.com/ideas/etto-principle/index.html]

    Work as Imagined/Work as Done: https://www.thehumandiver.com/blog/what-does-human-factors-in-diving-mean

    Cognitive biases: https://www.thehumandiver.com/blog/17-cognitive-biases

     

    Tags:  English, Gareth Lock, Human Factors, Safety

  • Human factors in diving encompass everything from individual behavior to the interaction between divers, technology, and organizational systems. This podcast dives into the complexities of human factors, exploring how they influence safety, performance, and decision-making. Topics include cognitive biases, stress, and fatigue, as well as the gap between "Work as Imagined" and "Work as Done." We also discuss the importance of Crew Resource Management (CRM) and Non-Technical Skills (NTS) in improving team dynamics and situational awareness, even in solo diving. Additionally, we touch on the lack of formal human factors standards in diving and the need for better incident reporting systems. Finally, we highlight practical approaches to training, such as effective pre-dive briefs, debriefs, and feedback mechanisms, to help divers and instructors foster safer, more adaptive practices.

    Original blog: https://www.thehumandiver.com/blog/what-does-human-factors-in-diving-mean

     

    Links: Steven Shorrocks blogs about the four parts of Human Factors: 

    ​​

    Tags: English, Gareth Lock, Human Factors

  • This podcast explores the limitations of attributing diving accidents to "human error," a reductionist explanation that fails to address the complexities of real-world decision-making and system failures. By examining a case study involving oxygen toxicity during a rebreather dive, the episode delves into how biases, situational awareness, and flawed mental models contribute to adverse events. It highlights the importance of understanding the context behind decisions, recognizing that divers rarely intend to put themselves or others at risk. Drawing parallels with aviation and other industries, the podcast advocates for systemic changes, better training, and a culture of learning to enhance safety, rather than placing blame.

    Original blog: https://www.thehumandiver.com/blog/why-human-error-is-a-poor-term

    Links: Animated Swiss cheese model: https://vimeo.com/249087556

    References:
    1.        Bierens, J. Handbook on drowning: Prevention, rescue, treatment. 50, (2006).

    2.        Denoble, P. J. Medical Examination of Diving Fatalities Symposium: Investigation of Diving Fatalities for Medical Examiners and Diving. (2014).

    3.        Denoble,  PJ, Caruso,  JL, de Dear,  GL, Pieper,  CF & Vann,  RD. Common causes of open-circuit recreational diving fatalities. Undersea Hyperb Med 35, 393–406 (2008).

    4.        Parry, G. W. Human reliability analysis—context and control By Erik Hollnagel, Academic Press, 1993, ISBN 0-12-352658-2. Reliability Engineering & System Safety 99–101 (1996). doi:10.1016/0951-8320(96)00023-3

    5.        Reason, J. T. Human Error. (Cambridge University Press, 1990).

    6.        Phipps, D. L. et al. Identifying violation-provoking conditions in a healthcare setting. Ergonomics 51, 1625–1642 (2008).

    7.        Dekker, S. The Field Guide to Understanding Human Error. 205–214 (2013). doi:10.1201/9781315239675-20

    8.        Endsley,  MR. Toward a theory of situation awareness in dynamic systems. Human Factors: The Journal of the Human Factors and Ergonomics Society 37, 32–64 (1995).

    9.        Klein,  GA. Streetlights and shadows: Searching for the keys to adaptive decision making. (2011).

    10.      Amalberti,  R, Vincent,  C, Auroy,  Y & de Maurice, S. G. Violations and migrations in health care: a framework for understanding and management. Quality & safety in health care 15 Suppl 1, i66–71 (2006).

    11.      Cook,  R & Rasmussen,  J. ‘Going solid’: a model of system dynamics and consequences for patient safety. Quality & safety in health care 14, 130–134 (2005).

    12.      Woods,  DD & Cook,  RI. Mistaking Error. Patient Safety Handbook 1–14 (2003).

    Tags: English, Gareth Lock, Human Error

  • In this episode, we explore a diving incident that highlights the critical importance of understanding human factors in high-risk activities like technical diving. A diver survived an oxygen toxicity seizure thanks to her buddy's quick thinking, but the investigation revealed a web of human errors, from outdated equipment to flawed decision-making. We discuss the lessons learned, the role of human variability in performance, and how other industries like aviation and healthcare have transformed safety through Crew Resource Management (CRM). Diving’s focus on technical skills often overlooks the human element—decision-making, communication, and teamwork—that can make or break a dive. Tune in to learn how adopting these skills can enhance safety, performance, and the culture of diving.

    Original blog: https://www.thehumandiver.com/blog/stop-making-stupid-mistakes

     

    Tags:  English, Gareth Lock, Human Factors