top of page

Cognitive bias in the cockpit: the invisible enemy of operational safety

  • há 1 dia
  • 3 min de leitura

By Captain Bassani - ATPL/B-727/DC-10/B-767 - Former Air Accident Inspector SIA PT. captbassani@gmail.com - Feb/2026 - https://www.personalflyer.com.br


After hundreds of messages, comments, and questions from fellow aviation professionals, it was time to take the discussion to the next level. This new post was crafted specifically to answer those requests, adding more depth and bringing extra practical insights for those who live the operation in the cockpit.


Image AI


Even in highly standardized operations, professional pilots remain vulnerable to cognitive biases that distort risk perception and influence critical decisions. Reviews of commercial aviation incident and accident reports show that overconfidence, plan continuation bias, anchoring, authority bias, and confirmation bias appear repeatedly in safety events, especially during approach and landing.


Plan continuation bias (“get‑there‑itis”) drives crews to persist with the original route, approach, or strategy even in the face of clear signs that conditions have deteriorated or that stabilization criteria are no longer met. NASA‑linked analyses of airline accidents suggest that a significant proportion of tactical decision errors are associated with the choice to continue with the initial plan despite multiple cues that diverting, returning, or going around would have been safer. In practical terms, this may mean pressing on VFR into marginal conditions until controlled flight into terrain, or maintaining an unstable approach beyond the point at which a go‑around should have been executed.


Confirmation bias strengthens this pattern by leading pilots to seek, interpret, and overweight cues that support their current mental model of the situation. Studies with pilots in simulators show that, under high workload, there is a tendency to focus on indications that fit the desired plan while discounting signs of worsening weather, instrument discrepancies, or system alerts. In line with broader human‑factors research, this creates a loop in which the further the flight progresses and the stronger the commitment to the plan, the harder it becomes to recognize the need to change course.


Other relevant biases documented in aviation and applied psychology include outcome bias and over‑learning from near‑misses. Research indicates that decision‑makers often evaluate the quality of past choices mainly by the final outcome (for example, an uneventful landing) rather than by the decision process itself. As a result, risky decisions that “turned out fine” can be encoded as acceptable, reinforcing dangerous patterns that only reveal their full severity when the outcome is negative.


To mitigate these effects, academic evidence and safety authority guidance converge on several lines of action. First, make cognitive biases an explicit training topic, embedding them into CRM, EBT, and LOFT programs with scenarios deliberately designed to trigger and then debrief plan continuation, over‑anchoring on initial information, and over‑confidence in automation. Second, promote a culture of structured debriefing in which decisions are analyzed against the information available at the time, not just by the eventual outcome, reducing outcome bias and enabling more realistic learning from near‑occurrences. Third, strengthen the use of “tactical pause” techniques (time‑outs, formal cross‑checks, explicit verbalization of alternatives) at key points in the flight, creating deliberate opportunities to break cognitive inertia and reassess risk.


For the professional pilot, recognizing that “thinking well” is not only about following SOPs but also about understanding how the brain itself distorts reality is a safety skill as critical as technical proficiency. The next hard decision you face en route may be filtered through biases you cannot see, but which can be mitigated if they are known, discussed, and systematically trained.


Safe flights!


Captain Luiz Bassani


Sources

SKYbrary – articles on Continuation Bias and its effects on pilot decision‑making.

Safety articles and bulletins on “get‑there‑itis” and plan continuation bias in general aviation and airline operations, including NASA studies.

Texts and analyses on the link between plan continuation bias and confirmation bias in pilot decision‑making.


WINGS OF KNOWLEDGE SERIES

They are now available for online purchase, in Portuguese and English, the first volume of this 18-book series, which is already emerging as a reference for aviation professionals and enthusiasts:

Volume 1 – Introduction to Aviation and the Role of the Pilot

Volume 2 – Human and Physiological Factors in Flight Safety

Volume 3 – The Importance of Aeronautical Knowledge – Mai 2026

More than books, these guides represent a true transformation in the education and development of aviation professionals, from cadets to experienced pilots, instructors, safety managers, and dedicated enthusiasts.

They provide a structured path toward mastering the art and science of flight, integrating technical fundamentals, human factors, internationally recognized best practices, and a forward-looking perspective.

Comentários

Avaliado com 0 de 5 estrelas.
Ainda sem avaliações

Adicione uma avaliação
  • Whatsapp
  • LinkedIn Social Icon
  • X
  • Instagram
  • Facebook Social Icon
  • Twitter Social Icon
  • Telegram
scsi-transparent.png
3b0bac_057b17e992224db8bd32e37a1739c219~mv2.png
download.png
rdwbXT00_400x400.jpg
1_AbLlfQVh0_L-tr_oW4vH-Q.webp
Airbus-a320.JPG
ICAO.logo.png
Coursera_20Artificial_20Intelligence_20Essentials_20V2.png
Portuguese_Air_Force_COA.png
personal flyer_2014_transp.gif

 © 2005 - 2026

bottom of page