Tuesday 4 June 2019

PDU DX SYTEM 1 AND 2 THINKING

Fast and slow thinking; and the problem of conflating clinical reasoning and ethical deliberation in acute decision‐making

First published: 01 April 2019
 
Conflict of interest: None declared.

Abstract

Expertise in a medical specialty requires countless hours of learning and practice and a combination of neural plasticity and contextual case experience resulting in advanced gestalt clinical reasoning. This holistic thinking assimilates complex segmented information and is advantageous for timely clinical decision‐making in the emergency department and paediatric or neonatal intensive care units. However, the same agile reasoning that is essential acutely may be at odds with the slow deliberative thought required for ethical reasoning and weighing the probability of patient morbidity. Recent studies suggest that inadequate ethical decision‐making results in increased morbidity for patients and that clinical ethics consultation may reduce the inappropriate use of life‐sustaining treatment. Behavioural psychology research suggests there are two systems of thinking – fast and slow – that control our thoughts and therefore our actions. The problem for experienced clinicians is that fast thinking, which is instinctual and reflexive, is particularly vulnerable to experiential biases or assumptions. While it has significant utility for clinical reasoning when timely life and death decisions are crucial, I contend it may simultaneously undermine the deliberative slow thought required for ethical reasoning to determine appropriate therapeutic interventions that reduce future patient morbidity. Whilst health‐care providers generally make excellent therapeutic choices leading to good outcomes, a type of substitutive thinking that conflates clinical reasoning and ethical deliberation in acute decision‐making may impinge on therapeutic relationships, have adverse effects on patient outcomes and inflict lifelong burdens on some children and their families.
Experienced physicians are adept at gestalt thinking, the ability to view a whole as greater than just component parts, to recognise patterns and to create order from disorder in subconscious and conscious thought.1 This clinical skill evolves and is refined over many years by learning and memorising the anatomy, physiology and pathogenesis of complex conditions, developing disease scripts and timely triage to clinical specialties, and choosing therapeutic options that benefit patients. This accumulation of knowledge hones a physician's executive cognitive function. As patient interaction and case management becomes the norm with practice and clinical specialisation, neuroplasticity attenuates unused neuronal paths, while utilised ones develop acuity – creating a nexus of expertise and swift clinical reasoning.2
This highly developed executive cognition is beneficial when life and death decision‐making is paramount in stressful environments subject to time constraints, such as may occur in neonatal intensive care unit (NICU) or paediatric intensive care unit (PICU) or the emergency department (ED). However, the same agile reasoning may simultaneously make it difficult to access the slow reflective thinking required for ethical logic when there are incalculable clinical outcomes. This rational process systematises, defends and endorses right and wrong conduct, leading to what Thomas Aquinas called ‘a well‐informed conscience’ and Socrates ‘an examined life’. However, heuristics, which are useful mental short cuts adopted to solve a problem,3 may fail to accurately assess a complex dilemma and result in cognitive bias or cognitive disposition to respond (CDR).4 This, I suggest, can subsequently cause missteps in the reasoning process, with unanticipated yet burdensome morbidity outcomes.5
Daniel Kahneman, a Professor of Psychology, won the Nobel Prize in 2002 for his influence on the field of behavioural economics. In his book ‘Thinking, Fast and Slow’,6 he argued that rational cognition left unchallenged is apt to adopt systematic errors and fallacies. The two opposing systems of the human mind fight with each other and lead to mistakes in decision‐making.
‘System 1’ is fast, instinctual and under emotional command; it is the primitive thinking that has enabled our survival, like sensing danger or detecting a sound source.6 Fast thinking skills are gained by rehearsal and enable quick and complex decisions essential for skilled clinicians working in pressured environments. They are utilised when physicians encounter symptoms or signs that subconsciously form a recognisable pattern or ‘disease script’.7
In contrast, ‘System 2’ is slow, considered and controlled by reason. It is harder to activate and is mostly on standby, while System 1 idles fluently and automatically. The problem is that slow thinking is the rational deliberative thought required for extremely complex multifaceted decisions involved in clinical prognosticating, ethical reasoning and when there is dissent between clinicians and or families.5
Mostly, these two systems operate in synergy: when fast‐thinking System 1 is intermittently overruled, slow‐thinking System 2 has the final say. The catch is that System 1 tends to try to maintain command when it ought to allow System 2 to dominate. System 1 is hard to turn off, which leads us to make crucial errors and incorrect conclusions based on ‘intuition’. So, where there are times that we need to use our logical slow thinking because we are much less rational than we think, our brains get lazy, and we make errors in judgement based on cognitive biases.
Therefore, heuristics may or may not lead to incorrect conclusions as System 1 seeks to form a coherent, plausible story by relying on association, memories, pattern matching and assumption.8 The problem for time‐poor expert clinicians is that System 1 will default to that plausible, convenient conclusion, even when it is based on incorrect information.
While many experiential biases potentially lead to questionable outcomes in clinical decision‐making,459 most notable for the scope of this proposal relating to ethical reasoning are: availability, anchoring, association, pervasive optimistic and ‘sunk cost’ bias3(Table 1).
Table 1. Notable heuristic biases that lead to cognitive dispositions to respond that are relevant to medical decision‐making3610
Affective biasAversion to images that provoke strong emotion which leads to avoidance. The desire to avoid strong emotion influences decision‐making
Aggregate biasDisbelief of aggregate data, such as that used to develop clinical practice guidelines. Based on a belief that their patients are not typical. This may result in unnecessary or burdensome clinical assessments/tests not matching guideline criteria
AnchoringWhen a clinician remains attached to specific features of the patient's initial presentation and is unable to adjust their outcome reasoning, even when updated information becomes available. This may be compounded by confirmation bias
AssociationA tendency to base decisions on previous clinical situations encountered
AvailabilityTendency to assess things that come to mind readily as more likely to occur. Often provoked by recent similar clinical experiences that can readily result in diagnostic or prognostic error
Base‐rate neglectWhen a clinician reduces or inflates the base rate likelihood of a disease, distorting accurate Bayesian reasoning
CommissionMore common in overconfident clinicians, this is a tendency to action rather than inaction
ConfirmationA predisposition to remember events and outcomes as they wish they had occurred. To only consider favoured diagnostic signs or symptoms and disregard negative inconsistent ones, which leads to excluding diagnostic or prognostic possibilities
Diagnosis momentumWhen an early possible diagnosis becomes (incorrectly) definite, which excludes other possibilities
Framing effectThe way problems are delineated influences how clinicians perceive, assess and diagnose
Fundamental attribution errorA tendency to judgement or to blame patients for their illness rather than examine environment or circumstance
Gambler's fallacyA cognitive error suggesting independent events are related, such as preceding patient diagnoses
Gender biasWhen a clinician believes a certain gender presupposes a particular diagnosis without data to support this belief
Hindsight biasThe tendency to interpret outcomes or events as more predictable then they were before they occurred
Omission biasThe tendency to inaction over action and that events are more acceptable when related to a natural disease trajectory rather than those related to a clinician's actions
Order effects biasThe tendency to remember the first primary effect and the last, rather than equally accounting for all the information
Pervasive optimism or overconfidenceDownplays negative outcomes and overplays positive ones. Overestimation of knowledge and making decision based on incomplete information
RepresentativenessWhen clinicians judge the likelihood of a hypothesis based on how representative it is of available clinical data
Substitution biasWhere a complex question is replaced with a simple one
Sunk cost fallacyA predisposition to increase risk taking even when evidence suggests a negative return
Visceral biasRelates to an excessive emotional connection with the patient causing underestimated error
Zebra retreat biasWhen a physician avoids a rare diagnosis, even when evidence makes it likely. This may be due to under‐confidence or fear of being unrealistic
  • † Biases discussed in this article are perhaps more likely to undermine ethical reasoning in acute time frames, although all have the potential to be relevant in ethical and clinical decision‐making.
Availability refers to a gauge of the probability that something will happen based on our previous experience of a witnessed chain of events. The memory of this sequence is readily available to our imagination and is then judged (usually incorrectly) to be more likely to occur. This may explain why physicians' prognoses may be swayed by the outcome of a memorable patient or case. Where a previous patient recovered, the physician may overestimate the likelihood of improvement for the current patient or, conversely, the likelihood of clinical deterioration or death, if their previous memory was a negative one.
Anchoring, failure to reconsider an initial incorrect assumption or diagnosis, is subtle, pervasive and challenging to surmount, especially if evaluation of risk leads a clinician to overemphasise the likelihood of negative outcomes when outlining pros and cons. If we first emphasise risks, it becomes more difficult to shift concerns in future decisions than if the positives had initially been outlined.3 Remediation requires System 2 thinking by wilfully contemplating the anchor and then deliberately choosing an alternate view and presenting benefits and harms of treatment equally.3 Analogous to ethical thinking, this requires reflective consideration of disparate viewpoints.
Affective heuristic bias relates to aversion as contemplating a child suffering or dying evokes poignant images and a strong emotional response. These aversions can lead to a mental shortcut to avoid an intervention that is perceived to generate pain or suffering,3 arguably sabotaging accurate long‐term prognostication about morbidity. This bias may also be prevalent in parental thought processing related to their sick child and increases the likelihood of dissent in shared decision‐making at end of life.3
Pervasive optimistic bias is similarly relevant to such a clinical context as it downplays the likelihood of a negative outcome and gives a false illusion of substantive control. Studies suggest that this has a negative effect on diagnostic accuracy.11
Substitution bias is where a difficult question or decision is replaced with a simpler one. For example, acutely, it may be appropriate to commence ventilatory support while gathering more diagnostic data when time for reflection on the future burdens of life‐sustaining therapy (LST) is unavailable. This so‐called window of opportunity12 is the narrow time frame where difficult questions must be addressed, such as whether it will be possible to wean a child with other comorbidities from ventilation. If not, then the medical appropriateness of the continuation of LST needs consideration. While withdrawal and withholding of LST is considered by most to be ethically equivalent, in practice, this is equivocal,13 and although not in the scope of this paper, these complex ethical deliberations require collaborative team and family discussions.314-16
The ‘sunk cost’ fallacy is the tendency to increase risk‐taking in the belief it will yield better returns, even when the demonstrable probability of this is low.6 Balancing the result of a flawed decision early in the clinical path to initiate LST might logically lead to an ethical decision to withdraw life support such as ventilation. However, sunk cost bias may urge a clinician to continue LST, hoping for a better outcome despite evidence of harm or that the intervention itself demotes rather than promotes the child's best interests.
Kahneman and Tversky initially distinguished eight biases in 1987.6 Further supporting research suggests that there are now over 50 documented biases that may influence or impinge on decision‐making.491718 This has significant ramifications for both clinical and ethical choices, particularly when life‐sustaining or life‐limiting treatments are being considered for an acutely deteriorating child or premature neonate and where requests for burdensome treatment may not be in the child's best interests.19
While decisive therapeutic action may be crucial to buy time and collect further data, arguably this fast‐reactive decision‐making may sometimes lead clinicians too far down a therapeutic path to easily change direction in complex care. If there are divergent viewpoints either between specialist clinicians or clinicians and parents, inflexibility may lead to entrenched conflict and paralyse decision‐making.2021 While ethical reasoning should never delay timely treatment in sick children, ethical deliberation paradoxically requires activating the slow‐thinking System 2 – the system that tends to be in ‘dormant mode’ in highly stressed and chaotic situations.
Experienced clinicians, I believe, generally make excellent clinical decisions when diagnoses and prognoses are complicated, using advanced gestalt reasoning and a type of ‘clinico‐ethical substitutive reasoning’, which mostly leads to good outcomes. They readily utilise System 2 cognition for complex clinical decision‐making where time is not of the essence in team discussions with other clinical specialists and multidisciplinary care meetings.18However, many physicians have not learned or are not as adept at ethical reasoning as clinical reasoning. Crucial decisions are deferred to ‘clinicians who are expert in the facts, but…not experts in the ethics’.22 When the stakes are high and the situation is more chaotic8 due to fundamental end‐of‐life decisions and parental expectations, dissent or extreme complexity, System 1 thinking can lead to suboptimal decisions at crucial points due to heuristic bias and assumptions unique to the ‘fast’ System 1 thinking. While techniques to remediate these biases have been proposed elsewhere, more training in ethical reasoning for medical students and junior clinicians would be beneficial.1718
These issues are demonstrated by questionable decisions to initiate medical interventions that serve to sustain life despite catastrophic brain injury and dire prognostic outcomes34. Likewise, diagnostic tests may predict imminent death in a fragile neonate who subsequently disproves senior neonatologists' prognostication and thrives when care is redirected to palliation.23 These situations, although relatively rare, can cause significant moral distress in health professionals such as nurses.24 Experienced clinicians accustomed to a technocratic approach may also struggle if they are used to controlling the clinical environment primarily by managing the medical technology, adjusting ventilation and perfusion and titrating doses unchallenged for the good of their patients. However, moral distress can also have a positive effect in this cohort by forcing clinicians to initiate slow thinking, confer with other specialists and engage ethical reasoning that may more accurately account for the family burdens and complex care needs of a child facing profound lifelong disability.25
Perhaps this is why specialist clinicians in high‐stress streamlined units such as PICUs, NICUs or the ED may be reluctant to invite clinical ethics services in to offer an alternate ethical frame in decision‐making.2627 This is despite recent evidence that clinical ethics can support better clinician decisions28 on more appropriate use of LST.1729 Understandably, such units have learned to rely mostly on teams with whom they have built trusting long‐term relationships based on a history of achieving good patient outcomes. This approach is supported by the evidence suggesting that developed world PICUs have a less than 3% mortality rate.30 What has been less studied until recently, however, is the burden of lifelong disability some families then manage indefinitely post discharge, perhaps due in part to this mode of decision‐making.31
Advances in medical technology and expectations of access to it have fostered a technocratic paradigm in clinical decision‐making,32 and while there is no doubt families undertake this task selflessly and enduringly, the ability of expert clinicians to pause and acknowledge the prevalence of these biases is crucial. Initiating slow System 2 thinking earlier in the acute decision‐making pathway, rather than reflexively adopting a type of ‘clinico‐ethical substitutive’ thinking, 34may allow a more realistic account of the lifelong burdens of medical interventions, such as long‐term ventilation, that these children and their families knowingly or unknowingly undertake.1033
In conclusion, valid studies are required to ascertain the potentially deleterious effects of cognitive biases on acute medical decision‐making where complex ethical dilemmas coexist. Gestalt System 1 clinical reasoning benefits short‐term outcomes for crises in ED and ICU but may paradoxically impede accurate long‐term prognoses and ethical decision‐making regarding LST, which require System 2 reflective thought. More comprehensive medical training in ethics might help to remediate these biases and lead to better long‐term patient outcomes, reduced dissent and better shared decision‐making. Ethical reasoning skills – not just excellent clinical reasoning skills – as well as timely ethics support may help clinicians to more fully account for potential social, psychological and physical burdens following a child's admission for critical care treatment.

No comments:

Post a Comment