Please be assured this is not a woke rant dealing with micro-aggressions or cultural stereotyping, but about something potentially much more impactful on drug development outcomes. It deals with the associations we all hold, outside of our conscious awareness and control, that are triggered by our brain automatically making quick judgements and assessments. The essence of the scientific method is to be objective, grounded in sound logic and documented assumptions, and for our actions to be fact and evidence based. However, as humans, it is inevitable that unconscious bias creeps in to the decisions we make, even in an R&D context. Biases can lead to poor decisions that result in reduced productivity and efficiency, missed opportunities, and even expensive late-stage failures.

Thinking Fast and Slow

In 2011, psychologist David Kahneman, published groundbreaking work in his book Thinking Fast and Slow 1, where he put forth the now widely-accepted theory of System 1 and System 2, defined as the cognitive systems that people use when making decisions. System 1, or gut instinct, evolved to provide rapid decisions based on complex information, particularly when a quick action is critical, such as in a fight or flight response. Whereas System 2 is more deliberate, providing a more rational approach based on conscious analysis of the available information.

System 1: Fast

  • Defining Characteristics: unconscious, automatic, effortless
  • Without self-awareness or control “What you see is all there is.”
  • Role: Assesses the situation, delivers updates
  • Does 98%of all our thinking

System 2: Slow

  • Defining Characteristics: deliberate and conscious, effortful, controlled mental process, rational thinking
  • With self-awareness or control, logical and skeptical
  • Role: seeks new/missing information, makes decisions
  • Does 2%of all our thinking


Through the use of priming studies, Kahneman showed experimentally that System 1 forms an initial decision when presented with stimuli and is then modified by System 2 when time permits. However, the correction provided by System 2 is often imperfect, leading to a residual degree of irrationality even after further consideration. Whether his theory is true or not, the concept is worth noting, summed up aptly by Harvard psychologist and author Steven Pinker: “His central message could not be more important, namely, that human reason left to its own devices is apt to engage in a number of fallacies and systematic errors, so if we want to make better decisions in our personal lives and as a society, we ought to be aware of these biases and seek workarounds. That's a powerful and important discovery.”

The Many Faces of Bias

The issue of bias are more recently discussed by Bieske et al in 2023 2 who cover 13 common cognitive biases encountered in pharma R&D, each well documented in the literature and reproduced in the table below. They conducted an online survey among 92 industry practitioners working for pharmaceutical and biotech companies. Most of the surveyed experts observed a wide range of biases in portfolio decisions that they were involved with. Overall, the five most frequently observed cognitive biases were: confirmation bias, champion bias, misaligned incentives, consensus bias, and groupthink. They also uncovered additional common biases, namely gender bias, fear to challenge authorities, fear of social punishment for being critical, fear of punishment for failure, fear of risking career, rejection of external ideas, bias toward hypothesis validation versus rejection, bias toward old methods, and distraction.

 

Overview of common cognitive biases impacting decision-making

Confirmation bias

Discounting information that undermines personal believes, past choices and judgments, overweighing evidence supporting personally favored views

Champion bias

Projecting project champion’s previous success on project proposal or overweighing champion’s personal view when selecting projects

Misaligned individual incentives

Incentives creating conflicting interests, e.g., misalignment of executives’ compensation plans and shareholder value

Consensus bias

Leader overestimates similarity between own preferences and preferences of the group (e.g., overestimation of product acceptance in market)

Groupthink

Seeking consensus in group to such an extent that irrational decisions are made

Availability bias

Tendency to make decisions based on information that readily comes to mind. This leads to bias toward easily recallable options rather than most important options

Power of storytelling

The way in which information is framed and presented can lead to different conclusions; facts embedded in coherent stories are easier to remember

Status quo bias

Change aversion leading to bias toward existing views/options

Anchoring bias

Rooting of oneself to initial quantitative value that leads to over- or underestimation of subsequent scenarios with differing conditions

Loss aversion

Tendency to prefer avoiding losses and uncertainty (preferring safe bets with small rewards over risky projects with high rewards)

Optimism bias

Overconfidence, which makes one believe that project will be successful

Sunk-cost fallacy

Continuing failing projects because they have already consumed numerous resources (previous expenditures influence decision making)

Misaligned perception of corporate goals

For example, focusing on short-term success versus working toward corporate long-term vision


Some of these can be checked consciously and by introspection, e.g., gender bias. Others, such as champion bias and the power of storytelling are very difficult to avoid, and others are endemic to hierarchical management structures and particularly prevalent in large organizations. For example, only the very brave will openly challenge those who wield decision power over their future career progression or question senior scientists with a stellar track record whose last project turned out to be a commercial blockbuster.

An important consideration in the context of this piece, is that the bias in these scenarios may not always be conscious, meaning the individual isn’t always aware it's occurring, i.e., they aren’t thinking to themselves about the issue and deliberately choosing not to take corrective action, though that obviously occurs. It can happen automatically at a level beneath the surface that simply skew one’s perception of the information being presented.

Although many of us may recognize the types of bias listed in the table above, the one we come across most in biotech due diligence is optimism bias.

Optimism Bias

As Lovallo and Kahneman's ‘Delusions of Success’ article in HBR3 says, “optimism generates much more enthusiasm than does realism, and it enables people to be resilient when confronting difficult situations or challenging goals.” Therefore, companies certainly should promote optimism to keep employees motivated and focused. A certain degree of optimism bias is actually essential to the survival of many startups: who could really be motivated to hitch their star to a preclinical wagon if they were focused only on the single digit percentage of generating an approved drug?

On the other hand, what can you say about a management team responsible for a >150% time overrun in a project, now describing a new and very similar project as "low risk"? The optimist might hope that the lessons learned in the first exercise would enable timely success in its successor. This is where outside viewpoints can yield significant utility, specifically in carefully probing to validate this hypothesis and to ensure that it is not another example of optimism bias. This ended up being the case with a recent due diligence assessment we conducted, where the team in question matched the above example quite well and described a similar initial project as "having gone well in the end", strongly suggesting a lack of self-awareness, and hence, bias. Conscious biases can easily become unconscious and seat themselves firmly at the base of an individual or team’s decision making.

As the outside consultants who are often the ones presenting findings that may run contrary to the internal dogma, we are frequently met with significant scrutiny and skepticism. Once the dust settles, this turns to surprise when our assessments turn out to be valid. How could we have missed that? If half as much skepticism had been directed internally at the object of our due diligence itself, the deficiency, oversight or overtly optimistic plans, perhaps it would have been uncovered and dealt with. But the truth is it’s simply much easier to challenge the assumptions of parties external to the organization than it is those within it. A certain amount of optimism is helpful and even necessary for a biotech’s culture, but the challenge is realizing when that thinking has strayed too far from what is realistic. And that can be difficult.

Conversely, biopharma development teams harbor a well-known skepticism of prospective clinical investigators when told that their center can recruit x patients in y months for clinical trials. One reasonably assumes that recruitment will be half of x and take twice y (and even then often stand to be disappointed). It's much easier to make this type of corrective assumption about someone external to the organization, and it is extremely important to be just as critical when dealing with internal assumptions.

The Convergence of Optimism Bias and Sunk-Cost Fallacy

Sunk cost fallacy occurs when companies continue to invest in a drug despite mounting evidence of failure, due to the significant resources already invested rather than the drug's potential for success. It can be quite easy for this to converge with optimism bias – a combination that tends to lead to the continuation of projects where the evidence has deviated well away from pre-specified objectives/success criteria. In these cases, success criteria are continually redefined more loosely, e.g., BID dosing will be OK, rather than the original OD requirement, or IV (intravenous) rather than and IM (intramuscular). The case for the project gets weaker and weaker but everyone has a vested interest in keeping it going.

Overcoming Bias

This piece isn’t meant to solve the challenges unconscious biases create, but simply to draw attention to their existence in an area where we expect objective decision-making to be paramount: developing drug products. There is no magic bullet, the best solutions to ensure sound decision making usually involve dispassionate third parties. An external reality check conducted periodically is usually the safest approach, whether done via a scientific advisory panel, an independent due diligence team or by another type of board with external representation and the requisite expertise. Whichever path is chosen, just be sure to select your external experts without involving your unconscious biases!

 

Bibliography

1Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011

2Bieske, M., Zinner, M., Dalhausen, F. and Truebel, H. Trends, challenges, and success factors in pharmaceutical portfolio management: Cognitive biases in decision-making and their mitigating measures. Drug Discovery Today 28(10). https://doi.org/10.1016/j.drudis.2023.103734

3Lovallo, D. and Kahneman, D. Harvard Business Review, July 2003

 

 

Our Drug Development Expertise

Alacrita's core team leverages 150+ industry-experienced, clinical development consultants with backgrounds spanning a broad range of functional disciplines, therapeutic areas and product modalities. Our extensive consulting resources allow us to offer versatile, fit-for-purpose expertise that can be tailored to the exact needs of your clinical development program, enabling us to support you in each crucial area from planning and strategy to execution and tactical support.  

Learn more

.custom-post-pagination {display: none;}