Skip To Navigation Skip To Content Skip To Footer
    Insight Article
    Home > Articles > Article
    Generic profile image
    Brad Wakefield, MBA, FACMPE
    Editor’s note: This article is adapted from a Fellow paper submitted for fulfillment of the requirements of Fellowship in the American College of Medical Practice Executives. Learn more about Fellowship:
    Bias is a factor in decision-making1 and can lead to less than optimal decisions. Unfortunately, decision-making theory and training are not commonly taught to managers in healthcare; rather, good decision-making is an expected outcome of education and experience.

    In the 1980s, researchers at RAND published a series of studies showing that large portions of procedures performed on patients were considered inappropriate based on medical evidence.2 The standard at that time for medical decision-making was education and experience.

    We now talk about evidenced-based medicine as the standard for clinicians, but are healthcare managers being held to the same standard for the decisions they make? If bias is part of their decision-making process, are those decisions evidenced-based?

    In this era of team-based care, management decisions have more of an effect on patient care than ever. For example, management decisions such as determining the ideal staffing ratio for primary care practices or when we should start answering our phones in the morning affect patient care.

    Intuition, biases and heuristics explained

    “I knew as soon as the interview started I did (or didn’t) want to hire him.” This is an example of making a management decision based on intuition. In a review of 17 studies on hiring, researchers at Harvard University found that when making a good hiring decision, an equation outperformed human decisions by 25%. The researchers also found that 85% to 97% of professionals rely on some degree of intuition when making hiring decisions.3

    Cognitive bias involves drawing inferences or adopting beliefs “where the evidence for doing so in a logically sound manner is either insufficient or absent.”4 Examples of cognitive biases include:
    • Ben Franklin effect (Becoming more inclined to do a favor for someone who has previously sought your help)
    • Gambler’s fallacy (Thinking future probabilities are affected by past events that bear no weight on an outcome)
    • IKEA effect (Placing higher value on objects that you partially assembled yourself)
    As humans, we like shortcuts or heuristics — “efficient cognitive processes, conscious or unconscious, that ignore part of the information.”5 For example, something is coming at you quickly; you see it out of the corner of your eye, and you instinctively move so it does not hit you. You could have looked at it directly, determined how fast it was traveling and if the current path of the object was going to intersect with you. However, by the time you did all that, the object would have either hit you or passed by. You moved intuitively because you didn’t want to take a chance on getting hit. Most of the biases affecting leaders are related to wanting to make fast, intuitive and emotional decisions, all of which happen in our brains without us being explicitly aware of them.

    There is a place in management decision-making for shortcuts and heuristics, but there is also a place for more rational decision-making. As leaders, we should understand when and how to employ these different decision-making processes.

    Management decisions are inherently uncertain because the exact outcome of those decisions cannot be predicted. Error management theory (EMT) suggests that we choose the “error” that will have the least cost over time. In business terms, the best decisions sometimes are those with the least costs over time, but EMT research has also found that bias is a part of error management.6

    If we were completely rational in our economic decision-making, we would always purchase products that provide the greatest benefits at the least cost. The truth is that most of us use shortcuts to make buying decisions. We purchase the same products week after week. We purchase products that are easier to find and reach. We don’t have time to research every buying decision, so we take shortcuts to get our shopping done in a reasonable amount of time. We also make similar decisions when we park our cars each day. We don’t always search for the “best” spot every day but tend to park in the same spot or the same area since it is easier to remember where we parked.

    Across five decades of bias research, there are some accepted findings, but decision-making science and decision-making bias are not part of standard healthcare management education. We study the good and bad decisions others have made, but we don’t really study the “why” behind those decisions.

    Performance reviews

    Performance reviews are susceptible to bias. If you like the person you are reviewing, the halo effect may lead you to focus on their good qualities and ignore their weaknesses. Recency bias may lead you to weigh recent performance more heavily, at the expense of relevant but less recent performance. Central tendency bias may cause a manager to review an entire team near the mean. Being aware of these biases is the first step to mitigating their impact on your next set of performance reviews.

    Hiring decisions

    Managers should look for biases, especially stereotyping, while reviewing résumés and interviewing candidates. Not everyone from a particular college or university is a good fit for a position. Confirmation bias may lead a manager to solidify what he or she already thinks, which may be based on limited information (such as only quickly glancing at a résumé). Availability heuristic may lead you to think that the information you have (resume, interview, references) is everything you need.

    Hiring managers should consider how better a decision would be with more information, by utilizing a Myers-Briggs Type Indicator or emotional intelligence assessment. Remind the managers in your organization that the information they receive from candidates is not objective and likely is not enough to assess an individual’s potential for success.

    Strategic planning

    A healthy SWOT (strengths, weaknesses, opportunities and threats) analysis or balanced scorecard process help leaders craft better strategic plans, but there are still biases to watch out for during high-level planning. Even actuaries have found that bias can affect their mathematical analysis regarding risk and reward.7

    Loss aversion may influence us to avoid losses rather than pursue equivalent gains. In forecasting growth for next year, if the average growth of visits at all locations is projected to be 5%, what will the specific growth be at each location? Intuitively it seems as if the busy locations would be busier, and the less busy locations would grow more slowly. However, regression to the mean would predict that your less busy locations would grow faster on a percentage basis.8 At your next strategic planning meeting, if a high-ranking executive says, “We’re going to open a series of new retail clinics next year,” how many people will jump on the bandwagon before applying the same rigor to ideas and suggestions for growth? Does the source of the suggestion or number of people who agree affect whether it moves forward?

    Most executives work to obtain as much information as possible before making a decision, but the availability heuristic may make some people feel as though they have all the information needed for due diligence.

    Group decision-making

    Committees or groups have some advantages over individuals when making decisions, including diverse opinions and experience, mitigation of individual biases and better buy-in for the agreed-upon decisions.9 Still, some biases influence groups, especially the bandwagon effect. Executives need to be explicit in their communication with their teams to ensure they are comfortable sharing their thoughts in meetings.

    Forecasting and budgeting

    To some extent, practice leaders must make predictions when forecasting and creating a budget. If your last quarter was good, how hard is it to budget based on the previous year and not just your last quarter? Will others ask, “Your last quarter was great, so why isn’t your budget more optimistic?” Recency bias would suggest it is going to be difficult to tell them “no.”

    New services

    Few things are potentially riskier than opening a new location or offering a new service. While we do our due diligence by reviewing the best available information and working on a pro forma, we know that the outcome is uncertain.

    While making plans, confirmation bias may convince someone to seek information that confirms what we already believe and ignore information to the contrary. If you hear a rumor that your competitor is looking for office space close to where you plan to lease office space, confirmation bias can tempt you to think the rumor has little merit. On the other hand, if you decide to not go forward with the new location, the rumor that your competitor is about to lease space will influence your decision.


    Recognizing bias is not reason enough to dissuade you from making a decision. If you acknowledge some of these traits in your thinking, begin by reflecting on how your mind works when making decisions. Understand that your brain wants to help you feel certain but being humble and realizing that you really don’t “know” the best decision is a healthy place to start.

    Consider the example of a quick decision on an interview candidate (perhaps the interview served a confirmation bias formed while reviewing the candidate’s résumé). Sometimes we’re just too busy to go through a behavioral interviewing process and talk to other stakeholders or references. If you don’t spend enough time on the process, you will be taking some mental shortcuts.

    Overcoming bias

    We can institute processes within our organizations to help us combat many forms of bias.
    • Checklists are very effective in reducing reliance on mental shortcuts.
    • Ask someone to be “devil's advocate” in your group or committee to help diminish the chances of a bandwagon effect or confirmation bias.10
    • Create structured interview questions.11
    • To avoid anchoring bias, follow negotiation preparation best practices, such as entering a discussion with your own number in mind and being able to justify it. Even if an anchor number has been set, don’t let the anchor drag you away from the number you brought with you.
    • To avoid confirmation bias, actively look for opposing viewpoints. If you think a stock is going to go up, explore differing opinions or ideas you have not considered. You may be right, but you may learn something you didn’t know.
    • If you really like an employee’s performance, make sure you let him or her know — and then help make him or her better. We all have weaknesses, so take the opportunity to coach someone and push him or her to reach a higher performance level.
    • If you made a poor decision, acknowledging it will be emotionally difficult. It is human nature to blame the poor outcome on something beyond your control, which is the driver behind choice-supportive bias. The most important thing is learning from your mistakes.
    • Reject stereotypes. Reflect on how lazy thinking based on someone’s birthplace, race or other characteristic would feel if you were on the receiving end.
    • Each decision you make needs to stand on its merits. Outcome bias is hard to resist. If we have been successful once, it is easier to believe we can do it again.
    • Loss aversion bias is simply an emotional reaction to the pain of loss and embarrassment of failure. When you are worried about a potential loss, talk about it with others. They may not share your emotions about the potential for loss. It is normal for us not to want to “lose” in front of our peers or boss, but those emotions may make us more conservative than necessary.
    It should not come as a surprise that we are occasionally irrational. We are subject to a long list of cognitive biases that affect our decision-making.

    It is often easier to take shortcuts because we are busy and expected to make quick decisions. We need to realize that shortcuts and emotions are often not the best ingredients in the decision-making process. If we are aware of our biases, we can do something about them. Better management decisions will lead to better patient care.

    As practice leaders, we should be open and accepting about practicing evidence-based management decision-making. Our physician colleagues have set a great example for us — let’s follow their lead and make fewer biased and more evidenced-based management decisions.

    Addendum: Explaining specific biases

    • Anchoring bias: an overreliance on the first piece of information we hear.12 A notable example of this involved asking students to estimate how old Mahatma Gandhi was when he died. Some students were asked if he was older or younger than 114 when he died, while other students were asked if he was older or younger than 35. Students’ estimates were much lower when given 35 as an anchor than those given 114 as an anchor.13 Anchoring bias has also been shown to be valid in salary and contract negotiations, as research suggests that you should always offer the first price in negotiations and not let the other party set the anchor.
    • Confirmation bias: the tendency to only look for information that confirms what you already believe.14 A famous example of confirmation bias is the leadup to the 2003 Iraq war. Officials who believed weapons of mass destruction existed in Iraq worked heard to produce evidence in support of their belief, and they discounted information that suggested there were no WMDs in Iraq.15
    • The bandwagon effect: Managers are more likely to agree with information or decisions based on the number of people that already hold that belief.16 Imagine an executive team meeting where a CEO and CFO think cash should be eliminated at check-in desks. A patient experience director might have a difficult time expressing concerns about that change hurting patient experience scores, knowing that the top of the C-suite is favoring a different outcome.
    • Availability heuristic: Managers usually overestimate the value of information that is available to them. They think, “What you see is all there is.”17 The information that is available is not always complete and is not always the information you need to make a decision. You may need to delay the decision while you gather more information.
    • The Halo effect: Managers sometimes have an overly positive view of their employees, and a difficult time seeing or expressing areas of improvement for those employees.18 The truth is that even good employees have weaknesses and areas that they could improve in, and even if business is good, there are pieces of the business that are weak and need to be improved. Managers need to be prepared to see through the halo and help good people get better and good business become better.
    • Choice-supportive bias: Feeling positive about making a decision even if it has flaws.19 If we return to the example of the C-Suite that wanted to stop collecting cash at patient check-in, you can imagine that those leaders will feel good about their decision, even if they start to get some feedback that certain patients don’t like it. This bias helps protect self-esteem of those who made the decision: successes are attributed to personal efforts, yet we attribute the failures to things beyond our control. Managers should support their decisions, but they also need to be objective to listen for potentially better ideas.
    • Stereotyping: Expecting a group or person to have certain qualities without having real information about the person or group.20 Stereotyping is lazy — a decision shortcut that harms individuals, groups and society. From a management perspective, we have a duty to find the best individuals for our organizations, not just those who fit preconceived notions about a particular background or even the college they attended.
    • Outcome bias: Judging a decision on the outcome rather than the process used to get to that outcome.21 If a hiring decision is made in the first few seconds of an interview and that employee turns out to be successful, one might perceive it as a good decision and that the process was sound. However, the process that led to that decision was not a “best practice.” We should realize we were lucky and not believe that we can use intuition alone to find the best candidates.
    • Central tendency bias: When rating people’s performance, most managers will rate people in the middle of the scale.22 This bias does not provide the best feedback for any employee: It leaves high performers with the feedback that they are closer to average than exceptional, and low performers are told they are closer to average than needing improvement. Both sets of individuals are not getting the help they need with accurate feedback.
    • Recency bias: When Managers weigh recent information as more valuable or relevant than older information.23 Have you ever encountered a physician who had three no-shows in a day and wants something done about the high no-show rate when his or her average over three months is less than 5% or below the clinic average? A productive conversation becomes difficult when recent data does not correlate to the more comprehensive data measured over months. Recency bias tends to keep us in the fire-fighting mode, always reacting to the latest data with some type of action, even if that data may not be any more relevant than data from previous months.
    • Loss aversion bias: Based on how we feel emotionally and socially, largely dependent on how we feel about ourselves and how sensitive we are to what others think of us. Managers tend to want to avoid losses rather than pursuing equivalent gains.24 It is much easier to sell a stock for more than you paid for it than sell a stock at a loss, both of which could be equally good or bad decisions.
    1. Johnson DDP, Blumstein DT, Fowler JH and Heselton MG. "The evolution of error: error management, cognative constraints, and adaptive decision making biases." Trends in Ecology and Evolution, 2013, 474-481.
    2. Chassin MR, Kosecoff J, Solomon DH and Brook RH. "How coronary angiography is used: Clinical determinants of appropriateness." JAMA. 258 (18), 2543–2547. doi:10.1001/jama.258.18.2543.
    3. Kuncel NR, Ones DS and Klieger DM. "In Hiring, Algorithms Beat Instinct." Harvard Business Review, May 2014.
    4. Haselton MG, Nettle D and Andrews PW. 2005. “The evolution of cognitive bias.” In Buss DM, editor, The Handbook of Evolutionary Psychology. Hoboken, N.J.: John Wiley & Sons, 725.
    5. Gigerenzer G and Gaissmaier W. "Heuristic Decision Making." Annual Review of Psychology, 2011, 451-482.
    6. Johnson, et al.
    7. Wolf R. “How to minimize your biases when making decisions.” Harvard Business Review, Sept. 24, 2012. Available from:
    8. Kahneman D. Thinking, Fast and Slow, 1st edition. New York: Farrar, Straus and Giroux, 2011.
    9. Bang D. and Firth CD. "Making better decisions in groups." Royal Society Open Science, 2017.
    10. Healy P. “Confirmation bias – how it affects your organization and how to overcome it.” HBX Business Blog, Aug. 18, 2016. Available from:
    11. Bohnet I. “How to take the bias out of interviews.” Harvard Business Review, April 2016.
    12. Rosin T. “How 4 types of cognitive bias contribute to physician diagnostic errors — and how to overcome them.” Becker’s Hospital Review, June 9, 2017. Available from:
    13. Kahneman.
    14. Healy.
    15. CNN. "Ex-CIA offiical: WMD evidence Ignored," April 23, 2006. Available from:
    16. Lebowitz S. and Lee S. “20 cognative biases that screw up your decisions.” Business Insider, Aug. 26, 2015. Available from:
    17. Kahneman.
    18. Maier S. “4 unconscious Biases that Distort Performance Reviews.” Entrepreneur, Sept. 22, 2016. Available from:
    19. Lind M, Visentini M, Mantyla T and  De Missier F. "Choice Supportive Misremebering: A new taxonomy and review." Frontiers in Psychology2017, (8),  2062.
    20. Baer.
    21. Ibid.
    22. Maier.
    23. Ibid.
    24. Lovallo DP and Sibony O. Distortions and deceptions in strategic decisions. McKinsey Quarterly, Boston: McKinsey, 2006.

    Continue adding critical-thinking and evidence-based decisions to your day-to-day role with our resource, Don’t Do Something Just Stand There by Frank Cohen. Learn to reduce the uncertainty around the decisions you make, thereby increasing the likelihood of a positive outcome.

    Explore Related Content

    More Insight Articles

    Ask MGMA
    Reload 🗙