top of page
Search
Writer's pictureEdgar Anaya

What You See is All There Is


The human mind operates as an "associative machine." By constantly analyzing the world, we utilize prior knowledge to comprehend and navigate our observations.  However, the “What You See is All There Is” bias affects our decision making.


Daniel Kahneman, Nobel Prize in Economics introduced in his book, "Thinking Fast and Slow", the cognitive bias called "What You See is All There Is" (WYSIATI) and he illustrates an idea we are all too familiar with – but one we ignore far too often. WYSIATI explains that individuals often fail to question the absence of evidence when presented with information that confirms their existing mental models.


Our brain divides the work between two “systems.” System 1 could be called our intuition. It works “automatically and quickly” with “little or no effort.”


This system is useful because most of our lives are filled with routines. We wake up, go to work, eat, talk, drive, watch television, and sleep. Practicing these habits carves pathways into the connections of our brains, which make it easy for them to repeat.


The problem is that because these routes are so easy to repeat, our brains sometimes choose them without considering all the critical information.

This is where System 2 comes into play. System 2 “requires attention.” It is responsible for “choice,” “concentration,” and “complex computations.” This analytical tool helps us apply self-control when needed. It is a rational machine that enables us to learn, perform, and grow.



But System 2 comes at a cost: it requires effort. Most of the time, our brains are lazy. Or more appropriately, we are.


We often overlook the importance of being present, paying attention to the questions we're asked, and processing the constant influx of information in our fast-paced 21st-century society. Therefore, we rely on System 1 more frequently than necessary. We think fast when we should be thinking slow.


Kahneman shows, throughout his book that we do not make decisions as rationally as we would like to believe. We let unconscious effects influence our decisions unknowingly. WYSIATI often leads us to believe we have more information than we do when our brains “fill in the gaps” with memories, feelings, and unconscious cues. We fall for “one-sided evidence” because our brains have a bias towards belief. Or more clearly, a bias to believe what they already believe.

 

Overcoming the "What You See Is All There Is" (WYSIATI) bias involves conscious effort and deliberate strategies to expand our perspective and improve decision-making.


1.     Seek Additional Information:

  • Make a habit of gathering more data before making decisions. Consider what information might be missing and seek it out actively.

  • Use checklists or structured decision-making frameworks to ensure all relevant factors are considered.


2. Consider Alternative Perspectives:

  • Challenge your assumptions by considering alternative viewpoints and scenarios. Ask yourself what someone with a different perspective might think.

  • Engage with diverse opinions and encourage debate or discussion to surface different angles and insights.


3. Use Analytical Tools:

  • Leverage tools such as SWOT analysis (Strengths, Weaknesses, Opportunities, Threats) to systematically evaluate all aspects of a situation.

  • Consider scenario planning to explore different potential futures and how they might impact decisions.


4. Slow Down:

  • Take time to reflect and avoid rushing to conclusions. Quick decisions often rely on limited information.

  • Use techniques such as the "premortem" method, where you imagine a decision has failed and work backward to identify potential causes, encouraging a more thorough evaluation of factors.


5. Foster a Culture of Inquiry:

  • Create an environment where questioning and curiosity are encouraged. Promote open communication and reward those who identify gaps in information or assumptions.

  • Encourage team members to play "devil's advocate" to explore all sides of an issue.


By incorporating these strategies into our decision-making process, we can reduce the influence of WYSIATI bias and make more informed and balanced decisions.


Source:

MLA style: Daniel Kahneman – Facts. NobelPrize.org. Nobel Prize Outreach AB 2024. Sun. 21 Jul 2024. <https://www.nobelprize.org/prizes/economic-sciences/2002/kahneman/facts/>

Kahneman, D. (2011) Thinking, fast and slow. 1st edn. New York, USA: Farrar, Straus and Giroux.




4 views0 comments

Comments


bottom of page