‘Thinking, Fast and Slow’ May Be the Most Helpful Book I Have Ever Read

Some books can make an impact while others can change your life. Thinking, Fast and Slow by Daniel Kahneman has changed how I see the world. It is truly the most helpful book I have ever read.
When Kahneman published this book (2011), which comprises his life’s work in psychology and behavioral economics, social media was in its infancy. The iPhone 4s was the latest device on the streets.
Now we are constantly inundated with content that can psychologically manipulate us, perpetuate our cognitive biases, and lead us further into a critical thinking crisis. Kahneman’s book on how we think is more important than ever. And while it may be dense, it’s probably the most accessible and comprehensive exploration of human psychology that’s available to general readers.
At its core, the book illuminates the fascinating duality of our minds, something I had never considered before reading Thinking, Fast and Slow. We all have two systems according to Kahneman:
- System 1 — our lightening fast intuitions and automatic functions that drive our daily actions; and
- System 2 — the energy-intensive focusing that engages in complex problem-solving
Through this framework, Kahneman doesn’t just tell his readers how we think. He introduces each of us to ourselves (hence why it’s the most helpful book). If you’re like me and have never read his work before or studied psychology, you may be meeting yourself for the first time. My first encounter with my mental flaws, including some of the hidden patterns and shortcuts my mind took (System 1), happened early in the book. Answer the following question posed by Kahneman:
A ball and a bat cost $1.10
The bat costs $1 more than the ball
How much does the ball cost?
If you initially answered 10 cents like me, you’re wrong.
The correct answer is 5 cents. Many people get this question wrong on the first try, so if you did, don’t feel bad. It’s a compelling example of overly relying on our intuition (System 1) and not slowing down and thinking harder (System 2).
What separates Thinking, Fast and Slow from other books in its genre is its remarkable ability to bridge the theoretical and practical divide. Kahneman reveals throughout the book the psychological forces at play in our daily lives, which little did he know (or did he?) have now compounded exponentially by social media. From countless cognitive biases that shape our decision-making to anchoring effects that can significantly influence economic decisions, this book serves as both a warning and a toolkit.
Kahneman alerts us to mental traps while providing actionable strategies to make better choices. And this spans all areas of life — professional settings, personal relationships, financial decisions, etc. One of the biggest mental traps is the lazy default setting of our brains.
We conduct our mental lives by the law of least effort
Kahneman describes how laziness is built deep into our nature. Our brains naturally want to conserve energy. Any effort is a cost. The acquisition of a skill is driven by the balance of these costs with the benefits we receive from acquiring the skill.
No wonder so many of us default to scrolling TikTok or “Netflix and chilling” instead of pursuing that creative project we’ve been putting off. Or making time with old friends or new friends alike. Scrolling social media for quick dopamine hits is simply easier than many alternatives and our brains quickly default to this option.
But as I have learned from this book, if I start from the premise that I operate by the law of least effort, I know that I must remove as many distractions or easy temptations as possible. If I’m pursuing something creative or focus-intensive, I clean my workspace, turn off email and social media, and focus on the task at hand.
When done effectively, Kahneman describes how our System 2 can kick in and we can enter “flow state”, which is when people exhibit “effortless concentration so deep that they lose their sense of time, of themselves, and of their problems.” Maintaining focus requires no self-control when in flow, leaving precious mental resources for the task.
This law of least effort also impacts other areas of our lives. My System 1 intuitive brain initially read the bat and ball problem above and subtracted the two available numbers. It tried to take the path of least mental effort. When in retrospect, I should have slowed down and thought deeper about the mathematical computation before jumping to conclusions. It made me think of all the other areas of life where I fall prey to similar mental hiccups. I know far less about myself than I thought.
You know far less about yourself than you think you do
Kahneman describes how this reveals itself in myriad ways. For example, priming can shape and influence behavior through symbolic cues without us even realizing it. One study that Kahneman cites involves placing only eyes and occasionally only flowers near an honor system contribution jar. The eyes made people more charitable than the flowers because they made them feel like they were being watched. We are motivated by what others may think of us. I certainly am, probably more often than I want to admit.
We may also know less about ourselves and our beliefs as the world spreads more misinformation. As Kahneman notes (in 2011!), “A reliable way to make people believe falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.”
This can also manifest itself when one phrase in an otherwise false statement may be familiar. It can lead us to think that the whole statement “feels” familiar and therefore true. Unless we’re able to slow down and engage our System 2.
This can become particularly dangerous when we believe a conclusion is true and are then more likely to believe any arguments that support it, even if those arguments are obviously unsound. I have to constantly remind myself that even if a certain conclusion is desirable or preferable, I cannot assume the conclusion or start my analysis by finding ways to support it. The arguments that lead to a conclusion must be sound. System 2 must be engaged.
We are also hit with many other strategies, whether in marketing, on social media, or elsewhere, that influence our behavior without us often realizing it. For example, the use of colored text printed in bright blue or red is better than middling shades of green, yellow, or pale blue, according to Kahneman. Similar strategies are used with images, videos, and hooks in advertising to engage our System 1 and suppress any critical thought from System 2.
Just think how easy it is to mindlessly scroll on social media. One bright screen or engaging image, text, or clip after another.
Kahneman warns that we’re especially susceptible to these influences when we’re in a bad mood or when we’re tired. Negative emotions and fatigue can have a powerful effect on intuitive (System 1) performance.
What makes this the most helpful book I’ve read is that now I’ve learned to apply more scrutiny to fonts, familiar concepts, and regularly repeated words or phrases. I’m also more cautious whenever I’m in a bad mood or tired. If we are not vigilant when our System 2 is weakened, it’s easier to fall prey to deceptive marketing and unsound arguments.
What you see is all there is
Another common mental trap is thinking that what you see is all there is. When information is scarce, System 1 operates as a machine for jumping to conclusions. You might be thinking that this is less of an issue in the modern world with so much information at our fingertips. But the opposite can be true, especially when we don’t know what information to trust.
When someone asks whether a company that manufactures one of your favorite products is financially sound, you might be willing to say yes despite having never reviewed any financial reports or statements.
One of the most classic examples is meeting someone for the first time. If they do something silly or stupid, that first impression likely clouds your perspective on them forever. The inverse is also true where you might hold someone in higher esteem than they deserve simply because they left a positive first impression. This halo effect unduly impacts subsequent information you learn about someone.
In the political world, the voting public suffers from “what you see is all there is” on a regular basis. If they are told something that’s difficult to substantiate, but the messenger is on their “team”, they’re more likely to believe it without fact-checking.
In the marketing world, consumers neglect key information because companies advertise in specific ways designed to evoke different emotions. For example, marketing a product as 90% fat-free versus 10% fat leads to base rate neglect and a sense of comfort for many that the overall product must be good (despite the many other unhealthy ingredients it may have).
Kahneman argues that we can combat the halo effect, base rate neglect, and other pitfalls of “what you see is all there is” by using a process called decorrelate error. This is a way to reduce bias and improve accuracy by increasing the sample size or inputs that lead to a conclusion.
In my own life, when I meet someone new, I make a conscious effort now to reserve judgment. If the person comes off negatively, I try (emphasis on try) to extend charitable assumption. I have no idea everything that person is going through and I have too small of a sample size to conclude on their character. The same is true if I have a positive first encounter with them.
This is why Kahneman recommended to companies over the years to require everyone to write brief summaries of their positions prior to important meetings. Amazon uses internal memos in this manner. The goal is to gather as many inputs as possible, without any of the positions influencing the others. We have all been in settings where the first person to speak confidently and assertively can influence subsequent opinions (for better or worse).
Why this is the most helpful book I have read: it teaches you how to constantly question your influential System 1 and your lazy System 2
Prior to reading Thinking, Fast and Slow I never appreciated the duality of my mind. How the intuitions served to me by my System 1 brain can impact my thinking, perspectives, and worldview. While it’s largely beneficial — given that I don’t have to think about every single action or thought I have per second — it’s important to understand the limitations and dangers.
Similarly, the laziness of System 2, my focused and thoughtful brain, is a reality of life. Sometimes intuition (System 1) beats logic (System 2) even when the correct answer stares me in the face. I have to be more mindful of engaging my System 2 by slowing down and not jumping to conclusions as I first did with the word problem posed at the beginning of this essay.
Kahneman warns that our intuitions will deliver predictions that are too extreme, and we’ll be inclined to put too much faith in them. The more we can recognize that System 1 is designed to jump to conclusions from little evidence, the more we can reduce the size of its jumps. While System 1 is the origin of much of what we do right, it’s also the impetus of what many of us, including myself, do wrong.
Unfortunately, there are no easy solutions here. But simply learning this basic psychology has helped me in a way that no other book has done. It’s made me understand that no matter how pure of thought I think I am, I have biases and there’s little I can do about them without considerable investment of effort to recognize situations where errors are likely.
One of the best ways to recognize these cognitive minefields is by reading Thinking, Fast and Slow and then applying its principles to daily life. It starts by slowing down and asking for reinforcement from System 2 while appreciating how difficult that is in practice.
0 Comments