Think Again by Adam Grant | Book Notes

HomeBook Notes
“Process is impossible without change; and those who cannot change their minds cannot change anything.” – George Bernard Shaw.

When it comes to our ideas and mindsets, we’re one of three personalities.


We’re either a preacher, a prosecutor, or a politician.


Preacher mode comes out when our long-held, sacred beliefs are in jeopardy. We preach elegant messages and illustrations to protect and promote our precious ideas.


Prosecutor mode reveals itself when we see mishaps, flaws, or illogical thinking in someone else’s argument. We do everything we can to prove them wrong and win our case.


Finally we all become “influential” politicians when trying to win over a crowd or an audience. We flash a smile, crack some jokes (or laugh at their corny jokes), all in hopes they will choose our idea to start thinking.


The Danger


But there’s danger in each of these. As Adam Grant puts it in the book, “The risk is that we become so wrapped up in preaching that we’re right, prosecuting others who are wrong, and politicking for support that we don’t bother to rethink our views.”


Once we choose how to think of something, we often stick to that mindset no matter the evidence presented to us.


Instead of being a stupid “P,” Grant proposes, we need to all be scientists. Don’t worry. You don’t have to quit your job and buy a white lab coat, although you could. Instead, being a scientist means searching for truth, embracing evidence that seems to differ from your current position, and running experiments and tests to discover what the world holds beyond the views you’ve held for the past twenty years.


If you change your mind in preacher mode, it’s a sure sign of weakness. But in scientist mode, it shows you have intellectual integrity.


But what if we’re in scientist mode, have the data and research, but can’t get other people to change their minds. Then what do we do?


Note, there are times when preaching and prosecuting make the argument stronger. Grant puts it this way, “Research suggests that the effectiveness of these approaches hinges on three key factors: how much people care about the issue, how open they are to our particular argument, and how strong-willed they are in general”


Through an incredible story about how Steve Jobs was first resistant to creating the iPhone, Grant demonstrates power in arguing what will stay the same. Change is scary. People might think if things are going well in the moment, why change what we’re doing now? But when you affirm what will stay the same through this change, they’re more open to hearing and enacting your idea. They realize, “Oh, not that much is changing. A, B, C, and D are all staying the same. We’re just changing E. Sure, let’s try it.”


Counterfactual Thinking


If reinforcing what will stay the same won’t help, you could try leading them down a path to imagine a world where their life circumstances would have unfolded differently. This is what psychologists call “Counterfactual thinking.”


It’s easy to forget how influential things like where we were born, who raised us, or where we went to school can be on our thinking. If you can successfully get someone to understand how easily they might have held different initial opinions or stereotypes, they might be much more willing to change their view. 


If you’re in a heated debate, argument, or conflict at work (or elsewhere), it can be helpful to ask, “What evidence would change your mind?” Because sometimes, that answer is nothing. And that’s okay. I don’t know about you, but I’d rather know that before spending hours investing in trying to do so.


The Confident Impostor


I played baseball for 15 years as a child (My goal was to play in college, but I had other plans, I guess.)


No matter where we played or who we played against, there was always the teammate or parent on the bleachers who thought they could do a better job of coaching the time than the actual coach. Sometimes they honestly might have been able to. There weren’t many requirements for being a baseball coach in travel ball. But most of the time, they weren’t.


That’s what’s called the armchair quarterback—the parent or fan who thinks they know how to do a better job than the actual coach. Armchair quarterback’s confidence exceeds competence. 


On the other end of the spectrum is imposter syndrome. This is a bit more common than the armchair quarterback, but people with imposter syndrome’s competence exceed their confidence. They are good at what they do but don’t believe in themselves enough or don’t want others to think they are good.


Either end of that spectrum is negative. If your confidence exceeds your competence, you’re just a jerk. If your competence exceeds your confidence, you won’t get anything done, and no one will trust you. So the ideal amount of confidence lies somewhere in the middle. Your competence and your confidence need to be in sync. You have to be confident in your ability to achieve a goal in the future, yet humble enough to question whether you have the right tools or you’re the right person for the job.


When someone lacks skills in a specific area, they’re afraid to ask for help or for someone else to do it. When someone tries and forces them to change their mind, they just double down. If that person is leading your team, you’re headed straight for a nosedive that ends in a fiery crash. 


Humility is a reflective lens: It helps us see weaknesses. Confident humility is a corrective lens: it enables us to overcome those weaknesses.


You Are Not Your Ideas


When we think we know better than someone else, it’s not just the armchair quarterback doing all that work by themselves. He’s got a teammate, and that teammate is the totalitarian ego.


The totalitarian ego is like a mini-dictator living inside your head, sitting right next to that pesky little armchair quarterback. Their job is to control the facts that flow into our heads, swatting down anything that disproves our hypothesis about a topic.


This happens in a two-step process:


First, the opinions that we hold are shielded in “filter bubbles.” These filter bubbles get bigger and bigger with the more information that proves our opinion, and with the bubble grows the ego.


Once enough confirming data reinforce these opinions, they’re sealed behind an impenetrable safe that forms an echo chamber of information, only saying what we want it to say. Sound familiar?


There’s only one cure to this totalitarian ego, and it comes from Daniel Kahneman – detachment.


As a psychologist, Kahneman doesn’t go a day without thinking of a new, weakly held opinion. Also, as a psychologist, his job is to look at the studies, interpret the data, and see if his hypothesis was correct. And if it wasn’t, change it.


The only way he can have so many fleeting ideas is that he refuses to let his beliefs become a part of his identity. 


“I am ______________________. Here is a belief that crossed my mind. Show me how I’m wrong.”


One practical way to go about this tomorrow is whenever you form a new idea, opinion, or hypothesis about how something works, ask yourself, “What would have to happen to prove this false?”

Written by
Dalton Mabery