How to avoid the cognitive hooks and habits that make us vulnerable to cons

Photo of author

By Sedoso Feb


How to avoid the cognitive hooks and habits that make us vulnerable to cons Nobody’s Fool: Why We Get Taken In and What We Can Do About It.“>
Enlarge / Daniel Simons and Christopher Chabris are the authors of Nobody’s Fool: Why We Get Taken In and What We Can Do About It.
Basic Books
There’s rarely time to write about every cool science-y story that comes our way. So this year, we’re once again running a special Twelve Days of Christmas series of posts, highlighting one science story that fell through the cracks in 2023, each day from December 25 through January 5. Today: A conversation with psychologists Daniel Simons and Christopher Chabris on the key habits of thinking and reasoning that may serve us well most of the time, but can make us vulnerable to being fooled.

It’s one of the most famous experiments in psychology. Back in 1999, Daniel Simons and Christopher Chabris conducted an experiment on inattentional blindness. They asked test subjects to watch a short video in which six people—half in white T-shirts, half in black ones—passed basketballs around. The subjects were asked to count the number of passes made by the people in white shirts. Halfway through the video, a person in a gorilla suit walked into the midst of the players and thumped their chest at the camera before strolling off-screen. What surprised the researchers was that fully half the test subjects were so busy counting the number of basketball passes that they never saw the gorilla.

The experiment became a viral sensation—helped by the amusing paper title, “Gorillas in Our Midst”—and snagged Simons and Chabris the 2004 Ig Nobel Psychology Prize. It also became the basis of their bestselling 2010 book, The Invisible Gorilla: How Our Intuitions Deceive Us. Thirteen years later, the two psychologists are back with their latest book, published last July, called Nobody’s Fool: Why We Get Taken In and What We Can Do About It.  Simons and Chabris have penned an entertaining examination of key habits of thinking that usually serve us well but also make us vulnerable to cons and scams. They also offer some practical tools based on cognitive science to help us spot deceptions before being taken in.

“People love reading about cons, yet they keep happening,” Simons told Ars. “Why do they keep happening? What is it those cons are tapping into? Why do we not learn from reading about Theranos? We realized there was a set of cognitive principles that seemed to apply across all of the domains, from cheating in sports and chess to cheating in finance and biotech. That became our organizing theme.”

Ars spoke with Simons and Chabris to learn more.

Ars Technica: I was surprised to learn that people still fall for basic scams like the Nigerian Prince scam. It reminds me of Fox Mulder’s poster on The X-Files: “I want to believe.

Daniel Simons: The Nigerian Prince scam is an interesting one because it’s been around forever. Its original form was in letters. Most people don’t get fooled by that one. The vast majority of people look at it and say, this thing is written in terrible grammar. It’s a mess. And why would anybody believe that they’re the one to recover this vast fortune? So there are some people who fall for it, but it’s a tiny percentage of people. I think it’s still illustrative because that one is obviously too good to be true for most people, but there’s some small subset of people for whom it’s just good enough. It’s just appealing enough to say, “Oh yeah, maybe I could become rich.”

There was a profile in the New Yorker of a clinical psychologist who fell for it. There are people who, for whatever reason, are either desperate or have the idea that they deserve to inherit a lot of money. But there are a lot of scams that are much less obvious than that one, selecting for the people who are most naive about it. I think the key insight there is that we tend to assume that only gullible people fall for this stuff. That is fundamentally wrong. We all fall for this stuff if it’s framed in the right way.

Christopher Chabris: I don’t think they’re necessarily people who always want to believe. I think it really depends on the situation. Some people might want to believe that they can strike it rich in crypto, but they would never fall for a Nigerian email or, for that matter, they might not fall for a traditional Ponzi scheme because they don’t believe in fiat money or the stock market. Going back to the Invisible Gorilla, one thing we noticed was a lot of people would ask us, “What’s the difference between the people who noticed the gorilla and the people who didn’t notice the gorilla?” The answer is, well, some of them happened to notice it and some of them didn’t. It’s not an IQ or personality test. So in the case of the Nigerian email, there might’ve been something going on in that guy’s life at that moment when he got that email that maybe led him to initially accept the premise as true, even though he knew it seemed kind of weird. Then, he got committed to the idea once he started interacting with these people.

Enlarge
Christopher Chabris

So one of our principles is commitment: the idea that if you accept something as true and you don’t question it anymore, then all kinds of bad decisions and bad outcomes can flow from that. So, if you somehow actually get convinced that these guys in Nigeria are real, that can explain the bad decisions you make after that. I think there’s a lot of unpredictableness about it. We all need to understand how these things work. We might think it sounds crazy and we would never fall for it, but we might if it was a different scam at a different time.

Ars Technica: Scientists are prone to think that they cannot be fooled. You wrote about the Hendrik Schön case of fraud in physics, where he claimed he could turn organic semiconductors into superconductors, for instance, and had built a molecular-scale transistor. One would think it should have been caught, but so many institutional assumptions and biases crept in. 

Daniel Simons: All scams, in hindsight, look obvious. I mean, Diederik Stapel got away with publishing 50-some fraudulent papers in social psychology—that we know of. The Schön case was interesting because the findings were so radical: They didn’t fit what everybody was expecting, but they were what a lot of people hoped for. I think that’s part of what’s driving this. A lot of scientists are less critical of things that fit in with their expectations and are just a little bit better than what other people produce.

The Stapel findings weren’t earth-shattering in the context of the theories that he was working within. They weren’t especially clever. They were sometimes methodologically challenging for other people to do, but they weren’t breaking entirely new ground. The contrast to that would be Daryl Bem, who published a well-known ESP paper. That one, almost nobody believed from the outset because it was a step too far. People went in not believing ESP. So they looked for anything they could do to tear down a finding of psychic phenomena. When it fits with our expectations, we’re less critical than when it doesn’t.

Christopher Chabris: Psychologists didn’t want to believe in ESP. Perhaps physicists wanted Schön’s work to be valid because it would be so beneficial for society if that technology could be developed. I am fascinated by how Schön was able to publish so many papers in both Science and Nature on the same topic within such a short period of time. Maybe just the existence of that body of “work” provides an answer to a lot of objections. That sometimes happens in psychology, too.

People rarely publish a paper that just has a single study all by itself. Usually, it’s a package of three or four studies using different methodologies. That helps us drop our guard a bit about whether these findings are real, even though that’s a well-known publication strategy in psychology. To get into certain journals, you pretty much have to do it. So a con artist, not to name anyone in particular, would, of course, learn that and figure out how to craft exactly the right package. The fact that there are three or four studies no longer has any merit as evidence in favor of it being a real phenomenon. Maybe that was part of what happened in the Schön case as well.

Daniel Simons: The groups that feel like they can’t be fooled—researchers, scientists, and skeptics—those communities have a tendency to assume,  “I’m a cynical person, I’m a contrarian, I’m going to challenge everything.” If you give them something that fits what they’re expecting, that’s sufficiently contrarian. They don’t question it. If you talk to magicians, they’ll say often the easiest people to fool are people like scientists and skeptics because they’ll lock onto the first kind of suggested method that they might be using. You introduce something that seems kind of obvious as an explanation for what you’re doing, and they’ll lock on to that. They’ll find it right away, and they’ll never let it go. By the time they realize they’re wrong, it’s too late. Magicians have a harder time fooling kids who aren’t paying attention to their banter and what they’re trying to cue them to, but skeptics and professors are easy.

We can’t be skeptics all the time. It doesn’t make sense, because the vast majority of the time, nobody’s trying to deceive us. It’s a relatively rare event. And even if they are, the vast majority of the time, it’s not consequential. It’s not a big deal if you get a little bit misled. So I think the idea that you should have to second guess everything that anybody says, it’d be disastrous for our ability to live in a community and to talk to other people.

Contrarianism might be one of the most overrated signs of intelligence or cleverness.

Christopher Chablis: If you follow contrarians on social media, they sound right at first because they had a good contrarian take, and they seem smart because they saw something that other people didn’t. But contrarianism might be one of the most overrated signs of intelligence or cleverness because if you follow them for long enough, you see that they’re contrarian about all kinds of things, and all those things can’t be right. Your contrarian takes are popular at first, so you keep on generating them, and then they don’t make sense anymore collectively. That’s their way of being fooled. They’ve been fooled on sort of a meta level. Believing the official story is usually wrong is in itself a mistaken belief.

Daniel Simons: And it can lead you to conspiratorial beliefs.

Christopher Chabris: You can see some contrarians—I’m not going to name names, but you can find them on social media. They’ve stumbled down into that place where they no longer seem to be saying anything. They’re talking to the people who already believe them. So they’ve now become a closed system with their audience.

Ars Technica: Perhaps people don’t want to admit they’ve been fooled, so they end up doubling and tripling down, trapped in a feedback loop of disinformation. From there, it’s just a short step into conspiracy theory.

Daniel Simons: We talked a little bit about cults in the book, just briefly, but I think one of the important elements of that is that it really comes down to a matter of commitment. Once you accept an initial premise, they might be arguing completely logically. They just won’t abandon that premise, and they get so far away from it that they don’t realize that it’s underlying all of the logical choices they made up to that point.

So if, for example, you believe that [Donald] Trump is infallible and never lies, you have to run in all sorts of circles to make his contradictory statements seem coherent. But if you take that as an unquestioned commitment, then logically, a lot of the stuff that’s down that rabbit hole holds together. It’s just that the premise is wrong, and it’s hard to get somebody to come back up out of that rabbit hole. They don’t see themselves as wrong because that premise is unquestionable. Once you’re in that mode, somebody on the outside can look at it and say, “That’s nuts.” But from the inside, it doesn’t look that way. It’s the same with cults.

Christopher Chabris: The question is how do we get people out of these rabbit holes. We talk more about how they got there and how not to get there yourself, but the getting out part, I’d recommend David McRaney’s book, How Minds Change. He really goes into this process of how people change their minds. A lot of it does come back to this concept of commitment. What information might get you to change your mind about that? What leads you to believe that? What follows from that? He calls it “street epistemology.” One has to really start to go through the web of beliefs and how they’re connected and the evidence and the logic supporting all this. Maybe a little bit of examination of one’s internal thought processes about this stuff might help.

Ars Technica: This ties in to the media ecosystem and how easy it is these days to choose only those sources that reinforce our beliefs and dismiss anything that challenges them as “fake news.” I know we all have confirmation bias. We all feel good when something reinforces our beliefs. But at some point, shouldn’t some rational common sense should kick in?

Daniel Simons: It’s hard because once you’re immersed in that, and you’re committed to the idea that one channel is telling you the truth and the other channel is lying, then you end up in a really tough spot. This is an equal opportunity thing. We all get pulled into our media circles and don’t break out of them. Once you’ve committed to the idea that this source is trustworthy and this other source is not, it’s really hard to break out of that. And, of course, the networks do their best to convey that. Fox does everything it can to make sure that people think CNN is not trustworthy. And MSNBC does the same to Fox. That’s not surprising because it’s a competitive landscape.

Christopher Chabris: I think it’s really a battle where a lot of one’s natural impulses tug in the wrong direction, as you said. We’re all doing that; it’s just that some of us are picking sources that are more connected to reality than others. One good heuristic, I think, is actually to pay attention to the experts. Sure, they’re not always right, but it’s hard for you to bet better than to pay attention to the credentialed experts. Sometimes, they’re massively wrong. We have to acknowledge that sometimes experts have held a consensus that turned out to be wrong. But the odds that any one individual or any one anonymous, online pretend “spy” is going to clue you in are much lower than the odds that the experts are probably right.

Daniel Simons: There’s one trick that I think helps sometimes, which is to anonymize who you’re hearing it from. This is the principle of familiarity: We tend to be less skeptical of things that are familiar. In the past, familiar meant family and friends and people you’ve known a long time; if they steer you wrong all the time, you stop trusting them. So it makes sense to trust things that are familiar. But sometimes, we do that at the celebrity level. If RFK Jr. were not named Kennedy, nobody would believe most of his nonsense because it really sounds crazy. If you just put him on a street corner saying this stuff, nobody would believe him. It would just be bizarre. But because he’s got that name attached, there is this halo effect because he’s familiar. So they tend to listen to him expound on things that he actually knows nothing whatsoever about.

Ars Technica:  Out of all the tips you offer in your book, what would each of you choose as the single most important thing we can do to avoid being fooled?

Daniel Simons: It depends on the scale of what you’re talking about for being fooled. One way to do it is just to ask a single question: “Am I missing something? Or is this really true?” But the question I tend to like is, “If somebody were scamming me, what would they do?” It puts you in the position of a con artist. If I was buying a new car and the dealer were going to try and mislead me, what would they do? If you’re doing something that’s potentially high cost or high risk, they’re going to go to much greater lengths. If you go to the grocery store and you think, “How would the grocery store cheat me?”—well, they might inflate their prices a bit on some of the products between the shelf and the register. It’s not a big deal. But if you’re investing your life savings and you’re picking some independent financial advisor instead of a giant bank, they could go a long way toward taking all your money. How would they do it? Well, they’d mislead you about their returns. They’d set up fake information.

Christopher Chabris: Dan began his answer with, “It depends,” so you know that he’s more credible than I’m going to be right now, because I’m going to give you an answer that doesn’t depend. The most important thing to think about is, what are they trying to get me to pay attention to? And is there something else that I’m not paying attention to that could be really important? That is a theme that recurs in a lot of the cons that we talked about—from psychics rattling off a lot of stuff really quickly, hoping that I’ll forget about the things that didn’t work and only pay attention to the ones that do work, to long cons where people will call a particular person and try to convince them to wire money. They’ll keep on working on them week after week after week. Even in those cases, there’s a tendency to become focused on what they’re telling you and to stop thinking about what other facts might be relevant to this situation.

Some guys pretending to be the Minister of Defense in France scammed millions and millions of euros out of people by telling them the money was to rescue French hostages who were being held in Syria by ISIS. Nobody looked very carefully into the possibility that, if there were all these French hostages being held, how come nobody else knows about it? Why couldn’t they call back the French defense ministry and reach this guy? They didn’t break outside the channel even a little bit. It’s kind of like what the Nigerian Prince wants you to do: just interact with them. Don’t think about anything else. Just think about their story. So I would say think about what they’re not showing you.

Source

Leave a Comment