The latest episode of the You Are Not So Smart podcast touches on a great example of self-delusion that we haven’t covered here yet. In this episode, David McRaney interviews Jesse Richardson of “Your Logical Fallacy Is”, a site that I am definitely going to have to check out about the strawman fallacy.
Person A: Soup is delicious
Person B: I tried soup once. It was terrible. Therefore you are wrong.
ISIS uses some incredibly sophisticated methods that are based in solid cognitive science and persuasive design. They are great at framing their narrative in a way that is engaging and convincing. They hit just the right affective buttons. They leverage powerful cognitive heuristics to anchor, confirm, and solidify their legitimacy in the minds of their prospective recruits and to incite action. It is scary just how good they are at it.
“The best thing to speak against recruitment by Isis are the voices of people who were recruited by Isis, understand what the true experience is, have escaped and have come back to tell the truth … Counter-speech to the speech that is perpetuating hate we think by far is the best answer.”
This study in Brain came from a team from the UK and Spain. They are studying the link between risk preference and the nucleus accumbens (NAcc). Even given the limitations in associating brain activity and human behavior that I acknowledged above, there is reasonable evidence that the NAcc is linked to risk preference. This study is remarkable in that they did a controlled study and the participants were blind to the intervention. This is rare in neuropsych studies where confounds and mediators are hard to control for.
Short-lived phasic electrical stimulation of the region of the nucleus accumbens dynamically altered risk behaviour, transiently shifting the psychometric function towards more risky decisions only for the duration of stimulation. A critical, on-line role of human nucleus accumbens in dynamic risk control is thereby established.
In anticipation of all of your New Year’s Resolutions, I thought I would share with you some new ideas on setting goals.
The first example comes from Jeffrey Davis in the Creativity Post. He calls this a radical alternative, but I think his approach makes perfect sense. First, he warns against using a long time horizon for your goals. Not that long term thinking is bad – in fact it is best. But the problem is that long term goals are too easy to forget about or put off for later. And even easier for us to delude ourselves with false progress. Instead, he recommends using vision goals that add meaning instead of milestones. Imagine where you want to be in the long term, and then set a goal for what you can do right now to move yourself towards that vision.
There has been a long history of movements in the business, psychology, and human factors communities to help people overcome the natural tendencies in decision making that often lead us astray. You know – what we often refer to as biases but that evolved to help us make fast, frugal decisions in the muddy context we call the real world.
If you believe this article in Harvard Business Review, a team of researchers led by Carey Morewedge at Boston University may have discovered a viable approach. They used a serious game to train participants in intelligence analysis.
Today I am going completely around the circle. I found a report that applies human factors to philanthropic management. So I am going to apply HF to philanthropic management and back to HF. Or something like that.
many of our decisions rely on mental shortcuts or “cognitive traps,”
which can lead us to make uninformed or even bad decisions.2
Shortcuts provide time-pressured
staff with simple ways of making decisions and managing complex strategies that play
out an uncertain world. These shortcuts affect how we access information, what information
we pay attention to, what we learn, and whether and how we apply what we learn. Like all
organizations, foundations and the people who work in them are subject to these same traps.
If you are not familiar with the term Surprise Validator, I am sure the concept will resonate with you. Cass Sunstein described it during an interview on NPR where he was promoting his book.
The Halo Effect is one of my favorite illustrations of self-delusion. In part because it is so common. But more because it is just so irrational most of the time. The basic idea of the halo effect is that when decisions or evaluations are difficult, instead of focusing on the most important or most diagnostic attributes, we focus on the one(s) that are easiest to evaluate…
In this season of the New Year’s resolutions, Jayashri Kulkarni from Monash University has some useful insight us to keep in mind.
In my patient’s case, unfortunately, I suspect her New Year’s resolution provided her with the opportunity to procrastinate. Despite comprehensive development of a smoking cessation plan, and extensive knowledge about the dangers to her health, she just didn’t want to give up smoking.
Springwise has been reporting on a variety of vending machines that constrain user behavior for their own good.
Businesses often stand by the motto ‘the customer is always right’ — but are they? We’ve already seen a few services that deny consumers what they want based on their personal info…