Some quotes from Giving Debiasing Away: Can Psychological Research on Correcting Cognitive Errors Promote Human Welfare? by Scott O. Lilienfeld, Rachel Ammirati, and Kristin Landfield.

I was happy to learn that Lilienfeld is part of the neorational movement! I have already read one of his books, 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior, and plan on reading another of them: Science and pseudoscience in clinical psychology.

 

ABSTRACT—Despite Miller’s (1969) now-famous clarion

call to ‘‘give psychology away’’ to the general public, sci-

entific psychology has done relatively little to combat fes-

tering problems of ideological extremism and both inter-

and intragroup conflict. After proposing that ideological

extremism is a significant contributor to world conflict and

that confirmation bias and several related biases are sig-

nificant contributors to ideological extremism, we raise a

crucial scientific question: Can debiasing the general

public against such biases promote human welfare by

tempering ideological extremism? We review the knowns

and unknowns of debiasing techniques against confirma-

tion bias, examine potential barriers to their real-world

efficacy, and delineate future directions for research on

debiasing. We argue that research on combating extreme

confirmation bias should be among psychological science’s

most pressing priorities.

Second, the term bias blind spot (Pronin, Gilovich, & Ross,

2004), more informally called the ‘‘not me fallacy’’ (Felson,

2002), refers to the belief that others are biased but that we are

not. Research shows that people readily recognize confirmation

bias and related biases in others, but not in themselves (Pronin

et al., 2004). The bias blind spot, which we can think of as a

‘‘meta-bias,’’ leads us to believe that only others, not ourselves,

interpret evidence in a distorted fashion.

Second, many individuals may be unreceptive to debiasing

efforts because they do not perceive these efforts as relevant to

their personal welfare. Research suggests that at least some

cognitive biases may be reduced by enhancing participants’

motivation to examine evidence thoughtfully (e.g., by increasing

their accountability to others), thereby promoting less per-

functory processing of information (Arkes, 1991; Tetlock&Kim,

1987). Therefore, some debiasing efforts may succeed only if

participants can be persuaded that their biases result in poor

decisions of real-world consequence to them.

 

Surely this is correct.

Fifth, researchers must be cognizant of the possibility that

efforts to combat confirmation bias may occasionally backfire

(Wilson, Centerbar, & Brekke, 2002). Researchers have ob-

served a backfire effect in the literature on hindsight bias

(Sanna, Schwarz, & Stocker, 2002), in which asking participants

to generate many alternative outcomes for an event paradoxi-

cally increases their certainty that the original outcome was

inevitable. This effect may arise because participants asked to

think of numerous alternative outcomes find doing so difficult,

leading them (by means of the availability heuristic; Tversky &

Kahneman, 1973) to conclude that there weren’t so many al-

ternative outcomes after all. Whether similar backfire effects

could result from efforts to debias participants against confir-

mation bias by encouraging them to consider alternative view-

points is unclear. Moreover, because research on attitude

inoculation (McGuire, 1962) suggests that exposure to weak

versions of arguments may actually immunize people against

these arguments, exposing people to alternative positions may

be effective only to the extent that these arguments are pre-

sented persuasively.

 

0 Comments

Leave a Reply