Here’s an interesting article that suggests why people tend to give equal weight to two sides of an argument that have very different levels of quality, different levels of expertise:
- And here’s the text:
- Energy and Environment
– The science of protecting people’s feelings: why we pretend all opinions are equal
By Chris Mooney March 10
It’s both the coolest — and also in some ways the most depressing —psychology study ever.
Indeed, it’s so cool (and so depressing) that the name of its chief finding — the Dunning-Kruger effect — has at least halfway filtered into public consciousness. In the classic 1999 paper, Cornell researchers David Dunning and Justin Kruger found that the less competent people were in three domains — humor, logic, and grammar — the less likely they were to be able to recognize that. Or as the researchers put it:
We propose that those with limited knowledge in a domain suffer from a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it.
Dunning and Kruger didn’t directly apply this insight to our debates about science. But I would argue that the effect named after them certainly helps to explain phenomena like vaccine denial, in which medical authorities have voiced a very strong opinion, but some parents just keep on thinking that, somehow, they’re in a position to challenge or ignore this view.
So why do I bring this classic study up now?
The reason is that an important successor to the Dunning-Kruger paper has just been come out — and it, too, is pretty depressing (at least for those of us who believe that domain expertise is a thing to be respected and, indeed, treasured). This time around, psychologists have not uncovered an endless spiral of incompetence and the inability to perceive it. Rather, they’ve shown that people have an “equality bias” when it comes to competence or expertise, such that even when it’s very clear that one person in a group is more skilled, expert, or competent (and the other less), they are nonetheless inclined to seek out a middle ground in determining how correct different viewpoints are.
Yes, that’s right — we’re all right, nobody’s wrong, and nobody gets hurt feelings.
The new study, just published in the Proceedings of the National Academy of Sciences, is by Ali Mahmoodi of the University of Tehran and a long list of colleagues from universities in the UK, Germany, China, Denmark, and the United States. And no wonder: The research was transnational, and the same experiment — with the same basic results — was carried out across cultures in China, Denmark, and Iran.
In the experiment (described in further detail in this previous paper), two separate people view two successive images, which are almost exactly the same, but not quite. In one of the images, there is an “oddball target” that looks slightly different. The images flash by very fast, and the two individuals have to decide which one, the first or the second, contained the target.
Sounds simple enough — but the two individuals didn’t merely have to identify the target. They also had to agree. Each member of the pair — the scientists wonkily call it a “dyad” — separately indicated which of the images contained the target, and how confident they were about that. Then, if there was a disagreement, one individual was chosen at random to decide what the right answer was – and thus, who was right and who was wrong. And then, both individuals learned the truth about whether their group decision had been the correct one or not.
This went on for 256 intervals, so the two individuals got to know each other quite well — and to know one another’s accuracy and skill quite well. Thus, if one member of the group was better than the other, both would pretty clearly notice. And a rational decision, you might think, would be for the less accurate group member to begin to favor the views of the more accurate one — and for the accurate one to favor his or her own assessments.
But that’s not what happened. Instead, report the study authors, “the worse members of each dyad underweighted their partner’s opinion (i.e., assigned less weight to their partner’s opinion than recommended by the optimal model), whereas the better members of each dyad overweighted their partner’s opinion.” Or to put it more bluntly, individuals tended to act “as if they were as good or as bad as their partner” — even when they quite obviously weren’t.
The researchers tried several variations on the experiment, and this “equality bias” didn’t go away. In one case, a “running score” reminded both members of the pair who was faring better (and who worse) at identifying the target — just in case it wasn’t obvious enough already. In another case, the task became much more difficult for one group member than the other, leading to a bigger gap in scores — accentuating differences in performance. And finally, in a third variant, actual money was offered for getting it right.
None of this did away with the “equality bias.”
So why do we do this? The authors, not surprisingly, point to the incredible power of human groups, and our dependence upon being good standing members of them:
By confirming themselves more often than they should have, the inferior member of each dyad may have tried to stay relevant and socially included. Conversely, the better performing member may have been trying to avoid ignoring their partner.
Great instincts in general — except, of course, when facts and reality are at stake.
Nobody’s saying we ought to be mean to people, or put them down when they’re wrong — or even that experts always get it right. They don’t.
Still, I think it’s pretty obvious that human groups (especially in the United States) err much more in the direction of giving everybody a say than in the direction of deferring too much to experts. And that’s quite obviously harmful on any number of issues, especially in science, where what experts know really matters and lives or the world depend on it — like vaccinations or climate change.
The new research underscores this conclusion — that we need to recognize experts more, respect them, and listen to them. But it also shows how our evolution in social groups binds us powerfully together and enforces collective norms, but can go haywire when it comes to recognizing and accepting inconvenient truths.
Chris Mooney reports on science and the environment.
The upshot, it appears, is that social concerns can often trump real world concerns, that if our social group has one belief and some few academics, experts, or professionals have another, that we have evolved in such a way that the social group has been more important – meaning, I imagine, that for much of our history and pre-history, being on the outs with our social group has had the more immediate and dire consequences. Opting to be ‘sheeple’ has been a matter of survival.
This – what would you call it, principle? Function, tendency? – this useful-for-a-social-animal adaptation would appear to have the negative effect that, when the whole group is wrong about something, then wrong the group shall be, and for a long time. It takes something big to effect a change in a group mindset, like perhaps when some smallish, tribal group is absorbed into a larger group like a modern nation and now the smaller group’s erroneous belief is rendered an outlying one in the new, larger group – of course, this only if they really do join the new nation, if the culture is absorbed, either by choice or by force. This doesn’t change their minds if they remain obstinate and insular, which, then the conversation goes to motivation. Many and varied are the reasons for a group to close ranks and remain as a distinct group, perhaps the maintenance of such beliefs being a big one. Or, maybe the group’s erroneous belief gets proved false in a catastrophic way, a way that is undeniable and the group must face the reality to survive. Perhaps the volcano erupts and burns our whole island down despite all the virgins we sacrificed to it, something like that.
Some might think that this is what is required to change the mind of the groups mentioned in the article, the global warming deniers and the anti-vaccination people. Let’s hope not, of course. Sometimes we’d rather have been wrong.
Something the article only grazes with its mention of global warming, would be a slightly larger issue, that of the economy VS the environment. I would think that one definition of the expression, ‘the economy’ could be simply and expansively, ‘the human system of living.’ Perhaps this adaptation of group thinking applies here, and that may explain why protecting the environment and the Earth’s resources is somehow so often viewed as beside the point, like it’s only a concern after the primary concern that we all have jobs, and that the economy continues to roll on. David Suzuki is fond of pointing out that indigenous peoples find our separation of these two things to be impossible for them to understand. ‘What is the economy if the world cannot support the people in it,’ is their question. How can the living Earth somehow not be relevant?
I think what is missing from the aboriginal person’s understanding of the modern, industrialized person’s POV is this: that urbanism, industry and agriculture have allowed people to become the dominant environment. The physical world and nature are a few steps away. For modern, industrialized people, people are the environment, the only things we have to interact with to survive, and the only things that we will not survive if we choose to disdain them. Tigers and lightning are not the modern person’s biggest threats – Republicans are.
(That’s a joke, sort of.)
That from the geo-political side of things, to be sure, but for those few who’ve read anything from me before, you know I see the world as a fractal sort of thing, the macro matching the microcosm, with the family as the model for society and the world. So, a sharp turn here. For all of us, when we are young, at our most vulnerable and impressionable, the environment that we need to survive most immediately is the family. This is where this useful-for-a-social-animal adaptation happens. It is in the home, in our nuclear families where we must make this adaptation first, and so we do. This is where we learn all the things that become the larger conversations later in life: we must work, everyone needs a job, thou shalt and thou shalt not, we are here to do God’s Will, which is this and this . . . while the real world consequences of so many of these sorts of concepts are still far beyond our grasp, we learn that what our parents and caregivers tell us we had better learn, or else. The real world consequences of these things may be far away, but the immediate social consequences of not learning what our parents teach are right there in front of us (or behind us, as the case may be, on our backsides).
To bring this very interesting article in the link above home, and to put it in a less academic, more brutal context, let’s view it this way. We don’t listen to the experts, not because we’ve weighed their theses and found them lacking – but because the experts aren’t likely to hurt us if we don’t. Which gives their views a whole lot less weight than our internalized parents who are the real leaders of our social groups.