Dr. Ken Broda-Bahm | Holland & Hart
In my opinion, it is one of the most interesting and important areas of social science at the moment. And if it’s not that, then it’s certainly the sassiest. A group of researchers has been focused on our susceptibility and resistance to various forms of bad information, disinformation, misinformation, rumors, bald-claims, conspiracy theories, and fake news. And I can just picture one of the researchers plaintively raising the question, “Can’t we just call it ‘bullshit?’” Well, they decided that they could, so now we have peer-reviewed scholarly articles on bullshit influence, persistence, and vulnerability. We even have the sine qua non of academic tools — a validated psychometric measure called the “Receptivity to Bullshit” Scale.
While it won’t be a good look to be applying that scale to potential jurors as they come into court, the overarching concept and research findings are relevant to litigation, since it is a measure of how gullible people can be when presented with information that lacks clear meaning or foundation. The question can be particularly important when it comes to expert testimony: you have a situation where there is complex information, as well as economic motive for a particular answer, so there’s a real chance that your jurors just might be presented with some bullshit. Researchers have found that, while susceptibility is significant, people can often counter this bad information through reflection. When they have to try explain in their own words why a bullshit idea is actually valid, they become significantly less likely to support it. So, you would think that group deliberation, where individuals are called on to explain and defend their positions, would be the ideal setting for countering testimony that lacks clear meaning or foundation. And in many circumstances it is. But based on a new study, there seems to be unique susceptibility when it comes to expert testimony. In this post, I’ll explain those study results and share a few implications when it comes to countering a bullshitting expert on the other side.
The Research: Expert Bullshit Is Different From Other Bullshit
Defining bullshit as “information constructed with a carefree indifference for conveying truth, accuracy, clarity, or meaning that is often used to impress, persuade, or otherwise mislead others,” the study (Littrell, Meyers & Fugelsang, 2022) tested susceptibility to various forms of bullshit, including pseudo-profound statements and fake news headlines, as well as scientific statements that were either anonymous or sourced to experts. Specifically, they looked at whether reflection (e.g., being asked to “describe in detail why the statement below is or is not true”) could be a cure. They found that while reflection can reduce the effects of fake profundity and fake news, there is a blind spot when it comes to expert opinion.
This suggests that we cannot expect jurors to reason their way out of questionable scientific testimony in the same way they deliberatively respond to other questionable information. The reason that statements from perceived experts seem to work differently comes down to something the researchers call the guru effect: “People often perceive bullshit statements from purported experts as more meaningful and convincing than bullshit attributed to anonymous sources.” To some extent, the research participants also seem to outsource responsibility for the explanation from themselves to that expert. As the researchers note, “failing to generate an explanation for how something works makes individuals doubt the knowledge they possess, but not the knowledge others possess.”
The Implications: Protect and Empower Your Jurors
A good trial attorney, of course, wouldn’t expect jurors to reason their way independently to a discovery of the problems with an opposing expert. That litigator has other tools — namely cross-examination and opposing experts — that were not available in the research setting. At the same time, both courts and attorneys have good reason to be sensitive to the particular risk that comes from a bad expert.
An Additional Reason for Daubert
The classic response to the possibility of bad expert evidence has been that jurors are the gatekeepers. Now, however, in a post-Daubert climate, judges are often the ones deciding whether proposed testimony has scientific merit that can be applied to the facts of the case. This research suggests that there is good reason for that. While the kinds of reflection promoted by deliberation are very valuable on a wide array of common-sense determinations, these collective tools can fail when jurors are outside the realm of their own experience and understanding. For those who cite social science to the court, this study might add to your motion against the other side’s bad expert.
An Additional Reason to Unpack the ‘Why’ and to Be the Better Teacher
The research finding also underscores the intuition that expert witness testimony should never just be presented. It should be taught. The more jurors outsource their judgment to someone else, the more susceptible they are to bad information. But the more they come to understand the process, the steps, and the reasons underlying an expert’s conclusions, the more they can appreciate and use your expert’s counter. This “show your work” emphasis should be a reminder to your testifying expert that they are not there simply to be an “authority.” They are there to be the better teacher. Ultimately, it is as much about being clear, concrete, and engaging as it is about being right.