Which expert should you believe?

Steve Simon

2007-08-03

There’s a common saying in research circles that goes something like this: “For every PhD, there is an equal and opposite PhD.” That saying is certainly true in my experience. For just about any scientific controversy, you can find an expert on either side of the issue. Some examples of these controversies are:

Quite often the experts on both sides know far more about the controversy than you or I will ever get a chance to know. So the question becomes: “Which expert should you believe?”

The easy answer is to select the expert who supports the perspective that you have been believing all along. Go ahead, if you like, because that expert has given you some cover and allowed you to continue in your belief. It offers you another advantage in the time savings that you accrue when you don’t have to think hard and critically about the issue.

A hard but equally unsatisfactory answer is to read more about the issue. Far be it from me to suggest that people do less reading, but there are two problems. First, people will choose to read only those sources that already confirm what they believe. Second, there are a lot of technical issues in many of these controversies which are difficult for you and I to understand.

I would argue that in addition to reading and learning more about the controversy, you need to look at the experts themselves. There are certain things that these experts will say and do that help you to assess their credibility.

These are not foolproof, but the more yes answers to these questions, the better.

Has this expert been wrong before? Look at the track record of your expert in other scientific controversies. An expert who has been wrong before might be right this time, but the odds are not in his/her favor. Be especially cautious of an expert who repeatedly sounds false alarms of catastrophes.

Does the expert profit financially from his/her beliefs? Monetary incentives are pervasive in today’s society, so you have to be careful here. But ask if this person is making an outrageous claim in order to sell more books or attract more patients. Even if that is not his/her only motivation, a financial conflict will often color one’s perspective and makes it difficult for that person to be dispassionate.

Does the expert rely on personal (ad hominem) attacks? Most of the controversies listed above elicit strong emotions on both sides, but failure to keep those emotions under control is a warning sign. No one can look into the soul of another and discern their true motives. Often a personal attack is a distraction from the facts. Be especially aware of insinuations about the motivations that one expert may attribute to the experts on the opposite side. Note that this attribution of motives is something that I argue in favor of in the previous question. It’s okay to point out financial influences, but watch out for negative characterizations (greedy, in it only for the money).

Is the expert an “outsider”? While there is some benefit to providing a fresh perspective, most outsiders will often get it wrong. Without formal training in the area of the controversy, the outside experts will often oversimplify the problem at hand. There are some who would argue that being an outsider is a benefit and that the insiders have a collective blind spot. While collective blind spots are sometimes a problem, such an event can only occur if the group of insiders is a very small group. Also some will argue that only an outsider can raise objections, because the insiders are all conspiring together to maintain the status quo. I disagree (see Red herring #2, Conspiracy theories).

Is the expert part of a narrowly drawn discipline? While collective blind spots are not common, they may be more likely in a group that is narrowly drawn. There is indeed safety in numbers, but especially when the numbers of experts are drawn from a wide range of perspectives. Experts who hail from varying disciplines are unlikely to share common biases and common conflicts of interest.

Does the expert bypass the peer-reviewed literature? The experts who are regularly producing research results in the peer-reviewed literature have demonstrated a high level of respect from their peers. Beware of experts who rely predominately or solely on non-peer reviewed outlets. The threshold, for example, for publishing a book is much lower than the threshold for publishing a peer-reviewed article. It might be more accurate to say that books have a different threshold (profitability) than peer-reviewed articles. If you can convince a publisher that you can make them and yourself a lot of money, they will not try hard to find flaws in your reasoning. It is just not in their economic interests to do so. Also beware of experts who use press conferences to disseminate information. While I am reluctant to attribute motives (see above), an excessive reliance on press conferences may be an indication of an expert who is in it for the publicity. One thing for sure is that someone who takes the trouble to get published in the peer-reviewed literature is not likely to be in it for shallow reasons. It is just too darn hard to get a peer-reviewed publication.

There are some red herrings that some people tend to focus on, but which don’t really help in gauging the credibility of an expert.

Red herring #1: Intellectual conflict of interest. A lot has been written about intellectual conflicts of interest, but it is all highly speculative and has little data to support it. Is it true, for example, that surgeons are biased towards surgical solutions to disease over non-surgical solutions? It is attractive to postulate such a link, but I have not seen a lot of empirical support for this. Another common claim is that researchers supported by the U.S. Environmental Protection Agency are prone to exaggerate environmental problems because it ensures further funding. This is an easy charge to make, and it has a certain resonance to it. The argument is that the agency wants to expand their budget by exaggerating the degree of environmental problems. Again, there is no evidence that I am aware of that researchers deliberately slant their findings or that the EPA preferentially seek out researchers who produce findings of environmental problems. Another claim is that scientists are reluctant to look for evidence that might overturn what they have learned in school. If anything, the evidence points in the opposite direction. Scientists, by their nature, are always questioning and prodding and trying to find a different answer. There is in fact a strong personal incentive to the first to recognize an approach that changes the way science is done.

Red herring #2. Conspiracy theories. Some conspiracies are real, but it takes a lot of energy to conduct and maintain a conspiracy. The efforts are exponentially greater when a large number of people are involved in the conspiracy. Tom Clancy is fond of quoting a saying in the intelligence community that the probability that a secret will leak is proportional to the square of the number of people who know. The efforts to maintain a conspiracy also grow over time. People who are motivated to keep a secret when they are working for the group in charge of the conspiracy lose this incentive as they retire. A lengthy time frame also increases the number of people involved because of the natural turnover in the positions of authority in the conspiring organization.

You can find an earlier version of this page on my old website.