16 Comments
Jul 1, 2021Liked by Will Wilkinson

Because I work in an area (litigation) populated with paid and hired experts with varying levels of credibility I would add that any kind of bullshit detector should incorporate coherent understanding of the speaker's motivations. Living an a world of increasingly targeted advertising, I default to the assumption that most people calling, emailing or popping up in my feed are trying to sell me something, and that it is unlikely to be something I want or need. It's not always true, but a decent first position.

Industry analysts are offering plausible arguments to protect their industry. Grant applicants are are protecting their future grants and talking up their interesting but not too out there research. The exception to the rule to trust top experts is that there are lots niche topics where the leading experts are captured or biased by the dynamics of their field, and outsiders really can identify things they are missing.

There is some kind of minimal rationality bar (Occam's Razor?) that these critiques need to pass, and I don't know that I have a more precise way to articulate the difference between Nate Silver posing questions about epidemiology and a random conspiracy theorist than to say that when they show their work one is vastly more persuasive to me than the other.

Expand full comment
Jul 1, 2021Liked by Will Wilkinson

I thoroughly enjoyed this piece, but I want to flag two additional challenges to Wilkinson's prescriptions:

1. If you try to adhere to the consensus of most disciplines you will be right a substantial majority of the time. But you also guarantee that you'll be wrong a certain fraction of the time, because consensus does sometimes go through a revolution. And these occasional revolutions provide ammunition for those who make sweeping declarations about the fallibility of experts. In a nutshell, every crank arguing for a dissenting position claims his or her theory is the next heliocentrism or plate tectonics.

2. It can be hard for people to determine the consensus position of a discipline. I think climate change is a good example of this. There exists an entire industry of faux-experts intended to shatter the notion of consensus. In some cases, the experts have genuine credentials, publications, and academic standing. They are just outliers. Convincing people not only to trust the experts, but to remain skeptical of experts deviating from the consensus, is a true challenge.

Expand full comment

One further point I would add is that people working within a specialty have an incentive to take a somewhat radical and distinctive view on that topic. You don't make a name for yourself as a specialist by saying "everything we thought ten years ago is totally right", but rather by saying "here's the one weird thing that you all got wrong over the past ten years".

In my experience, you want to trust experts, but not the very most specialized experts - rather, you want to trust the experts in immediately adjacent fields, who can see all the weird things the new stars of that field are proposing, and have some sense about which weird thing is most promising, and where the previous generation's consensus is probably still ok.

(This probably depends a lot on how precisely you zoom in. For instance, "climate science" is very broad compared to the effect I'm mentioning, but for a specialty like "influences of the Pacific Ocean heat exchange on North Atlantic hurricane season", you might do better to trust climate scientists who don't work specifically on this than the specialists themselves.)

Expand full comment

Interesting argument. Academia creates powerful incentives for hyper-specialization, so I suspect it will become harder over time to find people with the goldilocks depth of specialization in some areas.

Expand full comment

Oh, I don't think there's a goldilocks depth of specialization - I think you want to ask hyper-specialists in some specialty that is a goldilocks distance from the topic in question. A hyper-specialist too close to the topic in question is buffeted by all the incentives and fashions and novelty and so on. A hyper-specialist too far from the topic is not able to properly sift through all that. But a hyper-specialist somewhere nearby, and not directly involved, can hopefully do a better job than either.

Expand full comment

Love it! (Or at least most of it. Having studied critical theory (at the U of Iowa no less!), their approach to the question of truth tells me that I can’t weight CRT proponents’ statements particularly heavily. CRT is a nice tool in one’s toolbox but a really bad totalizing ideology. But I recognize that you add this line only to annoy.)

I wonder if you’re missing something about the rationalist community, though. It seems to me that, if you accept that their methods actually work to help one evaluate evidence more accurately, then I can figure out whom to trust by who seems to be hewing closest to these practices when presenting evidence. What’s more, by trying to spread these norms, we can make bootstrapping our way to an accurate model easier by upgrading the quality of the experts. If I’m listening to two supposed experts talk about how to deal with a pandemic, and one is using rationalistic assessments of the evidence and the other is using emotive, I am likelier to trust the first one, because I trust their decision making processes more.

I’m not entirely sure the rationalist movement succeeds in all this, mind you. But I think it’s far from useless.

Expand full comment

I think this proposed solution just pushes the problem back one level. Will mentions that one does better by trusting the right people than by following some canons of rationality oneself. You suggest that trusting the right people involves finding out who is following the canons of rationality, but one might think it's even better to trust people who trust the right people themselves, rather than trusting people who follow the canons of rationality themselves.

Probably the right way to approach this is to find some sort of mix of trusting the right people and following canons of rationality, with the balance depending on one's own areas of expertise, but the mix needs to persist at every level of the hierarchy, not just at the first level.

Expand full comment

Yeah--I think that's what I was trying to say, but you put it very well, thanks. :)

Expand full comment

Careful. Overwrought concerns about “CRT” is a gateway drug to the QAnon wormhole.

Expand full comment
Jul 1, 2021Liked by Will Wilkinson

Oh, I know! I try to work my CRT concerns carefully without going over the top. A pomo and abstruse ideology is not the threat to liberalism that bloviating conservatives want it to be.

But I reserve the right to think it's wrong, even if it's not some existential intellectual danger.

Expand full comment

I think the criticism you give here of the online "rationalist" community is equally applicable to most of the history of epistemology in academic philosophy. All the way from Descartes up to the 1980s, the majority of epistemology focused on the question of what it takes for an individual to be justified in believing something. Even when they address the epistemology of testimony (as people like Hume, Reid, and even Spinoza did sometimes) it's from the point of view of the individual receiving the testimony, and whether the individual needs to have evidence to justify the reliance on testimony or can trust it by default.

But starting at least with John Hardwig's paper "Epistemic Dependence", and in a much broader sense in the past decade, there's been a big rise in study of social epistemology, in all sorts of ways, which is absolutely an important thing.

Expand full comment

I agree with your epistemology points except that it only gets us so far. What if the whole system is biased? What if say, we lived in the USSR during the cold war and all the experts preached communism? Or what if we are currently being fed a too capitalist economic picture by the consensus of economists and the tankies are right? How would I know if anarcho-capitalists are right? These are hard questions. My heuristic is generally:

1. Be very skeptical of any claim that goes against large academic consensus (I just know global warming is true because its what mostly all climate science assert)

2. Be very skeptical of any claim that requires large amount of government employees to keep a secret

3. Read and understand the topics of interest as much as possible

4. I like the Bryan Caplan approach of having people make predictions and putting more weight on those that get more correct than wrong

5. Read broadly

Now granted, not all people have the luxury of this. So I grant that it's difficult and a bit unrealistic for most people. Which is why I think this is more of a bipartisan problem than you allude to in your post. Yes, the right is having it worse right now - but this is expected, given that the Republicans are quickly becoming the party of the non-elites. However the Democrats are going through their own epistemology problems as well given the BLM side of the party. I'm reminded of a tweet by Wilfred Reilly: "We all want justice for George Floyd, but the OVERALL #BLM narrative isn't substantially less fictional than the overall QAnon narrative. The average "very liberal" believes 1-10K unarmed Black men are killed annually by the police. Last year's actual figure was 18." And this from the party of the elites. How much more blame can we give the non-elite party?

Expand full comment

To be honest Will I've always thought of you as a bit of an iconoclast...

Expand full comment

This is so good.

Way too many people are simply looking to "belong" to something. It has been my experience in observing humans for 62 years, that if that belonging can be achieved without much thought or analysis.....so much the better. Many humans are just not as bright as many of our narratives like to espouse.

Expand full comment

You tackle questions here that I've been sort of obsessed with lately, and we've landed roughly in the same place. I like your notion of a "stable equilibrium," and I think what you refer to as "membership in a community that confers status and trust..." is a huge part of what makes the equilibrium stable, though you didn't explicitly connect these two notions here.

My own obsession with this topic led me to Daniel Schmachtenberger, who has been part of a lot of great conversations around what he refers to as the current epistemic crisis. As a solution, he pushes ideas around a discipline he calls "sensemaking," which are in line with your ideas here. Find some of his conversations on YouTube.

Schmactenberger in turn led me to a scholar named John Vervaeke, who has posited a framework to describe knowledge or knowing, which I've found hugely useful. Specifically he posits four types of knowledge, four ways to know something:

Procedural Knowing – to know how to enact a procedure, perhaps as a consequence of having practiced it. Riding a bike or baking bread are examples.

Perspectival Knowing – to know based on one’s place in the world, one’s point of view. Someone “knows” what it’s like to be trans, to be a Spanish-speaking immigrant in America, etc.

Propositional Knowing – to know information, logic, or facts via language. Mediated knowledge

Participatory Knowing – to know something somatically, a deep intuitive knowing. Being in a flow state is a manifestation of this.

As a culture, we spend a lot of time arguing in the realm of propositional knowing, but this is the weakest and most abstract kind of knowing. Ironically, it’s also the only form of knowledge we can share (it’s “shared-ness” is its form). But it’s mediated, so it’s knowing by proxy. Almost by definition, we can’t have certainty with propositional knowing, only degrees of confidence.

You discuss this problem at the end of your piece here. I personally think that some people simply can't tolerate uncertainty and doubt, whereas other people are naturally wary of certainty. The latter are more willing (or more naturally predisposed) to acknowledge how reliant we are on testimony, and the limits of that.

I think of propositional knowledge as algorithmic, or maybe probabilistic. I use the same kinds of heuristics you mention to decide whether something (some testimony) is true:

Consensus and corroboration – Are multiple sources saying the same thing?

Reputation and credentials – Does the person or publication have a good track record? Who are they relying on in terms of experts and authorities?

Falsification – We should explicitly consider as many counter-propositions as we can think of and try to falsify the original proposition with them, and try to falsify each of these as well.

But in the end, I also understand that I am not certain.

Expand full comment

I appreciate this argument and definitely agree that most people are bad at identifying authoritative sources, which leads to all kind of mayhem. But expertise is socially constructed, and in the few areas in which I've attempted to become an "expert" I've discovered the process of manufacturing consensus can be very ugly.

I don't know. Read lots, talk to multiple sources, and constructively debate people who disagree with you if it's at all possible. That's the best I can come up with.

Expand full comment