Huemer, agent-centered norms, and motivated reasoning
(This was initially written as a comment, so it was dashed off fairly quickly. Let me know if I made any glaring errors!)
Huemer recently posted this blog post about agent-centered norms.
We can distinguish between a more agent-centric approach to forming our beliefs, where we place greater epistemic weight to our own judgments than those of others, and a more agent-neutral approach, where we assign greater relative weight to the judgments of others (see Huemer’s article for a more precise articulation of the distinction).
According to Huemer, the agent-centric view is correct “at the fundamental level,” but people should function, in practice, in a way that more closely approximates the agent-neutral approach. I have little to say by way of a direct dispute with this claim. Maybe that’s a good way to go about things (though I hesitate to call it “correct,” since I’m an epistemic antirealist and don’t want to give the impression that any epistemic accounts is correct in some stance-independent way.)
The piece is nicely written, and I don’t intend to raise serious objections to the points raised in the post. Instead, I want to focus on the broad social consequences that an explicit emphasis on an agent-centered epistemic approach may have, not as a result of judiciously employing Huemer’s conception of what the “correct” approach is, but as a byproduct of the interaction between the optimal deployment of that approach and the unfortunately motivated nature of human psychology. In short, I am worried about epistemic blank checks.
While prioritizing one’s judgments but carefully portioning weight to those of others is probably the most effective way to forming an accurate picture of the world, in practice I think that much of what Huemer may make some people feel that they are giving due deference to the views of others, but grant themselves unwitting epistemic veto power. That is, someone can believe that they give due consideration to what others think: “most experts think this,” “my colleague, who specializes in this area, strongly disagrees, I should adjust my confidence in light of that,” and so on, but in practice, people dupe themselves into thinking they’re giving enough weight to others when in fact they only give “safe weight”: they give only so much weight to what others think that it doesn’t actually cause them to override any beliefs, or allows them to strategically sacrifice unimportant beliefs in order to maintain the most important ones.
In other words, insofar as people are equipped with the distinction between agent-centric and agent-neutral norms, they can domesticate the latter by rendering it subordinate to the former, allowing them to turn the dial up on motivated reasoning just enough that it flies under their own radar. The net result is that people’s central beliefs end up being driven, not only at a fundamental level, but in practice, by an almost exclusive concern with agent-centric norms.
In short: I think explicit awareness of the distinction can facilitate a methodological mirage where people give themselves and others the superficial but false impression of epistemic virtue that merely pays lip service to agent-neutral norms.
However, this is not what troubles me most about the epistemological approach Huemer promotes. Rather, it is the kind of “epistemic culture” that it may, as an unintended consequence, cultivate. I have seen hints of this culture emerge in online debate spaces, and it may also function in the way some subsets of philosophers conduct their work.
What troubles me is this: It is one thing for us to place greater priority on our own judgments than those of others. We already do that, and perhaps it is reasonable to do so. But it is another to make a point of explicitly stressing that one is justified in doing so. Even if this is true, the consequences of stressing this point may exacerbate the degree of intellectual siloing that results from the salience of the agent-centeredness of our judgments.
What I mean is this: the more people are self-conscious of the fact that they and others place greater weight on their own respective judgments, coupled with at least some of those judgments being private, publicly inaccessible intuitions or “seemings,” the more disputes will seem to be intractable, and, like a self-fulfilling prophecy, the more they will, in fact, functionally become intractable.
This is because people may operate as though there is no limit to the amount of epistemic weight one can put on one’s private seemings. Whatever objections you raise to my view, I can always put on my Moore cap and say “here is one seeming, and here is another…” and maintain that your objections, however brilliant, are insufficient to overcome how things seem to me.
I have seen this play out in actual conversations, with some people explicitly referencing Huemer or phenomenal conservatism. A conversation, that could have been educational for everyone, never gets off the ground, because one person immediately responds to the other that no arguments against their view are going to be more plausible than the view itself.
I worry that the promotion of phenomenal conservatism and appeals to seemings will have a deleterious effect on philosophical discourse. Such discourse is already in a dismal state.
Given the insularity of the discipline, encouraging an even greater reliance on private, inaccessible evidence will further widen the epistemic gap between philosophers and everyone else. Eventually, philosophers will live on an intellectual archipelago separated from the rest of the academic world spinning fantastic tales about forms and properties, but there will be nobody there to listen.
This is why I favor an approach that focuses more on human psychology itself, and that draws attention to aspects of human psychology that impede our ability to think rationally and avoid mistakes, e.g., our dependence on heuristics and biases. While I am critical of much of the research on cognitive biases, I still believe that an approach to epistemology that doesn’t put human psychology at its center is misguided. For instance, much of the focus of epistemology should center on our penchant for motivated reasoning, self-deception, and social incentives. Any approach that doesn’t put these and other psychological considerations the forefront is not going to provide as effective a foundation for cultivating good epistemic practices as one that does.