When Can I Stop Listening to my Enemy's Points?
Flat Earth will probably not have any luck convincing me.
Creationism
Climate Change Being Something That Exists At All
Astrology
Vaccines Preventing Disease
Flat Earth
5G Causes COVID-19
Moon Landing was Fake
Holocaust was Fake
The Pyramids were Built by Aliens
Geocentrism
…I’m sure you can think of a number of other suitable issues that are so heavily polarized in terms of good faith arguments that it’s difficult to find good arguments for them.
(When I use “smart” here, I’m not simply talking about high IQ. Being smart necessarily implies being able to discern the quality of an argument; otherwise your beliefs and arguments suck, and what are we doing here! The philosophers I listen to, on Substack and elsewhere, are smart in the latter way. If Bentham meant “no side has a monopoly on high IQ people” that’s trivially true, and much less interesting. Anyone can pump out convoluted bad arguments for any belief. More discussion in the comments)
But Bentham’s point in the article was not to defend these issues, but to gesture at close minded people who aren’t willing to entertain arguments for veganism, theism, abortion, or other issues where there are, in fact, many intelligent people with reasonable arguments on both sides.
And I think he’s hit on something very important: When do we decide that we’ve seen enough arguments for whichever side we’ve chosen in religion, politics, morality, philosophy, and stop listening? It’s not so simple as “keep an open mind! Value your personal truth over the consensus!” If I write a post defending flat earth, even if I make some interesting arguments you haven’t heard before, I’d forgive you if you assume I’m stupid and then leaving my post.
It’s also not so simple as radical skepticism. You should believe, very close to 100%, that the Earth is round! That’s the correct opinion! If you examine an issue like whether raising minimum wage is good for GDP, examine the competing meta-analyses on the issue, and decide “Huh, I don’t know which side is better, it’s maybe 70-30, I wish I was more of an expert”, then I think that’s reasonable. But it’s less reasonable to look at the arguments for flat earth vs round earth and say “Interesting points on both sides, round earth is a bit stronger, I bet it’s 70-30.”
It’s also definitely not so simple as the entrenchment of all the beliefs and biases you have right now. Refusing to change your mind even a little bit is a failure mode for many a group throughout history. It’s vital to change your mind when new evidence comes out, or when you view this new evidence — that’s a trivial point. If evidence can’t sway you, then you’re the one living in a post-truth world all the academics like to point at.
So the issue… is a misclassification of issues. There are issues where anyone respectable agrees. There are issues where the consensus swings one way, but there’s still many people who disagree. And there are issues where it’s a deadlock. This all gets complicated when you realize that every individual wants to position their chosen belief AS consensus, because a great way to sway people to your side is for them to think “every respectable person believes X”. Movements on Twitter have used this technique to influence major political parties’ core points! This is just another underhanded argumentative technique that’s used on social media, and doesn’t point to which side is actually correct, but this is obviously relevant for this classification!
If genuinely intelligent people who you respect think an opposite way from you, you should consider that solid evidence in favor. But how do you identify who’s intelligent about the issue? Is there a political bias that’s making them reluctant to tell the truth? How do you know if they ARE intelligent if you judge intelligence by the criteria of “do they agree with my beliefs right now”?
This gets even more interesting when you have the expertise, and you keep seeing the same bad arguments over and over again, but there are still intelligent people on both sides. I left a comment heavily disagreeing with a Substack post about the Sleeping Beauty Problem (I’m a thirder!), and the poster called me stupid and uneducated compared to him, and said I didn’t understand any of his post. Now, being mean to people you disagree with is obviously bad epistemic practice (I could write a whole post about that), but maybe my arguments were bad and he’d seen them 1,000 times! If I make a post about why the Earth is round, and someone replies with the same terrible arguments in favor of it being flat, I would certainly be dismissive. I don’t think my arguments fell into this category, but if you’re an expert in something, there are points casuals make that you can instantly dismiss. I have some AI doomer friends who feel that way about arguments against doomerism, though I mostly disagree.
I wish I had a stronger conclusion, but all I can say is that you must take each belief on a case-by-case basis. I don’t think there’s any broad, sweeping generalization that can be used to instantly decide whether an issue is hogwash or genuinely contentious, as nice as that would be. And I think most people do not take the time to distinguish the beliefs they hold that may be wrong vs the beliefs they hold that are obviously correct. And if they do this, they don’t do it well! In many cases, you won’t be smart or educated enough to evaluate the best evidence, and you have to gauge the consensus amongst experts and which category the issue falls into, and gauge if that consensus is biased or misguided, and how much. And you have to gauge if you’re smart enough to even determine the bias! It’s all a mess. I think this is a very hard problem that people don’t think is very hard.
Eliezer Yudkowsky has made a similar point before, but I think he has far too much confidence for his beliefs on the consciousness of infants and animals, just like I certainly have far too much confidence for some of my unexamined beliefs. If the figurehead of rationality isn’t immune to this classification issue, who is?
(The best way you can support me is to subscribe, for free! If I get more subscribers, I can spend more time writing. Just click the helpful orange button that’s very shiny right about this. I hear people who subscribe to me all get six-packs within a week of subscription, too)
I think that some approaches to a situation do not result in any win condition. Essentially, by framing it as "when do I need to stop listening to my enemy’s arguments", you enter into a labyrinth of contradictions and paradoxes that will never resolve by design. You can loop and spiral, and feel like you are making progress, but there is no stability in whatever truth you might come to - something always seems to loop you back around. Things seem like progress, but it's a Shephard tone - the same octave, not higher orders.
The only way out is to change the precept. One where people with different views are not enemies to assimilate or conquer. We could grow past our first-order categorization of a view as right/correct/good or wrong/incorrect/bad, and dive into the fabrics that make up the textiles of each belief. If we stop seeking to evaluate and change, space becomes open to hear subtle notes.
You can stop listening to your enemy’s points at any time, just stop treating them like your enemy. It’s not so much about the belief as it is about the person you are engaging with. Instead of saying “there are smart people on both sides of the issue”, it may be more helpful to say “there is always someone who you could learn from on both sides of an issue”. Maybe you can stop listening to someone when you feel there isn’t more you can learn from them at this time.
Generally though, I'm not listening to Flat Earth podcasts to decipher what foundational value sets or experiences tend to lead to those beliefs, I simply don't have time to pay attention to everything. For the things I do choose to attend to, though, I have found the energy much better spent in curiosity than in compelling others to adopt my views. (I further believe that this type of observational curiosity brings an individual closer to the truths of existence, and it doesn’t matter whether you conduct that work through the path of listening to flat earthers or listening to rationalists).
---
There are two factions in conflict. An outsider comes in with superior knowledge, assuming common humanity will override cultural distance, only to realize that their categories of thought blinded them to local reality. Does the outsider gain a newfound humility and cease their judgements of how they thought things ought be?
(check out The Left Hand of Darkness by Ursula K. Le Guin)
I think one thing is less understood by younger people online (not specific to gen z/a, but just people who haven’t been online/thinking about issues for a long time) is that a lot of people are dismissive of arguments prima facie because they’ve heard them a million times.
I think that’s the main reason folks like BB get such a reaction, because lots of us have thought about these issues (admittedly not as deeply at times!) longer than he himself has been reading books period. He likes to assert that atheists think you can point out where an argument is wrong just by listening to it, which is kind of wrong…but also kind of right?
Like every argument falls into a species of arguments that all make the same or similar assumption, and a lot of popular apologetics that’s slightly higher brow than frank turek seems to be just more creative ways to formulate the same 5 species of arguments. So, sure, I can’t debunk it just by listening to it, but if you gave me 5-10 minutes, I probably could, not because I’m super smart, but because I’m familiar enough with the species of argument!