Scott Alexander is Smarter Than Me. Should I Steal His Beliefs?
Arguments are overrated; you can find truth without them
It’s no secret I’m a Scott Alexander glazer. Scott once said that he was an embarrassing fanboy of Eliezer Yudkowsky, and that it may be his fate to have embarrassing fanboys of his own one day. Well, the bell tolls. I find Scott to be consistently interesting and intelligent, and he has a way of connecting topics to one another in an interesting way that I’ve never seen from anyone else. He’s seemingly dedicated to truth more than anyone else I know.
So, he’s smarter than me, and a better thinker than me, and spent a lot of time on many different topics trying to find the truth. As the smartest guy I’ve found, should I steal all of his beliefs indiscriminately? Legitimately. If a person only care about having the most correct beliefs, which I feel like is a reasonable goal, is finding the smartest person you know and stealing his beliefs a good idea?
Down the rabbit hole
Epistemology and epistemic practices are the study and methods of finding true things1. For the vast majority of issues, in politics, religion, psychology, and philosophy, where Scott’s gotten his beliefs through fantastic epistemic practices and I’ve gotten my beliefs from random sources and friends and biases that I can’t remember, does it make sense to copy everything?
Well, it seems obvious that there are some issues I shouldn’t copy from him — If there’s an issue where I’m an expert and he is not, then I probably shouldn’t steal his beliefs. But for the rest, I can treat very act of him having a belief as very strong Bayesian2 evidence in favor, and update very strongly towards his belief, because I know that he’s spent a lot of time thinking about any given topic to attempt to reach the truth, and I know I’m more biased!
Alright, to back off from the Scott glazing, am I too preoccupied with having my own, interesting opinions and I should change ALL my beliefs to what the smart people I respect think? Fully adopting someone else’s opinions without understanding them can be a dangerous game. Without understanding why someone believes X or why, you don’t actually know which way new evidence should swing, and how much. But I find it very easy to agree with the fact that if 99% of people just fully adopted Scott’s opinions, they would be much more correct. And if rationality and epistemology isn’t about being correct, what even are they about? They can still try to learn and understand different points, and when their expertise eclipsed Scott they could place a heavier emphasis on their points, but it’s the point of epistemology to be correct.
But of course, most people don’t think as highly of Scott as I do. Plenty of people are mirrors for the beliefs of a political figure, or even more commonly, their political side. If I could convince them to copy Scott’s epistemic gusto, then they’d be more correct, but they would need to think they should trust me, which brings a whole host of issues with it. I’m using my own shaky epistemic tools to decide that Scott is the guy I should copy, so if I think he’s really that much of a better choice than, say, Donald Trump, most of the work of stealing his beliefs is already over.
Eliezer has an old, relevant chestnut in this post that talks about how authority and argument are two very different types of evidence. It’s only if you don’t understand an argument, or don’t attempt to, that authority become relevant. If you fully grasp it, and judge it to be sound, then the speaker who said it becomes only a footnote. But I feel that I’m able to understand most of Scott’s posts, which is how I decided that he was such a good candidate for this thievery to begin with! It seems like if there was a hypothetical user X, with even better epistemic practices than Scott, but X didn’t care enough to dumb down his arguments to something that heathens like myself can understand, I’m out of luck.
Wait a second, Scott’s not a computer scientist or political scientist, he’s a writer
Now’s about the time where I point out that as great as Scott is, he is first and foremost a writer before an expert in any of the fields he talks about (except psychiatry). His best skill is his way to weave words to make them interesting, which makes them stick out in my brain like a sore thumb. So I’m basically doing what the person considering becoming a mirror to Trump is: finding the most charismatic person I know, and stealing all his beliefs. It’s just that my definition of charismatic is poisoned by wanting to feel like I’m learning.
If someone said “hey, your posts are insightful, I’m going to blindly steal all your beliefs” I would implore them to steal Scott’s instead! But Scott himself is certainly not the smartest person that SCOTT knows, as he says here in a post where he admits he always struggled with math:
Every so often an overly kind commenter here praises my intelligence … But at my level, I spend my time feeling intellectually inadequate compared to Scott Aaronson.
Scott Aaronson describes feeling ‘in awe’ of Terence Tao and frequently struggling to understand him.
Terence Tao – well, I don’t know if he’s religious, but maybe he feels intellectually inadequate **compared to God.
So wait a second, I’m at the bottom of this long totem pole of smartness. I haven’t read barely anything from Aaronson or Tao! Okay, we’re on this truth hunt, let’s cut out the middleman and just steal Terrance Tao’s beliefs. But Tao is private about most of his personal beliefs. As an example, YIMBYs are people who want to build more houses and have less housing regulations, and their naysayers are the dastardly NIMBYs — this is a niche political topic! What do I do when I want to have an opinion on YIMBYism for NIMBYism and I go up the totem chain and Tao has spent his time on computer science stuff, and doesn’t even have anything public about his thoughts on a niche topic?
Screw it, Tao isn’t a political scientist, why would I trust him there? Why don’t I just find a smart person in every field and steal their beliefs. I’ll find a smart political scientist, a smart philosopher, and a smart AI expert. But it becomes immediately obvious that I’m not smart enough in fields I don’t understand to determine who’s actually smart and who’s bullshitting. Let’s say I don’t understand any political science — how could I tell the difference between Curtis Yarvin and, uh, I guess literally any other substack political commentator? If I don’t even have a basic grasp of the material I’m screwed. But actually, experts in a topic have to be generally right when they have a consensus — say, climate change existing vs not, trusting the experts is a great heuristic. And if an issue is contentious, maybe trusting the majority is best? Once again, if I’m looking for a truth shortcut, the side with 60% seems better than the side with 40%!
So throw out the idea of finding one specific guy who’s really smart in a topic, let’s steal the expert consensus of everything. Except how am I supposed to know who’s the experts? If everyone working in a field counts, you’re gonna get a lot of stupid people. Now we’re stuck between 1 person being too many and a consensus of everyone giving college dropout interns in the field a say. But it might all just average out?
Maybe I need a consensus of the top 5 experts in a field, chosen by experts in a field. But that’s stupid, the people are just gonna choose
Screw it, the experts can’t be trusted, just get a consensus of the people. Hah! Are you stupid? That’s way worse. Steve Kirsch has 250,000 substack subscribers! Everybody is stupid except me! Maybe I’m just uniquely suited to finding truth in the universe because I got a 1480 on the SAT.
Wait a second, a 1480 literally isn’t even that high. Smart people? Let’s poll smart people! What’s the consensus of everybody who’s got a 1600 on the SAT on every topic? They can’t go wrong. Where can I find this? Do I need to create a database and find out what the smartest people think to find truth?
This is all going wrong. All I want is a fast track to truth without having to understand the arguments for every single topic. Wait, maybe epistemology and Bayes’ Rule are the problem. Who decided that the truth needed to be backed by statistics? Maybe the universe operates outside numbers. Religion! The majority of the US is religious! I’m not, but if we’re trusting the masses, what does God think? Err, what’s the consensus of people who talk to God about what God’s beliefs are? Does God think I should be a YIMBY? Maybe I operate in a privileged place in the universe, and epistemology doesn’t even work for me because of something something anthropic reasoning. What’s the best way to get God to speak to me about whether I should support Israel or Palestine? How can I find only the true prophets?
No! The prophets have no clear miracles! Why am I assuming other people are even real, or that my beliefs are real? What’s a belief anyway? If my conscious experience right now isn’t experiencing the belief, it’s just somewhere else in my brain, am I experiencing it? What is solipsism is true? Maybe this is a religious test in a simulation or something to see if I can discover NIMBYism off of pure faith without any evidence. Can God create another God so powerful he cannot control him? Can I control my own brain? How can I be a YIMBY if the fabric of time travelling from the past into the future cannot even be independently verified? What if I’m a Boltzmann brain and statistics and reality and truth and time are gonna break down? I’ll give it 3… 2… 1…
RAHHHHHHH
This is all ridiculous, right?
Okay, back up. I hope you realize this is all insane. It’s really, really hard to get a fast track to truth without understanding the arguments. It’s true that a cursory understanding isn’t as good as a deep understanding, but it’s better than nothing. Scott Alexander is valuable because he explains the arguments so well. And, of course, trusting the experts is usually very good advice, and a good heuristic to have. If the experts all disagree with you, you’re probably wrong. On contentious issues, weigh the sides, the amount of people for or against, and the arguments, and the world will mostly make sense. If many people have an incentive and the ability to position their false beliefs as the expert opinion, arguments and statistics are necessary. As a closer, I’m reminded of one of my favorite parables, about smallpox:
Louis XV died of smallpox in in 1774. He had all the power, and money, and resources in the world, yet he met his fate all the same; was he truly doomed? An inevitable and unavoidable tragedy?
Nay; three months before his death, a lowly dairy farmer across the Atlantic in the United States braced his family as Smallpox ravaged his town. Luckily, it was folk wisdom that cowpox, a relatively mild affliction, made you completely immune to the much more devilish smallpox. He took his family to his cows and rubbed their pus on his arm, and they were saved from the terrible fate for the rest of their lives
Loius XV’s fate was sealed not from a lack of resources, but from a lack of knowledge; there would have been no way for him to distinguish true knowledge from the snake oil salesman and faith healers that surrounded him. The only path is true understanding.
Alright, you got me, it’s the study of knowledge and is interested in the difference between true beliefs and justified beliefs, how they differ from opinion, etc etc. You’re damn right that I will abuse it in my article to mean “good at finding true things” because it can be a fancy way to say something that resembles that. Also, the word is fun to say. Ep-eh-stem-ick.
Bayes bayes bayes bayes bayes. if you say it 5 times it almost doesn’t sound like a word. This word is actually completely irrelevant in this sentence, and can be ignored. Also, I’m kinda totally abusing Bayes here, as you can’t be under or overconfident, but tuning your method to find truth is fine! But yes, fine, it’s “one of the most important equations of all time”. Yes, it’s “a mathematical way to explain how to accurately update beliefs so you can find truth”. I know, I know, I know. Scott’s bio actually is literally just Bayes’ rule, so if I don’t include it because it’s so important I’m basically disrespecting Scott.
Love this!
Regarding your claim that once you understand an argument, authority is a footnote, I think this neglects the possibility that you might think you understand an argument while not actually getting it. Pretty much nobody who ever misunderstands an argument or things they have rebuted it when they haven’t thinks they don’t understand the argument. For example, back in the day, I used to be confident that I had seen the problems with the argument from fine-tuning and therefore the actual value of physical constants shouldn’t be a problem I need to worry about. I still think it doesn’t work as an argument for God, but now that I have a better grasp of anthropics, I think I was clearly in error when I thought I had seen why the argument was mistaken. Basically, if you and a super smart person disagree about whether an argument is correct, you should all else equal assume that the smarter person is correct.
I do think Scott is a poor example, because while I’m pretty confident, he is smarter than I and possibly you his actual performance in his prediction contests makes me think you’d be better off, consulting the opinion of superforecasters. Of course as a practical matter, I noticed that I feel comfortable disregarding their opinions where I think the reasoning is clearly wrong, for example on existential risk from artificial intelligence. Similarly, I think many worlds is correct, even though the community of physicists who have probably forgotten more on the topic then I remember is sharply divided when it comes to this subject. This does indicate that subconsciously my brain still thinks it’s knowledgeable enough to judge the topics to some extent, but I noticed I thought the same back when I was in my auntie string theory phase, and in retrospect, I was clearly not thinking clearly in that time period and definitely wasn’t as knowledgeable as I needed to be to form confident opinions. I think in practice you need to give substantial way to authority when reasoning about a topic, but I do think that I’m not making a mistake when I don’t give authority in finite wait and do let sufficient evidence from my own thinking overrule It. Of course, this could be a mistake, and I noticed most people have overly confident opinions on topics where they have done, minimal research and thinking, even though they are aware that experts disagree. Think of topics like the minimum wage where people will actively get angry at the suggestion that it’s not definitely a good idea, even though professional economists disagree on the topic. Of course, I noticed I myself have several topics on which I think expert disagreement is irrelevant, like the non-existence of libertarian free will or my belief that electrons aren’t conscious.
I absolutely agree that you can’t judge who is an expert without already having a good understanding of the topic. As Scott noted if you are a young earth creationist, you think the relevant experts are fundamentalist preachers, and listen to the experts can’t possibly save you then. In fact, very often, you’ll judge whether you can trust the experts by looking at whether you think they come to correct conclusions in which case you can’t then turnaround and use your conclusions to form your own. At the very least, even if you don’t want to judge them by their conclusions, you have to judge them by something like the quality of their arguments. In which case you’re already filtering for people who agree with you on which arguments are valid.
In general, the actual solution, most people in our position appear to resolve on is generally trust the experts, but feel free to disagree with them if you have thought about the topic sufficiently. of course, I noticed that many of these same people are much more absolute in their trust of the stock market. Even though it effectively works by aggregating the knowledge of society which these self same people feel free to 2nd guess in different situations. Of course, given that these people end up effectively disagreeing with the stock market on various things like artificial intelligence, perhaps this is evidence they should trust the stock market less. Although I would note hear that, even if it leads to less accurate beliefs, ignoring authority is desirable for there to be disagreement in society because the discussion that follows will generally make society more accurate in its beliefs. Even if individual members are pushed further from the truth. Though I do think outside of expert communities on the margin we have too much willingness to ignore authority. Also to be fair while the arguments for complete adherence to authority can be philosophically strong. Most people can’t actually follow such a principle in practice.