Almost six years ago, Scott wrote a post on LW about the apparent lack of strong correlation of real-world instrumental success and studying what he calls “x-rationality” – that is, OB/LW-style rationality, of the kind that’s above and beyond the regular skills you can get from being a generally intelligent, thoughtful, and scientifically-minded person.
I’d be quite interested in hearing what his opinion is six years into the future, but my current one is that this situation hasn’t changed much, in general. In fact, I was linked to his post by a recent reply Ozy sent someone on zir blog, while commenting that zie didn’t spread LW memes because zie didn’t feel they were very useful. I’m not alone in this, then. (Let’s remember that CFAR exists now, though.)
I’d like to share my thoughts on another potential factor contributing to this case, something that was alluded to by the post and many commenters of it (including himself and Anna Salamon), something I’ve noticed that… I do. A skill, maybe.
Aspiring x-rationalists are the people who look at the mountain of stuff on Ovecoming Bias, Less Wrong, and other such sources, and decide that it makes sense, that their lives would be improved by the application of these techniques, so they go on and learn everything about it. They memorise it, they absorb all these memes to the point of being able to recite by heart many of the more famous quotes. And yet there isn’t a strong correlation! We’re not producing superheroes every other Tuesday! What gives?
I’d say it’s that believing rationality and alieving rationality are really different things.
Sure, you may believe that ghosts don’t exist. But after watching a thriller, in the dark of night, knowing full well that your house is locked and perfectly safe from supernatural phenomena, you still jump when you hear the wind rustling something in your living room. Likewise, you may very well believe that conjunctions are less likely than disjunctions, but you still find that story your neighbour told you about some ridiculously contrived happenstance mighty intriguing.
What I’m getting at here is that, maybe, for rationality to work for you, it’s not enough to believe it. You have to alieve it. You have to feel it in your bones. You have to turn it into heuristic, into habit, something as natural as breathing. Your flinching reactions have to be rational ones.
And… that’s just really really hard. And I’d say it’s a general skill: turning beliefs into aliefs. Based on this, I might actually say this is the single most important rationality skill, without which all others are superfluous. It’s no use learning that “politics is the mind-killer” if you don’t grok the idea that what this means is “whenever I’m discussing politics, I need to be extra careful because it’s a field that’s particularly likely to trigger my biases” and instead use that as a general get-out-of-jail-free card to not discuss anything that might involve any mobilisation on your part. It’s no use learning that “the map is not the territory” if you don’t internalise the meaning that something being true, something being provable, and something being believed, are three different things.
I’d say I’m pretty good at it. I’ve only recently realised it, but yeah, I think I’m good at turning beliefs into aliefs.
In my last post about LW, I said that one of the most important things LW gave me was kindness. I learnt how to be a kinder person, a gentler person. This is because the most obvious and direct lesson of x-rationality isn’t how to Overcome your Biases, it’s that everyone’s biased. Almost no one is evil, almost everything’s broken. Malice is actually really rare, and I find that most people do evil because they’re wrong, not because they’re bad; either they’re wrong about what they believe (gay marriage will destroy families!), wrong about what’s good (men’s egos are so fragile LOL let’s mock them for it), or wrong about how to do good (if I shame these white mentally ill people really hard about their privilege they’ll surely join our cause).
Other aliefs I have gotten are various things about social justice, gender, and sexuality; how to deal with my own emotions; what akrasia means and how to fight it; when not to; amongst other things. And I have had marked increases in my productivity. Last year I read 52 books, of which 23 (or thereabouts) were maths or otherwise constructive non-fiction. I learnt three (or four?) new programming languages, wrote a lot, and refined my goals and beliefs. And I claim it was because of x-rationality, and because I turned a lot of rationality techniques into intuitive heuristics.
(But I don’t know that this is actually true, I don’t know how I would’ve been if I hadn’t been exposed to x-rationality and it might just be the case that I’d be very similar.)
(The above is an example of second-guessing myself that I’ve turned instinctual and which now happens naturally.)
And now’s the point where I explain to you what the magic secret is…
Except I don’t know. It’s not an explicit skill, it’s just something that happens. I have an easy time turning beliefs into aliefs, really truly gut-level grokking something. If someone has some advice on that front, now’d be the perfect time.
But until then? Yeah, I’m pretty sure this should be a top-priority skill to learn: how to turn your other skills into instinct and instantaneous heuristic.