The largest mental gains I made in the shortest period of time were from studying rationality.
I was amazed to discover a couple years ago that there were people who regularly studied and discussed how to think, how to get correct and accurate beliefs about how the world works, how to understand how your mind works, and to get at the real reasons people make decisions.
The whole rationality thing is as addictive as crack-cocaine for me. I love it. The difference from crack, though, is you grow stronger and smarter the more you dive in.
Our minds are funny. We humans, we're "adaptation exercisers, not fitness maximizers" -
Fifty thousand years ago, the taste buds of Homo sapiens directed their bearers to the scarcest, most critical food resources - sugar and fat. Calories, in a word. Today, the context of a taste bud's function has changed, but the taste buds themselves have not. Calories, far from being scarce (in First World countries), are actively harmful. Micronutrients that were reliably abundant in leaves and nuts are absent from bread, but our taste buds don't complain. A scoop of ice cream is a superstimulus, containing more sugar, fat, and salt than anything in the ancestral environment.
Insights like this go a long way. Why do we eat so much crap? Because the crap was rare when our ancestors were evolving. A long long time ago, the people who ate as much sugar and fat as they could when they found a good source were more likely to survive. But nowadays it's a pain in the ass and we get fat.
The two sites that I've learned the most about rationality from are Overcoming Bias and Less Wrong. Overcoming Bias is Robin Hanson's blog. Less Wrong is a community site primarily led by Eliezer Yudkowsky. Both guys are really, really, really smart and very insightful.
What's in it for you to learn rationality? You'll start understanding how the world works, how people work, and how your own mind works. Also, it's really fascinating. Rationality is full of epiphanies and lightbulbs going off. And though it can be a dry and dense sometimes, but Hanson and Yudkowsky do a pretty good job of keeping it fun and easy enough to read, with good illustrative examples.
Here's a few good Eliezer Yudkowsky posts to start with:
Are your enemies innately evil?
Politics is the Mind-Killer
Tsuyoku Naritai
Making Beliefs Pay Rent
My favorite non-Yudkowsky post on Less Wrong - Generalizing From One Example - I read this one at just the right time, and it had a huge impact on helping me evaluate things. Basically, a lot of people assume that others think the same way they do, and then don't understand why people make decisions. I won't even try to summarize, just make sure you read Generalizing From One Example, it's very good and it's going to help you understand conflict a lot.
On Eliezer Yudkowsky's personal site, I highly recommend 12 Virtues of Rationality
Some choice quotes:
A burning itch to know is higher than a solemn vow to pursue truth.
P. C. Hodgell said: “That which can be destroyed by the truth should be.” Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud.
On Overcoming Bias, Robin Hanson talks about "signalling" a lot - people taking actions to demonstrate they're a certain kind of person, or allegiance to a certain group. I really liked this recent post -
The Meaning of the Meaning of Life, here's an excerpt:
It seems what people want is a satisfying story about their place in the universe. Since characters are the most important elements of a story, the main “place” that matters to people is their social place – who they relate to and how. People feel they understand their place when they have a story saying how they can relate well to important social entities.
Central to any social relation is whether the related person supports or opposes you in your conflicts. In fact, it seems enough to give your life meaning to just know who are your main natural allies and enemies among the important actors around, and what you can do to keep your allies supporting you, to give you high enough status.
For example, if there is a great powerful God, it seems enough to know what he wants you to do to keep him on your side. If you are a lowly servant but have the King for an ally, little else matters but pleasing him. (Unless you had higher status ambitions.) If you have committed yourself to certain strong relations, like a spouse or kids, then it may be enough to know how to keep them on your side. If your relations shift more often, you might instead focus on general features of your natural allies, such as gender, personality, ethnicity, or some grand shared far value. For example, knowing you are good at and love music may ensure the support of music lovers, “your people,” wherever you go.
People think their life has less meaning when enough aspects of it are determined by “impersonal” forces that refuse to take social sides. For example, a death caused by an enemy’s plan, or an allies failure to help, or by the dead person’s trying to help his allies, has far more meaning that a death caused by simple physics.
The community is pretty amazing. I signed up recently and decided I'd been lurking too long and wanted to give back. You need to get 20 karma points before you're allowed to post on Less Wrong, which means you need to comment a little bit show you understand the basics of civility and discussion.
I was pretty fortunate in that the community liked my first post enough to "promote" it - list it on the curated front page. "A "Failure to Evaluate Return-on-Time" Fallacy." I try to make my writing a little more humble there - Less Wrong is like the an informal discussion led by a very intelligent professor in the basement of a university library. People don't want theatrics or hand-waving there at all, and there's almost zero-tolerance for faulty emotionally-based arguments.
In "Failure to Evaluate", I wanted to bring up discussion around a main point - "A large majority of otherwise smart people spend time doing semi-productive things, when there are massively productive opportunities untapped. ... I'm curious as to why."
The comments on that were good, but even better was that there was an amazing reply by Anna Salmon - "Humans Are Not Automatically Strategic" -
Note: The next excerpt is dense, but take the time to work through it because it's worth it and you'll get a lot smarter if you can understand it.
But there are clearly also heuristics [heuristics: "rules of thumb" we use for quick decisionmaking] that would be useful to goal-achievement (or that would be part of what it means to “have goals” at all) that we do not automatically carry out. We do not automatically:
(a) Ask ourselves what we’re trying to achieve;
(b) Ask ourselves how we could tell if we achieved it (“what does it look like to be a good comedian?”) and how we can track progress;
(c) Find ourselves strongly, intrinsically curious about information that would help us achieve our goal;
(d) Gather that information (e.g., by asking as how folks commonly achieve our goal, or similar goals, or by tallying which strategies have and haven’t worked for us in the past);
(e) Systematically test many different conjectures for how to achieve the goals, including methods that aren’t habitual for us, while tracking which ones do and don’t work;
(f) Focus most of the energy that *isn’t* going into systematic exploration, on the methods that work best;
(g) Make sure that our "goal" is really our goal, that we coherently want it and are not constrained by fears or by uncertainty as to whether it is worth the effort, and that we have thought through any questions and decisions in advance so they won't continually sap our energies;
(h) Use environmental cues and social contexts to bolster our motivation, so we can keep working effectively in the face of intermittent frustrations, or temptations based in hyperbolic discounting;.... or carry out any number of other useful techniques. Instead, we mostly just do things. We act from habit; we act from impulse or convenience when primed by the activities in front of us; we remember our goal and choose an action that feels associated with our goal. We do any number of things. But we do not systematically choose the narrow sets of actions that would effectively optimize for our claimed goals, or for any other goals.
That's a little word-dense, but spend the time to work through it, or just go read the entire post. That's one of the reasons Less Wrong is so amazing - I brought up something I noticed for discussion, and got this amazingly brilliant reply from Anna.
The whole rationality community is a pretty amazing place. It can get a little word-dense sometimes and take a while to think about and work through, but man oh man is it worth it. If you haven't gotten started yet, there's enough material to surf through the Overcoming Bias and Less Wrong sites easily for two weeks with epiphany after epiphany, and you could keep studying and learning pretty quickly for six solid months of study, and enjoy casually studying it for the rest of your life. It's informing, entertaining, and really valuable for accomplishing things.
Don't wait - get started! If you're surfing the net, go through some of the links in this post as a jumping off point. Also, if you have any other favorites on rationality or other good rationality websites, please let us know in the comments.