LessWrong, MIRI etc. meltdown

What happens if you take a bunch of socially inept, above average intelligence, and mentally ill people and have them try to do stuff together? Sounds like a disaster. Well…

lesswrong.com/lw/p23/dragon_army_theory_charter_30min_read/

One comment, which I repost here because some are advocating for censorship, is very telling:

Comment author: 18239018038528017428 26 May 2017 08:43:41PM *  22 points [-]

This post is so thoroughly repulsive and disgusting that I made an account for the sole purpose of pointing out how transparently and obviously perverse this fucked-up proposal is. Naturally I don’t have any actual desire to be critical or rude; it’s just that nobody else is doing it, so because of my infinite kindness and charity (if you have any doubts, rest assured that my closest friends and colleagues will all attest to my beneficent nature), I find myself obligated to step up to the batting plate, so to speak. Ah, if only someone could release me from this great burden. If only.

The author seems to have missed the part of Ender’s Game about the protagonists being children. It’s generally not a good thing for adults to role-play as children (the reasons for which are, I hope, sufficiently obvious to not require elaboration). The dominant impression I get from this is that this resembles the antifa movement and the anti-antifa movement: it’s a bunch of immature adults LARPing but pretending that they aren’t doing so.

Note that despite the author’s insistence on the validity of his experience as a CFAR instructor, he fails to actually point to any concrete benefits that people have derived from that instruction — plausibly because those benefits, when concretely stated without embellishment, are at best underwhelming. Note also that (1) no mention of dealing with problems arising from interpersonal romance are mentioned in the post and (2) the author’s reply to the comment that does point out the probable future existence of such problems receives what can at best be termed a cursory and dismissive reply.

This suggests that, contrary to the author’s assertion of having amassed a diverse and broad range of skills, and contrary to whatever accolades his colleagues may see fit to place upon him, he hasn’t yet attained the level of social awareness of a typical American high school student. It also suggests that the author’s ability to model himself and to model others has more-or-less not yet attained the level of sophistication required to view people as more than one-dimensional. I.e., the post seems to suggest an attitude of “I, a good person, will find a bunch of good people, and we’ll make these good things happen”. I’m pretty sure I’ve met high school students with a more nuanced (and less optimistic) understanding of human nature.

Naturally, this would be excused if the Berkeley rationalist community were full of people who are actually good people and who tend to get things done. Let’s check: Qiaochu Yuan, one of the most mathematically sophisticated members, has to the best of my knowledge hit a dead end in his PhD, and is becoming a CFAR instructor in Seattle, which makes it seem as though he’s actually concretely worse off compared to the counterfactual in which the rationalist community didn’t exist; Eliezer Yudkowsky has shifted in the direction of posting practically-untrue, self-aggrandizing bullshit on Twitter and Facebook instead of doing anything productive; Arbital is best described as a failure; word is going around that Anna Salamon and Nate Soares are engaging in bizarre conspiratorial planning around some unsubstantiated belief that the world will end in ten years, leading to severe dissatisfaction among the staff of MIRI; despite the efforts of a very valiant man, people have still not realized that autogynephilic men with repressed femininity and a crossdressing fetish pretending to be women aren’t actually women; CFAR itself is trending in the direction of adding bureaucracy for bureaucracy’s sake; my own personal experience with people branded as “CFAR instructors” has been extremely negative, with them effectively acting arrogant out of proportion to their competence, not to mention their below-average levels of empathy; there was that bizarre scandal last year in which someone was accidentally impregnated and then decided not to abort the child, going against what had previously been agreed upon, and proceeded to shamelessly solicit donations from the rationalist community to support her child; etc., etc., etc.

In effect, there seems to be some sort of self-deception around the fact that the Berkeley rationalist community is by almost all reasonable standards severely dysfunctional, with the best people actually being on the periphery of the community. It’s almost as if the author is coming up with the “Dragon Army” in an attempt to help everyone collectively delude themselves into believing they’re much better than they are, because he can’t bear to actually look at the Berkeley rationalist community and see it for what it is: a pile of garbage. Just like how a child from a broken family might imagine that everyone’s getting along. Unfortunately(?), flinching away from the truth doesn’t actually make reality go away.

Amusingly, it actually does seem as though the author partially realizes this. Let’s review the criteria which the author hopes the members of “Dragon Army” will fulfill after a year’s worth of cult membership:

(1) Above-average physical capacity (2) Above-average introspection (3) Above-average planning & execution skill (4) Above-average communication/facilitation skill (5) Above-average calibration/debiasing/rationality knowledge (6) Above-average scientific lab skill/ability to theorize and rigorously investigate claims (7) Average problem-solving/debugging skill (8) Average public speaking skill (9) Average leadership/coordination skill (10) Average teaching and tutoring skill (11) Fundamentals of first aid & survival (12) Fundamentals of financial management (13) At least one of: fundamentals of programming, graphic design, writing, A/V/animation, or similar (employable mental skill) (14) At least one of: fundamentals of woodworking, electrical engineering, welding, plumbing, or similar (employable trade skill)

“Above-average”? “Average”? Not exactly a high bar. “At least one employable mental skill, and at least one employable trade skill”? Is the correct inference here that the typical participant is actually expected to be not employable at all (i.e., deficient in both categories)? “First aid & survival” — if there was ever any doubt that this is actually just sophisticated childish role-playing… The fact that I (in contrast with the Berkeley rationalist community) have put very little directed effort into the meta-goal of self-improvement and nevertheless plausibly already satisfy 11 of these 14 criteria, with the other 3 not seeming particularly difficult to attain, is not a good sign!

Despite the fixation on “evolving norms” or whatever, the author seems to be particularly blind to what social reality is actually like and what actually makes communities get along. Consider, e.g., the following quote:

for example, a Dragon who has been having trouble getting to sleep but has never informed the other Dragons that their actions are keeping them awake will agree that their anger and frustration, while valid internally, may not fairly be vented on those other Dragons, who were never given a chance to correct their behavior

Let me pose a question to the reader of my comment: would you rather live in a house where you have to constantly verbally ask the other residents to stop doing things that they could have reasonably foreseen would bother you, or would you rather live in a house where people actually used reasonable expectations of what other people want to guide their behavior and therefore acted in a way that preempted causing other people irritation?

There are two inferences to be made here:

  1. Members of the Berkeley rationalist community are particularly prone to using bureaucratic rule-setting as a way to compensate for their severely below-average social skills, and
  2. Members of the Berkeley rationalist community are particularly low-empathy and embody the worst of individualism, such that they don’t actually care whether or not what they’re doing might bother others until they’re told to stop.

In my personal experience, both inferences are correct. Ultimately, what this comes down to is a bunch of socially-inept losers with near-autistic social skills trying to attain the sort of basic social harmony that comes naturally to more competent people via a combination of bizarre mimicry and a mountain of bureaucracy. Naturally, and contrary to the author’s bizarre childish idealism, one can expect a hell of a lot of repressed irritation, interpersonal drama, and general unpleasantness from this experiment.

To top off the turd cake with a cherry, the author’s science fiction writing is trash:

I felt my stomach twist, felt that same odd certainty, this time wrapped in a layer of the coldest, blackest ice. “You came to kill us,” I said. There was a soft rustle as the others straightened, pressure on my shoulders as the space between us closed. “You came to kill us all.”

Anyone who can vomit that out on a page and feel proud of it isn’t fit to lead or teach anything. Period. The world would be concretely better off if the author, and anyone like him, killed themselves.

 

Views All Time
Views All Time
1222
Views Today
Views Today
4