From: M. Taylor Saotome-Westlake Date: Tue, 16 Feb 2021 04:49:37 +0000 (-0800) Subject: Presidential doldrums X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=6968b938c932c906a1630dbb1e74456ebf9ce38f;p=Ultimately_Untrue_Thought.git Presidential doldrums It's so hard to not get distracted!! But I have to prove—that it's possible to try. --- diff --git a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md index dd7113e..780d5ad 100644 --- a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md +++ b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md @@ -654,17 +654,15 @@ In ["The Ideology Is Not the Movement"](https://slatestarcodex.com/2016/04/04/th Alexander jokingly identifies the identifying feature of our robot cult as being the belief that "Eliezer Yudkowsky is the rightful caliph": the Sequences were a rallying flag that brought together a lot of like-minded people to form a subculture with its own ethos and norms—among which Alexander includes "don't misgender trans people"—but the subculture emerged as its own entity that isn't necessarily _about_ anything outside itself. -No one seemed to notice at the time, but this characterization of our movement is actually a _declaration of failure_. +No one seemed to notice at the time, but this characterization of our movement [is actually a _declaration of failure_](https://sinceriously.fyi/cached-answers/#comment-794). There's a word, "rationalist", that I've been trying to avoid in this post, because it's the subject of so much strategic equivocation, where the motte is "anyone who studies the ideal of systematically correct reasoning, general methods of thought that result in true beliefs and successful plans", and the bailey is "members of our social scene centered around Eliezer Yudkowsky and Scott Alexander". (Since I don't think we deserve the "rationalist" brand name, I had to choose something else to refer to the social scene. Hence, "robot cult.") +What I would have _hoped_ for from a systematically correct reasoning community worthy of the brand name is a world where _good arguments_ would propagate through the population no matter where they arose, "guided by the beauty of our weapons" ([following Scott Alexander](https://slatestarcodex.com/2017/03/24/guided-by-the-beauty-of-our-weapons/) [following Leonard Cohen](https://genius.com/1576578)). I think what actually happens is that people like Yudkowsky and Alexander rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and conformity with the local environment)—with the result that if Yudkowsky and Alexander _aren't interested in getting the right answer_ (in public), then there's no way for anyone who didn't [win the talent lottery](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) to fix the public understanding. -Hence, "robot cult." -[TODO: risk factor of people getting drawn in to a subculture that claims to be about reasoning, but is actualy very heavily optimized for cutting boys dicks off. "The Ideology Is Not the Movement" is very explicit about this!! People use trans as political cover; no one seemed to notice that "The Ideology Is Not the Movement" is a declaration of _failure_ -http://benjaminrosshoffman.com/construction-beacons/ +[TODO: risk factor of people getting drawn in to a subculture that claims to be about reasoning, but is actualy very heavily optimized for cutting boys dicks off. +People use trans as political cover; https://srconstantin.github.io/2017/08/08/the-craft-is-not-the-community.html -I'm worried about the failure mode where the awesomeness of the Sequences -caliphate -> Rather than good arguments propagating through the population of so-called "rationalists" no matter where they arise, what actually happens is that people like Eliezer and you rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and conformity with the local environment). So for people who didn't [win the talent lottery](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) but think they see a flaw in the Zeitgeist, the winning move is "persuade Scott Alexander". + ] diff --git a/notes/sexual-dimorphism-in-the-sequences-notes.md b/notes/sexual-dimorphism-in-the-sequences-notes.md index 4e44b56..293aa26 100644 --- a/notes/sexual-dimorphism-in-the-sequences-notes.md +++ b/notes/sexual-dimorphism-in-the-sequences-notes.md @@ -441,3 +441,4 @@ It _follows logically_ that, in particular, if _N_ := "woman", you can't define > Michael fucking Vassar. Shit! +I'm worried about the failure mode where bright young minds [lured in](http://benjaminrosshoffman.com/construction-beacons/) by the beautiful propaganda about _systematically correct reasoning_, are instead recruited into what is, effectively, the Eliezer-Yudkowsky-and-Scott-Alexander fan club. \ No newline at end of file