From: Zack M. Davis Date: Tue, 28 Nov 2023 17:13:50 +0000 (-0800) Subject: memoir: pt. 4 fills X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=78352bcc30d679cef5a2dfbf582637e734739ecb;p=Ultimately_Untrue_Thought.git memoir: pt. 4 fills --- diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index c5698fc..79ed97e 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -9,7 +9,7 @@ Status: draft > > —_Atlas Shrugged_ by Ayn Rand -Quickly recapping my Whole Dumb Story so far: [ever since puberty, I've had this obsessive sexual fantasy about being magically transformed into a woman, which got contextualized by these life-changing Sequences of blog posts by Eliezer Yudkowsky which taught me (amongst many, many other things) how fundamentally disconnected from reality my fantasy was.](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/) [So it came as a huge surprise when, around 2016, the "rationalist" community that had formed around the Sequences seemingly unanimously decided that guys like me might actually be women in some unspecified metaphysical sense.](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/) [A couple years later, after having put some effort into arguing against the popular misconception that the matter could be resolved by simply redefining the word _woman_ (on the grounds that you can define the word any way you like), I flipped out when Yudkowsky prevaricated about how his own philosophy of language says that you can't define a word any way you like, prompting me to join up with a handful of allies to attempt to persuade him to clarify.](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/) [When that failed, my attempts to cope with the "rationalists" being fake led to a series of small misadventures culminating in Yudkowsky eventually clarifying the philosophy of lanugage issue after I ran out of patience and yelled at him over email.](/2023/Nov/if-clarity-seems-like-death-to-them/) +Quickly recapping my Whole Dumb Story so far: [ever since puberty, I've had this obsessive sexual fantasy about being magically transformed into a woman, which got contextualized by these life-changing Sequences of blog posts by Eliezer Yudkowsky which taught me (amongst many, many other things) how fundamentally disconnected from reality my fantasy was.](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/) [So it came as a huge surprise when, around 2016, the "rationalist" community that had formed around the Sequences seemingly unanimously decided that guys like me might actually be women in some unspecified metaphysical sense.](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/) [A couple years later, after having put some effort into arguing against the popular misconception that the matter could be resolved by simply redefining the word _woman_ (on the grounds that you can define the word any way you like), I flipped out when Yudkowsky prevaricated about how his own philosophy of language says that you can't define a word any way you like, prompting me to join up with a handful of allies to attempt to persuade him to clarify.](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/) [When that failed, my attempts to cope with the "rationalists" being fake led to a series of small misadventures culminating in Yudkowsky eventually clarifying the philosophy of lanugage issue after I ran out of patience and yelled at him over email.](/2023/Dec/if-clarity-seems-like-death-to-them/) Really, that should have been the end of the story—and it would have had a relatively happy ending, too: that it's possible to correct straightforward philosophical errors, at the cost of almost two years of desperate effort by someone with [Something to Protect](https://www.lesswrong.com/posts/SGR4GxFK7KmW7ckCB/something-to-protect). @@ -235,9 +235,9 @@ On a close reading of the comment section, we see hints that Yudkowsky ... does So, the explanation of [the problem of political censorship filtering evidence](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting) here is great, but the part where Yudkowsky claims "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter" is laughable. My point (articulated at length in ["Challenges"](/2022/Mar/challenges-to-yudkowskys-pronoun-reform-proposal/)) that _she_ and _he_ have existing meanings that you can't just ignore by fiat given that the existing meanings are exactly what motivate people to ask for new pronouns in the first place is obvious. -Really, it would be less embarassing for Yudkowsky if he were lying about having tried to think of counterarguments. The original post isn't that bad if you assume that Yudkowsky was writing off the cuff, that he clearly just didn't put any effort whatsoever into thinking about why someone might disagree. I don't have a problem with selective argumentation that's clearly labeled as such: there's no shame in being an honest specialist who says, "I've mostly thought about these issues though the lens of ideology _X_, and therefore can't claim to be comprehensive; if you want other perspectives, you'll have to read other authors and think it through for yourself." +Really, it would be less embarassing for Yudkowsky if he were lying about having tried to think of counterarguments. The original post isn't that bad if you assume that Yudkowsky was writing off the cuff, that he clearly just didn't put any effort whatsoever into thinking about why someone might disagree. I don't have a problem with selective argumentation that's clearly labeled as such: there's no shame in being an honest specialist who says, "I've mostly thought about these issues though the lens of ideology _X_, and therefore can't claim to be comprehensive or even-handed; if you want other perspectives, you'll have to read other authors and think it through for yourself." -But if he _did_ put in the effort to aspire to comprehensiveness—enough that he felt comfortable bragging about his ability to see the other side of the argument—and still ended up proclaiming his "simplest and best protocol" without even so much as mentioning any of its obvious costs, that's discrediting. If Yudkowsky's ability to explore the space of arguments is that bad, why would you trust his opinion about anything? +But if he _did_ put in the effort to aspire to even-handedness—enough that he felt comfortable bragging about his ability to see the other side of the argument—and still ended up proclaiming his "simplest and best protocol" without even so much as mentioning any of its obvious costs, that's discrediting. If Yudkowsky's ability to explore the space of arguments is that bad, why would you trust his opinion about anything? Furthermore, the claim that only I "would have said anything where you could hear it" is also discrediting of the community. Transitioning or not is a _major life decision_ for many of the people in this community. People in this community _need the goddamned right answers_ to the questions I've been asking in order to make that kind of life decision sanely [(whatever the sane decisions turn out to be)](/2021/Sep/i-dont-do-policy/). If the community is so bad at exploring the space of arguments that I'm the only one who can talk about any of the obvious decision-relevant considerations that code as "anti-trans" when you project into the one-dimensional subspace corresponding to our Society's usual Culture War, why would you pay attention to the community _at all_? Insofar as the community is successfully marketing itself to promising young minds as the uniquely best place in the entire world for reasoning and sensemaking, then "the community" is _fraudulent_ (misleading people about what it has to offer in a way that's optimized to move resources to itself): it needs to either _rebrand_—or failing that, _disband_—or failing that, _be destroyed_. @@ -364,31 +364,28 @@ For the savvy people in the know, it would certainly be convenient if everyone s But if you cared about not deceiving your readers, you would want to be sure that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore politically unfavorable counteraguments_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion". -I think Yudkowsky is playing dumb here. In other contexts, he's written about ["attack[s] performed by selectively reporting true information"](https://twitter.com/ESYudkowsky/status/1634338145016909824) and ["[s]tatements which are technically true but which deceive the listener into forming further beliefs which are false"](https://hpmor.com/chapter/97). I think that _if he wanted to_, Eliezer Yudkowsky could think of some relevant differences between "2 + 2 = 4" and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'". +I think he's playing dumb here. In other contexts, he's written about ["attack[s] performed by selectively reporting true information"](https://twitter.com/ESYudkowsky/status/1634338145016909824) and ["[s]tatements which are technically true but which deceive the listener into forming further beliefs which are false"](https://hpmor.com/chapter/97). He's undoubtedly familiar with the motte-and-bailey doctrine as [described by Nicholas Shackel](https://philpapers.org/archive/SHATVO-2.pdf) and [popularized by Scott Alexander](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/). I think that _if he wanted to_, Eliezer Yudkowsky could think of some relevant differences between "2 + 2 = 4" and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'". -[TODO: +If you think it's "sometimes personally prudent and not community-harmful" to go out of your way to say positive things about Republican candidates and never, ever say positive things about Democratic candidates (because you live in a red state and "don't see what the alternative is besides getting shot"), you can see why people might regard you as a Republican shill, even if all the things you said were true. If you tried to defend yourself against the charge of being a Republican shill by pointing out that you've never told any specific individual, "You should vote Republican," that's a nice motte that might work on some people, but you shouldn't expect seasoned and devoted rationalists to fall for it. -topic sentence +Similarly, when Yudkowsky [wrote in June 2021](https://twitter.com/ESYudkowsky/status/1404697716689489921), "I have never in my own life tried to persuade anyone to go trans (or not go trans)—I don't imagine myself to understand others that much", it was a great motte. I don't doubt the literal motte stated literally. -https://twitter.com/ESYudkowsky/status/1404697716689489921 -> I have never in my own life tried to persuade anyone to go trans (or not go trans)—I don't imagine myself to understand others that much. +And yet it seems worth noticing that shortly after proclaiming in March 2016 that he was "over 50% probability at this point that at least 20% of the ones with penises are actually women", he made [a followup post gloating about causing someone's transition](https://www.facebook.com/yudkowsky/posts/10154110278349228): -If you think it's "sometimes personally prudent and not community-harmful" to go out of your way to say positive things about Republican candidates and never, ever say positive things about Democratic candidates (because you "don't see what the alternative is besides getting shot"), you can see why people might regard you as a Republican shill—even if all the things you said were true, and even if you never told any specific individual, "You should vote Republican." - -https://www.facebook.com/yudkowsky/posts/10154110278349228 > Just checked my filtered messages on Facebook and saw, "Your post last night was kind of the final thing I needed to realize that I'm a girl." > ==DOES ALL OF THE HAPPY DANCE FOREVER== +In the comments, he added: + > Atheists: 1000+ Anorgasmia: 2 Trans: 1 -https://twitter.com/ESYudkowsky/status/1404821285276774403 -> It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy. +He [later clarified on Twitter](https://twitter.com/ESYudkowsky/status/1404821285276774403), "It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy." -] +But if Stalin is committed to convincing gender-dysphoric males that they need to cut their dicks off, and you're committed to not disagree with Stalin, you _shouldn't_ mostly believe it when gender-dysphoric males thank you for providing the the final piece of evidence they needed to realize that they need to cut their dicks off, for the same reason a self-aware Republican shill shouldn't take it literally when people thank him for warning them against Democrat treachery. We know—he's told us very clearly—that Yudkowsky isn't trying to provide gender-dysphoric people with the full state of information that they would need to decide on the optimal quality-of-life interventions. He's playing on a different chessboard. -"[P]eople do _know_ they're living in a half-Stalinist environment," Yudkowsky says. "I think people are better off at the end of that," he says. But who are "people", specifically? One of the problems with utilitarianism is that it doesn't interact well with game theory. If a policy makes most people better off, at the cost of throwing a few others under the bus, is enacting that policy the right thing to do? +"[P]eople do _know_ they're living in a half-Stalinist environment," Yudkowsky claims. "I think people are better off at the end of that," he says. But who are "people", specifically? One of the problems with utilitarianism is that it doesn't interact well with game theory. If a policy makes most people better off, at the cost of throwing a few others under the bus, is enacting that policy the right thing to do? -Depending on the details, maybe—but you probably shouldn't expect the victims to meekly go under the wheels without a fight. That's why I've been telling you this 100,000-word sob story about how _I_ didn't know, and _I'm_ not better off. +Depending on the details, maybe—but you probably shouldn't expect the victims to meekly go under the wheels without a fight. That's why I've [been](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/) [telling](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/) [you](/2023/Dec/if-clarity-seems-like-death-to-them/) this 100,000-word sob story about how _I_ didn't know, and _I'm_ not better off. In [one of Yudkowsky's roleplaying fiction threads](https://www.glowfic.com/posts/4508), Thellim, a woman hailing from [a saner alternate version of Earth called dath ilan](https://www.lesswrong.com/tag/dath-ilan), [expresses horror and disgust at how shallow and superficial the characters in Jane Austen's _Pride and Prejudice_ are, in contrast to what a human being _should_ be](https://www.glowfic.com/replies/1592898#reply-1592898): @@ -508,44 +505,38 @@ Scott Alexander chose Feelings, but I can't really hold that against him, becaus [^hexaco]: The authors of the [HEXACO personality model](https://en.wikipedia.org/wiki/HEXACO_model_of_personality_structure) may have gotten something importantly right in [grouping "honesty" and "humility" as a single factor](https://en.wikipedia.org/wiki/Honesty-humility_factor_of_the_HEXACO_model_of_personality). -Eliezer Yudkowsky did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he consciously knows to be unambiguously false. And the reason I can hold it against _him_ is because Eliezer Yudkowsky does not identify as just some guy with a blog. Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. He markets himself as a master of the hidden Bayesian structure of cognition, who ["aspires to make sure [his] departures from perfection aren't noticeable to others"](https://twitter.com/ESYudkowsky/status/1384671335146692608). He [complains that "too many people think it's unvirtuous to shut up and listen to [him]"](https://twitter.com/ESYudkowsky/status/1509944888376188929). +Eliezer Yudkowsky did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he consciously knows to be unambiguously false. And the reason I can hold it against _him_ is because Eliezer Yudkowsky does not identify as just some guy with a blog. Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. He markets himself as a master of the hidden Bayesian structure of cognition, who ["aspires to make sure [his] departures from perfection aren't noticeable to others"](https://twitter.com/ESYudkowsky/status/1384671335146692608), who [complains that "too many people think it's unvirtuous to shut up and listen to [him]"](https://twitter.com/ESYudkowsky/status/1509944888376188929). In making such boasts, I think Yudkowsky is opting in to being held to higher standards than other mortals. If Scott Alexander gets something wrong when I was trusting him to be right, that's disappointing, but I'm not the victim of false advertising, because Scott Alexander doesn't claim to be anything more than some guy with a blog. If I trusted him more than that, that's on me. -If Eliezer Yudkowsky gets something wrong when I was trusting him to be right, and refuses to acknowledge corrections (in the absence of an unsustainable 21-month nagging campaign) and keeps inventing new galaxy-brained ways to be wrong in the service of his political agenda of being seen to agree with Stalin without technically lying, then I think I _am_ the victim of false advertising. His marketing bluster was optimized to trick people like me into trusting him, even if my being dumb enough to believe him is on me. +If Eliezer Yudkowsky gets something wrong when I was trusting him to be right, and refuses to acknowledge corrections (in the absence of an unsustainable 21-month nagging campaign) and keeps inventing new galaxy-brained ways to be wrong in the service of his political agenda of being seen to agree with Stalin without technically lying, then I think I _am_ the victim of false advertising. His marketing bluster was designed to trick people like me into trusting him, even if my being dumb enough to believe him is on me. -Because, I did, actually, trust him. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). And of course, it was a joke, but the hero-worship that make the joke funny was real. (You wouldn't make those jokes for your community college physics teacher, even if they were a good teacher.) +Because, I did, actually, trust him. Back in 2009 when _Less Wrong_ was new, we had a thread of hyperbolic ["Eliezer Yudkowsky Facts"](https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts) (in the style of [Chuck Norris facts](https://en.wikipedia.org/wiki/Chuck_Norris_facts)). And of course, it was a joke, but the joke was one of over-the-top exaggeration of a hero worship that was very real. (You wouldn't make those jokes for your community college physics teacher, even if they were a good teacher.) ["Never go in against Eliezer Yudkowsky when anything is on the line"](https://www.greaterwrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts/comment/Aq9eWJmK6Liivn8ND), said one of the facts—and back then, I didn't think I would _need_ to. -[TODO— - - * Back then, he was trustworthy—and part of what made him so trustworthy was specifically that he was clearly trying to help people think for themselves. He didn't think - -https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus - -https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science -> I'm not sure that human beings realistically _can_ trust and think at the same time. +What made him so trustworthy back then was that he wasn't asking for trust. He clearly _did_ think it was [unvirtuous to just shut up and listen to him](https://www.lesswrong.com/posts/t6Fe2PsEwb3HhcBEr/the-litany-against-gurus): "I'm not sure that human beings realistically _can_ trust and think at the same time," [he wrote](https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science). He was always arrogant, but it was tempered by the expectation of being held to account by arguments; he wrote about ["avoid[ing] sending [his] brain signals which tell it that [he was] high-status, just in case that cause[d his] brain to decide it [was] no longer necessary."](https://www.lesswrong.com/posts/cgrvvp9QzjiFuYwLi/high-status-and-stupidity-why). -https://www.lesswrong.com/posts/cgrvvp9QzjiFuYwLi/high-status-and-stupidity-why -> I try in general to avoid sending my brain signals which tell it that I am high-status, just in case that causes my brain to decide it is no longer necessary. In fact I try to avoid sending my brain signals which tell it that I have achieved acceptance in my tribe. When my brain begins thinking something that generates a sense of high status within the tribe, I stop thinking that thought. +He visibly [cared about other people being in touch with reality](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business). "I've informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before," [he wrote](https://www.greaterwrong.com/posts/kLR5H4pbaBjzZxLv6/polyhacking/comment/rYKwptdgLgD2dBnHY). (I can testify that this is true: while sharing a car ride with Anna Salamon in 2011, he told me I had B.O.) - * He visibly cared about people—Earth people—being in touch with reality, as with his body odors comment (I can testify that he actually told me about my B.O. in a car ride with Anna Salamon in 2011) - * This is above-and-beyond truth-encouraging behavior - -> I've informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before. -https://www.greaterwrong.com/posts/kLR5H4pbaBjzZxLv6/polyhacking/comment/rYKwptdgLgD2dBnHY +Informing people about their body odor is above-and-beyond truth-telling behavior: it's an area where people would benefit from feedback (if you know, you can invest in deodorant), but aren't getting that feedback by default (because no one wants to be so rude as to tell people they smell bad). +[TODO— * There's an obvious analogy to telling people they have B.O., to telling trans people that they don't pass. * It's not that I expect him to do it. (I don't do that, either.) But I'd expect him to _notice_ it as pro-Truth action. His pattern of public statements suggests he doesn't even notice!! + * The EY who care about people being in touch with reality is dead now. - * In contrast, he now wonders if trying to teach people was a mistake - * I complained in Discord that this amounted to giving up on the concept of intellectual honesty - * He put a checkmark on it. +In a Discord discussion, [he remarked](yudkowsky-i_might_have_made_a_fundamental_mistake.png): +> I might have made a fundamental mistake when I decided, long ago, that I was going to try to teach people how to reason so that they'd be able to process my arguments about AGI and AGI alignment through a mechanism that would discriminate true from false statements. +> +> maybe I should've just learned to persuade people of things instead + + * I got offended; I complained in Discord that this amounted to giving up on the concept of intellectual honesty ; davis-amounts-to-giving-up-on-the-concept-of-intellectual-honesty.png + * He put a checkmark on it. ] -[Yudkowsky writes](https://twitter.com/ESYudkowsky/status/1096769579362115584): +The modern [Yudkowsky writes](https://twitter.com/ESYudkowsky/status/1096769579362115584): > When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes.