From: M. Taylor Saotome-Westlake Date: Mon, 27 Jun 2022 00:17:02 +0000 (-0700) Subject: "A Hill" tap, shovel X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=f6b1cc69a7171bba8c61b1c7b46af347b8ff117b;p=Ultimately_Untrue_Thought.git "A Hill" tap, shovel --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 0b4e1cf..9ed8596 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -28,10 +28,12 @@ At this point, I was _disappointed_ with my impact, but not to the point of bear ... and, really, that _should_ have been the end of the story. Not much of a story at all. If I hadn't been further provoked, I would have still kept up this blog, and I still would have ended up arguing about gender with people occasionally, but my personal obsession wouldn't have been the occasion of a full-on religious civil war. -[TODO: I was at the company offsite browsing Twitter (which I had recently joined with fantasies of self-cancelling) when I saw the "Hill of Validity in Defense of Meaning"] +[TODO: I was at the company offsite browsing Twitter (which I had recently joined with fantasies of self-cancelling) after already having just spent a lot of time arguing with people about gender on Twitter, when I saw the "Hill of Validity in Defense of Meaning"] This is the moment where I _flipped the fuck out_. +[TODO: if everyone else did it, fine; if Yudkowsky did it ...] + > ["It is a common misconception that you can define a word any way you like. [...] If you believe that you can 'define a word any way you like', without realizing that your brain goes on categorizing without your conscious oversight, then you won't take the effort to choose your definitions wisely."](https://www.lesswrong.com/posts/3nxs2WYDGzJbzcLMp/words-as-hidden-inferences) > ["So that's another reason you can't 'define a word any way you like': You can't directly program concepts into someone else's brain."](https://www.lesswrong.com/posts/HsznWM9A7NiuGsp28/extensions-and-intensions) diff --git a/content/drafts/challenges-coda.md b/content/drafts/challenges-coda.md deleted file mode 100644 index 0623184..0000000 --- a/content/drafts/challenges-coda.md +++ /dev/null @@ -1,252 +0,0 @@ -Title: I Don't Trust Eliezer Yudkowsky's Intellectual Honesty -Date: 2022-01-01 11:00 -Category: commentary -Tags: Eliezer Yudkowsky -Status: draft - -> If you are silent about your pain, they'll kill you and say you enjoyed it. -> -> —Zora Neale Hurston - -### Summary - - * [TODO] - -In ["Challenges to Yudkowsky's Pronoun Reform Proposal"](TODO: link), I analyze a proposal by Eliezer Yudkowsky to normatively redefine the English pronouns _she_ and _he_ as referring to those who have asked us to use those pronouns. That object-level argument about pronoun conventions is probably not very interesting to many readers. For those who find my arguments persuasive (perhaps just from reading the summary bullet points rather than the entire post), what is perhaps more interesting is to jump up a meta level and ask: why is Eliezer Yudkowsky playing dumb like this? If his comments obviously can't be taken seriously, what's _actually_ going on here? - -(But on pain of committing [Bulverism](https://en.wikipedia.org/wiki/Bulverism), it's important that I wrote up the object-level counterargument _first_ as its own post, before engaging in [the fraught endeavor of speculating about psychology](https://arbital.com/p/psychologizing/) in this separate post.) - -Fortunately, Yudkowsky graciously grants us a clue in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228) on the post: - -> It unfortunately occurs to me that I must, in cases like these, disclaim that—to the extent there existed sensible opposing arguments against what I have just said—people might be reluctant to speak them in public, in the present social atmosphere. That is, in the logical counterfactual universe where I knew of very strong arguments against freedom of pronouns, I would have probably stayed silent on the issue, as would many other high-profile community members, and only Zack M. Davis would have said anything where you could hear it. -> -> This is a filter affecting your evidence; it has not to my own knowledge filtered out a giant valid counterargument that invalidates this whole post. I would have kept silent in that case, for to speak then would have been dishonest. - -I claim that my point that _she_ and _he_ already have existing meanings that you can't just ignore by fiat given that the existing meanings are _exactly_ what motivate people to ask for new pronouns in the first place, is a giant valid counterargument that invalidates the claim that "the simplest and best protocol is, '"He" refers to the set of people who have asked us to use "he", with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition." (It doesn't invalidate the whole post: the part about English being badly designed is fine.) - -Moreover, I claim that this point is _obvious_. This is not a _subtle_ objection. Any sane adult—really, any bright seven-year-old in the year 2016—who puts _any thought whatsoever_ into thinking about why someone might object to the pronouns-by-self-identity convention is going to think of this. And yet, remarkably, Yudkowsky continues: - -> Personally, I'm used to operating without the cognitive support of a civilization in controversial domains, and have some confidence in my own ability to independently invent everything important that would be on the other side of the filter and check it myself before speaking. So you know, from having read this, that I checked all the speakable and unspeakable arguments I had thought of, and concluded that this speakable argument would be good on net to publish, as would not be the case if I knew of a stronger but unspeakable counterargument in favor of Gendered Pronouns For Everyone and Asking To Leave The System Is Lying. -> -> But the existence of a wide social filter like that should be kept in mind; to whatever quantitative extent you don't trust your ability plus my ability to think of valid counterarguments that might exist, as a Bayesian you should proportionally update in the direction of the unknown arguments you speculate might have been filtered out. - -So, the explanation of [the problem of political censorship filtering evidence](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting) here is great, but the part where Yudkowsky claims "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter" is utterly _laughable_. - -In some ways, it would be _less_ embarrassing for Yudkowsky if he were just outright lying about having tried to think of counterarguments. The original post isn't _that_ bad if you assume that Yudkowsky was writing off the cuff, that he clearly just _didn't put any effort whatsoever_ into thinking about why someone might object. If he _did_ put in the effort—enough that he felt comfortable bragging about his ability to see "everything important" (!!) on the other side of the argument—and _still_ ended up proclaiming his "simplest and best protocol" without even so much as _mentioning_ any of its incredibly obvious costs ... that's just _pathetic_. If someone's ability to explore the space of arguments is _that_ bad, why would you trust their opinion about _anything_? - -But perhaps it's premature to judge Yudkowsky without appreciating what tight constraints he labors under. The disclaimer comment mentions "speakable and unspeakable arguments"—but what, exactly, is the boundary of the "speakable"? In response to a commenter mentioning the cost of having to remember pronouns as a potential counterargument, Yudkowsky [offers us another clue](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421871809228): - -> People might be able to speak that. A clearer example of a forbidden counterargument would be something like e.g. imagine if there was a pair of experimental studies somehow proving that (a) everybody claiming to experience gender dysphoria was lying, and that (b) they then got more favorable treatment from the rest of society. We wouldn't be able to talk about that. No such study exists to the best of my own knowledge, and in this case we might well hear about it from the other side to whom this is the exact opposite of unspeakable; but that would be an example. - -(As an aside, the wording of "we might well hear about it from _the other side_" (emphasis mine) is _very_ interesting, suggesting that the so-called "rationalist" community, is, effectively, a partisan institution, despite its claims to be about advancing the generically human art of systematically correct reasoning.) - -I think (a) and (b) _as stated_ are clearly false, so "we" (who?) fortunately aren't losing much by allegedly not being able to speak them. But what about some _similar_ hypotheses, that might be similarly unspeakable for similar reasons? - -Instead of (a), consider the claim that (a′) self-reports about gender dysphoria are substantially distorted by [socially-desirable responding tendencies](https://en.wikipedia.org/wiki/Social-desirability_bias)—as a notable and common example, heterosexual males with [sexual fantasies about being female](http://www.annelawrence.com/autogynephilia_&_MtF_typology.html) [often falsely deny or minimize the erotic dimension of their desire to change sex](/papers/blanchard-clemmensen-steiner-social_desirability_response_set_and_systematic_distortion.pdf) (The idea that self-reports can be motivatedly inaccurate without the subject consciously "lying" should not be novel to someone who co-blogged with [Robin Hanson](https://en.wikipedia.org/wiki/The_Elephant_in_the_Brain) for years!) - -And instead of (b), consider the claim that (b′) transitioning is socially rewarded within particular _subcultures_ (although not Society as a whole), such that many of the same people wouldn't think of themselves as trans or even gender-dysphoric if they lived in a different subculture. - -I claim that (a′) and (b′) are _overwhelmingly likely to be true_. Can "we" talk about _that_? Are (a′) and (b′) "speakable", or not? - -We're unlikely to get clarification from Yudkowsky, but based on my experiences with the so-called "rationalist" community over the past coming-up-on-six years—I'm going to _guess_ that the answer is broadly No: no, "we" can't talk about that. - -But if I'm right that (a′) and (b′) should be live hypotheses and that Yudkowsky would consider them "unspeakable", that means "we" can't talk about what's _actually going on_ with gender dysphoria and transsexuality, which puts the whole discussion in a different light. In another comment, Yudkowsky lists some gender-transition interventions he named in [a November 2018 Twitter thread](https://twitter.com/ESYudkowsky/status/1067183500216811521) that was the precursor to the present discussion—using a different bathroom, changing one's name, asking for new pronouns, and getting sex reassignment surgery—and notes that none of these are calling oneself a "woman". [He continues](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421986539228&reply_comment_id=10159424960909228): - -> [Calling someone a "woman"] _is_ closer to the right sort of thing _ontologically_ to be true or false. More relevant to the current thread, now that we have a truth-bearing sentence, we can admit of the possibility of using our human superpower of language to _debate_ whether this sentence is indeed true or false, and have people express their nuanced opinions by uttering this sentence, or perhaps a more complicated sentence using a bunch of caveats, or maybe using the original sentence uncaveated to express their belief that this is a bad place for caveats. Policies about who uses what bathroom also have consequences and we can debate the goodness or badness (not truth or falsity) of those policies, and utter sentences to declare our nuanced or non-nuanced position before or after that debate. -> -> Trying to pack all of that into the pronouns you'd have to use in step 1 is the wrong place to pack it. - -Sure, _if we were in the position of designing a constructed language from scratch_ under current social conditions in which a person's "gender" is a contested social construct, rather than their sex an objective and undisputed fact, then yeah: in that situation _which we are not in_, you definitely wouldn't want to pack sex or gender into pronouns. But it's a disingenuous derailing tactic to grandstand about how people need to alter the semantics of their _already existing_ native language so that we can discuss the real issues under an allegedly superior pronoun convention when, _by your own admission_, you have _no intention whatsoever of discussing the real issues!_ - -(Lest the "by your own admission" clause seem too accusatory, I should note that given constant behavior, admitting it is _much_ better than not-admitting it; so, huge thanks to Yudkowsky for the transparency on this point!) - -Again, as discussed in "Challenges to Yudkowsky's Pronoun Reform Proposal", a comparison to [the _tú_/_usted_ distinction](https://en.wikipedia.org/wiki/Spanish_personal_pronouns#T%C3%BA/vos_and_usted) is instructive. It's one thing to advocate for collapsing the distinction and just settling on one second-person singular pronoun for the Spanish language. That's principled. - -It's quite another thing altogether to _simultaneously_ try to prevent a speaker from using _tú_ to indicate disrespect towards a social superior (on the stated rationale that the _tú_/_usted_ distinction is dumb and shouldn't exist), while _also_ refusing to entertain or address the speaker's arguments explaining _why_ they think their interlocutor is unworthy of the deference that would be implied by _usted_ (because such arguments are "unspeakable" for political reasons). That's just psychologically abusive. - -If Yudkowsky _actually_ possessed (and felt motivated to use) the "ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking", it would be _obvious_ to him that "Gendered Pronouns For Everyone and Asking To Leave The System Is Lying" isn't the hill anyone would care about dying on if it weren't a Schelling point. A lot of TERF-adjacent folk would be _overjoyed_ to concede the (boring, insubstantial) matter of pronouns as a trivial courtesy if it meant getting to _actually_ address their real concerns of "Biological Sex Actually Exists", and ["Biological Sex Cannot Be Changed With Existing or Foreseeable Technology"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and "Biological Sex Is Sometimes More Relevant Than Self-Declared Gender Identity." The reason so many of them are inclined to stand their ground and not even offer the trivial courtesy is because they suspect, correctly, that the matter of pronouns is being used as a rhetorical wedge to try to prevent people from talking or thinking about sex. - ----- - -Having analyzed the _ways_ in which Yudkowsky is playing dumb here, what's still not entirely clear is _why_. Presumably he cares about maintaining his credibility as an insightful and fair-minded thinker. Why tarnish that by putting on this haughty performance? - -Of course, presumably he _doesn't_ think he's tarnishing it—but why not? [He graciously explains in the Facebook comments](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421901809228): - -> it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do _know_ they're living in a half-Stalinist environment [...] I think people are better off at the end of that. - -Ah, _prudence_! He continues: - -> I don't see what the alternative is besides getting shot, or utter silence about everything Stalin has expressed an opinion on including "2 + 2 = 4" because if that logically counterfactually were wrong you would not be able to express an opposing opinion. - -The problem with trying to "exhibit generally rationalist principles" in an line of argument that you're constructing in order to be prudent and not community-harmful, is that you're thereby necessarily _not_ exhibiting the central rationalist principle that what matters is the process that _determines_ your conclusion, not the reasoning you present to _reach_ your presented conclusion, after the fact. - -The best explanation of this I know was authored by Yudkowsky himself in 2007, in a post titled ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument). It's worth quoting at length. The Yudkowsky of 2007 invites us to consider the plight of a political campaign manager: - -> As a campaign manager reading a book on rationality, one question lies foremost on your mind: "How can I construct an impeccable rational argument that Mortimer Q. Snodgrass is the best candidate for Mayor of Hadleyburg?" -> -> Sorry. It can't be done. -> -> "What?" you cry. "But what if I use only valid support to construct my structure of reason? What if every fact I cite is true to the best of my knowledge, and relevant evidence under Bayes's Rule?" -> -> Sorry. It still can't be done. You defeated yourself the instant you specified your argument's conclusion in advance. - -The campaign manager is in possession of a survey of mayoral candidates on which Snodgrass compares favorably to other candidates, except for one question. The post continues (bolding mine): - -> So you are tempted to publish the questionnaire as part of your own campaign literature ... with the 11th question omitted, of course. -> -> **Which crosses the line between _rationality_ and _rationalization_.** It is no longer possible for the voters to condition on the facts alone; they must condition on the additional fact of their presentation, and infer the existence of hidden evidence. -> -> Indeed, **you crossed the line at the point where you considered whether the questionnaire was favorable or unfavorable to your candidate, before deciding whether to publish it.** "What!" you cry. "A campaign should publish facts unfavorable to their candidate?" But put yourself in the shoes of a voter, still trying to select a candidate—why would you censor useful information? You wouldn't, if you were genuinely curious. If you were flowing _forward_ from the evidence to an unknown choice of candidate, rather than flowing _backward_ from a fixed candidate to determine the arguments. - -The post then briefly discusses the idea of a "logical" argument, one whose conclusions follow from its premises. "All rectangles are quadrilaterals; all squares are quadrilaterals; therefore, all squares are rectangles" is given as an example of _illogical_ argument, even though the both premises are true (all rectangles and squares are in fact quadrilaterals) _and_ the conclusion is true (all squares are in fact rectangles). The problem is that the conclusion doesn't _follow_ from the premises; the _reason_ all squares are rectangles isn't _because_ they're both quadrilaterals. If we accepted arguments of the general _form_ "all A are C; all B are C; therefore all A are B", we would end up believing nonsense. - -Yudkowsky's conception of a "rational" argument—at least, Yudkowsky's conception in 2007, which the Yudkowsky of the current year seems to disagree with—has a similar flavor: the stated reasons should be the actual reasons. The post concludes: - -> If you really want to present an honest, rational argument _for your candidate_, in a political campaign, there is only one way to do it: -> -> * _Before anyone hires you_, gather up all the evidence you can about the different candidates. -> * Make a checklist which you, yourself, will use to decide which candidate seems best. -> * Process the checklist. -> * Go to the winning candidate. -> * Offer to become their campaign manager. -> * When they ask for campaign literature, print out your checklist. -> -> Only in this way can you offer a _rational_ chain of argument, one whose bottom line was written flowing _forward_ from the lines above it. Whatever _actually_ decides your bottom line is the only thing you can _honestly_ write on the lines above. - -I remember this being pretty shocking to read back in 'aught-seven. What an alien mindset! But it's _correct_. You can't rationally argue "for" a chosen conclusion, because only the process you use to _decide what to argue for_ can be your real reason. - -This is a shockingly high standard for anyone to aspire to live up to—but what made Yudkowsky's Sequences so life-changingly valuable, was that they articulated the _existence_ of such a standard. For that, I will always be grateful. - -... which is why it's so _bizarre_ that the Yudkowsky of the current year acts like he's never heard of it! If your _actual_ bottom line is that it is sometimes personally prudent and not community-harmful to post your agreement with Stalin, then sure, you can _totally_ find something you agree with to write on the lines above! Probably something that "exhibits generally rationalist principles", even! It's just that any rationalist who sees the game you're playing is going to correctly identify you as a partisan hack on this topic and take that into account when deciding whether they can trust you on other topics. - -"I don't see what the alternative is besides getting shot," Yudkowsky muses (where presumably, 'getting shot' is a metaphor for a large negative utility, like being unpopular with progressives). Yes, an astute observation! And _any other partisan hack could say exactly the same_, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative is besides getting shot. - -If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is _so_ terrifying to you that it seems analogous to getting shot, then, if those are really your true values, then sure—say whatever you need to say to keep your job and your popularity, as is personally prudent. You've set your price. But if the price you put on the intellectual integrity of your so-called "rationalist" community is similar to that of the Snodgrass for Mayor campaign, you shouldn't be surprised if intelligent, discerning people accord similar levels of credibility to the two groups' output. - -I see the phrase "bad faith" thrown around more than I think people know what it means. "Bad faith" doesn't mean "with ill intent", and it's more specific than "dishonest": it's [adopting the surface appearance of being moved by one set of motivations, while actually acting from another](https://en.wikipedia.org/wiki/Bad_faith). - -For example, an [insurance company employee](https://en.wikipedia.org/wiki/Claims_adjuster) who goes through the motions of investigating your claim while privately intending to deny it might never consciously tell an explicit "lie", but is definitely acting in bad faith: they're asking you questions, demanding evidence, _&c._ in order to _make it look like_ you'll get paid if you prove the loss occurred—whereas in reality, you're just not going to be paid. Your responses to the claim inspector aren't completely casually _inert_: if you can make an extremely strong case that the loss occurred as you say, then the claim inspector might need to put some effort into coming up with some ingenious excuse to deny your claim in ways that exhibit general claim-inspection principles. But at the end of the day, the inspector is going to say what they need to say in order to protect the company's loss ratio, as is personally prudent. - -With this understanding of bad faith, we can read Yudkowsky's "it is sometimes personally prudent [...]" comment as admitting that his behavior on politically-charged topics is in bad faith—where "bad faith" isn't a meaningless insult, but [literally refers](http://benjaminrosshoffman.com/can-crimes-be-discussed-literally/) to the pretending-to-have-one-set-of-motivations-while-acting-according-to-another behavior, such that accusations of bad faith can be true or false. Yudkowsky will never consciously tell an explicit "lie", but he'll go through the motions to _make it look like_ he's genuinely engaging with questions where I need the right answers in order to make extremely impactful social and medical decisions—whereas in reality, he's only going to address a selected subset of the relevant evidence and arguments that won't get him in trouble with progressives. - -To his credit, he _will_ admit that he's only willing to address a selected subset of arguments—but while doing so, he claims an absurd "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking" while _simultaneously_ blatantly mischaracterizing his opponents' beliefs! ("Gendered Pronouns For Everyone and Asking To Leave The System Is Lying" doesn't pass anyone's [ideological Turing test](https://www.econlib.org/archives/2011/06/the_ideological.html).) - -Counterarguments aren't completely causally _inert_: if you can make an extremely strong case that Biological Sex Is Sometimes More Relevant Than Self-Declared Gender Identity, Yudkowsky will put some effort into coming up with some ingenious excuse for why he _technically_ never said otherwise, in ways that exhibit generally rationalist principles. But at the end of the day, Yudkowsky is going to say what he needs to say in order to protect his reputation, as is personally prudent. - -Even if one were to agree with this description of Yudkowsky's behavior, it doesn't immediately follow that Yudkowsky is making the wrong decision. Again, "bad faith" is meant as a literal description that makes predictions about behavior, not a contentless attack—maybe there are some circumstances in which engaging some amount of bad faith is the right thing to do, given the constraints one faces! For example, when talking to people on Twitter with a very different ideological background from me, I sometimes anticipate that if my interlocutor knew what I was actually thinking, they wouldn't want to talk to me, so I engage in a bit of what is sometimes called ["concern trolling"](https://geekfeminism.fandom.com/wiki/Concern_troll): I take care to word my replies in a way that makes it look like I'm more ideologically aligned with them than I actually am. (For example, I [never say "assigned female/male at birth" in my own voice on my own platform](/2019/Sep/terminology-proposal-developmental-sex/), but I'll do it in an effort to speak my interlocutor's language.) I think of this as the _minimal_ amount of strategic bad faith needed to keep the conversation going, to get my interlocutor to evaluate my argument on its own merits, rather than rejecting it for coming from an ideological enemy. In cases such as these, I'm willing to defend my behavior as acceptable—there _is_ a sense in which I'm being deceptive by optimizing my language choice to make my interlocutor make bad guesses about my ideological alignment, but I'm comfortable with that amount and scope of deception in the service of correcting the distortion where I don't think my interlocutor _should_ be paying attention to my personal alignment. - -That is, my bad faith concern-trolling gambit of deceiving people about my ideological alignment in the hopes of improving the discussion seems like something that makes our collective beliefs about the topic-being-argued-about _more_ accurate. (And the topic-being-argued-about is presumably of greater collective interest than which "side" I personally happen to be on.) - -In contrast, the "it is sometimes personally prudent [...] to post your agreement with Stalin" gambit is the exact reverse: it's _introducing_ a distortion into the discussion in the hopes of correcting people's beliefs about the speaker's ideological alignment. (Yudkowsky is not a right-wing Bad Guy, but people would tar him as a right-wing Bad Guy if he ever said anything negative about trans people.) This doesn't improve our collective beliefs about the topic-being-argued about; it's a _pure_ ass-covering move. - -Yudkowsky names the alleged fact that "people do _know_ they're living in a half-Stalinist environment" as a mitigating factor. But the _reason_ censorship is such an effective tool in the hands of dictators like Stalin is because it ensures that many people _don't_ know—and that those who know (or suspect) don't have [game-theoretic common knowledge](https://www.lesswrong.com/posts/9QxnfMYccz9QRgZ5z/the-costly-coordination-mechanism-of-common-knowledge#Dictators_and_freedom_of_speech) that others do too. - -Zvi Mowshowitz has [written about how the false assertion that "everybody knows" something](https://thezvi.wordpress.com/2019/07/02/everybody-knows/) is typically used justify deception: if "everybody knows" that we can't talk about biological sex (the reasoning goes), then no one is being deceived when our allegedly truthseeking discussion carefully steers clear of any reference to the reality of biological sex when it would otherwise be extremely relevant. - -But if it were _actually_ the case that everybody knew (and everybody knew that everybody knew), then what would be the point of the censorship? It's not coherent to claim that no one is being harmed by censorship because everyone knows about it, because the entire appeal and purpose of censorship is precisely that _not_ everybody knows and that someone with power wants to _keep_ it that way. - -For the savvy people in the know, it would certainly be _convenient_ if everyone secretly knew: then the savvy people wouldn't have to face the tough choice between acceding to Power's demands (at the cost of deceiving their readers) and informing their readers (at the cost of incurring Power's wrath). - -Policy debates should not appear one-sided. Faced with this kind of dilemma, I can't say that defying Power is necessarily the right choice: if there really _were_ no other options between deceiving your readers with a bad faith performance, and incurring Power's wrath, and Power's wrath would be too terrible to bear, then maybe deceiving your readers with a bad faith performance is the right thing to do. - -But if you actually _cared_ about not deceiving your readers, you would want to be _really sure_ that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore counteraguments on grounds of their being politically unfavorable_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion", but this seems like yet another instance of Yudkowsky motivatedly playing dumb: if he _wanted_ to, I'm sure Eliezer Yudkowsky could think of _some relevant differences_ between "2 + 2 = 4" (a trivial fact of arithmetic) and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'" (a complex policy proposal whose numerous flaws I have analyzed in detail). - -"[P]eople do _know_ they're living in a half-Stalinist environment," Yudkowsky says to justify himself. "I think people are better off at the end of that," he says. But who are "people", specifically? I'm asking because I have a long sob story about how _I_ didn't know, and _I'm_ not better off. - -For a long time, I've been meaning to write up the Whole Dumb Story. - - -[TODO: For a long time, I've been meaning to write up the Whole Dumb Story, but it's been hard for me to finish, and it would be too long for you to read, so I hope it's OK if I just hit the highlights] - -[back in the Sequences-era, ["Sexual Dimorphism in Yudkowsky's Sequences, in Relation to My Gender Problems"](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/)] - -In [a 26 March 2016 Facebook post](https://www.facebook.com/yudkowsky/posts/10154078468809228), he wrote— - -> I'm not sure if the following generalization extends to all genetic backgrounds and childhood nutritional backgrounds. There are various ongoing arguments about estrogenlike chemicals in the environment, and those may not be present in every country ... - -> Still, for people roughly similar to the Bay Area / European mix, I think I'm over 50% probability at this point that at least 20% of the ones with penises are actually women. - -(***!?!?!?!?***) - -> A lot of them don't know it or wouldn't care, because they're female-minds-in-male-bodies but also cis-by-default (lots of women wouldn't be particularly disturbed if they had a male body; the ones we know as 'trans' are just the ones with unusually strong female gender identities). Or they don't know it because they haven't heard in detail what it feels like to be gender dysphoric, and haven't realized 'oh hey that's me'. See, e.g., and - -https://www.facebook.com/yudkowsky/posts/10154110278349228 -> Just checked my filtered messages on Facebook and saw, "Your post last night was kind of the final thing I needed to realize that I'm a girl." -> ==DOES ALL OF THE HAPPY DANCE FOREVER== - - - - -uncritically (uncharacteristically uncritically) taking the newly-ascendant gender-identity theory for granted ("lots of women wouldn't be particularly disturbed if they had a male body; the ones we know as 'trans' are just the ones with unusually strong female gender identities"), without considering the obvious-in-retrospect hypothesis that "guy who speculates about his female analogue on a transhumanist mailing list in 2004" and "guy who thinks he might be a trans women in Berkeley 2016" are the same guy. - - -[The claim that the "Hill of Validity" thread wasn't partisan, but was just neutrally trying to teach the difference between facts and policy decisions is not credible. Imagine if someone was complaining about being required to say "Peace Be Upon Him" before referencing the prophet Muhammad. "It's a speech act, there's nothing factually false about saying 'peace be unto him'"] -unremediatedgender.space/2020/Aug/yarvin-on-less-wrong/ - - -[For example](https://twitter.com/ESYudkowsky/status/1067490362225156096): -> The more technology advances, the further we can move people towards where they say they want to be in sexspace. Having said this we've said all the facts. Who competes in sports segregated around an Aristotelian binary is a policy question (that I personally find very humorous). - - - -"Beliefs about the self aren't special" is part of the whole AI reflectivity thing, too!! - - -> Still think this was a perfectly fine tweet btw. Some people afaict were doing the literal ontologically confused thing; seemed like a simple thing to make progress on. Some people wanted to read it as a coded statement despite all my attempts to narrow it, but what can you do. -https://twitter.com/ESYudkowsky/status/1356535300986523648 -If you were actually HONESTLY tring to narrow it, you would have said, "By the way, this is just about pronouns, I'm not taking a position on whether trans women are women" - -> If you think you can win a battle about 2 + 3 = 5, then it can feel like victory or self-justification to write a huge long article hammering on that; but it doesn't feel as good to engage with how the Other does not think they are arguing 2 + 3 = 6, they're talking about 2 * 3. -https://twitter.com/ESYudkowsky/status/1435618825198731270 - -Speaking of narcissism and perspective-taking, "deception" isn't about whether you personally "lied" according to your own re-definitions; it's about whether you predictably made others update in the wrong direction - -> I have never in my own life tried to persuade anyone to go trans (or not go trans) - I don't imagine myself to understand others that much. -https://twitter.com/ESYudkowsky/status/1404697716689489921 - -https://twitter.com/ESYudkowsky/status/1404821285276774403 -> It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy. - -https://twitter.com/ESYudkowsky/status/1434906470248636419 -> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. - -https://www.facebook.com/yudkowsky/posts/10154981483669228 -> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others. - - - -What makes all of this especially galling is the fact that _all of my heretical opinions are literally just Yudkowsky's opinions from the 'aughts!_ My whole thing about how changing sex isn't possible with existing technology because the category encompasses so many high-dimensional details? Not original to me! I [filled in a few trivial technical details](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#changing-sex-is-hard), but again, this was _in the Sequences_ as ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions). My thing about how you can't define concepts any way you want, because there are mathematical laws governing which category boundaries compress your anticipated experiences? Not original to me! I [filled in](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [a few technical details](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception), but [_we had a whole Sequence about this._](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) - -Seriously, you think I'm _smart enough_ to come up with all of this indepedently? I'm not! I ripped it all off from Yudkowsky back in the 'aughts _when he still thought he could politically afford to gave a shit about telling the truth_ in this domain. - -Now that the political environment has changed and he doesn't think he can afford to give a shit, does ... does he expect us not to _notice_? Or does he just think that "everybody knows"? - - - -But I don't think that everybody knows. So I'm telling you. - - - - - -[Why does this matter? It would be dishonest for me to claim that this is _directly_ relevant to xrisk, because that's not my real bottom line] - -Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." - - -a rationality community that can't think about _practical_ issues that affect our day to day lives, but can get existential risk stuff right, is like asking for self-driving car software that can drive red cars but not blue cars - -It's a _problem_ if public intellectuals in the current year need to pretend to be dumber than seven-year-olds in 2016 - -> _Perhaps_, replied the cold logic. _If the world were at stake._ -> -> _Perhaps_, echoed the other part of himself, _but that is not what was actually happening._ -https://www.yudkowsky.net/other/fiction/the-sword-of-good - -https://www.readthesequences.com/ -> Because it is all, in the end, one thing. I talked about big important distant problems and neglected immediate life, but the laws governing them aren't actually different. - -> the challenge is almost entirely about high integrity communication by small groups -https://twitter.com/HiFromMichaelV/status/1486044326618710018 - -https://www.econlib.org/archives/2016/01/the_invisible_t.html diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index e697756..b51c03a 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -2,16 +2,6 @@ The thing about our crowd is that we have a lamentably low proportion of women ( https://slatestarscratchpad.tumblr.com/post/142995164286/i-was-at-a-slate-star-codex-meetup. "We are solving the gender ratio issue one transition at a time" -So, a striking thing about my series of increasingly frustrating private conversations and subsequent public Facebook meltdown (the stress from which soon landed me in psychiatric jail, but that's [another](/2017/Mar/fresh-princess/) [story](/2017/Jun/memoirs-of-my-recent-madness-part-i-the-unanswerable-words/)) was the tendency for some threads of conversation to get _derailed_ on some variation of, "Well, the word _woman_ doesn't necessarily mean that," often with a link to ["The Categories Were Made for Man, Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/), a 2014 post by Scott Alexander - -, the _second_ most prominent writer in our robot cult. - -So, this _really_ wasn't what I was trying to talk about; _I_ thought I was trying to talk about autogynephilia as an _empirical_ theory in psychology, the truth or falsity of which - -Psychology is a complicated empirical science: no matter how "obvious" I might think something is, I have to admit that I could be wrong—not just as a formal profession of modesty, but _actually_ wrong in the real world. - -But this "I can define the word _woman_ any way I want" mind game? _That_ part was _absolutely_ clear-cut. That part of the argument, I knew I could win. - [TODO: contrast "... Not Man for the Categories" to "Against Lie Inflation"; When the topic at hand is how to define "lying", Alexander Scott has written exhaustively about the dangers of strategic equivocation ("Worst Argument", "Brick in the Motte"); insofar as I can get a _coherent_ posiiton out of the conjunction of "... for the Categories" and Scott's other work, it's that he must think strategic equivocation is OK if it's for being nice to people @@ -24,7 +14,7 @@ At first I did this in the object-level context of gender on this blog, in ["The Later, after [Eliezer Yudkowsky joined in the mind games on Twitter in November 2018](https://twitter.com/ESYudkowsky/status/1067183500216811521) [(archived)](https://archive.is/ChqYX), I _flipped the fuck out_, and ended up doing more [stictly abstract philosophy-of-language work](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) [on](https://www.lesswrong.com/posts/edEXi4SpkXfvaX42j/schelling-categories-and-simple-membership-tests) [the](https://www.lesswrong.com/posts/fmA2GJwZzYtkrAKYJ/algorithms-of-deception) [robot](https://www.lesswrong.com/posts/4hLcbXaqudM9wSeor/philosophy-in-the-darkest-timeline-basics-of-the-evolution)-[cult](https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist) [blog](https://www.lesswrong.com/posts/onwgTH6n8wxRSo2BJ/unnatural-categories-are-optimized-for-deception). -An important thing to appreciate is that the philosophical point I was trying to make has _absolutely nothing to do with gender_. In 2008, Yudkowsky had explained that _for all_ nouns N, you can't define _N_ any way you want, because _useful_ definitions need to "carve reality at the joints." +_for all_ nouns N, you can't define _N_ any way you want, because _useful_ definitions need to "carve reality at the joints." It [_follows logically_](https://www.lesswrong.com/posts/WQFioaudEH8R7fyhm/local-validity-as-a-key-to-sanity-and-civilization) that, in particular, if _N_ := "woman", you can't define the word _woman_ any way you want. Maybe trans women _are_ women! But if so—that is, if you want people to agree to that word usage—you need to be able to _argue_ for why that usage makes sense on the empirical merits; you can't just _define_ it to be true. And this is a _general_ principle of how language works, not something I made up on the spot in order to attack trans people. @@ -275,8 +265,10 @@ https://distill.pub/2021/multimodal-neurons/ https://www.jefftk.com/p/an-update-on-gendered-pronouns -> Still think this was a perfectly fine tweet btw. Some people afaict were doing the literal ontologically confused thing; seemed like a simple thing to make progress on. Some people wanted to read it as a coded statement despite all my attempts to narrow it, but what can you do. +> Still think this was a perfectly fine tweet btw. Some people afaict were doing the literal ontologically confused thing; seemed like a simple thing to make progress on. Some people wanted to read it as a coded statement despite all my attempts to narrow it, but what can you do. https://twitter.com/ESYudkowsky/status/1356535300986523648 + + If you were actually HONESTLY tring to narrow it, you would have said, "By the way, this is just about pronouns, I'm not taking a position on whether trans women are women" https://www.gingersoftware.com/content/grammar-rules/adjectives/order-of-adjectives/ @@ -416,15 +408,7 @@ But I think Eliezer and I _agree_ on what he's doing; he just doesn't see it's b Speaking of narcissism and perspective-taking, "deception" isn't about whether you personally "lied" according to your own re-definitions; it's about whether you predictably made others update in the wrong direction -[ -> I have never in my own life tried to persuade anyone to go trans (or not go trans) - I don't imagine myself to understand others that much. -https://twitter.com/ESYudkowsky/status/1404697716689489921 - -Tweet said "I've never persuaded anyone to go trans" in light of his track record; is like thinking it's personally prudent and not community-harmful to bash Democrats and praise Republicans. If any possible good thing about Democrats is something you mention that "the other side" would say. Even if you can truthfully say "I've never _told_ anyone to _vote_ Republican", you shouldn't be surprised if people regard you as a Republican shill ; the "30% of the ones with penises" proclamation sort of was encouraging it, really! - -https://twitter.com/ESYudkowsky/status/1404821285276774403 -> It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy. I really appreciated Anatoly Vorobey's comments: @@ -434,11 +418,6 @@ I really appreciated Anatoly Vorobey's comments: > ...(then twitter dogma of the time, and now almost the blue tribe dogma of our time)... that I can understand how someone like Zack, embedded in the rat culture physically and struggling with this reigning ideology, could feel it as gaslighting. -] - -https://www.facebook.com/yudkowsky/posts/10154110278349228 -> Just checked my filtered messages on Facebook and saw, "Your post last night was kind of the final thing I needed to realize that I'm a girl." -> ==DOES ALL OF THE HAPPY DANCE FOREVER== https://www.lesswrong.com/posts/sCCdCLPN9E3YvdZhj/shulman-and-yudkowsky-on-ai-progress @@ -455,7 +434,6 @@ https://www.facebook.com/algekalipso/posts/4769054639853322?comment_id=477040850 > recursively cloning Scott Alexander—with promising allelic variations - and hothousing the “products” could create a community of super-Scotts with even greater intellectual firepower https://twitter.com/ESYudkowsky/status/1434906470248636419 - > Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke." @@ -478,7 +456,7 @@ Respect needs to be updateable. No one can think fast enough to think all their I think it's a problem for our collective epistemology that Scott has the power to sneeze his mistakes onto everyone else—that our 2021 beliefs about dolphins (literally, dolphins in particular!) is causally downstream of Scott's political incentives in 2014, even if Scott wasn't consciously lying and Nate wasn't thinking about gender politics. I think this is the problem that Eliezer identified as dark side epistemology: people invent fake epistemology lessons to force a conclusion that they can't get on the merits, and the fake lessons can spread, even if the meme-recipients aren't trying to force anything themselves. I would have expected people with cultural power to be interested in correcting the problem once it was pointed out. https://twitter.com/esyudkowsky/status/1374161729073020937 -> Also: Having some things you say "no comment" to, is not at *all* the same phenomenon as being an organization that issues Pronouncements. There are a *lot* of good reasons to have "no comments" about things. Anybody who tells you otherwise has no life experience, or is lying. +> Also: Having some things you say "no comment" to, is not at *all* the same phenomenon as being an organization that issues Pronouncements. There are a *lot* of good reasons to have "no comments" about things. Anybody who tells you otherwise has no life experience, or is lying. "Speak out in order to make it clear how not alt-right you are; nothing wrong with that because I'm not lying" is being inconsistent about whether signaling and mood-affiliation matters—it's trying to socially profit by signaling pro-Stalin-ness, while simultaneously denying that anyone could object (because you didn't lie—pivoting to a worldview where only literal meanings matter and signals aren't real). Can I sketch this out mathematically? @@ -559,7 +537,7 @@ If Yudkowsky is obviously playing dumb (consciously or not) and his comments can Fortunately, Yudkowsky graciously grants us a clue in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228): -> It unfortunately occurs to me that I must, in cases like these, disclaim that—to the extent there existed sensible opposing arguments against what I have just said—people might be reluctant to speak them in public, in the present social atmosphere. [...] +> It unfortunately occurs to me that I must, in cases like these, disclaim that—to the extent there existed sensible opposing arguments against what I have just said—people might be reluctant to speak them in public, in the present social atmosphere. That is, in the logical counterfactual universe where I knew of very strong arguments against freedom of pronouns, I would have probably stayed silent on the issue, as would many other high-profile community members, and only Zack M. Davis would have said anything where you could hear it. > > This is a filter affecting your evidence; it has not to my own knowledge filtered out a giant valid counterargument that invalidates this whole post. I would have kept silent in that case, for to speak then would have been dishonest. > @@ -571,27 +549,43 @@ So, the explanation of [the problem of political censorship filtering evidence]( Really, it would be _less_ embarassing for Yudkowsky if he were outright lying about having tried to think of counterarguments. The original post isn't _that_ bad if you assume that Yudkowsky was writing off the cuff, that he clearly just _didn't put any effort whatsoever_ into thinking about why someone might disagree. If he _did_ put in the effort—enough that he felt comfortable bragging about his ability to see the other side of the argument—and _still_ ended up proclaiming his "simplest and best protocol" without even so much as _mentioning_ any of its incredibly obvious costs ... that's just _pathetic_. If Yudkowsky's ability to explore the space of arguments is _that_ bad, why would you trust his opinion about _anything_? +But perhaps it's premature to judge Yudkowsky without appreciating what tight constraints he labors under. The disclaimer comment mentions "speakable and unspeakable arguments"—but what, exactly, is the boundary of the "speakable"? In response to a commenter mentioning the cost of having to remember pronouns as a potential counterargument, Yudkowsky [offers us another clue](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421871809228): -... and, I had been planning to save the Whole Dumb Story about my alienation from Yudkowsky's so-called "rationalists" for a _different_ multi-thousand-word blog post, because _this_ multi-thousand-word blog post was supposed to be narrowly scoped to _just_ exhaustively replying to Yudkowsky's February 2021 Facebook post about pronoun conventions. But in order to explain the problems with "people do _know_ they're living in a half-Stalinist environment" and "people are better off at the end of that", I may need to _briefly_ recap some of the history leading to the present discussion, which explains why _I_ didn't know and _I'm_ not better off, with the understanding that it's only a summary and I might still need to tell the long version in a separate post, if it feels still necessary relative to everything else I need to get around to writing. (It's not actually a very interesting story; I just need to get it out of my system so I can stop grieving and move on with my life.) +> People might be able to speak that. A clearer example of a forbidden counterargument would be something like e.g. imagine if there was a pair of experimental studies somehow proving that (a) everybody claiming to experience gender dysphoria was lying, and that (b) they then got more favorable treatment from the rest of society. We wouldn't be able to talk about that. No such study exists to the best of my own knowledge, and in this case we might well hear about it from the other side to whom this is the exact opposite of unspeakable; but that would be an example. -I _never_ expected to end up arguing about something so _trivial_ as the minutiae of pronoun conventions (which no one would care about if historical contingencies of the evolution of the English language hadn't made them a Schelling point and typographical attack surface for things people do care about). The conversation only ended up here after a series of derailings. At the start, I was _trying_ to say something substantive about the psychology of straight men who wish they were women. +(As an aside, the wording of "we might well hear about it from _the other side_" (emphasis mine) is _very_ interesting, suggesting that the so-called "rationalist" community, is, effectively, a partisan institution, despite its claims to be about advancing the generically human art of systematically correct reasoning.) + +I think (a) and (b) _as stated_ are clearly false, so "we" (who?) fortunately aren't losing much by allegedly not being able to speak them. But what about some _similar_ hypotheses, that might be similarly unspeakable for similar reasons? -You see, back in the 'aughts when Yudkowsky was writing his Sequences, he occasionally said some things about sex differences that I often found offensive at the time, but which ended up being hugely influential on me, especially in the context of my ideological denial of psychological sex differences and my secret lifelong-since-puberty erotic fantasy about being magically transformed into a woman. I wrote about this at length in a previous post, ["Sexual Dimorphism in Yudkowsky's Sequences, in Relation to my Gender Problems"](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/)] -. +Instead of (a), consider the claim that (a′) self-reports about gender dysphoria are substantially distorted by [socially-desirable responding tendencies](https://en.wikipedia.org/wiki/Social-desirability_bias)—as a notable and common example, heterosexual males with [sexual fantasies about being female](http://www.annelawrence.com/autogynephilia_&_MtF_typology.html) [often falsely deny or minimize the erotic dimension of their desire to change sex](/papers/blanchard-clemmensen-steiner-social_desirability_response_set_and_systematic_distortion.pdf) (The idea that self-reports can be motivatedly inaccurate without the subject consciously "lying" should not be novel to someone who co-blogged with [Robin Hanson](https://en.wikipedia.org/wiki/The_Elephant_in_the_Brain) for years!) -In particular, in ["Changing Emotions"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) (and its precursor in a [2004 Extropians mailing list post](https://archive.is/En6qW)), Yudkowsky explains that "changing sex" is vastly easier said than done— +And instead of (b), consider the claim that (b′) transitioning is socially rewarded within particular _subcultures_ (although not Society as a whole), such that many of the same people wouldn't think of themselves as trans or even gender-dysphoric if they lived in a different subculture. +I claim that (a′) and (b′) are _overwhelmingly likely to be true_. Can "we" talk about _that_? Are (a′) and (b′) "speakable", or not? +We're unlikely to get clarification from Yudkowsky, but based on my experiences with the so-called "rationalist" community over the past coming-up-on-six years—I'm going to _guess_ that the answer is broadly No: no, "we" can't talk about that. -[[[ TODO summarize the Whole Dumb Story (does there need to be a separate post? I'm still not sure) +But if I'm right that (a′) and (b′) should be live hypotheses and that Yudkowsky would consider them "unspeakable", that means "we" can't talk about what's _actually going on_ with gender dysphoria and transsexuality, which puts the whole discussion in a different light. In another comment, Yudkowsky lists some gender-transition interventions he named in [a November 2018 Twitter thread](https://twitter.com/ESYudkowsky/status/1067183500216811521) that was the precursor to the present discussion—using a different bathroom, changing one's name, asking for new pronouns, and getting sex reassignment surgery—and notes that none of these are calling oneself a "woman". [He continues](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421986539228&reply_comment_id=10159424960909228): -[TODO: But that was all about me—I assumed "trans" was a different thing. My first clue that I might not be living in that world came from—Eliezer Yudkowsky, with the "at least 20% of the ones with penises are actually women" thing] +> [Calling someone a "woman"] _is_ closer to the right sort of thing _ontologically_ to be true or false. More relevant to the current thread, now that we have a truth-bearing sentence, we can admit of the possibility of using our human superpower of language to _debate_ whether this sentence is indeed true or false, and have people express their nuanced opinions by uttering this sentence, or perhaps a more complicated sentence using a bunch of caveats, or maybe using the original sentence uncaveated to express their belief that this is a bad place for caveats. Policies about who uses what bathroom also have consequences and we can debate the goodness or badness (not truth or falsity) of those policies, and utter sentences to declare our nuanced or non-nuanced position before or after that debate. +> +> Trying to pack all of that into the pronouns you'd have to use in step 1 is the wrong place to pack it. -_After it's been pointed out_, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female counterpart" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. +Sure, _if we were in the position of designing a constructed language from scratch_ under current social conditions in which a person's "gender" is a contested social construct, rather than their sex an objective and undisputed fact, then yeah: in that situation _which we are not in_, you definitely wouldn't want to pack sex or gender into pronouns. But it's a disingenuous derailing tactic to grandstand about how people need to alter the semantics of their _already existing_ native language so that we can discuss the real issues under an allegedly superior pronoun convention when, _by your own admission_, you have _no intention whatsoever of discussing the real issues!_ -[So I ended up arguing with people about the two-type taxonomy, and I noticed that those discussions kept getting _derailed_ on some variation of "The word woman doesn't actually mean that". So I took the bait, and starting arguing against that, and then Yudkowsky comes back to the subject with his "Hill of Validity in Defense of Meaning"—and I go on a philosophy of language crusade, and Yudkowsky eventually clarifies, and _then_ he comes back _again_ in Feb. 2022 with his "simplest and best protocol"] +(Lest the "by your own admission" clause seem too accusatory, I should note that given constant behavior, admitting it is _much_ better than not-admitting it; so, huge thanks to Yudkowsky for the transparency on this point!) -]]] +Again, as discussed in "Challenges to Yudkowsky's Pronoun Reform Proposal", a comparison to [the _tú_/_usted_ distinction](https://en.wikipedia.org/wiki/Spanish_personal_pronouns#T%C3%BA/vos_and_usted) is instructive. It's one thing to advocate for collapsing the distinction and just settling on one second-person singular pronoun for the Spanish language. That's principled. + +It's quite another thing altogether to _simultaneously_ try to prevent a speaker from using _tú_ to indicate disrespect towards a social superior (on the stated rationale that the _tú_/_usted_ distinction is dumb and shouldn't exist), while _also_ refusing to entertain or address the speaker's arguments explaining _why_ they think their interlocutor is unworthy of the deference that would be implied by _usted_ (because such arguments are "unspeakable" for political reasons). That's just psychologically abusive. + +If Yudkowsky _actually_ possessed (and felt motivated to use) the "ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking", it would be _obvious_ to him that "Gendered Pronouns For Everyone and Asking To Leave The System Is Lying" isn't the hill anyone would care about dying on if it weren't a Schelling point. A lot of TERF-adjacent folk would be _overjoyed_ to concede the (boring, insubstantial) matter of pronouns as a trivial courtesy if it meant getting to _actually_ address their real concerns of "Biological Sex Actually Exists", and ["Biological Sex Cannot Be Changed With Existing or Foreseeable Technology"](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and "Biological Sex Is Sometimes More Relevant Than Self-Declared Gender Identity." The reason so many of them are inclined to stand their ground and not even offer the trivial courtesy is because they suspect, correctly, that the matter of pronouns is being used as a rhetorical wedge to try to prevent people from talking or thinking about sex. + +--------- + +I _never_ expected to end up arguing about something so _trivial_ as the minutiae of pronoun conventions (which no one would care about if historical contingencies of the evolution of the English language hadn't made them a Schelling point and typographical attack surface for things people do care about). The conversation only ended up here after a series of derailings. At the start, I was _trying_ to say something substantive about the psychology of straight men who wish they were women. + +_After it's been pointed out_, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female counterpart" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. At this point, the nature of the game is very clear. Yudkowsky wants to make sure he's on peaceful terms with the progressive _Zeitgeist_, subject to the constraint of not saying anything he knows to be false. Meanwhile, I want to actually make sense of what's actually going on in the world as regards sex and gender, because _I need the correct answer to decide whether or not to cut my dick off_. @@ -665,6 +659,8 @@ congrats after Whale and Sawyer chimed in: https://twitter.com/ESYudkowsky/statu https://twitter.com/ESYudkowsky/status/1404700330927923206 > That is: there's a story here where not just particular people hounding Zack as a responsive target, but a whole larger group, are engaged in a dark conspiracy that is all about doing damage on issues legible to Zack and important to Zack. This is merely implausible on priors. +the "It is sometimes personally prudent to be seen to agree with Stalin" attitude behaves like a conspiracy, even if + I feel I've outlived myself https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4166378/ > Admitting something when being pushed a little, but never thinking it spontaneously and hence having those thoughts absent from your own thought processes, remains not sane. @@ -788,89 +784,6 @@ https://twitter.com/satisfiesvalues/status/1524475059695505409 flu virus that cures Borderer culture https://twitter.com/Kenku_Allaryi/status/1524646257976877057 -Anna thinks that committees can't do anything worthwhile; for endeavors requiring a lot of coordination, it's useful for leaders to have slack to make decisions without having to justify themselves to a mob. Anna endorses Straussianism: writing for the few is different from writing for the many, and that some of Ben's stuff may have veered too far towards loading negative affect on EA leaders; I and my model of Michael have reservations about the extent to which \"writing for the few\" could also be described as \"colluding to deceive the rest of the world\ - -an irony: in my psychosis, I was scared that the world was far less legible than I had imagined, but that _wasn't_ why my ordeal's and Devi's were so traumatic _at all_: the psych ward is _very much_ governed by legible rules, rules that I had no control over - -[initial fan mail to Bailey on 7 January, followup to include blog link on 11 February; initial fan mail to Blanchard 10 August] - -I had some self-awareness that I was going off the rails— -> She had a delusional mental breakdown; you're a little bit manic; I'm in the Avatar state. https://www.facebook.com/zmdavis/posts/10154813104220199 - -to Ben: "I thought I got a message from Michael Vassar saying that the main coalitions were you, and Sarah Constantine, and Zack Davis vs. the world" Fri Feb 17 2017 14:30:55 GMT-0800 - -scared that Orion was going to kill me - -to Ben: You can use police cars as Ubers???? Fri Feb 17 2017 15:19:59 GMT-0800 - -] - -You gave me hot chocolate last night, right? I was worried that you were subconsciously poisoning me; not on purpose, but because there are just a lot of contaminants in cities; things that taste sweet to children but are actually poisonous; but, Anna said that most events are normal; I don't remember that note" -Mon Apr 10 08:51:03 PDT 2017 - -Michael's "Congratulations on not going back to work at the carpet store!" was a reference to Rick & Morty "Mortynight Run" which aired in August 2015, but I hadn't seen it yet - -winning Hamilton tickets at $2200 - -meeting Katie— - -Sun Jan 15 2017 08:35:40 -Folks, I'm not sure it's feasible to have an intellectually-honest real-name public conversation about the etiology of MtF. If no one is willing to mention some of the key relevant facts, maybe it's less misleading to just say nothing.\"", - -He was always more emotionally tentative and less comfortable with the standard gender role and status stuff" -But in the way of like, a geeky nerd guy -Not in the way of someone feminine -The only thing I knew about it at the point we got married was that he thought it was fun to go in drag sometimes -Like Halloween - -And he thought feminization kink was fun -Like me making him dress up? But he said it was about humiliation -We didn't even do it more than a handful of times, it wasn't really my thing -Nothing in my experience ever caused me to think he was trans - -"He talked about being a child always feeling out of place -"But out of place seemed like because he was shy and anxious -He said he was convinced his anxiety and social problems was *because* he was trans - -Spencer seemed much less happy to me after admitting to want transition, often crying about how ugly his body was - -because it basically amounts to, \"You rebuilt your entire life around your perverted narcissistic fantasy and didn't even notice\ -like, there's no nice way to say that - -My taxon, right or wrong; if right, to be kept right; and if wrong, to be set right.\ - -all those transwomen are going to be so embarrassed when the FAI gives us telepathy after the Singularity -and it turns out that what actual women feel as _absolutely nothing to do_ with what AGP fantasy feels like - - - -Tue Feb 14 2017 11:26:20 (this conversation was actually during the tantrum)— -K: I really *was* getting to the point that I hated transwomen -Z: I hate them, too! -Z: Fuck those guys! -K: I hated what happened to my husband, I hate the insistence that I use the right pronouns and ignore my senses, I hate the takeover of women's spaces, I hate the presumption that they know what a woman's life is like, I was *getting* to the point that I deeply hated them, and saw them as the enemy -K: But you're actually changing that for me -K: You're reconnecting me with my natural compassion -K: To people who are struggling and have things that are hard -K: It's just that, the way they think things is hard is not the way I actually think it is anymore -Z: the \"suffering\" is mostly game-theoretic victimhood-culture -K: You've made me hate transwomen *less* now -K: Because I have a model -K: I understand the problem -[...] -K: I understand why it's hard -K: I feel like I can forgive it, to the extent that forgiveness is mine to give -K: This is a better thing for me -I did not *want* to be a hateful person", -I did not want to take seeming good people as an enemy in my head, while trying to be friends with them in public -I think now I can do it more honestly -They might not want *me* as a friend -But now I feel less threatened and confused and insulted -And that has dissolved the hatred that was starting to take root -I'm very grateful for that - -https://www.lesswrong.com/posts/ZEgQGAjQm5rTAnGuM/beware-boasting-about-non-existent-forecasting-track-records - https://www.lesswrong.com/tag/criticisms-of-the-rationalist-movement @@ -925,7 +838,7 @@ https://postchimpblog.wordpress.com/2020/03/05/alexs-guide-to-transitioning/ The LW community is a bubble/machine that made me who I am (it doesn't infringe on my independence more than school, but it's still shaping force in the way that going to University or Google shapes people) https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy -> If those people went around lying to others and paternalistically deceiving them—well, mostly, I don't think they'll have really been the types to live inside reality themselves. But even imagining the contrary, good luck suddenly unwinding all those deceptions and getting other people to live inside reality with you, to coordinate on whatever suddenly needs to be done when hope appears, after you drove them outside reality before that point. Why should they believe anything you say? +> If those people went around lying to others and paternalistically deceiving them—well, mostly, I don't think they'll have really been the types to live inside reality themselves. But even imagining the contrary, good luck suddenly unwinding all those deceptions and getting other people to live inside reality with you, to coordinate on whatever suddenly needs to be done when hope appears, after you drove them outside reality before that point. Why should they believe anything you say? the Extropians post _explicitly_ says "may be a common sexual fantasy" > So spending a week as a member of the opposite sex may be a common sexual fantasy, but I wouldn't count on being able to do this six seconds after the Singularity. I would not be surprised to find that it took three subjective centuries before anyone had grown far enough to attempt a gender switch. @@ -973,16 +886,128 @@ re Yudkowsky not understanding the "That's So Gender" sense, I suspect this is b ------ - - - - - [TODO: Email to Scott at 0330 a.m. > In the last hour of the world before this is over, as the nanobots start consuming my flesh, I try to distract myself from the pain by reflecting on what single blog post is most responsible for the end of the world. And the answer is obvious: "The Categories Were Made for the Man, Not Man for the Categories." That thing is a fucking Absolute Denial Macro! ] +------ + So, because -[TODO: the rats not getting AGP was excusable, the rats not getting the category boundary thing was extremely disappointing but not a causis belli; Eliezer Yudkowsky not getting the category boundary thing was an emergency] \ No newline at end of file +[TODO: the rats not getting AGP was excusable, the rats not getting the category boundary thing was extremely disappointing but not a causis belli; Eliezer Yudkowsky not getting the category boundary thing was an emergency] + +----- + +Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." + +a rationality community that can't think about _practical_ issues that affect our day to day lives, but can get existential risk stuff right, is like asking for self-driving car software that can drive red cars but not blue cars + +It's a _problem_ if public intellectuals in the current year need to pretend to be dumber than seven-year-olds in 2016 + +https://www.econlib.org/archives/2016/01/the_invisible_t.html + +------ + + +Having analyzed the _ways_ in which Yudkowsky is playing dumb here, what's still not entirely clear is _why_. Presumably he cares about maintaining his credibility as an insightful and fair-minded thinker. Why tarnish that by putting on this haughty performance? + +Of course, presumably he _doesn't_ think he's tarnishing it—but why not? [He graciously explains in the Facebook comments](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421901809228): + +> it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do _know_ they're living in a half-Stalinist environment [...] I think people are better off at the end of that. + +Ah, _prudence_! He continues: + +> I don't see what the alternative is besides getting shot, or utter silence about everything Stalin has expressed an opinion on including "2 + 2 = 4" because if that logically counterfactually were wrong you would not be able to express an opposing opinion. + +The problem with trying to "exhibit generally rationalist principles" in an line of argument that you're constructing in order to be prudent and not community-harmful, is that you're thereby necessarily _not_ exhibiting the central rationalist principle that what matters is the process that _determines_ your conclusion, not the reasoning you present to _reach_ your presented conclusion, after the fact. + +The best explanation of this I know was authored by Yudkowsky himself in 2007, in a post titled ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument). It's worth quoting at length. The Yudkowsky of 2007 invites us to consider the plight of a political campaign manager: + +> As a campaign manager reading a book on rationality, one question lies foremost on your mind: "How can I construct an impeccable rational argument that Mortimer Q. Snodgrass is the best candidate for Mayor of Hadleyburg?" +> +> Sorry. It can't be done. +> +> "What?" you cry. "But what if I use only valid support to construct my structure of reason? What if every fact I cite is true to the best of my knowledge, and relevant evidence under Bayes's Rule?" +> +> Sorry. It still can't be done. You defeated yourself the instant you specified your argument's conclusion in advance. + +The campaign manager is in possession of a survey of mayoral candidates on which Snodgrass compares favorably to other candidates, except for one question. The post continues (bolding mine): + +> So you are tempted to publish the questionnaire as part of your own campaign literature ... with the 11th question omitted, of course. +> +> **Which crosses the line between _rationality_ and _rationalization_.** It is no longer possible for the voters to condition on the facts alone; they must condition on the additional fact of their presentation, and infer the existence of hidden evidence. +> +> Indeed, **you crossed the line at the point where you considered whether the questionnaire was favorable or unfavorable to your candidate, before deciding whether to publish it.** "What!" you cry. "A campaign should publish facts unfavorable to their candidate?" But put yourself in the shoes of a voter, still trying to select a candidate—why would you censor useful information? You wouldn't, if you were genuinely curious. If you were flowing _forward_ from the evidence to an unknown choice of candidate, rather than flowing _backward_ from a fixed candidate to determine the arguments. + +The post then briefly discusses the idea of a "logical" argument, one whose conclusions follow from its premises. "All rectangles are quadrilaterals; all squares are quadrilaterals; therefore, all squares are rectangles" is given as an example of _illogical_ argument, even though the both premises are true (all rectangles and squares are in fact quadrilaterals) _and_ the conclusion is true (all squares are in fact rectangles). The problem is that the conclusion doesn't _follow_ from the premises; the _reason_ all squares are rectangles isn't _because_ they're both quadrilaterals. If we accepted arguments of the general _form_ "all A are C; all B are C; therefore all A are B", we would end up believing nonsense. + +Yudkowsky's conception of a "rational" argument—at least, Yudkowsky's conception in 2007, which the Yudkowsky of the current year seems to disagree with—has a similar flavor: the stated reasons should be the actual reasons. The post concludes: + +> If you really want to present an honest, rational argument _for your candidate_, in a political campaign, there is only one way to do it: +> +> * _Before anyone hires you_, gather up all the evidence you can about the different candidates. +> * Make a checklist which you, yourself, will use to decide which candidate seems best. +> * Process the checklist. +> * Go to the winning candidate. +> * Offer to become their campaign manager. +> * When they ask for campaign literature, print out your checklist. +> +> Only in this way can you offer a _rational_ chain of argument, one whose bottom line was written flowing _forward_ from the lines above it. Whatever _actually_ decides your bottom line is the only thing you can _honestly_ write on the lines above. + +I remember this being pretty shocking to read back in 'aught-seven. What an alien mindset! But it's _correct_. You can't rationally argue "for" a chosen conclusion, because only the process you use to _decide what to argue for_ can be your real reason. + +This is a shockingly high standard for anyone to aspire to live up to—but what made Yudkowsky's Sequences so life-changingly valuable, was that they articulated the _existence_ of such a standard. For that, I will always be grateful. + +... which is why it's so _bizarre_ that the Yudkowsky of the current year acts like he's never heard of it! If your _actual_ bottom line is that it is sometimes personally prudent and not community-harmful to post your agreement with Stalin, then sure, you can _totally_ find something you agree with to write on the lines above! Probably something that "exhibits generally rationalist principles", even! It's just that any rationalist who sees the game you're playing is going to correctly identify you as a partisan hack on this topic and take that into account when deciding whether they can trust you on other topics. + +"I don't see what the alternative is besides getting shot," Yudkowsky muses (where presumably, 'getting shot' is a metaphor for a large negative utility, like being unpopular with progressives). Yes, an astute observation! And _any other partisan hack could say exactly the same_, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative is besides getting shot. + +If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is _so_ terrifying to you that it seems analogous to getting shot, then, if those are really your true values, then sure—say whatever you need to say to keep your job and your popularity, as is personally prudent. You've set your price. But if the price you put on the intellectual integrity of your so-called "rationalist" community is similar to that of the Snodgrass for Mayor campaign, you shouldn't be surprised if intelligent, discerning people accord similar levels of credibility to the two groups' output. + +I see the phrase "bad faith" thrown around more than I think people know what it means. "Bad faith" doesn't mean "with ill intent", and it's more specific than "dishonest": it's [adopting the surface appearance of being moved by one set of motivations, while actually acting from another](https://en.wikipedia.org/wiki/Bad_faith). + +For example, an [insurance company employee](https://en.wikipedia.org/wiki/Claims_adjuster) who goes through the motions of investigating your claim while privately intending to deny it might never consciously tell an explicit "lie", but is definitely acting in bad faith: they're asking you questions, demanding evidence, _&c._ in order to _make it look like_ you'll get paid if you prove the loss occurred—whereas in reality, you're just not going to be paid. Your responses to the claim inspector aren't completely casually _inert_: if you can make an extremely strong case that the loss occurred as you say, then the claim inspector might need to put some effort into coming up with some ingenious excuse to deny your claim in ways that exhibit general claim-inspection principles. But at the end of the day, the inspector is going to say what they need to say in order to protect the company's loss ratio, as is personally prudent. + +With this understanding of bad faith, we can read Yudkowsky's "it is sometimes personally prudent [...]" comment as admitting that his behavior on politically-charged topics is in bad faith—where "bad faith" isn't a meaningless insult, but [literally refers](http://benjaminrosshoffman.com/can-crimes-be-discussed-literally/) to the pretending-to-have-one-set-of-motivations-while-acting-according-to-another behavior, such that accusations of bad faith can be true or false. Yudkowsky will never consciously tell an explicit "lie", but he'll go through the motions to _make it look like_ he's genuinely engaging with questions where I need the right answers in order to make extremely impactful social and medical decisions—whereas in reality, he's only going to address a selected subset of the relevant evidence and arguments that won't get him in trouble with progressives. + +To his credit, he _will_ admit that he's only willing to address a selected subset of arguments—but while doing so, he claims an absurd "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter and check it [himself] before speaking" while _simultaneously_ blatantly mischaracterizing his opponents' beliefs! ("Gendered Pronouns For Everyone and Asking To Leave The System Is Lying" doesn't pass anyone's [ideological Turing test](https://www.econlib.org/archives/2011/06/the_ideological.html).) + +Counterarguments aren't completely causally _inert_: if you can make an extremely strong case that Biological Sex Is Sometimes More Relevant Than Self-Declared Gender Identity, Yudkowsky will put some effort into coming up with some ingenious excuse for why he _technically_ never said otherwise, in ways that exhibit generally rationalist principles. But at the end of the day, Yudkowsky is going to say what he needs to say in order to protect his reputation, as is personally prudent. + +Even if one were to agree with this description of Yudkowsky's behavior, it doesn't immediately follow that Yudkowsky is making the wrong decision. Again, "bad faith" is meant as a literal description that makes predictions about behavior, not a contentless attack—maybe there are some circumstances in which engaging some amount of bad faith is the right thing to do, given the constraints one faces! For example, when talking to people on Twitter with a very different ideological background from me, I sometimes anticipate that if my interlocutor knew what I was actually thinking, they wouldn't want to talk to me, so I engage in a bit of what is sometimes called ["concern trolling"](https://geekfeminism.fandom.com/wiki/Concern_troll): I take care to word my replies in a way that makes it look like I'm more ideologically aligned with them than I actually am. (For example, I [never say "assigned female/male at birth" in my own voice on my own platform](/2019/Sep/terminology-proposal-developmental-sex/), but I'll do it in an effort to speak my interlocutor's language.) I think of this as the _minimal_ amount of strategic bad faith needed to keep the conversation going, to get my interlocutor to evaluate my argument on its own merits, rather than rejecting it for coming from an ideological enemy. In cases such as these, I'm willing to defend my behavior as acceptable—there _is_ a sense in which I'm being deceptive by optimizing my language choice to make my interlocutor make bad guesses about my ideological alignment, but I'm comfortable with that amount and scope of deception in the service of correcting the distortion where I don't think my interlocutor _should_ be paying attention to my personal alignment. + +That is, my bad faith concern-trolling gambit of deceiving people about my ideological alignment in the hopes of improving the discussion seems like something that makes our collective beliefs about the topic-being-argued-about _more_ accurate. (And the topic-being-argued-about is presumably of greater collective interest than which "side" I personally happen to be on.) + +In contrast, the "it is sometimes personally prudent [...] to post your agreement with Stalin" gambit is the exact reverse: it's _introducing_ a distortion into the discussion in the hopes of correcting people's beliefs about the speaker's ideological alignment. (Yudkowsky is not a right-wing Bad Guy, but people would tar him as a right-wing Bad Guy if he ever said anything negative about trans people.) This doesn't improve our collective beliefs about the topic-being-argued about; it's a _pure_ ass-covering move. + +Yudkowsky names the alleged fact that "people do _know_ they're living in a half-Stalinist environment" as a mitigating factor. But the _reason_ censorship is such an effective tool in the hands of dictators like Stalin is because it ensures that many people _don't_ know—and that those who know (or suspect) don't have [game-theoretic common knowledge](https://www.lesswrong.com/posts/9QxnfMYccz9QRgZ5z/the-costly-coordination-mechanism-of-common-knowledge#Dictators_and_freedom_of_speech) that others do too. + +Zvi Mowshowitz has [written about how the false assertion that "everybody knows" something](https://thezvi.wordpress.com/2019/07/02/everybody-knows/) is typically used justify deception: if "everybody knows" that we can't talk about biological sex (the reasoning goes), then no one is being deceived when our allegedly truthseeking discussion carefully steers clear of any reference to the reality of biological sex when it would otherwise be extremely relevant. + +But if it were _actually_ the case that everybody knew (and everybody knew that everybody knew), then what would be the point of the censorship? It's not coherent to claim that no one is being harmed by censorship because everyone knows about it, because the entire appeal and purpose of censorship is precisely that _not_ everybody knows and that someone with power wants to _keep_ it that way. + +For the savvy people in the know, it would certainly be _convenient_ if everyone secretly knew: then the savvy people wouldn't have to face the tough choice between acceding to Power's demands (at the cost of deceiving their readers) and informing their readers (at the cost of incurring Power's wrath). + +Policy debates should not appear one-sided. Faced with this kind of dilemma, I can't say that defying Power is necessarily the right choice: if there really _were_ no other options between deceiving your readers with a bad faith performance, and incurring Power's wrath, and Power's wrath would be too terrible to bear, then maybe deceiving your readers with a bad faith performance is the right thing to do. + +But if you actually _cared_ about not deceiving your readers, you would want to be _really sure_ that those _really were_ the only two options. You'd [spend five minutes by the clock looking for third alternatives](https://www.lesswrong.com/posts/erGipespbbzdG5zYb/the-third-alternative)—including, possibly, not issuing proclamations on your honor as leader of the so-called "rationalist" community on topics where you _explicitly intend to ignore counteraguments on grounds of their being politically unfavorable_. Yudkowsky rejects this alternative on the grounds that it allegedly implies "utter silence about everything Stalin has expressed an opinion on including '2 + 2 = 4' because if that logically counterfactually were wrong you would not be able to express an opposing opinion", but this seems like yet another instance of Yudkowsky motivatedly playing dumb: if he _wanted_ to, I'm sure Eliezer Yudkowsky could think of _some relevant differences_ between "2 + 2 = 4" (a trivial fact of arithmetic) and "the simplest and best protocol is, "'He' refers to the set of people who have asked us to use 'he'" (a complex policy proposal whose numerous flaws I have analyzed in detail). + +"[P]eople do _know_ they're living in a half-Stalinist environment," Yudkowsky says. "I think people are better off at the end of that," he says. But who are "people", specifically? I'm asking because I have a long sob story about how _I_ didn't know, and _I'm_ not better off. + +------ + +https://twitter.com/ESYudkowsky/status/1404697716689489921 +> I have never in my own life tried to persuade anyone to go trans (or not go trans)—I don't imagine myself to understand others that much. + +If you think it "sometimes personally prudent and not community-harmful" to strategically say positive things about Republican candidates, and make sure to never, ever say negative things about Democratic candidates (because you "don't see what the alternative is besides getting shot"), you can see why people might regard you as a _Republican shill_—even if all the things you said were true, and even if you never told any specific individual, "You should vote Republican." + +https://www.facebook.com/yudkowsky/posts/10154110278349228 +> Just checked my filtered messages on Facebook and saw, "Your post last night was kind of the final thing I needed to realize that I'm a girl." +> ==DOES ALL OF THE HAPPY DANCE FOREVER== + +https://twitter.com/ESYudkowsky/status/1404821285276774403 +> It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy. + +[The claim that the "Hill of Validity" thread wasn't partisan, but was just neutrally trying to teach the difference between facts and policy decisions is not credible. Imagine if someone was complaining about being required to say "Peace Be Upon Him" before referencing the prophet Muhammad. "It's a speech act, there's nothing factually false about saying 'peace be unto him'"] +unremediatedgender.space/2020/Aug/yarvin-on-less-wrong/ diff --git a/notes/blanchards-dangerous-idea-sections.md b/notes/blanchards-dangerous-idea-sections.md index 02c8401..1818e0a 100644 --- a/notes/blanchards-dangerous-idea-sections.md +++ b/notes/blanchards-dangerous-idea-sections.md @@ -208,3 +208,86 @@ On my last day at SwiftStack, I said that I was taking a sabbatical from my soft someone posted an inflation fetish joke in Kelsey's channel and got a lot of laugh reactions including from Big Yud, and it's salient to me that people don't gaslight people with other transformation fetishes https://cdn.discordapp.com/attachments/458329253595840522/987991378590052402/SPOILER_unknown.png + + +Anna thinks that committees can't do anything worthwhile; for endeavors requiring a lot of coordination, it's useful for leaders to have slack to make decisions without having to justify themselves to a mob. Anna endorses Straussianism: writing for the few is different from writing for the many, and that some of Ben's stuff may have veered too far towards loading negative affect on EA leaders; I and my model of Michael have reservations about the extent to which \"writing for the few\" could also be described as \"colluding to deceive the rest of the world\ + +an irony: in my psychosis, I was scared that the world was far less legible than I had imagined, but that _wasn't_ why my ordeal's and Devi's were so traumatic _at all_: the psych ward is _very much_ governed by legible rules, rules that I had no control over + +[initial fan mail to Bailey on 7 January, followup to include blog link on 11 February; initial fan mail to Blanchard 10 August] + +I had some self-awareness that I was going off the rails— +> She had a delusional mental breakdown; you're a little bit manic; I'm in the Avatar state. https://www.facebook.com/zmdavis/posts/10154813104220199 + +to Ben: "I thought I got a message from Michael Vassar saying that the main coalitions were you, and Sarah Constantine, and Zack Davis vs. the world" Fri Feb 17 2017 14:30:55 GMT-0800 + +scared that Orion was going to kill me + +to Ben: You can use police cars as Ubers???? Fri Feb 17 2017 15:19:59 GMT-0800 + +] + +You gave me hot chocolate last night, right? I was worried that you were subconsciously poisoning me; not on purpose, but because there are just a lot of contaminants in cities; things that taste sweet to children but are actually poisonous; but, Anna said that most events are normal; I don't remember that note" +Mon Apr 10 08:51:03 PDT 2017 + +Michael's "Congratulations on not going back to work at the carpet store!" was a reference to Rick & Morty "Mortynight Run" which aired in August 2015, but I hadn't seen it yet + +winning Hamilton tickets at $2200 + +meeting Katie— + +Sun Jan 15 2017 08:35:40 +Folks, I'm not sure it's feasible to have an intellectually-honest real-name public conversation about the etiology of MtF. If no one is willing to mention some of the key relevant facts, maybe it's less misleading to just say nothing.\"", + +He was always more emotionally tentative and less comfortable with the standard gender role and status stuff" +But in the way of like, a geeky nerd guy +Not in the way of someone feminine +The only thing I knew about it at the point we got married was that he thought it was fun to go in drag sometimes +Like Halloween + +And he thought feminization kink was fun +Like me making him dress up? But he said it was about humiliation +We didn't even do it more than a handful of times, it wasn't really my thing +Nothing in my experience ever caused me to think he was trans + +"He talked about being a child always feeling out of place +"But out of place seemed like because he was shy and anxious +He said he was convinced his anxiety and social problems was *because* he was trans + +Spencer seemed much less happy to me after admitting to want transition, often crying about how ugly his body was + +because it basically amounts to, \"You rebuilt your entire life around your perverted narcissistic fantasy and didn't even notice\ +like, there's no nice way to say that + +My taxon, right or wrong; if right, to be kept right; and if wrong, to be set right.\ + +all those transwomen are going to be so embarrassed when the FAI gives us telepathy after the Singularity +and it turns out that what actual women feel as _absolutely nothing to do_ with what AGP fantasy feels like + +Tue Feb 14 2017 11:26:20 (this conversation was actually during the tantrum)— +K: I really *was* getting to the point that I hated transwomen +Z: I hate them, too! +Z: Fuck those guys! +K: I hated what happened to my husband, I hate the insistence that I use the right pronouns and ignore my senses, I hate the takeover of women's spaces, I hate the presumption that they know what a woman's life is like, I was *getting* to the point that I deeply hated them, and saw them as the enemy +K: But you're actually changing that for me +K: You're reconnecting me with my natural compassion +K: To people who are struggling and have things that are hard +K: It's just that, the way they think things is hard is not the way I actually think it is anymore +Z: the \"suffering\" is mostly game-theoretic victimhood-culture +K: You've made me hate transwomen *less* now +K: Because I have a model +K: I understand the problem +[...] +K: I understand why it's hard +K: I feel like I can forgive it, to the extent that forgiveness is mine to give +K: This is a better thing for me +I did not *want* to be a hateful person", +I did not want to take seeming good people as an enemy in my head, while trying to be friends with them in public +I think now I can do it more honestly +They might not want *me* as a friend +But now I feel less threatened and confused and insulted +And that has dissolved the hatred that was starting to take root +I'm very grateful for that + +https://www.lesswrong.com/posts/ZEgQGAjQm5rTAnGuM/beware-boasting-about-non-existent-forecasting-track-records +