From: M. Taylor Saotome-Westlake Date: Sun, 5 Jun 2022 19:14:27 +0000 (-0700) Subject: Sunday retreat 4: religion, privacy X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=48bd0721de19562b6d5359ea0c70a94a02e94940;p=Ultimately_Untrue_Thought.git Sunday retreat 4: religion, privacy --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index cee04d7..1b096ec 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -242,7 +242,7 @@ The claim that political privileges are inculcating "a culture of worthless, unr While I was in this flurry of excitement about my recent updates and the insanity around me, I thought back to that "at least 20% of the ones with penises are actually women" Yudkowsky post from back in March that had been my wake-up call to all this. What _was_ going on with that? -I wasn't, like, _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a work buddy or a college friend something. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about he trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation, and I had his email address from previous contract work I had done for MIRI back in '12, so I wrote him offering $1,000 to talk about sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228) what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality, mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." +I wasn't, like, _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a work buddy or a college friend something. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about he trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation, and I had his email address from previous contract work I had done for MIRI back in '12, so on 29 September 2016, I wrote him offering $1,000 to talk about what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228), mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." At this point, any _normal people_ who are (somehow?) reading this might be thinking, isn't that weird and kind of cultish?—some blogger you follow posted something you thought was strange earlier this year, and you want to pay him _one grand_ to talk about it? To the normal person I would explain thusly— @@ -252,15 +252,15 @@ Second, $1000 isn't actually real money to a San Francisco software engineer. Third—yes. Yes, it _absolutely_ was kind of cultish. There's a sense in which, _sociologically and psychologically speaking_, Yudkowsky is a religious leader, and I was—am—a devout adherent of the religion he made up. -By this I don't mean that the _content_ of Yudkowskian rationalism is much comparable to Christianity or Buddhism. But whether or not there is a God or a Divine (there is not), the _features of human psychology_ that make Christianity or Buddhism adaptive memeplexes are still going to be active; the God-shaped whole in my head can't not be filled by _something_, and Yudkowsky's writings on the hidden Bayesian structure of the universe were a potent way to fill that whole. - -It seems fair to compare my tendency to write in Sequences links to a devout Christian's tendency to quote Scripture by chapter and verse; the underlying mental motion of "appeal to the holy text" is probably pretty similar. My only defense is that _my_ religion is _actually true_ (and that my religion says you should read the texts and think it through for yourself, rather than taking anything on "faith"). +By this I don't mean that the _content_ of Yudkowskian rationalism is much comparable to Christianity or Buddhism. But whether or not there is a God or a Divine (there is not), the _features of human psychology_ that make Christianity or Buddhism adaptive memeplexes are still going to be active. If the God-shaped whole in my head can't not be filled by _something_, it's better to fill it with a "religion" _about good epistemology_, one that can _reflect_ on the fact that beliefs that are adaptive memeplexes are not therefore true, and Yudkowsky's writings on the hidden Bayesian structure of the universe were a potent way to do that. It seems fair to compare my tendency to write in Sequences links to a devout Christian's tendency to quote Scripture by chapter and verse; the underlying mental motion of "appeal to the holy text" is probably pretty similar. My only defense is that _my_ religion is _actually true_ (and that my religion says you should read the texts and think it through for yourself, rather than taking anything on "faith"). That's the context in which my happy-price email thread ended up including the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" (Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including inside of future superintelligences running simulations of the past, there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.) -[TODO: I can't actually confirm or deny whether he accepted the happy price offer because if we did talk, it would have been a private conversation] +I say all this to emphasize just how much Yudkowsky's opinion meant to me. If you were a devout Catholic, and something in the Pope's latest encyclical seemed wrong according to your understanding of Scripture, and you had the opportunity to talk it over with the Pope for a measly $1000, wouldn't you take it? Of course you would! + +Anyway, I can't talk about the results of my happy price inquiry (whether he accepted the offer and a conversation occured, or what was said if it did occur), because I think the rule I should follow for telling this Whole Dumb Story is that while I have complete freedom to talk about _my_ actions and things that happened in public, I'm not allowed to divulge information about what Yudkowsky may or may not have said in private conversations that may or may not have occured, because even without an explicit secrecy promise, people might be less forthcoming in private conversations if they knew that you might blog about them later. Personally, I think most people are _way_ too paranoid about this, and often wish I could just say what relevant things I know without worrying about whether it might infringe on someone's "privacy", but I'm eager to cooperate with widely-held norms even if I personally think they're dumb. -(It was also around this time that I snuck a copy of _Men Trapped in Men's Bodies_ into the [MIRI](https://intelligence.org/) office library, which was sometimes possible for community members to visit. It seemed like something Harry Potter-Evans-Verres would do—and ominously, I noticed, not like something Hermione Granger would do.) +(Incidentally, it was also around this time that I snuck a copy of _Men Trapped in Men's Bodies_ into the [MIRI](https://intelligence.org/) office library, which was sometimes possible for community members to visit. It seemed like something Harry Potter-Evans-Verres would do—and ominously, I noticed, not like something Hermione Granger would do.) Gatekeeping sessions finished, I finally started HRT at the end of December 2016. In an effort to not let my anti–autogynephilia-denialism crusade take over my life, earlier that month, I promised myself (and [published the SHA256 hash of the promise](https://www.facebook.com/zmdavis/posts/10154596054540199)) not to comment on gender issues under my real name through June 2017—_that_ was what my new pseudonymous blog was for. @@ -270,6 +270,8 @@ Gatekeeping sessions finished, I finally started HRT at the end of December 2016 As a result of that, I got a PM from a woman whose marriage had fallen apart after (among other things) her husband transitioned. She told me about the parts of her husband's story that had never quite made sense to her (but which sounded like a textbook case from my reading). In her telling, the husband was always more emotionally tentative and less comfortable with the standard gender role and status stuff, but in the way of like, a geeky nerd guy, not in the way of someone feminine. He was into crossdressing sometimes, but she had thought that was just a weird and insignificant kink, not that he didn't like being a man—until they moved to the Bay Area and he fell in with a social-justicey crowd. When I linked her to Kay Brown's article on ["Advice for Wives and Girlfriends of Autogynephiles"](https://sillyolme.wordpress.com/advice-for-wivesgirlfriends-of-autogynephiles/), her response was, "Holy shit, this is _exactly_ what happened with me." +It was nice to make a friend over shared heresy. + [TODO: the story of my Facebook crusade, going off the rails, getting hospitalized] A striking pattern from my attempts to argue with people about the two-type taxonomy was the tendency for the conversation to get derailed on some variation of "Well, the word _woman_ doesn't necessarily mean that," often with a link to ["The Categories Were Made for Man, Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/), a 2014 post by Scott Alexander arguing that because categories exist in our model of the world rather than the world itself, there's nothing wrong with simply _defining_ trans people to be their preferred gender, in order to alleviate their dysphoria.