From: M. Taylor Saotome-Westlake Date: Sun, 5 Jun 2022 00:09:36 +0000 (-0700) Subject: Saturday evening retreat 1: re-org first Yudkowsky meet § X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=5c624db8305572d67a0a0eaa337549f3a602eb25;p=Ultimately_Untrue_Thought.git Saturday evening retreat 1: re-org first Yudkowsky meet § It's so hard to get started—but now that I am started (far too late in the day), I just have to keep rolling, and then keep rolling more tomorrow (still with the network cable out), and then I'll be able to meet my mom for Star Trek tomorrow evening. (I also told my roomie I would go to the store tonight and get milk, but first—more writing!) --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 13080cd..a555e12 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -238,19 +238,22 @@ It wasn't my place. I'm not a woman or a racial minority; I don't have their liv Until suddenly, in what was then the current year of 2016, it was now seeming that the designated sympathetic victim group of our age was ... _straight boys who wish they were girls_. And suddenly, [_I had standing_](/2017/Feb/a-beacon-through-the-darkness-or-getting-it-right-the-first-time/). When a political narrative is being pushed for _your_ alleged benefit, it's much easier to make the call that it's obviously full of lies. -The claim that political privileges are inculcating "a culture of worthless, unredeemable scoundrels" in some _other_ group is easy to dimiss as bigotry, but it hits differently when you can _see it happening to people like you_. Notwithstanding whether the progressive story had been right about the trevails of blacks and women, I _know_ that straight boys who wish they were girls are not actually as fragile and helpless as we were being portrayed—that we _weren't_ that fragile, if anyone still remembers the world of 2006, when straight boys who wished they were girls knew that they were, in fact, straight boys, and didn't think the world owed them deference for their perversion. And this experience _did_ raise further questions about whether previous iterations of progressive ideology had been entirely honest with me. (If nothing else, I couldn't help but notice that my update from "Blanchard is probably wrong because trans women's self-reports say it's wrong" to "Self-reports are pretty crazy" probably had implications for "[Red Pill](https://heartiste.org/the-sixteen-commandments-of-poon/) is probably wrong because women's self-reports say it's wrong".) +The claim that political privileges are inculcating "a culture of worthless, unredeemable scoundrels" in some _other_ group is easy to dimiss as bigotry, but it hits differently when you can see it happening to _people like you_. Notwithstanding whether the progressive story had been right about the trevails of blacks and women, I _know_ that straight boys who wish they were girls are not actually as fragile and helpless as we were being portrayed—that we _weren't_ that fragile, if anyone still remembers the world of 2006, when straight boys who wished they were girls knew that they were, in fact, straight boys, and didn't think the world owed them deference for their perversion. And this experience _did_ raise further questions about whether previous iterations of progressive ideology had been entirely honest with me. (If nothing else, I couldn't help but notice that my update from "Blanchard is probably wrong because trans women's self-reports say it's wrong" to "Self-reports are pretty crazy" probably had implications for "[Red Pill](https://heartiste.org/the-sixteen-commandments-of-poon/) is probably wrong because women's self-reports say it's wrong".) While I was in this flurry of excitement about my recent updates and the insanity around me, I thought back to that "at least 20% of the ones with penises are actually women" Yudkowsky post from back in March that had been my wake-up call to all this. What _was_ going with that? -[ +I wasn't, like, _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a work buddy or a college friend. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about he trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation, and I had his email address from previous contract work I had done for MIRI back in '12, so I wrote him offering $1,000 to talk about sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228) what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality, mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." -Under ordinary circumstances, I wouldn't have dared bother Eliezer Yudkowsky (who I considered to be the most important person in the world) on _any_ topic, let alone this weird personal obsession of mine. +At this point, any _normal people_ who are (somehow?) reading this might be thinking, isn't that weird and a little cultish?—some blogger you follow posted something you thought was strange earlier this year, and you want to pay him _one grand_ to talk about it? -But ... Yudkowsky _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about he trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation. Cheap, right?! +To the normal person I would explain thusly. First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and -[TODO: does this seem cultish] +Second, $1000 isn't actually real money to a San Francisco software engineer. -I had his email address from previous contract work I had done for MIRI back in '12, so I wrote him offering $1,000 to talk about sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228) what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality, mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." +[$1000 isn't actually real money to a San Francisco software engineer] +[the absurd hero-worship I had] + +One of my emails included the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" [TODO: I can't actually confirm or deny whether he accepted the happy price offer because if we did talk, it would have been a private conversation] diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index fdfda64..2a28774 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -235,10 +235,11 @@ I'm worried about the failure mode where bright young minds [lured in](http://be > I'm not trying to get Eliezer or "the community" to take a public stance on gender politics; I'm trying to get us to take a stance in favor of the kind of epistemology that we were doing in 2008. It turns out that epistemology has implications for gender politics which are unsafe, but that's more inferential steps, and ... I guess I just don't expect the sort of people who would punish good epistemology to follow the inferential steps? Maybe I'm living in the should-universe a bit here, but I don't think it "should" be hard for Eliezer to publicly say, "Yep, categories aren't arbitrary because you need them to carve reality at the joints in order to make probabilistic inferences, just like I said in 2008; this is obvious." -Scott got a lot of pushback just for including the blog that I showed him in a links post (Times have changed! BBL is locally quasi-mainstream after Ozy engaged) -https://slatestarcodex.com/2016/11/01/links-1116-site-unseen/ -https://slatestarscratchpad.tumblr.com/post/152736458066/hey-scott-im-a-bit-of-a-fan-of-yours-and-i +(Times have changed! BBL is locally quasi-mainstream after Ozy engaged) +[Scott got a lot of pushback just for including the blog that I showed him in a links post +https://slatestarcodex.com/2016/11/01/links-1116-site-unseen/ +https://slatestarscratchpad.tumblr.com/post/152736458066/hey-scott-im-a-bit-of-a-fan-of-yours-and-i] It's weird that he thinks telling the truth is politically impossible, because the specific truths I'm focused on are things he _already said_, that anyone could just look up. I guess the point is that the egregore doesn't have the logical or reading comprehension for that?—or rather (a reader points out) the egregore has no reason to care about the past; if you get tagged as an enemy, your past statements will get dug up as evidence of foul present intent, but if you're doing good enough of playing the part today, no one cares what you said in 2009 @@ -966,20 +967,15 @@ It's as if the guy has just completely given up on the idea that public speech i > "too many people think it's unvirtuous to shut up and listen to me" I wish I had never written about LDT and just told people to vote for reasons they understand when they're older https://twitter.com/ESYudkowsky/status/1509944888376188929 - - What a _profoundly_ anti-intellectual statement! This is just not something you would say if you cared about having a rationality community that could process arguments and correct errors, rather than a robot cult - To be clear, there _is_ such a thing as legitimately trusting an authority who knows better than you. +If he's frustrated that people won't listen _now_, he should remember the only reason he has _any_ people who defer to him _at all_ is because he used to be such a good explainer who actually argued for things. + That trust is a _finite resource_. ------ - - - -