From: M. Taylor Saotome-Westlake Date: Tue, 18 Jan 2022 16:10:14 +0000 (-0800) Subject: check in notes X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=4a85593500159cacd8620f5c7bb4890921ef51e4;p=Ultimately_Untrue_Thought.git check in notes --- diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index 41d0ba8..65c41ec 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -438,3 +438,40 @@ https://www.facebook.com/yudkowsky/posts/10154110278349228 > Just checked my filtered messages on Facebook and saw, "Your post last night was kind of the final thing I needed to realize that I'm a girl." > ==DOES ALL OF THE HAPPY DANCE FOREVER== + +https://www.lesswrong.com/posts/sCCdCLPN9E3YvdZhj/shulman-and-yudkowsky-on-ai-progress +> I'm curious about how much you think these opinions have been arrived at independently by yourself, Paul, and the rest of the OpenPhil complex? + +If he's worried about Carl being corrupted by OpenPhil; it make sense for me to worry about him being corrupted by Glowfic cluster + +https://www.lesswrong.com/posts/sCCdCLPN9E3YvdZhj/shulman-and-yudkowsky-on-ai-progress +> If you mean that say Mike Blume starts getting paid $20m/yr base salary +Weirdly specific that Mike (random member of your robot cult) is getting namedropped + +example of hero-worship, David Pearce writes— +https://www.facebook.com/algekalipso/posts/4769054639853322?comment_id=4770408506384602 +> recursively cloning Scott Alexander—with promising allelic variations - and hothousing the “products” could create a community of super-Scotts with even greater intellectual firepower + +https://twitter.com/ESYudkowsky/status/1434906470248636419 + +> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. + +Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke." + +It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior! + +An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But, the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller. + +Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Scott and Eliezer is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. Nullius in verba. + + + +Respect needs to be updateable. No one can think fast enough to think all their own thoughts. I have a draft explaining the dolphins thing, about why Nate's distaste for paraphyly is wrong. In Nate's own account, he "suspect[s] that ['[...] Not Man for the Categories'] played a causal role in [...] starting the thread out on fish." Okay, where did Scott get it from, then? I don't have access to his thoughts, but I think he pulled it out of his ass because it was politically convenient for him. I suspect that if you asked him in 2012 whether dolphins are fish, he would have said, "No, they're mammals" like any other educated adult. Can you imagine "... Not Man for the Categories" being as popular as it is in our world if it just cut off after section III? Me neither. + +I think it's a problem for our collective epistemology that Scott has the power to sneeze his mistakes onto everyone else—that our 2021 beliefs about dolphins (literally, dolphins in particular!) is causally downstream of Scott's political incentives in 2014, even if Scott wasn't consciously lying and Nate wasn't thinking about gender politics. I think this is the problem that Eliezer identified as dark side epistemology: people invent fake epistemology lessons to force a conclusion that they can't get on the merits, and the fake lessons can spread, even if the meme-recipients aren't trying to force anything themselves. I would have expected people with cultural power to be interested in correcting the problem once it was pointed out. + +And the thing where David Xu interprets criticism of Eliezer as me going "full post-rat"?! https://twitter.com/davidxu90/status/1435106339550740482 + +https://twitter.com/esyudkowsky/status/1374161729073020937 + +> Also: Having some things you say "no comment" to, is not at *all* the same phenomenon as being an organization that issues Pronouncements. There are a *lot* of good reasons to have "no comments" about things. Anybody who tells you otherwise has no life experience, or is lying. diff --git a/notes/post_ideas.txt b/notes/post_ideas.txt index c8485d3..da4437c 100644 --- a/notes/post_ideas.txt +++ b/notes/post_ideas.txt @@ -1,7 +1,9 @@ -2021+ significant posts— +2022 significant posts— _ Challenges to Yudkowsky's Pronoun Reform Proposal -_ A Hill of Validity in Defense of Meaning _ Book Review: Charles Murray's Facing Reality: Two Truths About Race in America +_ Trans Kids on the Margin, and Harms From Misleading Training Data +_ Reply to Scott Alexander on Autogenderphilia +_ A Hill of Validity in Defense of Meaning Queue— _ "Never Smile" linkpost @@ -9,7 +11,7 @@ _ Student Dysphoria, and a Previous Life's War _ Happy Meal _ https://www.lesswrong.com/posts/WikzbCsFjpLTRQmXn/declustering-reclustering-and-filling-in-thingspace -_ Reply to Scott Alexander on Autogenderphilia + _ Subspatial Distribution Overlap and Cancellable Stereotypes _ "But I'm Not Quite Sure What That Means": Costs of Nonbinary Gender as a Social Technology _ Four Clusters @@ -19,7 +21,6 @@ _ motivation for positing meta-attraction _ HPMoR on the function of democracy vs. Yarvin's true election -_ Trans Kids on the Margin, and Harms From Misleading Training Data _ Blanchard's Dangerous Idea and the Plight of the Lucid Crossdreamer _ Joint Book Review: Kathleen Stock's Material Girls and Kathryn Paige Harden's The Genetic Lottery