From: M. Taylor Saotome-Westlake Date: Sun, 30 Oct 2022 21:39:05 +0000 (-0700) Subject: memoir: bullet and shovel X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=e92c359154abdd0861069b76c2de22522a369c7a;p=Ultimately_Untrue_Thought.git memoir: bullet and shovel --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index d2bba00..868b6e8 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -1064,7 +1064,7 @@ I continued to work on my "advanced" philosophy of categorization thesis. The di > I had hoped that the Israel/Palestine example above made it clear that you have to deal with the consequences of your definitions, which can include confusion, muddling communication, and leaving openings for deceptive rhetorical strategies. -This is certainly an _improvement_ over the original text without the note, but I took the use of the national borders metaphor here to mean that Scott still hadn't really gotten my point about there being underlying laws of thought underlying categorization: mathematical principles governing _how_ definition choices can muddle communication or be deceptive. (But that wasn't surprising; [by Scott's own admission, he's not a math guy](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/).) +This is certainly an _improvement_ over the original text without the note, but I took the use of the national borders metaphor here to mean that Scott still hadn't really gotten my point about there being underlying laws of thought underlying categorization: mathematical principles governing _how_ definition choices can muddle communication or be deceptive. (But that wasn't surprising; [by Scott's own admission](https://slatestarcodex.com/2013/06/30/the-lottery-of-fascinations/), [he's not a math guy](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/).) Category "boundaries" are a useful _visual metaphor_ for explaining the cognitive function of categorization: you imagine a "boundary" in configuration space containing all the things that belong to the category. diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index a63ca8f..a0cb2fa 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -112,7 +112,7 @@ It would seem that in the current year, that culture is dead—or at least, if i At this point, some people would argue that I'm being too uncharitable in harping on the "not liking to be tossed into a [...] Bucket" paragraph. The same post does _also_ explicitly says that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I _agree_ that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not give him the benefit of the doubt and assume that he "really meant" to communicate the reading that does make sense, rather than the one that doesn't make sense? -I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. I have been ["trained in a theory of social deception that says that people can arrange reasons, excuses, for anything"](https://www.glowfic.com/replies/1820866#reply-1820866), such that it's informative ["to look at what _ended up_ happening, assume it was the _intended_ result, and ask who benefited."](http://www.hpmor.com/chapter/47) Yudkowsky is just _too talented of a writer_ for me to excuse his words as an accidental artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, it's often wise to conjecture that their behavior represents [_optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. I think the point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie." +I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. I have been ["trained in a theory of social deception that says that people can arrange reasons, excuses, for anything"](https://www.glowfic.com/replies/1820866#reply-1820866), such that it's informative ["to look at what _ended up_ happening, assume it was the _intended_ result, and ask who benefited."](http://www.hpmor.com/chapter/47) Yudkowsky is just _too talented of a writer_ for me to excuse his words as an accidental artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, it's often wise to conjecture that their behavior represents [_optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. I think the point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie." Consider the implications of Yudkowsky giving as a clue as to the political forces as play in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228): @@ -126,9 +126,14 @@ Consider the implications of Yudkowsky giving as a clue as to the political forc So, the explanation of [the problem of political censorship filtering evidence](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting) here is great, but the part where Yudkowsky claims "confidence in [his] own ability to independently invent everything important that would be on the other side of the filter" is just _laughable_. My point that _she_ and _he_ have existing meanings that you can't just ignore by fiat given that the existing meanings are _exactly_ what motivate people to ask for new pronouns in the first place is _really obvious_. -Really, it would be _less_ embarassing for Yudkowsky if he were outright lying about having tried to think of counterarguments. The original post isn't _that_ bad if you assume that Yudkowsky was writing off the cuff, that he clearly just _didn't put any effort whatsoever_ into thinking about why someone might disagree. If he _did_ put in the effort—enough that he felt comfortable bragging about his ability to see the other side of the argument—and _still_ ended up proclaiming his "simplest and best protocol" without even so much as _mentioning_ any of its incredibly obvious costs ... that's just _pathetic_. If Yudkowsky's ability to explore the space of arguments is _that_ bad, why would you trust his opinion about _anything_? +Really, it would be _less_ embarassing for Yudkowsky if he were outright lying about having tried to think of counterarguments. The original post isn't _that_ bad if you assume that Yudkowsky was writing off the cuff, that he clearly just _didn't put any effort whatsoever_ into thinking about why someone might disagree. If he _did_ put in the effort—enough that he felt comfortable bragging about his ability to see the other side of the argument—and _still_ ended up proclaiming his "simplest and best protocol" without even so much as _mentioning_ any of its incredibly obvious costs, that's utterly discrediting. If Yudkowsky's ability to explore the space of arguments is _that_ bad, why would you trust his opinion about _anything_? -[TODO: discrediting to the community] +[TODO: discrediting to the community + * "would have said anything where you could hear it" is _discrediting_ + * I mean, it's gratifying to be acknowledged by my caliph (or it would have been, if he was still my caliph), but + * This actually a major life decision for a lot of people in your community! People in this community actually need the right answer here, whatever the right answer turns out to be + * The "where you could hear it" is also bizarre—don't you people read widely? I'm unusual in the amount of analytical rigor I bring to bear (Kathleen Stock doesn't say anything about Bayesian networks), but my basic points are obvious +] The disclaimer comment mentions "speakable and unspeakable arguments"—but what, one wonders, is the boundary of the "speakable"? In response to a commenter mentioning the cost of having to remember pronouns as a potential counterargument, Yudkowsky [offers us another clue](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421871809228): @@ -290,14 +295,17 @@ I agree that you won't have much luck yelling at the Other about how they must r Let's recap. [TODO: recap— -* in 2009, "Changing Emotions" -* in 2016, "20% of the ones with penises" -* ... +* In 2009, "Changing Emotions" pointed out that men who sexually fantasize about being women aren't actually women and can't be with forseeable technology +* In March 2016, "20% of the ones with penises" +* This was a confusing reversal? What changed? I inquired via the Cheerful Price mechanism +* November 2018, "You're not standing in defense of truth ..." +* I spend an absurd amount of effort correcting that, and he eventually clarified in +* February 2021, "simplest and best proposal" ] I _never_ expected to end up arguing about something so _trivial_ as the minutiae of pronoun conventions (which no one would care about if historical contingencies of the evolution of the English language hadn't made them a Schelling point and typographical attack surface for things people do care about). The conversation only ended up here after a series of derailings. At the start, I was _trying_ to say something substantive about the psychology of straight men who wish they were women. -_After it's been pointed out_, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female counterpart" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. +_After it's been pointed out_, it should be a pretty obvious hypothesis that "guy on the Extropians mailing list in 2004 who fantasizes about having a female but '"otherwise identical' copy of himself" and "guy in 2016 Berkeley who identifies as a trans woman" are the _same guy_. At this point, the nature of the game is very clear. Yudkowsky wants to make sure he's on peaceful terms with the progressive _Zeitgeist_, subject to the constraint of not saying anything he knows to be false. Meanwhile, I want to actually make sense of what's actually going on in the world as regards sex and gender, because _I need the correct answer to decide whether or not to cut my dick off_. @@ -331,7 +339,11 @@ Yudkowsky [defends his behavior](https://twitter.com/ESYudkowsky/status/13568121 > I think that some people model civilization as being in the middle of a great battle in which this tweet, even if true, is giving comfort to the Wrong Side, where I would not have been as willing to tweet a truth helping the Right Side. From my perspective, this battle just isn't that close to the top of my priority list. I rated nudging the cognition of the people-I-usually-respect, closer to sanity, as more important; who knows, those people might matter for AGI someday. And the Wrong Side part isn't as clear to me either. -[TODO: first of all, "A Rational Arugment" is very explicit about "not have been as willing to Tweet a truth helping the side" meaning you've crossed the line; second of all, it's if anything more plausible that trans women will matter to AGI, as I pointed out in my email] +[TODO: there are a number of things to be said to this— + * A Rational Arugment" is very explicit about "not have been as willing to Tweet a truth helping the side" meaning you've crossed the line; + * It's not clear anyone he usually respects was making this mistake; it seems likely that the original thread was subtweeting Eric Weinstein, who was not making this mistake + * it's if anything more plausible that trans women will matter to AGI, as I pointed out in my email +] But the battle that matters—the battle with a Right Side and a Wrong Side—isn't "pro-trans" _vs._ "anti-trans". (The central tendency of the contemporary trans rights movement is firmly on the Wrong Side, but that's not the same thing as all trans people as individuals.) That's why Jessica joined our posse to try to argue with Yudkowsky in early 2019. (She wouldn't have, if my objection had been, "trans is fake; trans people Bad".) That's why Somni—one of the trans women who [infamously protested the 2019 CfAR reunion](https://www.ksro.com/2019/11/18/new-details-in-arrests-of-masked-camp-meeker-protesters/) for (among other things) CfAR allegedly discriminating against trans women—[understands what I've been saying](https://somnilogical.tumblr.com/post/189782657699/legally-blind). @@ -349,11 +361,21 @@ Scott Alexander chose Feelings, but I can't really hold that against him, becaus Eliezer Yudkowsky ... did not _unambiguously_ choose Feelings. He's been very careful with his words to strategically mood-affiliate with the side of Feelings, without consciously saying anything that he knows to be unambiguously false. +[TODO— + * Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. + +> I aspire to make sure my departures from perfection aren't noticeable to others, so this tweet is very validating. +https://twitter.com/ESYudkowsky/status/1384671335146692608 + +* papal infallability / Eliezer Yudkowsky facts +https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts?commentId=Aq9eWJmK6Liivn8ND +Never go in against Eliezer Yudkowsky when anything is on the line. +https://en.wikipedia.org/wiki/Chuck_Norris_facts -[TODO— finish Yudkowsky trying to be a religious leader -Eliezer Yudkowsky is _absolutely_ trying to be a religious leader. +https://twitter.com/ESYudkowsky/status/1096769579362115584 +> When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes. -If Eliezer Yudkowsky can't _unambigously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_. + * If Eliezer Yudkowsky can't _unambigously_ choose Truth over Feelings, _then Eliezer Yudkowsky is a fraud_. ] @@ -382,8 +404,6 @@ The turd on c3 is a pretty big likelihood ratio! - - [TODO: the dolphin war, our thoughts about dolphins are literally downstream from Scott's political incentives in 2014; this is a sign that we're a cult https://twitter.com/ESYudkowsky/status/1404700330927923206 @@ -397,6 +417,23 @@ I mean, I wouldn't _call_ it a "dark conspiracy" exactly, but if the people with [TODO: sneering at post-rats; David Xu interprets criticism of Eliezer as me going "full post-rat"?! 6 September 2021 + +https://twitter.com/ESYudkowsky/status/1434906470248636419 +> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. + +Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke." + +Or as Yudkowsky put it— + +https://www.facebook.com/yudkowsky/posts/10154981483669228 +> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others. + +It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior! + +An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But, the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller. + +Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Scott and Eliezer is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. _Nullius in verba_. + > Also: speaking as someone who's read and enjoyed your LW content, I do hope this isn't a sign that you're going full post-rat. It was bad enough when QC did it (though to his credit QC still has pretty decent Twitter takes, unlike most post-rats). https://twitter.com/davidxu90/status/1435106339550740482 diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index 705ec95..c89d969 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -1,10 +1,10 @@ waypoints— ✓ Jessica help with "Unnatural Categories" -_ wireheading his fiction subreddit -_ discrediting to the community -_ let's recap -_ people-I-usually-respect footnote -_ Yudkowsky is trying to be a religious leader +- wireheading his fiction subreddit +- discrediting to the community +- let's recap +- people-I-usually-respect footnote +- Yudkowsky is trying to be a religious leader medium section now— _ existential risk interlude; social justice game theory; I could forgive him @@ -38,6 +38,7 @@ _ more examples of Yudkowsky's arrogance far editing tier— +_ pull "agreeing with Stalin" quote earlier in ms. to argue that Yudkowsky apparently doesn't disagree with my "deliberately ambiguous" _ elaborate on why I'm not leaking sensitive bits, by explaining what can be inferred by what was and wasn't said in public _ footnote on "no one would even consider" _ post-Christmas conversation should do a better job of capturing the war, that Jessica thinks Scott is Bad for being a psychiatrist @@ -1209,49 +1210,12 @@ If you _don't_ have intent-to-inform, but make sure to never, ever say false thi ---- - - bitter comments about rationalists— https://www.greaterwrong.com/posts/qXwmMkEBLL59NkvYR/the-lesswrong-2018-review-posts-need-at-least-2-nominations/comment/d4RrEizzH85BdCPhE https://www.greaterwrong.com/posts/tkuknrjYCbaDoZEh5/could-we-solve-this-email-mess-if-we-all-moved-to-paid/comment/ZkreTspP599RBKsi7 - - ------ -Yudkowsky's hyper-arrogance— -> I aspire to make sure my departures from perfection aren't noticeable to others, so this tweet is very validating. -https://twitter.com/ESYudkowsky/status/1384671335146692608 - -* papal infallability / Eliezer Yudkowsky facts -https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts?commentId=Aq9eWJmK6Liivn8ND -Never go in against Eliezer Yudkowsky when anything is on the line. -https://en.wikipedia.org/wiki/Chuck_Norris_facts - -"epistemic hero" -https://twitter.com/ESYudkowsky/status/1096769579362115584 - -https://twitter.com/ESYudkowsky/status/1096769579362115584 -> When an epistemic hero seems to believe something crazy, you are often better off questioning "seems to believe" before questioning "crazy", and both should be questioned before shaking your head sadly about the mortal frailty of your heroes. - -https://twitter.com/ESYudkowsky/status/1434906470248636419 -> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. - -Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke." - -Or as Yudkowsky put it— - -https://www.facebook.com/yudkowsky/posts/10154981483669228 -> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others. - -It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior! - -An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But, the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller. - -Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Scott and Eliezer is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. _Nullius in verba_. - - [TODO: asking Anna to weigh in] (I figured that spamming people with hysterical and somewhat demanding physical postcards was more polite (and funnier) than my recent habit of spamming people with hysterical and somewhat demanding emails.) - https://trevorklee.substack.com/p/the-ftx-future-fund-needs-to-slow > changing EA to being a social movement from being one where you expect to give money @@ -1609,3 +1573,5 @@ messages with Leon about PDF templating as an easier task were on 22-23 May Berkeley rat culture trains people to steer towards fake agreements rather than clarifying disagreement, because realistic models of disagreement include bad faith (being wrong because of reasons rather than randomly), which is against the principle of charity Greg Egan's "Closer" (1992) predicted language models, and seemed wild at the time + +The text of _Inadequate Equilibria_ is more modest than his rhetorical marketing blunder