From: M. Taylor Saotome-Westlake Date: Thu, 29 Sep 2022 15:48:48 +0000 (-0700) Subject: check in X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=7c03008475ef027a5ad4cdbe1213e8271bb77568;p=Ultimately_Untrue_Thought.git check in --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 03cd742..4406caf 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -756,9 +756,9 @@ _I knew_. Even then, _I knew_ I had to qualify my not liking to be tossed into a It would seem that in the current year, that culture is dead—or at least, if it does have any remaining practitioners, they do not include Eliezer Yudkowsky. -At this point, some people would argue that I'm being too uncharitable in harping on the "not liking to be tossed into a [...] Bucket" paragraph. The same post does also explicitly say that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I agree that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not assume that he "really meant" to communicate the reading that does make sense, rather than the one that doesn't make sense? +At this point, some people would argue that I'm being too uncharitable in harping on the "not liking to be tossed into a [...] Bucket" paragraph. The same post does _also_ explicitly says that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I _agree_ that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not give him the benefit of the doubt and assume that he "really meant" to communicate the reading that does make sense, rather than the one that doesn't make sense? -I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. Yudkowsky is just _too talented of a writer_ for me to excuse his words as an artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, [it's often wise to conjecture that their behavior represents _optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. The point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie." +I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. Yudkowsky is just _too talented of a writer_ for me to excuse his words as an artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, [it's often wise to conjecture that their behavior represents _optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. I think the point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie." Consider the implications of Yudkowsky giving as a clue as to the political forces as play in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228): @@ -857,7 +857,7 @@ This is a shockingly high standard for anyone to aspire to live up to—but what "I don't see what the alternative is besides getting shot," Yudkowsky muses (where presumably, 'getting shot' is a metaphor for a large negative utility, like being unpopular with progressives). Yes, an astute observation! And _any other partisan hack could say exactly the same_, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative is besides getting shot. -If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is _so_ terrifying to you that it seems analogous to getting shot, then, if those are really your true values, then sure—say whatever you need to say to keep your job and your popularity, as is personally prudent. You've set your price. But if the price you put on the intellectual integrity of your so-called "rationalist" community is similar to that of the Snodgrass for Mayor campaign, you shouldn't be surprised if intelligent, discerning people accord similar levels of credibility to the two groups' output. +Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is _so_ terrifying to you that it seems analogous to getting shot, then, if those are really your true values, then sure—say whatever you need to say to keep your job and your popularity, as is personally prudent. You've set your price. But if the price you put on the intellectual integrity of your so-called "rationalist" community is similar to that of the Snodgrass for Mayor campaign, you shouldn't be surprised if intelligent, discerning people accord similar levels of credibility to the two groups' output. I see the phrase "bad faith" thrown around more than I think people know what it means. "Bad faith" doesn't mean "with ill intent", and it's more specific than "dishonest": it's [adopting the surface appearance of being moved by one set of motivations, while actually acting from another](https://en.wikipedia.org/wiki/Bad_faith). @@ -1131,7 +1131,11 @@ I don't doubt Yudkowsky could come up with some clever casuistry why, _technical [TODO: if he's reading this, win back respect— reply, motherfucker] -[TODO: the Death With Dignity era] +[TODO: the Death With Dignity era + +/2017/Jan/from-what-ive-tasted-of-desire/ + +] I don't, actually, know how to prevent the world from ending. Probably we were never going to survive. (The cis-human era of Earth-originating intelligent life wasn't going to last forever, and it's hard to exert detailed control over what comes next.) But if we're going to die either way, I think it would be _more dignified_ if Eliezer Yudkowsky were to behave as if he wanted his faithful students to be informed. Since it doesn't look like we're going to get that, I think it's _more dignified_ if his faithful students _know_ that he's not behaving like he wants us to be informed. And so one of my goals in telling you this long story about how I spent (wasted?) the last six years of my life, is to communicate the moral that **I don't trust Eliezer Yudkowsky to tell the truth, and I don't think you should trust him, either**—and that this is a _problem_ for the future of humanity, to the extent that there is a future of humanity. diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index 22b9429..1352406 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -790,8 +790,6 @@ So, because ----- -Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." - a rationality community that can't think about _practical_ issues that affect our day to day lives, but can get existential risk stuff right, is like asking for self-driving car software that can drive red cars but not blue cars It's a _problem_ if public intellectuals in the current year need to pretend to be dumber than seven-year-olds in 2016 @@ -1230,3 +1228,7 @@ Still citing it (19 Sep 22): https://twitter.com/ingalala/status/156839169106472 If you _have_ intent-to-inform and occasionally end up using your megaphone to say false things (out of sloppiness or motivated reasoning in the passion of the moment), it's actually not that big of a deal, as long as you're willing to acknowledge corrections. (It helps if you have critics who personally hate your guts and therefore have a motive to catch you making errors, and a discerning audience who will only reward the critics for finding real errors and not fake errors.) In the long run, the errors cancel out. If you _don't_ have intent-to-inform, but make sure to never, ever say false things (because you know that "lying" is wrong, and think that as long as you haven't "lied", you're in the clear), but you don't feel like you have an obligation to acknowledge criticisms (for example, because you think you and your flunkies are the only real people in the world, and anyone who doesn't want to become one of your flunkies can be disdained as a "post-rat"), that's potentially a much worse situation, because the errors don't cancel. + +---- + +comment on pseudo-lies post in which he says its OK for me to comment even though