From: M. Taylor Saotome-Westlake Date: Sun, 14 Aug 2022 21:05:26 +0000 (-0700) Subject: memoir: Anna–Michael feud X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=d7146ef7e8c4959445bebfb702ffe5bb618b9c8d;p=Ultimately_Untrue_Thought.git memoir: Anna–Michael feud --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index de52b0d..f85e7ae 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -246,26 +246,18 @@ One thing I regret about my behavior during this period was the extent to which Now, the memory of that social proof was a lifeline. Dear reader, if you've never been in the position of disagreeing with the entire weight of Society's educated opinion, _including_ your idiosyncratic subculture that tells itself a story about being smarter than the surrounding the Society—well, it's stressful. [There was a comment on /r/slatestarcodex around this time](https://old.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/) that cited Yudkowsky, Alexander, Ozy, _The Unit of Caring_, and Rob Bensinger as leaders of the "rationalist" community—just an arbitrary Reddit comment of no significance whatsoever—but it was salient indicator of the _Zeitgeist_ to me, because _[every](https://twitter.com/ESYudkowsky/status/1067183500216811521) [single](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) [one](https://thingofthings.wordpress.com/2018/06/18/man-should-allocate-some-more-categories/) of [those](https://theunitofcaring.tumblr.com/post/171986501376/your-post-on-definition-of-gender-and-woman-and) [people](https://www.facebook.com/robbensinger/posts/10158073223040447?comment_id=10158073685825447&reply_comment_id=10158074093570447)_ had tried to get away with some variant on the "categories are subjective, therefore you have no gounds to object to the claim that trans women are women" _mind game_. -In the face of that juggernaut of received opinion, I was already feeling pretty gaslighted. ("We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouWereThere), and _you_ were there ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ...") I don't know how my mind would have held up intact if I were just facing it alone; it's hard to imagine what I would have done in that case. I definitely wouldn't have had the impudence to pester Scott and Yudkowsky the way I did—_especially_ Yudkowsky—if it was just me against everyone else. - -But _Michael thought I was in the right_—not just intellectually on the philosophy issue, but morally in the right to be _prosecuting_ the philosophy issue, and not accepting stonewalling as an answer. That meant a lot to me. - - -[TODO SECTION: Anna Michael feud - * This may have been less effective than it was in my head; _I remembered_ Michael as being high-status - * Anna's 2 Mar comment badmouthing Michael - * my immediate response: I strongly agree with your point about "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help fill these functions"! That's why I'm so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserves to be laughed out of the room. - * "sacrificed all hope of success in favor of maintaining his own sanity by CC'ing you guys" - * Anna's case against Michael: he was talking to Devi even when Devi needed a break, and he wanted to destroy EA - * I remember at a party in 2015ish, asking Michael what else I should invest my money in, if not New Harvest/GiveWell, and his response was, "You" - * backstory of anti-EA sentiment: Ben's critiques, Sarah's "EA Has a Lying Problem"—Michael had been in the background - * Anna had any actual dirt on him, you'd expect her to use it while trashing him in public, but her only example basically amounts to "he gave people career advice I disagree with" - * "I should have noticed earlier that my emotional dependence on "Michael says X" validation is self-undermining, because Michael says that the thing that makes me valuable is my ability to think independently." - * fairly destructive move - * https://everythingtosaveit.how/case-study-cfar/#attempting-to-erase-the-agency-of-everyone-who-agrees-with-our-position - http://benjaminrosshoffman.com/why-i-am-no-longer-supporting-reach/ - He ... flatters people? He ... _didn't_ tell people to abandon their careers? What?! -] +In the face of that juggernaut of received opinion, I was already feeling pretty gaslighted. ("We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouWereThere), and _you_ were there ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ...") I don't know how my mind would have held up intact if I were just facing it alone; it's hard to imagine what I would have done in that case. I definitely wouldn't have had the impudence to pester Scott and Eliezer the way I did—especially Eliezer—if it was just me alone against everyone else. + +But _Michael thought I was in the right_—not just intellectually on the philosophy issue, but morally in the right to be _prosecuting_ the philosophy issue, and not accepting stonewalling as an answer. That social proof gave me a lot of social bravery that I otherwise wouldn't have been able to muster up—even though it would have been better if I could have propagated the implications of the observation that my dependence on him was self-undermining, because Michael himself said that the thing that made me valuable was my ability to think independently. + +The social proof was probably more effective in my own head, than it was with anyone we were arguing with. _I remembered_ Michael as a high-status community elder back in the _Overcoming Bias_ era, but that was a long time ago. (Luke Muelhauser had taken over leadership of the Singularity Institute in 2011; some sort of rift between Michael and Eliezer had widened in recent years, the details of which had never been explained to me.) Michael's status in "the community" of 2019 was much more mixed. He was intensely critical of the rise of Effective Altruism. (I remember at a party in 2015, on asking Michael what else I should spend my San Francisco software engineer money on if not the EA charities I was considering, being surprised that his answer was, "You.") + +Another blow to Michael's "community" reputation was dealt on 27 February, when Anna [published a comment badmouthing Michael and suggesting that talking to him was harmful](https://www.lesswrong.com/posts/u8GMcpEN9Z6aQiCvp/rule-thinkers-in-not-out?commentId=JLpyLwR2afav2xsyD), which I found pretty disappointing—more so as I began to realize the implications. + +I agreed with her point about how "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help fill these functions." That's why I was so heartbroken about about the "categories are arbitrary, therefore trans women are women" thing, which deserved to be _laughed out the room_. Why was she trying to ostracize the guy who was one of the very few to back me up on this incredibly obvious thing!? The reasons to discredit Michael given in the comment seemed incredibly weak. (He ... flatters people? He ... _didn't_ tell people to abandon their careers? What?) And the anti-Michael evidence she offered in private didn't seem much more compelling (_e.g._, at a CfAR event, he had been insistent on continuing to talk to someone who Anna thought was looking sleep-deprived and needed a break). + +It made sense for Anna to not like Michael, because of his personal conduct, or because he didn't like EA. (Expecting all of my friends to be friends with _each other_ would be [Geek Social Fallacy #4](http://www.plausiblydeniable.com/opinion/gsf.html).) If she didn't want to invite him to CfAR stuff, fine; that's her business not to invite him. But what did she gain from _escalating_ to publicly denouncing him as someone whose "lies/manipulations can sometimes disrupt [people's] thinking for long and costly periods of time"?! + [TODO SECTION: RIP Culture War thread, and defense against alt-right categorization @@ -308,9 +300,9 @@ Meanwhile, my email thread with Scott got started back up again, although I wasn One of Alexander's [most popular _Less Wrong_ posts ever had been about the noncentral fallacy, which Alexander called "the worst argument in the world"](https://www.lesswrong.com/posts/yCWPkLi8wJvewPbEp/the-noncentral-fallacy-the-worst-argument-in-the-world): for example, those who crow that abortion is _murder_ (because murder is the killing of a human being), or that Martin Luther King, Jr. was a _criminal_ (because he defied the segregation laws of the South), are engaging in a dishonest rhetorical maneuver in which they're trying to trick their audience into attributing attributes of the typical "murder" or "criminal" onto what are very noncentral members of those categories. -_Even if_ you're opposed to abortion, or have negative views about the historical legacy of Dr. King, this isn't the right way to argue. If you call Janie a _murderer_, that causes me to form a whole bunch of implicit probabilistic expectations—about Janie's moral character, about the suffering of victim whose hopes and dreams were cut short, about Janie's relationship with the law, _&c._—most of which get violated when you subsequently reveal that the murder victim was a fetus. +_Even if_ you're opposed to abortion, or have negative views about the historical legacy of Dr. King, this isn't the right way to argue. If you call Janie a _murderer_, that causes me to form a whole bunch of implicit probabilistic expectations—about Janie's moral character, about the suffering of victim whose hopes and dreams were cut short, about Janie's relationship with the law, _&c._—most of which get violated when you subsequently reveal that the murder victim was a four-week-old fetus. -Thus, we see that Alexander's own "The Worst Argument in the World" is really complaining about the _same_ category-gerrymandering move that his "... Not Man for the Categories" comes out in favor of. We would not let someone get away with declaring, "I ought to accept an unexpected abortion or two deep inside the conceptual boundaries of what would normally not be considered murder if it'll save someone's life." +Thus, we see that Alexander's own "The Worst Argument in the World" is really complaining about the _same_ category-gerrymandering move that his "... Not Man for the Categories" comes out in favor of. We would not let someone get away with declaring, "I ought to accept an unexpected abortion or two deep inside the conceptual boundaries of what would normally not be considered murder if it'll save someone's life." Maybe abortion _is_ wrong, but you need to make that case _on the merits_, not by linguistic fiat. ... Scott still didn't get it. He said that he didn't see why he shouldn't accept one unit of categorizational awkwardness in exchange for sufficiently large utilitarian benefits. I started drafting a long reply—but then I remembered that in recent discussion with my posse about what we might have done wrong in our attempted outreach to Yudkowsky, the idea had come up that in-person meetings are better for updateful disagreement-resolution. Would Scott be up for meeting in person some weekend? Non-urgent. Ben would be willing to moderate, unless Scott wanted to suggest someone else, or no moderator. diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index 55f6258..3fb83bd 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -1115,3 +1115,4 @@ all I actually want out of a post-Singularity utopia is the year 2007 except tha The McGongall turning into a cat parody may actually be worth fitting in—McCongall turning into a cat broke Harry's entire worldview. Similarly, the "pretend to turn into a cat, and everyone just buys it" maneuver broke my religion + * https://everythingtosaveit.how/case-study-cfar/#attempting-to-erase-the-agency-of-everyone-who-agrees-with-our-position diff --git a/notes/post_ideas.txt b/notes/post_ideas.txt index f0cc1e4..15b2bba 100644 --- a/notes/post_ideas.txt +++ b/notes/post_ideas.txt @@ -10,6 +10,9 @@ _ mention Nov. 2018 conversation with Ian somehow _ Said on Yudkowsky's retreat to Facebook being bad for him _ Discord discourse with Alicorner _ screenshot Rob's Facebook comment which I link +_ explain first use of Center for Applied Rationality +_ erasing agency of Michael's friends, construed as a pawn +_ Anna thought badmouthing Michael was OK by Michael's standards Urgent/needed for healing—