From: M. Taylor Saotome-Westlake Date: Sat, 13 Aug 2022 20:24:00 +0000 (-0700) Subject: memoir: tap at growing ms. X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=470ecee08dff4995b7629ed1ae7cdb6e1555f729;p=Ultimately_Untrue_Thought.git memoir: tap at growing ms. --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 4c40d73..2c15c5f 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -245,11 +245,45 @@ Ben thought that making them understand was hopeless and that becoming a stronge (I guess I'm only now, after spending an additional three years exhausting every possible line of argument, taking Ben's advice on this by writing this memoir. Sorry, Ben—and thanks.) +[TODO SECTION: relying on Michael too much; I'm not crazy + * "I should have noticed earlier that my emotional dependence on "Michael says X" validation is self-undermining, because Michael says that the thing that makes me valuable is my ability to think independently." + * fairly destructive move +* _Everyone got it wrong_. there was a comment on /r/slatestarcodex the other week that cited Scott, Eliezer, Ozy, Kelsey, and Rob as leaders of rationalist movement. https://www.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/ + +"We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouWereThere), and _you_ were there ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ..." +] + + +[TODO SECTION: Anna Michael feud + * Anna's 2 Mar comment badmouthing Michael + * This may have been less effective than it was in my head; I _remembered_ Michael as being high-status + * my immediate response: I strongly agree with your point about "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help fill these functions"! That's why I'm so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserves to be laughed out of the room. + * Anna's case against Michael: he was talking to Devi even when Devi needed a break, and he wanted to destroy EA + * I remember at a party in 2015ish, asking Michael what else I should invest my money in, if not New Harvest/GiveWell, and his response was, "You" + * backstory of anti-EA sentiment: Ben's critiques, Sarah's "EA Has a Lying Problem"—Michael had been in the background + * Anna had any actual dirt on him, you'd expect her to use it while trashing him in public, but her only example basically amounts to "he gave people career advice I disagree with" + http://benjaminrosshoffman.com/why-i-am-no-longer-supporting-reach/ + He ... flatters people? He ... didn't tell people to abandon their careers? What?! +] + + +I wasn't the only one whose life was being disrupted by political drama in early 2019. On 22 February, Scott Alexander [posted that the /r/slatestarcodex Culture War Thread was being moved](https://slatestarcodex.com/2019/02/22/rip-culture-war-thread/) to a new non–_Slate Star Codex_–branded subreddit in the hopes that it would hope help curb some of the harrassment he had been receiving. + +The problem with hosting an open discussion, Alexander explained, wasn't the difficulty of moderating obvious spam or advocacy of violence. + +The pro + + [TODO SECTION: RIP Culture War thread, and defense against alt-right categorization -https://slatestarcodex.com/2019/02/22/rip-culture-war-thread/ + * I wasn't the only one those life was being wracked with political drama: Scott exiled the /r/slatestarcodex Culture War Thread to /r/TheMotte due to private harrassment - * Yudkowsky: "Your annual reminder that Slate Star Codex is not and never was alt-right, every real stat shows as much, and the primary promoters of this lie are sociopaths who get off on torturing incredibly nice targets like Scott A." + +> Your annual reminder that Slate Star Codex is not and never was alt-right, every real stat shows as much, and the primary promoters of this lie are sociopaths who get off on torturing incredibly nice targets like Scott A. + + + + * Suppose the one were to reply: "Using language in a way you dislike, openly and explicitly and with public focus on the language and its meaning, is not lying. The proposition you claim false (Scott Alexander's explicit advocacy of a white ethnostate?) is not what the speech is meant to convey—and this is known to everyone involved, it is not a secret. You're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning. Now, maybe as a matter of policy, you want to make a case for language like 'alt-right' being used a certain way. Well, that's a separate debate then. But you're not making a stand for Truth in doing so, and your opponents aren't tricking anyone or trying to." * What direct falsehood is being asserted by Scott's detractors? I don't think anyone is claiming that, say, Scott identifies as alt-right (not even privately), any more than anyone is claiming that trans women have two X chromosomes. Sneer Club has been pretty explicit in their criticism * examples: @@ -271,29 +305,12 @@ Well, you're still somewhat better off listening to them than the whistling of t * I know you're very busy; I know your work's important—but it might be a useful exercise? Just for a minute, to think of what you would actually say if someone with social power _actually did this to you_ when you were trying to use language to reason about Something you had to Protect? ] -Without disclosing any _specific content_ from private conversations with Yudkowsky that may or may not have happened, I think I am allowed to say that our posse did not get the kind of engagement from him that we were hoping for. (That is, I'm Glomarizing over whether Yudkowsky just didn't reply, or whether he did reply and our posse was not satisfied with the response.) Michael said that it seemed important that, if we thought Yudkowsky wasn't interested, we should have common knowledge among ourselves that we consider him to be choosing to be a cult leader. -[TODO SECTION: relying on Michael too much; I'm not crazy - * This may have been less effective than it was in my head; I _remembered_ Michael as being high-status - * "I should have noticed earlier that my emotional dependence on "Michael says X" validation is self-undermining, because Michael says that the thing that makes me valuable is my ability to think independently." - * fairly destructive move -* _Everyone got it wrong_. there was a comment on /r/slatestarcodex the other week that cited Scott, Eliezer, Ozy, Kelsey, and Rob as leaders of rationalist movement. https://www.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/ - * "losing his ability to model other people and I'm worried about him", I think Ben-and-Jessica would see as [...] angry about living in simulacrum level 3 and we're worried about everyone else." +Without disclosing any _specific content_ from private conversations with Yudkowsky that may or may not have happened, I think I am allowed to say that our posse did not get the kind of engagement from Yudkowsky that we were hoping for. (That is, I'm Glomarizing over whether Yudkowsky just didn't reply, or whether he did reply and our posse was not satisfied with the response.) -"We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouWereThere), and _you_ were there ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ..." -] +Michael said that it seemed important that, if we thought Yudkowsky wasn't interested, we should have common knowledge among ourselves that we consider him to be choosing to be a cult leader. -[TODO: Anna Michael feud - * Anna's 2 Mar comment badmouthing Michael - * my immediate response: I strongly agree with your point about "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help fill these functions"! That's why I'm so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserves to be laughed out of the room. - * Anna's case against Michael: he was talking to Devi even when Devi needed a break, and he wanted to destroy EA - * I remember at a party in 2015ish, asking Michael what else I should invest my money in, if not New Harvest/GiveWell, and his response was, "You" - * backstory of anti-EA sentiment: Ben's critiques, Sarah's "EA Has a Lying Problem"—Michael had been in the background - * Anna had any actual dirt on him, you'd expect her to use it while trashing him in public, but her only example basically amounts to "he gave people career advice I disagree with" - http://benjaminrosshoffman.com/why-i-am-no-longer-supporting-reach/ -] - -Meaninwhile, my email thread with Scott got started back up, although I wasn't expecting anything to come out of it. I expressed some regret that all the times I had emailed him over the past couple years had been when I was upset about something (like psych hospitals, or—something else) and wanted something from him, which was bad, because it was treating him as a means rather than an end—and then, despite that regret, continued prosecuting the argument. +Meanwhile, my email thread with Scott got started back up, although I wasn't expecting anything to come out of it. I expressed some regret that all the times I had emailed him over the past couple years had been when I was upset about something (like psych hospitals, or—something else) and wanted something from him, which was bad, because it was treating him as a means rather than an end—and then, despite that regret, continued prosecuting the argument. One of Alexander's [most popular _Less Wrong_ posts ever had been about the noncentral fallacy, which Alexander called "the worst argument in the world"](https://www.lesswrong.com/posts/yCWPkLi8wJvewPbEp/the-noncentral-fallacy-the-worst-argument-in-the-world): for example, those who crow that abortion is _murder_ (because murder is the killing of a human being), or that Martin Luther King, Jr. was a _criminal_ (because he defied the segregation laws of the South), are engaging in a dishonest rhetorical maneuver in which they're trying to trick their audience into attributing attributes of the typical "murder" or "criminal" onto what are very noncentral members of those categories. @@ -301,7 +318,7 @@ _Even if_ you're opposed to abortion, or have negative views about the historica Thus, we see that Alexander's own "The Worst Argument in the World" is really complaining about the _same_ category-gerrymandering move that his "... Not Man for the Categories" comes out in favor of. We would not let someone get away with declaring, "I ought to accept an unexpected abortion or two deep inside the conceptual boundaries of what would normally not be considered murder if it'll save someone's life." -... Scott still didn't get it. He said that he didn't see why he shouldn't accept one unit of categorizational awkwardness in exchange for sufficiently large utilitarian benefits. I started drafting a long reply—but then I remembered that in recent discussion with my posse about what had gone wrong in our attempted outreach to Yudkowsky, the idea had come up that in-person meetings are better for updateful disagreement-resolution. Would Scott be up for meeting in person some weekend? Non-urgent. Ben would be willing to moderate, unless Scott wanted to suggest someone else, or no moderator. +... Scott still didn't get it. He said that he didn't see why he shouldn't accept one unit of categorizational awkwardness in exchange for sufficiently large utilitarian benefits. I started drafting a long reply—but then I remembered that in recent discussion with my posse about what we might have done wrong in our attempted outreach to Yudkowsky, the idea had come up that in-person meetings are better for updateful disagreement-resolution. Would Scott be up for meeting in person some weekend? Non-urgent. Ben would be willing to moderate, unless Scott wanted to suggest someone else, or no moderator. ... Scott didn't want to meet. At this point, I considered resorting to the tool of cheerful prices again, which I hadn't yet used against Scott—to say, "That's totally understandable! Would a financial incentive change your decision? For a two-hour meeting, I'd be happy to pay up to $4000 to you or your preferred charity. If you don't want the money, then sure, yes, let's table this. I hope you're having a good day." But that seemed sufficiently psychologically coercive and socially weird that I wasn't sure I wanted to go there. I emailed my posse asking what they thought—and then added that maybe they shouldn't reply until Friday, because it was Monday, and I really needed to focus on my dayjob that week. @@ -353,6 +370,9 @@ Maybe that's why I felt like I had to stand my ground and fight a culture war to * We need to figure out how to win against bad faith arguments ] + + + [TODO: Jessica joins the coalition; she tell me about her time at MIRI (link to Zoe-piggyback and Occupational Infohazards); Michael said that me and Jess together have more moral authority] [TODO: wrapping up with Scott; Kelsey; high and low Church https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/] diff --git a/notes/i-tell-myself-sections.md b/notes/i-tell-myself-sections.md index e801fd6..eac46dd 100644 --- a/notes/i-tell-myself-sections.md +++ b/notes/i-tell-myself-sections.md @@ -140,4 +140,6 @@ This post has been incredibly emotionally difficult to write, because intellectu ------ -[Dark Side Epistemology] \ No newline at end of file +[Dark Side Epistemology] + +Is this the hill _he_ wants to die on? \ No newline at end of file diff --git a/notes/post_ideas.txt b/notes/post_ideas.txt index 76fb779..437c408 100644 --- a/notes/post_ideas.txt +++ b/notes/post_ideas.txt @@ -5,6 +5,8 @@ _ maybe quote Michael's Nov 2018 texts? _ the right way to explain how I'm respecting Yudkowsky's privacy _ clarify sequence of outreach attempts _ clarify existence of a shadow posse member +_ mention Nov. 2018 conversation with Ian somehow +_ Said on Yudkowsky's retreat to Facebook being bad for him Urgent/needed for healing—