From: M. Taylor Saotome-Westlake Date: Sun, 2 Oct 2022 18:13:19 +0000 (-0700) Subject: memoir: skeleton prep for Category War initial defeat § X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=c0401d54544af0eeff0eea09ba29b29879c0aba2;p=Ultimately_Untrue_Thought.git memoir: skeleton prep for Category War initial defeat § --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 7bddde6..b341ea8 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -204,9 +204,9 @@ Supppose Alice messages Bob at 5 _p.m._, "Can you come to the party?", and also, I think commonsense privacy-norm-adherence intuitions actually say _No_ here: the text of Alice's messages makes it too easy to guess that sometime between 5 and 6, Bob probably said that he couldn't come to the party because he has gout. It would seem that Alice's right to talk about her own actions in her own life _does_ need to take into account some commonsense judgement of whether that leaks "sensitive" information about Bob. -In part of the Dumb Story that follows, I'm going to describe several times when I and others emailed Yudkowsky to try to argue with what he said in public, without saying anything about whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was busy with more existentially important things and didn't check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the futher emails with followup arguments that I describe. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".) +In part of the Dumb Story that follows, I'm going to describe several times when I and others emailed Yudkowsky to try to argue with what he said in public, without saying anything about whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was too busy with more existentially important things to check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the futher emails with followup arguments that I describe. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".) -It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone else can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; Scott's reputation isn't so direly in need of correction.) +It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone _else_ can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; Alexander's reputation isn't so direly in need of correction.) In accordance with the privacy-norm-adherence policy just described, I don't think I should say whether Yudkowsky replied to Michael's and my emails, nor ([again](/2022/TODO/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price-privacy-constraint)) whether he accepted the cheerful price money, because any conversation that may or may not have occured would have been private. But what I _can_ say, because it was public, is that we saw [this addition to the Twitter thread](https://twitter.com/ESYudkowsky/status/1068071036732694529): @@ -214,7 +214,7 @@ In accordance with the privacy-norm-adherence policy just described, I don't thi Look at that! The great Eliezer Yudkowsky said that my position is "not ontologically confused." That's _probably_ high praise coming from him! -You might think that that should have been the end of the story. Yudkowsky denounced a particular philosophical confusion, I already had a related objection written up, and he acknowledged my objection as not being the confusion he was trying to police. I _should_ be satisfied, right? +You might think that that should have been the end of the story. Yudkowsky denounced a particular philosophical confusion, I already had a related objection written up, and he publicly acknowledged my objection as not being the confusion he was trying to police. I _should_ be satisfied, right? I wasn't, in fact, satisfied. This little "not ontologically confused" clarification buried deep in the replies was _much less visible_ than the bombastic, arrogant top level pronouncement insinuating that resistance to gender-identity claims _was_ confused. (1 Like on this reply, _vs._ 140 Likes/21 Retweets on start of thread.) I expected that the typical reader who had gotten the impression from the initial thread that Yudkowsky thought that gender-identity skeptics didn't have a leg to stand on, would not, actually, be disabused of this impression by the existence of this little follow-up. Was it greedy of me to want something _louder_? @@ -332,7 +332,7 @@ In the face of that juggernaut of received opinion, I was already feeling pretty But _Michael thought I was in the right_—not just intellectually on the philosophy issue, but morally in the right to be _prosecuting_ the philosophy issue with our leaders, and not accepting stonewalling as an answer. That social proof gave me a lot of bravery that I otherwise wouldn't have been able to muster up—even though it would have been better if I could have propagated the implications of the observation that my dependence on him was self-undermining, because Michael himself said that the thing that made me valuable was my ability to think independently. -The social proof was probably more effective in my own head, than it was with anyone we were arguing with. _I remembered_ Michael as a high-status community elder back in the _Overcoming Bias_ era, but that had been a long time ago. (Luke Muelhauser had taken over leadership of the Singularity Institute in 2011; some sort of rift between Michael and Eliezer had widened in recent years, the details of which had never been explained to me.) Michael's status in "the community" of 2019 was much more mixed. He was intensely critical of the rise of Effective Altruism, which he saw as preying on the energies of the smartest and most scrupulous people around with bogus claims about how to do good in the world. (I remember being at a party in 2015 and asking Michael what else I should spend my San Francisco software engineer money on, if not the EA charities I was considering. I was surprised when his answer was, "You.") +The social proof was probably more effective in my own head, than it was with anyone we were arguing with. _I remembered_ Michael as a high-status community elder back in the _Overcoming Bias_ era, but that had been a long time ago. (Luke Muelhauser had taken over leadership of the Singularity Institute in 2011; and apparently, some sort of rift between Michael and Eliezer had widened in recent years, the details of which had never been explained to me.) Michael's status in "the community" of 2019 was much more mixed. He was intensely critical of the rise of the Effective Altruism movement, which he saw as using bogus claims about how to do the most good to prey on the energies of the smartest and most scrupulous people around. (I remember being at a party in 2015 and asking Michael what else I should spend my San Francisco software engineer money on, if not the EA charities I was considering. I was surprised when his answer was, "You.") Another blow to Michael's "community" reputation was dealt on 27 February, when Anna [published a comment badmouthing Michael and suggesting that talking to him was harmful](https://www.lesswrong.com/posts/u8GMcpEN9Z6aQiCvp/rule-thinkers-in-not-out?commentId=JLpyLwR2afav2xsyD), which I found pretty disappointing—more so as I began to realize the implications. @@ -372,7 +372,7 @@ Without disclosing any specific content from private conversations that may or m Michael said that it seemed important that, if we thought Yudkowsky wasn't interested, we should have common knowledge among ourselves that we consider him to be choosing to be a cult leader. -[I](https://www.youtube.com/watch?v=TqamOOSdeHs) [settled](https://www.youtube.com/watch?v=TF18bz2j5PM) [on](https://www.youtube.com/watch?v=Hny1prRDE3I) [Sara](https://www.youtube.com/watch?v=emdVSVoCLmg) [Barellies's](https://www.youtube.com/watch?v=jZMQ0OKVO80&t=112s) ["Gonna Get Over You"](https://www.youtube.com/watch?v=OUe3oVlxLSA) as my breakup song with Yudkowsky and the rationalists. ("And I tell myself to let the story end / And my heart will rest in someone else's hand"—Michael Vassar's—"My 'why not me?' philosophy began" ...) +[I](https://www.youtube.com/watch?v=TqamOOSdeHs) [settled](https://www.youtube.com/watch?v=TF18bz2j5PM) [on](https://www.youtube.com/watch?v=Hny1prRDE3I) [Sara](https://www.youtube.com/watch?v=emdVSVoCLmg) [Barellies's](https://www.youtube.com/watch?v=jZMQ0OKVO80&t=112s) ["Gonna Get Over You"](https://www.youtube.com/watch?v=OUe3oVlxLSA) as my breakup song with Yudkowsky and the rationalists, often listening to a cover of it on loop to numb the pain. ("And I tell myself to let the story end / And my heart will rest in someone else's hand"—Michael Vassar's.) Meanwhile, my email thread with Scott got started back up again, although I wasn't expecting anything public to come out of it. I expressed some regret that all the times I had emailed him over the past couple years had been when I was upset about something (like psych hospitals, or—something else) and wanted something from him, which was bad, because it was treating him as a means rather than an end—and then, despite that regret, continued prosecuting the argument. @@ -422,7 +422,7 @@ Anyway, I did successfully get to my apartment and get a few hours of sleep. One (Incidentally, the code that I wrote intermittently between 11 _p.m._ and 4 _a.m._ was a horrible bug-prone mess, and the company has been paying for it ever since, every time someone needs to modify that function and finds it harder to make sense of than it would be if I had been less emotionally overwhelmed in March 2019 and written something sane instead.) -I think at some level, I wanted Scott to know how frustrated I was about his use of "mental health for trans people" as an Absolute Denial Macro. But then when Michael started advocating on my behalf, I started to minimize my claims because I had a generalized attitude of not wanting to sell myself as a victim. Ben pointed out that [making oneself mentally ill in order to extract political concessions](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) only works if you have a lot of people doing it in a visibly coordinated way. And even if it did work, getting into a dysphoria contest with trans people didn't seem like it led anywhere good. +I think at some level, I wanted Scott to know how frustrated I was about his use of "mental health for trans people" as an Absolute Denial Macro. But then when Michael started advocating on my behalf, I started to minimize my claims because I had a generalized attitude of not wanting to sell myself as a victim. Ben pointed out that [making oneself mentally ill in order to extract political concessions](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) only works if you have a lot of people doing it in a visibly coordinated way—and even if it did work, getting into a dysphoria contest with trans people didn't seem like it led anywhere good. I supposed that, in Michael's worldview, aggression is more honest than passive-aggression. That seemed obviously true, but I was psychologically limited in how much overt aggression I was willing to deploy against my friends. (And particularly Yudkowsky, who I still hero-worshipped.) But clearly, the tension between "I don't want to do too much social aggression" and "losing the Category War within the rationalist community is _absolutely unacceptable_" was causing me to make wildly inconsistent decisions. (Emailing Scott at 4 _a.m._, and then calling Michael "aggressive" when he came to defend me was just crazy: either one of those things could make sense, but not _both_.) @@ -531,27 +531,31 @@ I could see a case that it was unfair of me to include subtext and then expect p (I did regret having accidentally "poisoned the well" the previous month by impulsively sharing the previous year's ["Blegg Mode"](/2018/Feb/blegg-mode/) [as a _Less Wrong_ linkpost](https://www.lesswrong.com/posts/GEJzPwY8JedcNX2qz/blegg-mode). "Blegg Mode" had originally been drafted as part of "... To Make Predictions" before getting spun off as a separate post. Frustrated in March at our failing email campaign, I thought it was politically "clean" enough to belatedly share, but it proved to be insufficiently [deniably allegorical](/tag/deniably-allegorical/). It's plausible that some portion of the _Less Wrong_ audience would have been more receptive to "... Boundaries?" as not-politically-threatening philosophy, if they hadn't been alerted to the political context by the 60+-comment trainwreck on the "Blegg Mode" linkpost.) -On 13 April, I pulled the trigger on publishing "... Boundaries?", and wrote to Yudkowsky again, a fourth time (!), asking if he could _either_ publicly endorse the post, _or_ publicly comment on what he thought the post got right and what he thought it got wrong; and, that if engaging on this level was too expensive for him in terms of spoons, if there was any action I could take to somehow make it less expensive? (Subject: "movement to clarity; or, rationality court filing") +On 13 April, I pulled the trigger on publishing "... Boundaries?", and wrote to Yudkowsky again, a fourth time (!), asking if he could _either_ publicly endorse the post, _or_ publicly comment on what he thought the post got right and what he thought it got wrong; and, that if engaging on this level was too expensive for him in terms of spoons, if there was any action I could take to somehow make it less expensive? [TODO: my stated justification] (Subject: "movement to clarity; or, rationality court filing") +Again, without revealing any content from private conversations that may or may not have occurred, we did not get any public engagement from Yudkowsky. +It seemed that the Category War was over, and we lost. +We _lost?!_ How could we _lose?!_ The philosophy here was _very clear-cut_. This shouldn't be hard or expensive or difficult to clear up. I could believe that Alexander was "honestly" confused, but Yudkowsky ...!? +I could see how, under ordinary circumstances, asking Yudkowsky to weigh in on my post would seem inappropriately demanding of a Very Important Person's time, given that a simple person such as me was surely as a mere _worm_ in the presence of the great Eliezer Yudkowsky. +But the only reason for my post to exist was _because_ [...] +[TODO: Ben on Eliza analogy] - -Jessica mentioned talking with someone about me writing to Yudkowsky and Alexander requesting that they clarify the category boundary thing. +Jessica mentioned talking with someone about me writing to Yudkowsky and Alexander requesting that they clarify the category boundary thing. This person described having a sense that I should have known that wouldn't work—because of the politics involved, not because I wasn't right. +"Those who are savvy in high-corruption equilibria maintain the delusion that high corruption is common knowledge, to justify expropriating those who naively don't play along, by narratizing them as already knowing and therefore intentionally attacking people, rather than being lied to and confused." -that I should have known that wouldn't work—because of the politics involved, not because I wasn't right. -"Those who are savvy in high-corruption equilibria maintain the delusion that high corruption is common knowledge, to justify expropriating those who naively don't play along, by narratizing them as already knowing and therefore intentionally attacking people, rather than being lied to and confused." +[TODO: asking Anna to weigh in] (I figured that spamming people with hysterical and somewhat demanding physical postcards was more polite (and funnier) than my recent habit of spamming people with hysterical and somewhat demanding emails.) -[TODO: We lost?! How could we lose??!!?!? And, post-war concessions ... curation hopes ... 22 Jun: I'm expressing a little bit of bitterness that a mole rats post got curated https://www.lesswrong.com/posts/fDKZZtTMTcGqvHnXd/naked-mole-rats-a-case-study-in-biological-weirdness @@ -562,35 +566,31 @@ scuffle on LessWrong FAQ 31 May https://www.lesswrong.com/posts/MqrzczdGhQCRePgq ] +[TODO section on factional conflict: +Michael on Anna as cult leader +Jessica told me about her time at MIRI (link to Zoe-piggyback and Occupational Infohazards) +24 Aug: I had told Anna about Michael's "enemy combatants" metaphor, and how I originally misunderstood +me being regarded as Michael's pawn +assortment of agendas +mutualist pattern where Michael by himself isn't very useful for scholarship (he just says a lot of crazy-sounding things and refuses to explain them), but people like Sarah and me can write intelligible things that secretly benefited from much less legible conversations with Michael. +] + Since arguing at the object level had failed (["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), ["Reply on Adult Human Females"](/2018/Apr/reply-to-the-unit-of-caring-on-adult-human-females/)), and arguing at the strictly meta level had failed (["... Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries)), the obvious thing to do next was to jump up to the meta-meta level and tell the story about why the "rationalists" were Dead To Me now, that [my price for joining](https://www.lesswrong.com/posts/Q8evewZW5SeidLdbA/your-price-for-joining) was not being met. (Just like Ben had suggested in December and in April.) I found it trouble to make progress on. I felt—constrained. I didn't know how to tell the story without (as I perceived it) escalating personal conflicts or leaking info from private conversations. So instead, I mostly turned to a combination of writing bitter and insulting comments whenever I saw someone praise "the rationalists" collectively, and—more philosophy-of-language blogging! - [TODO 2019 activities— "epistemic defense" meeting "Schelling Categories" Aug 2019 - -"Maybe Lying Doesn't Exist" Oct 2019 - -"Algorithms of Deception!" Oct 2019 - "Heads I Win" Sep 2019 - +"Algorithms of Deception!" Oct 2019 "Firming Up ..." Dec 2019 +"Against Lie Inflation"/"Maybe Lying Doesn't Exist" Oct 2019 -[TODO section on factional conflict: -Michael on Anna as cult leader -Jessica told me about her time at MIRI (link to Zoe-piggyback and Occupational Infohazards) -24 Aug: I had told Anna about Michael's "enemy combatants" metaphor, and how I originally misunderstood -me being regarded as Michael's pawn -assortment of agendas -mutualist pattern where Michael by himself isn't very useful for scholarship (he just says a lot of crazy-sounding things and refuses to explain them), but people like Sarah and me can write intelligible things that secretly benefited from much less legible conversations with Michael. -] [TODO: Yudkowsky throwing NRx under the bus; tragedy of recursive silencing diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index 9aefff1..f979353 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -11,6 +11,7 @@ _ excerpt 2nd "out of patience" email with internet available— +_ Sarah Barellies cover links _ "watchful waiting" _ Atlantic article on "My Son Wears Dresses" https://archive.is/FJNII _ in "especially galling" §: from "Changing Emotions"—"somehow it's always about sex when men are involved"—he even correctly pinpointing AGP in ordinary men (as was obvious back then), just without the part that AGP _is_ "trans" @@ -30,6 +31,7 @@ _ refusing to give a probability (When Not to Use Probabilities? Shut Up and Do far editing tier— +_ quote one more "Hill of Meaning" Tweet emphasizing fact/policy distinction _ conversation with Ben about physical injuries (this is important because it explains where the "cut my dick off rhetoric" came from) _ context of his claim to not be taking a stand _ rephrase "gamete size" discussion to make it clearer that Yudkowsky's proposal also implicitly requires people to be agree about the clustering thing