From: M. Taylor Saotome-Westlake Date: Sat, 27 Aug 2022 01:21:23 +0000 (-0700) Subject: memoir: the story about Melkor's orcs X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=0d7e6c46bb0268b6aaad8221a151f626c90edb50;p=Ultimately_Untrue_Thought.git memoir: the story about Melkor's orcs --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 2df3a47..c9be176 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -316,9 +316,13 @@ _Even if_ you're opposed to abortion, or have negative views about the historica Thus, we see that Alexander's own "The Worst Argument in the World" is really complaining about the _same_ category-gerrymandering move that his "... Not Man for the Categories" comes out in favor of. We would not let someone get away with declaring, "I ought to accept an unexpected abortion or two deep inside the conceptual boundaries of what would normally not be considered murder if it'll save someone's life." Maybe abortion _is_ wrong and relevantly similar to the central sense of "murder", but you need to make that case _on the merits_, not by linguistic fiat. -... Scott still didn't get it. He said that he didn't see why he shouldn't accept one unit of categorizational awkwardness in exchange for sufficiently large utilitarian benefits. I started drafting a long reply—but then I remembered that in recent discussion with my posse about what we might have done wrong in our attempted outreach to Yudkowsky, the idea had come up that in-person meetings are better for updateful disagreement-resolution. Would Scott be up for meeting in person some weekend? Non-urgent. Ben would be willing to moderate, unless Scott wanted to suggest someone else, or no moderator. +... Scott still didn't get it. He said that he didn't see why he shouldn't accept one unit of categorizational awkwardness in exchange for sufficiently large utilitarian benefits. He made an analogy to some [Glowfic](https://www.glowfic.com/) lore, a story about orcs who had unwisely sworn a oath to serve the evil god Melkor. Though the orcs intend no harm of their own will, they're magically bound to obey Melkor's commands and serve as his terrible army or else suffer unbearable pain. Our heroine comes up with a solution: she founds a new religion featuring a deist noninterventionist God, who also happens to be named Melkor. She convinces the orcs that since the oath didn't specify _which_ Melkor, they're free to follow her new God instead of evil-Melkor, and the magic making the oath binding apparently accepts this casuistry if the orc themelf does. -... Scott didn't want to meet. At this point, I considered resorting to the tool of cheerful prices again, which I hadn't yet used against Scott—to say, "That's totally understandable! Would a financial incentive change your decision? For a two-hour meeting, I'd be happy to pay up to $4000 to you or your preferred charity. If you don't want the money, then sure, yes, let's table this. I hope you're having a good day." But that seemed sufficiently psychologically coercive and socially weird that I wasn't sure I wanted to go there. I emailed my posse asking what they thought—and then added that maybe they shouldn't reply until Friday, because it was Monday, and I really needed to focus on my dayjob that week. +Scott's attitude towards the new interpretation of the oath in the story was analogous to his thinking about transgenderedness: sure, the new definition may be somewhat awkward and unnatural in some sense, but it's not literally objectively false, and it made life better for so many orcs. If [rationalists should win](https://www.lesswrong.com/posts/6ddcsdA2c2XpNpE5x/newcomb-s-problem-and-regret-of-rationality), then the true rationalist in this situation is the one who thought up this clever hack to save an entire species. + +I started drafting a long reply—but then I remembered that in recent discussion with my posse about what we might have done wrong in our attempted outreach to Yudkowsky, the idea had come up that in-person meetings are better for updateful disagreement-resolution. Would Scott be up for meeting in person some weekend? Non-urgent. Ben would be willing to moderate, unless Scott wanted to suggest someone else, or no moderator. + +... Scott didn't want to meet. At this point, I considered resorting to the tool of cheerful prices again, which I hadn't yet used against Scott—to say, "That's totally understandable! Would a financial incentive change your decision? For a two-hour meeting, I'd be happy to pay up to $4000 to you or your preferred charity. If you don't want the money, then sure, yes, let's table this. I hope you're having a good day." But that seemed sufficiently psychologically coercive and socially weird that I wasn't sure I wanted to go there. On 18 March, I emailed my posse asking what they thought—and then added that maybe they shouldn't reply until Friday, because it was Monday, and I really needed to focus on my dayjob that week. This is the part where I began to ... overheat. I tried ("tried") to focus on my dayjob, but I was just _so angry_. Did Scott _really_ not understand the rationality-relevant distinction between "value-dependent categories as a result of caring about predicting different variables" (as explained by the _dagim_/water-dwellers _vs._ fish example) and "value-dependent categories _in order to not make my friends sad_"? I thought I was pretty explicit about this? Was Scott _really_ that dumb?? Or is it that he was only verbal-smart and this is the sort of thing that only makes sense if you've ever been good at linear algebra?? (Such that the language of "only running your clustering algorithm on the subspace of the configuration space spanned by the variables that are relevant to your decisions" would come naturally.) Did I need to write a post explaining just that one point in mathematical detail? (With executable code and a worked example with entropy calculations.) @@ -388,13 +392,19 @@ But ... if there's some _other_ reason you suspect there might be multiple speci I asked the posse if this analysis was worth sending to Yudkowsky. Michael said it wasn't worth the digression. He asked if I was comfortable generalizing from Scott's behavior, and what others had said about fear of speaking openly, to assuming that something similar was going on with Eliezer? If so, then now that we had common knowledge, we needed to confront the actual crisis, which was that dread was tearing apart old friendships and causing fanatics to betray everything that they ever stood for while its existence was still being denied. +As it happened, I ran into Scott on the train that Friday, the twenty-second. + +He said he doesn't +[TODO section: wrapping up with Scott; Kelsey; high and low Church https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/] + [TODO: Jessica joins the coalition; she tell me about her time at MIRI (link to Zoe-piggyback and Occupational Infohazards); Michael said that me and Jess together have more moral authority] +[TODO: Michael on Anna as cult leader] +[30 Mar: Michael—we need to figure out how to win against bad faith arguments} -[TODO section: wrapping up with Scott; Kelsey; high and low Church https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/] [SECTION: treachery, faith, and the great river diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index 4b88eaf..0fd8c19 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -36,6 +36,7 @@ _ Ben/Jessica _ Scott _ Anna _ secret posse member +_ Alicorn: about privacy, and for Melkor Glowfic reference link _ someone from Alicorner #drama as a hostile prereader (Swimmer?) _ maybe Kelsey (very briefly, just about her name)? _ maybe SK (briefly about his name)?