From: M. Taylor Saotome-Westlake Date: Sun, 31 Jul 2022 20:09:16 +0000 (-0700) Subject: memoir: more email review up to March '19 overnight overheating incident X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=450abb2393a526ac2aa3773a9fc2d93328039a67;p=Ultimately_Untrue_Thought.git memoir: more email review up to March '19 overnight overheating incident --- diff --git a/notes/a-hill-email-review.md b/notes/a-hill-email-review.md index e0ed113..a2a7453 100644 --- a/notes/a-hill-email-review.md +++ b/notes/a-hill-email-review.md @@ -40,7 +40,23 @@ Email timeline— 25 Feb: first draft of defending against "alt-right" categorization 27 Feb: more support from the gang 27-28 Feb: on private universes - +1 Mar: intra-group disagreement re "ambient epistemic violence" +2 Mar: Michael—independence of perspectives vs. failure to present an united front +2 Mar: I saw the new Rebel Wilson movie last night, went to Simon's birthday (I remember seeing Scott there), then met Anna +2 Mar: Anna's comment about Michael +3 Mar: "Apologies are always intrinsically worthless, but that fact that we are able to legitimately demand an apology means that we can legitimately demand that he look at something." +3 Mar: I heard Anna say my desire for good arugments to win was "selfish"; what she actually said was "self-centered", "it seems less important than what happens on topics that we don't all know there isn't free speech about." +3 Mar: "told them that the thing that would most help AI risk from them anyhow was for them to continue on their present career," is that if I try to only focus on "important" things, I'll probably just come up with something fake +3 Mar: more draft, more thanks for social proof +5 Mar: Scott gets back to me +5 Mar: 12 short stories about language (and, expressing regret that I only email Scott when I need something from him) +6 Mar: SJWs without guns doing this than you are about mostly white men with money, gavels and guns doing it? ... SJW agenda is to play by the same rules as the legal system plays by, and they do it with less support from tradition and from ritual, so it's more obvious, +6 Mar: much better draft to EY (defending against alt-right) +10 Mar: send revised draft to EY +14 Mar: meeting with Ben/Sarah/Marie/Jessica; Ben assist on "Blegg Mode" +17 Mar: me to Scott, bringing up idea that in-person ideas were better +17 Mar: Michael characterizing EY as choosing to be a cult leader +18 Mar: starting to overheat (subject: "strategy huddle II") @@ -243,3 +259,48 @@ Michael— > I think that if Zack is having difficulties it would be ideal if we could move this ahead without him. Needless to say, he doesn't need to feel any responsibility and has done all that could be asked of him and more to cause this project to survive unanticipated difficulties. Right now, I don't think there's much help he can offer except for periodic pushes, like the current one, for the rest of us to keep at it. + +> On my model, it's good to send this because there are two possibilities +> 1) that this demonstrates the independence of our perspectives, which causes them to count as additional evidence for a proposition, and that means that this makes us more convincing +> 2) that this shows a lack of unity, a failure to present a United Front, which is bad, because Arguments are Soldiers, and thus we are making ourselves less convincing. +> In the latter case, Eliezer is dead to us and it would probably help if we were to bury something in a funeral. In the former case, our friends are still in there, and the way to reach them isn't by balancing the social forces arrayed against them. + +I'm pretty sure I can directly link to Ozy and Kelsey speaking favorably of literal self-identity, not gating on hormones or social passability. And just ... I keep rereading this stuff trying to check that I'm not misreading, because I can't take it seriously. They're just fucking with me, right???? I don't harbor ill will against any individual people, but I just can't share a rationalist community with this level of mindfuckery. + +I mean, okay, if that's what you want to do. (If the outcome of these hopefully-not-too-annoying emails is that Eliezer Yudkowsky feels more social/political constraints on what he can say rather than less, then this horribly backfired and you should forget we said anything.) + +"How would you feel if someone else tried to get away with the 'hill of meaning in defense of validity' appeal-to-arbitrariness conversation-halter against you when you were trying to use language to reason about something you actually cared about (like defending Scott, who is great)? If it would be logically rude in that context, maybe it was also logically rude the first time?" + +Michael +> We have all been essentially trained to believe that we are not entitled to voice our complaints. When someone backs off, we are prone to loosing confidence and to think that maybe we shouldn't be talking with them, yet function institutions have existed in the past, and if people aren't adhering to the normal of functional institutions, we have to not allow that to cause us to stop adhering to those norms. Therefore, we should keep doubling down with reminders regarding the rules for discussion. If he decides to not play, we should establish that to ourselves with no plausible deniability and move on. + + + "An alternative categorization system is not an error", like all appeals-to-arbitrariness, is necessarily what you called a symmetric weapon: the same form of argument works just as well for being mean to our friends as it does for being nice to our friends. The best the good guys can hope for is to win by coincidence. (Assuming there's any way to distinguish who the "good guys" are, if alternative systems that draw the "good guys" category boundary differently aren't in error, assuming there's any way to distinguish what "winning" is, if alternative systems that draw the "winning" category boundary differently aren't in error, &c.) + +If someone showed up at a Less Wrong meetup claiming that they had an invisible dragon, I would expect ordinary meetup attendees to exert social pushback against the dragon-claimant: "Dude, that doesn't make sense. Beliefs need to be falsifiable! You should read the Sequences!" I would even expect the social pushback to be strong enough such that the claimant would eventually either internalize the "Beliefs need to be falsifiable" idea or get the fuck out of my rationalist community. + +If I show up at a Less Wrong meetup claiming that people don't have invisible genders, and I get social pushback ... then I'm not really sure what my options are other than writing off the entire community. + + + +I'm not extremely troubled by my minor disagreements with you. All the actually-smart high-status old-timers I've talked to (you, Eliezer, Anna, Michael Vassar, Sarah Constantin, Ben Hoffman, Steve Rayhawk, &c.) clearly see the hidden-Bayesian-structure-of-language-and-cognition thing I'm trying to point at. Even if you say it's mere awkwardness and not a serious epistemology problem, at least you acknowledge the awkwardness, which is more than I've gotten out of Kelsey. + +The thing I find extremely troubling is that none of the old-timers who clearly understand the thing are willing to say so loudly and in public where other people can learn from it, with the exceptions of Ben and Vassar—and Vassar's in-community status is getting debased. (Sarah drafted a post supporting me on the categories thing, but decided not to publish it.) + +You wrote about the "inevitable" forces pointing towards a future guided by the beauty of our weapons, where good arguments eventually win out over bad arguments. What if you were too optimistic? What if good arguments don't win—not even in the so-called "rationalist" community? + +I think "Categories should carve reality at the joints to help us make efficient probabilistic predictions" is a better argument than "Categories capture tradeoffs you care about; they don't have to carve reality at the joints if that makes people sad." Right now, the better argument is losing. Unless something changes, the better argument will continue to lose. In other words, we live in a world where reason doesn't work. If we live in a world where reason doesn't work, then we're dead. If we can't get consensus on easy problems (Q: Can men become women by means of saying so? A: No. Why would you think that?), then how are we supposed to solve AI alignment? + + + +> I don't know if you read the glowfic that Alicorn and Kelsey write. There was a scene in one of their LoTR things where some elves capture some orcs. They learn that the orcs are nice people deep down, evil god Melkor has forced the orcs to swear an oath to serve him and do evil all the time and kill people. And for complicated magic reasons, orcs cannot break oaths, or else they are tormented by unbearable pain. So even though the orcs are nice, they are compelled to serve Melkor, who usually wants them to kill people and stuff. + +> After a while, our hero comes up with a solution. She founds a religion on the spot devoted to a Deist-style omnipotent observer God, who just so happens to also be named Melkor. Deist-Melkor just wants people to live their lives and be happy and help others. She claims that the orcs did swear an oath to follow Melkor, but they didn't specify *which* Melkor, and since her Melkor is omnipotent he takes preference over evil-Melkor. The orcs (who are all kind of dumb, and also incentivized toward highly motivated reasoning) agree this makes sense, this seems to satisfy whatever magic enforces their oath, and so they go off and live their lives and be happy and help others. + +> Part of this story is that occasionally some people will approach the orcs and be like "This is kind of dumb, right, on some level you know you're supposed to be following evil Melkor", and this information is infohazardous to orcs and can put them in unbearable pain (or compel them to kill people and stuff) until they're able to talk themselves out of it. + +> I asked Alicorn whether this was an intentional metaphor for this discussion, since I know you've been talking to her about it. She says no. But it captures a lot of how I think about this. Are the *real rationalists!!!* the ones who go to the orcs and tell them that they obviously meant to serve evil Melkor and so should be in constant pain all the time? Are those the people who are """winning"""? Or would saying that just be pointlessly harming them for no reason? + +> I couldn't care less about transgender. + +> I agree that he isn't, but it seems Important to me that if we think he isn't interested we actually establish common knowledge that we consider him at this point to be choosing to try to be a cult leader.