From: M. Taylor Saotome-Westlake Date: Sat, 11 Mar 2023 05:28:42 +0000 (-0800) Subject: memoir: Eliezerfic fight (authorial fiat), poke at conclusion X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=46a1ddf698f0b9f7e7c7fb5c37fedccccdf9000b;p=Ultimately_Untrue_Thought.git memoir: Eliezerfic fight (authorial fiat), poke at conclusion --- diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index 5473a8c..a44acad 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -828,8 +828,46 @@ Yudkowsky replied: I wish I had thought to ask if he'd have felt the same way in 2008. +Ajvermillion was still baffled at my skepticism: if the author specifies that the world of the story is simple in this-and-such direction, on what grounds could I _disagree_? -[TODO: regrets and wasted time +I admitted, again, that there was a sense in which I couldn't argue with authorial fiat. But I thought that an author's choice of assumptions reveals something about what they think is true in our world, and commenting on that should be fair game for literary critics. Suppose someone wrote a story and said, "in the world portrayed in this story, everyone is super-great at _kung fu_, and they could beat up everyone from our Earth, but they never have to practice at all." + +(Yudkowsky retorted, "...you realize you're describing like half the alien planets in comic books? when did Superman ever get depicted as studying kung fu?" I wish I had thought to admit that, yes, I _did_ hold Eliezer Yudkowsky to a higher standard of consilient worldbuilding than DC Comics. Would he rather I _didn't_?) + +Something about innate _kung fu_ world seems fake in a way that seems like a literary flaw. It's not just about plausibility. Innate _kung fu_ skills are scientifically plausible[^instinct] in a way that faster-than-light travel is not. Fiction incorporates unrealistic elements in order to tell a story that has relevace to real human lives. Throwing faster-than-light travel into the universe so that you can do a space opera doesn't make the _people_ fake in the way that Superman's fighting skills are fake. + +[^instinct]: All sorts of other instinctual behaviors exist in animals; I don't se why skills humans have to study for years as a "martial art" couldn't be coded into the genome. + +Similarly, a world that's claimed by authorial fiat to be super-great at epistemic rationality, but where the people don't have a will-to-truth stronger than their will-to-happiness, felt fake to me. I couldn't _prove_ that it was fake. I agreed with Harmless's case that, _technically_, as far as the Law went, you could build a Civilization or a Friendly AI to see all the ugly things that you preferred not to see. + +But if you could—would you? And more importantly, if you would—could you? + +It was possible that the attitude I was evincing here was just a difference between the eliezera out of dath ilan and the Zackistani from my medianworld, and that there's nothing more to be said about it. But I didn't think the thing was a _genetic_ trait of the Zackistani! _I_ got it from spending my early twenties obsessively re-reading blog posts that said things like, ["I believe that it is right and proper for me, as a human being, to have an interest in the future [...] One of those interests is the human pursuit of truth [...] I wish to strengthen that pursuit further, in this generation."](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business) + +There were definitely communities on Earth where I wasn't allowed in because of my tendency to shout things from street corners, and I respected those people's right to have a safe space for themselves. + +But those communities ... didn't call themselves _rationalists_, weren't _pretending_ be to be inheritors of the great tradition of E. T. Jaynes and Robin Dawes and Richard Feynmann. And if they _did_, I think I would have a false advertising complaint against them. + +"The eleventh virtue is scholarship. Study many sciences and absorb their power as your own ... unless a prediction market says that would make you less happy," just didn't have the same ring to it. Neither did "The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But higher than both of those, is trusting your Society's institutions to tell you which kinds of knowledge will make you happy"—even if you stipulated by authorial fiat that your Society's institutions are super-competent, such that they're probably right about the happiness thing. + +[TODO: Atlas Shrugged quote and children's morals] + +[TODO: Yudkowsky tests me] + +[TODO: derail with Lintamande] + +[TODO: knives, and showing myself out] + +------ + +Anyway, that—briefly (I mean it)—is the Whole Dumb Story about how I wasted the last seven years of my life. It's probably not that interesting? Life goes on—for now. My dayjob contract expired at the end of 2022. In 2023, I've been finishing up this memoir, and posting some other ideas to _Less Wrong_. (I got into another slapfight about me being un-collaborative, which is not interesting enough to summarize.) + +After this, the AI situation is looking worrying enough, that I'm thinking I should try to do some more direct xrisk-reduction work, although I haven't definitely selected any particular job or project. (It probably won't matter, but it will be dignified.) Now that the shape of the threat is on the horizon, I think I'm less afraid of being directly involved. Something about having large language models to study in the 'twenties is—grounding, compared to the superstitious fears of the paperclip boogeyman of my nightmares in the 'teens. + +Like all intellectuals, as a teenager I imagined that I would write a book. It was always going to be about gender, but I was vaguely imagining a novel, which never got beyond vague imaginings. That was before the Sequences. I'm 35 years old now. I think my intellectual life has succeeded in ways I didn't know how to imagine, before. I think my past self would be proud of this blog—140,000 words of blog posts stapled together is _morally_ a book—once he got over the shock of heresy. + +[TODO conclusion, cont'd— * Do I have regrets about this Whole Dumb Story? A lot, surely—it's been a lot of wasted time. But it's also hard to say what I should have done differently; I could have listened to Ben more and lost faith Yudkowsky earlier, but he had earned a lot of benefit of the doubt? + * even young smart AGPs who can appreciate my work have still gotten pinkpilled * less drama (in my youth, I would have been proud that at least this vice was a feminine trait; now, I prefer to be good even if that means being a good man) ]