From: Zack M. Davis Date: Wed, 27 Sep 2023 19:08:09 +0000 (-0700) Subject: dath ilan ancillary: functionalist preferences for deception X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=405544ed3be3910e3f656c74a76a335948b00bd1;p=Ultimately_Untrue_Thought.git dath ilan ancillary: functionalist preferences for deception --- diff --git a/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md b/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md index a7f6506..c786a38 100644 --- a/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md +++ b/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md @@ -9,19 +9,19 @@ Status: draft > > —Thomas Jefferson, earthling -Eliezer Yudkowsky's fiction about the world of dath ilan (capitalization _sic_) aims to portray a smarter, saner alternate version of Earth. Dath ilan had originally been introduced in a [2014 April Fool's Day post](https://yudkowsky.tumblr.com/post/81447230971/my-april-fools-day-confession), in which Yudkowsky "confessed" that the explanation for his seemingly implausible genius is that he's "actually" an ordinary person from dath ilan, where the ideas he presented to this world as his own were common knowledge. (This likely inspired the trope of a [_medianworld_](https://www.glowfic.com/replies/1619639#reply-1619639), a setting where the average person is like the author along important dimensions.)[^medianworlds] +Eliezer Yudkowsky's fiction about the world of dath ilan (capitalization _sic_) aims to portray a smarter, saner, better-coordinated alternate version of Earth. Dath ilan had originally been introduced in a [2014 April Fool's Day post](https://yudkowsky.tumblr.com/post/81447230971/my-april-fools-day-confession), in which Yudkowsky "confessed" that the explanation for his seemingly implausible genius is that he's "actually" an ordinary person from dath ilan, where the ideas he presented to this world as his own were common knowledge. (This likely inspired the trope of a [_medianworld_](https://www.glowfic.com/replies/1619639#reply-1619639), a setting where the average person is like the author along important dimensions.)[^medianworlds] [^medianworlds]: You might think that the thought experiment of imagining what someone's medianworld is like would only be interesting for people who are "weird" in our own world, thinking that our world is a medianworld for people who are normal in our world. But [in high-dimensional spaces, _most_ of the probability-mass is concentrated in a "shell" some distance around the mode](/2021/May/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems/#typical-point), because even though the per-unit-hypervolume probability _density_ is greatest at the mode, there's vastly more hypervolume in the hyperspace around it. The upshot is that typical people are atypical along _some_ dimensions, so normies can play the medianworld game, too. (Or they could, if the normies of our world were into worldbuilding.) -Dath ilan's purported cognitive superiority to the real world is a recurring theme in discussions of dath ilan. In the fictional canon, it's a focus of the story ["But Hurting People is Wrong"](https://www.glowfic.com/posts/4508), but even when not discussing fiction, Yudkowsky often makes sneering comments about "Earth" or "Earth people", apparently meant to disparage all actually existing humans for not living up to his fiction. +Dath ilan's cognitive superiority to the real world is a recurring theme in modern Yudkowsky's work. In the fictional canon, it's a focus of the story ["But Hurting People is Wrong"](https://www.glowfic.com/posts/4508), but even when not discussing fiction, Yudkowsky often makes sneering comments about "Earth" or "Earth people", apparently meant to disparage all actually existing humans for not living up to his fiction. -One is led to believe that people who were deeply inspired by Yudkowsky's [Sequences](https://www.readthesequences.com/) (a series of influential posts about rationality published on the _Overcoming Bias_ blog largely between 2007 and 2009) should regard dath ilan as a rationalist utopia. (After all, on the terms of the 2014 April Fools' Day joke, that's where the knowledge came from.) +One is led to believe that people who were deeply inspired by Yudkowsky's [Sequences](https://www.readthesequences.com/) (a series of influential posts about rationality published on the _Overcoming Bias_ blog largely between 2007 and 2009) should regard dath ilan as a rationalist utopia. After all, on the terms of the 2014 April Fools' Day joke, that's where the knowledge came from. -And yet for such a supposed rationalist utopia, it's remarkable the extent to which dath ilan's Society is portrayed as being organized around conspiracies to lie or otherwise cover up the truth—not just when forced to by dire matters of planetary security (as when keeping nuclear or AGI secrets), but seemingly for any somewhat plausible excuse whatsoever, including protecting the feelings of people who would be happier if kept ignorant. Evidently, there are _many_ truths existing which dath ilan fears and would wish unknown to the whole world. +And yet, for such a supposed rationalist utopia, it's remarkable the extent to which dath ilan's Society is portrayed as being organized around conspiracies to lie or otherwise cover up the truth—not just when forced to by dire matters of planetary security (as when keeping nuclear or AGI secrets), but seemingly for any somewhat plausible excuse whatsoever, including protecting the feelings of people who would be happier if kept ignorant. Evidently, there are _many_ truths existing which dath ilan fears and would wish unknown to the whole world. -The contrast to the [sense of life](http://aynrandlexicon.com/lexicon/sense_of_life.html) portrayed in the Sequences is striking. The Sequences emphasized that you—yes, you, the reader—had an interest in having true beliefs. On the subject of confronting unpleasant thoughts, the Sequences [gave this advice](https://www.readthesequences.com/Avoiding-Your-Beliefs-Real-Weak-Points) (bolding mine): +The contrast to the [sense of life](http://aynrandlexicon.com/lexicon/sense_of_life.html) portrayed in the Sequences is striking. The Sequences emphasized that you—yes, you, the reader—had an interest in having a maximally accurate world-model. On the subject of confronting unpleasant hypotheses, the Sequences [gave this advice](https://www.readthesequences.com/Avoiding-Your-Beliefs-Real-Weak-Points) (bolding mine): -> When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and **deliberately think about whatever hurts the most**. Don't rehearse standard objections whose standard counters would make you feel better. Ask yourself what _smart_ people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind. **Punch yourself in the solar plexus. Stick a knife in your heart, and wiggle to widen the hole. In the face of the pain, rehearse only this:** +> When you're doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and **deliberately think about whatever hurts the most**. Don't rehearse standard objections whose standard counters would make you feel better. Ask yourself what _smart_ people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind. **Punch yourself in the solar plexus. Stick a knife in your heart, and wiggle to widen the hole.** In the face of the pain, rehearse only this: > > What is true is already so. > Owning up to it doesn't make it worse. @@ -31,12 +31,38 @@ The contrast to the [sense of life](http://aynrandlexicon.com/lexicon/sense_of_l Meanwhile, the dath ilan mythos portrays a world steered by a secretive order of [Keepers of Highly Unpleasant Things it is Sometimes Necessary to Know](https://www.glowfic.com/replies/1612937#reply-1612937). (Ordinary dath ilani do receive rationality training, but it's implied to be deliberately crippled, featuring ["signposts around the first steps [towards becoming a Keeper], placed to warn dath ilani off starting down that path unless they mean it."](https://www.glowfic.com/replies/1799590#reply-1799590)) The maxim that "That which can be destroyed by the truth should be" is described as being ["remembered as much for how it's false, as for how it's true, because among the things that truths can destroy is people."](https://www.glowfic.com/replies/1687922#reply-1687922) +Clearly, this is not a culture that cares about ordinary people being well-informed. Apparently, they believe that owning up to it _does_ make it worse, that the untrue _is_ there to be lived (for non-Keepers). + +As a result, the algorithm that designed dath ilan's Civilization can be seen as systematically preferring deception. When I speak of an algorithm preferring deception, [what I mean is](https://www.lesswrong.com/posts/fmA2GJwZzYtkrAKYJ/algorithms-of-deception) that given a social problem, candidate solutions that involve deceiving the populace seem to be higher in dath ilani Civilization's implicit search ordering than solutions that involve informing the populace. Solutions that work by means of telling the truth will be implemented only when solutions that work by means of deception are seen to fail. + +Crucially, these are functionalist criteria of "preference" and "deception". It's about how Civilization is structured in a way that systematically encourages divergences between popular belief and reality. + +I'm _not_ positing that Civilization's Keepers and Legislators and Chief Executive are laughing maniacally and consciously telling each other, "As an individual, I love it when non-Keepers have false beliefs; we need to do as much of that as possible—as a terminal value!" + +Rather, I'm positing that they don't give a shit about non-Keepers having false beliefs. (They may give a shit about [not technically lying](https://www.lesswrong.com/posts/PrXR66hQcaJXsgWsa/not-technically-lying), but that [turns out to be a weak constraint](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly).) + +If you're in the business of coming up with clever plans to solve problems, and you don't give a shit about people having false beliefs, you mostly end up with clever plans that work by means of giving people false beliefs that trick them into doing what you want them to do (perhaps without technically lying). + +Why wouldn't you? There are more false maps than true maps; if you don't specifically give a shit about affirmatively telling the truth, you mostly end up supplying false maps. Instrumental convergence is a harsh mistress. + +### Interlude: "I Can't Argue With Authorial Fiat" + +### History Screening + +### The Merrin Show + +### Keltham and S/M + +### Kitchen Knives + +### Conclusion + + ------- [OUTLINE— - * Introduction and Thesis - * Yudkowsky's new fictional universe is dath ilan, a medianworld centered around him (the race of dath ilani humans are called the "eliezera" in Word of God canon); the "joke" is that this is where the rationality tech of the Sequences came from (per the 2014 April Fools' Day post). Yudkowsky even talks this way in other contexts, including a trope of making fun of "Earth people" and presenting an eliezera racial supremacy narrative. (It's still a racial supremacy narrative even if he doesn't _use the verbatim phrase_ "racial supremacy.") One is led to believe that dath ilan represents a canonical rationalist utopia, someplace that readers of the Sequences would be proud of. - * And yet, for such a supposed rationalist utopia, it's striking that dath ilan is _lying about everything_, seemingly whenever it "sounds like a good idea" to someone: not just keeping AGI secrets (the way we keep nuclear secrets on Earth), but preventing the idea from coming up as science fiction, keeping Merrin in a Truman-show-like reality where she doesn't know that she's famous, hiding info about sexuality on utilitarian grounds—even bizarre trivial stuff like knives. I discuss these examples in detail later in this essay. + * the race of dath ilani humans are called the "eliezera" in Word of God canon + presenting an eliezera racial supremacy narrative. (It's still a racial supremacy narrative even if he doesn't _use the verbatim phrase_ "racial supremacy.") * Bluntly, this is not a culture that gives a shit about people being well-informed. This is a culture that has explicitly * In more detail: the algorithm that designed dath ilani Civilization is one that systematically favors plans that involve deception, over than plans that involve being honest. * This is not a normative claim or a generic slur that dath ilani are "evil" or "bad"; it's a positve claim about systematic deception. If you keep seeing plans for which social-deception-value exceeds claimed-social-benefit value, you should infer that the plans are being generated by a process that "values" (is optimizing for) deception, whether it's a person or a conscious mind.