From: M. Taylor Saotome-Westlake Date: Fri, 7 Oct 2022 17:33:24 +0000 (-0700) Subject: check in / false start? X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=ed35fefc4c5b58bb72e8c0444b4e5a341afbd744;p=Ultimately_Untrue_Thought.git check in / false start? Starting a serious work day at ten-thirty-something is fine! It's fine. I'm here. --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 9936af9..445ad3f 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -182,9 +182,9 @@ But if Yudkowsky didn't want to get into a distracting political fight about a t But trusting Eliezer Yudkowsky—whose writings, more than any other single influence, had made me who I am—_did_ seem reasonable. If I put him on a pedastal, it was because he had earned the pedastal, for supplying me with my criteria for how to think—including, as a trivial special case, [how to think about what things to put on pedastals](https://www.lesswrong.com/posts/YC3ArwKM8xhNjYqQK/on-things-that-are-awesome). -So if the rationalists were going to get our own philosophy of language wrong over this _and Eliezer Yudkowsky was in on it_ (!!!), that was intolerable, inexplicable, incomprehensible—like there _wasn't a real world anymore_. +So if the rationalists were going to get our own philosophy of language wrong over this _and Eliezer Yudkowsky was in on it_ (!!!), that was intolerable, inexplicable, incomprehensible—like there _wasn't a real world anymore_. I remember going downstairs to impulsively confide in a senior engineer, an older bald guy who exuded masculinity, who you could tell by his entire manner and being was not infected by the Berkeley mind-virus, no matter how loyally he voted Democrat—not just about the immediate impetus of this Twitter thread, but this whole _thing_ of the past couple years where my entire social circle just suddenly decided that guys like me could be women by means of saying so. He was noncommittally sympathetic; he told me an anecdote about him accepting a trans person's correction of his pronoun usage, with the thought that different people have their own beliefs, and that's OK. -But if Yudkowsky was _already_ stonewalling his Twitter followers, entering the thread myself didn't seem likely to help. (Also, I hadn't intended to talk about gender on that account yet, although that seemed unimportant in light of the present cause for flipping out.) +If Yudkowsky was _already_ stonewalling his Twitter followers, entering the thread myself didn't seem likely to help. (Also, I hadn't intended to talk about gender on that account yet, although that seemed unimportant in light of the present cause for flipping out.) It seemed better to try to clear this up in private. I still had Yudkowsky's email address. I felt bad bidding for his attention over my gender thing _again_—but I had to do _something_. Hands trembling, I sent him an email asking him to read my ["The Categories Were Made for Man to Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), suggesting that it may qualify as an answer to his question about ["a page [he] could read to find a non-confused exclamation of how there's scientific truth at stake"](https://twitter.com/ESYudkowsky/status/1067482047126495232)—and that, because I cared very much about correcting what I claimed were confusions in my rationalist subculture, that I would be happy to pay up to $1000 for his time—and that, if he liked the post, he might consider Tweeting a link—and that I was cc'ing my friends Anna Salamon and Michael Vassar as a character reference (Subject: "another offer, $1000 to read a ~6500 word blog post about (was: Re: Happy Price offer for a 2 hour conversation)"). Then I texted Anna and Michael begging them to chime in and vouch for my credibility. @@ -238,9 +238,9 @@ And the reason to write this as a desperate email plea to Scott Alexander when I Back in 2010, the rationalist community had a shared understanding that the function of language is to describe reality. Now, we didn't. If Scott didn't want to cite my creepy blog about my creepy fetish, that was _totally fine_; I liked getting credit, but the important thing is that this "No, the Emperor isn't naked—oh, well, we're not claiming that he's wearing any garments—it would be pretty weird if we were claiming _that!_—it's just that utilitarianism implies that the _social_ property of clothedness should be defined this way because to do otherwise would be really mean to people who don't have anything to wear" gaslighting maneuver needed to _die_, and he alone could kill it. -... Scott didn't get it. We agreed that self-identity-, natal-sex-, and passing-based gender categories each had their own pros and cons, and that it's uninteresting to focus on whether something "really" belongs to a category, rather than on communicating what you mean. Scott took this to mean that what convention to use is a pragmatic choice that we can make on utilitarian grounds, and that being nice to trans people is worth a little bit of clunkiness. +... Scott didn't get it. We agreed that self-identity-, natal-sex-, and passing-based gender categories each had their own pros and cons, and that it's uninteresting to focus on whether something "really" belongs to a category, rather than on communicating what you mean. Scott took this to mean that what convention to use is a pragmatic choice that we can make on utilitarian grounds, and that being nice to trans was worth a little bit of clunkiness, that the mental health benefits to trans people were obviously enough to tip the first-order uilitarian calculus. -But I considered myself to be prosecuting _not_ the object-level question of which gender categories to use, but the meta-level question of what normative principles govern which categories we should use, for which, "whatever, it's a pragmatic choice, just be nice" wasn't an answer, because (I claimed) the principles exclude "just be nice" from being a relevant consideration. I didn't have a simple, [mistake-theoretic](https://slatestarcodex.com/2018/01/24/conflict-vs-mistake/) characterization of the language and social conventions that everyone should use such that anyone who defected from the compromise would be wrong. The best I could do was try to objectively predict the consequences of different possible conventions—and of _conflicts_ over possible conventions. +I didn't think _anything_ about "mental health benefits to trans people" was obvious, but more importantly, I considered myself to be prosecuting _not_ the object-level question of which gender categories to use, but the meta-level question of what normative principles govern which categories we should use, for which (I claimed) "whatever, it's a pragmatic choice, just be nice" wasn't an answer, because (I claimed) the principles exclude "just be nice" from being a relevant consideration. ["... Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) had concluded with a section on Emperor Norton, a 19th century San Francisco resident who declared himself Emperor of the United States. Certainly, it's not difficult or costly for the citizens of San Francisco to _address_ Norton as "Your Majesty" as a courtesy or a nickname. But there's more to being the Emperor of the United States than people calling you "Your Majesty." Unless we abolish Congress and have the military enforce Norton's decrees, he's not _actually_ functioning in the role of emperor—at least not according to the currently generally-understood meaning of the word "emperor." @@ -557,6 +557,11 @@ But ... it's only "obvious" if you _take as a given_ that Yudkowsky is playing a But since I _did_ spend my entire adult life in his robot cult, the idea that Eliezer Yudkowsky was going to behave just as badly as any other public intellectual in the current year, was not really in my hypothesis space. + +"sacrificed all hope of success in favor of maintaining his own sanity by CC'ing you guys (which I think he was correct to do conditional on email happening at all)" + + + At the start, I _had_ to assume that the "hill of validity in defense of meaning" Twitter performance was an "honest mistake" in his rationality lessons, and that honest mistakes could be corrected if someone put in the effort to explain the problem. @@ -564,7 +569,6 @@ It took some pretty large likelihood ratios to promote the "obvious" explanation -"sacrificed all hope of success in favor of maintaining his own sanity by CC'ing you guys (which I think he was correct to do conditional on email happening at all)" But the guy doesn't _market_ himself as being like any other public intellectual in the current year. As Ben put it, Yudkowsky's "claim to legitimacy really did amount to a claim that while nearly everyone else was criminally insane (causing huge amounts of damage due to disconnect from reality, in a way that would be criminal if done knowingly), he almost uniquely was not." Call me a sucker, but ... I _actually believed_ Yudkowsky's marketing story. The Sequences _really were just that good_. That's why it took so much fuss and wasted time to generate a likelihood ratio large enough to falsify that story. diff --git a/content/drafts/review-of-agp-erotica-automation-tools.md b/content/drafts/review-of-agp-erotica-automation-tools.md index df32b4e..12ddd01 100644 --- a/content/drafts/review-of-agp-erotica-automation-tools.md +++ b/content/drafts/review-of-agp-erotica-automation-tools.md @@ -7,11 +7,6 @@ Status: draft * recent shocking progress in deep learning has huge implications for Society—and the future of our species itself * but today I want to talk about the implications for—porn! By reviewing some port tools—okay, they weren't designed or marketed as such, but you know that's one of the key use-cases -### Xpression Camera - - * it's limited in the sense of noticeably not being a magic mirror (that key fraction of a second behind real time, pixelly, it's just doing basic eye-mouth tracking, so expressions are limited in that sense, distortion if you tilt your head too much), but it gets _close enough_ to being a magic mirror to be really exciting: I blink, the woman on the screen blinks; I open my mouth, she opens her mouth. Amazing!! - * I haven't actually tried this on a video call yet (any readers interested in a call?) - ### GPT-3 * GPT-3 is pretty good at writing generic webslush text, which means it's also pretty good at writing generic webslush erotica!! @@ -20,6 +15,10 @@ Status: draft * The text is not actually totally coherent; it'll forget which of the characters is which * the default InstructGPT can be kind of uncreative sometimes, but regular davinci seems more likely to go off the rails; I think I like Instruct better +### Xpression Camera + + * it's limited in the sense of noticeably not being a magic mirror (that key fraction of a second behind real time, pixelly, it's just doing basic eye-mouth tracking, so expressions are limited in that sense, distortion if you tilt your head too much), but it gets _close enough_ to being a magic mirror to be really exciting: I blink, the woman on the screen blinks; I open my mouth, she opens her mouth. Amazing!! + * I haven't actually tried this on a video call yet (any readers interested in a call?) ### Stable Diffusion diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index 3c64573..1448a04 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -862,13 +862,12 @@ My 28 November 2018 text to Michael— > just a thread reply to Eliezer that says "I trust Zack's rationality and think you should pay attention to what he has to say" (if and only if you actually believe that to be true, obviously)? -(don't know how to summarize the part with Ian—) -I remember going downstairs to impulsively confide in a senior engineer, an older bald guy who exuded masculinity, who you could tell by his entire manner and being was not infected by the Berkeley mind-virus, no matter how loyally he voted Democrat—not just about the immediate impetus of this Twitter thread, but this whole _thing_ of the past couple years where my entire social circle just suddenly decided that guys like me could be women by means of saying so. He was sympathetic. - helping Norton live in the real world -Scott says, "It seems to me pretty obvious that the mental health benefits to trans people are enough to tip the object-level first-pass uilitarian calculus."; I don't think _anything_ about "mental health benefits to trans people" is obvious +Scott says, "It seems to me pretty obvious that the mental health benefits to trans people are enough to tip the object-level first-pass uilitarian calculus. + +"; I don't think _anything_ about "mental health benefits to trans people" is obvious ] [TODO: connecting with Aurora 8 December, maybe not important] @@ -1227,3 +1226,5 @@ https://trevorklee.substack.com/p/the-ftx-future-fund-needs-to-slow > changing EA to being a social movement from being one where you expect to give money when I talked to the Kaiser psychiatrist in January 2021, he said that the drugs that they gave me in 2017 were Zyprexa 5mg and Trazadone 50mg, which actually seems a lot more reasonable in retrospect (Trazadone is on Scott's insomnia list), but it was a lot scarier in the context of not trusting the authorities + +I didn't have a simple, [mistake-theoretic](https://slatestarcodex.com/2018/01/24/conflict-vs-mistake/) characterization of the language and social conventions that everyone should use such that anyone who defected from the compromise would be wrong. The best I could do was try to objectively predict the consequences of different possible conventions—and of _conflicts_ over possible conventions. \ No newline at end of file