From: Zack M. Davis Date: Tue, 10 Oct 2023 03:10:48 +0000 (-0700) Subject: check in X-Git-Url: http://534655.efjtl6rk.asia/source?a=commitdiff_plain;h=0e65e79cd297f1c4d70195f4752608e69592791a;p=Ultimately_Untrue_Thought.git check in --- diff --git a/content/drafts/fake-deeply.md b/content/drafts/fake-deeply.md index 462d593..01571ad 100644 --- a/content/drafts/fake-deeply.md +++ b/content/drafts/fake-deeply.md @@ -150,6 +150,8 @@ Was there anything else he was missing? The object storage cluster did have a op "Imagine you're training your AI to act as a general home assistant, to run everything in a user's household in a way that the user rates highly," Chloë continued. "If—I don't know, say, the family cat dies, that's a negative reward—maybe a better feeding schedule or security monitoring could have prevented it. But if the cat dies and the system _tries to cover it up_—says the cat is out for a walk right now to avoid telling you the bad news—that's going to be an even larger negative reward when the deception is discovered. +[TODO: go with the vase example and cite: https://www.lesswrong.com/posts/AqsjZwxHNqH64C2b6/let-s-see-you-write-that-corrigibility-tag?commentId=8kPhqBc69HtmZj6XR ] + "Uh _huh_," Jake said, more unhappily. It turned out that versioning _was_ on for the bucket. (Why? But probably whoever's job it was to set up the bucket had instead asked, Why not?) A basic `GET` request for the file name would return puppies, but any previous revisions were still available for anyone who thought to query them. "If the system is trained to pass rigorous evaluations, a deceptive policy has to do a lot more work, different work, to pass the evaluations," Chloë said. "In the limit, that could mean—using Multigen-like capabilities to make videos to convince you that the cat is there while you're on vacation? Constructing a realistic cat-like robot to fool you when you get back? Maybe this isn't the best illustrative example. The point is, small, 'shallow' deceptions aren't stable. The set of policies that do well on evaluations comes in two disconnected parts: the part that tells the truth, and the part that—not just lies, but, um—" diff --git a/content/drafts/if-clarity-seems-like-death-to-them.md b/content/drafts/if-clarity-seems-like-death-to-them.md index a050c8f..940b7ac 100644 --- a/content/drafts/if-clarity-seems-like-death-to-them.md +++ b/content/drafts/if-clarity-seems-like-death-to-them.md @@ -465,7 +465,9 @@ I said I would bite that bullet: yes! Yes, I was trying to figure out whether I ------- -[TODO: Ziz's protest] +[TODO: Ziz's protest +https://archive.ph/jChxP +] -------- diff --git a/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md b/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md index 8a069bc..56f134f 100644 --- a/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md +++ b/content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md @@ -137,9 +137,9 @@ Notwithstanding that Rittaen can be Watsonianly assumed to have detailed neurosc That's not an accusation of hypocrisy. The dath ilani are being consistent. In a Society that prizes freedom-from-infohazards over freedom-of-speech, it makes sense that telling people things would be meddling, but colluding to not tell them, isn't. -### Keltham and the Masochism Coverup +### Keltham and the Sadism/Masochism Coverup -The story of Keltham's sexuality provides another case study on dath ilani Civilization's predilection for deception. In Golarion, Keltham discovers that he's a sexual sadist, and infers that ["not many sadists in dath ilan know what they are and Civilization tries to prevent us from finding out, because dath ilan does _not_ have masochists."](https://glowfic.com/replies/1735044#reply-1735044) (The notion of an evolved organism preferring pain in any context strikes him as implausible.) Word of God clarifies that masochists do exist, but are very rare, but Keltham's guess is otherwise correct; there is a deliberate effort (by the Keepers?—the exact agency responsible isn't clear) to prevent public knowledge that sadism and masochism even exist. +The story of Keltham's sexuality provides another case study on dath ilani Civilization's predilection for deception. In Golarion, Keltham discovers that he's a sexual sadist, and infers that ["not many sadists in dath ilan know what they are and Civilization tries to prevent us from finding out, because dath ilan does _not_ have masochists."](https://glowfic.com/replies/1735044#reply-1735044) (The notion of an evolved organism preferring pain in any context strikes him as implausible.) Word of God clarifies that masochists do exist at low frequencies, but that Keltham's guess is otherwise correct; there is a deliberate effort (by the Keepers?—the exact agency responsible isn't clear) to prevent public knowledge that sadism and masochism even exist. The naïve utilitarian case for the coverup is straightforward. There aren't enough masochists to go around. People with a latent inclination towards sadism are better off not being self-aware about it, because if they knew what they wanted, they would feel sad about not being able to have it. @@ -154,7 +154,7 @@ Again, [facts are connected to each other](https://www.lesswrong.com/posts/wyyfF * transsexualism example - * "They're better off not knowing about masochists that they can't have" disregards other ways of dealing with it: fantasy, porn, inventing in better sex robots, implications for eugenic policy ... + * "They're better off not knowing about masochists that they can't have" disregards other ways of dealing with it: fantasy, porn, inventing in better sex robots, implications for eugenic policy ... * Difficulties of recursive censorship; do the people who can afford it have to stay in the closet for the benefit of people like Keltham? You could claim that they're so good at coordinating that they negotiated this, but how does the negotiation work when the people the deal allegedly benefits don't know about it? diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index e9c8bfc..9fa429a 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -65,7 +65,13 @@ Yudkowsky made an appearance. (After he replied to someone else, I remarked pare > **Eliezer** — 11/29/2022 10:36 PM > I am sorry that some of the insane people I attracted got together and made each other more insane and then extensively meta-gaslit you into believing that everyone generally and me personally was engaging in some kind of weird out-in-the-open gaslighting that you could believe in if you attached least-charitable explanations to everything we were doing -It was pretty annoying that Yudkowsky was still attributing my greviances to Michael's malign influence—as if the gender identity revolution was something I would otherwise have just taken lying down. In the counterfactual where Michael had died in 2015, I think something like my February 2017 breakdown would have likely happened anyway. (Between August 2016 and January 2017, I sent Michael 14 emails, met with him once, and watched 60% of South Park season 19 at his suggestion, so he was _an_ influence on my thinking during that period, but not a disproportionately large one compared to everything else I was doing at the time.) How would I have later reacted to the November 2018 "hill of meaning" Tweets (assuming they weren't butterfly-effected away in this counterfactual)? It's hard to say. Maybe, if that world's analogue of my February 2017 breakdown had gone sufficiently badly (with no Michael to visit me in the psych ward or help me make sense of things afterwards), I would have already been a broken man, and not even sent Yudkowsky an email. In any case, I feel very confident that my understanding of the behavior of "everyone generally and [Yudkowsky] personally" would not have been _better_ without Michael _et al._'s influence. +It was pretty annoying that Yudkowsky was still attributing my greviances to Michael's malign influence—as if [the rationalists' wholesale embrace of gender identity revolution](/2023/Jul/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/) was something I would otherwise have just taken lying down. In the counterfactual where Michael had died in 2015, I think something like [my February 2017 mental breakdown](/2017/Mar/fresh-princess/) would have likely happened anyway.[^vassar-influence] + +[^vassar-influence]: Between August 2016 and January 2017, I sent Michael 14 emails, met with him once, and watched 60% of South Park season 19 at his suggestion, so he was _an_ influence on my thinking during that period, but not a disproportionately large one compared to everything else I was doing at the time. + +How would I have later reacted to [Yudkowsky's November 2018 "hill of meaning in defense of validity" Tweets](/2023/Jul/a-hill-of-validity-in-defense-of-meaning/) (assuming they weren't butterfly-effected away in this counterfactual)? It's hard to say. Maybe, if that world's analogue of my February 2017 breakdown had gone sufficiently badly (with no Michael to visit me in the psych ward or help me make sense of things afterwards), I would have already been a broken man, and not even sent Yudkowsky an email. In any case, I feel very confident that my understanding of the behavior of "everyone generally and [Yudkowsky] personally" would not have been _better_ without Michael _et al._'s influence. + +Also, while "weird out-in-the-open gaslighting" was clearly a hostile paraphrase optimized to make my position look silly, I'm tempted to bite the bullet on it being an apt description of repeatedly publishing text that's optimized to delegitimize thinking about biological sex (which is gaslighting, manipulating people such that they doubt their own perceptions of reality) and justifying it on the grounds that ["people do _know_ they're living in a half-Stalinist environment"](/images/yudkowsky-personally_prudent_and_not_community-harmful.png) (which makes it weird and out in the open). > [cont'd] > you may recall that this blog included something called the "Bayesian Conspiracy" @@ -79,7 +85,7 @@ Is it, though? The "always a complicated person [who has] shifted in [his] empha (As far as [the](https://www.lesswrong.com/posts/fnEWQAYxcRnaYBqaZ/initiation-ceremony) [Bayesian](https://www.lesswrong.com/posts/ZxR8P8hBFQ9kC8wMy/the-failures-of-eld-science) [Conspiracy](https://www.lesswrong.com/posts/xAXrEpF5FYjwqKMfZ/class-project) [stories](https://www.lesswrong.com/posts/kXAb5riiaJNrfR8v8/the-ritual) [went](https://www.lesswrong.com/posts/yffPyiu7hRLyc7r23/final-words), I think there's a significant narrative contrast between Brennan _seeking_ knowledge from the master _beisutsukai_, and Keltham, Merrin, and Thellim being _protected from_ knowledge by the Keepers. Neither the Bayesian Conspiracy nor the Keepers are publishing open-access textbooks, but at least the Conspiracy isn't claiming that their secretiveness is _for others' benefit_.) -It's notable that Yudkowsky listed "still having posts like [Meta-Honesty](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases)" as an exculpatory factor here. The thing is, I [wrote a _critique_ of Meta-Honesty](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly). It was well-received (being [cited as a good example in the introductory post for the 2019 Less Wrong Review](https://www.lesswrong.com/posts/QFBEjjAvT6KbaA3dY/the-lesswrong-2019-review), for instance). I don't think I could have written a similarly impassioned critique of anything from the Sequences era, because the stuff from the Sequences era still looked correct to me. To me, "Meta-Honesty" was evidence _for_ Yudkowsky having relinquished his Art and lost his powers, not evidence that his powers were still intact. +It's notable that Yudkowsky listed "still having posts like [Meta-Honesty](https://www.lesswrong.com/posts/xdwbX9pFEr7Pomaxv/meta-honesty-firming-up-honesty-around-its-edge-cases)" as an exculpatory factor here. The thing is, I [wrote a _critique_ of "Meta-Honesty"](https://www.lesswrong.com/posts/MN4NRkMw7ggt9587K/firming-up-not-lying-around-its-edge-cases-is-less-broadly). It was well-received (being [cited as a good example in the introductory post for the 2019 Less Wrong Review](https://www.lesswrong.com/posts/QFBEjjAvT6KbaA3dY/the-lesswrong-2019-review), for instance). I don't think I could have written a similarly impassioned critique of anything from the Sequences era, because the stuff from the Sequences era still looked correct to me. To me, "Meta-Honesty" was evidence _for_ Yudkowsky having relinquished his Art and lost his powers, not evidence that his powers were still intact. I didn't have that response thought through in real time. At the time, I just agreed: @@ -172,17 +178,15 @@ I admitted, again, that there was a sense in which I couldn't argue with authori Something about innate _kung fu_ world seems fake in a way that seems like a literary flaw. It's not just about plausibility. Fiction often incorporates unrealistic elements in order to tell a story that has relevance to real human lives. Innate _kung fu_ skills are scientifically plausible[^instinct] in a way that faster-than-light travel is not, but throwing faster-than-light travel into the universe so that you can do a [space opera](https://tvtropes.org/pmwiki/pmwiki.php/Main/SpaceOpera) doesn't make the _people_ fake in the way that Superman's fighting skills are fake. -[^instinct]: All sorts of other instinctual behaviors exist in animals; I don't se why skills humans have to study for years as a "martial art" couldn't be coded into the genome. +[^instinct]: All sorts of other instinctual behaviors exist in animals; I don't see why skills humans have to study for years as a "martial art" couldn't be coded into the genome. -Maybe it was okay for Superman's fighting skills to be fake from a literary perspective (because realism along that dimension is not what Superman is _about_), but if the Yudkowskian ethos exulted intelligence as ["the power that cannot be removed without removing you"](https://www.lesswrong.com/posts/SXK87NgEPszhWkvQm/mundane-magic), readers had grounds to demand that the dath ilani's thinking skills be real, and a world that's claimed by authorial fiat to be super-great at epistemic rationality, but where the people don't have a will-to-truth stronger than their will-to-happiness, felt fake to me. I couldn't _prove_ that it was fake. I agreed with Harmless's case that, _technically_, as far as the Law went, you could build a Civilization or a Friendly AI to see all the ugly things that you preferred not to see. +Maybe it was okay for Superman's fighting skills to be fake from a literary perspective (because realism along that dimension is not what Superman is _about_), but if the Yudkowskian ethos exulted intelligence as ["the power that cannot be removed without removing you"](https://www.lesswrong.com/posts/SXK87NgEPszhWkvQm/mundane-magic), readers had grounds to demand that the dath ilani's thinking skills be real, and a world that's claimed by authorial fiat to be super-great at epistemic rationality, but where the people don't have a will-to-truth stronger than their will-to-happiness, felt fake to me. I couldn't _prove_ that it was fake. I agreed with Harmless's case that, technically, as far as the Law went, you could build a Civilization or a Friendly AI to see all the ugly things that you preferred not to see. But if you could—would you? And more importantly, if you would—could you? It was possible that the attitude I was evincing here was just a difference between the eliezera out of dath ilan and the Zackistani from my medianworld, and that there was nothing more to be said about it. But I didn't think the thing was a _genetic_ trait of the Zackistani! _I_ got it from spending my early twenties obsessively re-reading blog posts that said things like, ["I believe that it is right and proper for me, as a human being, to have an interest in the future [...] One of those interests is the human pursuit of truth [...] I wish to strengthen that pursuit further, in this generation."](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business) -There were definitely communities on Earth where I wasn't allowed in because of my tendency to shout things from street corners, and I respected those people's right to have a safe space for themselves. - -But those communities didn't call themselves _rationalists_, weren't _pretending_ be to be inheritors of the great tradition of E. T. Jaynes and Richard Feynman and Robin Hanson. And if they _did_, I think I would have a false advertising complaint against them. +There were definitely communities on Earth where I wasn't allowed in because of my tendency to shout things from street corners, and I respected those people's right to have a safe space for themselves. But those communities didn't call themselves _rationalists_, weren't pretending be to be inheritors of the great tradition of E. T. Jaynes and Richard Feynman and Robin Hanson. And if they _did_, I think I would have a false advertising complaint against them. "[The eleventh virtue is scholarship. Study many sciences and absorb their power as your own](https://www.yudkowsky.net/rational/virtues) ... unless a prediction market says that would make you less happy," just didn't have the same ring to it. Neither did "The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But higher than both of those, is trusting your Society's institutions to tell you which kinds of knowledge will make you happy"—even if you stipulated by authorial fiat that your Society's institutions are super-competent, such that they're probably right about the happiness thing. @@ -219,7 +223,7 @@ Apparently I struck a nerve. Yudkowsky started "punching back": I thought the "you could not have generated the answer I just told you" gambit was a pretty dirty argumentative trick on Yudkowsky's part. (Given that I could, how would I be able to prove it?—this was itself a good use-case for concealing spoilers.) -As it happened, however, I _had_ already considered the case of spoilers as a class of legitimate infohazards, and was prepared to testify that I had already thought of it, and explain why I thought hiding spoilers were relevantly morally different from the coverups I was objecting to. The previous night, 7 December 2022, I had had a phone call with Anna Salamon,[^evidence-of-independent-generation] in which I had cited dath ilan's [practice of letting children figure out heliocentrism for themselves](https://www.glowfic.com/replies/1777588#reply-1777588) as not being objectionable in the way the sadism/masochism coverup was. +As it happened, however, I had already considered the case of spoilers as a class of legitimate infohazards, and was prepared to testify that I had already thought of it, and explain why I thought hiding spoilers were relevantly morally different from the coverups I was objecting to. The previous night, 7 December 2022, I had had a phone call with Anna Salamon,[^evidence-of-independent-generation] in which I had cited dath ilan's [practice of letting children figure out heliocentrism for themselves](https://www.glowfic.com/replies/1777588#reply-1777588) as not being objectionable in the way the sadism/masochism coverup was. [^evidence-of-independent-generation]: I was lucky to be able to point to Anna as a potential witness to defend myself against the "could not have generated" trick—as a matter of principle, not because I seriously expected anyone to care enough to go ask Anna if she remembered the conversation the same way. @@ -366,11 +370,11 @@ But, as I pointed out, it was significant that the particular problem to which m Apparently, yes: -**Eliezer** — 12/17/2022 5:50 PM -you sure are supposed to not get angry at the people who didn't create those political punishments -that's insane -they're living in Cheliax and you want them to behave like they're not in Cheliax and get arrested by the Church -your issue is with Asmodeus. take it to Him, and if you can't take Him down then don't blame others who can't do that either. +> **Eliezer** — 12/17/2022 5:50 PM +> you sure are supposed to not get angry at the people who didn't create those political punishments +> that's insane +> they're living in Cheliax and you want them to behave like they're not in Cheliax and get arrested by the Church +> your issue is with Asmodeus. take it to Him, and if you can't take Him down then don't blame others who can't do that either. Admirably explicit! If he were that frank all the time, I wouldn't actually have had a problem with him. (I don't expect people to pay arbitrary costs to defy their political incentives; my problem with the "hill of meaning in defense of validity" and "simplest and best protocol" performances was precisely that they were _pretending not to be political statements_; if we can be clear about the _existence_ of the Asmodean elephant in the room listening to everything we say, I don't blame anyone for not saying anything else that the elephant would report to its superiors.) diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index f5aaca0..a452bad 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -81,6 +81,9 @@ _ sucking Scott's dick is helpful because he's now the main gateway instead of H _ Sarah's point that Scott gets a lot of undeserved deference, too: https://twitter.com/s_r_constantin/status/1435609950303162370 _ clarify that Keltham infers there are no mascochists, vs. Word of God _ "Doublethink" ref in Xu discussion should mention that Word of God Eliezerfic clarification that it's not about telling others +_ https://www.greaterwrong.com/posts/vvc2MiZvWgMFaSbhx/book-review-the-bell-curve-by-charles-murray/comment/git7xaE2aHfSZyLzL + +pt. 6 edit tier— dath ilan ancillary tier— _ Who are the 9 most important legislators called? @@ -2829,4 +2832,8 @@ https://www.lesswrong.com/posts/qbcuk8WwFnTZcXTd6/thomas-kwa-s-miri-research-exp > The model was something like: Nate and Eliezer have a mindset that's good for both capabilities and alignment, and so if we talk to other alignment researchers about our work, the mindset will diffuse into the alignment community, and thence to OpenAI, where it would speed up capabilities. 27 January 2020— -> I'm also afraid of the failure mode where I get frame-controlled by the Michael/Ben/Jessica mini-egregore (while we tell ourselves a story that we're the real rationalist coordination group and not an egregore at all). Michael says that the worldview he's articulating would be the one that would be obvious to me if I felt that I was in danger. Insofar as I trust that my friends' mini-egregore is seeing something but I don't trust the details, the obvious path forward is to try to do original seeing while leaning into fear—trusting Michael's meta level advice, but not his detailed story. \ No newline at end of file +> I'm also afraid of the failure mode where I get frame-controlled by the Michael/Ben/Jessica mini-egregore (while we tell ourselves a story that we're the real rationalist coordination group and not an egregore at all). Michael says that the worldview he's articulating would be the one that would be obvious to me if I felt that I was in danger. Insofar as I trust that my friends' mini-egregore is seeing something but I don't trust the details, the obvious path forward is to try to do original seeing while leaning into fear—trusting Michael's meta level advice, but not his detailed story. + +Weird tribalist praise for Scott: https://www.greaterwrong.com/posts/GMCs73dCPTL8dWYGq/use-normal-predictions/comment/ez8xrquaXmmvbsYPi + + diff --git a/notes/memoir_wordcounts.csv b/notes/memoir_wordcounts.csv index f52d760..66222ef 100644 --- a/notes/memoir_wordcounts.csv +++ b/notes/memoir_wordcounts.csv @@ -537,5 +537,5 @@ 10/05/2023,123572,945 10/06/2023,119136,-4436 10/07/2023,118644,-492 -10/08/2023,, +10/08/2023,118644,0 10/09/2023,, diff --git a/notes/memoir_wordcounts.py b/notes/memoir_wordcounts.py index c87c256..c462761 100755 --- a/notes/memoir_wordcounts.py +++ b/notes/memoir_wordcounts.py @@ -21,7 +21,7 @@ MONTHS = { } def wordcount_at_this_sha(): - result = subprocess.run("wc -w content/2023/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md content/2023/a-hill-of-validity-in-defense-of-meaning.md content/drafts/if-clarity-seems-like-death-to-them.md content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md content/drafts/standing-under-the-same-sky.md".split(), stdout=subprocess.PIPE) + result = subprocess.run("wc -w content/2023/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md content/2023/a-hill-of-validity-in-defense-of-meaning.md content/drafts/if-clarity-seems-like-death-to-them.md content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md content/drafts/zevis-choice.md content/drafts/on-the-public-anti-epistemology-of-dath-ilan.md content/drafts/standing-under-the-same-sky.md".split(), stdout=subprocess.PIPE) wc_lines = result.stdout.decode('utf8').split('\n') total_line = wc_lines[-2] # last line is empty return int(total_line.split()[0]) diff --git a/notes/notes.txt b/notes/notes.txt index 3a9bb32..4e0923e 100644 --- a/notes/notes.txt +++ b/notes/notes.txt @@ -3359,3 +3359,5 @@ https://deathisbad.substack.com/p/my-agp-dudes-it-does-get-better https://www.newyorker.com/magazine/2023/10/09/alliance-defending-freedoms-legal-crusade > Michael says the thing we call "trans women" are basically males who have been lied to about how sex works and who don't, can't participate in rape culture. As a virgin from the 1987 birth cohort, this resonates. I said, "Comment 171 syndrome." Although ... I have been in bed with a woman a few times, and from there the proximate cause of still-being-a-virgin-afterward was erectile nonperformance due to obligate-AGP. This is likely related to Comment 171 syndrome. Blanchard wrote about "developmental competition" (the balance between allo- and auto- hardening during psycho-sexual development): my analogue in a world where I had known how and why to ethically pursue girls as a teenager would still be in the same taxon, but maybe wouldn't have gone so far down the obligate track. + +https://old.reddit.com/r/askAGP/comments/173rtqp/im_agp_and_i_transitioned_at_12_ama/ diff --git a/notes/post_ideas.txt b/notes/post_ideas.txt index a3afa93..f49d5c0 100644 --- a/notes/post_ideas.txt +++ b/notes/post_ideas.txt @@ -5,9 +5,10 @@ _ Reply to Scott Alexander on Autogenderphilia _ Hrunkner Unnerby and the Shallowness of Progress _ If Clarity Seems Like Death to Them (April 2019–January 2021) -_ On the Public Anti-Epistemology of dath ilan _ Agreeing With Stalin in Ways that Exhibit Generally Rationalist Principles (February 2021) -_ Standing Under the Same Sky (April 2021–December 2022) +_ Zevi's Choice (March 2021–April 2022) +_ On the Public Anti-Epistemology of dath ilan +_ Standing Under the Same Sky (September–December 2022) Minor—