--- /dev/null
+Title: <em>Memento Mori</em>
+Date: 2020-08-29 16:45
+Category: commentary
+Tags: personal, my robot cult, Star Trek, COVID-19
+
+_(Attention conservation notice: personal thoughts on the passing scene; [previously](/2020/Jun/oceans-rise-empires-fall/), [previously](/2018/Oct/sticker-prices/))_
+
+> _But always above you
+> The idea raises its head
+> What would I do if the Earth fell apart?
+> Who would I save, or am I not quite brave enough?_
+>
+> —Laura Barrett, ["Deception Island Optimists Club"](https://www.youtube.com/watch?v=7L4yB0LtP0U)
+
+Six or sixteen or twenty-one or forty-seven months later—depending on when you start counting—I think I'm almost ready to stop grieving and move on with my life. I have two more long blog posts to finish—one for the robot-cult blog restating my thesis about the cognitive function of categorization with somewhat more math this time and then using it to give an account of _mimicry_, and one here doing some robot-cult liturgical commentary plus necessary autobiographical scaffolding—and then I'll be _done_.
+
+Not done writing. Done _grieving_. Done with this impotent rage that expects (normative sense) this world to be something other than what I know enough to expect (positive sense). Maybe I'll start learning math again.
+
+Last week, I "e-ttended" the conference associated with this open-source scene I've been into for a while—although I've been so distracted by the Category War that I've landed [exactly one commit](https://github.com/rust-lang/rust/commit/d1cdb02e4d4) in master in the last 13 months. (I think I'm still allowed to say "in master", although ["whitelist" is out](https://github.com/rust-lang/rust/pull/74127).)
+
+Traditionally (since 2016), this [has](http://zackmdavis.net/blog/2016/09/rustconf-2016-travelogue/) [been](http://zackmdavis.net/blog/2017/10/some-excuse-for-a-rustconf-2017-travelogue/) [my annual occasion](/2019/Aug/a-love-that-is-out-of-anyones-control/#tech-conference) to travel up to Portland (the _real_ Portland, and not a cowardly obfuscation) and stay with friend of the blog [Sophia](/author/sophia/) (since 2017), but everything is remote this year because of the pandemic.
+
+Only if I'm serious about exiting my grief loop, I need to stop being so profoundly alienated by how thoroughly the finest technical minds of my generation are wholly owned by [Blue Egregore](/2019/Aug/the-social-construction-of-reality-and-the-sheer-goddamned-pointlessness-of-reason/#blue-egregore). _I fear the successor ideology_—the righteous glee with which they proclaim that everything is political, that anyone with reservations about the Code of Conduct is _ipso facto_ a bigot, how empathy is as important if not more so than technical excellence ...
+
+I can't even think of them as enemies. We're the _same people_. I was born in 1987 and grew up in California with the same [beautiful moral ideal](/2017/Dec/theres-a-land-that-i-see-or-the-spirit-of-intervention/) as everyone else. I just—stopped receiving updates a few years back. From their perspective, an unpatched copy of Social Liberalism 2009 must look hopelessly out-of-date with the Current Year's nexus of [ideological coordination](https://palladiummag.com/2019/08/05/the-real-problem-at-yale-is-not-free-speech/), which everyone wants to be [corrigible](https://arbital.greaterwrong.com/p/corrigibility) to.
+
+Or maybe I'm not _even_ running unpatched Liberalism 2009? I'm still loyal to the beauti—to _my interpretation of_ the beautiful moral ideal. But I've done a lot of off-curriculum reading—[it usually begins with Ayn Rand](https://en.wikipedia.org/wiki/It_Usually_Begins_with_Ayn_Rand), but it [gets](/2020/Apr/book-review-human-diversity/) [much](/2017/Jan/and-yet-none-more-blameable/#goldberg) [worse](/2020/Aug/yarvin-on-less-wrong/). It ... leaves a mark. It's _supposed to_ leave a mark on the world-model without touching the utility function. But how do you explain that to anyone outside of your robot cult?
+
+One of the remote conference talks was about using our software for computational biology. There was something I wanted to say in the Discord channel, related to how I might want to redirect my energies after I'm done grieving. I typed it out in my Emacs `*scratch*` buffer, but, after weighing the risks for a few seconds, deleted a parenthetical at the end.
+
+What I posted was:
+
+> really excited to hear about applying tech skills to biology; my current insurance dayjob is not terribly inspiring, and I've been wondering if I should put effort into making more of an impact with my career
+
+The parenthetical I deleted was:
+
+> (_e.g._ if someone in the world is working on <https://www.gwern.net/Embryo-selection> and needs programmers)
+
+It probably wouldn't have mattered either way, with so many messages flying by in the chat. In some ways, Blue Egregore is less like an ideology and more like a regular expression filter: you can get surprisingly far into discussing the actual substance of ideas as long as no one _says a bad word_ like "eugenics".
+
+—if we even have enough _time_ for things like embryo selection to help, if AI research somehow keeps plodding along [even as everything _else_ falls apart](https://www.unqualified-reservations.org/2007/05/antisingularity/). The [GPT-3 demos](https://www.gwern.net/GPT-3) have been tickling my neuroticism. Sure, it's "just" a language model, doing nothing more but predicting the next token of human-generated text. But [you can do a lot with language](https://bmk.sh/2020/08/17/Building-AGI-Using-Language-Models/). As _disgusted_ as I am with my robot cult as presently constituted, the _argument_ for why you should fear the coming robot apocalypse in which all will be consumed in a cloud of tiny molecular paperclips, still looks solid. But I had always thought of it as a long-term thing—this unspoken sense of, okay, we're probably all going to die, but that'll probably be in, like, 2060 or whatever. People freaking out about it coming _soon_-soon are probably just [following the gradient into](https://www.lesswrong.com/posts/yEjaj7PWacno5EvWa/every-cause-wants-to-be-a-cult) being [a doomsday cult](https://unstableontology.com/2019/07/11/the-ai-timelines-scam/). Now the threat, [and the uncertainty around it](https://intelligence.org/2017/10/13/fire-alarm/), feel more real—like maybe we'll all die in 2035 instead of 2060.
+
+At some point, I should write a post on the causes and consequences of the psychological traits of fictional characters not matching the real-life distributions by demographic. [The new _Star Trek_ cartoon](https://memory-alpha.fandom.com/wiki/Star_Trek:_Lower_Decks) is not very good, but I'm obligated to enjoy it anyway out of brand loyalty. One of the main characters, Ens. Beckett Mariner, is [brash and boisterous and dominant](https://youtu.be/64obsPsXxkE?t=45)—[friendly, but in a way](https://www.youtube.com/watch?v=7CVJGy0Do5I) that makes it clear that she's _on top_. If you've seen _Rick and Morty_, her relationship with Ens. Brad Boimler has the Rick and Morty dynamic, with Mariner as Rick. ([Series creator](https://en.wikipedia.org/wiki/Mike_McMahan) Mike McMahan actually worked on _Rick and Morty_, so it likely _is_ the same dynamic, not just superficially, but generated by the same algorithm in McMahan's head.)
+
+Overall, I'm left with this uncanny feeling that Mariner is ... not drawn from the (straight) female distribution?—like she's a jockish teenage boy [StyleGANed](https://youtu.be/kSLJriaOumA?t=28) into a cute mulatto woman's body. That, given the Federation's established [proficiency with cosmetic surgery](https://memory-alpha.fandom.com/wiki/Cosmetic_surgery), I'm almost _forced_ to formulate the headcanon that she's an AGP trans woman. (The name "Beckett" doesn't help, either. Maybe I should expand this theory into a full post and try to pick up some readers from [/r/DaystromInstitute](https://www.reddit.com/r/DaystromInstitute/), but maybe that would just get me banned.)
+
+I wish I knew _in more detail_ what my brain thinks it's picking up on here? (I could always be wrong.) It's important that I use the word _distribution_ everywhere; I'm at least definitely not being one of those _statistically-illiterate_ sexists. Most men _also_ don't have that kind or degree of boisterous dominance; my surprise is a matter of ratios in the right tail.
+
+I wish there was some way I could get a chance to explain to all _my people_ still under the Egregore's control, what _should_ be common knowledge too obvious to mention—that Bayesian _surprise_ is not moral _disapproval_. Beckett Mariner _deserves to exist_. (And, incidentally, I deserve the chance to _be_ her.) But I think the way you realistically _get_ starships and StyleGAN—and survive—is going to require an uncompromising focus on the kind of technical excellence that can explain in mathematical detail what black-box abstractions like "politics" and "empathy" are even supposed to _mean_—an excellence that _doesn't fit past the regex filter_.
+
+But I don't expect to live to get the chance.
+++ /dev/null
-Title: <em>Memento Mori</em>
-Date: 2021-01-01
-Category: commentary
-Tags: personal, my robot cult, Star Trek, COVID-19
-Status: draft
-
-_(Attention conservation notice: personal thoughts on the passing scene; [previously](/2020/Jun/oceans-rise-empires-fall/), [previously](/2018/Oct/sticker-prices/))_
-
-> _But always above you
-> The idea raises its head
-> What would I do if the Earth fell apart?
-> Who would I save, or am I not quite brave enough?_
->
-> —Laura Barrett, ["Deception Island Optimists Club"](https://www.youtube.com/watch?v=7L4yB0LtP0U)
-
-Six or sixteen or twenty-one or forty-seven months later—depending on when you start counting—I think I'm almost ready to stop grieving and move on with my life. I have two more long blog posts to write—one for the robot-cult blog restating my thesis about the cognitive function of categorization with somewhat more math this time and then using it to give an account of _mimickry_, and one here doing some robot-cult liturgical commentary plus necessary autobiographical scaffolding—and then I'll be _done_.
-
-Not done writing. Done _grieving_. Done with this impotent rage that expects (normative sense) this world to be something other than what I know enough to expect (positive sense). Maybe I'll start studying math again.
-
-Last week, I "e-ttended" the conference associated with this open-source scene I've been into for a while—although I've been so distracted by the Category War that I've landed [exactly one commit](https://github.com/rust-lang/rust/commit/d1cdb02e4d4) in master in the last 13 months. (I think I'm still allowed to say "in master", although ["whitelist" is out](https://github.com/rust-lang/rust/pull/74127).)
-
-Traditionally (since 2016), this [has](http://zackmdavis.net/blog/2016/09/rustconf-2016-travelogue/) [been](http://zackmdavis.net/blog/2017/10/some-excuse-for-a-rustconf-2017-travelogue/) [my annual occasion](/2019/Aug/a-love-that-is-out-of-anyones-control/#tech-conference) to travel up to Portland (the _real_ Portland, and not a cowardly obfuscation) and stay with friend of the blog [Sophia](/author/sophia/) (since 2017), but everything is remote this year because of the pandemic.
-
-Only if I'm serious about exiting my grief loop, I need to stop being so profoundly alienated by how thoroughly the finest technical minds of my generation are wholly owned by Blue Egregore. _I fear the successor ideology_—the righteous glee with which they proclaim that everything is political, that anyone with reservations about the Code of Conduct is _ipso facto_ a bigot, how empathy is as important if not more so than technical excellence ...
-
-I can't even think of them as enemies. We're the _same people_. I was born in 1987 and grew up in California with the same [beautiful moral ideal](/2017/Dec/theres-a-land-that-i-see-or-the-spirit-of-intervention/) as everyone else. I just—stopped receiving updates a few years back. From their perspective, an unpatched copy of Social Liberalism 2009 must look hopelessly out-of-date with the Current Year's nexus of [ideological coordination](https://palladiummag.com/2019/08/05/the-real-problem-at-yale-is-not-free-speech/), which everyone wants to be [corrigible](https://arbital.greaterwrong.com/p/corrigibility) to.
-
-Or maybe I'm not _even_ running unpatched Liberalism 2009? I'm still loyal to the beau—to _my interpretation of_ the beautiful moral ideal. But I've done a lot of off-curriculum reading—[it usually begins with Ayn Rand](https://en.wikipedia.org/wiki/It_Usually_Begins_with_Ayn_Rand), but it [gets](/2020/Apr/book-review-human-diversity/) [much](/2017/Jan/and-yet-none-more-blameable/#goldberg) [worse](/2020/Aug/yarvin-on-less-wrong/). It ... leaves a mark. It's _supposed to_ leave a mark on the world-model without touching the utility function.
-
-One of the remote conference talks was about using our software for computational biology. There was something I wanted to say in the Discord channel, related to how I might want to redirect my energies after I'm done grieving. I typed it out in my Emacs `*scratch*` buffer, but, after weighing the risks for a few seconds, deleted a parenthetical at the end.
-
-What I posted was:
-
-> really excited to hear about applying tech skills to biology; my current insurance dayjob is not terribly inspiring, and I've been wondering if I should put effort into making more of an impact with my career
-
-The parenthetical I deleted was:
-
-> (_e.g._ if someone in the world is working on <https://www.gwern.net/Embryo-selection> and needs programmers)
-
-It probably wouldn't have mattered either way, with so many messages flying by in the chat. In some ways, Blue Egregore is less like an ideology and more like a regular expression filter: you can get surprisingly fair into discussing the actual substance of ideas as long as no _says a Bad word_ like "eugenics".
-
-The [GPT-3 demos](https://www.gwern.net/GPT-3) have been tickling my neuroticism.
-
-My robot cult as currently
-
-
-
-https://www.unqualified-reservations.org/2007/05/antisingularity/
-https://www.lesswrong.com/posts/KnQs55tjxWopCzKsk/the-ai-timelines-scam
-
-
-
-[Lower Decks]
-
-
-Ens. Mariner personality tests
-
-fire
-
-/2017/Sep/grim-trigger-or-the-parable-of-the-honest-man-and-the-god-of-marketing/
-and the honest man said, "Cooperate."
-
-two types of people—one who buys the domain first, and one who finishes the project first
-
-There's more that I could say here, but I'm not sure that I should
-
-"I use she/her pronouns"