X-Git-Url: http://unremediatedgender.space/source?p=Ultimately_Untrue_Thought.git;a=blobdiff_plain;f=content%2F2020%2Fmemento-mori.md;fp=content%2F2020%2Fmemento-mori.md;h=13571dffaa546d2e28e257e617c502c3934c09cc;hp=08c58328a6d17d9691c31ae7fee08021aee1a646;hb=b2b8d05669cbb707226a083a9384162d1f9f01e0;hpb=8d0a2be60ccef09bebc12b468cc88500891bb229 diff --git a/content/2020/memento-mori.md b/content/2020/memento-mori.md index 08c5832..13571df 100644 --- a/content/2020/memento-mori.md +++ b/content/2020/memento-mori.md @@ -38,7 +38,7 @@ The parenthetical I deleted was: It probably wouldn't have mattered either way, with so many messages flying by in the chat. In some ways, Blue Egregore is less like an ideology and more like a regular expression filter: you can get surprisingly far into discussing the actual substance of ideas as long as no one _says a bad word_ like "eugenics". -—if we even have enough _time_ for things like embryo selection to help, if AI research somehow keeps plodding along [even as everything _else_ falls apart](https://www.unqualified-reservations.org/2007/05/antisingularity/). The [GPT-3 demos](https://www.gwern.net/GPT-3) have been tickling my neuroticism. Sure, it's "just" a language model, doing nothing more but predicting the next token of human-generated text. But [you can do a lot with language](https://bmk.sh/2020/08/17/Building-AGI-Using-Language-Models/). As _disgusted_ as I am with my robot cult as presently constituted, the _argument_ for why you should fear the coming robot apocalypse in which all will be consumed in a cloud of tiny molecular paperclips, still looks solid. But I had always thought of it as a long-term thing—this unspoken sense of, okay, we're probably all going to die, but that'll probably be in, like, 2060 or whatever. People freaking out about it coming _soon_-soon are probably just [following the gradient into](https://www.lesswrong.com/posts/yEjaj7PWacno5EvWa/every-cause-wants-to-be-a-cult) being [a doomsday cult](https://unstableontology.com/2019/07/11/the-ai-timelines-scam/). Now the threat, [and the uncertainty around it](https://intelligence.org/2017/10/13/fire-alarm/), feel more real—like maybe we'll all die in 2035 instead of 2060. +—if we even have enough _time_ for things like embryo selection to help, if AI research somehow keeps plodding along [even as everything _else_ falls apart](https://www.unqualified-reservations.org/2007/05/antisingularity/). The [GPT-3 demos](https://www.gwern.net/GPT-3) have been tickling my neuroticism. Sure, it's "just" a language model, doing nothing more but predicting the next token of human-generated text. But [you can do a lot with language](https://bmk.sh/2020/08/17/Building-AGI-Using-Language-Models/). As _disgusted_ as I am with my robot cult as presently constituted, the _argument_ for why you should fear the coming robot apocalypse in which all will be consumed in a cloud of tiny molecular paperclips, still looks solid. But I had always thought of it as a long-term thing—this unspoken sense of, okay, we're probably all going to die, but that'll probably be in, like, 2060 or whatever. People freaking out about it coming _soon_-soon are probably just [following the gradient into](https://www.lesswrong.com/posts/yEjaj7PWacno5EvWa/every-cause-wants-to-be-a-cult) being [a doomsday cult](https://unstableontology.com/2019/07/11/the-ai-timelines-scam/). Now the threat, [and the uncertainty around it](https://intelligence.org/2017/10/13/fire-alarm/), feel more real—like maybe we'll all die in 2035 instead of 2060. At some point, I should write a post on the causes and consequences of the psychological traits of fictional characters not matching the real-life distributions by demographic. [The new _Star Trek_ cartoon](https://memory-alpha.fandom.com/wiki/Star_Trek:_Lower_Decks) is not very good, but I'm obligated to enjoy it anyway out of brand loyalty. One of the main characters, Ens. Beckett Mariner, is [brash and boisterous and dominant](https://youtu.be/64obsPsXxkE?t=45)—[friendly, but in a way](https://www.youtube.com/watch?v=7CVJGy0Do5I) that makes it clear that she's _on top_. If you've seen _Rick and Morty_, her relationship with Ens. Brad Boimler has the Rick and Morty dynamic, with Mariner as Rick. ([Series creator](https://en.wikipedia.org/wiki/Mike_McMahan) Mike McMahan actually worked on _Rick and Morty_, so it likely _is_ the same dynamic, not just superficially, but generated by the same algorithm in McMahan's head.)