X-Git-Url: http://unremediatedgender.space/source?a=blobdiff_plain;f=content%2Fdrafts%2Fi-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md;h=6a0f965569863dd13a009cf84fdaa559124f65dd;hb=471610982dfbcf03e75a97374f20dd3e142df2cc;hp=4970d78886865bc614c4aae8d87a04491f6ba869;hpb=766f92d27caeb914774ffdf225b75b4bc95d60e9;p=Ultimately_Untrue_Thought.git diff --git a/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md index 4970d78..6a0f965 100644 --- a/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md @@ -6,15 +6,18 @@ Status: draft > _And I tell myself to let the story end > And my heart will rest in someone else's hand -> My 'why not me?' philosophy began +> My "Why not me?" philosophy began > And I say > Ooh, how'm I gonna get over you? > I'll be alright, just not tonight -> But someday_ +> But someday +> Hey, ooh I wish you'd want me to stay +> I'll be alright, just not tonight +> But someday—_ > > —Sara Barellies, ["Gonna Get Over You"](https://genius.com/Sara-bareilles-gonna-get-over-you-lyrics) -I haven't been doing so well for a lot of the last ... um, thirteen months. I mean, I've always been a high-neuroticism person, but this has probably been a below-average year even by my standards, with hours of lost sleep, occasional crying bouts, _many, many_ hours of obsessive ruminating-while-pacing instead of doing my dayjob, and too long with a Sara Barellies song on loop to numb the pain. I've been reluctant to write about it in too much detail for poorly-understood psychological reasons. Maybe it would feel too much like attacking my friends? +I haven't been doing so well for a lot of the last ... um, thirteen months. I mean, I've always been a high-neuroticism person, but this has probably been a below-average year even by my standards, with hours of lost sleep, occasional crying bouts, _many, many_ hours of obsessive ruminating-while-pacing instead of doing my dayjob, and too long with a [Sara](https://www.youtube.com/watch?v=OUe3oVlxLSA) [Barellies](https://www.youtube.com/watch?v=emdVSVoCLmg) [song](https://youtu.be/jZMQ0OKVO80?t=112) on loop to numb the pain. I've been reluctant to write about it in too much detail for poorly-understood psychological reasons. Maybe it would feel too much like attacking my friends? But this blog is not about _not_ attacking my friends. This blog is about the truth. For my own sanity, for my own emotional closure, I need to tell the story as best I can. If it's an _incredibly boring and petty_ story about me getting _unreasonably angry_ about philosophy-of-language minutiæ, well, you've been warned. If the story makes me look bad in the reader's eyes (because you think I'm crazy for getting so unreasonably angry about philosophy-of-language minutiæ), then I shall be happy to look bad for _what I actually am_. (If _telling the truth_ about what I've been obsessively preoccupied with all year makes you dislike me, then you probably _should_ dislike me. If you were to approve of me on the basis of _factually inaccurate beliefs_, then the thing of which you approve, wouldn't be _me_.) @@ -52,7 +55,7 @@ The short story ["Failed Utopia #4-2"](https://www.lesswrong.com/posts/ctpkTaqTK At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at the idea in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back eleven years later (my deconversion from my teenage religion being pretty thorough at this point, I think), the _argument makes sense_ (though you need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other). -On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_, rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, and judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. Calling the _verthandi_ "women" would be _worse writing_; it would _fail to communicate_ what, in the story, has been lost. +On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_, rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, and judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. Calling the _verthandi_ "women" would be _worse writing_; it would _fail to communicate_ the impact of what has taken place in the story. [section: reaction to "Changing Emotions"]