X-Git-Url: http://unremediatedgender.space/source?a=blobdiff_plain;f=notes%2Fi-tell-myself-notes.txt;h=c7265fed368b032925a2719b63209494f454eee2;hb=3c496f3c4c6f5046ce4465a60930f6e91226901f;hp=1313b5f4517e29bdf3bcc257a28fb425731c14c5;hpb=175f1ce638ad6f7add920c181cd139e1ec0c362e;p=Ultimately_Untrue_Thought.git diff --git a/notes/i-tell-myself-notes.txt b/notes/i-tell-myself-notes.txt index 1313b5f..c7265fe 100644 --- a/notes/i-tell-myself-notes.txt +++ b/notes/i-tell-myself-notes.txt @@ -86,6 +86,8 @@ the Church won't you be embarrassed to leave if we create utopia +won't all those trans women be embarrassed after the singularity + invent a fake epistemology lesson we live in a world where reason doesn't work @@ -279,10 +281,16 @@ I just don't _know how_ to tell the true tale of personal heartbreak without exp The "I can define the word 'woman' any way I want" argument is bullshit. All the actually-smart people know that it's bullshit at _some_ level, perhaps semi-consciously buried under a lot of cognitive dissonance. But it's _socially load-bearing_ bullshit that _not only_ does almost no one have an incentive to correct— + + But no one has the incentive to correct the mistake in public. +"woah, [toddler]'s learning about the facts of life +[friend]'s explaining about how some parts usually get covered and about how some people don't have penises and stuff" "Some people don't have penises" ... can you be a little more specific?! +same person: "people do tend to present as their genders" + politicizing the question of what 2 + 2 should equal Aumann is an Orthodox Jew @@ -462,3 +470,48 @@ cat/dog gaslighting; even if you don't particularly need that particular classif fame: arguing with a Discord server was low-impact compared to getting the leadership on board +https://www.lesswrong.com/posts/CEGnJBHmkcwPTysb7/lonely-dissent + +Telling the difference between fantasy and reality is kind of an important life skill + + +https://slatestarcodex.com/2017/11/07/does-age-bring-wisdom/ + +> Sometimes I can almost feel this happening. First I believe something is true, and say so. Then I realize it's considered low-status and cringeworthy. Then I make a principled decision to avoid saying it–or say it only in a very careful way–in order to protect my reputation and ability to participate in society. Then when other people say it, I start looking down on them for being bad at public relations. Then I start looking down on them just for being low-status or cringeworthy. Finally the idea of "low-status" and "bad and wrong" have merged so fully in my mind that the idea seems terrible and ridiculous to me, and I only remember it's true if I force myself to explicitly consider the question. And even then, it's in a condescending way, where I feel like the people who say it's true deserve low status for not being smart enough to remember not to say it. This is endemic, and I try to quash it when I notice it, but I don't know how many times it's slipped my notice all the way to the point where I can no longer remember the truth of the original statement. + + +https://www.greaterwrong.com/posts/FBgozHEv7J72NCEPB/my-way#comment-W4TAp4LuW3Ev6QWSF +> I am skeptical that either sex can ever really model and predict the other's deep internal life, short of computer-assisted telepathy. These are different brain designs we're talking about here. + + +https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2 + + + +The short story ["Failed Utopia #4-2"](https://www.lesswrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2) portrays an almost-aligned superintelligence constructing a happiness-maximizing utopia for humans—except that because [evolution didn't design women and men to be optimal partners for each other](https://www.lesswrong.com/posts/Py3uGnncqXuEfPtQp/interpersonal-entanglement), and the AI is prohibited from editing people's minds, the happiness-maximizing solution ends up splitting up the human species by sex and giving women and men their own _separate_ utopias, complete with artificially-synthesized romantic partners. + +At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at the idea in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back eleven years later (my deconversion from my teenage religion being pretty thorough at this point, I think), the _argument makes sense_ (though you need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other). + +On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_, rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, and judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. + + +https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions + +> Jun 18, 2008 +> this is too perfectly terrifying, too terrifyingly perfect +> +> My search for not-previously-read Eliezer Yudkowsky material was getting kind of pathetic--I'd gotten to the point of reading his old messages in the archives of the extropians mailing list. And then I read this: +> +> http://lists.extropy.org/pipermail/extropy-chat/2004-September/008924.html +> +> --and the worst thing is that I cannot adequately talk about my feelings. Am I shocked, liberated, relieved, scared, angry, amused? I'm not going to read the replies right now. I have work to do, and--and I'm too floored? _I'm just not built to handle this sort of thing_. I remain, +> +> Zachary Michael Davis + +Arguing with a specific person's published words is important, because otherwise you can strawman + +[Am I the asshole?](https://www.reddit.com/r/AmItheAsshole/) + +(I told people that my father was coming to pick me up at the end of my 72-hour (== 3 days) evaluation period, but that it wasn't fair that I couldn't rescue everyone.) + +blegg commentary: https://www.lesswrong.com/posts/GEJzPwY8JedcNX2qz/blegg-mode#aAgSDZ4ddHpzj9fNN