From: M. Taylor Saotome-Westlake Date: Fri, 13 Jan 2023 01:11:02 +0000 (-0800) Subject: memoir pokes: writing to Scott and /r/gendercritical, dayjob IQ minimum X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=1d8d7891d1f06e6bc167c3b25e81f4037c1b777d;p=Ultimately_Untrue_Thought.git memoir pokes: writing to Scott and /r/gendercritical, dayjob IQ minimum --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 20cb8ce..7d8fc0a 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -411,9 +411,11 @@ I started drafting a long reply—but then I remembered that in recent discussio This is the part where I began to ... overheat. I tried ("tried") to focus on my dayjob, but I was just _so angry_. Did Scott _really_ not understand the rationality-relevant distinction between "value-dependent categories as a result of caring about predicting different variables" (as explained by the _dagim_/water-dwellers _vs._ fish example) and "value-dependent categories _in order to not make my friends sad_"? I thought I was pretty explicit about this? Was Scott _really_ that dumb?? Or is it that he was only verbal-smart and this is the sort of thing that only makes sense if you've ever been good at linear algebra?? (Such that the language of "only running your clustering algorithm on the subspace of the configuration space spanned by the variables that are relevant to your decisions" would come naturally.) Did I need to write a post explaining just that one point in mathematical detail? (With executable code and a worked example with entropy calculations.) -My dayjob boss made it clear that he was expecting me to have code for my current Jira tickets by noon the next day, so I resigned myself to stay at the office late to finish that. +My dayjob boss made it clear that he was expecting me to have code for my current Jira tickets by noon the next day, so I deceived myself into thinking I could successfully accomplish that by staying at the office late. -But I was just in so much (psychological) pain. Or at least—as I noted in one of a series of emails to my posse that night—I felt motivated to type the sentence, "I'm in so much (psychological) pain." I'm never sure how to intepret my own self-reports, because even when I'm really emotionally trashed (crying, shaking, randomly yelling, _&c_.), I think I'm still noticeably _incentivizable_: if someone were to present a credible threat (like slapping me and telling me to snap out of it), then I would be able to calm down: there's some sort of game-theory algorithm in the brain that subjectively feels genuine distress (like crying or sending people too many hysterical emails) but only when it can predict that it will be either rewarded with sympathy or at least tolerated. (Kevin Simler: [tears are a discount on friendship](https://meltingasphalt.com/tears/).) +(Maybe I could have caught up, if it was just a matter of the task being slightly harder than anticipated and I weren't psychologically impaired. The problem was that focus is worth 30 IQ points, and an IQ 100 person _can't do my job_.) + +I was in so much (psychological) pain. Or at least—as I noted in one of a series of emails to my posse that night—I felt motivated to type the sentence, "I'm in so much (psychological) pain." I'm never sure how to intepret my own self-reports, because even when I'm really emotionally trashed (crying, shaking, randomly yelling, _&c_.), I think I'm still noticeably _incentivizable_: if someone were to present a credible threat (like slapping me and telling me to snap out of it), then I would be able to calm down: there's some sort of game-theory algorithm in the brain that subjectively feels genuine distress (like crying or sending people too many hysterical emails) but only when it can predict that it will be either rewarded with sympathy or at least tolerated. (Kevin Simler: [tears are a discount on friendship](https://meltingasphalt.com/tears/).) I [tweeted a Sequences quote](https://twitter.com/zackmdavis/status/1107874587822297089) (the mention of @ESYudkowsky being to attribute credit, I told myself; I figured Yudkowsky had enough followers that he probably wouldn't see a notification): diff --git a/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md b/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md index aabc07f..3e92a6e 100644 --- a/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md +++ b/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md @@ -101,7 +101,7 @@ Suppose our alien AI were to be informed that many of the human males seeking to What's the _usual_ reason for males to be obsessed with female bodies? -So, yeah. Basically, I think a _substantial majority_ of trans women under modern conditions in Western countries are, essentially, guys like me who were _less self-aware about what the thing actually is_. +So, yeah. Basically, I think a _substantial majority_ of trans women under modern conditions in Western countries are, essentially, guys like me who were _less self-aware about what the thing actually is_. It's not an innate gender identity; it's a sexual orientation that's _surprisingly easy to misinterpret_ as a gender identity. So, I realize this is an inflamatory and (far more importantly) _surprising_ claim. Obviously, I don't have introspective access into other people's minds. If someone claims to have an internal sense of her own gender that doesn't match her assigned sex at birth, on what evidence could I possibly, _possibly_ have the astounding arrogance to reply, "No, I think you're really just a perverted male like me"? @@ -327,13 +327,11 @@ I say all this to emphasize just how much Yudkowsky's opinion meant to me. If yo (Incidentally, it was also around this time that I snuck a copy of _Men Trapped in Men's Bodies_ into the [MIRI](https://intelligence.org/) office library, which was sometimes possible for community members to visit. It seemed like something Harry Potter-Evans-Verres would do—and ominously, I noticed, not like something Hermione Granger would do.) -[TODO: Scott linked to Kay Brown as part of his links post and got pushback -https://slatestarcodex.com/2016/11/01/links-1116-site-unseen/ -https://slatestarscratchpad.tumblr.com/post/152736458066/hey-scott-im-a-bit-of-a-fan-of-yours-and-i] +In October 2016, I wrote about my frustrations to Scott Alexander of _Slate Star Codex_ fame (Subject: "J. Michael Bailey did nothing wrong"). The immediate result of this is that he ended up including a link to one of Kay Brown's study summaries (about non-androphilic trans woman having very high IQs) in his [November links post](https://slatestarcodex.com/2016/11/01/links-1116-site-unseen/), and he [got some pushback just for that](https://slatestarscratchpad.tumblr.com/post/152736458066/hey-scott-im-a-bit-of-a-fan-of-yours-and-i). -[TODO: I posted to /r/gendercritical (post the full text in an ancillary page; it's currently in my "Collective Debt, Collective Shame" draft) +In late December 2016, I posted [an introductory message to the "Peak Trans" thread on /r/GenderCritical.](/ancillary/what-i-said-to-r-gendercritical/). The first comment was "You are a predator." -The first comment was "You are a predator." ... I'm not sure what I was expecting. I spent part of Christmas Day crying.] +... I'm not sure what I was expecting. I spent part of Christmas Day crying. Gatekeeping sessions finished, I finally started HRT at the end of December 2016. In an effort to not let my anti–autogynephilia-denialism crusade take over my life, earlier that month, I [promised myself](/ancillary/a-broken-promise/) (and [published the SHA256 hash of the promise](https://www.facebook.com/zmdavis/posts/10154596054540199) to signal that I was Serious) not to comment on gender issues under my real name through June 2017—_that_ was what my new pseudonymous blog was for. diff --git a/content/drafts/if-clarity-seems-like-death-to-them.md b/content/drafts/if-clarity-seems-like-death-to-them.md index 66f2dae..67f52f9 100644 --- a/content/drafts/if-clarity-seems-like-death-to-them.md +++ b/content/drafts/if-clarity-seems-like-death-to-them.md @@ -23,7 +23,7 @@ Ben had [previously](http://benjaminrosshoffman.com/givewell-and-partial-funding I believed that there _was_ a real problem, but didn't feel like I had a good grasp on what it was specifically. Cultural critique is a fraught endeavor: if someone tells an outright lie, you can, maybe, with a lot of effort, prove that to other people, and get a correction on that specific point. (Actually, as we had just discovered, even that might be too much to hope for.) But _culture_ is the sum of lots and lots of little micro-actions by lots and lots of people. If your _entire culture_ has visibly departed from the Way that was taught to you in the late 'aughts, how do you demonstrate that to people who, to all appearances, are acting like they don't remember the old Way, or that they don't think anything has changed, or that they notice some changes but think the new way is better? It's not as simple as shouting, "Hey guys, Truth matters!"—any ideologue or religious person would agree with _that_. It's not feasible to litigate every petty epistemic crime in something someone said, and if you tried, someone who thought the culture was basically on track could accuse you of cherry-picking. If "culture" is a real thing at all—and it certainly seems to be—we are condemned to grasp it unclearly, relying on the brain's pattern-matching faculties to sum over thousands of little micro-actions as a [_gestalt_](https://en.wiktionary.org/wiki/gestalt), rather than having the kind of robust, precise representation a well-designed AI could compute plans with. -Ben called the _gestalt_ he saw the Blight, after the rogue superintelligence in _A Fire Upon the Deep_: the problem wasn't that people were getting dumber; it's that there was locally coherent coordination away from clarity and truth and towards coalition-building, which was validated by the official narrative in ways that gave it a huge tactical advantage; people were increasingly making decisions that were better explained by their political incentives rather than acting on coherent beliefs about the world—using and construing claims about facts as moves in a power game, albeit sometimes subject to genre constraints under which only true facts were admissible moves in the game. +Ben called the _gestalt_ he saw the Blight, after the rogue superintelligence in Vernor Vinge's _A Fire Upon the Deep_: the problem wasn't that people were getting dumber; it's that there was locally coherent coordination away from clarity and truth and towards coalition-building, which was validated by the official narrative in ways that gave it a huge tactical advantage; people were increasingly making decisions that were better explained by their political incentives rather than acting on coherent beliefs about the world—using and construing claims about facts as moves in a power game, albeit sometimes subject to genre constraints under which only true facts were admissible moves in the game. When I asked him for specific examples of MIRI or CfAR leaders behaving badly, he gave the example of [MIRI executive director Nate Soares posting that he was "excited to see OpenAI joining the space"](https://intelligence.org/2015/12/11/openai-and-other-news/), despite the fact that [_no one_ who had been following the AI risk discourse](https://slatestarcodex.com/2015/12/17/should-ai-be-open/) [thought that OpenAI as originally announced was a good idea](http://benjaminrosshoffman.com/openai-makes-humanity-less-safe/). Nate had privately clarified to Ben that the word "excited" wasn't necessarily meant positively, and in this case meant something more like "terrified." diff --git a/content/pages/ancillary/what-i-said-to-r-gendercritical.md b/content/pages/ancillary/what-i-said-to-r-gendercritical.md new file mode 100644 index 0000000..495b811 --- /dev/null +++ b/content/pages/ancillary/what-i-said-to-r-gendercritical.md @@ -0,0 +1,31 @@ +Title: What I Said to /r/GenderCritical (December 2016) +Status: Hidden + +> Dear /r/gendercritical: +> +> So, I'm a man in the interesting position of simultaneously possessing what I now understand to be the same underlying psychological variation that motivates some males to become the kind of MtTs that justifiably draw the ire of gender-critical feminists, _and_ hitting peak trans. +> +> The psychologist Ray Blanchard proposed that male-to-female (... -to-"female") transsexuals come in two distinct types. So in one taxon, you have extremely feminine gay males who find they fit into society better women rather than anomalously feminine men. And in the other taxon, you have otherwise-mostly-ordinary men with an unusual sexual interest that Blanchard called _autogynephilia_ ("love of oneself as a woman") wherein they are erotically interested in the idea of having a female body, and over a period of years, gradually build up self-identity feelings around that image. The thing to appreciate here is that it's not _just_ a fetish! It's also a beautiful pure sacred self-identity feeling ... that, yes, happens to almost certainly be _causally related_ to the fetish. ["Men who love women and want to become what they love."](http://www.annelawrence.com/becoming_what_we_love.pdf) +> +> ... men like me. +> +> When I encountered the word _autogynephilia_ ten years ago at age 18, I immediately thought, _There's a word for it!_ I was actually surprised that it had been coined in the context of a theory of transsexualism; I wasn't _unhappy with my assigned gender_, because (like many of you) I was something of a gender-abolitionist at the time and didn't think gender roles should exist. It was just ... my happy fantasy. I didn't have any _reason_ to come up with any ludicrous rationalizations that I was somehow _literally_ a girl in some unspecified metaphysical sense. +> +> But the Blanchard taxonomy did not seem to be the standard view, and (I soon learned) people get mad at you when you use the word _autogynephilia_ in a comment section, so I assumed that the _theory_ that autogynephilia could be a root cause of transsexualism was false, while continuing to be grateful that _there was a word_ for the beautiful feeling at the center of my life. +> +> And I spent the next ten years continuing to have the sorts of experiences that I guess pass for "gender dysphoria" (not wanting to identify with maleness or masculinity, prefering to identitfy with women if not femininity, growing a ponytail for symbolic reasons, trying to go by a gender-neutral nickname for a few years, feeling happy when someone assumed I was gay or "ma'am"ed me over the phone, _&c._), all the while thinking that I wasn't one of those people who are like, _actually trans_, because they claim to have gender identities, and I didn't know what a gender identity was supposed to feel like. I was just, you know, one of those guys who are pointedly insistent about not being _proud_ of the fact that they're guys, and who like to fantasize about things being different, all of this being (at a guess) probably related somehow to my erotic fantasies about having magical shapeshifting powers. +> +> ... and then, I moved to Berkeley. +> +> I met some _very interesting_ people whom I am _very jealous_ of. I talked to some of them. I did [some reading](https://sillyolme.wordpress.com/faq-on-the-science/). And ... it's starting to look like Blanchard was right. Most _actual trans women_ (MtTs in your terminology) are, in fact, guys like me who were _less self-aware about it_, who had all the same happy romantic fantasies about being a woman and then—somehow—_took them literally_ (!?!). +> +> This revelation has left me with many conflicting feelings. +> +> So, cards on the table: _in itself_, I don't think autogynephilia is a bad thing. I think it's a _good_ thing. I think it's _made me a better person_. (I may not exactly be a _good_ person by radical feminist standards, but I'm probably better than I would have been if I had just been a _normal_ nerdy straight white guy without this obsessive need to identify with women.) I think people _should_ have the freedom to body-mod and choose their pronouns and have that be respected. +> +> But to exercise that freedom responsibly, I think it's important to be _realistic_ about what the underlying psychological mechanisms are, to be _realistic_ about what the existing technology can and cannot do, and to respect the interests of, you know, _actual women_, who might have legimate reasons to want their own sports teams or music festivals without people like me around! +> +> The currently-existing trans rights _Zeitgeist_, insofar as it doesn't even want to admit that autogynephilia is a thing, does not seem realistic to me. I'm kind of upset about this! I've started a blog, [_The Scintillating But Ultimately Untrue Thought_](http://unremediatedgender.space/), where I intend to write about this and other gender issues. If it's alright with you-all, I may want to share some of my future posts on this sub? (I am sympathetic to many of the goals of gender-critical feminism, but am writing from my own idiosyncratic perspective; I am eager to contribute insofar as our interests overlap, but don't want to intrude in spaces where I am not wanted.) I remain, +> +> Critically yours, +> M. Taylor Saotome-Westlake diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index b5d6ab2..265280e 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -1,9 +1,9 @@ smaller TODO blocks— ✓ previous AI milestones [pt. 4] ✓ short timelines and politics [pt. 4] -_ vaccine joke [pt. 4] -_ Scott linked to Kay Brown [pt. 2] -_ posted to /r/gendercritical [pt. 2] +✓ vaccine joke [pt. 4] +✓ Scott linked to Kay Brown [pt. 2] +✓ /ancillary/what-i-said-to-r-gendercritical/ [pt. 2] _ the Death With Dignity era [pt. 4] _ social justice and defying threats [pt. 4] _ complicity and friendship [pt. 4] @@ -18,6 +18,10 @@ _ Michael Vassar and the Theory of Optimal Gossip _ psychiatric disaster With internet available— +_ archive.is Nov. 2016 Scott Tumblr pushback +_ check summary of Nov. 2016 link +_ Smallpox Eradication Day +_ privacy ask _ May 2019 thread vs. Vanessa _ historical accuracy of Gallileo _ explain the "if the world were at stake" _Sword of Good_ reference better @@ -45,6 +49,8 @@ _ Anna's claim that Scott was a target specifically because he was good, my coun _ Yudkowsky's LW moderation policy far editing tier— +_ "no one else would have spoken" should have been a call-to-action to read more widely +_ explain who Kay Brown is _ mention Will MacAskill's gimped "longtermism" somehow _ re-read a DALL-E explanation and decide if I think it's less scary now _ Scott Aaronson on the blockchain of science https://scottaaronson.blog/?p=6821 @@ -114,6 +120,7 @@ _ update "80,000 words" refs with the near-final wordcount terms to explain on first mention— +_ Valinor _ "Caliphate" _ "rationalist" _ Center for Applied Rationality @@ -1966,4 +1973,4 @@ the generic experience is that the future is more capable but less aligned, and people from the past would envy our refrigeration, vaccines, infinite food, &c., but that doesn't mean they would regard our value-drifted-with-respect-to-them culture as superior paperclipping is just that turned up to 11 (well, 10¹¹) -Bostrom's apology for an old email—who is this written for?? Why get ahead, when you could just not comment? +Bostrom's apology for an old email—who is this written for?? Why get ahead, when you could just not comment? \ No newline at end of file