From: M. Taylor Saotome-Westlake Date: Mon, 28 Oct 2019 06:09:48 +0000 (-0700) Subject: at least poking at "'I Tell Myself' ..." X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=9b1fe4f645dca79bbdc0cbafb15c0618d4b71bbe;p=Ultimately_Untrue_Thought.git at least poking at "'I Tell Myself' ..." The pain won't go away until you write about it. --- diff --git a/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md index f388d3e..91673b5 100644 --- a/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/i-tell-myself-to-let-the-story-end-or-a-hill-of-validity-in-defense-of-meaning.md @@ -14,37 +14,35 @@ Status: draft > > —Sara Barellies, ["Gonna Get Over You"](https://genius.com/Sara-bareilles-gonna-get-over-you-lyrics) -I haven't been doing so well for a lot of the past ten months or so. I mean, I've always been a high-neuroticism person, but this has probably been a below-average year even by my standards, with hours of lost sleep, occasional crying bouts, _many, many_ hours of obsessive ruminating-while-pacing instead of doing my dayjob, and too long with a Sara Barellies song on loop to numb the pain. I've been reluctant to write about it in too much detail for poorly-understood psychological reasons. Maybe it would feel too much like attacking my friends? +I haven't been doing so well for a lot of the last ... um, year. I mean, I've always been a high-neuroticism person, but this has probably been a below-average year even by my standards, with hours of lost sleep, occasional crying bouts, _many, many_ hours of obsessive ruminating-while-pacing instead of doing my dayjob, and too long with a Sara Barellies song on loop to numb the pain. I've been reluctant to write about it in too much detail for poorly-understood psychological reasons. Maybe it would feel too much like attacking my friends? But this blog is not about _not_ attacking my friends. This blog is about the truth. For my own sanity, for my own emotional closure, I need to tell the story as best I can. If it's an _incredibly boring and petty_ story about me getting _unreasonably angry_ about philosophy-of-language minutiæ, well, you've been warned. If the story makes me look bad in the reader's eyes (because you think I'm crazy for getting so unreasonably angry about philosophy-of-language minutiæ), then I shall be happy to look bad for _what I actually am_. (If _telling the truth_ about what I've been obsessively preoccupied with all year makes you dislike me, then you probably _should_ dislike me. If you were to approve of me on the basis of _factually inaccurate beliefs_, then the thing of which you approve, wouldn't be _me_.) -So, I've spent basically my entire adult life in this insular little intellectual subculture that was founded in the late 'aughts on an ideal of _systematically correct reasoning_. Starting with the shared canon of knowledge of cognitive biases, reflectivity, and Bayesian probability theory bequeathed to us by our founder, _we_ were going to make serious [collective](https://www.lesswrong.com/posts/XqmjdBKa4ZaXJtNmf/raising-the-sanity-waterline) [intellectual progress](https://www.lesswrong.com/posts/Nu3wa6npK4Ry66vFp/a-sense-that-more-is-possible) in a way that had [never been done before](https://slatestarcodex.com/2017/04/07/yes-we-have-noticed-the-skulls/)—and not just out of a duty towards some philosophical ideal of Truth, but as a result of _understanding how intelligence works_—the reduction of "thought" to _cognitive algorithms_. Intelligent systems that construct predictive models of the world around them—that have "true" "beliefs"—can _use_ those models to compute which actions will best achieve their goals. +So, I've spent basically my entire adult life in this insular little intellectual subculture that was founded in the late 'aughts on an ideal of _systematically correct reasoning_. Starting with the shared canon of knowledge of [cognitive biases](https://www.lesswrong.com/posts/jnZbHi873v9vcpGpZ/what-s-a-bias-again), [reflectivity](https://www.lesswrong.com/posts/TynBiYt6zg42StRbb/my-kind-of-reflection), and [Bayesian probability theory](http://yudkowsky.net/rational/technical/) bequeathed to us by our founder the [Great Teacher](http://yudkowsky.net/rational/virtues/), _we_ were going to make serious [collective](https://www.lesswrong.com/posts/XqmjdBKa4ZaXJtNmf/raising-the-sanity-waterline) [intellectual progress](https://www.lesswrong.com/posts/Nu3wa6npK4Ry66vFp/a-sense-that-more-is-possible) in a way that had [never been done before](https://slatestarcodex.com/2017/04/07/yes-we-have-noticed-the-skulls/)—and [not just out of a duty towards some philosophical ideal of Truth](https://www.lesswrong.com/posts/XqvnWFtRD2keJdwjX/the-useful-idea-of-truth), but as a result of _understanding how intelligence works_—[the reduction of "thought"](https://www.lesswrong.com/posts/p7ftQ6acRkgo6hqHb/dreams-of-ai-design) to [_cognitive algorithms_](https://www.lesswrong.com/posts/HcCpvYLoSFP4iAqSz/rationality-appreciating-cognitive-algorithms). Intelligent systems that construct predictive models of the world around them—that have "true" "beliefs"—can _use_ those models to compute which actions will best achieve their goals. (Oh, and there was also [this part about](https://intelligence.org/files/AIPosNegFactor.pdf) how [the entire future of humanity and the universe depended on](https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile) our figuring out how to reflect human values in a recursively self-improving artificial superintelligence. That part's complicated.) I guess I feel pretty naïve now, but—I _actually believed our own propoganda_. I _actually thought_ we were doing something new and special of historical and possibly even _cosmological_ significance. -This does not seem remotely credible to me any more. I should explain. _Not_ because I expect anyone to actually read this petty Diary-like post, much less change their mind about anything because of it. I should explain for my own mental health. For closure. The sooner I manage to get the Whole Dumb Story _written down_, the sooner I can stop grieving and _move on with my life_. (However many decades that turns out to be. The part about superintelligence eventually destroying the world still seems right; it's just the part about there existing a systematically-correct-reasoning community that seems fake now.) +This does not seem remotely credible to me any more. I should explain. _Not_ because I expect anyone to actually read this petty Diary-like post, much less change their mind about anything because of it. I should explain for my own mental health. For closure. The sooner I manage to get the Whole Dumb Story _written down_, the sooner I can stop grieving and _move on with my life_. (However many decades that turns out to be. The part about superintelligence eventually destroying the world still seems right; it's just the part about there existing a systematically-correct-reasoning community poised to help save it that seems fake now.) -(A _secondary_ reason for explaining, is that it could _possibly_ function as a useful warning to the next guy to end up in an analogous situation of trusting the branded systematically-correct-reasoning community to actually be interested in doing systematically correct reasoning, and incurring a lot of wasted effort and pain desperately trying to correct the situation. But I don't know how common that is.) +(A _secondary_ reason for explaining, is that it could _possibly_ function as a useful warning to the next guy to end up in an analogous situation of trusting the branded systematically-correct-reasoning community to actually be interested in doing systematically correct reasoning, and incurring a lot of wasted effort and pain [making an extraordinary effort](https://www.lesswrong.com/posts/GuEsfTpSDSbXFiseH/make-an-extraordinary-effort) to [try to](https://www.lesswrong.com/posts/XqvnWFtRD2keJdwjX/the-useful-idea-of-truth) correct the situation. But I don't know how common that is.) I fear the explanation requires some personal backstory about me. I ... almost don't want to tell the backstory, because the thing I've been upset about all year is that I thought a systematically-correct-reasoning community should be able to correct a _trivial_ philosophy-of-language mistake which has nothing to do with me, and it was pretty frustrating when some people seemed to ignore the literal content of my careful very narrowly-scoped knockdown philosophy-of-language argument, and dismiss me with, "Oh, you're just upset about your personal thing (which doesn't matter)." So part of me is afraid that such a person reading the parts of this post that are about the ways in which I _am_, in fact, _really upset_ about my personal thing (which I _don't_ expect anyone else to care about), might take it as vindication that they were correct to be dismissive of my explicit philosophical arguments (which I _did_ expect others to take seriously). -But I shouldn't let that worry control what I write in _this_ post, because _this_ post isn't about making arguments that might convince anyone of anything: I _already_ made my arguments elsewhere, and it _didn't work_. _This_ post is about telling the story about that, so that I can finish grieving for the systematically-correct-reasoning community that I _thought_ I had, but which turned out to be a gaudy delusion. +But I shouldn't let that worry control what I write in _this_ post, because _this_ post isn't about making arguments that might convince anyone of anything: I _already_ made my arguments elsewhere, and it _didn't work_. _This_ post is about telling the story about that, so that I can finish grieving for the systematically-correct-reasoning community that I _thought_ I had. -So. Here it is. The Whole Dumb Story about my obsessive special interest and how I lost faith in the alleged systematically-correct-reasoning community. I don't know why you would want to read this, but I need to write it. - -Ever since I was thirteen years old— +So. Some backstory about me. Ever since I was thirteen years old— (and I _really_ didn't expect to be blogging about this eighteen years later) -(I _still_ don't want to be blogging about this, but somebody has to and no one else will) +(I _still_ don't want to be blogging about this, but it actually turns out to be relevant to the story about trying to correct a philosophy-of-language mistake) —my _favorite_—and basically only—masturbation fantasy has always been some variation on me getting magically transformed into a woman. I ... want to write more about the phenomenology of this, some time. I don't think the details are important here. So, there was that erotic thing, which I was pretty ashamed of at least, at first), and _of course_ never told a single soul. (It would have been about three years since the fantasy started that I even worked up the bravery to tell my Diary about it, in the addendum to entry number 53 on 8 March 2005.) -But within a couple years, I also developed this beautiful pure sacred self-identity thing, where I was also having a lot of thoughts about being a girl. Just—little day-to-day thoughts. Like when I would write in my pocket notebook as my female analogue. Or when I would practice swirling the descenders on all the lowercase letters that had descenders [(_g_, _j_, _p_, _z_)](TODO: linky "jazzy puppy" demo image) because I thought my handwriting look more feminine. +But within a couple years, I also developed this beautiful pure sacred self-identity thing, where I was also having a lot of thoughts about being a girl. Just—little day-to-day thoughts. Like when I would write in my pocket notebook as my female analogue. Or when I would practice swirling the descenders on all the lowercase letters that had descenders [(_g_, _j_, _p_, _z_)](TODO: linky "jazzy puppy" demo image) because I thought my handwriting look more feminine. [TODO: another anecdote] Now, of course I had _heard of_ there being such a thing as transsexualism. @@ -84,23 +82,16 @@ Similarly, the Popular Author himself has written extensively about [the noncent [...] -You see the problem. If "You can't define a word any way you want" is a good philosophy lesson, it should be a good philosophy lesson _independently_ of the particular word in question and _independently_ of the current year. - - - -If we've _learned something new_ about the philosophy of language in the last ten years - - +You see the problem. If "You can't define a word any way you want" is a good philosophy lesson, it should be a good philosophy lesson _independently_ of the particular word in question and _independently_ of the current year. If we've _learned something new_ about the philosophy of language in the last ten years, that's _really interesting_ and I want to know what it is! This is _basic shit_. As we say locally, this is _basic Sequences shit_. [...] -A friend tells me that the ori - +A friend tells me that I'm delusional to expect so much from "the community", that the original vision _never_ included tackling politically sensitive subjects. (I remember this friend recommending Paul Graham's ["What You Can't Say"](http://www.paulgraham.com/say.html) back in 'aught-nine, with the suggestion to take Graham's advice to figure out what you can't say, and then don't say it.) -we did not realize that _whether I should cut my dick off_ would become a politicized issue. +Perhaps so. But back in 2009, we did not realize that _whether or not I should cut my dick off_ would _become_ a politicized issue. -To be fair, it's not obvious that I _shouldn't_ cut my dick off! A lot of people seem to be doing it nowadays, and a lot of them seem to be pretty happy with their decision! But in order to _decide_ whether it's a good idea, I need _accurate information_. I need an _honest_ accounting of the costs and benefits of transition, so that I can cut my dick off in the possible worlds where that's a good idea, and not cut my dick off in the possible worlds where that's not a good idea. +To be fair, it's not obvious that I _shouldn't_ cut my dick off! A lot of people seem to be doing it nowadays, and a lot of them seem to be pretty happy. But in order to _decide_ whether to join them, I need _accurate information_. I need an _honest_ accounting of the costs and benefits of transition, so that I can cut my dick off in the possible worlds where that's a good idea, and not cut my dick off in the possible worlds where it's not a good idea. -actively manufacture _fake rationality lessons_ that have been optimized to confuse me into cutting my dick off _independently_ of whether or not we live in a world +And if the community whose marketing literature says they're all about systematically correct reasoning, is not only not going to be helpful at producing accurate information, but is furthermore going _actively manufacture fake rationality lessons_ that have been optimized to _confuse me into cutting my dick off_ independently of whether or not we live in one of the possible worlds where cutting my dick off is a good idea, then that community is _fraudulent_. It needs to either _rebrand_—or failing that, _disband_—or failing that, _be destroyed_. diff --git a/notes/i-tell-myself-notes.txt b/notes/i-tell-myself-notes.txt index 64fcf7b..d581e5c 100644 --- a/notes/i-tell-myself-notes.txt +++ b/notes/i-tell-myself-notes.txt @@ -4,14 +4,14 @@ OUTLINE * I haven't been doing so well, and I need to tell the story for my own sanity * I spent my entire adult life in "rationality", and I actually believed * In 2016, it was a huge shock to realize that I could be trans, too (I thought AGP was a different thing), and making this less confusing for other people seemed in line with the rationality mission -* slowly waking up from sex-differences denialim through LessWrong +* slowly waking up from sex-differences denialim through LessWrong: "Changing Emotions", "Failed Utopia #4-2" * It was pretty traumatizing when it turned out not to be! * But I notice people kept brining up this "Categories are arbitrary, therefore it's not wrong to insist that TWAW", and that's _definitely_ wrong; that, I knew I could win * But then Eliezer did it, too, and I _flipped the fuck out_, and set out on a mission to try to get this shit settled in public * Theory of jurisprudence, standing, rudeness; Outside View of bad-faith nitpickers * When email didn't work (details redacted), I thought, "Oh, it's probably because of the politics", so I wrote up the completely general version with examples about dolphins and job titles and Mullerian mimickry in snakes * And this is _still_ being perceived as too political, even though everyone else shot first?!?! -* And I can't object without looking +* And I can't object without looking like a status grab * What I think is going on. People want to say on the good size of the Blue Egregore, and that means they can't even defend the _basics_ if there's _any_ political context, or even the context of a _person_ with political cooties * I don't know what the optimal play is ("pretend that political constraints don't exist" might not actually work in the real world), but this is pretty bad for our collective sanity, and my mental health, and I wish we could at least try to deal with it on the meta level * I'm politically constrained, too: I don't talk about race differences even though I believe them (link to apophasis/hypocritical humor) @@ -368,3 +368,8 @@ M.L. Morris "Vocational Interests in the United States" d=1.7 on occupational pr playing for scraps vs. playing for keeps https://twitter.com/DarrenJBeattie/status/1151902363059392512 You can't optimize your group's culture for not-talking-about-atheism without also optimizing against understanding Occam's razor; you can't optimize for not questioning gender self-identity without also optimizing against understanding "A Human's Guide to Words." + +I didn't not have any reason to _invent the hypotheses_ that I had some undiagnosed brain-intersex condition, or that I was literally a girl in some unspecified metaphysical sense. + +Men who wish they were women do not particularly resemble actual women! We just—don't? This seems pretty obvious, really? +