From: M. Taylor Saotome-Westlake Date: Tue, 16 Feb 2021 07:24:33 +0000 (-0800) Subject: Presidential doldrums, cont'd X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=e9dce8f6fb4c7cd9690fe6aaf6477f5d5daf565d;p=Ultimately_Untrue_Thought.git Presidential doldrums, cont'd --- diff --git a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md index 780d5ad..f273577 100644 --- a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md +++ b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md @@ -624,7 +624,11 @@ _Was_ it a "political" act for me to write about the cognitive function of categ _Everyone else shot first_. That's not just my subjective perspective; the timestamps back me up here: my ["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/) (February 2018) was a _response to_ Alexander's ["... Not Man for the Categories"](https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/) (November 2014). My philosophy-of-language work on the robot-cult blog (April 2019–January 2021) was (stealthily) _in response to_ Yudkowsky's November 2018 Twitter thread. When I started trying to talk about autogynephilia with all my robot cult friends in 2016, I _did not expect_ to get dragged into a multi-year philosophy-of-language crusade! That was just _one branch_ of the argument-tree that, once begun, I thought should be easy to _definitively settle in public_ (within our robot cult, whatever the _general_ public thinks). -I guess by now the branch is as close to settled as it's going to get? Alexander ended up [adding an edit note to the end of "... Not Man to the Categories" in December 2019](https://archive.is/1a4zV#selection-805.0-817.1), and Yudkowsky would [go on to clarify his position on the philosophy of language in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228). So, that's nice. But I will confess to being quite disappointed that the public argument-tree evaluation didn't get much further, much faster? The thing you have understand about this whole debate is— +I guess by now the branch is as close to settled as it's going to get? Alexander ended up [adding an edit note to the end of "... Not Man to the Categories" in December 2019](https://archive.is/1a4zV#selection-805.0-817.1), and Yudkowsky would [go on to clarify his position on the philosophy of language in September 2020](https://www.facebook.com/yudkowsky/posts/10158853851009228). So, that's nice. + +[TODO: although I think even with the note, in practice, people are going to keep citing "... Not Man for the Categories" in a way that doesn't understand how the note undermines the main point] + +But I will confess to being quite disappointed that the public argument-tree evaluation didn't get much further, much faster? The thing you have understand about this whole debate is— _I need the correct answer in order to decide whether or not to cut my dick off_. As I've said, I _currently_ believe that cutting my dick off would be a _bad_ idea. But that's a cost–benefit judgement call based on many _contingent, empirical_ beliefs about the world. I'm obviously in the general _reference class_ of males who are getting their dicks cut off these days, and a lot of them seem to be pretty happy about it! I would be much more likely to go through with transitioning if I believed different things about the world—if I thought my beautiful pure sacred self-identity thing were a brain-intersex condition, or if I still believed in my teenage psychological-sex-differences denialism (such that there would be _axiomatically_ no worries about fitting with "other" women after transitioning), or if I were more optimistic about the degree to which HRT and surgeries approximate an actual sex change. @@ -654,18 +658,29 @@ In ["The Ideology Is Not the Movement"](https://slatestarcodex.com/2016/04/04/th Alexander jokingly identifies the identifying feature of our robot cult as being the belief that "Eliezer Yudkowsky is the rightful caliph": the Sequences were a rallying flag that brought together a lot of like-minded people to form a subculture with its own ethos and norms—among which Alexander includes "don't misgender trans people"—but the subculture emerged as its own entity that isn't necessarily _about_ anything outside itself. -No one seemed to notice at the time, but this characterization of our movement [is actually a _declaration of failure_](https://sinceriously.fyi/cached-answers/#comment-794). There's a word, "rationalist", that I've been trying to avoid in this post, because it's the subject of so much strategic equivocation, where the motte is "anyone who studies the ideal of systematically correct reasoning, general methods of thought that result in true beliefs and successful plans", and the bailey is "members of our social scene centered around Eliezer Yudkowsky and Scott Alexander". (Since I don't think we deserve the "rationalist" brand name, I had to choose something else to refer to the social scene. Hence, "robot cult.") +No one seemed to notice at the time, but this characterization of our movement [is actually a _declaration of failure_](https://sinceriously.fyi/cached-answers/#comment-794). There's a word, "rationalist", that I've been trying to avoid in this post, because it's the subject of so much strategic equivocation, where the motte is "anyone who studies the ideal of systematically correct reasoning, general methods of thought that result in true beliefs and successful plans", and the bailey is "members of our social scene centered around Eliezer Yudkowsky and Scott Alexander". (Since I don't think we deserve the "rationalist" brand name, I had to choose something else to refer to [the social scene](https://srconstantin.github.io/2017/08/08/the-craft-is-not-the-community.html). Hence, "robot cult.") + +What I would have _hoped_ for from a systematically correct reasoning community worthy of the brand name is one goddamned place in the whole goddamned world where _good arguments_ would propagate through the population no matter where they arose, "guided by the beauty of our weapons" ([following Scott Alexander](https://slatestarcodex.com/2017/03/24/guided-by-the-beauty-of-our-weapons/) [following Leonard Cohen](https://genius.com/1576578)). -What I would have _hoped_ for from a systematically correct reasoning community worthy of the brand name is a world where _good arguments_ would propagate through the population no matter where they arose, "guided by the beauty of our weapons" ([following Scott Alexander](https://slatestarcodex.com/2017/03/24/guided-by-the-beauty-of-our-weapons/) [following Leonard Cohen](https://genius.com/1576578)). I think what actually happens is that people like Yudkowsky and Alexander rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and conformity with the local environment)—with the result that if Yudkowsky and Alexander _aren't interested in getting the right answer_ (in public), then there's no way for anyone who didn't [win the talent lottery](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) to fix the public understanding. +I think what actually happens is that people like Yudkowsky and Alexander rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and conformity with the local environment)—with the result that if Yudkowsky and Alexander _aren't interested in getting the right answer_ (in public)—because getting the right answer in public would be politically suicidal—then there's no way for anyone who didn't [win the talent lottery](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) to fix the public understanding by making better arguments. + +It makes sense for public figures to not want to commit political suicide! [TODO: risk factor of people getting drawn in to a subculture that claims to be about reasoning, but is actualy very heavily optimized for cutting boys dicks off. -People use trans as political cover; -https://srconstantin.github.io/2017/08/08/the-craft-is-not-the-community.html +We already get a lot of shit for being right-wing; People use trans as political cover; + +https://twitter.com/yashkaf/status/1275524303430262790 +> "Who cares about a blog for male nerd know-it-alls?" +> The two demographics most over-represented in the SlateStarCodex readership according to the surveys are transgender people and Ph.D. holders. +https://www.scottaaronson.com/blog/?p=5310 +> Another thing that would’ve complicated the picture: the rationalist community’s legendary openness to alternative gender identities and sexualities, before +- ] + Someone asked me: "If we randomized half the people at [OpenAI](https://openai.com/) to use trans pronouns one way, and the other half to use it the other way, do you think they would end up with significantly different productivity?" But the thing I'm objecting to is a lot more fundamental than the specific choice of pronoun convention, which obviously isn't going to be uniquely determined. Turkish doesn't have gender pronouns, and that's fine. Naval ships traditionally take feminine pronouns in English, and it doesn't confuse anyone into thinking boats have a womb. [Many other languages are much more gendered than English](https://en.wikipedia.org/wiki/Grammatical_gender#Distribution_of_gender_in_the_world's_languages) (where pretty much only third-person singular pronouns are at issue). The conventions used in one's native language probably _do_ [color one's thinking to some extent](/2020/Dec/crossing-the-line/)—but when it comes to that, I have no reason to expect the overall design of English grammar and vocabulary "got it right" where Spanish or Russian "got it wrong." diff --git a/notes/notes.txt b/notes/notes.txt index 3e88984..d57eab0 100644 --- a/notes/notes.txt +++ b/notes/notes.txt @@ -1895,9 +1895,6 @@ https://twitter.com/KirinDave/status/1275647936194654208 > And finally, I strongly disagree that one good article about accepting trans-ness means you get to walk away from writing that is somewhat white supremacist and quite fascist without at least awknowledging you were wrong. And using it for defense. -https://twitter.com/yashkaf/status/1275524303430262790 -> "Who cares about a blog for male nerd know-it-alls?" -> The two demographics most over-represented in the SlateStarCodex readership according to the surveys are transgender people and Ph.D. holders. Two mentions in https://www.reddit.com/r/slatestarcodex/comments/hhy2yc/what_would_you_put_in_the_essential_ssc_collection/ @@ -2400,3 +2397,6 @@ https://patch.com/illinois/evanston/northwestern-neuroscientist-found-dead-after https://www.scientificamerican.com/article/its-time-to-take-the-penis-off-its-pedestal/ AGP testimony https://www.youtube.com/watch?v=FDE9H8i2Vw0 + +"A Typology of Gender Detransition and Its Implications for Healthcare Providers" +https://www.tandfonline.com/doi/full/10.1080/0092623X.2020.1869126 diff --git a/notes/sexual-dimorphism-in-the-sequences-notes.md b/notes/sexual-dimorphism-in-the-sequences-notes.md index 293aa26..3e1bcc8 100644 --- a/notes/sexual-dimorphism-in-the-sequences-notes.md +++ b/notes/sexual-dimorphism-in-the-sequences-notes.md @@ -1,5 +1,4 @@ Sections I should be able to easily make progress-per-minute on— -* Ideology is not the Movement * Dr. Will Powers * I Don't Do Policy * Cara's Egregore @@ -18,6 +17,10 @@ Terminology to explain before use— * Singularity/paperclip * "non-exclusively androphilic" / nonhomosexual +* too big to fail + +* I need to put this behind me + * not a theory of trans men * [the autogynephilic analogue of romantic love](/papers/lawrence-becoming_what_we_love.pdf)