From: M. Taylor Saotome-Westlake Date: Sun, 13 Nov 2022 03:03:29 +0000 (-0800) Subject: memoir: trans as shield, email leak X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=f4b907f77e03b13ce7f00ef7953e3327566aa67f;p=Ultimately_Untrue_Thought.git memoir: trans as shield, email leak --- diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index b5f4b40..a0a27f0 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -9,7 +9,7 @@ Status: draft > > _Atlas Shrugged_ by Ayn Rand -[TODO: Sasha disaster, breakup with Vassar group, this was really bad for me] +[TODO: psychiatric disaster, breakup with Vassar group, this was really bad for me] On 13 February 2021, ["Silicon Valley's Safe Space"](https://archive.ph/zW6oX), the _New York Times_ piece on _Slate Star Codex_ came out. It was ... pretty lame? (_Just_ lame, not a masterfully vicious hit piece.) Metz did a mediocre job of explaining what our robot cult is about, while [pushing hard on the subtext](https://scottaaronson.blog/?p=5310) to make us look racist and sexist, occasionally resorting to odd constructions that are surprising to read from someone who has been a professional writer for decades. ("It was nominally a blog", Metz wrote of _Slate Star Codex_. ["Nominally"](https://en.wiktionary.org/wiki/nominally)?) The article's claim that Alexander "wrote in a wordy, often roundabout way that left many wondering what he really believed" seemed to me more like a critique of the "many"'s reading comprehension, rather than Alexander's writing. @@ -27,27 +27,55 @@ But the sense in which Alexander "aligned himself with Murray" in ["Three Great It _is_ a weirdly brazen invalid _inference_. But by calling it a "falsehood", Alexander heavily implies this means he disagrees with Murray's offensive views on race: in invalidating the _Times_'s charge of guilt-by-association with Murray, Alexander validates Murray's guilt. -But ... anyone who's actually read _and understood_ Scott's work should be able to infer that Scott probably finds genetically-mediated group differences plausible (as a value-free matter of empirical Science with no particular normative implications): his [review of Judith Rich Harris](https://archive.ph/Zy3EL) indicates that he accepts the evidence from twin studies for individual behavioral differences having a large genetic component, and section III. of his ["The Atomic Bomb Considered As Hungarian High School Science Fair Project"](https://slatestarcodex.com/2017/05/26/the-atomic-bomb-considered-as-hungarian-high-school-science-fair-project/) indicates that he accepts genetics as an explantion for group differences in intelligence (in the case of Ashkenazi Jews). +But ... anyone who's actually read _and understood_ Scott's work should be able to infer that Scott probably finds genetically-mediated group differences plausible (as a value-free matter of empirical Science with no particular normative implications): his [review of Judith Rich Harris](https://archive.ph/Zy3EL) indicates that he accepts the evidence from twin studies for individual behavioral differences having a large genetic component, and section III. of his ["The Atomic Bomb Considered As Hungarian High School Science Fair Project"](https://slatestarcodex.com/2017/05/26/the-atomic-bomb-considered-as-hungarian-high-school-science-fair-project/) indicates that he accepts genetics as an explantion for group differences (in the particular case of Ashkenazi Jewish intelligence). There are a lot of standard caveats that go here that Scott would no doubt scrupulously address if he ever chose to tackle the subject of genetically-mediated group differences in general: [the mere existence of a group difference in a "heritable" trait doesn't itself imply a genetic cause of the group difference (because the groups' environments could also be different)](/2020/Apr/book-review-human-diversity/#heritability-caveats). It is without a doubt _entirely conceivable_ that the Ashkenazi IQ advantage is real and genetic, but black–white gap is fake and environmental.[^bet] Moreover, group averages are just that—averages. They don't imply anything about individuals and don't justify discrimination against individuals. -[^bet]: It's just—how much do you want to bet on that? How much do you think _Scott_ wants to bet on that? +[^bet]: It's just—how much do you want to bet on that? How much do you think _Scott_ wants to bet? But ... anyone who's actually read _and understood_ Charles Murray's work, knows that Murray _also_ includes the standard caveats! (Even though the one about group differences not implying anything about individuals is [actually](/2020/Apr/book-review-human-diversity/#individuals-should-not-be-judged-by-the-average) [wrong](/2022/Jun/comment-on-a-scene-from-planecrash-crisis-of-faith/).) The _Times_'s insinuation that Scott Alexander is a racist _like Charles Murray_ seems like a "[Gettier](https://en.wikipedia.org/wiki/Gettier_problem) attack": the charge is essentially correct, even though the evidence used to justify the charge to distracted _New York Times_ readers is completely bogus. Why do I keep repeatedly bringing this up, that "rationalist" leaders almost certainly believe in cognitive race differences (even if it's hard to get them to publicly admit it in a form that's easy for _New York Times_ readers to decode)? -Because one of the things I noticed while trying to make sense of why my entire social circle suddenly decided in 2016 that guys like me could become women by means of saying so, is that in the conflict between the "rationalist" Caliphate and mainstream progressives, the "rationalists"' defensive strategy is one of deception. The _New York Times_ accuses us of being racists like Charles Murray. Instead of pointing out that being a racist _like Charles Murray_ is the obviously correct position that sensible people will tend to reach by being sensible, we disingenuously deny everything. (Or rather, people are distributed on a spectrum between disingenuously denying everything and sincerly accepting that Charles Murray is Actually Bad, with the older and more skilled among us skewed more towards disingenuous denial.) +Because one of the things I noticed while trying to make sense of why my entire social circle suddenly decided in 2016 that guys like me could become women by means of saying so, is that in the conflict between the "rationalist" Caliphate and mainstream progressives, the "rationalists"' defensive strategy is one of deception. + +Because of the particular historical moment in which we live, we end up facing pressure from progressives, because—whatever our _object-level_ beliefs about (say) [sex, race, and class differences](/2020/Apr/book-review-human-diversity/)—and however much many of us would prefer not to talk about them—on the _meta_ level, our creed requires us to admit _it's an empirical question_, not a moral one—and that [empirical questions have no privileged reason to admit convenient answers](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god). + +I view this conflict as entirely incidental, something that [would happen in some form in any place and time](https://www.lesswrong.com/posts/cKrgy7hLdszkse2pq/archimedes-s-chronophone), rather than having to do with American politics or "the left" in particular. In a Christian theocracy, our analogues would get in trouble for beliefs about evolution; in the old Soviet Union, our analogues would get in trouble for [thinking about market economics](https://slatestarcodex.com/2014/09/24/book-review-red-plenty/) (as a [positive technical discipline](https://en.wikipedia.org/wiki/Fundamental_theorems_of_welfare_economics#Proof_of_the_first_fundamental_theorem) adjacent to game theory, not yoked to a particular normative agenda).[^logical-induction] + +[^logical-induction]: I sometimes wonder how hard it would have been to come up with MIRI's logical induction result (which describes an asymptotic algorithm for estimating the probabilities of mathematical truths in terms of a betting market of increasingly complex traders) in the Soviet Union. + +Incidental or not, the conflict is real, and everyone smart knows it—even if it's not easy to _prove_ that everyone smart knows it, because everyone smart is very careful what they say in public. (I am not smart.) + +So the _New York Times_ implicitly accuses us of being racists, like Charles Murray. Instead of pointing out that being a racist _like Charles Murray_ is the obviously correct position that sensible people will tend to reach in the course of being sensible, we disingenuously deny everything.[^deny-everything] + +[^deny-everything]: Or rather, people are distributed on a spectrum between disingenuously denying everything and sincerly accepting that Charles Murray is Actually Bad, with the older and more skilled among us skewed somewhat more towards disingenuous denial. It works surprisingly well. I fear my love of Truth is not so great that if I didn't have Something to Protect, I would have happily participated in the cover-up. -[TODO: explain the strategy whereby people are using being pro-trans as their progressive "dump stat"—maybe this (and maybe some of the above) should slot in after the discussion of Yudkowsky's post on our haters being Bad?] +As it happens, in our world, the defensive cover-up consists of _throwing me under the bus_. Facing censure from the egregore for being insufficiently progressive, we can't defend ourselves ideologically. (_We_ think we're egalitarians, but progressives won't buy that because we like markets too much.) We can't point to our racial diversity. (Mostly white if not Jewish, with a scattering of Asians.) The sex balance is doing a little better after hybridizing with Tumblr and Effective Alruism (as [contrasted with the _Overcoming Bias_ days](/2017/Dec/a-common-misunderstanding-or-the-spirit-of-the-staircase-24-january-2009/)), but still isn't great. + +But _trans!_ We do have plenty of trans people to trot out as a shield! [Jacob Falkovich noted](https://twitter.com/yashkaf/status/1275524303430262790) (on 23 June 2020, just after _Slate Star Codex_ went down), "The two demographics most over-represented in the SlateStarCodex readership according to the surveys are transgender people and Ph.D. holders." Scott Aaronson [noted (in commentary on the _Times_ article) that](https://www.scottaaronson.com/blog/?p=5310) "the rationalist community's legendary openness to alternative gender identities and sexualities" as something that would have "complicated the picture" of our portrayal as anti-feminist. + +Even the _haters_ grudgingly give Alexander credit for "... Not Man for the Categories": ["I strongly disagree that one good article about accepting transness means you get to walk away from writing that is somewhat white supremacist and quite fascist without at least awknowledging you were wrong"](https://archive.is/SlJo1), wrote one. + +Under these circumstances, dethroning the supremacy of gender identity ideology is politically impossible. All our Overton margin is already being spent somewhere else; sanity on this topic is our [dump stat](https://tvtropes.org/pmwiki/pmwiki.php/Main/DumpStat). But this being the case, _I have no remaining reason to participate in the cover-up_. What's in it for me? On 17 February 2021, Topher Brennan [claimed on Twitter that](https://web.archive.org/web/20210217195335/https://twitter.com/tophertbrennan/status/1362108632070905857) Scott Alexander "isn't being honest about his history with the far-right", and published [an email he had received from Scott in 2014](https://emilkirkegaard.dk/en/2021/02/backstabber-brennan-knifes-scott-alexander-with-2014-email/), on what Scott thought some neoreactionaries were getting importantly right. -I think to people who have actually read _and understood_ Scott's work, there is nothing at all surprising or scandalous about the contents of this email. +I think to people who have actually read _and understood_ Scott's work, there is nothing particularly surprising or scandalous about the contents of the email. Scott says that biologically-mediated group differences are probably real, that neoreactionaries are the only people discussing the object-level hypotheses _or_ the meta-level question of why our Society's collective epistemology is falling down on this. He says that reactionaries as a whole generate a lot of garbage, but that he trusts himself to sift through the noise and extract the novel insights. (In contrast, RationalWiki didn't generate garbage, but by hewing so closely to the mainstream, it also didn't say much that Scott doesn't already know.) + +The email contains details that Scott hadn't already blog about—most notably the section on "My behavior is the most appropriate response to these facts", explaining his social strategizing—but none of it is really _surprising_ if you actually know Scott from his writing. + +I think the main reason someone _would_ consider the email a scandalous revelation is if they hadn't read _Slate Star Codex_ that deeply—if their picture of Scott Alexander as a political writer was, "that guy who's _so_ committed to charity and discourse that he [wrote up an explanation of what _reactionaries_ (of all people) believe](https://slatestarcodex.com/2013/03/03/reactionary-philosophy-in-an-enormous-planet-sized-nutshell/)—and then, of course, turned around and wrote up the definitive explanation of why they're wrong and you shouldn't pay them any attention." As a first approximation, it's not a bad picture. But what it misses—what _Scott_ knows—is that charity isn't about putting on a show of superficially respecting your ideological opponent, before concluding that they were wrong and you were right all along in every detail. Charity is about seeing what the other guy is getting _right_. + +The same day, Yudkowsky published a Facebook post, which said + +> I feel like it should have been obvious to anyone at this point that anybody who openly hates on this community generally or me personally is probably also a bad person inside and has no ethics and will hurt you if you trust them and will break rules to do so; but in case it wasn't obvious, consider the point made explicitly. (Subtext: Topher Brennan. Do not provide any link in comments to Topher's publication of private emails, explicitly marked as private, from Scott Alexander.) + +In response to comments, Yudkowsky edited the post several times to clarify that he perceived an obvious distinction between hate and heated criticism. The next day, frustrated at how the discussion seemed to ignoring the obvious political angle. -[...] +[TODO Brennan leak discussion cont'd ...] ... except that Yudkowsky reopened the conversation in February 2021, with [a new Facebook post](https://www.facebook.com/yudkowsky/posts/10159421750419228) explaining the origins of his intuitions about pronoun conventions and concluding that, "the simplest and best protocol is, '"He" refers to the set of people who have asked us to use "he", with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition. Because it is _logically rude_, not just socially rude, to try to bake any other more complicated and controversial definition _into the very language protocol we are using to communicate_." @@ -405,6 +433,7 @@ https://twitter.com/ESYudkowsky/status/1096769579362115584 [TODO section existential stakes, cooperation * so far, I've been writing this from the perspective of caring about _rationality_ and wanting there to be a rationality movement, the common interest of many causes + * Singularity stuff scares me * e.g., as recently as 2020 I was daydreaming about working for an embryo selection company as part of the "altruistic" (about optimizing the future, rather than about my own experiences) component of my actions * if you have short timelines, and want to maintain influence over what big state-backed corporations are doing, self-censoring about contradicting the state religion makes sense * you could tell a story in which I'm the villain for undermining Team Singularity with my petty temporal concerns diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index 4ab8a07..fafa1a6 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -1,7 +1,7 @@ battle stations— ✓ outline NYT defense/prelude to pronouns -_ flesh out prelude to pronouns I -_ flush out prelude to pronouns II +✓ flesh out prelude to pronouns I +- flush out prelude to pronouns II _ outlining existential stakes and social justice FDT _ outlining Dolphin War II _ flesh out existential stakes and social justice FDT @@ -13,9 +13,12 @@ _ Michael Vassar and the Theory of Optimal Gossip _ Sasha disaster With internet available— +_ check original wording of Brennan denunciation +_ link "Anti-Reactionary FAQ" +_ RationalWiki link (is that still a thing?) +_ logical induction link _ dolphin thread also referenced Georgia on trees, which also cited "Not Man for the Categories" _ commit patch URL for my slate_starchive script -_ double-check on exact pre-edit wording of Brennan denunciation _ Wentworth specifically doesn't want people's John-models to overwrite their own models _ footnote Charles Murray's caveats _ record of Yudkowsky citing TDT as part of decision to prosecute Emerson? @@ -106,6 +109,7 @@ _ notice the symmetry where _both_ E and I want to partition the discussion with _ contract-drafting em, SSC blogroll is most of my traffic _ "Common Interest of Many Causes" and "Kolmogorov Complicity" offer directly contradictory strategies _ Vassar's about-face on gender +_ risk of people bouncing off progressivism @@ -242,22 +246,11 @@ I'm not optimistic about the problem being fixable, either. Our robot cult _alre ---- -Because of the particular historical moment in which we live, we end up facing pressure from progressives, because—whatever our _object-level_ beliefs about (say) [sex, race, and class differences](/2020/Apr/book-review-human-diversity/)—and however much many of us would prefer not to talk about them—on the _meta_ level, our creed requires us to admit _it's an empirical question_, not a moral one—and that [empirical questions have no privileged reason to admit convenient answers](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god). - -I view this conflict as entirely incidental, something that [would happen in some form in any place and time](https://www.lesswrong.com/posts/cKrgy7hLdszkse2pq/archimedes-s-chronophone), rather than having to do with American politics or "the left" in particular. In a Christian theocracy, our analogues would get in trouble for beliefs about evolution; in the old Soviet Union, our analogues would get in trouble for [thinking about market economics](https://slatestarcodex.com/2014/09/24/book-review-red-plenty/) (as a [positive technical discipline](https://en.wikipedia.org/wiki/Fundamental_theorems_of_welfare_economics#Proof_of_the_first_fundamental_theorem) adjacent to game theory, not yoked to a particular normative agenda). - -Incidental or not, the conflict is real, and everyone smart knows it—even if it's not easy to _prove_ that everyone smart knows it, because everyone smart is very careful what they say in public. (I am not smart.) - (which Alexander aptly renamed [Kolmorogov complicity](https://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/): - - Becuase of the conflict, and because all the prominent high-status people are running a Kolmogorov Option strategy, and because we happen to have to a _wildly_ disproportionate number of _people like me_ around, I think being "pro-trans" ended up being part of the community's "shield" against external political pressure, of the sort that perked up after [the February 2021 _New York Times_ hit piece about Alexander's blog](https://archive.is/0Ghdl). (The _magnitude_ of heat brought on by the recent _Times_ piece and its aftermath was new, but the underlying dynamics had been present for years.) -Jacob Falkovich noted (on 23 June 2020, just after _Slate Star Codex_ went down), ["The two demographics most over-represented in the SlateStarCodex readership according to the surveys are transgender people and Ph.D. holders."](https://twitter.com/yashkaf/status/1275524303430262790) [Scott Aaronson noted (in commentary on the _Times_ article)](https://www.scottaaronson.com/blog/?p=5310) "the rationalist community's legendary openness to alternative gender identities and sexualities" as something that would have "complicated the picture" of our portrayal as anti-feminist. - -Even the _haters_ grudgingly give Alexander credit for "... Not Man for the Categories": ["I strongly disagree that one good article about accepting transness means you get to walk away from writing that is somewhat white supremacist and quite fascist without at least awknowledging you were wrong."](https://archive.is/SlJo1) Given these political realities, you'd think that I _should_ be sympathetic to the Kolmogorov Option argument, which makes a lot of sense. _Of course_ all the high-status people with a public-facing mission (like building a movement to prevent the coming robot apocalypse) are going to be motivatedly dumb about trans stuff in public: look at all the damage [the _other_ Harry Potter author did to her legacy](https://en.wikipedia.org/wiki/Politics_of_J._K._Rowling#Transgender_people).