From: M. Taylor Saotome-Westlake Date: Mon, 22 Feb 2021 00:56:59 +0000 (-0800) Subject: "Sexual Dimorphism": "Tails"?—never heard of her X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=b836a21ee5182086cb2a8356bf6c236d941d76d7;p=Ultimately_Untrue_Thought.git "Sexual Dimorphism": "Tails"?—never heard of her --- diff --git a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md index 720f2de..0aa6cf4 100644 --- a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md +++ b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md @@ -638,7 +638,7 @@ In that November 2018 Twitter thread, [Yudkowsky wrote](https://archive.is/y5V9i This seems to suggest that gender pronouns in the English language as currently spoken don't have effective truth conditions. I think this is false _as a matter of cognitive science_. If someone told you, "Hey, you should come meet my friend at the mall, she is really cool and I think you'll like her," and then the friend turned out to look like me (as I am now), _you would be surprised_. (Even if people in Berkeley would socially punish you for _admitting_ that you were surprised.) The "she ... her" pronouns would prompt your brain to _predict_ that the friend would appear to be female, and that prediction would be _falsified_ by someone who looked like me (as I am now). Pretending that the social-norms dispute is about chromosomes was a _bullshit_ [weakmanning](https://slatestarcodex.com/2014/05/12/weak-men-are-superweapons/) move on the part of Yudkowsky, [who had once written that](https://www.lesswrong.com/posts/qNZM3EGoE5ZeMdCRt/reversed-stupidity-is-not-intelligence) "[t]o argue against an idea honestly, you should argue against the best arguments of the strongest advocates[;] [a]rguing against weaker advocates proves _nothing_, because even the strongest idea will attract weak advocates." Thanks to the skills I learned from Yudkowsky's _earlier_ writing, I wasn't dumb enough to fall for it, but we can imagine someone otherwise similar to me who was, who might have thereby been misled into making worse life decisions. -[TODO: ↑ tone] +[TODO: ↑ soften tone, be more precise, including about "dumb enough to fall for it"] If this "rationality" stuff is useful for _anything at all_, you would _expect_ it to be useful for _practical life decisions_ like _whether or not I should cut my dick off_. @@ -662,12 +662,24 @@ No one seemed to notice at the time, but this characterization of our movement [ What I would have _hoped_ for from a systematically correct reasoning community worthy of the brand name is one goddamned place in the whole goddamned world where _good arguments_ would propagate through the population no matter where they arose, "guided by the beauty of our weapons" ([following Scott Alexander](https://slatestarcodex.com/2017/03/24/guided-by-the-beauty-of-our-weapons/) [following Leonard Cohen](https://genius.com/1576578)). -I think what actually happens is that people like Yudkowsky and Alexander rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and conformity with the local environment)—with the result that if Yudkowsky and Alexander _aren't interested in getting the right answer_ (in public)—because getting the right answer in public would be politically suicidal—then there's no way for anyone who didn't [win the talent lottery](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) to fix the public understanding by making better arguments. +Instead, I think what actually happens is that people like Yudkowsky and Alexander rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and conformity with the local environment)—with the result that if Yudkowsky and Alexander _aren't interested in getting the right answer_ (in public)—because getting the right answer in public would be politically suicidal—then there's no way for anyone who didn't [win the talent lottery](https://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) to fix the public understanding by making better arguments. -It makes sense for public figures to not want to commit political suicide! +It makes sense for public figures to not want to commit political suicide! Even so, it's a _problem_ if public figures whose brand is premised on the ideal of _systematically correct reasoning_, end up drawing attention and resources into a subculture that's optimized for tricking men into cutting their dick off on false pretenses. (Although note that Alexander has [specifically disclaimed aspirations or pretentions to being a "rationalist" authority figure](https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/); that fate befell him without his consent because he's just too good and prolific of a writer compared to everyone else.) -[TODO: risk factor of people getting drawn in to a subculture that claims to be about reasoning, but is actualy very heavily optimized for cutting boys dicks off. -We already get a lot of shit for being right-wing; People use trans as political cover; +I'm not optimistic about the problem being fixable, either. Our robot cult _already_ gets a lot of shit from progressive-minded people for being "right-wing"—not because we are in any _useful_, non-gerrymandered sense, but because [attempts to achieve the map that reflects the territory are going to run afoul of ideological taboos for almost any ideology](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting). + +Because of the particular historical moment in which we live, we end up facing pressure from progressives, because—whatever our _object-level_ beliefs about (say) [sex, race, and class differences](/2020/Apr/book-review-human-diversity/)—and however much many of us would prefer not to talk about them—on the _meta_ level, our creed requires us to admit _it's an empirical question_, not a moral one—and that [empirical questions have no privileged reason to admit convenient answers](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god). + +I view this conflict as entirely incidental, nothing to do with American politics or "the left" in particular. In a Christian theocracy, our analogues would get in trouble for beliefs about evolution; in the old Soviet Union, our analogues would get in trouble for [thinking about market economics](https://slatestarcodex.com/2014/09/24/book-review-red-plenty/) (as a positive technical discipline, adjacent to game theory, not yoked to a particular normative agenda). + +https://www.scottaaronson.com/blog/?p=3376 +https://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/ + +[Leeroy Jenkins](https://en.wikipedia.org/wiki/Leeroy_Jenkins) Option. + +People use trans as political cover; + +[It would have been harder to recruit me] https://twitter.com/yashkaf/status/1275524303430262790 > "Who cares about a blog for male nerd know-it-alls?" @@ -675,9 +687,8 @@ https://twitter.com/yashkaf/status/1275524303430262790 https://www.scottaaronson.com/blog/?p=5310 -> Another thing that would've complicated the picture: the rationalist community’s legendary openness to alternative gender identities and sexualities +> Another thing that would've complicated the picture: the rationalist community’s legendary openness to alternative gender identities and sexualities] -] Someone asked me: "If we randomized half the people at [OpenAI](https://openai.com/) to use trans pronouns one way, and the other half to use it the other way, do you think they would end up with significantly different productivity?" diff --git a/notes/sexual-dimorphism-in-the-sequences-notes.md b/notes/sexual-dimorphism-in-the-sequences-notes.md index 02be39a..9dc36c4 100644 --- a/notes/sexual-dimorphism-in-the-sequences-notes.md +++ b/notes/sexual-dimorphism-in-the-sequences-notes.md @@ -77,6 +77,10 @@ Normal straight men also have positive-valence thoughts about women when they're NYT hit piece https://archive.is/0Ghdl +----- + +(write script to extract links, with useful argparse, incl. dropping into a Python shell) + ------ no safe defense https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science