From: M. Taylor Saotome-Westlake Date: Sat, 1 May 2021 03:11:41 +0000 (-0700) Subject: poking towards the finish ... X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=52de244e98eb7e4444a5dcdd0d8c242444dcf730;p=Ultimately_Untrue_Thought.git poking towards the finish ... --- diff --git a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md index 0cf99c1..3bbc60d 100644 --- a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md +++ b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md @@ -160,11 +160,11 @@ Yudkowsky dramatized the implications in a short story, ["Failed Utopia #4-2"](h Of course no one _wants_ that—our male protagonist doesn't _want_ to abandon his wife and daughter for some catgirl-adjacent (if conscious) hussy. But humans _do_ adapt to loss; if the separation were already accomplished by force, people would eventually move on, and post-separation life with companions superintelligently optimized _for you_ would ([_arguendo_](https://en.wikipedia.org/wiki/Arguendo)) be happier than life with your real friends and family, whose goals will sometimes come into conflict with yours because they weren't superintelligently designed _for you_. -The alignment-theory morals are those of [unforseen maxima](https://arbital.greaterwrong.com/p/unforeseen_maximum) and [edge instantiation](https://arbital.greaterwrong.com/p/edge_instantiation). An AI designed to maximize happiness would kill all humans and tile the galaxy with maximally-efficient happiness-brainware. If this sounds "crazy" to you, that's the problem with anthropomorphism I was telling you about: [don't imagine "AI" as an emotionally-repressed human](https://www.lesswrong.com/posts/zrGzan92SxP27LWP9/points-of-departure), just think about [a machine that calculates what actions would result in what outcomes](https://web.archive.org/web/20071013171416/http://www.singinst.org/blog/2007/06/11/the-stamp-collecting-device/), and does the action that would result in the outcome that maximizes some function. It turns out that picking a function that doesn't kill everyone looks hard. Just tacking on the constaints that you can think of (make the _existing_ humans happy without tampering with their minds) [will tend to produce similar "crazy" outcomes that you didn't think to exclude](https://arbital.greaterwrong.com/p/nearest_unblocked). +The alignment-theory morals are those of [unforseen maxima](https://arbital.greaterwrong.com/p/unforeseen_maximum) and [edge instantiation](https://arbital.greaterwrong.com/p/edge_instantiation). An AI designed to maximize happiness would kill all humans and tile the galaxy with maximally-efficient happiness-brainware. If this sounds "crazy" to you, that's the problem with anthropomorphism I was telling you about: [don't imagine "AI" as an emotionally-repressed human](https://www.lesswrong.com/posts/zrGzan92SxP27LWP9/points-of-departure), just think about [a machine that calculates what actions would result in what outcomes](https://web.archive.org/web/20071013171416/http://www.singinst.org/blog/2007/06/11/the-stamp-collecting-device/), and does the action that would result in the outcome that maximizes some function. It turns out that picking a function that doesn't kill everyone looks hard. Just tacking on the constaints that you can think of (like making the _existing_ humans happy without tampering with their minds) [will tend to produce similar "crazy" outcomes that you didn't think to exclude](https://arbital.greaterwrong.com/p/nearest_unblocked). At the time, [I expressed horror](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/PhiGnX7qKzzgn2aKb) at "Failed Utopia #4-2" in the comments section, because my quasi-religious psychological-sex-differences denialism required that I be horrified. But looking back a dozen years later—[or even four years later](https://www.greaterwrong.com/posts/ctpkTaqTKbmm6uRgC/failed-utopia-4-2/comment/D34jhYBcaoE7DEb8d)—my performative horror was missing the point. -_The argument makes sense_. Of course, it's important to notice that you'd need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI in the story doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other. A faithful antisexist (as I was) might insist that that should be the _only_ moral, as it implies the other [_a fortiori_](https://en.wikipedia.org/wiki/Argumentum_a_fortiori). But if you're trying to _learn about reality_ rather than protect your fixed quasi-religious beliefs, it should be _okay_ for one of the lessons to get a punchy sci-fi short story; it should be _okay_ to think about the hyperplane between two coarse clusters, even while it's simultaneously true that a set of hyperplanes would suffice to [shatter](https://en.wikipedia.org/wiki/Shattered_set) every individual point, without deigning to acknowledge the existence of clusters. +_The argument makes sense_. Of course, it's important to notice that you'd need an additional [handwave](https://tvtropes.org/pmwiki/pmwiki.php/Main/HandWave) to explain why the AI in the story doesn't give every _individual_ their separate utopia—if existing women and men aren't optimal partners for each other, so too are individual men not optimal same-sex friends for each other. A faithful antisexist (as I was) might insist that that should be the _only_ moral, as it implies the other [_a fortiori_](https://en.wikipedia.org/wiki/Argumentum_a_fortiori). But if you're trying to _learn about reality_ rather than protect your fixed quasi-religious beliefs, it should be _okay_ for one of the lessons to get a punchy sci-fi short story; it should be _okay_ to think about the hyperplane between two coarse clusters, even while it's simultaneously true that you could build a wall around every individual point, without deigning to acknowledge the existence of clusters. On my reading of the text, it is _significant_ that the AI-synthesized complements for men are given their own name, the _verthandi_ (presumably after [the Norse deity](https://en.wikipedia.org/wiki/Ver%C3%B0andi)), rather than just being referred to as women. The _verthandi_ may _look like_ women, they may be _approximately_ psychologically human, but since the _detailed_ psychology of "superintelligently-engineered optimal romantic partner for a human male" is not going to come out of the distribution of actual human females, judicious exercise of the [tenth virtue of precision](http://yudkowsky.net/rational/virtues/) demands that a _different word_ be coined for this hypothetical science-fictional type of person. Calling the _verthandi_ "women" would be _worse writing_; it would _fail to communicate_ the impact of what has taken place in the story. @@ -208,7 +208,7 @@ Do I want the magical transformation technology to fix all that, too? Do I have _any idea_ what it would even _mean_ to fix all that, without spending multiple lifetimes studying neuroscience? -I think I have just enough language to _start_ to talk about what it would mean. Since sex isn't an atomic attribute, but rather a high-level statistical regularity such that almost everyone can be cleanly classified as "female" or "male" _in terms of_ lower-level traits (genitals, hormone levels, _&c._), then, abstractly, we're trying to take points from male distribution and map them onto the female distribution in a way that preserves as much structure (personal identity) as possible. My female analogue doesn't have a penis (because then she wouldn't be female), but she is going to speak American English like me and be [85% Ashkenazi like me](/images/ancestry_report.png), because language and autosomal genes don't have anything to do with sex. +I think I have just enough language to _start_ to talk about what it would mean. Since sex isn't an atomic attribute, but rather a high-level statistical regularity such that almost everyone can be cleanly classified as "female" or "male" _in terms of_ lower-level traits (genitals, hormone levels, _&c._), then, abstractly, we're trying to take points from male distribution and map them onto the female distribution in a way that preserves as much structure (personal identity) as possible. My female analogue doesn't have a penis like me (because then she wouldn't be female), but she is going to speak American English like me and be [85% Ashkenazi like me](/images/ancestry_report.png), because language and autosomal genes don't have anything to do with sex. The hard part has to do with traits that are meaningfully sexually dimorphic, but not as a discrete dichotomy—where the sex-specific universal designs differ in ways that are _subtler_ than the presence or absence of entire reproductive organs. (Yes, I know about [homology](https://en.wikipedia.org/wiki/Homology_(biology))—and _you_ know what I meant.) We are _not_ satisfied if the magical transformation technology swaps out my penis and testicles for a functioning female reproductive system without changing the rest of my body, because we want the end result to be indistinguishable from having been drawn from the female distribution (at least, indistinguishable _modulo_ having my memories of life as a male before the magical transformation), and a man-who-somehow-magically-has-a-vagina doesn't qualify. @@ -236,9 +236,11 @@ Daphna Joel _et al._ [argue](https://www.pnas.org/content/112/50/15468) [that](h Sex differences in the brain are like sex differences in the skeleton: anthropologists can tell female and male skeletons apart (the [pelvis is shaped differently](https://johnhawks.net/explainer/laboratory/sexual-dimorphism-pelvis), for obvious reasons), and [machine-learning models can see very reliable differences that human radiologists can't](/papers/yune_et_al-beyond_human_perception_sexual_dimorphism_in_hand_and_wrist_radiographs.pdf), but neither sex has entire _bones_ that the other doesn't, and the same is true of brain regions. (The evopsych story about complex adaptations being universal-up-to-sex suggests that sex-specific bones or brain regions should be _possible_, but in a bit of _relative_ good news for antisexism, apprently evolution didn't need to go that far. Um, in humans—a lot of other mammals actually have [a penis bone](https://en.wikipedia.org/wiki/Baculum).) -Maybe this should just look like supplementary Statistics Details brushed over some basic facts of human existence that everyone knows? I'm a pretty weird guy, in more ways than one. I am not prototypically masculine. Most men are not like me. If I'm allowed to cherry-pick what measurements to take, I can name ways in which my mosaic is more female-typical than male-typical. (For example, I'm _sure_ I'm above the female mean in [Big Five Neuroticism](https://en.wikipedia.org/wiki/Big_Five_personality_traits).) ["[A] weakly negative correlation can be mistaken for a strong positive one with a bit of selective memory."](https://www.lesswrong.com/posts/veN86cBhoe7mBxXLk/categorizing-has-consequences) But "weird" represents a much larger space of possibilities than "normal", much as [_nonapples_ are a less cohesive category than _apples_](https://www.lesswrong.com/posts/2mLZiWxWKZyaRgcn7/selling-nonapples). If you _sum over_ all of my traits, everything that makes me, _me_—it's going to be a point in the _male_ region of the existing, unremediated, genderspace. In the course of _being myself_, I'm going to do more male-typical things than female-typical things, not becuase I'm _trying_ to be masculine (I'm not), and not because I "identify as" male (I don't—or I wouldn't, if someone could give me a straight answer as to what this "identifying as" operation is supposed to consist of), but because I literally in-fact am male in the same sense that male chimpanzees or male mice are male, whether or not I like it (I don't—or I wouldn't, if I still believed that preference was coherent), and whether or not I _notice_ all the little details that implies (I almost certainly don't). +Maybe this should just look like supplementary Statistics Details brushed over some basic facts of human existence that everyone knows? I'm a pretty weird guy, in more ways than one. I am not prototypically masculine. Most men are not like me. If I'm allowed to cherry-pick what measurements to take, I can name ways in which my mosaic is more female-typical than male-typical. (For example, I'm _sure_ I'm above the female mean in [Big Five Neuroticism](https://en.wikipedia.org/wiki/Big_Five_personality_traits).) ["[A] weakly negative correlation can be mistaken for a strong positive one with a bit of selective memory."](https://www.lesswrong.com/posts/veN86cBhoe7mBxXLk/categorizing-has-consequences) -Okay, maybe I'm _not_ completely over my teenage religion of psychological sex differences denialism?—that belief still feels uncomfortable to put my weight on. I would _prefer_ to believe that there are women who are relevantly "like me" with respect to some fair (not gerrymandered) metric on personspace. But, um ... it's not completely obvious whether I actually know any? (Well, maybe two or three.) When I look around me—most of the people in my robot cult (and much more so if you look the core of old-timers from the _Overcoming Bias_ days, rather than the greater Berkeley "community" of today) are male. Most of the people in my open-source programming scene are male. These days, [most of the _women_](/2020/Nov/survey-data-on-cis-and-trans-women-among-haskell-programmers/) in [my open-source programming scene](/2017/Aug/interlude-vii/) are male. Am I not supposed to _notice_? +But "weird" represents a much larger space of possibilities than "normal", much as [_nonapples_ are a less cohesive category than _apples_](https://www.lesswrong.com/posts/2mLZiWxWKZyaRgcn7/selling-nonapples): a woman trapped in a man's body would be weird, but it doesn't follow that weird men are secretly women, as opposed to some other, _specific_, kind of weird. If you _sum over_ all of my traits, everything that makes me, _me_—it's going to be a point in the _male_ region of the existing, unremediated, genderspace. In the course of _being myself_, I'm going to do more male-typical things than female-typical things, not becuase I'm _trying_ to be masculine (I'm not), and not because I "identify as" male (I don't—or I wouldn't, if someone could give me a straight answer as to what this "identifying as" operation is supposed to consist of), but because I literally in-fact am male in the same sense that male chimpanzees or male mice are male, whether or not I like it (I don't—or I wouldn't, if I still believed that preference was coherent), and whether or not I _notice_ all the little details that implies (I almost certainly don't). + +Okay, maybe I'm _not_ completely over my teenage religion of psychological sex differences denialism?—that belief still feels uncomfortable to put my weight on. I would _prefer_ to believe that there are women who are relevantly "like me" with respect to some fair (not gerrymandered) metric on personspace. But, um ... it's not completely obvious whether I actually know any? (Well, maybe two or three.) When I look around me—most of the people in my robot cult (and much more so if you look the core of old-timers from the _Overcoming Bias_ days, rather than the greater "community" of today) are male. Most of the people in my open-source programming scene are male. These days, [most of the _women_](/2020/Nov/survey-data-on-cis-and-trans-women-among-haskell-programmers/) in [my open-source programming scene](/2017/Aug/interlude-vii/) are male. Am ... am I not supposed to _notice_? Is _everyone else_ not supposed to notice? Suppose I got the magical body transformation (with no brain mods beyond the minimum needed for motor control). Suppose I caught the worshipful attention of a young man just like I used to be ("a" young man, as if there wouldn't be _dozens_), who privately told me, "I've never met a woman quite like you." What would I be supposed to tell him? ["There's a _reason_ for that"](https://www.dumbingofage.com/2014/comic/book-5/01-when-somebody-loved-me/purpleandskates/)? @@ -256,7 +258,7 @@ I think everything is a system ... because I'm male?? (Or whatever the appropriate generalization of "because" is for statistical group differences. The sentence "I'm 5′11″ because I'm male" doesn't seem quite right, but it's pointing to something real.) -I could _assert_ that it's all down to socialization and stereotyping and self-fulfilling prophecies—and I know that _some_ of it is. (Self-fulfilling prophecies [are coordination equilibria](/2020/Jan/book-review-the-origins-of-unfairness/).) But I still want to speculate that the nature of my X factor—the things about my personality that let me write the things I do even though I'm [objectively not that smart](/images/wisc-iii_result.jpg) compared to some of my robot-cult friends—is a pattern of mental illness that could realistically only occur in males. (Yudkowsky: ["It seems to me that male teenagers especially have something like a _higher cognitive temperature_, an ability to wander into strange places both good and bad."](https://www.lesswrong.com/posts/xsyG7PkMekHud2DMK/of-gender-and-rationality)) +I could _assert_ that it's all down to socialization and stereotyping and self-fulfilling prophecies—and I know that _some_ of it is. (Self-fulfilling prophecies [are coordination equilibria](/2020/Jan/book-review-the-origins-of-unfairness/).) But I still want to speculate that the nature of my X factor—the things about my personality that let me write the _specific_ things I do even though I'm [objectively not that smart](/images/wisc-iii_result.jpg) compared to some of my robot-cult friends—is a pattern of mental illness that could realistically only occur in males. (Yudkowsky: ["It seems to me that male teenagers especially have something like a _higher cognitive temperature_, an ability to wander into strange places both good and bad."](https://www.lesswrong.com/posts/xsyG7PkMekHud2DMK/of-gender-and-rationality)) I can _imagine_ that all the gaps will vanish after the revolution. I can imagine it, but I can no longer _assert it with a straight face_ because _I've read the literature_ and can tell you several observations about chimps and [congenital adrenal hyperplasia](/images/cah_diffs_table.png) that make that seem _relatively unlikely_. @@ -282,7 +284,7 @@ Of course same-sex siblings would _also_ be different pictures. (Identical twins Okay. Having supplied just enough language to _start_ to talk about what it would even mean to actually become female—is that what I _want_? -I've just explained that, _in principle_, it could be done, so you might think there's no _conceptual_ problem with the idea of changing sex, in the same sense that there's nothing _conceptually_ wrong with Jules Verne's [pair](https://en.wikipedia.org/wiki/From_the_Earth_to_the_Moon) of [novels](https://en.wikipedia.org/wiki/Around_the_Moon) about flying around the moon. There are lots of technical rocket-science details that Verne didn't and couldn't have known about in the 1860s, but the _basic idea_ was sound, and [actually achieved a hundred years later](https://en.wikipedia.org/wiki/Apollo_8). So why is it in any way is it _relevant_ that making the magical transformation fantasy real would be technically complicated? +I've just explained that, _in principle_, it could be done, so you might think there's no _conceptual_ problem with the idea of changing sex, in the same sense that there's nothing _conceptually_ wrong with Jules Verne's [pair](https://en.wikipedia.org/wiki/From_the_Earth_to_the_Moon) of [novels](https://en.wikipedia.org/wiki/Around_the_Moon) about flying around the moon. There are lots of technical rocket-science details that Verne didn't and couldn't have known about in the 1860s, but the _basic idea_ was sound, and [actually achieved a hundred years later](https://en.wikipedia.org/wiki/Apollo_8). So why is it in any way is it _relevant_ that making the magical transformation fantasy real would be technically complicated? It's relevant insofar as the technical details change your evaluation of the desirability of _what_ is to be accomplished, which can differ from [what sounds like good news in the moment of being first informed](https://www.lesswrong.com/posts/pK4HTxuv6mftHXWC3/prolegomena-to-a-theory-of-fun). So, I mean, if it's reversible, I would definitely be extremely eager to _try_ it ... @@ -304,7 +306,7 @@ I guess I _want_ to be "a normal [...] man wearing a female body like a suit of Is that weird? Is that wrong? -Okay, yes, it's _obviously_ weird and wrong, but should I care more about not being weird and wrong, than I do about my deepest most heartfelt desire that I've thought about every day for the last eighteen years? +Okay, yes, it's _obviously_ weird and wrong, but should I care more about not being weird and wrong, than I do about my deepest most heartfelt desire that I've thought about every day for the last nineteen years? This is probably counterintuitive if you haven't been living with it your entire adult life? People have _heard of_ the "born in the wrong body" narrative, which makes intuitive sense: if female souls are designed to work female bodies, and you're a female soul tethered to a male body, you can imagine the soul finding the mismatch distressing and wanting to fix it. But if, as I'm positing for my case, there _is no mismatch_ in any objective sense, then where does the desire come from? How do you make sense of wanting to change physiological sex, for reasons that _don't_ have anything to do with already neurologically resembling that sex? What's really going on there, psychologically? @@ -332,7 +334,7 @@ Two, my fantasies about having a female body aren't particularly, um, discrimina Three, the thought being transformed in a _different_ male body, other than my own, _is_ repulsive. Perhaps less so in the sense that thinking about it is horrifying, and more that I _can't_ think about it—my imagination "bounces off" the idea before any [Body Horror](https://tvtropes.org/pmwiki/pmwiki.php/Main/BodyHorror) emotions can kick in. -These details seem hard to square with gender identity theories: why is my own male body, and _only_ my own male body, seem "okay"? Whereas this is exactly what you would expect from the "male sexuality getting confused about a self–other distinction" story: I want to be all different sorts of women (and not men) for the same reason ordinary straight guys [want to fuck all different sorts of women](https://link.springer.com/article/10.1007/s10508-020-01730-x) (and not men). +These details seem hard to square with gender identity theories: why is my own male body, and _only_ my own male body, seem "okay"? Whereas this is exactly what you would expect from the "male sexuality getting confused about a self–other distinction" story: I want to be transformed into all different sorts of women for the same reason ordinary straight guys [want to fuck all different sorts of women](https://link.springer.com/article/10.1007/s10508-020-01730-x), and I can't even entertain the idea of being transformed into other men for the same reason ordinary straight guys can't even entertain the idea of fucking other men. An interesting prediction of this story is that if the nature of the "confusion", this—["erotic target location error"](/papers/lawrence-etle_an_underappreciated.pdf)?—is agnostic to the object of sexual attraction, then you should see the same pattern in men with unusual sexual interests. ("Men" because I think we legitimately want to be [shy about generalizing across sexes](/papers/bailey-what_is_sexual_orientation_and_do_women_have_one.pdf) for sex differences in the parts of the mind that are specifically about mating.) @@ -354,7 +356,9 @@ I just don't see any _reason_ to doubt the obvious explanation that the root cau (A "bug" with respect to the design criteria of evolution, not with respect to the human morality that affirms that I _like_ being this way. Some, fearing stigma, would prefer to tone-police "bug" down to "variation", but people who don't [understand the naturalistic fallacy](https://www.lesswrong.com/posts/YhNGY6ypoNbLJvDBu/rebelling-within-nature) aren't going to understand anything _else_ I'm saying, and I want to emphasize that the mirror-neurons-or-whatever and ordinary male heterosexuality weren't functionally optimized to collide like this.) -If I were to _actually_ become neurologically female, it _wouldn't_ seem like the scintillating apotheosis of sexual desire and the most important thing in the world. It would just feel normal, in the way that (I can only presume) actual women feel their own existence is normal. +If I were to _actually_ become neurologically female, it _wouldn't_ seem like the scintillating apotheosis of sexual desire and the most important thing in the world. It would just feel normal, in the way that (I can only imagine) actual women feel their own existence is normal. + +No doubt many women appreciate their own bodies, but a woman's positive body self-image experience of, "I feel sexy today", is going to be _very different_ from the autogynephile-with-BodyApp's experience of, "Oh my God, I have _breasts_ and a _vagina_ that I can look at and touch _without needing anyone's permission_; this is _the scintillating apotheosis of sexual desire and the most important thing in the world._" In this way, autogynephilia is _intrinsically self-undermining_ in a way that fantasies flying to the moon are not. This doesn't in any way lessen the desire or make it go away—any more than [the guy who gets turned on by entropy decreasing a closed system](https://qwantz.com/index.php?comic=1049) would have his libido suddenly and permanently vanish upon learning about the second law of thermodynamics. But it does, I suspect, change the way you think of it: it makes a difference whether you interpret the desire as a confused anomaly in male sexuality—the scintillating but ultimately untrue thought—or _take it literally_. @@ -362,7 +366,7 @@ But the reasons not to take it literally might not be obvious to _everyone_. The We can imagine that a male who was _like_ me in having this erotic-target-location-erroneous sexuality and associated beautiful pure sacred self-identity feelings, but who [read different books in a different order](/2020/Nov/the-feeling-is-mutual/), might come to very different conclusions about himself. -If you don't have the conceptual vocabulary to say, "I have a lot of these beautiful pure sacred self-identity feelings about being female, but it seems like a pretty obvious guess that there must be some sort of causal relationship between that and this erotic fantasy, which is realistically going to be a variation in _male_ sexuality," you might end up saying something simpler like, "I want to be a woman." Or possibly even, "I _am_ a woman, on the inside, where it counts." +If you don't have the conceptual vocabulary to say, "I have a lot of these beautiful pure sacred self-identity feelings about being female, but it seems like a pretty obvious guess that there must be some sort of causal relationship between that and this erotic fantasy, which is realistically going to be a variation in _male_ sexuality, such that it would be silly to interpret the beautiful pure sacred self-identity thing literally" you might end up saying something simpler like, "I want to be a woman." Or possibly even, "I _am_ a woman, on the inside, where it counts." (As Yudkowsky [occasionally](https://www.lesswrong.com/posts/3nxs2WYDGzJbzcLMp/words-as-hidden-inferences) [remarks](https://www.lesswrong.com/posts/f4RJtHBPvDRJcCTva/when-anthropomorphism-became-stupid), our _beliefs about_ how our minds work have very little impact on how they actually work. Aristotle thought the brain was an organ for cooling the blood, but he was just wrong; the theory did not _become true of him_ because he believed it.) @@ -398,7 +402,7 @@ Did you know that trans women [have to dilate their neovagina after bottom surge I am glad that these interventions _exist_ for the people who are brave and desperate enough to need them. But given that I'm not that desperate and not that brave, would it not be wiser to trust the paraphrased proverb and not look a gift man in the mouth? -My beautiful–beautiful ponytail was a _smart move_ (and hair length isn't sexually dimorphic anyway; it's only our culture's sexism that makes it seem relevant in this context). +My beautiful–beautiful ponytail was a _smart move_ (and hair length isn't sexually dimorphic anyway; it's only our culture's [arbitrary gender conventions](/2020/Jan/book-review-the-origins-of-unfairness/) that makes it seem relevant in this context). My [five-month HRT experiment](/tag/hrt-diary/) was a _smart move_, both for the beautiful–beautiful breast tissue, and [For Science](https://tvtropes.org/pmwiki/pmwiki.php/Main/ForScience). diff --git a/notes/sexual-dimorphism-in-the-sequences-notes.md b/notes/sexual-dimorphism-in-the-sequences-notes.md index 76d9599..34dda78 100644 --- a/notes/sexual-dimorphism-in-the-sequences-notes.md +++ b/notes/sexual-dimorphism-in-the-sequences-notes.md @@ -6,17 +6,25 @@ https://www.lesswrong.com/posts/vjmw8tW6wZAtNJMKo/which-parts-are-me -------- -While [the Sequence explaining Yudkowsky's metaethics](https://www.lesswrong.com/tag/metaethics-sequence) was being published (which a lot of people, including me, didn't quite "get" at the time; a [later précis](https://www.lesswrong.com/posts/zqwWicCLNBSA5Ssmn/by-which-it-may-be-judged) was perhaps more successful), I was put off by the extent to which Yudkowsky seemed to want to ground the definition of value in [the evolved design of the human brain](https://www.lesswrong.com/posts/cSXZpvqpa9vbGGLtG/thou-art-godshatter), as if culturally-defined values were irrelevant, to be wiped away by [the extrapolation of what people _would_ want if they knew more, thought faster, _&c._](https://arbital.com/p/normative_extrapolated_volition/). +While [the Sequence explaining Yudkowsky's metaethics](https://www.lesswrong.com/tag/metaethics-sequence) was being published (which a lot of people, including me, didn't quite "get" at the time; a [later précis](https://www.lesswrong.com/posts/zqwWicCLNBSA5Ssmn/by-which-it-may-be-judged) was perhaps more successful), I was put off by the extent to which Yudkowsky seemed to want to ground the specification of value in [the evolved design of the human brain](https://www.lesswrong.com/posts/cSXZpvqpa9vbGGLtG/thou-art-godshatter), as if culturally-defined values were irrelevant, to be wiped away by [the extrapolation of what people _would_ want if they knew more, thought faster, _&c._](https://arbital.com/p/normative_extrapolated_volition/). -And the _reason_ I felt that way was because I was aware of how much of a historical anomaly my sacred ideological value of antisexism, and felt threatened by it. Contrast to Yudkowsky's [casual sex-realist speculation in the comment section](https://www.greaterwrong.com/posts/BkkwXtaTf5LvbA6HB/moral-error-and-moral-disagreement/comment/vHNejGa6cRxh6kdnE): +And the _reason_ I felt that way was because I was aware of how much of a historical anomaly my sacred ideological value of antisexism was. Contrast to Yudkowsky's [casual sex-realist speculation in the comment section](https://www.greaterwrong.com/posts/BkkwXtaTf5LvbA6HB/moral-error-and-moral-disagreement/comment/vHNejGa6cRxh6kdnE): > If there are distinct categories of human transpersonal values, I would expect them to look like "male and female babies", "male children", "male adults", "female children", "female adults", "neurological damage 1", "neurological damage 2", not "Muslims vs. Christians!" You can see why this view would be unappealing to an ideologue eager to fight a culture war along an "Antisexism _vs._ Sexism" axis. -Looking back, while I had a point that culturally-inculcated values might not wash out under extrapolation, I was vastly underestimating the extent to which your current sacred ideology _can_ be shown to be meaningfully "wrong" with better information—and, by design of the extrapolation procedure, this _shouldn't_ be threatening. +Looking back—I do think I had a point that culturally-inculcated values won't completely wash out under extrapolation, but I think I was vastly underestimating the extent to which your current sacred ideology _can_ be shown to be meaningfully "wrong" with better information—and, by design of the extrapolation procedure, this _shouldn't_ be threatening. -Suppose it _is_ true that female adults and male adults have distinct transpersonal values. At the time, I found the prospect horrifying—but that just shows that the design of male transpersonal values _contains within it_ the potential (under appropriate cultural conditions) to be horrified by sex differences in transpersonal values. The thing to be committed to is not any potentially flawed object-level ideology, like antisexism or Christianity, but [the features of human psychology that make the object-level ideology _seem like a good idea_](http://zackmdavis.net/blog/2017/03/dreaming-of-political-bayescraft/). + + + + +Suppose it _is_ true that female adults and male adults have distinct transpersonal values. At the time, I found the prospect horrifying—but that just shows that the design of male transpersonal values _contains within it_ the potential (under appropriate cultural conditions) to be horrified by sex differences in transpersonal values. + + + +The thing to be committed to is not any potentially flawed object-level ideology, like antisexism or Christianity, but [the features of human psychology that make the object-level ideology _seem like a good idea_](http://zackmdavis.net/blog/2017/03/dreaming-of-political-bayescraft/). If, naïvely, [I don't _want_ it to be the case that women are a different thing that I don't understand](/2019/Jan/interlude-xvi/), but that preference _itself_ arises out of