From: M. Taylor Saotome-Westlake Date: Thu, 9 Jul 2020 08:07:48 +0000 (-0700) Subject: gearing towards "Sexual Dimorphism" X-Git-Url: http://unremediatedgender.space/source?p=Ultimately_Untrue_Thought.git;a=commitdiff_plain;h=6b7433677d80c6a2cca73f2c6d1d3e3506e7a275 gearing towards "Sexual Dimorphism" --- diff --git a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md index d3f91a7..bb6d3f2 100644 --- a/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md +++ b/content/drafts/sexual-dimorphism-in-the-sequences-in-relation-to-my-gender-problems.md @@ -4,17 +4,15 @@ Category: commentary Tags: autogynephilia, Eliezer Yudkowsky, epistemic horror, my robot cult, personal, sex differences Status: draft -> Given that females prefer to live with female relatives if they are going to live in groups at all, and given that the size and dispersion of these groups is determined by the interaction of food availability and predation risk, how should males map themselves onto the female distribution? -> -> Robin I. M. Dunbar, Primate Social Systems, Ch. 7, "Evolution of Grouping Patterns" +So, as I sometimes allude to, I've spent basically my entire adult life in this insular intellectual subculture that was founded in the late 'aughts to promulgate an ideal of _systematically correct reasoning_—general methods of thought that result in true beliefs and successful plans—and, incidentally, to use these methods of systematically correct reasoning to prevent superintelligent machines from [destroying all value in the universe](https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile). Lately I've been calling it my "robot cult" (a phrase [due to Dale Carrico](https://amormundi.blogspot.com/2011/08/ten-reasons-to-take-seriously.html))—the pejorative is partially ironically affectionate, and partially an expression of bitter contempt and white-hot rage acquired from that time almost everyone I [used to trust](https://www.lesswrong.com/posts/wustx45CPL5rZenuo/no-safe-defense-not-even-science) insisted on selectively playing dumb about our _own_ philosophy of language in a way that [was optimized for](TODO: linky "Algorithmic Intent") tricking me into cutting my dick off (independently of the empirical facts that [determine whether or not](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument) cutting my dick off is actually a good idea). -So, as I sometimes allude to, I've spent basically my entire adult life in this insular intellectual subculture that was founded in the late 'aughts to promulgate an ideal of _systematically correct reasoning_—general methods of thought that result in true beliefs and successful plans—and, incidentally, to use these methods of systematically correct reasoning to prevent superintelligent machines from [destroying all value in the universe](https://www.lesswrong.com/posts/GNnHHmm8EzePmKzPk/value-is-fragile). Lately I've been calling it my "robot cult" (a phrase [due to Dale Carrico](https://amormundi.blogspot.com/2011/08/ten-reasons-to-take-seriously.html))—the pejorative is partially ironically affectionate, and partially an expression of bitter contempt acquired from that time almost everyone I used to trust insisted on selectively playing dumb about our own philosophy of language in a way that [was optimized for](TODO: linky "Algorithmic Intent") tricking me into cutting my dick off (independently of the empirical facts that determine whether or not cutting my dick off is actually a good idea). +But that's a _long story_—for another time, perhaps. (Working title: "I Tell Myself to Let the Story End; Or, [A Hill of Validity in Defense of Meaning](https://archive.is/3epp2); Or, We Had [an Entire Sequence About This](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong), You Lying Motherfuckers.") -But that's a _long story_—for another time, perhaps. For now, I want to explain how my robot cult's foundational texts had an enormous influence on my self-concept in relation to sex and gender. +For _now_, I want to explain how my robot cult's foundational texts had an enormous influence on my self-concept in relation to sex and gender. It all started in summer 2007 when I came across _Overcoming Bias_, a blog on the theme of how to achieve more accurate beliefs. (I don't remember exactly how I was referred, but I think it was likely to have been [a link from Megan McArdle](https://web.archive.org/web/20071129181942/http://www.janegalt.net/archives/009783.html), then writing as "Jane Galt" at _Asymmetrical Information_.) -[Although](http://www.overcomingbias.com/author/hal-finney) [technically](http://www.overcomingbias.com/author/james-miller) [a](http://www.overcomingbias.com/author/david-j-balan) [group](http://www.overcomingbias.com/author/andrew) [blog](http://www.overcomingbias.com/author/anders-sandberg), the vast majority of posts on _Overcoming Bias_ were by Robin Hanson or Eliezer Yudkowsky. I was previously acquainted in passing with Yudkowsky's [writing about future superintelligence](http://yudkowsky.net/obsolete/tmol-faq.html). (I had [mentioned him in my Diary once](/ancillary/diary/42/), albeit without spelling his name correctly).) Yudkowsky was now using _Overcoming Bias_ and the medium of blogging [to generate material for a future book about rationality](https://www.lesswrong.com/posts/vHPrTLnhrgAHA96ko/why-i-m-blooking). Hanson's posts I could take or leave, but Yudkowsky's sequences of posts about rationality (coming out almost-daily through early 2009, eventualy totaling hundreds of thousands of words) were _life-changingly great_, drawing on fields from [cognitive psychology](https://www.lesswrong.com/s/5g5TkQTe9rmPS5vvM) to [evolutionary biology](https://www.lesswrong.com/s/MH2b8NfWv22dBtrs8) to explain the [mathematical](https://www.readthesequences.com/An-Intuitive-Explanation-Of-Bayess-Theorem) [principles](https://www.readthesequences.com/A-Technical-Explanation-Of-Technical-Explanation) [governing](https://www.lesswrong.com/posts/eY45uCCX7DdwJ4Jha/no-one-can-exempt-you-from-rationality-s-laws) _how intelligence works_—[the reduction of "thought"](https://www.lesswrong.com/posts/p7ftQ6acRkgo6hqHb/dreams-of-ai-design) to [_cognitive algorithms_](https://www.lesswrong.com/posts/HcCpvYLoSFP4iAqSz/rationality-appreciating-cognitive-algorithms). Intelligent systems that use [evidence](https://www.lesswrong.com/posts/6s3xABaXKPdFwA3FS/what-is-evidence) to construct [predictive](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences) models of the world around them—that have "true" "beliefs"—can _use_ those models to compute which actions will best achieve their goals. I would later frequently [joke](https://en.wiktionary.org/wiki/ha_ha_only_serious) that Yudkowsky rewrote my personality over the internet. +[Although](http://www.overcomingbias.com/author/hal-finney) [technically](http://www.overcomingbias.com/author/james-miller) [a](http://www.overcomingbias.com/author/david-j-balan) [group](http://www.overcomingbias.com/author/andrew) [blog](http://www.overcomingbias.com/author/anders-sandberg), the vast majority of posts on _Overcoming Bias_ were by Robin Hanson or Eliezer Yudkowsky. I was previously acquainted in passing with Yudkowsky's [writing about future superintelligence](http://yudkowsky.net/obsolete/tmol-faq.html). (I had [mentioned him in my Diary once in 2005](/ancillary/diary/42/), albeit without spelling his name correctly).) Yudkowsky was now using _Overcoming Bias_ and the medium of blogging [to generate material for a future book about rationality](https://www.lesswrong.com/posts/vHPrTLnhrgAHA96ko/why-i-m-blooking). Hanson's posts I could take or leave, but Yudkowsky's sequences of posts about rationality (coming out almost-daily through early 2009, eventualy totaling hundreds of thousands of words) were _life-changingly great_, drawing on fields from [cognitive psychology](https://www.lesswrong.com/s/5g5TkQTe9rmPS5vvM) to [evolutionary biology](https://www.lesswrong.com/s/MH2b8NfWv22dBtrs8) to explain the [mathematical](https://www.readthesequences.com/An-Intuitive-Explanation-Of-Bayess-Theorem) [principles](https://www.readthesequences.com/A-Technical-Explanation-Of-Technical-Explanation) [governing](https://www.lesswrong.com/posts/eY45uCCX7DdwJ4Jha/no-one-can-exempt-you-from-rationality-s-laws) _how intelligence works_—[the reduction of "thought"](https://www.lesswrong.com/posts/p7ftQ6acRkgo6hqHb/dreams-of-ai-design) to [_cognitive algorithms_](https://www.lesswrong.com/posts/HcCpvYLoSFP4iAqSz/rationality-appreciating-cognitive-algorithms). Intelligent systems that use [evidence](https://www.lesswrong.com/posts/6s3xABaXKPdFwA3FS/what-is-evidence) to construct [predictive](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences) models of the world around them—that have "true" "beliefs"—can _use_ those models to compute which actions will best achieve their goals. I would later frequently [joke](https://en.wiktionary.org/wiki/ha_ha_only_serious) that Yudkowsky rewrote my personality over the internet. Ever since I was fourteen years old— diff --git a/notes/epigraph_quotes.md b/notes/epigraph_quotes.md index b459028..9a61914 100644 --- a/notes/epigraph_quotes.md +++ b/notes/epigraph_quotes.md @@ -346,3 +346,11 @@ https://xkcd.com/1942/ > In some cases, there is a reason for one gender to adopt certain tasks—males have stronger muscles, and females give birth, and thus big game hunting tends to be done by men (who, in most cultures, are typically male). > > —Cailin O'Connor, "Measuring Conventionality" + +> Given that females prefer to live with female relatives if they are going to live in groups at all, and given that the size and dispersion of these groups is determined by the interaction of food availability and predation risk, how should males map themselves onto the female distribution? +> +> Robin I. M. Dunbar, Primate Social Systems, Ch. 7, "Evolution of Grouping Patterns" + +> If you are silent about your pain, they'll kill you and say you enjoyed it. +> +> —Zora Neale Hurston \ No newline at end of file diff --git a/notes/post_ideas.txt b/notes/post_ideas.txt index 934d6aa..ceb67ab 100644 --- a/notes/post_ideas.txt +++ b/notes/post_ideas.txt @@ -1,12 +1,11 @@ Main path (important posts)— _ Algorithmic Intent: A Hansonian Generalized Anti-Zombie Principle (LW) -_ Intrumental Categories, Wireheading, and War (LW) -_ theory of MMB's transition +_ Sexual Dimorphism in Yudkowsky's Sequences, in Relation to My Gender Problems +_ Unnatural Categories Are Optimized for Deception (LW) _ Irreversible Damage review _ Jessica/wiz AGP defense -_ Sexual Dimorphism in Yudkowsky's Sequences, in Relation to My Gender Problems _ Elision _vs_. Choice (working title) _ Phenotypic Identity and Memetic Capture diff --git a/notes/sexual-dimorphism-in-the-sequences-notes.md b/notes/sexual-dimorphism-in-the-sequences-notes.md index 239c696..dd1911e 100644 --- a/notes/sexual-dimorphism-in-the-sequences-notes.md +++ b/notes/sexual-dimorphism-in-the-sequences-notes.md @@ -13,7 +13,7 @@ Easy— Harder— * "I often wish some men/women would appreciate" * empathic inference: https://www.lesswrong.com/posts/NLMo5FZWFFq652MNe/sympathetic-minds https://www.lesswrong.com/posts/Zkzzjg3h7hW5Z36hK/humans-in-funny-suits -* "The Opposite Sex" (memoryholed, but there should be an archive) +* "The Opposite Sex" https://web.archive.org/web/20130216025508/http://lesswrong.com/lw/rp/the_opposite_sex/ * EY was right about "men need to think about themselves _as men_" (find cite) * wipe culturally defined values * finding things in the refrigerator @@ -21,7 +21,8 @@ Harder— My ideological committment to psychological-sex-differences denialism made me uncomfortable when the topic of sex differences happened to come up on the blog—which wasn't particularly often, but in such a vast, sprawling body of work as the Sequences, it occasionally turned out to be relevant in a discussion of evolution or human values. -For example, +For example, an early explanation of why + "the love of a man for a woman, and the love of a woman for a man, have not been cognitively derived from each other or from any other value."