+While [the Sequence explaining Yudkowsky's metaethics](https://www.lesswrong.com/tag/metaethics-sequence) was being published (which a lot of people, including me, didn't quite "get" at the time; a [later précis](https://www.lesswrong.com/posts/zqwWicCLNBSA5Ssmn/by-which-it-may-be-judged) was perhaps more successful), I was put off by the extent to which Yudkowsky seemed to want to ground the specification of value in [the evolved design of the human brain](https://www.lesswrong.com/posts/cSXZpvqpa9vbGGLtG/thou-art-godshatter), as if culturally-defined values were irrelevant, to be wiped away by [the extrapolation of what people _would_ want if they knew more, thought faster, _&c._](https://arbital.com/p/normative_extrapolated_volition/).
+
+And the _reason_ I felt that way was because I was aware of how much of a historical anomaly my sacred ideological value of antisexism was. Contrast to Yudkowsky's [casual sex-realist speculation in the comment section](https://www.greaterwrong.com/posts/BkkwXtaTf5LvbA6HB/moral-error-and-moral-disagreement/comment/vHNejGa6cRxh6kdnE):
+
+> If there are distinct categories of human transpersonal values, I would expect them to look like "male and female babies", "male children", "male adults", "female children", "female adults", "neurological damage 1", "neurological damage 2", not "Muslims vs. Christians!"
+
+You can see why this view would be unappealing to an ideologue eager to fight a culture war along an "Antisexism _vs._ Sexism" axis.
+
+Looking back—I do think I had a point that culturally-inculcated values won't completely wash out under extrapolation, but I think I was vastly underestimating the extent to which your current sacred ideology _can_ be shown to be meaningfully "wrong" with better information—and, by design of the extrapolation procedure, this _shouldn't_ be threatening.