-Needless to say (it _should_ be needless to say), I agree that old-timey patriarchy and chattel slavery were Actually Really Bad. However, I feel like Murray's overall positioning strategy is trying to have it both ways: challenging the orthodoxy, while downplaying the possibility of any [unfortunate implications](https://tvtropes.org/pmwiki/pmwiki.php/Main/UnfortunateImplications) of the orthodoxy being false. This is sympathetic, but [ultimately ineffective](http://zackmdavis.net/blog/2016/08/ineffective-deconversion-pitch/), and I think we can do better by going meta and analyzing the _functions_ being served by the constraints on our discourse and seeking out clever self-aware strategies for satisfying those functions _without_ [lying about everything](/2017/Jan/im-sick-of-being-lied-to/). We mustn't fear opening the dread meta-door in front of whether there actually _are_ dread doors that we must fear opening.
+(Okay, this story is actually somewhat complicated by the fact that [evolution didn't "figure out" how to build brains](https://www.lesswrong.com/posts/gTNB9CQd5hnbkMxAG/protein-reinforcement-and-dna-consequentialism) that [keep track of probability and utility separately](https://plato.stanford.edu/entries/decision-theory/): my analogues in the environment of evolutionary adaptedness might also have been better off assuming that a rustling in the bush was a tiger, even if it usually wasn't a tiger, because failing to detect actual tigers was so much more costly than erroneously "detecting" an imaginary tiger. But let this pass.)
+
+The problem is that, while any individual should always want true beliefs for _themselves_ in order to navigate the world, you might want _others_ to have false beliefs in order to trick them into _mis_-navigating the world in a way that benefits you. If I'm trying to sell you a used car, then—counterintuitively—I might not _want_ you to have accurate beliefs about the car, if that will reduce the sale price or result in no deal. If our analogues in the environment of evolutionary adaptedness regularly faced structurally similar situations, and if it's expensive to maintain two sets of beliefs (the real map for ourselves, and a fake map for our victims), we might end up with a tendency not just to be lying motherfuckers who decieve others, but also to _self_-decieve in situations where the fitness payoffs of tricking others outweighed those of being clear-sighted ourselves.
+
+That's why we're not smart enough to want a discipline of Actual Social Science. The benefits of having a collective understanding of human behavior—a _shared_ map—could be enormous, but beliefs about our own qualities, and those of socially-salient groups to which we belong (_e.g._, sex, race, and class) are _exactly_ those for which we face the largest incentive to decieve and self-decieve. Counterintuively, I might not _want_ you to have accurate beliefs about the value of my friendship, for the same reason that I might not want you to have accurate beliefs about the value of my used car. That makes it a lot harder not just to _get the right answer for the reasons_, but also to _trust_ that your fellow so-called "scholars" are trying to get the right answer, rather than trying to sneak self-serving lies into the shared map in order to fuck you over.