From: M. Taylor Saotome-Westlake Date: Sun, 20 Sep 2020 02:42:17 +0000 (-0700) Subject: self-deception hash-anchor in "Human Diversity" review X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=7f9666c07995597a09bd939b0a8784437ab3e0ea;p=Ultimately_Untrue_Thought.git self-deception hash-anchor in "Human Diversity" review --- diff --git a/content/2020/book-review-human-diversity.md b/content/2020/book-review-human-diversity.md index 1407bcc..6dd3d0b 100644 --- a/content/2020/book-review-human-diversity.md +++ b/content/2020/book-review-human-diversity.md @@ -83,7 +83,7 @@ We really _shouldn't_. Everyone _should_ perceive a common interest in true beli (Okay, this story is actually somewhat complicated by the fact that [evolution didn't "figure out" how to build brains](https://www.lesswrong.com/posts/gTNB9CQd5hnbkMxAG/protein-reinforcement-and-dna-consequentialism) that [keep track of probability and utility separately](https://plato.stanford.edu/entries/decision-theory/): my analogues in the environment of evolutionary adaptedness might also have been better off assuming that a rustling in the bush was a tiger, even if it usually wasn't a tiger, because failing to detect actual tigers was so much more costly (in terms of fitness) than erroneously "detecting" an imaginary tiger. But let this pass.) -The problem is that, while any individual should always want true beliefs for _themselves_ in order to navigate the world, you might want _others_ to have false beliefs in order to trick them into _mis_-navigating the world in a way that benefits you. If I'm trying to sell you a used car, then—counterintuitively—I might not _want_ you to have accurate beliefs about the car, if that would reduce the sale price or result in no deal. If our analogues in the environment of evolutionary adaptedness regularly faced structurally similar situations, and if it's expensive to maintain two sets of beliefs (the real map for ourselves, and a fake map for our victims), we might end up with a tendency not just to be lying motherfuckers who deceive others, but also to _self_-deceive in situations where the payoffs (in fitness) of tricking others outweighed those of being clear-sighted ourselves. +The problem is that, while any individual should always want true beliefs for _themselves_ in order to navigate the world, you might want _others_ to have false beliefs in order to trick them into _mis_-navigating the world in a way that benefits you. If I'm trying to sell you a used car, then—counterintuitively—I might not _want_ you to have accurate beliefs about the car, if that would reduce the sale price or result in no deal. If our analogues in the environment of evolutionary adaptedness regularly faced structurally similar situations, and if it's expensive to maintain two sets of beliefs (the real map for ourselves, and a fake map for our victims), we might end up with a tendency not just to be lying motherfuckers who deceive others, but also to _self_-deceive in situations where the payoffs (in fitness) of tricking others outweighed those of being clear-sighted ourselves. That's why we're not smart enough to want a discipline of Actual Social Science. The benefits of having a collective understanding of human behavior—a _shared_ map that reflects the territory that we are—could be enormous, but beliefs about our own qualities, and those of socially-salient groups to which we belong (_e.g._, sex, race, and class) are _exactly_ those for which we face the largest incentive to deceive and self-deceive. Counterintuitively, I might not _want_ you to have accurate beliefs about the value of my friendship (or the disutility of my animosity), for the same reason that I might not want you to have accurate beliefs about the value of my used car. That makes it a lot harder not just to _get the right answer for the reasons_, but also to _trust_ that your fellow so-called "scholars" are trying to get the right answer, rather than trying to sneak self-aggrandizing lies into the shared map in order to fuck you over. You can't _just_ write a friendly science book for oblivious science nerds about "things we know about some ways in which people are different from each other", because almost no one is that oblivious. To write and be understood, you have to do some sort of _positioning_ of how your work fits in to [the war](/2020/Feb/if-in-some-smothering-dreams-you-too-could-pace/) over the shared map.