Title: Teleology Date: 2020-04-29 05:00 Category: other Tags: epistemology "I mean, if that explanation actually makes you feel happier, then fine." "Feeling happier isn't what explanations are for. Explanations are for [predicting our observations](https://www.lesswrong.com/posts/a7n8GdKiAZRX86T5A/making-beliefs-pay-rent-in-anticipated-experiences). "Emotions, too, are functional: happiness measures whether things in your life are going well or going poorly, but does not _constitute_ things going well, much as a high reading on a thermometer measures heat as 'temperature' without itself _being_ heat. "If the explanation that predicts your observations makes you unhappy, then the explanation—and the unhappiness—are functioning as designed."