That's the context in which my happy-price email thread ended up including the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" (Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including [inside of future superintelligences running simulations of the past](https://www.simulation-argument.com/), there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.)
That's the context in which my happy-price email thread ended up including the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" (Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including [inside of future superintelligences running simulations of the past](https://www.simulation-argument.com/), there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.)