+------
+
+Yudkowsky's political cowardice is arguably puzzling in light of his timeless decision theory's recommendations against giving in to extortion.
+
+The "arguably" is important, because randos on the internet are notoriously bad at drawing out the consequences of the theory, to the extent that Yudkowsky has said that he wishes he hadn't published—and though I think I'm smarter than the average rando, I don't expect anyone to _take my word for it_. So let me disclaim that this is _my_ explanation of how Yudkowsky's decision theory _could be interpreted_ to recommend that he behave the way I want him to, without any pretense that I'm any sort of neutral expert witness on decision theory.
+
+The idea of timeless decision theory is that you should choose the action that has the best consequences _given_ that your decision is mirrored at all the places your decision algorithm is embedded in the universe.
+
+The reason this is any different from the "causal decision theory" of just choosing the action with the best consequences (locally, without any regard to this "multiple embeddings in the universe" nonsense) is because it's possible for other parts of the universe to depend on your choices. For example, in the "Parfit's Hitchhiker" scenario, someone might give you a ride out of the desert if they _predict_ you'll pay them back later. After you've already received the ride, you might think that you can get away with stiffing them—but if they'd predicted you would do that, they wouldn't have given you the ride in the first place. Your decision is mirrored _inside the world-model every other agent with a sufficiently good knowledge of you_.
+
+In particular, if you're the kind of agent that gives in to extortion—if you respond to threats of the form "Do what I want, or I'll hurt you" by doing what the threatener wants—that gives other agents an incentive to spend resources trying to extort you. On the other hand, if any would-be extortionist knows you'll never give in, they have no reason to bother trying. This is where the standard ["Don't negotiate with terrorists"](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) advice comes from.
+
+So, naïvely, doesn't Yudkowsky's "personally prudent to post your agreement with Stalin"[^gambit] gambit constitute giving in to an extortion threat of the form, "Support the progressive position, or we'll hurt you", which Yudkowsky's own decision theory says not to do?
+
+[^gambit]: In _ways that exhibit generally rationalist principles_, natch.