+But fighting for public epistemology is a long battle; it makes more sense if you have _time_ for it to pay off. Back in the late 'aughts and early 'tens, it looked like we had time. We had these abstract philosophical arguments for worrying about AI, but no one really talked about _timelines_. I believed the Singularity was going to happen in the 21st century, but it felt like something to expect in the _second_ half of the 21st century.
+
+Now it looks like we have—less time? Not just tautologically because time has passed (the 21st century is one-fifth over—closer to a quarter over), but because of new information from the visible results of the deep learning revolution during that time. Yudkowsky seemed particularly spooked by AlphaGo and AlphaZero in 2016–2017.
+
+[TODO: specifically, AlphaGo seemed "deeper" than minimax search so you shouldn't dimiss it as "meh, games", the way it rocketed past human level from self-play https://twitter.com/zackmdavis/status/1536364192441040896]
+
+My AlphaGo moment was 5 January 2021, OpenAI's release of [DALL-E](https://openai.com/blog/dall-e/) (by far the most significant news story of that week of January 2021).
+
+[TODO: previous AI milestones had seemed dismissible as a mere clever statistics trick; this looked more like "real" understanding, "real" creativity]
+
+[As recently as 2020, I had been daydreaming about about](/2020/Aug/memento-mori/#if-we-even-have-enough-time) working at an embryo selection company (if they needed programmers—but everyone needs programmers, these days), and having that be my altruistic[^altruism] contribution to the world.
+
+[^altruism]: If it seems odd to frame _eugenics_ as "altruistic", translate it as a term of art referring to the component of my actions dedicating to optimizing the future of the world, as opposed to selfishly optimizing my own experiences.
+
+[TODO—
+
+ * If you have short timelines, and want to maintain influence over what big state-backed corporations are doing, self-censoring about contradicting the state religion makes sense. There's no time to win a culture war; we need to grab hold of the Singularity now!!
+
+ * So isn't there a story here where I'm the villain for not falling in line and accepting orders? Aren't I causing damage, losing dignity points for our Earth, by criticizing Yudkowsky so harshly, when actually the rest of the world should listen to him more about the coming robot apocalypse?
+
+]
+
+> [_Perhaps_, replied the cold logic. _If the world were at stake._