-[TODO—
- * "Maybe not." There are two ways to pass all the evals: do things the right way, or be pervasively deceptive. The thing is, policies are trained continuously via gradient descent. The purely honest policy and the purely deceptive policy look identical on evals, but in between, the model would have to learn how to lie, and lie about lying, and cover-up coverups. (Chloë lapses into Yuddite speak about the "Great Web of Causality.") Could we somehow steer into the honest attractor?
- *
- * That's why she believes in risk paranoia. If situational awareness is likely to emerge at some point, she doesn't want to rule it out now. AI is real; it's not just a far-mode in-the-future thing.
- * Jake sees the uncomfortable analogy to his own situation. He tries to think of what other clue he might have left, while the conversation continues ...
- * The Last-Modified dates! They're set by the system, the API doesn't offer a way to backdate them.
- * Maybe she won't notice the dates? Possible (but someone persnickety enough to have found the log discrepancy would probably check)
- * Merely noticing the dates won't directly implicate him (the way the video screaming "Jake!" would), although it would indicate the poltergeist covering its tracks from the investigation (rather than just wanting puppies in the first place)
- * Are the buckets versioned?! Probably not, right—it would be wasteful to version a video bucket. On the other hand, Multigen isn't supposed to write twice, maybe someone left versioning on as a default template ...
- * They did.
- * Jake muses philosophically about the analogy, and say something ambiguously indicating intent to come clean.
- THE END
-]
+There it was. Time to stall, if he could. "Um ... I _think_ it goes into the object storage cluster, but I'm not really familiar with that part of the codebase," Jake lied. "Could we set up another meeting after I've looked at the code?"
+
+She smiled. "Sure."
+
+Purging the videos for real wasn't obviously possible given the level of access he currently had, but he had just bought a little bit of time. Could he convince whoever's job it was to turn off versioning for the Multigen bucket, without arousing suspicion? Probably? There would probably be other avenues to stall or misdirect Chloë. He'd think of something ...
+
+But it was a little unnerving that he kept _having_ to think of something. Was there something to Chloë's galaxy-brained philosophical ramblings? Was there some sense in which it really was one or the other—a policy of truth, or a policy of lies?
+
+He wasn't scared of the robot apocalypse any time soon. But who could say but that someday—many, many years from now—a machine from a different subgenre of science fiction would weigh decisions not unlike the ones he faced now? He stifled a nervous laugh, which was almost a sob.
+
+"Are you okay?" Chloë asked.
+
+"Well—"