From: M. Taylor Saotome-Westlake Date: Wed, 29 Mar 2023 05:00:42 +0000 (-0700) Subject: memoir: Eliezerfic—editing, Big Yud lays down the challenge X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=103c847be752fce4dd8bbe071ce48202e6568804;p=Ultimately_Untrue_Thought.git memoir: Eliezerfic—editing, Big Yud lays down the challenge --- diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index eb0f1f7..96cd27e 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -806,7 +806,7 @@ I didn't have that response thought through in real time. At the time, I just ag It turned out that I was lying about probably not talking in the server anymore. (Hedging with the word "probably" didn't make the claim true, and of course I wasn't _consciously_ lying, but that hardly seems exculpatory.) -The next day, I belatedly pointed out that "Keltham thought that not learning about masochists he can never have, was obviously in retrospect what he'd have wanted Civilization to do" seemed to contradict "one thing hasn't changed: the message that you, yourself, should always be trying to infer the true truth". In the first statement, it didn't sound like Keltham thinks it's good that Civilization didn't tell him so that he could figure it how for himself (in accordance with the discipline of "you, yourself, always trying to infer the truth"). It sounded like he was better off not knowing—better off having a _less accurate self-model_ (not having the concept of "obligate romantic sadism"), better off having a _less accurate world-model_ (thinking that masochism isn't real). +The next day, I belatedly pointed out that "Keltham thought that not learning about masochists he can never have, was obviously in retrospect what he'd have wanted Civilization to do" seemed to contradict "one thing hasn't changed: the message that you, yourself, should always be trying to infer the true truth". In the first statement, it didn't sound like Keltham thinks it's good that Civilization didn't tell him so that he could figure it out for himself (in accordance with the discipline of "you, yourself, always trying to infer the truth"). It sounded like he was better off not knowing—better off having a _less accurate self-model_ (not having the concept of "obligate romantic sadism"), better off having a _less accurate world-model_ (thinking that masochism isn't real). In response to someone positing that dath ilani were choosing to be happier but less accurate predictors, I said that I read a blog post once about why you actually didn't want to do that, linking to [an Internet Archive copy of "Doublethink (Choosing to Be Biased)"](https://web.archive.org/web/20080216204229/https://www.overcomingbias.com/2007/09/doublethink-cho.html) from 2008[^hanson-conceit]—at least, that was _my_ attempted paraphrase; it was possible that I'd extracted a simpler message from it than the author intended. @@ -827,9 +827,9 @@ The problem I saw with this is that becoming rich and famous isn't a purely rand The dilemma of whether to make more ambitious economic choices in pusuit of sexual goals was something that _already_ happens to people on Earth, rather than being hypothetical. I once met a trans woman who spent a lot of her twenties and thirties working very hard to get money for various medical procedures. I think she would be worse off under a censorship regime run by self-styled Keepers who thought it was kinder to prevent _poor people_ from learning about the concept of "transsexualism". -Further discussion established that Yudkowsky was (supposedly) already taking into account the distortion on individuals' decisions, but that the empirical setting of probabilities and utilities happened to be such that ignorance came out on top. +Further discussion established that Yudkowsky was (supposedly) already taking into account that class of distortion on individuals' decisions, but that the empirical setting of probabilities and utilities happened to be such that ignorance came out on top. -I wasn't sure what my wordcount and diplomacy budget limits for the server were, but I couldn't let go; I kept the thread going on subsequent days. There was something I felt I should be able to convey, if I could just find the right words. +I wasn't sure what my wordcount and "diplomacy" "budget limits" for the server were, but I couldn't let go; I kept the thread going on subsequent days. There was something I felt I should be able to convey, if I could just find the right words. When [Word of God](https://tvtropes.org/pmwiki/pmwiki.php/Main/WordOfGod) says, "trying to prevent most [_X_] from discovering what they are [...] continues to strike me as a basically reasonable policy option", then, separately from the particular value of _X_, I expected people to jump out of their chairs and say, "No! This is wrong! Morally wrong! People can stand what is true about themselves, because they are already doing so!" @@ -847,15 +847,15 @@ I admitted, again, that there was a sense in which I couldn't argue with authori (Yudkowsky retorted, "...you realize you're describing like half the alien planets in comic books? when did Superman ever get depicted as studying kung fu?" I wish I had thought to admit that, yes, I _did_ hold Eliezer Yudkowsky to a higher standard of consilient worldbuilding than DC Comics. Would he rather I _didn't_?) -Something about innate _kung fu_ world seems fake in a way that seems like a literary flaw. It's not just about plausibility. Innate _kung fu_ skills are scientifically plausible[^instinct] in a way that faster-than-light travel is not. Fiction incorporates unrealistic elements in order to tell a story that has relevace to real human lives. Throwing faster-than-light travel into the universe so that you can do a [space opera](https://tvtropes.org/pmwiki/pmwiki.php/Main/SpaceOpera) doesn't make the _people_ fake in the way that Superman's fighting skills are fake. +Something about innate _kung fu_ world seems fake in a way that seems like a literary flaw. It's not just about plausibility. Fiction often incorporates unrealistic elements in order to tell a story that has relevance to real human lives. Innate _kung fu_ skills are scientifically plausible[^instinct] in a way that faster-than-light travel is not, but throwing faster-than-light travel into the universe so that you can do a [space opera](https://tvtropes.org/pmwiki/pmwiki.php/Main/SpaceOpera) doesn't make the _people_ fake in the way that Superman's fighting skills are fake. [^instinct]: All sorts of other instinctual behaviors exist in animals; I don't se why skills humans have to study for years as a "martial art" couldn't be coded into the genome. -Similarly, a world that's claimed by authorial fiat to be super-great at epistemic rationality, but where the people don't have a will-to-truth stronger than their will-to-happiness, felt fake to me. I couldn't _prove_ that it was fake. I agreed with Harmless's case that, _technically_, as far as the Law went, you could build a Civilization or a Friendly AI to see all the ugly things that you preferred not to see. +Maybe it was okay for Superman's fighting skills to be fake from a literary perspective (because realism along that dimension is not what Superman is _about_), but if the Yudkowskian ethos exulted intelligence as ["the power that cannot be removed without removing you"](https://www.lesswrong.com/posts/SXK87NgEPszhWkvQm/mundane-magic), readers had grounds to demand that the dath ilani's thinking skills be real, and a world that's claimed by authorial fiat to be super-great at epistemic rationality, but where the people don't have a will-to-truth stronger than their will-to-happiness, felt fake to me. I couldn't _prove_ that it was fake. I agreed with Harmless's case that, _technically_, as far as the Law went, you could build a Civilization or a Friendly AI to see all the ugly things that you preferred not to see. But if you could—would you? And more importantly, if you would—could you? -It was possible that the attitude I was evincing here was just a difference between the eliezera out of dath ilan and the Zackistani from my medianworld, and that there's nothing more to be said about it. But I didn't think the thing was a _genetic_ trait of the Zackistani! _I_ got it from spending my early twenties obsessively re-reading blog posts that said things like, ["I believe that it is right and proper for me, as a human being, to have an interest in the future [...] One of those interests is the human pursuit of truth [...] I wish to strengthen that pursuit further, in this generation."](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business) +It was possible that the attitude I was evincing here was just a difference between the eliezera out of dath ilan and the Zackistani from my medianworld, and that there was nothing more to be said about it. But I didn't think the thing was a _genetic_ trait of the Zackistani! _I_ got it from spending my early twenties obsessively re-reading blog posts that said things like, ["I believe that it is right and proper for me, as a human being, to have an interest in the future [...] One of those interests is the human pursuit of truth [...] I wish to strengthen that pursuit further, in this generation."](https://www.lesswrong.com/posts/anCubLdggTWjnEvBS/your-rationality-is-my-business) There were definitely communities on Earth where I wasn't allowed in because of my tendency to shout things from street corners, and I respected those people's right to have a safe space for themselves. @@ -863,7 +863,7 @@ But those communities ... didn't call themselves _rationalists_, weren't _preten "[The eleventh virtue is scholarship. Study many sciences and absorb their power as your own](https://www.yudkowsky.net/rational/virtues) ... unless a prediction market says that would make you less happy," just didn't have the same ring to it. Neither did "The first virtue is curiosity. A burning itch to know is higher than a solemn vow to pursue truth. But higher than both of those, is trusting your Society's institutions to tell you which kinds of knowledge will make you happy"—even if you stipulated by authorial fiat that your Society's institutions are super-competent, such that they're probably right about the happiness thing. -Attempting to illustrate the mood I thought dath ilan was missing, I quoted (with Discord's click-to-reveal spoiler blocks around the more plot-relevant sentences) the scene from _Atlas Shrugged_ where our heroine Dagny expresses a wish to be kept ignorant for the sake of her own happiness and gets shut down by Galt—and Dagny _thanks_ him. +Attempting to illustrate [the mood I thought dath ilan was missing](https://www.econlib.org/archives/2016/01/the_invisible_t.html), I quoted (with Discord's click-to-reveal spoiler blocks around the more plot-relevant sentences) the scene from _Atlas Shrugged_ where our heroine Dagny expresses a wish to be kept ignorant for the sake of her own happiness, and gets shut down by John Galt—and Dagny _thanks_ him.[^atlas-shrugged-ref] > "[...] Oh, if only I didn't have to hear about it! If only I could stay here and never know what they're doing to the railroad, and never learn when it goes!" > @@ -871,7 +871,11 @@ Attempting to illustrate the mood I thought dath ilan was missing, I quoted (wit > > She looked at him, her head lifted, knowing what chance he was rejecting. She thought that no man of the outer world would have said this to her at this moment—she thought of the world's code that worshipped white lies as an act of mercy—she felt a stab of revulsion against that code, suddenly seeing its full ugliness for the first time [...] she answered quietly, "Thank you. You're right." -This (probably predictably) failed to resonate with other server participants, who were baffled why I seemed to be appealing to Ayn Rand's authority. But I was actually going for a _reverse_ appeal-to-authority: if _Ayn Rand_ understood that facing reality is virtuous, why didn't the 2020's "rationalists"? Wasn't that undignified? I didn't think the disdain for "Earth people" (again, as if there were any other kind) was justified, when Earth's philosophy of rationality (as exemplified by Ayn Rand or Robert ["Get the Facts"](https://www.goodreads.com/quotes/38764-what-are-the-facts-again-and-again-and-again) Heinlein) was doing better than dath ilan's on this critical dimension. +[^atlas-shrugged-ref]: In Part Three, Chapter II, "The Utopia of Greed". + +This (probably predictably) failed to resonate with other server participants, who were baffled why I seemed to be appealing to Ayn Rand's authority. + +I was actually going for a _reverse_ appeal-to-authority: if _Ayn Rand_ understood that facing reality is virtuous, why didn't the 2020s "rationalists"? Wasn't that undignified? I didn't think the disdain for "Earth people" (again, as if there were any other kind) was justified, when Earth's philosophy of rationality (as exemplified by Ayn Rand or Robert ["Get the Facts"](https://www.goodreads.com/quotes/38764-what-are-the-facts-again-and-again-and-again) Heinlein) was doing better than dath ilan's on this critical dimension. But if people's souls had been damaged such that they didn't have the "facing reality is virtuous" gear, it wasn't easy to install the gear by talking at them. @@ -885,13 +889,13 @@ I knew that. The other people in the chatroom knew that. So to the extent that t The problem was that, in my view, the people who weren't talking about Truth as if it were a sacred value were being _wildly recklessly casual_ about harms from covering things up, as if they didn't see the non-first-order harms _at all_. I felt I had to appeal to the lessons for children about how Lying Is Bad, because if I tried to make a more sophisticated argument about it being _quantitatively_ crazy to cover up psychology facts that make people sad, I would face a brick wall of "authorial fiat declares that the probabilities and utilities are specifically fine-tuned such that ignorance is good". -Even if you specified by authorial fiat that "latent sadists could use the information to decide whether or not to try to become rich and famous" didn't tip the utility calculus in itself, [facts are connected to each other](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies), there were _more consequences_ to the coverup, more ways in which better-informed people could make better decisions than worse informed people. +Even if you specified by authorial fiat that "latent sadists could use the information to decide whether or not to try to become rich and famous" didn't tip the utility calculus in itself, [facts are connected to each other](https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies); there were _more consequences_ to the coverup, more ways in which better-informed people could make better decisions than worse-informed people. What about the costs of all the other recursive censorship you'd have to do to keep the secret? (If a biography mentioned masochism in passing along with many other traits of the subject, you'd need to either censor the paragraphs with that detail, or censor the whole book. Those are real costs, even under a soft-censorship regime where people can give special consent to access "Ill Advised" products.) Maybe latent sadists could console themselves with porn if they knew, or devote their careers to making better sex robots, just as people on Earth with non-satisfiable sexual desires manage to get by. (I _knew some things_ about this topic.) What about dath ilan's heritage optimization (read: eugenics) program? Are they going to try to breed more masochists, or fewer sadists, and who's authorized to know that? And so on. -A user called RationalMoron asked if I was appealing to a terminal value. Did I think people should have accurate self-models even if they don't want to? +A user called RationalMoron asked if I was appealing to a terminal value. Did I think people should have accurate self-models even if they didn't want to? -Obviously I wasn't going to use a universal quantifier over all possible worlds and all possible minds, but in human practice, yes: people who prefer to believe lies about themselves are doing the wrong thing; people who lie to their friends to keep them happy are doing the wrong thing. People can stand what is true, because they are already doing so. I realized this was a children's lesson without very advanced math, but I thought it was a better lesson than, "Ah, but what if a _prediction market_ says they can't???" That the eliezera prefer not to know that there are desirable sexual experiences that they can't have, contradicted April's earlier claim (which had received a Word of God checkmark-emoji) that "it's not that the standards are being dropped it's that there's an even higher standard far beyond what anyone on earth has accomplished". +Obviously I wasn't going to use a universal quantifier over all possible worlds and all possible minds, but in human practice, yes: people who prefer to believe lies about themselves are doing the wrong thing; people who lie to their friends to keep them happy are doing the wrong thing. People can stand what is true, because they are already doing so. I realized that this was a children's lesson without very advanced math, but I thought it was a better lesson than, "Ah, but what if a _prediction market_ says they can't???" That the eliezera prefer not to know that there are desirable sexual experiences that they can't have, contradicted April's earlier claim (which had received a Word of God checkmark-emoji) that "it's not that the standards are being dropped[;] it's that there's an even higher standard far beyond what anyone on earth has accomplished". Apparently I struck a nerve. Yudkowsky started "punching back": @@ -900,23 +904,29 @@ Apparently I struck a nerve. Yudkowsky started "punching back": > It's noticeably more extreme than the _Invention of Lying_ aliens, who can still have nudity taboos > I'd also note that I think in retrospect (only after having typed it) that Zack could not have generated these examples of other places where society refrains from observation, and that I think this means I am tracking the thing Zack fears in a way that Zack cannot because his thinking is distorted and he is arguing rather than seeing; and this, not verbally advocating for "truth", is more what respect for truth really is. -I thought the "you could not have generated the answer I just told you" gambit was a pretty dirty argumentative trick on Yudkowsky's part. (Given that I could, how would I be able to prove it?—this was itself a good use-case for spoilers.) +I thought the "you could not have generated the answer I just told you" gambit was a pretty dirty argumentative trick on Yudkowsky's part. (Given that I could, how would I be able to prove it?—this was itself a good use-case for concealing spoilers.) -As it happened, however, I _had_ already considered the case of spoilers as a class of legitimate infohazards, and was prepared to testify that I had already thought of it, and explain why hiding spoilers were relevantly morally different from coverups in my view. The previous night, 7 December 2022, I had had a phone call with Anna Salamon,[^evidence-of-independent-generation] in which (I pretty distinctly remember) citing dath ilan's [practice of letting children figure out heliocentrism for themselves](https://www.glowfic.com/replies/1777588#reply-1777588) as not being objectionable in the way the sadism/masochism coverup was. +As it happened, however, I _had_ already considered the case of spoilers as a class of legitimate infohazards, and was prepared to testify that I had already thought of it, and explain why I thought hiding spoilers were relevantly morally different from the coverups I was objecting to. The previous night, 7 December 2022, I had had a phone call with Anna Salamon,[^evidence-of-independent-generation] in which I (remembered that I) had cited dath ilan's [practice of letting children figure out heliocentrism for themselves](https://www.glowfic.com/replies/1777588#reply-1777588) as not being objectionable in the way the sadism/masochism coverup was. -[^evidence-of-independent-generation]: I was lucky to be able to point to Anna as a potential witness to defend myself against the "could not have generated" trick—as a matter of principle, not because I seriously expected anyone to go ask Anna if she remembered the conversation the same way. +[^evidence-of-independent-generation]: I was lucky to be able to point to Anna as a potential witness to defend myself against the "could not have generated" trick—as a matter of principle, not because I seriously expected anyone to care enough to go ask Anna if she remembered the conversation the same way. - I also mentioned that when I had used spoiler blocks on the _Atlas Shrugged_ quote I had posted upthread, I had briefly considered including some kind of side-remark noting that the spoiler blocks were also a form of information-hiding, but couldn't think of anything funny or relevant enough (which, if my self-report could be trusted, showed that I had independently generated the idea spoilers being an example of hiding information—but I didn't expect other people to uncritically believe my self-reports). + I also mentioned that when I had used spoiler blocks on the _Atlas Shrugged_ quote I had posted upthread, I had briefly considered making some kind of side-remark noting that the spoiler blocks were also a form of information-hiding, but couldn't think of anything funny or relevant enough (which, if my self-report could be trusted, showed that I had independently generated the idea of spoilers being an example of hiding information—but I didn't expect other people to uncritically believe my self-reports). -It seemed like the rationale for avoiding spoilers of movie plots or homework exercises had to do with the outcome being different if you got spoiled: you have a different æsthetic experience if you experience the plot twist in the 90th minute of the movie rather than the fourth paragraph of the _Wikipedia_ article; you learn more by working out the answer to the homework exercise yourself. Dath ilan's sadism/masochism coverup didn't seem to have the same structure: when I try to prove a theorem myself before looking at how the textbook says to do it, it's not because I would be _sad about the state of the world_ if I looked at the textbook; it's because the temporary ignorance of working it out myself results in a stronger state of final knowledge. +It seemed like the rationale for avoiding spoilers of movie plots or homework exercises had to do with the outcome being different if you got spoiled: you have a different æsthetic experience if you experience the plot twist in the 90th minute of the movie rather than the fourth paragraph of the _Wikipedia_ article. Dath ilan's sadism/masochism coverup didn't seem to have the same structure: when I try to prove a theorem myself before looking at how the textbook says to do it, it's not because I would be _sad about the state of the world_ if I looked at the textbook; it's because the temporary ignorance of working it out myself results in a stronger state of final knowledge. That is, the difference between "spoilers" (sometimes useful) and "coverups" (bad) had to do with whether the ignorant person is expected to eventually uncover the hidden information, and whether the ignorant person knows that there's hidden information that they're expected to uncover. In the case of the sadism/masochism coverup (in contrast to the cases of movie spoilers or homework exercises), it seemed like neither of these conditions pertained. (Keltham knows that the Keepers are keeping secrets, but he seems to actively have beliefs about human psychology that imply masochism is implausible; it seems more like he has a false map, rather than a blank spot on his map for the answer to the homework exercise to be filled in.) I thought that was morally relevant. -(I would have hoped that my two previous mentions in the thread of supporting keeping nuclear, bioweapon, and AI secrets should have already made it clear that I wasn't against _all_ cases of Society hiding information, but to further demonstrate my ability to generate counterexamples, I mentioned that I would also admit _threats_ as a class of legitimate infohazard: if I'm not a perfect decision theorist, I'm better off if Tony Soprano just doesn't have my email to begin with, if I don't trust myself to calculate when I "should" ignore his demands.) +(Additionally, I would have hoped that my two previous mentions in the thread of supporting keeping nuclear, bioweapon, and AI secrets should have already made it clear that I wasn't against _all_ cases of Society hiding information, but to further demonstrate my ability to generate counterexamples, I mentioned that I would also admit _threats_ as a class of legitimate infohazard: if I'm not a perfect decision theorist, I'm better off if Tony Soprano just doesn't have my email address to begin with, if I don't trust myself to calculate when I "should" ignore his demands.) As for the claim that my thinking was distorted and I was arguing instead of seeing, it was definitely true that I was _motivated to look for_ criticisms of Yudkowsky and dath ilan, for personal reasons outside the scope of the server, and I thought it was great for people to notice this and take it into account. I hoped to nevertheless be competent to only report real criticisms and not fake criticisms. (Whether I succeeded, of course, was up to the reader to decide.) -[TODO: Yudkowsky tests me: (despite "Do not let the argument wander and become about something else, such as someone's virtue as a rationalist")] +Yudkowsky replied: + +> only half the battle even if you could do it. you're also not reporting any facts/arguments on the other side, which is a much larger and visible gap to me, and has a lot to do with why I'm not presently considering this criticism from a peer despite your spoken adherence to virtues I value. **QUESTION FOR ZACK ONLY, NOBODY ELSE ANSWER OR SAY ANYTHING ABOUT IT IN THIS MAIN CHANNEL:** What are some of the ways that Planecrash valorizes truth, as you, yourself, see that virtue? + +I didn't ask why it was relevant whether or not I was a "peer." If we're measuring IQ (143 _vs._ [131](/images/wisc-iii_result.jpg)), or fiction-writing ability (several [highly-acclaimed](https://www.lesswrong.com/posts/HawFh7RvDM4RyoJ2d/three-worlds-collide-0-8) [stories](https://www.yudkowsky.net/other/fiction/the-sword-of-good) [including the world's most popular _Harry Potter_ fanfiction](https://www.hpmor.com/) _vs._ a [_My Life as a Teenage Robot_ fanfiction](https://archive.ph/WdydM) with double-digit favorites and a [few](/2018/Jan/blame-me-for-trying/) [blog](http://zackmdavis.net/blog/2016/05/living-well-is-the-best-revenge/) [vignettes](https://www.lesswrong.com/posts/dYspinGtiba5oDCcv/feature-selection) here and there), or contributions to AI alignment (founder of the field _vs._ author of some dubiously relevant blog comments), I'm obviously _not_ his peer. It didn't seem like that was necessary when one could just [evaluate my arguments about dath ilan on their own merits](https://www.lesswrong.com/posts/5yFRd3cjLpm3Nd6Di/argument-screens-off-authority). But I wasn't going to be so impertinent to point that out when the master was testing me (!) and I was eager to pass the test. + +[TODO: outline the test] [TODO: derail with Lintamande]