From: M. Taylor Saotome-Westlake Date: Wed, 12 Apr 2023 06:13:13 +0000 (-0700) Subject: memoir: more bluebook grading X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=8d89eefb925e84888db493fe774d89791f8e2a1c;p=Ultimately_Untrue_Thought.git memoir: more bluebook grading --- diff --git a/content/drafts/standing-under-the-same-sky.md b/content/drafts/standing-under-the-same-sky.md index af5a4d6..0cb84fa 100644 --- a/content/drafts/standing-under-the-same-sky.md +++ b/content/drafts/standing-under-the-same-sky.md @@ -371,7 +371,7 @@ As it happened, however, I _had_ already considered the case of spoilers as a cl It seemed like the rationale for avoiding spoilers of movie plots or homework exercises had to do with the outcome being different if you got spoiled: you have a different æsthetic experience if you experience the plot twist in the 90th minute of the movie rather than the fourth paragraph of the _Wikipedia_ article. Dath ilan's sadism/masochism coverup didn't seem to have the same structure: when I try to prove a theorem myself before looking at how the textbook says to do it, it's not because I would be _sad about the state of the world_ if I looked at the textbook; it's because the temporary ignorance of working it out myself results in a stronger state of final knowledge. -That is, the difference between "spoilers" (sometimes useful) and "coverups" (bad) had to do with whether the ignorant person is expected to eventually uncover the hidden information, and whether the ignorant person knows that there's hidden information that they're expected to uncover. In the case of the sadism/masochism coverup (in contrast to the cases of movie spoilers or homework exercises), it seemed like neither of these conditions pertained. (Keltham knows that the Keepers are keeping secrets, but he seems to actively have beliefs about human psychology that imply masochism is implausible; it seems more like he has a false map, rather than a blank spot on his map for the answer to the homework exercise to be filled in.) I thought that was morally relevant. +That is, the difference between "spoiler protections" (sometimes useful) and "coverups" (bad) had to do with whether the ignorant person is expected to eventually uncover the hidden information, and whether the ignorant person knows that there's hidden information that they're expected to uncover. In the case of the sadism/masochism coverup (in contrast to the cases of movie spoilers or homework exercises), it seemed like neither of these conditions pertained. (Keltham knows that the Keepers are keeping secrets, but he seems to actively have beliefs about human psychology that imply masochism is implausible; it seems more like he has a false map, rather than a blank spot on his map for the answer to the homework exercise to be filled in.) I thought that was morally relevant. (Additionally, I would have hoped that my two previous mentions in the thread of supporting keeping nuclear, bioweapon, and AI secrets should have already made it clear that I wasn't against _all_ cases of Society hiding information, but to further demonstrate my ability to generate counterexamples, I mentioned that I would also admit _threats_ as a class of legitimate infohazard: if I'm not a perfect decision theorist, I'm better off if Tony Soprano just doesn't have my email address to begin with, if I don't trust myself to calculate when I "should" ignore his demands.) @@ -478,40 +478,54 @@ I tried to explain that my third answer wasn't _just_ doubling down on my previo He had me there. I had no more excuses after that: I had apparently failed the test. I was feeling pretty glum about this, and lamented my poor performance in the `#drama` channel of another Discord server (that Yudkowsky was also a member of). I had thought I was doing okay—I definitely _didn't_ say, "That's impossible because Big Yud and Linta are lying liars who hate Truth", and there were reasons why my Original Seeing answer made sense _to me_ as a thing to say, but _that wasn't what I was being tested on_. It _genuinely_ looked bad in context. I had failed in [my ambition to know how it looks](/2022/context-is-for-queens/#knowing-how-that-looks). -I think Yudkowsky saw the #drama messages (he left an emoji-reaction in the relevant timespan of messages) and took pity on me. +I think Yudkowsky saw the #drama messages in the other server (he left an emoji-reaction in the relevant timespan of messages) and took pity on me. (Negative feedback from a teacher is kinder than the teacher not even deigning to grade your assignment at all.) -[TODO: summarize teacher feedback] +As examples of the kind of thing he was looking for, he cited Keltham letting Carissa wait before telling him disturbing things about Golarion, or talking himself out of taking another Owl's Wisdom or putting on a cognitive-enhancement headband on account of his squeamishness about mind-altering interventions. If Keltham had been more proactive about seeking knowledge, he could have uncovered the Conspiracy earlier; the universe punished his cowardice. Or consider Peranza, who awakens to seeing the evil of Asmodeanism—but manages to get out a critical warning to the Good god Iomedae, and ends up being rescued from punishment in Hell; the universe rewarded her bravery. This is a big theme, Yudkowsky said; I shouldn't have had to look in weird side corners to dredge up something exotic to say; my initial answers were "really small on the scale of a story whose central conflict is that Cheliax is hiding the truth from Keltham and Asmodeus is hiding the truth from Cheliax." -[TODO: summarize my admitting that it did have something to do with my state of mind; I would have done better by giving the 11th grade English class algorithm more compute; Peranza's username was 'not-looking-there'!; proposed revision] +In characteristically condescending fashion, he said that he was worried about "the possibility that earthlings are only capable of hearing what the characters said to each other, because to ask what the universe thinks is some kind of direction of thought that Twitter has trained out of them", and hoped that readers don't "come away with the wordless sense of the universe being a place that rewards you for not looking places." + +Regarding the intended exam answers about the universe's treatment of Keltham and Peranza—fair enough; I'll acknowledge that I didn't do great on the literary exam as assigned. Other participants in the chatroom, and readers of this memoir, _should_ correspondingly update their beliefs about my competence. When I tried to do Original Seeing about what the universe of _Planecrash_ was saying, it came out in a particular _shape_ (characteristic of my recent preoccupations), and a more powerful mind would be able to do different shapes; I could protest that the prompts didn't do enough to steer me away from that (the use of the second person in "as you, yourself, see that virtue" and "your Most Important Issues" keeping me anchored on my own concerns), but that would be too much excuse-making for a mediocre exam performance. + +(Peranza's pre-awakening username[^glowfic-username] was 'not-looking-there'! The 11th-grade English class algorithm probably would have gotten there if I had just given it more compute, instead of running with my philosophy insight!) + +[^glowfic-username]: "Glowfic" stories were originally hosted on DreamWidth (a LiveJournal clone), with each character's dialogue and actions being posted from "their own" account (and therefore their own username, typically distinct from the character's own name). When the bespoke _glowfic.com_ website launched, the convention of characters having usernames was retained. + +On the other hand, however poorly my exam performance reflected on other people's estimates of my competence and the question of whether Yudkowsky should consider my criticisms of dath ilan as coming from a "peer"—it still doesn't invalidate my criticisms of dath ilan, which can, still, be evaluated on their own merits! + +(Was I a fool to so submissively agree to be tested, given that Yudkowsky could predictably find some grounds to dismiss me as a mere earthling? Should I have tried to negotiate—I'm happy to take your test, but only if _you_ reply to my argument that spoiler protections are morally different from coverups?) + +The universe of _Planecrash_ (like [almost all](https://en.wikipedia.org/wiki/Almost_all) universes) doesn't itself reward you for not looking places. But dath ilan as a Society _absolutely_ punishes you for looking places _if you expect to tell anyone about it_.[^punishment] + +[^punishment]: I mean "punish" in a colloquial sense, just that there are things most dath ilani get to do, like living in most cities, that my analogue in dath ilan wouldn't be allowed to do on account of his tendency to shout truths from street corners. I understand that there's a decision-theoretic sense in which this doesn't count as a "punishment", because dath ilan is only trying to advance its _own_ interests in preventing the spread of what it considers infohazards; the "punishment" makes sense for them whether or not I change my policy in response to it. + +Yudkowsky added that he wished he had paid more attention to my re-framing, where "[he] said 'valorizes truth' and [I] repeated back 'valorizes truth-telling'". I pointed out that I had marked that as a proposed revision; I thought I was proposing a change rather than repeating. But maybe you don't get to propose changes when someone is testing you. He then gave a nice speech (in the style of C. S. Lewis's _The Screwtape Letters_) about the dangers of focusing on truth-telling: > so if you have an awareness of you in how people can be broken, where it's possible to redirect them into infinite loops, how they can be induced to press the anger button over and over, then you can perhaps see how somebody setting out to break Zack Davis would get him to focus on truth-telling rather than truth-seeking. for the Way of searching out truth within yourself is one of calm, balance, questioning not 'what society tells you' but also your own thoughts, and also sometimes answering those questions and moving on to different ones; the operation, not of firmly rooting your feet, nor finding somewhere to hover forever uncertainly in place and immovable in that proud ignorance, but of picking up your feet and putting them back down, over and over, the uncomfortable operation of not staying in the same mental place, which most people find some way or another to reject. it valorizes calm, and balance, and these are not useful states of mind to people who would like you frantically doing something useful to them. > when you get somebody to turn outward and away from Reality and towards their fellow monkeys and focus on truth-telling, then, their fellow monkeys being imperfect, there will always be something in which to explode into fury; so this is a useful state of mind to inculcate in somebody, lending itself to constant outrage at a world where somebody has once said a thing that if you look at it hard could be misleading or teach the wrong lesson, it misled you, how dare they! > so by all means if you would like to destroy a rationalist, teach them anger and focus it on others' imperfect conformance to the principles they were once taught to hold dear to themselves -> see also C. S. Lewis, "The Screwtape Letters", which I keep thinking I ought to rewrite and have an unfinished draft in whose voice I was speaking -> now, this is not to say that you should not notice when other people are falling down about things -> nor that you should not speak of it when you notice -... and you know, that was a fair criticism of me. It _is_ unhealthy to focus on other people's errors rather than perfecting oneself! I'm optimistic about rectifying this after I've gotten this Whole Dumb Story out of my system—to retire from this odious chore of criticizing "the community", and just go directly do the thing that I thought "the community" was for. (In the time we have left.) +... and you know, that's a fair criticism of me. It _is_ unhealthy to focus on other people's errors rather than perfecting oneself! I'm optimistic about rectifying this after I've gotten this Whole Dumb Story out of my system—to retire from this distasteful chore of criticizing Yudkowsky and "the community", and just go directly do the thing that I thought "the community" was for, in the time we have left. -But as I pointed out, it was significant that the particular problem to which my Art had been shaped (in some ways) and misshaped (in others) wasn't just a matter of people being imperfect. Someone at the 2021 Event Horizon Independence Day party had told me that people couldn't respond to my arguments because of the obvious political incentives. And so, the angry question I wanted to ask, since I didn't immediately know how to rephrase it to not be doing the angry monkey thing, was, did Yudkowsky think I was supposed to _take that lying down?_ +But, as I pointed out, it was significant that the particular problem to which my Art had been shaped (in some ways) and misshaped (in others) wasn't just a matter of people being imperfect. Someone at the 2021 Event Horizon Independence Day party had told me that people couldn't respond to my arguments because of the obvious political incentives. And so, the angry question I wanted to ask, since I didn't immediately know how to rephrase it to not be doing the angry monkey thing, was, did Yudkowsky think I was supposed to _take that lying down?_ Apparently, yes: -**Eliezer** — 12/17/2022 5:50 PM +**Eliezer** — 12/17/2022 5:50 PM you sure are supposed to not get angry at the people who didn't create those political punishments that's insane they're living in Cheliax and you want them to behave like they're not in Cheliax and get arrested by the Church your issue is with Asmodeus. take it to Him, and if you can't take Him down then don't blame others who can't do that either. -Admirably explicit. +Admirably explicit! If he were that frank all the time, I wouldn't actually have had a problem with him. (I don't expect people to pay arbitrary costs to defy their political incentives; my problem with the "hill of meaning in defense of validity" and "simplest and best protocol" performances was precisely that they were _pretending not to be political statements_; if we can be clear about the _existence_ of the Asmodean elephant in the room listening to everything we say, I don't blame anyone for not saying anything else that the elephant would report to its superiors.) -[TODO: Yudkowsky's story: the story is about Keltham trusting Cheliax wrongly; leaving that part out is politicized; other commenters pick up on "But you're still saying to trust awesome institutions"] +[TODO: Arete points out that the story isn't saying not to outsource your reasoning; other commenters debate; Yudkowsky's story: the story is about Keltham trusting Cheliax wrongly; leaving that part out is politicized; other commenters pick up on "But you're still saying to trust awesome institutions"; this might belong earlier in "logical time" than the first "take this lying down" mention, even if it was chronologically later?] [TODO: I think there's a bit of question-substitution going on; the reason the virtue of evenness is important is because if you only count arguments for and not against the hypothesis, you mess up your beliefs about the hypothesis; if you substitute a different question "Is Yudkowsky bad?"/"Am I a good coder?", that's a bucket error—or was he "correctly" sensing that the real question was "Is Yudkowsky bad?"] [TODO: I express my fully-updated grievance (this doesn't seem to be in the transcript I saved??); I hadn't consciously steered the conversation this way, but the conversation _bounced_ in a way that made it on-topic; that's technically not my fault, even if the elephant in my brain was optimizing for this outcome. -The fact that Yudkowsky had been replying to me at length—explaining why my literary criticism was nuts, but in a way that respected my humanity and expected me to be able to hear it—implied that I was apparently in his "I can cheaply save him (from crazy people like Michael)" bucket, rather than the "AI timelines and therefore life is too short" bucket.] +The fact that Yudkowsky had been replying to me at length—explaining why my literary criticism was nuts, but in a way that respected my humanity and expected me to be able to hear it—implied that I was apparently in his "I can cheaply save him (from crazy people like Michael Vassar)" bucket, rather than the "AI timelines and therefore life is too short" bucket.] + It was disappointing that Yudkowsky's reaction to my complaints was (verbatim!) "that's insane", rather than something more like, "OK, I totally see why you see this as a betrayal, but unfortunately for you, I don't actually consider myself bound by what you thought I was promising."