From: M. Taylor Saotome-Westlake Date: Sat, 5 Nov 2022 05:38:26 +0000 (-0700) Subject: poke and bullet? X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=b7906fb0f95f76c2dffaeaac76e14af21b6ef97f;p=Ultimately_Untrue_Thought.git poke and bullet? --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 130db9c..2d6fd14 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -199,13 +199,13 @@ Again, I realize this must seem weird and cultish to any normal people reading t Anna didn't reply, but I apparently did interest Michael, who chimed in on the email thread to Yudkowsky. We had a long phone conversation the next day lamenting how the "rationalists" were dead as an intellectual community. -As for the attempt to intervene on Yudkowsky—here I need to make a digression about the constraints I'm facing in telling this Whole Dumb Story. _I_ would prefer to just tell the Whole Dumb Story as I would to my long-neglected Diary—trying my best at the difficult task of explaining _what actually happened_ in a very important part of my life, without thought of concealing anything. +As for the attempt to intervene on Yudkowsky—here I need to make a digression about the constraints I'm facing in telling this Whole Dumb Story. _I_ would prefer to just tell this Whole Dumb Story as I would to my long-neglected Diary—trying my best at the difficult task of explaining _what actually happened_ in a very important part of my life, without thought of concealing anything. (If you are silent about your pain, _they'll kill you and say you enjoyed it_.) Unfortunately, a lot of _other people_ seem to have strong intuitions about "privacy", which bizarrely impose constraints on what _I'm_ allowed to say about my own life: in particular, it's considered unacceptable to publicly quote or summarize someone's emails from a conversation that they had reason to expect to be private. I feel obligated to comply with these widely-held privacy norms, even if _I_ think they're paranoid and [anti-social](http://benjaminrosshoffman.com/blackmailers-are-privateers-in-the-war-on-hypocrisy/). -So I would _think_ that the commonsense privacy-norm-compliance rule I should hold myself to while telling this Whole Dumb Story is that I obviously have an inalienable right to blog about _my own_ actions, but that I'm not allowed to refer to private conversations in cases where I don't think I'd be able to get the consent of the other party. (I don't think I'm required to go through the ritual of asking for consent in cases where the revealed information couldn't reasonably be considered "sensitive", or if I know the person doesn't have hangups about this weird "privacy" thing.) In this case, I'm allowed to talk about _me_ emailing Yudkowsky (because that was _my_ action), but I'm not allowed to talk about anything he might have said in reply, or whether he replied. +So I would _think_ that the commonsense privacy-norm-compliance rule I should hold myself to while telling this Whole Dumb Story is that I obviously have an inalienable right to blog about _my own_ actions, but that I'm not allowed to directly refer to private conversations in cases where I don't think I'd be able to get the consent of the other party. (I don't think I'm required to go through the ritual of asking for consent in cases where the revealed information couldn't reasonably be considered "sensitive", or if I know the person doesn't have hangups about this weird "privacy" thing.) In this case, I'm allowed to talk about _me_ emailing Yudkowsky (because that was _my_ action), but I'm not allowed to talk about anything he might have said in reply, or whether he replied. Unfortunately, there's a potentially serious loophole in the commonsense rule: what if some of my actions (which I would have _hoped_ to have an inalienable right to blog about) _depend on_ content from private conversations? You can't, in general, only reveal one side of a conversation. @@ -1101,7 +1101,10 @@ I put the question to a few friends (Subject: "rubber duck philosophy"), and Jes ------ -[TODO: blowing up at a stray remark; robot cult to stop tricking me] +[TODO: + * Yudkowsky made a stray remark about social media causing people to say crazy thing + +] [TODO: "out of patience" email] diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index 02d2af6..48e5f33 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -132,7 +132,7 @@ Furthermore, the claim that only I "would have said anything where you could hea The "where you could hear it" clause is _particularly_ bizarre—as if Yudkowsky takes it as an unexamined and unproblematic assumption that people in "the community" _don't read widely_. It's gratifying to be acknowledged by my caliph—or it would be, if he were still my caliph—but I don't think the points I've been making about the relevance of autogynephilia to transgender identity and the reality of biological sex (!) are particularly novel. I think I _am_ unusual in the amount of analytical rigor I can bring to bear on these topics: when people like Kathleen Stock or Corrina Cohn or Aaron Terrell make similar points, they don't have the background to formulate it [in the language of probabilistic graphical models](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/). _That_ part is a genuine value-add of the "rationalist" memeplex—something I wouldn't have been able to do without the influence of Yudkowsky's Sequences, and all the math books I studied afterwards because the vibe of the _Overcoming Bias_ comment section made that sound like an important and high-status thing to do. -But the promise of Sequences was in offering a discipline of thought that could be _applied to_ everything else you would have read and thought about anyway. This notion that if someone in "the community" (such as it was) didn't say something, then I _wouldn't be able to hear it_ (?!?), would be absurd: _Overcoming Bias_ was a gem of the blogoshere, but not a substitute for the rest of it. (Nor was the blogosphere a substitute for the University library, which escaped the autodidact's resentment of the tyranny of schools by selling borrowing privileges to the public for $100 a year.) To the extent that the Yudkowsky of the current year doesn't expect his followers to be able to pick up on obvious points when someone like Kathleen Stock or Corrina Cohn or Aaron Terrell says it, he should notice that he's running a mere cult or fandom rather than anything one would want to dignify by calling it an intellectual community. +But the promise of Sequences was in offering a discipline of thought that could be _applied to_ everything else you would have read and thought about anyway. This notion that if someone in "the community" (such as it was) didn't say something, then Yudkowsky's faithful students therefore _wouldn't be able to hear it_ (?!?), would be absurd: _Overcoming Bias_ was a gem of the blogoshere, but not a substitute for the rest of it. (Nor was the blogosphere a substitute for the University library, which escaped the autodidact's resentment of the tyranny of schools by selling borrowing privileges to the public for $100 a year.) To the extent that the Yudkowsky of the current year doesn't expect his followers to be able to pick up on obvious points when someone like Kathleen Stock or Corrina Cohn or Aaron Terrell says it, he should notice that he's running a mere cult or fandom rather than anything one would want to dignify by calling it an intellectual community. Yudkowsky's disclaimer comment mentions "speakable and unspeakable arguments"—but what, one wonders, is the boundary of the "speakable"? In response to a commenter mentioning the cost of having to remember pronouns as a potential counterargument, Yudkowsky [offers us another clue](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421871809228): @@ -378,7 +378,14 @@ https://twitter.com/ESYudkowsky/status/1096769579362115584 ] -[TODO section existential stakes, cooperation] +[TODO section existential stakes, cooperation + * so far, I've been writing this from the perspective of _rationalit + + + + + +] > [_Perhaps_, replied the cold logic](https://www.yudkowsky.net/other/fiction/the-sword-of-good). _If the world were at stake._ > @@ -386,6 +393,8 @@ https://twitter.com/ESYudkowsky/status/1096769579362115584 [TODO: social justice and defying threats + * back in 'aught-nine, SingInst had made a point of prosecuting Tyler Emerson + at least Sabbatai Zevi had an excuse: his choices were to convert to Islam or be impaled https://en.wikipedia.org/wiki/Sabbatai_Zevi#Conversion_to_Islam ] @@ -414,8 +423,13 @@ I mean, I wouldn't _call_ it a "dark conspiracy" exactly, but if the people with ] -[TODO: sneering at post-rats; David Xu interprets criticism of Eliezer as me going "full post-rat"?! 6 September 2021 +[TODO: + * It wouldn't be so bad if he weren't trying to sell himself as a religious leader, and profiting from the conflation of rationalist-someone-who-cares-about-reasoning, and rationalist-member-of-robot-cult + * But he does, in fact, seem to actively encourage this conflation (contrast to how the Sequences had a litany against gurus) + * a specific example that made me very angry in September 2021 + * the fact that David Xu interpreted criticism of the robot cult as me going "full post-rat" suggests that Yudkowsky's framing had spilled onto others +sneering at post-rats; David Xu interprets criticism of Eliezer as me going "full post-rat"?! 6 September 2021 https://twitter.com/ESYudkowsky/status/1434906470248636419 > Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. @@ -521,7 +535,7 @@ I don't doubt Yudkowsky could come up with some clever casuistry why, _technical On the offhand chance that Eliezer Yudkowsky happens to be reading this—if someone _he_ trusts (MIRI employees?) genuinely thinks it would be good for the lightcone to bring this paragraph to his attention—he should know that if he _wanted_ to win back _some_ of the trust and respect he's lost from me and everyone I can influence—not _all_ of it, but _some_ of it[^some-of-it]—I think it would be really easy. All he would have to do is come clean about the things he's _already_ misled people about. -[^some-of-it]: Coming clean _after_ someone writes a 70,000 word memoir explaining how dishonest you've been, engenders less trust than coming clean spontenously of your own accord. +[^some-of-it]: Coming clean _after_ someone writes a 80,000 word memoir explaining how dishonest you've been, engenders less trust than coming clean spontenously of your own accord. I don't, actually, expect people to spontaneously blurt out everything they believe to be true, that Stalin would find offensive. "No comment" would be fine. Even selective argumentation that's _clearly labeled as such_ would be fine. (There's no shame in being an honest specialist who says, "I've mostly thought about these issues though the lens of ideology _X_, and therefore can't claim to be comprehensive; if you want other perspectives, you'll have to read other authors and think it through for yourself.") @@ -543,10 +557,12 @@ Again, that's the administrator of Yudkowsky's _own website_ saying that he's de ... but I'm not, holding my breath. If Yudkowsky _wants_ to reply—if he _wants_ to try to win back some of the trust and respect he's lost from me—he's totally _welcome_ to. (_I_ don't censor my comment sections of people whom it "looks like it would be unhedonic to spend time interacting with".) - [TODO: I've given up talking to the guy (nearest unblocked strategy sniping in Eliezerfic doesn't count), my last email, giving up on hero-worship I don't want to waste any more of his time. I owe him that much.] -[TODO: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe ] +[TODO: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe + * when Jessica published her story, the karma took a nosedive when Scott commented blaming all of Jessica's problems on Michael, and Yudkowsky backed up Scott; to me, this looks like raw factional conflict: Jessica had some negative-valence things to say about the Calilphate, so Caliphate leaders move in to discredit her by association. + * (extract points from my conversation about Michael with Scott) +] [TODO: the Death With Dignity era @@ -556,4 +572,14 @@ Again, that's the administrator of Yudkowsky's _own website_ saying that he's de ] -[TODO: regrets and wasted time] +[TODO: + * I wrote to him asking if he cared if I said negative things about him, that it would be easier if he wouldn't hold it against me, and explained my understanding of the privacy norm + * in retrospect, I was wrong to ask that. I _do_ hold it against him. And if I'm entitled to my feelings, isn't he entitled to his? + +like a crazy ex-girlfriend (["I have no underlying issues to address / I'm certifiably cute, and adorably obsessed"](https://www.youtube.com/watch?v=UMHz6FiRzS8)) +] + +[TODO: regrets and wasted time + * Do I have regrets about this Whole Dumb Story? A lot, surely—it's been a lot of wasted time. But it's also hard to say what I should have done differently; I could have listened to Ben more and lost faith Yudkowsky earlier, but he had earned a lot of benefit of the doubt? + +] diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index f137eb1..035e123 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -19,6 +19,7 @@ With internet available— _ university library _ Aaron Terrell and Corina Cohn _ Eddie Izzard +_ more examples of Yudkowsky's arrogance (MIRI dialogues, knew how to say anything at all) _ rational-ist/physics-ist analogy (and link to Yudkowsky Tweet where I think I got this from) _ "linking to K-complicity in my replies", link the specific reply _ Wentworth on mutual information being on the right track? @@ -37,7 +38,7 @@ _ tussle with Ruby on "Causal vs. Social Reality" _ examples of snarky comments about "the rationalists" _ 13th century word meanings _ weirdly hostile comments on "... Boundaries?" -_ more examples of Yudkowsky's arrogance + far editing tier— @@ -848,8 +849,6 @@ and Keltham tells Carissa (null action pg 39) to keep the Light alive as long as > It, it—the fe—it, flame—flames. Flames—on the side of my face. Breathing—breathl—heaving breaths, heaving— -like a crazy ex-girlfriend (["I have no underlying issues to address / I'm certifiably cute, and adorably obsessed"](https://www.youtube.com/watch?v=UMHz6FiRzS8)) - But he is willing to go to bat for killing babies, but not for "Biological Sex is Actually Real Even If That Hurts Your Feelings" https://mobile.twitter.com/AuronMacintyre/status/1547562974927134732 https://extropians.weidai.com/extropians.3Q97/4361.html @@ -1605,6 +1604,15 @@ And I think I _would_ have been over it, except— ----- -I'm available if you want to contest anything you think is unfair or challenge my interpretation of the "can't directly refer to private conversations" privacy norm, but I doubt it's a good use of your time. +FYI, I think this turned out significantly harsher on you than my January 2022 emails made it sound, thus occasioning this one additional email (because I want to be very sure I'm only attacking you and not betraying you; a true friend stabs you in the front). + +I'm planning on publishing the drafts linked below on [dates]. + + * "Blanchard's Dangerous Idea and the Plight of the Lucid Crossdreamer" (18K words) + * "A Hill of Validity in Defense of Meaning" (46K words) + * "Agreeing with Stalin in Ways That Exhibit Generally Rationalist Principles" (19K words) + * **"Why I Don't Trust Eliezer Yudkowsky's Intellectual Honesty"** (1.5K word summary of the parts of the Whole Dumb Story that are specifically an attack on your reputation) + +I'm available if you want to contest anything you think is unfair or challenge my interpretation of the "can't directly refer to private conversations" privacy norm, but I doubt it's a high-value use of your time. End transmission. -(_I've_ given up on you; my intent is to explain to _everyone else_ why you're not worth it.) +---- \ No newline at end of file diff --git a/notes/notes.txt b/notes/notes.txt index 725f3a1..0ece53d 100644 --- a/notes/notes.txt +++ b/notes/notes.txt @@ -3202,3 +3202,7 @@ https://discord.com/channels/401181628015050773/538097598008131594/1037807776065 ... it's not surprising that attraction to masculine presenting women (females, a.f.a.b.s) would fail to generalize to masculine-presenting men (males, a.m.a.b.s)? Biological sex actually exists? ---- + +don't like it when science-fictional or fantasy characters are held up as trans icons (like Jadzia Dax, or when Janet in _The Good Place_ says "Not a girl") when the character has an in-universe rationale for holding beyond-the-binary status (if Trill symbionts have sexual dimorphism, it doesn't need to match their host; Janet is a heavenly robot-analogue), and real-life trans people and enbies do not + +