From: M. Taylor Saotome-Westlake Date: Mon, 8 Aug 2022 01:58:17 +0000 (-0700) Subject: memoir: for the uniform; outlining X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=2a7eb34b454eebb94ab56d6bac13e348a396a691;p=Ultimately_Untrue_Thought.git memoir: for the uniform; outlining --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 17c3192..30abb64 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -269,9 +269,24 @@ Well, you're still somewhat better off listening to them than the whistling of t * I know you're very busy; I know your work's important—but it might be a useful exercise? Just for a minute, to think of what you would actually say if someone with social power _actually did this to you_ when you were trying to use language to reason about Something you had to Protect? ] +Without disclosing any _specific content_ from private conversations with Yudkowsky that may or may not have happened, I think I am allowed to say that our posse did not get the kind of engagement from him that we were hoping for. (That is, I'm Glomarizing over whether Yudkowsky just didn't reply, or whether he did reply and our posse was not satisfied with the response.) Michael said that it seemed important that, if we thought Yudkowsky wasn't interested, we should have common knowledge among ourselves that we consider him to be choosing to be a cult leader. + +[TODO SECTION: relying on Michael too much; I'm not crazy + * This may have been less effective than it was in my head; I _remembered_ Michael as being high-status + * "I should have noticed earlier that my emotional dependence on "Michael says X" validation is self-undermining, because Michael says that the thing that makes me valuable is my ability to think independently." + * fairly destructive move +* _Everyone got it wrong_. there was a comment on /r/slatestarcodex the other week that cited Scott, Eliezer, Ozy, Kelsey, and Rob as leaders of rationalist movement. https://www.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/ +"We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouWereThere), and _you_ were there ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ..." +] -Without disclosing any _specific content_ from private conversations with Yudkowsky that may or may not have happened, I think I am allowed to say that our posse did not get the kind of engagement from him that we were hoping for. (That is, I'm Glomarizing over whether Yudkowsky just didn't reply, or whether he did reply and our posse was not satisfied with the response.) Michael said that it seemed important that, if we thought Yudkowsky wasn't interested, we should have common knowledge among ourselves that we consider him to be choosing to be a cult leader. +[TODO: Anna Michael feud + * Anna's 2 Mar comment badmouthing Michael + * my immediate response: I strongly agree with your point about "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help fill these functions"! That's why I'm so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserves to be laughed out of the room. + * Anna's case against Michael: he was talking to Devi even when Devi needed a break, and he wanted to destroy EA + * I remember at a party in 2015ish, asking Michael what else I should invest my money in, if not New Harvest/GiveWell, and his response was, "You" + * backstory of anti-EA sentiment: Ben's critiques, Sarah's "EA Has a Lying Problem"—Michael had been in the background +] Meaninwhile, my email thread with Scott got started back up, although I wasn't expecting anything to come out of it. I expressed some regret that all the times I had emailed him over the past couple years had been when I was upset about something (like psych hospitals, or—something else) and wanted something from him, which was bad, because it was treating him as a means rather than an end—and then, despite that regret, continued prosecuting the argument. @@ -309,30 +324,56 @@ Lying down didn't work. So at 5:26 _a.m._, I sent an email to Scott cc my posse [TODO: Michael jumps in to help, I rebuff him, Michael says WTF and calls me, I take a train home, Alicorn visits with her son—I mean, her son at the time] -[TODO: proton concession - * +(Incidentally, the code that I wrote intermittently between 11 _p.m._ and 4 _a.m._ was a horrible mess, and the company has been paying for it ever since, every time someone needs to modify that function, and finds it harder to navigate than it would be if I had been less emotionally overwhelmed in March 2019 and written something sane instead.) -] +I think at some level, I wanted Scott to know how frustrated I was about his use of "mental health for trans people" as an Absolute Denial Macro. But then when Michael started advocating on my behalf, I started to minimize my claims because I had a generalized attitude of not wanting to sell myself as a victim. (Michael seemed to have a theory that people will only change their bad behavior when they see a victim who is being harmed.) -[TODO SECTION: relying on Michael too much; I'm not crazy - * This may have been less effective than it was in my head; I _remembered_ Michael as being high-status - * "I should have noticed earlier that my emotional dependence on "Michael says X" validation is self-undermining, because Michael says that the thing that makes me valuable is my ability to think independently." - * fairly destructive move -* _Everyone got it wrong_. there was a comment on /r/slatestarcodex the other week that cited Scott, Eliezer, Ozy, Kelsey, and Rob as leaders of rationalist movement. https://www.reddit.com/r/slatestarcodex/comments/anvwr8/experts_in_any_given_field_how_would_you_say_the/eg1ga9a/ +I supposed that, in Michael's worldview, aggression is more honest than passive-aggression. That seemed obviously true, but I was psychologically limited in how much aggression I was willing to deploy against my friends. (And particularly Yudkowsky, who I still hero-worshipped.) But clearly, the tension between "I don't want to do too much social aggression" and "losing the Category War within the rationalist community is _absolutely unacceptable_" was causing me to make wildly inconsistent decisions. (Emailing Scott at 4 a.m., and then calling Michael "aggressive" when he came to defend me was just crazy.) -"We ... we had a whole Sequence about this. Didn't we? And, and ... [_you_ were there](https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouWereThere), and _you_ were there ... It—really happened, right? I didn't just imagine it? The [hyperlinks](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong) [still](https://www.lesswrong.com/posts/d5NyJ2Lf6N22AD9PB/where-to-draw-the-boundary) [work](https://www.lesswrong.com/posts/yLcuygFfMfrfK8KjF/mutual-information-and-density-in-thingspace) ..." +Was the answer just that I needed to accept that there wasn't such a thing in the world as a "rationalist community"? (Sarah had told me as much two years ago, at BABSCon, and I just hadn't made the corresponing mental adjustments.) + +On the other hand, a possible reason to be attached to the "rationalist" brand name and social identity that wasn't just me being stupid was that _the way I talk_ had been trained really hard on this subculture for _ten years_. Most of my emails during this whole campaign had contained multiple Sequences or _Slate Star Codex_ links that I could just expect people to have read. I could spontaneously use the phrase "Absolute Denial Macro" in conversation and expect to be understood. That's a massive "home field advantage." If I just gave up on "rationalists" being a thing, and go out in the world to make intellectual friends elsewhere (by making friends with _Quillette_ readers or arbitrary University of Chicago graduates), then I would lose all that accumulated capital. + +The language I spoke was _mostly_ educated American English, but I relied on subculture dialect for a lot. My sister has a chemistry doctorate from MIT (and so speaks the language of STEM intellectuals generally), and when I showed her ["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), she reported finding it somewhat hard to read, likely because I casually use phrases like "thus, an excellent [motte](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/)", and expect to be understood without the reader taking 10 minutes to read the link. That essay, which was me writing from the heart in the words that came most naturally to me, could not be published in _Quillette_. The links and phraseology were just too context-bound. + +Maybe that's why I felt like I had to stand my ground and fight a culture war to preserve the world I was made in, even though the contradiction between the war effort and my general submissiveness was having me making crazy decisions. + +[TODO SECTION: proton concession + * as it happened, the next day, Wednesday, we got this: https://twitter.com/ESYudkowsky/status/1108277090577600512 (Why now? maybe he saw the "tools have shattered in their hand"; maybe the Quillette article just happened to be timely) + * A concession! In the war frame, you'd think this would make me happy + * "I did you a favor by Tweeting something obliquely favorable to your object-level crusade, and you repay me by criticizing me? How dare you?!" My model of Sequences-era Eliezer-2009 would never do that, because the species-typical arguments-as-social-exchange + * do you think Eliezer is thinking, "Fine, if I tweet something obliquely favorable towards Zack's object-level agenda, maybe Michael's gang will leave me alone now" + * If there's some other reason you suspect there might by multiple species of dysphoria, but you tell people your suspicion is because dysphoria has more than one proton, then you're still kind of misinforming them for political reasons, which is the generalized problem that we're worried about? + * Michael's take: not worth the digression; we need to confront the actual crisis + * We need to figure out how to win against bad faith arguments ] -[TODO: Anna Michael feud - * Anna's 2 Mar comment badmouthing Michael - * my immediate response: I strongly agree with your point about "ridicule of obviously-fallacious reasoning plays an important role in discerning which thinkers can (or can't) help fill these functions"! That's why I'm so heartbroken about the "categories are arbitrary, therefore trans women are women" thing, which deserves to be laughed out of the room. - * Anna's case against Michael: he was talking to Devi even when Devi needed a break, and he wanted to destroy EA - * I remember at a party in 2015ish, asking Michael what else I should invest my money in, if not New Harvest/GiveWell, and his response was, "You" - * backstory of anti-EA sentiment: Ben's critiques, Sarah's "EA Has a Lying Problem"—Michael had been in the background +[TODO: Jessica joins the coalition; she tell me about her time at MIRI (link to Zoe-piggyback and Occupational Infohazards); Michael said that me and Jess together have more moral authority] + +[TODO: wrapping up with Scott; Kelsey; high and low Church https://slatestarcodex.com/2019/07/04/some-clarifications-on-rationalist-blogging/] + +[TODO: Ben reiterated that the most important thing was explaining why I've written them off; self-promotion imposes a cost on others; Jessica on creating clarity; Michael on less precise is more violent] + +[TODO: after some bouncing off the posse, what was originally an email draft became a public _Less Wrong_ post, "Where to Draw the Boundaries?" (note, plural) + * Wasn't the math overkill? + * math is important for appeal to principle—and as intimidation https://slatestarcodex.com/2014/08/10/getting-eulered/ + * four simulacra levels got kicked off here + * no politics! just philosophy! + * Ben on Michael on whether we are doing politics; "friendship, supplication, and economics" + * I could see that I'm including subtext and expecting people to only engage with the text, but if we're not going to get into full-on gender-politics on Less Wrong, but gender politics is motivating an epistemology error, I'm not sure what else I'm supposed to do! I'm pretty constrained here! + * I had already poisoned the well with "Blegg Mode" the other month, bad decision + * We lost?! How could we lose??!!?!? ] +------ -[TODO: "Blegg Mode", "Where to Draw the Boundaries?", and failure /2019/May/hiatus/ ] + +[TODO: I was floored; math and wellness month + Anna doesn't want money from me + scuffle on "Yes Requires the Possibility of No" + LessWrong FAQ https://www.lesswrong.com/posts/MqrzczdGhQCRePgqN/feedback-requested-draft-of-a-new-about-welcome-page-for#iqEEme6M2JmZEXYAk + +] [TODO: more blogging (https://www.lesswrong.com/posts/5aqumaym7Jd2qhDcy/containment-thread-on-the-motivation-and-political-context), 2019 Christmas party, disclaimer on "Categories Were Made"] ["Univariate fallacy" also a concession] diff --git a/notes/post_ideas.txt b/notes/post_ideas.txt index 10b751d..76fb779 100644 --- a/notes/post_ideas.txt +++ b/notes/post_ideas.txt @@ -1,16 +1,3 @@ -writing tier— -_ fill minor psych episode § -_ fill Culture War RIP/defending against alt-right § -_ fill relying on Michael et al. too much § -_ fill Anna–Michael feud § - -outline tier— -_ outline proton concession § - -harvesting tier— -- name more sections -- more email harvesting - editing tier— _ Emperor Norton ordered Hayes executed _ address the "maybe it's good to be called names" point from "Hill" thread @@ -20,7 +7,6 @@ _ clarify sequence of outreach attempts _ clarify existence of a shadow posse member - Urgent/needed for healing— _ I Am Not Great With Secrets (aAL) _ Reply to Scott Alexander on Autogenderphilia