I think commonsense privacy-norm-adherence intuitions actually say _No_ here: the text of Alice's messages makes it too easy to guess that sometime between 5 and 6, Bob probably said that he couldn't come to the party because he has gout. It would seem that Alice's right to talk about her own actions in her own life _does_ need to take into account some commonsense judgement whether that leaks "sensitive" information about Bob.
-In part of the Dumb Story that follows, I'm going to describe several times when I others emailed Yudkowsky to try to argue with what he said in public, without telling whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, doesn't leak any sensitive information about the other side of a conversation that may or may not have happened: the story comes off about the same whether Yudkowsky didn't reply at all, or whether he replied in a way that I found sufficiently unsatisfying as to merit the futher emails with followup arguments that I describe. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop contacting me," because I would have respected that, but the fact that he didn't say that isn't "sensitive": you probably don't reply to spammers demanding your precious time, either.)
+In part of the Dumb Story that follows, I'm going to describe several times when I others emailed Yudkowsky to try to argue with what he said in public, without telling whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: the story comes off about the same whether Yudkowsky didn't reply at all, or whether he replied in a way that I found sufficiently unsatisfying as to merit the futher emails with followup arguments that I describe. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop contacting me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".)
-It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone else can make higher-quality decisions about whether they should believe the things that Carol says.)
-
-In that context, it seems right that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Scott. (Not because Scott performed well, but because I didn't really _expect_ Scott to perform well in this situation.)
+It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone else can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Scott. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; a public reputation-update isn't called for in the same way.)
In accordance with the privacy-norm-adherence policy just described, I don't think I should say whether Yudkowsky replied to Michael's and my emails, nor ([again](/2022/TODO/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price-privacy-constraint)) whether he accepted the cheerful price money, because any conversation that may or may not have occured would have been private. But what I _can_ say, because it was public, is that we saw [this addition to the Twitter thread](https://twitter.com/ESYudkowsky/status/1068071036732694529):
It's true that [the reason _I_ was continuing to freak out about this](/2019/Jul/the-source-of-our-power/) to the extent of sending him this obnoxious email telling him what to write (seriously, who does that?!) had to with transgender stuff, but wasn't the reason _Scott_ should care.
-The other year, Alexander had written a post, ["Kolmogorov Complicity and the Parable of Lightning"](http://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/), explaining the consequences of political censorship by means of an allegory about a Society with the dogma that thunder occurs before lightning. The problem isn't so much the sacred dogma itself (it's not often that you need to _directly_ make use of the fact that thunder comes first), but that the need to _defend_ the sacred dogma _destroys everyone's ability to think_.
+The other year, Alexander had written a post, ["Kolmogorov Complicity and the Parable of Lightning"](http://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/), explaining the consequences of political censorship by means of an allegory about a Society with the dogma that thunder occurs before lightning. The problem isn't so much the sacred dogma itself (it's not often that you need to _directly_ make use of the fact that lightning comes first), but that the need to _defend_ the sacred dogma _destroys everyone's ability to think_.
It was the same thing here. It wasn't that I had any direct practical need to misgender anyone in particular. It still wasn't okay that trying to talk about the reality of biological sex to so-called "rationalists" got you an endless deluge of—polite! charitable! non-ostracism-threatening!—_bullshit nitpicking_. (What about [complete androgen insensitivity syndrome](https://en.wikipedia.org/wiki/Complete_androgen_insensitivity_syndrome)? Why doesn't this ludicrous misinterpretation of what you said [imply that lesbians aren't women](https://thingofthings.wordpress.com/2018/06/18/man-should-allocate-some-more-categories/)? _&c. ad infinitum_.) With enough time, I thought the nitpicks could and should be satisfactorily answered. (Any ones that couldn't would presumably be fatal criticisms rather than bullshit nitpicks.) But while I was in the process of continuing to write all that up, I hoped Alexander could see why I feel somewhat gaslighted.
Another woman said, "'the original thing that already exists without having to try' sounds fake to me" (to the acclaim of 4 "+1" emoji reactions).
-The problem with this kind of exchange is not that anyone is being shouted down, nor that anyone is lying. The _problem_ is that people are motivatedly, ["algorithmically"](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie) "playing dumb." I wish we had more standard terminology for this phenomenon, which is ubiquitous in human life. By "playing dumb", I don't mean that to suggest that Kelsey was _consciously_ thinking, "I'm playing dumb in order gain an advantage in this argument". I don't doubt that, _subjectively_, mentioning that cis women also get cosmetic surgery sometimes _felt like_ a relevant reply (because I had mentioned transition technology). It's just that, in context, I was very obviously trying to talk about the natural category of "biological sex", and Kelsey could have figured that out _if she had wanted to_.
+The problem with this kind of exchange is not that anyone is being shouted down, nor that anyone is lying. The _problem_ is that people are motivatedly, ["algorithmically"](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie) "playing dumb." I wish we had more standard terminology for this phenomenon, which is ubiquitous in human life. By "playing dumb", I don't mean that to suggest that Kelsey was _consciously_ thinking, "I'm playing dumb in order gain an advantage in this argument". I don't doubt that, _subjectively_, mentioning that cis women also get cosmetic surgery sometimes _felt like_ a relevant reply (because I had mentioned transitioning interventions). It's just that, in context, I was very obviously trying to talk about the natural category of "biological sex", and Kelsey could have figured that out _if she had wanted to_.
It's not that anyone explicitly said, "Biological sex isn't real" in those words. ([The elephant in the brain](https://en.wikipedia.org/wiki/The_Elephant_in_the_Brain) knows it wouldn't be able to get away with _that_.) But if everyone correlatedly plays dumb whenever someone tries to _talk_ about sex in clear language in a context where that could conceivably hurt some trans person's feelings, I think what you have is a culture of _de facto_ biological sex denialism. ("'The original thing that already exists without having to try' sounds fake to me"!!) It's not hard to get people to admit that trans women are different from cis women, but somehow they can't (in public, using words) follow the implication that trans women are different from cis women _because_ trans women are male.
Ben thought I was wrong to think of this kind of behavior as non-ostracisizing. The deluge of motivated nitpicking _is_ an implied marginalization threat, he explained: the game people are playing when they do that is to force me to choose between doing arbitarily large amounts of interpretive labor, or being cast as never having answered these construed-as-reasonable objections, and therefore over time losing standing to make the claim, being thought of as unreasonable, not getting invited to events, _&c._
-I saw the dynamic he was pointing at, but as a matter of personality, I was more inclined to respond, "Welp, I guess I need to write faster and more clearly", rather than to say, "You're dishonestly demanding arbitrarily large amounts of interpretive labor from me." I thought Ben was far too quick to give up on people who he modeled as trying not to understand, whereas I continued to have faith in the possibility of _making_ them understand if I just never gave up. Not to be _so_ much of a scrub as to play chess with a pigeon (which shits on the board and then struts around like it's won), or wrestle with a pig (which gets you both dirty, and the pig likes it), or dispute [what the Tortise said to Achilles](https://en.wikipedia.org/wiki/What_the_Tortoise_Said_to_Achilles)—but to hold out hope that people in "the community" could only be _boundedly_ motivatedly dense, and anyway that giving up wouldn't make me a stronger writer.
+I saw the dynamic he was pointing at, but as a matter of personality, I was more inclined to respond, "Welp, I guess I need to write faster and more clearly", rather than to say, "You're dishonestly demanding arbitrarily large amounts of interpretive labor from me." I thought Ben was far too quick to give up on people who he modeled as trying not to understand, whereas I continued to have faith in the possibility of _making_ them understand if I just ... never gave up. Not to be _so_ much of a scrub as to play chess with a pigeon (which shits on the board and then struts around like it's won), or wrestle with a pig (which gets you both dirty, and the pig likes it), or dispute [what the Tortise said to Achilles](https://en.wikipedia.org/wiki/What_the_Tortoise_Said_to_Achilles)—but to hold out hope that people in "the community" could only be _boundedly_ motivatedly dense, and anyway that giving up wouldn't make me a stronger writer.
(Picture me playing Hermione Granger in a post-Singularity [holonovel](https://memory-alpha.fandom.com/wiki/Holo-novel_program) adaptation of _Harry Potter and the Methods of Rationality_ (Emma Watson having charged me [the standard licensing fee](/2019/Dec/comp/) to use a copy of her body for the occasion): "[We can do anything if we](https://www.hpmor.com/chapter/30) exert arbitrarily large amounts of interpretive labor!")
One of Alexander's [most popular _Less Wrong_ posts ever had been about the noncentral fallacy, which Alexander called "the worst argument in the world"](https://www.lesswrong.com/posts/yCWPkLi8wJvewPbEp/the-noncentral-fallacy-the-worst-argument-in-the-world): for example, those who crow that abortion is _murder_ (because murder is the killing of a human being), or that Martin Luther King, Jr. was a _criminal_ (because he defied the segregation laws of the South), are engaging in a dishonest rhetorical maneuver in which they're trying to trick their audience into attributing attributes of the typical "murder" or "criminal" onto what are very noncentral members of those categories.
-_Even if_ you're opposed to abortion, or have negative views about the historical legacy of Dr. King, this isn't the right way to argue. If you call Janie a _murderer_, that causes me to form a whole bunch of implicit probabilistic expectations—about Janie's moral character, about the suffering of victim whose hopes and dreams were cut short, about Janie's relationship with the law, _&c._—most of which get violated when you subsequently reveal that the murder victim was a four-week-old fetus.
+_Even if_ you're opposed to abortion, or have negative views about the historical legacy of Dr. King, this isn't the right way to argue. If you call Janie a _murderer_, that causes me to form a whole bunch of implicit probabilistic expectations on the basis of what the typical "murder" is like—expectations about Janie's moral character, about the suffering of victim whose hopes and dreams were cut short, about Janie's relationship with the law, _&c._—most of which get violated when you subsequently reveal that the murder victim was a four-week-old fetus.
In the form of a series of short parables, I tried to point out that Alexander's own "The Worst Argument in the World" is really complaining about the _same_ category-gerrymandering move that his "... Not Man for the Categories" comes out in favor of. We would not let someone get away with declaring, "I ought to accept an unexpected abortion or two deep inside the conceptual boundaries of what would normally not be considered murder if it'll save someone's life." Maybe abortion _is_ wrong and relevantly similar to the central sense of "murder", but you need to make that case _on the empirical merits_, not by linguistic fiat (Subject: "twelve short stories about language").
]
+
+
+
[TODO: We lost?! How could we lose??!!?!? And, post-war concessions ...
curation hopes ... 22 Jun: I'm expressing a little bit of bitterness that a mole rats post got curated https://www.lesswrong.com/posts/fDKZZtTMTcGqvHnXd/naked-mole-rats-a-case-study-in-biological-weirdness
The reason it _should_ be safe to write is because Explaining Things is Good. It should be possible to say, "This is not a social attack; I'm not saying 'rationalists Bad, Yudkowsky Bad'; I'm just trying to carefully _tell the true story_ about why, as a matter of cause-and-effect, I've been upset this year, including addressing counterarguments for why some would argue that I shouldn't be upset, why other people could be said to be behaving 'reasonably' given their incentives, why I nevertheless wish they'd be braver and adhere to principle rather than 'reasonably' following incentives, _&c_."
-So why couldn't I write? Was it that I didn't know how to make "This is not a social attack" credible? Maybe because it's wasn't true?? I was afraid that telling a story about our leader being intellectually dishonest was "the nuclear option" in a way that I couldn't credibly cancel with "But I'm just telling a true story about a thing that was important to me that actually happened" disclaimers. If you're slowly-but-surely gaining territory in a conventional war, _suddenly_ escalating to nukes seems pointlessly destructive. This metaphor is horribly non-normative ([arguing is not a punishment!](https://srconstantin.github.io/2018/12/15/argue-politics-with-your-best-friends.html) carefully telling a true story _about_ an argument is not a nuke!), but I didn't know how to make it stably go away.
+So why couldn't I write? Was it that I didn't know how to make "This is not a social attack" credible? Maybe because ... it's wasn't true?? I was afraid that telling a story about our leader being intellectually dishonest was "the nuclear option" in a way that I couldn't credibly cancel with "But I'm just telling a true story about a thing that was important to me that actually happened" disclaimers. If you're slowly-but-surely gaining territory in a conventional war, _suddenly_ escalating to nukes seems pointlessly destructive. This metaphor is horribly non-normative ([arguing is not a punishment!](https://srconstantin.github.io/2018/12/15/argue-politics-with-your-best-friends.html) carefully telling a true story _about_ an argument is not a nuke!), but I didn't know how to make it stably go away.
A more motivationally-stable compromise would be to try to split off whatever _generalizable insights_ that would have been part of the story into their own posts that don't make it personal. ["Heads I Win, Tails?—Never Heard of Her"](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting) had been a huge success as far as I was concerned, and I could do more of that kind of thing, analyzing the social stuff I was worried about, without making it personal, even if, secretly, it actually was personal.
playing on a different chessboard
people reading funny GPT-2 quotes
Tsvi said it would be sad if I had to leave the Bay Area
+
+A MIRI researcher
+
motivation deflates after Christmas victory
5 Jan memoir as nuke
]
It would seem that in the current year, that culture is dead—or at least, if it does have any remaining practitioners, they do not include Eliezer Yudkowsky.
-At this point, some people would argue that I'm being too uncharitable in my interpretation of the "not liking to be tossed into a [...] Bucket" paragraph. The same post does also explicitly say that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I agree that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not assume that he "really meant" to communicate the reading that does make sense, rather than the one that doesn't make sense?
-
-I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. Yudkowsky is just _too talented of a writer_ for me to excuse his words as an artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. Or at least—_optimizedly_ ambiguous. The point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie."
+At this point, some people would argue that I'm being too uncharitable in harping on the "not liking to be tossed into a [...] Bucket" paragraph. The same post does also explicitly say that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I agree that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not assume that he "really meant" to communicate the reading that does make sense, rather than the one that doesn't make sense?
-If Yudkowsky was playing dumb (consciously or not) and his comments can't be taken seriously, what was _actually_ going on here? When smart people act dumb, [it's usually wisest to assume that their behavior represents _optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning.
+I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. Yudkowsky is just _too talented of a writer_ for me to excuse his words as an artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, [it's often wise to conjecture that their behavior represents _optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. The point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie."
-Fortunately, Yudkowsky graciously grants us a clue in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228):
+Consider the implications of Yudkowsky giving as a clue as to the political forces as play in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228):
> It unfortunately occurs to me that I must, in cases like these, disclaim that—to the extent there existed sensible opposing arguments against what I have just said—people might be reluctant to speak them in public, in the present social atmosphere. That is, in the logical counterfactual universe where I knew of very strong arguments against freedom of pronouns, I would have probably stayed silent on the issue, as would many other high-profile community members [...]
>
with internet available—
-_ examples of snarky comments about "the rationalists"
_ Discord logs before Austin retreat
+_ examples of snarky comments about "the rationalists"
_ screenshot Rob's Facebook comment which I link
_ 13th century word meanings
_ compile Categories references from the Dolphin War Twitter thread
_ 2019 Discord discourse with Alicorner
_ edit discussion of "anti-trans" side given that I later emphasize that "sides" shouldn't be a thing
_ first appearance of "Caliphate"
-_ the right way to explain how I'm respecting Yudkowsky's privacy
_ explain the adversarial pressure on privacy norms
_ first EY contact was asking for public clarification or "I am being silenced" (so Glomarizing over "unsatisfying response" or no response isn't leaking anything Yudkowksy cares about)
_ Nov. 2018 continues thread from Oct. 2016 conversation
_ maybe quote Michael's Nov 2018 texts?
_ clarify sequence of outreach attempts
_ clarify existence of a shadow posse member
-_ mention Nov. 2018 conversation with Ian somehow
+_ mention Nov. 2018 conversation with Ian somehow; backref on bidding for attention again; subject line from Happy Price 2016
_ Said on Yudkowsky's retreat to Facebook being bad for him
_ explain first use of "rationalist"
_ explain first use of Center for Applied Rationality
> the massive correlation between exposure to Yudkowsky's writings and being a trans woman (can't bother to do the calculations but the connection is absurdly strong)
Namespace's point about the two EYs
-[stonewalling](https://www.lesswrong.com/posts/wqmmv6NraYv4Xoeyj/conversation-halters)
+
The level above "Many-worlds is obviously correct, stop being stupid" is "Racial IQ differences are obviously real; stop being stupid"
> I suspect Scott is calling the wrong side monastic, though - we basically believe it can be done by lay people, he doesn't. I'll be pleasantly surprised if he gets the sides right, though.
-• at least Sabbatai Zevi had an excuse: his choices were to convert to Islam or be impaled https://en.wikipedia.org/wiki/Sabbatai_Zevi#Conversion_to_Islam
+
Really, self-respecting trans people who care about logical consistency should abhor Scott and Eliezer's opinions—you should want people to use the right pronouns _because_ of your gender soul or _because_ your transition actually worked, not because categories are flexible and pronouns shouldn't imply gender
-https://twitter.com/ESYudkowsky/status/1435605868758765568
-> Because it was you, I tried to read this when it came out. But you do not know how to come to a point, because you are too horrified by the thought that a reader might disagree with you if you don't write even more first; so then I started skimming, and then I gave up.
-But I wasn't always this way. It's adaptive. The reason I write even more to get out ahead of the objections I can forsee, is because I've been _at this for six years_. I tried being concise _first_.
+> Because it was you, I tried to read this when it came out. But
+
+
+Yudkowsky complains—not entirely without justification—that I ["do not know how to come to a point, because [I am] too horrified by the thought that a reader might disagree with [me] if [I] don't write even more first."](https://twitter.com/ESYudkowsky/status/1435605868758765568)
+
+But I wasn't always this way. It's an adaptive response to years of trolling. The reason I write even more to get out ahead of the objections I can forsee, is because I've been _at this for six years_. I tried being concise _first_.
You want concise? Meghan Murphy got it down to four words (which could have been three): "Men aren't women tho."
+
+
+
+
+
> If you think you can win a battle about 2 + 3 = 5, then it can feel like victory or self-justification to write a huge long article hammering on that; but it doesn't feel as good to engage with how the Other does not think they are arguing 2 + 3 = 6, they're talking about 2 * 3.
https://twitter.com/ESYudkowsky/status/1435618825198731270
https://twitter.com/ESYudkowsky/status/1404821285276774403
> It is not trans-specific. When people tell me I helped them, I mostly believe them and am happy.
]
+
+
+https://www.lesswrong.com/posts/cyzXoCv7nagDWCMNS/you-re-calling-who-a-cult-leader#35n
+> In fact, I would say that by far the most cultish-looking behavior on Hacker News is people trying to show off how willing they are to disagree with Paul Graham
+I'm totally still doing this
+
+> it's that it's hard to get that innocence back, once you even start thinking about whether you're _independent_ of someone