From 69c168e506288ff297ff588ad588a5e2d61a20c9 Mon Sep 17 00:00:00 2001 From: "M. Taylor Saotome-Westlake" Date: Fri, 28 Jan 2022 10:30:34 -0800 Subject: [PATCH] Friday morning drafting "Challenges": bad faith --- ...s-to-yudkowskys-pronoun-reform-proposal.md | 40 ++++++------------- notes/challenges-notes.md | 7 ++++ 2 files changed, 20 insertions(+), 27 deletions(-) diff --git a/content/drafts/challenges-to-yudkowskys-pronoun-reform-proposal.md b/content/drafts/challenges-to-yudkowskys-pronoun-reform-proposal.md index 867f6e3..6562c7a 100644 --- a/content/drafts/challenges-to-yudkowskys-pronoun-reform-proposal.md +++ b/content/drafts/challenges-to-yudkowskys-pronoun-reform-proposal.md @@ -159,6 +159,8 @@ I would bet at very generous odds at some point in his four decades on Earth, El Conversely, I would also bet at very generous odds that in his four decades on Earth, Eliezer Yudkowsky has very rarely if ever assumed what someone's name is on the basis of their appearance without being told. Because _no native English speakers do this_ (seriously, rather than as a joke or a troll). If you doubt this, try to explain what algorithm you would use to infer that someone's name is "Oliver" based on how they look. What are the "secondary Oliver characteristics", specifically? People for whom it was _actually true_ that names map to appearances the way pronouns map to sex, should not have trouble answering this question! +[TODO: this is why I feel comfortable saying the commenter who introduced "Oliver" was trolling in her teenage memory] + If there _were_ a substantial contingent of native English speakers who don't interpret pronouns as conveying sex category information, one would expect this to show up in our cultural corpus more often—and yet, I'm actually not aware of any notable examples of this (if you do, let me know in the comments). In contrast, it's very easy to find instances of speakers treating pronouns and sex as synonymous. As an arbitrarily chosen example, in [one episode](https://theamazingworldofgumball.fandom.com/wiki/The_Nest) of the animated series [_The Amazing World of Gumball_](https://tvtropes.org/pmwiki/pmwiki.php/WesternAnimation/TheAmazingWorldOfGumball) featuring the ravenous spawn of our protagonists' evil pet turtle, the anthropomorphic-rabbit [Bumbling Dad](https://tvtropes.org/pmwiki/pmwiki.php/Main/BumblingDad) character [says, "Who's to say this pregnant turtle is a _her_?" and everyone gives him a look](https://www.youtube.com/watch?v=5N2Msnrq7wU&t=14s). The joke, you see, is that bunny-father is unthinkingly applying the stock question "Who's to say _X_ is a he/she?" (which makes sense when _X_ is, _e.g._, "the nurse") in a context where there's an obvious answer—namely, that the referents of "her" pronouns are female and only females get pregnant—but the character is too stupid to notice this, and we enjoy a laugh at his expense. @@ -416,18 +418,11 @@ Yudkowsky's pretension to merely have been standing up for the distinction betwe (And in this case, the empirical facts are _so_ lopsided, that if we must find humor in the matter, it really goes the other way. Lia Thomas trounces the entire field by _4.2 standard deviations_ (!!), and Eliezer Yudkowsky feels obligated to _pretend not to see the problem?_ You've got to admit, that's a _little_ bit funny.) -[TODO—another example from the November 2018 thread— - -> I find the "(chromosomes?)" here very amusing. I am also a Yudkowskian, Eliezer; "female human" is a cluster in thingspace :) -https://twitter.com/EnyeWord/status/1068983389716385792 - -] - -[TODO outlining remainder of coda—] +---- -Still, having analyzed the _ways_ in which Yudkowsky is playing dumb here, what's still not entirely clear is _why_. Presumably he cares about maintaining his credibility as an insightful and fair-minded thinker. Why tarnish that by putting out this haughty performance? +Having analyzed the _ways_ in which Yudkowsky is playing dumb here, what's still not entirely clear is _why_. Presumably he cares about maintaining his credibility as an insightful and fair-minded thinker. Why tarnish that by putting out this haughty performance? -Of course, presumably he _doesn't_ think he's tarnishing it—but why not? [He explains in the Facebook comments](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421901809228): +Of course, presumably he _doesn't_ think he's tarnishing it—but why not? [He graciously explains in the Facebook comments](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421901809228): > it is sometimes personally prudent and not community-harmful to post your agreement with Stalin about things you actually agree with Stalin about, in ways that exhibit generally rationalist principles, especially because people do _know_ they're living in a half-Stalinist environment [...] I think people are better off at the end of that. @@ -435,13 +430,9 @@ Ah, _prudence_! He continues: > I don't see what the alternative is besides getting shot, or utter silence about everything Stalin has expressed an opinion on including "2 + 2 = 4" because if that logically counterfactually were wrong you would not be able to express an opposing opinion. -[TODO: type out these five lines of rebuttal and then stitch them together somehow] - ----- - The problem with trying to "exhibit rationalist principles" in an line of argument that you're constructing to be prudent and not community-harmful, is that you're thereby necessarily _not_ exhibiting the central rationalist principle that what matters is the process that _determines_ your conclusion, not the reasoning you present to _reach_ your presented conclusion, after the fact. -The best explanation of this I know was authored by Yudkowsky himself in 2007, in a post titled ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument). It's worth quoting at length. Yudkowsky invites us to consider the plight of a political campaign manager: +The best explanation of this I know was authored by Yudkowsky himself in 2007, in a post titled ["A Rational Argument"](https://www.lesswrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument). It's worth quoting at length. The Yudkowsky of 2007 invites us to consider the plight of a political campaign manager: > As a campaign manager reading a book on rationality, one question lies foremost on your mind: "How can I construct an impeccable rational argument that Mortimer Q. Snodgrass is the best candidate for Mayor of Hadleyburg?" > @@ -478,29 +469,24 @@ I remember this being pretty shocking to read back in 'aught-seven. What an alie This is a shockingly high standard for anyone to aspire to live up to—but what made the Yudkowsky's Sequences so life-changingly valuable, was that they articulated the _existence_ of such a standard. For that, I will always be grateful. -... which is why it's so _bizarre_ that the Yudkowsky of the current year acts like he's never heard of it. If your _actual_ bottom line is that it is sometimes personally prudent and not community-harmful to post your agreement with Stalin, then sure, you can _totally_ find something you agree with to write on the lines above! Probably something that "exhibits generally rationalist principles", even! - -"I don't see what the alternative is besides getting shot," Yudkowsky muses (where presumably, 'getting shot' is a metaphor for a large negative utility, like being unpopular with progressives). An astute observation! And _any other partisan hack could say exactly the same_, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative besides getting shot. - -If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is so terrifying to you that it seems analogous to getting shot - +... which is why it's so _bizarre_ that the Yudkowsky of the current year acts like he's never heard of it! If your _actual_ bottom line is that it is sometimes personally prudent and not community-harmful to post your agreement with Stalin, then sure, you can _totally_ find something you agree with to write on the lines above! Probably something that "exhibits generally rationalist principles", even! It's just that any rationalist who sees the game you're playing has no reason to give a shit about what you say. +"I don't see what the alternative is besides getting shot," Yudkowsky muses (where presumably, 'getting shot' is a metaphor for a large negative utility, like being unpopular with progressives). Yes, an astute observation! And _any other partisan hack could say exactly the same_, for the same reason. Why does the campaign manager withhold the results of the 11th question? Because he doesn't see what the alternative besides getting shot. +Yudkowsky [sometimes](https://www.lesswrong.com/posts/K2c3dkKErsqFd28Dh/prices-or-bindings) [quotes](https://twitter.com/ESYudkowsky/status/1456002060084600832) _Calvin and Hobbes_: "I don't know which is worse, that everyone has his price, or that the price is always so low." +If the idea of being fired from the Snodgrass campaign or being unpopular with progressives is _so_ terrifying to you that it seems analogous to getting shot, then, if those are really your true values, then sure—say whatever you need to say to keep your job and your popularity. But if the price you put on the intellectual integrity of your so-called "rationalist" community is similar to that of the Snodgrass for Mayor campaign, you shouldn't be surprised if intelligent, discerning people accord the same level of trust to the two groups' output. +I often see the phrase "bad faith" thrown around without adequate appreciation of what it means. It's more specific than "dishonest"; it means [adopting the surface appearance of being moved by one set of motivations, while actually acting from another](https://en.wikipedia.org/wiki/Bad_faith). +For example, an insurance company employee who goes through the motions of investigating your claim while privately intending to deny it from the very start might never tell an explicit "lie", but is definitely acting in bad faith: they're asking you questions, demanding evidence, _&c._ in order to _make it look like_ there's some relationship between you proving that the loss occurred, and you getting paid—whereas in reality, you're just not going to be paid. ------ - -You sometimes hear the phrase "bad faith" thrown around, - -This is what https://en.wikipedia.org/wiki/Bad_faith means ["Everybody knows" https://thezvi.wordpress.com/2019/07/02/everybody-knows/ ] ["People are better off at the end of that"— _who_ is better off? I'm not better off ] -[Agreeing with Stalin that 2+2=4 is fine; the problem is a sustained pattern of _selectively_ bring up pro-Party points while ignoring anti-Party facts that would otherwise be relevant to the topic of interest, including stonewalling commenters who try to point out relevance] +[Agreeing with Stalin that 2+2=4 is fine; the problem is a sustained pattern of _selectively_ bring up pro-Party points while ignoring anti-Party facts that would otherwise be relevant to the topic of interest, including stonewalling commenters who try to point out relevance; ] ------- diff --git a/notes/challenges-notes.md b/notes/challenges-notes.md index 8907576..66df659 100644 --- a/notes/challenges-notes.md +++ b/notes/challenges-notes.md @@ -18,8 +18,15 @@ Fit in somewhere— * need introductory sentence before first reference to "we" or "the community" + * it is merited to touch on the nearest-unblocked strategy history somewhere in this piece, even if I may also need to write a longer "A Hill of Validity" + * also need a short statement of what I'm fighting for (AGPs are factually not women, and a culture that insists that everyone needs to lie to protect our feelings is bad for our own intellectual development; I want the things I said in "Sexual Dimorphism" to be the standard story, rather than my weird heresy) + + * my "self-ID is a Schelling Point" and "On the Argumentative Form" show that I'm not a partisan hack (maybe also publish a brief version of ) + 4 levels of intellectual conversation https://rationalconspiracy.com/2017/01/03/four-layers-of-intellectual-conversation/ +> I find the "(chromosomes?)" here very amusing. I am also a Yudkowskian, Eliezer; "female human" is a cluster in thingspace :) +https://twitter.com/EnyeWord/status/1068983389716385792 > But Twitter is at least not *ontologically confused* if they say that using preferred pronouns is courtesy, and claim that they're enforcing a courtesy standard. Replying "That's a lie! I will never lie!" is confused. https://twitter.com/ESYudkowsky/status/1067302082481274880 -- 2.17.1