From 7deba5d700221c598f450955b14b1ea1afe5b267 Mon Sep 17 00:00:00 2001 From: "M. Taylor Saotome-Westlake" Date: Fri, 30 Sep 2022 12:00:48 -0700 Subject: [PATCH] memoir poking and shoveling; incl. "out of patience" email excerpt --- ...-hill-of-validity-in-defense-of-meaning.md | 77 ++++++++- ...ion-overlap-and-cancellable-stereotypes.md | 2 +- notes/a-hill-of-validity-sections.md | 155 ++++++------------ notes/blanchards-dangerous-idea-sections.md | 3 + 4 files changed, 129 insertions(+), 108 deletions(-) diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 0c9752a..c2ad12e 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -206,7 +206,7 @@ I think commonsense privacy-norm-adherence intuitions actually say _No_ here: th In part of the Dumb Story that follows, I'm going to describe several times when I and others emailed Yudkowsky to try to argue with what he said in public, without saying anything about whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was busy with more existentially important things and didn't check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the futher emails with followup arguments that I describe. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".) -It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone else can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation.) +It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone else can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; Scott's reputation isn't so direly in need of correction.) In accordance with the privacy-norm-adherence policy just described, I don't think I should say whether Yudkowsky replied to Michael's and my emails, nor ([again](/2022/TODO/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer/#cheerful-price-privacy-constraint)) whether he accepted the cheerful price money, because any conversation that may or may not have occured would have been private. But what I _can_ say, because it was public, is that we saw [this addition to the Twitter thread](https://twitter.com/ESYudkowsky/status/1068071036732694529): @@ -280,7 +280,7 @@ It was also around this time that our posse picked up a new member, who would pr On 5 January, I met with Michael and his associate Aurora in San Francisco to attempt mediated discourse with [Ziz](https://sinceriously.fyi/) and [Gwen](https://everythingtosaveit.how/), who were considering suing CfAR for discriminating against trans women. Michael hoped to dissuade them from a lawsuit—not because Michael approved of CfAR's behavior, but because involving lawyers makes everything worse. -Ziz recounted [her story](https://sinceriously.fyi/net-negative) story of Anna's alleged discrimination, engaging in [conceptual warfare](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) to portray Ziz as a predatory male. I was unimpressed: in my worldview, I didn't think Ziz had the right to say "I'm not a man," and expect people to just believe that. (I remember at one point, Ziz answered a question with, "Because I don't run off masochistic self-doubt like you." I replied, "That's fair.") But I did respect that Ziz actually believed in an intersex brain theory: in Ziz and Gwen's worldview, people's genders were a _fact_ of the matter, not just a manipulation of consensus categories to make people happy. +Ziz recounted [her](/2019/Oct/self-identity-is-a-schelling-point/) [story of Anna's alleged discrimination](https://sinceriously.fyi/net-negative), engaging in [conceptual warfare](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) to portray Ziz as a predatory male. I was unimpressed: in my worldview, I didn't think Ziz had the right to say "I'm not a man," and expect people to just believe that. (I remember at one point, Ziz answered a question with, "Because I don't run off masochistic self-doubt like you." I replied, "That's fair.") But I did respect that Ziz actually believed in an intersex brain theory: in Ziz and Gwen's worldview, people's genders were a _fact_ of the matter, not just a manipulation of consensus categories to make people happy. Probably the most ultimately significant part of this meeting for future events was Michael verbally confirming to Ziz that MIRI had settled with a disgruntled former employee who had put up a website slandering them. I don't actually know the details of the alleged settlement. (I'm working off of [Ziz's notes](https://sinceriously.fyi/intersex-brains-and-conceptual-warfare/) rather than particularly remembering that part of the conversation clearly; I don't know what Michael knew.) @@ -428,7 +428,7 @@ I supposed that, in Michael's worldview, aggression is more honest than passive- Was the answer just that I needed to accept that there wasn't such a thing in the world as a "rationalist community"? (Sarah had told me as much two years ago, at BABSCon, and I just hadn't made the corresponing mental adjustments.) -On the other hand, a possible reason to be attached to the "rationalist" brand name and social identity that wasn't just me being stupid was that _the way I talk_ had been trained really hard on this subculture for _ten years_. Most of my emails during this whole campaign had contained multiple Sequences or _Slate Star Codex_ links that I could just expect people to have read. I could spontaneously use [the phrase "Absolute Denial Macro"](https://www.lesswrong.com/posts/t2NN6JwMFaqANuLqH/the-strangest-thing-an-ai-could-tell-you) in conversation and expect to be understood. That's a massive "home field advantage." If I just gave up on the "rationalists" being a thing, and went out into the world to make friends with _Quillette_ readers or arbitrary University of Chicago graduates, then I would lose all that accumulated capital. +On the other hand, a possible reason to be attached to the "rationalist" brand name and social identity that wasn't just me being stupid was that _the way I talk_ had been trained really hard on this subculture for _ten years_. Most of my emails during this whole campaign had contained multiple Sequences or _Slate Star Codex_ links that I could just expect people to have read. I could spontaneously use [the phrase "Absolute Denial Macro"](https://www.lesswrong.com/posts/t2NN6JwMFaqANuLqH/the-strangest-thing-an-ai-could-tell-you) in conversation and expect to be understood. That's a massive "home field advantage." If I just gave up on the "rationalists" being a thing, and went out into the world to make friends with _Quillette_ readers or arbitrary University of Chicago graduates, then I would lose all that accumulated capital. Here, I had a _massive_ home territory advantage because I could appeal to Yudkowsky's writings about the philosophy of language from 10 years ago, and people couldn't say, "Eliezer _who?_ He's probably a Bad Man." The language I spoke was _mostly_ educated American English, but I relied on subculture dialect for a lot. My sister has a chemistry doctorate from MIT (and so speaks the language of STEM intellectuals generally), and when I showed her ["... To Make Predictions"](/2018/Feb/the-categories-were-made-for-man-to-make-predictions/), she reported finding it somewhat hard to read, likely because I casually use phrases like "thus, an excellent [motte](https://slatestarcodex.com/2014/11/03/all-in-all-another-brick-in-the-motte/)", and expect to be understood without the reader taking 10 minutes to read the link. That essay, which was me writing from the heart in the words that came most naturally to me, could not be published in _Quillette_. The links and phraseology were just too context-bound. @@ -644,7 +644,7 @@ Here I again need to make a digression about privacy norms. Adherence to norms i [TODO: pandemic starts] -[TODO: "Autogenderphilia Is Common"] +[TODO: "Autogenderphilia Is Common" https://slatestarcodex.com/2020/02/10/autogenderphilia-is-common-and-not-especially-related-to-transgender/] [TODO: help from Jessica for "Unnatural Categories"] @@ -652,9 +652,63 @@ Here I again need to make a digression about privacy norms. Adherence to norms i https://slatestarcodex.com/2020/09/11/update-on-my-situation/ ] -[TODO: "out of patience" email] +[TODO: "out of patience" email +> To: Eliezer Yudkowsky <[redacted]> +> Cc: Anna Salamon <[redacted]> +> Date: 13 September 2020 2:24 _a.m._ +> Subject: out of patience +> +>> "I could beg you to do it in order to save me. I could beg you to do it in order to avert a national disaster. But I won't. These may not be valid reasons. There is only one reason: you must say it, because it is true." +>> —_Atlas Shrugged_ by Ayn Rand +> +> Dear Eliezer (cc Anna as mediator): +> +> Sorry, I'm getting _really really_ impatient (maybe you saw my impulsive Tweet-replies today; and I impulsively called Anna today; and I've spent the last few hours drafting an even more impulsive hysterical-and-shouty potential _Less Wrong_ post; but now I'm impulsively deciding to email you in the hopes that I can withhold the hysterical-and-shouty post in favor of a lower-drama option of your choice): **is there _any_ way we can resolve the categories dispute _in public_?! Not** any object-level gender stuff which you don't and shouldn't care about, **_just_ the philosophy-of-language part.** +> +> My grievance against you is *very* simple. [You are *on the public record* claiming that](https://twitter.com/ESYudkowsky/status/1067198993485058048): +> +>> you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning. +> +> I claim that this is _false_. **I think I _am_ standing in defense of truth when I insist on a word, brought explicitly into question, being used with some particular meaning, when I have an _argument_ for _why_ my preferred usage does a better job of "carving reality at the joints" and the one bringing my usage into question doesn't have such an argument. And in particular, "This word usage makes me sad" doesn't count as a relevant argument.** I [agree that words don't have intrinsic ontologically-basic meanings](https://www.lesswrong.com/posts/4hLcbXaqudM9wSeor/philosophy-in-the-darkest-timeline-basics-of-the-evolution), but precisely _because_ words don't have intrinsic ontologically-basic meanings, there's no _reason_ to challenge someone's word usage except _because_ of the hidden probabilistic inference it embodies. +> +> Imagine one day David Gerard of /r/SneerClub said, "Eliezer Yudkowsky is a white supremacist!" And you replied: "No, I'm not! That's a lie." And imagine E.T. Jaynes was still alive and piped up, "You are _ontologcially confused_ if you think that's a false assertion. You're not standing in defense of truth if you insist on words, such _white supremacist_, brought explicitly into question, being used with some particular meaning." Suppose you emailed Jaynes about it, and he brushed you off with, "But I didn't _say_ you were a white supremacist; I was only targeting a narrow ontology error." In this hypothetical situation, I think you might be pretty upset—perhaps upset enough to form a twenty-one month grudge against someone whom you used to idolize? +> +> I agree that pronouns don't have the same function as ordinary nouns. However, **in the English language as actually spoken by native speakers, I think that gender pronouns _do_ have effective "truth conditions" _as a matter of cognitive science_.** If someone said, "Come meet me and my friend at the mall; she's really cool and you'll like her", and then that friend turned out to look like me, **you would be surprised**. +> +> I don't see the _substantive_ difference between "You're not standing in defense of truth [...]" and "I can define a word any way I want." [...] +> +> [...] +> +> As far as your public output is concerned, it *looks like* you either changed your mind about how the philosophy of language works, or you think gender is somehow an exception. If you didn't change your mind, and you don't think gender is somehow an exception, is there some way we can _get that on the public record **somewhere**?!_ +> +> As an example of such a "somewhere", I had asked you for a comment on my explanation, ["Where to Draw the Boundaries?"](https://www.lesswrong.com/posts/esRZaPXSHgWzyB2NL/where-to-draw-the-boundaries) (with non-politically-hazardous examples about dolphins and job titles) [... redacted ...] I asked for a comment from Anna, and at first she said that she would need to "red team" it first (because of the political context), and later she said that she was having difficulty for other reasons. Okay, the clarification doesn't have to be on _my_ post. **I don't care about credit! I don't care whether or not anyone is sorry! I just need this _trivial_ thing settled in public so that I can stop being in pain and move on with my life.** +> +> As I mentioned in my Tweets today, I have a longer and better explanation than "... Boundaries?" mostly drafted. (It's actually somewhat interesting; the logarithmic score doesn't work as a measure of category-system goodness because it can only reward you for the probability you assign to the _exact_ answer, but we _want_ "partial credit" for almost-right answers, so the expected squared error is actually better here, contrary to what you said in [the "Technical Explanation"](https://yudkowsky.net/rational/technical/) about what Bayesian statisticians do). [... redacted] +> +> The *only* thing I've been trying to do for the past twenty-one months +is make this simple thing established "rationalist" knowledge: +> +> (1) For all nouns _N_, you can't define _N_ any way you want, [for at least 37 reasons](https://www.lesswrong.com/posts/FaJaCgqBKphrDzDSj/37-ways-that-words-can-be-wrong). +> +> (2) *Woman* is such a noun. +> +> (3) Therefore, you can't define the word *woman* any way you want. +> +> (Note, **this is _totally compatible_ with the claim that trans women are women, and trans men are men, and nonbinary people are nonbinary!** It's just that **you have to _argue_ for why those categorizations make sense in the context you're using the word**, rather than merely asserting it with an appeal to arbitrariness.) +> +> This is **literally _modus ponens_**. I don't understand how you expect people to trust you to save the world with a research community that _literally cannot perform modus ponens._ +> +> [redacted ...] See, I thought you were playing on the chessboard of _being correct about rationality_. Such that, if you accidentally mislead people about your own philosophy of language, you could just ... issue a clarification? I and Michael and Ben and Sarah and [redacted] _and Jessica_ wrote to you about this and explained the problem in _painstaking_ detail [... redacted ...] Why? **Why is this so hard?!** +> +> [redacted] +> +> No. The thing that's been driving me nuts for twenty-one months is that I expected Eliezer Yudkowsky to tell the truth. I remain, +> +> Your heartbroken student, + +[TODO: also excerpt out-of-patience followup email?] [TODO: Sep 2020 categories clarification from EY—victory?! https://www.facebook.com/yudkowsky/posts/10158853851009228 @@ -683,6 +737,10 @@ https://astralcodexten.substack.com/p/statement-on-new-york-times-article https://reddragdiva.tumblr.com/post/643403673004851200/reddragdiva-topher-brennan-ive-decided-to-say https://www.facebook.com/yudkowsky/posts/10159408250519228 +Scott Aaronson on the Times's hit piece of Scott Alexander— +https://scottaaronson.blog/?p=5310 +> The trouble with the NYT piece is not that it makes any false statements, but just that it constantly insinuates nefarious beliefs and motives, via strategic word choices and omission of relevant facts that change the emotional coloration of the facts that it does present. + ] ... except that Yudkowsky reopened the conversation in February 2021, with [a new Facebook post](https://www.facebook.com/yudkowsky/posts/10159421750419228) explaining the origins of his intuitions about pronoun conventions and concluding that, "the simplest and best protocol is, '"He" refers to the set of people who have asked us to use "he", with a default for those-who-haven't-asked that goes by gamete size' and to say that this just _is_ the normative definition. Because it is _logically rude_, not just socially rude, to try to bake any other more complicated and controversial definition _into the very language protocol we are using to communicate_." @@ -773,7 +831,7 @@ It would seem that in the current year, that culture is dead—or at least, if i At this point, some people would argue that I'm being too uncharitable in harping on the "not liking to be tossed into a [...] Bucket" paragraph. The same post does _also_ explicitly says that "[i]t's not that no truth-bearing propositions about these issues can possibly exist." I _agree_ that there are some interpretations of "not lik[ing] to be tossed into a Male Bucket or Female Bucket" that make sense, even though biological sex denialism does not make sense. Given that the author is Eliezer Yudkowsky, should I not give him the benefit of the doubt and assume that he "really meant" to communicate the reading that does make sense, rather than the one that doesn't make sense? -I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. Yudkowsky is just _too talented of a writer_ for me to excuse his words as an artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, [it's often wise to conjecture that their behavior represents _optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. I think the point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie." +I reply: _given that the author is Eliezer Yudkowsky_, no, obviously not. I have been ["trained in a theory of social deception that says that people can arrange reasons, excuses, for anything"](https://www.glowfic.com/replies/1820866#reply-1820866), such that it's informative ["to look at what _ended up_ happening, assume it was the _intended_ result, and ask who benefited."](https://www.hpmor.com/chapter/97) Yudkowsky is just _too talented of a writer_ for me to excuse his words as an accidental artifact of unclear writing. Where the text is ambiguous about whether biological sex is a real thing that people should be able to talk about, I think it's _deliberately_ ambiguous. When smart people act dumb, it's often wise to conjecture that their behavior represents [_optimized_ stupidity](https://www.lesswrong.com/posts/sXHQ9R5tahiaXEZhR/algorithmic-intent-a-hansonian-generalized-anti-zombie)—apparent "stupidity" that achieves a goal through some other channel than their words straightforwardly reflecting the truth. Someone who was _actually_ stupid wouldn't be able to generate text with a specific balance of insight and selective stupidity fine-tuned to reach a gender-politically convenient conclusion without explicitly invoking any controversial gender-political reasoning. I think the point of the post is to pander to the biological sex denialists in his robot cult, without technically saying anything unambiguously false that someone could point out as a "lie." Consider the implications of Yudkowsky giving as a clue as to the political forces as play in the form of [a disclaimer comment](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228): @@ -1148,6 +1206,13 @@ I don't doubt Yudkowsky could come up with some clever casuistry why, _technical [TODO: the Death With Dignity era +"Death With Dignity" isn't really an update; he used to refuse to give a probability, and now he says the probability is ~0 + +https://twitter.com/esyudkowsky/status/1164332124712738821 +> I unfortunately have had a policy for over a decade of not putting numbers on a few things, one of which is AGI timelines and one of which is *non-relative* doom probabilities. Among the reasons is that my estimates of those have been extremely unstable. + + + /2017/Jan/from-what-ive-tasted-of-desire/ ] diff --git a/content/drafts/subspatial-distribution-overlap-and-cancellable-stereotypes.md b/content/drafts/subspatial-distribution-overlap-and-cancellable-stereotypes.md index c28da01..7134323 100644 --- a/content/drafts/subspatial-distribution-overlap-and-cancellable-stereotypes.md +++ b/content/drafts/subspatial-distribution-overlap-and-cancellable-stereotypes.md @@ -48,7 +48,7 @@ This is a severe misreading of the sex-realist position. No one wants to _define _One_ of the _many_ distinctions people sometimes want to make when thinking about the multivariate distribution of bodies and minds in the world, is that between the sexes. But sex is by no means the only way in which people differ! In many situations, you might want to categorize or describe people in many different ways, some more or less discrete _versus_ categorical, or high- _versus_ low-dimensional: age or race or religion or subculture or social class or intelligence or agreeableness. -It's quite possible that the categories that are salient in a particular culture ought to be revised in order to fit the world better: maybe we _should_ talk about categories like "masculine people" (including both typical men, and butch lesbians) more often! But the typical trans advocate shell game of just replacing "sex" with "gender" and letting people choose their "gender" isn't going to fly, because sex actually exists and we have a need for language to talk about it—or maybe, the fact that we have a need for language to talk about it (the fact that the information we observe admits compression) is what it means for sex to "actually" "exist". +It's possible that the categories that are salient in a particular culture ought to be revised in order to fit the world better: maybe we _should_ talk about categories like "masculine people" (including both typical men, and butch lesbians) more often! But the typical trans advocate shell game of just replacing "sex" with "gender" and letting people choose their "gender" isn't going to fly, because sex actually exists and we have a need for language to talk about it—or maybe, the fact that we have a need for language to talk about it (the fact that the information we observe admits compression) is what it means for sex to "actually" "exist". If trans advocates go astray in disbelieving that diff --git a/notes/a-hill-of-validity-sections.md b/notes/a-hill-of-validity-sections.md index ae698ba..b2a3cb1 100644 --- a/notes/a-hill-of-validity-sections.md +++ b/notes/a-hill-of-validity-sections.md @@ -1,14 +1,18 @@ on deck— -_ excerpt "out of patience" email +_ the egregore doesn't care about the past +_ § about privacy norms, and my secret research project (research report on gout) _ finish and polish § on reaching out a fourth time -_ § about privacy norms, and my secret research project _ talk about the 2019 Christmas party _ Let's recap _ If he's reading this ... _ Perhaps if the world were at stake +_ ¶ about body odors +_ regrets and wasted time +_ excerpt 2nd "out of patience" email with internet available— +_ "look at what ended up happening"—look up whether that exact quote from from http://www.hpmor.com/chapter/47 or https://www.hpmor.com/chapter/97 _ Gallileo "And yet it moves" _ Discord logs before Austin retreat _ examples of snarky comments about "the rationalists" @@ -16,6 +20,10 @@ _ screenshot Rob's Facebook comment which I link _ 13th century word meanings _ compile Categories references from the Dolphin War Twitter thread _ weirdly hostile comments on "... Boundaries?" +_ more examples of Yudkowsky's arrogance +_ "The Correct Contrarian Cluster" and race/IQ +_ taqiyya +_ refusing to give a probability (When Not to Use Probabilities? Shut Up and Do the Impossible?) far editing tier— @@ -150,6 +158,9 @@ It makes sense for public figures to not want to commit political suicide! Even I'm not optimistic about the problem being fixable, either. Our robot cult _already_ gets a lot of shit from progressive-minded people for being "right-wing"—not because we are in any _useful_, non-gerrymandered sense, but because [attempts to achieve the map that reflects the territory are going to run afoul of ideological taboos for almost any ideology](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting). + +---- + Because of the particular historical moment in which we live, we end up facing pressure from progressives, because—whatever our _object-level_ beliefs about (say) [sex, race, and class differences](/2020/Apr/book-review-human-diversity/)—and however much many of us would prefer not to talk about them—on the _meta_ level, our creed requires us to admit _it's an empirical question_, not a moral one—and that [empirical questions have no privileged reason to admit convenient answers](https://www.lesswrong.com/posts/sYgv4eYH82JEsTD34/beyond-the-reach-of-god). I view this conflict as entirely incidental, something that [would happen in some form in any place and time](https://www.lesswrong.com/posts/cKrgy7hLdszkse2pq/archimedes-s-chronophone), rather than having to do with American politics or "the left" in particular. In a Christian theocracy, our analogues would get in trouble for beliefs about evolution; in the old Soviet Union, our analogues would get in trouble for [thinking about market economics](https://slatestarcodex.com/2014/09/24/book-review-red-plenty/) (as a [positive technical discipline](https://en.wikipedia.org/wiki/Fundamental_theorems_of_welfare_economics#Proof_of_the_first_fundamental_theorem) adjacent to game theory, not yoked to a particular normative agenda). @@ -194,6 +205,8 @@ I'm trying to _get the theory right_. My main victory condition is getting the t It worked once, right? +----- + > An extreme case in point of "handwringing about the Overton Window in fact constituted the Overton Window's implementation" OK, now apply that to your Kolomogorov cowardice https://twitter.com/ESYudkowsky/status/1373004525481598978 @@ -207,35 +220,17 @@ https://www.lesswrong.com/posts/ASpGaS3HGEQCbJbjS/eliezer-s-sequences-and-mainst > The actual real-world consequences of a post like this when people actually read it are what bothers me, and it does feel frustrating because those consequences seem very predictable (!!) -http://www.hpmor.com/chapter/47 -https://www.hpmor.com/chapter/97 -> one technique was to look at what _ended up_ happening, assume it was the _intended_ result, and ask who benefited. - -> This about dath ilani, they are trained in a theory of social deception that says that people can arrange reasons, excuses, for anything, and so at the end of it all you look at what happened and try to build an explanation around that alone. -https://www.glowfic.com/replies/1820866#reply-1820866 - - - -> At least, I have a MASSIVE home territory advantage because I can appeal to Eliezer's writings from 10 years ago, and ppl can't say "Eliezer who? He's probably a bad man" > Makes sense... just don't be shocked if the next frontier is grudging concessions that get compartmentalized > Stopping reading your Tweets is the correct move for them IF you construe them as only optimizing for their personal hedonics https://twitter.com/zackmdavis/status/1224433237679722500 -> I aspire to make sure my departures from perfection aren't noticeable to others, so this tweet is very validating. -https://twitter.com/ESYudkowsky/status/1384671335146692608 - "assuming that it was a 'he'"—people treating pronouns as synonymous with sex https://www.youtube.com/watch?v=mxZBrbVqZnU I realize it wasn't personal—no one _consciously_ thinking "I'm going to trick autogynpehilic men into cutting their dicks off", but -whom I have variously described as having "taught me everything I know" and "rewritten my personality over the internet" - -* the moment in October 2016 when I switched sides http://zackmdavis.net/blog/2016/10/late-onset/ http://zackmdavis.net/blog/2017/03/brand-rust/ -https://www.lesswrong.com/posts/jNAAZ9XNyt82CXosr/mirrors-and-paintings - > The absolute inadequacy of every single institution in the civilization of magical Britain is what happened! You cannot comprehend it, boy! I cannot comprehend it! It has to be seen and even then it cannot be believed! http://www.hpmor.com/chapter/108 @@ -243,11 +238,6 @@ EGS?? (If the world were smaller, you'd never give different people the same name; if our memories were larger, we'd give everyone a UUID.) -* papal infallability / Eliezer Yudkowsky facts -https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts?commentId=Aq9eWJmK6Liivn8ND -Never go in against Eliezer Yudkowsky when anything is on the line. -https://en.wikipedia.org/wiki/Chuck_Norris_facts - how they would actually think about the problem in dath ilan https://www.reddit.com/r/TheMotte/comments/myr3n7/culture_war_roundup_for_the_week_of_april_26_2021/gw0nhqv/?context=3 @@ -255,8 +245,6 @@ https://www.reddit.com/r/TheMotte/comments/myr3n7/culture_war_roundup_for_the_we https://arbital.greaterwrong.com/p/domain_distance?l=7vk -I'm writing to you because I'm afraid that marketing is a more powerful force than argument. Rather than good arguments propogating through the population of so-called "rationalists" no matter where they arise, what actually happens is that people like Eliezer and you rise to power on the strength of good arguments and entertaining writing (but mostly the latter), and then everyone else sort-of absorbs most of their worldview (plus noise and [conformity with the local environment](https://thezvi.wordpress.com/2017/08/12/what-is-rationalist-berkleys-community-culture/)). So for people who _didn't_ [win the talent lottery](http://slatestarcodex.com/2015/01/31/the-parable-of-the-talents/) but think they see a flaw in the _Zeitgeist_, the winning move is "persuade Scott Alexander". - https://www.facebook.com/yudkowsky/posts/10159611207744228?comment_id=10159611208509228&reply_comment_id=10159613820954228 > In the circles I run in, being poly isn't very political, just a sexual orientation like any other—it's normalized the way that LGBT is normalized in saner circles, not political the way that LGBT is political in crazier circles. @@ -264,23 +252,17 @@ https://archive.is/7Wolo > the massive correlation between exposure to Yudkowsky's writings and being a trans woman (can't bother to do the calculations but the connection is absurdly strong) Namespace's point about the two EYs +The level above "Many-worlds is obviously correct, stop being stupid" is "Racial IQ differences are obviously real; stop being stupid"— link to the correct contrarian cluster -The level above "Many-worlds is obviously correct, stop being stupid" is "Racial IQ differences are obviously real; stop being stupid" - - -Anyway, four years later, it turns out that this whole "rationality" subculture is completely fake. The thing that convinced me of this was not _even_ the late-onset-gender-dysphoria-in-males-is-not-an-intersex-condition thesis that I was originally trying to talk about. Humans are _really complicated_: no matter how "obvious" something in psychology or social science to me, I can't write someone off entirely simply for disagreeing, because the whole domain is so complex that I always have to acknowledge that, ultimately, I could just be wrong. - But in the _process_ of trying to _talk about_ this late-onset-gender-dysphoria-in-males-is-not-an-intersex-condition thesis, I noticed that my conversations kept getting _derailed_ on some variation of "The word _woman_ doesn't necessarily mean that." _That_ part of the debate, I knew I could win. -what the math actually means in the real world from "Reply to Holden" I guess I feel pretty naïve now, but—I _actually believed our own propoganda_. I _actually thought_ we were doing something new and special of historical and possibly even _cosmological_ significance. I got a pingback to "Optimized Propaganda" from in an "EDIT 5/21/2021" on https://www.lesswrong.com/posts/qKvn7rxP2mzJbKfcA/persuasion-tools-ai-takeover-without-agi-or-agency after Scott Alexander linked it—evidence for Scott having Power to shape people's attention -https://slatestarcodex.com/2020/02/10/autogenderphilia-is-common-and-not-especially-related-to-transgender/ https://twitter.com/HiFromMichaelV/status/1221771020534788098 "Rationalism starts with the belief that arguments aren't soldiers, and ends with the belief that soldiers are arguments." @@ -312,6 +294,8 @@ The robot cult is "only" "trying" to trick me into cutting my dick off in the se > the problem with taqiyya is that your sons will believe you https://twitter.com/extradeadjcb/status/1397618177991921667 +the second generation doesn't "get the joke"; young people don't understand physical strength differences anymore + > I've informed a number of male college students that they have large, clearly detectable body odors. In every single case so far, they say nobody has ever told them that before. https://www.greaterwrong.com/posts/kLR5H4pbaBjzZxLv6/polyhacking/comment/rYKwptdgLgD2dBnHY @@ -337,7 +321,6 @@ https://www.jefftk.com/p/an-update-on-gendered-pronouns > Still think this was a perfectly fine tweet btw. Some people afaict were doing the literal ontologically confused thing; seemed like a simple thing to make progress on. Some people wanted to read it as a coded statement despite all my attempts to narrow it, but what can you do. https://twitter.com/ESYudkowsky/status/1356535300986523648 - If you were actually HONESTLY tring to narrow it, you would have said, "By the way, this is just about pronouns, I'm not taking a position on whether trans women are women" https://www.gingersoftware.com/content/grammar-rules/adjectives/order-of-adjectives/ @@ -347,7 +330,7 @@ https://www.unqualified-reservations.org/2008/01/how-to-actually-defeat-us-gover https://www.unqualified-reservations.org/2007/12/explanation-of-democratic-centrism/ -the second generation doesn't "get the joke"; young people don't understand physical strength differences anymore + that's the thing; I read as lefty because I am morally lefty (in contrast to Real Men Who Lift &c.); it's just that I had the "bad luck" of reading everything I could about race and IQ after the James Watson affair in 'aught-seven, and all my leftness is filtered through ten years of living with inconvenient hypotheses @@ -384,9 +367,6 @@ https://twitter.com/fortenforge/status/1402057829142302721 uncritically (uncharacteristically uncritically) taking the newly-ascendant gender-identity theory for granted ("lots of women wouldn't be particularly disturbed if they had a male body; the ones we know as 'trans' are just the ones with unusually strong female gender identities"), without considering the obvious-in-retrospect hypothesis that "guy who speculates about his female analogue on a transhumanist mailing list in 2004" and "guy who thinks he might be a trans women in Berkeley 2016" are the same guy. -https://arbital.greaterwrong.com/p/logical_dt/?l=5gc -It even leaked into Big Yud!!! "Counterfactuals were made for humanity, not humanity for counterfactuals." - If hiring a community matchmaker was worth it, why don't my concerns count, too? When I protested that I wasn't expecting a science fictional Utopia of pure reason, but just for people to continue to be right about things they already got right in 2008, he said, "Doesn't matter—doesn't _fucking_ matter." @@ -400,20 +380,6 @@ The boundary must be drawn here! This far, and no further! People change their opinion with the party line—similarly with Scott and Yud https://onlinelibrary.wiley.com/doi/abs/10.1111/ajps.12550?campaign=woletoc -"No one begins to truly search for the Way until their parents have failed them, their gods are dead, and their tools have shattered in their hand." -https://twitter.com/zackmdavis/status/1107874587822297089 - -Robert Stadler - ----------- - - - - ------------ - -Really, it's impressive how far we got in establishing a cult of Bayesians in a world where "discimination" is a transgression against the dominant ideology! I can't be the only one who got the joke - https://stefanfschubert.com/blog/2020/12/22/legitimate-epistocracy prevaricate @@ -422,8 +388,6 @@ back in 'aught-nine, Anna commented that no one in our circle was that old, as i > [Anna] seemed to disapprove of our putting pressure on Scott, because the fact that Scott has done a lot of great work is exactly what made him a target for our pressure. -> Those who are savvy in high-corruption equilibria maintain the delusion that high corruption is common knowledge, to justify expropriating those who naively don't play along, by narratizing them as already knowing and therefore intentionally attacking people, rather than being lied to and confused - > I told Anna about Michael's "enemy combatants" metaphor, and how I originally misunderstood the point of the analogy. War metaphors sound Scary and Mean—I don't want to shoot my friends! But the point of the analogy (which Michael had explained at the time, but I wasn't ready to hear until I did a few more weeks of emotional processing) was specifically that soliders on the other side of a war aren't particularly morally blameworthy as individuals: their actions are just being controlled by the Power they're embedded in. And Anna was like, "But you could still be friends with someone on an animal level, like with a dog", and I was like, "Yeah, that's basically what Michael said." > He says he likes "monastic rationalism vs. lay rationalism" as a frame for the schism Ben is proposing. @@ -431,28 +395,17 @@ back in 'aught-nine, Anna commented that no one in our circle was that old, as i > I suspect Scott is calling the wrong side monastic, though - we basically believe it can be done by lay people, he doesn't. I'll be pleasantly surprised if he gets the sides right, though. - Really, self-respecting trans people who care about logical consistency should abhor Scott and Eliezer's opinions—you should want people to use the right pronouns _because_ of your gender soul or _because_ your transition actually worked, not because categories are flexible and pronouns shouldn't imply gender -> Because it was you, I tried to read this when it came out. But - - Yudkowsky complains—not entirely without justification—that I ["do not know how to come to a point, because [I am] too horrified by the thought that a reader might disagree with [me] if [I] don't write even more first."](https://twitter.com/ESYudkowsky/status/1435605868758765568) But I wasn't always this way. It's an adaptive response to years of trolling. The reason I write even more to get out ahead of the objections I can forsee, is because I've been _at this for six years_. I tried being concise _first_. -You want concise? Meghan Murphy got it down to four words (which could have been three): "Men aren't women tho." - - - - - - - +You want concise? Meghan Murphy got it down to four words: "Men aren't women tho." > If you think you can win a battle about 2 + 3 = 5, then it can feel like victory or self-justification to write a huge long article hammering on that; but it doesn't feel as good to engage with how the Other does not think they are arguing 2 + 3 = 6, they're talking about 2 * 3. @@ -468,7 +421,7 @@ Speaking of narcissism and perspective-taking, "deception" isn't about whether y I really appreciated Anatoly Vorobey's comments: -> to provide context how it may (justifiably?) seem like over the last 7-8 years the rat. community largely fell *hard* for a particular gender philosophy +> to provide context how it may (justifiably?) seem like over the last 7-8 years the rat. community largely fell *hard* for a particular gender philosophy > ... fell for it in ways that seemed so spectacularly uncritical, compared to other beliefs carefully examined and dissected, and more so, even justified with a veneer of "rationality" (as in Scott's Categories post) that beautifully coincided with the tumblr dogma of the time... @@ -487,22 +440,6 @@ example of hero-worship, David Pearce writes— https://www.facebook.com/algekalipso/posts/4769054639853322?comment_id=4770408506384602 > recursively cloning Scott Alexander—with promising allelic variations - and hothousing the “products” could create a community of super-Scotts with even greater intellectual firepower -https://twitter.com/ESYudkowsky/status/1434906470248636419 -> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. - -Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke." - -Or as Yudkowsky put it— - -https://www.facebook.com/yudkowsky/posts/10154981483669228 -> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others. - -It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior! - -An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But, the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller. - -Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Scott and Eliezer is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. Nullius in verba. - Respect needs to be updateable. No one can think fast enough to think all their own thoughts. I have a draft explaining the dolphins thing, about why Nate's distaste for paraphyly is wrong. In Nate's own account, he "suspect[s] that ['[...] Not Man for the Categories'] played a causal role in [...] starting the thread out on fish." Okay, where did Scott get it from, then? I don't have access to his thoughts, but I think he pulled it out of his ass because it was politically convenient for him. I suspect that if you asked him in 2012 whether dolphins are fish, he would have said, "No, they're mammals" like any other educated adult. Can you imagine "... Not Man for the Categories" being as popular as it is in our world if it just cut off after section III? Me neither. @@ -555,7 +492,6 @@ It would be nice if children in rationalist Berkeley could grow up correctly congrats after Whale and Sawyer chimed in: https://twitter.com/ESYudkowsky/status/1435706946158432258 -the "It is sometimes personally prudent to be seen to agree with Stalin" attitude behaves like a conspiracy, even if I feel I've outlived myself https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4166378/ @@ -570,10 +506,6 @@ https://www.lesswrong.com/posts/PG8i7ZiqLxthACaBi/do-fandoms-need-awfulness https://graymirror.substack.com/p/the-journalist-rationalist-showdown?s=r -Scott Aaronson on the Times's hit piece of Scott Alexander— -https://scottaaronson.blog/?p=5310 -> The trouble with the NYT piece is not that it makes any false statements, but just that it constantly insinuates nefarious beliefs and motives, via strategic word choices and omission of relevant facts that change the emotional coloration of the facts that it does present. - contrast to masochism being an infohazard in dath ilan: in real life, when your sexuality is considered an infrohazard (supposedly for the benefit of people with your sexuality), you don't take it lying down Keltham contradicts himself inside of a single tag! Using the words "shape" and "covert" both times!! @@ -582,8 +514,6 @@ Scott has the power to set narratives, as evidenced by his attack on Michael hij maybe he should have some sympathy for Stephen J. Gould's intellectual crimes -Feb. 2016 happy price schedule https://www.facebook.com/yudkowsky/posts/10153956696609228 - one of the new collaborators on _Mad Investor Chaos_ is a _Catholic_ When my chat with EY at the independence day party degenerated into "I'm sorry for being dumb", he said if Zack Davis was too dumb, we're in trouble @@ -605,7 +535,7 @@ in this sense, I keep winning battles, but I've basically conceded the war Rubices— > with Zack in 2017 in particular, I don't know if it's still true now, there was also a lot of "women are brilliant and perfect and pure, and it would do damage to something beautiful for a man to pretend to be one" -gonna mostlky bite the bullet on this one +gonna mostly bite the bullet on this one https://dilbert.com/strip/2022-04-09 @@ -615,7 +545,6 @@ The time E.Y. recommended dark side epistemology as shortform still feels tellin https://discord.com/channels/401181628015050773/471045466805633029/934929649421672488 https://www.lesswrong.com/posts/fYC3t6QQDvdBBwJdw/plutonic_form-s-shortform?commentId=dghPKBjgiA6pTcCKz -"Death With Dignity" isn't really an update; he used to refuse to give a probability, and now he says the probability is ~0 https://www.foxylists.com/etiquette > 6. Do not ask for additional pictures, selfies or services they have not already agreed upon. @@ -1139,10 +1068,6 @@ https://www.greaterwrong.com/posts/FBgozHEv7J72NCEPB/my-way - - - - [TODO: https://twitter.com/ESYudkowsky/status/1404697716689489921 @@ -1169,10 +1094,6 @@ I'm totally still doing this If Scott's willing to link to A. Marinos, maybe he'd link to my memoir, too? https://astralcodexten.substack.com/p/open-thread-242 My reaction to Marinos is probably similar to a lot of people's reaction to me: geez, putting in so much effort to correct Scott's mistake is lame, what a loser, who cares -https://twitter.com/esyudkowsky/status/1164332124712738821 -> I unfortunately have had a policy for over a decade of not putting numbers on a few things, one of which is AGI timelines and one of which is *non-relative* doom probabilities. Among the reasons is that my estimates of those have been extremely unstable. - - -------- @@ -1237,6 +1158,9 @@ Still citing it (31 Jul 22): https://www.reddit.com/r/slatestarcodex/comments/wb Still citing it (19 Sep 22): https://twitter.com/ingalala/status/1568391691064729603 +https://arbital.greaterwrong.com/p/logical_dt/?l=5gc +It even leaked into Big Yud!!! "Counterfactuals were made for humanity, not humanity for counterfactuals." + ------ If you _have_ intent-to-inform and occasionally end up using your megaphone to say false things (out of sloppiness or motivated reasoning in the passion of the moment), it's actually not that big of a deal, as long as you're willing to acknowledge corrections. (It helps if you have critics who personally hate your guts and therefore have a motive to catch you making errors, and a discerning audience who will only reward the critics for finding real errors and not fake errors.) In the long run, the errors cancel out. @@ -1249,3 +1173,32 @@ comment on pseudo-lies post in which he says its OK for me to comment even thoug bitter comments about rationalists— https://www.greaterwrong.com/posts/qXwmMkEBLL59NkvYR/the-lesswrong-2018-review-posts-need-at-least-2-nominations/comment/d4RrEizzH85BdCPhE + +(If you are silent about your pain, _they'll kill you and say you enjoyed it_.) + +------ + +Yudkowsky's hyper-arrogance— +> I aspire to make sure my departures from perfection aren't noticeable to others, so this tweet is very validating. +https://twitter.com/ESYudkowsky/status/1384671335146692608 + +* papal infallability / Eliezer Yudkowsky facts +https://www.lesswrong.com/posts/Ndtb22KYBxpBsagpj/eliezer-yudkowsky-facts?commentId=Aq9eWJmK6Liivn8ND +Never go in against Eliezer Yudkowsky when anything is on the line. +https://en.wikipedia.org/wiki/Chuck_Norris_facts + +https://twitter.com/ESYudkowsky/status/1434906470248636419 +> Anyways, Scott, this is just the usual division of labor in our caliphate: we're both always right, but you cater to the crowd that wants to hear it from somebody too modest to admit that, and I cater to the crowd that wants somebody out of that closet. + +Okay, I get that it was meant as humorous exaggeration. But I think it still has the effect of discouraging people from criticizing Scott or Eliezer because they're the leaders of the caliphate. I spent three and a half years of my life explaining in exhaustive, exhaustive detail, with math, how Scott was wrong about something, no one serious actually disagrees, and Eliezer is still using his social power to boost Scott's right-about-everything (!!) reputation. That seems really unfair, in a way that isn't dulled by "it was just a joke." + +Or as Yudkowsky put it— + +https://www.facebook.com/yudkowsky/posts/10154981483669228 +> I know that it's a bad sign to worry about which jokes other people find funny. But you can laugh at jokes about Jews arguing with each other, and laugh at jokes about Jews secretly being in charge of the world, and not laugh at jokes about Jews cheating their customers. Jokes do reveal conceptual links and some conceptual links are more problematic than others. + +It's totally understandable to not want to get involved in a political scuffle because xrisk reduction is astronomically more important! But I don't see any plausible case that metaphorically sucking Scott's dick in public reduces xrisk. It would be so easy to just not engage in this kind of cartel behavior! + +An analogy: racist jokes are also just jokes. Alice says, "What's the difference between a black dad and a boomerang? A boomerang comes back." Bob says, "That's super racist! Tons of African-American fathers are devoted parents!!" Alice says, "Chill out, it was just a joke." In a way, Alice is right. It was just a joke; no sane person could think that Alice was literally claiming that all black men are deadbeat dads. But, the joke only makes sense in the first place in context of a culture where the black-father-abandonment stereotype is operative. If you thought the stereotype was false, or if you were worried about it being a self-fulfilling prophecy, you would find it tempting to be a humorless scold and get angry at the joke-teller. + +Similarly, the "Caliphate" humor only makes sense in the first place in the context of a celebrity culture where deferring to Scott and Eliezer is expected behavior. (In a way that deferring to Julia Galef or John S. Wentworth is not expected behavior, even if Galef and Wentworth also have a track record as good thinkers.) I think this culture is bad. _Nullius in verba_. diff --git a/notes/blanchards-dangerous-idea-sections.md b/notes/blanchards-dangerous-idea-sections.md index 6d3946f..a4fc5b7 100644 --- a/notes/blanchards-dangerous-idea-sections.md +++ b/notes/blanchards-dangerous-idea-sections.md @@ -296,3 +296,6 @@ Idle anecdote on zero-sum intutions: I tend to invent fake conservation laws whe If my so-called "friends" are going to selectively pretend not to know basic Sequences shit in a way that has clearly been optimized for tricking me into cutting my dick off (independently of the empirical evidence determining whether or not we live in one of the possible worlds where cutting my dick off is a good idea), and still won't correct the mistake after I (and Michael, and Ben, and Sarah, and Zvi—and Jessica, who can be presumed to not share my object-level-politically-derived biases, because she seems pretty happy about having had her dick cut off) spend thousands of words patiently explaining the problem ... how do I know these "friends" aren't already secretly in the pay of the mine dwarves? They're certainly acting like it. A distributed intelligence that's using your former friends for processing power and actuators is trying to confuse you into cutting your dick off. Another former friend says, "Oh, that's just an exception; the distributed intelligence isn't usually that crazy. You shouldn't stop believing its story about how you need to give it your lunch money so that it can save the world." You think for a moment. "I guess that makes sense," you say, and hand over your lunch money. + +* the moment in October 2016 when I switched sides http://zackmdavis.net/blog/2016/10/late-onset/ http://zackmdavis.net/blog/2017/03/brand-rust/ +https://www.lesswrong.com/posts/jNAAZ9XNyt82CXosr/mirrors-and-paintings -- 2.17.1