From: M. Taylor Saotome-Westlake Date: Fri, 18 Nov 2022 02:03:09 +0000 (-0800) Subject: memoir: save off a bunch of net resources before pulling my cable X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;ds=inline;h=3e5b2f1dd5b2bbcd229530c262bb119199f9eb0d;p=Ultimately_Untrue_Thought.git memoir: save off a bunch of net resources before pulling my cable The "physically pull out the network cable" strategy for avoiding distraction is really dumb, but unfortunately, it's actually more reliable than toggling the connection in software or having willpower to do the right thing. --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 452de21..4cd8298 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -811,7 +811,7 @@ I continued to take note of signs of contemporary Yudkowsky visibly not being th [I pointed out that](https://twitter.com/zackmdavis/status/1164259164819845120) the people who smear him as a right-wing Bad Guy do so _in order to_ extract these kinds of statements of political alignment as concessions; his own timeless decision theory would seem to recommend ignoring them rather than paying even this small [Danegeld](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/). -When I emailed the posse about it begging for Likes (Subject: "can't leave well enough alone"), Jessica said she didn't get my point. If people are falsely accusing you of something (in this case, of being a right-wing Bad Guy), isn't it helpful to point out that the accusation is actually false? It seemed like I was advocating for self-censorship on the grounds that speaking up helps the false accusers. But it also helps bystanders (by correcting the misapprehension), and hurts the false accusers (by demonstrating to bystanders that the accusers are making things up). By linking to ["Kolmogorov Complicity"](http://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/) in my replies, I seemed to be insinuating that Yudkowsky was under some sort of duress, but this wasn't spelled out: if Yudkowsky would face social punishment for advancing right-wing opinions, did that mean he was under such duress that saying anything at all would be helping the oppressors? +When I emailed the posse about it begging for Likes (Subject: "can't leave well enough alone"), Jessica said she didn't get my point. If people are falsely accusing you of something (in this case, of being a right-wing Bad Guy), isn't it helpful to point out that the accusation is actually false? It seemed like I was advocating for self-censorship on the grounds that speaking up helps the false accusers. But it also helps bystanders (by correcting the misapprehension), and hurts the false accusers (by demonstrating to bystanders that the accusers are making things up). By [linking to](https://twitter.com/zackmdavis/status/1164259289575251968) ["Kolmogorov Complicity"](http://slatestarcodex.com/2017/10/23/kolmogorov-complicity-and-the-parable-of-lightning/) in my replies, I seemed to be insinuating that Yudkowsky was under some sort of duress, but this wasn't spelled out: if Yudkowsky would face social punishment for advancing right-wing opinions, did that mean he was under such duress that saying anything at all would be helping the oppressors? The paragraph from "Kolmogorov Complicity" that I was thinking of was (bolding mine): diff --git a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md index 86aec2a..286de0a 100644 --- a/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md +++ b/content/drafts/agreeing-with-stalin-in-ways-that-exhibit-generally-rationalist-principles.md @@ -35,7 +35,9 @@ There are a lot of standard caveats that go here that Scott would no doubt scrup [^bet]: It's just—how much do you want to bet on that? How much do you think _Scott_ wants to bet? -But ... anyone who's actually read _and understood_ Charles Murray's work, knows that [Murray _also_ includes the standard caveats](/2020/Apr/book-review-human-diversity/#individuals-should-not-be-judged-by-the-average)! (Even though the one about group differences not implying anything about individuals is [actually wrong](/2022/Jun/comment-on-a-scene-from-planecrash-crisis-of-faith/).) The _Times_'s insinuation that Scott Alexander is a racist _like Charles Murray_ seems like a "[Gettier](https://en.wikipedia.org/wiki/Gettier_problem) attack": the charge is essentially correct, even though the evidence used to prosecute the charge before a jury of distracted _New York Times_ readers is completely bogus. +But ... anyone who's actually read _and understood_ Charles Murray's work, knows that [Murray _also_ includes the standard caveats](/2020/Apr/book-review-human-diversity/#individuals-should-not-be-judged-by-the-average)![^murray-caveat] (Even though the one about group differences not implying anything about individuals is [actually wrong](/2022/Jun/comment-on-a-scene-from-planecrash-crisis-of-faith/).) The _Times_'s insinuation that Scott Alexander is a racist _like Charles Murray_ seems like a "[Gettier](https://en.wikipedia.org/wiki/Gettier_problem) attack": the charge is essentially correct, even though the evidence used to prosecute the charge before a jury of distracted _New York Times_ readers is completely bogus. + +[^murray-caveat]: For example, the introductory summary for Ch. 13 of _The Bell Curve_, "Ethnic Differences in Cognitive Ability", states: "Even if the differences between races were entirely genetics (which they surely are not), it should make no practical difference in how individuals deal with each other." Why do I keep repeatedly bringing this up, that "rationalist" leaders almost certainly believe in cognitive race differences (even if it's hard to get them to publicly admit it in a form that's easy to selectively quote in front of _New York Times_ reader)? @@ -45,7 +47,7 @@ Because of the particular historical moment in which we live, we end up facing p I view this conflict as entirely incidental, something that [would happen in some form in any place and time](https://www.lesswrong.com/posts/cKrgy7hLdszkse2pq/archimedes-s-chronophone), rather than having to do with American politics or "the left" in particular. In a Christian theocracy, our analogues would get in trouble for beliefs about evolution; in the old Soviet Union, our analogues would get in trouble for [thinking about market economics](https://slatestarcodex.com/2014/09/24/book-review-red-plenty/) (as a [positive technical discipline](https://en.wikipedia.org/wiki/Fundamental_theorems_of_welfare_economics#Proof_of_the_first_fundamental_theorem) adjacent to game theory, not yoked to a particular normative agenda).[^logical-induction] -[^logical-induction]: I sometimes wonder how hard it would have been to come up with MIRI's logical induction result (which describes an asymptotic algorithm for estimating the probabilities of mathematical truths in terms of a betting market composed of increasingly complex traders) in the Soviet Union. +[^logical-induction]: I sometimes wonder how hard it would have been to come up with MIRI's [logical induction result](https://arxiv.org/abs/1609.03543) (which describes an asymptotic algorithm for estimating the probabilities of mathematical truths in terms of a betting market composed of increasingly complex traders) in the Soviet Union. Incidental or not, the conflict is real, and everyone smart knows it—even if it's not easy to _prove_ that everyone smart knows it, because everyone smart is very careful about what they say in public. (I am not smart.) @@ -65,13 +67,15 @@ Under these circumstances, dethroning the supremacy of gender identity ideology On 17 February 2021, Topher Brennan [claimed that](https://web.archive.org/web/20210217195335/https://twitter.com/tophertbrennan/status/1362108632070905857) Scott Alexander "isn't being honest about his history with the far-right", and published [an email he had received from Scott in February 2014](https://emilkirkegaard.dk/en/2021/02/backstabber-brennan-knifes-scott-alexander-with-2014-email/), on what Scott thought some neoreactionaries were getting importantly right. -I think that to people who have actually read _and understood_ Scott's work, there is nothing surprising or scandalous about the contents of the email. Scott said that biologically-mediated group differences are probably real, and that neoreactionaries were the only people discussing the object-level hypotheses or the meta-level question of why our Society's collective epistemology is obfuscating this. He said that reactionaries as a whole generate a lot of garbage, but that he trusted himself to sift through the noise and extract the novel insights. (In contrast, RationalWiki didn't generate garbage, but by hewing so closely to the mainstream, it also didn't say much that Scott doesn't already know.) The email contains some details that Scott hadn't already blogged about—most notably the section headed "My behavior is the most appropriate response to these facts", explaining his social strategizing _vis á vis_ the neoreactionaries and his own popularity—but again, none of it is really _surprising_ if you know Scott from his writing. +I think that to people who have actually read _and understood_ Scott's work, there is nothing surprising or scandalous about the contents of the email. Scott said that biologically-mediated group differences are probably real, and that neoreactionaries were the only people discussing the object-level hypotheses or the meta-level question of why our Society's collective epistemology is obfuscating this. He said that reactionaries as a whole generate a lot of garbage, but that he trusted himself to sift through the noise and extract the novel insights. (In contrast, [RationalWiki](https://rationalwiki.org/wiki/Main_Page) didn't generate garbage, but by hewing so closely to the mainstream, it also didn't say much that Scott doesn't already know.) The email contains some details that Scott hadn't already blogged about—most notably the section headed "My behavior is the most appropriate response to these facts", explaining his social strategizing [_vis á vis_](https://en.wiktionary.org/wiki/vis-%C3%A0-vis#Preposition) the neoreactionaries and his own popularity—but again, none of it is really _surprising_ if you know Scott from his writing. + +I think the main reason someone _would_ consider the email a scandalous revelation is if they hadn't read _Slate Star Codex_ that deeply—if their picture of Scott Alexander as a political writer was, "that guy who's _so_ committed to charitable discourse that he [wrote up an explanation of what _reactionaries_ (of all people) believe](https://slatestarcodex.com/2013/03/03/reactionary-philosophy-in-an-enormous-planet-sized-nutshell/)—and then, of course, [turned around and wrote up the definitive explanation of why they're wrong and you shouldn't pay them any attention](https://slatestarcodex.com/2013/10/20/the-anti-reactionary-faq/)." As a first approximation, it's not a bad picture. But what it misses—what _Scott_ knows—is that charity isn't about putting on a show of superficially respecting your ideological opponent, before concluding (of course) that they were wrong and you were right all along in every detail. Charity is about seeing what the other guy is getting _right_. -I think the main reason someone _would_ consider the email a scandalous revelation is if they hadn't read _Slate Star Codex_ that deeply—if their picture of Scott Alexander as a political writer was, "that guy who's _so_ committed to charitable discourse that he [wrote up an explanation of what _reactionaries_ (of all people) believe](https://slatestarcodex.com/2013/03/03/reactionary-philosophy-in-an-enormous-planet-sized-nutshell/)—and then, of course, turned around and wrote up the definitive explanation of why they're wrong and you shouldn't pay them any attention." As a first approximation, it's not a bad picture. But what it misses—what _Scott_ knows—is that charity isn't about putting on a show of superficially respecting your ideological opponent, before concluding (of course) that they were wrong and you were right all along in every detail. Charity is about seeing what the other guy is getting _right_. +The same day, Yudkowsky published [a Facebook post](https://www.facebook.com/yudkowsky/posts/pfbid02ZoAPjap94KgiDg4CNi1GhhhZeQs3TeTc312SMvoCrNep4smg41S3G874saF2ZRSQl) which said[^brennan-condemnation-edits]: -The same day, Yudkowsky published a Facebook post which said: +> I feel like it should have been obvious to anyone at this point that anybody who openly hates on this community generally or me personally is probably also a bad person inside and has no ethics and will hurt you if you trust them, but in case it wasn't obvious consider the point made explicitly. (Subtext: Topher Brennan. Do not provide any link in comments to Topher's publication of private emails, explicitly marked as private, from Scott Alexander.) -> I feel like it should have been obvious to anyone at this point that anybody who openly hates on this community generally or me personally is probably also a bad person inside and has no ethics and will hurt you if you trust them and will break rules to do so; but in case it wasn't obvious, consider the point made explicitly. (Subtext: Topher Brennan. Do not provide any link in comments to Topher's publication of private emails, explicitly marked as private, from Scott Alexander.) +[^brennan-condemnation-edits]: The post was subsequently edited a number of times in ways that I don't think are relevant to my discussion here. I was annoyed at how the discussion seemed to be ignoring the obvious political angle, and the next day, I wrote [a comment](https://www.facebook.com/yudkowsky/posts/pfbid0WJ2h9CRnqzrenpccajdU6SYJkT4967KCstW5dqESt4ArJLjjGHY7yZMk6mjar15Sl?comment_id=10159410429909228) (which ended up yielding 49 Like and Heart reactions): I agreed that there was a grain of truth to the claim that our detractors hate us because they're evil bullies, but stopping the analysis there seemed _incredibly shallow and transparently self-serving_. @@ -85,7 +89,7 @@ In that light, you could see why someone might find "blow the whistle on people Indeed, it seems important to notice (though I didn't at the time of my comment) that _Brennan didn't break any promises_. In [Brennan's account](https://web.archive.org/web/20210217195335/https://twitter.com/tophertbrennan/status/1362108632070905857), Alexander "did not first say 'can I tell you something in confidence?' or anything like that." Scott _unilaterally_ said in the email, "I will appreciate if you NEVER TELL ANYONE I SAID THIS, not even in confidence. And by 'appreciate', I mean that if you ever do, I'll probably either leave the Internet forever or seek some sort of horrible revenge", but we have no evidence that Topher agreed. -To see why the lack of a promise is significant, imagine if someone were guilty of a serious crime (like murder or stealing their customers' money), unilaterally confessed to an acquaintance, but added, "never tell anyone I said this, or I'll seek some sort of horrible revenge". In that case, I think more people's moral intuitions would side with the whistleblower and against "privacy." +To see why the lack of a promise is significant, imagine if someone were guilty of a serious crime (like murder or [stealing billions of dollars of their customers' money](https://www.vox.com/future-perfect/23462333/sam-bankman-fried-ftx-cryptocurrency-effective-altruism-crypto-bahamas-philanthropy)), unilaterally confessed to an acquaintance, but added, "never tell anyone I said this, or I'll seek some sort of horrible revenge". In that case, I think more people's moral intuitions would side with the whistleblower and against "privacy." In the Brennan–Alexander case, I don't think Scott has anything to be ashamed of—but that's _because_ I don't think learning from right-wingers is a crime. If our _actual_ problem was "Genuinely consistent rationalism is realistically always going to be an enemy of the state, because [the map that fully reflects the territory is going to include facts that powerful coalitions would prefer to censor, no matter what specific ideology happens to be on top in a particular place and time](https://www.lesswrong.com/posts/DoPo4PDjgSySquHX8/heads-i-win-tails-never-heard-of-her-or-selective-reporting)", but we _thought_ our problem was "We need to figure out how to exclude evil bullies", then we were in trouble! @@ -98,7 +102,7 @@ In the Brennan–Alexander case, I don't think Scott has anything to be ashamed - +https://twitter.com/zackmdavis/status/1362555980232282113 > Oh, maybe it's relevant to note that those posts were specifically part of my 21-month rage–grief campaign of being furious at Eliezer all day every day for lying-by-implicature about the philosophy of language? But, I don't want to seem petty by pointing that out! I'm over it! And I think I _would_ have been over it, except— @@ -216,7 +220,7 @@ Furthermore, the claim that only I "would have said anything where you could hea The "where you could hear it" clause is _particularly_ bizarre—as if Yudkowsky takes it as an unexamined and unproblematic assumption that people in "the community" _don't read widely_. It's gratifying to be acknowledged by my caliph—or it would be, if he were still my caliph—but I don't think the basic points I've been making about the relevance of autogynephilia to male-to-female transsexualism and the reality of biological sex (!) are particularly novel. I think I _am_ unusual in the amount of analytical rigor I can bring to bear on these topics. When similar points are made by people like Kathleen Stock or Corrina Cohn or Aaron Terrell—or for that matter Steve Sailer—they don't have the background to formulate it [in the language of probabilistic graphical models](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/). _That_ part is a genuine value-add of the "rationalist" memeplex—something I wouldn't have been able to do without the influence of Yudkowsky's Sequences, and all the math books I studied afterwards because the vibe of the _Overcoming Bias_ comment section made that sound like an important and high-status thing to do. -But the promise of Sequences was in offering a discipline of thought that could be _applied to_ everything else you would have read and thought about anyway. This notion that if someone in "the community" (such as it was) didn't say something, then Yudkowsky's faithful students therefore _wouldn't be able to hear it_ (?!?), would be absurd: _Overcoming Bias_ was a gem of the blogoshere, but not a substitute for the rest of it. (Nor was the blogosphere a substitute for the University library, which escaped the autodidact's resentment of the tyranny of schools by selling borrowing privileges to the public for $100 a year.) To the extent that the Yudkowsky of the current year takes for granted that his readers _don't read Steve Sailer_, he should notice that he's running a mere cult or fandom rather than anything one would want to dignify by calling it an intellectual community. +But the promise of Sequences was in offering a discipline of thought that could be _applied to_ everything else you would have read and thought about anyway. This notion that if someone in "the community" (such as it was) didn't say something, then Yudkowsky's faithful students therefore _wouldn't be able to hear it_ (?!?), would be absurd: _Overcoming Bias_ was a gem of the blogoshere, but not a substitute for the rest of it. (Nor was the blogosphere a substitute for the University library, which escaped the autodidact's resentment of the tyranny of schools by [selling borrowing privileges to the public for $100 a year](https://www.lib.berkeley.edu/about/access-library-collections-by-external-users).) To the extent that the Yudkowsky of the current year takes for granted that his readers _don't read Steve Sailer_, he should notice that he's running a mere cult or fandom rather than anything one would want to dignify by calling it an intellectual community. Yudkowsky's disclaimer comment mentions "speakable and unspeakable arguments"—but what, one wonders, is the boundary of the "speakable"? In response to a commenter mentioning the cost of having to remember pronouns as a potential counterargument, Yudkowsky [offers us another clue](https://www.facebook.com/yudkowsky/posts/10159421750419228?comment_id=10159421833274228&reply_comment_id=10159421871809228): @@ -351,7 +355,7 @@ A trans woman I follow on Twitter complained that a receptionist at her workplac It _is_ genuinely sad that the author of those Tweets didn't get perceived in the way she would prefer! But the thing I want her to understand, a thing I think any sane adult (on Earth, and not just dath ilan) should understand— -_It was a compliment!_ That receptionist was almost certainly thinking of [David Bowie](https://en.wikipedia.org/wiki/David_Bowie) or [Eddie Izzard](https://en.wikipedia.org/wiki/Eddie_Izzard), rather than being hateful and trying to hurt. The author should have graciously accepted the compliment, and _done something to pass better next time_. The horror of trans culture is that it's impossible to imagine any of these people doing that—of noticing that they're behaving like a TERF's hostile stereotype of a narcissistic, gaslighting trans-identified man and snapping out of it. +_It was a compliment!_ That receptionist was almost certainly thinking of someone like [David Bowie](https://en.wikipedia.org/wiki/David_Bowie) or [Eddie Izzard](https://en.wikipedia.org/wiki/Eddie_Izzard), rather than being hateful and trying to hurt. The author should have graciously accepted the compliment, and _done something to pass better next time_. The horror of trans culture is that it's impossible to imagine any of these people doing that—of noticing that they're behaving like a TERF's hostile stereotype of a narcissistic, gaslighting trans-identified man and snapping out of it. I want a shared cultural understanding that the _correct_ way to ameliorate the genuine sadness of people not being perceived the way they prefer is through things like _better and cheaper facial feminization surgery_, not _[emotionally blackmailing](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/) people out of their ability to report what they see_. I don't _want_ to reliniqush [my ability to notice what women's faces look like](/papers/bruce_et_al-sex_discrimination_how_do_we_tell.pdf), even if that means noticing that mine isn't; if I'm sad that it isn't, I can endure the sadness if the alternative is _forcing everyone in my life to doublethink around their perceptions of me_. diff --git a/content/images/yudkowsky-woman_in_a_mans_body_noted.png b/content/images/yudkowsky-woman_in_a_mans_body_noted.png new file mode 100644 index 0000000..6dd2b85 Binary files /dev/null and b/content/images/yudkowsky-woman_in_a_mans_body_noted.png differ diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index a26c2c5..537d65b 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -10,44 +10,21 @@ _ flesh out Dolphin War II dedicated day?— _ Michael Vassar and the Theory of Optimal Gossip -_ Sasha disaster +_ psychiatric disaster With internet available— -_ dath ilan endorsers are paid strictly for their time -_ replies to Laura Vaughn's comment about Brent? -_ evil bullies reply -_ link FTX fiasco on "stealing their customers' money" -_ stealing customer deposits -_ does "Contra Grant" side-step the innateness issue? -_ what does "vis á vis" actually mean? -_ check original wording of Brennan denunciation -_ link "Anti-Reactionary FAQ" -_ RationalWiki link (is that still a thing?) -_ logical induction link -_ dolphin thread also referenced Georgia on trees, which also cited "Not Man for the Categories" -_ commit patch URL for my slate_starchive script -_ Wentworth specifically doesn't want people's John-models to overwrite their own models -_ footnote Charles Murray's caveats -_ record of Yudkowsky citing TDT as part of decision to prosecute Emerson? -_ university library sells borrowing privileges +✓ record of Yudkowsky citing TDT as part of decision to prosecute Emerson? +✓ Scott vs. Ben on Drowning +✓ Ray and D. Xu's comments on "The Incentives" +✓ comments on "Self-Consciousness wants to make" +✓ tussle with Ruby on "Causal vs. Social Reality" +_ "not taking into account considerations" → rephrase to quote "God's dictionary" _ Aaron Terrell and Corina Cohn -_ Eddie Izzard _ more examples of Yudkowsky's arrogance (MIRI dialogues, knew how to say anything at all) -_ rational-ist/physics-ist analogy (and link to Yudkowsky Tweet where I think I got this from) -_ "linking to K-complicity in my replies", link the specific reply -_ Wentworth on mutual information being on the right track? -_ "20% of the ones with penises" someone in the comments saying, "It is a woman's body", and Yudkowsky saying "duly corrected" -_ "not taking into account considerations" → rephrase to quote "God's dictionary" -_ except when it's net bad to have concluded Y: https://www.greaterwrong.com/posts/BgEG9RZBtQMLGuqm7/[Error%20communicating%20with%20LW2%20server]/comment/LgLx6AD94c2odFxs4 _ my history of sniping in Yudkowsky's mentions -_ comments on "Self-Consciousness wants to make" -_ Scott vs. Ben on Drowning Children -_ Ray and D. Xu's comments on "The Incentives" _ my comment about having changed my mind about "A Fable of Science and Politics" _ more Yudkowsky Facebook comment screenshots _ that neuroscience paper backing the two-types -_ compile Categories references from the Dolphin War Twitter thread -_ tussle with Ruby on "Causal vs. Social Reality" _ examples of snarky comments about "the rationalists" _ 13th century word meanings _ weirdly hostile comments on "... Boundaries?" @@ -1679,9 +1656,38 @@ https://twitter.com/ESYudkowsky/status/1592002777429180416 ------ -The problem isn't just the smugness and condescension; it's the smugness and condescension when he's in the wrong and knows it; I don't want to be lumped in with +14 November conversation, he put a checkmark emoji on my explanation of why giving up on persuading people via methods that discriminate true or false amounts to giving up on the concept of intellectual honesty and choosing instead to become a propaganda AI, which made me feel much less ragey https://discord.com/channels/401181628015050773/458419017602826260/1041836374556426350 + +The problem isn't just the smugness and condescension; it's the smugness and condescension when he's in the wrong and betraying the principles he laid out in the Sequences and knows it; I don't want to be lumped in with anti-arrogance that's not sensitive to whether the arrogance is in the right My obsession must look as pathetic from the outside as Scott Aaronson's—why doesn't he laugh it off, who cares what SneerClub thinks?—but in my case, the difference is that I was betrayed ----- +dath ilan on advertising (https://www.glowfic.com/replies/1589520#reply-1589520)— +> So, in practice, an ad might look like a picture of the product, with a brief description of what the product does better that tries to sound very factual and quantitative so it doesn't set off suspicions. Plus a much more glowing quote from a Very Serious Person who's high enough up to have a famous reputation for impartiality, where the Very Serious Person either got paid a small amount for their time to try that product, or donated some time that a nonprofit auctioned for much larger amounts; and the Very Serious Person ended up actually impressed with the product, and willing to stake some of their reputation on recommending it in the name of the social surplus they expect to be thereby produced. + + +I wrote a Python script to replace links to _Slate Star Codex_ with archive links: http://unremediatedgender.space/source?p=Ultimately_Untrue_Thought.git;a=commitdiff;h=21731ba6f1191f1e8f9#patch23 + +John Wentworth— +> I chose the "train a shoulder advisor" framing specifically to keep my/Eliezer's models separate from the participants' own models. +https://www.greaterwrong.com/posts/Afdohjyt6gESu4ANf/most-people-start-with-the-same-few-bad-ideas#comment-zL728sQssPtXM3QD9 + +https://twitter.com/ESYudkowsky/status/1355712437006204932 +> A "Physics-ist" is trying to engage in a more special human activity, hopefully productively, where they *think* about light in order to use it better. + +Wentworth on my confusion about going with the sqaured-error criterion in "Unnatural Categories"— +> I think you were on the right track with mutual information. They key insight here is not an insight about what metric to use, it's an insight about the structure of the world and our information about the world. [...] If we care more about the rough wall-height than about brick-parity, that’s because the rough wall-height is more relevant to the other things which we care about in the world. And that, in turn, is because the rough wall-height is more relevant to more things in general. Information about brick-parity just doesn’t propagate very far in the causal graph of the world; it's quickly wiped out by noise in other variables. Rough wall-height propagates further. + +not interested in litigating "lying" vs. "rationalizing" vs. "misleading-by-implicature"; you can be _culpable_ for causing people to be misled in a way that isn't that sensitive to what exactly was going on in your head + +----- + +https://www.facebook.com/yudkowsky/posts/pfbid02ZoAPjap94KgiDg4CNi1GhhhZeQs3TeTc312SMvoCrNep4smg41S3G874saF2ZRSQl?comment_id=10159410429909228&reply_comment_id=10159410748194228 + +> Zack, and many others, I think you have a vulnerability where you care way too much about the reasons that bullies give for bullying you, and the bullies detect that and exploit it. + +https://www.facebook.com/yudkowsky/posts/pfbid02ZoAPjap94KgiDg4CNi1GhhhZeQs3TeTc312SMvoCrNep4smg41S3G874saF2ZRSQl?comment_id=10159410429909228&reply_comment_id=10159410753284228 + +> Everyone. (Including organizers of science fiction conventions.) Has a problem of "We need to figure out how to exclude evil bullies." We also have an inevitable Kolmogorov Option issue but that should not be confused with the inevitable Evil Bullies issue, even if bullies attack through Kolmogorov Option issues. diff --git a/notes/memoir-signal-review.md b/notes/memoir-signal-review.md index 2200e71..a830050 100644 --- a/notes/memoir-signal-review.md +++ b/notes/memoir-signal-review.md @@ -130,7 +130,23 @@ Kelsey Piper wrote me a REALLY IMPRESSIVE email about why she thinks you're a ba Had been meaning to send you detailed email about slander but didn't get around to writing it; REACH story got more interesting (Kelsey says an undisclosed someone credibly threatened to sue (!) over the panel report about you, which will delay its release for the one-year statute of limitations, which is short-term good news (not published) but longer-term bad news (the report muyst be a hit piece if you secret ally is trying to hush it), sorry -[my comment: Kelsey said that _someone_ threatened to sue about the report about Michael, and I didn't infer that it was Michael itself?! Jeez, I'm dumb] +[my comment: Kelsey said that _someone_ threatened to sue about the report about Michael, and I didn't infer that it was Michael itself?! Jeez, I'm dumb + +Kelsey's further comments on 28 November 2020 (comment thread on https://www.facebook.com/zmdavis/posts/10158484261660199 ): + +> I can confirm that REACH investigated complaints by REACH attendees about Vassar's behavior. The panel wrote up a summary of their conclusions, which included that they felt Vassar should not be welcome at REACH for the time being. The summary was not meant for publication with Vassar's name attached. REACH policy at the time was for investigations to be published with enough information that the person could be uniquely identified in the community "Michael V/A" but not with any information that'd make them Googleable. When the document was shared with Vassar for comment, Vassar threatened the REACH panelists with litigation for "defamation". He did not claim that any specific content of the document was false. The panelists included some people who would be very adversely affected by ongoing defamation litigation, so they made the decision to not publish, and the panelists stepped down. + +> At the same time as I was consulting with lawyers trying to figure out some way to publish the REACH document in some fashion so the liability would be solely on me and other people willing to deal with it, Zack and other Vassar-defenders were barraging me with assurances that Vassar was not engaged in reputation management within the rationalist community and had stopped trying to protect opinion of him within the rationalist community. To this day none of them have acknowledged that this was blatantly lying. + +my reply— + +Kelsey, hi. At the time I was defending Vassar to you (in our 25 July 2019 email conversation), I actually didn't know that Vassar had threatened to sue in response to the REACH report! That conversation definitely wouldn't have happened the way it did if I had known that piece of information at the time! + +I certainly agree that threatening a lawsuit is definitely an instance of reputation-management. + +If you actively want to go over the Discord/email log, I could go in to more detail about why I said what I said given what I knew at the time? I can apologize for and retract specific sentences I said that were wrong, but I can't agree with the characterization of my behavior as "blatantly lying." + +] oh, I had angrily testified to the panel on your behalf (which would be a bad idea if it were plice, but I figured my political advocacy couldn't hurt): "Michael is great; this is a skapegoating process, not a justice process, &c."