From: M. Taylor Saotome-Westlake Date: Sat, 3 Jun 2023 19:38:32 +0000 (-0700) Subject: memoir: editing sweep ... X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=9c01d4240b9bd8716fa0faa02786ce03fa93d43e;p=Ultimately_Untrue_Thought.git memoir: editing sweep ... --- diff --git a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md index 4b6a8ea..eaa081f 100644 --- a/content/drafts/a-hill-of-validity-in-defense-of-meaning.md +++ b/content/drafts/a-hill-of-validity-in-defense-of-meaning.md @@ -203,19 +203,19 @@ As for the attempt to intervene on Yudkowsky—here I need to make a digression (If you are silent about your pain, _they'll kill you and say you enjoyed it_.) -Unfortunately, a lot of _other people_ seem to have strong intuitions about "privacy", which bizarrely impose constraints on what _I'm_ allowed to say about my own life: in particular, it's considered unacceptable to publicly quote or summarize someone's emails from a conversation that they had reason to expect to be private. I feel obligated to comply with these widely-held privacy norms, even if _I_ think they're paranoid and [anti-social](http://benjaminrosshoffman.com/blackmailers-are-privateers-in-the-war-on-hypocrisy/). +Unfortunately, a lot of _other people_ seem to have strong intuitions about "privacy", which bizarrely impose constraints on what _I'm_ allowed to say about my own life: in particular, it's considered unacceptable to publicly quote or summarize someone's emails from a conversation that they had reason to expect to be private. I feel obligated to comply with these widely-held privacy norms, even if _I_ think they're paranoid and [anti-social](http://benjaminrosshoffman.com/blackmailers-are-privateers-in-the-war-on-hypocrisy/). (This secrecy-hating trait probably correlates with the autogynephilia blogging; someone otherwise like me who believed in privacy wouldn't be telling you this Whole Dumb Story.) -So I would _think_ that the commonsense privacy-norm-compliance rule I should hold myself to while telling this Whole Dumb Story is that I obviously have an inalienable right to blog about _my own_ actions, but that I'm not allowed to directly refer to private conversations in cases where I don't think I'd be able to get the consent of the other party. (I don't think I'm required to go through the ritual of asking for consent in cases where the revealed information couldn't reasonably be considered "sensitive", or if I know the person doesn't have hangups about this weird "privacy" thing.) In this case, I'm allowed to talk about _me_ emailing Yudkowsky (because that was _my_ action), but I'm not allowed to talk about anything he might have said in reply, or whether he replied. +So I would _think_ that the commonsense privacy-norm-compliance rule I should hold myself to while telling this Whole Dumb Story is that I obviously have an inalienable right to blog about _my own_ actions, but that I'm not allowed to directly refer to private conversations with named individuals in cases where I don't think I'd be able to get the consent of the other party. (I don't think I'm required to go through the ritual of asking for consent in cases where the revealed information couldn't reasonably be considered "sensitive", or if I know the person doesn't have hangups about this weird "privacy" thing.) In this case, I'm allowed to talk about _me_ emailing Yudkowsky (because that was _my_ action), but I'm not allowed to talk about anything he might have said in reply, or whether he replied. Unfortunately, there's a potentially serious loophole in the commonsense rule: what if some of my actions (which I would have _hoped_ to have an inalienable right to blog about) _depend on_ content from private conversations? You can't, in general, only reveal one side of a conversation. -Supppose Alice messages Bob at 5 _p.m._, "Can you come to the party?", and also, separately, that Alice messages Bob at 6 _p.m._, "Gout isn't contagious." Should Alice be allowed to blog about the messages she sent at 6 _p.m._ and 7 _p.m._, because she's only describing her own messages, and not confirming or denying whether Bob replied at all, let alone quoting him? +Supppose Alice messages Bob at 5 _p.m._, "Can you come to the party?", and also, separately, that Alice messages Bob at 6 _p.m._, "Gout isn't contagious." Should Alice be allowed to blog about the messages she sent at 5 _p.m._ and 6 _p.m._, because she's only describing her own messages, and not confirming or denying whether Bob replied at all, let alone quoting him? I think commonsense privacy-norm-adherence intuitions actually say _No_ here: the text of Alice's messages makes it too easy to guess that sometime between 5 and 6, Bob probably said that he couldn't come to the party because he has gout. It would seem that Alice's right to talk about her own actions in her own life _does_ need to take into account some commonsense judgement of whether that leaks "sensitive" information about Bob. -In part of the Dumb Story that follows, I'm going to describe several times when I and others emailed Yudkowsky to try to argue with what he said in public, without saying anything about whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was too busy with more existentially important things to check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the futher emails with followup arguments that I describe; I don't think I'm leaking any sensitive bits that aren't already easy to infer from what's been said (and not said) in public. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".) +In the substory (of my Whole Dumb Story) that follows, I'm going to describe several times when I and others emailed Yudkowsky to try to argue with what he said in public, without saying anything about whether Yudkowsky replied, or what he might have said if he did reply. I maintain that I'm within my rights here, because I think commonsense judgement will agree that me talking about the arguments _I_ made, does not in this case leak any sensitive information about the other side of a conversation that may or may not have happened: I think the story comes off relevantly the same whether Yudkowsky didn't reply at all (_e.g._, because he was too busy with more existentially important things to check his email), or whether he replied in a way that I found sufficiently unsatisfying as to occasion the futher emails with followup arguments that I describe; I don't think I'm leaking any sensitive bits that aren't already easy to infer from what's been said (and not said) in public. (Talking about later emails _does_ rule out the possible world where Yudkowsky had said, "Please stop emailing me," because I would have respected that, but the fact that he didn't say that isn't "sensitive".) -It seems particularly important to lay out these principles of adherence to privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone _else_ can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the Dumb Story that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; Alexander's reputation isn't so direly in need of correction.) +It seems particularly important to lay out these judgements about privacy norms in connection to my attempts to contact Yudkowsky, because part of what I'm trying to accomplish in telling this Whole Dumb Story is to deal reputational damage to Yudkowsky, which I claim is deserved. (We want reputations to track reality. If you see Carol exhibiting a pattern of intellectual dishonesty, and she keeps doing it even after you try talking to her about it privately, you might want to write a blog post describing the pattern in detail—not to _hurt_ Carol, particularly, but so that everyone _else_ can make higher-quality decisions about whether they should believe the things that Carol says.) Given that motivation of mine, it seems important that I only try to hang Yudkowsky with the rope of what he said in public, where you can click the links and read the context for yourself. In the substory that follows, I _also_ describe some of my correspondence with Scott Alexander, but that doesn't seem sensitive in the same way, because I'm not particularly trying to deal reputational damage to Alexander in the same way. (Not because Scott performed well, but because one wouldn't really have _expected_ Scott to perform well in this situation; Alexander's reputation isn't so direly in need of correction.) In accordance with the privacy-norm-adherence policy just described, I don't think I should say whether Yudkowsky replied to Michael's and my emails, nor (again) whether he accepted the cheerful price money, because any conversation that may or may not have occured would have been private. But what I _can_ say, because it was public, is that we saw [this addition to the Twitter thread](https://twitter.com/ESYudkowsky/status/1068071036732694529): diff --git a/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md b/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md index 9c8ab04..c3c6a0a 100644 --- a/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md +++ b/content/drafts/blanchards-dangerous-idea-and-the-plight-of-the-lucid-crossdreamer.md @@ -91,7 +91,7 @@ Between the reading, and a series of increasingly frustrating private conversati (With the caveated understanding that psychology is complicated and there's [a lot to be said about what "as a first approximation" is even supposed to mean](/2022/Jul/the-two-type-taxonomy-is-a-useful-approximation-for-a-more-detailed-causal-model/), but I need a few paragraphs to first talk about the _simple_ version of the theory that makes _pretty good_ predictions on _average_, as a prerequisite for more complicated theories that might make even better predictions including on cases that diverge from average.) -The idea is that male-to-female transsexualism isn't actually one phenomenon; it's two completely different phenomena that don't actually have anything to do with each other, except for the (perhaps) indicated treatments of hormone therapy, surgery, and social transition. (Compare to how different medical conditions might happen to respond to the same drug.) +The theory was put forth by Blanchard in a series of journal articles in the late 'eighties and early 'nineties, but notably popularized (to some controversy) by J. Michael Bailey in the popular-level book _The Man Who Would be Queen_ in 'aught-three. The idea is that male-to-female transsexualism isn't actually one phenomenon; it's two completely different phenomena that don't actually have anything to do with each other, except for the (perhaps) indicated treatments of hormone therapy, surgery, and social transition. (Compare to how different medical conditions might happen to respond to the same drug.) In one taxon, the "early-onset" type, you have same-sex-attracted males who have just been extremely feminine (in social behavior, interests, _&c._) their entire lives going back to early childhood, in a way that's salient to other people and causes big social problems for them—the far tail of effeminate gay men who end up fitting into Society better as straight women. _That's_ where the "woman trapped inside a man's body" trope comes from. [This one probably _is_ a brain-intersex condition.](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3180619/) @@ -397,43 +397,43 @@ When a political narrative is being pushed for _your_ alleged benefit, it's much While I was in this flurry of excitement about my recent updates and the insanity around me, I thought back to that Yudkowsky post from back in March that had been my wake-up call to all this. ("I think I'm over 50% probability at this point that at least 20% of the ones with penises are actually women"!!) What _was_ going on with that? -I wasn't _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a dayjob or college acquaintance something. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about the trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation. I had his email address from previous contract work I had done for MIRI back in '12, so on 29 September 2016, I wrote him offering $1,000 to talk about what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228), mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." +I wasn't _friends_ with Yudkowsky, obviously; I didn't have a natural social affordance to _just_ ask him the way you would ask a dayjob or college acquaintance something. But ... he _had_ posted about how he was willing to accept money to do things he otherwise wouldn't in exchange for enough money to feel happy about the trade—a Happy Price, or [Cheerful Price, as the custom was later termed](https://www.lesswrong.com/posts/MzKKi7niyEqkBPnyu/your-cheerful-price)—and his [schedule of happy prices](https://www.facebook.com/yudkowsky/posts/10153956696609228) listed $1,000 as the price for a 2 hour conversation. I had his email address from previous contract work I had done for MIRI back in 'twelve, so on 29 September 2016, I wrote him offering $1,000 to talk about what kind of _massive_ update he made on the topics of human psychological sex differences and MtF transsexuality sometime between [January 2009](https://www.lesswrong.com/posts/QZs4vkC7cbyjL9XA9/changing-emotions) and [March of the current year](https://www.facebook.com/yudkowsky/posts/10154078468809228), mentioning that I had been "feeling baffled and disappointed (although I shouldn't be) that the rationality community is getting this _really easy_ scientific question wrong." At this point, any _normal people_ who are (somehow?) reading this might be thinking, isn't that weird and kind of cultish? Some blogger you follow posted something you thought was strange earlier this year, and you want to pay him _one grand_ to talk about it? To the normal person I would explain thusly— -First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and paying people money in exchange for them using their time differently from how they otherwise would is a perfectly ordinary thing for microeconomic agents to do. Upper-middle–class normal people don't blink at paying a licensed therapist $100 to talk for an hour, because their culture designates that as a special ritualized context in which paying money to talk to someone isn't weird. In my culture, we don't need the special ritualized context; Yudkowsky just had a somewhat higher rate than most therapists. +First, in our subculture, we don't have your weird hangups about money: people's time is valuable, and paying people money in exchange for them using their time differently from how they otherwise would is a perfectly ordinary thing for microeconomic agents to do. Upper-middle–class normal people don't blink at paying a licensed therapist $100 to talk for an hour, because their culture designates that as a special ritualized context in which paying money to talk to someone isn't weird. In my culture, we don't need the special ritualized context; Yudkowsky just had a higher rate than most therapists. Second, $1000 isn't actually real money to a San Francisco software engineer. Third—yes. Yes, it _absolutely_ was kind of cultish. There's a sense in which, _sociologically and psychologically speaking_, Yudkowsky is a religious leader, and I was—am—a devout adherent of the religion he made up. -By this I don't mean that the _content_ of Yudkowskian rationalism is much comparable to Christianity or Buddhism. But whether or not there is a God or a Divine (there is not), the _features of human psychology_ that make Christianity or Buddhism adaptive memeplexes are still going to be active. If the God-shaped whole in my head can't not be filled by _something_, it's better to fill it with a "religion" _about good epistemology_, one that can _reflect_ on the fact that beliefs that are adaptive memeplexes are not therefore true, and Yudkowsky's writings on the hidden Bayesian structure of the universe were a potent way to do that. It seems fair to compare my tendency to write in Sequences links to a devout Christian's tendency to quote Scripture by chapter and verse; the underlying mental motion of "appeal to the holy text" is probably pretty similar. My only defense is that _my_ religion is _actually true_ (and that my religion says you should read the texts and think it through for yourself, rather than taking anything on "faith"). +By this I don't mean that the _content_ of Yudkowskian rationalism is much comparable to (say) Christianity or Buddhism. But whether or not there is a God or a Divine (there is not), the _features of human psychology_ that make Christianity or Buddhism adaptive memeplexes are still going to be active. [If the God-shaped hole in my head can't not be filled by _something_](http://zackmdavis.net/blog/2013/03/religious/), it's better to fill it with a "religion" _about good epistemology_, one that can _reflect_ on the fact that beliefs that are adaptive memeplexes are often false. It seems fair to compare my tendency to write in Sequences links to a devout Christian's tendency to quote Scripture by chapter and verse; the underlying mental motion of "appeal to the holy text" is probably pretty similar. My only defense is that _my_ religion is _actually true_ (and that my religion says you should read the texts and think it through for yourself, rather than taking anything on "faith"). -That's the context in which my happy-price email thread ended up including the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" (Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including inside of future superintelligences running simulations of the past, there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.) +That's the context in which my happy-price email thread ended up including the sentence, "I feel awful writing _Eliezer Yudkowsky_ about this, because my interactions with you probably have disproportionately more simulation-measure than the rest of my life, and do I _really_ want to spend that on _this topic_?" (Referring to the idea that, in a sufficiently large universe where many subjectively-indistinguishable copies of everyone exists, including [inside of future superintelligences running simulations of the past](https://www.simulation-argument.com/), there would plausibly be _more_ copies of my interactions with Yudkowsky than of other moments of my life, on account of that information being of greater decision-relevance to those superintelligences.) I say all this to emphasize just how much Yudkowsky's opinion meant to me. If you were a devout Catholic, and something in the Pope's latest encyclical seemed wrong according to your understanding of Scripture, and you had the opportunity to talk it over with the Pope for a measly $1000, wouldn't you take it? Of course you would! -Anyway, I can't talk about the results of my cheerful price inquiry (whether he accepted the offer and a conversation occured, or what was said if it did occur), because I think the rule I should follow for telling this Whole Dumb Story is that while I have complete freedom to talk about _my_ actions and things that happened in public, I'm not allowed to divulge information about what Yudkowsky may or may not have said in private conversations that may or may not have occured, because even without an explicit secrecy promise, people might be less forthcoming in private conversations if they knew that you might blog about them later.Personally, I think most people are _way_ too paranoid about this; I often wish I could just say what relevant things I know without worrying about whether it might infringe on someone's "privacy". (This secrecy-hating trait probably correlates with the autogynephilia blogging; someone otherwise like me who believed in privacy wouldn't be telling you this Whole Dumb Story.) But I feel morally obligated to cooperate with widely-held norms even if I personally think they're dumb. +Anyway, I don't think I should talk about the results of my cheerful price inquiry (whether he accepted the offer and a conversation occured, or what was said if it did occur), because any conversation that _did_ occur would be protected by the privacy-norm-adherence rules that I'm holding myself to in telling this Whole Dumb Story. (Incidentally, it was also around this time that I snuck a copy of _Men Trapped in Men's Bodies_ into the [MIRI](https://intelligence.org/) office library, which was sometimes possible for community members to visit. It seemed like something Harry Potter-Evans-Verres would do—and ominously, I noticed, not like something Hermione Granger would do.) ------ -If I had to pick a _date_ for my break with progressive morality, it would be 7 October 2017. Over the past couple days, I had been having a frustrating Messenger conversation with some guy, which I would [later describe as feeling like I was talking to an AI designed to maximize the number of trans people](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/). +If I had to pick a _date_ for my break with progressive morality, it would be 7 October 2017. Over the past couple days, I had been having a frustrating Messenger conversation with some guy, which I would [later describe as feeling like I was talking to an AI designed to maximize the number of trans people](/2018/Jan/dont-negotiate-with-terrorist-memeplexes/). He didn't even bother making his denials cohere with each other, insisting with no or minimal argument that my ideas were wrong _and_ overconfident _and_ irrelevant _and_ harmful to talk about. Over the previous weeks and months, I had been frustrated with the _Zeitgeist_, but I was trying to not to be loud or obnoxious about it, because I wanted to be a good person and not hurt anyone's feelings and not lose any more friends. ("Helen" had rebuffed my last few requests to chat or hang out. "I don't fully endorse the silence," she had said, "just find talking vaguely aversive.") -The conversation made it very clear to me that I could have no peace with the _Zeitgeist_. It wasn't the mere fact that some guy in my social circle was being dumb and gaslighty about it. It was that fact that his gaslighting was an unusually pure distillation of _socially normative_ behavior in Berkeley 2016. There were more copies of him than there were of me. +This conversation made it very clear to me that I could have no peace with the _Zeitgeist_. It wasn't the mere fact that some guy in my social circle was being dumb and gaslighty about it. It was the fact that his performance was an unusually pure distillation of _socially normative_ behavior in Berkeley 2016; there were more copies of him than there were of me. -It was a Huckleberry Finn moment for me: opposing this was worth losing friends, worth hurting feelings, and, actually, worth the other thing. I posted on Facebook in the morning and [on my real-name blog](http://zackmdavis.net/blog/2016/10/late-onset/) in the evening: +Opposing this was worth losing friends, worth hurting feelings—and, actually, worth the other thing. I posted on Facebook in the morning and [on my real-name blog](http://zackmdavis.net/blog/2016/10/late-onset/) in the evening: > the moment of liberating clarity when you resolve the tension between being a good person and the requirement to pretend to be stupid by deciding not to be a good person anymore 💖 -Former MIRI president Michael Vassar emailed me about it, and we ended up meeting once. (I had also emailed him back in August, when I had heard from my friend Anna Salamon that he was also skeptical of the transgender movement (Subject: "I've heard of fake geek girls, but this is ridiculous").) +Former MIRI president Michael Vassar emailed me about the Facebook post, and we ended up meeting once. (I had also emailed him back in August, when I had heard from my friend Anna Salamon that he was also skeptical of the transgender movement (Subject: "I've heard of fake geek girls, but this is ridiculous").) ------ -I wrote about my frustrations to Scott Alexander of _Slate Star Codex_ fame (Subject: "J. Michael Bailey did nothing wrong"). The immediate result of this is that he ended up including a link to one of Kay Brown's study summaries (and expressing surprise at the claim that non-androphilic trans woman have very high IQs) in his [November links post](https://slatestarcodex.com/2016/11/01/links-1116-site-unseen/), and he [got some pushback even for that](https://slatestarscratchpad.tumblr.com/post/152736458066/hey-scott-im-a-bit-of-a-fan-of-yours-and-i). +I wrote about my frustrations to Scott Alexander of _Slate Star Codex_ fame (Subject: "J. Michael Bailey did nothing wrong"). The immediate result of this is that he ended up including a link to one of Kay Brown's study summaries (and expressing surprise at the claim that non-androphilic trans woman have very high IQs) in his [November 2016 links post](https://slatestarcodex.com/2016/11/01/links-1116-site-unseen/). He [got some pushback even for that](https://slatestarscratchpad.tumblr.com/post/152736458066/hey-scott-im-a-bit-of-a-fan-of-yours-and-i). ------ diff --git a/notes/memoir-sections.md b/notes/memoir-sections.md index f2b7736..555c403 100644 --- a/notes/memoir-sections.md +++ b/notes/memoir-sections.md @@ -28,31 +28,33 @@ _ security code drama?? ("Although, speaking of documenting" 2 March) _ integrate 5150 scene scraps into a coherent section near editing tier— -_ "maximize the number of trans ppl" conversation should briefly sample the guy's bogus conjunction of arguments _ mention my "trembling hand" history with secrets, not just that I don't like it _ NRx explanation should include the "there's only what won" line _ It's not clear anyone he usually respects was making this mistake; it seems likely that the original thread was subtweeting Eric Weinstein, who was not making this mistake -_ explain who Michael Bailey is earlier people to consult specifically before pt. 1–3: -_ Tail (AGP discussion) -_ "Thomas" (privacy negotiations) -_ Anna -_ "Noreen" -_ "Bill" -_ "Rebecca" (consent, pseudonym choice) -_ Alicorn: briefly, and for Melkor Glowfic reference link -_ Sarah (name mention, whether to name conversation) -_ Ben/Jessica (Michael) -_ Scott -_ Lex -_ Divia? (ask if she wants to be mentioned?) -_ hostile prereader? (first-choice: April) +_ hostile prereader? (first-choice: April) _ professional editor? +_ Tail (AGP discussion) [pt. 1] +_ "Thomas" [blanket] +_ Anna [blanket] +_ Sarah (name mention, whether to name conversation) [blanket] +_ Ben/Jessica (Michael) [blanket] +_ Scott [blanket] +_ Divia? (ask if she wants to be mentioned?) [blanket] +_ "Bill" [blanket] +_ Sophia [pt. 1-2] + +_ Alicorn: briefly, and for Melkor Glowfic reference link [pt. 2-3] +_ "Noreen" [pt. 2] +_ Lex [pt. 2] +_ "Rebecca" (consent, pseudonym choice) [pt. 2, not before 15 June] + + -------------- -from the top editing session— bookmark phrase "At this point, any _normal people_" +from the top editing session— bookmark phrase "A trans woman named Sophia" ---------------- @@ -76,7 +78,10 @@ _ Dolphin War finish ------ With internet available— -_ Trump and Brexit and the summer of George Floyd +_ link specifically to Sophia's comment on "Wicked Transcendence" +_ "(to some controversy)" link +_ what is an "encyclical" +_ Trump and Brexit and the summer of George Floyd link? _ links to PA hospitals _ link to my Facebook posts on Azkaban, prematuring conceding the bet, and correcting the bet record _ Scott on "right to waive your rights" and wrongful committment