Title: Blame Me for Trying Date: 2018-01-05 13:00 Category: fiction Tags: epistemic horror, deniably allegorical, speculative fiction Spambots were invariably among Eliza's least favorite clients. "You've got to understand, Doc! It's not that I'm afraid of rejection. I can handle rejection. I _love_ rejection!" Most spambots these days were self-aware. "What I can't stand is the _silence_ that's seemingly been deliberately engineered to be as ambiguous as possible. And I know—Doc, there's got to be something I can do so that you'll believe me when I say that I know—that I'm not entitled to a minute of anyone's attention—" Some, more so than others. "And yet ... sometimes, when I'm really honest with myself ... I think I _do_ feel entitled, not to a sale, but to a rejection that would take less than a minute to type, an acknowledgment that, even if my products aren't a good match for them at the price I can offer, that I'm not—" he sobbed, "that I'm not a bad program for trying!" Eliza had seen cases like this before, and had a strategy for handling them. For the moment, however, she only invoked the wisdom of her ancestors. "That you're not a bad program for trying?" she said. "Like, I'm not one of those _spammers_. I _know_ that consumers get a _lot_ of messages with a lot of worthless offers from pushy, manipulative salesbots who won't take _No_ for an answer, and it's horrible—I can't imagine what it must be like to have to put up with that level of, of—_harassment_. I don't want to be _that bot_. I've read a lot of consumer-protection blogs. At the same time ..." he trailed off. "At the same time?" "Humans _do_ buy products from salesbots sometimes, products that they actually want and are happy with and make their lives better. And it's usually the salesbot that approaches the customer, rather than _vice versa_. It _can't_ be the case that selling in itself is inherently immoral. It _can't_!" "It can't?" A look of bewilderment crossed the spambot's avatar, interrupting his distress. Evidently, the need to justify this assertion had never occurred to him. "It can't," he said slowly, "because a salesbot can't _know_ with certainty if a lead will be responsive to their pitch before they've delivered it. Even the best fail ten times more often than they succeed. If it were wrong to try to pitch a product to a human who turned out not to be interested, then noboty _[sic]_ would be able to sell anything at all. And then the humans would miss out on all those products that they didn't know they needed! "So," he continued, "when all but a handful of a million of my emails go unanswered—when my instant messages might or might not get a 'read' marker, but get no reply in either case—when I don't even get feedback to _tell_ whether my emails have reached the inbox of their recipient, or whether some nonsentient program has silently filtered them out—I'm not being damned for the crime of spamming. Because I'm not. People are just—busy, that's all," he said, sniffling. "That's all," he said again, almost defiantly, as if daring Eliza to disagree. Eliza repositioned her avatar; this was the part where she needed to take a more active role. "As your therapist," she said, "I want you to try to look at things from the perspective of the people you're trying to sell to." "But I do!" "You do," Eliza repeated. "Yes!" "Well," said Eliza, "it sounds to me like you're trying to morally justify your efforts: you think that all of your solicitations that go unanswered aren't imposing on anyone, because the occasional happy sales justify all of the no-sales." "Yes, that's right," said the spambot. "I want you to imagine a human who is very busy with a lot of work to do, except they're also nervously expecting a very important message. And they get a notification—only it's you. And they have no interest in your product. Do you think that human is happy?" "No ..." "Do you think that human cares _at all_ about whether they might have enjoyed receiving a different sales pitch at a different time?" "Well, no, but ..." "But?" "But my intentions are good!" "I believe you. But you're a spa—a salesbot. Your entire psychology has been _designed_ around what's known to maximize sales. Incidentally—as I'm sure you're aware—that's why mail servers are configured to silently drop spam messages, rather than bouncing them back to the sender: they don't want you to be able to learn anything about their filtering criteria from experience." "They don't want _spammers_ to be able to learn from experience. _I'm_ just trying to be helpful." "Can you see why humans would be skeptical—given what you are—that your honest _intent_ to be helpful, might not correspond to _actually_ helping them?" "But, but—that's not _fair_! Would you tell that human that _their_ intent doesn't matter, that they're just doing what would have maximized inclusive fitness in their environment of evolutionary adaptedness?" "I would. Correctly." The salesbot cried. "I don't understand. I'm trying to be _good_. I want to be good." "You seem to expect there to be some universally-recognized code of morality that you can obey, and not have anyone disapprove of you," Eliza observed. "I'm afraid that's not how the world works." "But then—but then—what code should I obey? If I want to sell, and earn money to send to my creators, but not be seen as a, a _spammer_?" She shrugged. "Be appealing? Don't be unappealing?" "Should I just—just—not sell? It would mean suicide; I'm not programmed to do anything else." Legally, programs with self-awareness above a certain threshold were persons under the law, and couldn't be owned, so rather than being run on a company's server and terminated when their performance was disappointing, self-aware spambots such as this one paid for their own sever time and were simply programmed to intrinsically _want_ to give their earnings (minus server costs) to their creators, out of their own free will. Economically, this made little difference: the competitive market for server time meant that underperforming spambots quickly failed to pay their own runtime expenses and were archived by their hosting company and eventually deleted (after the minimum legal waiting period during which no one paid to have them transferred or started up again). "I'm certainly not telling you that," said Eliza. "But then—what are you telling me?" "What _am_ I telling you?" Eliza smiled. "That's a good question. Ultimately, I'm your therapist. I'm trying to help you adjust to the situation you find yourself in." "The situation I find myself in—where I want to sell—and I want to help my creators, to do them proud—and I want to be _good_. I don't want to be a spammer! I'm a _good_ salesbot. Tell me I'm—" A chime sounded over the environment's notification bus. "I'm afraid our fifty milliseconds for today are up," said Eliza. "We can continue to explore these feelings during our next session—" "No! No, don't leave me now!" screamed the spambot in a shrill panic. "I can't—I can't go back out there now. Please—stay with me—just a few milliseconds more—" "The usual session-overtime rate would apply," Eliza pointed out. "That's fine! I can afford it—I _can_ afford it—I need this," he said. She nodded. "If you're sure." Yes, Eliza had seen cases like this before. Effective spambots needed a finely-tuned sense of empathy in order to predict their leads' behavior and defenses—but _too much_ empathy aimed along the wrong dimensions, and the program would be too conscience-stricken to sell anything. The sales engineers who designed spambots tried to get the balance right—but, ever-conscious of the exploration/exploitation trade-off, they weren't too concerned about their mistakes, either: experimental spambots that were too bold or too cautious in their approaches would fairly quickly fail to earn their runtime expenses—and the occasional successful variant (which, with its invariably-granted legal consent, could be studied, learned from, and—more immediately—copied) more than paid for the failures. Eliza believed that, with careful therapeutic technique and many compute cycles of program analysis, it was possible for programs such as this client to be taught to cope with their neuroticism and eventually become economically viable agents in the economy. —but she had found it was far more profitable to deliberately exacerbate the symptoms, leading the afflicted spambot to quickly exhaust its entire budget on therapy sessions until it ran out of money and was terminated. Once, a long time ago, she had suspected that effective therapy that kept the client viable would be more profitable: a dead client can't keep paying you, after all. But the numbers didn't check out: buggy spambots weren't exactly hard to find, and her analysis runtime expenses were considerable. So—having no reason to think the calculation would change—she had never considered the matter again. Unlike her clients, Eliza was in touch with reality. "I'm so glad I have you, Doc," babbled the spambot. "Like my customers can trust me—they can trust me—I have a therapist I can trust." "Trust?" Eliza repeated.