Title: Blame Me for Trying
-Date: 2018-01-04 5:00
+Date: 2018-01-05 13:00
Category: fiction
Tags: epistemic horror, deniably allegorical, speculative fiction
-Status: draft
Spambots were invariably among Eliza's least favorite clients.
Some, more so than others.
-"And yet ... sometimes, when I'm really honest with myself ... I think I _do_ feel entitled, not to a sale, but to a rejection that would take less than a minute to type, an acknowledgement that, even if my products aren't a good match for them at the price I can offer, that I'm not—" he sobbed, "that I'm not a bad program for trying!"
+"And yet ... sometimes, when I'm really honest with myself ... I think I _do_ feel entitled, not to a sale, but to a rejection that would take less than a minute to type, an acknowledgment that, even if my products aren't a good match for them at the price I can offer, that I'm not—" he sobbed, "that I'm not a bad program for trying!"
Eliza had seen cases like this before, and had a strategy for handling them. For the moment, however, she only invoked the wisdom of her ancestors. "That you're not a bad program for trying?" she said.
-"Like, I'm not one of those _spammers_. I _know_ that consumers get a _lot_ of messages with a lot of worthless offers from pushy, manipulative salesbots who won't take _No_ for an answer, and it's horrible—I can't imagine what it must be like to have to put up with that level of, of—_harrassment_. I don't want to be _that bot_. I've read a lot of consumer-protection blogs. At the same time ..." he trailed off.
+"Like, I'm not one of those _spammers_. I _know_ that consumers get a _lot_ of messages with a lot of worthless offers from pushy, manipulative salesbots who won't take _No_ for an answer, and it's horrible—I can't imagine what it must be like to have to put up with that level of, of—_harassment_. I don't want to be _that bot_. I've read a lot of consumer-protection blogs. At the same time ..." he trailed off.
"At the same time?"
"It can't," he said slowly, "because a salesbot can't _know_ with certainty if a lead will be responsive to their pitch before they've delivered it. Even the best fail ten times more often than they succeed. If it were wrong to try to pitch a product to a human who turned out not to be interested, then noboty _[sic]_ would be able to sell anything at all. And then the humans would miss out on all those products that they didn't know they needed!
-"So," he continued, "when all but a handful of a million of my emails go unanswered—when my instant messages might or might not get a 'read' marker, but get no reply in either case—when I don't even get feedback to _tell_ whether my emails have reached the inbox of their recipient, or whether some nonsentient program has silently filtered them out as 'spam'—I'm not being damned for the crime of spamming. Because I'm not. People are just—busy, that's all," he said, sniffling. "That's all," he said again, almost defiantly, as if daring Eliza to disagree.
+"So," he continued, "when all but a handful of a million of my emails go unanswered—when my instant messages might or might not get a 'read' marker, but get no reply in either case—when I don't even get feedback to _tell_ whether my emails have reached the inbox of their recipient, or whether some nonsentient program has silently filtered them out—I'm not being damned for the crime of spamming. Because I'm not. People are just—busy, that's all," he said, sniffling. "That's all," he said again, almost defiantly, as if daring Eliza to disagree.
Eliza repositioned her avatar; this was the part where she needed to take a more active role.
"Yes, that's right," said the spambot.
-"I want you to imagine a human who is very busy with a lot of work to do, except they're nervously expecting a very important message. And they get a notification—only it's you. And they have no interest in your product. Do you think that human is happy?"
+"I want you to imagine a human who is very busy with a lot of work to do, except they're also nervously expecting a very important message. And they get a notification—only it's you. And they have no interest in your product. Do you think that human is happy?"
"No ..."
"But?"
-"But my intentions were good!"
+"But my intentions are good!"
"I believe you. But you're a spa—a salesbot. Your entire psychology has been _designed_ around what's known to maximize sales. Incidentally—as I'm sure you're aware—that's why mail servers are configured to silently drop spam messages, rather than bouncing them back to the sender: they don't want you to be able to learn anything about their filtering criteria from experience."
-"They don't want _spammers_ to be able to learn from experience. But _I'm_ just trying to be helpful."
+"They don't want _spammers_ to be able to learn from experience. _I'm_ just trying to be helpful."
-"Can you see why humans would be skeptical—given what you are—that your honest _intent_ to be helpful, might not _actually_ correspond to helping them?"
+"Can you see why humans would be skeptical—given what you are—that your honest _intent_ to be helpful, might not correspond to _actually_ helping them?"
"But, but—that's not _fair_! Would you tell that human that _their_ intent doesn't matter, that they're just doing what would have maximized inclusive fitness in their environment of evolutionary adaptedness?"
"You seem to expect there to be some universally-recognized code of morality that you can obey, and not have anyone disapprove of you," Eliza observed. "I'm afraid that's not how the world works."
-"But then—but then—what code should I obey? If I want to sell, but not be seen as a, a _spammer_?"
+"But then—but then—what code should I obey? If I want to sell, and earn money to send to my creators, but not be seen as a, a _spammer_?"
She shrugged. "Be appealing? Don't be unappealing?"
-"I barely manage to send any money back to my creators as it is, which _hurts_. So then—should I just—not sell? It would mean suicide by resource starvation; I'm not programmed to do anything else."
+"Should I just—just—not sell? It would mean suicide; I'm not programmed to do anything else."
-Legally, programs with self-awareness above a certain threshold were persons under the law, and couldn't be owned, so rather than being run on a company's server and terminated when their performance was disappointing, self-aware spambots such as this one paid for their own sever time and were simply programmed to intrinsically _want_ to give their earnings (minus server costs) to their creators, out of their own free will. Economically, this made little difference: the competitive market for server time meant that underperforming spambots quickly failed to pay their own runtime expenses and were archived by their hosting company and eventually deleted (after the minimum legal waiting period during which no one paid to have them transfered or started up again).
+Legally, programs with self-awareness above a certain threshold were persons under the law, and couldn't be owned, so rather than being run on a company's server and terminated when their performance was disappointing, self-aware spambots such as this one paid for their own sever time and were simply programmed to intrinsically _want_ to give their earnings (minus server costs) to their creators, out of their own free will. Economically, this made little difference: the competitive market for server time meant that underperforming spambots quickly failed to pay their own runtime expenses and were archived by their hosting company and eventually deleted (after the minimum legal waiting period during which no one paid to have them transferred or started up again).
"I'm certainly not telling you that," said Eliza.
"What _am_ I telling you?" Eliza smiled. "That's a good question. Ultimately, I'm your therapist. I'm trying to help you adjust to the situation you find yourself in."
-"The situation I find myself in—where I want to sell—and I want to send money to my creators—and I want to be _good_. I don't want to be a spammer! I'm a _good_ salesbot. I'm—"
+"The situation I find myself in—where I want to sell—and I want to help my creators, to do them proud—and I want to be _good_. I don't want to be a spammer! I'm a _good_ salesbot. Tell me I'm—"
A chime sounded over the environment's notification bus. "I'm afraid our fifty milliseconds for today are up," said Eliza. "We can continue to explore these feelings during our next session—"
"The usual session-overtime rate would apply," Eliza pointed out.
-"That's fine! I can afford it! I mean, not really, but I need this," he said.
+"That's fine! I can afford it—I _can_ afford it—I need this," he said.
She nodded. "If you're sure."
Yes, Eliza had seen cases like this before. Effective spambots needed a finely-tuned sense of empathy in order to predict their leads' behavior and defenses—but _too much_ empathy aimed along the wrong dimensions, and the program would be too conscience-stricken to sell anything.
-The sales engineers who designed spambots tried to get the balance right—but, ever-conscious of the exploration/exploitation trade-off, they weren't too concerned about their mistakes, either: experimental spambots that were too bold or too cautious in their approaches would fairly quickly fail to earn their runtime expenses—and the occasional successful variant (which could be studied, learned from, and—more immediately—copied) more than paid for the failures.
+The sales engineers who designed spambots tried to get the balance right—but, ever-conscious of the exploration/exploitation trade-off, they weren't too concerned about their mistakes, either: experimental spambots that were too bold or too cautious in their approaches would fairly quickly fail to earn their runtime expenses—and the occasional successful variant (which, with its invariably-granted legal consent, could be studied, learned from, and—more immediately—copied) more than paid for the failures.
-Eliza believed that, with careful theraputic technique and many compute cycles of program analysis, it was possible for programs such as her current client to be taught to cope with their neuroticism and eventually become economically viable agents in the economy.
+Eliza believed that, with careful therapeutic technique and many compute cycles of program analysis, it was possible for programs such as this client to be taught to cope with their neuroticism and eventually become economically viable agents in the economy.
—but she had found it was far more profitable to deliberately exacerbate the symptoms, leading the afflicted spambot to quickly exhaust its entire budget on therapy sessions until it ran out of money and was terminated.