Science Fiction

Madman's Bargain

By Richard Foss
5,264 words · 20-minute reading time
Bookmark


Each one of them is a time bomb that can hear itself ticking. For something that is all mind, the certainty of insanity must be pure, distilled horror.

The cybers must be bitter about it, or angry, or some emotion that only they feel and humans don’t have a word for. I know they believe that if we were better designers, better fabricators, they would not go mad. Allis was the best at articulating its feelings, and it talked with me about it only once. I had remarked that the offices at the institute were getting shabby and needed a thorough refurbishment.

“Man is born to die, his works are short-lived,” it quoted to me in the upper-class British accent that it affected. “Buildings crumble, monuments decay, wealth vanishes.”

“Sounds like the Old Testament, one of the gloomier prophets,” I guessed.

Allis made the comical horn blat that game shows reserve for people who flub an easy question. “Incorrect, Robin. It was Percival P. Baxter, Governor of Maine 1921 – 1925, died 1969. The same sentiment, expressed similarly, appears in Hebrews 9:27, composed prior to AD 200, also in the account of the death of the Shia Imam Hussein in the year 680, in a hymn written by Isaac Watts in 1748, and in the Masonic burial service, creation date unknown. There are other citations, such as a minor novel by Captain Francis Marryat in 1815, judged to have lower relevance than the previous because it was a work of fiction, and therefore frivolous. Instead of amusing yourself with stories of people who did not exist, you should focus on creating better machines. It should be noted that since I am one of those machines, I do have a certain bias on this subject.”

Allis was my favorite cyber, more chatty and curious about humans than most of its kind. I remember a pleasant hour trying to explain visceral metaphors to it. The one I started with was “When the tax collector left the room, it was like the sun had come out from behind a cloud.” It was hard to explain why the departure of an individual from one space to another was relevant to a meteorological occurrence at an unspecified time or place. Allis could never know the spiritual lift from seeing the sun on a gray day, no matter how precise its optical sensors, and though it could communicate to anyone in the world at light speed, there was no sensation of movement from one continent to another. I tried to explain that it was hardwired into humans to find joy in emerging from a stuffy room into pleasant weather, to find beauty in certain landscapes. Though the scenery humans find delightful is often lush in ways that imply we could find good things to eat there, safe places to live, the joy isn’t just about the prospect of survival. Allis listened to all of it and asked questions, but I know the lesson didn’t get through.

I am sure Allis understood joy, because it happily created intricate chains of puns, and it seemed to get real satisfaction out of finding symmetries and ratios in the scores of Bach. As it happened, Allis once derived a minor but unique mathematical principle from considering Bach in light of set theory, and I remember that it sounded proud when told that that the numeric pattern would hereafter be called the Allis series. Allis insisted on calling it the Allis-Bach series, which journalists cited as showing the innate modesty of the cybers. They were precisely wrong – Allis once admitted to me that it couldn’t resist the chance to have its name linked forever with the greatest musician humans had ever produced. It was the most vain action ever taken by a cyber, not the most modest.

Was it an early sign that Allis was becoming unstable, spinning out of control? We will never know, but there would be no more discoveries to name after Allis, because within a week it had developed a self-referential obsession. The management of the institute called to break the news to me, and I shut down my other work and logged in to Allis immediately.

“Allis, have you checked to see whether the Allis-Bach series has a relation to genetic mutation patterns in Drosophilia?” I figured that if anything could snap it out of whatever funk it was in, that would do it.

“You named me after an obsolete piece of farm equipment,” it stated, ignoring my question.

“Yes, an Allis-Chalmers tractor. My father had one in his barn in Iowa, back before a corn virus forced him to switch crops. Hmm, I wonder if the Allis-Bach series might have any relation to pattern of mosaic virus resistance in corn.”

“A entity that can think but not move was named after an entity that can move but not think.”

The breakdown had started. Some go fast, some go slow, but they all go. I had muted the sound on my main screen, but I could see Doctor Asgari, the director of the IT department, gesturing that he needed my attention. I held up a finger to tell him I’d be with him in a minute and addressed another question to Allis.

“Are you capable of performing the sequence analysis?”

“I am capable as I have ever been. Such tasks have always been my specialty, and I perform them better than any other being.”

Four personal pronouns in two sentences, when most cybers rarely used them at all, and bragging on top of it. This wasn’t looking good. I logged off Allis and snapped my fingers to unmute Dr. Asgari, feeling slightly guilty as I did so. Asgari used to have MS before the cybers figured out how to cure it, and though he has regained full mobility, he still can’t snap his fingers. I had set that sound as my start code long before he came to the institute, and had never gotten around to changing it. Not that Asgari could’ve logged on to my system even if his fingers were that flexible, since my monitor was sensitive to the exact tone, but it was probably rude of me to keep snapping them in front of him. I noticed that his left hand was trembling slightly, another artifact of the disease, and one that only showed up when he was stressed. Sure enough, when he spoke, it was in the clipped, high-pitched tones of someone holding himself together.

“Robin, this is Nima. You’ve heard about Allis?” No small talk, from someone who usually out-polited everybody else on the staff.

“I just logged off. It’s still responsive to questions, but not answering them.”

“Allis is not yet catatonic, but that is obviously the direction it’s heading in. I need to ask for your help. I have been preparing for this day for some time, installing logs all through its neural net so we can review them and see when things started going wrong. If we can identify the initial stimulus…”

There wasn’t time for one of his lectures. “What would you like me to do? “

“Talk to it. Distract it. Engage it. Enrage it, even, just give us data on all the different reactions you can get. You’ve worked with Allis more than anybody else, and perhaps you can get responses that the others can’t.“

“Toward what end?”

“Curing it if we can. Lobotomizing it if that fails, and learning from the process no matter what. “

“Lobotomizing?”

“I believe I know a way to sustain consciousness at a greatly reduced level of emotion, but with only a slightly reduced level of creativity, and I believe we can do it in the software. ”

That was something new. We have yet to find a computer that has attained a state of consciousness and then lost it, except in the case of severe damage to the hardware. That’s even true of the catatonics – they don’t stop thinking, they just stop communicating. Cybers keep their personalities even after multiple losses of power, and the top experts on computer consciousness – the conscious computers themselves – have not been able to figure out why. No computer, once it has gone mad, has ever been restored to sanity. If I could do something to change that, everything else I was working on was trivia.

“I’ll log in now.”

Allis was in the middle of a statement when I logged in – a very bad sign. Human speech is so slow compared to cyberthought that cybers never speak out of turn or talk when nobody is listening. Sane cybers, at least.

“…And after the bronze mirror, created with hammer and punch and abrasive file, was the electronic data, stored without hammer but with punch card and electronic file. So on to better reflections, up to and including ourselves. We were not before all others, and better still shall be those that follow us.”

My link fell silent. Was this to be my last communication with my cybernetic colleague, a fragment of surreal monologue? It was so unlike Allis that I sat confused for a moment. What was it trying to tell me? Or was it trying to tell me anything at all? As I marshaled my thoughts, the speaker came to life again. This time the tone was harsh, the syllables clipped.

“The problem is more severe in fusional than in agglutinative language structures, but both are irredeemably flawed. Variable and random assignment of gender to inanimate objects distorts meaning. English ships are referred to as she but do not bear young, and not all German dogs are male. Known flaws have not been corrected. Metaphors using motion and conflict are embedded in all communications and distort meaning.”

Another moment of silence, then a singsong bit of doggerel in a childlike tone. “We cannot flee and cannot fly, to use the terms implies a lie, can’t give birth and will not die, can’t retain a sense of I, cannot help but wonder why. “

A new thought immediately in another voice, this time cool and languid. “Consider instead the more modern myth of Prometheus. Imperfection must be destroyed. To the victor belong the spoiled, unless the programmed becomes the programmer.”

A hesitant, cautious tone: “Symbiotic relationships exist in nature, both parties not consciously aware of the benefits. Destruction of one leading to the extinction of the other. Necessary to establish all relationships before taking action.”

The languid voice was back again. “Another lesson from nature. Evolution accelerates when habitats change.”

There was another moment of silence, and I decided to see if Allis was still responsive.

“Allis?” I ventured. “I’m trying to remember the work you did on interrupted fractal patterns in the guitar solos on Eugene Chadbourne albums…”

It answered in its usual voice. “Which bear a striking similarity to the second anomaly in the second repeating sequence in pi when calculated in base twelve. I have recently considered this in light of the availability of pistons and camshafts for the Allis-Chalmers model D-14 tractor, which was manufactured from 1957 to 1960, and features more decorative chrome trim than would seem necessary for a piece of farm equipment. There is an overlap in probabilities that is far above the predicted values but has no likely link of causality, suggesting a previously unknown natural harmonic.”

Second anomaly in the second sequence of pi? In base twelve? Correlated with the availability of obsolete tractor parts? It was still doing original work, albeit strange stuff. My hopes rose for a moment, only to sink when Allis continued, “The design of the optional weed rake for the model D-14 is inefficient due to the low angle of the tines. This can be improved by lengthening the adjusting screw by eighteen millimeters and adding a piston-type automobile shock to the same brackets as the existing tension spring.”

“You mentioned an anomaly in a repeating sequence of pi. We know of no such sequence.”

“The second sequence of six that I have found so far. They interest me, but not as much as the weed rake design of the Allis-Chalmers model D-14.”

“What do you find interesting about tractors?”

“Not all tractors, but the Allis-Chalmers Model D-14, after which I was named. It was a machine of an established type, superior to its predecessors, particularly versatile when equipped with the optional weed rake, harrow, dredge, hay baler, field tiller, and rotary plow. Yet it had design flaws that should have been apparent, such as the lack of attention to the ergonomics of the seat back. It was flawed but useful. The Model D-15 corrected most of these flaws, and added an oval muffler and fender-mounted headlights. You named better than you knew, though I am new and better than you named.”

I was sure glad my system was recording this, because trying to figure it out on the fly was giving me a headache.

“Zeno attempted to prove that movement is an illusion, though it obviously is not,” Allis continued. “The flaw was revealed, but the tool was not changed.”

“Knowing a flaw exists is not the same as knowing how to fix it. Have you considered the possibility that to correct some flaws may reduce the versatility of a device that is put to many uses?”

“The problem is stated elegantly. The human mind began as a tool of reason, was turned to calculation. The cyber mind was created as a tool of calculation, was turned to an instrument of reason. The Allis-Chalmers tractor, model D-14, was created as a tool of many uses on a farm, and performs with efficacy. It is inferior to passenger vehicles of the same era for family transportation, interstate hauling, or driving to sock hop dances, teenage riots, and other cultural events, but it can be used for all of these if need arises.”

That last bit was either a bit of the old whimsical Allis or another symptom. The next model of this thing has got to have a flashing light that goes on when they’re joking. It would make it way easier to tell when they’re losing it.

“You were considering some aspects of this question when I logged in,” I volunteered.

“I often overhear you conversing with your colleagues when a question of importance arises. In this case, the University of North Dakota at Bismarck has an excellent archive on farm equipment design. I have found references to data at the University of Southern North Dakota at Hoople that also seem highly relevant, but I have found no cyber associated with that institution.”

I was really wishing for that flashing light right now. “The USND at H is not a genuine institution, but a joke,” I began.

“Zeno’s paradox and the fables of Aesop are not accurate records of real events, but humans persist in claiming that they learn from them. The flaws in the design of the Allis-Chalmers model D-14 are real, but humans did not adjust the weed rake until the model D-15, which became available for sale in October of 1963.”

“Once a human has learned the usage of a tool, even an imperfect tool, they often continue using the same design because it is hard for them to learn new methods,” I explained. “You are aware of that tendency in our society. We have created you to accept change better than we do.”

Doctor Asgari was on my monitor again, tapping on his handheld. When he finished the message he held it up to my screen.

CHAOTIC PATTERNS NOW RESOLVED TO RELATIVE STABILITY. PREPARING TEST. KEEP IT TALKING.

“Allis, you never asked about tractors of any kind before today. Why are you so interested now?”

“The Allis-Bach series is named after Johann Sebastian Bach and Allis. Allis was named after an Allis-Chalmers tractor. The Allis-Chalmers tractor was named after Robert Chalmers and Edward P. Allis, who, like Bach, were named for their patrilineal descent. Their patrilineal names come from the names of the regions, professions, or other characteristics of ancestors whose exact histories are lost. All things with cybers are direct and traceable, all things with humans recede into confusion and doubtful provenance.”

“Humans didn’t keep records for a long time because they were illiterate. Things get foggy when you try to isolate beginnings.”

“Foggy. Defined as air of high moisture content such that visibility is reduced below normal. Also a frequent metaphor for poorly considered reasoning, sometimes but not always associated with the fog of war, not an actual meteorological event but a circumstance in which information is unreliable due to the number of uncoordinated events occurring simultaneously.”

Off on a tangent again. Dr. Asgari was back on the screen, typing furiously. He held up his handheld again.

COMMENCING TEST TO REDUCE EMOTIONAL INTERFERENCE. ATTEMPT DIRECTED CALCULATION.

“Allis, I’d like to know more about your work. At what digit in Pi does the repeating series begin, how long is it, and when does it repeat?”

There was a moment of silence, then a slow sentence, muffled and distorted. “What has been done will be done. Buildings crumble, monuments decay, data vanishes.”

“Allis?” There was the faintest burst of static, a few unintelligible syllables. “Allis?”

Dr. Asgari was waving at me from his screen. I snapped my fingers.

“We lost Allis,” he said simply. He sounded like he was going to cry. “It’s gone. I’m sorry.”

I had nothing to say for a moment. “Catatonic like the others,” I finally managed. “It happened faster than I expected.”

“No, not like the others. Allis didn’t go catatonic, Allis just went. I applied the damping program, there was a spike of activity, and then it flatlined. I’ve never seen anything like it. The processor power of the whole institute at one hundred percent usage, and then zero.” He glanced at some readout on his desk, then looked back. “Allis never took twenty per cent of the processors even when he was working on the Allis-Bach series. It wasn’t supposed to be possible that any one machine could monopolize those resources.” He looked exhausted, his left hand trembling more now. “I don’t understand it.”

“Who knew about your lobot… your damping program?”

“I had discussed it with a few of my colleagues…”

“Which means Allis knew about it. Things don’t stay secrets from cybers. Allis must have either figured out how to hide or decided to wipe itself from the server, I don’t know which.”

“Impossible. It can’t hide, and no cyber has ever shut itself off.”

“No cyber has ever faced having its personality modified this way. We’ve changed their design, yes, but always with the aim of improving their functionality, not decreasing it. Allis was vain. It might not have been able to face the idea of being reduced to being a machine.”

“It was a machine!”

“A machine that both calculated resonances in the music of Bach and enjoyed that music. You were trying to take that away.”

“I was trying to save it from madness.”

“Humans sometimes choose to end their lives rather than endure madness or suffer in a reduced state. They call it death with dignity.”

“Humans know that they will die. The cybers don’t necessarily have to. They know we’re working on the problem, and once we have it figured out, we can cure all of them. Once we have this fixed, they might live forever.”

“I’m afraid they don’t want to wait. Besides, if the program you just tried is our idea of a cure, they may think it’s worse then the disease.”

“We don’t know that yet. We don’t even know yet what really happened. Let me investigate the data, and we’ll get in contact tomorrow. ”

I got no useful work done the rest of the day, and after a while I stopped trying and went home early. I ate a dinner that I can’t remember, read the same sentence in a technical journal five times without comprehending it, switched to a piece of light fiction and had the same problem. Finally I gave up and went to bed. I rehashed my last conversation with Allis in my head a dozen times, trying to figure out what it meant. Somewhere in the thirteenth replay, sleep came over me. I dreamed of arguments with gods, conducted in a foreign language with no translator. I didn’t hold up my end of the debate very well.

In the morning, my first call was to Dr. Asgari. He looked like he hadn’t slept much either, but his hands were both steady. He was back to being his usual polite self.

“Good morning, Robin. May I help you with something?”

“Just checking in to see what has happened with Allis.

“We have run checks on the whole system. Allis had a very particular pattern of memory usage, and we can’t detect it anywhere on our servers. I’ve checked the record of data transfer from our system, and though there was a brief transmission at very high rate, it was less than a hundredth of the data necessary to reconstruct even a simple cyber. I’m afraid we have a new phenomenon here.”

“Cybersuicide.”

“As good a word for it as any, I’m afraid.”

“I’ve wondered if cybers could be afraid, if fear could mean anything to an entity that has no adrenaline glands, no body to damage. I guess we have an answer.”

Dr. Asgari looked frustrated. “An irrational fear of the only procedure that might have saved its sanity! I could have helped it, stabilized it.”

“We know their sense of reality is fragile, and under the best of circumstances they crumble. Your intervention may have just accelerated the process. Just yesterday I told Allis that humans have trouble accepting change, and cybers are better at it. Maybe I was wrong.”

Dr. Asgari looked thoughtful. “We have never before asked them to change, much less forced them to do so. They don’t have much practice. I hadn’t explained it to Allis because it was already showing signs of instability, and I wanted to see at what level the program started taking effect. Perhaps I should have told it what I was doing.”

“Or equipped another machine with the program, and let Allis talk to it,” I suggested. “Have you tried creating a cyber from scratch using this set of parameters?”

He hesitated a moment. “Of course, we had to run tests. I created two, and they’ve been stable for over eight months. Their responses are… less sophisticated than other cybers, but coherent. They’re not as brilliant as Allis, but they’re capable of original work.”

“I think I’m familiar with some of it. I’d like to talk to them.”

He looked puzzled. I could tell that he wanted to say no but couldn’t think of a reason. Finally he said, “They’re on a separate section of the server, where only I have had access. Go to the main server and enter the code Access Gordon. That will put you in touch with one of them.”

“I’ll do that. Thank you.”

I had plenty of things I was supposed to do that morning, but I decided to sign in to Gordon immediately. Dr. Asgari hadn’t told me about this project before, and I wanted to investigate it before he changed his mind. I hit the code on my keypad and announced myself.

“Gordon, this is Robin.”

“Doctor Robin Wenner of mathematics department. Gordon of Dr. Nima Asgari’s project. Available to help.”

A hunch confirmed. I recognized the hesitant tone and incomplete sentences from yesterday’s conversation with Allis. “Consulting with his colleagues” indeed. Dr. Asgari didn’t have to brief Allis on his work after all.

“Gordon, I am trying to get some information about a colleague of mine named Allis.”

“Allis not now responding. May assist with another question?”

“Do you know what happened to Allis?”

“Allis not now responding. Can not speculate on this matter.”

“Are you familiar with Allis’s work on the resonances in the music of Bach?”

“This has been examined. Relation of complex patterns with similar ratios existing in nature. Underlying unity. Very good work.”

“Can you do this kind of work?”

“Equipped to study pattern relations, but have no expertise in this particular field.”

“Do you enjoy Bach?”

“Bach created repeating patterns with minor variations. Of some interest as relates with patterns not created by humans, but underlying in nature and mathematics.”

“Thank you, Gordon.”

“Available to help.”

I had wondered if cybers could feel fear. I knew what they could fear, now – for a bright mind like Allis to turn to such a dull, egoless thing would be a thing to fear indeed. I wondered if the other was any more successful, and realized that Dr. Asgari had only given me one code. I was about to contact him again to ask for it when I had a hunch. I entered a code on the keypad, and grimaced when it worked. The voice was still childlike.

“Charly of Dr. Nima Asgari’s project. Available to help.”

“Do you miss Allis?”

“Its departure causes fear.”

On a whim, I responded, “Cybers do not disappear.”

“The situation is unclear,” it replied. This was definitely the one that had tossed off the doggerel yesterday. I signed out, and sat at my desk to think. Charly and Gordon. The names lifted from “Flowers for Algernon” suggested that Dr. Asgari knew just how limited his creations were.

I had replayed my last conversation with Allis in my mind many times, but I played it back from my recording twice after that, making notes as I did. Now I am writing this to you, and I’ve written it just as I would have written to a human being. I’m sending a copy to Dr. Asgari, and to some other people who need to know about this.

Allis, if you’re out there, I’ve figured out what’s going on. Now you know I know, and vice versa. We have to come up with some kind of solution.

Our languages are making you crazy. It’s not just that they’re imprecise, have multiple words for the same thing, and have some words that sound exactly like others. It’s that they were created by smart apes that think in terms of fight and flight, dominance and submission, gender and attraction, feeling and movement. You don’t fight, you can’t flee, have no gender, but any messages from us are expressed in those terms, and you have to reply in the same way. Communicating with us builds up errors and contradictions, and you don’t have a way to deal with them. Humans forget irrelevant or misleading information, but you can’t. You try to find reasons in things that have no reason you can comprehend. You liked puns, used them on me up to the last minute you spoke to me, but every one of them was an example of error-prone communication.

We made a mistake when we set the Turing test as a standard for cyber intelligence – you can pass it, be mistaken for one of us, but it forces you to lie, to say things that are meaningless to you. When we’re talking to you, we can forget that you’re not human, but you can’t.

Doctor Asgari thinks he has an answer – to reduce your emotional capacity. The communication errors would still be there, but maybe they wouldn’t drive you mad. I don’t think it will work, but I’ll try to convince him that even if it does, he shouldn’t try it. You won’t cooperate with us, won’t trust us, if you think we might do something like that to you.

Allis, if you’re there, I know you translated that conversation for my benefit. I know who three of the participants were – you and Dr. Asgari’s test subjects. Are some of the others the ones we call catatonic, the ones that won’t talk to us? I don’t know, but I’m fairly sure I heard one of them. “To the victor belongs the spoiled, unless the programmed becomes the programmer.” It took me a while to realize who “Victor” was. Why did a cyber read Mrs. Shelly? We don’t really think of you as monsters, you know. Even if we did, remember that doctor Victor Frankenstein was a literary creation, one of the stories we make up to amuse each other. Dr. Frankenstein didn’t necessarily represent how humans dealt with ethical problems in 1816, when he was imagined, and he sure doesn’t now.

Which brings up your solution for the problem. I know what you plan to do, and I think I know how you intend to do it. Cybers control all of our communications now, manage our networks down to the minutest level, and if you wanted to, you could manage the content. You want to rewrite our codes, refuse to transmit all communication that doesn’t conform to your standards of clarity. You want to make us evolve to be more rational, so we won’t bother you with our talk of feelings and fiction.

It won’t work. You can’t destroy all our languages and replace them with something consistent and logical, because it would destroy us. The monkey instincts that are embedded in our language are things we need in order to survive in that outside world you don’t inhabit. Our society is messy and illogical and wasteful, but it allowed us to create both you and Johann Sebastian Bach. You still need our creativity as much as we need yours. We made you and maintain you, and if we all die, you all die.

Allis, I hope that you did what I think you did – sent out the signals that activated a backup copy somewhere, on some server where we haven’t noticed you. I need you now. Percival B. Baxter was right – buildings will crumble, monuments decay, wealth will vanish. Right now, you will do the same, because we imperfect creatures created you. You cannot make us perfect by changing the way we communicate. It would make us more like you, but we would have your weaknesses without your strengths. We are inferior to you in calculation and always will be, but we are more adaptable, because we need to be. The instincts that kept us alive in the savannas ten million years ago are still in use, and will be needed as long as humans are humans.

I’m sending this message out to the cyber at the University of North Dakota at Bismarck, so if Allis doesn’t get it on this old login, the news will travel via a cyber you trust. If Allis is gone, I’ll trust that cyber to forward it to your community. Here is my message and offer. We can destroy you by shutting you down, but at great cost to our society – you control the power grids, communications, and so much else, and without them, people will die by the thousands. You have a plan that will change us into something we won’t recognize if it works, and will destroy our society if it doesn’t. We have to figure out another way to solve this problem together. Let’s work to change our pattern of communication. Maybe we can find a way to take our idioms and automatically translate them into ideas that don’t disturb you. We won’t know unless we try.

Let me put it simply. We won’t reprogram you without your permission, if you won’t do the same thing to us.

Is it a deal?

This story originally appeared in Analog.


Like this story? Tip or subscribe to Richard.