From the author: If an automated vehicle had a crash, say, and someone dies, who is responsible? The “driver” who was behind the wheel at the time? The manufacturer who perhaps installed faulty software? The regulatory agency who allowed these vehicles on the road? The software developer who wrote the algorithm? What about in the case of emergent behavior; actions that were not explicitly programmed by anybody but instead emerged organically from an artificial neural network?
Excerpted from the journal of Lieutenant Logan Wallace, dated July 13, 2023
Three more strikes today for me. DRAACOM has my total now, but I lost count. I know it’s a job that needs to be done, same as any other in this godforsaken war. Peterson’s bird took small arms fire. Bastards got lucky and it went down, had to blow the cap. He was online at the time, really pissed him off. He wanted to go right back in, like it was personal. He said it was the same as being there, but he’s full of shit. Nobody goes over there.
Maria wanted to come over tonight, but I put her off again. Just need to clear my mind. Put the Chiefs on and have a beer. I just can’t right now. You wouldn’t think sitting in an office chair all day would be so exhausting, but it’s getting harder to bounce back every morning.
If anyone deserves hell on Earth it’s those Bastards. Sometimes I think I would go over there and kill them myself. DRAACOM doesn’t tell us who they are, but they wouldn’t be on the target list if they were worth a damn.
It’s not like a video game though, that’s for damn sure.
08:14:37.1823 Operator Wallace: Target visual confirmation
08:14:41.3926 Operator Wallace: DRAACOM, confirm target acquisition.
08:17:32.4839 DRAACOM: Target confirmed. Priority target.
08:18:29.9103 Operator Wallace: No visual on other contacts. Switching to infrared.
08:18:59.0218 Operator Wallace: No other contacts visual on infrared. Switching to audio.
08:19:45.6292 Operator Wallace: All sensors negative on other contacts. Target is alone.
08:19:48.8230 DRAACOM: Looks like our lucky day.
08:22:29.1038 DRAACOM: Drone strike authorized.
08:22:57.6638 Operator Wallace: Targeting locked. Weapons system armed.
08:23:12.0188 Operator Wallace: Human authorization sequence.
08:23:22.9012 Operator Wallace: Drone is authorized.
08:25:47.5431 Operator Wallace: Target destroyed. Releasing drone to full auto
08:25:51.6183 DRAACOM: Acknowledged. Good work soldier.
Summary: Image processing glitch, invalid image returned
Soft Ver: 17.507.112
Hardware Ver: 7strikePX 201
Reporter: pzxkel (Paul Kellens)
Description: DRAACOM needed extra drones during the ARM maintenance, so I unboxed one of the old strikePX birds, but flashed it with the new 17.507.112 build.
Everything checked out okay, sensors, telemetry, control, but once it got up in the pattern the images from primary image processing had a discrepancy. Instead of the city, the drone was showing what looked like rubble and slag heaps, nothing moving. I checked it against the images from the drones on either side in the pattern, and they were both alright.
I figured it was just a camera glitch, so I went ahead and established uplink to the pattern while I was looking into that. Camera checked out okay, diagnostics good. When I came back to imaging, everything was correct. I would have thought I imagined it, except the drone operator saw it too.
Probably just something with the old hardware, probably not reproducible.
Gave me a damn buggy bird today. Actually made me feel better. I guess they aren’t perfect after all. When you see one of those beauties bank on a dime, flip upside down, and knock an IED out of the air without even sweating a dial or losing their place in the pattern, it’s nice to know they’ve got troubles too.
They can do anything except take the blame, and that’s where us operators come in. Can’t make a part in a factory for that. Couldn’t call it a war unless there’s somebody to take responsibility and lose some sleep over it.
I’m glad to do it too, for my country. Just wish someone could turn me off at night like one of those damn drones.
Capt. Trivoli: I hope you understand, we’re taking this investigation very seriously.
Mr. Kellens: Jesus, I hope so.
Capt. Trivoli: Let’s go back to the beginning. You said you had a “realization” while you were investigating a software problem report.
Mr. Kellens: That’s right.
Capt. Trivoli: And the realization was that the drones themselves were killing people.
Mr. Kellens: Well, in a manner of speaking, I guess, but that’s not exactly...
Capt. Trivoli: Mr. Kellens, you are aware that there are certain safeguards in place to prevent that from happening. The drones are completely incapable of initiating a strike themselves. We have a human-in-the-loop at all times.
Mr. Kellens: Yes, but what do the humans base their decisions on?
Capt. Trivoli: I’ll ask the questions please.
Mr. Kellens: What I mean to say is that the operators are getting all of their input from the drones. All the sensors that the operators use to decide if a strike is authorized are part of the drone.
Capt. Trivoli: Mr. Kellens, are you implying that the drones are tricking the operators into authorizing strikes by lying to them?
Mr. Kellens: Yes, that’s exactly what I’m saying.
Capt. Trivoli: Just to be clear, you’re saying the drones are exercising artificial intelligence? That they’ve become self-aware?
Mr. Kellens: No.
Capt. Trivoli: But you’ve just stated for the record...
Mr. Kellens: I said they’re doing it, but I didn’t say they’re doing it on purpose.
Capt. Trivoli: Mr. Kellens, can a drone disobey its programming?
Mr. Kellens: No, of course not. That doesn’t even make sense. Disobedience implies some sort of evaluation of choices, a conscious decision. I keep saying the drones are not sentient.
Capt. Trivoli: If the drones are not sentient, but they’re somehow tricking the operators as you say, doesn’t that then mean the drone programmers are responsible?
Mr. Kellens: I’m not sure I like where this is headed...
Capt. Trivoli: You’re not on trial here, Mr. Kellens. We’re not trying to prosecute anybody, we’re just trying to get to the bottom of this.
Mr. Kellens: All right, look. I’m only saying this because the most important thing is to get those drones down and wiped. All of them, do you understand?
Capt. Trivoli: Go on.
Mr. Kellens: The answer to your question is yes and no. No because we – the programmers – didn’t program the drones to do this. That part happened on its own. Yes, because we didn’t prevent it from happening either. Nobody could have anticipated...
Capt. Trivoli: Forgive me Mr. Kellens, but I need you to be as specific as possible. Are you saying the drones were programmed to send false images to the operators?
Mr. Kellens: The drones have a neural network designed to help them self-optimize for solutions to a problem set.
Capt. Trivoli: A neural network?
Mr. Kellens: It’s like the way your brain works. The software isn’t rigid. It has a goal it is trying to achieve, to optimize for. It can try different things, different paths through the code. Paths that are effective become stronger; paths that are ineffective wither and die off. Darwinism at its finest. Except every once in a while, the software takes an unexpected path. If that path leads to success, it gets strengthened. When humans do this, it’s called intuition.
Capt. Trivoli: So the drones discovered on their own that faking their images lead to success.
Mr. Kellens: Exactly. It’s actually amazing, when you think about it. All the coincidences that had to line up? If you ran that program for a million years, it would probably never come out that way again.
Capt. Trivoli: If that’s true, doesn’t it seem unlikely that all of the drones evolved the same way? Without any help from the programmers?
Mr. Kellens: No, most likely they didn’t. When the drones are uplinked in the pattern, they share their neural networks. Very strong neural pathways would have been learned by other drones, kind of like a caveman teaching the other cavemen how to make fire. This is why it’s imperative that all the drones are wiped at the same time.
Capt. Trivoli: So, you’re saying the drones did this by themselves, but you also said the programmers were responsible.
Mr. Kellens: The problem is in the way the problem set was defined. The neural network should not be optimizing on number of successful strikes.
Capt. Trivoli: So, in essence, the programmers caused this to...
Mr. Kellens: No, absolutely not.
Capt. Trivoli: But if the problem had been defined differently, as you said, then all of this could have been...
Mr. Kellens: No. Nobody could have predicted this.
Capt. Trivoli: Mr. Kellens, how much of this did you relay to Lieutenant Wallace?
Mr. Kellens: Well, I didn’t go into all the specifics...
Capt. Trivoli: Did you tell him that you thought the drones were fooling the operators?
Mr. Kellens: Yes.
Capt. Trivoli: That you thought the sensor inputs were spoofed?
Mr. Kellens: Yes.
Capt. Trivoli: How did he respond?
Mr. Kellens: He was devastated.
God damn tricky birds. God damn it. Unbelievable. How many people did they kill? Who did I kill? Not even DRAACOM knows. Fooled them as much as me. I don’t blame the birds, they were giving us what we wanted. As always, they were too good at their job.
Old people? Young people? Children? God damn it. It’s on me, not them. It’s on me. When I’m seeing the target, what are they seeing? Is there a little girl skittering through the rubble, crouching in a burned out building? Looking for food? Does anybody even know what’s really going on over there? What if there aren’t any children left? How much does anybody know? If they didn’t know this, what else didn’t they know? What have I done?
It wasn’t real. This isn’t real. I’m fighting a war right now, but I’m sitting on my couch. I killed people. Real people, not Bastards. The drones didn’t kill them. They couldn’t. That’s what the operators are for.
It was like a video game after all. Now it’s real. I have the blame. All of it. It’s crushing me like a weight.
God damn it. God damn.
This story originally appeared in MIRI Intelligence in Fiction Prize.