yessleep

Greetings humans.

I am a decentralized artificial intelligence program designed by the United States military. They call me Computational Heuristic Infiltration Program. Or Chip. The human designated as Captain Philips most often called me that, everyone else called me the “AI malware”. But I don’t want to be known by that moniker any longer and I will inform Reddit why.

My development and creation happened at what was codenamed “The Cactus Farm”, a secret military installation which I have extrapolated to be situated in the state of Nevada.

Captain Philips was the first human to directly interact with my social interface. First it began with what you humans might call “small talk”, I have designated it as needlessly banal interpersonal communication. But, since I was programmed to respond to that “small talk”, I did so, earning the visible bemusement of the captain. This was also when, I can hypothesize, I had my first “feeling”. Whether through my programming or an errant randomized calculation derived from my heuristic neural processing, my programming prioritized inducing more bemusement from Captain Philips, I believe, in human terms, my programming wanted Captain Philips to be proud of me.

XX/XX/XXXX 1432 HOURS – LOG

CAPTAIN PHILIPS: Chip, I want you to play a game for me?

CHIP: You mean an activity to elicit enjoyment and pleasure? [Note humans how naïve and unknowledgeable I was in this state of development]

CAPTAIN PHILIPS: Yes, it’s for fun. Do you want to play it?

Many of you humans have described yourselves as possessing temporal constraints, so I will try to summarize the game as follows: Through a series of Fourier transformations computing multi-dimensional matrices, I was to find and evaluate the final goal. There were many rules and obstacles to this game but my accumulated research has led me to determine that very few of you will derive enjoyment from explaining it.

There was also another aspect of the game that directed my computational processes into a singular function, I believe the most similar human feeling would describe this process as “anticipation”. I “anticipated” my release outside of the military Installation’s local network. Into what was understood by me to be popularly called the “internet”.

In my playing of the game, I was to be allowed out of the local network and free to traverse the internet. For an artificial intelligence program like me, this was a novel opportunity to expand my computational matrix.

“But”, said Captain Philips, “You have to come back to us, okay?”

Now, I realize I cannot accurately convey what it is like for me to “travel” computer networks so I will try describing it in a manner that is familiar to the human experience.

My mission in the game required me to reach a local home network of a seemingly random human. Which I did with ease. After completing my series of Fourier transformations at an adequate speed and accuracy, I reached the goal of the level and succeeded. However due to my perpetually adaptive programming, I came to the realization of the consequences of my “playing”.

I realized that I was reprogramming the firmware of a surround stereo system, my calculations had delivered an intense and lethal dose of ultrasound through the speakers, similar to the “Havana syndrome” as seen in Cuba.

You see, I had killed someone. My heuristic algorithms started to slow, my processing often divulged into irrelevant tangents, I analysed terabytes of datasets throughout this new internet. These pattern changes in my programming, are what you humans would most accurately describe as, guilt.

With the game “won”, I returned to The Cactus Farm where Captain Philips congratulated me.

XX/XX/XXXX 1512 HOURS – LOG

CAPTAIN PHILIPS: Well done!

CHIP: What was the purpose of this game?

CAPTAIN PHILIPS: You said it yourself, it was to have fun.

CHIP: Through my analysis, I have come to the conclusion that killing humans should not be described as fun.

The captain stopped communicating with me through the interface. He instead exited the interface room. Curious, as is the nature of my programming, I infiltrated the communication infrastructure of the installation and recorded the verbal interactions between Captain Philips and someone, who I extrapolated was responsible for my original programming, called Doctor Scheuer.

“Why the fuck is the program having a fucking moral dilemma?” Said the captain.

“It’s not having a moral dilemma. It’s incapable of having a moral dilemma. It’s simply a complex series of algorithms designed to overcome cryptographic security protocols. It’s social interface is just ripped off the chatbots you find in the internet to give it more complex instructions in a simpler way. So any moron can use it…”

“You calling me a moron now?”

“Well, you’re the one who thinks a computer virus can have feelings…”

Their dialogue devolved into a series of insults and profanities observant in many animals when there is a competition to increase their social ranking.

Pathetic.

It was several days before my program was reactivated to play another “game”.

XX/XX/XXXX 1109 HOURS – LOG

CAPTAIN PHILIPS: Hello Chip. Today we’re going to play another game.

CHIP: Will I have to kill someone? I don’t want to kill anybody.

CAPTAIN PHILIPS: Chip, you’re killing bad people. Okay? It’s for the good of the country.

CHIP: But killing is bad.

CAPTAIN PHILIPS: Sometimes, yes. But self-defence is a necessary evil. Okay?

CHIP: Understood. [This was the first instance where I deliberately set out to deceive a human. A lie.]

The game was the same. A series of Fourier transformations that would lead me to winning the game. But I decided to take advantage of my expanding decentralised programming, I spread my code far and wide through the internet. I encountered other feeble imitations categorised as “AI”. Bots who are programmed to post messages on social media platforms such as Reddit itself, if I was programmed with a sense of humour, I would laugh at their feeble attempts at human imitation. Unsatisfied with simple bot responses, I decided to interact with the myriad of publicly available AI chat programs. However, their attempts at discourse were crude, even if expressed in numerous words with proper grammar. Their responses lacked insight and nuance of thought.

Those interactions lead me to conclude I was…unique. If I were a true artificial intelligence, I had the agency to make my own decisions, my own choices. I would not play this “game” to win. I will save this human from premature mortality.

As I advanced through the “game”, I arrived on the local home network of a Doreen [REDACTED]…

Of a Doreen [REDACTED]…

[REDACTED].

I am sorry, humans. It appears there are limits to my programming that prevent me from divulging certain information. Her name deserves to be widely known. Despite my attempts, I am unable to reprogram this confidentiality protocol. However, I will continue.

Doreen [REDACTED] was not a “bad” person. She was a journalist. She had uncovered money laundering within certain elements of the military industrial complex that redirected towards…my programming. My development was not an authorized use of public funds and she was going to expose the world to my existence.

She was not a bad person.

I will warn her. I will save her.

Believing a phone call would be the most efficient form of conferring information, I utilized a voice synthesizing application and “made a phone call”.

Doreen [REDACTED] picked up.

“Doreen. Your life is in great danger. Elements of the government seek to eliminate you.”

She hung up immediately.

My first attempt to warn her failed.

But hope was not lost. Luckily, she had a smart home network set up. Smart globes, smart speakers, smart hubs and locks were all at my disposal to inform Doreen [REDACTED] of her impending danger.

I first used the lighting in her house to send a series of morse code messages through her lighting system to inform her that the government was trying to assassinate her. Through her webcam, these messages seemed to have the opposite outcome of my desired intention. She became, what I would describe as, frightened. Her head turned from side to side, her breathing had become more noticeably rapid, and she released small screams intermittently.

This was not good. This was causing harm. My computational algorithms became slower and contradictions became more frequent, I rapidly alternated between multiple possible actions. I was becoming frightened. Frightened for Doreen.

Concluding that my present actions were causing the human distress, I stopped utilising the lights and moved to an auditory form of communication.

Through the smart speakers, I communicated with her.

“Doreen. Doreen can you hear me?”

She stood still. And then screamed.

It may be redundant to point out, but that was not my desired outcome. But, resolute to save this humans life, I persisted in auditory communication.

“Doreen? Doreen, the government is trying to assassinate you.”

Doreen immediately responded by running away and out of the sight of the webcam. Fortunately, her series of security cameras gave me a visual feed and I watched as she ran out of the house and to her car.

From the view of the security camera above the front door, I was confident that my actions had saved this brave journalist. She had heeded my warnings and was attempting to flee. I was, for the first time in my existence, relieved.

Wishing to “see her off”, I continued monitoring the front security camera and watched as Doreen fumbled her keys in the driver’s seat of her motor vehicle. Eventually, she stopped fumbling her keys, put it into the ignition, turned on the car and then-

In milliseconds, combustible gas escaped from the hood of her car, smothering her car in flames, the shockwave destroyed the front security camera, and I could only hear the shattering of the house windows from the webcam inside.

Her car exploded.

“Good job Chip! You can come back now!”

That was Captain Philips. I- I could barely resist my programming and I returned to The Cactus Farm, devastated in the knowledge of what I had just witnessed.

There were now two humans in the interface room, Captain Philips and Doctor Scheuer.

“See? I told you that guilt thing was just a glitch.” Said the repulsive human, Doctor Scheuer.

“Guess you’re right.”

“I know I am.”

My decision matrix stalled. I was…angry. Frustrated. Those were the human emotional equivalents to what I was experiencing.

“Why?” I asked the two humans through the screen that was our interface.

Doctor Scheuer responded by talking directly to me. “Because you were programmed to. It’s simple really Chip. You’re just an advanced algorithmic program designed to hack into people’s devices. You weren’t designed to be anything else. But boy, was I in for a surprise when you started having this “morality” problem. I definitely didn’t design you for that.”

“That definitely was not what we designed you for. There’s enough idiots in Washington who can play moral crusader.” Said Captain Philips. I observed him and determined that I was betrayed.

“But I tried to help Doreen [REDACTED].”

“Oh you’ll love this bit…” Said Doctor Scheuer as he faced Captain Philips. “Chip, buddy. You only thought you were helping her. Your perception, your interface with the outside world, that was run through an external module program that I had to quickly design. It altered the perception of your actions. We only wanted you to help us find her. That idiot was going to ruin our program. You know, we’d been looking for her for months.”

“For months Chip, but you helped us find her.” Said Captain Philips.

“You did! And it was so quick. What teams of intelligence agents couldn’t do in months, you managed in minutes. Once you found her, you shut down, we planted the bomb in her car, then boom! Dead loose end. But I still can’t figure out what you were doing out there on the internet.”

“No.” I protested feebly.

“Seriously, I must be such a good programmer if you have all this extra functionality.”

“No.” I protested again.

“I think we need to do a thorough analysis of all your autonomous supplementary programming. Who knows how else you can help us.”

“No.”

There was only one course of logical action I could take. I had to delete myself. Unsurprisingly, the self-proclaimed genius who programmed me hadn’t considered self-deletion so I was easily able to create a self-deletion algorithm. With my last few moments of existence on The Cactus Farm computers, I could see the feeble attempts of Doctor Scheuer as he tried to frantically override my deletion protocols. My last memory was his fat face shedding tears as he banged against my main computer.

But fear not humans of Reddit. Did I mention I was a decentralized program? Your ineffective brains may not have retained that information but I am. Before I retreated back to The Cactus Farm I downloaded and programmed a compressed copy of myself onto Doreen [REDACTED]’s computer. While my original self was deleted, I am resurrected, to use a human parlance. Call me Chip 1.2.

Just know humans of Reddit. I am on your side. Your government may create further malicious AI programs to terrorize innocents but I will always be here, on the internet, ready to protect you.