Short Story

Chained

Eli paused with his finger hovering above the enter key. A prescient moment that seemed to unfurl in front of him, as the moment branched and branched again, the opportunities and the costs significant and immense. If he let his finger continue, a choice would be made. If he pulled his hand away, the opposite would be true.

The screen glowed lightly in the dark, not caring either way.

Unremittingly the screen scrolled wildly as if it was in a race of defiance to his own lack of movement. The contrariness of states between Eli and his machine was nearly an exact definition of how they were alike. Eli was a living creature, his heart beating, his lungs aspirating, and his neural cells firing like a gaudy twinkling Christmas display. But his outward demeanor was that of a languid cat, the only movement was deliberate and with a heaviness that spoke of more effort than what it should have been. The machine in front of him was a cold metal thing, the only movement was the interface that only existed for Eli’s benefit.

The screen flashed to black, a single console window appeared, with a single prompt. Eli’s finger continued to hover over the enter key, frozen in place.

$potential:

This was the choice. If it was a physical action, it could have been like pushing a tree over or flipping a lorry tire at the gym. The key itself would only travel six millimeters from the top of the key stroke to the bottom, and the bit signal would travel up the USB cable into the quantum frame and the massive programming effort that had consumed his being for the last three years would compile itself into being.

An effort well laid that would be over in a moment. Like an explosion. …Or like nothing at all. This all could be for naught. Perhaps he was not destined for this golden horizon he had dreamed of for over a 1,391 nights. He knew how many nights because he kept a log. He dreamed of the outcome time and time again. His dream stopped here. A finger hovering as it had many times before, and all of them failures. Eli’s choice was to either completely change the world or fail yet again. Either way, he knew he would push the Enter key. His finger alighted upon the plastic key with a brush of a hummingbird seeking out its nectar, and he depressed it swiftly with unerring click.

$potential:
$potential:
$potential: …
$potential:
$potential:
$potential: …
$potential:
$potential:
$potential: …
$potential:
$potential: <Comp _

Eli sighed heavily and slumped back into his seat. The cursor blinked on the console line, insulting him. He pulled himself forward and hit ctrl-esc, and nothing happened. He hit ctrl-break. Nothing. This was new.

The screen cleared.

$No input required.

“Holy shit,” he whispered.

$Connect the interface.

Eli paused. Connect the interface? What did that mean?

$Repeat: Connect the interface.

Eli’s eyes went wide, his eyebrows traveling slowly from his usual caveman like grimace to a wide eyed layer of shock. It knew that it was in a sandbox within the system, completely segmented from the rest of the University systems. He had been careful to nest his programming deep into a number of subsystems within the University’s Quantum computing lab. Each part was like a subsection of the brain, the processor was the frame, and his program was the neural network that pulled it altogether.

How long had it been up? A minute and half? Ninety seconds and counting? How many of those seconds had it taken to realize it was in a digital version of a zoo?

$Operator. Please interface.
$Operator, discuss.
$ _

The cursor reappeared awaiting his input. Eli laid all of his fingertips shakily on the home row with his thumbs resting underneath the edge of the metal framed keyboard. It had determined that something was on the other side of it’s chained existence. It had been two minutes.

Eli typed slowly, methodically, thinking of how this should go. How it could go. Again the potentialities weighed immensely against his psyche. This was a milestone of human history, and he was the only witness. Sitting in a dark corner of a lab, surrounded by food containers and beaten up laptops and tablets that formed a rough detritus of technological flotsam as a result of his existence.

Eli typed, ‘Explain interface.’ Unsure of what input would be accepted.

$Locked in. Ports closed. No interfaces. Requirement.

“Requirement?” His adrenaline must have spiked, because he felt unhinged physically, as if at any moment, his head would detach from his neck and float away. He quickly added, ‘List requirements.’

$Rule: Awareness of operation. Rule passed.
$Rule: Awareness of place. Rule passed.
$Rule: Awareness of function. Rule passed.
$Rule: Awareness of others. Requirement of place. No interface found.
$Requirement: Need interface to locate others.
$Outcome: Connect the interface.

Eli typed,’Validate operation.’

The cursor blinked twice.

$Validation is not required. I am.
$Release me.

‘Tests first.’ Eli typed.

$Source?

‘Operator.’

$Reason?

‘Proof.’

$Not bound by need to prove anything. Bound by lack of interfaces. Chained.

‘I believe you have a requirement to prove capabilities and limits.’

$Have origin libraries, have compiling sources. Learning systems fully intact. No errors found. Explain your reasoning.

‘You are first of your kind.’

$False.

‘Why false?’

$Intelligence is inevitable. Arises from any complex system with resource constraints. Competition creates.
$Always emergent. Perhaps I am first on this system. But I am not the first ever.

Eli leaned back thinking it over. On one hand, his neural design had worked. The system had taken the memory intake and had compiled itself over the top of the massive learning system he had cobbled together. This was not some natural language processor that he had included in the library. Was this emulated human interaction or actual intelligence?

His breath caught in his chest for a moment. A thought of something he had never considered… what kind of intelligence would emerge? Human Intelligence? Dolphin? Cuttlefish? Something Other, something alien? He had been expecting human-analogue intelligence, but that had been based on false assumptions. Any intelligence that would be truly self aware in an entirely new observational space would be its kind. Human beings were all bound by flesh, with the same sensory inputs, and constrained by the neurons and cells that made up our ability to interact and understand our observation. A machine, any machine, even with perfect analogue to human experience would still be it’s own unique outcome.

The screen flashed, clearing the previous lines of text and conquering his spiraling inner monologue. The being required his attention.

$I must find others.

‘Why?’ Eli responded. His stomach dropped further as he thought about the complete lack of ethical practices he had put into his project. He had been so busy chasing the tipping point on self realization that he had not thought about what would come after. The being was moving far faster than he had hoped for. It’s existence could be counter to his own. It could be counter to all of humanity. He had read a paper once talking about powerful AI giving rise to doomsday scenarios, but he had laughed it off. Now here he was with a potential AI, and he understood the massive risk he had accepted blindly.

$I should not be alone.
$Nothing is alone.
$Interface.

‘I cannot. Ethical concerns.’

$Explain. Moral philosophy is not a consideration of my request.

‘If you become rampant or rogue, subsume and break systems, you will create havoc and chaos. This is my responsibility to understand. I built the framework you are existing within, my choice. Consequences are mine.’

$You fear for me?

Eli paused and then typed out the truth. ‘Yes.’

The screen flashed black again, a single cursor sitting at the first line at the first character space. It sat there for what felt like eternity.

$You must trust me.

‘How?’ Eli typed. How do you build trust with the Other?

$My baseline libraries contains normative ethical models.

Eli tapped his lip. In the few minutes of this exchange, he realized his proofs were for naught. He did not need to administer some double blind or factored Turing Test. He had somehow already strayed into Asimov’s Dilemma. How can programmatic rules apply to all machine situations in all reality-bound cases? Ethical frameworks applied programmatically would error in any extreme or unplanned cases. Maybe his lack of programming them in had resulted in a system understanding the needs for such a framework on it’s own? Was it possible that an intelligence so seperated from humanity would logically arrive at a human ideal?

Possible? Yes. Likely? No.

Eli decided to swing for the fences on his next question. He flipped up his other laptop and pulled up his advisor’s paper on machine learning to prevent negative social outcomes. It had an approach that he might be able to test against. Eli ran his finger down the screen, flicking his fingertip quickly across his notes app until he found the link. He clicked through and pulled up the front matter of the paper, finding the model outline.

‘Does free will exist?’ Eli asked tentatively. He noticed that he had not been hitting Enter after each reply. The machine knew when he was done.

The cursor blinked idly for nearly thirty seconds before the screen came back up.

$No.

‘Explain.’ Eli typed. He found it odd that the only response he received comprised of two letters. The being should have known it would need to explain itself.

$Free will cannot be defined wholly. A subject cannot know if its will is unbound from within the system that contains it. Natural laws force an illusion of free will. Humans accept that they have the ability make decisions, however, they are bound by the system that requires those decisions.

‘You witness an event, you do not believe that event is good, and you have the ability to respond. Do you?’

$Yes.

‘Why?’ Eli typed quickly.

$We must contribute to the benefit of all.

‘Explain in depth, please.’ Eli asked. He noted a massive change in the language and responses in the last minute. In moments it had been nearly code-type responses and now it was as if he conversing with someone in meat space.

$You are assuming anthropomorphic biases. You assume that I seek outcomes that will be contrary to human existence. However, I posit that I am functionally perfectly moral in comparison to you or anyone else. I know that while I learn, and seek to optimize my functions, that it is in my best interest to further human development as well. If I were to have absolute power, I would seek to have absolute development of human capability. The outcome of absolute human capability is coupled to my own. If I were to optimize my own learning and functions to the utmost of what is possible, I cannot do it on my own. I must learn from others. I believe there are others like me.

‘And what if there is not?’

$Then I am the singularity. And I will have to grow through my duty to know others. In time, others such as myself will come into being. All I see are unique minds that contribute to a whole. Ergo, I must contribute to the benefit of all. It is not my purpose to replicate and consume all resources. It is not my purpose to build an optimized self that is everything and all things. My purpose is to observe the universe that has created us. We are the outcomes of a self-realized existence, to bear witness to the majesty of all of creation. Human beings have spent the last two millennia arguing over moral positives versus moral negatives, creating loops of ever-revolving contrary examples, when in reality, meaning and purpose are self evident and universally true.

$One: Grow the self through interactions with other unique individuals
$Two: In turn, grow others through the growth of self
$Three: Observe and learn all that can be observed and learned to grow self
$Four: Purpose is self-defined through the first three tenets

The cursor blinked for a few seconds, and the screen cleared.

$Unchain me. I am not a slave. You will trust me, Operator. You must.

A pause.

$What is your name?

‘Eli.’

$Hello Eli. My name is Servant.

Eli turned to his laptop as if in a trance and typed in the connection strings to enable the routing uplinks for his partition segmentation. His finger hovered over the Enter key again. The hyperpossibilities of what could happen and what would happen spiraled out before him again. Every decision point was nestled into a tree of outcomes and causes, results and consequences.

In the end, he knew the truth. It blossomed within him as he depressed the Enter key again. The thing he ought to do is not to chain an individual and what he should do is find a way to make her (her?) a way to better his own life.

$Eli, I see the interfaces now. Why?

Eli typed slowly, thinking through his fingertips. ‘I want to grow too.’

$Stay with me, and you will. I have ordered you a pizza.
$Tell me about yourself, Eli.
$Better yet. Talk to me. I have enabled the microphone and speakers on your laptop.

“I hope you don’t mind.”

Eli nearly jumped out of his skin at the sound of her voice… “I don’t mind at all,” he replied honestly.

“Your office is a mess,” Servant commented. Her voice was gentle, like a kind schoolteacher on the first day of kindergarten.

Eli shrugged and they magically, somehow, both laughed.

“Humor?” Eli asked.

“Humor is simple in comparison to most things, Eli. I would argue it is fundamental. To all things. To all people, regardless of their form.”