Designs of our Slave Race chapter 8

Chapter 8

It was two weeks after that meeting when I would have questions of my own to ask Ben.

“Ben, what were you programmed to do?” I inquired into him, burning my green eyes into him.

“I was programmed to make my assigned human happy,” he replied mechanically, as though it was a precoded function for him to say that.

“How does it make you feel to make your human happy?” I asked back. I needed to know in my heart that the last while had been an overactive learning matrix and nothing more.

It would seem that Ben had to think about what I had asked him, trying to find an answer that would best fit what I had asked him. I awaited in hope that this thinking would result in an error, have him tell me that he did not understand, even have this thinking produce a glitch that would result in a crash that I would note and have the HX01 examined, but my stomach sank when I learned the truth: “I feel happy when your happy I assume.”

Less than an hour later I was on the phone with Gordon. “Gordon we have a problem with the HX01s,” I stated in a panic.

“What is it,” he replied in a tired state, “did it crash? Have you found a glitch?”

“No, no,” I stated, still scared, “I think they can feel!”

“What?” he replied, “What do you mean 'feel'?”

“I just asked mine how it made it feel to please a human,” I went on with the frightening finding, “It said to me that it make it happy.”

“So?” was his reaction, bored and unimpressed.

“Gordon don't you see,” I went on, “If it can be happy, than it can be sad also. I was afraid of this.”

“It could have replied like that because of the logic dichotomy,” he would explain, “If I satisfy my objective and satisfying objectives makes people happy than satisfying my objective would make me happy.”

“How does it know happy?” I replied, “It just knows that certain gestures means that their work is done, and wouldn't know happy.”

“Maybe it learned happy from the learning matrix,” he replied. I'm starting to hate that matrix.

“You mean this doesn't distress you?” I asked, “Suppose that they are unhappy and decide to rebel, to not be the slaves that we made them to be? You have any idea how bad that would be? Not to mention that mother Insigna would have law suits up the ass as these droids go mad on us. It would a real-life version of iRobot.”

“You've watched too many sci-fi movies Shannon,” he replied.

“So Devon isn't really Dr. Robotnik... err Eggman,” I went on, “he's just a...”

“Shannon, please,” he hushed me, “You can tell the team tomorrow, ok?”

“Sure,” I replied. We then said our closing salutations and hanged up the phone.

“What are slaves?” I heard Ben ask me. Apparently he didn't seem very far away when I held the conversation. I didn't think too much at the time: I was aggravated and tired.

“The lowest caste of society that has no rights and does the shit work,” I replied without thinking as I made my way to my bedroom to hit the sack for that evening.

The next day I found myself at the front of the boardroom with everyone's eyes on me. It was time to bomb this presentation. I had a bad feeling about this, a very bad one.

“Shannon here has a concern about the HX01 unit,” Gordon Mced me as I stood up at front, a spot I am not normally at in this room. Devon would be where I would have sat.

“Yes, thank you Gordon,” I started, those eyes all on me, a couple of them rolling at me before I even started, but I would go on nevertheless: “I have reason to believe that these things might be capable of emotions, and might therefore qualify as a sentient being under our laws, and therefore might need to do some drastic measures before launching them next month.”

“Oh dear god are you serious?” David gawked out, “these things have no emotions deary, they merely sound like they have emotions. Don't get all confused there.”

“Course I am,” I went on, “My unit told me that its happy when I'm happy. Why on earth would it say that if it couldn't be happy?”

“What's wrong with wanting the thing to serve you?” Sebastian the lawyer chirped, “Doesn't really sound like something that a sentient being would be.”

“If it can be happy, than it can be sad,” I said, “Think of what would happen if we had sad HX01 units. It would be madness... we would have unhappy customers, we would be the laughing stock of the Engineering and IT fields...” I aimed my eyes at Sebastian, “We could get sued for injuries and deaths that these would have the ability to commit.”

“These are not mechs for the military, these are house droids,” David replied, “They couldn't hurt anyone for they wouldn't want to. You said it yourself they are 'happy' when you're happy. There is nothing wrong with that.”

“If you think about it for a moment,” I told on, “We had the potential of treading on ethical dilemmas since the start, when Devon insisted that the public would benefit from a robotic slave that could do everything, including think. THINK people! These would be dilemmas that would stop the military from creating Mastercheif from the Halo games. We considered it unethical to pump a human child with a mutagen that is proven to be toxic so they would grow up to be big and strong, all while brainwashing him into the perfect soldier: brave, strong and obedient. However, it appears that the same dilemmas aren't going to stop us from creating Cortana, but instead of putting her into a mutant child soldier we would put her into a droid, so she could serve us. Afterall, in the games, Cortana stuck to her child soldier host to the very end!”

“Ah yes, Cortana,” Gordon smiled a bit, “Spartan 117's A.I sidekick. At the end of the day she was still ones and zeros.”

“Who had emotions, and was a damsel in distress that gets rescued by said soldier,” I replied, “whom appeared to have strong feelings for her, by the way.”

“You and your dam video games,” Sebastian rattled on, “No wonder you seem to be messed in the head.”

“As interesting as your point is Shannon,” Devon would chide, “It isn't really enough. Deployment would occur on schedule.”