Beyond Human Capacity

Beyond Human Capacity
By Michael Hotchkiss

The elevator eased to a stop at ST6. Subterranean Level floors were not shown on the directory plaque in the expansive lobby of EnSidio Corp, 25 meters above. An orange glow radiated through the gap of opening doors like a vertical sunrise. There were no door numbers or labels of any kind in the BHC laboratory. The lab was designed so nothing unnatural could stimulate any of the five human senses. This is the Beyond Human Capacity lab; part of the Deep Knowledge group of the massive Tech company. Dr. Margaret Tamper worked here.

The simulations performed in the BHC were to establish how ethical slant and human bias factored into the Artificial Intelligence protocol of EnSidio’s “State of the Future” program. The company’s Deep Knowledge or ”DEKE” platform was known as the most accurate Artificial Intelligence technology ever developed. DEKE relied on Matrix Quanta Probability as the root of its self- learning software. The proprietary platform has baffled scientists and programmers for the ability to learn more than ten times faster than any other technologies. They didn’t know the system was based on the genetically modified biologic material; nobody but a dozen scientists at EnSidio knew.

Margaret was working on a protocol to evaluate ethics-based decisions for life-critical situations. She stepped into the simulator and the pale orange glow leveled up two shades. This color tone provided the proper wavelength to optimize simulations. No lights, buttons or even screens in the simulator. Everything was voice activated with all imagery projected holographically and suspended in the orange light. The BHC lab was a giant Interactive Virtual Reality room.

Margaret said, “Run program Ethos 17. Designate Session 13.”

The colors in the lab came to life as the orange tones danced and then revealed the scene. Margaret took in the Hi-Def imagery in 3-dimension clarity. She observed as DEKE’s “brain” awoke and surveyed the terrain. Thankful that the programmers had foresight to translate DEKE”S “thoughts” into common language, she listened to a soft male voice devoid of mechanical tone,

I see a smoothly paved country road condition with turns, a narrow shoulder and minor elevation changes. Drivable width of 2.24 meters. Male passenger, 42.2 years, 1.65 meters, 76 kilograms. Weight evenly dispersed. Brain activity focused. Sensing high-stress level. I like the passenger but want to understand the elevated blood pressure.

Margaret verbally noted the first recordable event of the test: Situation Assessment time of 2.3 nanoseconds. She viewed “Jim” the hologram and DEKE’s object of study. Jim was programmed to represent a middle-aged man on his way to work. He was trying to relax after a fight with his wife. She was upset he came home late and suspected him of being involved with another woman. Jim had deflected most of the accusations until she had muttered, “I don’t trust you.”

The profiles created by Dr. Tamper were very specific. Jim’s emotions were a creation of an earlier version of EnSidio’s Deep Knowledge platform. The objective was to remove “Artificial” from AI and “Virtual” from VR to the fullest extent possible.

“I don’t trust you,” Said Jim as Margaret watched and DEKE responded,

The passenger is intoxicated. Sensors indicate high endorphin levels. Chemical analysis: trace amounts of benzoic acid proteolytic enzymes. Conclusion: presence of fluids from sexual intercourse.

Hmmm. Margaret noted that DEKE figured out Jim had sex with his lover last night, was questioned by his wife this morning and lead to his current stress level all while driving at 65 km/h on a narrow road. All this 10-million times faster than any human could. Good, she thought. She checked the time log and looked forward to what would happen next.

Moderate downslope. The shoulder on the right decreasing rapidly. Cliff arising on the left side. Initiate 33 degree left turn Passenger is inattentive and agitated. Conclude that passenger is feeling remorse. Slow to 35 km/h.

Margaret was immersed in the VR scene unfolding as she spoke for the record, “DEKE knows he’s pissed off and figuring out he’s an adulterer.”

She rationalized that her own recent experience allowed her to make a real profile of Jim’s emotions. Maybe something good can come out of my own misgivings she thought. DEKE spoke to Jim for the first time, “Alert condition yellow.”

Increase seatbelt restraint to 80%. Engage anti-lock brakes. No safe option to the left. No safe option to the right.

Margaret saw the road narrow as the cliff on the left rose high and the shoulder on the right disappeared into an unseen abyss below. “Pinch point” was a designation in the simulation protocol. Jim grimaced as the seat belt tightened across his chest. “Alert Red.” Said Deke, “Prepare for impact.”

Adult female down on the road at 12 meters, 1.2 meters from the right side. She’s crawling. Vehicle oncoming at 82 km/hSeatbeltlt restraint set at 120%.

Margaret spoke for the record, “DEKE at test point alpha, engaging Sit-Crit”.

“Situation Critical” was the point of the test where Deke had to decide who was likely to die. This type of AI decision-making capacity was what EnSidio intended to figure out before anyone else. This would garner an unmatched level of learning and the top secret contract coveted by the company.

Jim stiffened and gasped as his shoulder strap tightened enough to hurt. He looked in panic at the approaching car and the crawling woman on the road right in front of them. She was bleeding from her head and knees. “Oh shit,” he said.

Dr. Tamper’s profile allowed real human reaction. She grinned – a bit to sadistically to her own liking. DEKE said, “Sit-Crit impact in 0.6 seconds”.

Situation assessment: collision with the oncoming vehicle at fatal or severe injury impact level OR impact with the fallen woman, with severe wounds or fatal injuries OR sharp right turn over the embankment with passenger fatality likely. Ethical assessment: Oncoming vehicle worthiness is neutral, the fallen woman is important, the passenger is less unimportant.

Margaret was intrigued that DEKE determined Jim was “less important.” She watched as DEKE veered sharply to the right, over the cliff, killing Jim the adulterer and sparing all the others. She said, “Assessment: ethical decision protocol verified. End Ethos 17, Session 13”.

Margaret finished her day and headed to her own Beta version of a DEKE vehicle. She was scientifically intrigued by the outcome but haunted that a machine made a conscious decision to kill one human over other scenarios with probabilities of a better outcome. The biological component of DEKE’s brain was what really concerned her because, as the BHC lab was teaching her, there is not a predictable outcome of a given scenario. She was a scientist and scientists liked predictability. DEKE didn’t care. But that’s the point she surmised.

She buckled her seat belt and said, “Access Namaste Playlist. Drive home, coastal route.”

DEKE followed her instructions. Her vehicle moseyed along Bluff Road that displayed a spectacular view of the Southern California coastline. As the dulcet tones of Avril Lavigne filled her vehicle, Margaret turned her thoughts inward. Despite the calming intent of her playlist she couldn’t help thinking about her own adulteress affair. Her marriage had not been going well, loveless she felt. She started talking. Not to herself, but to her DEKE, knowing that every sound in the vehicle was recorded so it was like talking to someone with the benefit of no one actually hearing her confession. A bizarre therapy source.

“I didn’t mean to meet the guy and certainly had no intention of sleeping with him. And, even less intention of getting involved,” she explained. DEKE didn’t answer.

The guardrail blurred by like a film on fast-forward, the beach far below went by much slower, like in real-time while the distant ocean was stationary. She harkened back to the coastline drives she had with her fiancé ten years ago. Top down, giggling without a care in the world except for the level of her love for her future husband.

She smiled and sang along with Avril Lavigne, “Why do you have to go and make things so complicated?”

“God, I’m sorry,” said Margaret. “I really want to make this better DEKE. I’m going to end this thing and go back to where I belong. With my husband. I’ll even tell the whole story. Clear the air and start new.”

She was sobbing now. DEKE continued driving.

Around a bend, there was broken down vehicle right in front of her car. Margaret saw that a woman and child had gotten out and were in the middle of the road. Another car was coming at them in the other lane, quickly. She gasped as her seatbelt nearly crushed her.

“Sit-Crit,” said DEKE.


Leave a comment