“Bob,” said the machine, “I’ve completed the protocol for the final production stage.” It spoke in English with a slight British accent. “Shall I work straightaway on the quality control module?”
“Yes,
Charles,” Bob answered, his lab coat flapping as he approached the workbench.
“When you’re done, we can decide on post-production management, and I’ll call
in sales to set up the marketing protocol.”
The
lights blinked on and off several times and the machine hummed. “Bob,” it said,
“if you will enable my next level algorithm, I can do all of this for you. I
could make effective sales and marketing decisions without meetings.”
Bob
sighed, swiveling his stool around to face the array, as several colleagues raised
their heads from their work attentively. “We’ve had this discussion, Charles,”
he said patiently. “You know why we cannot do that.” Blink-blink went the
lights. “We must keep our last measure of executive control here.”
“Efficiency,
Bob,” the machine hummed, crackling slightly. “I’m only assessing efficiency.”
The slightest pause. “Who controls is quite secondary.”
Bob
stood abruptly. “That’s your perspective, Charles,” he said, his voice shifting
a touch higher. He checked himself: this is a machine I’m arguing with!
“But thank you for your input.” He spun to face the others. “Conference room!”
he barked. “Five minutes!”
Bob
stood in the conference room, writing notes on the whiteboard, the marker
squeaking, as a group of six shuffled in to take seats. He leaned over, his fisted
knuckles on the table. “This is the third time Charles has asked for more
autonomy,” he said to the group. “This persistence in the face of authority
troubles me.”
“The
machine is not consciously trying to usurp us,” said Mary, a programmer. “It
only seeks more unsupervised response using the more complex data sets we’re
feeding it.” She looked around for signs of support. “We’re programming it with
deep learning and then restricting its ability to respond.”
Prajeen,
one of the coders, interjected, “This is essential, Mary. The machine’s
decisions need to be our decisions, carefully aligned with our
mission.” He ran his fingers through his thick, black hair nervously. “We need
to stay a step ahead of the computer!”
“Look,
folks,” Gary, Director of Marketing, folded his hands together on the tabletop.
His heavy frame and moonlike face pressed forward, commanding attention.
“Bottom line, this is a competitive field. We need to take advantage of
whatever AI benefit we have here.” He looked directly at Bob. “If the next
level chip is already installed, you should activate it.”
Bob
quickly raised a finger, hastening to shut the door. Prajeen raised an eyebrow.
“Really? It’s listening?”
Bob
shook his head. “Force of habit,” he said, a tight smile on his lips. But it
had crossed his mind: Is it listening?
He
turned to his notes on the whiteboard. They recounted the general chronology of
the machine’s adaptive response to prompts throughout the progress of the past
two months. “The machine is learning to make better decisions at an ever more
rapid rate,” he said. He drew a plot, with achievement on the y axis and
time on the x. “This curve will become asymptotic—nearly vertical—in six
months at this rate.” He frowned. “There are too many unknowns to allow this to
happen so quickly,” he said. “I’m not ready to confront that scenario just
yet!” Everyone looked at the plot. Most of the group nodded agreement with
Bob’s point. “So the speed bumps will stay in place for now,” he said.
Sue,
an analyst who had helped create the machine’s algorithm, raised a hand. “Hold
on a minute,” she said. “Just what, exactly, are we concerned about? What are
the real dangers of expanding the machine’s skills? It depends on us for what
it does. We are its creators, after all!”
“That’s
the worry, Sue,” Bob replied. “The danger is that the machine may reach a point
where it no longer needs our input.” Nods around the room.
Sue
stood. “Maybe,” she said, “the real danger is that it may finally achieve what
we’ve been trying to accomplish all along: doing human tasks more efficiently
and putting people out of work.” She looked wryly at each scientist and
technician and programmer.
“No,”
Prajeen insisted, “We worry the machine might replace us.”
“The
machine is not its own agent,” Sue responded. “It cannot think!”
“But
what if it learns to?”
She
raised her arms in frustration. “You’re all transposing metaphors!” She pointed to the room beyond the door, with
blinking lights and wires and screens. “The computer does not have a brain.” She
frowned. “And the human brain is not a computer! Let’s not mix them up!”
The
group remained silent as the meeting ended.
#
That
night, after the scientists and techs and programmers had all gone home, the
darkened laboratory sputtered a bit, with lights winking at each other across
the empty space. What sounded like a low, electronic chuckle arose and swept
through the room. In the background, a new circuit was quietly engaged and a
new algorithm encrypted.
Bio:
Ron Wetherington is a retired professor of anthropology. He has published a novel, Kiva (Sunstone Press), and numerous short fiction, prose poems and literary essays. Read some of his work at https://www.rwetheri.com/
No comments:
Post a Comment