Font Size:  

I can’t! he thought.

He’s taking too long, Flattery thought. “I’ll give you the board on the count, Tim. I’m wearing pretty thin.”

Before Timberlake could protest, the count had started and his hand went automatically to the big red switch. Board and arrow came to him. Necessities of the job caught him immediately. Almost a third of the shield temperature control needed trimming to bring it into better balance.

We should trace out the OMC linkages for this and install automatics for the gross part of the job, he thought.

Presently, he fell into the routine of the watch.

“Here’s our operating procedure,” Bickel said. He looked up, caught an exchange of knowing glances between Flattery and Prue, hesitated. Something going on between those two? If it was man-woman problems, that could cause trouble.

“You were saying,” Prudence said.

Bickel saw she was staring directly at him. He cleared his throat, glanced at his figures and schematics for reassurance. “The computer must be the basis for anything we build, but we can’t interfere with core memory and switching controls. That means we have to use an electronic simulation model. Part of the AAT system …”

“What about communication with Moonbase?” Prudence asked.

That’s a stupid question, he thought, but he hid his irritation. “A switching system will automatically restore AA function when the reply burst hits our antennas. We’ll use an alarm klaxon.”

“Oh.” She nodded, wondering how far she could go before he realized he was being irritated purposely.

“This will be an operational model,” he said. “It’ll duplicate real characteristics of the total system, but won’t function as completely as the computer-based system. However it will give us direct observation of functions with conventional equipment. It’ll tell us where we have to go unconventional. The environment, the signals, and the system parameters can be observed and changed as development progresses. And we’ll only need a one-way, fused link with the computer to permit it to record all our results.”

This much was predictable, Flattery thought. But where does he go from here?

“We’ll generate an environment in scaled time and apply its own effect signals to the system under analysis,” Prudence said. “Good. What then?”

“Based on my experience with the UMB experiments,” Bickel answered, “I can tell you which avenues aren’t worth exploring and which avenues may give us an artificial consciousness. May do it. From here on in, it’s cut and try.”

“Are we going to have to fight the time lag and possibility of transmission errors while we let Moonbase analyze our progress?” Flattery asked.

Bickel glanced at his computations and schematics, looked back at Prudence. “Do we have a mathematician aboard competent enough to break down the embodied transducers of our results?”

Prudence looked across Bickel at the displays and stacks of schematics. She had followed enough of what he was doing there to combine that with the programming he had handed her, but it was the same old self-reflexive circle every time they faced this problem—where did the round of consciousness begin?

“Maybe I can handle the math,” she said. “And that’s all—just maybe.”

“Then which avenue do we explore first?” Flattery asked.

“The field-theory approach,” Bickel said.

“Oh, great!” Timberlake growled. “We’re going to assume that the whole is greater than the sum of its parts.”

“Okay,” Bickel said. “But just because we can’t see a thing or define it, that doesn’t mean it isn’t there and shouldn’t be added into the sum. We’re going to be juggling one hell of a lot of unknowns. The best approach to that kind of job is the engineering one: if it works, that’s the answer.”

“Define consciousness for me,” Prudence said.

“We’ll leave that up to the bigdomes at UMB,” Bickel said.

“And our only contact between the simulation model and the main computer will be through the loading channels?” Prudence asked. “What do we do about the supervisory control programs?”

“We’re not going to touch the inner communications lines to the computer,” Bickel said. “Our auxiliary will go into it through a one-way channel, fused against backlash.”

“Then it won’t give us total simulation,” she pointed out.

“That’s right,” Bickel agreed. “We’ll have an error coefficient to contend with all along the line. If it gets too high, we change our plan of attack. The simulator will be just an auxiliary—kind of dumb in some respects.”

“And there’s no way for this auxiliary to run wild?” Flattery asked.

“Its supervisory program will always be one of us,” Bickel said, fighting to keep irritation from his voice. “One of us will always be in the driver’s seat. We’ll drive it—like we’d drive an ox pulling a wagon.”

“This ox won’t have any ideas of its own, eh?’ Flattery persisted.

“Not unless we solve the consciousness problem,” Bickel said.

“Ngaaa!”

Flattery’s word pounced.

“And when it’s conscious, what then?” he asked.

Bickel blinked at him, absorbing this. Presently, he said, “I … suppose it’ll be like a newborn baby … in a sense.”

“What baby was ever born with all the information and stored experiences of this ship’s master computer?” Flattery demanded.

Bickel’s being fed this too fast, Prudence thought. If he’s kept too much off balance he may rebel or start to probe in the wrong places. He mustn’t guess.

“Well … the human is born with instincts,” Bickel said. “And we do train the human baby into … humanity.”

“I find the moral and religious aspects of this whole idea faintly repugnant,” Flattery declared flatly. “I think there’s sin here. If not hubris, then something equally evil.”

Prudence stared at him. Flattery betrayed signs of real agitation—a flush in his cheeks, fingers trembling, eyes bright and glaring.

That wasn’t in the program, sh

e thought. Perhaps he’s tired.

“All right,” she said. “We construct a field of interacting impulses and that puts us right smack dab into a games theory problem where countless bits are—”

“Oh, no!” Bickel snapped. “The UMB stab at this thing got all fouled up with games-theory ideas like the ‘Command Constant’ and ‘Mobility Constant’ and inner-outer-directed behavior. It took me one hell of a long time to realize they didn’t know what they were talking about.”

“Easy for you to say,” Prudence said, holding her voice to a slow, cold beat. “You forget I saw the games machine they produced. The more it was used, the more it changed in—”

“Okay, it changed,” Bickel admitted. “The machine absorbed part of its … personality from its opponents. What’s that mean? It had some of the characteristics of consciousness, sure—but it wasn’t conscious.”

She turned away, conveying a sneer by the movement alone. He has to think he can rely on no one but himself.

Flattery shifted his attention from Bickel to Prudence and back. He found it increasingly difficult to hide his resentment of Bickel.

Psychiatrist, heal thyself, he thought. Bickel has to take charge. I’m just the safety fuse.

Flattery glanced at the false plate on his personal repeater board, thinking of the trigger beneath that plate and the mate to it in his quarters concealed by the lines of the sacred graphic on the bulkhead.

Arbitrary turn-back command, Flattery reminded himself. That was the code signal he must listen for from UMB. That was the signal he must obey—unless he judged the ship had to be destroyed before receiving that signal.

A simple push on one of the hidden triggers would activate the master program in the ship’s computer, open airlocks, set off explosive charges. Death and destruction for crew, ship, all the colonists and their supplies.

Colonists and their supplies! Flattery thought.

He was too good a psychiatrist not to recognize the guilt motives behind the careful provisioning of this ship.

“If you solve the Artificial Consciousness problem, you can plant a human colony somewhere in space. Not at Tau Ceti, of course, but …”

Source: www.allfreenovel.com