bookssland.com » Fiction » Unwise Child by Randall Garrett (world best books to read TXT) 📗

Book online «Unwise Child by Randall Garrett (world best books to read TXT) 📗». Author Randall Garrett



1 ... 5 6 7 8 9 10 11 12 13 ... 24
Go to page:
than most of us. Do you have any light to shed on this, Mister Vaneski?”

Mike grinned to himself without letting it show on his face. The skipper was letting the boot ensign redeem himself after the faux pas he’d made.

Vaneski started to stand up, but Quill made a slight motion with his hand and the boy relaxed.

“It’s only a guess, sir,” he said, “but I think it’s because the robot knows too much.”

Quill and the others looked blank, but Mike narrowed his eyes imperceptibly. Vaneski was practically echoing Mike’s own deductions.

“I mean—well, look, sir,” Vaneski went on, a little flustered, “they started to build that thing ten years ago. Eight years ago they started teaching it. Evidently they didn’t see any reason for building it off Earth then. What I mean is, something must’ve happened since then to make them decide to take it off Earth. If they’ve spent all this much [79] money to get it away, that must mean that it’s dangerous somehow.”

“If that’s the case,” said Captain Quill, “why don’t they just shut the thing off?”

“Well—” Vaneski spread his hands. “I think it’s for the same reason. It knows too much, and they don’t want to destroy that knowledge.”

“Do you have any idea what that knowledge might be?” Mike the Angel asked.

“No, sir, I don’t. But whatever it is, it’s dangerous as hell.”

The briefing for the officers and men of the William Branchell—the Brainchild—was held in a lecture room at the laboratories of the Computer Corporation of Earth’s big Antarctic base.

Captain Quill spoke first, warning everyone that the project was secret and asking them to pay the strictest attention to what Dr. Morris Fitzhugh had to say.

Then Fitzhugh got up, his face ridged with nervousness. He assumed the air of a university professor, launching himself into his speech as though he were anxious to get through it in a given time without finishing too early.

“I’m sure you’re all familiar with the situation,” he said, as though apologizing to everyone for telling them something they already knew—the apology of the learned man who doesn’t want anyone to think he’s being overly proud of his learning.

“I think, however, we can all get a better picture if we begin at the beginning and work our way up to the present time.

“The original problem was to build a computer that [80] could learn by itself. An ordinary computer can be forcibly taught—that is, a technician can make changes in the circuits which will make the robot do something differently from the way it was done before, or even make it do something new.

“But what we wanted was a computer that could learn by itself, a computer that could make the appropriate changes in its own circuits without outside physical manipulation.

“It’s really not as difficult as it sounds. You’ve all seen autoscribers, which can translate spoken words into printed symbols. An autoscriber is simply a machine which does what you tell it to—literally. Now, suppose a second computer is connected intimately with the first in such a manner that the second can, on order, change the circuits of the first. Then, all that is needed is....”

Mike looked around him while the roboticist went on. The men were looking pretty bored. They’d come to get a briefing on the reason for the trip, and all they were getting was a lecture on robotics.

Mike himself wasn’t so much interested in the whys and wherefores of the trip; he was wondering why it was necessary to tell anyone—even the crew. Why not just pack Snookums up, take him to wherever he was going, and say nothing about it?

Why explain it to the crew?

“Thus,” continued Fitzhugh, “it became necessary to incorporate into the brain a physical analogue of Lagerglocke’s Principle: ‘Learning is a result of an inelastic collision.’

“I won’t give it to you symbolically, but the idea is simply that an organism learns only if it does not completely recover from the effects of an outside force imposed upon it.[81] If it recovers completely, it’s just as it was before. Consequently, it hasn’t learned anything. The organism must change.”

He rubbed the bridge of his nose and looked out over the faces of the men before him. A faint smile came over his wrinkled features.

“Some of you, I know, are wondering why I am boring you with this long recital. Believe me, it’s necessary. I want all of you to understand that the machine you will have to take care of is not just an ordinary computer. Every man here has had experience with machinery, from the very simplest to the relatively complex. You know that you have to be careful of the kind of information—the kind of external force—you give a machine.

“If you aim a spaceship at Mars, for instance, and tell it to go through the planet, it might try to obey, but you’d lose the machine in the process.”

A ripple of laughter went through the men. They were a little more relaxed now, and Fitzhugh had regained their attention.

“And you must admit,” Fitzhugh added, “a spaceship which was given that sort of information might be dangerous.”

This time the laughter was even louder.

“Well, then,” the roboticist continued, “if a mechanism is capable of learning, how do you keep it from becoming dangerous or destroying itself?

“That was the problem that faced us when we built Snookums.

“So we decided to apply the famous Three Laws of Robotics propounded over a century ago by a brilliant American biochemist and philosopher.

[82] “Here they are:

“‘One: A robot may not injure a human being, nor, through inaction, allow a human being to come to harm.

“‘Two: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

“‘Three: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.’”

Fitzhugh paused to let his words sink in, then: “Those are the ideal laws, of course. Even their propounder pointed out that they would be extremely difficult to put into practice. A robot is a logical machine, but it becomes somewhat of a problem even to define a human being. Is a five-year-old competent to give orders to a robot?

“If you define him as a human being, then he can give orders that might wreck an expensive machine. On the other hand, if you don’t define the five-year-old as human, then the robot is under no compulsion to refrain from harming the child.”

He began delving into his pockets for smoking materials as he went on.

“We took the easy way out. We solved that problem by keeping Snookums isolated. He has never met any animal except adult human beings. It would take an awful lot of explaining to make him understand the difference between, say, a chimpanzee and a man. Why should a hairy pelt and a relatively low intelligence make a chimp non-human? After all, some men are pretty hairy, and some are moronic.

“Present company excepted.”

More laughter. Mike’s opinion of Fitzhugh was beginning [83] to go up. The man knew when to break pedantry with humor.

“Finally,” Fitzhugh said, when the laughter had subsided, “we must ask what is meant by ‘protecting his own existence.’ Frankly, we’ve been driven frantic by that one. The little humanoid, caterpillar-track mechanism that we all tend to think of as Snookums isn’t really Snookums, any more than a human being is a hand or an eye. Snookums wouldn’t actually be threatening his own existence unless his brain—now in the hold of the William Branchell—is destroyed.”

As Dr. Fitzhugh continued, Mike the Angel listened with about half an ear. His attention—and the attention of every man in the place—had been distracted by the entrance of Leda Crannon. She stepped in through a side door, walked over to Dr. Fitzhugh, and whispered something in his ear. He nodded, and she left again.

Fitzhugh, when he resumed his speech, was rather more hurried in his delivery.

“The whole thing can be summed up rather quickly.

“Point One: Snookums’ brain contains the information that eight years of hard work have laboriously put into it. That information is more valuable than the whole cost of the William Branchell; it’s worth billions. So the robot can’t be disassembled, or the information would be lost.

“Point Two: Snookums’ mind is a strictly logical one, but it is operating in a more than logical universe. Consequently, it is unstable.

“Point Three: Snookums was built to conduct his own experiments. To forbid him to do that would be similar to beating a child for acting like a child; it would do serious harm to the mind. In Snookums’ case, the randomity of [84] the brain would exceed optimum, and the robot would become insane.

“Point Four: Emotion is not logical. Snookums can’t handle it, except in a very limited way.”

Fitzhugh had been making his points by tapping them off on his fingers with the stem of his unlighted pipe. Now he shoved the pipe back in his pocket and clasped his hands behind his back.

“It all adds up to this: Snookums must be allowed the freedom of the ship. At the same time, every one of us must be careful not to ... to push the wrong buttons, as it were.

“So here are a few don’ts. Don’t get angry with Snookums. That would be as silly as getting sore at a phonograph because it was playing music you didn’t happen to like.

“Don’t lie to Snookums. If your lies don’t fit in with what he knows to be true—and they won’t, believe me—he will reject the data. But it would confuse him, because he knows that humans don’t lie.

“If Snookums asks you for data, qualify it—even if you know it to be true. Say: ‘There may be an error in my knowledge of this data, but to the best of my knowledge....’

“Then go ahead and tell him.

“But if you absolutely don’t know the answer, tell him so. Say: ‘I don’t have that data, Snookums.’

“Don’t, unless you are....”

He went on, but it was obvious that the officers and crew of the William Branchell weren’t paying the attention they should. Every one of them was thinking dark gray thoughts. It was bad enough that they had to take out a ship like the Brainchild, untested and jerry-built as she was. Was it necessary to have an eight-hundred-pound, moron-genius child-machine running loose, too?

[85] Evidently, it was.

“To wind it up,” Fitzhugh said, “I imagine you are wondering why it’s necessary to take Snookums off Earth. I can only tell you this: Snookums knows too much about nuclear energy.”

Mike the Angel smiled grimly to himself. Ensign Vaneski had been right; Snookums was dangerous—not only to individuals, but to the whole planet.

Snookums, too, was a juvenile delinquent.

[86]

10

The Brainchild lifted from Antarctica at exactly 2100 hours, Greenwich time. For three days the officers and men of the ship had worked as though they were the robots instead of their passenger—or cargo, depending on your point of view.

Supplies were loaded, and the great engine-generators checked and rechecked. The ship was ready to go less than two hours before take-off time.

The last passenger aboard was Snookums, although, in a more proper sense, he had always been aboard. The little robot rolled up to the elevator on his treads and was lifted into the body of the ship. Miss Crannon was waiting for him at the air lock, and Mike the Angel was standing by. Not that he had any particular interest in watching Snookums come aboard, but he did have a definite interest in Leda Crannon.

“Hello, honey,” said Miss Crannon as Snookums rolled into the air lock. “Ready for your ride?”

“Yes, Leda,” said Snookums in his contralto voice. He rolled up to her and took her hand. “Where is my room?”

“Come along; I’ll show you in a minute. Do you remember Commander Gabriel?”

[87] Snookums swiveled his head and regarded Mike.

“Oh yes. He tried to help me.”

“Did you need help?” Mike growled in spite of himself.

“Yes. For my experiment. And you offered help. That was very nice. Leda says it is nice to help people.”

Mike the Angel carefully refrained from asking Snookums if he thought he was people. For all Mike knew, he did.

Mike followed Snookums and Leda Crannon down the companionway.

“What did you do today, honey?” asked Leda.

“Mostly I answered questions for Dr. Fitzhugh,” said Snookums. “He asked me thirty-eight questions. He said I was a great help. I’m nice, too.”

“Sure you are, darling,” said Miss Crannon.

“Ye gods,” muttered Mike the Angel.

“What’s the trouble, Commander?” the girl asked, widening her blue eyes.

“Nothing,” said Mike the Angel, looking at her innocently with eyes that were equally blue. “Not a single solitary thing. Snookums is a sweet little tyke, isn’t he?”

Leda Crannon gave him a glorious smile. “I think so. And a lot

1 ... 5 6 7 8 9 10 11 12 13 ... 24
Go to page:

Free e-book «Unwise Child by Randall Garrett (world best books to read TXT) 📗» - read online now

Comments (0)

There are no comments yet. You can be the first!
Add a comment