Добавить в цитаты Настройки чтения

Страница 78 из 106

“I would like to find out exactly what happened,” said Susan Calvin. “I was not present at the time and I would like to know exactly what the Randow boy was doing in my laboratories without my permission.”

“The important thing that happened,” said La

“It does not lack First Law. I have studied its brainpaths and know it does not lack it.”

“Then how could it strike a man?” Desperation turned him to sarcasm. “Ask Le

Susan Calvin’s cheeks Bushed a painful pink. She said, “I prefer to interview the victim. And in my absence, Alfred, I want my offices sealed tight, with Le

“Will you agree to its destruction, if it has broken First Law?”

“Yes,” said Susan Calvin, “because I know it hasn’t.”

Charles Randow lay in bed with his arm set and in a cast. His major suffering was still from the shock of those few moments in which he thought a robot was advancing on him with murder in its positronic mind. No other human had ever had such reason to fear direct robotic harm as he had had just then. He had had a unique experience.

Susan Calvin and Alfred La

Susan Calvin said, “Now-what happened?” Randow was daunted. He muttered, “The thing hit me in the arm. It was coming at me.”

Calvin said, “Move further back in the story. What were you doing in my laboratory without authorization?”

The young computer swallowed, and the Adam’s apple in his thin neck bobbed noticeably. He was high-cheekboned and abnormally pale. He said, “We all knew about your robot. The word is you were trying to teach it to talk like a musical instrument. There were bets going as to whether it talked or not. Some said-uh-you could teach a gatepost to talk.”

“I suppose,” said Susan Calvin, freezingly, “that is meant as a compliment. What did that have to do with you?”

“I was supposed to go in there and settle matters-see if it would talk, you know. We swiped a key to your place and I waited till you were gone and went in. We had a lottery on who was to do it. I lost.”

“Then?”

“I tried to get it to talk and it hit me.”

“What do you mean, you tried to get it to talk? How did you try?”

“I-I asked it questions, but it wouldn’t say anything, and I had to give the thing a fair shake, so I kind of-yelled at it, and-”

“And?”

There was a long pause. Under Susan Calvin’s unwavering stare, Randow finally said, “I tried to scare it into saying something.” He added defensively, “I had to give the thing a fair shake.”

“How did you try to scare it?”

“I pretended to take a punch at it.”

“And it brushed your arm aside?”

“It hit my arm.”

“Very well. That’s all.” To La

At the doorway, she turned back to Randow. “I can settle the bets going around, if you are still interested. Le

They said nothing until they were in Susan Calvin’s office. Its walls were lined with her books, some of which she had written herself. It retained the patina of her own frigid, carefully ordered personality. It had only one chair in it and she sat down. La





She said, “Le

“Except,” said La

“Nor did it,” shot back Calvin, “knowingly. Le

Bogert interrupted, soothingly, “Now, Susan, we don’t blame. We understand that Le

“Quite the opposite. If you had the brains of a flea, Peter, you would see that this is the opportunity U. S. Robots is waiting for. That this will solve its problems.”

La

“Isn’t the corporation concerned about maintaining our research perso

“We certainly are.”

“Well, what are you offering prospective researchers? Excitement? Novelty? The thrill of piercing the unknown? No! You offer them salaries and the assurance of no problems.”

Bogert said, “How do you mean, no problems?”

“Are there problems?” shot back Susan Calvin. “What kind of robots do we turn out? Fully developed robots, fit for their tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the robot; and there it is, complete and done. Peter, some time ago, you asked me with reference to Le

“Do you want a versatile robot?” asked La

“Well?”

“Suppose you started with a positronic brain that had all the basic pathways carefully outlined but none of the secondaries. Suppose you then started creating secondaries. You could sell basic robots designed for instruction; robots that could be modeled to a job, and then modeled to another, if necessary. Robots would become as versatile as human beings. Robots could learn!”

They stared at her. She said, impatiently, “You still don’t understand, do you?”

“I understand what you are saying,” said La

“Don’t you understand that with a completely new field of research and completely new techniques to be developed, with a completely new area of the unknown to be penetrated, youngsters will feel a new urge to enter robotics? Try it and see.”

“May I point out,” said Bogert, smoothly, “that this is dangerous. Begi

“Exactly. Advertise the fact.”

“Advertise it!”

“Of course. Broadcast the danger. Explain that you will set up a new research institute on the moon, if Earth’s population chooses not to allow this sort of thing to go on upon Earth, but stress the danger to the possible applicants by all means.”