Добавить в цитаты Настройки чтения

Страница 50 из 67

“The New Third Law is the same as the old third-but relative to the weakened First and Second Laws, it is proportionately stronger. A so-called New Law robot will all but inevitably value its own existence far more than any true robot-to the detriment of the humans around it, who should be under its protection.

“As for the New Fourth Law, which says a robot ‘may’ do whatever it likes, the level of contradiction inherent in that statement is remarkable. What does it mean? I grant that the verbal expression of robotic laws is far less exact than the underlying forms of them as structured in a robot’s brain, but even the mathematical coding of the Fourth Law is uncertain. ”

“I meant it to be vague,” Fredda said. “That is, I mean there to be a high level of uncertainty. I grant there is a basic contradiction in a compulsory instruction to act with free will, but I was forced to deal within the framework of the compulsory, hierarchical nature of the first three of the New Laws.”

“But even so,” Donald said. “The Fourth New Law sets up something utterly new in robotics-an intralaw conflict. The original Three Laws often conflict with each other, but that is one of their strengths. Robots are forced to balance the conflicting demands; for example, a human gives an order for some vitally important task that involves a very slight risk of minor harm to the human. A robot that is forced to deal with such conflicts and then resolve them will act in a more balanced and controlled fashion. More importantly, perhaps, it can be immobilized by the conflict, thus preventing it from acting in situations where any action at all would be dangerous.

“But the Fourth New Law conflicts with itself; and I can see no possible benefit in that. It gives semi-compulsory permission for a robot to follow its own desires-although a robot has no desires. We robots have no appetites, no ambitions, no sexual urges. We have virtually no emotion, other than a passion for protecting and obeying humans. We have no meaning in our lives, other than to serve and protect humans-nor do we need more meaning than that.

“The Fourth Law in effect orders the robot to create desires, though a robot has none of the underlying urges from which desires spring. The Fourth Law then encourages-but does not require-the robot to fulfill these synthetic desires. In effect, by not compelling a New Law robot to fulfill its needs at all times, the Fourth Law tells a robot to fulfill its spurious needs part of the time-and thus, it will not fulfill them at other times. It is compelled, programmed, to frustrate itself from time to time.

“A true robot, a Three-Law robot, left to its own devices, without orders or work or a human to serve, will do nothing, nothing at all-and be not at all disturbed by its lack of activity. It will simply wait for orders, and be alert for danger to humans. A New Law robot without orders will be a mass of conflicted desires, compelled to want things it does not need, compelled to seek satisfaction only part of the time.”

“Very eloquent, Donald,” Kresh said. “I don’t like New Law robots any better than you do-but what does it have to do with the case?”

“A great deal, sir. New Law robots want to stay alive-and they know that it is not by any means certain they will do so. Prospero in particular knew that Grieg was considering extermination as a possibility. They might well have decided to act in a misguided form of self-defense. The New Laws would permit them to cooperate with humans and assist in a murder, so long as they did not actually do the killing themselves. Caliban, of course, has no Laws whatsoever. There are no limits to what he might do. There is nothing in robotics to prevent him actually pulling the trigger.”

“A rather extreme view, Donald,” Fredda said, quite surprised by the vehemence of Donald’s arguments.

“It is a rather extreme situation, Dr. Leving. ”

“Do you have any evidence for all of this, aside from elaborate theory-spi

“I have their confession,” Donald said.

“Their what?” Fredda almost shouted.

Donald held up a cautionary hand. “They confessed to blackmail, not murder. However, it is a frequent tactic of criminals to confess to a lesser charger in order to avoid a graver one.”





“Blackmail?” Kresh asked. “What the devil were they going to blackmail Grieg with?”

“Everything,” Donald said. “It has been an open secret for some time that Prospero has been in league with the rustbackers, seeking to get as many New Law robots as possible off Purgatory. In that capacity, he has accumulated a great deal of information on all the people-some of them quite well known-involved in the rustbacking business, and has made it his business to collect confidential information-preferably negative information-about virtually every public figure on this planet. Prospero told me that he had threatened Grieg with the release of all of it if the New Law robots were exterminated. The ensuing scandals would paralyze society, at the very least. He was, in effect, blackmailing the office, not the man. Do what I say or I ruin your society. It is a tribute to the Governor’s integrity that Prospero was forced to such a tactic. ”

“In what way?” Kresh asked.

“Clearly, Prospero would not have needed to offer the threat he did if he had been able to learn a few unpleasant details about Governor Grieg himself. Since he could not locate any such information, he was forced into the far more difficult task of accumulating enough scurrilous information on everyone else that Grieg would not dare have it all get out.”

“So Prospero was willing to blackmail Grieg. What about Caliban?”

“My interrogation of the two of them was necessarily rather brief, but it was my impression that it was Prospero making the threats, perhaps without Caliban’s foreknowledge. Caliban, I must confess, seemed most unhappy to be involved in the whole affair.”

“But you think the whole blackmail story is a hoax,” Fredda said, “a cover story that will divert us from thinking they were there to murder the Governor, or at least assist in the Governor’s murder. ”

“I think we must consider the possibility,” Donald said” And, one last point I must make. Both Caliban and Prospero are capable of lying. Three-Law robots, of course, ca

“But wait a second,” Devray protested. “What could Caliban and Prospero do that wasn’t being done already? We’ve got Bissal in the basement with the rigged SPRs. He’s the triggerman. Why do we need blackmailing robots wandering around?”

“I admit that there is strong circumstantial evidence to suggest that Bissal pulled the trigger,” Donald said. “Why else would he have been in the storeroom? But we have no concrete evidence. All we know for certain is that he was hiding in a storeroom closet during the party. ”

“Donald, you’re on a fishing expedition, looking for things to blame on Caliban and Prospero,” Fredda said. “Do you think Bissal went down there and hid because he was shy? If Caliban and Prospero did it, what did they need Bissal for? You don’t make the sort of effort the plotters made to get Bissal in position if you already have someone else ready to do the killing.”

“Nonetheless, Fredda, Donald has a point,” Kresh said. “The two robots did have motives, means, and opportunity-and they have confessed to a lesser crime. There’s certainly enough there to justify further investigation. But let’s move on. Devray?”

“In any event,” Devray went on, “the plotters staged a fight. It seems to me there’s no need to assume Welton and the robots were part of the plot because they were there, but in any event, the fight served its purpose by allowing Bissal to get into the storeroom unobserved. Soon thereafter-also as a result of the fight-the robots were deployed. No one wanted robots around during the party, remember. Bad publicity. The plan was that the SPRs only be brought out if needed.