Добавить в цитаты Настройки чтения

Страница 29 из 56

Thirteen

ANDREW EXPERIENCED a sensation of discomfort after Little Miss's death that would not leave him for weeks. To call it grief might be a little too strong, he thought, for he suspected that there was no place in his positronic pathways for any feeling that corresponded exactly to the human emotion known as grief.

And yet there was no question but that he was disturbed in some way that could only be traced to the loss of Little Miss. He could not have quantified it. A certain heaviness about his thoughts, a certain odd sluggishness about his movements, a perception of general imbalance in his rhythms-he felt these things, but he suspected that no instruments would be able to detect any measurable change in his capacities.

To ease this sensation of what he would not let himself call grief he plunged deep into his research on robot history, and his manuscript began to grow from day to day.

A brief prologue sufficed to deal with the concept of the robot in history and literature-the metal men of the ancient Greek myths, the automata imagined by clever storytellers like E. T. A. Hoffma

And so Andrew moved swiftly to the year 1982 and the incorporation of United States Robots and Mechanical Men by its visionary founder, Lawrence Robertson. He felt almost as though he were reliving the story himself, as he told of the early years of struggle in drafty converted-warehouse rooms and the first dramatic breakthrough in the construction of the platinum-iridium positronic brain, after endless trial-and-error. The conception and development of the indispensable Three Laws; research director Alfred La

And then Andrew turned to something much more troublesome for him to describe: the period of negative human reaction which followed, the hysteria and downright terror that the new robots engendered, the worldwide outburst of legislation prohibiting the use of robot labor on Earth. Because miniaturization of the positronic brain was still in the development stage then and the need for elaborate cooling systems was great, the early mobile speaking units had been gigantic-nearly twelve feet high, frightful lumbering monsters that had summoned up all of humanity's fears of artificial beings-of Frankenstein's monster and the Golem and all the rest of that assortment of nightmares.

Andrew's book devoted three entire chapters to that time of extreme robot-fear. They were enormously difficult chapters to write, for they dealt entirely with human irrationality, and that was a subject almost impossible for Andrew to comprehend.

He grappled with it as well as he could, striving to put himself in the place of human beings who-though they knew that the Three Laws provided foolproof safeguards against the possibility that robots could do harm to humans-persisted in looking upon robots with dread and loathing. And after a time Andrew actually succeeded in understanding, as far as he was able, how it had been possible for humans to have felt insecure in the face of such a powerful guarantee of security.

For what he discovered, as he made his way through the archives of robotics, was that the Three Laws were not as foolproof a safeguard as they seemed. They were, in fact, full of ambiguities and hidden sources of conflict. And they could unexpectedly confront robots-straightforward literal-minded creatures that they were-with the need to make decisions that were not necessarily ideal from the human point of view.

The robot who was sent on a dangerous errand on an alien planet, for example-to find and bring back some substance vital to the safety and well-being of a human explorer-might feel such a conflict between the Second Law of obedience and the Third Law of self-preservation that he would fall into a hopeless equilibrium, unable either to go forward or to retreat. And by such a stalemate the robot-through inaction-thus could create dire jeopardy for the human who had sent him on his mission, despite the imperatives of the First Law that supposedly took precedence over the other two. For how could a robot invariably know that the conflict he was experiencing between the Second and Third Laws was placing a human in danger? Unless the nature of his mission had been spelled out precisely in advance, he might remain unaware of the consequences of his inaction and never realize that his dithering was creating a First Law violation.

Or the robot who might, through faulty design or poor programming, decide that a certain human being was not human at all, and therefore not in a position to demand the protection that the First and Second Laws were supposed to afford

Or the robot who was given a poorly phrased order, and interpreted it so literally that he inadvertently caused danger to humans nearby

There were dozens of such case histories in the archives. The early roboticists-most notably the extraordinary robopsychologist, Susan Calvin, that formidable and austere woman-had labored long and mightily to cope with the difficulties that kept cropping up.

The problems had become especially intricate as robots with more advanced types of positronic pathways began to emerge from the workshops of U. S. Robots and Mechanical Men toward the middle of the Twenty-First Century: robots with a broader capacity for thought, robots who were able to look at situations and perceive their complexities with an almost human depth of understanding. Robots like-though he took care not to say so explicitly-Andrew Martin himself. The new generalized-pathway robots, equipped with the ability to interpret data in much more subjective terms than their predecessors, often reacted in ways that humans were not expecting. Always within the framework of the Three Laws, of course. But sometimes from a perspective that had not been anticipated by the framers of those laws.

As he studied the a

So it was humans themselves who sometimes led robots into violations of one or another of the Three Laws-and then, in their illogical way, often would blame the robots themselves for having done something undesirable which in fact they had actually been ordered to do by their human masters.





Andrew handled these chapters with the utmost care and delicacy, revising and revising them to eliminate any possibility of bias. It was not his intention to write a diatribe against the flaws of mankind. His prime goal, as always, was to serve the needs of mankind.

The original purpose of writing his book might have been to arrive at a deeper understanding of his own relationship to the human beings who were his creators-but as he proceeded with it he saw that, if properly and thoughtfully done, the book could be an invaluable bridge between humans and robots, a source of enlightenment not only for robots but for the flesh-and-blood species that had brought them into the world. Anything that enabled humans and robots to get along better would permit robots to be of greater service to humanity; and that, of course, was the reason for their existence.

When he had finished half his book, Andrew asked George Charney to read what he had written and offer suggestions for its improvement. Several years had passed since the death of Little Miss, and George himself seemed unwell now, his once robust frame gaunt, his hair nearly gone. He looked at Andrew's bulky manuscript with an expression of barely masked discomfort and said, "I'm not really much of a writer myself, you know, Andrew."

"I'm not asking for your opinion of my literary style, George. It's my ideas that I want you to evaluate. I need to know whether there's anything in the manuscript that might be offensive to human beings."

"I'm sure there isn't, Andrew. You have always been the soul of courtesy."

"I would never knowingly give offense, that is true. But the possibility that I would inadvertently-"

George sighed. "Yes. Yes, I understand. All right, I'll read your book, Andrew. But you know that I've been getting tired very easily these days. It may take me a while to plow all the way through it."

"There is no hurry," said Andrew.

Indeed George took his time: close to a year. When he finally returned the manuscript to Andrew, though, there was no more than half a page of notes attached to it, the most minor factual corrections and nothing more.

Andrew said mildly, "I had hoped for criticisms of a more general kind, George."

"I don't have any general criticisms to make. It's a remarkable work. Remarkable. It's a truly profound study of its subject. You should be proud of what you've done."

"But where I touch on the topic of how human irrationality has often led to Three Laws difficulties-"

"Absolutely on the mark, Andrew. We are a sloppy-minded species, aren't we? Brilliant and tremendously creative at times, but full of all sorts of messy little contradictions and confusions. We must seem like a hopelessly illogical bunch to you, don't we, Andrew?"

"There are times that it does seem that way to me, yes. But it is not my intention to write a book that is critical of human beings. Far from it, George. What I want to give the world is something that will bring humans and robots closer together. And if I should seem to be expressing scorn for the mental abilities of humans in any way, that would be the direct opposite of what I want to be doing. Which is why I had hoped that you would single out, in your reading of my manuscript, any passages that might be interpreted in such a way that-"

"Perhaps you should have asked my son Paul to read the manuscript instead of me," George said. "He's right at the top of his profession, you know. So much more in touch with all these matters of nuance and subtle inference than I am these days."

And Andrew finally understood from that statement that George Charney had not wanted to read his manuscript at all-that George was growing old and weary, that he was entering the final years of his life, that once again the wheel of the generations had turned and that Paul was now the head of the family. Sir had gone and so had Little Miss and soon it was going to be George's turn. Martins and Charneys came and went and yet Andrew remained-not exactly unchanging (for his body was still undergoing occasional technological updating and it also seemed to him that his mental processes were constantly deepening and growing richer as he allowed himself to recognize fully his own extraordinary capabilities), but certainly invulnerable to the ravages of the passing years.