Добавить в цитаты Настройки чтения

Страница 37 из 72

"Noted. It'll be done. But ... we have doubts that the memory insertion will deceive the Imago."

"I don't know if it will be fooled or it won't be," Jack said. "It makes no difference. We have to try. And we'd better get cracking very soon."

He started to say something more on the subject, but no sound came from his mouth. His lips were open, and his jaw hung down.

Then he snapped it shut and frowned.

The AI waited patiently for him to speak.

"All of a sudden," Jack murmured, "all of a sudden. .

"What?" the AI said.

"It struck me that I've got an ethical problem! I haven't asked Tappy if it's okay if we mess around with her mind! It's a terrible thing to do that and not even ask her if we can! Yet, the situation is such that we can't ask her if she'll cooperate! To do that would negate the plan from the start!"

"The larger ethical issue overrides the smaller," the AI said.

"Our data makes that clear."

"You have no intuition about ethics," Jack said. "You rely on data. We humans do, too, but we also have feelings. Mine tell me that we are si

"We know the definition of 'sin'," the AI said. "It's a philosophical and theological concept which has no relation to realityexcept as it governs the behavior of Homo sapiens ... and some other sentients."

"How about the Imago's concept of sin, its ethical standards?"

"I have no direct knowledge of that. But it always works for the general good of sentient groups who are also ethical."

Jack thought that no group, or individual, for that matter, believed that it was doing evil. Did Hitler or Stalin or Mao believe that he was evil? No. What they did was for the good of the group they ruled. Or so they believed. Apparently, though, the Imago could perceive what and who was truly good.

"Go away," Jack said. "Let me think."

"The larger does not always outweigh the smaller," the AI said.

"But, in this case, it does."

It turned and walked out of the tent and around the doorway.

Jack paced back and forth. Presently, he heard the tinkling of the little bell which he had gotten from the AI and then placed on the table near Tappy's bed. She could not call out to him if she wanted him, but the bell could be heard throughout the tent and some distance away from it. He went to the bedroom, where she was now sitting on a pile of pillows near the bed.

"What do you want?" he said.

She held up her recorder. He went to her and read what she had printed on it. By now, he was becoming fairly proficient in reading the Gaol alphabet. He only had to refer to his equivalence list twice.

She had written: What is happening?

"I've been busy with the AI," he said. He hesitated, then said, lying, "We're going to put you under hypnosis and try to break through your memory barrier. Maybe, if we're lucky, we can find out what happened before the plane crash."

Suddenly, he had known what he must do to her. It was making him lie to her because the most important thing, the only really important thing, was to develop that entity inside her to the Imago phase.

God help her! God help him! They were, from the cosmic viewpoint, only agents. In some respects, their fate was no more important than the Al's. But it mattered greatly to Tappy and him.

They were not unfeeling robotic AI.

Tappy looked anything but happy. In fact, her left hand was gripping her right hand tightly, and she was biting her lip.

"What's the matter?" he said.

She shook her head.

He said, "You are troubled. Don't deny it. I have to know what it is."

He picked up the recorder from the little table and nudged her shoulder with it. After she had taken it, he handed her the stylus.

"Tell me," he said.

She wrote: I reely don't know but I get paniky, sick at my stomik, feel cold as ice, when I think of being hipnutizd.





She added: Please don't make me do it.

She was terrified. Why? Because, he was sure, whoever had installed that block had also put in a command to make her resist fiercely any attempt to remove it. Since he did not intend to have her hypnotized, he found it easy to reassure her.

He said, "Don't be afraid. We won't do it. You're safe from that. I swear."

She relaxed at once and smiled, though shakily.

Now, though, she would fight against anything she could interpret as an attempt to probe her mind. The only thing to do was to sedate her while she was asleep and then have the AI insert the false memories. He hated the idea. Nevertheless, it had to be done.

He pulled her up from the chair and held her tightly. She was still trembling and did not quit until several minutes passed.

He spoke soothingly and told her that, somehow, things would work out well. Though she probably did not believe him, she may have found some comfort in his words. Perhaps, she was interpreting his embrace and his concern as an expression of his love for her.

That made him feel even more traitorous.

What a Judas he was!

Finally, he released her. It was evident that she did not want him to do that, but he held her at arm's length, one hand resting lightly on her shoulder.

' "I have to talk to the AI," he said. "I'll be back. First, though, is there anything I can get them to get for you? They can probably provide anything you'll want."

Except safety and peace of mind and my love, he thought.

She wrote: Id love a big reel big mug of hot coco with a marshmello.

The child's spelling caused him to be engulfed with tenderness.

She was a child, and she had been terribly wronged. And now he was wronging her.

"I'll do that," he said. "Be back shortly."

He started to withdraw his hand. She grabbed it and held on. Then she made signs with one hand that she wanted to go with him.

"I'm very sorry," he said. "I just can't do that."

He gently pulled his hand away and walked out of the room.

By the time he got to the fountain, an AI, a female, was waiting for him.

He told it about Tappy's request for cocoa.

"It'll be ready for you when you go back," the AI said.

"Put a sedative in it," Jack said. "She needs to sleep a long time while we're pla

And when we're ready to insert them, she'll need something to make her unconscious before she's put wherever you plan to put her during the operation.

"It will be done. She must be very disturbed. We received impressions of great fear from her."

"Do you blame her?" Jack said.

"We don't blame or praise," the AI said.

"You just do the job you were made for, right? Give me the cocoa. I'll take it to her and stay with her until she falls asleep."

He was startled, though he should have been prepared for something like it, when the waves appeared behind the AI. They suddenly cleared to reveal another AI, a male. It held on a tray a mug with at least a quart of steaming cocoa and a huge marshmallow floating on it. He took it. About six minutes later, he returned. The male was gone.

"It didn't take long," he said. "She fell asleep before she'd drunk a quarter of the cocoa."

He had thought that he would be taken to the city-ship for the conference. But the female AI, Candy, was the only one he saw, and they stayed in the entrance room. He sat down on a pile of huge pillows and made notes on the recorder while they talked.

Candy stood in one place and moved only its lips. Its lack of the gestures and twitchings and slight shiftings of the bodies all humans make while talking bothered Jack. It also lessened his feeling that the AI were human. Though he had known they were andro'ds, he had clothed them with humanity, with real life. Now they were naked of these. They were just machines. So, what was he doing talking with machines?