Добавить в цитаты Настройки чтения

Страница 12 из 17



"I don't think you did."

"He was strange. I knew that. I was about fourteen. He was an American, and he took an interest in me. He got me a nice apartment in Saigon. And he put me in school."

She was studying me, looking for a reaction. I didn't give her one.

"He was surely a pedophile, and probably had homosexual tendencies, since I looked so much like a ski

Again the wait. This time she smiled.

"He was good to me. I learned to read well. From there on, anything is possible."

"I didn't actually ask you about your Captain. I asked why you got interested in computers."

"That's right. You did."

"Is it just a living?"

"It started that way. It's the future, Victor."

"God knows I've read that enough times."

"It's true. It's already here. It's power, if you know how to use it. You've seen what Kluge was able to do. You can make money with one of these things. I don't mean earn it, I mean make it, like if you had a printing press. Remember Osborne mentioned that Kluge's house didn't exist? Did you think what that means?"

"That he wiped it out of the memory banks."

"That was the first step. But the lot exists in the county plat books, wouldn't you think? I mean, this country hasn't entirely given up paper.''

"So the county really does have a record of that house."

"No. That page was torn out of the records."

"I don't get it. Kluge never left the house."

"Oldest way in the world, friend. Kluge looked through the L.A.P.D. files until he found a guy known as Sammy. He sent him a cashier's check for a thousand dollars, along with a letter saying he could earn twice that if he'd go to the hall of records and do something. Sammy didn't bite, and neither did McGee, or Molly Unger. But Little Billy Phipps did, and he got a check just like the letter said, and he and Kluge had a wonderful business relationship for many years. Little Billy drives a new Cadillac now, and hasn't the faintest notion who Kluge was or where he lived. It didn't matter to Kluge how much he spent. He just pulled it out of thin air."

I thought that over for a while. I guess it's true that with enough money you can do just about anything, and Kluge had all the money in the world.

"Did you tell Osborne about Little Billy?"

"I erased that disc, just like I erased your seven hundred thousand. You never know when you might need somebody like Little Billy."

"You're not afraid of getting into trouble over it?"

"Life is risk, Victor. I'm keeping the best stuff for myself. Not because I intend to use it, but because if I ever needed it badly and didn't have it, I'd feel like such a fool."

She cocked her head and narrowed her eyes, which made them practically disappear.

"Tell me something, Yank. Kluge picked you out of all your neighbors because you'd been a Boy Scout for thirty years. How do you react to what I'm doing?"

"You're cheerfully amoral, and you're a survivor, and you're basically decent. And I pity anybody who gets in your way."

She gri

" 'Cheerfully amoral.' I like that." She sat beside me, making a great sloshing in the bed. "You want to be amoral again?"





"In a little bit." She started rubbing my chest. "So you got into computers because they were the wave of the future. Don't you ever worry about them… I don't know, I guess it sounds corny… do you think they'll take over?"

"Everybody thinks that until they start to use them," she said. "You've got to realize just how stupid they are. With­out programming they are good for nothing, literally. Now, what I do believe is that the people who run the computers will take over. They already have. That's why I study them."

"I guess that's not what I meant. Maybe I can't say it right."

She frowned. "Kluge was looking into something. He'd been eavesdropping in artificial intelligence labs, and reading a lot of neurological research. I think he was trying to find a common thread."

"Between human brains and computers?"

"Not quite. He was thinking of computers and neurons. Brain cells." She pointed to her computer. "That thing, or any other computer, is light-years away from being a human brain. It can't generalize, or infer, or categorize, or invent. With good programming it can appear to do some of those things, but it's an illusion.

"There's an old speculation about what would happen if we finally built a computer with as many transistors as the human brain has neurons. Would there be a self-awareness? I think that's baloney. A transistor isn't a neuron, and a quintil-lion of them aren't any better than a dozen.

"So Kluge-who seems to have felt the same way-started looking into the possible similarities between a neuron and an 8-bit computer. That's why he had all that consumer junk sitting around his house, those Trash-80's and Atari's and TI's and Sinclair's, for chrissake. He was used to much more powerful instruments. He ate up the home units like candy."

"What did he find out?"

"Nothing, it looks like. An 8-bit unit is more complex than a neuron, and no computer is in the same galaxy as an organic brain. But see, the words get tricky. I said an Atari is more complex than a neuron, but it's hard to really compare them. It's like comparing a direction with a distance, or a color with a mass. The units are different. Except for one similarity."

"What's that?"

"The co

"That's what Kluge was interested in. The old 'critical mass computer' idea, the computer that becomes aware, but with a new angle. Maybe it wouldn't be the size of the computer, but the number of computers. There used to be thousands of them. Now there's millions. They're putting them in cars. In wristwatches. Every home has several, from the simple timer on a microwave oven up to a video game or home terminal. Kluge was trying to find out if critical mass could be reached that way."

"What did he think?"

"I don't know. He was just getting started." She glanced down at me. "But you know what, Yank? I think you've reached critical mass while I wasn't looking."

"I think you're right." I reached for her.

Lisa liked to cuddle. I didn't, at first, after fifty years of sleeping alone. But I got to like it pretty quickly.

That's what we were doing when we resumed the conversa­tion we had been having. We just lay in each other's arms and talked about things. Nobody had mentioned love yet, but I knew I loved her. I didn't know what to do about it, but I would think of something.

"Critical mass," I said. She nuzzled my neck, and yawned.

"What about it?"

"What would it be like? It seems like it would be such a vast intelligence. So quick, so omniscient. God-like."

"Could be."

"Wouldn't it… run our lives? I guess I'm asking the same questions I started off with. Would it take over?"

She thought about it for a long time.

"I wonder if there would be anything to take over. I mean, why should it care? How could we figure what its concerns would be? Would it want to be worshipped, for instance? I doubt it. Would it want to 'rationalize all human behavior, to eliminate all emotion,' as I'm sure some sci-fi film computer must have told some damsel in distress in the 'fifties.