Добавить в цитаты Настройки чтения

Страница 31 из 38



In that case, we would now have two effects. The first effect would be a tendency toward randomness which would favor high frequencies and increase radiation as frequency was increased. Second, there was the new Planck effect of decreasing probability of radiation as frequency went up. This would favor low frequencies and decrease radiation as frequency was increased.

In the low-frequency range the first effect is dominant, but in the high-frequency range the second effect increas ingly overpowers the first. Therefore, in black-body tadia tion, as one goes up the frequencies, the amount of radia tion first increases, reaches a peak, then decreases again exactly as is observed.

Next, suppose the temperature is raised. 'ne first effect can't be changed, for randomness is randomness. But sup 178 pose that as the temperature is raised, the probability of emitting high-frequency radiation increases. The second effect, then, is steadily weakened as the temperature goes up. In that case, the radiation continues to increase with increasing frequency for a longer and longer time before it is overtaken and repressed by the gradually weakening second effect. The peak radiation, consequently, moves into higher and higher frequencies as the temperature goes up-precisely as Wien had discovered.

On this basis, Planck was able to work out an equation that described black-body radiation very nicely both in the low-frequency and high-frequency range.

However, it is all very well to say that the higher the frequency the lower the probability of radiation, but why?

There was nothing in the physics of the time to explain that, and Planck had to make up something new.

Suppose that energy did not flow continuously, as physicists had, always assumed, but was given off in pieces.

Suppose there were "energy atoms" and these increased in size as frequency went up. Suppose, still further, that light of a particular frequency could not be emitted unless enough energy had been accumulated to make up an "energy atom" of the size required by that frequency.

The higher the frequency the larger the "energy atom" and the smaller the probability of its accumulation at any given instant of time. Most of the energy would be lost as radiation of lower frequency, where the "energy atoms" were smaller and more easily accumulated. For that rea son, an object at a temperature of 400' C. would radiate its heat in the infrared entirely. So few "energy atoms" of visible light size would be accumulated that no visible glow would be produced.

As temperature went up, more energy would be gen erally available and the probabilities of accumulating a high-frequency "energy atom" would increase. At 6000' C. most of the radiation would be in "energy atoms" of visible light, but the still larger "energy atoms" of ultraviolet would continue to be formed only to a minor extent.

But how big is an "energy atom"? How much energy does it contain? Since this "how much" is a key question, Planck, with admirable directness, named the "energy atom" a quantum, which is Latin for "how much?" the plural is quanta.

For Planck's equation for the distribution of black-body radiation to work, the size of the quantum had to be directly proportional to the frequency of the radiation. To express this mathematically, let us represent the size of the quantum, or the amount of energy it contains, by e (for energy). The frequency of radiation is invariably repre sented by physicists by means of the Greek letter nu (v).

If energy (e) is proportional to frequency (v), then e must be equal to v multiplied by some constant. This con stant, called Planck's constant, is invariably represented as h. The equation, giving the size of a quantum for a par ticular frequency of radiation, becomes: e = hv (Equation 1)

It is this equation, presented to the world in 1900, which is the Continental Divide that separates Classical Physics from Modern Physics. In Classical Physics, energy was considered continuous; in Modern Physics it is con sidered to be composed of quanta. To put it another way, in Classical Physics the value of h is considered to be 0; in Modern Physics it is considered to be greater than 0.

It is as though there were a sudden change from con sidering motion as taking place in a smooth glide, to mo tion as taking place in a series of steps.

There would be no confusion if steps were long ga lumphing strides. It would be easy, in that case, to dis tinguish steps from a glide. But suppose one minced along in microscopic little tippy-steps, each taking a tiny frac tion of a second. A careless glance could not distinguish that from a glide. Only a painstaking study would show that your head was bobbing slightly with each step. The smaller the steps, the harder to detect the difference from a glide.

In the same way, everything would depend on just how big individual- quanta were; on how "grainy" energy was.

The size of the quanta depends on 'the size of Planck's constant, so let's consider that for a while.

If we solve Equation I for h, we get: h = elv (Equation 2) Energy is very frequently measured in ergs (see Chapter 13). Frequency is measured as "so many per second" and its units are therefore "reciprocal seconds" or "I/second."



We must treat the units of h as we treat h itself. We get h by dividing e by v; so we must get the units of h by dividing the units of e by the units of v. When we divide ergs by I/second we are multiplying ergs by sec onds, and we find the units of h to be "erg-seconds." A unit which is the result of multiplying energy by time is said, by physicists, to be one of "action." Therefore, Planck's constant is expressed in units of action.

Since the nature of the universe depends on the size of Planck's constant, we are all dependent on the size of the piece of action it represents. Planck, in other words, had sought and found the piece of the action. (I understand that others have been searching for a piece of the action ever since, but where's the point since Planck has found it?)

And what is the exact size of h? Planck found it had to be very small indeed. The best value, currently ac cepted, is: 0.0000000000000000000000000066256 erg seconds,or 6.6256 x 10-2" erg-seconds.

Now let's see if I can find a way of expressing just how small this is. The human body, on an average day, con sumes and expends about 2500 kilocalories in maintaining itself and performing its tasks. One kilocalorie is equal to 1000 calories, so the daily supply is 2,500,000 calories.

One calorie, then, is a small quantity of energy from the human standpoint. It is 1/2,500,000 of your daily store. It is the amount of energy contained in 1/113,000 of an ounce of sugar, and so on.

Now imagine you are faced with a book weighing one pound and wish to lift it from the floor to the top of a bookcase three feet from the ground. The energy expended in lifting one pound through a distance of three feet against gravity is just about 1 calorie,.

Suppose that Planck's constant were of the order of a calorie-second in size. The universe would be a very strange place indeed. If you tried to lift the book, you would have to wait until enough energy had been accumu lated to make up the tremendously sized quanta made necessary by so large a piece of action. Then, once it was accumulated, the book would suddenly be three feet in the air.

But a calorie-second is equal to 41,850,000 erg-seconds, and since Planck's constant is 'Such a minute fraction of one erg-secoiid, a single calorie-second equals 6,385,400, 000,000,000,000,000,000,000,000,000 Planck's constants, or 6.3854 x 10:1@' Planck's constants, or about six and a third decillion Planck's constants. However you slice it, a calorie-second is equal to a tremendous number of Planck's constants.

Consequently, in any action such as the lifting of a one pound book, matters are carried through in so many tril lions of trillions of steps, each one so tiny, that motion seems a continuous glide.

When Planck first introduced his "quantum theory 91 in 1900, it caused remarkably little stir, for the quanta seemed to be pulled out of midair. Even Planck himself was dubious-not over his equation describing the dis tribution of black-body radiation, to be sure, for that worked well; but about the quanta he had introduced to explain the equation.

Then came 1905, and in that year a 26-year-old theo retical physicist, Albert Einstein, published fivo separate scientific papers on three subjects, any one of which would have been enough to establish him as a first-magnitude star in the scientific heavens.

In two, he worked out the theoretical basis for "Brown ian motion" and, incidentally, produced the machinery by which the actual size of atoms could be established for the first time. It was one of these papers that earned him his Ph.D.

In the third paper, he dealt with the "photoelectric effect" and showed that although Classical Physics could not explain it, Planck's quantum theory could.

This really startled physicists. Planck had invented quanta merely to account for black-body radiation, and here it turned out to explain the photoelectric effect, too, something entirely different. For quanta to strike in two different places like this, it seemed suddenly very reason able to suppose that they (or something very like them) actually existed.

(Einstein's fourth and fifth papers set up a new view of the universe which we call "The Special Theory of Rela tivity." It is in these papers that he introduced his famous equation e = MC2; see Chapter 13.

These papers on relativity, expanded into a "General Theory" in 1915, are the achievements for which Einstein is known to people outside the world of physics. Just the same, in 1921, when he was awarded the Nobel Prize for Physics, it was for his work on the photoelectric effect and not for his theory of relativity.)

The value of h is so incredibly small that in the ordinary world we can ignore it. The ordinary gross events of everyday life can be considered as though energy were a continuum. This is a good "first approximation."

However, as we deal with smaller and smaller energy changes, the quantum steps by which those changes'must take place become larger and larger in comparison. Thus, a flight of stairs consisting of treads 1 millimeter high and 3 millimeters deep would seem merely a slightly roughened ramp to a six-foot man. To a man the size of an ant, how ever, the steps would seem respectable individual obstacles to be clambered over with difficulty. And to a man the size of a bacterium, they would be mountainous precipices lin the same way, by the time we descend into the world within the atom the quantum step has become a gigantic thing. Atomic physics ca