Thursday, June 25, 2009

Don't Stand Too Close to the TV!

I don’t know about you, but I grew up constantly being told by my parents not to stand too close to things. Don’t stand too close to the TV because it’ll hurt your eyes. Don’t stand too close to the microwave because it will give you cancer. I believed the TV thing so whole-heartedly that when my eyesight began to get bad in 6th grade and I had to get glasses, I was convinced it was related to my rebellious tendency to stare close range at the TV when my parents weren’t looking.

I’m sure if I had developed leukemia or something at 13, I would have been convinced it was from the microwave.

But is it actually dangerous to stand too close to a TV or a microwave? Does doing so expose one’s body to dangerous radiation that can do biological damage to eyes, body, and soul?

To figure out the answer to those questions, it is important to understand what radiation is. Most people think of radiation as some mysterious death ray that emanates from radioactive elements and atomic bombs.

To be sure, those things give off radiation, but so do microwaves, televisions, and radios. Does that mean that radiation coming from small electronics is the same as radiation coming from uranium, Hiroshima, or Three Mile Island?

Well, not exactly.

All radiation is measured on a scale that is referred by physicists as the Electromagnetic Spectrum. This spectrum categorizes radiation by its level of energy. The higher the energy of a radiation source, the higher on the EM Spectrum it goes.

There are seven categories of electromagnetic radiation. In order from lowest energy to highest energy, they are: radio waves, microwaves, infrared light, visible light, ultraviolet light, X-rays, and gamma rays. While each of these categories of “rays” has a different name, it is only the energy level that makes any difference between then. In other words, a radio wave is only different from a gamma ray because of a difference in energy level. The “ray” or “wave” itself is the same – it’s called a “photon” by physicists. It’s like the difference between boiling water and ice. Both are still just water; the only difference is the temperature. Similarly, both a gamma ray and a radio wave are electromagnetic photons, the only difference being the energy.

The energy between the various forms of EM radiation is enormous. Radio waves are the lowest on the list, and gamma rays are the highest – with six degrees of separation between them. But that doesn’t mean gamma rays are six times more energetic than radio waves. In fact, a radio wave has the frequency of 10,000 cycles per second, while a gamma ray, on the other hand, has 10-to-the-20th cycles per second. That would be 1 with 20 zeros behind it. To put that in perspective, many scientists figure there are about 10-to-the-17th grains of sand on earth. That’s 1 with only 17 zeros behind it.

Furthermore, the higher you get on the scale, the closer the categories are together. For instance, visible light, the middle category, has a frequency of 10-to-the-15th – 11 powers higher than radio waves, but only 5 powers less than gamma rays.

The energy determines how dangerous the photon is. The higher it is, the more dangerous it is. But it is not just energy that ultimately determines the risk to humans. Two other factors are also involved: quantity and duration.

A very high energy source of EM radiation may be harmless if the quantity of radiation is sufficiently low and the duration of exposure is sufficiently low. Radiation from a single gamma ray photon for one second – though highly energetic – is not likely to hurt you.

Similarly, a very low energy source of radiation could carry risks with high enough quantity and duration. In other words, radio waves – those photons that make your radio work – could theoretically kill someone if the quantity and duration were great enough, despite having inherently very low energy.

But that’s just in theory. The fact is, no radiation below the level of ultraviolet is known to cause biological damage to human tissue. And in the case of microwaves and the ovens that use them, microwaves are two steps below visible light on the electromagnetic spectrum. That means visible light – the light given off by light bulbs and the sun – is of higher energy than microwaves. That means light bulbs give off a more dangerous form of radiation than microwave ovens. And no one, of course, supposes that it is dangerous to stand beneath a light bulb.

A deadly source of radiation?

Did Mother get sterilized by the Oven o' Death?

The difference, however, between a light bulb and a microwave is not just about energy. Remember that quantity and duration matter too. A microwave oven puts out enormous amounts of microwave radiation, far more radiation than is given out by a light bulb. But since microwave radiation has far lower energy than visible light, and since the duration is usually only a minute or two, standing in front of a microwave for a few minutes is probably no more dangerous than sitting underneath a light bulb for a few hours.

But what about televisions? Does sitting too close to the TV cause eye problems?

In the modern age, you first have to distinguish what kind of TV one is discussing. Old TV’s – the so-called “tube” TV’s – give off radio waves, light rays, and X-rays.

Modern TV’s – flat screen televisions that use digital signals – don’t give off any form of radiation at all except for visible light. There are no radio waves or X-rays being given off.

Older TV’s use a cathode ray tube and a radio antenna to produce the picture. The cathode ray tube, in addition to transforming the radio signal into a picture, emits X-rays. Those X-rays definitely have an energy level that falls into the “dangerous” realm. But the quantity of X-rays produced in old televisions is very low, and those televisions also have a leaded screen on the front that absorbs the few X-rays that are being produced. The visible light and radio waves emitted from the television are so low in energy and quantity that they would not do biological damage to the body or the eyes.

Modern televisions, because they only give off visible light waves, are not dangerous either.

So what we are left with is an example of an old myth dying hard. Televisions never were dangerous for the eyes, and they especially aren’t dangerous in modern, digital versions. In addition, microwaves give off forms of radiation that are very low energy, and despite the high quantities of radiation involved in running a microwave oven, the duration and energy are low enough that there is no legitimate danger in standing in front of a microwave. It is also important to remember that much of the radiation coming from a microwave oven would be absorbed in the metal and plastic that encases the oven, so it never reaches someone standing in front of it anyway.

In the end, I wouldn’t worry much about standing in front of microwaves or sitting too close to the TV. You might want to turn that lamp off on your desk though.


Astronomer Royal said...

Excellent analysis! The older picture tubes did produce x-rays and there probably was shielding. Since we kept them longer, some of the shielding might have been worn away but I don't know if that is true. Still, today, LCD flat-screen TVs should not have those issues.

However, our vision can be affected by concentrated doses of even visible light. Intensity is the key. Staring at a very bright light for a long time probably can hurt your eye sight. So putting your face against a TV screen for an hour is probably not a good plan for keeping good eyesight.

Now one thinks immediately of the damage caused by the sun if you look at it. But that damage may be mostly from the UV component of the sun's spectrum.

Anyway...just some thoughts stimulated by a very good assessment of another part of our cultural mythology. Like many myths, it starts with some reality, and then quickly gets inflated to all sorts of nonsense.

Leisurely said...

So you're telling me that Gamma Rays will not turn me into a large green beast?? I don't think I'm buying that. Next you'll tell me that men really did go to the moon!! What a maroon.

Scott said...

It's all about energy, quantity, and duration. Visible light, as the Astronomer has pointed out, can damage eyes if there is enough quantity and duration (such as staring for long periods of time at a light bulb). The same would be true for a TV. However, it would take many, many hours of nonstop staring, I would think.

As for gamma rays...again, it's energy, quantity, and duration. 1 single gamma ray photon would not hurt you, no. But gamma rays are the most energetic forms of EM radiation, and you wouldn't want to get exposed to gamma rays if you can help it. That's what makes radioactive elements dangerous - they give off gamma rays as a byproduct, as well as alpha and beta particle radiation - which is a different kind of radiation.

Serene Musings Books of the Year, 2005-2015