ADVERTISEMENTS:
In this essay, we will explore the roles played by the various types of radiation in causing cancer. Find paragraphs, long and short essays on ‘Radiation and Cancer’ especially written for college and medical students.
Essay on Radiation and Cancer
Essay # 1. Sunlight, Ultraviolet Radiation and Cancer:
Several types of radiation, differing from one another in source, energy, and wavelength, are known to cause cancer. Among these, sunlight is responsible for the most cancer cases. In the United States, about half of all newly diagnosed cancers—roughly 1.3 million new cancers each year—are caused by the sun, giving sunlight the dubious distinction of causing more people to develop cancer than all other causes of cancer combined.
Skin Cancer Risk is Related to Sunlight Exposure and Intensity:
ADVERTISEMENTS:
The idea that sunlight can trigger cancer initially came from the discovery that skin cancer rates are elevated in people who spend long hours in the sun, especially in areas of the world where the sunlight is intense. A striking example involves the population of Australia, where skin cancer rates are high for reasons based on a quirk of eighteenth-century history.
During the 1780s, the British House of Commons decided to deal with overcrowding in British jails by banishing criminals to the (then) remote island of Australia. Within a few decades, the east coast of Australia came to be inhabited by light-skinned British men and women whose descendants now represent a large part of the Australian population.
The white skin and fair complexion of these people makes them particularly vulnerable to the intense Australian sunlight, and as a result, the white population of Australia has the highest skin cancer rate of any people in the world. Such high rates cannot be explained by hereditary factors because in England, where the sun is weaker and often covered by clouds, this same group of people had low skin cancer rates.
ADVERTISEMENTS:
The Australian episode is particularly striking because it created the world’s biggest skin cancer epidemic, but similar associations between sunlight and skin cancer have been observed in other contexts.
For example, skin cancer rates vary in different regions of the United States, depending on the average amount of sunlight received. In southern states with intense sunlight, such as Texas and New Mexico, skin cancer is more frequent than in northern states such as Iowa and Michigan, where it is cloudier and the sunlight is less intense (Figure 1).
As would be expected if sunlight were responsible, skin cancers arise most frequently on parts of the body that are routinely exposed to the sun, such as the face, neck, arms, and hands. Skin cancer rarely affects areas of the body that are normally covered by clothing and the sun cannot reach.
Because skin cancers are located on the outside of the body, they tend to be noticed and diagnosed at earlier stages than other cancers. As a result, most skin cancers can be removed before invasion and metastasis have occurred and cure rates for skin cancer are very high (around 99%). Nonetheless, some forms of skin cancer do represent a significant hazard.
Of the three forms of skin cancer that are routinely distinguished from one another, basal cell carcinomas are the most common. Basal cell carcinomas account for 75% of all skin cancers but cause few deaths because the tumors metastasize in less than one patient per thousand.
The next most frequent type of skin cancer is squamous cell carcinoma, which accounts for about 20% of all skin cancers. Squamous cell carcinomas are more serious than basal cell carcinoma because they metastasize more frequently, although metastasis still occurs in only one out of twenty patients.
The most serious form of skin cancer is melanoma, a type of cancer that arises from pigment cells called melanocytes. Melanomas account for only 5% of all skin cancers but are the most dangerous because they frequently metastasize, often before the tumor has even been noticed.
For this reason, melanomas account for the vast majority of skin cancer fatalities (Figure 2). Basal and squamous cell carcinomas, being less hazardous than melanomas, are often lumped together and referred to as nonmelanoma skin cancers to distinguish them from melanomas.
Sunlight has been implicated in causing both melanoma and nonmelanoma skin cancers, although the nature of the connection differs for the two groups of cancers. Nonmelanoma skin cancers arise mainly in individuals who are exposed to the sun on a regular basis and cancer tends to appear in locations that receive the most sunlight, such as the face, arms, and hands.
In contrast, melanomas often arise in areas of the body that are not routinely exposed to the sun, such as the legs and back, and tend to occur in people who work indoors but have intense periodic exposures to sunlight on weekends or vacations or who had intense sunburn episodes when they were young.
Sunlight Contains Several Classes of UV Radiation:
To explain how sunlight causes cancer, we need to describe the types of radiation given off by the sun. The sunlight that reaches the earth contains several forms of electromagnetic radiation, which is defined as waves of electric and magnetic fields that are propagated through space at the speed of light. Electromagnetic radiation occurs in a variety of forms that differ in wavelength and energy content (Figure 3).
Wavelength and energy are inversely related to each other—that is, radiation of shorter wavelength possesses more energy than radiation of longer wavelength. The longest-wavelength component of sunlight is infrared radiation, which creates the warmth we feel from the sun.
ADVERTISEMENTS:
Next comes visible light, which is of shorter wavelength than infrared radiation and provides the illumination that allows us to see colors. Finally, ultraviolet radiation (UV) is the shortest-wavelength component of sunlight and possesses the greatest energy, making it capable of inflicting damage on human tissues.
The ultraviolet radiation in sunlight is in turn subdivided into three classes—A, B, and C—in order of decreasing wavelength (Table 1). UVA has the longest wavelength and the least energy. Defined as the portion of the UV spectrum whose wavelength falls between 315 and 400 nanometers (nm), UVA is the predominant type of ultraviolet radiation to reach the earth because it is not filtered out by the earth’s atmosphere.
UVA was once thought to be harmless because of its lower energy content, but long-term exposure to UVA is now known to cause aging of the skin and to act as a promoting agent for skin cancer by stimulating cell proliferation.
ADVERTISEMENTS:
UVB radiation is of higher energy than UVA, falling in the wavelength range of 280 to 315 nm. Animal studies have shown the UVB is largely responsible for the carcinogenic properties of sunlight. More than 90% of the UVB radiation emitted by the sun is absorbed by ozone molecules present in the upper atmosphere, but enough UVB passes through to the earth’s surface to cause sunburn, tanning, aging of the skin, and skin cancer.
During the latter half of the twentieth century, the earth’s ozone layer was partially destroyed by chemicals called chlorofluorocarbons (CFCs), which were used as refrigerants and for several industrial purposes. When they escape into the atmosphere, CFCs react with and destroy ozone molecules.
Although the production of CFCs is now being phased out, the ozone layer will take several decades to recover, and ozone depletion may be causing skin cancer rates to rise because more UVB radiation is currently reaching the earth.
ADVERTISEMENTS:
Finally, UVC falls in the wavelength range of 100 to 280 nm and is the most energetic type of UV radiation emitted by the sun. This high-energy, short-wavelength form of UV radiation can cause severe burns, but it is completely absorbed by the upper layers of the atmosphere before reaching the earth. UVC radiation is generally encountered only from artificial light sources, such as the germicidal lamps that use UVC to destroy bacteria when sterilizing medical and scientific equipment.
UVB Radiation Creates Pyrimidine Dimers in DNA:
UVB is the highest-energy component of sunlight to reach the earth, but its energy level is still relatively low and thus it cannot penetrate very far into the body. Instead, UVB is absorbed by cells located in the outer layers of the skin, which explains why sunlight rarely causes any type of malignancy other than skin cancer. The damaging effects of UVB on skin cells often precede the development of cancer by many years.
For example, consider what happens to people who move from England, with its weak sunlight and cloudy skies, to the intensely sunny climate of Australia. Those who move to Australia when they are young develop skin cancer at high rates when they reach middle age, whereas those who move to Australia later in life retain the low skin cancer rates that are typical of people who remain in England. Such observations suggest that skin cancers observed later in life are the result of sunlight damage that occurred many years earlier.
Such a pattern is reminiscent of the initiation phase of chemical carcinogenesis, in which carcinogens trigger DNA mutations that persist for many years, passed from one cell generation to the next as genetically damaged cells proliferate and give rise to tumors. By analogy, researchers have looked to see whether sunlight causes skin cell mutations early in life that can be linked to the later development of cancer.
This is a complicated task because even if mutations are discovered in skin cancer cells, how can you be certain that sunlight caused them? A useful clue has come from studying the interactions of UVB—the main carcinogenic component of sunlight— with different kinds of cells and viruses. The shorter wavelengths of UVB (near 280 nm) are absorbed by the DNA bases, imparting enough energy to alter chemical bonds.
The most common reaction occurs in regions containing the bases cytosine (C) and thymine (T), a class of single-ring bases known as pyrimidines. In locations where two of these pyrimidine bases lie next to each other, absorption of UVB radiation triggers the formation of covalent bonds between the adjacent bases, creating a unique type of mutation called a pyrimidine dimer. All four combinations of two adjacent pyrimidines—that is, CC, CT, TC, and TT—are frequently converted into covalently linked dimers by UVB radiation.
ADVERTISEMENTS:
Although cells can repair pyrimidine dimers, repair needs to occur before DNA replication creates a permanent, non-correctable mutation. Figure 4 illustrates how such a permanent mutation could arise, using a CC dimer as an example. During DNA replication, the DNA strand in the region of the CC dimer is distorted and therefore tends to pair improperly with bases in the newly forming DNA strand.
Instead of pairing correctly with its complementary base G, the base C in a CC dimer often pairs incorrectly with the base A (Figure 4 step②). During the next round of DNA replication, the incorrectly inserted A will then form a base pair with its normal complementary base, T, creating an AT base pair Figure 4, step ③).This AT base pair now looks normal to the cellular DNA repair machinery and so will continue to be replicated as if no error had been introduced.
Because the base T resides where the base C had been located in the original DNA molecule, the preceding type of mutation is called a C→T substitution. In some cases both C’s of the dimer are replaced by the same mechanism, creating a CC→TT mutation.
At this point the initial CC dimer in the original DNA strand could be repaired, but the C→T or CC →TT substitution in the newly replicated DNA will be permanent. Such base substitution patterns involving adjacent pyrimidines are unique to UV radiation and are therefore used as a distinctive “signature” to identify mutations caused by sunlight.
Mutations in the p53 Gene Triggered by UVB Radiation can Lead to Skin Cancer:
After scientists discovered that UV radiation selectively induces the formation of pyrimidine dimers, the next task was to determine whether these mutations are associated with skin cancer. Among the first genes to be examined for the presence of pyrimidine dimers was the p53 gene, a gene chosen for study because it is known to be mutated in many kinds of human cancer.
ADVERTISEMENTS:
When skin cancer cells are examined for the presence of p53 mutations, nonmelanoma skin cancers are routinely found to exhibit p53 mutations with the distinctive UV “signature”—that is, C →T or CC → TT substitutions at dipyrimidine sites. In contrast, the p53 mutations arising in cancers of internal body organs do not generally exhibit this UV-specific pattern (Figure 5).
The preceding observations indicate that the p53 mutations seen in nonmelanoma skin cancer cells are triggered by sunlight, but do these mutations actually cause cancer to arise, or are they simply an irrelevant byproduct of long-term exposure to sunlight? This question can be resolved by looking at the precise location of UV-induced mutations within the p53 gene.
The DNA base sequence of most genes is arranged in a series of three-base units called codons, each of which specifies a particular amino acid in the protein encoded by the gene. Typically, the first two bases of a codon are more important in determining the amino acid than is the third. For example, the codons GAA and GAG both specify the same amino acid (glutamine), so changing the third base from A to G in this codon does not change the amino acid. A similar principle applies to the codons for many other amino acids.
If the p53 mutations seen in non-melanoma skin cancers were simply a random by-product of sunlight exposure, mutations in a codons third base (which do not change an amino acid) should be as frequent as mutations in the first or second base (which do change an amino acid). In fact, DNA sequencing has revealed that p53 mutations are not randomly distributed but instead involve base changes that alter amino acids.
In other words, the p53 mutations seen in non-melanoma skin cancers alter the amino acid sequence of the protein encoded by the p53 gene, as would be expected if these mutations are involved in the mechanism by which sunlight causes cancer. The p53 gene is not, however, the only mutant gene to be involved in non-melanoma skin cancers, nor is it as frequently mutated in melanomas.
The p53 Protein Protects against Skin Cancer by Preventing Cells with Damaged DNA from Proliferating:
Given that UV radiation triggers mutations that alter the p53 protein of skin cells, the question arises as to how these p53 abnormalities lead to cancer. The normal function of the p53 protein is to stop cells with damaged DNA from proliferating. In the presence of damaged DNA, the p53 protein accumulates and activates a pathway that halts the cell cycle, thereby allowing time for DNA repair.
If the DNA damage is too severe to be repaired, the p53 protein eventually triggers cell suicide by apoptosis. The net result is that cells containing mutations that might lead to cancer are not allowed to proliferate.
But what happens if sunlight happens to trigger a mutation that renders the p53 protein nonfunctional? In such a case, DNA damage caused by subsequent exposure to sunlight will not be able to trigger p53- mediated apoptosis, even if the DNA is damaged beyond repair. Damaged DNA will therefore be passed on to future cell generations during ensuing cell divisions, creating conditions that can lead to the development of cancer (Figure 6).
Sunlight-induced mutation of the p53 gene is thus comparable to the initiation stage of chemical carcinogenesis, in which an initial mutation creates a precancerous cell that is later converted into a tumor by a promotion phase involving sustained cell proliferation.
In addition to initiating carcinogenesis by mutating the p53 gene, sunlight also affects tumor promotion through an indirect mechanism involving neighboring cells. An initiated, precancerous skin cell that has incurred a p53 mutation will be surrounded by neighboring cells in which the p53 gene is normal.
If these surrounding cells sustain extensive DNA damage during subsequent episodes of sunlight exposure, the normal operation of the p53 protein will cause them to commit suicide by apoptosis. The dying cells leave behind a space that needs to be filled, thereby creating conditions that allow the uncontrolled proliferation of p53-mutated cells to replace the dying cells. Sunlight therefore acts as a complete carcinogen, functioning in both the initiation (mutation) and tumor promotion stages.
The preceding model provides an interesting perspective on the phenomenon of sunburn, the reddening and peeling of the skin that is commonly observed after intense sunlight exposure. Microscopic and biochemical examination of sunburned, peeling skin cells reveals that they look like cells being destroyed by apoptosis.
Does that mean that a sunburn is simply the reflection of p53-mediated apoptosis triggered by sunlight-induced DNA damage? An answer has come from studies involving a mutant strain of mice possessing a defective p53 gene. When the skin of such mice is exposed to intense UV radiation, fewer sunburned, apoptotic cells are observed than in normal mice exposed to the same radiation.
It therefore appears that sunburn merely reflects the p53-mediated destruction of cells containing damaged DNA. So despite the pain, a sunburn is a protective mechanism, a deliberate effort to avert the development of cancer by destroying cells that have been damaged by UV radiation.
Clothing, Sunscreens, Skin Pigmentation, and Avoiding Strong Sunlight are all helpful in Decreasing Skin Cancer Risk:
Because sunlight is its main cause, skin cancer is one of the easiest kinds of cancer to prevent. The best protection is afforded by simply staying out of the sun, especially at midday when sunlight is most intense.
People living in tropical climates have traditionally taken a siesta at this time of day, a practice born of common sense and immortalized in a Noel Coward song whose lyrics state that only “mad dogs and Englishmen go out in the midday sun.” (The English, of course, can safely go outdoors at midday when living in England because the sun is relatively weak and often covered by clouds; going out at midday only creates a significant cancer problem for the English when they travel to tropical climates.)
Although staying out of the sun provides the best protection, it is often impractical advice because occupational or recreational activities may require people to spend prolonged hours outdoors. Since only exposed areas of the body tend to be affected, the risk of developing skin cancer can be minimized by wearing protective clothing such as hats, long-sleeved shirts, and pants. Recently developed lightweight fabrics that exhibit enhanced sun- blocking properties, even when wet, are especially useful for this purpose.
Protection can also be provided by sunscreen lotions, which contain substances that prevent UV radiation from reaching skin cells. Sunscreens are of two different types: physical sunscreens that reflect UV radiation and chemical sunscreens that absorb it. The most popular physical sunscreens, zinc oxide and titanium dioxide, reflect both UVA and UVB radiation.
Early lotions containing zinc oxide or titanium dioxide had the disadvantage of coloring the skin white, but these substances are now manufactured with particles so small that the white coloring is almost invisible. Unlike physical sunscreens, chemical sunscreens preferentially absorb either UVA or UVB radiation.
Para-aminobenzoic acid (PABA), which absorbs UVB radiation, was introduced in the early 1970s and became the first widely used chemical sunscreen. However, PABA was subsequently found to elicit allergic reactions and has been largely replaced with other UVB-absorbing molecules, such as cinnamates, salicylates, and octocrylene. For absorbing UVA radiation, benzophenones and avobenzone are the most common chemical sunscreens.
Modern “broad-spectrum” sunscreen lotions usually contain a mixture of the preceding ingredients to provide maximum protection against both UVB radiation, the main cause of DNA mutation and skin cancer, and UVA radiation, which stimulates cell proliferation and can therefore promote tumor development. Sunscreens clearly diminish the risk of sunburn, but their effectiveness in decreasing skin cancer risk has been more difficult to assess.
One complication is that skin cancer can take 15 to 30 years to develop. Because the use of broad-spectrum sunscreens is a relatively recent phenomenon, it may take several decades before epidemiological studies will detect any effects on skin cancer rates. In addition, the use of sunscreen lotions often lets people stay outdoors longer, so the protective effects of sunscreens might be offset by extra time spent in the sun.
Although it may therefore take many years before the effects of sunscreens on skin cancer rates are precisely quantified, their ability to block UV radiation and prevent sun-burning justifies their continued use. (Note: The same cannot be said for so-called sunless tanning lotions, which contain a chemical that directly imparts a bronze-like color to the skin; such products do not absorb UV radiation and thus afford no protection against skin cancer.)
Like broad-spectrum sunscreens, the natural pigments found in the skin also absorb UV radiation. As a result, people with darkly pigmented skin (e.g., Africans, Latinos, Hispanics, and Australian Aborigines) have lower skin cancer rates than people with lightly pigmented skin. Skin pigmentation comes from melanin, a family of brown pigments that are synthesized by melanocyte cells present in the skin.
Regular exposure to sunlight stimulates melanin production, thereby causing a darkening of the skin or “suntan.” A suntan helps protect the skin during subsequent exposures to sunlight because the melanin molecules absorb UV radiation. For this reason, some people use artificial tanning devices—such as sunlamps or commercial tanning beds—to start a suntan before vacationing in a sunny climate.
Such tanning devices have even been claimed to be “safer” than sunlight because most of the radiation is UVA rather than UVB. However, people who use such devices develop skin cancer at higher rates than people who do not, even when their total sun exposure is equivalent.
The popularity of artificial tanning devices is therefore of considerable concern, especially in view of reports that 50% of high school-aged girls in certain areas of the northern United States use commercial tanning beds on a regular basis.
Essay # 2. Ionizing Radiation and Cancer:
Although UV radiation is responsible for more cases of cancer than all other carcinogens combined, its inability to penetrate very far into the body means that it only causes skin cancer, which is often easy to cure.
We now turn our attention to higher-energy forms of radiation that penetrate into the body and can therefore cause cancer to arise in internal organs. This type of radiation is called ionizing radiation because it removes electrons from biological molecules, thereby generating highly reactive ions that damage DNA in various ways.
X-Rays Penetrate through Body Tissues and Cause Cancers of Internal Cells and Organs:
In 1895, the first form of ionizing radiation that would turn out to cause cancer in humans was accidentally discovered by Wilhelm Roentgen, a German physicist. Roentgen was passing an electric current through a partially evacuated glass tube, called a cathode-ray tube, when he noticed that a fluorescent screen located across the room began to glow.
Even after he covered the cathode-ray tube with black paper and moved it to another room, the screen glowed when the cathode-ray tube was turned on. Most astonishing, however, was the discovery that an image of the bones in Roentgen’s hand appeared on the screen when he placed his hand between the cathode-ray tube and the screen.
Radiation exhibiting such unusual properties was completely unknown at that time, so Roentgen named it X-rays. In recognition of the importance of this discovery, Roentgen was awarded the first Nobel Prize in Physics in 1901.
X-rays are a type of electromagnetic radiation exhibiting a wavelength shorter than that of UV radiation (see Figure 3). Because of their short wavelength, X-rays are highly energetic and will pass through many materials that cannot be penetrated by UV radiation, visible light, or other weaker forms of electromagnetic radiation. This is the property that allows X-rays to be used for viewing the inside of objects such as the human body.
Shortly after X-rays were discovered in 1895, newspaper headlines proclaimed “new light sees through flesh to bones!” and X-ray studios were opened around the country so that people could have “bone portraits” taken of themselves, even if they had no health problems! And doctors, of course, quickly embraced the new tool, which was to revolutionize many aspects of medical diagnosis and treatment.
Unfortunately, medical practitioners and researchers were slow to recognize the hazards of X-rays. An early danger signal came from the laboratory of Thomas Edison, whose research technician routinely tested X-ray equipment by using it to take pictures of his own hands. The technician soon developed severe radiation burns and cancer arose in the burned tissue. Although both his arms were subsequently amputated, he died of metastatic cancer in 1904, the first cancer fatality attributed to X-rays.
More cancers appeared in the next few decades as doctors specializing in the use of X-rays (radiologists) began to develop leukemia at rates several times higher than normal. The suspicion that X-rays were causing these cancers was eventually confirmed by animal studies, which showed that animals exposed to X-rays develop cancer at rates that are directly proportional to the dose of radiation received (Figure 7). The risk of leukemia is especially elevated, but X-rays pose a cancer threat to almost every tissue of the body.
Human exposure to such hazards could be virtually eliminated if X-rays served no useful purpose, but in many situations the health benefits to be gained from medical X-rays far outweigh the risk of inducing cancer.
That is not the case in all situations, however. Between the 1920s and 1950s, some doctors used high-dose X-rays to treat children with superficial skin conditions of the head and neck, such as ringworm and acne. Later in life, these individuals developed thyroid cancer at much higher rates than normal. Thus, the medical benefits to be gained from every X-ray procedure need to be prudently weighed against the increased cancer risk.
Radioactive Elements Emit Alpha, Beta, and Gamma Radiation:
A year after the discovery of X-rays by Roentgen in 1895, the French physicist Henri Becquerel discovered another form of radiation called radioactivity, which is emitted by chemical elements that are intrinsically unstable.
Some elements are naturally radioactive, whereas others are created artificially. The radiation emitted by a radioactive element emanates from an unstable atomic nucleus and is therefore referred to as nuclear radiation.
There are three main forms of nuclear radiation, known as alpha (α), beta (β), and gamma (γ) radiation. Alpha and beta radiation both involve streams of charged particles of matter. Alpha particles are positively charged entities composed of two neutrons plus two protons (the nucleus of one helium atom); beta particles are electrons and therefore exhibit a negative charge.
In addition to these particulate forms of nuclear radiation, some radioactive elements emit gamma rays, which are a type of electromagnetic radiation and therefore exhibit no mass or charge. The wavelength of gamma rays is shorter than that of X-rays, making them the most energetic form of electromagnetic radiation (see Figure 3).
One of the first scientists to work with radioactivity was Marie Curie, co-discoverer of two naturally radioactive elements, polonium and radium. Awarded two Nobel Prizes (one in physics and one in chemistry) for her pioneering work on radioactivity, she suffered severe radiation burns and eventually died of leukemia, presumably caused by extensive exposure to nuclear radiation.
Marie Curie’s daughter, Irene Joliot-Curie, followed in her mother’s footsteps and eventually received a Nobel Prize in Chemistry for showing that radioactive elements can be artificially created in the laboratory by bombarding chemically stable elements with high-energy radiation. Like dozens of other early workers who handled radioactive materials, Irene Joliot-Curie also died of cancer.
Radiation Dose is measured in Grays and Sieverts to Account for Differences in Tissue Absorption and Damage:
As with other carcinogens, the cancer risk associated with nuclear radiation is directly related to the dose received. Measuring radiation dosage is complicated because various types of radiation differ both in their energy content and in their ability to penetrate and damage biological tissues. To take these variables into account, several measurement units are used when describing radiation doses (Table 2).
The basic unit of radiation energy is the electron volt (eV), which expresses the total amount of energy present. However, such a measurement does not reflect how much energy is absorbed when a given type of radiation interacts with biological tissue. An additional unit of measurement, called the gray (Gy), is therefore used to describe how much energy is actually absorbed.
Even when two types of radiation are absorbed in equal amounts, they may differ in their ability to damage biological tissue. To account for such differences, the absorbed dose (measured in grays) is commonly multiplied by a correction factor called the relative biological effectiveness (RBE) to yield a biologically equivalent dose (or simply equivalent dose) measured in units called sieverts (Sv).
As a standard of reference, X-rays are defined as having an RBE = 1; the tissue-damaging potencies of other forms of radiation are then compared with that of X-rays, and each type of radiation is assigned an RBE value that reflects its relative effectiveness.
For example, alpha particles have an RBE = 20, which means that alpha particles are 20 times more effective than X-rays in causing tissue damage. Therefore, exposure to 1 Gy of alpha particles corresponds to an equivalent dose of 20 Sv (the absorbed dose in grays multiplied by RBE = 20).
In contrast, exposure to 1 Gy of X-rays corresponds to an equivalent dose of only 1 Sv (the absorbed dose in grays multiplied by RBE = 1). The reason for expressing radiation dose in sieverts rather than grays is that it provides a better indication of how much biological damage a given exposure to radiation will cause. (Note: The gray and sievert replace older historical units of radiation exposure called rads [= 0.01 Gy] and rems [= 0.01 Sv], respectively.)
Radon, Polonium, and Radium Emit Alpha Particles that can Cause Cancer in Humans:
Of the three main types of nuclear radiation (alpha, beta, and gamma) alpha particles are potentially the most hazardous because they are highly damaging to biological tissues and many radioactive elements emit alpha particles. Nonetheless, external exposure of the body to alpha emitters involves little danger because alpha particles are relatively large and are therefore easily blocked by most materials, making it difficult for them to penetrate very far into biological tissue.
From outside the body they do not infiltrate more than 50 micrometers into the skin, which does not get them beyond the outermost layer of dead skin cells.
The situation is different when an alpha-emitting radioactive substance is inhaled or ingested, bringing the radioactivity into direct contact with living cells. That is what happens with radon, a radioactive gas produced during the spontaneous breakdown of radium (which is itself produced from the spontaneous breakdown of uranium).
In regions of the country where large amounts of radium are present in underground rock formations (Figure 8), the radium gives rise to radioactive radon gas that seeps out of the earth and can accumulate in buildings if the ventilation is inadequate. When a radioactive atom of radon emits an alpha particle, the atom is converted into polonium, a radioactive metal that forms tiny particles that may be inhaled and become lodged in a person’s lungs.
The subsequent radioactive decay of polonium produces more alpha particles and a series of additional radioactive elements, which emit yet more radiation. The alpha particles released during these events enter the cells lining the inner surface of the lung, causing DNA mutations that can initiate the development of cancer.
As a consequence, increased lung cancer rates are observed in people who have been exposed to high levels of environmental radon, especially in those individuals who also smoke cigarettes.
Exposure to environmental radon is not the only explanation for radioactivity that might be present in a person’s lungs. Cigarette smoke is also radioactive, in this case because of the fertilizers that are used for growing commercial tobacco.
Most of these fertilizers contain phosphate derived from crushed rocks that contain trapped radon gas, which decays to form radioactive polonium; the radioactive polonium in turn appears both in cigarette smoke and in high concentration in the lungs of cigarette smokers. Animal studies indicate that radioactive polonium is one of the more potent carcinogens present in tobacco smoke.
When radioactive radon or polonium is inhaled, the lung is the initial organ encountered by the inhaled material and is therefore the main site where cancer appears. For other radioactive elements, specific chemical properties may cause the inhaled or ingested radioactivity to become concentrated in a different location.
A striking illustration of this principle dates back to the 1920s and involves a group of women working in a New Jersey factory that produced watch dials that glow in the dark. The luminescent paint used in painting the dials contained radium, a radioactive element that resembles calcium in some of its chemical properties.
The radium paint was applied using a fine-tipped brush that the employees frequently wetted with their tongues. As a result, minute quantities of radium were inadvertently ingested and, like calcium, the radium became concentrated in bone tissue. Many of these women subsequently developed bone cancer caused by the radioactive radium that had become concentrated in their bones.
Nuclear Explosions have Exposed People to Massive Doses of Ionizing Radiation:
The most dramatic episode of large-dose exposure to ionizing radiation involved the atomic bombs that were exploded over the Japanese cities of Hiroshima and Nagasaki at the end of World War II.
People who survived the initial blast and the short-term toxic effects of the massive amounts of ionizing radiation released by the explosions later developed cancer at higher-than-normal rates. Leukemia initially appeared to be the predominant cancer, but it is now clear that many other kinds of cancer were caused as well.
The ability of ionizing radiation to cause leukemia versus other types of cancer tends to be overestimated for two reasons. First, leukemia is not as common as many other kinds of cancer, so increases in leukemia rates caused by radiation exposure are easier to spot. Second, the latent period is generally shorter for leukemia than for other cancers, so leukemias are the first cancers to be seen.
Leukemia rates began to increase within the first 2 years after the atomic explosions in Japan, whereas other cancer rates did not rise until 10 to 15 years later. As a result, initial reports tended to emphasize the effects of radiation on leukemia rates. The long-term data now make it clear that leukemias accounted for only about 15% of the total number of cancer deaths caused by the two atomic explosions (Figure 9).
Many of the more common types of cancer—including those of the breast, lung, colon, stomach, ovary, and uterus—also increased in response to the massive doses of radiation received by the citizens of Hiroshima and Nagasaki.
Individuals who survived the initial impact of the atomic bombs were located at varying distances from the explosions and therefore received differing doses of radiation. Using such information to estimate the doses received by different individuals, scientists have constructed dose- response curves for radiation exposure versus subsequent cancer risk (Figure 10).
As expected, the data reveal that cancer rates go up as the radiation dose is increased, both for leukemia and for other types of cancer. However, a transient decline in cancer rates occurs at very high doses of radiation, which is thought to reflect the fact that at higher doses, radiation kills cells in addition to causing mutations that can lead to cancer.
The pattern of increasing cancer risk may therefore be disrupted at high radiation doses because the probability that a cell will be killed by radiation may be as great as (or greater than) the probability that it will sustain a cancer-inducing mutation.
Japan is not the only place where nuclear explosions have triggered cancer outbreaks. During the 1950s, the United States tested its nuclear weapons by exploding them in the open deserts of Nevada. In contrast to the situation in Japan, where a large population was exposed to whole-body radiation from gamma rays and neutrons emitted by the initial explosion, relatively few people were close enough to the Nevada test sites to receive much direct radiation.
However, radioactive particles created during the explosions were carried by the prevailing winds into southwestern Utah, where individuals exposed to the radioactive fallout suffered an increased incidence of leukemia several decades later. Recent reports suggest that most people living in the United States have been exposed to at least a little bit of the radioactive fallout produced by these tests, although the exact impact on cancer rates has been difficult to estimate.
Another nuclear radiation incident happened in 1986 at the Chernobyl nuclear power plant in the former Soviet Union (now Ukraine). During a routine test, one of the plant’s reactors went out of control and exploded, discharging several hundred Hiroshima bombs’ worth of radioactive fallout across a large portion of Eastern Europe.
Although many radioactive chemicals were released by the Chernobyl explosion, beta-particle emitting forms of iodine represented a large fraction. When iodine is ingested, it becomes concentrated in the thyroid gland so efficiently that the radiation dose experienced by the thyroid is 1000 to 2000 times higher than the average body dose.
It is therefore not surprising that a few years after the accident; thyroid cancer rates in children were almost one hundred times higher than normal in regions receiving the largest amounts of radioactive fallout. These thyroid cancers represent the largest number of cancers of one particular type ever triggered by a single event.
Ionizing Radiation Initiates Carcinogenesis by Causing DNA Damage:
As was the case for carcinogenic chemicals and UV radiation, DNA damage lies at the heart of the mechanism by which ionizing radiation causes cancer. The ability of ionizing radiation to trigger mutations was first described in the 1920s by Hermann Muller in studies involving fruit flies. When the mutation rate is plotted against the dose of ionizing radiation, the dose-response curve appears to be linear over a wide range of radiation doses.
In contrast to UV radiation, which creates a distinctive type of DNA mutation (pyrimidine dimers), ionizing radiation damages DNA in a variety of ways (Figure 11). By definition, ionizing radiation strips away electrons from molecules, generating highly unstable ions that rapidly undergo chemical changes and break chemical bonds. Because roughly 80% of the mass of a typical cell is accounted for by water molecules, many of the bonds broken by ionizing radiation reside in water.
The disruption of water molecules produces highly reactive fragments called free radicals, a general term that refers to any atom or molecule containing an unpaired electron. The presence of an unpaired electron makes free radicals extremely reactive.
One of the free radicals produced when ionizing radiation interacts with water is the hydroxyl radical (OH), which readily attaches itself to DNA bases. The presence of these added hydroxyl groups alters the base-pairing properties of the bases during DNA replication, leading to various mutations.
In addition to generating water-derived free radicals, ionizing radiation also attacks DNA directly, stripping away electrons and breaking bonds. Such reactions cleave the bonds that join bases to the DNA backbone, thereby causing individual bases to be lost; ionizing radiation also attacks the DNA backbone itself, creating single- or double-strand breaks in the DNA double helix.
Fortunately, it is relatively easy to repair single-strand breaks or the loss of individual bases because the opposite DNA strand of the double helix remains intact and serves as a template for fixing the defective strand by normal repair mechanisms.
Double-strand breaks are more difficult to fix, and imperfect attempts at repair may create localized mutations in the region of the break or larger-scale alterations, such as major deletions or sequence rearrangements.
If double-strand breaks occur in more than one chromosome, DNA derived from two different chromosomes may be mistakenly joined together. The result is a chromosomal translocation in which a segment of one chromosome is physically joined to another chromosome.
It usually takes many years for cancer to arise following radiation-induced DNA damage. Radiation is thus acting in the initiation phase of carcinogenesis, playing a role comparable to that of mutagenic chemicals in the initiation of chemical carcinogenesis. As would be expected, treating radiation-exposed cells with promoting agents, such as phorbol esters, increases the rate at which tumors appear.
Cells that have been initiated by exposure to ionizing radiation often exhibit a persistent elevation in the rate at which new mutations and chromosomal abnormalities arise. This condition, called genetic instability, creates conditions favorable for accumulation of the subsequent mutations that are required in the stepwise progression toward malignancy.
Most Human Exposure to Ionizing Radiation comes from Natural Background Sources:
Although the ability of ionizing radiation to cause cancer is well established, many of the examples we have considered—such as nuclear explosions or occupational exposures to X-rays or radioactivity—are not particularly relevant to most people. So how much of a hazard do the various types of ionizing radiation actually pose for the typical citizen?
Figure 12 summarizes the main sources of exposure to ionizing radiation for the average person in the United States, who typically receives an annual radiation dose of about 3.6 mSv. Most of this exposure comes from natural sources of background radiation such as radioactive radon, which continually seeps out of the earth’s crust and accounts for roughly 2.0 mSv per year.
Additional sources of natural radiation include other radioactive elements present in the earth’s crust and in our bodies, plus cosmic rays, which are high-energy charged particles that bombard the earth from outer space. Exposure to cosmic rays is greater at higher altitudes where the atmosphere is thinner. For example, a round-trip transatlantic airplane flight at an altitude of 35,000 feet exposes passengers to almost as much ionizing radiation as a chest X-ray.
Taken together, the various sources of natural radiation account for about 80% of our annual exposure to ionizing radiation. The remaining 20% comes from artificial sources of radiation generated by human activities. Some of this ionizing radiation is emitted by consumer products, such as television sets and smoke detectors, but most comes from medical procedures, mainly diagnostic X-rays.
The benefits of medical X-rays almost always outweigh the small risks involved, but people sometimes express concerns about being X-rayed because they have heard that radiation causes cancer. Such individuals usually lack scientific training and are not likely to be reassured when told that a chest X-ray involves only 0.08 mSv of radiation.
It would be easier for the average person to understand how much radiation is involved in a medical X-ray if it were compared to the dose of radiation we all receive every day from natural sources of background radiation. For this reason, scientists have developed an alternative measurement unit for expressing radiation exposure known as the BERT (for Background Equivalent Radiation Time) value.
The BERT value converts a given dose of ionizing radiation into the amount of time it would take a person to receive that same dose from the natural background radiation that we are all exposed to every day. Table 3 lists the BERT values for several types of radiation.
It shows, for example, that the BERT value of a typical chest X-ray is roughly ten days. In other words, the amount of radiation received during a routine chest X-ray is equivalent to the amount of natural background radiation we all receive every ten days. Thus a chest X-ray represents a small fraction of the natural background radiation we are exposed to each year.
Living near a nuclear power plant for an entire year has a BERT value of less than one day, indicating that it would add an extremely small amount to the annual radiation dose we already receive. In contrast, smoking a pack of cigarettes per day for one year has a BERT value of about ten years, which means that such a smoker would receive ten times more ionizing radiation each year from the radioactive polonium in tobacco smoke than they do from natural background radiation.
The Cancer Dangers Associated with Typical Exposures to Ionizing Radiation are Relatively Small:
The BERT concept is useful in helping people understand how much additional radiation exposure (compared to background levels) is associated with various activities, but it does not provide any direct information about cancer risk. Assessing cancer risk is a difficult task because epidemiological data cannot reliably quantify the small numbers of cancer cases that would be expected when individuals are exposed to low levels of ionizing radiation.
Scientists must therefore extrapolate from higher-dose human exposures, such as those experienced by atomic bomb survivors and radiation workers, to estimate human cancer risk at lower exposures.
Although such extrapolations are inherently imprecise, the data suggest that a person receiving a single whole-body exposure to 100 mSv of X-rays would incur about a 1% lifetime risk of developing a fatal cancer as a result of this exposure.
Since 100 mSv is roughly one thousand times higher than the dose from a routine chest X-ray and 30 times higher than a person’s typical annual exposure to all sources of ionizing radiation (3.6 mSv), it can be concluded that the cancer risks associated with typical exposures to ionizing radiation are quite small.
Such calculations, however, are for whole-body radiation. If a radioactive element is inhaled and becomes permanently lodged in the lungs, as is the case for the radioactive polonium in tobacco smoke, the cells lining each lung are subjected to radiation levels that are much higher than those experienced by the body as a whole.
Another variable to be considered is the effect of ongoing exposure to low doses of radiation spread over long periods of time. For example, consider a workplace environment that exposes workers to an annual radiation dose of 50 mSv, which is just within acceptable governmental guidelines.
A 30-year employee would acquire an accumulated dose of 30 yrs x 15 mSv/yr = 450 mSv of radiation. Since it was indicated in the preceding paragraph that 100 mSv increases the risk of developing a fatal cancer by about 1%, a worker with an accumulated dose of 450 mSv would have incurred a 4.5% added risk of dying of cancer.
Even this modest estimate of increased risk is probably an overestimate. First of all, radiation exposures that are accumulated gradually over a period of many years are less damaging to tissue than the same total dose of radiation administered all at once, so the preceding calculation for workplace exposure is likely to overestimate the long-term cancer risk.
In addition, if radiation-induced mutations only begin to accumulate after the capacity of DNA repair pathways has been exceeded by higher radiation doses, the dose-response curve for radiation-induced cancers may exhibit a threshold. If such a threshold were to exist, the actual cancer risk from low-dose exposures to radiation would be even smaller than suggested by current estimates, which are already quite low.
Given the relatively small cancer threat likely to be associated with typical human exposures to ionizing radiation, it is worth noting that people’s perceptions of risk are often quite different from actual risks.
When individuals are asked to rank their perceived risk of dying from various activities, nuclear power tends to be ranked ahead of much riskier activities, such as smoking cigarettes or driving a car. In reality, the risk of dying from a radiation-induced cancer is quite small compared with most of the other risks associated with everyday life (Table 4).
Essay # 3. Cell Telephones and Cancer:
Cell Phones or Power Lines are Significant Cancer Hazards:
Within the past decade, a substantial portion of the world’s industrialized population has adopted a new means of communication: the cell phone. This familiar device emits and transmits radiofrequency (RF) waves located near the microwave region of the electromagnetic spectrum (see Figure 3).
A cell phone RF wave, which is a bit shorter than an FM radio wave and somewhat longer than the microwaves used in a microwave oven, contains billions of times less energy than an X-ray and is therefore a nonionizing type of radiation.
Because cell phones are placed directly against a person’s head, concerns have been expressed that the emitted RF waves might cause brain cancer. Support for this idea seemed to come from early reports of DNA damage in animals exposed to RF waves, but most studies have failed to confirm such results or demonstrate any carcinogenic activity of RF waves in laboratory animals.
In humans, lag times of 20 to 30 years between exposure to a carcinogen and the development of cancer are not uncommon, so it is too early to know whether any long- term cancer hazards exist. However, the epidemiological evidence has failed to reveal any consistent relationship between cell phone usage and brain cancer rates thus far.
The low-energy nature of RF waves and the failure to detect any consistent carcinogenic effects suggest that if any cancer risk exists at all, it will probably turn out to be rather small. At present the only well-established health risk associated with cell phone usage is an increased risk of dying in a car accident when the driver is talking on the phone.
High-voltage power lines, which emit electric and magnetic fields of extremely long wavelength, are another source of electromagnetic energy that has been claimed to pose a cancer risk. The electromagnetic energy emitted by power lines, called ELF (for Extremely Low Frequency), is at the low-energy end of the radiofrequency region of the electromagnetic spectrum (see Figure 3).
Concerns about the possible health hazards of ELF fields were initially raised by several reports suggesting that children living near high-voltage power lines have elevated rates of leukemia, brain cancer, and other childhood cancers. The observed increase in cancer rates is rather small, however, and subsequent studies have failed to consistently detect such effects. Moreover, ELF fields have insufficient energy to damage DNA directly.