Download Late Effects of Radiation

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Late Effects of Radiation
EARLY EFFECTS of radiation exposure are produced by high radiation
doses. Late effects of radiation exposure are the result of low doses
delivered over a long period.
Radiation exposures experienced by personnel in diagnostic imaging
are low dose and low linear energy transfer (LET). In addition, the
exposures in diagnostic imaging are delivered intermittently over long
periods.
The principal late effects of low-dose radiation over long periods consist
of radiation-induced malignancy and genetic effects. Life span
shortening and effects on local tissues also have been reported as late
effects, but these are not considered significant. Radiation protection
guides are based on suspected or observed late effects of radiation
and on an assumed linear, nonthreshold dose-response relationship.
Most late effects are also known as stochastic effects.
Stochastic effects of radiation exposure exhibit an
increasing incidence of response—not severity—with
increasing dose. No dose threshold has been established
for a stochastic response
Our radiation protection guides are based on the late
effects of radiation and on linear, nonthreshold doseresponse relationships.
Studies of large numbers of people exposed to a toxic substance
require considerable statistical analyses. Such studies, called
epidemiologic studies, are required when the number of persons
affected is small.
Epidemiologic studies of people exposed to radiation are difficult
because:
(1) the dose usually is not known but is presumed to be low,
(2) the frequency of response is very low.
Consequently, the results of radiation epidemiologic studies do not
convey the statistical accuracy associated with observations of early
radiation effects.
LOCAL TISSUE EFFECTS
Skin
In addition to the early effects of erythema and desquamation and
late-developing carcinoma, chronic irradiation of the skin can result
in severe nonmalignant changes. Early radiologists who performed
fluoroscopic examinations without protective gloves developed a
very callused, discolored, and weathered appearance to the skin of
their hands and forearms. In addition, the skin would be very tight
and brittle and sometimes would severely crack or flake.
This late effect was observed many years ago in radiologists and is
called radiodermatitis. The dose necessary to produce such an
effect is very high. No such effects occur in the current practice of
radiology
Chromosomes
Irradiation of blood-forming organs can produce hematologic depression as an
early response or leukemia as a late response. Chromosome damage in the
circulating lymphocytes can be produced as both an early and a late response.
The types and frequency of chromosome aberrations have been described
previously; however, even a low dose of radiation can produce chromosome
aberrations that may not be apparent until many years after radiation exposure.
For example, individuals irradiated accidentally with rather high radiation doses
continue to show chromosome abnormalities in their peripheral lymphocytes for
as long as 20 years.
This late effect presumably occurs because of radiation damage to the
lymphocytic stem cells. These cells may not be stimulated into replication and
maturation for many years.
Cataracts
In 1932, E.O. Lawrence of the University of California developed the first
cyclotron, a 5-inch-diameter device capable of accelerating charged particles to
very high energies. These charged particles are used as “bullets” that are shot at
the nuclei of target atoms in the study of nuclear structure. By 1940, every
university physics department of any worth had built its own cyclotron and was
engaged in what has become high-energy physics.
The modern cyclotron is used principally to produce radionuclides for use in
nuclear medicine especially fluorine-18 for positron emission tomography (PET).
The largest particle accelerators in the world are located at Argonne National
Laboratory in the United States and at CERN in Switzerland. These accelerators
are used to discover the ultimate fine structure of matter and to describe exactly
what happened at the moment of creation of the universe
Early cyclotrons were located in one room and a beam of high-energy particles was
extracted through a tube and steered and focused by electromagnets onto the
target material in the adjacent room. At that time, sophisticated electronic equipment
was not available for controlling this high-energy beam. Cyclotron physicists used a
tool of the radiologic technologist, the radiographic intensifying screen, to aid them
in locating the high-energy beam. Unfortunately, in so doing, these physicists
received high radiation doses to the lens of the eye because they had to look
directly into the beam.
In 1949, the first paper reporting cataracts in cyclotron physicists appeared.
By 1960, several hundred such cases of radiation-induced cataracts had
been reported. This was particularly tragic because there were few highenergy physicists.
Radiation-induced cataracts occur on the
posterior pole of the lens.
On the basis of these observations and animal experimentation,
several conclusions can be drawn regarding radiation-induced
cataracts. The radiosensitivity of the lens of the eye is age dependent.
As the age of the individual increases, the radiation effect becomes
greater and the latent period becomes shorter.
Latent periods ranging from 5 to 30 years have been observed in
humans, and the average latent period is approximately 15 years.
High-LET radiation, such as neutron and proton radiation, has a high
relative biologic effectiveness (RBE) for the production of cataracts.
The dose-response relationship for radiationinduced cataracts is nonlinear, threshold.
If the lens dose is high enough, in excess of approximately 1000 rad (10
Gyt), cataracts develop in nearly 100% of those who are irradiated. The
precise level of the threshold dose is difficult to assess.
Most investigators would suggest that the threshold after an acute x-ray
exposure is approximately 200 rad (2 Gyt). The threshold after fractionated
exposure, such as that received in radiology, is probably in excess of 1000
rad (10 Gyt). Occupational exposures to the lens of the eye are too low to
require protective lens shields for radiologic technologists. It is nearly
impossible for a medical radiation worker to reach the threshold dose.
Radiation administered to patients who are undergoing head and neck
examination by fluoroscopy or computed tomography can be significant. In
computed tomography, the lens dose can be 5 rad (50 mGyt) per slice. In
this situation, however, usually no more than one or two slices intersect the
lens. In either case, protective lens shields are not normally required.
However, in computed tomography, it is common to modify the examination
to reduce the dose to the eyes
LIFE-SPAN SHORTENING
Many experiments have been conducted with animals after both acute
and chronic exposures that show that irradiated animals die young.
Figure below, which has been redrawn from several such
representative experiments, shows that the relationship between life
span shortening and dose is apparently linear, nonthreshold. When all
animal data are considered collectively, it is difficult to attempt a
meaningful extrapolation to humans
At worst, humans can expect a reduced
life span of approximately 10 days for
every rad.
Radiation-induced life span shortening is nonspecific, that is, no
characteristic diseases are associated with it, and it does not include
late malignant effects. It occurs simply as accelerated premature
aging and death.
One investigator has evaluated the death records of radiologic
technologists who operated field x-ray equipment during World War
II. These imaging systems were poorly designed and inadequately
shielded, so that technologists received higher-than-normal
exposures. Seven thousand such technologists have been studied,
and no radiation effects have been observed
Risk of Life Span Shortening as a Consequence of Occupation, Disease, or Various
Other Conditions
Risky Condition
Expected Days of Life Lost
Being male rather than female
2800
Heart disease
Being unmarried
2100
2000
One pack of cigarettes a day
1600
Working as a coal miner
Cancer
30 pounds overweight
Stroke
All accidents
Service in Vietnam
Motor vehicle accidents
1100
980
900
20
435
400
200
Average occupational accidents
74
Speed limit increase from 55 to 65
mph
Radiation worker
Airplane crashes
40
12
1
Observations on human populations have not been totally
convincing. No life span shortening has been observed among
atomic bomb survivors, although some received rather substantial
radiation doses. Life span shortening in radium watch-dial
painters, x-ray patients, and other human radiation-exposed
populations has not been reported.
American radiologists have been fairly extensively studied, and early
radiologists appeared to have a reduced life span. Such research has many
shortcomings, not the least of which is its retrospective nature. Figure
below shows the results obtained when the age at death for radiologists
was compared with the age at death for the general population.
Radiologists dying in the early 1930s were approximately 5 years younger
than members of the general population who died at an average age.
However, this difference in age at death had shrunk to zero by 1965.
The theory of radiation hormesis suggests that very low
radiation doses are beneficial.
Some evidence supports the principle of radiation
hormesis. Radiation hormesis suggests that low levels of
radiation—less than approximately 10 rad (100 mGyt)—are
good for you! Such low doses may provide a protective
effect by stimulating molecular repair and immunologic
response mechanisms. Nevertheless, radiation hormesis
remains a theory at this time, and until it has been proved,
we will continue to practice ALARA—as low as reasonably
achievable.
RADIATION-INDUCED MALIGNANCY
All the late effects, including radiation-induced malignancy, have
been observed in experimental animals, and on the basis of these
animal experiments, dose-response relationships have been
developed. At the human level, these late effects have been
observed, but often, data are insufficient to allow precise
identification of the dose-response relationship. Consequently,
some of the conclusions drawn regarding human responses are
based in part on animal data.
Leukemia
When one considers radiation-induced leukemia in laboratory
animals, there is no question that this response is real and that
the incidence increases with increasing radiation dose. The form
of the dose-response relationship is linear and nonthreshold. A
number of human population groups have exhibited an elevated
incidence of leukemia after radiation exposure—atomic bomb
survivors, American radiologists, radiotherapy patients, and
children irradiated in utero, to name a few.
Radiation-induced leukemia follows a linear,
nonthreshold dose-response relationship.
Radiation-induced leukemia is considered to have a latent period of
4 to 7 years and an at-risk period of approximately 20 years.
The at-risk period is that time after irradiation during which one might
expect the radiation effect to occur. The at-risk period for radiationinduced cancer is lifetime.
Data from atomic bomb survivors show without a doubt that radiation
exposure to those survivors caused the later development of
leukemia.
Radiologists
By the second decade of radiology, reports of pernicious anemia
and leukemia in radiologists began to appear. In the early 1940s,
several investigators reviewed the incidence of leukemia in
American radiologists and found it alarmingly high. These early
radiologists functioned without the benefit of modern radiation
protection devices and procedures, and many served as both
radiation oncologists and diagnostic radiologists.
It has been estimated that some of these early radiologists
received doses exceeding 100 rad/yr (1 Gyt/yr). Currently,
American radiologists do not exhibit an elevated incidence of
leukemia compared with other physician specialists.
In the 1940s and 1950s, particularly in Great Britain, it was common practice to treat
patients with ankylosing spondylitis with radiation. Ankylosing spondylitis is an
arthritis-like condition of the vertebral column.
Patients cannot walk upright or move except with great difficulty. For relief, they
would be given fairly high doses of radiation to the spinal column, and the treatment
was quite successful. Patients who previously had been hunched over were able to
stand erect.
Radiation therapy was a permanent cure and remained the treatment of choice for
approximately 20 years, until it was discovered that some who had been cured by
radiation were dying from leukemia.
During the period from 1935 to 1955, 14,554 male patients were treated at 81
different radiation therapy centers in Great Britain. Review of treatment records
showed that the dose to the bone marrow of the spinal column ranged from 100 to
4000 rad (1 to 40 Gy).
Fifty-two cases of leukemia occurred in this population. When this incidence of
leukemia is compared with that of the general population, the relative risk is 10:1.
Thyroid Cancer
Thyroid cancer has been shown to develop in three groups of patients
whose thyroid glands were irradiated in childhood.
The first two groups, called the Ann Arbor series and the Rochester series,
consisted of individuals who, in the 1940s and early 1950s, were treated
shortly after birth for thymic enlargement. The thymus is a gland lying just
below the thyroid gland that can enlarge shortly after birth in response to
infection.
At these facilities, radiation was often the treatment of choice. After a dose
of up to 500 rad (5 Gyt), the thymus gland would shrink so that all
enlargement disappeared. No additional problems were evident until up to
20 years later, when thyroid nodules and thyroid cancer began to develop in
some of these patients.
Another group included 21 children who were natives of the
Rongelap Atoll in 1954; they were subjected to high levels of fallout
during a hydrogen bomb test. The winds shifted during the test,
carrying the fallout over an adjacent inhabited island rather than one
that had been evacuated. These children received radiation doses to
the thyroid gland from both external exposure and internal ingestion
of approximately 1200 rad (12 Gyt).
Data are just now becoming available on the nearly 100,000
persons exposed to radiation from the 1989 Chernobyl
incident. No excess leukemia or cancer has been observed
in this population, although a small increase in thyroid
nodularity has been noted.
?
Bone Cancer
Two population groups have contributed an enormous quantity of
data showing that radiation can cause bone cancer. The first group
consists of radium watch-dial painters.
In the 1920s and 1930s, various small laboratories hired employees,
most often female, who worked at benches painting watch dials with
paint laden with radium sulfate. To prepare a fine point on the
paintbrushes, the employees would touch the tip of the brush to the
tongue. In this manner, substantial quantities of radium were
ingested.
Radium salts were used because the emitted radiation, principally alpha and
beta particles, would continuously excite the luminous compounds so the
watch dial would glow in the dark. Current technology uses harmlessly low
levels of tritium (3H) and promethium (147Pm) for this purpose.
When ingested, the radium would behave metabolically similar to calcium and
deposit in bone. Because of radium's long half-life (1620 years) and alpha
emission, these employees received radiation doses to bone of up to 50,000
rad (500 Gyt).
Seventy-two bone cancers in approximately 800 persons have been observed
during a follow-up period in excess of 50 years
Another population in whom excess bone cancer developed consisted of
patients treated with radium salts for a variety of diseases, from arthritis to
tuberculosis. Such treatments were common practice in many parts of the
world until about 1950.
Skin Cancer
Skin cancer usually begins with the development of a radiodermatitis.
Significant data have been developed from several reports of skin cancer
induced in radiation therapy recipients treated with orthovoltage (200 to 300
kVp) or superficial x-rays (50 to 150 kVp).
Radiation-induced skin cancer follows a
threshold dose-response relationship.
Breast Cancer
Controversy is ongoing regarding the risk of radiation-induced breast cancer,
with implications for breast cancer detection by x-ray mammography.
Concern over such risk first surfaced in the mid-1960s, after reports were
published of breast cancer developing in patients with tuberculosis.
Tuberculosis was for many years treated by isolation in a sanitarium. During
the patient's stay, one mode of therapy was to induce a pneumothorax in the
affected lung; this was done under non–image-intensified fluoroscopy. Many
patients received multiple treatments and up to several hundred fluoroscopic
examinations.
Precise dose determinations are not possible, but levels of several hundred
rad would have been common. In some of these patient populations, the
relative risk for radiation-induced breast cancer was shown to be as high as
10:1.
Lung Cancer
Early in the 20th century, it was observed that approximately 50% of workers in
the Bohemian pitchblende mines of Germany died of lung cancer. Lung cancer
incidence in the general population was negligible by comparison. The dusty mine
environment was considered to be the cause of this lung cancer. Now it is known
that radiation exposure from radon in the mines contributed to the incidence of
lung cancer in these miners.
Observations of American uranium miners active in the Colorado plateau in the
1950s and 1960s have also shown elevated levels of lung cancer. The peak of
this activity occurred in the early 1960s, when approximately 5000 miners were
active in nearly 500 underground mines and 150 open-pit mines. Most of the
mines were worked by fewer than 10 men; therefore, for such a small operation,
one could expect a lack of proper ventilation.
The radiation exposure in these mines occurred because of the high
concentration of uranium ore. Uranium, which is radioactive with a very long halflife of 109 years, decays through a series of radioactive nuclides by successive
alpha and beta emissions, each accompanied by gamma radiation.
To date, more than 4000 uranium miners have been observed, and they have
received estimated doses to lung tissue as high as 3000 rad (30 Gyt); on this
basis, the relative risk was approximately 8:1. It is interesting to note that smoking
uranium miners have a relative risk of approximately 20:1.
Liver Cancer
Thorium dioxide (ThO2) in a colloidal suspension known as Thorotrast was
widely used in diagnostic radiology between 1925 and 1945 as a contrast agent
for angiography. Thorotrast was approximately 25% ThO2 by weight, and it
contained several radioactive isotopes of thorium and its decay products.
Radiation that was emitted produced a dose in the ratio of approximately
100:10:1 of alpha, beta, and gamma radiation, respectively.
The use of Thorotrast has been shown to be responsible for several types of
carcinoma after a latent period of approximately 15 to 20 years. After
extravascular injection, it is carcinogenic at the site of the injection. After
intravascular injection, ThO2 particles are deposited in phagocytic cells of the
reticuloendothelial system and are concentrated in the liver and spleen. Its halflife and high alpha radiation dose have resulted in many cases of cancer in
these organs.