Download About Barrett`s esophagus

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Prenatal testing wikipedia , lookup

Race and health wikipedia , lookup

Syndemic wikipedia , lookup

Epidemiology wikipedia , lookup

Gene therapy wikipedia , lookup

Seven Countries Study wikipedia , lookup

Fetal origins hypothesis wikipedia , lookup

Public health genomics wikipedia , lookup

Women's Health Initiative wikipedia , lookup

Alzheimer's disease research wikipedia , lookup

Index of HIV/AIDS-related articles wikipedia , lookup

Multiple sclerosis research wikipedia , lookup

Transcript
Stanford study drives stake through claims that garlic lowers cholesterol levels
When it comes to lowering cholesterol levels, garlic stinks, according to a new study from the
Stanford University School of Medicine.
Despite decades of conflicting studies about the pungent herb's ability to improve heart health, the researchers
say their study provides the most rigorous evidence to date that consuming garlic on a daily basis - in the form of
either raw garlic or two of the most popular garlic supplements - does not lower LDL cholesterol levels among adults
with moderately high cholesterol levels.
"It just doesn't work," said senior author Christopher Gardner, PhD, assistant professor of medicine at the
Stanford Prevention Research Center. "There's no shortcut. You achieve good health through eating healthy food.
There isn't a pill or an herb you can take to counteract an unhealthy diet."
Gardner said the study, which will be published in the Feb. 26 issue of the Archives of Internal Medicine, is the
first independent, long-term, head-to-head assessment of raw garlic and garlic supplements. The study also drew
on the expertise of two of the nation's foremost garlic experts - Larry Lawson, PhD, of the Plant Bioactives Research
Institute in Utah, and Eric Block, PhD, professor of chemistry at the University at Albany, State University of New
York - who have devoted much of their careers to understanding the biochemical properties of the herb and who
ensured the quality and stability of the garlic consumed in the study.
"If garlic was going to work, in one form or another, then it would have worked in our study," Gardner said. "The
lack of effect was compelling and clear. We took cholesterol measurements every month for six months and the
numbers just didn't move. There was no effect with any of the three products, even though fairly high doses were
used."
Most of the medicinal claims about garlic revolve around the sulfur-containing substance allicin, which is
produced when raw garlic is chopped or crushed. Allicin has been shown to inhibit the synthesis of cholesterol in
test tubes and in animal models, but there is conflicting clinical evidence about its ability to react inside the human
body the same way it does in a lab dish.
"In lab tests, you can apply the garlic compounds directly to cells," Gardner said. "But in real people you don't
know whether allicin would actually get directly to cells if someone ate garlic. You still have to do the human clinical
trial to see if it really works, and the previous clinical trials left people confused."
Indeed, the fact that allicin had positive results in some lab tests and animal studies made it possible for
supplement makers to tout garlic as a natural remedy for high LDL cholesterol levels. LDL cholesterol is known as
the "bad cholesterol" because when too much of the substance circulates in the blood, it can build up and clog
arteries, increasing the risk of heart attack or stroke. LDL levels of less than 130 mg/dl are considered optimal, while
levels greater than that are considered high.
For the study, the researchers recruited 192 patients with moderately elevated LDL cholesterol levels, with an
average level of about 140 mg/dl. "These are the people who are the most likely to use supplements," Gardner said.
"If their cholesterol were higher, then their doctors would be putting them on statins or some other prescription
medication."
The study participants were then randomly assigned to ingest either raw garlic, an aged garlic supplement, a
powdered garlic supplement or a placebo six days a week for six months. For those assigned to take either raw or
supplemental forms of garlic, each participant consumed the equivalent of a 4-gram clove of garlic per day (which
the researchers determined was the average size of a clove of garlic). For those assigned to the supplements, this
meant taking somewhat more than the dosage recommended on the packaging instructions for both supplements.
The garlic supplements used in the study are two of the top sellers, but are manufactured in very different ways.
Gardner said that the manufacturer of the aged garlic extract calls it "the 'sociable' garlic because they say the
aging process takes away the bad-breath aspect." Extensive chemical analyses of the three garlic products
confirmed that the daily doses represented similar amounts of the original starting material and that all three
remained stable over the course of the study.
All of the study participants were given tablets as well as sandwiches prepared by Gardner's team. For those
assigned to consume raw garlic, the garlic was mixed into the sandwich condiments, and the pills were placebos.
For those assigned to take supplements, the condiments were garlic-free. In all, the research team made more than
30,000 heart-healthy gourmet sandwiches for the six-month study.
Participants were closely monitored throughout the study to ensure that they didn't gain or lose weight, which
might have affected their cholesterol readings. Additionally, blood samples were taken monthly from the study
participants.
When the researchers tested the blood samples, they found that the LDL cholesterol readings remained nearly
identical from start to finish.
"Our study had the statistical power to see any small differences that would have shown up, and we had the
duration to see whether it might take a while for the effect of the garlic to creep in. We even looked separately at
the participants with the highest vs. the lowest LDL cholesterol levels at the start of the study, and the results were
identical," Gardner said. "Garlic just didn't work."
STANFORD, Calif. --
One potential reason for the confusion surrounding garlic's reputed health benefits is that the supplement makers
themselves funded many of the previous studies claiming that garlic lowered cholesterol. Gardner's funding came
from the National Institutes of Health.
Gardner said garlic may still have an effect on other health and disease processes that were not addressed in this
study, such as inflammation, immune function or cancer. But, he added, those potential benefits also need to be
studied in rigorously controlled trials.
He also said that garlic can still be a valuable part of the diet if it's used to increase the consumption of healthy
dishes, such as a stir fry or a Mediterranean salad. "But if you choose garlic fries as a cholesterol-lowering food,
then you blew it. The garlic doesn't counteract the fries," Gardner said.
Aspirin reduces esophageal-cancer risk in people with most-aggressive form of Barrett's
esophagus
Study also identifies biomarkers that predict which patients are most likely to get cancer
Researchers at Fred Hutchinson Cancer Research Center have found that people with the most-aggressive
form of Barrett’s esophagus, a precancerous condition that can lead to esophageal cancer, may benefit the most
from preventive therapy with aspirin, ibuprofen and other nonsteroidal anti-inflammatory drugs, or NSAIDs. The
researchers also identified a cluster of four known cancer biomarkers, or genetic abnormalities, in people with
Barrett’s that significantly increases their risk of developing esophageal cancer.
These findings, by lead authors Patricia C. Galipeau and Xiaohong Li, senior author Brian J. Reid, M.D., Ph.D., and
colleagues in the Hutchinson Center-based Seattle Barrett’s Esophagus Program, will be published in the Feb. 27 issue of
PLoS Medicine, a freely available online journal. Researchers from Virginia Mason Medical Center, Harvard Medical School
and The Wistar Institute collaborated on the study, which was funded by the National Institutes of Health and the
Hutchinson Center.
The researchers found that those with three or more of the cancer biomarkers upon enrollment in the study who
also used aspirin or other NSAIDs had a 30 percent risk of esophageal cancer after 10 years, while those with the
same biomarkers who did not use NSAIDs had a 79 percent risk of developing cancer within a decade of joining the
study.
"This is the first prospective, longitudinal study in patients with Barrett’s esophagus – or any other pre-malignant
condition, for that matter – to link somatic genetic biomarkers for cancer-risk prediction with candidate interventions
such as NSAIDs to prevent cancer," said Galipeau, a research technician in Reid’s laboratory, which is based in the
Hutchinson Center’s Human Biology Division.
The researchers also found that Barrett’s patients whose esophageal tissue had no such genetic abnormalities, or
biomarkers, upon joining the study had a 12 percent risk of developing esophageal cancer after 10 years, whereas
those with three or more of the abnormalities at baseline had a nearly 80 percent risk of developing such cancer
within a decade.
"Several studies have suggested that individual genetic abnormalities may identify Barrett’s patients who are at
increased risk of progression toward esophageal cancer, but this is the first study to evaluate the combined
contribution of genetic abnormalities for esophageal cancer-risk prediction," said Reid, director of the Seattle
Barrett’s Esophagus Program.
The study followed 243 people with Barrett’s esophagus for 10 years (189 male, 54 female, mean age 62 upon
joining the study). The participants were interviewed about their medical history, diet and medication use and were
closely monitored for signs of disease progression through regular endoscopies and tissue biopsies.
Their Barrett’s-related esophageal tissue was evaluated at the initial study visit for a variety of known cancer
biomarkers, but the genetic abnormalities that were most strongly predictive of progression toward cancer were:
* Loss of heterozygosity (LOH) at 9p and 17p – loss on the short arms of chromosomes 17 and 9. Such
chromosomal abnormalities inactivate tumor-suppressor genes that are critical for controlling cell growth.
* DNA content abnormalities (aneuploidy and tetraploidy) – the accumulation of cells with grossly abnormal
amounts of DNA, which indicates substantial genetic damage and heralds advanced progression toward cancer.
Ultimately, the researchers hope, these biomarkers one day could be used in a clinical setting to identify which
Barrett’s patients are most likely to develop esophageal cancer and therefore benefit from aggressive cancer
surveillance via endoscopy and chemoprevention with aspirin and other NSAIDs. Galipeau and colleagues are in the
process of developing such a standardized, biomarker-based screening test. The test would evaluate DNA from
esophageal-tissue biopsies, but significantly fewer tissue samples would need to be collected as compared to
current endoscopic-surveillance methods.
"Once such a test is available, it could be a major factor in guiding the development of clinical trials to identify
high-risk patients and definitively determine the value of NSAIDs in preventing the progression of Barrett’s
esophagus toward cancer," Reid said.
It is hypothesized that aspirin and other NSAIDs may fight cancer by reducing chronic inflammation, which is a
driving force behind the development of many cancers and other diseases. Specifically, NSAIDs have been shown to
inhibit the production of the cyclooxygenase-2 (COX-2) enzyme. Disruption of this pathway slows the growth of
abnormal cells and facilitates the normal process of programmed cell death, or apoptosis, both of which can thwart
SEATTLE --
cancer development. NSAIDs are also believed to decrease proliferation of cells and decrease the growth of blood
vessels that supply blood to tumors, Reid said.
The annual incidence of esophageal cancer among Barrett’s esophagus patients is about 1 percent – most
patients never get the cancer. However, the outlook is grim if the cancer is not diagnosed early, with an overall
survival rate of only 13.7 percent. For this reason, Barrett’s patients must undergo frequent endoscopic surveillance.
"Many Barrett’s patients are subjected to overdiagnosis of risk and overtreatment," Reid said. "These findings
ultimately may help us identify high-risk patients who truly require frequent surveillance and low-risk patients who
need no or less-frequent surveillance. For example, low-risk patients with no biomarker abnormalities at baseline
had a zero percent cumulative risk of developing esophageal cancer for nearly eight years," he said. "These findings
also may help us determine which Barrett’s patients may benefit most from a very cost-effective, noninvasive
therapy in the form of aspirin or NSAIDs."
Currently, the recommended endoscopic screening frequency for Barrett’s esophagus ranges from once every
three to six months to once every two to three years, depending on the amount of affected tissue and the degree of
dysplasia, or cellular abnormality, detected upon examining a tissue sample under a microscope. While cellular
dysplasia is the most common method for determining the severity, or grade, of Barrett’s, several recent studies
have found that the technique is not truly predictive of cancer risk. Longitudinal studies of high-risk Barrett’s
patients (as determined by degree of cellular dysplasia) have found cancer-incidence rates ranging from 8 percent
to 59 percent. "Clearly we need a new, more consistent and reliable way to predict risk that doesn’t rely on the
subjective interpretation of a pathologist," said Reid, also a professor of gastroenterology and medicine and an
adjunct professor of genome sciences at the University of Washington School of Medicine.
---------------------BACKGROUND INFORMATION
About Barrett’s esophagus
An estimated 20 million Americans experience chronic heartburn; about 2 million of these people have Barrett’s
esophagus, a premalignant condition of the tube that carries food from the mouth to the stomach. While the condition is
most prevalent in middle-aged white men, the incidence of esophageal adenocarcinoma, the cancer associated with
Barrett’s esophagus, is rising in women and African Americans. A physician may suspect the condition is present if part of
the inner lining, or epithelium, of the esophagus is red rather than the usual light pink. This is determined through a
procedure called endoscopy, in which a tube-like instrument is used to view the esophageal lining. A definite diagnosis
cannot be made unless small samples of the red lining are biopsied, or removed and examined under a microscope, and
found to have cellular changes typical of this disorder. Barrett’s-related esophageal cancer strikes about 10,000 Americans
a year, and for unknown reasons, the incidence is rising faster than that of any other cancer in the United States.
Barrett’s-related cancers increased fourfold between 1973 and 2002, with a nearly fivefold increase in white males. More
than 90 percent of patients with invasive esophageal adenocarcinoma die within five years of diagnosis.
Opening windows may be the best way of preventing transmission of airborne infection
A study of eight hospitals in Peru has shown that opening windows and doors provided ventilation more than
double that of mechanically ventilated negative-pressure rooms and 18 times that of rooms with windows and doors
closed.
The researchers, led by Rod Escombe from Imperial College London, compared the airflow in 70 naturally
ventilated clinical rooms such as respiratory isolation rooms, TB wards, respiratory wards, general medical wards,
outpatient consulting rooms, waiting rooms, and emergency departments with 12 mechanically ventilated negativepressure respiratory isolation rooms built after 2000.
Even at the lowest of wind speeds, natural ventilation exceeded mechanical ventilation. Facilities built more than
50 years ago, characterized by large windows and high ceilings, had greater ventilation than modern naturally
ventilated rooms. The authors went on to calculate what these results might mean for transmission of infection and
estimated that in mechanically ventilated rooms 39% of susceptible individuals would become infected following 24
h of exposure to untreated TB patients compared with 33% in modern and 11% in pre-1950 naturally ventilated
facilities with windows and doors open.
The authors conclude that opening windows and doors maximizes natural ventilation and that the risk of airborne
contagion is lower than with mechanical ventilation systems. Old-fashioned clinical areas with high ceilings and large
windows provided the greatest protection.
In a related perspective article Peter Wilson, from University College London, concludes that although “natural
ventilation is not an easy solution for patients in countries where winters are cold … the current practice of sealing
in the local environment is probably the wrong route for hospital wards”.
Early Europeans unable to stomach milk
The first direct evidence that early Europeans were unable to digest milk has been found by scientists at UCL
(University College London) and Mainz University.
In a study, published in the journal 'PNAS', the team shows that the gene that controls our ability to digest milk
was missing from Neolithic skeletons dating to between 5840 and 5000 BC. However, through exposure to milk,
lactose tolerance evolved extremely rapidly, in evolutionary terms. Today, it is present in over ninety per cent of the
population of northern Europe and is also found in some African and Middle Eastern populations but is missing from
the majority of the adult population globally.
Dr Mark Thomas, UCL Biology, said: "The ability to drink milk is the most advantageous trait that's evolved in
Europeans in the recent past. Without the enzyme lactase, drinking milk in adulthood causes bloating and diarrhoea.
Although the benefits of milk tolerance are not fully understood yet, they probably include: the continuous supply of
milk compared to the boom and bust of seasonal crops; its nourishing qualities; and the fact that it's
uncontaminated by parasites, unlike stream water, making it a safer drink. All in all, the ability to drink milk gave
some early Europeans a big survival advantage."
The team carried out DNA tests on Neolithic skeletons from some of the earliest organised farming communities
in Europe. Their aim was to find out whether these early Europeans from various sites in central, northeast and
southeast Europe, carried a version of the lactase gene that controls our ability to produce the essential enzyme
lactase into adulthood. The team found that it was absent from their ancient bone DNA. This led the researchers to
conclude that the consumption and tolerance of milk would have been very rare or absent at the time.
Scientists have known for decades that at some point in the past all humans were lactose intolerant. What was
not known was just how recently lactose tolerance evolved.
Dr Thomas said: "To go from lactose tolerance being rare or absent seven to eight thousand years ago to the
commonality we see today in central and northern Europeans just cannot be explained by anything except strong
natural selection. Our study confirms that the variant of the lactase gene appeared very recently in evolutionary
terms and that it became common because it gave its carriers a massive survival advantage. Scientists have inferred
this already through analysis of genes in today's population but we've confirmed it by going back and looking at
ancient DNA."
This study challenges the theory that certain groups of Europeans were lactose tolerant and that this inborn
ability led the community to pursue dairy farming. Instead, they actually evolved their tolerance of milk within the
last 8000 years due to exposure to milk.
Dr Thomas said: "There were two theories out there: one that lactose tolerance led to dairy farming and another
that exposure to milk led to the evolution of lactose tolerance. This is a simple chicken or egg question but one that
is very important to archaeologists, anthropologists and evolutionary biologists. We found that the lactose tolerance
variant of the lactase gene only became common after dairy farming, which started around 9 thousand years ago in
Europe.
"This is just one part of the picture researchers are gathering about lactose tolerance and the origins of
Europeans. Next on the list is why there is such disparity in lactose tolerance between populations. It's striking, for
example, that today around eighty per cent of southern Europeans cannot tolerate lactose even though the first
dairy farmers in Europe probably lived in those areas. Through computer simulations and DNA testing we are
beginning to get glimpses of the bigger early European picture."
The influence of the menstrual cycle on the female brain
What influence does the variation in estrogen level have on the activation of the female brain? Using functional
Magnetic Resonance Imaging, Jean-Claude Dreher, a researcher at the Cognitive Neuroscience Center
(CNRS/Université Lyon 1), in collaboration with an American team from the National Institute of Mental Health
(Bethesda, Maryland) directed by Karen Berman, has identified, for the first time, the neural networks involved in
processing reward-related functions modulated by female gonadal steroid hormones. This result, which was
published online on January 29, 2007 on the PNAS website, is an important step in better comprehension of certain
psychiatric and neurological pathologies.
The human brain has a recompense system that predicts different types of reward (food, money, drugs…). The
normal functioning of this system plays a fundamental role in many cognitive processes such as motivation and
learning. This reward system, composed of dopaminergic neurons1 situated in the mesencephalon (a very deep
region of the brain) and their projection sites2, is crucial for neural coding of rewards. Its dysfunction can result in
disorders such as addictions and is also implicated in various psychiatric and neurological pathologies, such as
Parkinson's disease and schizophrenic disorders. Many studies on animals prove that the dopaminergic3 system is
sensitive to gonadal steroid hormones (estrogen, progesterone). For example, female rats self-administer cocaine (a
drug that acts on the dopamine system) in higher doses after estrogens have been administered to them. The
influence of gonadal steroid hormones on the activation of the reward system remained to be studied in humans. A
better knowledge of this influence should make for better understanding of the differences between men and
women, particularly as observed in the prevalence of certain psychiatric pathologies and in vulnerability to drugs,
(for which the dopaminergic system plays an important role.) It is known, for example, that the female response to
cocaine is greater in the follicular phase of the menstrual cycle4 than in the luteal phase5.Moreover, schizophrenia
tends to appear later in women than in men.
1. Dopamine is a neurotransmitter, more specifically a molecule that modulates neuron activity in the brain.
Dopaminergic neurons use dopamine as a neurotransmitter/neuromodulator.
2. Structures including the ventral striatum, the anterior cingulate cortex, and the orbitofrontal cortex.
3. Dopaminergic system: all the brain structures innervated by dopaminergic neurons.
4. Follicular phase: the first part of the menstrual cycle starting from the first day of the period.
5 Luteal phase: the second pat of the menstrual cycle that begins after ovulation and ends on the last day of the
period.
6. Gonadal neurosteroids: steroidal hormones produced by the gonads (ovaries and testicles), which interact with
estrogen receptors, progesterone or androgens.
7. Luteinizing hormone (LH) is a hormone produced by the pituitary gland. Its main role is to trigger ovulation,
which occurs between 36 and 48 hours after the LH peak.
An increase in brain activity with anticipation of uncertain monetary rewards is observed. During the follicular
phase, this is in the amygdala and the orbitofrontal cortex. (The higher the T values, the more the colors go from
red to yellow, and the "statistically" higher the brain activity.)
Estrogens and progesterone are not just sex hormones that influence ovulation and reproduction; they also affect
a large number of cognitive and affective functions.
These two observations show that gonadal neurosteroids6 modulate the female dopaminergic system, but the
question remains as to whether these hormones modulate the reward system neuron network.
In order to answer this question, the team developed an experiment using functional Magnetic Resonance
Imaging (fMRI). The brain activity of a group of women was examined twice during their menstrual cycle. Each time
they went into the MRI, they were presented with virtual slot machines showing different probabilities of winning.
When women anticipate uncertain rewards, they activate the brain regions involved in processing emotions,
particularly the amygdala and the orbitofrontal cortex, to a greater extent during the follicular phase (4 to 8 days
after the start of the period) than during the luteal phase (6 to 10 after the LH7 hormone surge). These results
demonstrate increased reactivity of the female recompense system during the follicular phase, which is also the
phase in which the estrogens do not oppose the progesterone. In order to determine the gender-related differences
of reward system activation, the same experiment was carried out on a male group. Result: when men anticipate
rewards, they mainly activate a region involved in motivation for obtaining rewards, the ventral striatum, whereas in
women, it is a region dealing with emotions, the amygdalo-hippocampal region, which is the most highly activated.
These conclusions could be applied to rewards other than monetary. Take receptiveness and desire, for example,
two qualities that are supposed to facilitate procreation and are seen during the period of ovulation. It could be
envisaged that the increase in activity of certain regions of the female brain during the follicular phase would
modulate behavior linked to obtaining rewards, such as approach behavior during reward anticipation and hedonistic
behavior when the reward is received.
These results, at the border between neuroendocrinology and neurosciences, provide a better understanding of
the fundamental role of gonadal steroid hormones on reward processing, particularly in behavioral processes such
as motivation and learning. They also important in understanding the dysfunction of the reward system observed
particularly in cases of Parkinson's disease, schizophrenia, normal ageing and drug and gambling addictions.
Manual Dishwashing Study Digs up Dirt on Dish Cleanliness
New research at Ohio State University answers an infectious question about eating at restaurants:
How clean are manually washed dishes?
Jaesung Lee and Melvin Pascall found that even when they washed dishes in cooler-than-recommended water,
numbers of bacteria on the dishware dropped to levels accepted in the Food and Drug Administration's Food Code.
They also found that certain foods—especially cheese and milk—can be safe havens for bacteria when dried onto
dishware. Lipstick, however, proved to be dangerous to bacteria.
Melvin Pascall
“After washing, there were lipstick stains still left on a few glasses, but it was the least hospitable substance for
bacteria,” Pascall said. “It seems to have antimicrobial properties, which was a big surprise to us.”
Lee, a research associate, and Pascall, an assistant professor, both in food science and technology, published
their findings in the Journal of Food Engineering.
When restaurants manually wash dishes, they follow a three-step process: Dishes are washed and scrubbed in
soapy water, rinsed with clean water, and finally soaked in water containing germ-killing sanitizers. But employees
often use water that is cooler than 110 degrees Fahrenheit—the minimum washing temperature recommended by
the FDA—because it is uncomfortably hot. The FDA also requires that washing cause a 100,000-fold drop in
amounts of bacteria on those dishes.
To investigate effective lower-temperature dishwashing tactics, the researchers coated dishes individually with
cheese, eggs, jelly, lipstick, and milk, and then added Escherichia coli and Listeria innocua bacteria. Contaminants
like E. coli and L. innocua can survive for long periods of time if they make their way into food dried onto dishes. If
those dishes aren't thoroughly washed, they can sometimes cause food-borne disease outbreaks.
After letting the food dry on to the dishes for an hour—a plausible wait in a busy restaurant dish room—they
gave each utensil a few scrubs per side and measured the amount of microscopic organisms still clinging to the
dishes.
Lee and Pascall discovered that washing dishes in hot dish water, followed by soaking in extra sanitizers,
eliminated almost all of the bacteria on them, even when coated with dried-on cheese. But dishes washed in soapy
room-temperature water, rinsed, and then weakly sanitized with ammonium-based chemicals also achieved FDAacceptable results.
COLUMBUS , Ohio –
The find is important because acceptable sanitization can be achieved with cooler dish-washing water, as dishes
washed in room-temperature water and then rinsed in more-concentrated sanitizers achieved results comparable to
higher-temperature alternatives.
“We wanted to show that employees could use a more comfortable washing technique and still get clean dishes,”
Pascall said. “We were able to do that, and we did it by using different combinations of washing, rinsing, and
sanitizing.”
But all dishes are not created equal. Compared to ceramic plates, steel knives, spoons, and plastic trays, steel
forks seemed to be the best home for bacterial contaminants.
“The prongs of forks actually shield food from the action of scrubbing,” Pascall said. “Taking extra time to wash
forks is a good idea, especially those covered with sticky foods like cheese.”
Although cheesy forks were the most problematic utensil, milk dried onto glasses protected bacteria more than
any other food. Pascall explained that milk is a good growth medium in the laboratory, but why it adheres to glass
so well isn't clearly understood.
“Milk is an area of research we'd like to explore further,” Pascall said. “We want to find ways to safely and quickly
remove milk dried on glasses.”
The research aimed to explore restaurant dishwashing conditions, but Pascall explained that homeowners can
benefit from the findings, too.
“Leaving food on eating utensils and dishes could easily cause bacteria to grow on them, especially if it's moist,”
Pascall said. “The best thing you can do is wash your dishes off right away, before the food dries. It saves washing
time and gets rid of places where bacteria can survive drying and washing.”
Pascall and Lee conducted the study with Richard Cartwright and Tom Grueser of the Hobart Corporation in Troy, Ohio.
Funding was supplied by the Center for Innovative Food Technologies, a USDA-funded institution, and a manual
dishwashing sink for the project was provided by the Hobart Corporation.
Usefulness of cardiovascular disease test questioned
Dartmouth researchers say the test identifies too many at low risk
Researchers with Dartmouth Medical School and the Veterans Affairs Outcomes Group at the White River
Junction (Vt.) VA Medical Center are questioning the usefulness of the C-Reactive Protein (CRP) test for guiding
decisions about the use of cholesterol-lowering medication.
The researchers show that adding CRP testing to routine assessments would increase the number of Americans
eligible for cholesterol-lowering treatment by about 2 million if used judiciously, and by over 25 million if used
broadly—with most of these people being at low risk for heart attacks or heart disease. The authors argue that the
medical community should focus energies on treating high-risk patients before expanding the pool to include so
many low-risk patients. Their study was published in the February issue of the Journal of General Internal Medicine.
"There is a push to use this test, and that probably doesn't make much sense," says Steven Woloshin, one of the
authors on the paper and an associate professor of community and family medicine at Dartmouth Medical School
(DMS).
According to co-author Lisa Schwartz, associate professor of medicine at DMS, "A general population use of the
test would identify millions of low-risk people, and we don't know if exposing them to cholesterol medications will do
more good than harm. Plus, focusing on low-risk people seems misplaced since over half of high-risk people, who
we know are helped by treatment, remain untreated."
Woloshin and Schwartz's co-authors are H. Gilbert Welch and Kevin Kerin, all of whom are affiliated with
Veterans Affairs Outcomes Group and Dartmouth Medical School. Woloshin, Schwartz, and Welch are also part of
Dartmouth's Center for Evaluative Clinical Sciences.
For this study, the team analyzed nationally representative data from more than 2,500 people who participated in
the 1999-2002 National Health and Nutrition Examination Study (NHANES). They discovered that adding a broadly
applied CRP strategy to current cholesterol-based guidelines would make over half the adults age 35 and older in
the United States eligible for lipid-lowering therapy.
"Before expanding treatment criteria to include more low-risk patients—for whom treatment benefit is not
established—we should ensure the treatment of high-risk patients where the benefit of therapy is clear," says
Woloshin.
The authors note that their study has several limitations. Since the NHANES data did not include some
cardiovascular risk factors (e.g., presence of an aortic abdominal aneurysm or peripheral artery disease), the
number of people in the highest risk groups may be underestimated. Second, not every patient made eligible for
cholesterol-lowering therapy by the CRP test will get treated; some will try lifestyle measures first (although data
from other studies suggest these measures will only succeed for a minority).
Gene Therapy Shows Promise as Treatment for Diseased Limbs Posted
New research suggests that gene therapy is a safe treatment method to explore in patients whose
lower limbs are at risk for amputation because of poor circulation caused by blocked blood vessels.
In a Phase I clinical trial, almost half the patients receiving gene therapy reported complete resolution of chronic
pain one year after treatment and more than a quarter of patients with chronic wounds experienced complete
COLUMBUS, Ohio –
healing of those ulcers in the same time frame. The results appear online and are scheduled for publication in the
March 13 issue of the journal Circulation.
The researchers are the first to report on testing of the effects of the hypoxia-inducible factor-1 alpha (HIF-1a)
gene as the basis of treatment for limbs damaged by compromised blood flow. Though the trial largely focused on
the therapy’s safety, “the bottom line is that 34 patients improved to varying degrees with this treatment,” said Dr.
Sanjay Rajagopalan, section director of vascular medicine at Ohio State University Medical Center and first author of
the Circulation article.
The treatment is currently being tested in a major Phase II clinical trial in the United States and Europe.
“If this gene therapy approach were to prove safe and effective after exhaustive testing in Phase III studies, it
would provide clinicians with an alternative approach to treating patients with serious blood flow problems in their
lower limbs,” Rajagopalan said.
Physicians are seeking better options to treat what is called critical limb ischemia because the condition often
results in amputation and is characterized by chronic wounds that resist healing. The compromised blood flow is
caused by severe blockages of peripheral arteries, or peripheral arterial disease (PAD), which occurs when
cholesterol or plaque builds up in the arteries outside the heart – typically in the legs or pelvis. The one-year
mortality rate in patients with critical limb ischemia is between 25 percent and 45 percent, and the risk of death
increases with amputation.
An estimated 3 percent of Americans younger than age 60 have some degree of PAD, although they may not be
aware of it because the early symptoms are often subtle. The frequency of the disorder increases with age, with
more than 20 percent of individuals older than age 70 suffering from some form of it.
The current standard of care for critical limb ischemia includes the use of stents or balloons or bypass grafting to
improve blood flow for patients who can tolerate such procedures. For many, these treatments offer only short-term
benefits.
HIF-1a is considered a “master switch” gene produced naturally in the body that triggers the growth of blood
vessel cells in patients with critical limb ischemia. The study drug, Ad2/HIF-1a/VP16, is a genetically modified form
of the gene, intended to boost its active properties. The scientists theorized that HIF-1a may normalize oxygen
levels in the cells by increasing interactions among multiple cytokines and genes that contribute to cell growth and
facilitate survival of tissue damaged by reduced blood flow.
“Previous attempts to test gene-based formulations for critical limb ischemia have been disappointing, and that
could be because the formulations tested were all single modality approaches that simply did not have the ability to
induce the coordinated series of events required to induce vessel growth,” said Rajagopalan, also associate director
of vascular research for Ohio State’s Davis Heart and Lung Research Institute.
He and colleagues administered the therapy through injections into the damaged limbs. The researchers
completed two studies – a randomized, double-blind, placebo-controlled study, as well as an extension study in
which participants, including some who had been receiving placebo, received the active treatment. Patients enrolled
at five U.S. centers had at least one main artery with complete or nearly complete blockage and no other options to
fix the vessels.
Though 21 of 38 patients completing the trials experienced adverse effects, all negative events were attributed to
their illness and not the study treatment. Because the study group was small, no effect was attributed to different
doses of the therapy.
At one year, 14 of 32 patients experienced complete resolution of pain while at rest, and five of 18 patients
reported complete healing of chronic wounds. Other benefits included an increase in the number and size of visible
vessels in affected legs.
Repeat administration of the study treatment is not considered an option because the body typically generates an
immune response to gene therapy.
The study also confirmed the high rate of disease progression in patients with advanced circulation problems
even with the study treatment. One year after the study began, 26 percent of all participating patients experienced
amputation and 13 percent of patients died; of those receiving the study therapy, 29 percent experienced
amputation and 9 percent died.
This research was funded by the Genzyme Corp., which manufactures Ad2/HIF-1a/VP16. Rajagopalan has received
grant support from the company.
Genes and genius: Researchers confirm association between gene and intelligence
If you're particularly good with puzzles or chess, the reason may be in your genes.
A team of scientists, led by psychiatric geneticists at Washington University School of Medicine in St. Louis, has
gathered the most extensive evidence to date that a gene that activates signaling pathways in the brain influences
one kind of intelligence. They have confirmed a link between the gene, CHRM2, and performance IQ, which involves
a person's ability to organize things logically.
"This is not a gene FOR intelligence," says Danielle M. Dick, Ph.D., assistant professor of psychiatry and lead
author on the study. "It's a gene that's involved in some kinds of brain processing, and specific alterations in the
gene appear to influence IQ. But this single gene isn't going to be the difference between whether a person is a
genius or has below-average intelligence."
Dick's team comprehensively studied the DNA along the gene and found that several variations within the CHRM2
gene could be correlated with slight differences in performance IQ scores, which measure a person's visual-motor
coordination, logical and sequential reasoning, spatial perception and abstract problem solving skills. When people
had more than one positive variation in the gene, the improvements in performance IQ were cumulative. The
study's findings are available online in Behavioral Genetics and will appear in an upcoming print issue of that journal.
IQ tests also measure verbal skills and typically include many subtests. For this study, subjects took five verbal
subtests and four performance subtests, but the genetic variations influenced only performance IQ scores.
"One way to measure performance IQ may be to ask people to order pictures correctly to tell a story," Dick
explains. "A simple example might be pictures of a child holding a vase, the vase broken to bits on the floor and the
child crying. The person taking the test would have to put those pictures into an order that tells the story of how the
child dropped the vase and broke it and then cried."
The researchers studied DNA gathered as part of the Collaborative Study on the Genetics of Alcoholism (COGA).
In this multi-center study, people who have been treated for alcohol dependence and members of their families
provide DNA samples to researchers, who isolated DNA regions related to alcohol abuse and dependence, as well as
a variety of other outcomes.
Some of the participants in the study also took the Wechsler Adult Intelligence Scale-Revised, a traditional IQ
test. In all, members of 200 families, including more than 2,150 individuals, took the Wechsler test, and those
results were matched to differences in individuals' DNA.
By comparing individual differences embedded in DNA, the team zeroed in on CHRM2, the neuronal receptor
gene on chromosome 7. The CHRM2 gene activates a multitude of signaling pathways in the brain involved in
learning, memory and other higher brain functions. The research team doesn't yet understand how the gene exerts
its effects on intelligence.
Intelligence was one of the first traits that attracted the attention of people interested in the interplay of genes
and environmental influences. Early studies of adopted children, for example, showed that when children grow up
away from their biological parents, their IQs are more closely correlated to biological parents, with whom they share
genes, than adoptive parents, with whom they share an environment.
But in spite of the association between genes and intelligence, it has been difficult to find specific variations that
influence intelligence. The genes identified in the past were those that had profoundly negative effects on
intelligence — genes that cause mental retardation, for example. Those that contribute to less dramatic differences
have been much harder to isolate.
Dick's team is not the first to notice a link between intelligence and the CHRM2 gene. In 2003, a group in
Minnesota looked at a single marker in the gene and noted that the variation was related to an increase in IQ. A
more recent Dutch study looked at three regions of DNA along the gene and also noticed influences on intelligence.
In this new study, however, researchers tested multiple genetic markers throughout the gene.
"If we look at a single marker, a DNA variation might influence IQ scores between two and four points,
depending on which variant a person carries," Dick explains. "We did that all up and down the gene and found that
the variations had cumulative effects, so that if one person had all of the 'good' variations and another all of the
'bad' variations, the difference in IQ might be 15 to 20 points. Unfortunately, the numbers of people at those
extremes were so small that the finding isn't statistically significant, but the point is we saw fairly substantial
differences in our sample when we combined information across multiple regions of the gene."
Dick says the next step is to look at the gene and its numerous variants to learn what is going on biologically that
might affect cognitive performance. Presently, she says it's too early to predict how small changes in the gene might
be influencing communication in the brain to affect intelligence, and she says it's nearly certain CHRM2 is not the
only gene involved.
"Perhaps as many as 100 genes or more could influence intelligence," she says. "I think all of the genes involved
probably have small, cumulative effects on increasing or decreasing I.Q., and I expect overall intelligence is a
function of the accumulation of all of these genetic variants, not to mention environmental influences ranging from
socio-economic status to the value that's placed on learning when children are growing up."
Dick DM, et al. Association of CHRM2 with IQ: Converging Evidence for a Gene Influencing Intelligence. Behavioral
Genetics, DOI 10.1007/s10519-006-9131-2
New UD technology removes viruses from drinking water
Tracey Bryant
University of Delaware researchers have developed an inexpensive, nonchlorine-based technology that can remove
harmful microorganisms, including viruses, from drinking water.
UD's patented technology, developed jointly by researchers in the College of Agriculture and Natural Resources
and the College of Engineering, incorporates highly reactive iron in the filtering process to deliver a chemical “knockout punch” to a host of notorious pathogens, from E. coli to rotavirus.
The new technology could dramatically improve the safety of drinking water around the globe, particularly in
developing countries. According to the World Health Organization (WHO), over a billion people--one-sixth of the
world's population--lack access to safe water supplies.
Four billion cases of diarrheal disease occur worldwide every year, resulting in 1.8 million deaths, primarily
infants and children in developing countries. Eighty-eight percent of this disease is attributed to unsafe water
supplies, inadequate sanitation and hygiene.
In the United States, viruses are the target pathogenic microorganisms in the new Ground Water Rule under the
Environmental Protection Agency's Safe Drinking Water Act, which took effect on Jan. 8.
“What is unique about our technology is its ability to remove viruses--the smallest of the pathogens--from water
supplies,” Pei Chiu, an associate professor in UD's Department of Civil and Environmental Engineering, said.
Chiu collaborated with Yan Jin, a professor of environmental soil physics in UD's plant and soil sciences
department, to develop the technology. They then sought the expertise of virologist Kali Kniel, an assistant
professor in the animal and food sciences department, who has provided critical assistance with the testing phase.
“A serious challenge facing the water treatment industry is how to simultaneously control microbial pathogens,
disinfectants such as chlorine, and toxic disinfection byproducts in our drinking water, and at an acceptable cost,”
Chiu noted.
Viruses are difficult to eliminate in drinking water using current methods because they are far smaller than
bacteria, highly mobile, and resistant to chlorination, which is the dominant disinfection method used in the United
States, according to the researchers.
Of all the inhabitants of the microbial world, viruses are the smallest--as tiny as 10 nanometers. According to the
American Society for Microbiology, if a virus could be enlarged to the size of a baseball, the average bacterium
would be the size of the pitcher's mound, and a single cell in your body would be the size of a ballpark.
“By using elemental iron in the filtration process, we were able to remove viral agents from drinking water at
very high efficiencies. Of a quarter of a million particles going in, only a few were going out,” Chiu noted.
The elemental or “zero-valent” iron (Fe) used in the technology is widely available as a byproduct of iron and
steel production, and it is inexpensive, currently costing less than 40 cents a pound (~$750/ton). Viruses are either
chemically inactivated by or irreversibly adsorbed to the iron, according to the scientists.
Technology removes 99.999 percent of viruses
The idea for the UD research sprang up when Jin and Chiu were discussing their respective projects over lunch
one day.
Since joining UD in 1995, Jin's primary research area has been investigating the survival, attachment and
transport behavior of viruses in soil and groundwater aquifers. One of the projects, which was sponsored by the
American Water Works Association Research Foundation, involved testing virus transport potential in soils collected
from different regions across the United States. Jin's group found that the soils high in iron and aluminum oxides
removed viruses much more efficiently than those that didn't contain metal oxides.
“We knew that iron had been used to treat a variety of pollutants in groundwater, but no one had tested iron
against biological agents,” Chiu said. So the two researchers decided to pursue some experiments.
With partial support from the U.S. Department of Agriculture and the Delaware Water Resources Center, through
its graduate fellowship program, the scientists and their students began evaluating the effectiveness of iron
granules in removing viruses from water under continuous flow conditions and over extended periods. Two
bacteriophages--viruses that infect bacteria--were used in the initial lab studies.
Since then, Kniel has been documenting the technology's effectiveness against human pathogens including E. coli
0157:H7, hepatitis A, norovirus and rotavirus. Rotavirus is the number-one cause of diarrhea in children, according
to Kniel.
“In 20 minutes, we found 99.99 percent removal of the viruses,” Chiu said. “And we found that removal of the
viruses got even better than that with time, to more than 99.999 percent.”
The elemental iron also removed organic material, such as humic acid, that naturally occurs in groundwater and
other sources of drinking water. During the disinfection process, this natural organic material can react with chlorine
to produce a variety of toxic chemicals called disinfection byproducts.
“Our iron-based technology can help ensure drinking-water safety by reducing microbial pathogens and
disinfection byproducts simultaneously,” Chiu noted.
Applications in agriculture and food safety
Besides helping to safeguard drinking water, the UD technology may have applications in agriculture.
Integrated into the wash-water system at a produce-packing house, it could help clean and safeguard fresh and
“ready to eat” vegetables, particularly leafy greens like lettuce and spinach, as well as fruit, according to Kniel.
“Sometimes on farms, wash-water is recirculated, so this technology could help prevent plant pathogens from
spreading to other plants,” she said.
This UD research underscores the importance of interdisciplinary study in solving problems.
“There are lots of exciting things you can discover working together,” Jin said, smiling. “In this project, we all
need each other. Pei is the engineer and knows where we should put this step and how to scale it up. I study how
viruses and other types of colloidal particles are transported in water, and Kali knows all about waterborne
pathogens.
“Our hope is that the technology we've developed will help people in our country and around the world,
especially in developing countries,” Jin noted.
Currently, the Centre for Affordable Water and Sanitation Technology in Calgary, Canada, is exploring use of the
UD technology in a portable water treatment unit. Since 2001, the registered Canadian charity has provided
technical training in water and sanitation to more than 300 organizations in 43 countries of the developing world,
impacting nearly a million people.
The University of Delaware is pursuing commercialization opportunities for the research. Patents have been filed in the
United States, Canada, France, Germany and Switzerland. For more information, contact Bruce Morrissey, UD director of
technology development, Office of the Vice Provost for Research and Graduate Studies, at [[email protected]] or (302)
831-4230.
It pays to be well hung, if you're a rat
Well-hung males may enjoy an evolutionary advantage over their less well-endowed competitors - in certain
rodents, anyway. The finding may help answer the vexing question of why penis size is so variable among mammals.
Steven Ramm, an evolutionary biologist at the University of Liverpool, UK, gathered published measurements of
the length of the penis bone in four orders of mammals: rodents, primates, bats and carnivores, then corrected for
the fact that related species tend to have similar-sized penises. He compared these adjusted lengths with body
weight and testis size, which is a good indicator of a species' promiscuity and so of the amount of competition a
male will face for fertilisation.
Rodents with relatively large testes also tended to have relatively long penises, Ramm found (The American
Naturalist, vol 169, p 360). The advantage this confers on rodents is unknown, but a generously proportioned organ
may deposit a male's sperm further up the female reproductive tract, giving them a head start in the race to the egg,
Ramm speculates.
A similar, but weaker pattern occurs for carnivores. However, Ramm found no evidence that the correlation
exists in either bats or primates.
Study Points to Genetics in Disparities in Preterm Births
By NICHOLAS BAKALAR
Black women have significantly higher rates of premature birth than white women, and a new study suggests
there may be underlying genetic factors even when other known risks are taken into account.
The researchers, who published their findings this month in The American Journal of Obstetrics & Gynecology,
say that even though preterm birth is not a desirable outcome, it may provide some advantage, perhaps protection
against diseases — in somewhat the same way the gene for sickle cell confers protection against malaria.
“We have to think of everything in the context of what’s been evolutionarily advantageous,” said Dr. Louis J.
Muglia, a professor of pediatrics at Washington University in St. Louis, who was the senior author of the study.
Dr. Muglia noted that during a normal pregnancy, certain immune responses are suppressed, and that cytokines,
the molecules involved in healthy immune response, are heavily involved in preterm birth.
“The same things that select for a robust immune response,” he said, “may also confer a risk for giving birth
early.”
Some experts remain skeptical. Neil J. Risch, director of the Institute for Human Genetics at the University of
California, San Francisco, said he was not impressed with the quality of the evidence.
“They’re inferring something is genetic by elimination of other factors,” he said. “But geneticists believe that to
implicate something as genetic requires direct evidence, as opposed to evidence by absence. One should use high
standards of evidence before trying to directly implicate genetics in group differences. There could be a genetic
contribution, but they haven’t shown it.”
That black women give birth prematurely more often than white women has been demonstrated in numerous
studies, and there is wide agreement on many of the risks. The mother’s age, prenatal care, socioeconomic status,
education level, body-mass index, smoking and maternal diabetes have all been shown to affect the likelihood of
preterm birth.
But the new study found that black women still had higher rates of preterm birth even after accounting for all
these variables.
The researchers used the Missouri Department of Health and Senior Services’ database of more than 711,000 live
births from 1989 to 1997. They found that compared with white women, black women were twice as likely to give
birth from 20 to 34 weeks; that their rate of recurrent preterm birth is higher; and that the median time for
recurrence is two weeks earlier — 31 weeks for black mothers, compared with 33 for whites. (Normal gestation is
37 to 41 weeks.)
The researchers found that race was an even more powerful predictor than lack of prenatal care, which is itself
one of the strongest risk factors for prematurity. For black women, the risk of recurrent preterm delivery is four
times as great as it is for white women.
The study also found that for both blacks and whites, the most likely time for the birth of a second preterm child
is the same week as the first preterm delivery. These findings suggest, but do not prove, that there is a genetic
contribution to premature birth.
A co-author of the paper, Dr. F. Sessions Cole, also a professor of pediatrics at Washington University,
acknowledged that the study was not a genetic analysis. But he said that should be the next step.
“There is no specific candidate gene or pathway identified here,” Dr. Cole said, “but this is a sufficiently large and
robust cohort that inferences about genetic contribution can be made. The study provides momentum for a genomic
approach to understanding the racial disparity between blacks and whites in rates of prematurity.”
Dr. Muglia acknowledges that because many people are genetically mixed, critics say it is impossible to associate
race with any specific genes. But he added, “There have been enough studies to show that when you look at
people’s self-reported race, it does track with specific markers of ethnicity.”
Previous studies have shown that the tendency to give birth prematurely is inherited, and that women who were
themselves born prematurely are significantly more likely to give birth to preterm infants. One study found that
women born at 30 weeks of gestation or less are more than twice as likely as women born after 36 weeks to have
their own babies prematurely.
Dr. Muglia said that genes and environment interact. “There are likely genetic variations that will require certain
environmental exposures to result in preterm birth,” he said, “and those environmental contributors vary as a
function of race as well. This complicates the overall analysis of ascribing risk to specific variables.”
Dr. Jerome F. Strauss, dean of the Virginia Commonwealth University School of Medicine, said that the findings
were “new and significant” and that “they add support to the idea that genetic factors contribute to the disparity
between African-Americans and Americans of European descent” in rates of prematurity. Dr. Strauss was not
involved in the study.
Although no one has identified a specific gene for prematurity, Dr. Strauss and his colleagues have identified a
gene variant, much more common in people of African descent, that confers an increased risk of premature rupture
of the fetal membranes. Still, Dr. Muglia said, even that finding gives no indication of the overall contribution of
genetics to the likelihood of giving birth prematurely.
The authors emphasize that their analysis does not prove that the disparity in preterm births has a genetic
component, because unknown variables that go along with black race may also contribute.
“Many moms feel they’ve done something wrong if they have a preterm infant,” Dr. Muglia said. “But there are
still other risk factors that aren’t in their control, like their genetic makeup, that we need to know more about — for
both whites and blacks.”
Cases
A Mystery Ailment, but Not for the Right Doctor
By CLAIRE PANOSIAN DUNAVAN, M.D.
This is the story of a sore foot. In the annals of illness, a sore foot is a
humble woe unless it is your foot, and if searching out its cause has led
you nowhere.
Not long ago I met a retired Silicon Valley engineer whose doctors
could not figure out why his foot had been red, hot and painful for three
months. He sought my opinion as a tropical medicine specialist: could he
have the mosquito-borne virus called chikungunya?
The engineer had done his homework. Although native to Africa,
chikungunya virus had recently spread to southern India — where my patient, an avid globetrotter, had gone lame
after being bitten by mosquitoes.
But while that diagnosis sounded plausible, his symptoms did not really fit. He never had the virus’s signature
full-body ache. Nor had he suffered its typical fever, headache, rash, nausea and fatigue.
I specialize in exotic infections, but I still subscribe to some time-honored advice to medical rookies: “When you
hear hoofbeats, think horses, not zebras.” If doctors placed bets on diagnoses, those sticking to classic diseases
with quirky symptoms, as opposed to quirky diseases with classic symptoms, would win 9 times out of 10.
So I scratched my head for another answer. The engineer’s uric acid level was mildly elevated; gout was a
possibility. A month earlier, a podiatrist had the same idea, but a test of fluid from the foot came up negative.
My next thought was osteomyelitis — a smoldering infection of bone found most often in diabetics but
occasionally in otherwise healthy people. At one point during his overseas trip, my patient had a draining blister on
his toe. Eureka! I said to myself. There’s a perfect way in for a bad bug like staph. Maybe I should order imaging
studies and follow up with a bone biopsy.
The retired engineer was a good sport. Having cast aside his own theory, he now agreed to another round of
foot X-rays and, if necessary, a costly M.R.I. scan. I swabbed his nose for staph. On my urging, he even promised
to resume the anti-inflammatory drugs he had recently quit in favor of acupuncture. He knew I was bent on solving
this puzzle, and he was grateful and cooperative. I had won his trust.
Fortunately, a day later I ran into a bona fide bone and joint expert. It didn’t take him long to burst my balloon
and make me face the truth: I was the wrong doctor for the man with the sore foot. Yes, he and I had rapport. But
what the patient really needed was expertise I didn’t have.
And that was what cracked the case. My colleague saw the patient and diagnosed Reiter’s syndrome, an
inflammatory disease touched off, in this instance, by a prior prostate infection. Subtle erosions on the patient’s foot
X-ray — abnormalities that only a rheumatologist might spot — were the telltale clue.
My doctor friend added that by the time I saw the patient, the infection was gone. It was then I remembered the
antibiotics my patient had downed a month earlier. Prescribed by yet another doctor, they had blotted out the one
clue that might have nudged me in the right direction.
In a way, I felt a kinship with my unknown predecessor who proffered a fistful of pills while failing to grasp the
bigger picture. Like me, it seems, he simply wasn’t meant to solve the mystery.
Last week, I called my patient to ask how he was doing. He judged his foot to be “85 percent better.” He also
told me he was seeing a new doctor for his troublesome prostate — the domino that set this whole painful episode
in motion.
And how does an infection in a walnut-sized gland near the bladder cause violent inflammation in someone’s
foot? Autoimmunity — a reaction usually found in people with a certain genetic makeup. In the engineer’s case, his
body’s immune system was jolted into action by an invading micro-organism, then mistakenly continued to attack
his foot long after the original culprit was gone. In classic cases of Reiter’s syndrome, there are three remote sites
of attack after the initial intestinal or genitourinary infection: joints, eyes and urethra.
On one hand, I’m glad my patient was spared those other miseries. Then again, if his disease had played by the
rules, I wonder: with the help of the Internet, would he have simply diagnosed his own case?
Dr. Claire Panosian Dunavan is a professor of medicine and infectious diseases at the University of California, Los Angeles.
Personal Health
A Mix of Medicines That Can Be Lethal
By JANE E. BRODY
The death of Libby Zion, an 18-year-old college student, in a New York hospital on March 5, 1984, led to a highly
publicized court battle and created a cause célèbre over the lack of supervision of inexperienced and overworked
young doctors. But only much later did experts zero in on the preventable disorder that apparently led to Ms. Zion’s
death: a form of drug poisoning called serotonin syndrome.
Ms. Zion, who went to the hospital with a fever of 103.5, had been taking a prescribed antidepressant,
phenelzine (Nardil). The combination of phenelzine and the narcotic painkiller meperidine (Demerol) given to her at
the hospital could raise the level of circulating serotonin to dangerous levels. When she became agitated, a
symptom of serotonin toxicity, and tried to pull out her intravenous tubes, she was restrained, and the resulting
muscular tension is believed to have sent her fever soaring to lethal heights.
Now, with the enormous rise in the use of serotonin-enhancing antidepressants, often taken in combination with
other drugs that also raise serotonin levels, emergency medicine specialists are trying to educate doctors and
patients about this not-so-rare and potentially life-threatening disorder. In March 2005, two such specialists, Dr.
Edward W. Boyer and Dr. Michael Shannon of Children’s Hospital Boston, noted that more than 85 percent of
doctors were “unaware of the serotonin syndrome as a clinical diagnosis.”
In their review in The New England Journal of Medicine, Dr. Boyer and Dr. Shannon cited a report based on calls
to poison control centers around the country in 2002 showing 7,349 cases of serotonin toxicity and 93 deaths. (In
2005, the last year for which statistics are available, 118 deaths were reported.)
The experts fear that failure to recognize serotonin syndrome in its mild or early stages can result in improper
treatment and an abrupt worsening of the condition, leading to severe illness or death. Even more important, in
hopes of preventing it, they want doctors — and patients — to know just what drugs and drug combinations can
cause serotonin poisoning.
A Diagnostic Challenge
Serotonin syndrome was first described in medical literature in 1959 in a patient with tuberculosis who was
treated with meperidine. But it wasn’t given its current name until 1982.
Recognizing the early signs is tricky because it has varying symptoms that can be easily confused with less
serious conditions, including tremor, diarrhea, high blood pressure, anxiety and agitation. The examining physician
may regard early symptoms as inconsequential and may not think to relate them to drug therapy, Dr. Boyer and Dr.
Shannon noted.
In its classic form, serotonin syndrome involves three categories of symptoms:
¶Cognitive-behavioral symptoms like confusion, disorientation, agitation, irritability, unresponsiveness and anxiety.
¶Neuromuscular symptoms like muscle spasms, exaggerated reflexes, muscular rigidity, tremors, loss of
coordination and shivering.
¶Autonomic nervous system symptoms like fever, profuse sweating, rapid heart rate, raised blood pressure and
dilated pupils.
Widespread ignorance of the syndrome is another diagnostic impediment. But even when doctors know about it,
the strict diagnostic criteria may rule out “what are now recognized as mild, early or subacute stages of the
disorder,” Dr. Boyer and Dr. Shannon wrote.
Perhaps adding to the diagnostic challenge is the fact that a huge number of drugs — prescription, over the
counter, recreational and herbal — can trigger the syndrome. In addition to selective serotonin reuptake inhibitors
like Zoloft, Prozac and Paxil and serotonin/norepinephrine reuptake inhibitors like Effexor, the list includes tricyclic
antidepressants and MAOIs (for monoamine oxidase inhibitors); narcotic painkillers like fentanyl and tramadol; overthe-counter cough and cold remedies containing dextromethorphan; the anticonvulsant valproate; triptans like
Imitrex used to treat and prevent migraines; the antibiotic Zyvox (linezolide); antinausea drugs; the anti-Parkinson’s
drug L-dopa; the weight-loss drug Meridia (sibutramine); lithium; the dietary supplements tryptophan, St. John’s
wort and ginseng; and several drugs of abuse, including ecstasy, LSD, amphetamines, the hallucinogens foxy
methoxy and Syrian rue.
Although serotonin poisoning can be caused by an antidepressant overdose, it more often results from a
combination of an S.S.R.I. or MAOI with another serotonin-raising substance. Patients at particular risk, some
experts say, are those taking combinations of antidepressant and antipsychotic drugs sometimes prescribed to treat
resistant depression. All it may take is a small dose of another serotonin-inducing drug to cause the syndrome.
One patient, a 45-year-old Bostonian, had been taking four drugs to treat depression when he had surgery on an
ankle last December. He developed several classic signs of serotonin syndrome while in the recovery room, where
he had been given fentanyl when the anesthetic wore off.
As described by his wife, he suddenly developed tremors and violent shaking and started cracking his teeth. He
was moved to the intensive care unit, where he thrashed and flailed, was oblivious to those around him, and had to
be restrained to keep from pulling out his tubes. Two weeks later, he was still in intensive care and still very
confused, despite being taken off all medications that could have caused his symptoms.
Serotonin syndrome can occur at any age, including in the elderly, in newborns and even in dogs. Since 1998,
the poison control center at the American Society for the Prevention of Cruelty to Animals has gotten more than a
thousand reports of the ingestion of antidepressant medications by dogs, which can develop symptoms rapidly and
die. The syndrome can also occur weeks after a serotonin-raising drug has been discontinued. Some drugs remain
active in the body for weeks, and the MAOIs disable an enzyme involved in serotonin metabolism that does not
recover until weeks after the drugs are stopped.
Prevention and Treatment
Most cases of serotonin syndrome are mild and resolved within 24 hours. But if the doctor fails to recognize them
and prescribes either a larger dose of a serotonin enhancer or another serotonin-raising drug, the consequences can
be rapid and severe.
Most important to preventing the syndrome is for patients to give each of their doctors a complete list of drugs
they regularly take — including prescriptions, over-the-counter medication, dietary supplements and recreational
drugs — before a doctor prescribes something new.
Indeed, if you are taking any of the drugs described above, you might ask whether a new prescription is safe.
And when filling a new prescription, it’s not a bad idea to also ask the pharmacist whether the medication, or an
over-the-counter remedy you are considering, is safe to combine with any other drugs you take.
Once the syndrome develops, the first step is to stop the offending drugs. It is crucial to seek immediate care,
preferably in a hospital. Most cases require only treatment of symptoms like agitation, elevated blood pressure and
body temperature, and a tincture of time.
More severe cases are treated with drugs that inhibit serotonin and chemical sedation. Dr. Boyer and Dr.
Shannon cautioned against using physical restraints to control agitation because they could enforce isometric
muscle contractions that cause a severe buildup of lactic acid and a life-threatening rise in body temperature.
Really?
The Claim: Duct Tape Removes Warts
By ANAHAD O’CONNOR
THE FACTS A small study in 2002 gave credence to an old remedy for an ugly
problem when it stated that duct tape, that ever-popular emblem of inventiveness and
quick fixes, was a highly effective treatment for warts.
The practice is supposed to work because the tape, if left on long enough, irritates
the skin, thereby causing an immune reaction that clears up the infections responsible
for warts, which are most common in children. The 2002 study found that if applied
for six days, duct tape worked more often than the usual technique: cryotherapy, or
freezing.
But many critics questioned the size of the study and its methodology, and this
month they gained ammunition from a study in The Journal of Family Practice.
In the study, tape was applied to problem areas for seven days and then the spots were soaked in warm water
and rubbed with pumice stones. This technique worked about 16 percent of the time, about the same as applying
corn pads overnight with once-weekly soaks and rubs. The researchers looked at 103 children — twice the number
of subjects in the 2002 study.
Last year, another study went a step further by compiling and analyzing data from 60 studies that had looked at
various removal methods. That study, published in the Cochrane Database of Systematic Reviews, found that simple
skin treatments with salicylic acid were probably the best option. Applied regularly, the study found, they had a cure
rate of about 73 percent.
THE BOTTOM LINE Duct tape may not be as effective as once thought.
It Seems the Fertility Clock Ticks for Men, Too
By RONI RABIN
When it comes to fertility and the prospect of having normal babies, it has always been assumed that men have
no biological clock — that unlike women, they can have it all, at any age.
But mounting evidence is raising questions about that assumption, suggesting that as men get older, they face
an increased risk of fathering children with abnormalities. Several recent studies are starting to persuade many
doctors that men should not be too cavalier about postponing marriage and children.
Until now, the problems known to occur more often with advanced paternal age were so rare they received scant
public attention. The newer studies were alarming because they found higher rates of more common conditions —
including autism and schizophrenia — in offspring born to men in their middle and late 40s. A number of studies
also suggest that male fertility may diminish with age.
“Obviously there is a difference between men and women; women simply can’t have children after a certain
age,” said Dr. Harry Fisch, director of the Male Reproductive Center at New York-Presbyterian Hospital/Columbia
University Medical Center and the author of “The Male Biological Clock.”
“But not every man can be guaranteed that everything’s going to be fine,” Dr. Fisch said. “Fertility will drop for
some men, others will maintain their fertility but not to the same degree, and there is an increased risk of genetic
abnormalities.”
It’s a touchy subject. “Advanced maternal age” is formally defined: women who are 35 or older when they deliver
their baby may have “A.M.A.” stamped on their medical files to call attention to the higher risks they face. But the
concept of “advanced paternal age” is murky. Many experts are skeptical about the latest findings, and doctors
appear to be in no rush to set age guidelines or safety perimeters for would-be fathers, content instead to issue
vague sooner-rather-than-later warnings.
“The problem is that the data is very sparse right now,” said Dr. Larry Lipschultz, a specialist in the field of male
infertility and a past president of the American Society for Reproductive Medicine. “I don’t think there’s a consensus
of what patients should be warned about.”
And many men maintain their fertility, said Dr. Rebecca Z. Sokol, president of the Society of Male Reproduction
and Urology.
“If you look at males over 50 or 40, yes, there is a decline in the number of sperm being produced, and there
may be a decline in the amount of testosterone,” Dr. Sokol said. But by and large, she added, “the sperm can still
do their job.”
Some advocates, however, welcome the attention being paid to the issue of male fertility, saying it is long
overdue and adding that it could level the playing field between men and women in the premarital dating game.
“The message to men is: ‘Wake up and smell the java,’ ” said Pamela Madsen, executive director of the American
Fertility Association, a national education and advocacy group. “ ‘It’s not just about women anymore, it’s about you
too.’ ”
“It takes two to make a baby,” she said, “and men who one day want to become fathers need to wake up, read
what’s out there and take responsibility.
“I don’t see why everyone is so surprised,” Ms. Madsen added. “Everyone ages. Why would sperm cells be the
only cells not to age as men get older?”
Analyses of sperm samples from healthy men have found changes as men age, including increased
fragmentation of DNA, and some studies outside the United States have noted increased rates of some cancers in
children of older fathers.
Geneticists have been aware for decades that the risk of certain rare birth defects increases with the father’s age.
One of the most studied of these conditions is a form of dwarfism called achondroplasia, but the list also includes
neurofibromatosis, the connective-tissues disorder Marfan syndrome, skull and facial abnormalities like Apert
syndrome, and many other diseases and abnormalities.
“We have counseled for quite a long time that as paternal age increases, there is an increased frequency in new
mutations,” said Dr. Joe Leigh Simpson, president-elect of the American College of Medical Genetics.
Some studies suggest that the risk of sporadic single-gene mutations may be four to five times higher for fathers
who are 45 and older, compared with fathers in their 20s, Dr. Simpson said. Over all, having an older father is
estimated to increase the risk of a birth defect by 1 percent, against a background 3 percent risk for a birth defect,
he said.
Even grandchildren may be at greater risk for some conditions that are not expressed in the daughter of an older
father, according to the American College of Medical Genetics. These include Duchenne muscular dystrophy, some
types of hemophilia and fragile-X syndrome.
A recent study on autism attracted attention because of its striking findings about a perplexing disorder.
Researchers analyzed a large Israeli military database to determine whether there was a correlation between
paternal age and the incidence of autism and related disorders. It found that children of men who became a father
at 40 or older were 5.75 times as likely to have an autism disorder as those whose fathers were younger than 30.
“Until now, the dominant view has been, ‘Blame it on the mother,’ ” said Dr. Avi Reichenberg, the lead author of
the study, published in September in The Archives of General Psychiatry. “But we found a dose-response
relationship: the older the father, the higher the risk. We think there is a biological mechanism that is linked to
aging fathers.”
The study controlled for the age of the mother, the child’s year of birth and socioeconomic factors, but
researchers did not have information about autistic traits in the parents.
A study on schizophrenia found that the risk of illness was doubled among children of fathers in their late 40s
when compared with children of fathers under 25, and increased almost threefold in children born to fathers 50 and
older. This study was also carried out in Israel, which maintains the kind of large centralized health databases
required for such research. In this case, the researchers used a registry of 87,907 births in Jerusalem between 1964
and 1976, and linked the records with an Israeli psychiatric registry.
Researchers controlled for the age of the mother but did not have information on family psychiatric history.
According to the study’s calculations, the risk of schizophrenia was 1 in 141 in children of fathers under 25 years,
1 in 99 for fathers 30 to 35, and 1 in 47 for fathers 50 and older. The study found no association between older
fathers and any other psychiatric conditions.
“When our paper came out, everyone said, ‘They must have missed something,’ ” said an author of the study, Dr.
Dolores Malaspina, chairwoman of the psychiatry department at New York University Medical Center. (Dr. Malaspina
was also involved in the autism study.)
But studies elsewhere had similar findings, she said: a threefold increase in schizophrenia among offspring of
older fathers.
“The fact it’s so similar around the world suggests it’s due to biological aging,” she said. “As we age we do things
less well, and things break down, and that includes the making of the sperm.”
Unlike women, who are born with a lifetime supply of eggs, men are constantly making new sperm. But the
spermatogonia — the immature stem cells in the testes that replenish sperm — are constantly dividing and
replicating, with each round of division creating another possibility for error.
While women have only about 24 divisions in the cells that produce their eggs, the cells that create sperm go
through about 30 rounds of mitosis before puberty and through roughly 23 replications a year from puberty onward.
By the time a man reaches 50, the cells that create his sperm have gone through more than 800 rounds of division
and replication.
“It’s like a light-bulb factory,” said Dr. Reichenberg, the author of the autism study. “You can manufacture a
billion light bulbs, but some fraction are going to be impaired. When you’re manufacturing something so frequently,
in such large quantities, the chances of an error are very high.”
Skeptics say the studies find an association but do not prove a causal relationship between an older father’s
genetic material and autism or schizophrenia, and note that other factors related to having an older father could be
at play, including different parenthood styles. Another possibility is that the father’s own mental illness or autistic
tendencies are responsible both for the late marriage and for the effect on the child.
But other findings suggest implications for older fathers. Another study by Dr. Malaspina and Dr. Reichenberg,
also using Israeli army data, found a correlation between having an older father and lower scores on nonverbal, or
performance, I.Q. tests.
Dr. Fisch, author of “The Male Biological Clock,” analyzed a New York State database of births and found that
older fathers added to the risk of having a baby with Down syndrome if the mother was over 35. (The father’s age
seemed to have no effect if the mother was younger; younger women may have compensated for any problems of
the older male.) The paper concluded that the father’s age was a contributing factor in 50 percent of Down
syndrome cases in babies born to women over 40.
Meanwhile, scientists have reported that sperm counts decline with age, and that sperm begin to lose motility
and the ability to swim in a straight line. The researchers also reported a steady increase in sperm DNA
fragmentation as men grew older, with a 2 percent increase each year in the gene mutation associated with
achondroplasia, the dwarfism syndrome. They found no correlation between advanced age and the kinds of
chromosomal changes that cause Down syndrome, but suggested that a small proportion of older fathers may be at
increased risk for transmitting multiple genetic and chromosomal defects.
The changes are gradual, rather than precipitous, said Brenda Eskenazi, director of the Center for Children’s
Environmental Health Research at the School of Public Health at the University of California, Berkeley. Some
scientists have suggested that unlike women’s biological clocks, which come to a dead stop when fertility ends at
menopause, older men’s clocks might be described as running slow and losing time.
So what’s a guy to do?
“I think what we’re saying is that men, too, need to be concerned about their aging,” Dr. Eskenazi said. “We
don’t really know what the complete effects are of men’s age on their ability to produce viable, healthy offspring.”
Dr. Fisch says healthy habits, regular exercise and a balanced diet may help preserve fertility. He advises against
smoking and using anabolic steroids and hot tubs, all of which can damage sperm.
If pressed, he said, “I would tell people, ‘If you’re going to have kids, have them sooner rather than later.’ ”
“No matter what happens,” he added, “the biological clock ticks.”
2 New Drugs Offer Options in H.I.V. Fight
By LAWRENCE K. ALTMAN and ANDREW POLLACK
Two new AIDS drugs, each of which works in a novel way, have proved safe and highly successful in
large studies, a development that doctors said here on Tuesday would significantly expand treatment options for
patients.
The two drugs, which could be approved for marketing later this year, would add two new classes of drugs to the
four that are available to battle H.I.V., the AIDS virus. That would be especially important to tens of thousands of
patients in the United States whose treatment is failing because their virus has become resistant to drugs already in
use.
“This is really a remarkable development in the field,” Dr. John W. Mellors of the University of Pittsburgh said at a
news conference here at the 14th Annual Conference on Retroviruses and Opportunistic Infections.
Dr. Mellors, who was not involved in the studies but has been a consultant to the manufacturers of the drugs,
said he “wouldn’t be going out on a limb” to say the new results were as exciting as those from the mid-1990s,
when researchers first discovered that cocktails of drugs could significantly prolong lives.
Dr. Scott Hammer, chief of infectious diseases at Columbia University Medical Center, who also was not involved
in the studies but has been a consultant to the manufacturers, agreed that the new drugs “will provide extended
years of meaningful survival to patients.”
One drug, maraviroc, was developed by Pfizer, which has already applied for approval to sell it. The Food and
Drug Administration has scheduled an advisory committee meeting on April 24 to discuss the application.
The other drug, raltegravir, was developed by Merck, which has said it will apply in the second quarter for
approval.
Experts said the new drugs would be used in combination with older drugs. Both drugs stem from scientific
findings made a decade or more ago that have peeled back the intricate molecular process used by H.I.V. to infect
human immune system cells and to replicate themselves.
While there are now more than 20 approved drugs to treat H.I.V. and AIDS, there are only four different
mechanisms by which the drugs work. In many patients, the virus develops resistance to one or more drugs, usually
because patients do not take their drugs on time as prescribed.
And if the virus develops resistance to one drug in a class, it often becomes resistant to others in that class and
sometimes in other classes. So AIDS experts have said there is an urgent need for drugs that work by new
mechanisms.
The two new drugs would represent the first new classes since 2003, when an injectable drug called Fuzeon was
approved. They would be the first new classes of oral H.I.V. drugs in a decade.
Merck’s drug works by inhibiting the action of integrase, an enzyme produced by the virus that incorporates the
virus’s genetic material into the DNA of a patient’s immune cell. Once incorporated, the viral DNA commandeers the
cell to make more copies of the virus.
In two Merck studies involving a total of 700 patients, virus levels dropped to below 50 copies per milliliter of
blood, an amount considered undetectable, in about
60 percent of patients who received raltegravir. That
compared with about 35 percent of those who
received a placebo.
The patients in the two Phase 3 trials, typically
the last stage of testing before approval, were
resistant to at least one drug in each of three classes
of antiretroviral drugs. All the patients also received
a combination of older drugs that their doctors
deemed to be the most appropriate. The results
reported here were after 16 weeks, in a study that is
continuing so it is possible that longer-term side
effects might yet arise.
Other integrase inhibitors, like one from Gilead
Sciences, are also under development. Gilead’s drug
is 18 months to 2 years behind Merck’s.
Pfizer’s drug works by blocking a protein on
human immune system cells that H.I.V. uses as a
portal to enter and infect the cell. It would be the
first drug that targets the human body rather than
the virus.
The portal, known as CCR5, was discovered in
1996 by several groups of scientists, and there has been a race to develop drugs to block it.
In two Phase 3 studies sponsored by Pfizer involving 1,049 patients, more than 40 percent of patients who
received maraviroc had undetectable levels of virus after 24 weeks of a 48-week study. That was about twice the
rate of those who received placebo. As in the Merck trials, patients were resistant to three classes of drugs and
were receiving an optimized combination of older drugs.
LOS ANGELES —
Some experts said they were a bit cautious about maraviroc, in part because it blocks a human protein instead of
a viral one, with possible unknown long-term effects. One CCR5 inhibitor that was being developed by
GlaxoSmithKline was dropped because it caused liver toxicity, and a second being developed by Schering-Plough
appeared to possibly raise the risk of blood cancers.
But in Pfizer’s study there was no increased incidence of cancers. In one study there was a higher rate of death
among those who took the drug, but Pfizer said the deaths were not associated with the drug.
Experts are also encouraged that about 1 percent of Caucasians have a particular mutation in both copies of their
CCR5 gene that knocks out its function. These people are resistant to H.I.V. infection and apparently live otherwise
normal lives.
Yet another issue is that some viruses use a different entry portal called CXCR4. Before getting maraviroc,
patients will have to be tested to see which portal their virus uses, which would make the drug an early example of
“personalized medicine” tailored to the patient.
The test, which will probably take two weeks for results, was developed by Monogram Biosciences of South San
Francisco, Calif. It is expected to cost as much as $1,000, or more.
About 85 percent of newly infected patients have a virus that uses CCR5 while only about half of highly drugresistant viruses use that portal. There has been some concern that blocking CCR5 would encourage the
development of viruses that use the alternative portal — and those viruses seem to be associated with worse
outcomes.
But that has not proven so far to be a big problem, according to Edward A. Berger of the National Institute of
Allergy and Infectious Diseases, who played a key role in the discovery of the two portals.
Government, academic and industry experts said there was no reliable estimate of the number of people who
would need one of the new drugs. But the number is declining as more and better AIDS drugs become available.
“The numbers are not what they used to be six years ago,” said Norbert Bischofberger, executive vice president
for research and development at Gilead, which makes some widely used AIDS drugs.
Both Merck and Pfizer say they are conducting studies testing their drugs for use as initial treatments. They
would not say how much their drugs would cost.
Project Curbs Malaria in Ugandan Group
By LAWRENCE K. ALTMAN
LOS ANGELES — A simple, inexpensive and surprisingly powerful combination of treatments all but wiped out
malaria in a group of H.I.V.-positive children in a study in Uganda, scientists are reporting.
The combination — taking one inexpensive antibiotic pill each day and sleeping under an insecticide-treated
mosquito net — reduced the incidence of malaria by 97 percent compared with a control group, Dr. Anne Gasasira,
an AIDS researcher at Makerere University in Kampala, Uganda, said at a medical conference here on Wednesday.
She said the findings had already changed medical practice there. But scientists said they had not yet determined
whether the treatment would be as effective in H.I.V.-negative children with malaria.
The antibiotic is known as cotrimoxazole, which is sold in the United States as Bactrim and Septra. It is used to
prevent infections that are common complications of AIDS. The drug also has known benefits against the parasite
that causes malaria.
“The findings were shockingly dramatic,” said Dr. Elaine Abrams, a professor of pediatrics and epidemiology at
Columbia University. Dr. Abrams was not connected with the Uganda study and moderated the session where the
findings were reported, at the 14th Conference on Retroviruses and Opportunistic Infections.
Dr. Abrams and other experts said the findings had implications for better controlling malaria in Africa. Malaria is
the leading cause of illness and death among children under 5 in Uganda, Dr. Gasasira said.
Although the World Health Organization does not recommend the combination, the United Nations agency does
recommend each measure separately — cotrimoxazole for H.I.V. in Africa and bed nets to prevent malaria in
infected areas.
Generic forms of cotrimoxazole cost less than $10 per patient each year in Uganda and the bed nets about $5, Dr.
Gasasira said in an interview.
Because there are known interactions involving AIDS, malaria and other diseases, researchers have broadened
their studies in Africa to find better ways to treat and prevent them.
The findings also extend to an earlier study that found a reduced frequency of malaria among H.I.V.-infected
adults in Uganda who took the antibiotic and slept under bed nets. Dr. Jonathan Mermin of the Centers for Disease
Control and Prevention in Atlanta led the adult study, which was published in The Lancet last year. But Dr. Gasasira
said that because the adult and pediatric studies used different methodologies, the findings could not be directly
compared.
Her study found that among 561 healthy children who were assumed not to be H.I.V.-infected and who did not
take the antibiotic and sleep under bed nets, there were 356 episodes of malaria. This compared with four episodes
among 300 children who were known to be H.I.V.-infected.
The 97 percent reduction was calculated by including other factors like the time they were under observation in
the study that began in October 2005. The data were measured as of last August, but the study is continuing to
determine if the findings hold up over a longer period. Additional studies are planned.
The infected children also received antiretroviral therapy, Dr. Gasasira said. Her team included researchers from
the University of California at San Francisco and was paid for by the National Institute of Allergy and Infectious
Diseases in Bethesda, Md.
The study came up with another important finding on the relationship between fever and malaria. Only 4 percent
of fevers were from malaria among children who received the combination therapy in the study.
In the past, doctors assumed that a child who came to a clinic for fever in Uganda had malaria until it was
proved otherwise. But because malaria was far less common among the participants who received the combination
therapy, Dr. Gasasira said, doctors now assume that any fever in a young child must be investigated for a cause
other than malaria.
Dr. Abrams, the Columbia expert, said in an interview that the Uganda findings had additional implications for
treating H.I.V.-infected children in malarious areas. Because pediatricians are concerned that prolonged use of
cotrimoxazole could lead to resistant malaria, they often stop the drug among AIDS patients when tests show
significant improvement in the health of their immune system after antiretroviral therapy.
“This data will make us reconsider whether to stop cotrimoxazole” in such circumstances, she said.
Stephen Hawking Plans Prelude to the Ride of His Life
By DENNIS OVERBYE
Stephen Hawking, the British cosmologist, Cambridge professor and best-selling author who has spent his career
pondering the nature of gravity from a wheelchair, says he intends to get away from it all for a little while.
On April 26, Dr. Hawking, surrounded by a medical entourage, is to take a zero-gravity ride out of Cape
Canaveral on a so-called vomit comet, a padded aircraft that flies a roller-coaster trajectory to produce periods of
weightlessness. He is getting his lift gratis, from the Zero Gravity Corporation, which has been flying thrill seekers
on a special Boeing 727-200 since 2004 at $3,500 a trip.
Peter H. Diamandis, chief executive of Zero G, said that “the idea of giving the world’s expert on gravity the
opportunity to experience zero gravity” was irresistible.
In some ways, this is only a prelude. Dr. Hawking announced on his 65th birthday, in January, that he hoped to
take a longer, higher flight in 2009 on a space plane being developed by Richard Branson’s company Virgin Galactic,
which seeks to take six passengers to an altitude of 70 miles.
Dr. Hawking says he wants to encourage public interest in spaceflight, which he believes is critical to the future
of humanity.
“I also want to show,” he said in an e-mail interview, “that people need not be limited by physical handicaps as
long as they are not disabled in spirit.”
Coming at a time when human spaceflight is at a crossroads, his trip into space is likely to shine a giant light on
the burgeoning and hopeful industry of space tourism.
NASA has redesigned the space program around finishing the International Space Station and sending people to
the Moon again and then to Mars, much to the unhappiness of many scientists who fear that the growing costs of
human flight will squeeze science out of the program.
Some voices, including Martin Rees, Dr. Hawking’s old friend and president of the Royal Society, have been
saying that space may be explored more economically and faster by private entrepreneurs, who can take risks and
weather the occasional disaster without having to worry about a Congressional cancellation of financing.
Last summer, at a news conference in Hong Kong, Dr. Hawking said humanity’s ultimate survival depended on
colonizing the solar system and beyond.
“Life on Earth,” he said, “is at the ever-increasing risk of being wiped out by a disaster, such as sudden global
nuclear war, a genetically engineered virus or other dangers we have not yet thought of.”
At an age when many of his contemporaries are thinking about retirement, Dr. Hawking seems determined to
add yet another chapter to a tale of already legendary adventurousness and determination, not to mention scientific
achievement.
He was only a graduate student at Cambridge University in the 1960s when he was found to have amyotrophic
lateral sclerosis, or Lou Gehrig’s disease, which usually kills its victims in two to five years. He persevered to get his
degree and become the world’s reigning expert on black holes, the bottomless pits in which gravity has crushed
dead stars, space and time out of existence.
Along the way he has married twice, fathered three children (he is now a grandfather), written the best-selling “A
Brief History of Time” among other books, traveled the world and appeared as a guest on “Star Trek: The Next
Generation” and “The Simpsons.”
Dr. Hawking has been to the White House, the Great Wall of China and Antarctica, met the Dallas Cowboys
Cheerleaders and been lowered into the pit of an underground particle accelerator. Lawrence M. Krauss, a
cosmologist from Case Western Reserve University, who once took him down in a submarine, said, “Stephen is a
dreamer and an adventurer who enjoys the opportunities his celebrity brings in a way that happily perhaps
compensates, although only minuscule-ly, for his physical affliction.”
The image of him floating through the stars in his wheelchair has become a symbol of humanity’s restless
curiosity and wonder.
Now it seems that the symbol is about to become the real thing, sans wheelchair.
Dr. Diamandis, a space entrepreneur who is a founder of the $10 million Ansari X Prize, awarded in 2004 for the
world’s first private spacecraft, on which the Branson craft is based, said he had offered Dr. Hawking a ride after
hearing him express enthusiasm for spaceflight.
There followed long discussions between Dr. Hawking’s doctors and the company’s to make sure that it would be
safe. Almost completely paralyzed, and frail after decades in a wheelchair, Dr. Hawking long ago lost the power of
speech and communicates with a computerized voice synthesizer that is controlled by his eye movements.
Zero Gravity, founded in 1993 by Dr. Diamandis and Byron K. Lichtenberg, a former astronaut, has flown some
2,500 people, only 1 or 2 percent of whom, Dr. Diamandis said, have become spacesick.
The aircraft has about 35 seats. Once the plane reaches some 24,000 feet, he said, the passengers leave their
seats and lie in a large padded open area. As the plane flies its roller-coaster trajectory, they experience repeated
swings between feeling heavier than normal, at the dip, and then weightless, at the peak, where they drift gently
off the floor in what Dr. Diamandis, who has been on 40 or 50 such flights, described as a “really a joyous
experience, almost Zen-like,” lasting about half a minute.
Dr. Hawking’s flight will probably be even shorter, Dr. Diamandis said, with the pilots consulting with Dr. Hawking
and his doctors after each cycle.
In the e-mail interview, Dr. Hawking said, “I’m not worried about the zero gravity section, but the high-G part will
be difficult.”
Asked what his family thought of the adventure, he replied, “My family say ‘Good on you!’ ”
New DNA study helps explain unique diversity among Melanesians
Small populations of Melanesians — among the most genetically diverse people on the planet — have significant
differences in their mitochondrial DNA that can be linked to where they live, the size of their home island and the
language they speak, according to a study being published in the new online journal, Public Library of Science ONE
(http://www.plosone.org).
The study, "Melanesian mtDNA complexity," was lead by Jonathan Friedlaender, emeritus professor of
anthropology at Temple University. The study appears in the Feb. 28 issue.
Friedlaender and his collaborators from Binghamton University, the Institute for Medical research in New Guinea
and the University of Pennsylvania, examined mitochondrial DNA sequences from 32 diverse populations on four
Melanesian islands, an island chain north and northeast of Australia that includes Fiji, New Caledonia, Vanuatu, the
Solomon Islands, and New Guinea. The islands that were intensively covered were Bougainville, New Ireland, New
Britain and New Guinea. "Mitochondrial DNA has been a focus of analysis for about 15 years," says Friedlaender. "It
is very interesting in that it is strictly maternally inherited as a block of DNA, so it really allows for the construction
of a very deep family tree on the maternal side as new mutations accumulate over the generations on ancestral
genetic backgrounds.
"In this part of the world, the genealogy extends back more than 35,000 years, when Neanderthals still occupied
Europe," he adds. "These island groups were isolated at the edge of the human species range for an incredible
length of time, not quite out in the middle of the Pacific, but beyond Australia and New Guinea. During this time
they developed this pattern of DNA diversity that is really quite extraordinary, and includes many genetic variants
that are unknown elsewhere, that can be tied to specific islands and even specific populations there. Others suggest
very ancient links to Australian Aborigines and New Guinea highlanders."
Friedlaender also says that the study gives a different perspective on the notion of the "apparent distinctions
between humans from different continents, often called racial differences. In this part of the Pacific, there are big
differences between groups just from one island to the next — one might have to name five or six new races on this
basis, if one were so inclined. Human racial distinctions don’t amount to much."
Use of some antioxidant supplements may increase mortality risk
Contradicting claims of disease prevention, an analysis of previous studies indicates that the antioxidant
supplements beta carotene, vitamin A, and vitamin E may increase the risk of death, according to a meta-analysis
and review article in the February 28 issue of JAMA.
Many people take antioxidant supplements, believing they improve their health and prevent diseases. Whether
these supplements are beneficial or harmful is uncertain, according to background information in the article.
Goran Bjelakovic, M.D., Dr.Med.Sci., of the Center for Clinical Intervention Research, Copenhagen University
Hospital, Copenhagen, Denmark, and colleagues conducted an analysis of previous studies to examine the effects of
antioxidant supplements (beta carotene, vitamins A and E, vitamin C [ascorbic acid], and selenium) on all-cause
death of adults included in primary and secondary prevention trials. Using electronic databases and bibliographies,
the researchers identified and included 68 randomized trials with 232,606 participants in the review and metaanalysis. The authors also classified the trials according to the risk of bias based on the quality of the methods used
in the study, and stratified trials as "low-bias risk" (high quality) or "high-bias risk" (low quality).
In an analysis that pooled all low-bias risk and high bias risk trials, there was no significant association between
antioxidant use and mortality. In 47 low-bias trials involving 180,938 participants, the antioxidant supplements were
associated with a 5 percent increased risk of mortality. Among low-bias trials, use of beta carotene, vitamin A, and
vitamin E was associated with 7 percent, 16 percent and 4 percent, respectively, increased risk of mortality,
whereas there was no increased mortality risk associated with vitamin C or selenium use.
"Our systematic review contains a number of findings. Beta carotene, vitamin A, and vitamin E given singly or
combined with other antioxidant supplements significantly increase mortality. There is no evidence that vitamin C
may increase longevity. We lack evidence to refute a potential negative effect of vitamin C on survival. Selenium
tended to reduce mortality, but we need more research on this question," the authors write.
"Our findings contradict the findings of observational studies, claiming that antioxidants improve health.
Considering that 10 percent to 20 percent of the adult population (80-160 million people) in North America and
Europe may consume the assessed supplements, the public health consequences may be substantial. We are
exposed to intense marketing with a contrary statement, which is also reflected by the high number of publications
per included randomized trial found in the present review."
"There are several possible explanations for the negative effect of antioxidant supplements on mortality.
Although oxidative stress has a hypothesized role in the pathogenesis of many chronic diseases, it may be the
consequence of pathological conditions. By eliminating free radicals from our organism, we interfere with some
essential defensive mechanisms . Antioxidant supplements are synthetic and not subjected to the same rigorous
toxicity studies as other pharmaceutical agents. Better understanding of mechanisms and actions of antioxidants in
relation to a potential disease is needed," the researchers conclude.
Eating ice cream may help women to conceive, but low-fat dairy foods may increase infertility
risk
Drinking whole fat milk and eating ice cream appears to be better for women trying to become pregnant than a
diet consisting of low-fat dairy products such as skimmed milk and yoghurt, according to new research published in
Europe's leading reproductive medicine journal, Human Reproduction, today (28 February). [1]
Researchers in the United States have found a link between a low-fat dairy diet and increased risk of infertility
due to lack of ovulation (anovulatory infertility). Their study showed that if women ate two or more servings of lowfat dairy foods a day, they increased their risk of ovulation-related infertility by more than four fifths (85%)
compared to women who ate less than one serving of low-fat dairy food a week. On the other hand, if women ate
at least one serving of high-fat dairy food a day, they reduced their risk of anovulatory infertility by more than a
quarter (27%) compared to women who consumed one or fewer high-fat dairy serving a week.
Lead author of the study, Dr Jorge Chavarro, who is a research fellow in the Department of Nutrition at Harvard
School of Public Health, Boston, Massachusetts, USA, said that, given the scarcity of information in this area, it was
important that more research should be carried out into the association between low-fat dairy foods and
anovulatory infertility in order to confirm or refute the findings.
"Clarifying the role of dairy foods intake on fertility is particularly important since the current Dietary Guidelines
for Americans recommend that adults consume three or more daily servings of low-fat milk or equivalent dairy
products: a strategy that may well be deleterious for women planning to become pregnant as it would give them an
85% higher risk of anovulatory infertility according to our findings."
In the meantime, he said that his advice to women wanting to conceive would be to change their diet. "They
should consider changing low-fat dairy foods for high-fat dairy foods; for instance, by swapping skimmed milk for
whole milk and eating ice cream, not low fat yoghurt." However, he said that it was important that women did this
within the constraints of maintaining their normal calorie intake and limiting their overall consumption of saturated
fats in order to maintain general good health. "Once they have become pregnant, then they should probably switch
back to low-fat dairy foods as it is easier to limit intake of saturated fat by consuming low-fat dairy foods," he said.
In their prospective study, Dr Chavarro and his colleagues identified 18,555 women, aged between 24 and 42,
without a history of infertility, who had tried to become pregnant or had became pregnant between 1991 and 1999.
The women were part of a much larger study of 116,000 women in The Nurses' Health Study II.
Every two years the women completed a questionnaire that asked if they had tried to become pregnant for more
than a year without success, and what the cause was if they had been unable to conceive. The women also supplied
information on how often, on average, they had consumed certain foods and drinks during the previous year.
During the eight years, 438 healthy women reported infertility due to an ovulatory disorder.
After adjusting for various factors such as age, parity, body mass index, total calorie intake, physical activity,
smoking, drinking and contraceptive use, the researchers found an 85% increased risk of anovulatory infertility in
women eating two or more servings of low-fat dairy food a day compared to women eating one or fewer servings a
week, and a 27% decreased risk of infertility for women eating high-fat dairy food one or more times a day
compared to women eating a serving one or fewer times a week.
Dr Chavarro said: "Intake of total dairy foods was not associated with the risk of anovulatory infertility, but when
the low-fat and high-fat foods were considered separately, we found a positive association between low-fat dairy
food intake above five servings a week and risk of anovulatory infertility, and an inverse association between highfat dairy food intake and risk of developing this condition."
Further analysis of the findings in which specific foods were investigated, showed that an extra serving per day of
a low-fat dairy food such as yoghurt, appeared to increase the risk of anovulatory infertility by 11%, if the total daily
intake of calories was unchanged. In contrast, an extra daily serving of a high-fat dairy food such as whole fat milk
was associated with a 22% lower risk (with an unchanged calorie intake). The study showed that the more ice
cream the women ate, the lower was their risk, so that a woman eating ice cream two or more times a week had a
38% lower risk compared to a woman who consumed ice cream less than once a week.
The researchers believe that the presence of a fat-soluble substance, which improves ovarian function, might
explain the lower risk of infertility from high-fat dairy foods. "The intake of dairy fat, or a fat-soluble substance
present in dairy foods, may partly explain the inverse association between high-fat dairy foods and anovulatory
infertility," said Dr Chavarro.
Previous studies had suggested that lactose (a sugar found in milk) might be associated with anovulatory
infertility, but Dr Chavarro's study found neither a positive nor negative association for this, and nor was there any
association between intake of calcium, phosphorus or vitamin D and anovulatory infertility.
Innovative treatment for migraines combines Botox and surgery
Five years ago, Sharon Schafer Bennett suffered from migraines so severe that the headaches disrupted
her life, kept her from seeking a job and interfered with participation in her children's daily activities.
Now, thanks to an innovative surgical technique performed by a UT Southwestern Medical Center plastic surgeon
who helped pioneer the procedure, the frequency and intensity of Mrs. Bennett's migraines have diminished
dramatically – from two to three per week to an occasional one every few months.
The technique – performed by a handful of plastic surgeons in the U.S. – includes using the anti-wrinkle drug
Botox to pinpoint which of several specific muscles in the forehead, back of the head or temple areas may be
serving as "trigger points" to compress, irritate or entrap nerves that could be causing the migraine. Because Botox
temporarily paralyzes muscles, usually for about three months, it can be used as a "litmus test" or "marker" to see if
headaches go away or become less intense while the Botox's effects last, said Dr. Jeffrey Janis, assistant professor
of plastic surgery.
If the Botox is successful in preventing migraines or lessening their severity, then surgery to remove the targeted
muscle is likely to accomplish the same result, but on a more long-term and possibly permanent basis, he said.
For Mrs. Bennett, the surgery proved to be life-altering.
"I can't even begin to tell you what a change this has made in my life," said Mrs. Bennett, 45, a Houston-area
resident. "For the first time in years, I can live like a normal human being and do all the normal 'mom' and 'wife'
things that the migraines physically prevented me from doing. My family thinks it's great because they don't have to
put their lives on hold numerous times a week because of my migraines. I'm also going back to school to get a
second degree, something I could never have considered before."
Dr. Janis said: "Many neurologists are using Botox to treat migraines, but they are making the injections in a
'headband-like' circle around the forehead, temple and skull. They are not looking at finding the specific location of
the headache's trigger point. While patients may get temporary relief, after the Botox wears off they will have to go
back and get more injections or continue medications for migraines.
"It's like a math equation. I will inject the Botox into one trigger point at a time and leave the others alone. The
Botox is used as a diagnostic test to determine what trigger point is causing the problem. If patients get a benefit
from the Botox, they likely will get a benefit from the surgery. If there's no benefit from the Botox, then there won't
be a benefit from the surgery."
Dr. Janis began collaborating more than five years ago with Dr. Bahman Guyuron, a plastic surgeon at Case
Western Reserve University and the first to explore using surgery to relieve migraines, following the revelation by
several of his patients that their migraines had disappeared after they had cosmetic brow lifts. Dr. Janis has assisted
his colleague by performing anatomical studies on cadavers to explore the nerves and pathways that might cause
migraines. Together they have identified four specific trigger points and developed a treatment algorithm that
includes using Botox prior to deciding whether to perform surgery.
During the past several years, numerous peer-reviewed articles have been published in Plastic & Reconstructive
Surgery detailing their research efforts and the researchers have presented the technique at professional meetings
of plastic surgeons.
Approximately 28 million Americans, 75 percent of those women, suffer from migraines, according to the
National Institutes of Health. For employers, that translates into an estimated 157 million lost workdays annually.
"A migraine is something you can't explain to someone who hasn't had one," said Mrs. Bennett, who began
suffering monthly migraines as a teenager. As she grew older, the headaches become more frequent and
unpredictable. "They were messing up my life. I couldn't make any commitments or plan activities for my kids. This
surgery has made a huge difference in my life. It's awesome."
Dr. Janis only sees patients who have been diagnosed with recurring migraines by a neurologist and have tried
other treatments that have failed.
"Plastic surgeons are not in the business of diagnosing and treating headaches," he said. "This is a novel method
of treatment that is proving to be effective and potentially more long lasting than other things used before. But it is
still in its infancy."
DALLAS –
Brain works more chaotically than previously thought
Information is not only transferred at synapses
The brain appears to process information more chaotically than has long been assumed. This is demonstrated by
a new study conducted by scientists at the University of Bonn. The passing on of information from neuron to neuron
does not, they show, occur exclusively at the synapses, i.e. the junctions between the nerve cell extensions. Rather,
it seems that the neurons release their chemical messengers along the entire length of these extensions and, in this
way, excite the neighbouring cells. The findings of the study are of huge significance since they explode
fundamental notions about the way our brain works. Moreover, they might contribute to the development of new
medical drugs. The study is due to appear shortly in the prestigious academic journals "Nature Neuroscience" and
has already been posted online (doi:10.1038/nn1850).
Until now everything seemed quite clear. Nerve cells receive their signals by means of little "arms", known as
dendrites. Dendrites pass on the electrical impulses to the cell body, or soma, where they are processed. The
component responsible for "distributing" the result is the axon. Axons are long cable-like projections of the cell
along which the electrical signals pass until they meet, at a synapse, the dendritic arm of another neuron. The
synapse presents an insurmountable barrier to the neuron's electrical pulses. The brain overcomes this obstruction
by means of an amazing signal conversion: the synapse releases chemical messengers, known as neurotransmitters,
which diffuse to the dendrites. There, they dock onto specific receptors and generate new electrical impulses. "It
was previously thought that neurotransmitters are only released at synapses," points out Dr. Dirk Dietrich at Bonn
University. "But our findings indicate that this is not the case."
The messenger attracts insulating cells
Together with his colleagues Dr. Maria Kukley and Dr. Estibaliz Capetillo-Zarate, Dietrich has conducted a careful
examination of the "white matter" in the brain of rats. This substance contains the "cable ducts" linking the right
and left halves of the brain. They consist essentially of axons and ancillary cells. There are no dendrites or even
synapses here. "So it is not a place where we would expect to see the release of messengers," the neuroscientist
explains.
Yet it is in the white matter that the scientists have made a remarkable discovery. As soon as an electrical
impulse runs through an axon cable, tiny bubbles containing glutamate travel to the axon membrane and release
their content into the brain. Glutamate is one of the most important neurotransmitters, being released when signal
transmission occurs at synapses. The researchers were able to demonstrate that certain cells in the white matter
react to glutamate: the precursor to what are known as oligodendrocytes. Oligodendrocytes are the brain's
"insulating cells". They produce the myelin, a sort of fatty layer that surrounds the axons and ensures rapid
retransmission of signals. "It is likely that insulating cells are guided by the glutamate to locate axons and envelope
them in a layer of myelin," says Dirk Dietrich.
As soon as the axons leave the white "cable duct" they enter the brain's grey matter where they encounter their
receptor dendrites. Here, the information is passed on at the synapses to the receptor cells. "We think, however,
that on their way though the grey matter the axons probably release glutamate at other points apart from the
synapses," Dietrich speculates. "Nerve cells and dendrites are closely packed together here. So the axon could not
only excite the actual receptor but also numerous other nerve cells."
If this hypothesis is correct, the accepted scientific understanding of the way neurons communicate, which has
prevailed for over a hundred years, will have to be revised. In 1897 Sir Charles Sherrington first put forward the
idea that chemical messengers are only released at "synapses", a term he coined. According to the founder of
modern neurophysiology this means that nerve cells can only communicate with a small number of other nerve cells,
i.e. only with those with which they are connected via synapses. This concept is the basis of the notion that
neuronal information in the brain, somewhat like electricity in a computer, only spreads directionally in the brain,
following specific ordered circuits.
Too much glutamate is the death of cells
There is, however, also an aspect to the research team's discovery that is of considerable medical interest. It has
long been known that in the event of oxygen deficiency or a severe epileptic fit, large numbers of insulating cells in
the white matter are destroyed. The trigger for this damage is our old friend, the neurotransmitter glutamate.
"Nobody knew until now where the glutamate actually comes from," says Dr. Dietrich. "Our results might open the
door to totally new therapeutic options." After all, drugs have already been developed that prevent glutamate
bubbles from discharging their load into the brain. Indeed, Bonn's neuroscientists now know precisely which
receptors of the insulating cells are stimulated by the neurotransmitter – another starting point for developing new
drugs.
Yet, why can glutamate sometimes be so dangerous? When an epileptic fit occurs, the nerve cells "fire" very
rapidly and fiercely. In this event so many impulses run through the axons that large quantities of glutamate are
released all at once. "In these concentrations the neurotransmitter damages the insulating cells," says Dietrich. "It's
the dosage that makes it harmful."
Vitamin D deficiency widespread during pregnancy
Significant racial disparities also noted despite use of prenatal multivitamin supplements
Even regular use of prenatal multivitamin supplements is not adequate to prevent vitamin D
insufficiency, University of Pittsburgh researchers report in the current issue of the Journal of Nutrition, the
publication of the American Society for Nutrition. A condition linked to rickets and other musculoskeletal and health
complications, vitamin D insufficiency was found to be widespread among women during pregnancy, particularly in
the northern latitudes.
PITTSBURGH –
"In our study, more than 80 percent of African-American women and nearly half of white women tested at
delivery had levels of vitamin D that were too low, even though more than 90 percent of them used prenatal
vitamins during pregnancy," said Lisa Bodnar, Ph.D., M.P.H., R.D., assistant professor of epidemiology at the
University of Pittsburgh Graduate School of Public Health (GSPH) and lead author of the study. "The numbers also
were striking for their newborns – 92.4 percent of African-American babies and 66.1 percent of white infants were
found to have insufficient vitamin D at birth."
A vitamin closely associated with bone health, vitamin D deficiency early in life is associated with rickets – a
disorder characterized by soft bones and thought to have been eradicated in the United States more than 50 years
ago – as well as increased risk for type 1 diabetes, asthma and schizophrenia.
"A newborn's vitamin D stores are completely reliant on vitamin D from the mother," observed Dr. Bodnar, who
also is an assistant investigator at the university-affiliated Magee-Womens Research Institute (MWRI). "Not
surprisingly, poor maternal vitamin D status during pregnancy is a major risk factor for infant rickets, which again is
becoming a major health problem."
For their study, Dr. Bodnar and her colleagues evaluated data that was collected on 200 black women and 200
white women who were randomly selected from more than 2,200 women enrolled in the MWRI's Pregnancy
Exposures and Preeclampsia Prevention Study between 1997 and 2001. Samples of maternal blood were collected
prior to 22 weeks pregnancy and again just before delivery, Samples of newborn umbilical cord blood also were
tested for 25 hydroxyvitamin D, an indicator of vitamin D status. Finding such a proliferation of vitamin D
insufficiency in spite of prenatal multivitamin use is troubling, she noted, suggesting that higher dosages, differing
vitamin formulations or a moderate increase in sunlight exposure might be necessary to boost vitamin D stores to
healthier levels.
"In both groups, vitamin D concentrations were highest in summer and lowest in winter and spring," said senior
author James M. Roberts, M.D., MWRI director and professor and vice chair of research in the department of obstetrics,
gynecology and reproductive sciences at the University of Pittsburgh School of Medicine. "But differences were smaller
between seasons for African-American mothers and babies, whose vitamin D deficiency remained more constant."
Since vitamin D is made by the body in reaction to sunlight exposure, it has long been known that vitamin D
deficiency is more common among darker-skinned individuals, particularly in more northern latitudes, where less
ultraviolet radiation reaches the Earth. Indeed, vitamin D deficiency is more than three times as common in winter
than in summer for all women of childbearing age in the United States. Even so, the Pittsburgh researchers' study is
cause for concern.
"This study is among the largest to examine these questions in this at-risk population," Marjorie L. McCullough,
Sc.D., senior epidemiologist at the American Cancer Society, wrote in an accompanying editorial. "By the end of
pregnancy, 90 percent of all women were taking prenatal vitamins and yet deficiency was still common."
Vitamin D is found naturally in fatty fish but few other foods. Primary dietary sources include fortified foods such
as milk and some ready-to-eat cereals and vitamin supplements. Sun exposure for skin synthesis of vitamin D also
remains critical.
"Our study shows that current vitamin D dietary intake recommendations are not enough to meet the demands
of pregnancy," Dr. Bodnar said. "Improving vitamin D status has tremendous capacity to benefit public health."
Ancient Egypt Meds: Prayer, Laxatives
Jennifer Viegas, Discovery News
"Feeling irregular?" might have been a common question in ancient Egypt, since laxatives appear to have
dominated their pharmaceuticals, suggests ongoing research on medicine in the time of the Pharaohs.
The investigation — one of the largest studies of its kind — represents a partnership between England's
University of Manchester and the Egyptian Medicinal Plant Conservation Project in St. Katherine's, Sinai.
Although findings are preliminary, it appears that treating constipation preoccupied early doctors.
"The ancient Egyptians used a diverse range of plants for an equally diverse range of medical conditions," lead
researcher Ryan Metcalfe told Discovery News. "Laxatives dominated the field, with bulk laxatives, such as figs, bran
and dates in common use."
Metcalfe, a scientist in the university's School of Medicine, added that the Egyptians used bowel stimulants such
as the bitter fruit coloynth and castor oil, "which remained in clinical use until about 40 years ago."
One ancient remedy, believed to relieve excess gas and indigestion, consisted of cumin, a hefty portion of
goosefat and milk. All were boiled together, strained and consumed.
Metcalfe and his team are currently studying ancient papyrus records on the medical practices of people from
Egypt and the surrounding region.
At the same time, they are conducting genetic and chemical analysis on plant remains and resins, with the goal
of identifying trade routes, which species were used and how these plants might have been cultivated outside their
natural growing ranges.
"Around 50 percent of the plants used in ancient Egypt remained in clinical use up to the mid 20th century, and
some are still in use today," Metcalfe said, adding that researchers are even discovering "new" cures based on old
remedies, such as eating celery to help curb inflammation associated with arthritis.
The early Egyptians also seem to have recognized that stress could contribute to illness. They established
sanitariums where people would undergo "dream therapy" and treatments with "healing waters."
The scientists believe Egyptians obtained their medical knowledge from nomadic tribes that united to form
ancient Egypt, as well as from people in Mesopotamia and Nubia. Current medical practices by the Bedouin in the
Sinai region and by some groups in parts of Egypt show similarities to Pharaonic medicine.
"For example, acacia was used to treat coughs and eye complaints in ancient times and is still used for that to
this day," explained Metcalfe. "Colic was treated with anti-spasmodics, such as hyoscymus, cumin and coriander,
still vogue today."
John Taylor, assistant keeper of antiquities at the British Museum, supports the research. He recently provided
Metcalfe and colleagues Jackie Campbell and Jude Seath access to all of the medicinal plant evidence in the
museum's collection.
Taylor believes the ancient Egyptians mixed their medical knowledge with spiritual healing techniques, such as
incantations and rituals.
Metcalfe agreed, and said the Egyptians often prayed for healing, although they believed the gods were not
always on their side.
"Some illnesses were thought to be the result of evil spirits or a god's displeasure," Metcalfe explained "and in
these cases it may have seemed more sensible to use magic-religious techniques to treat the patient."
In addition to revealing information about Egypt's past, the researchers hope to preserve the biodiversity of the
country and surrounding region by identifying useful native plants and promoting their growth in the area.
Tooth implant 'to release drugs'
Forgetting to take medicine may be a thing of the past as researchers close in on creating an artificial tooth
which automatically releases medicine.
The Intellidrug device is small enough to fit inside two artificial molars in the jaw,
the Engineer journal said.
European Commission researchers also believe it will benefit patients, such as
those with diabetes and high blood pressure, who need doses in the night.
If human trials prove successful, the device could be available in 2010.
Dr Thomas Velten, from the Frauhofer Institute for Biomedical Technology in
Germany, one of the 15 research bodies involved in the project, said: "It is
important for some conditions that there is a constant level of drug in the blood.
"With this system, we can time the dosage to take place - even when the
patient is sleeping.
The artificial tooth will contain a reservoir which will release the drug
"We can easily adjust the dosage in line with the patient's needs, dependent on sex or weight."
Intellidrug works by holding the drug in tablet form in a reservoir. The implant is held in place through a
combination of clips and dental fixative.
Reservoir
Once in place, saliva in the mouth enters the reservoir via a membrane and dissolves the solid drug, forming a
solution. When the system is triggered by the electrical timing mechanism, a valve opens and allows a controlled
amount of the solution to flow into the mouth where it is absorbed into the body.
The device is fitted with two sensors. The first is a fill-level sensor that measures the concentration of the drug in
the reservoir. It alerts the patient when the concentration of the drug falls below a certain level. At the moment
enough medication can be contained for up to two weeks. The second sensor monitors how much drug solution
has been administered and a remote control allows the doctor to increase the dose of medication if necessary.
Matt Griffiths, prescribing and medicines manager adviser at the Royal College of Nursing, said: "Cost is an issue
as to whether this would become widely available, but there is a cost benefit to improving medicines concordance.
"About 50% of people with chronic conditions do not take their medicines correctly and that in turn costs the
health service money."
Scientists probe 'hole in Earth'
Scientists are to sail to the mid-Atlantic to examine a massive "open wound" on the Earth's surface.
Dr Chris MacLeod, from Cardiff University, said the Earth's crust appeared to be
completely missing in an area thousands of kilometres across. The hole in the crust
is midway between the Cape Verde Islands and the Caribbean, on the Mid-Atlantic
Ridge.
The team will survey the area, up to 5km (3 miles) under the surface, from ocean
research vessel RRS James Cook. The ship is on its inaugural voyage after being
named in February.
Dr MacLeod said the hole in the Earth's crust was not unique, but was recognised
as one of the most significant. He said it was an "open wound on the surface of the
Earth", where the oceanic crust, usually 6-7km thick (3.7-4.3 miles), was simply not there.
"Usually the plates are pulled apart and to fill the gap the mantle underneath has to rise up. As it comes up it
starts to melt. That forms the magma," he said. "That's the normal process. Here it has gone awry for some
reason. "The crust does not seem to be repairing itself."
Dr MacLeod said the research could lead to a "new way of understanding" the process of plate tectonics.
The scientist will test theories he developed after visiting the area in 2001 - including the possibility the missing
crust was caused by a "detachment fracture".
A rock called serpentinite is exposed at the surface
"Effectively it's a huge rupture - one side is being pulled away from the other. It's created a rupture so big it's
actually pulled the entire crust away. "We also think the mantle did not melt as much as usual and that the normal
amount of mantle was not produced."
As a result, the mantle is exposed to seawater, creating a rock called serpentinite.
The survey voyage, costing $1m (£510,000), will be led by marine geophysicist Professor Roger Searle, from
Durham University. Dr Bramley Murton, from the National Oceanography Centre, Southampton, is the third expert
taking part.
They will set sail from Tenerife on Monday and return in April.
The team intends to use sonar to build up an image of the seafloor and then take rock cores using a robotic
seabed drill developed by the British Geological Survey in conjunction with Dr MacLeod.
The progress of the voyage can be followed online.
Nectar is not a simple soft drink
The sugar-containing nectar secreted by plants and consumed by pollinators shares a number of similarities to
fitness drinks, including ingredients such as amino acids and vitamins. In addition to these components, nectar can
also contain secondary metabolites such as the alkaloid nicotine and other toxic compounds. Scientists Danny
Kessler and Ian Baldwin from the Max Planck Institute for Chemical Ecology in Jena, Germany, recently addressed
the question, why would plants risk poisoning the insects and birds that provide pollination services? Their findings
have been published in The Plant Journal.
Kessler and Baldwin examined the nectar of a wild tobacco species, Nicotiana attenuata, and discovered that it is
flavoured with 35 secondary compounds. The researchers then tested 16 of these in cafeteria-style bioassays with
three groups of native visitors - hawkmoths, hummingbirds (both pollinators) and ants ('nectar thieves'). Some
compounds were attractive and others were not. Certain nectar blends seem to increase a flower's chances of being
visited by useful pollinators while discouraging nectar thieves.
Nicotine, the most abundant repellent found, affected both pollinators and nectar thieves in the same way. The
visitors removed less nectar per visit when nicotine was present. To determine if nicotine was repellent in the real
world, the researchers genetically transformed N. attenuata plants to create nicotine-free plants, which were
planted into a natural population and nectar removal rates were measured. Native floral visitors removed much
more nectar from the plants that had no nicotine than from the normal nicotine-containing plants. Why would a
plant produce nectar that repels pollinators? Data from the bioassays provided a hypothesis: when nectar contains
nicotine, the amount of nectar consumed per visit decreases but the number of visitations increases. Increasing the
number of visitors might increase the genetic diversity of the offspring produced. The researchers are planning to
test this hypothesis in the upcoming field season.
Dissecting the function of this secret formula of nectar, thought to be nature's soft drink, has instead shown it to
be quite 'hard'.
Unique Tomatoes Tops in Disease-Fighting Antioxidants
Deep red tomatoes get their rich color from lycopene, a disease-fighting antioxidant. A new study,
however, suggests that a special variety of orange-colored tomatoes provide a different form of lycopene, one that
our bodies may more readily use.
Researchers found that eating spaghetti covered in sauce made from these orange tomatoes, called Tangerine
tomatoes, caused a noticeable boost in this form of lycopene in participants' blood.
“While red tomatoes contain far more lycopene than orange tomatoes, most of it is in a form that the body doesn't
absorb well,” said Steven Schwartz, the study's lead author and a professor of food science and technology at Ohio
State University.
“The people in the study actually consumed less lycopene when they ate sauce made from the orange tomatoes,
but they absorbed far more lycopene than they would have if it had come from red tomatoes,” he said. “That's what
is so dramatic about it.”
The tomatoes used for this work were developed specifically for the study – these particular varieties aren't
readily available in grocery stores. The researchers suggest that interested consumers seek out orange- and goldcolored heirloom tomatoes as an alternative to Tangerine tomatoes, but caution that they haven't tested how much
or what kind of lycopene these varieties contain.
Lycopene belongs to a family of antioxidants called the carotenoids, which give certain fruits and vegetables their
distinctive colors. Carotenoids are thought to have a number of health benefits, such as reducing the risk of
developing cancer, cardiovascular disease and macular degeneration.
COLUMBUS , Ohio –
“The tomato is a wonderful biosynthetic factory for carotenoids, and scientists are working on ways to enhance
the fruit's antioxidant content and composition,” Schwartz continued.
The findings appear in a recent issue of the Journal of Agricultural and Food Chemistry.
Lycopene is a carotenoid that contains a variety of related compounds called isomers. Isomers share the same
chemical formula, yet differ in chemical structure. In the case of tomatoes, the different lycopene isomers play a
part in determining the color of the fruit.
Several years ago, Schwartz and his colleagues discovered the abundance of several of these isomers, called cislycopenes, in human blood. But most of the tomatoes and tomato-based products we currently consume are rich in
all-trans--lycopene.
“We don't know why our bodies seem to transform lycopene into cis-isomers, or if some isomers are more
beneficial than others,” Schwartz said.
The researchers don't know if tomatoes rich in cis-lycopene would provide greater health benefits to humans, but
the study's results suggest that tomatoes can be used to increase both the intake and absorption of the healthbeneficial compounds.
The researchers made spaghetti sauce from two tomato varieties – tangerine tomatoes, which get their name
from their orange skin and are high in cis-lycopene, and a tomato variety chosen for its rich beta carotene content.
The tomatoes were grown at an Ohio State-affiliated agricultural research station in northwestern Ohio. Following
harvest, both tomato varieties were immediately processed into canned tomato juice and concentrated. Italian
seasoning was added for taste.
The 12 adults participating in the study ate two spaghetti test meals – one included sauce made from tangerine
tomatoes, while the other featured sauce made from the tomatoes high in beta carotene. The participants were
asked to avoid tomato and beta carotene-rich foods for 13 days before eating each test meal.
Researchers drew blood right before each participant ate and again every hour or two up to 10 hours after the
meal. They analyzed the blood samples for lycopene and beta carotene content.
Lycopene absorption from the tangerine tomatoes was 2.5 times higher than that absorbed from the beta
carotene-rich tomatoes and, Schwartz said, from typical red tomato varieties. Cis-lycopene levels spiked around five
hours after eating the tangerine tomato sauce, and at this point during absorption the levels were some 200 times
greater than those of trans-lycopene, which were nearly non-existent. While cis-lycopene is by far the most
abundant isomer in these tomatoes, they do contain trace amounts of trans-lycopene.
The participants' bodies also readily absorbed beta carotene from the beta carotene-rich tomatoes.
“Right now, only carrots and sweet potatoes are a more readily available, richer source of beta carotene,”
Schwartz said. “And this carotenoid is a major source of vitamin A for a large proportion of the world's population.
Its deficiency is a serious health problem in many developing countries.
“Our study showed that a tomato can also increase beta carotene levels in the blood,” Schwartz said. While these
special tomatoes were grown just for this study, the researchers have pre-commercial lines of both varieties
available.
He conducted the study with Ohio State colleagues David Francis, an associate professor of horticulture and crop
science; Steven Clinton, an associate professor of hematology and oncology and human nutrition; Nuray Unlu, a former
postdoctoral researcher in food science; and Torsten Bohn, a former postdoctoral fellow in food science at Ohio State.
Funding for this work was provided by the Ohio Agricultural and Development Research Center in Wooster; the U.S.
Department of Agriculture's IFAFS program; the National Center of Research Resources of the National Institutes of Health; and
the National Cancer Institute.
Brain maps online
February 27th, 2007 @ 1:33 pm by Andy
Digital atlases of the brains of humans, monkeys, dogs, cats, mice, birds and other animals have been created
and posted online by researchers at the UC Davis Center for Neuroscience.
BrainMaps.org features the highest resolution whole-brain atlases ever constructed, with over 50 terabytes of
brain image data directly accessible online. Users can explore the brains of humans and a variety of other species at
an unprecedented level of detail, from a broad view of the brain to the fine details of nerves and connections. The
website also includes a suite of free, downloadable tools for navigating and analyzing brain data.
“Many users have described it as a ‘Google Maps’ of the brain,” said Shawn Mikula, a postdoctoral researcher at
UC Davis who is first author on a paper describing the work.
The high-resolution maps will enable researchers to use “virtual microscopy” to compare healthy brains with
others, looking at structure, gene expression and the distribution of different proteins. They will enable better
understanding of the organization of normal brains, and could help researchers in identifying fine morphological and
chemical abnormalities underlying Alzheimer’s, Parkinson’s and other neurological diseases, Mikula said.
Brain imageTo make the maps, the researchers started with sections of brain mounted on microscope slides.
Those slides were scanned to create image files or “virtual slides,” and assembled like tiles into composite images.
The maps have a resolution of better than half a micrometer per pixel, or 55,000 dots per inch, with virtual slides
approaching 30 gigabytes in size each.
The paper is published in the March edition of the journal NeuroImage. The other authors on the paper are Issac
Trotts and James Stone, both researchers at the Center for Neuroscience, and Edward (Ted) Jones, director of the
center and a professor of psychiatry at UC Davis. The work was funded by the National Institutes of Health.
Manchester physicists pioneer new super-thin technology
Researchers have used the world's thinnest material to create a new type of technology, which could be used to
make super-fast electronic components and speed up the development of drugs.
Physicists at The University of Manchester and The Max-Planck Institute in Germany have created a new kind of
a membrane that is only one atom thick.
It's believed this super-small structure can be used to sieve gases, make ultra-fast electronic switches and image
individual molecules with unprecedented accuracy.
The findings of the research team is published today (Thursday 1 March 2007) in the journal Nature.
Two years ago, scientists discovered a new class of materials that can be viewed as individual atomic planes
pulled out of bulk crystals.
These one-atom-thick materials and in particular graphene – a gauze of carbon atoms resembling chicken wire –
have rapidly become one of the hottest topics in physics.
However, it has remained doubtful whether such materials can exist in the free state, without being placed on
top of other materials.
Now an international research team, led by Dr Jannik Meyer of The Max-Planck Institute in Germany and
Professor Andre Geim of The University of Manchester has managed to make free-hanging graphene.
The team used a combination of microfabrication techniques used, for example, in the manufacturing of
microprocessors.
A metallic scaffold was placed on top of a sheet of graphene, which was placed on a silicon chip. The chip was
then dissolved in acids, leaving the graphene hanging freely in air or a vacuum from the scaffold.
The resulting membranes are the thinnest material possible and maintain a remarkably high quality.
Professor Geim – who works in the School of Physics and Astronomy at The University of Manchester – and his
fellow researchers have also found the reason for the stability of such atomically-thin materials, which were
previously presumed to be impossible.
They report that graphene is not perfectly flat but instead gently crumpled out of plane, which helps stabilise
otherwise intrinsically unstable ultra-thin matter.
Professor Geim and his colleagues believe that the membranes they have created can be used like sieves, to filter
light gases through the atomic mesh of the chicken wire structure, or to make miniature electro-mechanical
switches.
It's also thought it may be possible to use them as a non-obscuring support for electron microscopy to study
individual molecules.
This has significant implications for the development of medical drugs, as it will potentially allow the rapid
analysis of the atomic structures of bio-active complex molecules.
"This is a completely new type of technology – even nanotechnology is not the right word to describe these new
membranes," said Professor Geim.
"We have made proof-of-concept devices and believe the technology transfer to other areas should be
straightforward. However, the real challenge is to make such membranes cheap and readily available for large-scale
applications."
University of Nevada scientists gauge earthquake hazards through study of precariously
balance rocks
Research by Nevada professors pinpoints certain seismic hazards, past and present, through California
rock formations
RENO, Nev. – A seismological research team from the University of Nevada, Reno is finding ways to make precariously
balanced rocks talk. In so doing, they are unlocking valuable scientific information in assessing seismic hazards in
areas throughout the West.
Their findings are shared in the January-February issue of American Scientist magazine. Scientists believe that
zones of precarious rocks – rocks that have come close but haven't tipped over in the wake of a major seismic event
– provide important information about seismic risk, its magnitude and its frequency. For a look at the article, click
on: http://www.americanscientist.org/template/AssetDetail/assetid/54437;jsessionid=baa9...?fulltext=true#54485
"There's really no long-term data to test seismic hazards other than precarious rocks," said Matthew Purvance, a
postdoctoral scholar in geophysics at the University, who authored the article along with James Brune, professor in
the Department of Geological Sciences and past director of the Nevada Seismological Laboratory, and Rasool
Anooshehpoor, research professor in the Nevada Seismological Laboratory.
"By studying precariously balanced rocks, it can serve as an indicator that an earthquake of a sufficient size to
topple a tippy rock has not occurred … at least for a very long time. We think this is a fundamental story that gives
fundamental information on seismic hazards that has never been done before."
The data from the study is important, as it not only tests ground-motion probability, but can help further refine
United States Geological Survey hazard methodologies that are used by engineers to formulate building codes.
Purvance explained that seismologists and engineers since the late 1960s have increasingly followed a method
known as probabilistic seismic-hazard analysis in trying to get a more firm grasp on earthquake probability. This
analysis allows researchers to determine the number and magnitude of earthquakes on relevant faults. The study of
precarious rocks, which act as "witnesses" to strong seismic events throughout history, has provided scientists an
important research window to test the predictions of probability, Purvance said.
The team tested massive rocks of up to 1,000 pounds and more than 10,000 years old, measuring the force and
angle it would take to tip them over. One of the more interesting aspects of the study was a technique used by
Anooshehpoor, which measured the restoring force that has allowed the rock to remain upright through centuries of
wear and the force of past strong seismic events.
Anooshehpoor's technique allowed the team to measure a tipping boulder's restoring force with a digital load cell
and the rock's tilt with an inclineometer. The work wasn't easy. By pushing and pulling on the massive, bus-sized
rocks with a series of wire cables, nylon straps, chains, pulleys, winches, hydraulic pistons, ground anchors and 4 by
4 blocks of wood, the team was able to record data for precarious rocks that had never been tested before.
"It gives us very useful information about the precarious rocks and further adds to the knowledge of gauging
earthquake hazards," Purvance said, noting that it was work by Brune in the early 1990s with precarious rocks in
southern California that led to the rocks becoming more widely recognized as an accurate barometer of seismic
force and occurrence. "These measurements help better explain the story of how the rock has managed to
withstand some of the forces of time and nature."
Added Anooshehpoor: "The rocks that we have studied are from large earthquakes and are so rare. If
throughout history the world had tons of instruments and recorded many of these earthquakes, we probably
wouldn't have the need to study precarious rocks. The lack of data has been a major problem in estimating ground
motion. With this study, we've been provided with another opportunity to give the engineers the right information
they need."
Research on the color red shows definite impact on achievement
The color red can affect how people function: Red means danger and commands us to stop in traffic.
Researchers at the University of Rochester have now found that red also can keep us from performing our best on
tests.
If test takers are aware of even a hint of red, performance on a test will be affected to a significant degree, say
researchers at Rochester and the University of Munich. The researchers’ article in the February issue of the Journal
of Experimental Psychology: General on the effect of red on intellectual performance reveals that color associations
are so strong and embedded so deeply that people are predisposed to certain reactions when they see red.
Andrew J. Elliot, lead author and professor of psychology at the University of Rochester, and his co-authors found
that when people see even a flash of red before being tested, they associate the color with mistakes and failures. In
turn, they do poorly on the test. Red, of course, is traditionally associated with marking errors on school papers.
"Color clearly has aesthetic value, but it can also carry specific meaning and convey specific information," says
Elliot. "Our study of avoidance motivation is part and parcel of that."
Four experiments demonstrated that the brief perception of red prior to an important test—such as an IQ test or
a major exam—actually impaired performance. Two further experiments also established the link between red and
avoidance motivation when task choice and psychophysiological measures were applied.
The findings show that "care must be taken in how red is used in achievement contexts," the researchers
reported, "and illustrate how color can act as a subtle environmental cue that has important influences on
behavior."
Elliot and his colleagues didn’t use just any color of red. He assessed the colors using guidelines for hue,
saturation, and brightness, and purchased a high-quality printer and a spectrophotometer for the research. He was
stunned to learn that results from earlier work on color psychology by others didn’t control for saturation and
brightness.
The article’s hypothesis is based on the idea that color can evoke motivation and have an effect without the
subject being aware of it. "It leads people to do worse without their knowledge," says Elliot, when it comes to
academic achievement. In one of the six tests given, for example, people were allowed a choice of questions to
answer. Most of them chose to answer the easiest question, a classic example of how to avoid failure.
The researchers believe that "color carries different meanings in different contexts." If the context changes, the
implications do, too. Elliot’s next study will focus on physical attractiveness.
Green tea and COX-2 inhibitors combine to slow growth of prostate cancer
Drinking a nice warm cup of green tea has long been touted for its healthful benefits, both real and
anecdotal. But now researchers have found that a component of green tea, combined with low doses of a COX-2
inhibitor, could slow the spread of human prostate cancer.
In the March 1 issue of Clinical Cancer Research, researchers from University of Wisconsin-Madison demonstrate
that low doses of the COX-2 inhibitor celecoxib, administered with a green tea polyphenol called pigallocatechin-3gallate (EGCG), can slow the growth of human prostate cancer. Their experiments were performed in cell cultures
and in a mouse model for the disease.
PHILADELPHIA --
“Celecoxib and green tea have a synergistic effect -- each triggering cellular pathways that, combined, are more
powerful than either agent alone,” said Hasan Mukhtar, Ph.D., professor of dermatology at the University of
Wisconsin and member of Wisconsin’s Paul Carbone Comprehensive Cancer Center. “We hope that a clinical trial
could lead to a preventative treatment as simple as tea time.”
Previous research has linked the cyclooxygenase-2 enzyme, commonly known as COX-2, to many cancer types,
including prostate cancer, said Mukhtar. Mukhtar and his colleagues have previously shown COX-2 inhibitors like
celecoxib (known under the brand name Celebrex™) suppress prostate cancer in animal models. COX-2 inhibitors
also have been shown to cause adverse cardiovascular effects when administered at high doses over long durations.
In 2004, Mukhtar and his colleagues demonstrated that green tea polyphenol EGCG has cancer-fighting abilities
of its own. Their study, published in Cancer Research, showed that EGCG can modulate the insulin-like growth
factor-1 (IGF-1)-driven molecular pathway in a mouse model for human prostate cancer, pushing the cells toward
programmed cell death (apoptosis).
“We believed that COX-2 inhibitors may still prove beneficial if used in combination with complementary agents,”
Mukhtar said. “Our studies showed that the additive effect of green tea enables us to utilize the cancer-fighting
abilities of COX-2 inhibitors, but at lower, safer doses.”
In this latest research, Mukhtar and his colleagues looked at the effects of the two substances on cultured human
prostate cancer cells. Alone, both EGCG and NS-398, a COX-2 inhibitor similar to celecoxib, demonstrated the ability
to slow cancer cell growth and limit the presence of known cancer-promoting proteins within the cell samples.
Together, EGCG and NS-398 suppressed cell growth by an additional 15 to 28 percent.
The researchers repeated the experiment in mouse models of prostate cancer, using celecoxib and an oral
suspension of the decaffeinated green tea polyphenol. By using pharmacy-grade celecoxib and actual tea, they had
hoped to replicate real-life conditions. “The idea is that it would be easier to get people to drink green tea than it
would be to take an additional dietary supplement,” Mukhtar said.
In mice that were not treated with either substance, the tumor volume averaged 1,300 cubic millimeters,
whereas mice given either the tea or celecoxib had tumors averaging 835 cubic millimeters and 650 cubic
millimeters, respectively. Tumors taken from mice given both agents, however, measured on average a volume of
350 cubic millimeters.
In parallel to tumor growth inhibition, mice that received a combination of green tea and celecoxib registered a
greater decrease in prostate specific antigen (PSA) levels compared to that in celecoxib alone or green tea alone
treated animals. PSA is a protein produced by the cells of the prostate and is used as a marker for detection and
progression of prostate cancer. These results, combined with a marked decrease in the presence of cancerpromoting proteins, offered clear indications that green tea and celecoxib, combined, could be useful in slowing
prostate cancer growth, Mukhtar said.
“Prostate cancer typically arises from more than one defect in the cellular mechanics, which means that a single
therapeutic might not work fighting a particular cancer long-term,” Mukhtar said. “If tests in human trials replicate
these results, we could see a powerful combined therapy that is both simple to administer and relatively cost
effective.”
Researchers wake up viruses inside tumors to image and then destroy cancers
Researchers have found a way to activate Epstein-Barr viruses inside tumors as a way to identify
patients whose infection can then be manipulated to destroy their tumors. They say this strategy could offer a novel
way of treating many cancers associated with Epstein-Barr, including at least four different types of lymphoma and
nasopharyngeal and gastric cancers.
In the March 1 issue of Clinical Cancer Research, a team of radiologists and oncologists from Johns Hopkins Medical
Institutions describe how they used two agents already on the market − one of which is the multiple myeloma drug
Velcade − to light up tumor viruses on a gamma camera. The technique is the first in the new field of in vivo
molecular-genetic imaging that doesn't require transfecting tumors with a "reporter" gene, the scientists say.
"The beauty of this is that you don't have to introduce any reporter genes into the tumor because they are
already there," says radiologist Martin G. Pomper, M.D., Ph.D. "This is the only example we know of where it is
possible to image activated endogenous gene expression without having to transfect cells."
A variety of blood and solid cancers are more likely to occur in people who have been infected with the EpsteinBarr virus (EBV), but not everyone with these cancers has such infections. For those who do, researchers, such as
Hopkins oncologist and co-author Richard F. Ambinder, M.D., Ph.D., have been working on ways to activate the
reproductive, or "lytic" cycle, within the virus to make it replicate within the tumor cell. When enough viral particles
are produced, the tumor will burst, releasing the virus. In animal experiments, this experimental therapy, called lytic
induction therapy, results in tumor death.
As the first step in this study, researchers screened a wide variety of drugs to see if any of them could reawaken
the virus. They were fortunate in that one of the genes that is expressed upon viral lytic induction is EBV's
thymidine kinase (EBV-TK), an enzyme that helps the virus begin to reproduce. This kinase is of interest because
researchers know its "sister" kinase, the one produced by the herpes simplex virus, can be imaged by an injected
radiolabeled chemical (FIAU), which can then be imaged using a gamma camera.
PHILADELPHIA --
"To perform molecular-genetic imaging, we have always had to infect cells with active herpes simplex virus so
that they can replicate, express TK, and only then could we use the FIAU tracer to make the cells light up," Pomper
says. "So we were hoping to find a way to turn latent Epstein-Barr virus on in these cancers, and use the thymidine
kinase it then produces to enable us to see the virus-associated tumors with radiolabeled FIAU."
The researchers screened 2,700 agents until they hit upon Velcade, a targeted chemotherapy drug already
approved for use in multiple myeloma. "We were both surprised and lucky," he says. "Velcade is a proteasome
inhibitor, but it also induces the lytic cycle thereby activating the TK in the Epstein-Barr virus. Once the TK is
activated, we can image the tumors."
To test their findings, the researchers used mice carrying human Burkitt's lymphoma, a cancer often associated
with Epstein-Barr viral infection. Tumors glowed in mice given Velcade followed by an injection of FIAU, but not in
mice that weren't given Velcade. Mice whose Burkitt's lymphoma did not contain Epstein-Barr virus also did not
respond to either Velcade or FIAU, according to researchers.
"Velcade woke up the virus in the tumors, which increased viral load by 12-fold, all the while cranking out TK,"
Pomper says. "An injection of FIAU made it easy to image the tumors with virus in them."
The method is highly sensitive, he says: as few as five percent of the cells within the tumor mass needed to be
induced into the lytic cycle in order to be detected.
Not only can FIAU light up the tumors, it can also potentially kill them, Pomper says. For imaging purposes, FIAU
can carry a radionuclide that emits a low energy gamma photon, but it can also be engineered to carry therapeutic
radionuclides, which are lethal to cells in which TK is activated.
Results of this study suggests that this strategy could be applied to other viruses associated with tumors, and
that other drugs may potentially be used to activate these viruses, Pomper says. "Velcade is only one of an array of
new, as well as older agents, that can induce lytic infection, and a particular agent could be tailored for use in a
specific patient through imaging," he says.
Sweat may pass on hepatitis B in contact sports
Sweat may be another way to pass on hepatitis B infection during contact sports, suggests research published
ahead of print in the British Journal of Sports Medicine.
Hepatitis B virus attacks the liver and can cause lifelong infection, cirrhosis (scarring) of the liver, liver cancer,
liver failure, and death.
The research team analysed blood and sweat samples from 70 male Olympic wrestlers for evidence of hepatitis B
infection (HBV).
The wrestlers, who were all aged between 18 and 30, were all asked about injuries, as blood-borne infection is a
common route of transmission.
Over a third said they had had bleeding or weeping wounds during training and competition. And almost half said
that they had had an episode of bleeding during other activities.
None of the wrestlers had active HBV infection, as evidenced by a lack of antibodies to the virus.
Nevertheless, the virus itself was found in the blood of nine (13%), suggesting that they had hidden or occult
infection, says the author. This is perfectly plausible, given that intense training temporarily suppresses a normal
immune response, she says.
Eight (11%) also had particles of the virus present in their sweat, and levels of the virus found in the blood
closely matched those found in the sweat.
The findings prompt the author to suggest that sweat, like open wounds and mucous membranes, could be
another way of transmitting the infection.
Some sporting bodies have ruled that HIV testing should be mandatory for all contact sport competitors, but no
such recommendations have been made for HBV, says the author.
Yet HBV is far more transmissible, because much higher levels of the virus are found in the blood and it is not as
fragile as HIV, she says, calling for HBV testing and vaccination for all wrestlers at the start of their career.
Universal rules needed for medics responding to calls for help in public
Universal rules are needed for doctors playing the "Good Samaritan" to members of the public who fall ill outside
hospital, says an experienced medic.
Dr Rubin is a paediatrician by training, who has responded to some two dozen pleas over the past 25 years to
help a member of the public who sustained injuries or became critically ill.
Doctors may not always be suitably qualified to take on all manner of public emergencies, he suggests. But no
one ever calls out: "Is there a paramedic in the house?" he says.
Many specialists are not used to dealing with the kinds of emergencies that occur on the street. A person in a
road traffic collision would be better served by a lifeguard than a dermatologist long out of basic training, for
example
Aside from the possibility of mouth to mouth resuscitation, which, research shows, deters some doctors from
responding to calls for medical assistance in public places, legal implications increasingly play a part in their
discomfort at getting involved, he suggests.
The ‘Good Samaritan’ law, which supposedly affords doctors legal protection from subsequently disgruntled
patients and their families "is not exactly airtight," says Dr Rubin.
It does not guarantee that someone will not sue, nor does it provide immunity from malpractice claims, he says.
Yet calls for public help are likely to become more frequent, given increasingly ageing populations and higher
rates of chronic illness, he says.
"Are we comfortable relying on random physician expertise, availability, and willingness to meet our emergency
needs in the air or on the ground?" he asks. "It is time to figure this out," he concludes.
Using morphine to hasten death is a myth, says doctor
Letter: Double effect is a myth leading a double life, BMJ Volume 334, p440
Using morphine to end a person's life is a myth, argues a senior doctor in a letter to this week's BMJ.
It follows the case of Kelly Taylor, a terminally ill woman who went to court earlier this month for the right to be
sedated into unconsciousness by morphine, even though it will hasten her death.
Mrs Taylor's request to use morphine to make her unconscious under the principle of double effect is a puzzling
choice, writes Claud Regnard, a consultant in palliative care medicine. The principle of double effect allows a doctor
to administer treatment that hastens death, providing the intention is to relieve pain rather than to kill.
Evidence over the past 20 years has repeatedly shown that, used correctly, morphine is well tolerated and does
not shorten life or hasten death, he explains. Its sedative effects wear off quickly (making it useless if you want to
stay unconscious), toxic doses can cause distressing agitation (which is why such doses are never used in palliative
care), and it has a wide therapeutic range (making death unlikely).
The Dutch know this and hardly ever use morphine for euthanasia, he writes.
Palliative care specialists are not faced with the dilemma of controlling severe pain at the risk of killing the patient
- they manage pain with drugs and doses adjusted to each individual patient, while at the same time helping fear,
depression and spiritual distress, he adds.
And he warns that doctors who act precipitously with high, often intravenous, doses of opioids are being misled
into bad practice by the continuing promotion of double effect as a real and essential phenomenon in end of life care.
Using double effect as a justification for patient assisted suicide and euthanasia is not tenable in evidence-based
medicine, he says. In end of life care, double effect is a myth leading a double life.
Murder and the Operations
Veteran Analyst Looks Back on Innovations in Studying Criminal Justice in America
Hanover, MD – The criminal justice system, often the subject of political controversy, gains major insights from the
unbiased analytical tools that operations researchers introduced beginning with the President’s Crime Commission in
the 1960s, according to a career retrospective by the winner of the Stockholm Prize in Criminology.
The paper, “An O.R. Missionary’s Visits to the Criminal Justice System,” by Alfred Blumstein, appears in the February
issue of Operations Research, the flagship journal of the Institute for Operations Research and the Management Sciences
(INFORMS®).
“By bringing their analytical skills and system perspectives and without being constrained by the traditional
presumptions that permeate all fields—perhaps to an extreme in criminal justice because of the strong ideological
perspectives that pervade it—operations researchers bring new insights, new questions, and new challenges,” writes
Professor Blumstein of the Heinz School at Carnegie Mellon University.
Professor Blumstein, a pioneer in operations research, has been named a recipient of the prestigious 2007
Stockholm Prize in Criminology for his research into how criminals’ activities vary over the course of their criminal
careers.
Operations research, says the professor, has changed the way that government and experts view the spike in
murder and drug-related crimes in the nineties, the jump in imprisonment rate that began with the introduction of
mandatory minimum sentencing, and the extent that removing criminals from the streets really helps prevent crime.
In blunt remarks, Prof. Blumstein characterizes the criminal justice system as primitive for its continued slowness
to adopt techniques of quantitative modeling, system perspective, and planning that are used in other policy areas.
Professor Blumstein and colleagues have brought operations research to
* analyzing the counterproductive effects of arresting older drug dealers, who were replaced by younger, more
violent offenders
* assessing how much the imprisonment of criminals prevents crime
* reviewing trends in incarceration and factors contributing to those trends
* examining the interaction of incarceration and drug markets
Prof. Blumstein startlingly observes that crime-fighting efforts aimed at deterring drug use in the 1980s and
1990s actually spurred a rise in murder and drug-related crime. He determined that during the crack cocaine
epidemic, imprisoning less violent drug dealers in their twenties led to the recruitment of younger teenage boys,
who are more prone to resolve arguments with violence. These teens began obtaining handguns for self-defense
and that stimulated others to get their own guns for their own defense and to achieve status among their peers. As
a result, he observes, crime rates for this age category soared. Murder and drug arrests dropped in the mid-1990s,
but no thanks to law enforcement, he maintains. Instead, the crime rate fell precipitously when people in drugridden areas realized how badly crack cocaine was damaging their parents and older siblings and turned away from
the drug. A reduced need for teenage drug sellers coincided with a robust economy, so these young people could
leave the underground economy for regular jobs.
Professor Blumstein’s paper also looks at the way that operations researchers have shed new light on the path
and length that criminal careers take, as well as the success of imprisonment in “incapacitating” the crimes they
might have committed while in prison. He outlines a controversial debate among criminologists about the ability to
identify criminals who are more likely to commit crimes, and whether these criminals’ sentences should be
lengthened based on a projection of crimes they might commit in the future.
Prof. Blumstein’s paper also looks at a jump in imprisonment since the 1970s from 110 per 100,00 to 500 per
100,000 that has made the United States the world leader in incarceration, now ahead of even Russia.
The change, his research shows, is a result of the political system pushing aside the criminal justice system in
addressing crime in America.
He writes, “the results of those analyses make it clear that more crime has not been a major driver and that
arrests per crime has been astonishingly flat over the period.”
He adds, “The 30-second sound bite that featured a cell door slamming provided much more powerful rhetorical
appeal than mulling over the trade-offs among incarceration, community treatment, rehabilitation, and the other
complexities in decision on appropriate sentences.”
Goooal! New study shows goalie may influence direction of penalty kick in soccer
A penalty kick places a goalkeeper at such a disadvantage that only approximately 18% of penalty kicks are
saved. However, some soccer fans think goalkeepers might save penalty kicks more easily by standing marginally to
the left or right.
It turns out they're right! In an article published in the March issue of Psychological Science, Professors Rich
Masters, John van der Kamp and Robin Jackson of the Institute of Human Performance at the University of Hong
Kong found that penalty takers are more likely to direct the football to the side with more space.
After observing 200 video clips of penalty kicks, including those in World Cup and African Nations Cup matches,
European Championships, and Union of European Football Association (UEFA) Champions League matches, the
researchers found that goalkeepers stood marginally left or right of goal centre 96% of the time. While the mean
displacement of the goalkeepers was 9.95 cm, there was no association between the side to which the goalkeepers
stood and the direction in which they dived (94 out of 190 dives were to the side with less space). So goalkeepers
weren't standing off-centre as a strategy.
Remarkably, despite all of the factors that can influence the direction of a penalty kick, more penalty kicks were
directed to the side with more space.
After conducting experimental studies and carefully evaluating the results, Professor Masters and his team
concluded that it is feasible for a goalkeeper to influence perceptions of space and consequently the direction of
penalty kicks by standing marginally to one side or another of the goal centre. The goalkeeper can then strategically
dive to the side with more space.
Extrapolation of their data indicates that the optimum displacement of the goalkeeper in real life is from 6 to 10
cm. Their results suggest that the penalty taker is unlikely to notice a displacement in this range, but is at least 10%
more likely to direct the penalty kick to the side with more space than to the side with less space.
Peruvian citadel is site of earliest ancient solar observatory in the Americas
Archeologists from Yale and the University of Leicester have identified an ancient solar
observatory at Chankillo, Peru as the oldest in the
Americas with alignments covering the entire solar
year, according to an article in the March 2 issue of
Science.
Recorded accounts from the 16th century A.D.
detail practices of state-regulated sun worship during
Inca times, and related social and cosmological beliefs.
These speak of towers being used to mark the rising
or setting position of the sun at certain times in the
year, but no trace of the towers has ever been found.
This paper reports the earliest structures that support
those writings.
At Chankillo, not only were there towers marking
the sun's position throughout the year, but they
remain in place, and the site was constructed much
earlier – in approximately the 4th century B.C.
Simplified diagram of how the solar observatory
would have worked. Courtesy of Ivan Ghezzi
"Archaeological research in Peru is constantly pushing back the origins of civilization in the Americas," said Ivan
Ghezzi, a graduate student in the department of Anthropology at Yale University and lead author of the paper. "In
this case, the 2,300 year old solar observatory at Chankillo is the earliest such structure identified and unlike all
other sites contains alignments that cover the entire solar year. It predates the European conquests by 1,800 years
New Haven, Conn. —
and even precedes, by about 500 years, the monuments of similar purpose constructed by the Mayans in Central
America."
Chankillo is a large ceremonial center covering several square kilometers in the costal Peruvian desert. It was
better known in the past for a heavily fortified hilltop structure with massive walls, restricted gates, and parapets.
For many years, there has been a controversy as to whether this part of
Chankillo was a fort or a ceremonial center. But the purpose of a
300meter long line of Thirteen Towers lying along a small hill nearby had
remained a mystery..
The new evidence now identifies it as a solar observatory. When
viewed from two specially constructed observing points, the thirteen
towers are strikingly visible on the horizon, resembling large prehistoric
teeth. Around the observing points are spaces where artifacts indicate
that ritual gatherings were held.
The current report offers strong evidence for an additional use of the
site at Chankillo — as a solar observatory. It is remarkable as the earliest
known complete solar observatory in the Americas that defines all the
major aspects of the solar year.
The fortified stone temple at Chankillo Courtesy of Peru's National Aerophotographic Service (SAN)
"Focusing on the Andes and the Incan empire, we have known for decades from archeological artifacts and
documents that they practiced what is called solar horizon astronomy, which uses the rising and setting positions of
the sun in the horizon to determine the time of the year," said Ghezzi. "We knew that Inca practices of astronomy
were very sophisticated and that they used buildings as a form of "landscape timekeeping" to mark the positions of
the sun on key dates of the year, but we did not know that these practices were so old."
According to archival texts, "sun pillars" standing on the horizon near Cusco were used to mark planting times
and regulate seasonal observances, but have vanished and their precise location remains unknown. In this report,
the model of Inca astronomy, based almost exclusively in the texts, is fleshed out with a wealth of archaeological
and archaeo-astronomical evidence.
Ghezzi was originally working at the site as a Yale graduate student conducting thesis work on ancient warfare in
the region, with a focus on the fortress at the site.
Noting the configuration of 13 monuments, in 2001, Ghezzi wondered about a proposed relationship to
astronomy. "Since the 19th century there was speculation that the 13-tower array could be solar or lunar
demarcation — but no one followed up on it," Ghezzi said. "We were there. We had extraordinary support from the
Peruvian Government, Earthwatch and Yale University. So we said, 'Let's study it while we are here!'"
To his great surprise, within hours they had measurements indicating that one tower aligned with the June
solstice and another with the December solstice. But, it took several years of fieldwork to date the structures and
demonstrate the intentionality of the alignments. In 2005, Ghezzi connected with co-author Clive Ruggles, a leading
British authority on archeoastronomy. Ruggles was immediately impressed with the monument structures.
"I am used to being disappointed when visiting places people claim to be ancient astronomical observatories."
said Ruggles. "Since everything must point somewhere and there are a great many promising astronomical targets,
the evidence — when you look at it objectively — turns out all too often to be completely unconvincing."
"Chankillo, on the other hand, provided a complete set of horizon markers — the Thirteen Towers — and two
unique and indisputable observation points," Ruggles said. "The fact that, as seen from these two points, the towers
just span the solar rising and setting arcs provides the clearest possible indication that they were built specifically to
facilitate sunrise and sunset observations throughout the seasonal year."
What they found at Chankillo was much more than the archival records had indicated. "Chankillo reflects welldeveloped astronomical principles, which suggests the original forms of astronomy must be quite older," said Ghezzi,
who is also the is Director of Archaeology of the National Institute of Culture in Lima, Peru.
The researchers also knew that Inca astronomical practices in much later times were intimately linked to the
political operations of the Inca king, who considered himself an offspring of the sun. Finding this observatory
revealed a much older precursor where calendrical observances may well have helped to support the social and
political hierarchy. They suggest that this is the earliest unequivocal evidence, not only in the Andes but in all the
Americas, of a monument built to track the movement of the sun throughout the year as part of a cultural
landscape.
According to the authors, these monuments were statements about how the society was organized; about who
had power, and who did not. The people who controlled these monuments "controlled" the movement of the sun.
The authors pose that this knowledge could have been translated into the very powerful political and ideological
statement, "See, I control the sun!"
"This study brings a new significance to an old site," said Richard Burger, Chairman of Archeological Studies at
Yale and Ghezzi's graduate mentor. "It is a wonderful discovery and an important milestone in Andean observations
of this site that people have been arguing over for a hundred years."
"Chankillo is one of the most exciting archaeoastronomical sites I have come across," said Ruggles. "It seems
extraordinary that an ancient astronomical device as clear as this could have remained undiscovered for so long."
Size of brain areas does matter -- but bigger isn't necessarily better
The ability to hit a baseball or play a piano well is part practice and part innate talent. One side of the
equation required for skilled performances has its roots in the architecture of the brain genetically determined
before birth, say scientists at the Salk Institute for Biological Studies. Practice takes no explaining, just persistence.
In this week's early online edition of the Proceedings of the National Academy of Sciences, the Salk researchers
report that in mice, functional subdivisions of the cortex – the brain's powerful central processing unit responsible
for higher functions – must be just the right size relative to other brain architecture, or mice will underperform in
tests of their skill at the relevant behaviors.
These functionally-specialized subdivisions are known as areas, and are responsible for sensory perception,
movements, and the coordination of these and other complex phenomena. The same area of the cortex can vary
two-fold in size among normal humans, but the question of whether such variations in area size can influence
behavioral performance has been left unanswered. Now, Salk investigators have answered this question by
genetically manipulating area sizes in mice and testing the effect on behavioral performance.
They find that if areas of the cortex involved in body sensations and motor control are either smaller or larger
than normal, mice will not be able to run an obstacle course, keep from falling off a rotating rod, or perform other
tactile and motor behaviors that require balance and coordination as well as other mice can.
"It has been assumed that if a cortical area is larger, it would be more effective in processing information," says
the senior author, Dennis O'Leary, Ph.D., professor in the Molecular Neurobiology Laboratory at the Salk Institute.
"However", adds Axel Leingartner, Ph.D., co-first author together with Sandrine Thuret, Ph.D., "our findings suggest
that the area size that gives optimal performance is the one that is best tuned to the context of the neural system
within which that area functions."
In other words, the cortex needs to fit the functional profile of the "pipeline" of information, "read-outs" of body
sensations and peripheral sensory structures such as the eye, that is taken in by brain neurons and sent to the
cortex for processing and ultimately a behavioral response.
Thuret, formerly in the Laboratory of Genetics at the Salk and now at the Centre for the Cellular Basis of
Behaviour at King's College in London, concludes that "If cortical areas are not properly sized, the information will
not be processed effectively, resulting in diminished performance."
This study built upon a previous discovery by O'Leary and colleagues, that Emx2, a gene common to mice and
men, controls how the cortex in mice is divided during embryonic development into its functionally specialized areas.
The researchers wanted to know what would happen to behavioral performance if they altered area sizes by
changing the levels of Emx2. Leingartner engineered one group of mice to express too much, which resulted in
reductions in the sizes of "sensorimotor" areas of the cortex. These mice exhibited significant deficiencies in tactile
and motor behaviors. But surprisingly, tactile and motor behaviors were also diminished in a second group of mice
that had too little Emx2, resulting in an expansion of the sensorimotor areas.
In a final critical experiment, the first group of mice was bred to the second to perform a "genetic rescue." The
scientists found that levels of Emx2, area sizes, and behavioral performance all returned to normal. "To us this
rescue experiment was compelling, and even a bit shocking, because the offspring that performed normally were
the progeny of the two lines of mice that performed poorly," O'Leary says. "Findings from the first two lines of mice
tested show a correlation between area size and performance, but the genetic rescue proves the relationship
between area size and performance."
The Salk scientists say that the two-fold variability in cortical area size likely explains at least in part variability in
human performance and behavior and could also provide insight into developmental cognitive disorders. O'Leary
says that establishing such a correlation between area size in human cortex size and behavior is possible by
combining, in the same individuals, tests of behavioral performance with functional MRI that can be used to
measure the size of a cortical area based on neural activity.
"There is no doubt that people vary considerably in performance in everything from hitting a baseball to playing a
piano to even a simple measure such as visual acuity–indeed the full spectrum of sensory, motor, and cognitive
function," he says. "Just as the size of a cortical area can vary considerably between people, so can human
behavioral performance. Our studies in mice lead us to conclude that in humans, variations in cortical area size
figures prominently in explaining variations in behavioral performance."
Alterations in the size and shape of cortical areas could also underlie some cognitive strengths and weaknesses,
the researchers say, for example those associated with the genetically based disorder, Williams Syndrome, as
recently reported by Salk professor Ursula Bellugi and her colleagues.
O'Leary stresses though that he is making no statements about variability in intelligence. "Neuroscientists have
yet to develop an understanding of the biological underpinnings of intelligence. The behaviors we have studied are
based on sensory and motor modalities. However, for most issues in biology, in the end, researchers conclude that
both an environmental component and a genetic component contribute to the final outcome."
LA JOLLA, CA –
The study was supported by a Javits Award from the National Institutes of Health. The experimental work was done
primarily by the co-first authors Axel Leingartner, Ph.D., a postdoctoral fellow in the O'Leary lab and Sandrine Thuret, Ph.D., a
former postdoctoral fellow in the Gage lab, with contributions from Todd T. Kroll, Ph.D. and Shen-ju Chou, Ph.D., postdoctoral
fellows in the O'Leary lab, Leigh Leasure, Ph.D., a postdoctoral fellow in the Gage lab and Fred H. Gage, Ph.D., a professor in
the Laboratory of Genetics.
Whole-grain breakfast cereal associated with reduced heart failure risk
Eating whole-grain breakfast cereals seven or more times per week was associated with a lower risk
of heart failure, according to an analysis of the observational Physicians’ Health Study. Researchers presented
findings of the study today at the American Heart Association’s 47th Annual Conference on Cardiovascular Disease
Epidemiology and Prevention. For the present study, breakfast cereals that contain at least 25 percent oat or bran
content were classified as whole grain cereals.
The analysis shows that those who ate a whole-grain breakfast cereal seven or more times per week were less
likely (by 28 percent) to develop heart failure over the course of the study than those who never ate such cereal.
The risk of heart failure decreased by 22 percent in those who ate a whole-grain breakfast cereal from two to six
times per week and by 14 percent in those who ate a whole-grain breakfast cereal up to once per week.
According to researchers, if this data is confirmed by other studies, a healthy diet including whole-grain breakfast
cereals along with other measures may help reduce the risk of heart failure.
"There are good and powerful arguments for eating a whole-grain cereal for breakfast," said Luc Djoussé, M.D.,
M.P.H., D.Sc., lead author of the study and assistant professor of medicine in the Division of Aging at Brigham &
Women’s Hospital and Harvard Medical School in Boston, Mass. "The significant health benefits of whole-grain
cereal are not just for kids, but also for adults. A whole-grain, high-fiber breakfast may lower blood pressure and
bad cholesterol and prevent heart attacks."
Djoussé urges the general public to consider eating a regular whole-grain, high fiber breakfast for its overall
health benefits.
In the Physicians’ Health Study, the majority of the physicians in the study ate whole-grain cereals rather than
refined cereals. Whole grains are rich in vitamins, minerals, and anti-oxidants and have a high fiber content. Of
10,469 physicians reporting cereal consumption at baseline, 8,266 (79 percent) ate whole-grain cereals compared to
2,203 (21 percent) who ate refined cereals.
Among the physicians who ate whole-grain breakfast cereals, 2,873 (35 percent) said they ate them seven or
more times per week; 3,240 (39 percent) said two to six times per week; and 2,153 (26 percent) said they ate up to
one cereal serving per week.
The findings reported here were based on annual detailed questionnaires about major heart events and reported
breakfast cereal consumption at baseline. However, the results did not change when possible changes in cereal
consumption over time (assessed at 18 weeks; two years; four years; six years; eight years; and ten years) were
taken into account. Researchers conducted the study from 1982 to 2006. The average age of physicians in the study
at baseline was 53.7 years. Djoussé hopes the findings of the Physicians’ Health Study will encourage the general
population to eat heart-healthy diets.
"The Physicians’ Health Study shows that even in a population with overall healthy behavior, it is possible to see
less heart failure in those who eat a whole-grain cereal breakfast," Djoussé said.
In the United States, foods considered "whole grain" contain 51 percent or more whole grain ingredients by
weight per reference amount customarily consumed.
ORLANDO, Fla. --
Second-born twin faces doubled risk of death
* 11:11 02 March 2007
* NewScientist.com news service
* Roxanne Khamsi
Second-born twins are more than twice as likely to die during or shortly after vaginal delivery compared to their
first-born siblings, a new study reveals.
The analysis of nearly 1400 twin deliveries found that the second-born babies died more frequently from
complications such as breech birth and premature detachment of the umbilical cord. Based on this finding, doctors
suggest that delivering more twins by planned caesarean section might help reduce the risk of perinatal death.
Gordon Smith at the University of Cambridge in the UK, and colleagues, analysed data from 1377 twin
pregnancies between 1994 and 2003 in which one of the infants died during or immediately after birth.
They discovered a significant difference in risk among twins born at term – after 36 weeks of gestation. Among
those born at term, 53 first twins did not survive, compared with 124 second-born twins. The analysis showed that
the latter group had 2.3 times the risk of dying during delivery.
Complications such as breech birth and infection were partly to blame for this increased risk, says Smith.
Premature twins
Smith adds that second-born twins appear four-times as likely as their first-born counterparts to die from oxygen
deprivation during birth. In some cases, the umbilical cord connecting the second twin to the oxygen-supplying
placenta sometimes detaches and is delivered before the baby, Smith explains.
Previous research has shown that second-born twins delivered at term have a one in 250 chance of dying during
or shortly after labour.
Smith’s team found that birth order had no effect on the risk of survival among prematurely born twins.
According to Smith this is because premature twins have an extremely high mortality risk during delivery, which
made it statistically unlikely that birth order would stand out as a major cause of death.
Delivery at around full term by planned caesarean section could help avoid second twins from dying, Smith
suggests. Preliminary studies have indicated that twins delivered in this way have a slightly better chance of survival.
Journal reference: BMJ (DOI: 10.1136/bmj.39118.483819.55)
New medical finding: Treatment for gum disease could also help the heart
Scientists at University College London (UCL) have conducted the first clinical trial to demonstrate that an
intensive treatment for periodontitis (gum disease) directly improves the health of blood vessels. This study,
conducted in conjunction with Professor Maurizio Tonetti (University of Connecticut, USA), and reported in the latest
edition of the New England Journal of Medicine, may have relevance for the prevention of heart attacks and stroke.
Periodontitis is a common inflammatory disease of the gums, affecting up to 40 per cent of the world's adult
population. It is a bacterial infection of the tissue that supports the teeth in the mouth. If untreated, it can cause
progressive bone loss around the teeth, and eventual tooth loss.
There is already established scientific evidence linking inflammation, the body's natural response to infection or
injury, with the arterial changes that underlie stroke and heart attack. However, this is the first clinical trial to
demonstrate that relief of inflammation in the mouth, through intensive treatment of periodontitis, results in
improved function of the arteries.
Dr Francesco D'Aiuto, project leader and therapist, UCL Eastman Dental Institute, explained the method behind
the research: "Middle-aged subjects with severe periodontitis, but no evidence of cardiovascular disease, were
randomly allocated to dental treatments of two levels of intensity. After six months, those who received the more
intensive periodontitis treatment, which resulted in a marked improvement in their gum disease, also demonstrated
a significant restoration of blood vessel function.
"The intensive treatment involved removal of plaque through scaling and root planning techniques, as well as
extraction of teeth that could not be saved. This initially resulted in some inflammation and dysfunction of the blood
vessels and arteries. However, that was short-lived and six months later the treatment led to an improvement in
both oral health and arterial function."
Professor John Deanfield, senior author, UCL Institute of Child Health, added: "Previous studies have shown an
association between periodontitis and blood vessel dysfunction, heart attack and stroke. However, a clinical trial was
required to test whether these links could be causal. This is the first time that a direct link has been made between
treatment for gum disease and improved circulatory function, which is relevant to some of the UK's biggest killers:
heart attack and stroke."
Dr Aroon Hingorani, UCL Division of Medicine, a co-author on the study, set the findings in context: "Elevations in
blood pressure and cholesterol, as well as smoking and diabetes, are recognised as the main risk factors for
cardiovascular disease, and these can be effectively treated. Nevertheless, heart attacks and stroke remain a major
cause of disability and death. Intriguing links have emerged between inflammation and heart disease and so it is
important to better understand the nature of this connection, and whether it could lead to the development of new
treatments. The current study points to disease of the gums as a potential source of this inflammation."
Professor Deanfield concluded: "This finding therefore has potential implications for public health, but further
studies are now required to determine whether the treatment of severe periodontitis could directly contribute to the
prevention of disease of the arteries (atherosclerosis), stroke and heart attacks."
The mechanism by which periodontitis affects endothelial function in the body is still uncertain. The gum disease
involves a bacterial infection that invades the tissue around the teeth. One possibility is that the bacteria disturb
endothelial function directly, since some bacteria can enter the bloodstream. Alternatively, the periodontitis might
trigger a low grade inflammatory response throughout the body that has a detrimental effect on the vascular wall.
Antidepressants improve post-stroke 'thinking outside the box'
Effect is independent of any changes in depression
Antidepressant treatment appears to help stroke survivors with the kind of complex mental abilities often referred
to as "thinking outside the box," according to a University of Iowa study.
The antidepressants' effects on study participants' abilities were independent of any changes in depression. In
addition, the improvements in complex mental abilities were not seen immediately but during the course of 21
months after the treatment ended. The study results appear in the March 2007 issue of the British Journal of
Psychiatry.
Antidepressant treatment already was known to improve mood in depressed post-stroke patients, but such
therapy had not been examined on executive function in people with clinically diagnosed stroke, said Sergio
Paradiso, M.D., Ph.D., the study's corresponding author and assistant professor of psychiatry at the UI Carver
College of Medicine.
"We found that people diagnosed with stroke who often have a decline in 'executive function', that is, those
mental abilities that enable us to respond appropriately to unfamiliar or complex situations, and support several
cognitive, emotional and social capacities, showed improvement after receiving a 12-week treatment with
antidepressants," Paradiso said.
Executive functions come into play, for instance, when we plan to take an alternative route home due to
unexpected detours. This brain function involves stopping ingrained behavior, such as trying to take your usual
route home. People with stroke often show impairments in executive function and may not be able to respond well
to non-routine situations. This impairment may affect rehabilitation efforts.
The UI team included Kenji Narushima, M.D, Ph.D., UI resident physician in psychiatry, who contributed
significantly to the study.
The study began with 47 patients who had had a stroke during the previous six months. These individuals were
divided into three groups and randomly assigned (with the exception of those with certain medical conditions) to
take the antidepressant fluoxetine (Prozac), the antidepressant nortriptyline (Aventyl or Pamelor) or a placebo
(inactive substance).
Their executive functions were assessed using standard neuropsychological tasks at the end of 12 weeks of
treatment, and again two years after the study had started. A total of 36 patients completed all the evaluations.
No significant differences were found between the antidepressant and placebo groups at the end of treatment.
However, 21 months after the treatment ended, the placebo group showed continued worsening of the executive
functions, whereas the group treated with antidepressants had clear and significant improvement, regardless of how
their depressive symptoms changed.
"We were somewhat surprised to initially not find any difference after the first 12 weeks of treatment. It took
another 21 months after the initial treatments for the antidepressants to have a detectable effect," Paradiso said.
The investigators hypothesize that antidepressants may foster recovery of neural tissue not directly destroyed by
the stroke, yet because the process is slow, it takes months.
"Drugs such as antibiotics start working right away to kill germs. However, antidepressants may be reorganizing
brain structure and re-establishing neuronal connections that were lost because of the death of neurons due to the
stroke," Paradiso said. "We expect this regeneration to happen in longer, rather than brief, periods of time.
"We really appreciate the patients who made the commitment to participate in this two-year-long study while
they were in their post-stroke recovery. The information we've learned will help us develop new studies," he added.
The researchers plan to examine individuals who responded favorably to the antidepressants and look
noninvasively for brain changes.
"We can do functional and structural brain imaging studies using different technologies, including relatively new
techniques that quantify chemicals in the brain," Paradiso said.
In addition to Paradiso and Narushima, the study involved UI researchers in psychiatry: David Moser, Ph.D., associate
professor; Ricardo Jorge, M.D., assistant professor; and Robert G. Robinson, M.D., department head and the Paul W.
Penningroth Professor of Psychiatry.
Spiky oddball prowled ocean half billion years ago
By Will Dunham
A spectacularly quirky creature with long, curved spines protruding from its armored body
prowled the ocean floor half a billion years ago near the dawn of complex life forms on Earth, scientists said.
In research appearing in Friday's edition of the journal Science, scientists identified an ancient invertebrate they
named Orthrozanclus reburrus from 11 complete fossils retrieved from
Canada's fossil-rich Burgess Shale rock formation.
"It's a tiny beast," Jean-Bernard Caron of the Royal Ontario Museum in
Toronto, who described the newly identified species along with Simon
Conway Morris of the University of Cambridge in Britain, said in an
interview.
Orthrozanclus, about half an inch (one centimetre) long, lived about
505 million years ago during the Cambrian Period. The Cambrian was an
important moment in the history of life on Earth and a time of radical
evolutionary experimentation when many major animal groups first
appeared in the fossil record.
This proliferation of life is dubbed the "Cambrian Explosion" because of
the relatively brief time span in which this diversity of forms arose.
A handout drawing shows an artist's impression of the newly
identified invertebrate Orthrozanclus reburrus that lived 505 million years ago during the Cambrian Period,
a time of radical experimentation in body forms at the dawn of complex life on Earth.
WASHINGTON (Reuters) -
REUTERS/AAAS/Science-2007-Drawing by Marianne Collins
Orthrozanclus had no eyes and no limbs and apparently moved along the ocean floor with a muscular foot, like a
snail does, while dining on bacterial growths, the researchers said.
Orthrozanclus seems to have been built to prevent predators from turning it into a quick snack. It was covered in
a shell and had almost three dozen long, pointy, curved spines sticking out from the edge of its body, and many
smaller ones, too.
"You probably don't want to have them in your slippers. They're kind of spiky," Caron said.
EARLY ANIMAL EVOLUTION
The newly identified invertebrate helps clarify early animal evolution, the scientists said.
The scientists think Orthrozanclus may belong to a newly identified group of organisms characterized by a similar
type of body armor, and that this group was related to present-day snails, earthworms and mollusks, which include
snails, clams, squid and octopuses.
The researchers described the animal based on complete and beautifully preserved fossils -- nine at the Royal
Ontario Museum and two at the Smithsonian Institution in Washington.
The Burgess Shale, an important rock layer from the Cambrian in the Canadian Rockies of southeastern British
Columbia, has yielded a treasure trove of fossils from this critical period in the history of life.
These include such weirdos as Hallucigenia, a spiky animal so unusual that the scientists who named it seemed
to think it was a hallucination, and the predator Anomalocaris with large, grasping limbs, the largest animal found in
the Burgess Shale.
Some creatures found as fossils in the Burgess Shale are ancestors of animals alive today, while others have long
since gone extinct and are not like any existing living thing.