Download WORD document HERE

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Seven Countries Study wikipedia , lookup

Pharmacognosy wikipedia , lookup

Public health genomics wikipedia , lookup

Syndemic wikipedia , lookup

Epidemiology wikipedia , lookup

Multiple sclerosis research wikipedia , lookup

Alzheimer's disease research wikipedia , lookup

Transcript
Novel computed imaging technique uses blurry images to enhance view
Researchers at the University of Illinois at Urbana-Champaign have developed a novel
computational image-forming technique for optical microscopy that can produce crisp, three-dimensional
images from blurry, out-of-focus data.
Called Interferometric Synthetic Aperture Microscopy, ISAM can do for optical microscopy what magnetic
resonance imaging did for nuclear magnetic resonance, and what computed tomography did for X-ray imaging,
the scientists say.
"ISAM can perform high-speed, micron-scale, cross-sectional imaging without the need for time-consuming
processing, sectioning and staining of resected tissue," said Stephen Boppart, a professor of electrical and
computer engineering, of bioengineering, and of medicine at the U. of I., and corresponding author of a paper
accepted for publication in the journal Nature Physics, and posted on its Web site.
Developed by postdoctoral research associate and lead author Tyler Ralston, research scientist Daniel
Marks, electrical and computer engineering professor P. Scott Carney, and Boppart, the imaging technique
utilizes a broad-spectrum light source and a spectral interferometer to obtain high-resolution, reconstructed
images from the optical signals based on an understanding of the physics of light-scattering within the sample.
"ISAM has the potential to broadly impact real-time, three-dimensional microscopy and analysis in the fields
of cell and tumor biology, as well as in clinical diagnosis where imaging is preferable to biopsy," said Boppart,
who is also a physician and founding director of the Mills Breast Cancer Institute at Carle Foundation Hospital
in Urbana, Ill.
While other methods of three-dimensional optical microscopy require the instrument's focal plane to be
scanned through the region of interest, ISAM works by utilizing light from the out-of-focus image planes,
Ralston said. "Although most of the image planes are blurry, ISAM descrambles the light to produce a fully
focused, three-dimensional image."
ISAM effectively extends the region of the image that is in focus, using information that was discarded in
the past.
"We have demonstrated that the discarded information can be computationally reconstructed to quickly
create the desired image," Marks said. "We are now applying the technique to various microscopy methods
used in biological imaging."
In their paper, the researchers demonstrate the usefulness of computed image reconstruction on both
phantom tissue and on excised human breast-tumor tissue.
"ISAM can assist doctors by providing faster diagnostic information, and by facilitating the further
development of image-guided surgery," Boppart said. "Using ISAM, it may be possible to perform micron-scale
imaging over large volumes of tissue rather than resecting large volumes of tissue."
The versatile imaging technique can be applied to existing hardware with only minor modifications.
CHAMPAIGN, Ill. --
Activation of brain region predicts altruism
Duke University Medical Center researchers have discovered that activation of a particular brain
region predicts whether people tend to be selfish or altruistic.
"Although understanding the function of this brain region may not necessarily identify what drives people
like Mother Theresa, it may give clues to the origins of important social behaviors like altruism," said study
investigator Scott A. Huettel, Ph.D., a neuroscientist at the Brain Imaging and Analysis Center.
Results of the study appear Sunday, Jan. 21, in the advance online edition of Nature Neuroscience and will
be published in the February 2007 print issue of the journal. The work was funded by the National Institutes
of Health.
Altruism describes the tendency of people to act in ways that put the welfare of others ahead of their own.
Why some people choose to act altruistically is unclear, says lead study investigator Dharol Tankersley, a
graduate student in Huettel's laboratory.
In the study, researchers scanned the brains of 45 people while they either played a computer game or
watched the computer play the game on its own. In both cases, successful playing of the game earned money
for a charity of the study participant's choice.
The researchers scanned the participants' brains using a technique called functional magnetic resonance
imaging (fMRI), which uses harmless magnetic pulses to measure changes in oxygen levels that indicate nerve
cell activity.
The scans revealed that a region of the brain called the posterior superior temporal sulcus was activated to
a greater degree when people perceived an action -- that is, when they watched the computer play the game
-- than when they acted themselves, Tankersley said. This region, which lies in the top and back portion of the
brain, is generally activated when the mind is trying to figure out social relationships.
The researchers then characterized the participants as more or less altruistic, based on their responses to
questions about how often they engaged in different helping behaviors, and compared the participants' brain
DURHAM, N.C. --
scans with their estimated level of altruistic behavior. The fMRI scans showed that increased activity in the
posterior superior temporal sulcus strongly predicted a person's likelihood for altruistic behavior.
According to the researchers, the results suggest that altruistic behavior may originate from how people
view the world rather than how they act in it.
"We believe that the ability to perceive other people's actions as meaningful is critical for altruism,"
Tankersley said.
The scientists suggest that studying the brain systems that allow people to see the world as a series of
meaningful interactions may ultimately help further understanding of disorders, such as autism or antisocial
behavior, that are characterized by deficits in interpersonal interactions.
The researchers are now exploring ways to study the development of this brain region early in life,
Tankersley said, adding that such information may help determine how the tendencies toward altruism are
established.
The floral network -- what determines who pollinates whom
A field of spring wildflowers, abuzz with busy insects seeking nectar and spreading pollen, may look like a
perfect model of random interaction. But ecologists have discovered order within this anarchy. For instance, as
the number of species grows, the number of interactions does too, while the connectivity (the fraction of
possible interactions that actually occur) and the nestedness (the relative importance of generalist species as
mutualistic partners of specialist species) shrinks. Study of such networks of species is still in its youth, and
the rules that generate these patterns of interaction are still being worked out. In a new study, Luis
Santamaría and Miguel Rodríguez-Gironés propose that two key mechanisms, trait complementarity and
barriers to exploitation, go a long way in explaining the structure of actual networks of plants and their many
pollinators.
The two mechanisms each arise from fundamental aspects of the interaction between species. An insect
will be unable to reach nectar in floral tubes longer than its proboscis; the tube length sets up a barrier to
some species, but not to others. Each plant species also has a given flowering period. The specific activity
period of each insect species will complement the flowering of some plant species more than others. Other
barriers and other complementary traits have been described for a variety of plant–pollinator pairs. To explore
the significance of these mechanisms, the authors
modeled plant–pollinator interaction networks using a few
simple rules, and compared their results to data from real
networks in real plant communities. The models
incorporated from one to four barrier or complementary
traits, or a combination of two of each. They also tested
two variations of a "neutral" interaction model, in which
species interact randomly, based simply on their relative
abundance.
Different models did better at mimicking different
aspects of real networks, but the two that performed best
overall were the combination model and one of the
neutral models. The authors argue that the neutral model,
despite its appealing simplicity, can be discounted because
it requires key assumptions regarding species abundances
and random interaction that conflict with empirical
observations of real communities. In contrast, the model
combining barriers and complementary traits matches well
with observed plant–pollinator interactions. Barriers alone
would mean that pollinators with the longest proboscis
would be supreme generalists, able to feed on any flower,
causing perfect network nestedness; while
complementarity alone would mean that specialist
pollinators do not interact primarily with generalist plants,
causing unrealistically low network nestedness. Instead,
the authors suggest, a combination of barriers and
complementary traits accounts for the pattern of
specialists and generalists seen in real pollination
networks.
Researchers are just beginning to understand the mechanisms governing the complex network
interactions between plants and pollinators, such as hummingbirds, shown in this illustration from
Ernst Haeckel's Kunstformen der Natur (1904). Maravall et al.
The superiority of the combination model also has implications for understanding floral evolution. A
common principle has been that plants coevolve with their most-efficient pollinator to strengthen the
complementarity of their matching adaptations. Barriers, however, while reducing exploitation by inefficient
pollinators, may also interfere with pollination by efficient ones. Nonetheless, the results of the present study
indicate that barriers are likely to play an important role in pollinator networks, suggesting that coevolution
with the most-efficient pollinator is not the sole factor governing floral evolution.
Detaining patients is justified to contain deadly TB strain in South Africa say experts
A team of medical ethics and public health experts say tough isolation measures, involuntary if need be, are
justified to contain a deadly, contagious, drug-resistant strain of TB in South Africa and to prevent "a
potentially explosive international health crisis."
In a policy paper in the international health journal PLoS Medicine, Dr Jerome Singh of the Centre for the
AIDS Programme of Research in Durban, South Africa (who is also an Adjunct Professor at the Joint Centre for
Bioethics, University of Toronto) and colleagues say that "the forced isolation and confinement of extensively
drug resistant tuberculosis (XDR-TB) and multiple drug resistant tuberculosis (MDR-TB) infected individuals
may be a proportionate response in defined situations given the extreme risk posed."
On September 01, 2006, the World Health Organisation announced that a deadly new strain of XDR-TB had
been detected in Tugela Ferry, a rural town in the South African province of KwaZulu-Natal, the epicentre of
South Africa's HIV/AIDS epidemic. Of the 544 patients studied in the area in 2005, 221 had MDR-TB
(Mycobacterium tuberculosis resistant to at least rifampicin and isoniazid). Of these 221 cases, 53 were
identified as XDR-TB (i.e. MDR-TB plus resistance to at least three of the six classes of second line drug
treatments). Of the 53, 44 were tested for HIV and all were HIV infected.
This strain of XDR-TB in Kwazulu-Natal proved to be particularly deadly: 52 of the 53 patients died (within
a median of 16 days of the initial collection of sputum for diagnostic purposes).
But the authors say that there have been a number of obstacles in the way of dealing effectively with the
crisis. "The South African government's initial lethargic reaction to the crisis," they say, "and uncertainty
amongst South African health professionals concerning the ethical, social and human rights implications of
effectively tackling this outbreak highlights the need to address these issues as a matter of urgency lest doubt
and inaction spawns a full-blown XDR-TB epidemic in South Africa and beyond."
Daily use of antidepressants associated with increased risk of fracture in older adults
Daily use of the antidepressant medications known as selective serotonin reuptake inhibitors (SSRIs) by
adults 50 years and older is associated with a doubled risk of some fractures, according to a report in the
January 22, 2007 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.
Depression affects about 10 percent of primary care patients in the United States, according to background
information in the article. The use of SSRIs for the treatment of depressive symptoms is widespread due to
the medication's presumed favorable adverse effect profile. Past studies have found the use of these
antidepressants to be associated with an increased risk of clinical fragility fracture (fractures due to falling
from bed, chair or standing height), but did not reliably examine such factors as falls and bone mineral density,
the authors note.
J. Brent Richards, M.D., of McGill University, Montreal, Quebec, and colleagues evaluated 5,008 communitydwelling adults 50 years and older who were followed up for over five years for incident fractures. Researchers
examined the relationships between SSRI use, bone mineral density (BMD) and falls. Participants who used
the medication at the beginning of the study and at year five were considered to be recurrent users. BMD of
the lower spine and hip were measured at the beginning of the study. Patients were then sent a yearly
questionnaire to determine if they had experienced clinical fragility fractures and all reported fractures were
confirmed radiographically. Other factors such as demographic information, history of falls and medication use
were all assessed.
Daily use of SSRIs was reported by 137 participants with an average age of 65.1 years. The researchers
found that "daily SSRI use remained associated with a two-fold increased risk of incident clinical fragility
fracture even after adjustment for many potential confounding variables." These fractures occurred at the
forearm (40 percent), ankle and foot (21 percent), hip (13 percent), rib (13 percent) femur (9 percent) and
back (4 percent). Participants who used SSRIs at the beginning of the study had similar increased risks of
fracture to those who used them at follow-up.
During the initial interview, the daily use of SSRIs was associated with an increased risk of falling. The
effect was dose-dependent; doubling the daily dose of SSRIs increased the odds of falling 1.5-fold during the
previous month. Daily use of SSRIs was also associated with a 4 percent decreased BMD at the total hip and a
2.4 percent decrease at the lumbar spine.
"Our results suggest that BMD and falls may be affected adversely by daily SSRI use but that fracture rates
remain elevated despite adjustment for these two risk factors, indicating that other pathways, such as
impaired bone quality leading to reduced bone strength, may be of particular relevance," the authors conclude.
"In light of the high rate of SSRI use among the general population, and among elderly persons in particular,
further studies that include controlled prospective trials are needed to confirm our findings."
A spoonful of sugar makes the medicine go to work
There will soon be no more bitter pills to swallow, thanks to new research by University of Leeds scientists
(UK): a spoonful of sugar will be all we need for our bodies to make their own medicine.
Professor Simon Carding of Leeds' Faculty of Biological Sciences has adapted a bacteria in our own bodies
to make it produce a treatment for Inflammatory Bowel Disease (IBD). Bacteria and viruses have been used
before to deliver drugs in this way, but Professor Carding has solved the major problem with this kind of
treatment: he uses a sugar to 'switch' the bacteria on and off. By eating the sugar, a patient will set the
medicine to work and then can end the treatment simply by stopping consumption of the sugar.
"Current bacteria and virus delivery systems produce their drugs non-stop, but for many treatments there is
a narrow concentration range at which drugs are beneficial," said Professor Carding. "Outside of this, the
treatment can be counterproductive and make the condition worse. It's vitally important to be able to control
when and how much of the drug is administered and we believe our discovery will provide that control."
Professor Carding has modified one of the trillions of bacteria in the human gut so that it will produce
human growth factors which help repair the layer of cells lining the colon, so reducing inflammation caused by
IBD. But he's also adapted the bacteria so it only activates in the presence of a plant sugar called xylan that is
found in tree bark. Xylan is naturally present in food in low concentrations, so by taking it in higher quantities,
a patient will be able to produce their own medicine as and when they need it.
"The human gut has a huge number of bacteria, and this treatment simply adapts what's there naturally to
treat the disease," said Professor Carding. "We're already looking at using the same technique for colorectal
cancer, as we believe we could modify the bacteria to produce factors that will reduce tumour growth.
Treatment of diseases elsewhere in the body might also be possible as most things present in the gut can get
taken into the blood stream."
Vital Signs
Consequences: Gun Ownership Linked to Higher Homicide Rates
By ERIC NAGOURNEY
States with the greatest number of guns in the home also have the highest rates of homicide, a new study
finds.
The study, in the February issue of Social Science and Medicine, looked at gun ownership in all 50 states
and then compared the results with the number of people killed over a three-year period.
The research, the authors said, “suggests that household firearms are a direct and an indirect source of
firearms used to kill Americans both in their homes and on the streets.”
The researchers, led by Matthew Miller of the Harvard School of Public Health, drew on data gathered by
the federal Centers for Disease Control and Prevention. In 2001, the agency surveyed more than 200,000
people and asked them, among other questions, whether they had a gun in or near the home.
In states in the highest quarter of gun ownership, the study found, the overall homicide rate was 60
percent higher than in states in the lowest quarter. The rate of homicides involving guns was more than twice
as high.
Among the possible explanations for the higher homicide rates, the study said, is that states with high gun
ownership tend to make it easier to buy guns. There are also more guns that can be stolen. And the presence
of a gun may allow arguments and fights to turn fatal.
The researchers said they could not prove that the guns caused the increase in homicides, only that there
was a link. It may be, they said, that people are more likely to buy guns in states where violence is already
high. But they said that explanation did not appear to be supported by their findings.
The Consumer
An Old Cholesterol Remedy Is New Again
By MICHAEL MASON
Perhaps you heard it? The wail last month from the labs of heart researchers
and the offices of Wall Street analysts?
Pfizer Inc., the pharmaceutical giant, halted late-stage trials of a cholesterol
drug called torcetrapib after investigators discovered that it increased heart
problems — and death rates — in the test population.
Torcetrapib wasn’t just another scientific misfire; the drug was to have been
a blockbuster heralding the transformation of cardiovascular care. Statin drugs like simvastatin (sold as Zocor)
and atorvastatin (Lipitor) lower blood levels of LDL, the so-called bad cholesterol, thereby slowing the buildup
of plaque in the arteries.
But torcetrapib worked primarily by increasing HDL, or good cholesterol. Among other functions, HDL
carries dangerous forms of cholesterol from artery walls to the liver for excretion. The process, called reverse
cholesterol transport, is thought to be crucial to preventing clogged arteries.
Many scientists still believe that a statin combined with a drug that raises HDL would mark a significant
advance in the treatment of heart disease. But for patients now at high risk of heart attack or stroke, the news
is better than it sounds. An effective HDL booster already exists.
It is niacin, the ordinary B vitamin.
In its therapeutic form, nicotinic acid, niacin can increase HDL as much as 35 percent when taken in high
doses, usually about 2,000 milligrams per day. It also lowers LDL, though not as sharply as statins do, and it
has been shown to reduce serum levels of artery-clogging triglycerides as much as 50 percent. Its principal
side effect is an irritating flush caused by the vitamin’s dilation of blood vessels.
Despite its effectiveness, niacin has been the ugly duckling of heart medications, an old remedy that few
scientists cared to examine. But that seems likely to change.
“There’s a great unfilled need for something that raises HDL,” said Dr. Steven E. Nissen, a cardiologist at
the Cleveland Clinic and president of the American College of Cardiology. “Right now, in the wake of the
failure of torcetrapib, niacin is really it. Nothing else available is that effective.”
In 1975, long before statins, a landmark study of 8,341 men who had suffered heart attacks found that
niacin was the only treatment among five tested that prevented second heart attacks. Compared with men on
placebos, those on niacin had a 26 percent reduction in heart attacks and a 27 percent reduction in strokes.
Fifteen years later, the mortality rate among the men on niacin was 11 percent lower than among those who
had received placebos.
“Here you have a drug that was about as effective as the early statins, and it just never caught on,” said
Dr. B. Greg Brown, professor of medicine at the University of Washington in Seattle. “It’s a mystery to me. But
if you’re a drug company, I guess you can’t make money on a vitamin.”
By and large, research was focused on lowering LDL, and the statins proved to be remarkably effective.
The drugs can slow the progress of cardiovascular disease, reducing the risk of heart attack or other adverse
outcomes by 25 percent to 35 percent.
But recent studies suggest that the addition of an HDL booster like niacin may afford still greater protection.
After analyzing data from more than 83,000 heart patients who participated in 23 different clinical trials,
researchers at the University of Washington calculated this month that a regimen that increased HDL by 30
percent and lowered LDL by 40 percent in the average patient would reduce the risk of heart attack or stroke
by 70 percent. That is far more than can be achieved by reducing LDL alone.
Other small studies have produced similarly encouraging results, but some experts caution that the data on
increased HDL and heart disease are preliminary.
Researchers at 72 sites in the United States and Canada are recruiting 3,300 heart patients for a study, led
by Dr. Brown and financed by the National Institutes of Health, comparing those who take niacin and a statin
with those who take only a statin. This large head-on comparison should answer many questions about the
benefits of combination therapy.
Many cardiologists see no reason to wait for the results. But niacin can be a bitter pill; in rare instances, the
vitamin can cause liver damage and can impair the body’s use of glucose. High doses should be taken only
under a doctor’s supervision.
A more frequent side effect is flushing. It becomes less pronounced with time, and often it can be avoided
by taking the pills before bed with a bit of food. Doctors also recommend starting with small doses and
working up to larger ones.
Extended-release formulations of the vitamin, taken once daily, are now available by prescription, and in
many patients they produce fewer side effects. And a new Merck drug to counteract niacin-induced flushing is
being tested in Britain. If it works, the company plans to bundle the drug with its own extended-release niacin
and with Zocor, its popular statin.
Until then, consider this: If it means preventing a heart attack, maybe it is better to put up with flushing
than to wait for the next blockbuster.
“If you can just get patients to take niacin, HDL goes up substantially,” said Dr. Nissen of the Cleveland
Clinic. “Most of the evidence suggests they’ll get a benefit from that.”
Study Says Tapping of Granite Could Unleash Energy Source
By ANDREW C. REVKIN
The United States could generate as much electricity by 2050 as that flowing today from all of the country’s
nuclear power plants by developing technologies that tap heat locked in deep layers of granite, according to a
new study commissioned by the Energy Department.
There are already dozens of power plants worldwide that have long exploited hot spots of geothermal
energy to drive steam turbines, but they are restricted to a few areas.
The new report, published online yesterday,
focuses on a process that it said could
affordably harvest heat locked in deep layers
of granite that exist almost everywhere on
earth. The technique, called enhanced
geothermal, involves drilling several holes —
some two to three miles deep — into granite
that has been held at chicken-roasting
temperatures, around 400 degrees or more,
by insulating layers of rock above.
In the right geological conditions,
pressurized water can be used to widen
natural mazelike arrays of cracks in the
granite, creating a vast, porous subterranean
reservoir.
In a typical setup, water pumped down
into the reservoir through one hole absorbs
heat from the rock and flows up another hole
to a power plant, giving up its heat to
generate steam and electricity before it is
recirculated in the rock below.
There are successful plants harvesting heat
from deep hot rock in Australia, Europe and
Japan, the report noted, adding that studies
of the technology largely stopped in the
United States after a brief burst of research
during the oil crises of the 1970s.
The report’s 18 authors, from academia, government and industry, said that a public investment of less
than $1 billion spread over 15 years would probably be enough to overcome technical hurdles and do initial
large-scale deployment of the technology.
The generating capacity by 2050 could be 100 billion watts, about 10 percent of the country’s current
generating capacity.
David Keith, an expert on energy technologies at the University of Calgary who was not involved with the
study, said there were significant, but surmountable, hurdles to doing such operations at large scale.
Among them, Professor Keith said, are cutting the costs of drilling deep holes and increasing the efficiency
of systems that can generate electricity from relatively low-temperature source of heat like deep rock.
“There’s no question there’s a lot of heat down there,” he said. “It’s about the cost of access, and about
the value of low-grade heat.”
Jefferson W. Tester, the lead author of the study and a chemical engineer at the Massachusetts Institute of
Technology, said there were many new justifications for aggressively pursuing this kind of energy option.
“Back then, we weren’t worried about carbon dioxide and climate, we weren’t running short of natural gas,
and now energy is a national security issue in the long run,” Dr. Tester said. “While there’s no guarantee it’s
going to work, this is not an unreasonable investment and it’s a good bet on the future.”
A Radical Step to Preserve a Species: Assisted Migration
By CARL ZIMMER
The Bay checkerspot butterfly’s story is all too familiar. It was once a common sight in the San Francisco
Bay area, but development and invasive plants have wiped out much of its grassland habitat.
Conservationists have tried to save the butterfly by saving the remaining patches where it survives. But
thanks to global warming, that may not be good enough.
Climate scientists expect that the planet will become warmer in the next century if humans continue to
produce greenhouse gases like carbon dioxide. The California Climate Change Center projects the state’s
average temperature will rise 2.6 to 10.8 degrees Fahrenheit. Warming is also expected to cause bigger
swings in rainfall.
Studies on the Bay checkerspot butterfly suggest that this climate change will push the insect to extinction.
The plants it depends on for food will shift their growing seasons, so that when the butterfly eggs hatch, the
caterpillars have little to eat. Many other species may face a similar threat, and conservation biologists are
beginning to confront the question of how to respond. The solution they prefer would be to halt global
warming. But they know they may need to prepare for the worst.
One of the most radical strategies they are considering is known as assisted migration. Biologists would
pick a species up and move it hundreds of miles to a cooler place.
Assisted migration triggers strong, mixed feelings from conservation biologists. They recognize that such a
procedure would be plagued by uncertainties and risk. And yet it may be the only way to save some of the
world’s biodiversity.
“Some days I think this is absolutely, positively something that has to be done,” said Dr. Jessica Hellmann
of the University of Notre Dame. “And other days I think it’s a terrible idea.”
Conservation biologists are talking seriously about assisted migration because the effects of climate change
are already becoming clear. The average temperature of the planet is 1.6 degrees Fahrenheit higher than it
was in 1880. Dr. Camille Parmesan, a biologist at the University of Texas, reviewed hundreds of studies on the
ecological effects of climate change this month in the journal Annual Review of Ecology, Evolution, and
Systematics. Many plant species are now budding earlier in the spring. Animals migrate earlier as well. And
the ranges of many species are shifting to higher latitudes, as they track the climate that suits them best.
This is hardly the first time that species have moved in response to climate change. For over two million
years, the planet has swung between ice ages and warm periods, causing some species to shift their ranges
hundreds of miles. But the current bout of warming may be different. The earth was already relatively warm
when it began. “These species haven’t seen an earth as warm as this one’s going to be in a long, long time,”
said Dr. Mark Schwartz, a conservation biologist at the University of California, Davis.
It’s also going to be more difficult for some species to move, Dr. Schwartz added. When the planet warmed
at the end of past ice ages, retreating glaciers left behind empty landscapes. Today’s species will face an
obstacle course made of cities, farms and other human settlements.
Animals and plants will also have to move quickly. If a species cannot keep up with the shifting climate, its
range will shrink. Species that are already limited to small ranges may not be able to survive the loss.
In 2004, an international team of scientists estimated that 15 percent to 37 percent of species would
become extinct by 2050 because of global warming. “We need to limit climate change or we wind up with a
lot of species in trouble, possibly extinct,” said Dr. Lee Hannah, a co-author of the paper and chief climate
change biologist at the Center for Applied Biodiversity Science at Conservation International.
Some scientists have questioned that study’s methods. Dr. Schwartz calls it an overestimate. Nevertheless,
Dr. Schwartz said that more conservative estimates would still represent “a serious extinction.”
Many conservation biologists believe that conventional strategies may help combat extinctions from global
warming. Bigger preserves, and corridors connecting them, could give species more room to move.
Conservation biologists have also been talking informally about assisted migration. The idea builds on past
efforts to save endangered species by moving them to parts of their former ranges. The gray wolf, for
example, has been translocated from Canada to parts of the western United States with great success.
When Dr. Jason McLachlan, a Notre Dame biologist, gives talks on global warming and extinction,
“someone will say, ‘It’s not a problem, since we can just FedEx them to anywhere they need to go,’ ” he said.
No government or conservation group has yet begun an assisted migration for global warming. But
discussions have started. “We’re thinking about these issues,” said Dr. Patrick Gonzalez, a climate scientist at
the Nature Conservancy.
The conservancy is exploring many different ways to combat extinctions from global warming, and Dr.
Gonzalez says that assisted migration “could certainly be one of the options.” For now, the conservancy has
no official policy on assisted migration.
As Dr. McLachlan began hearing about assisted migration more often, he became concerned that
conservation biologists were not weighing it scientifically. He joined with Dr. Schwartz and Dr. Hellmann to lay
out the terms of the debate in a paper to be published in the journal Conservation Biology.
Dr. McLachlan and his colleagues argue that assisted migration may indeed turn out to be the only way to
save some species. But biologists need to answer many questions before they can do it safely and effectively.
The first question would be which species to move. If tens of thousands are facing extinction, it will
probably be impossible to save them all. Conservation biologists will have to make the painful decision about
which species to try to save. Some species threatened by climate change, including polar bears and other
animals adapted to very cold climates, may have nowhere to go.
The next challenge will be to decide where to take those species. Conservation biologists will have to
identify regions where species can survive in a warmer climate. But to make that prediction, scientists need to
know how climate controls the range of species today. In many countries, including the United States, that
information is lacking.
“We don’t even know where species are now,” Dr. McLachlan said.
Simply moving a species is no guarantee it will be saved, of course. Many species depend intimately on
other species for their survival. If conservation biologists move the Bay checkerspot butterfly hundreds of
miles north to Washington, for example, it may not be able to feed on the plants there. Conservation
biologists may have to move entire networks of species, and it may be hard to know where to draw the line.
Assisted migration is plagued not only with uncertain prospects of success, but potential risks as well. A
transplanted species would, in essence, be an invasive one. And it might thrive so well that it would start to
harm other species. Invasive species are among the biggest threats to biodiversity in some parts of the world.
Many were accidentally introduced but some were intentionally moved with great confidence that they would
do no harm. Cane toads were introduced in Australia to destroy pests on sugar plantations, and they
proceeded to wipe out much of the continent’s wildlife.
“If you’re trying to protect a community of species, you’re not going to want someone to introduce some
tree from Florida,” Dr. Hellmann said. “But if you’re someone watching that tree go extinct, you’re going to
want to do it.”
Dr. Hellmann and her colleagues do not endorse or condemn assisted migration in their new paper. Instead,
they call for other conservation biologists to join in a debate. They hope to organize a meeting this summer to
have experts share their ideas.
“There really needs to be a clear conversation about this, so that we can lay all the chips on the table,” Dr.
Schwartz said.
Other experts on global warming and extinctions praised the new paper for framing the assisted migration
debate. “It’s certainly on everybody’s mind, and people are discussing it quite a lot,” Dr. Hannah said. “This
paper’s a breakthrough in that sense.”
Dr. Hannah for one is leery of moving species around. “I’m not a huge fan of assisted migration, but there’s
no question we’ll have to get into it to some degree,” he said. “We want to see it as a measure of last resort,
and get into it as little as possible.”
It is possible that conservation biologists may reject assisted migration in favor of other strategies, Dr.
McLachlan said. But the hard questions it raises will not go away. As species shift their ranges, some of them
will push into preserves that are refuges for endangered species.
“Even if we don’t move anything, they’re going to be moving,” Dr. McLachlan said. “Do we eradicate
them? All of these issues are still relevant.”
Do You Believe in Magic?
By BENEDICT CAREY
A graduate school application can go sour in as
many ways as a blind date. The personal essay
might seem too eager, the references too casual.
The admissions officer on duty might be nursing a
grudge. Or a hangover.
Rachel Riskind of Austin, Tex., nonetheless has a
good feeling about her chances for admittance to
the University of Michigan’s exclusive graduate
program in psychology, and it’s not just a matter of
her qualifications.
On a recent afternoon, as she was working on the admissions application, she went out for lunch with coworkers. Walking from the car to the restaurant in a misting rain, she saw a woman stroll by with a Michigan
umbrella.
“I felt it was a sign; you almost never see Michigan stuff here,” said Ms. Riskind, 22. “And I guess I think
that has given me a kind of confidence. Even if it’s a false confidence, I know that that in itself can help
people do well.”
Psychologists and anthropologists have typically turned to faith healers, tribal cultures or New Age
spiritualists to study the underpinnings of belief in superstition or magical powers. Yet they could just as well
have examined their own neighbors, lab assistants or even some fellow scientists. New research demonstrates
that habits of so-called magical thinking — the belief, for instance, that wishing harm on a loathed colleague
or relative might make him sick — are far more common than people acknowledge.
These habits have little to do with religious faith, which is much more complex because it involves large
questions of morality, community and history. But magical thinking underlies a vast, often unseen universe of
small rituals that accompany people through every waking hour of a day.
The appetite for such beliefs appears to be rooted in the circuitry of the brain, and for good reason. The
sense of having special powers buoys people in threatening situations, and helps soothe everyday fears and
ward off mental distress. In excess, it can lead to compulsive or delusional behavior. This emerging portrait of
magical thinking helps explain why people who fashion themselves skeptics cling to odd rituals that seem to
make no sense, and how apparently harmless superstition may become disabling.
The brain seems to have networks that are specialized to produce an explicit, magical explanation in some
circumstances, said Pascal Boyer, a professor of psychology and anthropology at Washington University in St.
Louis. In an e-mail message, he said such thinking was “only one domain where a relevant interpretation that
connects all the dots, so to speak, is preferred to a rational one.”
Children exhibit a form of magical thinking by about 18 months, when they begin to create imaginary
worlds while playing. By age 3, most know the difference between fantasy and reality, though they usually still
believe (with adult encouragement) in Santa Claus and the Tooth Fairy. By age 8, and sometimes earlier, they
have mostly pruned away these beliefs, and the line between magic and reality is about as clear to them as it
is for adults.
It is no coincidence, some social scientists believe, that youngsters begin learning about faith around the
time they begin to give up on wishing. “The point at which the culture withdraws support for belief in Santa
and the Tooth Fairy is about the same time it introduces children to prayer,” said Jacqueline Woolley, a
professor of psychology at the University of Texas. “The mechanism is already there, kids have already spent
time believing that wishing can make things come true, and they’re just losing faith in the efficacy of that.”
If the tendency to think magically were no more than self-defeating superstition, then over the pitiless
history of human evolution it should have all but disappeared in intellectually mature adults.
Yet in a series of experiments published last summer, psychologists at Princeton and Harvard showed how
easy it was to elicit magical thinking in well-educated young adults. In one instance, the researchers had
participants watch a blindfolded person play an arcade basketball game, and visualize success for the player.
The game, unknown to the subjects, was rigged: the shooter could see through the blindfold, had practiced
extensively and made most of the shots.
On questionnaires, the spectators said later that they had probably had some role in the shooter’s success.
A comparison group of participants, who had been instructed to visualize the player lifting dumbbells, was far
less likely to claim such credit.
In another experiment, the researchers demonstrated that young men and women instructed on how to
use a voodoo doll suspected that they might have put a curse on a study partner who feigned a headache.
And they found, similarly, that devoted fans who watched the 2005 Super Bowl felt somewhat responsible for
the outcome, whether their team won or lost. Millions in Chicago and Indianapolis are currently trying to
channel the winning magic.
“The question is why do people create this illusion of magical power?” said the lead author, Emily Pronin,
an assistant professor of psychology and public affairs at Princeton. “I think in part it’s because we are
constantly exposed to our own thoughts, they are most salient to us” — and thus we are likely to overestimate
their connection to outside events.
The brain, moreover, has evolved to make snap judgments about causation, and will leap to conclusions
well before logic can be applied. In an experiment presented last fall at the Society for Neuroscience meeting,
Ben Parris of the University of Exeter in England presented magnetic resonance imaging scans taken from the
brains of people watching magic tricks. In one, the magician performed a simple sleight of hand: he placed a
coin in his palm, closed his fingers over it, then opened his hand to reveal that the coin was gone.
Dr. Parris and his colleagues found spikes of activity in regions of the left hemisphere of the brain that
usually become engaged when people form hypotheses in uncertain situations.
These activations occur so quickly, other researchers say, that they often link two events based on nothing
more than coincidence: “I was just thinking about looking up my high school girlfriend when out of the blue
she called me,” or, “The day after I began praying for a quick recovery, she emerged from the coma.”
For people who are generally uncertain of their own abilities, or slow to act because of feelings of
inadequacy, this kind of thinking can be an antidote, a needed activator, said Daniel M. Wegner, a professor of
psychology at Harvard. (Dr. Wegner was a co-author of the voodoo study, with Kimberly McCarthy of Harvard
and Sylvia Rodriguez of Princeton.)
“I deal with students like this all the time and I say, ‘Let’s get you overconfident,’ ” Dr. Wegner said. “This
feeling that your thoughts can somehow control things can be a needed feeling” — the polar opposite of the
helplessness, he added, that so often accompanies depression.
Magical thinking is most evident precisely when people feel most helpless. Giora Keinan, a professor at Tel
Aviv University, sent questionnaires to 174 Israelis after the Iraqi Scud missile attacks of the 1991 gulf war.
Those who reported the highest level of stress were also the most likely to endorse magical beliefs, like “I
have the feeling that the chances of being hit during a missile attack are greater if a person whose house was
attacked is present in the sealed room,” or “To be on the safe side, it is best to step into the sealed room right
foot first.”
“It is of interest to note,” Dr. Keinan concluded, “that persons who hold magical beliefs or engage in
magical rituals are often aware that their thoughts, actions or both are unreasonable and irrational. Despite
this awareness, they are unable to rid themselves of such behavior.”
On athletic fields, at the craps table or out sailing in the open ocean, magical thinking is a way of life.
Elaborate, entirely nonsensical rituals are performed with solemn deliberation, complete with theories of
magical causation.
“I am hoping I do not change my clothes for the rest of the season, that I really start to stink,” said Tom
Livatino, head basketball coach at Lincoln Park High School in Chicago, who wears the same outfit as long as
his team is winning. (And it usually does.)
The idea, Mr. Livatino said, is to do as much as possible to recreate the environment that surrounds his
team’s good play. He doesn’t change his socks; he doesn’t empty his pockets; and he works the sideline with
the sense he has done everything possible to win. “The full commitment,” he explained. “I’ll do anything to
give us an edge.”
Only in extreme doses can magical thinking increase the likelihood of mental distress, studies suggest.
People with obsessive-compulsive disorder are often nearly paralyzed by the convictions that they must
perform elaborate rituals, like hand washing or special prayers, to ward off contamination or disaster. The
superstitions, perhaps harmless at the outset, can grow into disabling defense mechanisms.
Those whose magical thoughts can blossom into full-blown delusion and psychosis appear to be a
fundamentally different group in their own right, said Mark Lenzenweger, a professor of clinical science,
neuroscience and cognitive psychology at Binghamton, part of the State University of New York. “These are
people for whom magical thinking is a central part of how they view the world,” not a vague sense of having
special powers, he said. “Whereas with most people, if you were to confront them about their magical beliefs,
they would back down.”
Reality is the most potent check on runaway magical thoughts, and in the vast majority of people it
prevents the beliefs from becoming anything more than comforting — and disposable — private rituals. When
something important is at stake, a test or a performance or a relationship, people don’t simply perform their
private rituals: they prepare. And if their rituals start getting in the way, they adapt quickly.
Mr. Livatino lives and breathes basketball, but he also recently was engaged to be married.
“I can tell you she doesn’t like the clothes superstition,” he said. “She has made that pretty clear.”
Did the dinosaurs invent biplane technology?
* 22:00 22 January 2007
* NewScientist.com news service
* Jeff Hecht
Microraptor gui, a little dinosaur with four
feathered limbs, may have glided through the air
like a biplane, with its wings paired in parallel,
say palaeontologists.
If the hypothesis is correct, it would be the
only known example of a living creature
employing such a flight mechanism. The
microraptor fossil was found in China (see Fourwinged dinosaur makes feathers fly) and
measures just 77 centimetres from the nose to
the tip of its long tail. It dates from 125 million
years before biplanes were invented.
Researchers originally suggested that
Microraptor spread both arms and legs to the
sides of its body to form two pairs of gliding
wings. But palaeontologist Sankar Chatterjee of
Texas Tech University in Lubbock, US, says that would have been aerodynamically inefficient and that
Microraptor's legs could not be splayed sideways.
Microraptor gui from China compared with the Wright 1903 Flyer (Image: Jeff Martz)
No choice
He says the dinosaur instead folded its legs under its body like modern raptors catching prey, with its long
leg feathers sticking out to the side. The asymmetric foot feathers must have had their narrow side facing
forward to smooth the flow of air around the leg.
"Once you do this, you have no other choice but the biplane design," Chatterjee told New Scientist. This
creates a second set of wings below the body and behind the arm wings, which is "a more anatomically and
aerodynamically stable configuration", he believes.
Gregory Paul, a palaeontologist not involved in the study, is unconvinced by the theory, although he allows
the model is plausible. Although fossils clearly show flight feathers, their orientation – critical to understanding
their role in flight – remains unclear.
"We don't know what they're doing with those hind feathers," Paul told New Scientist. He thinks
Microraptor used its hind wings only for gliding, folding them out of the way for powered flight. Clearer fossil
finds may provide a definitive answer.
Journal reference: Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.0609975104)
Researchers: Microwave oven can sterilize sponges, scrub pads
Filed under Research, Health, Engineering on Monday, January 22, 2007.
PLEASE NOTE: To guard against the risk of fire, people who wish to sterilize their sponges at home
must ensure the sponge is completely wet. Two minutes of microwaving is sufficient for most
sterilization. Sponges should also have no metallic content. Last, people should be careful when
removing the sponge from the microwave as it will be hot.
GAINESVILLE, Fla. — Microwave ovens may be good for more than just zapping the leftovers; they may also help
protect your family.
University of Florida engineering researchers have found that microwaving kitchen sponges and plastic
scrubbers — known to be common carriers of the bacteria and viruses that cause food-borne illnesses –
sterilizes them rapidly and effectively.
That means that the estimated 90-plus percent of Americans with microwaves in their kitchens have a
powerful weapon against E. coli, salmonella and other bugs at the root of increasing incidents of potentially
deadly food poisoning and other illnesses.
“Basically what we find is that we could knock out most bacteria in two minutes,” said Gabriel Bitton, a UF
professor of environmental engineering. “People often put their sponges and scrubbers in the dishwasher, but
if they really want to decontaminate them and not just clean them, they should use the microwave.”
Bitton, an expert on wastewater microbiology, co-authored a paper about the research that appears in the
December issue of the Journal of Environmental Health, the most recent issue. The other authors are Richard
Melker, a UF professor of anesthesiology, and Dong Kyoo Park, a UF biomedical engineering doctoral student.
Food-borne illnesses afflict at least 6 million Americans annually, causing at least 9,000 deaths and $4
billion to $6 billion in medical costs and other expenses. Home kitchens are a common source of
contamination, as pathogens from uncooked eggs, meat and vegetables find their way onto countertops,
utensils and cleaning tools. Previous studies have shown that sponges and dishcloths are common carriers of
the pathogens, in part because they often remain damp, which helps the bugs survive, according to the UF
paper.
Bitton said the UF researchers soaked sponges and scrubbing pads in raw wastewater containing a witch’s
brew of fecal bacteria, viruses, protozoan parasites and bacterial spores, including Bacillus cereus spores.
Like many other bacterial spores, Bacillus cereus spores are quite resistant to radiation, heat and toxic
chemicals, and they are notoriously difficult to kill. The UF researchers used the spores as surrogates for cysts
and oocysts of disease-causing parasitic protozoa such as Giardia, the infectious stage of the protozoa. The
researchers used bacterial viruses as a substitute for disease-causing food-borne viruses, such as noroviruses
and hepatitis A virus.
The researchers used an off-the-shelf microwave oven to zap the sponges and scrub pads for varying
lengths of time, wringing them out and determining the microbial load of the water for each test. They
compared their findings with water from control sponges and pads not placed in the microwave.
The results were unambiguous: Two minutes of microwaving on full power mode killed or inactivated more
than 99 percent of all the living pathogens in the sponges and pads, although the Bacillus cereus spores
required four minutes for total inactivation.
Bitton said the heat, rather than the microwave radiation, likely is what proves fatal to the pathogens.
Because the microwave works by exciting water molecules, it is better to microwave wet rather than dry
sponges or scrub pads, he said.
“The microwave is a very powerful and an inexpensive tool for sterilization,” Bitton said, adding that
people should microwave their sponges according to how often they cook, with every other day being a good
rule of thumb.
Spurred by the trend toward home health care, the researchers also examined the effects of microwaving
contaminated syringes. Bitton said the goal in this research was to come up with a way to sterilize syringes
and other equipment that, at home, often gets tossed in the household trash, winding up in standard rather
than hazardous waste landfills.
The researchers also found that microwaves were effective in decontaminating syringes, but that it
generally took far longer, up to 12 minutes for Bacillus cereus spores. The researchers also discovered they
could shorten the time required for sterilization by placing the syringes in heat-trapping ceramic bowls.
Bitton said preliminary research also shows that microwaves might be effective against bioterrorism
pathogens such as anthrax, used in the deadly, still-unsolved 2001 postal attacks.
Using a dose of Bacillus cereus dried on an envelope as a substitute for mail contaminated by anthrax
spores, Bitton said he found he could kill 98 percent of the spores in 10 minutes by microwaving the paper –
suggesting, he said, one possible course of action for people who fear mail might be contaminated. However,
more research is needed to confirm that this approach works against actual anthrax spores, he said.
Families do not cause anorexia nervosa
Eating disorders researchers counter Bundchen's blunder
Misstatements and ignorance claiming that families "cause" eating disorders is like
blaming parents for diabetes or asthma or cancer says an international group of eating disorders researchers.
Recent damaging statements by fashion model Gisele Bundchen stating that unsupportive families cause
anorexia nervosa only perpetuate misconceptions and further stigmatize eating disorders. Contrary to her
claim, there is no scientific evidence that families cause anorexia nervosa. In fact, the researchers are finding
that anorexia nervosa is far more complex than simply wanting to be slim to achieve some fashionable slender
ideal. The data show that anorexia nervosa has a strong genetic component that may be the root cause of this
illness.
"An uninformed opinion such as Bundchen's causes harm on a number of levels. By contributing to the
stigma, it drives sufferers underground and creates obstacles to seeking help. It damages attempts at
advocacy and hurts parents who are desperately fighting for their child's recovery," said Allan S. Kaplan, M.D.,
Loretta Anne Rogers Chair in Eating Disorders at the University of Toronto. "Such thinking also misinforms
third party payors who may not want to pay for the treatment of these biologically-based illnesses if they think
its primary cause is family dysfunction."
Dr. Kaplan is a member of the international group of researchers attempting to find which genes contribute
to anorexia nervosa through a National Institute of Mental Health-funded study of families with a history of
anorexia nervosa. The current study, which is being conducted at 10 sites across the world, hopes to further
clarify which genes play a role in anorexia nervosa. The study builds on data from ten years of
groundbreaking research on the genetics of eating disorders sponsored by the Price Foundation.
"We often hear that societal pressures to be thin cause many young women and men to develop an eating
disorder. Many individuals in our culture, for a number of reasons, are concerned with their weight and diet.
Yet less than half of 1 percent of all women develop anorexia nervosa, which indicates to us that societal
pressure alone isn't enough to cause someone to develop this disease," said Walter H. Kaye, M.D., professor
of psychiatry, University of Pittsburgh School of Medicine. "Our research has found that genes seem to play a
substantial role in determining who is vulnerable to developing an eating disorder. However, the societal
pressure isn't irrelevant; it may be the environmental trigger that releases a person's genetic risk." Families
should not be blamed for causing anorexia. In fact, they are often devastated and suffer from the
consequences of this illness."
Anorexia nervosa is a serious and potentially lethal illness, with a mortality rate greater than 10 percent. It
is characterized by the relentless pursuit of thinness, emaciation and the obsessive fear of gaining weight.
Anorexia nervosa commonly begins during adolescence, but strikes throughout the lifespan--it is nine times
more common in females than in males. Personality traits, such as perfectionism, anxiety and obsessionality,
are often present in childhood before the eating disorder develops and may contribute to the risk of
developing this disorder.
"We need to understand all the factors that influence eating disorders, both genetic and environmental,
and find ways to address them in order to prevent people from developing these potentially deadly
conditions," said Cynthia Bulik, Ph.D., William and Jeanne Jordan Distinguished Professor of Eating Disorders,
University of North Carolina at Chapel Hill. "Understanding how genes and environment interact both to
increase risk for eating disorders and to protect those who are genetically vulnerable from developing the
disorder will require the cooperation of professionals in the eating disorders field, the media, and the fashion
and entertainment industries. Only cooperatively, will we be able to move the field forward toward the
elimination of this disease."
"Anorexia nervosa has the highest death rate of any mental illness, yet so few dollars are dedicated to the
cure," stated Lynn Grefe, CEO of the National Eating Disorders Association. "These scientific advances
demonstrating a genetic component are significant and so meaningful to our families, wiping away the myths
and emphasizing the need for even more research to help the next generation."
PITTSBURGH, Jan. 22 --
Dental Researchers Test No-Needle Anesthesia, No-Drilling Cavity Care
Imagine having a decayed tooth repaired, painlessly, without drilling or shots of anesthesia to
numb the area.
Wishful thinking? Not if two studies being conducted at the University at Buffalo's School of Dental Medicine
show positive results.
BUFFALO, N.Y. --
In one study, funded by a $100,000 grant by Apollonia, LLC, researchers in the school's Center for Dental
Studies are testing a nasal spray that numbs the upper teeth.
"If this study is successful," said Sebastian Ciancio, D.D.S., principal investigator on the study, "it may
mean the end of dental injections when dentists are performing procedures on the upper arch."
The second study, set to begin in coming months, will test the use of ozone to kill bacteria in a decayed
tooth and its potential to eliminate the need for the dreaded drill, at least to repair simple cavities.
Researchers at UB and two other U.S. dental schools will conduct the research, which is funded by a $1.5
million grant from Curozone, Inc. and Kavo Dental Manufacturing Co. UB's portion is $400,000.
Ciancio, who also is the UB principal investigator on this study, said the ozone delivery device currently is
being used in Europe. "If the U.S. studies are successful, it should be available in this country in about two
years," he said.
The nasal spray study is testing the effectiveness in dental procedures of a topical anesthetic normally used
by ear, nose and throat physicians when they operate on the nose. Patients who received this anesthetic for
that purpose reported it also numbed their upper teeth, sparking interest in using it for dental procedures.
"We currently are testing to determine what the optimal dose is for this spray when used as an anesthetic
agent for the maxillary (upper) teeth," said Ciancio. "The current study includes 85 patients and should be
completed by the end of January and will be followed by a second study in March. Once we know the results,
we'll then test it in a broader population."
Co-investigators, all from the UB dental school, are Eugene Pantera, D.D.S., Sandra Shostad, D.D.S., and Joseph
Bonavilla, D.D.S.
The ozone study will evaluate the effectiveness of the ozone delivery device, which fits over a tooth and
forms an airtight seal, in arresting tooth decay. The study will enroll 125 participants and will last 18 months.
"Following application of the ozone, patients will use a remineralizing solution, which strengthens the
weakened tooth structure and, in many cases, eliminates the need for any dental drilling," said Ciancio.
Additional investigators on this study are Othman Shibly, D.D.S., Jude Fabiano, D.D.S., Benita Sobieroj, D.D.S.,
Maureen Donley, D.D.S., and Nina Kim, D.D.S., all from the UB dental school faculty.
Who laid the first egg? An update
Scientists move a step closer to linking embryos of earth's first animals and their adult form
Blacksburg, Va., January, 23, 2007 -- A decade ago, Shuhai Xiao, associate professor of geosciences at Virginia Tech,
and his colleagues discovered thousands of 600-million-year-old embryo microfossils in the Doushantuo
Formation, a fossil site near Weng'an, South China. In 2000, Xiao's team reported the discovery of a tubular
coral-like animal that might be a candidate for parenthood.
In the February issue of Geology, the journal of the Geological Society of America, Xiao will report
discoveries about the intermediary stage that links the embryo to the adult. (Cover story "Rare helical
spheroidal fossils from the Doushantuo Lagerstatte: Ediacaran animal embryos come of age?" by Xiao, James
W. Hagadorn of Amherst, and Chuanming Zhou and Xunlai Yuan of Nanjing Institute of Geology and
Paleontology.)
While there are thousands of early-stage embryos, there are only 80 have been recovered that have
advanced to an intermediary stage of development. The intermediary stage embryos have an envelope similar
to that of earlier embryonic stage, and they have a coiled tubular embryo within
the envelope. Their envelope has a groove on the surface, consisting of three
clockwise coils. Using microfocus X-ray computed tomography (microCT) imaging,
the scientists virtually peeled off the envelope and exposed the embryo inside.
The tubular embryo is also coiled, with three clockwise coils. In some specimens,
the scientists found signs of uncoiling. "This is further evidence that these
embryos would have grown into the tubular organisms," Xiao said.
In the article, the researchers state, "… if this possibility holds up to further
testing, the new fossils may bridge the developmental gap between two
previously described Doushantuo forms."
"The discovery of additional intermediary stages and even more advanced
specimens would be the ultimate test," Xiao said. But the conditions that
preserved the ancient embryos may not have been favorable for preserving or
fossilizing more developed life forms, the researchers note. Connecting the first
moments of animal evolution will likely require more use of advanced imaging
techniques.
Shown are scanning electron photomicrographs of two fossil embryo specimens from the 600-million-yearold Doushantuo Formation in South China. The soccer-ball-shaped specimen is interpreted as an early
stage (blastula) embryo, and the baseball-shaped specimen is interpreted as an intermediate-stage helical
embryo consisting of three clockwise coils. Each embryo used to be enclosed in an envelope, which was
removed (some piece still remains in the soccer-ball-shaped specimen) so that the embryo itself is
exposed. Embryos are about 0.55-0.75 millimeter in diameter. Background shows the Doushantuo rocks
from which the embryos were extracted. Shuhai Xiao
The top one does look like a soccer ball, but doesn’t that bottom one look like a baseball? Something is a bit fishy here…
How fish conquered the ocean
Sequence analyses of duplicated yolk genes of bony fishes yield new insights for their successful
radiation in the oceans during the early Paleogene period
Scientists at the University of Bergen, Norway have deduced how bony fishes conquered the oceans by
duplicating their yolk-producing genes and filling their eggs with the water of life – the degradation of yolk
proteins from one of the duplicated genes causes the eggs to fill with vital water and float. This is the major
solution realized by extant marine teleosts that showed an unprecedented radiation during the late Cretaceous
and early Paleogene Periods. The work is a unique hypothesis that integrates the cellular and molecular
physiology of teleost reproduction with their evolutionary and environmental history.
"The oceans have not always been filled with fishes as nowadays" says researcher Dr. Roderick Nigel Finn
at the Department of Biology, University of Bergen, Norway. "To the contrary", Dr Finn says, "the fossil record
shows that the ancestors of the bony fishes (teleosts) inhabited fresh water environments for at least 150
million years before they entered the oceans".
"Apparently, it was not until the Eocene epoch (about 55 million years ago) that an unparalleled and rapid
burst of thousands of new marine teleost species took place as evidenced by their sudden appearance in the
fossil records of marine sediments. The basis for this successful radiation is unexplained and has intrigued
biologists for many years", says Dr. Finn and adds, "Our paper in PLoS ONE relates to the molecular solutions
evolved among the teleost ancestors and provides a compelling hypothesis of when, how and why the teleosts
succeeded in the oceanic environment. It is common knowledge that water is essential for life," continues Dr.
Finn, "so it seems a surprising paradox that fishes that live in water should have a problem acquiring it. Yet it
was this paradox that provided the trail of clues for us to follow".
"The physiological problems of reverting from a fresh water environment to the saline seawater is
demanding for the water balance of fishes", says professor Hans Jørgen Fyhn, a colleague of Dr. Finn, and
adds, "This is especially so for their newly spawned eggs since they lack the adult organs and mechanisms
responsible for coping with these problems. For years we studied various aspects of the physiological
adaptations of the fish egg to the marine environment. It is most satisfying that Dr. Finn has been able to tie
the threads together in molecular and evolutionary terms with their impressive, comparative sequence
alignment study of the involved yolk genes and proteins as published in PLoS ONE".
In the paper the authors (RN Finn & BA Kristoffersen) have used Bayesian analysis to examine the
evolution of vertebrate yolk protein (vitellogenin) genes in relation to the "Three round hypothesis" of whole
genome duplication among vertebrates, and the functional end points of the vitellogenin fractional
degradation during the final stages of oogenesis, a period that prepares the egg for spawning and fertilization.
They show that teleost vitellogenins have undergone a post-R3 lineage-specific gene duplication to form
paralogous clusters that correlate to the pelagic and benthic character of the eggs. The alteration in the
function (neo-functionalization) of one of the duplicated genes (paralogues) allowed its yolk protein products
to be broken down to free amino acids and thus drive hydration of the maturing eggs. The timing of these
events matches the appearance of the vast numbers of marine acanthomorph teleosts in the fossil record. The
authors propose that the neo-functionalization of duplicated vitellogenin genes was a key event in the
evolution and success of the acanthomorph teleosts in the oceanic environment.
"This study is an exciting part of our research focus in Developmental Biology of Fishes, and the work
published in PLoS ONE is clearly a high point of these efforts" says professor Jarl Giske, head of the
Department of Biology at the University of Bergen."It is stimulating to both students and staff at the
department when our researchers are able to contribute to solving great evolutionary problems."
New dopamine brain target discovered
Potential breakthrough for schizophrenia treatment
A team of Canadian researchers, lead by Dr. Susan George and Dr. Brian O'Dowd
at the Centre for Addiction and Mental health (CAMH), discovered a distinct dopamine signalling complex in
the brain. Composed of two different types of dopamine receptors, this novel target may have a significant
role in understanding and treating schizophrenia.
Published in the Proceedings of the National Academy of Sciences USA (Rashid et al., 2007), this important
discovery demonstrates the existence of a Gq/11-coupled signalling unit that triggers a calcium signal, which
is turned on by stimulating D1 and D2 dopamine receptors. Unlike other dopamine receptors, this novel unit
will only create brain signals when both receptors are stimulated at the same time.
Using animal models. Drs. George and O'Dowd and their team identified this complex by its unique reaction
to dopamine or specific drug triggers. Strikingly, stimulating this target with dopamine or specific drugs
Toronto, ON, January 23, 2007 --
triggered a rise in calcium in the brain. As calcium has a profound effect on almost all brain function, this rise
in calcium causes a cascade of events in the brain. This is the first time that a direct connection between
dopamine and calcium signals has been reported.
"This distinct unit provides a novel signalling pathway through which dopamine can impact the function of
brain cells", said Dr. George. "This is significant because signalling through calcium release is a major
mechanism regulating many important functions in the brain and we have provided the first direct mechanism
by which dopamine can activate a calcium signal."
This data has significant implications for schizophrenia. Research tells us that people with schizophrenia
may have disordered calcium signals, and the major treatments for this disease target the dopamine system.
Drs. George and O'Dowd state, "our data links these two pieces of evidence, creating better understanding of
the disease and opening the door for a new generation of highly specific drugs that may help alleviate the
devastating symptoms of schizophrenia."
Paleontologists discover most primitive primate skeleton
The origins and earliest branches of primate evolution are clearer and more ancient by 10
million years than previous studies estimated, according to a study featured on the cover of the Jan. 23 print
edition of the Proceedings of the National Academy of Sciences.
The paper by researchers at Yale, the University of Winnipeg, Stony Brook University, and led by University
of Florida paleontologist Jonathan Bloch reconstructs the base of the primate family tree by comparing skeletal
and fossil specimens representing more than 85 modern and extinct species. The team also discovered two
56-million-year-old fossils, including the most primitive primate skeleton ever described.
In the two-part study, an extensive evaluation of skeletal structures provides evidence that plesiadapiforms,
a group of archaic mammals once thought to be more closely related to flying lemurs, are the most primitive
primates. The team analyzed 173 characteristics of modern primates, tree shrews, flying lemurs with
plesiadapiform skeletons to determine their evolutionary relationships. High-resolution CT scanning made fine
resolution of inaccessible structures inside the skulls possible.
"This is the first study to bring it all together," said co-author Eric Sargis, associate professor of
anthropology at Yale University and Assistant Curator of Vertebrate Zoology at Yale's Peabody Museum of
Natural History. "The extensive dataset, the number and type of characteristics we were able to compare, and
the availability of full skeletons, let us test far more than any previous study."
At least five major features characterize modern primates: relatively large brains, enhanced vision and eyes
that face forward, a specialized ability to leap, nails instead of claws on at least the first toes, and specialized
grasping hands and feet. Plesiadapiforms have some but not all of these traits. The article argues that these
early primates may have acquired the traits over 10 million years in incremental changes to exploit their
environment.
While the study did not include a molecular evaluation of the samples, according to Sargis, these results
are consistent with molecular studies on related living groups.
Compatibility with the independent molecular data increases the
researchers' confidence in their own results.
Bloch discovered the new plesiadapiform species, Ignacius
clarkforkensis and Dryomomys szalayi, just outside Yellowstone
National Park in the Bighorn Basin with co-author Doug Boyer, a
graduate student in anatomical sciences at Stony Brook. Previously,
based only on skulls and isolated bones, scientists proposed that
Ignacius was not an archaic primate, but instead a gliding mammal
related to flying lemurs. However, analysis of a more complete and
well-preserved skeleton by Bloch and his team altered this idea.
Composite (left) and reconstructed (right) skeletons of D. szalayi, the oldest known ancestor of
primates. (Bloch, et al./ PNAS)
"These fossil finds from Wyoming show that our earliest primate ancestors were the size of a mouse, ate
fruit and lived in the trees," said study leader Jonathan Bloch, a vertebrate paleontology curator at the Florida
Museum of Natural History. "It is remarkable to think we are still discovering new fossil species in an area
studied by paleontologists for over 100 years."
Researchers previously hypothesized plesiadapiforms as the ancestors of modern primates, but the idea
generated strong debate within the primatology community. This study places the origins of Plesiadapiforms in
the Paleocene, about 65 (million) to 55 million years ago in the period between the extinction of the dinosaurs
and the first appearance of a number of undisputed members of the modern orders of mammals.
"Plesiadapiforms have long been one of the most controversial groups in mammalian phylogeny," said
Michael J. Novacek, curator of paleontology at the American Museum of Natural History. "First, they are
New Haven, Conn. --
somewhere near primates and us. Second, historically they have offered tantalizing, but very often incomplete,
fossil evidence. But the specimens in their study are beautifully and spectacularly preserved."
"The results of this study suggest that plesiadapiforms are the critical taxa to study in understanding the
earliest phases of human evolution. As such, they should be of very broad interest to biologists,
paleontologists, and anthropologists," said co-author Mary Silcox, professor of anthropology at the University
of Winnipeg.
"This collaboration is the first to bring together evidence from all regions of the skeleton, and offers a wellsupported perspective on the structure of the earliest part of the primate family tree," Bloch said.
Be afraid, be very afraid, if you learned to
Study on fear responses suggests new understanding of anxiety disorders
WASHINGTON, DC January 23, 2007 – A new study on rats has identified a part of the brain's cortex that controls
learned but not innate fear responses.
The results suggest that hyperactivity in a region of the prefrontal cortex might contribute to disorders of
learned fear in humans, such as post-traumatic stress disorder and other anxiety disorders, say authors Kevin
A. Corcoran, PhD, and Gregory Quirk, PhD, of the Ponce School of Medicine in Puerto Rico. Their report
appears in the January 24 issue of The Journal of Neuroscience.
While building on previous findings, this study contradicts prior thinking that the amygdala, which plays a
central role in emotional learning, is sufficient for processing and expressing fear, and it opens the potential
for new avenues of treatment, the researchers say.
"This is the first paper demonstrating that a region of the cortex is involved in learned fear but not in innate
fear," says Markus Fendt, PhD, of the Novartis Institutes for Biomedical Research in Basel, Switzerland, who is
not connected with the study.
In their study, Corcoran and Quirk taught rats to associate a 30-second tone with a shock to the foot at the
end of the tone. Upon hearing the same tone the next day, rats spent nearly 70 percent of the time of the
tone frozen, a typical fear response.
In another group of rats, the researchers chemically blocked activity in the prelimbic cortex, which is
located near the front of the brain and close to the midline between the two hemispheres. These rats spent
only 14 percent of the time freezing to the sound of the tone.
Yet the rats' innate, or natural, fears seemed unaffected by blocking the prelimbic cortex; they froze as
much in response to seeing a cat or being placed in a large open area as they did to hearing the tone.
Furthermore, when the team trained rats with the tone after chemically inactivating the prelimbic cortex, and
then tested them drug-free the next day, the rats showed a normal fear response, indicating that inactivating
the prelimbic cortex did not prevent them from learning to fear the tone.
The prelimbic cortex is connected to the amygdala, and, based on their findings, Corcoran and Quirk
speculate that "by modulating amygdala activity, the prelimbic cortex is important for determining the
circumstances in which it is appropriate to convey learned fears." In contrast, they propose that fear
responses to innate threats are automatic and do not require cortical involvement.
"Corcoran and Quirk's work raises the question of whether learned fear is more controllable--for example,
by higher brain functions--than innate fear," says Fendt.
Molecular Biology
Brown Team Finds Crucial Protein Role in Deadly Prion Spread
Brown University biologists have made another major advance toward understanding the deadly
work of prions, the culprits behind fatal brain diseases such as mad cow and their human
counterparts. In new work published online in PLoS Biology, researchers show that the protein
Hsp104 must be present and active for prions to multiply and cause disease.
PROVIDENCE, R.I. [Brown University] — A single protein plays a major role in deadly prion diseases by smashing up
clusters of these infectious proteins, creating the “seeds” that allow fatal brain illnesses to quickly spread, new
Brown University research shows.
The findings are exciting, researchers say, because they might reveal a way to control the spread of prions
through drug intervention. If a drug could be made that inhibits this fragmentation process, it could
substantially slow the spread of prions, which cause mad cow disease and scrapie in animals and, in rare
cases, Creutzfeldt-Jacob disease and kuru in humans.
Because similar protein replication occurs in Alzheimer’s and Parkinson’s diseases, such a drug could also
slow progression of these diseases as well.
“The protein fragmentation we studied has a big impact on how fast prion diseases spread and may also
play a role in the accumulation of toxic proteins in neurodegenerative diseases like Parkinson’s,” said Tricia
Serio, an assistant professor in Brown’s Department of Molecular Biology, Cell Biology and Biochemistry and
lead researcher on the project.
The findings from Serio and her team, which appear online in PLoS Biology, build on their groundbreaking
work published in Nature in 2005. That research showed that prions – strange, self-replicating proteins that
cause fatal brain diseases – convert healthy protein into abnormal protein through an ultrafast process.
This good-gone-bad conversion is one way that prions multiply and spread disease. But scientists believe
that there is another crucial step in this propagation process – fragmentation of existing prion complexes.
Once converted, the thinking goes, clusters of “bad” or infectious protein are smashed into smaller bits, a
process that creates “seeds” so that prions multiply more quickly in the body. Hsp104, a molecule known to be
required for prion replication, could function as this protein “crusher,” Serio thought.
To test these ideas, Serio and members of her lab studied Sup35, a yeast protein similar to the human
prion protein PrP. They put Sup35 together with Hsp104, then activated and deactivated Hsp104. They found
that the protein does, indeed, chop up Sup35 complexes – the first direct evidence that this process occurs in
a living cell and that Hsp104 is the culprit.
“To understand how fragmentation speeds the spread of prions, think of a dandelion,” Serio said. “A
dandelion head is a cluster of flowers that each carries a seed. When the flower dries up and the wind blows,
the seeds disperse. Prion protein works the same way. Hsp104 acts like the wind, blowing apart the flower
and spreading the seeds.”
Serio said that prions still multiply without fragmentation. However, she said, they do so at a much slower
rate. So a drug that blocked the activity of Hsp104 could seriously slow progression of prion-related diseases.
Former graduate student Prasanna Satpute-Krishnan and research associate Sara Langseth, also in Brown’s
Department of Molecular Biology, Cell Biology and Biochemistry, conducted the work with Serio.
The National Cancer Institute, the National Institute of General Medical Sciences, and the Pew Scholars Program
in the Biomedical Sciences funded the research.
RX for wrong-site surgery -- 2 minutes of conversation
A study of Johns Hopkins surgeons, anesthesiologists and nurses suggests that hospital policies requiring a
brief preoperation "team meeting" to make sure surgery is performed on the right patient and the right part of
the body could decrease errors.
In the study, which will appear in the February issue of the Journal of the American College of Surgeons,
Hopkins OR personnel were "very positive" about the briefings, according to surgeon Martin Makary, M.D.,
M.P.H., director of the Johns Hopkins Center for Surgical Outcomes Research and lead author of the study.
"Although we lack systems for uniform reporting of wrong-site surgeries to understand the extent of the
problem, we observed team meetings increase the awareness of OR personnel with regard to the site and
procedure and their perceptions of operating rooms safety" says Makary. He stressed that wrong-site surgery
is exceptionally rare but entirely preventable.
A study published last year in the Archives of Surgery that looked at 2.8 million operations in Massachusetts
over a 20-year period suggests that the rate of "wrong-site" surgery anywhere other than the spine is 1 in
every 112,994 operations. The study excluded the spine because researchers defined wrong-site surgeries as
operations conducted on a different organ or body part than intended by the surgeon and patient. Since the
spine is one body part, even though a surgeon may have operated on the wrong part of the spine, technically
it is still the right part of the body.
The Joint Commission, which evaluates and accredits nearly 15,000 health care organizations and programs
in the United States, requires hospitals to have a presurgical conversation in the OR before every surgery.
Although Makary says no national standard was set by the Joint Commission, he and others led efforts at
Hopkins to enforce the mandate, developing a standardized OR briefing program that became Hopkins
Hospital policy in June 2006. Since then, he has collaborated with Rochester University, Yale, Columbia and
Cornell and the World Health Organization to broaden the use and reach of the Hopkins program.
The briefing consists of a two-minute meeting during which all members of the OR team state their name
and role, and the lead surgeon identifies and verifies such critical components of the operation as the patient's
identity, the surgical site and other patient safety concerns. The briefing is performed after anesthesia is
administered and prior to incision.
A survey, among 147 surgeons, 59 anesthesiologists, 187 nurses and 29 other OR staff, was given twice before implementing the policy and after it had been in effect for three-months.
After training, a 13.2 percent increase in those who believed the policy would be effective was recorded
among the OR personnel. And more than 90 percent agreed that "a team discussion before a surgical
procedure is important for patient safety."
"The Joint Commission identified communication breakdowns as the most common root cause of wrong-site
surgeries," says Makary. "Our research indicates that OR personnel see presurgical briefings as a useful tool to
help prevent such errors."
Before the new policy was implemented, Makary notes, many surgeons would walk into the OR and start
working without a conversation of any kind and without even knowing the names of the nurses and other staff
who were assisting them.
A recently licensed nicotine receptor stimulant trebles the odds of stopping smoking
A recently licensed nicotine receptor stimulant trebles the odds of stopping smoking
The new anti-smoking drug varenicline was first licensed for use in the UK on 5th December 2006. An early
Cochrane Review' of its effectiveness shows that it can give a three-fold increase in the odds of a person
quitting smoking. Varenicline is the first new anti-smoking drug in the last ten years, and only the third, after
NRT and bupropion, to be licensed in the USA for smoking cessation.
People become addicted to smoking tobacco partly because nicotine in the smoke stimulates receptors in
the nervous system that cause a release of the feel-good hormone dopamine. Varenicline partially stimulates
these nicotine receptors and enables a low-level release of dopamine, which reduces withdrawal symptoms. It
also partially blocks nicotine from being absorbed by the receptors, making continued smoking less satisfying.
This reduces a person’s need to smoke, and may help them to quit completely.
This conclusion was drawn by a group of Cochrane researchers after they studied data from six trials that
compared the effects of giving people either varenicline or a placebo. Together the trials involved 2451 people
on varenicline and 2473 people on placebos.
Pooling the data showed that people taking varenicline increased their odds of quitting approximately
three-fold for 12 months or longer compared with those on placebo drugs .
Data from some of the trials also showed that people given varenicline increased their odds of quitting
more than 1½-fold compared with those given bupropion, an antidepressant drug that roughly doubles a
person’s chance of stopping smoking (see: next press release.)
"What we need now are some trials that make direct comparisons between varenicline and nicotine
replacement therapy" says Lead Review Author Kate Cahill, who works in the Department of Primary Health
Care at Oxford University.
New evidence boosts the conclusion that some antidepressants can double a smoker’s chance of
quitting
The most recent Cochrane Review² concluded antidepressants bupropion (Zyban) and nortriptyline double
a person’s chances of giving up smoking and have few side-effects, but selective serotonin reuptake inhibitors
(SSRIs) such as fluoxetine (Prozac) are not effective.
Although nicotine medications are known to help people quit smoking, not everyone is helped by them or
wants to use them. One possible alternative is to use antidepressants. The rationale for this is that some
people may smoke to combat depression and that stopping smoking could trigger depressive symptoms in
some smokers.
A Cochrane review first published in 1997 (and last updated in 2004) showed that the antidepressants
bupropion and nortriptyline increase a person’s chances of giving up smoking, but selective serotonin reuptake
inhibitors such as fluoxetine (Prozac) do not. An updated version of the review published in Jan 2007 now
adds 17 more trials to the dataset, and now shows that bupropion and nortriptyline double a person’s chance
of quitting - again SSRIs have no effect.
"Since bupropion and nortriptyline appear to work as well in non-depressed as depressed persons, this
suggests they help smokers quit in some way other than as antidepressants," says John Hughes, a Professor
in the Department of Psychiatry at the University of Vermont, Burlington, USA.
Genes reveal West African heritage of white Brits
* 11:24 24 January 2007
* NewScientist.com news service
* Roxanne Khamsi
Gene tests on a sample of “indigenous” Englishmen have thrown up a surprise black ancestry, providing
new insight into a centuries-old African presence in Britain.
The research, funded by the Wellcome Trust, identified a rare West African Y chromosome in a group of
men from Yorkshire who share a surname that dates back at least as far as the mid-14th century and have a
typical European appearance. They owe their unusual Y chromosome to an African man living in England at
least 250 years ago and perhaps as early as Roman times, the researchers say.
Mark Jobling at the University of Leicester, UK, and colleagues recruited 421 men who described
themselves as British and analysed their genes as part of a survey of British Y chromosome diversity. To the
researchers’ surprise, they found that one individual in the study carried a very rare Y chromosome, called
hgA1.
This particular variant has previously been identified in only 26 people worldwide, three African Americans
and 23 men living in West African countries such as Guinea-Bissau and Senegal. “It’s so distinctive, it really
sticks out like a sore thumb,” Jobling says of the chromosome’s unique sequence. He adds that it is virtually
impossible for this sequence to have coincidentally evolved in Britain.
The white British subject with the hgA1 variant, however, knew of no African family connection.
Father to son
To explore the mysterious origin of his Y chromosome scientists recruited 18 other men that shared his rare
surname, which dates back to the first use of surnames, hundreds of years ago, and was first recorded in the
county of Yorkshire, in northern England. The researchers have not disclosed the surname to maintain the
men’s privacy.
The team hoped that this would help them pinpoint when the hgA1 had variant entered the lineage, since Y
chromosomes, like surnames, are passed from father to son.
Of the 18 men with the Yorkshire surname, six of them carried the hgA1 Y chromosome – including one
man in the US, whose ancestors had migrated from England in 1894.
Genealogical records linked these men to two family trees, both dating back to the 1780s in Yorkshire.
Jobling believes that these two genealogies are connected by a common male ancestor of West African
descent living in England at least 250 years ago.
Viking capture
The British men carry an hgA1 Y chromosome that closely matches the one identified in men presently
living in West Africa. This suggests that the former group’s black ancestor arrived in Britain within the past few
thousand years. Had their hgA1 Y chromosome been introduced any thousands of years earlier, when humans
first migrated from Africa to Europe, its sequence would have shown greater divergence from the one
currently found in West Africa.
The hgA1 Y chromosome could perhaps have entered the gene pool in northern England 1800 years ago
when Africans fought there as Roman soldiers, Jobling says. It also might have been introduced in the 9th
century, when Vikings brought captured North Africans to Britain, according to some historians.
But scientists note that the majority of black men with the hgA1 variant currently live in Guinea-Bissau and
nearby countries in West Africa. Because many slaves from this area came to Britain beginning in the mid16th century, it is likely that the white men with the hgA1 variant have a black ancestor that arrived this way,
researchers say.
This ancestor could have been a first-generation immigrant African or one whose family had lived in Britain
for generations.
Famed writer
Jobling says his study provides the first evidence of a long-lived African presence in Britain. He adds that it
raises the possibility that relationships among black and white people was perhaps more historically
acceptable in Britain than some people might believe.
Vincent Brown of Harvard University in Cambridge, Massachusetts, US, agrees and points to the example of
Olaudah Equiano, a black man who bought his freedom in Britain in the mid 18th century and achieved fame
for his writing. Equiano claimed to be a slave from west Africa, though some argue that he had arrived from
colonial America. He lived in London and eventually married a white woman, notes Brown.
The new findings are unusual because they reveal the hidden African ancestry of white men, Jobling says.
He notes that it is much more common for studies to discover or confirm the reverse. For example, gene tests
gave strong evidence that the black descendents of the slave Sally Hemmings could also trace their ancestry
to her "owner", the third US president, Thomas Jefferson (Nature, vol 396, p 27).
And several years ago, Jobling’s team found that more than a quarter of British African-Caribbean men
have a Y chromosome which traces back to Europe rather than Africa.
Journal reference: European Journal of Human Genetics (DOI: 10.1038/sj.ejhg.5201771)
Canadian researchers first to complete the human metabalome
Researchers at the University of Alberta, in Edmonton, Canada, have announced the completion of the first
draft of the human metabolome, the chemical equivalent of the human genome.
The metabolome is the complete complement of all small molecule chemicals (metabolites) found in or
produced by an organism. By analogy, if the genome represents the blueprint of life, the metabolome
represents the ingredients of life.
The scientists have catalogued and characterized 2,500 metabolites, 1,200 drugs and 3,500 food
components that can be found in the human body.
The research is published in the journal Nucleic Acids Research.
The researchers believe that the results of their work represent the starting point for a new era in
diagnosing and detecting diseases.
They believe that the Human Metabolome Project (HMP), which began in Canada in 2004, will have a more
immediate impact on medicine and medical practices than the Human Genome Project, because the
metabolome is far more sensitive to the body's health and physiology.
"Metabolites are the canaries of the genome," says Project Leader Dr. Wishart, professor of computing
science and biological sciences at the University of Alberta and Principal Investigator at NRC, National Institute
for Nanotechnology. "A single base change in our DNA can lead to a 100,000X change in metabolite levels."
This $7.5 Million project funded by Genome Canada through Genome Alberta, the Canada Foundation for
Innovation (CFI), Alberta Ingenuity Centre for Machine Learning, and the University of Alberta will have far
reaching benefits to patient care.
"The results of this research will have a significant impact on the diagnosis, prediction, prevention and
monitoring of many genetic, infectious and environmental diseases," stated Dr. David Bailey, President and
CEO of Genome Alberta.
The metabolome is exquisitely sensitive to what a person eats, where they live, the time of day, the time of
year, their general health and even their mood. The HMP is aimed at allowing doctors to better diagnose and
treat diseases.
"Most medical tests today are based on measuring metabolites in blood or urine," Wishart says.
"Unfortunately, less than 1% of known metabolites are being used in routine clinical testing. If you can only
see 1% of what's going on in the body, you're obviously going to miss a lot."
By measuring or acquiring chemical, biological and disease association data on all known human
metabolites, the HMP Consortium, which consists of some 50 scientists based at the University of Alberta and
the University of Calgary, has spent the past two and half years compiling the remaining 95% of all known
metabolites in the human metabolome. Detailed information about each of the 2500 metabolites identified so
far can be found on the Human Metabolome Database (HMDB) at http://www.hmdb.ca.
"With the data in the HMDB, anyone can find out what metabolites are associated with which diseases,
what the normal and abnormal concentrations are, where the metabolites are found or what genes are
associated with which metabolites," Wishart says.
"It's the first time that this sort of data has been compiled into one spot. By decoding the human
metabolome, we can identify and diagnose hundreds of diseases in a matter of seconds at a cost of pennies,"
Wishart added.
UCLA, Caltech chemists report important step toward building molecular computers
A team of UCLA and California Institute of Technology chemists reports in the Jan. 25 issue of the journal
Nature the successful demonstration of a large-scale, "ultra-dense" memory device that stores information
using reconfigurable molecular switches. This research represents an important step toward the creation of
molecular computers that are much smaller and could be more powerful than today’s silicon-based computers.
The 160-kilobit memory device uses interlocked molecules manufactured in the UCLA laboratory of J. Fraser
Stoddart, director of the California NanoSystems Institute (CNSI), who holds UCLA’s Fred Kavli Chair in
Nanosystems Sciences and who was awarded a knighthood by Queen Elizabeth II less than a month ago.
A bit, or binary digit, is the basic unit of information storage and communication in digital computing. A
kilobit is equal to 1,000 bits and is commonly used for measuring the amount of data that is transferred in one
second between two telecommunication points.
The research published in Nature describes the fabrication and operation of a memory device. The memory
is based on a series of perpendicular, crossing nanowires, similar to a tic-tac-toe board, with 400 bottom wires
and another 400 crossing top wires. Sitting at each crossing of the tic-tac-toe structure and serving as the
storage element are approximately 300 bistable rotaxane molecules. These molecules may be switched
between two different states, and each junction of a crossbar can be addressed individually by controlling the
voltages applied to the appropriate top and bottom crossing wires, forming a bit at each nanowire crossing.
The 160-kilobit molecular memory was fabricated at a density of 100,000,000,000 (1011) bits per square
centimeter — "a density predicted for commercial memory devices in approximately 2020," Stoddart said.
A rotaxane is a molecule in which a dumbbell-shaped component, made up of a rod section and terminated
by two stoppers, is encircled by a ring. It has the potential to be a molecular abacus. The bistable rotaxanes
behave as switches by incorporating two different recognition sites for the ring, and the ring sits preferentially
at one of the two, said Stoddart, leader of the UCLA team. The molecule can act as a switch provided the ring
can be induced to move from one site to the other site and then reside there for many minutes. The bistable
rotaxane molecules used in the crossbar memory can be switched at very modest voltages from an "off" (low
conductivity) to an "on" (high conductivity) state. The stoppers for the rotaxane molecules are designed to
allow the molecules to be organized into single-molecule-thick layers, after which they are incorporated into
the memory device, Stoddart said.
"Fraser Stoddart is the ‘Maestro of Molecules,’ " said Patricia O'Brien, executive dean of UCLA’s College of
Letters and Science. "This is highly significant research."
"For this commercial dream to be realized, many fundamental challenges of nano-fabrication must be
solved first," Stoddart said. "The use of bistable molecules as the unit of information storage promises
scalability to this density and beyond. However, there remain many questions as to how these memory
devices will work over a prolonged period of time. This research is an initial step toward answering some of
those questions.
"Using molecular components for memory or computation or to replace other electronic components holds
tremendous promise," Stoddart said. "This research is the best example — indeed one of the only examples —
of building large molecular memory in a chip at an extremely high density, testing it and working in an
architecture that is practical, where it is obvious how information can be written and read.
"We have shown that if a wire is broken or misaligned, the unaffected bits still function effectively; thus,
this architecture is a great example of ‘defect tolerance,’ which is a fundamental issue in both nanoscience and
in solving problems of the semiconductor industry. This research is the culmination of a long-standing dream
that these bistable rotaxane molecules could be used for information storage," said Stoddart, whose areas of
expertise include nanoelectronics, mechanically interlocked molecules, molecular machines, molecular
nanotechnology, self-assembly processes and molecular recognition, among many other fields of chemistry.
"Our goal here was not to demonstrate a robust technology; the memory circuit we have reported on is
hardly that," said James R. Heath, Caltech’s Elizabeth W. Gilloon Professor of Chemistry and a co-author of the
Nature paper. "Instead, our goal was to demonstrate that large-scale, working electronic circuits could be
constructed at a density that is well-beyond — 10 to 15 years — where many of the most optimistic
projections say is possible."
Caltech chemists and chemical engineers, led by Heath, are the world leaders at making nanowires,
according to Stoddart. "Nobody can equal them in terms of the precision with which they carry this research
out," he said. The memory device’s top and bottom nanowires, each 16 nanometers wide, were fabricated
using a method developed by Heath’s group.
Stoddart’s research team is widely considered the world’s leader in making molecular switches, an area in
which Stoddart and his colleagues have conducted 25 years of research that has laid the foundation for this
current work. Stoddart’s group designs and manufactures intricate interlocked molecules in which the relative
motions of the interlocked components can be switched in controlled ways.
Stoddart and Heath are pioneers in molecular electronics — using nanoscale molecules as key components
in computers and other electronic devices. Stoddart harnesses molecular self-assembly in sophisticated ways,
designing molecules that can be made by "templation methods" and that orient themselves such that they can
be incorporated into solid-state devices in predictable ways.
A variety of molecular electronic components have been demonstrated, said Stoddart, lead authors
Jonathan E. Green and Jang Wook Choi of Heath’s Caltech laboratory, and Heath, who is a member of CNSI’s
scientific board. For example, logic gates, memory circuits, sensors and many other fundamental components
have been reported.
"However, few of these components have been demonstrated to work in practical, highly dense device
arrays before," Stoddart said.
"One of the most exciting features of this research is that it moves beyond the testing of molecular
electronic components in individual, non-scalable device formats and demonstrates a large, integrated array of
working molecular devices," said William R. Dichtel, a researcher who is a member of both Stoddart’s and
Heath’s research teams. "In targeting a large memory array, many fundamental issues of how information is
stored and retrieved had to be addressed."
While this research could affect the computer industry dramatically, it also may have a significant impact on
very different uses of information technologies as well, said Heath and Stoddart, whose research is funded
primarliy by the Defense Advanced Research Projects Agency, the central research and development
organization for the U.S. Department of Defense, with additional funding by the National Science Foundation.
"Molecular switches will lead to other new technologies beyond molecular electronic computers." Stoddart
said. "It is too soon to say precisely which ones will be the first to benefit, but they could include areas such
as health care, alternative energy and homeland security.
"In 1959, physicist Richard Feynman said it should be possible some day to store all of the Encyclopedia
Britannica on the tip of a needle," Stoddart noted. "We’re not there yet, but we’re not far off."
The CNSI, a joint enterprise between UCLA and the University of California, Santa Barbara, is exploring the
power and potential of organizing and manipulating matter to engineer "new integrated and emergent
systems and devices, by starting down at the nanoscale level, that will aid and abet information technology,
energy production, storage and saving, environmental well-being, and the diagnosis, prevention and
treatment of chronic and degenerative diseases with an impact that far outstretches our comprehension of life
to date," Stoddart said.
Nanosystems-related research is performed on a size-scale ranging from 1 nanometer — about onebillionth of a meter — to a few hundred nanometers. The DNA molecule is 2 nanometers wide, roughly 1,000
times smaller than a red blood cell and 10,000 times smaller than the diameter of a human hair.
Beyond nature vs. nurture: Williams syndrome across cultures
Nobody questions that the color of our eyes is encoded in our genes. When it comes to behavior
the concept of "DNA as fate" quickly breaks down – it's been long accepted that both genes and the
environment shape human behavior. But just how much sway the environment holds over our genetic destiny
has been difficult to untangle.
Scientists at the Salk Institute for Biological Studies have found a clever way to sort one from the other:
They compared the social behavior of children with Williams syndrome — known for their innate drive to
interact with people — across cultures with differing social mores. Their study, published in a forthcoming
issue of Developmental Science, demonstrates the extent of culture's stamp on social behavior.
"Overall, a consistent result has emerged from our research," summarizes lead author Ursula Bellugi,
director of the Laboratory for Cognitive Neuroscience at the Salk. "Regardless of age, language or cultural
background, Williams syndrome social phenotype is shaped both by genes and interactions with the
environment."
The current research is just one piece in a puzzle that a large collaboration of scientists under the umbrella
of a long-running Program Project from the National Institutes of Child Health and Human Development has
been trying to piece together over the last decade. Led by Bellugi, the researchers are looking to Williams
syndrome to provide clues to some of the mysteries of the genetic basis of behavior. Much of the research
revolves around the work of molecular geneticist Julie R. Korenberg, a professor in the Department of
Pediatrics at UCLA and an adjunct professor at the Salk Institute, who has been studying the genetic basis of
Williams syndrome for the last decade.
Virtually everyone with Williams syndrome has exactly the same set of genes with one strand missing, a
small set of genes on chromosome 7, but some rare cases with different size deletions sparked the interest of
researchers. One unusually shy and introverted little girl retained at least one gene from the GTF2i family that
most people with the disorder have lost. This finding convinced Korenberg and her collaborators that this short
stretch of DNA may contain the gene (or genes) responsible for the hypersociability among children with
Willliams syndrome.
"Although a certain amount of variability exists with the Williams syndrome population, the clear genetic
basis presents an unusual opportunity to search for the genetic underpinnings of human social behavior and
social characteristics, such as trust and over-friendliness," explains Bellugi.
Identified more than 40 years ago, Williams syndrome occurs in an estimated one in 20,000 births
worldwide. It arises from a faulty recombination event during the development of sperm or egg cells. As a
result, almost invariably the same set of about 20 genes surrounding the gene for elastin is deleted from one
copy of chromosome seven, catapulting the carrier of the deletion into a world where people make much more
sense than objects do. Despite a myriad health problems and a generally low IQ, children with Williams
syndrome are loquacious, sociable, and irresistibly drawn to strangers.
To determine the extent to which this behavioral profile is universal across culture, the researchers settled
on two vastly differing environments: the United States and Japan, whose cultural differences are said to be
aptly summarized in two proverbs: In America, "The squeaky wheel gets the grease," while in Japan, "The nail
that stands out gets pounded down."
Using a questionnaire developed by Salk researchers, Bellugi and first author Carol Zitzer-Comfort, a
professor at California State University in Long Beach, asked parents in the U.S. and Japan to rate the
tendency of their child to approach others, their general behavior on social situations, their ability to
remember names and faces, their eagerness to please other people, their tendency to empathize with others'
emotional states, and the tendency for other people to approach their child.
Despite the differences in upbringing, in both countries children with Williams syndrome were rated
significantly higher in global sociability and their tendency to approach strangers than were their typically
developing counterparts. But cultural expectations clearly influenced social behavior, since the sociability of
normal American kids was on par with Japanese Williams syndrome kids, whose social behavior is considered
out of bounds in their native country.
Says Zitzer-Comfort: "It really is an intriguing illustration of the interaction between nature and nurture,"
but notes that there might be alternative explanations. Japanese parents, for one, rated their children
generally lower on the 7-point scale of the questionnaire. "Perhaps the stigma of having a 'different' child in
Japan affected the ways in which parents ranked their child's degree of sociability," speculates the scientist.
In an earlier study, published last year, Bellugi and her colleagues collected oral narratives from children
and adolescents with Williams syndrome in the U.S., France, and Italy and came to a similar conclusion. Not
only are Williams syndrome kids natural born storytellers, who hook their audiences with expressive and
affective narratives, but — no matter where they grew up — they did their countrymen significantly better.
LA JOLLA, CA –
New research is first to explore regional differences in US serial killings
Study found in the most recent issue of Homicide Studies
Did you know that people living in the Western region of the United States are more likely to become
victims of a serial killer than people living in the Northeast? The February issue of Homicide Studies, published
by SAGE, is the first to explore research looking at the considerable interstate and regional differences in serial
killer activity.
The study led by University of Connecticut Emeritus Sociology Professor James DeFronzo examined male
serial killers in the United States from 1970 to 1992 using sociological perspectives long used to understand
other crimes.
The study found that social structural factors, such as the percentage of a state's urban population,
divorced residents, one-person households and unemployed residents, all helped to explain why some states
and regions are home to more male serial killers. The study also found that cultural factors, such as a high
ratio of executions to homicides and classification as a southern state, correlated with a higher rate of serial
killers.
"Experts traditionally have used psychiatric analyses to understand male serial killer activity, but that
approach has not been able to explain the considerable geographic differences that exist with serial killings,"
said DeFronzo, who led a team of researchers from UConn, Northeastern University, Villanova University and
Massey University. "This appears to be the first study to show that both cultural and social structural factors
play a role."
Caverns give up huge fossil haul
An astonishing collection of fossil animals from southern Australia is reported by scientists.
The creatures were found in limestone caves under Nullarbor Plain and date from about 400,000-800,000
years ago.
The palaeontological "treasure trove" includes 23 kangaroo species, eight of which are entirely new to
science.
Researchers tell Nature magazine that the caves also yielded a complete
specimen of Thylacoleo carnifex , an extinct marsupial lion.
'Took my breath away'
It appears the unsuspecting creatures fell to their deaths through pipes in
the dusty plain surface that periodically opened and closed over millennia.
Most of the animals were killed instantly but others initially survived the
20m drop only to crawl off into rock piles to die from their injuries or from
thirst and starvation.
The preservation of many of the specimens was remarkable, said the Nature
paper's lead author, Dr Gavin Prideaux.
"To drop down into these caves and see the Thylacoleo lying there just as it
had died really took my breath away," the Western Australian Museum
researcher told the BBC's Science In Action Programme.
"Sitting in the darkness next to this skeleton, you really got the sense of the
animal collapsing in a heap and taking its last breath. It was quite poignant.
"Everywhere we looked around the boulder piles, we found more and more
skeletons of a very wide array of creatures."
In total, 69 vertebrate species have been identified in three chambers the
scientists now call the Thylacoleo Caves.
The caves and their contents were first discovered in 2002
These include mammals, birds and reptiles. The kangaroos range from rat-sized animals to 3m (nearly 10ft)
giants.
The team even found an unusual wallaby with large brow ridges.
"When we first glanced at the animal, we thought they were horns; but on closer inspection we realised
they must have performed some sort of protective function," Dr Prideaux explained.
"The beast must have been sticking its head into spiny bushes and browsing on leaves."
The 'Ancient Dry'
The scientists' investigations indicate the ancient Nullarbor environment was very similar to that of today an arid landscape that received little more than 200mm of rainfall a year.
What has changed significantly is the vegetation. Whereas the Thylacoleo Caves' animals would have seen
trees on the plain, the modern landscape is covered in a fire-resistant chenopod shrub.
This observation goes to the heart of a key debate in Australian palaeontology, the team believes.
The continent was once home to a remarkable and distinctive collection of giant beasts.
These megafauna, as researchers like to call them, included an
immense wombat-like animal ( Diprotodon optatum ) and a 400kg
lizard ( Megalania prisca ).
But all - including the marsupial lion - had disappeared by the end
of the Pleistocene Epoch (11,500 years ago).
Some scientists think the significant driver behind these extinctions
was climate change - large shifts in temperature and precipitation.
But Dr Prideaux and colleagues argue the Thylacoleo Caves'
animals give the lie to this explanation because they were already
living in an extremely testing environment.
"Because these animals were so well adapted to dry conditions, to
say that climate knocked them out just isn't adequate. These animals
survived the very worst nature could throw at them, and they came
through it," co-author Professor Bert Roberts told BBC News.
"If you look at the last four or five glacial cycles, where the ice
ages come and go, the animals certainly suffered but they didn't go
extinct - they suffered but survived," the University of Wollongong
scientist said.
This assessment would be consistent with the other favoured
extinction theory - extermination by humans, either directly by
hunting or indirectly by changing the landscape through burning.
The discovery of the complete Thylacoleo skeleton attracted international media interest when it was
first announced in 2002.
Railway construction unearths ancient artifacts in Germany
By Colin Nickerson, Globe Staff | January 21, 2007
Genialinius Gennatus was one fine duck hunter.
In the third century , he recorded his prowess in high Latin on a stone tablet that he dedicated to Jupiter.
That and a hefty donation probably ensured that the tablet won display in the temple to the Roman god in the
settlement then called Colonia.
Five or six centuries later, Cologne's early Christians, perhaps offended by the tablet dedicated to a
pantheist god, chucked it into the silting channel between the Rhine river port and a small island on the Rhine,
unknowingly ensuring the hunter's immortality.
Historians now know the ordinary man named Gennatus hunted ducks and prayed to Jupiter because of
Cologne's decision to punch 2 1/2 miles of new north-south light railway tunnel through the silt and sediment
that lie beneath one of Germany's oldest cities.
"It would not have seemed valuable to anyone at the time," said Bernhard Irmler, one of scores of
researchers mucking through damp tunnel s beneath Cologne in Europe's largest ongoing archeological dig.
"But for us it's another small window into a long-ago time."
The $1 billion cost of the rail project includes $194 million for 100 archeologists to dig, sift, and probe the
depths in front of the giant, boring machines and other equipment that will chew out the subway tubes .
And what a fine mess archeologists and diggers alike are making. Great swaths of downtown Cologne are
cordoned off for the scientific sleuths working against construction deadlines -- the dig started two years ago
and subway trains are supposed to be zipping from Breslauer Platz to Market Strasse in 2010.
In a sense, that's lightning speed by local standards: the landmark Cologne Cathedral was more than 630
years in the making, from conception in 1248 to consecration in 1880.
"Modern Germans are a bit more impatient," Irmler, an associate archeologist with Cologne's famed
Romano-Germanic Museum , said in an interview by the site. "Almost the instant the archeologists finish
[searching] a section, the construction crews are right behind us."
More than 10,000 artifacts have been unearthed from the site, from the duck hunter's tribute to Jupiter to
lumpy-looking rejects from a 15th-century pottery maker.
"The finer flagons would have been exported to centers across Europe," Irmler said. "The ones we are
finding are flawed vessels that were probably sold cheap locally for use as chamber pots."
At an average depth of 50 feet below the surface, the tunnel is too deep to disturb the ruins and relics
enshrined beneath Cologne, even though the line will pass directly under the center of the old city.
But the eight entrances to planned subway stops will plunge through more than 20 centuries of history.
That's why Germany sent in the archeological brigades first -- to ferret out what can be saved and to
record what will be destroyed.
Not everything in the path of the construction can be preserved.
COLOGNE, Germany --
For example, the rediscovered foundations of all-but-forgotten St. Katherine's Church -- a medieval house
of worship that according to legend was coated with solid gold -- will be blasted away to make room for the
Severin Strasse station.
But fragments from the church's ornate columns have been rescued.
"Look closely, and you can see flecks of gold," Irmler said.
None of the discoveries so far will set archeology on its ear. Cologne's history is well recorded. But
researchers are excited to find substantial remnants of the ancient Roman harbor wall, constructed of thick
oak timbers almost perfectly preserved through the centuries.
The harbor lay in a long-lost channel between Cologne and the small island in the Rhine, both covered for
more than 1,000 years by the expanding city.
"Every day we find something that may not change history, but helps us better understand the past of this
specific place," said Irmler .
Among the other yields: amphorae, two-handled jars with a narrow neck used by the Romans to carry wine
or oil, strewn everywhere.
Shells from oysters carried from Normandy as a delicacy. Burial urns. Old fortifications and sewage systems.
Hair combs made of wood. Intriguing scraps from a workshop where crystal minerals from distant mountains
were carved into religious displays for the cathedral.
"We find the stories of the city written in debris," said the archeologist, noting that scientific crews will
scrutinize more than 20,000 cubic meters, about 706,000 cubic feet, of excavated material . "Every scoop is a
new page from the past."
Ancient Iraqi Art Determined Poisonous
Jennifer Viegas, Discovery News
Some ninth century Iraqi artists may have literally died for their art, suggests new analysis of
Iraqi stucco fragments from this period. A fragment, taken from the ancient palace-city of Samarra, contains
three arsenic-based pigments that are known to be poisonous and may cause cancer upon exposure.
Although the findings will not be published until May in the Journal of Archaeological Science, curators at
London's Victoria and Albert Museum, where the fragments are housed, have already taken special handling
precautions.
"The fragments are stored in a locked cabinet and only handled as little as
possible by curators in the Museum’s Middle Eastern section who wear nitrile
(special sturdy rubber) gloves," Mariam Rosser-Owen, curator of the Middle East
collections at the museum, told Discovery News.
Lucia Burgio, a conservation scientist at the museum, added that researchers
also might wear face masks and work in a "fume cupboard." If the object should
go on display, it would be placed in a special case "to avoid any accidental
contamination of members of the public."
For the study, Burgio, Rosser-Owen and colleague Robin Clark used a noninvasive, high tech process called Raman microscopy, which scanned a grid
pattern over the surface of the fragments to construct maps of chemical information. These maps revealed
that an otherwise innocuous-looking stucco fragment of colorful stripes contained the toxic pigments orpiment,
pararealgar, and another related substance.
These orange-yellow minerals are toxic arsenic sulphides. Orpiment was even once used to coat the tips of
poison arrows.
The ancient Iraqis, however, probably did not realize the minerals were poisonous, although some artists
may have died for their craft.
"People died young until a couple of centuries ago, and I guess other illnesses were causing artists to die
before they got poisoned to death by the materials they were using," explained Burgio. "What happened to
their apprentices, who ground and prepared the pigments on a routine basis, I don’t know."
The fragments were once colorful wall paintings on a fine gypsum surface that decorated mosques and
palaces at Samarra, which is just over 77 miles away from Baghdad.
Construction of this massive, ancient city created "an early golden age for architectural decoration,"
according to the researchers. While small, the fragments show beautifully rendered decorations based on
plant forms, animals and courtly activities, such as people enjoying wine and dancing. The style is uniquely
Arabic, but was possibly influenced by Central Asian artwork.
Clark, a professor in the Christopher Ingold Laboratories at University College London, said the toxic
pigments were also "well known in Western Europe." Shades of green, including emerald green, are also
sometimes poisonous elements of certain early European art, due to the presence of arsenic-containing
copper arsenite and copper arsenoacetate.
Jan. 22, 2007 ―
Alastair Northedge, professor of art and Islamic archaeology at the University of Paris, is one of the world's
leading western experts on Samarra. He recently authored the book, "Historical Topography of Samarra."
Northedge told Discovery News that he is "sure the conclusions are correct" in the recent study.
"It was interesting to see the painters were poisoning themselves with arsenic," he said.
Toxins aside, remains of Samarra, also known as the Abbasid Caliphate in Iraq, comprise a site of
important archaeological relevance.
"The Abbasid Caliphate was one of the high points of world civilization," said Northedge, "but it has been
more or less inaccessible because of Saddam, and now the war."
A new international project, www.samarrafinds.info, has been set up to better understand the site and
what its art and architecture would have looked like during its golden age.
Some brain-damaged patients quit smoking with ease, researchers report in Science
A silver dollar-sized region deep in the brain called the insula is intimately involved in smoking addiction,
and damage to this structure can completely erase the body's urge to smoke, researchers have discovered.
The findings appear in the 26 January 2007 issue of the journal Science, published by AAAS, the nonprofit
science society.
Obviously brain damage is not a treatment option for nicotine addiction, but the new results may offer
leads for therapies to help smokers kick the habit or for monitoring smokers' progress while using existing
therapies.
The study was largely inspired by a patient who had smoked around 40 cigarettes a day before his insula
was damaged by a stroke and then quit immediately after. He told the researchers that his body "forgot the
urge to smoke."
The insula receives information from other parts of the body and is thought to help translate those signals
into something we subjectively feel, such as hunger, pain, or craving for a drug. Compared to other brain
regions, the insula has not attracted very much attention in drug addiction research until now, but some
imaging studies have shown that this region is activated by drug-associated cues, such as the sight of people
doing drugs or drug paraphernalia.
"One of the most difficult problems in any form of addiction is the difficulty in stopping the urge to smoke,
to take a drug, or to eat for that matter. Now we have identified a brain target for further research into
dealing with that urge," said study author Antoine Bechara of the University of Southern California and the
University of Iowa.
"This kind of study is quite forward-looking. In addition to investigating a basic scientific mechanism
underlying drug addiction, these authors have come up with innovative ideas about how we may be able to
treat addiction and prevent relapse," said Science senior editor Peter Stern.
Though intriguing, the possibility of insula-targeting drugs that might help smokers quit is still a long way
off. More immediately, it may be possible to monitor the success of current smoking cessation therapies by
measuring the activity within this brain region.
Bechara his colleagues are affiliated with a large patient registry at the University of Iowa that allows
researchers to study the effects of brain damage. To investigate whether the insula plays a major role in
smoking addiction, the authors studied some of the patients enrolled in the registry, all of whom had been
smoking more than five cigarettes per day for more than two years when their brain damage occurred.
Bechara and his colleagues studied 69 patients with brain damage who had been smokers before the
damage occurred. Nineteen of these patients had brain damage that included the insula.
Thirteen of the insula-damaged patients had quit smoking, and 12 of them had done so quickly and easily,
reporting that they had felt no urges to smoke since quitting. The authors don't know why the other six
patients did not quit smoking.
Some of the patients with other forms of brain damage also stopped smoking without effort, but, overall,
patients who had quit easily were much more likely to have damage to the insula rather than anywhere else in
the brain.
At the time of the study, the patients had quit smoking for at least one year.
Because the patients reported losing the urge to smoke so suddenly and without difficulty or relapse,
Bechara and his colleagues concluded that insula damage reduced the patients' actual urge to smoke rather
than reducing the pleasurable experience, or "reward," associated with smoking. Bechara says these findings
don't contradict the importance of the reward system in addiction; rather, they add another piece to the
picture.
The authors were also curious about whether insula damage disrupts other behaviors. They couldn't study
other forms of drug addiction, since patients with these addictions weren't allowed to enroll in the registry.
In a follow-up survey, the researchers found that insula damage didn't seem to affect patients' desire to eat
or their food intake. Because eating is so vital for survival, multiple brain regions may produce the urge to eat,
according to Bechara.
He noted that another possible approach to treating smoking addiction might be to use a technique called
transcranialmagnetic stimulation, which involves inducing weak electrical currents in the brain tissue, to
disrupt the insula's activity. (Currently this technique doesn't penetrate deep enough to reach the insula,
however.)
"The insula also carries out lots of normal everyday functions so we would want to make sure we only
interfere with functions that disrupt bad habits like smoking but not something vital like eating," cautioned
Bechara.
Researchers propose reason for severe side-effects of Northwick Park clinical trial
A possible reason why the Northwick Park clinical trial of the drug TGN1412 in the UK caused multiple
organ failure in human volunteers is revealed in research presented today at a conference near Paris.
The research shows that stimulating the molecule CD28 on cells that mediate the immune response, known
as T cells, can have an adverse effect if these immune cells have been activated and altered by infection or
illness in the past.
The scientists found that when they artificially stimulated CD28 on these previously activated 'memory' T
cells, this caused the cells to migrate from the blood stream into organs where there was no infection, causing
significant tissue damage. CD28 is an important molecule for activating T cell responses and the TGN1412
drug tested on the human volunteers strongly activates CD28.
Around 50% of adult human T cells are memory cells, having been activated by infections and illnesses
during the course of a person's life. However, animal models, such as those used to test TGN1412 before tests
were carried out on humans, do not have many memory T cells because they are deliberately kept in a sterile
environment where they are shielded from infections.
The research, by scientists from Imperial College London, King's College London, and the Babraham
Institute, is presented today at the Club de la Transplantation conference in Cernay la Ville, near Paris.
Dr Federica Marelli-Berg, lead author of the research from the Department of Immunology at Imperial
College London, explained: "The drug TGN1412 appeared to be relatively safe when it was tested in animal
models. However, when the drug was tested on human volunteers, some experienced very severe side-effects.
"Our research suggests that this is because the human subjects' memory T-cells lost their sense of
direction and started migrating into several areas of the body where they were not supposed to go, and
caused damage."
The researchers reached their conclusions after memory T cells in which CD28 had been previously
stimulated were injected into healthy mice. These cells immediately migrated from the blood into many organs
including the kidney, the heart and the gut, where they are not normally found unless there is an infection.
TGN1412 was developed to treat chronic inflammatory conditions, including rheumatoid arthritis, leukaemia
and multiple sclerosis, which are caused by the body's immune system attacking itself. It was thought that by
targeting CD28, the drug could over-stimulate the rogue T cells, making them burn out and die.
Stem cells cultured from human bone marrow behave like those derived from brain
tissue
Stem cells taken from adult human bone marrow have been manipulated by
scientists at the Maxine Dunitz Neurosurgical Institute at Cedars-Sinai Medical Center to generate aggregates
of cells called spheres that are similar to those derived from neural stem cells of the brain.
In addition, the bone marrow-derived stem cells, which could be differentiated into neurons and other cells
making up the central nervous system, spread far and wide and behaved like neural stem cells when
transplanted into the brain tissue of chicken embryos.
Results of the experiments, described in the February 2007 of the Journal of Neuroscience Research,
support the concept of using bone marrow-derived stem cells to create therapies to treat brain tumors,
strokes and neurodegenerative diseases. A similar study using bone marrow-derived stem cells of rats
appeared as the cover article of the December 2002 issue of Experimental Neurology.
"These findings reinforce the data that came from our study of rat bone marrow-derived stem cells," said
John S. Yu, M.D., neurosurgeon, co-director of the Comprehensive Brain Tumor Program, and senior author of
both articles. "Using two methods, we show evidence for the bone marrow-derived stem cells being neural
cells, and we demonstrate that it is feasible to grow the cells in large numbers. We also document that these
cells function electrophysiologically as neurons, using similar voltage-regulating mechanisms."
Progressing from the rat study to experiments with human cells and transplantation into mammal brain
tissue, the research team continues to build a foundation for translating laboratory research into human
clinical trials.
"Based on our studies to date, a patient's own bone marrow appears to offer a viable and renewable source
of neural stem cells, allowing us to avoid many of the issues related to other types of stem cells," said Keith L.
LOS ANGELES (Jan. 25, 2007) –
Black, M.D., director of the Maxine Dunitz Neurosurgical Institute and chairman of Cedars-Sinai's Department
of Neurosurgery.
The replacement of damaged brain cells with healthy cells cultured from stem cells is considered to
potentially be a promising therapy for the treatment of stroke, neurodegenerative disorders and even brain
tumors, but finding a reliable source for generating neural cells for transplantation has been a challenge. The
use of embryonic and fetal tissue has raised ethical questions among some, and brings with it the possibility of
immune rejection. And while neural stem cells can be taken from brain tissue, the removal of healthy tissue
from a patient's brain introduces a new set of safety, practicality and ethical issues.
In their recent work, the Cedars-Sinai researchers documented that several genes that speed up and
control the proliferation process could be used to rapidly expand the supply of marrow-derived neural stem
cells, writing in the article that "this novel method of expansion … may prove to be useful in the design of
novel therapeutics for the treatment of brain disorders, including tumors."
Role of anesthetics in Alzheimer's disease: Molecular details revealed
Inhaled anesthetics commonly used in surgery are more likely to cause the aggregation of
Alzheimer's disease-related plaques in the brain than intravenous anesthetics say University of Pittsburgh
School of Medicine researchers in a journal article published in the Jan. 23 issue of Biochemistry. This is the
first report using state-of-the-art nuclear magnetic resonance (NMR) spectroscopic technique to explain the
detailed molecular mechanism behind the aggregation of amyloid β (Aβ) peptide due to various anesthetics.
Aβ plaques are found in the brains of people with Alzheimer's disease. Many believe that the uncontrolled
clumping of Aβ is the cause of Alzheimer's disease and that the similar aggregation of peptides and proteins
play a role in the development of other neurodegenerative diseases such as Parkinson's disease.
"Many people know of or have heard of an elderly person who went into surgery where they received
anesthesia and when they woke up they had noticeable memory loss or cognitive dysfunction," said Pravat K.
Mandal, Ph.D., assistant professor of psychiatry, University of Pittsburgh School of Medicine and lead author of
the study. Previous studies by the Pittsburgh researchers found that the inhaled anesthetics halothane and
isoflurane and the intravenous anesthetic propofol encouraged the growth and clumping of Aβ in a test tube
experiment.
"Our prior research had shown in molecular models that anesthetics may play a role by causing amyloid
peptides to clump together—something that is thought to signal the advancement of Alzheimer's disease. In
this study, we set out to see why this was happening and to determine if any one form of anesthesia might be
a safer option than another," said Dr. Mandal.
In this study the researchers used NMR spectroscopy to determine how the inhaled anesthetics halothane
and isoflurane and the intravenous anesthetics propofol and thiopental interact with Aβ influencing the
aggregation of Aβ in forms commonly found in the brains of people with Alzheimer's disease. The results were
strikingly different between the inhaled and injected anesthetics. The inhaled halothane and isoflurane had the
most potent interaction with Aβ peptides causing the highest levels of Aβ aggregation. The injected anesthetic
propofol only interacted and caused aggregation at high concentrations—interaction was not evident at lower
concentrations. The intravenous thiopental did not cause the clustering of Aβ peptides even at high
concentrations. Additionally, the molecular details for the interaction of these anesthetics with Aβ peptide were
revealed.
Dr. Mandal noted that if the same thing occurs in humans, anesthetics could lead to more amyloid plaques
which may lead to earlier memory problems, warranting further studies of anesthetics with Aβ both in
laboratory and clinical settings.
PITTSBURGH, Jan. 25 –
MRI contrast agent linked to rare disease
New research has shown a possible association between a popular magnetic
resonance imaging (MRI) contrast agent and the incidence of a rare disease called nephrogenic systemic
fibrosis (NSF) in patients with kidney disease, according to an editorial appearing in the March issue of
Radiology.
"We recommend avoiding the use of gadodiamide in patients with any degree of renal disease," said Phillip
H. Kuo, M.D., Ph.D., assistant clinical professor of diagnostic radiology at Yale University School of Medicine in
New Haven, Conn. "At this point, the data clearly show the vast majority of NSF cases are associated with the
use of gadodiamide."
NSF, an emerging systemic disorder characterized by widespread tissue fibrosis, has been diagnosed in
patients who were previously administered gadodiamide (Omniscan) and other gadolinium-based MRI contrast
agents. While the precise cause of NSF is unknown, the disorder has only been observed in patients with
kidney disease, especially those requiring dialysis.
"So far, NSF has only been reported in patients with renal failure," Dr. Kuo said. "Gadolinium contrast
agents do not appear to cause NSF in patients with normal kidney function."
OAK BROOK, Ill. (January 25, 2007) –
Patients with NSF experience an increase of collagen in the tissues, causing thickening and hardening of
the skin of the extremities and often resulting in immobility and tightening or deformity of the joints. NSF can
develop rapidly and may result in patients becoming wheelchair-bound within just a few weeks. In some cases,
there is involvement of other tissues, including the lungs, heart, diaphragm, esophagus and skeletal muscle.
No consistently effective therapy exists.
Approximately 400 cases of NSF have been reported worldwide. While gadolinium-based agents have not
been definitively shown to cause NSF, as many as 90 percent of known NSF patients had previously received
gadodiamide, and a recent survey of approximately 100 NSF patients revealed that more than 95 percent
were exposed to a gadolinium agent within two to three months prior to disease onset. Other evidence linking
gadolinium with NSF includes residual gadolinium in a skin biopsy of an NSF patient 11 months after the
contrast agent was administered.
Studies investigating the relationship between NSF and gadolinium are currently underway at Yale, as well
as the Centers for Disease Control, U.S. Food and Drug Administration (FDA) and the medical regulatory
agencies of the European Union. In the meantime, the FDA advises cautionary use of all gadolinium-based
contrast agents in patients with moderate to advanced renal disease.
"While I appreciate the conservative approach of the FDA," Dr. Kuo said, "my colleagues and I are
concerned that expanding the warning to millions of patients with only moderate renal disease might have a
negative impact on patient care."
Dr. Kuo noted that only three percent of patients with renal failure who are given gadolinium agents will
develop NSF, and that an overwhelming majority of the reported cases of NSF are tied specifically to
gadodiamide. "That leaves a large percentage of patients who can gain the benefits of a contrast-enhanced
scan without developing NSF," he said.
Dr. Kuo and colleagues recommend not using gadodiamide in patients with kidney disease, but he pointed
out that there are circumstances where the benefits of other gadolinium-based agents outweigh the risks.
"MRI with contrast is simply the best exam in many situations," Dr. Kuo said. "One has to wonder if
excluding large numbers of patients with moderate renal failure from the best exam would do more harm than
good."
The editorial is published online at http://radiology.rsnajnls.org/cgi/content/full/2423061640v1.
No one strategy is best for teaching reading, FSU professor shows
For decades, a debate has simmered in the educational community over the best way to
teach children how to read. Proponents of phonics, the "whole language and meaning" approach and other
teaching methods long have battled for dominance, each insisting that theirs is the superior strategy.
Now, a Florida State University researcher has entered the fray with a paper in the prestigious journal
Science that says there is no one "best" method for teaching children to read.
Carol M. Connor is an assistant professor in the FSU College of Education and a researcher with the Florida
Center for Reading Research. Along with colleagues from FSU and the University of Michigan, she wrote
"Algorithm-Guided Individualized Reading Instruction," published in Science’s Jan. 26 issue. (The magazine is
available online to subscribers at www.sciencemag.org.) Connor’s paper shows that lots of individualized
instruction, combined with the use of diagnostic tools that help teachers match each child with the amounts
and types of reading instruction that are most effective for him or her, is vastly preferable to the standard
"one size fits all" approach to reading education that is prevalent in many American elementary schools.
"There is too much of a tendency in education to go with what ‘sounds’ really good," Connor said of various
educational trends that come into and fall out of fashion. "What we haven’t done very well is conduct
comprehensive field trials and perform the rigorous research that are the norm in other fields of science. With
this study, we sought to do just that — to take a systematic approach to what works, what doesn’t, and why"
when teaching students to read.
The researchers found that "the efficacy of any particular instructional practice may depend on the skill
level of the student. Instructional strategies that help one student may be ineffective when applied to another
student with different skills." The trick, then, is to more precisely determine the reading skill level of each child
and then find a way to cater the curriculum to each student’s individual needs.
"Instead of viewing the class as an organism, we’re trying to get teachers to view the students as
individuals," Connor said.
While that may sound daunting to the typical first- or second-grade teacher, Connor has turned to
technology to offer a helping hand. She, Frederick J. Morrison and Barry Fishman, professors at the University
of Michigan, have developed "Assessment to Instruction," or A2i, a Web-based software program. A2i uses
students’ vocabulary and reading scores and their desired reading outcome (i.e. their grade level by the end of
first grade) to create algorithms that compute the recommended amounts and types of reading instruction for
each child in the classroom. The software then groups students based on learning goals and allows teachers
to regularly monitor their progress and make changes to individual curricula as needed.
TALLAHASSEE, Fla. --
A2i currently is being tested by about 60 elementary-school teachers in one Florida county. However, "right
now A2i is just a research tool," Connor said. "Hopefully we’ll be able to make it available more widely as time
goes on."
By Barry Ray Jan. 25, 2007
Go to http://know.umich.edu/A2i/login.asp to view an A2i demonstration page. Use the username "a2idemo" and the
password "isi06!".
In addition to Connor, Morrison and Fishman, other co-authors of the Science paper were Associate Professor Christopher
Schatschneider of FSU’s department of psychology and Phyllis Underwood, a doctoral student in the FSU College of
Education.
New approach could lower antibiotic requirements by 50 times
Antibiotic doses could be reduced by up to 50 times using a new approach based on bacteriophages.
Steven Hagens, previously at the University of Vienna, told Chemistry & Industry, the magazine of the SCI,
that certain bacteriophages, a type of virus that infects bacteria, can boost the effectiveness of antibiotics
gentamicin, gramacidin or tetracycline.
It is the phages' ability to channel through bacterial cell membranes that boosts antibiotic effectiveness.
'Pseudomonas bacteria for example are particularly multi-resistant to antibiotics because they have efflux
pump mechanisms that enable them to throw out antibiotics. A pore in the cell wall would obviously cancel the
efflux effect,' Hagens explains.
Pseudomonas bacteria cause pneumonia and are a common cause of hospital-acquired infections.
Experiments in mice revealed that 75% of those infected with a lethal dose of Pseudomonas survived if the
antibiotic gentamicin was administered in the presence of bacteriophages. None survived without the phages
(Microb. Drug Resist., 2006, 12 (3), 164).
The bacteriophage approach would also be particularly useful for treating cases of food poisoning, because
the lower doses of antibiotic needed would not disrupt the friendly bacteria in the gut - a big problem with
conventional antibiotic treatments.
'The prospect of using such treatments to prolong the life of existing agents and delay the onset of
widespread resistance is to be welcomed,' said Jim Spencer a lecturer in microbial pathogenesis at the
University of Bristol.
The overuse of antibiotics since the 1940s had slowly created a host of infections that are resistant to
antibiotics. MRSA (Methicillin-resistant Staphylococcus aureus) for example is rapidly spreading through
hospitals, affecting more than 8,000 people in the UK every year. MRSA infection can lead to septic shock and
death. Chemistry & Industry http://www.chemind.org
100 percent juices found as beneficial to health as fruits and vegetables
When it comes to some of today’s health issues, 100 percent fruit and vegetable juices do help reduce risk
factors related to certain diseases.
This conclusion is the result of a European study designed to question traditional thinking that 100 percent
juices play a less significant role in reducing risk for both cancer and cardiovascular disease than whole fruits
and vegetables.
Juices are comparable in their ability to reduce risk compared to their whole fruit/vegetable counterparts
say several researchers in the United Kingdom who conducted the literature review. The researchers analyzed
a variety of studies that looked at risk reduction attributed to the effects of both fiber and antioxidants. As a
result, they determined that the positive impact fruits and vegetables offer come not from just the fiber but
also from antioxidants which are present in both juice and the whole fruit and vegetables.
This 2006 review of the literature states, “When considering cancer and coronary heart disease prevention,
there is no evidence that pure fruit and vegetable juices are less beneficial than whole fruit and vegetables.”
The researchers add that the positioning of juices as being nutritionally inferior to whole fruits and vegetables
in relationship to chronic disease development is “unjustified” and that policies which suggest otherwise about
fruit and vegetable juices should be re-examined.
The researchers who authored the paper “Can pure fruit and vegetable juices protect against cancer and
cardiovascular disease, too? A review of the evidence” suggest that more studies in certain area are needed to
bolster their findings. The study was published in the International Journal of Food Science and Nutrition
(2006).
“Although this independent review of the literature is not designed to focus on any particular 100 percent
juice, it does go a long way in demonstrating that fruit and vegetable juices do play an important role in
reducing the risk of various diseases, especially cancer and cardiovascular heart disease,” says Sue Taylor, RD,
with the Juice Products Association, a non-profit organization not associated with this research. She adds that
appropriate amounts of juices should be included in the diet of both children and adults, following guidelines
established by leading health authorities.
Taylor also points to a large epidemiological study, published in the September 2006 issue of the Journal of
Medicine, which found that consumption of a variety of 100 percent fruit and vegetable juices was associated
with a reduced risk for Alzheimer’s disease. In fact, that study found that individuals who drank three or more
servings of fruit and vegetable juices per week had a 76 percent lower risk of developing Alzheimer’s disease
than those who drank juice less than once per week.
Hints of huge water reservoirs on Mars
* 19:00 25 January 2007
* NewScientist.com news service
* David Shiga
Mars is losing little water to space, according to new research, so much of its ancient abundance may still
be hidden beneath the surface.
Dried up riverbeds and other evidence imply that Mars once had enough water to fill a global ocean more
than 600 metres deep, together with a thick atmosphere of carbon dioxide that kept the planet warm enough
for the water to be liquid. But the planet is now very dry and has a thin atmosphere.
Some scientists have proposed that the Red Planet lost its water and CO2 to space as the solar wind
stripped molecules from the top of the planet's atmosphere. Measurements by Russia's Phobos-2 probe to
Mars in 1989 hinted that the loss was quite rapid.
Now the European Space Agency's Mars Express spacecraft has revealed that the rate of loss is much lower.
Stas Barabash of the Swedish Institute of Space Physics in Kiruna led a team that used data from Mars
Express's ASPERA-3 instrument (Analyzer of Space Plasmas and Energetic Atoms).
Its measurements suggest the whole planet loses only about 20 grams per second of oxygen and CO2 to
space, only about 1% of the rate inferred from Phobos-2 data.
If this rate has held steady over Mars's history, it would have removed just a few centimetres of water, and
a thousandth of the original CO2.
Huge amounts
Either some other process removed the water and CO2 or they are still present and hidden somewhere on
Mars, probably underground, Barabash says. "We are talking about huge amounts of water," he told New
Scientist. "To store it somewhere requires a really big, huge reservoir."
Barabash is not sure what form this reservoir – or reservoirs – would take, but he points to findings from
NASA's now lost Mars Global Surveyor (MGS). This data provided evidence that water had gushed down slopes
on Mars in recent years, possibly originating from beneath the surface (see Water flows on Mars before our
very eyes). "So there might be some possibilities for water existing in liquid form even now," he says.
"If water is there, I think it will put all ideas about human missions to Mars on a completely different level,"
he says. "It's not only water to support [astronauts], but also a
potential fuel." Hydrogen and oxygen for rocket fuel can be
produced from water.
Stormy weather
However, the researchers point out that other mechanisms
might have removed water and CO2 from Mars, such as asteroid
and comet impacts. Or the solar wind might have sheared off of
whole chunks of atmosphere rather than individual molecules.
Another possibility is suggested by Mars atmosphere expert
David Brain at the University of California in Berkeley, US. He points
out that magnetic storms might boost the rate at which the solar
wind strips molecules from the atmosphere.
Mars once had enough water for a global ocean several hundred metres deep, but where has it gone?
(Illustration: NASA/Greg Shirah)
"We believe that solar storms were frequent and more intense early on in the solar system's history," he
told New Scientist. Even so, Brain thinks that some of Mars's ancient water and CO2 is still stored in hidden
reservoirs.