Download Oilfield Review Autumn 2001 - Global Warming and

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

German Climate Action Plan 2050 wikipedia , lookup

Climate governance wikipedia , lookup

Effects of global warming on human health wikipedia , lookup

Economics of climate change mitigation wikipedia , lookup

Global warming controversy wikipedia , lookup

Climate change and agriculture wikipedia , lookup

Economics of global warming wikipedia , lookup

Surveys of scientists' views on climate change wikipedia , lookup

Effects of global warming on humans wikipedia , lookup

Fred Singer wikipedia , lookup

Citizens' Climate Lobby wikipedia , lookup

General circulation model wikipedia , lookup

Scientific opinion on climate change wikipedia , lookup

Climate change, industry and society wikipedia , lookup

Global warming hiatus wikipedia , lookup

2009 United Nations Climate Change Conference wikipedia , lookup

Climate engineering wikipedia , lookup

Climate-friendly gardening wikipedia , lookup

Global Energy and Water Cycle Experiment wikipedia , lookup

Public opinion on global warming wikipedia , lookup

Climate change mitigation wikipedia , lookup

Climate change and poverty wikipedia , lookup

Climate change in New Zealand wikipedia , lookup

Instrumental temperature record wikipedia , lookup

Views on the Kyoto Protocol wikipedia , lookup

Physical impacts of climate change wikipedia , lookup

Attribution of recent climate change wikipedia , lookup

Effects of global warming on Australia wikipedia , lookup

United Nations Framework Convention on Climate Change wikipedia , lookup

Low-carbon economy wikipedia , lookup

Climate change in the United States wikipedia , lookup

Global warming wikipedia , lookup

Carbon Pollution Reduction Scheme wikipedia , lookup

Mitigation of global warming in Australia wikipedia , lookup

Solar radiation management wikipedia , lookup

Politics of global warming wikipedia , lookup

Climate change feedback wikipedia , lookup

Business action on climate change wikipedia , lookup

IPCC Fourth Assessment Report wikipedia , lookup

Transcript
50387schD01R1
11/29/01
5:23 AM
Page 44
Global Warming and the E&P Industry
The question as to what extent man-made emissions of greenhouse gases may be
causing climate change has stirred intense debate around the world. Continued shifts
in the Earth’s temperatures, predicted by many scientists, could dramatically affect the
way we live and do business. This article examines the evidence and the arguments,
and describes some of the mitigating actions being taken by the exploration and production (E&P) industry.
Melvin Cannell
Centre for Ecology and Hydrology
Edinburgh, Scotland
Jim Filas
Rosharon, Texas, USA
John Harries
Imperial College of Science,
Technology and Medicine
London, England
Geoff Jenkins
Hadley Centre for Climate
Prediction and Research
Berkshire, England
Martin Parry
University of East Anglia
Norwich, England
Paul Rutter
BP
Sunbury on Thames, England
Lars Sonneland
Stavanger, Norway
Jeremy Walker
Houston, Texas
44
Scientists use language cautiously. They tend to
err on the side of understatement. During the
mid-1990s, in the Second Assessment Report of
the Intergovernmental Panel on Climate Change
(IPCC), leading scientists from around the world
expressed a consensus view that “the balance of
evidence suggests a discernible human influence
on global climate.” In July 2001, for the IPCC
Third Assessment Report, experts took this conclusion a step further. Considering new evidence,
and taking into account remaining uncertainties,
the panel stated “most of the observed warming
over the last 50 years is likely to have been due
to the increase in greenhouse-gas concentrations.”1 The word ‘likely’ is defined by the IPCC as
a 66 to 90% probability that the claim is true.
An important and influential segment of the
global scientific community firmly believes that
human activity has contributed to a rise in the
Earth’s average surface temperature and a resulting worldwide climate change. They contend that
such activity may be enhancing the so-called
‘greenhouse effect.’ Other distinguished scientists disagree, some dismissing the IPCC view
as simplistic.
The Greenhouse and Enhanced
Greenhouse Effects
The greenhouse effect is the name given to the
insulating mechanism by which the atmosphere
keeps the Earth’s surface substantially warmer
than it would otherwise be. The effect can be
illustrated by comparing the effects of solar
radiation on the earth and the moon. Both are
roughly equidistant from the sun, which supplies
the radiation that warms them, and both receive
about the same amount of heat energy per
square meter of their surfaces. Yet, the earth is
much warmer—a global average temperature of
15°C [59°F] compared with that of the moon,
-18°C [-0.4°F]. The difference is largely due to the
fact that the moon has almost no atmosphere
while the Earth’s dense atmosphere effectively
traps heat that would otherwise escape into space.
Climatologists use a physical greenhouse
analogy to explain how warming occurs. Energy
from the sun, transmitted as visible light, passes
through the glass of a greenhouse without hindrance, is first absorbed by the floor and contents, and then reemitted as infrared radiation.
For help in preparation of this article, thanks to David
Harrison, Houston, Texas, USA; Dwight Peters, Sugar Land,
Texas; and Thomas Wilson, Caracas, Venezuela. Special
thanks to the Hadley Centre for Climate Prediction and
Research for supplying graphics that were used as a basis
for some of the figures appearing in this article.
1. Climate Change 2001: The Scientific Basis: The
Contribution of Working Group I to the Third Assessment
Report of the Intergovernmental Panel on Climate
Change. New York, New York, USA: Cambridge University
Press (2000): 10.
Oilfield Review
50387schD01R1
11/29/01
5:24 AM
Page 45
50387schD01R1
11/29/01
5:24 AM
Page 46
Because infrared radiation cannot pass through
the glass as readily as sunlight, some of it is
trapped, and the temperature inside the greenhouse rises, providing an artificially warm environment to stimulate plant growth (right).
In the natural greenhouse effect, the Earth’s
atmosphere behaves like panes of glass. Energy
coming from the sun as visible short-wavelength
radiation passes through the atmosphere, just as
it does through greenhouse glass, and is
absorbed by the surface of the earth, which then
reemits it as long-wavelength infrared radiation.
Infrared radiation is absorbed by naturally occurring gases in the atmosphere—water vapor,
carbon dioxide [CO2], methane, nitrous oxide,
ozone and others—and reradiated. While some
energy goes into outer space, most is reradiated
back to earth, heating its surface.2
An enhanced greenhouse effect occurs when
human activities increase the levels of certain
naturally occurring gases. If the atmosphere is
pictured as a translucent blanket that insulates
the earth, adding to the concentration of these
greenhouse gases is analogous to increasing the
thickness of the blanket, improving its insulating
qualities (below).
Some reemitted infrared
radiation is reflected by the
glass and trapped inside.
Visible energy from the sun
passes through the glass,
heating the ground.
> The greenhouse analogy. A greenhouse effectively traps a portion of the
sun’s energy impinging on it, raising the interior temperature and creating an
artificially warm growing environment.
Natural Greenhouse Effect
Enhanced Greenhouse Effect
Enhanced
absorption by
greenhouse gases
Absorption of outgoing
radiation by indigenous
atmospheric gases
Reradiation
into space
Outgoing
long-wavelength
radiation
Incoming
short-wavelength
radiation
Reradiation
to earth
Reradiation
into space
Outgoing
long-wavelength
radiation
Incoming
short-wavelength
radiation
Reradiation
to earth
> Natural and enhanced greenhouse effects. In the natural greenhouse effect (left), indigenous atmospheric gases contribute to heating of the Earth’s
surface by absorbing and reradiating back some of the infrared energy coming from the surface. In the enhanced greenhouse effect (right), increased gas
concentrations, resulting from human activity, improve the atmosphere’s insulating qualities.
46
Oilfield Review
50387schD01R1
11/29/01
5:24 AM
Page 47
Atmospheric
constituent
Source
Lifetime
Carbon dioxide
Combustion of
fossil fuels
and woods
Land-use changes
100 years
Methane
Production and
transport of fossil
fuels
Decomposing waste
Agriculture
Dissociation of
gas hydrates
10 years
Combustion of fossil
fuels
Combustion of waste
150 years
Chlorofluorocarbons
Production
100 years
Ground-level ozone
Transport
Industrial emissions
3 months
Aerosols
Power generation
Transport
2 weeks
Nitrous oxide
Nitrous oxide
10%
Autumn 2001
Carbon dioxide
63%
Others
3%
> Man-made emission sources and lifetimes for
greenhouse gases. Various gases and aerosols
are emitted daily in commercial, industrial and
residential activities. Carbon dioxide is the most
important, because of its abundance and effective lifetime in the atmosphere of about 100 years.
Man-made emissions of greenhouse gases
occur in a number of ways. For example, carbon
dioxide is released to the atmosphere when solid
waste, wood and fossil fuels—oil, natural gas
and coal—are burned. Methane is emitted
by decomposing organic wastes in landfill sites,
during production and transportation of fossil
fuels, by agricultural activity and by dissociation of gas hydrates. Nitrous oxide is vented
during the combustion of solid wastes and fossil
fuels (above left).
Carbon dioxide is the most important, due
principally to the fact that it has an effective lifetime in the atmosphere of about 100 years, and is
the most abundant. Every year, more than 20 billion tons are emitted when fossil fuels are
burned in commercial, residential, transportation
and power-station applications. Another 5.5 billion tons are released during land-use changes,
such as deforestation.3 The concentration of CO2
in the atmosphere has increased by more than
30% since the start of the Industrial Revolution.
Methane
24%
> Relative warming projected from different
greenhouse gases during this century. Of the
various greenhouse gases, carbon dioxide is predicted to have the greatest capacity for causing
additional global warming, followed by methane
and nitrous oxide.
Analysis of air trapped in antarctic ice caps
shows that the level of carbon dioxide in the
atmosphere in pre-industrial days was about 270
parts per million (ppm). Today, readings taken at
the Mauna Loa Observatory in Hawaii, USA,
place the concentration at about 370 ppm.4
Concentrations of methane and nitrous oxide,
which have effective lifetimes of 10 and
150 years, respectively, also have increased—
methane more than doubling and nitrous oxide
rising by about 15% over the same time span.
Both are at much lower levels than CO2—
methane at 1.72 ppm and nitrous oxide at
0.3 ppm—but they exert a significant influence
because of their effectiveness in trapping heat.
Methane is 21 times more effective in this regard
than CO2, while nitrous oxide is 310 times more
effective, molecule for molecule.5
The global-warming potential of a gas is a
measure of its capacity to cause global warming
over the next 100 years. The warming effect of
an additional 1-kg [2.2-lbm] emission of a greenhouse gas discharged today—relative to 1 kg of
CO2—will depend on its effective lifetime, the
amount of extra infrared radiation it will absorb,
and its density. On this basis, experts calculate
that, during this century, CO2 will be responsible
for about two-thirds of predicted future warming,
methane a quarter and nitrous oxide around a
tenth (above right).6
2. The description above is a simplification. In fact, about
25% of solar radiation is reflected back into space before
reaching the Earth’s surface by clouds, molecules and
particles, and another 5% is reflected back by the Earth’s
surface. A further 20% is absorbed before it reaches the
earth by water vapor, dust and clouds. It is the remainder—just over half of the incoming solar radiation—that
is absorbed by the Earth’s surface. The greenhouse analogy, although widely used, is also only partly accurate.
Greenhouses work mainly by preventing the natural process of convection.
3. Jenkins G, Mitchell JFB and Folland CK: “The Greenhouse
Effect and Climate Change: A Review,” The Royal Society
(1999): 9-10.
4. Reference 1: 12.
5. “The Greenhouse Effect and Climate Change: A Briefing
from the Hadley Centre,” Berkshire, England: Hadley
Centre for Climate Prediction and Research (October
1999): 7.
6. Reference 5: 7.
47
50387schD01R1
11/29/01
5:24 AM
Page 48
Observed
behavior
Comparison
and
validation
Climate-system
model
Computer
simulation
Predicted
behavior
Update and refine model
> Climate simulations. Scientists use sophisticated models and computer simulations of the Earth’s climate system to confirm historical, and predict future,
temperature changes. Results are validated by comparison with actual temperature measurements. Such analyses form a basis for updating and refining
the reliability of simulations.
Temperature anomalies, C
1.0
1.0
Model
Observations
0.5
0.5
0.0
0.0
–0.5
–1.0
1850
–0.5
Natural factors only
1900
1950
Temperature anomalies, C
1.0
Model
Observations
Human factors only
–1.0
2000 1850
1900
1950
2000
Model
Observations
0.5
0.0
–0.5
–1.0
1850
Human and natural factors
1900
1950
2000
> Observed and simulated global warming. Neither natural nor man-made effects alone account for
the evolution of the Earth’s climate during the 20th century. By combining the two, however, the
observed pattern is reproduced with reasonable accuracy.
48
Measuring and Modeling Climate Change
IPCC scientists believe that we are already experiencing an enhanced greenhouse effect.
According to their findings, the Earth’s global
average surface temperature increased by about
0.6°C [1.1°F] during the last century. They maintain that this increase is greater than can be
explained by natural climatic variations. The
panel believes there is only a 1 to 10% probability that inherent variability alone accounts for this
extent of warming. Most studies suggest that,
over the past 50 years, the estimated rate and
magnitude of warming due to increasing concentrations of greenhouse gases alone are comparable to, or larger than, the observed warming.7
To better understand the physical, chemical
and biological processes involved, scientists
investigating climate variations construct complex
mathematical models of the Earth’s weather system. These models are then used to simulate past
changes and predict future variations. The more
closely that simulations match historical climate
records built from direct observations, the more
confident scientists become in their predictive
capabilities (left).
Greater emphasis on diagnosing and predicting the impact of global warming has resulted in
increasingly sophisticated simulations. For example, a state-of-the-art, three-dimensional (3D)
ocean-atmosphere model developed at the
Hadley Centre for Climate Prediction and
Research in Berkshire, England, appears to replicate—with reasonable precision—the evolution
of global climate during the late 19th and 20th
centuries. This simulation matches records that
clearly show that the global mean surface air
temperature has increased by 0.6°C ± 0.2°C
[1.1°F ± 0.4°F] since 1860, but that the progression has not been steady. Most of the warming
occurred in two distinct periods—from 1910 to
1945, and since 1976—with little change in the
intervening three decades.
When factors that impact the Earth’s climate
vary—concentrations of greenhouse gases, but
also heat output from the sun, for example—
they exert a ‘forcing’ on climate (see “Increases
in Greenhouse Forcing,” next page). A positive
forcing causes warming, a negative one results
in cooling. When researchers at the Hadley
Centre and the Rutherford Appleton Laboratory,
near Oxford, England, simulated the evolution
of 20th century climate, they concluded that,
by themselves, natural forcings—changes in
volcanic aerosols, solar output and other
phenomena—could not account for warming
Oilfield Review
50387schD01R1
11/29/01
5:25 AM
Page 49
Increases in Greenhouse Forcing
Observed
90˚ N
45˚ N
45˚ S
90˚ S
180˚ W
90˚ W
–1
–0.5
0˚
0
0.5
90˚ E
1
1.5
180˚ E
2
Simulated
90˚ N
45˚ N
45˚ S
90˚ S
180˚ W
90˚ W
–1
–0.5
0˚
0
0.5
90˚ E
1
1.5
180˚ E
2
> Observed (top) and simulated (bottom) surface air temperature changes.
Computer models closely resemble the global temperature signature produced by measurements of the change in air temperature. Values increase
from negative to positive as the color scale moves from blue to red.
in recent decades. They also concluded that
anthropogenic, or man-made, forcings alone
were insufficient to explain the warming from
1910 to 1945, but were necessary to reproduce
the warming since 1976. However, by combining
the two simulations, researchers were able to
reproduce the pattern of temperature change
with reasonable accuracy. Agreement between
observed and simulated temperature variations
supports the contention that 20th century warming resulted from a combination of natural and
external factors (previous page, bottom).8
In addition to examining the global mean temperature, researchers at the Hadley Centre also
Autumn 2001
compared geographic patterns of temperature
change across the surface of the earth. They
used models to simulate climate variations
driven by changes in greenhouse-gas concentrations and compared the ‘fingerprint’ produced
with patterns of change that emerge from observation. Striking similarities are evident between
the fingerprint generated by a simulation of the
last 100 years of temperature changes and the
patterns actually observed over that period (above).
Despite many advances, climate modeling
remains an inexact science. There is concern
that, at present, simulations may not adequately
represent certain feedback mechanisms, especially those involving clouds. Researchers, like
Early this year, scientists at the Imperial
College of Science, Technology and Medicine in
London, England, provided the first experimental observation of a change in the greenhouse
effect. Previous studies had been largely limited
to theoretical simulations.1 Changes in the
Earth’s greenhouse effect can be detected from
variations in the spectrum of outgoing longwavelength radiation, a measure of how the
earth gives off heat into space that also carries
an imprint of the gases responsible for the
greenhouse effect.
From October 1996 until July 1997, an instrument on board the Japanese ADEOS satellite
measured the spectra of long-wavelength radiation leaving the earth. The Imperial College
group compared the ADEOS data with data
obtained 27 years earlier by a similar instrument aboard the National Aeronautics and
Space Administration (NASA) Nimbus 4
meteorological satellite. The comparison of the
two sets of clear-sky infrared spectra provided
direct evidence of a significant increase in the
atmospheric levels of methane, carbon dioxide,
ozone and chlorofluorocarbons since 1970.
Simulations show that these increases are
responsible for the observed spectra.
1. Harries JE, Brindley HE, Sagoo PJ and Bantges RJ:
“Increases in Greenhouse Forcing Inferred from the
Outgoing Longwave Radiation Spectra of the Earth in
1970 and 1997,” Nature 410, no. 6832 (March 15, 2001):
355-357.
those at Hadley, do not claim that close agreement between observed and simulated temperature changes implies a perfect climatic model,
but if today’s sophisticated simulations of
climate-change patterns continue to closely
match observations, scientists will rely to a
greater extent on their predictive capabilities.
7. Reference 1: 10.
8. Stott PA, Tett SFB, Jones GS, Allen MR, Mitchell JFB
and Jenkins GJ: “External Control of 20th Century
Temperature by Natural and Anthropogenic Forcings,”
Science 290, no. 5499 (December 15, 2000): 2133-2137.
49
50387schD01R1
11/29/01
Radiation
into space
5:19 AM
Page 50
Radiation
into space
Soot
Coalesced
state
Aerosol
Radiation
from Earth's
surface
Radiation
from Earth's
surface
Separate soot
and aerosol
constituents
(external mixing)
Coalesced soot
and aerosol
constituents
(internal mixing)
> Impact of aerosols and soot. Temperature
simulations that take into account an internally
mixed, or coalesced, accumulation of aerosols
and soot (right) are more consistent with observations than separate, or externally mixed,
accumulations (left).
Global-average
surface temperature
change
(1900 to 2000)
+ 0.6 C
Results:
10% decrease in snow cover
(since the late 1960s)
2-week shorter annual ice cover
0.1- to 0.2-m sea-level rise
0.5 to 1% increase in precipitation
per decade (Northern Hemisphere)
> Observed impact of global warming. The
0.6°C temperature rise observed during the last
100 years has been postulated as the cause of
decreased snow and ice cover, higher sea levels
and increased precipitation.
50
The Opposing View
Not all scientists accept the IPCC findings.
Many distinguished researchers argue that the
panel’s approach is too simplistic. For instance,
Dr Richard Lindzen, Alfred P. Sloan Professor of
Meteorology at the Massachusetts Institute of
Technology (MIT) in Cambridge, USA, suggests
that clouds over the tropics act as an effective
thermostat and that any future warming because
of increased carbon dioxide concentration in the
atmosphere could be significantly less than current models predict.
Scientists have voiced strong objections that
even sophisticated circulation models do not
adequately describe the complexity of the mechanisms at work. A group of researchers at the
Harvard-Smithsonian Center for Astrophysics in
Cambridge, Massachusetts, for example, claims
there are too many unknowns and uncertainties
in climate modeling to have confidence in the
accuracy of today’s predictions. The group argues
that even if society had complete control
over how much CO2 was introduced into the
atmosphere, other variables within the climate
system are not sufficiently well-defined to produce reliable forecasts. The researchers are not
trying to disprove a significant man-made contribution, but rather contend that scientists do not
know enough about the complexity of climate
systems, and should be careful in ascribing too
much relevance to existing models.9
New scientific studies are shedding more
light on the problem. For example, previous
investigations have concluded that the Earth’s
climate balance is upset not only by emissions of
man-made greenhouse gases during processes
such as the combustion of fossil fuels, but also
by small particles called aerosols, such as those
formed from sulfur dioxide, which cool the Earth’s
surface by bouncing sunlight back into space.
But, new findings suggest that things may not be
that simple. A researcher at Stanford University,
California, USA, states that black carbon, or soot,
emissions from the burning of biomass and fossil
fuels are interfering with the reflectivity of
aerosols, darkening their color so that they
absorb more radiation. This reduces the cooling
effect, and could mean that black carbon is a
major cause of global warming, along with carbon dioxide and other greenhouse gases.
Atmospheric computer simulations usually
assume that aerosols and soot particles are separate, or externally mixed. An internally mixed
state—in which aerosols and soot coalesce—
also exists, but no one has yet successfully determined the relative proportions of the two states.
The Stanford researcher ran a simulation in
which black carbon was substantially coalesced
with aerosols. His results were more consistent
with observations than simulations that assumed
mainly external mixing. Although this could mean
that black carbon is a significant contributor to
warming, there is a bright side to the discovery.
Unlike the extended lifetime of carbon dioxide,
black carbon disappears much more rapidly. If
such emissions were stopped, the atmosphere
would be clear of black carbon in only a matter of
weeks (left).10
9. Soon W, Baliunas S, Idso SB, Kondratyev KY and
Postmentier ES: “Modelling Climatic Effects of
Anthropogenic Carbon Dioxide Emissions: Unknowns
and Uncertainties.” A Center for Astrophysics preprint.
Cambridge, Massachusetts, USA: Harvard-Smithsonian
Center for Astrophysics (January 10, 2001): to appear as
a review paper in Climate Research.
10. Jacobson M: “Strong Radiative Heating due to the
Mixing State of Black Carbon in Atmospheric Aerosol,”
Nature 409, no. 6821 (2001): 695-697.
11. Reference 1: 2-4.
12. Reference 1: 12-13.
13. Climate Change 2001: Impacts, Adaptation and
Vulnerability: Contribution of Working Group II to the
Third Assessment Report of the Intergovernmental
Panel on Climate Change. New York, New York, USA:
Cambridge University Press (2001): 5.
Oilfield Review
50387schD01R1 12/17/01 10:04 PM Page 51
Greater exposure
to disease
Increase in frequency
and intensity
of severe weather
Decreased food
supply
Water shortages
Increased flooding
> Future impact of global warming. IPCC scientists predict a number of consequences if climate changes
track the latest simulations, ranging from water shortages to flooding and decreased food supply.
Predicting the Future Impact of
Global Warming
The IPCC has described the current state of scientific understanding of the global climate system, and has suggested how this system may
evolve in the future. As discussed, the panel confirmed that the global-average surface temperature of the earth increased by about 0.6°C during
the last 100 years. Analyses of proxy data from
the Northern Hemisphere indicate that it is likely
the increase was the largest of any century in the
past millennium. Because of limited data, less is
known about annual averages prior to the year
1000, and for conditions prevailing in most of the
Southern Hemisphere prior to 1861.
The IPCC report states that temperatures
have risen during the past four decades in the
lowest 8 km [5 miles] of the atmosphere; snow
cover has decreased by 10% since the late
1960s; the annual period during which rivers and
lakes are covered by ice is nearly two weeks
Autumn 2001
shorter than at the start of the century; and average sea levels rose by 0.1 to 0.2 m [0.3 to 0.7 ft]
during the 1900s. The report further states that,
during the last century, precipitation increased by
0.5 to 1% per decade over most middle and high
latitudes of Northern Hemisphere continents,
and by 0.2 to 0.3% per decade over tropical land
areas (previous page, bottom).11
While these changes may appear to be modest, predicted changes for this century are much
larger. Simulations of future atmospheric levels of
greenhouse gases and aerosols suggest that the
concentration of CO2 could rise to between 540
and 970 ppm. For all scenarios considered by the
IPCC, both global-average temperature and sea
level will rise by the year 2100—temperature by
1.4°C to 5.8°C [2.5°F to 10.4°F] and sea level by
0.09 to 0.9 m [0.3 to 2.7 ft]. The predicted temperature rise is significantly greater than the 1°C
to 3.5°C [1.8°F to 6.3°F] estimated by the IPCC
five years ago. Precipitation is also forecasted to
increase. Northern Hemisphere snow cover is
expected to decrease further, and both glaciers
and ice caps are expected to continue to retreat.12
If climate changes occur as predicted, serious
consequences could result, both with respect to
natural phenomena, such as hurricane frequency
and severity, and to human-support systems. The
IPCC Working Group II, which assessed impacts,
adaptation and vulnerability, stated that if the
world continues to warm, we could expect water
shortages in heavily populated areas, particularly
in subtropical regions; a widespread increase in
the risk of flooding as a result of heavier rainfall
and rising sea levels; greater threats to health
from insect-borne diseases, such as malaria, and
water-borne diseases, such as cholera; and
decreased food supply as grain yields drop
because of heat stress. Even minimal increases in
temperature could cause problems in tropical
locations where some crops are already near their
maximum temperature tolerance (above).13
51
50387schD01R1
11/29/01
5:25 AM
Page 52
Sea-level rises could threaten five parts of
Africa that have large coastal population centers—the Gulf of Guinea, Senegal, Gambia,
Egypt and the southeastern African coast. Even a
somewhat conservative scenario of a 40-cm
[15.8-in.] sea-level rise by the 2080s would add
75 to 200 million people to the number currently
at risk of being flooded by coastal storm surges,
with associated tens of billions of dollars in property loss per country.14
Africa, Latin America and the developing
countries of Asia may have a two-fold problem,
being both more susceptible to the adverse
effects of climate change and lacking the infrastructure to adjust to the potential social and
economic impacts.
The IPCC Working Group II has ‘high confidence’ that:
• Increases in droughts, floods and other
extreme events in Africa would add to stresses
on water resources, food-supply security,
human health and infrastructures, and constrain further development.
• Sea-level rise and an increase in the intensity
of tropical cyclones in temperate and tropical
Asia would displace tens of millions of people
in low-lying coastal areas, while increased
rainfall intensity would heighten flood risks.
• Floods and droughts would become more
frequent in Latin America, and flooding
would increase sediment loads and degrade
water quality.
The Working Group has ‘medium confidence’
that:
• Reductions in average annual rainfall, runoff
and soil moisture would increase the creation
of deserts in Africa, especially in southern,
northern and western Africa.
• Decreases in agricultural productivity and
aquaculture due to thermal and water stress,
sea-level rise, floods, droughts and tropical
cyclones would diminish the stability of food
supplies in many countries in the arid, tropical
and temperate parts of Asia.
• Exposure to diseases such as malaria,
dengue fever and cholera would increase in
Latin America.15
Not all impacts would be negative, however.
Among projected beneficial effects are higher
crop yields in some mid-latitude regions; an
increase in global timber supply; increased water
availability for people in some regions, like parts
of Southeast Asia, which currently experience
water shortages; and lower winter death rates in
mid- to high-latitude countries.16
52
Retreating glaciers
Thawing of permafrost
Melting of sea ice
Floods
Increased rainfall
Intense cyclones
Decreased food supply
Rising sea levels
Higher heat index
Hotter summers
Reduced water supply
Increase in forest fires
Deteriorating air quality
Floods
Droughts
Degraded water quality
Droughts
Floods
Decreased food supply
Expanding deserts
Sea-level rise
> Impact of global warming by region. All continents will be affected significantly if global warming
continues. The type and severity of specific impacts will vary, as will each continent’s or country’s
capacity to use infrastructure and technology to cope with change.
Other studies—such as the US Global
Research Program’s report “Climate Change
Impacts on the United States,” and the European
Community-funded ACACIA (A Consortium for
the Application of Climate Impact Assessments)
Project report—are consistent with future IPCC
forecasts, and provide a more detailed picture for
particular regions.
According to the US study, assuming there are
no major interventions to reduce continued growth
of world greenhouse-gas emissions, temperatures
in the USA can be expected to rise by about 3°C to
5°C [5.4°F to 9°F] over the next 100 years, compared with the worldwide range of 1.4°C to 5.8°C
[2.5°F to 10.4°F] suggested by the IPCC.17
Assuming there are no major interventions,
other predictions include the following:
• Rising sea levels could put coastal areas at
greater risk of storm surges, particularly in the
southeast USA.
• Large increases in the heat index, the combination of temperature and humidity, and in the
frequency of heat waves could occur, particularly in major metropolitan cities.
• Continued thawing of permafrost and melting
of sea ice in Alaska could further damage
forests, buildings, roads and coastlines.
In Europe, negative climate changes are
expected to impact the south more than the
north. Sectors such as agriculture and forestry
will be affected to a greater extent than sectors
such as manufacturing and retailing, and
marginal and poorer regions will suffer more
adverse effects than wealthy ones.
The ACACIA report, which provided the basis
for the IPCC findings on impacts in Europe, makes
the following predictions for southern Europe:
• Longer, hotter summers will double in frequency by 2020, with a five-fold increase in
southern Spain, increasing the demand for
air conditioning.
• Available water volumes will decrease by 25%,
reducing agricultural potential. Careful planning will be essential to satisfy future urban
water needs.
• Desertification and forest fires will increase.
• Deteriorating air quality in cities and excessive
temperatures at beaches could reduce recreational use and associated tourist income.
Predictions for northern Europe include the
following:
• Cold winters will be half as frequent by 2020.
• Northern tundra will retreat and there could be
a loss of up to 90% of alpine glaciers by the
end of the century.
• Conversely, climate changes could increase
agricultural and forest productivity and water
availability, although the risk of flooding could
increase (above).18
Oilfield Review
50387schD01R1
11/29/01
5:20 AM
Page 53
The Sociopolitical Debate and Its Impact
on Process and Technology
On balance, the potential dangers and adverse
effects of global warming far outweigh any possible benefits. Both legislative and technical
options are being explored to mitigate the
impacts of future climate change.
With its 100-year effective lifetime, CO2 concentration in the atmosphere is slow to respond
to any cut in emissions. If nothing is done to
reduce emissions, the concentration would more
than double over the next century. If emissions
are lowered to 1990 levels, the concentration
would still rise, probably to more than 500 ppm.
Even if emissions were slashed to half that level
and held there for 100 years, there would still be
a slow rise in concentration. Best estimates suggest it would take a reduction of 60 to 70% of the
1990 emission levels to stabilize the concentration of CO2 at the 1990 levels.19
Against this backdrop, there have been political attempts to grapple with the problem for
nearly a decade. These have achieved, at best,
modest results. Although an in-depth discussion
of global-warming politics is beyond the scope of
this technically focused article, conferences held
to date and their resulting protocols illustrate the
challenges that will be faced by new-generation
oilfield processes and technologies, and by business and industry in general (above).
The political movement toward global consensus began in 1992 at the United Nations
Conference on Environment and Development
held in Rio de Janeiro, Brazil. This conference
resulted in the United Nations Framework
Convention on Climate Change (UNFCCC), a
statement of intent on the control of greenhousegas emissions, signed by an overwhelming
majority of world leaders. Article II of the convention, which came into force in 1994, said the
signatories had agreed to “achieve stabilization
of greenhouse-gas concentrations in the atmosphere at a level that would prevent dangerous
anthropogenic interference with the climate system…within a time frame sufficient to allow
ecosystems to adapt naturally to climate change,
to ensure that food production is not threatened,
and to enable economic development to proceed
in a sustainable manner.” The developed nations
taking part also committed themselves to reduce
their emissions of greenhouse gases in the year
2000 to 1990 levels.
A more ambitious target was set in 1997 in
the Kyoto Protocol, an agreement designed to
Autumn 2001
Conference
_____
Outcome
1992
1997
2000
2001
Rio de Janeiro,
Brazil
_________
Kyoto,
Japan
_________
The Hague,
The Netherlands
_________
Bonn,
Germany
_________
Statement of
intent on control
of greenhouse
gases
Protocol on
reduction levels
for specific
commitment
period
Collapse of
implementation
plan for Kyoto
Protocol
Broad agreement
on rulebook
for implementing
Kyoto protocol
(except USA)
> Major international global warming conferences. A concerted effort at
addressing the sociopolitical implications of global warming in a forum of
nations began in 1992 in Rio de Janeiro, Brazil. The most recent conference,
held in July 2001 in Bonn, Germany, was the latest attempt to reach some
type of formalized agreement on reducing greenhouse-gas emissions.
commit the world’s 38 richest nations to reduce
their greenhouse-gas emissions by an average of
at least 5% below 1990 levels in the period from
2008 to 2012.20 The Kyoto Protocol put most of
the burden on developed countries, which, as a
group, had been responsible for the majority
of greenhouse gases in the atmosphere. It
excluded more than 130 developing countries,
even though many poorer nations were adding to
the problem in their rush to catch up with the
developed world. European Union (EU) countries
agreed to a reduction of 8%, and the USA
promised a 7% cutback, based on 1990 levels. To
take effect, it was agreed that the Protocol must
be ratified by at least 55 countries, including
those responsible for at least 55% of 1990 CO2
emissions from developed countries.
The targets set in Kyoto are more rigorous
than they might first appear since many developed economies have, until very recently, been
growing rapidly and are emitting greater volumes
of greenhouse gases. In 1998, for example, the
US Department of Energy forecasted that US
emissions in the year 2010 would exceed the
Kyoto target by 43%.
The November 2000 talks in The Hague on
implementing the Kyoto Protocol collapsed when
the EU rejected a request that the estimated
310 million tons of CO2 soaked up by forests in
the USA be set against its 7% commitment. The
EU suggested instead that the USA be allocated
a 7.5-million ton offset.
In July 2001, 180 members of the UNFCCC
finally reached broad agreement on an operational rulebook for the Kyoto Protocol at a meeting in Bonn, Germany. The USA rejected the
agreement. If the Protocol is to go forward, the
next step would be for developed-country
governments to ratify it so that measures could
be brought into force as soon as possible, possibly by 2002.
One issue resolved at the Bonn meeting was
how much credit developed countries could
receive towards their Kyoto targets through the
use of ‘sinks’ that absorb carbon from the atmosphere. There was agreement that activities that
could be included under this heading included
revegetation and management of forests, croplands and grazing lands. Individual country quotas
were set so that, in practice, sinks will account
only for a fraction of the emission reductions that
can be counted towards the target levels.
Similarly, storage options exist for carbon dioxide
that offer attractive alternatives to sinks under
certain conditions (see “Mitigating the Impact of
Carbon Dioxide: Sinks and Storage,” page 54).
The conference also adopted rules governing the
so-called Clean Development Mechanism (CDM)
through which developed countries can invest in
climate-friendly projects in developing countries
and receive credit for emissions thereby avoided.
(continued on page 56)
14. Reference 13: 13-14.
15. Reference 13: 14-15.
16. Reference 13: 6.
17. Climate Change Impacts on the United States, The
Potential Consequences of Climate Variability and
Change: Foundation Report, US Global Change Research
Program Staff. New York, New York, USA: Cambridge
University Press (2001): 6-10.
18. Parry ML (ed): Assessment of Potential Effects and
Adaptations for Climate Change in Europe. Norwich,
England: Jackson Environment Institute, University of
East Anglia, 2000.
19. Jenkins et al, reference 3: 10.
20. Kyoto Protocol, Article 31, available at Web site:
http://www.unfccc.de/resource/docs/convkp/kpeng.html
53
50387schD01R1
11/29/01
5:21 AM
Page 54
Mitigating the Impact of Carbon Dioxide: Sinks and Storage
In the short to medium term, the world will
continue to depend upon fossil fuels as cheap
energy sources, so there is growing interest in
methods to control carbon dioxide emissions—
for example, the creation of carbon sinks and
storage in natural reservoirs underground or in
the oceans.1
Carbon sinks—Carbon sinks are newly
planted forests where trees take CO2 from the
atmosphere as they grow and store it in their
branches, trunks and roots. If too much CO2 is
being pumped into the atmosphere by burning
fossil fuels, discharge levels can be compensated for, to some extent, by planting new trees
that soak up and store CO2.
In 1995, the IPCC estimated that some
345 million hectares [852 million acres] of new
forests could be planted between 1995 and 2050
that would sequester nearly 38 gigatons of carbon. These actions would offset about 7.5% of
fossil-fuel emissions. The IPCC added that other
measures, like slowing tropical deforestation,
could sequester another 20 to 50 gigatons.
Taken together, new forests, agroforestry, regeneration and slower deforestation might offset 12
to 15% of fossil-fuel emissions by the year 2050.
An attractive feature of this approach is that, if
implemented globally, it buys time during which
longer term solutions can be sought to meet
world energy needs without endangering the
climate system.
There are, however, other factors that must
be considered, such as how to quantify the
amount of carbon being sequestered, how to
verify sequestration claims and how to deal with
‘leakage.’ Leakage occurs when actions to
increase carbon storage in one place promote
activities elsewhere that cause either a
decrease in carbon storage (negative leak) or
an increase in carbon storage (positive leak).
Preserving a forest for carbon storage may, for
instance, produce deforestation elsewhere (negative leakage) or stimulate tree planting elsewhere to provide timber (positive leakage). The
carbon-sink process is reversible. At some
future date, some forests could become unsustainable, leading to a rise in CO2 levels.
Carbon storage—Carbon dioxide is produced
as a by-product in many industrial processes,
54
Sleipner
West
Sleipner
East
Statfjord
Gullfaks
NORWAY
Frigg
Heimdal
Stavanger
Sleipner
Ula
Ekofisk
NORTH SEA
DENMARK
UNITED
KINGDOM
GERMANY
> Sleipner field location.
usually in combination with other gases. If the
CO2 can be separated from the other gases—at
present, an expensive process—it can be stored
rather than released to the atmosphere. Storage
could be provided in the oceans, deep saline
aquifers, depleted oil and gas reservoirs, or on
land as a solid. Oceans probably have the greatest potential storage capacity. While there
are no real engineering obstacles to overcome,
the environmental implications are not adequately understood.
For years, carbon dioxide has been injected
into operating oil fields to enhance recovery,
and normally remains in the formation. The use
of depleted oil or gas reservoirs for CO2 storage,
however, has a further advantage in that the
geology is well-known, so disposal takes place in
areas where formation seals can contain the gas.
The first commercial-scale storage of CO2 in
an aquifer began in 1996 in the Sleipner natural
gas field belonging to the Norwegian oil company Statoil. The project is named SACS (Saline
Aquifer CO2 Storage) and is sponsored by the
EU research program Thermie. A million tons,
a year of CO2 production, are removed from the
natural gas stream using a solvent-absorption
process and then reinjected into the Utsira
reservoir, 900 m [2950 ft] below the floor of
the North Sea (above). According to a report by
the Norwegian Ministry of Petroleum and
Energy, the Utsira formation is widespread
and about 200 m [660 ft] thick, so it can theoretically accommodate 800 billion tons of
CO2—equivalent to the emissions from all
northern European power stations and major
industrial establishments for centuries to come
(next page, bottom).
Oilfield Review
50387schD01R1
11/29/01
5:21 AM
Page 55
To monitor the CO2-injection area,
Schlumberger is conducting four-dimensional
(4D), or time-lapse, seismic studies that compare seismic surveys performed before and during injection. A survey acquired in 1994, two
years before injection began, served as the baseline for comparison with a 1999 survey acquired
after about 2 million tons of CO2 had been
injected. Higher seismic amplitudes in the 1999
survey show the location where gas has displaced brine in the Utsira formation. Another
4D survey is scheduled for late 2001 (right).
The Sleipner CO2 sequestration project
already has inspired other oil and gas companies to consider or plan similar efforts in southeast Asia, Australia and Alaska.
Sleipner CO2 injection siesmic monitoring
E-W section preliminary raw stack
1. Cannell M: Outlook on Agriculture 28, no. 3: 171-177.
> Seismic responses due to carbon dioxide injection. A 1994 seismic survey (left)
served as a baseline for a 1999 survey (right) that showed the pattern of brine
displacement by carbon dioxide following injection of 2 million tons of the gas.
1994
1999
after injecting 2 millIon tons of CO2 since 1996
no change above this level
Top Utsira formation
–250 m
Injection point
500 m
Velocity push-down
beneath CO2 cloud
Depth, m
Sleipner T
Sleipner A
0
500
CO2 injection well
1000
CO2
Utsira formation
1500
Sleipner East production
and injection wells
2000
2500
0
500
1000
1500 m
0
1640
3280
4920 ft
Heimdal formation
> Carbon dioxide injection well in Utsira. The Utsira formation is about 200 m [660 ft] thick and can hold the equivalent of all carbon dioxide emissions
from all northern European power stations and industrial facilities for centuries to come.
Autumn 2001
55
11/29/01
5:25 AM
Page 56
BP Emissions-Reduction Program
_________
Capture and reuse emissions.
Stop deliberate venting of carbon
dioxide and methane.
Improve energy efficiency.
Eliminate routine flaring.
Develop technologies to separate
carbon dioxide from gas mixtures.
> Cutting emission levels. BP has undertaken an
aggressive, multifaceted program to reduce
emissions, ranging from improved energy efficiency to elimination of routine gas flaring.
The Kyoto Protocol includes a compliance
mechanism. For every ton of gas that a country
emits over its target, it will be required to reduce
an additional 1.3 tons during the Protocol’s second commitment period, which starts in 2013.
Some reports contend that concessions made at
the conference reduced emissions cuts required
by the Protocol from 5.2% to between 0 and 3%
in 2010. The UNFCCC is more cautious in its
statements. As of August of this year, its secretariat had not calculated how the Bonn agreements might affect developed-country emission
reductions under the Kyoto Protocol, and indicated that this would not be known with any precision until the 2008-2012 target period.
E&P Company Initiatives
Today, many oil and gas companies are taking
global warming seriously, convinced that it is sensible to adopt a precautionary approach. Others
have taken a more conservative stance: they
agree that climate change may pose a legitimate
long-term risk, but argue that there is still insufficient scientific understanding to make reasonable
predictions and informed decisions, or to justify
drastic measures. All agree that a combination of
process changes and advanced technologies will
be required within the industry to meet the types
of emission standards being proposed.
BP and Shell have implemented strategies
based on a judgment that while the science of
climate change is not yet fully proven, it is prudent to behave as though it was. Both companies
have established ambitious internal targets for
reduction of their own emissions. The Kyoto
Protocol calls for an overall reduction of greenhouse-gas emissions of at least 5% by 2008 to
2012, compared with 1990. BP has undertaken to
56
reduce its greenhouse-gas emissions by 10% by
the year 2010, against the 1990 baseline. Shell
intends to reduce emissions by 10%, against the
same baseline, by 2002.
Companies are choosing to cut emissions in
several different ways. The BP emissions reduction program, for instance, includes ambitious
commitments:
• Ensure that nothing escapes into the environment that can be captured and, ideally, used
elsewhere. BP intends to stop the deliberate
venting of methane and carbon dioxide wherever possible. This may involve redesigning or
replacing equipment, and identifying and eliminating leaks.
• Improve energy efficiency. Engineers are examining all energy-generating equipment to
ensure that the company is making the best
possible use of hydrocarbon fuels and the heat
that is a by-product of energy generation.
• Eliminate routine flaring. It is better to flare gas
than vent it directly to the atmosphere, but it
is still a waste of hydrocarbons—although
some flaring may still be necessary for
safety reasons.
• Develop technology to separate carbon dioxide
from gas mixtures, then reuse it for enhanced
oil recovery or store it in oil and gas reservoirs
that are no longer in use, or in saline formations (above).
Integrated oil companies also are trying to
help customers reduce greenhouse-gas emissions
by increasing the availability of fuels with lower
carbon content and offering renewable energy
alternatives, like solar and wind-driven power.
Some companies, including BP and Shell,
have introduced internal greenhouse-gas emissions trading systems. The attraction of emissions
trading is that it allows reductions to be achieved
at the lowest cost; companies for whom emissions reductions are cheap can lower their
emissions and sell emission rights to firms that
would have to pay more to decrease emissions.
The BP emissions trading system is based on
a cap-and-trade concept, and was primarily
designed to provide BP with practical experience
dealing with an emissions trading market and to
learn about its complexities. At its simplest level,
a cap is set each year to steer the group toward
the most efficient use of capital to meet its 2010
target of 10%. Say, for example, increased production is planned from an offshore platform,
thereby causing emissions above its allocated
allowance. If the platform’s on-site abatement
costs are higher than the market price of CO2, the
company may decide to purchase CO2
allowances for that unit. Similarly, if a downstream unit has upgraded its refinery and emits
less CO2 than its allowances cover, it is economically desirable to both companies if the latter
sells its allowances to the former (below).
The operation of these systems will be
closely followed not only by other oil and gas
companies but also by governments, since the
principles behind emissions trading are broadly
the same whether trading takes place within a
single company, among companies within a single country, among companies internationally or
between nations.
Oilfield Technology Development
and Application
Working with oil and gas companies, major oilfield service suppliers have been at the forefront
in addressing a range of health, safety and environmental issues—from reducing personnel
exposure to risks at the wellsite, to application of
‘green’ chemicals that provide equal or enhanced
performance while decreasing ecological impact,
and to methods for cutting or eliminating emissions resulting from processes such as burning
oil and flaring gas during well-testing operations.
Emission limit
after trading
Units bought
Carbon dioxide emissions
50387schD01R1
–10
Units sold
40
Each company
initially is
allocated 50
permits to emit
50 tons
Company A
+10
Emission limit
before trading
50
Company B
> Emissions trading system. This process strives to reduce emissions at the
lowest cost by permitting the buying and selling of emissions rights between
various units within a given company or between companies.
Oilfield Review
50387schD01R1
11/29/01
5:26 AM
Page 57
Gas
Flaring
Series of pumps
Produced fluid
Oil
Pipeline
Water and oil emulsion
Disposal
Stage 1
Separator
Flaring
Gas
Produced fluid
Gas and oil
Neutralizer and emulsion breaker
Series of pumps
Separator
Stage 2
Oil
Broken emulsion
Skimmer
Oil
Pipeline
Surge tank
Clean water
Produced fluid
Gas and oil
Neutralizer and emulsion breaker
Disposal
Gas and oil
Multiphase flowmeter
Multiphase pump
Stage 3
Pipeline
Broken emulsion
Skimmer
Oil
Surge tank
Clean water
Disposal
> Three-stage program to eliminate flaring. A Schlumberger team in the Middle East committed to first reduce and then fully eliminate flaring of gas and
burning of oil and, at the same time, generate greater revenue for the operator by increasing pipeline throughput.
Solutions to eliminate flaring—Burning oil
and flaring natural gas during testing operations
not only are costly due to lost revenue, but also
produce large quantities of carbon dioxide. Small
amounts of toxic gases, soot and unburned
hydrocarbons are also released. Eliminating oil
burning and, ultimately, gas flaring not only creates a safer working environment, but also helps
reduce the key constituent, carbon dioxide,
thought to be associated with global warming.
Recently, a Schlumberger team in the Middle
East, working closely with a major operator in the
region, addressed the flaring problem for production testing where an existing export pipeline
was available. Considering the nature of the testing program, there were several key challenges
that had to be overcome. Wells are typically
highly deviated or horizontal, and penetrate massive carbonate formations. Large quantities of
acid are used to treat the zones, giving rise to
long cleanup periods and an erratic initial flow of
mixtures of spent acid, emulsions, oil and gas.
Autumn 2001
Traditionally, the wells were flowed until sufficient oil was produced at sufficient pressure to
go directly into the production pipeline, requiring
burning of oil in the interim. Care had to be taken
that the fluid’s pH was high enough so as not to
cause corrosion problems.
A three-stage program to eliminate flaring
and simultaneously solve associated well-testing
problems was undertaken. In the first stage,
beginning in 1998, the goal was to pump separated oil into the pipeline from the outset,
instead of burning it. This required the design of
specialized, dual-packing centrifugal pumps that
were run in series to achieve the required pressure for oil injection into the pipeline. Natural
gas was still flared, and separated water discarded. Residual oil and water emulsions
remained a problem, since a single separator
was insufficient to break them.
In the second stage of the project, a neutralizer and breaker system was designed for treatment of the emulsion phase prior to entering the
main separator. Remaining gas and oil were then
flowed through the separator. A skimmer and
chemical injection system were employed to
reduce the oil content in the water underflow
stream from 3000 ppm to less than 80 ppm,
allowing safe disposal of all residual water. Oil
produced through emulsion breaking was
pumped into a surge tank and then into the production pipeline, saving additional oil that would
have otherwise been discarded.
In the third stage, currently under way, the
goal is for complete elimination of flaring by
using advanced multiphase pumping technology
with multiphase metering. When the wellhead
pressure is insufficient to route gas back through
the line after the multiphase meter, a variabledrive multiphase pump—that can handle a variety of flow rates and pressures—would be
introduced so that both oil and gas can be
injected into the production pipeline (above).
57
50387schD01R1
11/29/01
5:21 AM
Page 58
In the first year of implementation of the initial
stages of the project, the operator was able to sell
an additional 375,000 barrels [59,600 m3] of oil
that otherwise would have been burned, generating more than $11 million in increased revenues.21
Zero-emission testing—The next frontier is a
generalized solution for zero-emission testing for
exploration and appraisal wells where an export
pipeline is not available. Here, the challenge is to
take a quantum step beyond improved burner
technology. The goal is elimination of all emissions by keeping produced hydrocarbons contained either below surface or the mudline, or in
special offshore storage vessels. Through the use
of advanced downhole measurements and tools,
high-quality test data and samples could still
be captured.
There are several approaches to downhole
containment. In particular, three options are
currently undergoing intensive investigation. The
first is closed-chamber testing. Here, test fluids
flow from the formation into an enclosed portion
of a tool or pipestring. A short flow period is
achieved as the chamber fills and its original contents become compressed. Flow stops as the
chamber reaches equilibrium, allowing analysis
of the subsequent buildup. This method, applicable to both oil and gas wells, is simple, and the
short test duration limits rig time compared with
a conventional test. But, there are drawbacks.
With only a small flowed volume due to capacity
limitations of the test string or wellbore, only a
limited radius of investigation near the wellbore
can be evaluated. Lack of thorough cleanup after
perforating can potentially affect the quality of
collected samples. If the formation is not wellconsolidated, hole damage or collapse may occur
because of high inflow rates (below left).
Surface valve
A second method is production from one zone
and reinjection into the same zone, known as
harmonic testing. Here, fluid is alternately withdrawn into a test string and then pumped back
into the reservoir at a given periodic frequency.
The reservoir signature is determined point-bypoint as a function of frequency by varying the
frequency during testing. The advantage is that a
21. The team that spearheaded this project won the
Performed by Schlumberger Chairman’s Award 2000,
the top award in a company-wide program to strengthen
the Schlumberger culture of excellence. Client team
members included Abdullah Faddaq, Suishi Kikuchi,
Mahmoud Hassan, Eyad Al-Assi, Jean Cabillic,
Graham Beadie, Ameer El-Messiri and Simon Cossy.
Schlumberger team members included Jean-Francois
Pithon, Abdul Hameed Mohsen, Mansour Shaheen,
Thomas F. Wilson, Nashat Mohammed, Aouni El Sadek,
Karim Mohi El Din Malash, Akram Arawi, Jamal Al
Najjar, Basem Al Ashab, Mohammed Eyad Allouch,
Jacob Kurien, Alp Tengirsek, Mohamed Gamad and
Thomas Koshy.
Tubing
Circulating valve
Barrier valve
Upper packer
Circulating valve
Ball valve
Downhole
pump assembly
Gas-liquid
interface
Test valve
Produced fluid
and initial
liquid cushion
Packer
Lower packer
Pressure
gauge
Sand screen and
gravel pack
> Closed-chamber testing. Test fluids from the
formation enter an enclosed space until the contents compress and reach equilibrium. This brief
flow period is then followed by a second stage of
pressure buildup.
58
Flow direction
> Continuous production and reinjection. A specially designed tool allows produced fluid from
one zone to be continuously injected into another
using a downhole pump to provide a prolonged
testing period. Samples can be retrieved, and
flow and pressure data are measured downhole
for subsequent analysis.
Oilfield Review
50387schD01R1
11/29/01
5:26 AM
Page 59
Drilling and
production unit
Storage modules and
processing facilities
Dynamically positioned
storage or shuttle tanker
Rigid production
riser
Export flowline
BOP or subsea
test tree
> Offshore storage-module concept. A vessel for storing and offloading fluids collected in closed
modules during testing operations might offer an approach to eliminate the need for flaring while
generating increased revenues.
separate zone for disposal of the produced fluid
is not needed, but defining the pressure-response
curve would require more time than for a conventional test and may not be cost-effective.
Advanced signal processing may be able to
reduce the time required, but still may not make
the process economically viable.
The third method is to continually produce
from one zone and inject the produced fluid into
another zone. Reservoir fluids are never brought
to surface, but are reinjected using a downhole
pump. Drawdown is achieved by pumping from
the production zone into the disposal zone.
Buildup is provided by simultaneously shutting in
the production zone and stopping the downhole
pump. If injectivity can be maintained, this continuous process emulates a full-scale well test. A
larger radius of investigation is possible due to
larger flow volumes, with the potential to investigate compartmentalization or even reservoir
limits. A longer flow period improves cleanup
prior to sampling. Flow and pressure are measured downhole and analyzed with conventional
methods for radial flow. It is possible to capture
small pressure-volume-temperature (PVT)-quality
samples and larger dead-oil samples downhole.
Drawbacks include a somewhat complex tool
string, an inability to handle significant quantities
of gas and no time-saving over a conventional
well test. The key factor is having a suitable
injection zone that provides sufficient isolation
(previous page, bottom right).
Two joint industry programs have been established to investigate each of the three methods in
detail, with participation by BP, Chevron, Norsk
Hydro and Schlumberger. The first, conducted by
Schlumberger, is assessing downhole tool design
Autumn 2001
and capability requirements. The second, a threeyear program at Imperial College in London,
England, is defining the interpretation packages
and procedures that would be required to capture
the maximum amount of reliable information
from the data.
Once the selection of the preferred method is
finalized, the next step will be a proof-of-concept
field experiment that mirrors the requirements of
a variety of well-test conditions. Currently, the
continuous production-reinjection option looks
most promising.
Modules mounted on the deck or in the hold
of a suitable floating vessel are being investigated for storing fluids collected offshore during
testing. Fluid-processing facilities also would be
provided onboard. Large discoveries, marginal
fields and deepwater prospects are targeted
applications. Equipment would be designed to
handle a broad range of testing conditions and
durations. The vessel would receive and store
gas and liquids, and offload the contents at the
end of the well test or at intervals during the test.
This concept could completely eliminate the need
for flaring, and generate revenues from sale of
produced fluids that would otherwise be lost. The
procedures for handling and storing liquids have
already been successfully demonstrated in
extended well tests in fields such as BP’s
Machar—proving both the feasibility and financial viability of the approach. Gas handling and
storage, however, pose additional challenges
that would probably require compression
and transfer facilities to create compressed
natural gas. This is a costly proposition and
may not be economically viable at current
gas prices (above).
With growing emphasis on eliminating all
types of gas emissions, particularly carbon dioxide, these areas of investigation are expected to
continue to receive close attention and significant industry funding.
Future Challenges
In the near future, governments around the world
will receive the IPCC Synthesis Report which will
attempt to answer, as clearly and simply as possible, 10 policy-relevant scientific questions.
Perhaps the pivotal question, as stated by the
IPCC, is: “How does the extent and timing of the
introduction of a range of emissions-reduction
actions determine and affect the rate, magnitude
and impacts of climate change, and affect global
and regional economies, taking into account historical and current emissions?”
In another five years, the IPCC is expected to
publish its Fourth Assessment Report. By then,
climatologists may have resolved some of the
uncertainties that limit today’s climate models.
They should, for example, be able to provide a
better description of the many feedback systems
associated with climatic phenomena, particularly
clouds. Greater understanding could lead to
reduced uncertainty about a causal connection
between increased greenhouse-gas concentrations and global warming. This would be a major
step forward.
In the interim, oil and gas companies,
working closely with oilfield service companies,
will continue to be proactive in developing
technologies and operational procedures for
reducing emissions.
—MB/DEO
59