Download Chapter 2 - ANU Repository

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

No-till farming wikipedia , lookup

Canadian system of soil classification wikipedia , lookup

Surface runoff wikipedia , lookup

Pedosphere wikipedia , lookup

Soil contamination wikipedia , lookup

Irrigation wikipedia , lookup

SahysMod wikipedia , lookup

Soil salinity control wikipedia , lookup

Transcript
PART II
THE NRM CONTEXT FOR THE AEH STUDY
_______________________________________________
Preface
Any study of learning from the past requires a context. In this thesis the context chosen is the
natural resource management (NRM) problem of irrigation salinity. The technical nature of
salinity makes it a good context to study. Because effects are delayed, long timescales are
needed for feedback effects to occur and to become visible. The rate of learning by policy
makers will be much slower than the rate at which the problem develops. Therefore, the
experiences from other places and other times, via history, are essential.
Since ancient times, irrigation has been an essential component in the development of
agriculture. Irrigation created new relationships between nature and human beings, which were
initially beneficial to society. But ‘the irrigated base upon which everything else rested required
constant vigilance; if it was neglected, a cascade of devastating effects could follow’ (Postel
1999:5). Neglect brought environmental degradation in the form of rising watertables,
waterlogged soils, and salinised soils and groundwater.
Despite long experience with irrigation, and awareness of the dangers of salinisation, societies
continue to use irrigation to support intensive agriculture. In the nineteenth century irrigation
was promoted enthusiastically in the semi-arid areas of British India, the western United States,
and the colonies of south-eastern Australia. In Victoria the Australian politician, Alfred Deakin,
regarded irrigation as 'this antique aid to husbandry' that 'has had a new spirit breathed into it by
modern engineering' (Deakin 1893:141).
Part II (Chapters 6-7) comprises background material that is necessary for the exploratory AEH
study. Chapter 6 provides a description of the bio-physical context of salinisation. Chapter 7
covers the principal policy initiatives that governments in Australia have taken since the late
1960s, when irrigation salinity in the southern Murray-Darling Basin was recognised as a
serious environmental and social problem. These two chapters are based on a wide reading of
contemporary sources, and contain a summary of the literature.
83
Irrigation in ancient Egypt. Drawing of date
palms along a water storage pond. (Hyams
1971)
Construction of an irrigation canal,
Murrumbidgee Irrigation Area,
New South Wales, 1913.
(Water Conservation and
Irrigation Commission)
84
CHAPTER 6
SALINITY: THE BIO-PHYSICAL PROBLEM
_______________________________________________
6.1
Introduction
Salt occurs naturally on Earth in various forms and as different chemical compounds. It appears
in landscape features such as salt pans, lakes, scalds, springs and plugs. The high salt-content of
some regions has suggested their geographic name, such as The Dead Sea and Salt Lake City,
and numerous salt rivers and creeks. There are botanical examples: saltbush (Atriplex spp.)
thrives in dry alkaline soil, and salt grasses are native to markedly saline habitats. The salt tree,
or East Indian Tamarisk (Tamarix orientalis), is so named because its twigs are frequently saltencrusted. Biologists differentiate between salt-water and fresh-water species. Salt marshes
support halophytic plants and salt-adapted insects and fish. Natural salt outcrops provide a
source of mineral salts for herbivorous animals, particularly in cases where the soils of grazing
lands have been depleted. For example, grazing animals, including elephants, are known to
supplement their need for salt from the salt caves of Mount Elgon in Kenya.
Archaeology and history provide evidence of the important role of salt in past societies. In
modern Austria the richly furnished Celtic graves at Hallstatt belong to the Iron Age culture that
flourished from the eighth to the fifth century BCE. The name Hallstatt indicates that Celtic
settlements grew up around and exploited the salt mines near modern Salzburg. The Celts traded
salt with other parts of Europe along the Salzweg (salt way). Trade in salt also prompted the
appearance of sites in remote areas, such as Chinguetti in Mauritania which developed as a stopover site to service the caravans crossing the Western Sahara.
Throughout history societies have taken advantage of the beneficial uses of salt for curing and
preserving. Yet salt can have injurious effects on agriculture as a result of salinisation.
Salinisation is a hydrological process that mobilises natural salts deep in the groundwater and
soils, and brings them to the surface by capillary attraction, where they accumulate in the root
zone of crops. Increased salt concentrations in the upper soil layers can adversely affect normal
crop growth.
Primary salinisation is the result of bio-physical conditions and cycles that operate over
geological timescales to produce landscapes with naturally occurring salts. These landscapes
have primary salinised soils. Secondary salinisation is the result of human agency and activities
conducted over very much short timescales. It can produce both dryland salinity and irrigation
salinity. Human activities that have contributed to secondary salinisation are the clearing of
native deep-rooted vegetation, the planting of shallow-rooted annual crops, irrigating unsuitable
85
soils, and poor irrigation practices, especially over-watering. Figure 6.1 shows in summary the
effects of salinity in its two forms.
Secondary Salinisation
Human activities
over decades
Primary Salinisation
Natural conditions and cycles
over geological time
Dryland salinity
Irrigation salinity
Universal Problems
Rising watertables
Waterlogged soils
Salt in soil and groundwater
Soil erosion
Rural Problems
Reduced agricultural productivity
Increased salt in river systems and wetlands
Degradation of aquatic ecosystems
Loss of riparian vegetation
Stream bank erosion
Nutrient enrichment of inland waters
Urban Problems
Deterioration of
building foundations,
roads and municipal
infrastructure
Figure 6.1: Some of the injurious effects of salinisation
In the longer term, salinisation in either form can result in soil degradation, increased salinity in
groundwater, river systems and wetlands, reduced agricultural productivity and degradation of
aquatic ecosystems. The problems associated with salinised soils and water have major
significance in attempts to establish sustainable agriculture by communities around the world.
Although a problem of great antiquity, the extent of the problem has grown enormously in
recent decades. Today the problems associated with both dryland and irrigation salinity have
captured the attention and concern of rural and urban societies alike. These issues now require
policy and management initiatives instituted at the level of national governments (Australia
2001).
The following discussion of salinity provides the necessary background to understand the
historical study presented in Part III. It also provides support for the salinity examples used in
the theoretical discussion in §6.4. My account is not intended to be a comprehensive review of
the bio-physical aspects of salinity. The scientific literature, which began to expand rapidly in
the 1970s in Australia, now contains countless studies, reports, reviews and scholarly texts. I
have drawn on some of the most recent and comprehensive of these: Ghassemi et al (1995),
Smith (1998), the report of the Australian State of the Environment Committee (2001a), the
report on dryland salinity by the National Land and Water Resources Audit (2001), and
publications by the Murray-Darling Basin Commission listed in the bibliography. I have also
drawn generally on the classic irrigation text of Hagan, Haise and Edminster (1967).
86
6.2
The Bio-physical System and Salinity
The basic process of salinisation operates under particular conditions of climate and landform. It
is a result of natural interactions between water, soil, salt and vegetation. Saline soils are found
particularly in the arid and semi-arid zones of the world. Such landscapes are characterised by
large expanses of flat terrain that provide suitable conditions for salts to accumulate. Their very
low gradients prevent accumulating salts from being leached out of the soil.
6.2.1
Climate
Most arid and semi-arid lands lie in the broad regions between latitudes 10º and 40º N and 10º
and 40º S. This land distribution is primarily the result of global circulation of air between the
equator and the poles. As warm air at the equator expands and rises, it creates surface air flows
towards the equator. The rising air cools and its water vapour condenses and falls as tropical
rain. High-level air currents then move towards the poles to replace the air moving towards the
equator near the surface. When the high-level currents descend, they absorb water vapour,
resulting in essentially continuous drought in high latitude areas (James et al 1982:10). Such
lands constitute a large proportion of the Australian continent. They also characterise parts of
south-western United States and South America, North Africa and western South Africa, the
Middle East and Pakistan, parts of India, parts of the former Soviet Union in Central Asia,
northern and central China, and Mongolia (Ghassemi et al 1995).
While not all soils in these dry landscapes are salt-affected, most salt-affected soils are
associated with climates that produce permanently arid or semi-arid conditions. During summer
the evaporation and transpiration of plants greatly exceed precipitation, and essentially no water
percolates through the soil under natural conditions. The lack of 'flushing' allows natural salts to
remain undisturbed in the soil.
In some semi-arid regions, particularly in parts of the Mediterranean, rain falls during the winter
months when evaporation and transpiration are low. Rain is scarce during the warmer season
when evapo-transpiration and temperatures are correspondingly high. This so-called
Mediterranean climate also occurs in southern Australia, California, the South African Cape and
central Chile.
6.2.2
Water
To understand salinisation it is necessary to understand the movement of water through the
landscape by means of various bio-physical processes, forming what scientists call the
hydrologic cycle. Precipitation, infiltration and groundwater recharge are input processes in the
cycle. Evapo-transpiration, runoff, interflow and return flow are the output processes. Once
precipitation (as rain or snow) has occurred water passes into the unsaturated soil zone by
87
infiltration. As it infiltrates through soil layers it may enter the plant root system and return to
the atmosphere by transpiration. If it infiltrates below the root zone it ultimately recharges the
groundwater system. Where the horizontal permeability of the soils exceeds the vertical
permeability of an underlying layer, water may not continue downwards. It will then flow
sideways. This so called interflow is an important waterlogging mechanism in duplex structured
soils. The shape of the landscape will affect the downward flow of water. Short steep slopes
provide better conditions for water to discharge quickly, rather than long gentle slopes where
the flow is slower and waterlogging can occur. Where surface discharge is slow dissolved salts
in the soil concentrate more readily under evaporation (Williams 1991:95).
The watertable is the level below which soils and sediments are saturated. Its depth varies across
a landscape and under seasonal conditions. In the saturated zone groundwater can accumulate
in beds of coarse sediments known as aquifers. Deep aquifers may extend beneath surface
catchments over large areas to form regional aquifers. Perched aquifers, located closer to the
surface, are small discontinuous beds that can affect individual farms. Both types of aquifers can
act as parts of the mechanisms that produce waterlogged and salinised soils on different spatial
scales.
Under natural conditions water that does not penetrate into the subsoil flows into natural surface
drainage systems as runoff. On lands cleared for agriculture additional surface and sub-surface
drainage is frequently necessary to remove water in excess of needs from irrigation and rainfall.
This water may be directed into rivers or groundwater. Surface runoff in the form of irrigation
tailwater contains increased concentrations of soluble salts. Where it has drained into rivers and
streams it can raise river salinity to levels which are unacceptable to downstream users, and
harmful to aquatic ecosystems.
Water may finally complete the hydrologic cycle as return flow through incised stream beds,
where the watertable has risen to meet base flow of a river.
6.2.3
Soils and salts
Saline soils occur naturally in the landscape as a result of different geological processes.
Weathering, whereby decomposition of the elements of parent rock has taken place under the
action of air and water, accounts for most saline soils. Over geological time salts accumulate in
the rocks, soils and groundwater. As a result of geological movement of the Earth, marine
deposits have been uplifted and served as parent material for large areas of soils in arid and
semi-arid regions. Much of the desert interior of the Australian continent has been formed under
these conditions.
88
Primary saline soils also occur in places where sea salt has entered the hydrologic system. Salt
that moved inland in rainfall, has been accumulating in the landscape since earlier geological
ages. Like other natural salts, these airborne salts find their way into, and are stored in, the soils
or groundwater. Because it completes its cycle eventually by returning to the ocean through the
groundwater, it is known as cyclic salt. This is the principal source of salinity in Western
Australia. Prevailing winds in Australia deposit salt at a rate of 30-100 kg/ha/year (Hamblin
2001:77).
Soils are classified as saline, sodic and saline-sodic. The pH is the measure of soil alkalinity or
acidity. Alkaline soils have a pH above about 8, neutral soils between about 6.5 and 8, and
acidic soils below about 6.5. Saline soils are often recognised by the presence of a white
efflorescence on the surface, which accounts for the description ‘white alkali’ found in early
literature. They have a relatively low level of adsorbed sodium. Chloride, bicarbonate and
sulphate are the principal negative ions (anions), while sodium, calcium and magnesium the
principal positive ions (cations). The high salt conditions keeps the soil in a flocculated state,
and water permeability is high. Sodic soils have low salinity. They have been called ‘black
alkali’ from the dissolved soil organic matter which is deposited on the surface by evaporation
to give a characteristic darkened surface layer. They contain a greater amount of exchangeable
sodium cations than in saline soils, as well as carbonate cations and bicarbonate anions. They
form a clay system which prevents water and air from mixing easily with the soil. Saline-sodic
soils have chemical properties of both saline and sodic soils.
In Australia the salts present in the soil are predominantly sodium chloride (NaCl), but other
sodium salts, and salts of calcium and magnesium, may be present. Sulphates are common in
weathered sedimentary rocks of marine origin. Carbonates and bicarbonates are also found in
some soils. Salt content is measured in terms of the electrical conductivity (EC) of a water
extract of the soil. A thousand EC units are equivalent to about 600 milligrams of salt dissolved
in one litre of water.1 Salt concentrations can vary from as low as 50 EC (the value of the
domestic water supply of Canberra), to 5000 EC, the upper limit for salt-tolerant irrigated crops,
such as barley and cotton. Other EC values are given in Table 6.1.
1
This method is now more commonly applied than the previous measure of total dissolved salts (TDS) in
parts per million (ppm).
89
Table 6.1:
Values for electrical conductivity (EC) of water with different salt concentrations
EC units
(µS/cm)
45,000
Sea water
Value that divides fresh water from saline water
5000
Upper limit for salt-tolerant irrigated crops, e.g., barley, cotton,
5000
wheat, canola, sunflower, on well-drained soils
Direct adverse biological effects on many plant species and
1500
freshwater ecosystems
Upper limit of water for stock
1500
Australian Drinking Water Guideline
833
WHO upper limit for desirable drinking water
800
Upper limit for irrigated salt-sensitive crops, e.g., citrus
800
In-stream salinity of Murrumbidgee River
250
Canberra domestic water supply
50
Secondary saline soils appear after societies have introduced changes in land-use practices that
fundamentally disturb the natural hydrologic system. In the case of dryland salinity, saline soils
are associated with a rise in the watertable following the clearing of native trees and deep-rooted
woody vegetation. In the case of irrigation salinity, saline soils follow a rise in the watertable as
a result of intensive irrigation, where schemes have been introduced without provision for
adequate drainage, where poor quality irrigation waters is used, and where inexperienced
irrigators over-water. Frequently all three problems exist in the same place.
The presence of dissolved salts in the soil manifests itself in different ways, and generally
shows a salinity problem in different forms. The osmotic pressure of the soil-water solution can
cause the soil to hold moisture so tightly that plants are unable to obtain sufficient moisture by
means of osmosis. Under such conditions normal growth is not possible. As the concentration of
salts increases plants can show all the signs of stress common under drought conditions, and
may die, even though the soil is moist. Salt containing specific minerals can have a toxic effect
on particular plant species. For example, chloride ions have an injurious effect on stone fruit and
citrus trees. Certain fruit crops cannot tolerate high concentrations of sodium. Furthermore,
some soils are affected by sodium, which breaks down the structure of the soil, thereby
restricting the passage of water and inhibits growth in the plant’s root zone.
On the soil surface bare patches of various sizes may appear interspersed among healthy ground
cover. As soil salinity progresses, larger areas appear with no vegetation or with dead trees and
90
bushes, or are replaced with salt-tolerant plants. In the most obvious cases salt crystals or crusts
form, leaving a white saline efflorescence over the ground when the surface has dried out. Once
bare patches appear soil erosion follows.
6.2.4
Vegetation
Deep-rooted woody perennial vegetation and native grasses can keep a saline landscape healthy,
by acting as pumps to maintain a low watertable. Such plants use large amounts of water, and
ensure that potentially mobile salts remain deep in the soil profile.
Secondary salinisation occurs where deep-rooted species have been cleared to make way for
introduced shallow-rooted annual crops and pastures. Most cereal crops are selected because
they begin their growth cycle from seed at the onset of winter rains. They complete this cycle
(and seed development) before the soil moisture is exhausted under summer temperatures. In the
case of both cereals and pastures evapo-transpiration is reduced, and excess water remains for
sub-surface flow within the soil and for runoff to lower ground.
In Australia many native plant species have adapted to the saline environment of the arid and
semi-arid regions. Saltbush (Atriplex spp.), mulga (Acacia aneura) and mallee (e.g., Eucalyptus
oleosa) are found widely across south-eastern Australia, while brigalow (Acacia harpophylla) is
common in southern Queensland. Grasses such as sea barley (Hordeum hystrix) and curly rye
grass (Parapholis incurva) commonly colonise salt-affected land.
6.3
Irrigation Salinity
Irrigation salinity occurs as a result of human-induced changes to land use. The water from
irrigation causes the groundwater to rise, mobilises salts held deep within the soils and
groundwater, and can bring these salts to the surface.
In Figure 6.2 the top panel represents a landscape where the natural groundwater level lies under
deep-rooted trees or woody vegetation, and evapo-transpiration rates are high. The middle
panel shows land cleared and planted with shallow-rooted crops or pasture. The purpose of
irrigation is to provide optimal soil moisture in the root zone. In healthy pastoral or agricultural
lands, where the natural hydrologic balance is maintained, soluble salts are contained deep
within the soil and do not adversely affect plant growth. When intensive irrigation is introduced
to support agriculture in arid and semi-arid zones the soils can become saturated, water is added
to the natural groundwater, and the salts are mobilised through the soil profile. Poor irrigation
practices, such as over-watering, inadequate drainage and leaking from unlined channels,
contribute to rising watertables and waterlogged soil. The watertable will sink as the demand for
water declines, but over time it remains high even when the irrigation season has ended.
91
Groundwater under deep-rooted trees
high evapotranspiration
soil
2-5 m
depth of
watertable
capillary fringe
groundwater + stored salts
Groundwater after tree-clearing and cultivation
of irrigated crops or pasture
irrigation
channel
capillary fringe
groundwater + stored salts
depth of
watertable
movement
of stored
salts
Groundwater after many irrigation seasons:
permanently high watertables
decline in
crop health
and yield
irrigation
channel
capillary fringe
groundwater + stored salts
waterlogged
root zone
movement
of stored
salts
Figure 6.2: Stages in the process towards secondary salinisation under irrigation
92
The lower panel shows the same land after repeated cycles of irrigating and drying out. Under
hot summer conditions with low humidity, when irrigation is most in demand, soluble salts in
the soil and groundwater are brought to the surface by capillary action. The salts are
concentrated by evapo-transpiration in the plant’s root zone. This change in soil quality
ultimately makes the land unfit for normal plant growth. In some parts of the world the
irrigation water itself can be a source of salts, thereby exacerbating the problem. Applying
excess water can leach salts out of the root zone, but this process is only effective if good
drainage assures that the leached salts are removed from the area under cultivation. In many
cases the removal of leached salts simply transfers the problem downstream in a catchment; and
it may take a very long time.
In the case of both rain-fed and irrigated agriculture, an increase in salinity is symptomatic of
land uses that have replaced natural systems, resulting in a massive hydrological imbalance
(Pels and Stannard 1977:178).
Figure 6.2 is also a model for dryland salinity in its secondary form. The removal of deeprooted trees or woody species can lead to dryland salinity without the addition of irrigation. It
triggers much of the surface salt seen in south-eastern Australia.
A significant effect of increased saline runoff from irrigation areas is increased salinity of
groundwater, streams and rivers. In Australia it has been found that excess irrigation water finds
its way into the groundwater up to ten times faster than under natural conditions (Australian
SEC 2001a:13). Particularly in regions with shallow watertables, saline groundwater readily
seeps into natural and artificial drainage systems or directly into the principal waterways of the
catchment. In this way terrestrial and aquatic ecosystems are degraded. Salinity levels of rivers
and streams, which vary seasonally, are important for monitoring salt movement in the
landscape. During droughts when water levels are very low, salt concentrations are high.
Conversely, floods dilute the salts to produce low salinity levels.
6.4
Over-watering
6.4.1
Technical perspective
Over-watering has long been recognised as a serious problem in irrigated agriculture, and is the
driving force in irrigation salinisation. To illustrate some effects of over-watering and the
principles already outlined in the process of irrigation salinisation, a dynamical model of an
early 1900's irrigation development has been built using Stella® software. The Conceptual
Model of Irrigation Development (CMID) is described fully in Appendix A. Technical terms
are described in Chapter 3.
93
CMID is designed to illustrate the possible behaviour of a small irrigation scheme in the early
phase of its development. According to the scenario modelled, the development is located in
south-eastern Australia at the beginning of the twentieth century. It is assumed that the scheme
comprises 80 small (20 ha) blocks worked by settler families, and where new blocks are steadily
being brought into the system as the network of subsidiary channels expands. The model
represents the average behaviour of the whole district and its activities, not that of individual
farms. The results are either average values, or aggregated values, as appropriate to each
variable (see Appendix A for details).
In CMID we are not interested in the effects of seasonal variation; it is therefore assumed that
the rainfall of 300 mm per year is distributed uniformly throughout the year. Little knowledge
exists of the geological characteristics of the area, and in the early twentieth century the farmers
pay no attention to issues such as watertable depth or salt concentration in the soils and
groundwater. There is also a poor understanding about how much water to apply, and farmers
generally apply too much rather than too little, particularly in the early years of the scheme
when water delivery is irregular (see Chapters 9-10).
It is assumed that irrigation has not been applied in this part of the country before, and it is
necessary for farmers to experiment with a variety of crops and agricultural products. Most of
the settlers have no experience in farming of any kind, a few have been involved with sheep
grazing, and others have come from towns. All must learn about farming using irrigation.
Therefore, different farming activities take place across the irrigation area, and crops and
products are in different stages of growth. Initially, there have been blocks developed for
irrigated pasture, because dairying can provide an income quickly. Then, horticulture which
needs a longer development time, is established. Each year more fruit trees are planted, and the
horticultural farms approach full production by about year 6.
Most crops are susceptible to damage from over-watering, and die if the soil has become
waterlogged for more than 5-10 days. Modern soil science has provided methods for measuring
soil moisture accurately, and for developing an understanding of the soil moisture tolerances of
different crops in a variety of soil types (Veihmeyer and Hendrickson 1931).2 Two technical
terms are important in any discussion of the interaction between soil, soil moisture and crops:
field capacity and permanent wilting point (Figure 6.3). The term field capacity refers to the
capacity of a soil to retain moisture. It is measured as the percentage of water (weight for weight
2
Willard Gardner (1883-1964) was a pioneer of soil physics at Utah State University. In the 1920s with
Lorenzo Richards he made significant progress to understand the matric or capillary potential of soil
water (Or 2001).
94
(w/w)) remaining in a soil 2-3 days after it has been saturated, and after free drainage has
practically ceased. Water within soil is under negative pressure (suction) which decreases as soil
quality changes from medium-texture to sand. As a soil dries out a plant is unable to meet its
water demands, and will wilt permanently. The term permanent wilting point refers to the
amount of water in the soil below which different plants will be unable to obtain further
supplies, and will therefore wilt and die (Russell 1973). In CMID it is assumed that the
dominant soil type in the area is loam, and that the field capacity of the soil in the whole area is
30% w/w, the upper limit when soil becomes saturated is 40% w/w, and the permanent wilting
point of the crops is at 13% w/w (Charman and Murphy 1991, Figure 10.9 and Table 10.6).
Field
capacity
Saturated
soil
Crop yield
1
Permanent
wilting point
0
13
30
Soil moisture (% w/w)
37 40
Figure 6.3: Dependence of crop yield on soil moisture content M as assumed in CMID. Soil
moisture is measured as the weight of water contained in a unit weight of soil and expressed as
a percentage (weight for weight (w/w)). The permanent wilting point of the crops has been
assumed to lie at 13% w/w, and the average field capacity of the soils is taken to be 30% w/w.
The maximum yield is achieved at 37% w/w, but it runs perilously close to the saturation
(waterlogging) point which is taken to be 40% w/w. These values are typical of loam soils
(Charman and Murphy 1991).
In CMID the control parameter Average Irrigation Rate is the number of megalitres of water
applied to each 20-ha farm per year.
The model has five state variables: average soil moisture, aggregate product, accumulated
aggregate wealth, average watertable height, and average root zone salt (see Table 6.2). The
flows that affect these variables are described in Appendix A.
95
Table 6.2:
State Variables of the Conceptual Model of Irrigation Development (CMID)
State Variable
Symbol
Units
Average soil moisture
M
% water w/w
Aggregate product
P
bushels
Accumulated aggregate wealth
W
£ (pounds)
Average watertable height
H
metres
Average root zone salt
S
deci-Siemens/metre
The variable M represents the moisture content of the soils in the development, averaged over
all farms. Thus, while the soil moisture of a single farm will change rapidly when irrigation
begins, the M value will vary relatively slowly as more and more farms begin irrigation.
Similarly, if the agricultural endeavour fails (and farms turn off the water) M will decrease
slowly at a rate determined by the rate at which farms fail, rather than that rate at which a given
farm dries out.
The aggregate product P represents the total amount of produce in the district at any given time
t. It includes all produce, whether as standing crops or harvested and in storage, summed over
all farms in the district.
The accumulated aggregate wealth W represents the profit, above production and living costs,
summed over all farms in the development, and accumulated over time. This variable is
therefore taken to represent the total 'wealth' in the district in whatever form it resides (cash,
capital equipment, etc).
The variable H represents the height of the groundwater above an impervious layer which is
assumed to lie 15 metres below ground level. Therefore, H can vary between 0 (no water above
the impervious layer) and 15 (watertable at the surface). The value computed represents an
average over the area of development. Given that the land surface is not flat, the impervious
layer will lie at various depths below the surface across the development. A specific value of H
will represent a situation where some farms will still be operating fully, while others are out of
action because of waterlogging and salinisation.
Finally, the variable S represents the average level of salinisation in the soils of the area. Again,
a given S value will correspond to a situation where some farms are still operating, while others
have turned off the water as a result of salinisation.
96
In the simulation results presented below it is assumed that farmers who take up land in the new
development spend the first few years trying to discover the optimum irrigation rate largely by
trial and error. It is also assumed that they define the optimum rate to be that which maximises
their profits derived from the irrigated blocks over three years. During this period different
farmers set different irrigation rates in the range from 20 to 80 ML/farm/yr. The results of these
experiments are shown in Figure 6.4. After three years the community concludes that the
optimum irrigation rate is 60 ML/farm/yr. Confident with the information from their trials, the
farmers plan to set the irrigation rate at 60 ML/farm/yr for the future. The results (shown in
Figure 6.5) took them by surprise.
Results from 3 years of observations
40000
35000
w3 [£ per farm]
30000
25000
20000
15000
10000
5000
0
0
10
20
30
40
50
60
70
80
90
Irrigation rate [ML/farm/yr]
Figure 6.4: Results of farmers' experiments to determine the optimum irrigation rate. The
horizontal axis shows the irrigation rate in megalitres per farm per year. The vertical axis shows
w3, the accumulated wealth per farm in 3 years. These results were generated by running CMID
14 times with the irrigation rate set at values between 20 and 80 ML/farm/yr. The computations
were intended to represent a situation where different farmers irrigated at different rates in an
attempt to determine the dependence of crop yield on irrigation rate. The simulation shows that
farmers would have generated maximum profits when the irrigation rate was in the vicinity of
60 ML/farm/yr. The steep fall-off in w3 that is observed for irrigation rates in excess of this
figure is the result of waterlogging. Such results would have convinced farmers that 60
ML/farm/yr was the optimum irrigation rate for the district. Subsequent events (see Figure 6.5)
would reveal the difficulty of learning from experience in dynamically complex situations.
97
In the case shown in Figure 6.5, P reaches its maximum at the end of year 2. But farmers can
sustain this level of production for only one year. P then declines quickly, reflecting the first
effect of waterlogging and salinisation in the root zone, and the reduction in the number of
farms using irrigation. By year 8 most crops have died, and production is all but ended.
1: M
1:
2:
3:
4:
5:
2: P
3: W
4: H
50.0
35000
300000
15.0
5.0
5: S
4
4
4
1
1
5
1:
2:
3:
4:
5:
25.0
17500
150000
7.5
2.5
5
2
1
1
4
5
2
2
1:
2:
3:
4:
5:
3
3
3
0.0
0
0
0.0
0.0
3
0.00
2
5
2.00
4.00
Page 1
6.00
8.00
Years
State Variables of Irrigation Development
Figure 6.5: State variables of irrigation scheme. The results of farmers having set the irrigation
rate at 60 ML/farm/year. The horizontal axis shows elapsed time in years from the inception of
the scheme. The vertical axis shows the five state variables on different scales:
(1) Average soil moisture M on a scale of 0 to 50% w/w,
(2) Aggregate product P on a scale of 0 to 35,000 bushels (0 to 1233 m3),
(3) Accumulated aggregate wealth W on a scale of 0 to £300,000,
(4) Average watertable height H on a scale of 0 to 15 metres above impervious layer,
(5) Average root zone salt S on a scale of 0 to 5 dS/m.
See text for a detailed discussion. Graph produced from Stella® simulation run.
In Figure 6.5, the variable W has an initial value of zero, and increases steadily for 6 years until
growth is inhibited, first, by the rising watertable and then by salt. At 6.6 years W starts to drop
slowly, and maintains a steady and constant decline. This decline represents the erosion of the
accumulated wealth of the district as agricultural operations collapse. In CMID it is assumed
that the farmers are concerned primarily with production because the community is ignorant of
the bio-physical processes. However, for a better understanding of the farmers' predicament it is
necessary to understand the behaviour of the other three variables, of which they were ignorant.
98
The first of these variable is the average soil moisture M. In CMID it is assumed that the
average rainfall is 300 mm per year. This results in an equilibrium value for M of 20% w/w.
With the irrigation rate set at 60 ML/farm/yr, M rises quickly and peaks after one year. Then, as
crops grow and transpiration rates increase, it falls back gently to a level around 31% w/w, just
above the field capacity. By year 4 the value of M starts to rise as the watertable approaches the
soil surface. Soon after it falls rapidly as farmers turn off the irrigation water permanently. By
year 7 M has reached a new level, close to its initial value of 20% w/w.
For the first six months of irrigation the average watertable height H remains at or near zero
because the soil moisture is less than the field capacity. It then climbs very steeply as the soil
moisture rises past the field capacity, and irrigation water begins to seep into the groundwater.
As H approaches its maximum at year 4, the soil becomes saturated and crops begin to die. At
this point farmers begin to turn the irrigation off. H remains high for half a year, then begins to
steadily decline as the groundwater diffuses horizontally out of the area, and as the soil moisture
returns to its initial value of 20% w/w.
The variable S has an initial value of zero. For the first 3.6 years of operations no salt appears in
the root zone of the crops. Then, as the watertable height reaches its maximum, S increases
rapidly as salt diffuses into the root zone. Thereafter, as the crops die from salinisation as well
as over-watering and increasingly more farmers turn off irrigation, the watertable falls and S
reduces slowly due to leaching of salts from the root zone.
The farmers' misfortune is the result of over-watering. From an agricultural point of view M has
an optimum value a little less than 30% w/w (the field capacity of the local soils). To have an
irrigation operation that is more sustainable than the one illustrated in Figure 6.5, they must
reduce the water-usage rate considerably to hold M below this value. For example, running
CMID with an irrigation rate set at 45 ML/farm/yr produces the behaviour over 18 years shown
in Figure 6.6. After about 2.5 years M, H and P have each reached a stable state. S remains at
zero because salts have not been mobilised by over-watering, and W maintains a steady growth.
Compared with the initial profit level attained when the irrigation rate was set at 60 ML/farm/yr
(Figure 6.5), the financial returns to the farmers are lower. Nevertheless, they will continue
indefinitely, compared with only 4 years of peak operation when the irrigation rate was set at 45
ML/farm/yr.
99
1: M
1:
2:
3:
4:
5:
2: P
3: W
4: H
5: S
50.0
35000
300000
15.0
5.0
3
1
1
1:
2:
3:
4:
5:
25.0
17500
150000
7.5
2.5
1
2
1
2
2
3
2
4
4
4
4
3
1:
2:
3:
4:
5:
0.0
0
0
0.0
0.0
3
5
5
0.00
4.00
5
8.00
Page 1
5
12.00
16.00
Years
State Variables of Irrigation Development
Figure 6.6: State variables of irrigation scheme. The results of farmers having set the irrigation
rate at 45 ML/farm/year. The vertical axis shows elapsed time in years from the inception of the
scheme. The horizontal axis shows the five state variables on different scales:
(1) Average soil moisture M on a scale of 0 to 50% w/w,
(2) Aggregate product P on a scale of 0 to 35,000 bushels (0 to 1233 m3),
(3) Accumulated aggregate wealth W on a scale of 0 to £300,000,
(4) Average watertable height H on a scale of 0 to 15 metres above impervious layer,
(5) Average root zone salt S on a scale of 0 to 5 dS/m.
See text for a detailed discussion. Graph produced from Stella® simulation run.
6.4.2
Historical perspective
Irrigation has been an essential process in the development of agriculture since prehistory. It has
allowed societies to develop in marginal environments (with less than 200 millimetres of rain a
year) where rain-fed agriculture could not survive. At the end of the last Ice Age, as crops were
domesticated, hunter-gather communities changed to settled agricultural societies. This shift
took place c.8000 BCE in the southern Fertile Crescent on the central floodplain of the Tigris
and Euphrates Rivers in Mesopotamia (Scarre 1999). In this arid region ancient societies first
discovered the usefulness of irrigation for agriculture.
Nevertheless, within a relatively short time the new system of water delivery had disturbed the
natural equilibrium (Adams 1981). When practised on large scales, irrigation could interrupt
natural drainage patterns. Additional flows, made possible by the complex of canals, hastened
100
the loss of arable land to salinity by raising the watertables. Although ancient societies appear to
have recognised the importance of allowing fields to lie fallow, archaeologists Jacobsen and
Adams (1958:2) believed that increased agricultural production was only attained at the cost of
greater ecological damage.
Many sites in the mid-Tigris region from the Samarran period (6000-5500 BCE) were located
beyond the 200-millimetre isohyet limit for rain-fed agriculture. In these areas ancient
cultivators developed the first simple irrigation techniques. The earliest evidence comes from
the site of Choga Mami, where archaeologists found remains of a disused canal system along
with evidence of irrigated hybrid crops: barley, wheat and flax (Scarre 1999:99). From c.5500
BCE settlements also appeared on the arid southern plains of Mesopotamia, where irrigation
was necessary. At sites such as Eridu, Uruk and Ur cultivators harnessed the spring floods of the
Euphrates to improve crop yields. By this means centres grew that were capable of supporting
larger populations. By the late fourth millennium the first cities of Mesopotamia appeared, the
most famous of which was Ur.
The archaeological record shows that the importance of good irrigation practices was recognised
in Mesopotamia. An inscribed clay tablet from c.1550 BCE was excavated at the ancient site of
Nippur, near modern Baghdad (Kramer 1951), and it records instructions to farmers on the
careful use of irrigation water:
In days of yore a farmer gave these instructions to his son: When you are about to
cultivate your field, take care to open the irrigation works so that their water does not rise
too high in the field. When you have emptied it of water, watch the field’s wet ground
that it stays even; let no wandering ox trample it. . .
Written accounts of good irrigation practices come from tablets dating from 1775-1760 BCE
discovered at Mari, a site on the Euphrates in northern Mesopotamia. These tablets refer to the
care of the canals and equitable distribution of irrigation waters (Scarre 1999:132).
In the Tigris-Euphrates environment settlements were in constant danger from flooding, and, in
time, their agricultural system came under threat from the accumulation of salts in the soil.
Poor natural drainage also made it difficult to leach salts from the fields. Archaeologists have
found evidence of increased salinity and declining yield in southern Mesopotamia between 2400
and 1700 BCE. By the end of the third millennium southern Mesopotamia had shifted from
wheat to an overwhelming reliance on barley, a more salt-tolerant crop (Jacobsen and Adams
1958:1252).3
3
Another interpretation for the change in the dominant crop comes from Powell (1987). He noted that
barley is also a preferred fodder for sheep. In ancient Attica (Greece), where salinity was not a problem
for agriculture, barley was the predominant crop over wheat.
101
Nevertheless, scholars agree that environmental degradation caused by salinisation of the land,
as a result of human activity, played a significant role in the decline of the Sumerian civilisation
in Mesopotamia (Jacobsen and Adams 1958, Hughes 1994). Jacobsen and Adams (1958:2) have
linked an increase in salinity in southern Mesopotamia with the cutting of a new canal in c.2400
BCE. They claimed that by 2100 BCE salinity had spread west and the area never recovered.
Scholars associated with the Diyala Basin Archaeological Projects in southern Iraq in the 1950s
have identified ancient cuneiform records which refer to saline soils from as early as c.2400
BCE (Jacobsen 1982).
In Egypt the annual flooding of the Nile and the deposition of rich alluvial soils has made
sustainable agriculture possible for some 5000 years. Agriculturalists in the Nile Valley used a
system of embankments to hold floodwaters, and dug channels to inundate large areas during
the river’s peak flooding period (Scarre 1999:104). Its reliable high crop yields gave Egypt the
reputation as the grain-house of the Mediterranean during the Roman period. The Nile floods
had the beneficial effect of leaching salts from the soil, making salinisation a lesser problem
than in Mesopotamia. Nevertheless, salt deposits did occur in irrigated areas above the flood
line, and were serious in the northern Fayuum area, located below sea level (Hughes 1994:40).
Hillel (1994) has taken up the more recent history of water in the Middle East. He describes
how the rigours of survival in environments that have long been undergoing slow and
widespread destruction, have contributed to current social and political crises in the region.
In South Asia agriculture appeared around 6000 BCE, as societies became more highly
organised. Settlements based on irrigated agriculture appeared on the alluvial plains of the Indus
Valley from the fourth millennium BCE. In addition to well irrigation, cultivators used
inundation channels to improve agricultural productivity and support larger populations. The
Indus civilisation reached its peak between 2300 and 1750 BCE, with the rise of large centres
such as Harappa, Mohenjo-daro and Chanhu-daro (McIntosh 2002). Settlement of the Ganges
Valley followed during the first millennium BCE. However, systems of intensive canal
irrigation did not appear in India until the Mogul period (1500-1700 CE). For this reason
salinity of irrigated areas in modern India is a problem of comparatively recent origin.
The cultivators of ancient China employed simple irrigation techniques more than 4000-5000
years ago using water from the Hwang Ho (Yellow) River on the North China Plain. During the
Shang Period (3000-1000 BCE) prosperous farming communities developed on the Hwang Ho
where millet was the primary crop (Scarre 1999:146). Later, large-scale irrigation projects
developed, such as the Hsingan Canal between the Yangtze and Pearl Rivers in the third century
102
BCE, the Grand Canal built from 486 BCE, and irrigation works on the Min River in c.250 BCE
(Framji et al 1981:226). As in other places, salinisation problems followed Chinese irrigation.
Postel (1999:99) has noted that a prescription for salinity control in China appeared in a book
written in 1617 CE.
Irrigated agriculture also existed in the cultures and environments of the New World in preColumbian times. It appeared in the American south-west 1300-1400 years ago, and supported
societies in the central Andes of South America and the highlands of Mesoamerica. Salinisation
in the American south-west became a serious problem in the recent past, under intensive
European-style agriculture. In South America and Mesoamerica the extent and complexity of
irrigation responded to the rise and fall of successive civilisations over many centuries. Here the
environmental conditions, agricultural practices and population dynamics have not produced
evidence that land degradation from salinity was a significant problem (Gulhati and Smith
1967). However, there is considerable lack of agreement on the causes of abandonment of
prehistoric irrigation canals in Peru, and salinisation may have been one of the causes (Denevan
2002).
The operation of intensive irrigation in arid environments continues to challenge contemporary
societies. As in ancient times, sustainable practices require agriculturalists to understand the
systems that they are trying to manage. Yet modern communities commonly fail to recognise
that irrigation ‘carries with it certain technological and social imperatives the ignorance of
which may lead to disaster’ (Gulhati and Smith 1967:3).
Widespread salinity in the semi-arid regions of the world is now a matter of concern to
governments, communities and scientists (ISRIC 1990, Ghassemi et al 1995, Glazovsky 1995,
Postel 1999, WCD 2000). It is a serious problem in Argentina, Australia, China, Egypt, India,
Iran, Pakistan, South Africa, Thailand, the USA, and several independent states of the former
Soviet Union located in Central Asia. It is difficult to calculate accurately the extent of the
problem. Salinisation was one of the forms of soil degradation considered in the Global
Assessment of Human Induced Soil Degradation (GLASOD) in 1990. That survey was
conducted by the International Soil Reference and Information Centre (ISRIC), and
commissioned by the United Nations Environment Programme. While the study had broad
categories to describe the major causative factors, such as agricultural mismanagement or
overgrazing, it did not specify whether the degradation was the result of irrigation or other
reasons (ISRIC 1990). However, the World Commission on Dams regards the control of salinity
and the reclaiming of saline land as an urgent priority, in order to increase productivity of
existing land and to make better use of irrigation (WCD 2000).
103
In a study of eight major irrigation countries Postel (1999) has shown that 22 per cent of the
irrigated land in these countries is affected by salt (Figure 6.7). As salt claims more land
worldwide, more prime agricultural land is being lost while productivity declines. She calls for
a radical reform of irrigation practices.
China
Values represent the total irrigated
land affected by salt in millions of
hectares
6.7
7.0
India
4.2
United States
Pakistan
4.2
1.7
Iran
Egypt
0.9
Uzbekistan
2.4
Turkmenistan
1.0
Sub-Total
28.1
W orld Estimate
47.7
0
10
20
30
40
50
60
70
80
90
% of total irrigated land affected by salt
Figure 6.7: Salt-affected irrigated land in eight major irrigation countries in the late 1980s (data
taken from Postel 1999, Table 5-1). The two extreme cases (Uzbekistan and Turkmenistan) are
located in Central Asia. The plight of these and other states in the Aral Sea Basin is the result of
government policy in the 1950s to divert the Amu Darya and Syr Darya Rivers in order to
develop irrigated land for rice and cotton production.
104
CHAPTER 7
SALINITY IN SOUTH-EASTERN AUSTRALIA
_______________________________________________
7.1
Introduction
This overview of salinity in the southern Murray-Darling Basin provides a context for the
historical study that follows in Chapters 8, 9 and 10. It shows the magnitude of a problem that
could have been foreseen, if policy makers had been alert to the experiences available from
history.
7.2
Australian Experience
As the driest inhabited continent, Australia has an average rainfall of only 455 millimetres,
compared with an average of 660 millimetres for the land surface of the whole world. Of
Australia’s average rainfall 88 per cent is lost in evapo-transpiration, while surface runoff
accounts for 11 per cent (into rivers and the sea), and one per cent finds it way into the
groundwater (Smith 1998:8). Along with southern Africa, Australia also has the world’s most
variable rainfall and runoff. Spatial and temporal distributions of water across the continent in
its 12 drainage divisions are highly uneven and make long-term averages of limited value even
at regional levels (Smith 1998:16).
The Australian continent has vast areas of natural salt-affected land, where the flat terrain has
prevented the salts from being leached out of the soils. Salt is also present as a result of sea salts
blown from exposed continental shelves during periods of low sea level, and from salts
deposited in rainfall. In the east of the continent the occurrence of salt is associated with old
residual landscapes of the Tertiary Period (62 to 3 million years ago) and the weathered mantle
that once covered most of the continent (Gunn and Richardson 1979:210). A geological history
of successive sea-level rises before this period resulted in salt accumulations in many of these
landscapes. In the interior, salt lakes and salt-tolerant native vegetation indicate this aspect of
the continent’s geological history. Explorer-surveyors in the nineteenth century commented on
the presence of salt, and the first settlers who took their sheep onto the inland plains in the
south-east found the environment well suited to pastoralism. Since 1788 the whole continent has
being undergoing major changes in land use. In the south-west of Western Australia European
cereals have replaced native vegetation, and a serious dryland salinity problem has developed
along with the State’s wheat industry.
Despite many early warning signs of the presence of salt in Australian soils and groundwater,
modern scientific investigations of salinity are relatively recent. While the knowledge of how
irrigation salinity occurred and how to deal with it already existed in the nineteenth century, it
105
has taken scientists a long time to distinguish between primary and secondary salinisation.
Discussions recorded in the nineteenth century frequently exposed confusion between these two
forms of the problem.
Before salinity could be adequately understood in Australia, comprehensive data on regional
bio-physical environments were necessary. Scientists and policy makers needed data from many
scientific fields, particularly geology, geomorphology, hydrology and meteorology, to
understand soils and groundwater conditions, vegetation cover, flood and drought occurrences,
and river discharge patterns. These data were not available in appropriate detail, across broad
temporal and spatial scales, until the development of the continent’s water resources moved
forward. The impetus for this development on a national level came after 1945, during the
period of post-war reconstruction.
The Commonwealth conducted the first review of Australia’s water resources in 1963 (AWRC
1965). The review examined the occurrence and quantitative use of stream flows and
groundwater. It gave a broad indication of available quantities, their location, annual variation,
and amount of water regarded as committed to use. Although the review was based on
incomplete data, it played the important role of identifying significant gaps in current
knowledge and inaccuracies in existing data. The review introduced the river basin as the
primary unit of assessing surface water, and grouped them in 12 drainage divisions with 197
river basins. These divisions remain a fundamental way in which we define water distribution in
Australia.
The review also examined the location and extent of underground water, and recognised the
combined knowledge and skills from geology, geophysics, engineering and chemistry for
groundwater development. It emphasised the inter-relation, and often direct inter-connection,
between surface and groundwater. In some places groundwater drains into streams and
maintains their base flow, while in other places streams replenish underground supplies.
Therefore, as the exploitation of one resource can influence the other, the review recommended
that conservation and utilisation of both resources should be planned jointly. It was clear that
understanding the critical inter-dependence of elements in the whole hydrologic system would
improve as surveys of Australia’s water resources proceeded. Other Commonwealth reviews of
water resources followed in 1976, 1981, 1983 and 1997-2002.
In 2001 the Australian State of the Environment Committee noted that increasing salinity of
soils and water in Australian catchments was one of the most significant threats to the health of
aquatic ecosystems and water resources for irrigation and human consumption (Ball et al
2001:44). Since the 1996 State of the Environment Report it was estimated that an additional 0.5
106
million hectares of secondary salinity had become evident. The common cause was identified as
hydrologic imbalance. It is predicted that the area of land affected by dryland salinity will triple
in the next 50 years. Because irrigation salinity is more localised it can be managed and reduced
more easily with efficient drainage and under sound irrigation practices.
Despite heightened public awareness of salinity in south-eastern Australia in recent years,
primary salinisation of land and water on the Australian continent is, in fact, a very old problem.
To understand the complexities of the degradation of irrigated lands in Australia over the last
half-century, it is necessary to understand more than just the science that can explain the
phenomenon. More than ten years ago Barr and Cary (1992) were stressing the need for
appreciation of the whole human activity system that continues to exacerbate the underlying
bio-physical problem. Government policies, institutional roles, politics, social and
environmental history, changed land uses and social learning all play a part. Governments in
Australia have been slow to recognise the multiple dimensions of the problem of salinity
(MDBC 1999a, LWA 2000).
7.3
The Murray-Darling Basin
The Murray-Darling is Australia’s largest river system (Figure 7.1). Its drainage basin covers 14
per cent of the continent, an area of over 100 million hectares with 20 major rivers. It extends
over the four states of New South Wales, Victoria, South Australia and Queensland, and the
Australian Capital Territory. Here Australia’s main agricultural industries took root and
prospered, after passing through an experimental stage in the Sydney region. Today, the water
resources of the Basin are intensively developed and regulated, with the result that the Basin
supports 41 per cent of Australia’s total gross value of agricultural production. It contains 44.6
per cent of the Australian crop area, 96 per cent of raw cotton production, and 56 per cent of
fruit production. Food processing industries contribute to 60 per cent of the region’s
manufacturing turnover.
The Murray-Darling Basin (MDB) has large areas of naturally saline soils and groundwater:
along and to the west of the Darling River, along the Lachlan River, and across the Riverine
Plains and Mallee Zone of the Murrumbidgee and Murray Rivers in the south.
Despite the Basin’s geological characteristics and climate, making salinisation a natural
phenomenon, Australian irrigation developed on its southern riverine plains. Today, more than
95 per cent of the water diverted from the rivers of the Murray-Darling system is used for
irrigation. This water contributes to over 71 per cent (1.47 million ha) of the total irrigated crops
and pastures in Australia. Irrigation in the Basin uses some 70 per cent of all water consumed in
107
Australia. Production from agriculture and horticulture is most concentrated under intense
irrigation from the Murrumbidgee, the Murray and its tributaries (Crabb 1997:98).
Figure 7.1: The Murray-Darling Basin
7.3.1
Southern MDB landscapes
The Murrumbidgee and Murray share common geology and geomorphology which provide the
preconditions for primary and secondary salinity to occur. In the east, the riverine plains have
been built up by slow sedimentation from flows westwards during periods of heightened river
flows. Most of the alluvial sediments come from two sets of palaeo-channels, identified by
Butler (1950) as prior-stream channels and ancestral-river channels, in two periods of deposition
during the Quaternary period. Langford-Smith (1960) later showed that the Murrumbidgee
broke up into distributaries that spread over the plains in all directions. Few of these streams
reached the sea because they dissipated through seepage and evaporation. The western portion
of the Murrumbidgee and Murray areas was exposed about a million years ago, as the sea began
retreating westwards from the vicinity of Swan Hill. The present mallee zone is the highly
saline area left behind when the sea withdrew. It is frequently covered by species of salt-tolerant
mallee (Eucalyptus oleosa and E. dumosa). More recently, Page and Nanson (1996) have
replaced the earlier prior-stream and ancestral-stream models of fluvial deposition with
aggradational and migrational palaeo-channel models. Their work has significantly extended
understanding of the geomorphology of the riverine plains in the Late Quaternary period.
108
Shallow watertables underlie most of the southern MDB in both the riverine plains and mallee
zone. Yet throughout the twentieth century the governments of New South Wales and Victoria
established an increasing number of irrigation areas in this part of the Basin. In the Murray
Valley there is intensive irrigation on the riverine plains in the regions of Kerang, Wakool,
Deniliquin and Shepparton, and in the mallee zone in Sunraysia, Riverland and Lower Murray
regions. Irrigation from the Murrumbidgee is concentrated in the Murrumbidgee and
Coleambally Irrigation Areas. On the riverine plains the deep sediments are highly saline, while
in the mallee zone the natural groundwater is highly saline and shallow watertables occur where
natural drainage conditions are poor.
7.3.2
Salinity: impulses and responses
Communities in the irrigation areas of the Murrumbidgee and Murray became aware of an
increasing salinity problem after wet winters in 1931, 1939, 1942 and 1956 (WCIC Annual
Reports). Although rising watertables were apparent in the Murrumbidgee Irrigation Areas
(MIA) as early as 1914, salinity first appeared at Yenda after severe floods in May-June 1931
submerged the village. Again in 1939, flooding raised watertables to dangerous levels and
caused loss of fruit trees in the MIA. The conditions of 1956 seriously affected the MIA citrus
crops. Similar problems existed in areas served by the Murray. (See map in Figure 10.3)
In 1967 the River Murray Commission1 initiated the first investigation of environmental
conditions and other factors which contribute to irrigation salinity in the Murray Valley. The
comprehensive report by Gutteridge, Haskins and Davey (1970; hereinafter GHD) clearly
identified the two basic salinity problems: high watertables and land salinisation in the riverine
plains, and high river salinity levels from irrigation in the mallee zone. It outlined the extent of
these two tightly coupled problems, their effect on agricultural and horticultural production, and
the contribution from poor irrigation practices. The report recommended measures to deal with
the problems, including immediate and long-term construction works and investigations. The
report became the standard reference and main source book for a whole range of salinity and
drainage studies.
A Commonwealth review of water resources in 1976 (AWRC 1976) principally investigated
water quantities, but looked broadly at salinity of rivers and groundwater as a water-quality
issue. The review reiterated the community concerns over increasing salinity levels of the
Murray River. However, the concerns were based on findings published in 1970 (from data
1
The River Murray Commission (RMC) was established in 1915 as a result of an agreement between the
Commonwealth, New South Wales, South Australia and Victoria to develop and share equitably the
water resources of the Murray among the three States. In 1987 a new agreement was signed to meet the
emerging problems in the Basin, and the RMC became the Murray-Darling Basin Commission. Later
Queensland (1992) and the Australian Capital Territory (1998) joined the partnership.
109
taken in 1966/67). Clearly, more recent measurements were needed to improve on data now a
decade old.
Later, the Murray Valley Salinity Study Steering Committee commissioned a new report on the
area’s salinity problems (Maunsell and Partners 1979). The large scale and high capital cost of
many of the remediation works proposed by GHD in 1970 resulted in few projects progressing
beyond the major proposal stage in the State water agencies. The history of the River Murray
has shown that many problems concerning the river involve multiple jurisdictions. The salinity
problem required co-ordination between the States and the Commonwealth. Therefore, the
Maunsell study assessed the existing and potential problems, reviewed proposals to deal with
them, determined project priorities and prepared a co-ordinated action plan. The study
concluded that the drainage and salinity effects of any proposed expansion or intensification of
irrigation should be thoroughly investigated before adoption. It also regarded the problems of
salinity and drainage as having broad social and economic implications, and their solutions had
to be integrated with other government policies.
In 1981 the Commonwealth initiated a study of national water-resources needs and problems to
the end of the century. Previous studies had focused on supply and quality, rather than use,
hence this study was called The First National Survey of Water Use in Australia (AWRC 1981).
It was seen as identifying water resource development and management issues that might
impede plans for national development. Salinity was one of 13 issues addressed by the study,
and a report on the effects of human beings on salinity in Australia appeared later (Peck et al
1983). The salinity study recognised that the most serious salinity problems occur in areas with
shallow watertables (i.e., less than 2 metres below ground surface), and estimated that 5240 km2
of the Murray-Darling Basin suffered from this problem. The study estimated that salinity levels
of the River Murray had increased by about 84 per cent over the period 1938-1981. The trend
was expected to continue. A high proportion of irrigated land would develop shallow
watertables without artificial drainage. The study noted that no established methods were
practised to prevent saline seepage or to reclaim salinised land in dryland agricultural regions.
The data presented for the Murray-Darling Basin confirmed the trend whereby seepage salting
was affecting both irrigated and non-irrigated soils, and streams. Data gathering over a wider
area and better sampling methods had contributed to a growing understanding of the extent and
complexity of the problem.
Earlier interest in soil conservation at the State level had resulted in the Commonwealth creating
a Standing Committee on Soil Conservation in 1946. This body set up a Working Party in 1978
to examine dryland salinity in Australia (SCSC 1982). It was particularly concerned with land
110
degradation arising from scalding and seepage salting, the cause of which was seen as changed
land-use practices since European settlement.
In 1984, under the Australian Environment Statistics Project, the Commonwealth compiled a
report of land degradation from surveys conducted before 1977 (Woods 1984). This project
recognised the absence of information on the Australian environment, and a vital need for
collection and wide dissemination of such data. On salinity, the report noted that the problem
was increasing and complex, but not even experts in the field fully understood the extent and
significance of this form of land degradation. It recognised that land salinisation was one of the
most expensive forms of degradation to treat (e.g., tile drainage in irrigation areas).
A review of Australia’s water resources and water use in 1985 showed that water use in the
Basin was close to the sustainable maximum (AWRC 1988, Smith 1998:85). This situation set
in train the development of a new policy by the Murray-Darling Basin Ministerial Council
(MDBMC) and resulted in the 1995 moratorium (known as The Cap) on further water
diversions. Extractions were capped at the 1995 level.
In 1989 the MDBMC adopted a Salinity and Drainage Strategy for the Murray and
Murrumbidgee valleys (MDBC 1987). The Strategy was prompted by the results of previous
studies that showed how some 96,000 hectares of irrigated land in the Basin were showing
visible signs of salinisation. It was also prompted by a fear that high watertables could increase
from 559,000 hectares in 1985 to 869,000 hectares in 2015. It set out a specific salinityreduction target of 80 EC against benchmark conditions, in particular, to reduce average salinity
at the off-take for water supply to Adelaide, South Australia's capital city. It envisaged the three
Basin States earning credits from other salinity reduction schemes outside the 80 EC target. The
salinity credits can be off-set against activities which increase salinity levels. (See §6.2.3 for
discussion of EC units.)
A review of the Strategy a decade later (MDBC 1999b) examined the effects of schemes that
provided cost-effective salt interception and drainage diversion to rehabilitate waterlogged and
salinised irrigation areas. Salt interception works are also used to control river salinity. Over ten
years the Strategy achieved a net reduction of 57 EC units in the Lower Murray. In 1999 the
MDBMC produced The Salinity Audit, a report on salinity levels and future trends in the
principal river valleys of the Basin. In the absence of new management interventions, it showed
that salinity would increase across the landscape. It estimated that it would take centuries to
stabilise the immense hydrologic imbalance that has occurred over decades in some areas.
111
Since the 1970s Government attention to salinisation in the Murray-Darling Basin has been
growing steadily with the problem itself, and as scientists have developed improved methods to
measure and understand it. Interest in salinity can be assessed by activities from a multitude of
perspectives and from government agencies across the Basin. There has been rapid growth in
the number of working parties, experts groups, regional and local studies, monitoring programs,
computer modelling projects, technical reports, scientific publications, management plans, and
public works programs. These activities are improving the scientific knowledge base, and
address such issues as salt-loads, risks and impacts on agriculture, salt-reduction targets, costs to
the community, and integrated catchment management.
In 2000 the Council of Australian Governments endorsed The National Action Plan for Salinity
and Water Quality, a measure which targets some of the worst affected areas in the country. The
Plan provides some $1.4 billion over seven years for applying regional solutions to salinity and
water quality problems. It aims to bring together people from all levels of government with
community groups, individual land managers and local businesses.
The Basin Salinity Management Strategy 2001-2015 for the Murray-Darling Basin (MDBMC
2001) has identified the needs for three levels of approach in dealing with problem. First,
communities can get short-term relief from salinity by means of engineering works that
intercept the saline groundwater. Second, interception schemes allow communities to put in
place longer-term measures for treating the causes by reducing groundwater discharge. Third, in
some parts of the Basin it will also be necessary for communities to adapt and co-exist with
more salt in the landscape.
7.3.3
Salinity trends
The systematic measuring of salinity is a relatively recent practice, and studies have only
adopted standard units within the last ten years. Therefore, it is difficult to record reliable longterm trends in salinity, and it is clear that many past estimates seriously under-estimated the
problem. Nevertheless, the figures in Table 7.1 give an indication of some of the key changes
across the landscape.
The critical elements in salinity are now recognised as:
•
The salt load mobilised in the landscape measured in tonnes per year
•
The area of land with rising groundwater, whether saline or not, measured in hectares
•
The salt exported from the landscape through rivers measured in tonnes per year
•
The salinity levels of rivers measured in EC units.
112
At the rate at which groundwater levels are rising, most irrigation areas in the southern part of
the Basin will have watertables within two metres of the surface by 2020 (MDBMC 1999).
Table 7.1:
Extent of Salinity in the Murray-Darling Basin
High watertables
Unit
559,000 ha
Visible signs of salinisation
96,000 ha
1996
Salt-affected land
300,000 ha
1998
Salt mobilised to the land surface
5 million t/yr
- exported from the system
2 million t/yr
- retained in soils and groundwater
3 million t/yr
1998
Rising groundwater in a study area of 21million ha
16 million ha
2001
Land at risk of dryland salinity*
5 million ha
1985
mid-1980s
Data from Murray-Darling Basin Commission (various sources).
*Recently the Bureau of Rural Sciences reported that salt stores in eastern Australia are much more
localised in the landscape than previously thought, and represent a salinity risk only if they are likely to
be mobilised by water (Baker and Barson 2004).
Over recent years downstream sites on the Lower Murray have experienced increasing salinity
levels, as increased salt loads from both tributaries and seepage in the mallee zone enter the
river. River salinity levels recorded during 1975-1985 provide the benchmark for acceptable
salinity in the Lower Murray. The town of Morgan in South Australia has become a key
monitoring station because it is located upstream of the off-takes for water supply to Adelaide
and other South Australian districts. While there is little information about the salinity of the
Murray before about 1965, what there is shows a small but perceptible increase since the 1930s
(GHD 1970, 3:37). One study has shown salinity levels for the period 1969/70 to 1972/73 on
average to be 520 EC (Peck 1983:15). During 1975-1985 river salinity at Morgan exceeded 800
EC (the threshold for desirable drinking water quality) 42 per cent of the time. During 19931999, following implementation of the Salinity and Drainage Strategy for the Basin, the level
exceeded 800 EC for only 8 per cent of the time. The level at Morgan now provides the key
performance indicator for the Murray’s water salinity. In comparison, the Murrumbidgee River
maintains a relatively low salinity level. In its middle course, through the irrigation areas,
readings are between 200 and 400 EC.
Much of Australia’s native ecology is adapted to occasional fluxes of salt in primary saline
areas. However, scientists have only recently begun to understand the effect of secondary
salinity on aquatic ecosystems. As salinity levels increase ‘the native life of floodplains now
suffers at least as much from rising salt as do crops and horticulture, and probably more so’
113
(Mussared 1997:98). The MDB supports a number of major wetlands: the Macquarie Marshes,
Great Cumbung Swamp (NSW), Avoca Marshes and Chowilla Floodplain (Victoria). While
current knowledge prevents a clear assessment of the adverse affects from salinity, there is no
doubt that these wetlands will suffer from rising salinity. Some 80 wetlands have already been
affected or are at risk from salinity across Australia (NLWRA 2001:13). In the past, the use of
wetlands (in the South Australian Riverland district and in Victoria’s Kerang) as salt
evaporation basins for the disposal of saline drainage water from irrigation areas, has been
highly destructive of aquatic ecosystems.
Figure 7.2: Watertable levels for four irrigated districts in the NSW Murray area of the
Murray-Darling Basin. (Original data from MDBC Draft Salinity and Drainage Strategy 1988,
graph reproduced in Ghassemi et al 1995.)
The future prospects for Australia in an environment under the increasing risk and impact from
dryland salinity are summarised in the National Land and Water Resources Audit (NLWRA
2001:v):
•
Approximately 5.7 million hectares are within regions mapped to be at risk or affected by
dryland salinity. It has been estimated that in 50 years’ time the area of regions with a high
risk may increase to 17 million hectares.
•
Some 20,000 km of major road and 1600 km of railways occur in regions mapped to have
areas of high risk. Estimates suggest these could be 52,000 km and 3600 km respectively by
the year 2050.
•
Up to 20,000 km of streams could be significantly salt-affected by 2050.
114
•
Areas of remnant native vegetation (630,000 hectares) and associated ecosystems are within
regions with areas mapped to be at risk. These areas are projected to increase by up to 2
million hectares over the next 50 years.
•
Australian rural towns are not immune: over 200 towns could suffer damage to
infrastructure and other community assets from dryland salinity by 2050.
7.4
The Murrumbidgee River
Intensive irrigation in New South Wales began with the Murrumbidgee Irrigation Scheme in
1912. Its history is described in Part III. Today the MIA is an important agricultural district of
some 3,624 square kilometres. Producers on Large Areas (properties from 200 to 320 hectares)
grow rice, corn, wheat and vegetables, and raise prime lamb, wool and beef cattle.
Horticulturists (on farms from 16 to 20 hectares) grow permanent crops such as grapes, oranges,
lemons, peaches, apricots, grapefruit, cherries, prunes and plums. The MIA produces almost 90
per cent of the NSW citrus crop (35 per cent of total Australian production), and 70 per cent of
NSW wine grapes (20 per cent of total Australian production). The gross value of farm
production in the MIA is estimated at about $700 million (Murrumbidgee Irrigation Limited,
c.2002).
Rising groundwater from irrigation was evident in the first decade of the MIA's operation.
Salinity first appeared in the 1930s, and occurred increasingly during wet seasons over the
following decades. As irrigation expanded from the original areas of Mirrool and Yanco the
problem spread. In the Benerembah Irrigation District, where rice growing was introduced in
1942, shallow watertables and salinity appeared over large areas in 1984. In the Coleambally
Irrigation Area, on the southern side of the Murrumbidgee, the watertable rose significantly in
the period 1981-91. By 1984 salinity was evident as the watertable approached levels less than
two metres from the surface.
In 1992 the NSW Government adopted the State Rivers and Estuaries Policy as a response to
the community's growing concern on the condition of the waterways in New South Wales. The
NSW Department of Land and Water Conservation began a data-gathering program as a step
towards determining future management of the rivers and their catchments. In the report on the
Murrumbidgee, waterlogging and salinity in the irrigated areas were identified as major
problems (NSW DLWC 1995). The report also said: 'Salinity is likely to become an
increasingly important issue in the Murrumbidgee region over the next five to ten years.'
The Murray-Darling Basin Salinity Audit (MDBMC 1999) indicated that waterlogging and
salinity had been less serious in the Murrumbidgee catchment than in other major NSW river
systems. Although salt loads exported from the whole Murrumbidgee valley were the highest in
115
the State, the NSW study, that provided data for the Salinity Audit, found that salinity in the
Murrumbidgee itself remained the lowest in the State as a result of dilution by fresh water from
the Snowy Mountains catchments (NSW DLWC 2000). The NSW study also found that, of the
17 tributary sub-catchments above Wagga Wagga in the Murrumbidgee valley, several were
contributing the highest average annual salt loads per unit area found anywhere in the NSW
portion of the Murray-Darling Basin. Salt loads ranged from 4 to 48 tonnes per km2 per year,
and were predicted to increase to 6 to 74 tonnes per km2 per year by the end of the twenty-first
century. In 1998 the average salinity level recorded for the Murrumbidgee was only 140 EC at
Wagga Wagga above the scheme, and 250 EC below the scheme at Balranald (cf. Table 6.1).
As well as diversions for irrigation below Wagga Wagga, the Murrumbidgee provides
municipal water supplies to centres from the catchment above Burrinjuck Dam and downstream
as far as Balranald. It also supplies water to significant wetlands in the lower reaches of the
system. Of the average annual flow of 4592 GL under current conditions at Wagga Wagga,
2424 GL is diverted for irrigation, 19 GL is used for stock and domestic water supply, 799 GL
is deposited in wetlands and 1350 GL flows out to the Murray River at end of system. The
estimated salt loads are summarised in Table 7.2 (NSW DLWC 2000).
Table 7.2:
Estimated salt loads for the Murrumbidgee
Year
River at
Wagga
Wagga
Diversions
for irrigation
Wetlands and
other losses
Stock and
domestic
water supply
End of
system
Average tonnes per year
1998
401 800
162 700
53 600
1300
184 200
2020
482 300
189 050
62 300
1500
229 450
2050
529 100
208 100
68 600
1650
250 750
2100
607 400
240 000
79 100
1900
286 400
Water salinity for downstream uses is expected to reflect the same concentration and variability
as at Wagga Wagga, where the range of average monthly salinities predicted for the coming
century lie well below the seasonal threshold values of 800 EC and 1600 EC (cf. Table 6.1).
Tributary catchments such as Muttama Creek and Jugiong Creek already exceed 800 EC in 35%
and 77% of months, and 1600 EC in 20% and 33% of months, respectively (NSW DLWC
2000). River salinity is particularly significant to downstream users of the Murrumbidgee and to
those below Wentworth on the Murray River. The aim of the Basin Salinity Management
Strategy 2001-2015 (MDBMC 2001) is to ensure that salinity measured at Morgan remains at
800 EC for 95% of the time over the 15-year period.
116