Groundwater remediation: clean or no clean?
Groundwater remediation has never been so important says Ian Grant, but a trend towards looking at its cost effectiveness is worrying.
Groundwater is the largest available reservoir of fresh water - 97.5 per cent of the Earth's water is salt water and approximately 70 per cent of the remainder is polar ice. Less than one per cent of fresh water is in rivers and lakes leaving 0.75 per cent to share between humans, animals and plants.
But our groundwater usage is increasing. Of the 0.75 per cent to share between humans, animals and plants, humans currently use around 40 per cent. The 2050 projection is 90 per cent. This would leave 10 per cent for animals and plants.
And a major problem, according to Jeremy Birnstingl, Managing Director of remediation company Regenesis, is that the easiest groundwater to get to - shallow aquifers as opposed to deeper fossil water - is also the easiest groundwater to pollute. And a shortage of replenishable shallow aquifer water is now leading to deep water mining of a non-renewable resource.
Birnstingl says that some 30 per cent of US groundwater is used for irrigation. But with a depletion rate of 12 cubic km per year, he wonders if there will be anything left in 25 years.
Indeed net depletions of aquifers are not only recorded in the US, China and India, but also in Iran, Israel, Jordan, Mexico, Morocco, Pakistan, Saudi Arabia, South Korea, Spain, Syria, Tunisia and Yemen. These countries account for more than half the world's population.
UK pressures
Groundwater shortage is not just a developing world issue, it is also a serious issue facing the UK. By 2033, the predicted consumption is set to be 1,074 litres each day.
But besides the abstraction stress facing shallow aquifers, there are also contamination threats which range from hydrocarbons to arsenic, fluoride, and saline intrusions. Diffuse pollution dangers include nitrates and pesticides.
Fortunately EC legislation is tight. For example, the legal limit on perchloroethene, a PCE dry-cleaning solvent, is the equivalent of one teaspoon in an Olympic sized swimming pool. It makes the pre-seventies common practice, of pouring cans of solvent away, a rather expensive behaviour.
Indeed it was the mid-to-late sixties when resource mapping was uppermost. Then, groundwater pollution was classified in clunky generalisations - oils or metals. But with the technological breakthroughs such as capilliary gas chromatography, oil could be speciated into different chemicals and compounds such as benzene, and toluene and polyaromatic hydrocarbons into some 16 types.
The industry started to ramp up to deal with these ‘new contaminants' and there was an explosion of technologies. The armoury included pump and treat, air sparge, soil vapour extraction, surfactant flushing and pumping, and bioremediation.
The Dutch decided that groundwater should be cleaned up to multi-functional (very high) standards which had a big influence on the industry. But by the early nineties the honeymoon ended. Groundwater remediation techniques on the whole were never going to totally clean-up aquifers and the limitations of the technologies were realised. The recession didn't help. Dr Birnstingl likens the problem to spilling coffee on the carpet. It takes a lot more energy to take it out than putting it there in the first place, and often the stain and smell remains to some degree despite energetic efforts.
Cost efficiency
So from the mid-nineties, risk assessment came to the fore along with a new found level of pragmatism - with clean-up levels more practical. This trend has continued through the noughties with advances in instrumentation and practical technologies - chemical oxidation, enhanced in situ bioremediation and enhanced monitored natural attenuation.
But even with advanced technologies, groundwater pollution is ‘unseen', ‘underground', often complex, in three dimensions, subject to ever changing dynamic forces. Remediators have to rely on limited data which only represents snapshots in time.
Mike Rivett, lecturer in Earth Sciences at the University of Birmingham, says sorting out which ‘heterogeneities' to be bothered about is the issue. He also says that guidance can be highly complex. Ecological risk assessments are thorough - but some industry commentators say following it can open up a can of worms.
One area of debate is whether the right chemicals are being analysed. In 2009 the US National Research Council stated: "Chemicals that have not been examined sufficiently in epidemiologic or toxicologic studies are often insufficiently considered in or are even excluded from risk assessments; because no description of their risks is included in the risk characterisation, they carry no weight in decision-making. That occurs in superfund-site and other risk assessments, in which a relatively short list of chemicals on which there are epidemiologic and toxicologic data tends to drive the exposure and risk assessments".
In addition, a number of emerging contaminants may prove to be a challenge both from a remediation and analytical point of view for consultants and contract laboratories, according to Dr Cecilia Macleod, technical director of Arcadis, an environmental consultancy.
The contaminants include perfluorinated sulfonates and their break-down products, 1,4 dioxane, pharmaceutical compounds including penicillin, acetominaphen and oestrogen (and others), perchlorates, and the phosphate-based flame retardants including Tris(chloro-isopropyl)phosphate.
"These compounds are not yet part of a routine water quality or soil analytical suite and yet are known to be a problem in the environment. The challenge of detecting, delineating and remediating these compounds is one we will have to face as we move into the next decade," she says.
Professor Phil Morgan, associate director of the Sirius Group, says that compounds are really hard to judge with tailing and rebound; inadequate site investigation; geological variability; the physical and chemical properties of contaminants; biological effects; poor verification design; plus time and budgets adding to the problem.
He adds: "Components behave very differently from each other and can affect the behaviour of others in terms of solubility, biodegradation, sorption and vapour composition. Some components within the mixture may be degraded in preference to others due to degradation kinetics, higher energy yield or inhibition effects."
Data and tools
The data used when looking at contamination is also the subject of discussion and debate. There has been a lot of discussion on heterogeneity of aquifer sediments and lateral dispersivity as well as issues with scale and correct identification of groundwater flow direction. Scale, for example, is an important factor in understanding contaminant behaviour and that often the error is made in looking at plumes over a large scale when in fact small scale changes can greatly impact upon the success or failure of a remediation system.
Tools available for on-site measurement range from qualitative methods including geophysical methods such as electrical resistivity tomography, ground penetrating radar, electromagnetic surveys and shallow seismic investigations. These tools can be used to build up 3D models of the subsurface which can identify particular horizons or stratigraphic layers showing disturbance, presence or absence of a shallow ground water table and changes in properties which can indicate zones of contamination.
Semi-quantitative tools include the membrane interface probe, laser induced fluorescence, rapid on-site toxicity test for assessing soil or groundwater toxicity, petroflag for assessing petroleum hydrocarbons, immunoassay screening for dioxins, PCBs and PAHs and x-ray fluorescence screening for metals in soils.
In addition, a number of consultancies and the British Geologic Survey have set up mobile laboratory facilities equipped with gas chromatographs (mass spectrometry and flame ionisation, photoionisation or electrical conductivity detection) colorimetric testing equipment, x-ray fluorescence screening and other analytical tools.
Other tools which are also becoming more sophisticated and are very useful in analysing and presenting data include geographical information systems, which can be the master databases for site information holding location, descriptive and quantitative data. There is more widespread use of statistical analysis to test for significance and representativeness of both investigative and remediation data.
In the wake of new challenges then, sharing data and findings is going to be key in improving the effectiveness of remediation techniques.
But Birnstingl, for one, believes it will be a decade of austerity with the focus changing to looking for the significant possibility of significant harm rather than just the possibility of significant harm. He points to examples in the United States where authorities are saying that in some cases, it is better to have no abstraction from the water body, rather than go through the expense of cleaning it up.
And in some developments, the solution is to put tougher membranes down, capping the pollution rather than cleaning it up. He warns that trends in the US are often reflected in the UK.
Clearly then despite all the advances in technology the fact remains that in some instances, the levels of pollution are such that it is more cost effective and practical to not clean up. But is that acceptable in an age where water shortages are rife and the value of water as a resource is increasing all the time?