Menu
  • Home
  • About
  • Blog/News
  • Calendar
  • Purchase
  • Contact
  • 01Research
    • Research Interests
    • Current Projects
      • Current Projects – METRIC Software
      • Current Projects – Non-destructive Detection of Pipeline Properties
      • Current Projects – Cryptography for confidentiality protection
      • Current Projects – Geostatistical Boundary Analysis
      • Current Projects – I-HEAT
      • Current Projects – Geostatistical software for space-time interpolation
    • Completed Projects
      • Completed Projects – Boundary Analysis
      • Completed Projects – Modeling of College Drinking
      • Completed Projects – Cancer Cluster Morphology
      • Completed Projects – Geostatistical software for the analysis of individual-level epidemiologic data
      • Completed Projects – Case-only Cancer Clustering for Mobile Populations
      • Completed Projects – Contextual Mapping of Space-Time Cancer Data
  • 02Tutorials
  • 03Publications
  • 04Collaborations
  • 05Software
    • SpaceStat
    • SpaceStat Help & Tutorials
    • Overview
    • ClusterSeer
      • ClusterSeer Help and Tutorials
      • Overview
    • BoundarySeer
      • BoundarySeer Help and Tutorials
      • Overview
    • Support
    • Network License
    • Purchase
  • Home
  • About
  • Blog/News
  • Calendar
  • Purchase
  • Contact
  • 01Research
  • 02Tutorials
  • 03Publications
  • 04Collaborations
  • 05Software
Subnav Menu

remediation process

Highlights from SETAC 2010: Deepwater Horizon oil spill and Global Climate Change

11.15.10

The Society of Environmental Toxicology and Chemistry (SETAC) is a non-profit, worldwide professional society comprised of individuals and institutions engaged in: 1) the study, analysis, and solution of environmental problems, 2) the management and regulation of natural resources, 3) environmental education, and 4) research and development. The 2010 annual meeting was a four day event in Portland, Oregon, America’s top green city. The program was diverse — 14 parallel sessions and hundreds of posters displayed every day — it catered to everyone’s interest. It was my second participation to this conference, following the poster presentation at SETAC 2009 in New Orleans.

The highlights of this year’s meeting were twofold: 1) several oral and poster sessions devoted to The Deepwater Horizon oil spill in the Gulf of Mexico, and 2) the first session on global climate change ever organized by SETAC. The largest oil spill in US history, which spanned 87 days in the Gulf of Mexico, will pose environmental threats for decades to come. Many lessons were learned during the 20 years of research following another environmental disaster, the spill from Tanker Exxon Valdez. Yet, the BP spill is different in many respects: a light, sweet crude discharged into warm Gulf waters behaves very differently from a heavy crude oil discharged into cold Alaskan waters. These characteristics, combined with the unprecedented use of chemical dispersants to break up oil, put environmental scientists in uncharted territory. The U.S. EPA conducted extensive monitoring of the air, water, and sediments to assess the impacts of crude oil and dispersant chemicals on human health, including shoreline communities, and the aquatic environment. This research around the Gulf spill is clearly high profile, given the level of media attention and politicization of the incident.

The technical and policy debate on Global Climate Change (GCC) is another topic that is familiar to the general public. Climate change influences sea levels, ocean acidification, severity and frequency of extreme weather, the balance of ecosystems, and other phenomena of importance to natural and man-made systems. While significant resources are being directed to predicting potential consequences of climate change, one needs to develop rational approaches to guide decision-making under uncertainty along with methods for developing and comparing the performance of alternative adaptive strategies within an overall adaptive management approach. As a speaker rightly stated, there is no time to waste! We have to start developing plans that will reduce the risks climate change poses to humans, infrastructure and ecosystems, even if this risk is still not perfectly understood.

My presentation was part of a platform entitled “Effects of Spatial Uncertainty on Site Management Decisions at Superfund Mega Sites – Protectiveness, Remedial Effectiveness and Cost Efficiency”. In the environmental industry, remediation projects often involve dredging of contaminated sediments. The volume of sediment targeted for removal must be estimated prior to the project because it is often used in remedial contracts to develop and monitor achievement of project goals and assess payment. Over- or under-estimating these volumes can have significant adverse impacts on the project scope, budget, and schedule. It is therefore imperative to make as accurate an estimate as possible of sediment volumes targeted for removal. Spatial interpolations of contaminant data form the basis of site management decisions throughout the Remedial investigation/feasibility study process at contaminated Superfund Sites.

Based on the scientific literature, geostatistics is regarded as a well-accepted methodology for site characterization in environmental studies. When stepping out of the classroom and academic congresses, it might thus be shocking to realize that most remediation projects in the United States are still based on estimates obtained using Thiessen polygons or inverse distance methods, with little attention to uncertainty or accuracy assessment. Common issues that complicate the geostatistical modeling of contaminated sites include: 1) the preferential sampling of the most contaminated areas leading to biased sample statistics and unstable variogram estimates, 2) the complex geometry of the site (e.g. meandering rivers) that prohibits the use of Euclidian distances, 3) the presence of large proportions of non-detects and strongly asymmetric histograms that deviate from the lognormal model, 4) the wide range of core lengths (measurement support) that makes questionable the equal weighting of observations in the analysis, 5) the orders of magnitude difference between the spatial support of the data (e.g. 1 foot core) and the spatial support used for remediation decision (e.g. cubic meters blocs), and 6) the nature of the remediation process that requires the processing of three-dimensional concentration models to estimate the maximum depth at which contaminant concentrations exceed the thresholds of concern.

In my paper, “Geostatistical Estimation of Contaminated Sediment Volumes: Review of Common Challenges and Solutions” I shared my recent experience as consultant for environmental firms and discuss the main challenges associated with the use of geostatistics in that industry.

Monthly Archives

Featured

SpaceStat is a comprehensive software package that enables visualization and analysis of space–time data, overcoming some of the constraints inherent to spatial-only Geographic Information System software.

Read More About our Software »

Methods

Time is an integral part of our analysis. All views of the data can be animated, from maps to histograms to scatter plots. Furthermore, these animations can be linked together to explore ideas and discover patterns.

Learn More About Our Methods »

Order Books

Compartmental Analysis in Biology and Medicine, 3rd Edition, by John A. Jacquez and Modeling with Compartments, by John A. Jacquez

Order Now »

  • 01Research
  • 02Tutorials
  • 03Publications
  • 04Collaborations
  • 05Software
  • Home
  • About
  • Blog/News
  • Calendar
  • Purchase
  • Contact

© 2018 BioMedware | P.O. Box 1577, Ann Arbor, MI 48106 | Phone & Fax: (734) 913-1098
Privacy Policy