Marine Pollution Analysis: The Deepwater Horizon Oil Spill

Article

The Column

ColumnThe Column-02-20-2014
Volume 10
Issue 3

Chris Reddy from the Woods Hole Oceanographic Institution (WHOI) spoke to Alasdair Matheson of The Column about the role of chromatography in the ongoing environmental analysis of the Deepwater Horizon oil spill, how GC×GC works in practice, and why this oil spill led to the return of thin layer chromatography (TLC) to his laboratory.

Chris Reddy from the Woods Hole Oceanographic Institution (WHOI) spoke to Alasdair Matheson of The Column about the role of chromatography in the ongoing environmental analysis of the Deepwater Horizon oil spill, how GC×GC works in practice, and why this oil spill led to the return of thin layer chromatography (TLC) to his laboratory.

Q: Tell us about your group's involvement in the work at the Deepwater Horizon disaster site, which has attracted widespread attention in the media. What were the objectives?

A: In April of 2010, the Deepwater Horizon (DWH) drilling rig exploded and released approximately 200 million gallons of crude oil along with a large quantity of methane, ethane, and propane.


Photo courtesy of Chris Reddy, Woods Hole Oceanographic Institution

It was an ongoing spill for 87 days with oil residues that we continue to find along the Gulf beaches as recently as January 2014. We have studied - and continue to study - a wide range of research questions from determining the flow rate, analyzing how nature breaks down or "weathers" the oil, and fingerprinting it to confirm that oiled samples we have found have come from the DWH disaster.

Our field work has led us to collecting samples using a robot right where the oil was coming out of the pipe that you may have seen on TV at the time. We have walked many miles of the Gulf of Mexico coastline and even 300 or 400 miles away from the explosion. So we have gone from analyzing oil samples a foot away from the source of the spill to hundreds of miles away

I expect to be working at the site for the next 10 years, alongside some other oil spills and projects.

Q: One of the main techniques you used was comprehensive two-dimensional gas chromatography (GC×GC). Why did you decide to use a GC×GC method and what are the advantages of this technique compared to other methods?

A: My team has extensive experience in tackling some interesting research questions with GC×GC by studying numerous oil spills that have occurred as well as natural oil seeps. What makes GC×GC so powerful is that it has the capacity to resolve and detect many more compounds than with traditional analytical techniques, such as gas chromatography with mass spectrometry (GC–MS).

Now, I want to be very clear here. A lot of people hear me say GC×GC can do more than GC–MS, and they immediately assume that GC×GC is a replacement to GC–MS, but it is not. It is just another tool in the laboratory that allows us to address some specific questions where a regular benchtop GC–MS cannot.

On the other hand, for polycyclic aromatic hydrocarbons (PAHs), I don't think GC×GC can do any better than a benchtop GC–MS can do and the GC–MS software is much, much more user-friendly. And so, in my lab, we don't quantify PAHs by GC×GC. There's no point. It's easier and faster to do so with a GC–MS.

One of the main factors that makes GC×GC very powerful, that I think a lot of people miss out on, is that when you look at a chromatogram in two-dimensional GC space you are not only just able to identify and measure compounds (many, many more compounds than with traditional techniques), but also can convert retention times in the first and second dimension to vapour pressure and water solubility.

If you're interested in the fate of oil there are two key questions you want to know: What is the vapour pressure of a compound (or what is the likelihood the compound will evaporate?); and what is the water solubility of the compound (what is the likelihood the compound will dissolve in water?)

Now if we use some newly developed algorithms, we can allocate how much of a compound evaporated versus how much dissolved in water. That is the real major leap in my mind: GC×GC allows us to discover where the compounds are going. It's beyond just making your Excel spreadsheet bigger and identifying many more compounds. It allows us to say where is this compound going, or where has it gone.

Q: Can you describe an interesting example of the use of chromatography on the Deepwater Horizon project using GC?

A: There was a question about some oil residues that were found on corals on the bottom of the seafloor, not too far from where the well blew out. It was possible that natural seeps around that area caused the oiling and not from the DWH. For understanding the impacts of the spill, it was critical that we could accurately fingerprint these oil residues on the coral. We were able to show - with much more confidence than what you could do with traditional tools - that that oil was from the Macondo well.

Q: Are there any particular difficulties you have to overcome when developing GC×GC methods?

A: I think that the reason that GC×GC is not more popular and has not fully matured, despite being around for decades, is that it is hard to get it up and running day in and day out. This is counter to your typical scientist who might start a job in a new laboratory, buy a benchtop GC–MS, and be obtaining publication-quality data the next day.

The cost of GC×GC with a flame ionization detector (FID) is comparable to a GC–MS, but it might take you months to achieve adequate and reproducible chromatograms sustainably.

Even then, the basic maintenance is a little tricky as you have a lot of potential for more leaks and so the learning curve is a lot steeper - what's more there aren't many manuals available on troubleshooting. This sounds negative, but I think it's just that the technology hasn't matured, and it's probably where GC–MS was in the late 1960s.

To me, maturity will come from more users contributing to the field with manufacturers developing better and easier to use hardware and software. The more users out there, the more everybody learns the tricks, and the more that it will allow us to pollinate so that it can be more user-friendly.

Q: So, you see it evolving and being easier to use as time progresses? Do you think it could be used for routine analysis in the future?

A: Absolutely. It would be in the best interest of the manufacturers to help it evolve faster, and I know they are working hard on this at the moment. I truly hope it become mainstream and not a boutique tool. I have been really lucky to have a colleague, Bob Nelson, work in my lab for the last 12 years and all he has done is GC×GC.

For routine analysis, the one thing that needs to evolve is the software to interpret the data. It's a case of "be careful what you wish for: You might get it!" You get a lot of resolved peaks and a lot of information, but the developments in hardware are way ahead of the software - data analysis is therefore not as easy as it is with a regular benchtop GC–MS to integrate peaks and such. The software exists but it needs to be refined and that's what I also tell potential users - give it time and look for ways to improve it. I think there is a lot of low-hanging fruit and a lot of talented people can make this technology even better.

Q: Is there anything you would like to add about the use of GC×GC in your group?

A: Now there's one more powerful thing with GC×GC that we use a lot is that we can do algebraic operations, mathematical operations, on a whole chromatogram. So let's make-believe you have a sample of oil that was in a ship before it got spilled. Now we might collect a slick sample that was collected a few days later a couple of kilometres away. We can analyze each sample with the same exact method, align the chromatograms, and then subtract one from the other. We get a "difference" chromatogram. And that "difference" chromatogram can tell you across the whole gas chromatographic plane what compounds have already been lost to the environment.

Q: Any advice for chromatographers who are embarking on using GC×GC?

A: My advice is if you buy one of these instruments is to be patient and be willing to invest the time to get it going. You're not going to be able to return in a day but that doesn't mean that you can't. You just have to recognize that this is an instrument that sometimes can be a little counter-intuitive and has its own challenges, but there are no different challenges in other techniques that haven't gone into the mainstream. I think the best advice for a good analytical chemist is to give it a little bit more time and to tell their bosses to give them a little bit more time when they get the technology and they will reap the benefits. Patience is a virtue.

Q: Are there any actual other chromatographic techniques that you think are particularly interesting in the environmental analysis at the moment?

A: You know it's funny. One of the pieces of technology that we have found to be incredibly powerful is thin layer chromatography with a flame ionization detector (TLC–FID). This is an automated instrument that has been around for decades. It has been invaluable for looking at compounds that are not GC-amenable - compounds that are either thermally unstable or what we call too sticky to go through a gas chromatograph instrument.

This approach has become useful for the DWH project because we believe there has been a lot of biology or photochemistry that has oxidized some of the oil samples that makes them difficult to be analyzed by any type of GC, but that can be easily separated and quantified on a TLC–FID. So I think there's some irony that we have resurrected a piece of technology that was in my building and now there's a queue of scientists waiting to use it. In fact, we're thinking about buying another one. So there's an interesting spin on using something that's been around for a long time that we really think has a lot of power. It's very easy and the curve is easy to get going. We have already published several papers using it. So it's an interesting end of the spectrum from a chromatographic perspective.

Q: It seems simplicity sometimes has a lot to offer.

A: Indeed. It's really funny because I only knew how to use it because my PhD advisor was formally trained in looking at lipids and he did a lot of work on pollution. His name is Jim Quinn. He's still alive and a great guy. He had a project going on where somebody was looking at lipid chemistry, and so when I was in the lab I got to see these people using it. When we started to think about how we might be able to elucidate what was going on in the Deepwater Horizon site I wondered if we could use TLC–FID.

We found one in our building and we have a lot of chemists and biochemists, so naturally somebody had one on a shelf. And lo and behold, we brought some rods and development tanks and we were in business.

Q: Interesting stuff! Your Institution does a lot of brilliant work. Is there anything you'd like to say about the role of separation science at your institute?

A: We do a wide range of chromatography in a relatively small building. We have about six investigators: Six professors in a relatively small building and I'd probably wager that we have maybe 25 mass spectrometers and probably 35 different types of chromatographs.

The analyses run the whole gamut from looking at methane to inspecting large molecular biopolymers that have much, much bigger molecular weights to any possible spectrum, potential usefulness, we have in our building. And it's a lot of fun. We've had a lot of people come to our lab and say "I've never seen so many chromatographs."

Q: So many professors in one laboratory!

A: Yeah, it's really amazing. To be honest with you, one of our biggest problems is that we're about 100 kilometres away from Boston. Our power supply sometimes can have some issues so that we have a lot of problems making sure that our instruments can switch to emergency power. And then the other challenge that we face, and we're working on it, believe it or not, is the sheer amount of gas and liquid gases that we consume. Right now, we have been bringing in standard bottles and liquid nitrogen on tanks - to the point where our hallway is lined with tanks all over the place and it's a fire hazard.

We also only get a gas delivery on a Monday, Tuesday, and Friday. So if you are out of gas at four o'clock on a Friday afternoon and you're trying to batch up samples over the weekend, you can't run them.

That's a huge problem for us. And so we've been working with our administration so that we can have a permanent plumbed gas source so we would remove all these tanks. Actually, it's a lot cheaper and safer. And it will allow us to be more efficient. And so really in many respects, our two-biggest challenges in terms of day-to-day operations are good power and gas supply.

Q: How is this perceived from a cost perspective?

A: Good point. We have been working with our administration and the upfront costs are not cheap. We're in financially tough times. And for us to be able to say, "Look, the return on the investment in this plumbing will be in only in a couple of years or so and it's going to be safer" is important. We're not going to have so many bottles in the hallway. We're not going to be relying on all this traffic. Even these big diesel trucks have to deliver these tanks about 80 miles away from a service place and so we have a lot of traffic to get these tanks here and our building is small. So there are a lot of challenges, including getting these instruments working day in and day out with the bare necessities of electricity and gas - as well as the separation science.

Chris Reddy is a senior scientist at Woods Hole Oceanographic Institution, Woods Hole, Massachusetts, USA. He is currently studying the short and long-term fate of oil seeping off the coast of Santa Barbara, California, and the Gulf of Mexico, World War II wrecks in the South Pacific, and spills that have occurred in 1969, 1974, 1996, and 2003 in New England, two that occurred in 2007 in San Francisco Bay and South Korea, the Exxon Valdez, the 2002 Prestige spill along the Spanish coastline, and the Deepwater Horizon.


Chris Reddy

According to a 2010 survey by Thomson Reuters, he is one of the top cited and published scientists studying oil spill effects, remediation methods, and petroleum microbiology. He has extensive experience with the Deepwater Horizon, including being the academic liaison at the Unified Area Command where he interacted and provided guidance to state, Federal, and BP officials.

He had led or participated in two major research cruises on the DWH, many small boat operations, overflights, and sampled the beaches of the Gulf from Pensacola, Florida, to Port Fourchon, Louisiana, countless times.

He received his BS in chemistry from Rhode Island College, Providence, Rhode Island, USA, and his PhD in chemical oceanography from the Graduate School of Oceanography at the University of Rhode Island (Narragansett, Rhode Island, USA).

E-mail: creddy@whoi.edu

Website: http://www.whoi.edu

Recent Videos