An analysis of water, sediment and seafood samples taken in 2010 during and after the oil spill in the Gulf of Mexico has found higher contamination levels in some cases than previous studies by federal agencies did, casting doubt on some of the earlier sampling methods.
The lead author, Paul W. Sammarco of the Louisiana Universities Marine Consortium, said that dispersants used to break up the oil might have affected some of the samples. He said that the greater contamination called into question the timing of decisions by the National Oceanic and Atmospheric Administration to reopen gulf fisheries after the spill and that "it might be time to review the techniques that are used to determine" such reopenings.
Eleven workers died and roughly 200 million gallons of crude oil gushed into the gulf after a blowout at an exploratory well owned by BP caused the Deepwater Horizon drilling rig to explode on April 20, 2010. Nearly two million gallons of Corexit, a dispersant, were sprayed on the surface or injected into the oil plume near the wellhead.
In all, more than 88,000 square miles of federal waters were closed to commercial and recreational fishing. Some areas were reopened before the well was capped three months after the blowout; the last areas were reopened a year after the disaster.
Like other studies after the spill, the new analysis, published last week in the journal Marine Pollution Bulletin, found that components of oil were distributed along the Gulf Coast as far west as Galveston, Tex. — about 300 miles from the well site — and southeast to the Florida Keys.
But the study found higher levels of many oil-related compounds than earlier studies by NOAA scientists and others, particularly in seawater and sediment. The compounds studied included polycyclic aromatic hydrocarbons, some of which are classified as probably carcinogenic, and volatile organic compounds, which can affect the immune and nervous systems.
"When the numbers first started coming in, I thought these looked awfully high," Dr. Sammarco said, referring to the data he analyzed, which came from samples that he and other researchers had collected. Then he looked at the NOAA data. "Their numbers were very low," he said, "I thought what is going on here? It didn't make sense."
Read the full story at New York Times>>
Brian Rothschild of the Center for Sustainable Fisheries on revisions to the Magnuson-Stevens Act.
National Fisherman Live: 4/8/14
The South Atlantic Fishery Management Council is currently soliciting applicants for open advisory panel seats as well as applications from scientists interested in serving on its Scientific and Statistical Committee.
The North Carolina Fisheries Association (NCFA), a nonprofit trade association representing commercial fishermen, seafood dealers and processors, recently announced a new leadership team. Incorporated in 1952, its administrative office is in Bayboro, N.C.