An analysis of water, sediment and seafood samples taken in 2010 during and after the oil spill in the Gulf of Mexico has found higher contamination levels in some cases than previous studies by federal agencies did, casting doubt on some of the earlier sampling methods.
The lead author, Paul W. Sammarco of the Louisiana Universities Marine Consortium, said that dispersants used to break up the oil might have affected some of the samples. He said that the greater contamination called into question the timing of decisions by the National Oceanic and Atmospheric Administration to reopen gulf fisheries after the spill and that "it might be time to review the techniques that are used to determine" such reopenings.
Eleven workers died and roughly 200 million gallons of crude oil gushed into the gulf after a blowout at an exploratory well owned by BP caused the Deepwater Horizon drilling rig to explode on April 20, 2010. Nearly two million gallons of Corexit, a dispersant, were sprayed on the surface or injected into the oil plume near the wellhead.
In all, more than 88,000 square miles of federal waters were closed to commercial and recreational fishing. Some areas were reopened before the well was capped three months after the blowout; the last areas were reopened a year after the disaster.
Like other studies after the spill, the new analysis, published last week in the journal Marine Pollution Bulletin, found that components of oil were distributed along the Gulf Coast as far west as Galveston, Tex. — about 300 miles from the well site — and southeast to the Florida Keys.
But the study found higher levels of many oil-related compounds than earlier studies by NOAA scientists and others, particularly in seawater and sediment. The compounds studied included polycyclic aromatic hydrocarbons, some of which are classified as probably carcinogenic, and volatile organic compounds, which can affect the immune and nervous systems.
"When the numbers first started coming in, I thought these looked awfully high," Dr. Sammarco said, referring to the data he analyzed, which came from samples that he and other researchers had collected. Then he looked at the NOAA data. "Their numbers were very low," he said, "I thought what is going on here? It didn't make sense."
Read the full story at New York Times>>
National Fisherman Live: 9/9/14
In this episode:
Seafood Watch upgrades status of 21 fish species
Calif. bill attacking seafood mislabeling approved
Ballot item would protect Bristol Bay salmon
NOAA closes cod, yellowtail fishing areas
Pacific panel halves young bluefin harvest
National Fisherman Live: 8/26/14
In this episode, National Fisherman Publisher Jerry Fraser talks about his early days dragging for redfish on the Vandal.
More than a dozen higher education institutions and federal and local fishery management agencies and organizations in American Samoa, Guam, the Commonwealth of the Northern Mariana Islands and Hawaii have signed a memorandum of understanding aimed at building the capacity of the U.S. Pacific Island territories to manage their fisheries and fishery-related resources.