Christopher Reddy Wired Science 20 Apr 12;
When the Deepwater Horizon drilling rig exploded two years ago in the Gulf of Mexico, many scientists, including me, stepped outside of the Ivory Tower to study what was an unprecedented — and unintended — environmental experiment. We succeeded in gathering mountains of data, learning all sorts of new things, and advancing science.
But we also failed.
Academic scientists chose the research that most interested us, rather than what may have been most important to responding to the immediate disaster. We failed to grasp the mechanics of the media. And we struggled with how our data was vetted and whom we could trust with it. Simply put, problems arose when academia did not appreciate the cultures of the other players responding to the spill.
To add to these challenges, we were very much in the fog of war, literally and figuratively. The smell of oil, floating in a sea of orange/brown oil, the roaring jets of burning oil, and the hundreds of boats was overwhelming. And on land, the press just kept calling.
Opportunities were missed when others did not understand the academic culture, too.
Unlike most previous oil spills, the ruptured Macondo well spewed oil and gas nearly a mile beneath the surface of the Gulf of Mexico. That was aqua incognita to the oil industry and federal responders, but it was a familiar neighborhood for oceanographers who had been studying the deep sea for decades.
BP as well as federal officials were under enormous pressure and did little to enlist outside help. Very few were readily aware of what academic scientists could contribute. Nor did they communicate what research would be most useful for them, or provide funds to do it. A month passed before government officials invited academic leaders to a meeting in Washington, D.C., about the spill.
Many scientists were keen to help but did not know whom to contact. In the initial days, they forged ahead without outside direction, and many were awarded rapid-response grants from the National Science Foundation. But they were guided solely by their scientific instincts and information they gleaned on their own and not by what could have helped the overall effort.
We were trying to find Atlantis instead of contributing to solving problems.
Our academic training did not prepare us for the media attention we received, and sometimes liked too much. We did not recognize that the media’s mission to provide immediate, definitive information about unfolding events to an anxious public can limit its ability to be comprehensive and complex. Academia provides us the luxury to move slowly with the goal of perfection. So we had problems explaining uncertainties, and we did not understand the ramifications of our statements to the media.
Time, more than anything else, separated us. The media has hours to make a deadline. We have five to eight years to get tenure.
An example of how this played out was the reporting of oil plumes flowing from the well deep underwater.
Oil generally floats, so in the early days of the spill, scientists were startled to find high levels of hydrocarbons deep in the Gulf and relayed their findings to the media. The scientists hypothesized that high pressure at the depth where the leak occurred was causing some hydrocarbons to flow horizontally away from the well, rather than up to the surface.
The resulting news reports gave the impression that rivers of oil were flowing at the bottom of the sea, potentially killing shrimp and fish that supported the local economy and harming the ecosystem. Government responders and industry had to respond to the press about the plumes, rather then focusing on higher priorities such as capping the well. And the public had to respond to these reports, too. I recall one Gulf resident asking me if he should sell his house and move away.
Many academics, including me, were hard on the scientists who reported the presence of plumes. We thought they had veered from the standards of good science. Their findings were not peer-reviewed. In their communications with the public, they seemed susceptible to the lure of limelight.
But I now recognize the upside. Those scientists awakened the public, and me, to an important and unrecognized phenomenon that needed further study. Soon I was out in the Gulf with cutting-edge technology and a team that, just a few months earlier, had successfully mapped oil and gas seeping naturally from the seafloor near Santa Barbara.
I wish I could say I wasn’t thinking about scooping my peers, confirming the plume, and publishing a top-notch science paper, but that wouldn’t be true. In fact, I called an editor of a journal from the bow of a boat asking him if he was interested in our findings.
A month after the well was capped, we published a study in the journal Science confirming a subsurface plume more than a mile wide and 600 feet high that flowed for miles from the Macondo well at a depth of 3,600 feet. However, this plume was not a river of oil, but rather a layer in the ocean that was enriched in hydrocarbons. Water samples taken from within the plume were crystal clear.
We had just mapped an underwater plume with a one-of-a-kind underwater vehicle carrying a state-of-the-art mass spectrometer. It could be the greatest scientific contribution of my career. But the media wasn’t that interested. They were more concerned with whether the plume was toxic.
We were confused and said to them, “You need to know where the plume is before you can consider harmful effects.” It seemed so simple to us, but it was only newsworthy if the plume, at that time, could harm marine life or the environment.
We had published the study a little more than two months after gathering the data — lightning fast for a scientific paper. But when I was the academic liaison at the oil spill’s headquarters the following month, I learned that those on the front line weren’t impressed by the publication of a paper a month after the crisis was over. Crisis responders often must make decisions on the spot, with imperfect information, even if it is risky.
During a crisis, “peer review is the biggest problem with academia” Juliette Kayyem, who was an assistant secretary at the Department of Homeland Security during the Deepwater Horizon and teaches crisis response at Harvard, told me.
But to release unvetted data is a leap of faith. I observed a very talented junior scientist struggle with this. He was afraid he might be not be 100 percent correct, word would get out, and it would affect his tenure decision.
The good news is that most of these problems are avoidable. The many stakeholders involved did not share a common language, timeframe, set of values, or pre-existing relationships. We can take a lesson from Deepwater Horizon and start opening new lines of communication before the next disaster. For example, I have asked around and many of the oil spill responders would be glad to visit campuses to explain their world.
It’s time for academia to embrace a maxim in crisis management that “a crisis is no time to start exchanging business cards.”