Developing the Report
The process for preparing condition reports involves a combination of accepted techniques for collecting and interpreting information gathered from subject matter experts. The approach varies somewhat from sanctuary to sanctuary in order to accommodate differing styles for working with partners. The Channel Islands National Marine Sanctuary approach was closely related to the Delphi Method, a technique designed to organize group communication among a panel of geographically dispersed experts by using questionnaires, ultimately facilitating the formation of a group judgment. This method can be applied when it is necessary for decision-makers to combine the testimony of a group of experts, whether in the form of facts or informed opinion, or both, into a single useful statement.
The Delphi Method relies on repeated interactions with experts who respond to questions with a limited number of choices to arrive at the best supported answers. Feedback to the experts allows them to refine their views, gradually moving the group toward the most agreeable judgment. For condition reports, the Office of National Marine Sanctuaries uses 17 questions related to the status and trends of sanctuary resources, with accompanying descriptions and five possible choices that describe resource conditions.
In order to address the 17 questions, sanctuary staff selected and consulted outside experts familiar with water quality, living resources, habitat, and maritime archaeological resources. Experts represented various affiliations including California Department of Fish and Game, Carter Biological Consulting, Cascadia Research, Channel Islands National Marine Sanctuary, Channel Islands National Park, Partnership for Interdisciplinary Studies of Coastal Oceans, Reef Environmental Education Foundation, Santa Barbara Channelkeeper, Southern California Coastal Water Research Project, and University of California Santa Barbara.
Expert opinion was solicited electronically and through one-on-one contact via phone calls and/or emails. Background material was provided to the experts in order to develop a consistent understanding of the project and the questions. Experts were asked to use Appendix A, which accompanies every report to guide their responses. Appendix A clarifies the set of questions and presents standardized statements that are used to describe the status and assign a corresponding color code on a scale from “good” to “poor.” These statements are customized for each question.
During the initial request for response to questions, a total of 28 experts were contacted and 13 responded. They were asked to rate resource status and trends based on guidance provided, and submit supplemental comments, data, graphics, literature citations, Web site links and other relevant information.
The combined input of all experts was considered by a writing team composed of individuals from the sanctuary and the national office. They tallied and discussed ratings and accompanying comments, and summarized the input in a written draft that included a proposed status rating and a proposed trend for each question. The initial ratings represented agreement by the writing team based on interpretation of quantitative and, when necessary, non-quantitative expert input, as well as other available information, such as assessments and observations of scientists, managers and users. In some cases, certain input was not used because it was either not relevant to the question it accompanied, or it was too narrowly focused to address the question. Nevertheless, the ratings and text are intended to summarize the opinions and uncertainty expressed by experts, who based their input on knowledge and perceptions of local conditions. Comments and citations received from the experts were included, as appropriate, in text supporting the ratings.
This draft document was sent back to the subject experts for what was called an initial review, a 21-day period that allowed them to ensure that the report accurately reflected their input, identify information gaps, provide comments or suggest revisions to the ratings and text. Upon receiving those comments, the writing team revised the text and ratings as they deemed appropriate. The final interpretation, ratings, and text in the draft condition report were the responsibility of sanctuary staff, with final approval by the sanctuary manager. To emphasize this important point, authorship of the report is attributed to the sanctuary alone. Subject experts were not authors, though their efforts and affiliations are acknowledged in the report.
The second phase of review, called invited review, involved particularly important partners in research and resource management, including the research activities panel, sanctuary advisory council, NOAA’s Marine Debris Program, and NOAA’s National Marine Fisheries Service. These bodies were asked to review the technical merits of resource ratings and accompanying text, as well as to point out any omissions or factual errors. The comments and recommendations of invited reviewers were received, considered by sanctuary staff, and incorporated, as appropriate, into a final draft document.
A draft final report was then sent to John Butler, NOAA’s National Marine Fisheries Service; Don Morris, National Park Service (retired); and Jen Smith, Scripps Institution of Oceanography, University of California San Diego, who served as external peer reviewers. This external peer review is a requirement that started in December 2004 when the White House Office of Management and Budget (OMB) issued a Final Information Quality Bulletin for Peer Review (OMB Bulletin) establishing peer review standards that would enhance the quality and credibility of the federal government’s scientific information. Along with other information, these standards apply to Influential Scientific Information which is information that can reasonably be determined to have a “clear and substantial impact on important public policies or private sector decisions.” The Condition Reports are considered Influential Scientific Information. For this reason, these reports are subject to the review requirements of both the Information Quality Act and the OMB Bulletin guidelines. Therefore, following the completion of every condition report, they are reviewed by a minimum of three individuals who are considered to be experts in their field, were not involved in the development of the report, and are not ONMS employees. Comments from these peer reviews were incorporated into the final text of the report. Furthermore, OMB Bulletin guidelines require that reviewer comments, names, and affiliations be posted on the agency website. Reviewer comments, however, are not attributed to specific individuals. Reviewer comments are posted at the same time as with the formatted final document.