The term I use for bad data interpreted badly is zombie data, and there is a lot of it around here. Zombie data is dead on arrival, but for some reason, it is able to wander around un-dead for years. And it usually ends up eating people and their property before meeting its ultimate demise. Unfortunately for us, there is a cottage industry of non-profits and quangos who generate hordes of this crazy stuff, usually funded by grants. Data collection is great, but it has to be done by people who know what they're doing, and then the collection methods and interpretations need impartial peer review. We should take the approach that all data and interpretations are suspect, not definitive, until/unless subjected to the highest standards of independent quality review (not consensus-building review, but authentic independent critique).
Our Best Available Science (BAS) and BAS synthesis are so full of zombie data that we might as well call it zomBAS. This doesn't stop our Council members from believing it, however. They can't tell dead data from living data. Others can. Below are some excerpts of critiques of our zomBAS.
Email Excerpts -- Barbara Bentley PhD to the Council about BAS
Please re-read all these references again to make sure that the data actually support their conclusions. Most of them are unpublished in-house reports without peer review. From my reading, even “anecdotal” is a stretch for some of them.
... The documents of greatest concern are those produced by Kwiaht. I have worked with Russel in the field as well as read through those documents that I can get a copy of. Frankly, I’m NOT impressed with the methods, the data, the analyses, or the unfounded conclusions. His 2008 and 2010 “papers” are unpublished manuscripts, and both have these very serious flaws. I described them in my letter to the Council and would be willing to talk with you in more detail if you wish. Others in the same category include Brennan, Moulton, Wyllie-Echeverria (2008).
... I also have some concerns about oral presentations. Details of methods and analyses are often not part of a powerpoint presentation and thus the validity of the data cannot be assessed. These include Beamer, Canning, Whitman, Wyllie-Echeverria.
... And finally, it makes me more that a bit nervous to have “circular citations” --- BAS reports citing other BAS reports, such as Herrera, Adamus, and The Watershed Company all citing BAS reports by Herrera.
Excerpts from Tim Verslycke PhD Report about BAS
The buffer sizing procedure proposed in the Wetland and FWHCA Regulations includes two components: a Water Quality-Sensitivity Buffer and a Habitat Buffer. The Water Quality-Sensitivity Buffer sizing procedure is based on whether runoff will flow above- or below-ground, and then relies on an analysispublished by Mayer et al. (2007) to determine appropriate buffer widths for a given level of pollutant removal. There are significant technical issues with the proposed procedure:
- The meta-analysis presented in Mayer et al. (2007) found only weak correlations between riparian buffer width and nitrogen removal, indicating that factors other than buffer width influence buffer effectiveness. Furthermore, Mayer et al. (2007) found that only herbaceous buffers were more effective at removing nitrogen when wider, whereas no statistically significant relationship was found between nitrogen removal and width of forested, forested/wetland, and wetland buffers. As a result, the proposed buffer sizing procedure is based on weak correlations between only one pollutant (i.e., nitrogen) and only selected vegetated buffers (i.e., herbaceous) and it is therefore not reflective of best available science; and
... As an example, two studies by Barsh et al. are cited throughout the Best Available Science Synthesis document (San Juan County, 2011) as providing evidence of chemical contamination (e.g., pesticides, surfactants) in San Juan County surface waters (Barsh et al., 2008) and a localized freshwater fish population (Barsh et al., 2010). It is my opinion that both studies are of limited value and should not be used to infer the presence of ecological risks. Both studies rely on commercial kits to analyze chemical concentrations. While these kits are useful screening tools and may inform the need for further study, they are no substitute for properly conducted monitoring studies using standard analytical methodologies, nor for relying on regulatory field sampling protocols (including appropriate data quality controls, such as field replicates, field blanks, matrix spikes, etc.).
- The statement that "nitrogen is one of the most difficult contaminants to remove, and if buffers will adequately remove nitrogen it is likely to remove many other contaminants" (San Juan County Council, 2012a, p. 7) is a significant oversimplification of the current state-of-the-science of chemical fate and transport and buffer effects on water quality (e.g., as summarized in San Juan County, 2011, pp. 58-60) and is not reflective of best available science.