[Updates, Feb. 23-24: I have added extensive discussion “below the fold”, starting with the section entitled GMU Process. The summary has been updated with additional links to side-by-side comparisons to enable readers to make their own judgments.]
Dan Vergano of USA Today reports on an “all faculty” announcement from George Mason University concerning the outcome of two faculty committee investigations of plagiarism charges against GMU statistics professor Edward Wegman.
One investigation concerned a 20o8 article by Wegman protege Yasmin Said, Wegman himself and two others in Computational Statistics & Data Analysis (CSDA). The committee upheld CSDAs previous plagiarism finding; as “team leader”, Wegman was found to bear responsibility and has been asked to retract the article and apologize to CSDA’s editor. GMU has also issued an official letter of reprimand confirming that finding of research misconduct.
A separate GMU committee investigated the 2006 congressional report commonly known as the Wegman Report, a critique of the Mann-Bradley-Hughes “hockey stick” reconstruction. That investigation held that “no scientific misconduct was involved”, only “extensive paraphrasing of another work” that was “referenced repeatedly”. [That finding holds that there was no plagiarism in Wegman Report background material derived from Raymond Bradley’s Paleoclimatolgy; readers may judge side-by-side comparisons of the passages on tree-rings and ice core and coral proxies for themselves]. However, in a bizarre twist, it appears that the committee did not even consider side-by-side comparison of the Wegman Report’s long and unreferenced background section on social network analysis, part of which was reused in the later CSDA article and gave rise to the plagiarism finding in the other GMU case!
The two Wegman misconduct cases originated with a confidential complaint from “hockey stick” co-author Raymond Bradley in March 2010. It was based on my revelations concerning apparent copying from Bradley’s own work in Wegman Report background subsections on tree-ring and ice core temperature proxies.
A month later, Bradley updated GMU with my additional evidence concerning the report’s lengthy background section on social networks, which appeared to be copied from unattributed sources including two text books (including Wasserman and Faust’s classic Social Network Analysis), as well as the Wikipedia article on the subject. Bradley also pointed out my discovery that some of that material was reused in the subsequent 2008 CSDA article by Wegman, Said and two other GMU co-authors.
The GMU statement from provost Peter Stearns says the case “received wide publicity … inappropriately”. He also defended long delays as a result of both “federal requirements” and “due process”. However, Bradley went public only after GMU missed deadline after deadline. And the initial inquiry phase even appears to have lasted beyond May 2011, when the CSDA retraction of Said et al 2008 was announced. That occasioned a Nature editorial lambasting GMU for the unconscionable delay. All of that seems a flagrant violation of federally mandated timelines, which require the inquiry phase to conclude within a few months.
The decision to separate the initial inquiry into two separate inquiries is also highly questionable, since the CSDA case involved a virtually identical subset (less than half, in fact) of unattributed social network material in the Wegman Report. [In a comment, “Rob” points out that the separate inquiries could make sense, especially as the CSDA article was also reported in a separate complaint to NIH Office of Research Integrity, and was not immediately passed on to GMU. It is not known when the CSDA inqury began. ]
Even worse, as we shall see, there is strong evidence that the Wegman Report inquiry did not even consider this social network analysis material in its deliberations. If so, the GMU process went off the rails almost at its very start.
The finding of research misconduct in the CSDA article was obvious, especially as the investigation followed the journal’s own retraction and finding of plagiarism nine months ago. GMU has attempted to minimize the seriousness of the finding, referring to plagiarism occurring in a “contextual section” as a result of “poor judgment”. Despite that, this is an indelible black mark and will no doubt have further repercussions for Wegman and indeed GMU’s overall reputation. Much will depend on the subsequent automatic review by the Office of Research Integrity, which oversees much federally funded work.
The summary of the decision in the Wegman Report case is worth quoting in full.
The committee investigating the congressional report has concluded that no scientific misconduct was involved. Extensive paraphrasing of another work did occur, in a background section, but the work was repeatedly referenced and the committee found that the paraphrasing did not constitute misconduct. This was a unanimous finding.
The reference to “extensive paraphrasing of another work” that was referenced “repeatedly” implies inevitably that the committee considered only the issue of whether passages derived from Bradley’s 1999 text, Paleoclimatology: Reconstructing Climates of the Quaternary, constituted plagiarism. Even this narrow consideration of a smaller part of the evidence is problematic. The first long paragraph in the tree-ring passage is virtually identical to Bradley’s text, and the citation at the end of the third paragraph – a page away – does not attribute the preceding text and only applies to the calibration step in any event. Moreover, the sub-section on ice cores and corals also contains many phrases identical to Bradley, and has no attribution whatsoever.
But all this pales beside the apparent failure to evaluate another background section, that on social network analysis. As I showed in my discusson of this part of the Wegman Report(a section that ran a full five pages), almost all of that material was virtually identical to three antecedent, and completely unattributed, sources. Those sources are:
- Wikipedia article – Social Networks (January 2, 2006 version) – Available online at Archive.org
- Stanley Wasserman and Katherine Faust, Social Network Analysis: Methods and Applications. New York, Cambridge
University Press, 1994.
- Wouter de Nooy, Andrej Mrvar and Vladimir Batagelj; Exploratory Social Network Analysis with Pajek. New York, Cambridge
University Press, 2005.
The evidence in the side-by-side comparison is overwhelming. And apparently the other GMU committee agrees (not to mention CSDA itself), even though the CSDA article used less than half of the social network material in the Wegman Report. For greater certainty on this point, here is the side-by-side comparison of the social network material used in the CSDA article with its antecedents. Unlike John Mashey, I didn’t do a three-way comparison and only used one column for the derived text. Why? Because the differences between the CSDA text and the corresponding text in the Wegman Report were so few that the trivial exceptions were easy to identify and note along the way.
The inevitable question, then, is how this investigation committee could possibly have failed to consider this part of the Wegman Report. The most plausible explanation is that the committee was never even asked to consider it, and that the additional information provided by Bradley in April 2010 was not even incorporated in the preceding inquiry report.
And it should be also mentioned that there are still other problematic sections in the Wegman Report, including the remaining background section on PCA and noise models, and several of the summaries in the appendix (the latter analysed by John Mashey). But at least GMU could reasonably claim to be unaware of those problems, unlike the social networks material.
It’s also worth noting that the Wegman Report’s central analysis of the MBH “hockey stick” reconstruction is also deficient, but that’s a story to be continued another time.
Much as GMU would like to “move on”, and consider the matter closed, that won’t be possible. For one thing, since the CSDA article was supported by federal funding from the National Institute of Health, that investigation will necessarily be reviewed by the Office of Research Integrity, which may elect to impose further sanctions.
As for the GMU investigation of misconduct in the Wegman Report, it is not clear (to me, at any rate) whether that would fall under federal review, as the congressional report was not peer-reviewed and was not supported by federal funding.
However, the clear failure to consider evidence presented by the complainant himself, along with failure to adhere to federally mandated timelines in the inquiry phase, constitute evidence of a major breach in the GMU response to this complaint. These clear failures of process are more than enough to warrant a complete investigation of the matter by the ORI, or at the very least, send the matter back to GMU for a reconsideration of the evidence. All of it, this time.
And the problems don’t end there. There still has been no apparent consideration of problems in recent PhD dissertations within Wegman’s group. [In a comment, “Rob” notes that possible plagiarism in three dissertations were reported to the GMU Provost in October 2010. GMU has not provided updates since it is being treated as a “personnel matter”.]
Nor have palpable problems in other work by Wegman and Said been addressed. That list includes two long review articles in the journal they co-edit (along with Wegman report co-author David Scott), WIREs Computational Statistics, as previously detailed in the following posts:
- Said and Wegman 2009: Suboptimal Scholarship [side-by-side]
- Wegman and Said 2011, part 1 [side-by-side]
- Wegman and Said 2011, part 2
My understanding is that Wiley (the publisher of the WIREs series) is well aware of those issues, but it is not clear at present to what extent they have been addressed. The 2009 WIREs article Roadmap to Optimization, and its apparent unattributed reliance on 13 Wikipedia articles (!) and two other online pieces, was even covered by USA Today’s Vergano in October of last year.
And there is yet more evidence of problems involving Wegman and Said in two separate chapters in the Handbook of Statistics: Data Mining and Data Visualization, edited by Rao et al (and co-edited by Wegman himself). That was published by Elsevier in 2005, thus even predating the Wegman Report.
So while the GMU decisions issued yesterday represent an important milestone along the way, this saga is far from over.