Author Archives: Deep Climate

Open Thread #7

By Deep Climate

Yes, Wegmania continues on and on, but readers no doubt would like to discuss other topics as well.

The return of the Daily Mail’s David Rose and his claims that global warming “has halted” [h/t MapleLeaf] has brought forth a scathing rebuttal from the Guardian’s George Monbiot, apparently with the able assistance of the Climate Science Rapid Response Team. For background on Rose, see my post on  Rose’s first coverage of climategate and Mojib Latif’s supposed twenty-year global-cooling prediction, as well as the later gullible regurgitation of Steve McIntyre’s “hid the decline” falsehoods, complete with fake graphs.

Also from the Guardian come tales of Christopher Monckton’s shenanigans [h/t Holly Stick] at the climate conference in Cancun (where the government of Canada has been earning “Fossil of the Day” awards – five so far –  and no doubt will garner the overall “Fossil of the Year” award). Meanwhile, the UNFCCC process appears to be in deep trouble, as the political will to support effective action on climate change seems weaker than ever.

Wegman et al miscellany

By Deep Climate

John Mashey has suggested a new thread for general talk about various aspects of the Wegman Report, and I’m happy to oblige. Of course, the immediately preceding Replication and Due Diligence, Wegman Style will remain open for discussion of Wegman et al’s, ahem,  statistical analysis. But other Wegman Report discussion should happen here for now, pending further posts (and there are a few in the pipeline).

To get us started, here are excerpts from some interesting comments that came in over the last few days, comments which clearly show that the emerging expert assessments of plagiarism in the Wegman Report are showing just the tip of the iceberg (sounds like a good title for a future post).

Continue reading

Replication and due diligence, Wegman style

By Deep Climate

Today I continue my examination of the key analysis section of the Wegman report on the Mann et al “hockey stick” temperature reconstruction, which uncritically rehashed Steve McIntyre and Ross McKitrick’s purported demonstration of the extreme biasing effect of Mann et al’s “short-centered” principal component analysis.

First, I’ll fill in some much needed context as an antidote to McIntyre and McKitrick’s misleading focus on Mann et al’s use of principal components analysis (PCA) in data preprocessing of tree-ring proxy networks. Their problematic analysis was compounded by Wegman et al’s refusal to even consider all subsequent peer reviewed commentary – commentary that clearly demonstrated that correction of Mann et al’s “short-centered” PCA had minimal impact on the overall reconstruction.

Next, I’ll look at Wegman et al’s “reproduction” of McIntyre and McKitrick’s  simulation of Mann et al’s PCA methodology, published in the pair’s 2005 Geophysical Research Letters article, Hockey sticks, principal components, and spurious significance).  It turns out that the sample leading principal components (PC1s) shown in two key Wegman et al figures were in fact rendered directly from McIntyre and McKitrick’s original archive of simulated “hockey stick” PC1s. Even worse, though, is the astonishing fact that this special collection of “hockey sticks”  is not even  a random sample of the 10,000 pseudo-proxy PC1s originally produced in the GRL study. Rather it expressly contains the very  top 100 – one percent – having the most pronounced upward blade. Thus, McIntyre and McKitrick’s original Fig 1-1, mechanically reproduced by Wegman et al, shows a carefully selected “sample” from the top 1% of simulated  “hockey sticks”. And Wegman’s Fig 4-4, which falsely claimed to show “hockey sticks” mined from low-order, low-autocorrelation “red noise”, contains another 12 from that same 1%!

Finally, I’ll return to the central claim of Wegman et al – that McIntyre and McKitrick had shown that Michael Mann’s “short-centred” principal component analysis would mine “hockey sticks”, even from low-order, low-correlation “red noise” proxies . But both the source code and the hard-wired “hockey stick” figures clearly confirm what physicist David Ritson pointed out more than four years ago, namely that McIntyre and McKitrick’s “compelling” result was in fact based on a highly questionable procedure that generated null proxies with very high auto-correlation and persistence. All these facts are clear from even a cursory examination of McIntyre’s source code, demonstrating once and for all the incompetence and lack of due diligence exhibited by the Wegman report authors.

Continue reading

The Wegman report sees red (noise)

By Deep Climate

The recent focus on George Mason University’s investigation into plagiarism allegations concerning the Wegman “hockey stick” report and related scholarship has led to some interesting reactions in the blogosphere. Apparently, this involves trifling attribution problems for one or two paragraphs, even though the allegations now touch on no less than 35 pages of the Wegman report, as well as the federally funded Said et al 2008. Not to mention that subsequent editing has also led to numerous errors and even distortions.

But we are also told that none of this “matters”, because the allegations and incompetence do not directly touch on the analysis nor the findings of the Wegman report. So, given David Ritson’s timely intervention and his renewed complaints about Edward Wegman’s lack of transparency, perhaps it is time to re-examine  Wegman report section 4, entitled “Reconstructions and  Exploration Principal Component Methodologies”. For Ritson’s critique of the central Wegman analysis itself remains as pertinent today as four years ago, when he expressed his concerns directly to the authors less than three weeks after the release of the Wegman report.

Ritson pointed out a major error in Wegman et al’s exposition of the supposed tendency of “short-centred” principal component analysis to exclusively “pick out” hockey sticks from random pseudo-proxies.  Wegman et al claimed that Steve McIntyre and Ross McKitrick had used  a simple auto-regressive model to generate the random pseudo-proxies, which is the same procedure used by paleoclimatologists to benchmark reconstructions. But, in fact, McIntyre and McKitrick clearly used a very different – and highly questionable – noise model, based on a “persistent” auto-correlation function derived from the original set of proxies. As a result of this gross misunderstanding, to put it charitably, the Wegman report failed utterly to analyze the actual scientific and statistical issues. And to this day, no one – not Wegman, nor any of his defenders – has addressed or even mentioned this obvious and fatal flaw at the heart of the Wegman report.

Continue reading

David Ritson speaks out

By Deep Climate

David Ritson, emeritus professor of physics at Stanford University, has updated Steve McIntyre and the rest of the world on a key controversy concerning the Wegman Report, namely Edward Wegman’s ongoing failure to release supporting material related to the analysis within the Wegman report, more than four years after his promises to do so.

Here is Ritson’s complete comment, addressed to McIntyre at ClimateAudit.

Continue reading

Wegman under investigation by George Mason University

By Deep Climate

[Update, Oct. 11: George Mason University spokesperson Doug Walsch has clarified that the complaint against Wegman has moved past the preliminary “inquiry” phase and is now under formal investigation. ]

[Update, Oct. 15, 19: I have added pointers to my previous discussions and updated side-by-side comparisons relevant to allegations of plagiarism forwarded to George Mason University last March and April. The allegations concern not only the Wegman report, but also the federally-funded  Said et al 2008  (published in Computational Statistics and Data Analysis, with Wegman and two other Wegman proteges as co-authors). ]

George Mason University has acknowledged that statistics professor Edward Wegman is under investigation for plagiarism. As related in USA Today, the investigation followed a formal complaint by paleoclimatologist Raymond Bradley, co-author of the seminal (and controversial) 1998 and 1999 “hockey stick” temperature reconstructions.

But a letter from Roger Stough, GMU’s vice-president responsible for research, indicates that the pace of the initial inquiry has been slow. And it appears that a promised date for resolution of the inquiry phase of the proceeding has been missed.

Continue reading

Open Thread #6

Here is a new open thread for general climate science discussion.

John Mashey on Strange Scholarship in the Wegman Report

Guest post by John Mashey

Strange Scholarship in the Wegman Report (SSWR)
A Facade for the Climate Anti-Science PR Campaign

This report offers a detailed study of the “Wegman Report”: Edward J. Wegman, David W. Scott, Yasmin H. Said, “AD HOC COMMITTEE REPORT ON THE ‘HOCKEY STICK’ GLOBAL CLIMATE RECONSTRUCTION”(2006).

It has been key prop of climate anti-science ever since. It was promoted to Congress by Representatives Joe Barton and Ed Whitfield as “independent, impartial, expert” work by a team of “eminent statisticians.” It was none of those.

A Barton staffer provided much of the source material to the Wegman team. The report itself contains numerous cases of obvious bias, as do process, testimony and follow-on actions. Of 91 pages, 35 are mostly plagiarized text, but often injected with errors, bias and changes of meaning. Its Bibliography is mostly padding, 50% of the references uncited in the text.  Many references are irrelevant or dubious.  The team relied heavily on a long-obsolete sketch and very likely on various uncredited sources. Much of the work was done by Said (then less than 1 year post-PhD) and by students several years pre-PhD. The (distinguished) 2nd author Scott wrote only a 3-page standard mathematical Appendix.  Some commenters were surprised to be later named as serious “reviewers.”  Comments were often ignored anyway.  People were misused.

The Wegman Report claimed two missions: #1 evaluate statistical issues of the “hockey stick” temperature graph,  and #2 assess potential peer review issues in climate science.  For #1, the team might have been able to do a peer-review-grade statistical analysis, but in 91 pages managed not to do so.  For  #2, a credible assessment needed a senior, multidisciplinary panel, not a statistics professor and his students, demonstrably unfamiliar with the science and as a team, unqualified for that task.   Instead, they made an odd excursion into “social network analysis,” a discipline  in which they lacked experience, but used poorly to make baseless claims of potential wrongdoing.

In retrospect, the real missions were: #1 claim the “hockey stick” broken and #2 discredit climate science as a whole. All this was a facade for a PR campaign well-honed by Washington, DC “think tanks” and allies, underway for years.

Most people can just read the 25-page main discussion, but 200+ pages of backup text are included to provide the necessary documentation, as some issues are potentially quite serious.

For a quick download, read the Executive Summary (first six pages). Then, here is the complete report, including the main discussion and 200+ pages of appendices.

Wegman report update, part 2: GMU dissertation review

By Deep Climate

Several posts in past months have highlighted highly questionable scholarship in the 2006 Wegman report on the “hockey stick” temperature reconstruction (and revelations of much more will come soon, with  the imminent release of John Mashey’s massive analysis). Today I present yet another analysis of background material of “striking similarity” to antecedents, this time found in a trio of dissertations by recent George Mason University PhD students under the supervision of Edward Wegman.

Wegman Report co-author Yasmin Said’s 2005 dissertation on the “ecology” of alcohol consumption  appears to presage some of the questionable scholarship techniques employed in the Wegman Report.  And later dissertations from two other Wegman proteges, Walid Sharabati (2008) and Hadi Rezazad (2009), both have extensive passages that follow closely Wegman Report’s social networks background section, which in turn  is based on unattributed material from Wikipedia and two widely used text books. Thus, as in the case of Donald Rapp, there appears to be serial propagation of unattributed, “striking similar” material. Astonishingly, all three Wegman acolytes were honored with an  annual GMU award for outstanding dissertations in statistics and computational science.   However, a closer look betrays not only scholarship problems in the work, but clear failure in the PhD supervision process itself.

It may also be that some heat is being felt behind the scenes. For one thing, Said’s 2005 dissertation was recently deleted from the George Mason University website. And around the same time, most traces of Said’s eye-opening presentation on the Wegman panel process [PDF] were also deliberately removed. That appears to be a clumsy attempt to cover up embarrassing details about the U.S. House Energy and Commerce Committee 2005-2006 climate investigation, including the key role of Republican  staffer Peter Spencer, Representative “Smoky” Joe Barton’s long time point man on climate change issues. (These disappearances were pointed out to me by the ever-vigilant John Mashey).

Continue reading

McShane and Wyner 2010

Over at ClimateAudit and WUWT they’ve broken out the champagne and are celebrating (once again) the demise, nay, the shattering into 1209 tiny splinters, of the Mann et al “hockey stick” graph, both the 1998 and 2008 editions. The occasion of all the rejoicing is a new paper by statisticians Blakely McShane and Abraham Wyner, entitled A Statistical Analysis of Multiple Temperature Proxies: Are Reconstructions of Surface Temperatures Over the Last 1000 Years Reliable? [PDF]. The paper, in press at the Annals of Applied Statistics, purports to demonstrate that randomly generated  proxies of various kinds can produce temperature “reconstructions” that perform on validation tests as well as, or even better than,  the actual proxies.

My discussion of McShane and Wyner is divided into two parts. First, I’ll look at the opening background sections. Here we’ll see that the authors have framed the issue in surprisingly political terms, citing a number of popular references not normally found in serious peer-reviewed literature. Similarly, the review of the “scientific literature” relies inordinately on grey literature such as Steve McIntyre and Ross McKitrick’s two Environment and Energy articles and the (non peer-reviewed) Wegman report. Even worse, that review contains numerous substantive errors, some of which appear to have been introduced by a failure to consult cited sources directly, notably in a discussion of a key quote from Edward Wegman himself.

With regard to the technical analysis, I have assumed that McShane and Wyner’s applications of statistical tests and calculations are sound. However, here too, there are numerous problems. The authors’ analysis of the performance of various randomly generated “pseudo proxies” is based on several questionable methodological choices. Not only that, but a close examination of the results shows clear contradictions with the findings in the key reconstruction studies cited. Yet the authors have not even mentioned these contradictions, let alone explained them.

Continue reading