Wegman and Said on social networks: More dubious scholarship

Today I continue my exploration of the dubious scholarship in the contrarian touchstone known as the Wegman report, this time focusing on the report’s background section on social network analysis. As many readers may recall, Wegman et al used a simplistic analysis of co-author relationships to speculate about supposed lack of independence between researchers in paleoclimatology, accompanied by lapses of rigour in the peer review process. This, of course, echoed similar accusations by self-styled climate auditor Steve McIntyre.

In both the original Wegman report and a subsequent follow-up paper by Yasmin Said, Wegman and two others, the background sections on social network research show clear and compelling instances of apparent plagiarism. The three main sources, used almost verbatim and without attribution, have now been identified. These include a Wikipedia article and a classic sociology text book by Wasserman and Faust. But the papers rely even more on the third source, a hands-on text book that explores social network concepts via the Pajek analysis software package – the same tool used by the Wegman team to analyze “hockey stick” author Michael Mann’s co-author network.

Not only that, but the later Said et al paper acknowledges support from the National Institutes on Alcohol Abuse and Alcoholism, as well as the Army Research Laboratory, raising a host of new issues and questions. And chief among those questions is this: Will George Mason University now finally do the right thing and launch a complete investigation of the actions and scholarship of Wegman and Said?

I first explored apparent plagiarism in the Wegman report late last year. At that time I analyzed wholesale cribbing from, and distortion of, material from “hockey stick” co-author Raymond Bradley. But I also pointed out that at least some of the background material on social network analysis appeared to be taken from other sources, without attribution.

I have now done a complete analysis of section 2.3 of the Wegman Report, which runs close to five pages. As we shall see, a careful sentence by sentence analysis shows that almost all of the material can be traced to one of the three antecedents named above.

Said et al 2008

Before turning to the details of that analysis, I’ll touch on a little known follow up article, namely Social Networks of Author–Coauthor Relationships, published in July 2008 in the journal Computational Statistics and Data Analysis (first appearing online in August 2007, a year after the Wegman report).

The four authors were all affiliated with George Mason University and are all connected to the Wegman panel and report. They include two of the Wegman report authors, namely Wegman protoge Yasmin Said and Wegman himself. The third author, John Rigsby, was acknowledged as a contributor to the Wegman report (in fact he performed the block model analysis of Michael Mann’s coauthor network) and is now apparently working on his PhD under Wegman at GMU. The fourth author, Walid Sharabati, did not participate in the report, but did supply an analysis of Wegman’s own co-author network for Wegman’s written response to questions posed by Rep. Bart Stupak in connection with the “hockey stick” congressional hearings [PDF 2.7 Mb]. Like Said, he was a Wegman PhD student; he’s now at Purdue University.

Said et al describe the genesis of their paper this way:

Wegman et al. (2006) [i.e. the Wegman report] undertook a social network analysis of a segment of the paleoclimate research community. This analysis met with considerable criticism in some circles, but it did clearly point out a style of co-authorship that led to intriguing speculation about implications of peer review. Based on this analysis and the concomitant criticism, we undertook to examine a number of author–coauthor networks in order to see if there are other styles of authorship. Based on our analysis we identify four basic styles of co-authorship, which we label, respectively, solo, entrepreneurial, mentor, and laboratory. The individuals we have chosen to represent the styles of co-authorship all have outstanding reputations as publishing scholars. Because of potential for awkwardness in social relationships, we do not identify any of the individuals or their co-authors.

Of course, the “entrepreneurial” style network was already identified in Wegman et al as that of “hockey stick” paleoclimatologist Michael Mann. Perhaps more “awkward” is the fact that the mentor style network described is that of Wegman himself and is obviously based on Sharabati’s earlier analysis of his own mentor’s coauthor network.

Clearly, any analysis of Wegman et al’s speculation on peer review in paleoclimatology must also cover the Stupak response and the Said et al paper, a task I’ll leave to another time. For now, I’ll note that it’s hard to imagine a more self-serving approach to scholarship concerning peer review issues than the one emerging here.

Wegman’s sources

Let us now turn to the analysis of the antecedents of the Wegman et al background section on social networks in greater detail. (I’ll follow that with an analysis of the introduction section of Said et al 2007, which is essentially the same lifted material in reduced form). If you want to follow along, here is a side-by-side comparison of Wegman et al and the sources I’ve identified.

Right off the top, Wegman et al quote the favourite source of scholars in a hurry: Wikipedia (more precisely, the 2006 version of the Wikipedia article on social networks).

Here’s the opening of Section 2.3 in Wegman:

A social network is a mathematical structure made of nodes, which are generally taken to represent individuals or organizations. … Social network analysis (also called network theory) has emerged as a key technique and a topic of study in modern sociology, anthropology, social psychology and organizational theory.

And here’s Wikipedia:

A social network is a social structure between actors, mostly individuals or organizations… Social network analysis (also sometimes called network theory) has emerged as a key technique in modern sociology, anthropology, Social Psychology and organizational studies, as well as a popular topic of speculation and study.

It’s quite close. But the changes don’t even make sense. Right of the top, Wegman et al seem to misunderstand the difference between a structure and its representation (in this case as a graph). The second sentence has been shortened – but also mangled in such a way as to remove the original sense of identifying social network analysis as both a key social science technique and a topic of study on its own.

Continuing on, there are fewer and fewer changes (shown with the occasional added or removed word as bold or cross out):

Research in a number of academic fields have has demonstrated that social networks, operating on many levels, from families up to the level of nations, and play a critical role in determining the way problems are solved, organizations are run, and the degree to which individuals succeed in achieving their goals. The shape of the social network helps determine a network’s usefulness to its individuals …

And so on for several more sentences, pausing only to skip over the Wikipedia’s table of contents. And that’s just the first paragraph.

Then it’s on to paragraph two – and the most used (or abused) source for this section: Exploratory Social Network Analysis with Pajek, by Wouter de Nooy, Andrej Mrvar and Vladimir Batagelj. This book has the undoubted attraction of serving as both a “how to” guide for the Pajek software package used by Wegman et al to generate various diagrams, as well as a providing some overview of social network concepts. (Not to mention that it appears to be freely available as a full PDF and so easily copied).

De Nooy et al have this apparently irresistable nugget:

… Social network analysts assume that interpersonal ties matter, as do ties among organizations or countries, because they transmit behavior, attitudes, information, or goods…

That has been lightly massaged to read:

Social network analysis assumes that interpersonal ties matter, whether they exist among individuals, organizations or countries. Interpersonal connections matter because they are conduits for the transmission of information, goods, behavior and attitudes.

It’s not clear why the order of transmitted items has been changed or why Wegman et al felt compelled to unnecessarily complicate the phrasing. But certainly small changes like “analysts” into “analysis” do make possible plagiarism detection harder.

At paragraph three, Wegman et al lift a long series of key definitions from the classic 1994 text, Social Network Analysis: Methods and Applications, by Stanley Wasserman and Katherine Faust. These concepts include “actor”, “dyad” “relational tie” and so on.

For example, here is the Wasserman and Faust definition of “relational tie”:

Relational tie. Actors are linked to one another by social ties. …[T]he range and type of ties can be quite extensive. The defining feature of a tie is that it establishes a linkage between a pair of actors. Some of the more common examples of ties employed in network analysis are:

  • Evaluation of one person by another (for example expressed friendship, liking, or respect)
  • Transfers of material resources (for example business transactions, lending or borrowing things)
  • Association or affiliation (for example jointly attending a social event, or belonging to the same social club)
  • Behavioral interaction (talking together, sending messages)
  • Movement between places or statuses (migration, social or physical mobility)
  • Physical connection {a road. river, or bridge connecting two points}
  • Formal relations (for example authority)
  • Biological relationship (kinship or descent)

Wegman et al cut out a few words and replaced the bullet points with a long list:

Relational Tie: Social ties link actors to one another. The range and type of social ties can be quite extensive. A tie establishes a linkage between a pair of actors. Examples of ties include the evaluation of one person by another (such as expressed friendship, liking, respect), transfer of material resources (such as business transactions, lending or borrowing things), association or affiliation (such as jointly attending the same social event or belonging to the same social club), behavioral interaction (talking together, sending messages), movement between places or statues (migration, social or physical mobility), physical connection (a road, river, bridge connecting two points), formal relations such as authority and biological relationships such as kinship or descent.

So it goes for another two pages.

Finally, Wegman et al cover “computational aspects”; for that it’s back to de Nooy et al to cover such graph-related concepts as partitioning and clustering. This section includes a melange of general concepts such as brokerage, as well as their graphic analogues such as vertex centrality. As the material is lifted from the introductions to various book sections, the whole feel of this section is oddly unbalanced, as the discussion jumps from topic to topic with little apparent logic.

As the section lurches to a close, the apparent plagiarism becomes clearer and clearer as there are only desultory changes, or none at all. For example, at page 22 of Wegman et al we have:

The concepts of vertex centrality and network centralization are best understood by considering undirected communication networks. If social relations are channels that transmit information between people, central people are those people who either have quick access to information circulating in the network or who may control the circulation of information.

Except for the indicated addition of the third “people” and the removal of two words, that’s identical to the paragraph found at section 6.5 (p. 133) of de Nooy et al.

Indeed, easily ninety percent of the material was lifted and lightly edited, with the only significant original material was from a paper by John Rigsby on the concept of allegiance.

And all of this, remember, was done without any attribution whatsoever.


Said et al rework it

Now let’s look at the sources for the introduction of Said et al, a comparison that it turns out also provides clues as to how the material was lifted. Once again, you can follow along with a detailed side-by-side comparison, wherein the same three sources are shown as for Wegman et al.

Indeed, as mentioned previously, the Said et al introduction is largely a condensation of the five-page Wegman et al section. But it starts out a little differently:

A social network is an emerging tool frequently used on quantitative social science to understand how individuals or organizations are related. The basic mathematical structure for visualizing the social network is a graph. A graph is a pair (V ,E) where V is a set of nodes or vertices and E is a set of edges or links.

The first part is barely comprehensible English and so is probably from the authors; once again, there is rampant confusion between a structure and its representation. Presumably they mean that social network analyis, not “a social network”, is an “emerging tool”. And surely they also want to say it is a tool used in quantitative social science rather than “on” it (whatever that means).

Said et al quickly return to the safe ground of Wikipedia, however, with the same text as rendered in Wegman et al:

Social network analysis (also called network theory) has emerged as a key technique and a topic of study in modern sociology, anthropology, social psychology and organizational theory. The shape of the social network helps determine a network’s usefulness to its individuals. Smaller, tighter networks can be less useful to their members than networks with lots of loose connections (weak ties) to individuals outside the main network. More “open” networks, with many weak ties and social connections, are more likely to introduce new ideas and opportunities to their members than closed networks with many redundant ties. See Granovetter (1973).

In this case, however, the authors have cut out the five subsequent Wikipedia sentences that appeared in the Wegman et al version. Another change is the addition of a reference to Granovetter, the one and only citation in the whole introduction; it’s reasonable enough, but that is clearly not the true source.

Presumably in the interest of brevity, Said et al then skip to the definitions found in Wasserman and Faust. This time, the headings have been removed altogether.

Social network analysis is concerned with understanding the linkages among social entities and the implications of these linkages. The social entities are referred to as actors that are represented by the vertices of the graph.

Once again this is taken straight from Wasserman and Faust’s actor definition, with the additional information that actors are represented by graph vertices (the sentence should presumably read “which are represented”, rather than “that”).

The exact same passage on relational ties is also given as in Wegman et al, but again without the heading.

Well, it’s almost the same. Wasserman and Faust refer to a type of linkage they call “movement between places or statuses (migration, social or physical mobility)”. In Wegman et al, this became “movement between places or statues“! You read that right – “statues”.

In Said et al, this was corrected, sort of. It now reads “movement between places or states, which undoubtedly makes more sense, if not exactly what Wasserman and Faust meant.

Beyond the amusing “typo”, though, this is a very telling detail. It implies that Wegman et al were likely working from a scanned version of the introductory chapter of Wasserman and Faust, converted to text using OCR. And it also implies that Said et al simply started from the earlier Wegman et al version “correcting” and condensing as they went.

In the final passages of the introduction, much of the de Nooy et al material makes a reappearance. The sections on partitions and clustering have been reduced, though, and the concepts of cohesion, brokerage and affiliation has been omitted altogether. But the section on centrality have been retained, once again virtually identical to de Nooy et al passages discussed above.

And, once again there is no attribution to the actual sources used. Wasserman and Faust are cited, but only in a subsequent section of the paper, so this is not helpful to the cause of Said et al.

It is also worth noting an oft-quoted definition from Wasserman and Faust, used verbatim in both Wegman et al and Said et al (and a number of other sources).

A social network consists of a finite set or sets of actors and the relation or relations defined on them.

Now look at this Google Scholar search on that exact sentence. In each and every case, a direct citation of Wasserman and Faust is given, with the sentence usually rendered in quotes.

Every case, that is, except one – Said et al.

Before leaving this piece of shoddy scholarship, I should mention the acknowledgments.

The work of Dr. Yasmin Said was supported in part by the National Institutes on Alcohol Abuse and Alcoholism under grant 1 F32 AA015876-01A1. The work of Dr. Edward Wegman was supported in part by the Army Research Office under contract W911NF-04-1-0447. The work of Dr. Said and Dr. Wegman was also supported in part by the Army Research Laboratory under contract W911NF-07-1-0059…

To be sure, the distortions and problems with the lifted material fall somewhat short of those I identified in the case of Wegman et al’s treatment of paleoclimatology. Nevertheless, as usual in cases of wholesale appropriation, even a cursory examination of the social network material betrays a shocking lack of understanding of social network analysis, accompanied by a complete failure to tie the background material to any meaningful analysis. The analysis itself consists of speculation based on a simplistic matrix of the analysis target’s co-authors. That this incompetent and deceptive work appears to have been subsidized by the U.S. government is outrageous. But that fact may have a silver lining – for it provides additional impetus for a long overdue investigation.

Implications for peer review

Indeed, the work of Wegman and his young team has raised some interesting questions about peer review and social networks. But ironically those questions point back, not to the scientific community, but to Wegman and his own social network of proteges and colleagues.

A full exploration of peer review issues in both the Wegman report and Said et al will have to wait for a subsequent post. But clearly there are many problems with Wegman’s hastily convened ensemble of reviewers; for example, William Wieczorek was on Yasmin Said’s PhD advisory committee. And Said et al sailed through peer review in a mere six days from submission to acceptance.

In both cases, there appears to have been no peer review from experts with relevant expertise in social network analysis or paleoclimatology, which explains the complete failure to notice the paucity of citations in key background sections. That was an obvious “red flag” that should have given any reasonable reviewer pause. (There are also numerous problems with Wegman et al’s paleoclimate material, such as confusion between the Northern Hemisphere 20th century temperature rise described by Mann et al, and the global rise discussed in the NAS report).

I now have demonstrated that two key background sections in Wegman et al (as well as the introduction to the follow up paper from Said et al) are riddled with apparent plagiarism and other problems. On top of that, of course, there is every indication that the Barton investigation and the Wegman panel were nothing more than a politically motivated attack on climate science and scientists from the start.

The flimsy accusations of “climategate” have been addressed by no less than three investigations in the U.K., and found to be groundless by the  two completed so far.  What will it take for responsible media to finally focus on the real scandal here? And how can a supposedly responsible academic institution continue to ignore the obvious?

For starters, George Mason University must initiate a complete, independent investigation of the activities and scholarship of Edward Wegman and his proteges. Now.

=============================

References [to be updated]

108 responses to “Wegman and Said on social networks: More dubious scholarship

  1. Absolutely brilliant. Beautiful work.

    It is so nice to see the fight being taken to the denialists for a change. Keep it up.

  2. DC,
    This may be of interest:
    http://universitypolicy.gmu.edu/4007res.html

    I can see you may not be willing to give up your identity, but you level quite serious charges. Charges that may need to be investigated further. You could, of course, inform certain journalists of your findings, but since you did the work, you may want to consider taking it to where this needs to go: the VP of research and economic development of GMU. Note that they assure confidentiality of the complainant, but be prepared that that’s not cast in stone.
    In addition, or alternatively, you could inform the journal editors.
    However, make sure to keep it to the point, and not add too many allegations and issues that dilute the important point: apparent plagiarism. This is considered a very serious charge in the academic community.

  3. So, maybe someone with some spare time can email the 5 authors (Wasserman @ Indiana U, Faust @ UC Irvine, I think, and the other 3 in Europe, but likely findable.) and point them at this. Whether they want to do anything or not, they at least ought to know.

  4. DC, I concur with Marco and John. Plagiarism is a serious problem in education and exposing important examples is quite instructive to students. Hockey stick or no hockey stick, this exposure would be a great lesson for science students.

    Scott A. Mandia, Professor of Physical Sciences
    Selden, NY
    Global Warming: Man or Myth?
    My Global Warming Blog
    Twitter @AGW_Prof
    “Global Warming Fact of the Day” Facebook Group

  5. We are a new climate denial skeptics group from Calgary that will take care of this problem:
    http://friendsofginandtonic.org/

    [DC: The upper right hand corner says it all.]

    • “The upper right hand corner says it all”… But only if you read the small print.

      Looking forward to seeing the responses to your letters FoGT. I wonder if anyone other than Singer will respond.

  6. The upper right hand corner says it all.

    The site’s a bit juvenile but funny … actually “sea levels are rising because Jesus is crying” isn’t bad at all. They could popularize that among any number of southern Baptist congregations!

    [DC: Yes, I thought it was funny. And they have pulled together some good info on Friends of Science (some from SourceWatch apparently).

    For the record, I don’t see how anyone could reconcile a belief in creationism with scientific inquiry. But that doesn’t mean that religion and science are mutually exclusive. I imagine religious scientists would be definitely in the minority, but there must be a significant number of them.

    However, I don’t want to get sidetracked into a discussion of the relationship of religion and science. That’s what the Open Thread is for!]

  7. In Hell's Kitchen

    GMU is a right wing idea-laundering operation.
    If you reveal yourself to any of its “officers”
    you’re liable to receive the kind of treatment
    Lance Baxter received.

    It appears to me that these GMU “researchers”
    are just rubber-stamping documents prepared
    by other parties. That the rubber-stamped
    submissions get through the review process is
    the outrage here.

  8. Well done DC– it was worth the wait, quality not quantity I say (hi their Watts and McIntyre).

    So it took DC to audit the Wegman report and discover plagiarism. Why did McIntyre not audit the Wegman report? 😉

    [Also posted at Deltoid]

  9. Nice, painstaking work. So much for Wegmann’s credibility as an expert witness on the academic shortcomings of others.

    This brings to mind Tom Lehrer’s old classic, “Lobachevsky”:

    “Every chapter I stole from someone else. Index I copied from old Vladivostok telephone directory.”

    (http://www.youtube.com/watch?v=UQHaGhC7C2E)

    But where are the block matrix graphs? We want block matrices! You could have one showing a textual network analysis with paragraphs from the three sources along the x axis, paragraphs from Wegmann and friends along the y, and a nice diagonal running down the middle…

    [DC:An overall diagram of the sources is a good idea. But first, I have to find all the unattributed lifted material in chapter 2 of the Wegman report. Section 2.2 (on PCA) undoubtedly has less than the other two sections, but it has at least one part straight from Wikipedia. More to come … ]

  10. In discussion with a knowledgable friend (who has done expert witnessing), I learned the proper lawyer-speak for most of this, which “striking similarity.”

  11. Research funded by federal agencies is subject to federal policies on research misconduct, which includes plagiarism in proposing, reviewing or reporting research.

    Said was supported as an NIH Postdoctoral Fellow (F32 grant) and Wegman had Army money. Both agencies REQUIRE George Mason University to investigate any allegation of research misconduct. If an allegation is filed, there is a federal requirement that the allegation be fully investigated. The allegation must be made in good faith, which means that the complainant (person bringing the allegation) has a belief in the truth of the allegation, that a “reasonable person” in the complainant’s position would similarly have.

    The NIH handles this process through the Office of Research Integrity at:

    http://ori.dhhs.gov/

    The ORI policy states:

    “it is the responsibility of the investigative body [George Mason University] and ORI, not the complainant [the person who brings allegations of research misconduct to the attention of ORI or George Mason], to ensure that the allegation is thoroughly and competently investigated to resolution. Therefore, once the allegation is made, the complainant assumes the role of a possible witness in any subsequent inquiry, investigation, or hearing.”

    Confidentially is required to the extent that federal law 93.108 delineates, as follows:

    “(a) Disclosure of the identity of respondents and complainants in research misconduct proceedings is limited, to the extent possible, to those who need to know, consistent with a thorough, competent, objective and fair research misconduct proceeding, and as allowed by law. Provided, however, that:

    (1) The institution must disclose the identity of respondents and complainants to ORI pursuant to an ORI review of research misconduct proceedings under §93.403.

    (2) Under §93.517(g), HHS administrative hearings must be open to the public.

    (b) Except as may otherwise be prescribed by applicable law, confidentiality must be maintained for any records or evidence from which research subjects might be identified. Disclosure is limited to those who have a need to know to carry out a research misconduct proceeding.”

    Breach of confidentiality is a serious issue in a research misconduct investigation, and can have serious consequences for the breacher.

    The information provided by DC in this post is sufficiently detailed and documented to stand as an allegation of research misconduct.

    The research misconduct policy of George Mason University can be found at:

    http://universitypolicy.gmu.edu/4007res.html

  12. DC,

    This are very serious allegations that you are making. I am surprised that Wegman and his cohorts have not come forward to defend themselves.

    How should we interpret their silence?
    Is there any chance that someone like Joe Romm could use his contacts to force Wegman et al. to appear before Congress or the Senate.
    And while they are at it, they could also cross-examine McIntyre and McKitrick.

    Surely the plagiarism that you have are grounds for pursuing that line of action rather than GMU, that sounds like a very likely dead end, at least according to what others have stated here.

  13. DC, there’s a few plagiarism-detection tools available. Most cost money, but you could try
    http://copytracker.ec-lille.fr/
    http://www.plagium.com/
    http://plagscan.com/seesources/
    http://chimpsky.uwaterloo.ca/
    or shareware like
    http://www.siberiasoft.info/

    Several also provide percentage similarity. It may be necessary you have electronic copies of the papers+books that may be plagiarised, meaning you may need full-text access to journals (books is a bit more tricky). But who knows what else you run into?

  14. DC, tried a post with too many links (I guess), so this is a ‘condensed’ version:

    Please check the available software for plagiarism detection on Wikipedia:
    http://en.wikipedia.org/wiki/Plagiarism_detection#cite_note-0
    Several are free and even give you a percentage similarity. It’s a bit better than your current manual approach, and who knows what else pops up…

    [DC: Fished the previous one out of the spam filter, so I guess it was the links.]

  15. MapleLeaf,

    If an allegation of plagiarism is made to the appropriate person at George Mason University and that allegation is copied to the NIH ORI, the the NIH will REQUIRE a thorough investigation of the allegations. I don’t know why you say that this would be a dead end. Despite what you may think of GMU, they do receive federal funds and are required by federal law to abide by the ORI regulations. I know GMU is a lower tier university (it isn’t even a Research I school), but if they want to keep getting federal funding, they have no choice but to investigate the allegations and make a report to the NIH ORI.

  16. When we condense every plagiarized line with a proper reference, how many pages are left in the Wegman report?

  17. Hi Sam,

    OK, I stand corrected. I was swayed by “In hell’s kitchen’s” comments, and comments I have read elsewhere about GMU.

    Regardless, why not also force Wegman et al. to appear before the senate or congress? If Mann can be investigated by them over an inconsequential glitch in the code then, by comparison, Wegman et al. have a lot more to answer for.

    It is my understanding that DC has not perused the entire document, this could just be the top of the ice berg.

  18. Mark Shapiro

    Dear DC,

    Thank you.

    When the Wegman report came out in 2006, I remember wondering where they got the climate and social networking background. I just assumed that either they attributed it or they got a pass, since it was an ad hoc report for a govt committee as opposed to anacademic work. So thank you for exposing that it is not only plagiarism, but that they changed the source texts, and then it was “replagiarized” by Rapp.

    I hope this at least gets exposed to the point where Wegman has to acknowledge and apologize — as publicly as possible.

  19. re: CM
    I have the chart, although not quite the form you suggest….
    but DC makes my life hard by continually finding new material, so I’ve had to change it several times 🙂

    MapleLeaf:
    You might want to reread Crescendo to Climategate Cacophony and see if you find any useful hints, like italicized questions to ask people.

    There will a V2.0 within a week to finish incorporating DC’s newest findings, recent Greenpeace report on Koch, and some other things.

    GMU really is the place to start, especially because this is already out in the public, so unlike some internal matters, is fairly hard to sweep under the rug. In addition, it is worth studying Yasmin Said 2007 talk, p.23, under “Bad Reactions”. GMU is *not* a dead end, even if it might be just step 1. If you look at the GMU procedures, as best as I can tell, there would probably be a 100-120 lag time from when a complaint was filed to when they might say something, and that process would likely (and quite fairly) be confidential during that.

    But don’t forget there are a bunch of other folks who might be contacted (*’d ones added by this latest)

    Rice(Scott)
    Purdue (Sharabati)
    *Computational Statistics & Data Analysis journal
    (which has a simple, regular procedure)
    Govt agencies (as Sam kindly notes)
    *Naval Surface Weapons Center (Rigsby, sometimes)

    *Wasserman & Faust (who might well talk to their publisher, Cambridge)
    *de Nooy, Mrvar, Bategelj (who might talk to their publisher, …Cambridge)
    Bradley (who might talk to his publisher, Elsevier)

    But note:
    1) We know who signed off on the Wegman Report (Wegman, Scott, Said).
    2) We know the paper Said, Wegman, Sharabati, Rigsby

    We do not know who actually did what and who knew about it.

    • “We do not know who actually did what and who knew about it.”

      That’s an interesting point. If one author was responsible for all the “striking similarities”, perhaps the others could escape the most serious consequences.

      But I think this is an unlikley scenario for *all* the material in question (paleo and social network).

    • “We do not know who actually did what and who knew about it”

      Ah, the “wir haben es nicht gewusst” rebuttal. I think here some major weight will be put on long paragraphs without a single citation. This should raise red flags for anyone in academia. I don’t think all will be able to escape the most serious consequences (whatever they are going to be).

      [DC: I agree that the lead author in each case will have to bear some responsibility, no matter “who did what”.

      By the way, Sharabati’s PhD dissertation also has an introductory section on social networks with nary a citation (about one page in this case, with a “striking similarity” to Said et al). Wegman and Said were “Dissertation Co-Directors”. Maybe it’s the GMU way. ]

    • Please forgive my ignorance but surely if a paper has more than one author, all co-authors “sign off” on the work as a whole? It is a joint work and all authors take joint responsibility for all contents, i.e., no author can claim “that bit was nothing to do with me”.

  20. Thanks John Mashey,

    I downloaded your report a while ago, just have to make the time to read it!

    What is not clear to me is exactly who is going to report this to GMU. No need to name names, I just curious to know whether or not someone has been tasked with this and will follow through soon.

    Any thoughts on getting a Congressional hearing into this and other antics of the denial machine?

    Thanks.

  21. The thing that cracks me up is that social network theory is far better applied to showing the tight interconnections between the most vocal denialists.

  22. GFW:
    yes, and if you track the 3 large reports I posted at DeSmogBlog:
    The first had some of this, which is what put me on to it – the last page is an example (affiliation matrix) although there is some on the Internet as well (2007-2008)
    http://www.desmogblog.com/skeptics-journal-publishes-plagiarized-paper

    The second was mostly a study in inferring the network from the order of petition -signing, coauthor nets, institutional affiliations, etc…. i.e., a diffusion/contagion model:
    http://www.desmogblog.com/another-silly-climate-petition-exposed (2009)

    The third (CCC) really got serious. (2010)

    It’s not accidental: the hypothesis from the first one was that these folks were very tightly tied together, and that one should patiently gather data about the people and their connections, and hope they made a mistake that might allow a conspiracy investigation … because then, a tight social network, instead of being efficient, can be deadly…

  23. It’s very difficult to see on the PDF, but the clique diagrams on p2 & p3 appear to be generated differently for Mann and Wegman. On Wegman’s they left out the diagonal elements, which conveniently makes it look less dark and … well, less cliquey. I haven’t read the whole PDF so there may be a good explanation – but at first glance it looks like “how to distort with charts 101”.

    Any thoughts?

    [DC: If you’re referring to the version in the Stupak response, there’s a combination of factors. The second (Wegman) chart was a work in progress by Sharabati, and there are many different ways to organize the matrix (in the Appendix, Sharabati goes through a few of them). I’d say the best one to look at would be Fig. 17, rather than the one on p. 2.

    Said et al is probably the best one for a side by side comparison of the two charts. There the diagonals are all filled in. Still, Wegman has a lot more co-authors (much longer career and he basically writes with each and every one of his PhD students).

    So I wouldn’t make too much of the diagonal, which represent the reflexive pairs (i.e. each author with him or herself). Leaving the diagonal out is defensible and may well be an option in the software.

    Having said that, I don’t see much compelling in the analysis. Sharabati actually claims:

    “We can also argue on the quality versus quantity of the publications. Wegman favored quality of the work rather than quantity. This observation can be concluded by the many coauthors connected to him with few publications.” (p. 28)

    But the mentor style implies that many of the co-authors will be relatively young and not have a long publication record. Also the methodology does not appear to capture papers written by co-authors with others outside Wegman’s co-author network. ]

    • Leaving the diagonal out is defensible and may well be an option in the software.

      It’s certainly defensible as a general choice…but if you’re presenting two diagrams on successive pages ostensibly for the purpose of illustrating comparative differences, one with diagonals and one without…well, it certainly looks like an attempt to bias perceptions. Not as bad as some of Jo Nova’s worst graph juxtapositions…

      I agree with you that there are several likely more substantive issues, but this also looks shonky (albeit in a comparatively minor way) to me.

      [DC: It was a self-serving rush job, no doubt about it. Not a good combination. ]

  24. The reason it should be reported to ORI and George Mason University is because this is will place the authors in deep doodoo (that is the official term, I think I read it in GMU’s misconduct policy). They will face a hearing and probably be forced to retract or acknowledge the plagiarism. It will be publicly announced by ORI (look at their webpage).

    How the heck do you guys think you can “force” Wegman to testify before Congress? Unless you have some very highly placed friends, you’re dreaming.

  25. Sam Cohen: good input above.

    You really, really should look at:
    Crescendo to Climategate Cacophony, at least the first page of text, the ToC, and the next to last page, on Possible Legal Issues.
    This may provide further context and food for thought as this is much bigger than the plagiarism.

    After that, if you are in the US, and if your local Representative would seem amenable, you might write to them urging action. It can’t hurt, especially if you happen to be located in a few districts pulled out of the hat, like CA-30, MA-07, or WA-01, but not TX-06.

  26. Wasserman and Faust refer to a type of linkage they call “movement between places or statuses (migration, social or physical mobility)”. In Wegman et al, this became “movement between places or statues“!

    You gotta admit, the image of a scientist “moving between statues” is very intriguing.

  27. Note to bluegrue: The spam filter ate your last comment as I realized a fraction of a second too late it wasn’t spam – feel free to repost it (I hope it wasn’t too long).

  28. Lotharson: re graphs

    Graphical presentation is always interesting …
    But it is also misdirection.
    Wegman&co seemingly tried to establish a narrative in whic hthe only relevant social network was coauthorship. Now, despite Sharabati’s clam (i9n reply to Stupak):

    “Of all the work that has been done on social networks, very few investigators have considered coauthorship network. Therefore, what we are about to observe in this paper is a brand new approach in the social networks field.”

    This is absurd: coauthorship networks have long been studied, because they are one of the most accessible bodies of data you can find. Try:
    Google: co authorship social network scientific publishing
    or look up Erdos Number, or…

    On the other hand, nobody in thisturf things they are the *only* relationship types

    Dissertation advisor – advisee is strong, swo for example:
    ,a href=”http://genealogy.math.ndsu.nodak.edu/id.php?id=41964″>Mathematics Geneaology Project.

    Then there are:
    Editor: reviewer
    Book editor: chapter author
    Committee membership
    Conference session organizer: invited speaker
    Just for example.

    It is really, really easy for people with mathematical tools to attack problems, get enwrangled in graphs and matrices and statistical measures, and forget/ignore the fact that (especially in social sciences) the models may not capture all the relevant relationships.

    This is less true in computer/communication networks. A good analogy might be that if computer nets were social nets, the Dell servers in a computer room might dislike talking to he HP servers, forming two cliques, which would not be well captured by the physical network topology.

    So, far, it is hard to find evidence that in 2005/2006, Wegman’s crew really had much experience with (human) social networks or the literature thereof, beyond a few textbooks.

    • Yes, at first glance I have problems with drawing such firm conclusions from this type of analysis, but I hadn’t had time to think them through when I commented on the graphical manipulation.

      I think you hit some of the points I was half way to intuiting – particularly the framing that the only relationship that matters was co-authorship.

  29. Marion Delgado

    Lotharsson, we saved the diagonal lines to connect you alarmists …. to HITLER!

    We win!

    — The Skeptics

    [DC: … including Tim Ball and Lord Chrisopher Monckton].

  30. @DC: I had feared as much. It happens.

    I’ve looked a bit into the history of the wikipedia article on social networks in order to see, whether perhaps a wikipedia editor lifted the passages from another source, which could then also be the source of Wegman. It turns out, that this does not seem to be the case. Key phrases grew over the history of the article, making this scenario unlikely. They also date back quite a bit earlier than the Wegman report.

    The “… has emerged as a key technique …” part was introduced on 30 October 2004 as part of a complete overhaul.

    http://en.wikipedia.org/w/index.php?title=Social_network&oldid=6975947

    The beginning sequence “… social structure between actors, mostly individuals or organizations …” was basically introduced on 27 August 2005 and only slightly rephrased later on.

    http://en.wikipedia.org/w/index.php?title=Social_network&oldid=21973192

    So the wikipedia source is actually a genuine new text, not a copy of any third source material. This history also rules out, that Wegman et al were quoted by wikipedia or contributed as authors to wikipedia.

    [DC: I found many of the “striking similarities” only by going back to 2006; the version linked to above is from January 2006. I have since found another passage (in a different Wegman section, taken from a different Wikipedia article – yep, there’s more) that comes from no earlier than April 2006.

    At one point I identified the author of most of that version of that version and section of the social network Wikipedia article, but I forget his handle now. But he did seem to have a good knowledge of the field, even though he didn’t put as many citations as he might have.

    (P.S. another spam delay, I’m afraid).]

  31. DC,

    Have you looked at other sections of the Wegman report and compared them to text and information posted on the original McIntyre website that pre-dates CA, and have done any comparisons to the MM papers?

    [DC: I discussed the pre-CA posts a while back. Those were folded into CA at its launch in early 2005. There’s very little there though. ]

  32. Ted Kirkpatrick

    Great work, DC. In addition to notifying GMU, another good pressure point is the journal, “Computational Statistics and Data Analysis”. This is a real journal, with substantial names on both its Advisory Board and its list of Associate Editors. It’s published by Elsevier, an enormous force in scientific publishing. Both editors and publisher will care about the reputation of their journal, with the editors most directly concerned.

    And they ought to be concerned, for the Said et al. (2008) article is fishy in a number of ways. It purports to be about an “emerging tool”, yet the only references it gives are two unrefereed articles by the authors of this paper, a 34-year old article in a sociology journal, and a 13-year old textbook. Not a single citation from a statistical forum of a recent *refereed* article on this “key technique”.

    Furthermore, the writing of the article is weak. Consider the sentence, “The presence of relational information is a significant feature of a social network”. Um, no. It’s a *defining* feature—how can you construct a network without relational information? Lo and behold, the phrase “critical and defining” is used in Wasserman and Faust’s formulation of the concept.

    One further example of weak writing. Paragraph 4 lurches from “centrality” to “centralized” in a sentence that is not even grammatical: “The network is centralized from socio-centered perspective”. Two new ideas (“centralized network” and “socio-centered perspective”) introduced without preparation. Contrast with the smoother transition in De Nooy, Mrvar, and Bategli, where they explicitly note the change of viewpoint and then introduce the concept of a centralized network: “Viewed from a sociocentered perspective, the network as a whole is more or less centralized”.

    Yet when this lightly-referenced, weakly-written article was submitted, it was accepted without revision in just six days. Contrast this with other articles in the same issue of that journal. Selecting five at random, all had considerably more references, all had recent references from the refereed statistical literature, and all required revision before acceptance. What was so special about this article?

    Now look more closely at the editorial board for the journal. Edward Wegman is on the Advisory Board and Yasmin Said is an Associate Editor. (Wegman remains listed on the Board for the August 2010 issue but Said is no longer listed as an Editor.) Reviewing their lists of interests, Wegman’s list is appropriately diverse for a scholar of his stature and seniority.

    Said’s list, however, is an absolute grab-bag: “Biostatistics, epidemiology, public health, statistical modeling and graphics, adaptive design, social network theory, data mining, time series analysis, computer intrusion detection, climatology, metadata”. Some of those topics are application areas of statistics rather than statistics proper (public health, computer intrusion detection, climatology, metadata). More importantly, this is far too diverse a range of topics for a new scholar (two years out from her Ph.D.). Bear in mind that topic lists for Associate Editors are given to indicate areas where they have sufficient expertise to select referees, evaluate their reports, and make a final publication decision. This is a higher level of expertise than required for listing as an interest on one’s professional Web page. It defies likelihood that any young scholar could already have sufficient expertise to act as Associate Editor for such a diverse range of topics.

    Said is the only Associate Editor at the time who listed social networks—or “social” anything—in her interest list. Presumably, her article was given to some other editor to handle for review, but it’s not clear that any of the possible editors knew the topic area well.

    I will give the paper credit for one well-crafted sentence, though. The last line of the abstract reads, “We conjecture that certain styles of co-authorship lead to the possibility of group-think, reduced creativity, and the possibility of less rigorous reviewing processes.” Indeed.

    [DC: Well, well. You pointed out many of the items that I was going to highlight in the upcoming post on peer review (along with some additional details).

    The current editorial board is here, and a cache with Yasmin Said as associate editor is here. Interestingly, though, Wegman’s resume only lists him as an associate editor for CSDA. His presence on the advisory board may explain what appears to be the “less rigorous reviewing process” evidenced here (to say the least).

    As for the writing, there is no doubt that the authors run into trouble every time the text was changed from the “strikingly similar” antecedents. And it’s not just poor English; the changes also reflect poor understanding of the source material. ]

  33. Ted: good sleuthing

    A few more items for you:
    1) I went through the abstracts for all the articles in that issue, recording Received, Accepted, and Published dates.
    These sorted set of (Accepted-Received) was: {0, 1, 6*, 61, 71, 101, 111, 114, 138, 158, 165, 172, 176, 181, 193, 202, 204, 210, 232, 240, 272, 291, 294, 298, 327, 330, 344, 427, 428, 451, 470, 711, 1018, and one unknown}.
    6* = SWSR.

    2) The July 05, 2007 CS&AD Editors shows (as you note) Wegman as Advisor, Said as Associate Editor. Another Assoc. editor is:

    C. E. Priebe (Johns Hopkins), who did his PhD in 1993 under Wegman, and has coauthored at least 3 papers with him.

    Of course, this may all be coincidence, as two other papers got through with no real review. But it was probably unwise to publish a paper this way casting doubt on another discipline’s peer review, using plagiarized material, poor cites, and government support…

    Perusal of Wegman C.V., Feb 2010 can be instructive – much to mine there.

    AS or social networks:
    3) I’ve already pointed out the amazing belief that little work has been done in coauthor networks.

    4) Elsevier *has* a journal called Social Networks, with a higher impact factor than CS&DA.

    5) Searching within CS&DA for {social network} or {social networks} yeilds a few hits, and I ahvent’ examined all of them … but the bottom line is:

    CS&DA is *not* where people publish social networks research…

    6) As a quiz, there is another journal called Computational Statistics. What’s Wegman’s connection with *that* one?

    IMPORTANT NOTE:
    If one doesn’t do silly things like claiming only coauthorship counts as a relationship, senior academics tend to have big webs of relationships. It is not an *apriori* indication of wrongdoing to submit an article to a journal where you know somebody. What’s bizarre is to attack another field with no basis, which Wegman and co have now done repeatedly. it shows up in the Wegman Report, the reply to Stupak, Sharabati’s dissertation, and I think several more places.

    CS&DS seems an OK journal and has not generally been hijacked.

  34. Oops, I mentioned it in CCC, but there is also Wiley Interdisciplinary Reviews: Computational Statistics a new journal whose 3 editors are: Wegman, Said, Scott, including the inexplicable label of Said as a Professor at Oklahoma State U (she isn’t). Somebody might ask Wiley.

  35. Of course, this may all be coincidence, as two other papers got through with no real review.

    Or maybe a look at the social network connections between the authors of those papers and the editorial board might reveal something interesting.

    But it was probably unwise to publish a paper this way casting doubt on another discipline’s peer review, using plagiarized material, poor cites, and government support…

    Those who live in glass houses shouldn’t cast stones … especially from inside the house.

  36. I looked quickly at the Timmerman paper, one of the two other “instant” acceptance papers. It seems reasonable enough with no obvious link to the journal’s board. Lots of references for what it’s worth.

  37. Ted Kirkpatrick

    Thanks for the encouragement! The more I think about this paper, the more oddities appear. I’ll add just some short comments to what I said above.

    First, I emphasize that my comments about the problems with the writing and references in Said et al. (2008) are in addition to and independent of the question of “striking similarities” of wording. I was considering what the published paper demonstrated about the quality of its review process. I think the distinction is important because referees are not asked to check a submission for similarities to the wording of other papers, but they are asked to check a submission for the precision of its key definitions, the clarity and potential impact of its contribution, and how well its argument is grounded in the ongoing discussions of the field, amongst other things.

    In line with that, I started reading the article as though I were reviewing a submission. By the second page, I’d already had enough “huh?” moments that I would have recommended the paper be revised before acceptance. In my comment I only listed two examples, but there were many others I could have given. Thus analyzing the paper using *only the terms of reference for referees* (and not even looking for problems in their technical argument) revealed serious problems with the reviewing, even without considering similarities to other documents.

    But despite the length of my comment, I missed the most substantial reviewing lapse of all: The paper never states its claimed contribution. Read the paper’s introduction, then ask yourself the classic questions: What is the question the authors are asking? How are they going to answer it? Why is their question important?

    The authors never say. (They partially answer the questions in their abstract, but the body of the article ought to make the case more fully and back it up with relevant citations.) And I think this fuzziness of purpose was central to getting the article through review. If the authors had clearly stated, “We propose and validate a new approach to clustering social networks”, the reviewers might well have replied, “That’s in the purview of CS&DA, but we don’t see anything like a validation of your new clustering method—perform a validation and resubmit”. Had the authors instead stated, “We propose a taxonomy of coauthorship networks”, the reviewers *should* have replied, “That’s a sociology of science contribution. Not CS&DA’s field—submit somewhere else, like ‘Social Networks'”.

    The authors never explicitly made either claim, and the referees apparently never asked them to clarify. Another big sign that the submission was inadequately reviewed, to the point of not asking how the submission was related to the journal’s mission.

    [DC: Good points all – a very cogent analysis. I was hinting at that problem with my observation about jumping back and forth between a particular domain analysis and the methods, or between social network structures and their implementation. But you’ve summed up arguably the basic problem at the heart of the paper.]

  38. It looks like Said is an Assistant Professor at Oklahoma State. She isn’t listed here:

    http://statistics.okstate.edu/people/faculty.htm

    but this hasn’t been updated for a long time.

    However, she is listed here:

    http://www.okstate.edu/registrar/Catalogs/E-Catalog/2009-2010/Faculty.html#STAT

    if this is the same person.

    FWIW

    I wouldn’t want to be an Assistant Professor if my postdoctoral advisor had plagiarized a paper reporting my research. Not good.

    [DC: As has been remarked already, it’s hard to say who did what exactly. I’m sure it will all come out eventually.

    Anyway, it seems the Oklahoma State appointment did not go through for some reason and Said ended up staying at GMU as Assistant Research Professor in Computational & Data Sciences.

    http://peoplefinder.gmu.edu/index.php?search=yasmin+said&group=all&x=62&y=13 ]

  39. My first comment since your clipping of my posts which suggested that climate scientologists may have filtered endpoints of temp plots to ‘hide the decline’. Actually, you seem to be on target on some points here.

    It is extremely convincing that some wording was taken from other sources. Which from whom, I don’ t know .

    Of course, the main issue is whether you think Mann 98 is a reasonably correct method or not. Was Wegman right or not.

    Is it your contention that Wegman was incorrect in his interpretation that the use of decentered PCA was bogus?

    • Gavin's Pussycat

      Of course, the main issue is whether you think Mann 98 is a
      reasonably correct method or not. Was Wegman right or not.

      No, that is not the main issue. The main issue is plagiarism. For starters.

    • “…climate scientologists…”. So amusing. I think we can assume that this represents your opinion of climate scientists. Would that be all scientists in that field or only those with whom you disagree?

  40. Sam Cohen:
    You’re looking in the right places! but you really might want to read my CC document, which says:

    “[Thanks to DC] – very strange connection(?) has appeared with Oklahoma State University (??)
    http://www.okstate.edu/registrar/Catalogs/E-Catalog/2009-2010/Faculty.html,
    and the associated PDF, created 08/05/09 both list Yasmin H. Said as an Assistant Professor in Statistics.
    statistics.okstate.edu/people/faculty.htm But the OSU Statistics Department does not.
    There may have been some period when both Said and OSU thought she was coming there.
    In any case, many joint papers are found 2005-2009 via:
    Google Scholar: EJ Wegman YH Said
    Ironically, one paper was ―Text Mining with Application to Fraud Discovery‖
    What was going on? OSU seems a very unusual choice for Said. It is difficult to think of any connection
    except possibly Inhofe, but he is more involved with U of Oklahoma.
    Can OSU Statistics say more? Why was she listed?”

    I.e., this was out a while ago. It’s one of the interesting dangling questions.

  41. “Of course, the main issue is whether you think Mann 98 is a reasonably correct method or not. Was Wegman right or not. ”

    I would have thought the main issue was whether Mann’s results were accurate or not.

    Are you saying that what matters is not Wegman’s methodology but his result, a result that criticised Mann’s methodology while ignoring his result?

  42. Oklahoma State is neither an obvious nor unusual for Said. If she is looking for an academic job, she would take what she could get. Some people get multiple offers, some get none. If this was the only offer she got, or if this was the best offer she got, then her choice was to take it or leave it.

    I do, however, wonder what happened. The site that lists her as a Statistics faculty member was the Registrar’s website, which would only list what departments told them to list.

    The Department of Statistics at Oklahoma State has an opening for an Assistant Professor, to begin August 2010.

    [DC: It’s a mystery, all right. I suppose it’s not unusual for an offer to be rejected. But surely it is unusual for the rejection to happen *after* the lists are published? Perhaps she got a “better” offer very late, or there was miscommunication of some sort. Said also spent a year teaching at Johns Hopkins in 2006. I’m not sure what happened there either or what kind of position she had. ]

    • DC,

      If she got a better offer later, then why is she only Research Assistant Professor at GMU? This is not a tenure track position and could hardly be considered a better offer. Clearly, she must have accepted an offer at OK State, they listed her as a faculty member, then she never took the position. Did she decline an offer she had already accepted or was the offer withdrawn after she had accepted it? Either way, she would have had to accept the offer for them to list her as a faculty member.

      I would be interested to know if this issue of plagiarism in Said, et al., has already been reported to GMU.

      With respect to the Johns Hopkins position, this was before she became a postdoc at GMU. Her F32 grant went from 26 May 2006 to 25 May 2009. The grant was an O1A1, which means it was a second (revised) submission. I believe she got her Ph.D. in 2004 or 2005, as she won an award for outstanding dissertation in 2005 for the period 1 July to 30 June. Perhaps she took a temporary position there before her NIH postdoc was funded. A May 2006 start date for an F32 grant would have meant a 2005 submission date on the revised application, and either an early 2005 or late 2004 submission date on the original application.

  43. Sam: thanks, new-to-me info on the grants.

    Wegman’s C.V., updated Feb 2010 is worth perusing.

    1) Said’s PhD is dated 2005.

    2) Carey Priebe is a Professor at JHU, was one of Wegman’s students, and has coauthored at least 6 papers with him from 1997-2005.

    3) Wegman has a huge social network. I would certainly guess he knows the Stat Dept. head at OSU, Abe Ahmad, so maybe that’s a connection., but given the frequency of working with Said, one might have expected her to find something closer than Stillwater, OK … which is 90 minutes’ drive from Oklahoma City, and air connections aren’t that great, especially to go to Europe/Mid-East.
    Stillwater does have a few mosques, but compared to places like Washington, New York City, San Francisco Bay Area, it just wouldn’t be an obvious early choice. Of course, if nothing else were available, so be it, but usually somebody of Wegman’s experience helps find good positions for his students.

    4) But obviously, if someone cares, they might write to both OSU and Wiley-Interscience and ask them why Wiley thinks she’s a Professor @ OSU.

    • John,

      My guess is that Said, et al., have far bigger things to worry about than a mistaken affiliation, or at least they will in the near future. The NIH does not take Research Misconduct lightly, and Said was funded by the NIH when this paper was written and published. She is the first author, so it might be difficult to explain away why a good portion of a key paper from her postdoc was plagiarized. Perhaps Wegman wrote the paper and put her name first. No matter who wrote it, there will necessarily be some explaining to do.

      DC, it might prove enlightening to take a look at Said’s doctoral dissertation from GMU. Perhaps some clever person on this blog could see if they can obtain a copy of it.

      [DC: Both Said’s and Sharabati’s PhD dissertations are available online at GMU (although I don’t have the exact links handy right now). I have not examined Said’s dissertation in detail, but as I recall it did not touch on the above social network analysis background. I did note above that William Wieczorek, one of the reviewers of Wegman et al, is also listed as one of Said’s advisors.

      The Sharabati dissertation does contain unattributed background material on social networks that is strikingly similar to the Said et al introduction, which in turn is … well you get the idea. ]

  44. To add to your Wegman collection: here’s some stuff on the basic science that he gets wrong: compare Wegman’s statement:

    “Both Esper et al. (2002) and Moberg et al. (2005) indicate that current global temperatures are not warmer that the medieval warm period.”

    With Esper (2002):
    “annual temperatures up to AD 2000 over extra-tropical NH land areas have probably exceeded by about 0.3 C the warmest previous interval over the past 1162 years. ”

    and Moberg (2005):
    “We find no evidence for any earlier periods in the last two millennia with warmer conditions than the post-1990 period—in agreement with previous similar studies”

    • @Dave.

      Oh boy. And how many times a week does someone try to argue in msm comments that the MWP was warmer than today, in tandem with another obligatory “cynic” throwing Wegman into the mix? Plenty enough. Many thanks for the corrective references and quotes.

  45. DC wrote:

    “For starters, George Mason University must initiate a complete, independent investigation of the activities and scholarship of Edward Wegman and his proteges. Now.”

    Since I couldn’t see any confirmation that you’d initiated the relevant complaint procedure I assumed you’re “willing to wound but afraid to strike”.

    Consequently, I’ve lodged a complaint of research misconduct with GMU on your behalf.

    I wouldn’t be too optimistic about the outcome.

    This is what the Office of Research Integrity says about allegations of plagiarism:

    “As a general working definition, ORI considers plagiarism to include both the theft or misappropriation of intellectual property and the substantial unattributed textual copying of another’s work. It does not include authorship or credit disputes.

    The theft or misappropriation of intellectual property includes the unauthorized use of ideas or unique methods obtained by a privileged communication, such as a grant or manuscript review.

    Substantial unattributed textual copying of another’s work means the unattributed verbatim or nearly verbatim copying of sentences and paragraphs which materially mislead the ordinary reader regarding the contributions of the author. ORI generally does not pursue the limited use of identical or nearly-identical phrases which describe a commonly-used methodology or previous research because ORI does not consider such use as substantially misleading to the reader or of great significance.”

    Presumably, this is the view GMU will take.

    [DC:I believe this case goes beyond simple plagiarism, and thus requires a broader investigation in the public interest.

    So the question becomes: is this “limited use” or “substantial unattributed textual copying”? By my count we have about 7+ pages of “striking similar” material here.

    By the way, one reason (not the only one) I have not initiated a complaint myself is that the GMU complaint procedure, like most, has a confidentiality requirement, which is obviously impossible for me to fulfill, as I have already disseminated many of the relevant facts. Again, I strongly believed it was in the public interest to do so.

    Here one GMU prof gives various examples of plagiarism using four sentences (two paragraphs) of his own work as source material.

    Here is a key excerpt from the University’s policy on research misconduct:

    ”Research misconduct” means fabrication, falsification, or plagiarism in proposing, performing, or reviewing research, or in reporting research results. Research misconduct does not include honest error or differences of opinion.

    (a) Fabrication is making up data or results and recording or reporting them.

    (b) Falsification is manipulating research materials, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record.

    (c) Plagiarism is the appropriation of another person’s ideas, processes, results, or words without giving appropriate credit.
    ]

    • I disagree. The NIH will be following the investigation, and the ORI does not let this type of investigation fall through the cracks. If ORI does not already know about the misconduct, GMU is required to notify them of the allegation you made. Remember, Said was an NIH postdoc fellow during this period. The NIH will be very interested in the proceedings and outcome.

  46. Steven mosher

    Excellent work:

    What is the group opinion of a scientist call him A who writes a major report and does the following:

    1 Takes text supplied by another scientist ( call him B) from a paper written by B and C. When he has been directed not to.

    2. Incorporates that text into his document without citing B or C

    3. Asks B to review his incorporation to see if anybody can tell that he took the work from B and Cs work.

    4. Gives B priveledges ( as a reviewer) that he is not authorized to and fails to disclose this to other reviewers.

    5. Violates the rights of a citizen who requests documentation of his misdeed.

    Hypothetically of course

    What would your opinion be about such an act?
    hypothetically, of course.

    [DC: Hypothetically, you appear to be referring to the inclusion of Wahl and Ammann in AR4 and your comment at RC.

    http://www.realclimate.org/?comments_popup=3846#comment-171303

    I think Gavin answered your nonsense very well. Nice try, though.

    Of course, further discussion of this isssue is off-topic, although you may continue at the Open Thread. Thanks!

    Returning to Wegman et al, relevant discussion might center on the exclusion of Wegman et al from IPCC AR4 WG1 (justified by the poor scholarship and lack of proper peer review).

    Also pertinent is the exclusion of Wahl and Ammann from substantive consideration by Wegman et al. Wegman et al’s reasoning for this simply does not hold up. ]

  47. I used to deal with much simpler plagiarism, i.e., by students copying computer programs, via rummaging in previous term’s punchcard decks (long ago!) I always warned them upfront, and some never realized how easy it was to find.

    That was just laziness in not doing the work …

    In a paper, one can:
    (1) Just point at a standard source.
    (2) Or if the intro material is deemed necessary, just quote it, hopefully getting permission.

    Either of those establishes:
    (a) The authors know of the standard material. That is valuable because researchers are supposed to know that.

    (b) The ability of he reader to go look at the original source and check that:
    (b1) It is correctly quoted.
    (b2) It is in context, not cherry-picked.

    but it does *not* establish that:
    (c) The authors are truly knowledgeable in this turf, the way original text might do.
    [This is why I really like to attend live talks and be able to ask questions. An expert can give a talk, then handle question after tough question well. Someone else could present the identical talk, and then fall apart in Q&A, and it’s really easy to tell the difference, live.]

    So, of the instances of “striking similarity” DC has found:

    A) Some just seem like laziness, like using a few paragraphs from Wikipedia. I doubt anyone would get too excited about that by itself, but it is part of the pattern.

    B) Some (like the social-networking texts) were just using standard material, perhaps from laziness, or perhaps to impress readers with expertise … that wasn’t really there. After all, *most* of those pages of text had relatively little to do with their analysis. To pick an easy case, the Wegman Report has a whole a paragraph on “Triad” (similar to Wasserman&Faust) … and then it’s never mentioned again. But it looks impressive.

    C) But the tree-ring material seems to violate b1) and b2) rather strongly. Had they cited Bradley, the problems in their interpretation would have been instantly obvious. Likewise, search for “confounding” in the WR. Of the 7 hits:
    1 is unrelated
    5 emphasize confounding factors against tree-rings
    1 (page 27 by their page #, 26 of PDF) says:

    “The variables affecting earth’s climate and atmosphere are most likely to be numerous and confounding. Making conclusive statements without specific findings with regard to atmospheric forcings suggests a lack of scientific rigor and possibly an agenda.”
    One should read the preceding paragraphs of that for context … but this mostly seems an argument from ignorance.

    After all, as Judith Curry recently wrote to defend Wegman in #125 @ collide-a-scape:

    ” When asked to explain the greenhouse effect, he really didn’t know anything about the physics of how it worked. So I don’t think you could have gotten a more unbiased person to do this review. ”

    Bradey’s book is 600 pages, of which carefully explains the various factors and how they are dealt with. Really, the emphasis on “confounding” is like starting with a good book on software engineering&testing, and spin a narrative emphasizing that because there can be bugs, software cannot work … despite the book’s descriptions of the techniques used to avoid, find, fix and regression-test bug-fixes.]

    Plagiarism can take huge work to find, but once found, can be very, very clear, even to people unfamiliar with the topics.

    But often, plagiarism is a hint that further investigation may find more important issues, which I think has happened here. {18USC1001, 18USC371, 18USC4}, should they come into play, would be nightmares compared to mere plagiarism.

  48. Pingback: Judith Curry doesn’t let up [The Island of Doubt ]

  49. It occurs to me that statuses=>statues could have been a result of sloppy spellchecking, “statuses” being a common pluralization but not the correct one per Latin grammar.

  50. It is common knowledge that the Wegman Report was instigated by McIntyre supporters (or at least MBH “sceptics”) as a counter to the NRC report.

    Who could have imagined that it wasn’t just biased but also of such poor quality? DC has been doing a great job but I wonder why the report wasn’t examined more closely by its targets?

    I sometimes think that scientists ignore too much contrarian work, especially when it has such an impact. Surely Ray Bradley, for instance, would have spotted the plagiarism of his own work if he had read it?

    [DC:If you read my previous account here, you will see that the Wegman panel was formed in September 2005. Boehlart’s request to the NAS came later, likely after he got wind of this second phase of the partisan Barton investigation. ]

    • Thanks. I know that Wegman was not a “response” in terms of chronology. I thought they were more-or-less in parallel. Was the NRC report set up only when the Energy and Commerce Committee refused the offer of a proper scientific review from the National Academy of Sciences?

  51. Just to add (from your article).

    The Science Committee hearings (and National Academy of Sciences report) would come in March 2006, while Barton shot back with a report by a panel led by Edward Wegman and his own set of hearings in July 2006.

    This explains my misconception.

    [DC: The sequence of events was indeed confusing; it took me a while to unravel it all. I’ll look back and see if I can easily clarify further. ]

  52. TrueSkeptic:

    “Surely Ray Bradley…” is a bad assumption.

    I can easily imagine how he’d missed it.
    Suppose he started skimming the introductory material. He could easily think “pretty standard stuff, looks OK” and then skip to the real arguments, easily missing the weakenings and reversal buried towards the end of that section. People who publish a lot don’t necessarily remember every word they wrote, especially if uncontroversial.

    And after all, who would ever be looking for plagiarism in such a high-profile report?

    • John,

      I’m sure you’re right. I thought that just maybe it would be a case of “oh, this looks familiar…I wonder why?”.

  53. Here is an excerpt from Wegman et al section 2.2 (p. 16) that bears a striking similarity to the Wikipedia article on Colors of Noise as of April 12, 2006.

    The color names for these different types of sounds are derived from an analogy between the spectrum of frequencies of sound wave present in the sound (as shown in the blue diagrams) and the equivalent spectrum of light wave frequencies. That is, if the sound wave pattern of “blue noise” were translated into light waves, the resulting light would be blue, and so on.

    The material is short and arguably definitional, although citations should still be given for background material (there are none at all in this section, just as in section 2.3 on social network analysis).

    However, this excerpt is interesting for two reasons.

    1) This passage first appeared in Wikipedia on April 12; the previous version from April 10 does not have it. Presumably then the section was written in April 2006 or later, which helps to solidify the timelines.

    2) The context of discussion in Wegman et al was signals; yet, the odd reference to sound was left in. It would have been easy (and more logical) to simply change “sound” to “signal”. It does suggest that whoever worked on this passage did not fully understand the material or was working in great haste. Or both.

  54. Doug Bostrom

    If Wegman et al intended to cast doubt on the reliability of peer reviewed literature they’ve certainly accomplished that goal, ironically speaking.

  55. This issue has analogy in the anti-vax arena. The anti-vaxers finally get someone to publish something contrary in a peer-reviewed journal, and they trumpet it to high heaven, claiming the peer-reviewed science supports their position. Then, oops, it’s retracted or fraudulent, and they therefore claim there is a conspiracy to keep their side of the issue out of the literature.

    Here we have someone who writes a paper supporting the theory that there exists a conspiracy among climate scientists to only publish research that supports their “side” due to their incestuous interrelationships, but oops, the paper actually doesn’t support that at all, and is found to be fraudulent (plagiarized). If Wegman is forced to retract the paper, then this will be proof of the conspiracy to keep the deniers from publishing in the peer-reviewed literature.

    In reality, in both cases, it’s just s**tty science.

  56. Usually, a bad paper gets refuted elsewhere.
    The key message of Said, Wegman, Sharabati, Rigsby seemed to be that Mann-style “entrepreneurial” networks could be more prone to peer review problems than Wegman-style “mentor” nets.

    Hence, that seems a “self-refuting” paper, in which the paper lacks evidence for its hypothesis, but itself is evidence against it.

    [DC: Well put! ]

  57. Apparently Tom Fuller believes that what DC is doing (exposing Wegmen’s plagiarism) is sleazy, yet what he did (using STOLEN personal emails and profiting by writing a book with Mosher called Climategate: The CRUtape Letters ) is not! Yikes!

    [DC: Your point is well taken, but I hope we don’t have to discuss Fuller at any length. ]

  58. …”(the sentence should presumably read “which are represented”, rather than “that”).”
    It is ridiculous to throw in with deadly serious accusations accusations of plagiarism a nitpicking appeal to a pseudo-rule invented by ignorant prescriptivists in defiance of longstanding English usage. To quote Geoffrey Pullum, editor of the Cambridge Grammar of the English Language:
    “Excellent writers who have not been forced to submit to American copy editors on this point tend to use which and that in roughly equal proportions at the beginnings of their integrated relative clauses.”
    http://languagelog.ldc.upenn.edu/nll/?p=1689#more-1689

    [DC: You have misunderstood my point, which was that this addition was one more confusing change (among many) introduced by the authors. I had to reread it to understand it, so the comment also served to help others who may have had the same “huh?” reaction that I had. The confusion was a result of the choice of conjunction, together with the hazy relationship of the clause to the preceding statement.

    Now suppose I had written:

    “You have misunderstood my point, that was that this addition was one more confusing change (among many) introduced by the authors.”

    Surely some readers would have been confused.

    But I am no “prescriptivist” – far from it. It all comes down to cases and clarity, ultimately.]

  59. > journalist should do

    But, but, Wegman is Too Big To Fail.
    You’d bring down the entire US economic system!

    Ironically, Wegman et al. are experts in data mining. Wouldn’t you think their writing tools would include something like Zotero (also from GMU) to track sources and attributions properly? http://www.zotero.org/
    (I continue to use and like Zotero as a way of collecting snippets-with-citations that I think may later come in handy, though I do wonder who else at GMU might have access to its files.)

  60. James Annan doesn’t do pingbacks, so I’ll just point to his post on Curry (with an appearance from yours truly).

    http://julesandjames.blogspot.com/2010/04/curried-leftovers.html

  61. Moberg and Esper revisited – Wegman et al refute themselves

    Wegman et al (p. 47)

    Both Esper et al. (2002) and Moberg et al. (2005) indicate that current global temperatures are not warmer that the medieval warm period.

    Wegman et al (p. 83) [Summary of Moberg et al 2005]

    This study finds no evidence for any earlier periods in the past two millennia with warmer conditions than the post-1990 period.

    Wegman et al (p. 86) [Summary of Esper et al 2002]

    Additionally, using RCS methods, climate variability of the Medieval Warming Period (MWP) can be reconstructed, and it approaches the magnitude of 20th-century warming in the Northern Hemisphere up to 1990.

    The original is clearer

    In so doing, evidence for a large-scale MWP (sensu lato) has been reconstructed, and it approaches the magnitude of 20th-century warming in the NH up to 1990.

    • The Wegman et al quote above from p. 47 about “current global temperatures” is found right after a figure from d’Arrigo et al 2006 showing various Northern Hemisphere reconstructions in separate panels, including the Moberg and Esper reconstructions.

      These do not show the instrumental record and so do not show “current” temperatures at all. Moberg for example appears to stop in 1980. This likely accounts for the error.

  62. And then there is Cook, Esper and d’Arrigo (2004), with the quote highlighted by Dave above, revisiting Esper, Cook and Schweingruber (i.e. updating their own work):

    The temperature signal in the ECS reconstruction is shown to be restricted to periods longer than 20 years in duration. After re-calibration to take this property into account, annual temperatures up to AD 2000 over extra-tropical NH land areas have probably exceeded by about 0.3 °C the warmest previous interval over the past 1162 years.

    I guess Barton staffer and climate science gatekeeper Peter Spencer missed that one somehow.

    [Hat-tip to Robert Murphy at Rabett for the 2004 reference.]

    (Yes, there’ll be a post on this and other “bogosities”, as James Annan puts it, in Wegman et al’s summary of climate science)

  63. One final note (for now). Eagle-eyed readers will have noted that Wegman et al refer to global temperatures, even though all the reconstructions discussed are Northern Hemisphere (some extra-tropical).

    This is a persistent confusion, as in the following comparison of NH temperature since 1850 and global temperature since 1900:

    “We do not assume any position with respect to global warming except to note in our report that the instrumented record of global average temperature has risen since 1850 according to the MBH99 chart by about 1.2 degrees Centigrade, and in the NAS panel report chaired by Dr. North, about six-tenths of a degree Centigrade in several places in that report.”

    Wegman testimony at Energy and Commerce Committee “hockey stick” hearings (July 19, 2006), p. 18.

  64. John Mashey

    One of the things that struck me was in Wegman’s response to Stupak, p.17, in which Wegman’s student Whalid Shrabati writes:

    “It is worth mentioning that no one has yet analyzed Wegman’s author-coauthorship social network on any level. This work will provide an independent source to view the network on the different levels and examine how the actors interact with each other.”

    The use of the word “independent” is interesting.

    On p.18, he writes:

    “Of all the work that has been done on social networks, very few investigators have considered coauthorship network. Therefore, what we are about to observe in this paper is a brand new approach in the social networks field.”

    When I saw that, I was astonished…
    Now, I have never done formal research in social networks, but have taken several courses in graph theory (including one from Claude Berge), worked at Bell Labs, which of course developed and used much network & statistical theory. I worked closely with some of the canniest managers in Bell labs, often in technology diffusion efforts (which sometimes relied on practical understanding of informal “technology gatekeeper networks.” Later, I worked with some of the best computer salespeople in the business, and whenever I was going to help on a salescall, I expected pre-briefing in the relevant social nets, although the terminology wasn’t academic.

    But I’ve seen coauthorship and citation analyses long ago, and of course, in mathematics, one’s Erdos number is famous. ( don’t have one, but know people who have fairly low ones.)

    Similarly, there has long been a mathematics geneology that tracks PhD descendance.

    IDEA: coauthorship isn’t the only relationship that matters. Serious people track all sorts of others. (In one major sales deal, it mattered that the CEO of our prospect had bought his house from the CEO of our competitor…)
    When I did the study of thePetition to APS, I showed a wide range of relationships. It’s hard to believe that anyone familiar with social networks literature would ignore that. Also, it’s hard to believe would restrict their view to only a single hop, not the more general multi-hop case.

    But the idea that few had looked at coauthorship networks … was bizarre.

    Try Google Scholar: coauthorship network

    There is a *huge* literature, in part because coauthorship of academic papers is easily available without doing exhaustive surveys of people in the field.

    Try: Google Scholar: barabasi coauthorship and look the (incredible) citation counts. (H/T to Garry Robins, i.e., soembody who actually does research in this field.

    SUMMARY: Wegman’s group seems ill-informed about (human) social research. Wegman later seems to claim his social network is only about 15 people, based somehow on who he ahs coauthored with, even though Sharabati identified a much bigger network of coauthors alone, not counting the other relevant relationships.

    Oen has to wonder how far Said, Wegman, Sharabati, Rigswby would have gotten if submitted to Social Networks, or other journals in “Scientometrics.”

  65. So, um, John, who has the lower Erdos number, climate change denialism, or climate change study?

  66. Pingback: Wegman again [Stoat]

  67. A contemporary example of the kind of work John Mashey describes:

    http://scienceblogs.com/christinaslisrant/2010/04/review_of_an_article_using_bib.php?utm_source=editorspicks

    “The authors combine network analysis of the co-authorship network with qualitative interviews with the scientists to look at intergroup collaboration, migrations, and exchange of services or samples.”

    The citation:

    Velden, T., Haque, A., & Lagoze, C. (2010). A new approach to analyzing patterns of collaboration in co-authorship networks: mesoscopic analysis and interpretation Scientometrics DOI: 10.1007/s11192-010-0224-6 (pre-print available at: http://arxiv.org/abs/0911.4761 )

  68. This is as bad as your last analysis. You take common deffinitions and statements that could be attributed to dozens of authors attribute it to one and call plagerism.

    [DC: No. If you look at the side-by-side comparison I provide – which covers all five pages of Wegman et al section 2.3 – there can be no doubt that the antecedents are the Wikipedia article and two text books I have identified. ]

  69. Oops, lose the trailing close parenthesis:
    http://arxiv.org/abs/0911.4761

  70. John Mashey

    Another tipoff is the inclusion of definitions that don’t really seem necessary. For example Search for “triad” in the Wegman PDF. They spend a paragraph defining it, then don’t really use it. Likewise, unreferenced references are odd (although most are not as odd as Tom Valentine’s :-))

  71. Ted Kirkpatrick

    Speaking of social networks at GMU: Virginia Attorney-General Ken Cuccinelli, who is suing the University of Virginia for records related to Michael Mann … is a 1995 graduate of the GMU Law School.

  72. FYI, I just received this email from the NIH Office of Research Integrity. Relevant here, I believe.

    ABSTRACT BEING ACCEPTED FOR QUEST FOR RESEARCH EXCELLENCE CONFERENCE
    ORI is accepting abstracts on Research on Research Integrity of the Quest for Research Excellence Conference 2010. If interested, please complete the application form and submit it by August 1. Decisions will be made within 1 month of receiving applications.
    http://ori.hhs.gov/documents/Q4RE_AbstractSubmission.doc

    NEW MISCONDUCT CASE: SCOTT J. BRODIE
    ORI made fifteen findings of misconduct in science based on evidence that Dr. Scott Brodie knowingly and intentionally fabricated and falsified data reported in nine PHS grant applications and progress reports and several published papers, manuscripts, and PowerPoint presentations.
    http://ori.hhs.gov/misconduct/cases/Brodie_Scott.shtml

    NEW MISCONDUCT FINDING: BORIS CHESKIS
    Boris Cheskis engaged in research misconduct in grant applications 1 R01 DK072026-01 and 1 R01 DK072026-01A2 submitted to the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), NIH.
    http://ori.hhs.gov/misconduct/cases/Cheskis_Boris.shtml

    NEW MISCONDUCT FINDING: EMILY HORVATH
    Emily Horvath admitted to falsifying the original research data when entering values into computer programs for statistical analysis with the goal of reducing the magnitude of errors within groups, thereby gaining greater statistical power.
    http://ori.hhs.gov/misconduct/cases/Horvath_Emily.shtml

    CALL FOR APPLICATIONS
    Applications are invited for a two-day workshop on teaching research ethics using the OpenSeminar in Research Ethics. Travel stipends available. July 12-13, 2010, at North Carolina State University. Eligibility: those teaching or planning 1-credit graduate level research ethics courses, with a preference for faculty in tenured or tenure-stream positions in philosophy departments.
    http://openseminar.org/ethics/

    USA SCIENCE & ENGINEERING FESTIVAL
    The Inaugural USA Science & Engineering Festival will be the country’s first national science festival and will take place in Washington, D.C. on October 10 – 24, 2010. The Festival promises to be the ultimate multi-cultural, multi-generational and multi-disciplinary celebration of science in the United States.
    http://www.usasciencefestival.org/

    ORI BLOG: ADVISOR STEALS STUDENT’S WORK. WHAT WOULD YOU DO?
    A graduate student prepares a research proposal as part of her dissertation requirements. Her faculty advisor reviews the proposal but otherwise provides only minimal assistance in developing the concept. The student later learns that her advisor has paraphrased sections of her proposal and incorporated them into his own application to a different funding agency. How should the student respond?
    http://ori.hhs.gov/blog/!0510-2

    ORI BLOG: BEING RECRUITED BY THE COMPETITOR.
    A graduate student completes a dissertation based on research that was partly funded by a corporate sponsor. After graduation, the student is offered a job working for another corporation that competes with the first. There can be no doubt that the dissertation and experience with a competitor’s research played a role in this offer. Is there a conflict? Does it matter whether the student personally received any funding from the corporate sponsor?
    http://ori.hhs.gov/blog/!0510-1

  73. Looks like some careers have come to an end … the misconduct findings make for some interesting readings.

  74. Pingback: The “Hockey Stick” evolution « Our Clouded Hills

  75. PolyisTCOandbanned

    1. GMU is a state school located in VA. Not THAT strange that some of it’s graduates would be in state government political positions.

    2. It does have a reputation, especially in economics for having some world class conservative professors. So…maybe conservatives gravitate there and there is some conservative network. But given that proferssors are in general FAR more conservative than either the general population or the general student population, is this so bad? I mean y’all got Harvard and all kinds of other places for lib’ruls.

  76. PolyisTCOandbanned

    I mean profs are more librul than the norm.

  77. “I mean profs are more librul than the norm.”

    More intelligent, too. Interesting correlation, no?

  78. Pingback: Wegman Report update, part 1: More dubious scholarship in full colour « Deep Climate

  79. Pingback: Replication and due diligence, Wegman style | Deep Climate

  80. Pingback: Wegman and Said 2011: Dubious Scholarship in Full Colour, part 1 | Deep Climate

  81. Pingback: Retraction of Said, Wegman et al 2008, part 1 | Deep Climate

  82. Pingback: Nielsen-Gammon interviews North and others on Wegman – plagiarism may be related to a cultural misunderstanding by foreign exchange student « Wott's Up With That?