Said and Wegman 2009: Suboptimal Scholarship

Today I present an analysis of a 2009 article by Yasmin Said and Edward Wegman of George Mason University. “Roadmap for Optimization” was published in the inaugural edition of WIREs Comp Stats, one of a new family of Wiley publications conceived as a “serial encyclopedia”. Wegman and Said, along with David Scott of Rice University, are also editors of the journal; the three are best known as co-authors of the 2006 “hockey stick” report to Congress, commissioned by Rep. Joe Barton.

As the title implies, the article was meant to provide a broad overview of  mathematical optimization and set the stage for subsequent articles detailing various optimization techniques. However my analysis, entitled Suboptimal Scholarship: Antecedents of Said and Wegman 2009, demonstrates the highly problematic scholarship of the “Roadmap” article.

  • No fewer than 15 likely online antecedent sources, all unattributed, have been identified, including 13 articles from Wikipedia and two others from Prof. Tom Ferguson and Wolfram MathWorld.
  • Numerous errors have been identified, apparently arising from mistranscription, faulty rewording, or omission of key information.
  • The scanty list of references appears to have been “carried along” from the unattributed antecedents; thus, these references may well constitute false citations.

First, I’ll present an abridged version of Suboptimal Scholarship summary as an overview of the analysis. Then I’ll take a look at a few examples showing the derivation of “Roadmap” from its antecedents, including some remarkable errors introduced in the process.  And finally I’ll place this latest embarrassment in the context of the pattern of dubious scholarship evidenced by Wegman and Said over the last several years.

Overview of Suboptimal Scholarship: Antecedents of Said and Wegman 2009

Here is the full citation for “Roadmap for Optimization”

Yasmin H. Said and Edward J. Wegman, “Roadmap for Optimization‖”, Wiley Interdisciplinary Reviews: ComputationalStatistics[WIREs Comp Stat], Volume 1, Issue 1, pages 3-11, July/August 2009. Online July 13, 2009.

The abstract reads:

This article focuses broadly on the area known as optimization. The intent is to provide in broad brush strokes a perspective on the area in order to orient the reader to more detailed treatments of specific subdisciplines of optimization throughout WIREs: Computational Statistics. In this article we provide background on mathematical programming, Lagrange multipliers, Karush-Kuhn-Tucker Conditions, numerical optimization methods, linear programming, dynamic programming, the calculus of variations, and metaheuristic algorithms.

Thus, this article was conceived as a key element in realizing the “serial encyclopedia” vision of WIREs Comp Stat.

As can be seen in the following table, apparent unattributed antecedents
have been identified in each of the sections corresponding to the aforementioned subjects. For the most part, the antecedents’ use appears
sequential, and shows little of the interspersing of sources found in the pair’s other WIREs overview article, Wegman and Said 2011 “Color Theory and Design” (previously analyzed).

Sect. Source (with version date)
1 1. Wikipedia – Mathematical Optimization (Jan. 2009)
2 2. Wikipedia – Lagrange Multiplier (Jan. 2009)
3. Wikipedia – Karush-Kuhn-Tucker conditions (Jan. 2009)
3 4. Wikipedia – Gradient Descent  (Nov. 2008)
5. Wikipedia – Conjugate Gradient Method (Jan. 2009)
4 6. Wikipedia – Linear Programming (Jan. 2009)
7. Tom Ferguson – Linear Programming
8. Wikipedia – Simplex Algorithm (Apr. 2009)
9. Wikipedia – Karmarkar’s Algorithm (Jan. 2009)
10. Wikipedia – Interior Point Methods (Dec. 2008)
5 11. Wikipedia – Dynamic Programming (Jan. 2009)
6 12. Wikipedia – Calculus of Variations (Dec. 2008)
13. Wolfram MathWorld – Calculus of Variations
14. Wikipedia – Fundamental Lemma of Calculus of Variations (Jan. 2009)
7 15. Wikipedia – Simulated Annealing (Jan. 2009)

Section 2 (p. 3-5) of Suboptimal Scholarship  breaks down Said and Wegman 2009 page by page and gives identified antecedents, if any, for each sub-section, paragraph by paragraph. This section also contains a brief selection of identified errors, most of which have been introduced through various changes to the antecedents, reflecting apparent mistranscriptions or misunderstandings of subject area details. Some key omissions are also identified.

Section 3 (p. 6-30)  begins with a detailed breakdown of the antecedents paragraph by paragraph. That is followed by a complete textual side-by-side comparison of Said and Wegman and its identified antecedents, with identical and trivially similar text highlighted, in the now familiar cyan (identical) and yellow (trivially different)  highlighting scheme.

Finally, section 4 (p. 31-33) has a brief analysis of Said and Wegman’s references and suggested reading list. The seven references, including two for different editions of Bellman’s Dynamic Programing, can all be found in the corresponding Wikipedia antecedents. Similarly, the extensive “Further Reading” list also appears to be derived from the various antecedents.

Example 1: “Mathematical programming is the simplest case of optimization” [SoS p. 4, p. 10]

Note: in this and following examples, I’ve given the relevant page numbers from  Suboptimal Scholarship (SoS), in case you want to follow along. The first page number is the page analyzing the error(s) and the second is the location of the detailed side-by-side comparison.

Almost a year ago, reader “Amoeba” tried to point out that parts of “Roadmap” had “striking similarities” to the Wikipedia article on Mathematical Optimization. I didn’t get it at the time, but in my defence the correspondence is quite a bit clearer if you go back to the version from early 2009.

In mathematics, the simplest case of optimization, or  mathematical programming, refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an allowed set. This (a scalar real valued objective function) …

Now let’s omit certain parts:

… the simplest case of optimization, or  mathematical programming, … real … variables … a scalar real valued objective function

And rearrange to get the second paragraph of Said and Wegman:

The simplest case of mathematical optimization is mathematical programming. Mathematical programming considers as its objective function a scalar real-valued function of real variables.

The rest of the section features a similar rearrangement of the Wikipedia antecedent, but with so much common material as to make the source unmistakeable.

However, the real howler is that Said and Wegman have failed to understand that “mathematical programming” is used here as a synonym for mathematical optimization (or just plain optimization); it is patent nonsense to claim that mathematical programming is the “simplest case” of mathematical optimization.

If  only they had waited a year, they could have avoided this particular fiasco. The Wikipedia opening now reads:

In mathematics and computational science, mathematical optimization (alternatively, optimization or mathematical programming) refers to the selection of a best element from some set of available alternatives.

In the simplest case, this means solving problems in which one seeks to maximize (or to minimize) a real function by systematically choosing the values of real or integer variables from within an allowed set.

Enough said.

Example 2: 2d (or, not 2d)       [SoS, p. 4, p. 22-24]

The long section on linear programming contains some of the most egregious lifting of material; check out the “sea of cyan” in the side-by-side comparison.

Here is a short excerpt from a longer passage that is strikingly similar to Tom Ferguson’s handy Linear Programming: A Concise Introduction. The highlighted portions are identical, with the slight changes shown with strikeout.

A vector, x for in the standard maximum problem or y for in the standard minimum problem, is said to be feasible if it satisfies the corresponding constraints. The set of feasible vectors is called the constraint set. A linear programming problem is said to be feasible if the constraint set is not empty; otherwise it is said to be infeasible. A feasible maximum (resp. minimum) problem is said to be unbounded if the objective function can assume arbitrarily large positive (resp. negative) values at feasible vectors. ;otherwise, If a problem is not unbounded,it is said to be bounded

And so it goes, for several more sentences. But even these minimal changes have introduced confusion; the original commas (and the word “for”) are there for a reason, to emphasize that x and y are used for the standard maximum or standard  minimum problem respectively, and that the rest of the statement applies to “a vector” in general. Then the removal of “resp.” (i.e. “respective”) makes the parallel construction incomprehensible.

One of Said and Wegman’s particular enhancements to their Wikipedia antecedents, is to change notation of variable counts from n to d, perhaps to emphasize the geometric aspect of some optimization techniques. So a long passage on the simplex algorithm, apparently originating in Wikipedia – Simplex Algorithm (Apr. 2009), contains several such switches. For example, these two sentences require five such switches:

Suppose in the standard form of the problem there are n d variables and m constraints, not counting the n non-negativity constraints. Generally, a vertex of the simplex corresponds to making n d of the m + d total constraints tight, while adjacent vertices share n d − 1 tight constraints.

Unfortunately, the authors only made four of the changes, omitting to adjust the count of non-negativity constraints, which should of course match the number of variables. (And, no, n makes no other appearance in this section).

A few sentences later, Said and Wegman break new mathematical ground in a howler that has already been excoriated by Andrew Gelman again and again :

… the simplex method visits all 2d vertices before arriving at the optimal vertex.

The original has:

the simplex method … visits all 2n vertices before arriving at the optimal vertex.

And all this time I thought a cube had eight vertices, not six. Who knew!

Example 3: Simulated annealing breaks down [SoS p. 5, p. 30]

Here is the opening from the  Wikipedia 2009 article on Simulated Annealing.

Simulated annealing (SA) is a generic probabilistic meta-algorithm for the global optimization problem, namely locating a good approximation to the global minimum of a given function in a large search space.

The Said and Wegman version is less clear, but at least does not totally mangle the original.

Simulated annealing is a probabilistic metaheuristic global optimization algorithm for locating a good approximation to the global minimum of a given function in a large search space.

Here’s the Wikipedia explanation of the role of the global “temperature” parameter T, and its effect on solution search.

By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random “nearby” solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly “downhill” as T goes to zero. The allowance for “uphill” moves saves the method from becoming stuck at local minima—which are the bane of greedier methods.

Now look at the Said and Wegman version, which effectively confuses the global “temperature” parameter that modulates the solution search, and the minimum solution itself, as well as omitting key details such as selecting successive candidate solutions “nearby”.

During each step of the algorithm, the variable that will eventually represent the minimum is replaced by a random solution that is chosen according to a temperature parameter, T. As the temperature of the system decreases, the probability of higher temperature values replacing the minimum decreases, but it is always non-zero. The decrease in probability ensures a gradual decrease in the value of the minimum. However, the non-zero stipulation allows for a higher value to replace the minimum. Though this may sound like a flaw in the algorithm, it makes simulated annealing very useful because it allows for global minimums to be found rather than local ones. If during the course of the implementation of the algorithm a certain location (local minimum) has a lower temperature than its neighbors yet much higher than the overall lowest temperature (global minimum), this non-zero probability stipulation will allow for the value of the minimum to back track in a sense and become unstuck from local minima.

This is plain wrong – frighteningly so, in fact.

Concluding thoughts

This paper is the fifth major work that I have analyzed from Wegman and Said. From the 2006 Wegman report to Congress, up to this year’s “Colour Theory and Design”, so much of Wegman and Said’s recent work demonstrates extreme reliance on unattributed antecedents, as well as numerous errors and incompetent analysis. Here is the list of the main works that have been found to be highly questionable (together with links to discussion and analysis for each).

  1. 2005: Chapter 1 (Wegman, Solka and Rao), and Chapter 13 (Said) in Handbook of Statistics: Data Mining and Data Visualization, Rao et al eds., Elsevier [Discussion]
  2. 2006: Wegman, Scott and Said; Ad Hoc Committee Report on the ‘Hockey Stick’ Global Climate Reconstruction [Discussion 1, 234 –  Analysis 1, 2, 3, 4   –   Statistical analysis discussion  ]
  3. 2008: Said, Wegman et al, “Social Networks of Author–Coauthor Relationships”, Computational Statistics and Data Analysis [Discussion  – Analysis – Retraction part1, part 2]
  4. 2009: Said and Wegman, “Roadmap for Optimization”, WIREs Comp Stat [Discussion  (this piece) –  Analysis]
  5. 2011: Wegman and Said, “Color Theory and Design”, WIREs Comp Stat [Discussion part 1, part 2 –  Analysis]

It’s a dismal chronology. Yet George Mason University is stuck in a never-ending research misconduct inquiry that apparently still has not reached a recommendation to proceed to a full-blown investigation, even after 18 long months consideration of an overwhelming mountain of evidence. Meanwhile, Wiley is incapable of facing up to the serious problems at WIREs Comp Stat and seems prepared to let the reputation of the journal spiral downward.

Perhaps it’s time for David Scott to step up and do the right thing.


38 responses to “Said and Wegman 2009: Suboptimal Scholarship

  1. One must ask: with 2 of the the 3 co-editors as authors, that only leaves David Scott to have managed the review. Did he? or did Said and Wegman just write the article and publish it?

    However, GMU *is* in an investigation … they just haven’t ever managed to provide an inquiry report.

  2. A quick editing question:

    Under “Concluding thoughts”, list # 2, analysis 1 and 2 both link to the same post, namely wegman-bradley-tree-rings-v20.pdf.

    Should one of them link to a different post?

    • Thanks for pointing that out; it should be fixed now. The discussion (blog post) and analysis (side-by-side comparison) are supposed to cover four topics in the Wegman Report in parallel (all in the “background” section 2):
      1. Tree ring background
      2. Ice core and corals
      3. Noise models and PCA
      4. Social Network Analysis

  3. DC, you state that Wiley is incapable of facing up. Does this mean you guys (I include John Mashey there) have contacted them? If so, what did they say?

    And did you contact David Scott and ask his opinion?

    • I haven’t contacted Wiley myself, but I have heard rumours of decidedly one-sided interactions. I’m not aware of the details, though.

      As for Scott, I feel sure he is well aware of the situation, but I don’t know of any direct contacts.

  4. I heavily rely on the simulated annealing algorithm in designing particle transport systems. The Said and Wegman paragraph on this topic is a real stunner. It’s not that it is just garbled, but the little sense it makes is entirely wrong. It mixes up parameters, the function to be minimized, and temperature; all 3. I was attempting to count all the errors in it, but gave up. It’s the type garbled regurgitation I would expect from a reporter with no background in physics or math. What a travesty!

  5. This has become so embarrassing that it’s almost unbearable. Why won’t GMU do anything?

    • Heck, I’m embarrassed too – I go to this school.

      From what I know of the relevant departments that actually STUDY global warming, I suspect they are angry and frustrated but have good reason to not speak out on this – knowing GMU’s administration and how poorly it already treats its professors (this ‘term assistant/associate professor’ bulls**t is just another word for ‘adjunct’; they’re not tenure track and can be terminated at any point), I think they would suffer serious repercussions if they tried. I have said time and time again that this school is a libertarian hovel. Remember, too, that this state is home to three cult houses – Liberty, Regent, and Patrick Henry; I won’t dignify those institutions with the word ‘university’ or ‘college’.

      I concur with what John Mashey told me in a comment on another website (Deltoid, I think) that even mentioning it in the student newspaper would take an extraordinarily gonad-endowed editor. I am afraid of the possible repercussions that might be brought against a student who mentioned this, let alone a faculty member.

      GMU is not an institution known for the candor of its students, or for their intelligence, for that matter, either – I wager that there aren’t too many students here who would even BOTHER to learn about this scandal. Hell, I would have gone to Virginia Tech instead if it wasn’t uncomfortably far away.

      [DC:Slightly edited for strong language.]

    • K. said:

      From what I know of the relevant departments that actually STUDY global warming, I suspect they are angry and frustrated but have good reason to not speak out on this

      There are indications that same anger and frustration may be present in certain quarters of the GMU statistics department itself.

    • I suspect GMU’s statistics department in general is taking a hit.

  6. Neven, I would tender that one answer to your question might simply involve rearranging it… to wit:

    Why won’t GMU do anything? This has become so embarrassing that it’s almost unbearable.

  7. With the help of various other folks:
    1) Wegman & Said(2011): Wiley was informed in March..

    2) Yasmin Said’s false affiliation with Oklahoma State.
    This was verified by early April, and reported to Wiley later that month.

    3) By late April, Wiley had been informed about Said&Wegman(2009), first with a few examples, then more detailed analysis. Other issues of potential concern were raised at that time.

    Most of the articles in WIREs:CS seem OK, written by people with good publication records in the area and no criticism whatsoever is implied.

    Note that David Scott’s C.V., as of Jan 2010 lists (p.12) 5 WIREs:CS papers as peer-reviewed.
    These all looked OK, but one has to wonder about the peer-review tag.

  8. As a layman following this saga, I’m getting the distinct impression from this aspect that Wegman’s data mining techniques are proving to be worse than useless and wronger than wrong in practice, gven the elementary.level of the generated mistakes revealed that weren’t picked and corrected up by either him or his team.

    With Wegman being (previously) perceived as an ‘authoritiy’ in that field and the US Naval military funding pumped into the research, it doesn’t inspire a lot of confidence in what Westworld’s security state might have on file in what they think like to think are ‘records’ on you and I.

  9. Dan Vergano weighs in with More Wikipedia copying from climate critics”.

    Once again, GMU spokesperson Walsch fails to cover himself with glory.

  10. Pingback: W’man < W’pedia, again « Statistical Modeling, Causal Inference, and Social Science

  11. Reading somewhat between the lines of the 2007-2009 COLA progress report (from – mostly the recommendations from the scientific advisory council (SAC) report), it sounds to me like GMU had set up an effort akin to tobacco’s Indoor Air Quality research.
    (although COLA reports it is NOAA/NSF/NASA funded)

    Perhaps someone who knows the field better than me could weigh in?

    • COLA is a good team of climate scientists. They are oriented to prediction of interannual variability (such as ENSO) and predictability of decadal-scale variability, rather than global warming. But scientists who study dynamics of global warming recognize their relevance. Also, the leader, Jagadish Shukla is known to be straight about global warming. I have just fund a web page at GMU written in 2007. . It does not necessarily mean that Shukla still think exactly like this, but I cannot imagine he turning skeptic to global warming. COLA belong to IGES, which seems to be a private non-profit corporation managed by Shukla family and supported by research funding of the government. It seems that GMU outsourced climate research to IGES, It may be a problem that GMU does not seem to have a good group of physical climate scientists inside (though it does have climate impact scientists).

    • Thank you. (Related, COLA’s Barry Klinger gave what looks to be a fine presentation (pdf) on the science, for the global warming teach-in in 2008. )

      But (AFAIK) the IAQ researchers were doing good research too, it just happened to be a research area where activity benefited tobacco interests. This GMU-affiliated group seems to be largely focused on researching Qs at the interface (thus blurring the distinction) between noise(weather&short-term variability) and signal (climate).

      And I am haunted (short term) by a COLA director’s recent, long CSPAN “climatology and climate dynamics” interview ( ), , which studiously avoids addressing our big picture (afaik;perhaps I missed something?) except for saying, once (~14:55), “climate change is happening, it is very likely to be associated with human activity”, with a strong spoken emphasis on the word “likely”. (try saying it that way, see what it conveys. )

  12. 1) Following may answer a few questions for those who watch trainwrecks.

    2) Wiley WIREs For authors says:
    “Your review will be published alongside other world-class contributions from leading researchers in the field.

    All WIREs article topics and authors are selected by an internationally renowned Editorial Board, and all content is rigorously peer reviewed by experts.
    Your article will have the highest possible visibility and usage.

    All WIREs titles will initially be made available for free to online users.
    Your review will attract full scientific and professional credit.

    The WIREs are serial publications that qualify for full Abstracting and Indexing and an Impact Factor/ISI Ranking.”

    “The WIREs adhere strongly to the guidelines of the Committee on Publication Ethics (COPE). All instances of publishing misconduct, including, but not limited to, plagiarism, data fabrication, image/data manipulation to falsify/enhance results, etc. will result in rejection/retraction of the manuscript in question.”

    SO: WIRES IS RIGOROUSLY PEER REVIEWED. the article is written by 2 of the 3 Editors-in-chief. Was there *any* peer-review, and who managed it?

    Wegman’s Feb 2010 CV (p.23) listed this as #197, in PUBLICATIONS – BOOKS, SPECIAL ISSUES OF JOURNALS, AND SOFTWARE, not under INVITED PAPERS. (There was no REFEREED PAPERS category.)

    3) Even without the plagiarism, the paper was obvious junk, to anyone with the slightest background.
    Many articles seem fine, by obvious experts, but given these examples, would everybody trust the review process in general?

    The best analog might be from Billy Madison(1995):
    “Mr. Madison, what you’ve just said … is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this room is now dumber for having listened to it.”

    a) My modest formal O.R. background was long ago, and the article was obviously a weird mishmash on first read, even it me.

    b) I showed it to someone whose PhD was in optimization decades ago, whose first reaction was “some poor rehash of Hillier&Lieberman?”

    c) I sent it to a real O.R. expert at a leading university, who found so many problems in the first few pages that he stopped there.


    Wiley was informed of the Wegman&Said(2011) problem in MARCH, 6.5 months ago. DC had already documented that in detail.

    Wiley was informed of the Said false title and affiliation (Professor, Oklahoma State University) in APRIL, 5.5 MONTHS AGO.
    That finally got fixed in September: Professor OSU to Professor GMU to Assistant Professor GMU. (She’s a Research Assistant Professor, so this is correct, but in some places, it makes a difference.)

    Wiley was briefly informed in April about Said&Wegman(2009):

    “2) PROBLEM: FURTHER PLAGIARISM: WIRES:CS Vol 1, Issue 1, Said and Wegman ,“Roadmap for optimization” (SW2009)

    Part of this article seemed to have come from Wikipedia, but more has been found since:

    I think a thorough comparison document will be prepared by an associate in next week or two, but a few hours’ efforts sufficed to find Wikipedia pages, circa mid-2009, all of which have text with striking similarities, although SW2009 occasionally has extra errors.

    For example, here is a cut-and-paste with minimal trivial edits, a plagiarism style seen often involving Said:

    Said and Wegman: p.9 Simulated annealing (zero citations)
    “Simulated annealing is a probabilistic metaheuristic global optimization algorithm
    for locating a good approximation to the global minimum of a given function in
    a large search space. For many problems, simulated annealing may be more effective than exhaustive enumeration provided that the goal is to find an acceptably good solution in a fixed amount of
    time, rather than the best possible solution.” (July 2009)
    ” Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem
    of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. … For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution.”

    One might ask if anyone actually reviewed this paper, as it has problems beyond plagiarism. The approach seems to take uncited Wikipedia pages, copy a few of the references found in Wikipedia, but often detached as “further reading” or equivalent.”

    Shorty thereafter, another person sent them a more detailed analysis, and then DC has followed up with the extremely-thorough discussion shown here.

    But, Wiley knew most of this by late APRIL.

  13. Pingback: Yet more Wegman plagiarism : Deltoid

  14. Wegman as a subject seems to belong in the same slough as Alger Hiss studies. Yes, it’s fertile for awhile, but there’s a danger in expending valuable intelligence upon a subject that doesn’t really demand it. Wegman appears to be done.

    His university’s willingness to stonewall, though, is another genuine scandal.

  15. Would someone add a notice of this to the Wikipedia article on Wegman?
    I tried, but apparently don’t know how to do it properly.

  16. Jeffrey: see this
    1) GMU, for almost all, no clear action.
    2) Wiley, for (i and r)
    3) Springer (s)
    4) NSWC for (l, Solka) and (n, Rigsby) and maybe the Ritson-related code issue.
    5) US Army: 2011 says sponsoring organization = IFNA (Wegman), i.e., presuiambly they get money to organize this.

    6) Various: Interface 2011.
    “Grant Support SAS Institute
    ASA Section on Statistical Computing
    ASA Section on Statistical Graphics

    7) JSM 2011.
    Still giving talks.

  17. Pingback: John Quiggin » Wegman plagiarism case: GMU jury out to permanent lunch

  18. Anna: Yasmin gave the talk, but after all, she’s a coauthor with Wegman on many of the problems.

    re: image: a newer version is coming, to point at DC’s post on this, among other things.

  19. An updated plagiarism-chain chart is here.

  20. BTW, I’d mentioned some history over at Andrew Gelman’s, and earlier Said’s bio in the WIREs author guide, i,.e., from this.

    You can read the original there, or an annotated version here, which Wiley got in April.

    “Dr. Yasmin H. Said is a Visiting Fellow(1) at the Isaac Newton Institute for Mathematical Sciences at the University of Cambridge in England
    and is a National Research Fellow(2) from the National Institutes of Health. She earned her A.B. in pure mathematics, her M.S. in computer science and information systems, and Ph.D. in computational statistics. She does alcohol modeling, agent-based simulation modeling, social network analysis, text, image, and data mining, and major public policy work trying to minimize negative acute outcomes, including HIV/AIDS, related to alcohol consumption. Dr. Said is also the Statistical Methodology Director of the Innovative Medical Institute, LLC(3), and Co-Director(4) of the Center for Computational Data Sciences in the College of Science at George Mason University. She is the editor of Computing Science and Statistics(5), is an associate editor of the journal, Computational Statistics and Data Analysis(6), serves on the board of the Washington Statistical Society(7), and serves on the American Statistical Association Presidential Task Force on Science Policy. Dr. Said is an elected member of the International Statistical Institute, an elected member of the Research Society on Alcoholism, and an elected member of Sigma Xi, the Scientific Research Society. She is currently writing a book, Controversies in Global Warming(9) and another, Statisticians of the Twentieth Century. She has published a book, Intervention to Prevention: A Policy Tool for Alcohol Studies. With colleagues she has developed testimonies on global warming for the House Committee on Energy and Commerce and to the House Subcommittee on Oversight and Investigations.(10) She has also taught probability and statistics at The Johns Hopkins University in Baltimore, MD.(11)”

    (1) She and Dr Wegman were at Cambridge during part of 2008.
    (2) Kirschstein Fellowship described above. As best as I can tell, that ended Fall 2008.
    (3) Innovative Medical Institute: little or no trace of this, whatever it is/was.
    (4) lists no such center. It does have Oddly, Said isn’t listed in .
    (5) That sounds like a journal, but is actually the proceedings of the Interface Symposia, organized by Interface, which has long been run by Wegman,, and for some of its history see: p.79.
    (6) She is no longer is an Associate Editor at CS&DA, and was not in December 2010.
    (7) She was a non-voting member, on the Social Arrangements Committee.
    I don’t think this still exists, and the term “Presidential” is curious.
    (9) This is claimed by Amazon to be a 288-page book published by Wiley in 2007, but only by Wegman.
    Some of us have tried to order this book. Oddly, at the same time different booksellers claimed it was not yet printed or available within 2-3 days. Perhaps some is still working on it. I would be curious if someone at Wiley can confirm the existence or non-existence of this book, noting that if it actually exists, and if it includes much of the Wegman Report, there will likely be copyright actions.
    (10) This was the Wegman Report, of which 35/91 pages have obvious plagiarism, among many other problems, including a key statistical claim based on a 1% cherry-pick of desired results.

    Click to access strange-scholarship-v1-02.pdf

    (11) Yes, this was true, for 2005-2006 school year. Then she returned to GMU.

  21. For those with computer networking expertise, take a look at Rezazad(2011), currently free.
    Don’t worry particularly about plagiarism, I’m more interested in:

    a) Technical opinions on the techniques.

    b) How this works as a *review* article.

    • How is this any different than the travelling salesman problem which has been around for years, and for which genetic (real DNA) algorithms have been tried?
      Seems more of a plug for “netEnhancer” whatever that is….

  22. Harvey:
    1) The extent to which NetEnhancer is actually useful in the real world.

    It is akin to the TSP in using graph models, but TSP tries for a minimal-distance/cost tour that visits all nodes.

    NetEnhancer wants to change the graph, and it’s not a tour. It’s slightly more related to use of Minimal Spanning Trees in network design, a topic of interest at Bell Labs (where both Prim and Kruskal MST algorithms arose). Of course, a MST gets disconnected upon any cut, as it by definition has no redundant paths.

  23. Sigma Xi? Seriously, no one puts that on their resume except a graduate student with little else to include. Anyone who has done the least iota of scientific research can be a member. Surprised she didn’t list that she was an “elected” member of AAAS, elected by the person who entered her information into the database and cashed her check.

  24. Pingback: Wiley coverup: The great Wegman and Said “redo” to hide plagiarism and errors | Deep Climate