Wednesday, November 2, 2011

Scientific fraud: do you want your data cooked or fabricated?

Word is out about a significant scientific (well, soocial scientific) fraud by Dutch social psychologist 
 Diederik Stapel. Scientific American reports that a preliminary investigation released on Oct. 31 found that Stapel fabricated data used in at least thirty publications.
Stapel's eye-catching studies on aspects of social behaviour such as power and stereo­typing garnered wide press coverage. For example, in a recent Science paper (which the investigation has not identified as fraudulent), Stapel reported that untidy environments encouraged discrimination ( Science 332, 251-253; 2011).
"Somebody used the word 'wunderkind'," says Miles Hewstone, a social psychologist at the University of Oxford, UK. "He was one of the bright thrusting young stars of Dutch social psychology -- highly published, highly cited, prize-winning, worked with lots of people, and very well thought of in the field."
In early September, however, Stapel was suspended from his position as dean of the Tilburg School of Social and Behavioral Sciences over suspicions of research fraud. In late August, three young researchers under Stapel's supervision had found irregularities in published data and notified the head of the social-psychology department, Marcel Zeelenberg. Levelt's committee joined up with sister committees at the universities of Groningen and Amsterdam, where Stapel has also worked, to produce the report. They are now combing through his publications and their supporting data, and interviewing collaborators, to map out the full extent of the misconduct.


Science Insider provides additional details:

Stapel's work encompassed a broad range of attention-catching topics, including the influence of power on moral thinking and the reaction of psychologists to a plagiarism scandal.  The committee, which interviewed dozens of Stapel's former students, postdoctoral researchers, co-authors, and colleagues, found that Stapel alone was responsible for the fraud. The panel reported that he would discuss in detail experimental designs, including drafting questionnaires, and would then claim to conduct the experiments at high schools and universities with which he had special arrangements. The experiments, however, never took place, the universities concluded. Stapel made up the data sets, which he then gave the student or collaborator for analysis, investigators allege. In other instances, the report says, he told colleagues that he had an old data set lying around that he hadn't yet had a chance to analyze. When Stapel did conduct actual experiments, the committee found evidence that he manipulated the results.
Many of Stapel's students graduated without having ever run an experiment, the report says. Stapel told them that their time was better spent analyzing data and writing. The commission writes that Stapel was "lord of the data" in his collaborations. It says colleagues or students who asked to see raw data were given excuses or even threatened and insulted. 
At least two earlier groups of whistleblowers had raised questions about Stapel's work, the commission found. No one followed up on their concerns, however. Stapel's fabrications weren't particularly sophisticated, the committee says, and on careful inspection many of the data sets have improbable effect sizes and other statistical irregularities. His colleagues, when they failed to replicate the results, tended to blame themselves, the report says. Among Stapel's colleagues, the description of data as too good to be true "was a heartfelt compliment to his skill and creativity," the report says. 

Scientific fraud is a sadly familiar story, infecting physical and biological sciences as well. A notorious case involved cancer researcher Summerlin in 1974 as recounted by Gerald Weissman , editor of the Journal of the Federation of American Socieites for Experimental Biology, who in 2006 reviewed the recent history of scientific fraud and came to some conclusions:



1) Some scientists will cheat and we’ll probably never know how often this happens. Young or old, MD or PhD, average or distinguished, male or female, black or white or khaki, some scientists will try to pull wool over the eyes of their colleagues, their reviewers, and their editors.
2) The culture of science is based on trust, not suspicion. Great scientists have had fraud committed under their noses, and good editors have published fabrication. Reviewers and editors must have a keen nose for swindle, but cannot engage in criminal investigation. As in political life, where I tend to side with liberty over security, in science, I’d go for trust over suspicion every time.
3) Therefore, we ought not to rely on machines or algorithms of image or text analysis, but rely on the judgment of our editors, our editorial boards, and our reviewers as to whether a manuscript looks as if the data had been cooked.
4) No measure of "quality assurance", no affidavit of authorship, no oath of responsibility or percent effort can stop a soul hell-bent on self-destruction. And fraud in science, if not in politics, is always self-destructive. Since the name of the game is confirmation, science is self-cleansing: flawed work is soon forgotten and remains uncited.
Science, maybe even social science, is a reflexively critical, self- correcting enterprise. Still, when the incentives on researchers to produce results are too high, fraud will increase. For a literary take on the ambiguities of research and fraud, I recommend Allegra Goodman's excellent novel:, Intuition.

1 comment:

  1. Data recovery fraud is increasing day by day. we can save our data if we take safety and provide best security. we are also deal in data document solution scam prevention services by Ari Balabanian .

    ReplyDelete