I am fascinated by the psychology of scientific fraudsters. What drives these people? If you are smart enough to fake results, surely you have the ability to do research properly? You should also be clever enough to realise that one day you will get caught. And you should know that fabricating results is a worthless exercise that runs completely counter to the spirit of enquiry. Why would anyone pervert their science with fakery?
The reasons why some scientists succumb to corruption have no doubt also intrigued psychologists but of late you could be forgiven for suspecting them of being more preoccupied with committing fraud than analysing it. Psychology is not the only field of inquiry tarnished by incidents of dishonesty — let’s not forget physicist Jan Hendrik Schön, stem cell researcher Hwang Woo-suk or crystallographer HM Krishna Murthy — but its practitioners may be better placed than most to analyse the origins of the problem.
Indeed one of the most prominent recent transgressors has provided some useful insights. In 2011 Diederik Stapel, a professor of social psychology, was suspended from his job at Tilburg University because of suspected fraud; a subsequent investigation found that he had fabricated data over a number of years that affected over 55 of his publications. Interviewed in the New York Times by Yudhijit Bhattacharjee, the disgraced psychologist was candid about where he had gone wrong:
Stapel did not deny that his deceit was driven by ambition. But it was more complicated than that, he told me. He insisted that he loved social psychology but had been frustrated by the messiness of experimental data, which rarely led to clear conclusions. His lifelong obsession with elegance and order, he said, led him to concoct sexy results that journals found attractive. “It was a quest for aesthetics, for beauty — instead of the truth,” he said. He described his behavior as an addiction that drove him to carry out acts of increasingly daring fraud, like a junkie seeking a bigger and better high.
There’s a fair bit to unpack in those few lines. In part the problem is systemic. Stapel’s allusion to journals’ demands for ‘sexy results’ is a nod to one of the corrosive effects on researchers of the construction of journal hierarchies on the shifting and unreliable sands of impact factors. Stapel elaborates later on in the interview:
What the public didn’t realize, he said, was that academic science, too, was becoming a business. “There are scarce resources, you need grants, you need money, there is competition,” he said. “Normal people go to the edge to get that money. Science is of course about discovery, about digging to discover the truth. But it is also communication, persuasion, marketing. I am a salesman.”
Competition for finite resources is no bad thing, helping to ensure that grants and promotions are awarded to the applicants doing the highest quality science, but the process has been undermined by over-reliance on journal impact factors as a measure of achievement. A paper in a ‘top’ journal is now often seen as a more important goal than the publication of the very best science because busy reviewers rely too readily on the name of the journals the applicants’ papers are published in rather than the work that they report. Although ‘many normal people go to the edge’, it is clear that Stapel went well beyond it. At some point the self-promoting salesman overtook the discoverer of truth.
Unfortunately the issue of publication pressures leading to poor scientific practice is hardly news. A decade ago Peter Lawrence — always worth reading on the conduct of science and scientists — analysed the ‘politics of publication‘ and lamented that “when we give the journal priority over the science, we turn ourselves into philistines in our own world.” Lawrence’s gloomy prognosis has been borne out by Fang and Casadevall’s revelation that retraction rates are strongly correlated with impact factors. Stapel’s unmasking continues that sorry trend, one that will not be reversed until we can break our dependency on statistically dubious methods of assessment.
Problems of dubious practice (of varying degrees of severity) are more widespread than most realise but It is still true that most scientists live with the stress of competition without relinquishing their ethics. So what pushed Stapel over the edge? Good mentorship of junior scientists is recognised as a valuable corrective but the Dutch researcher’s training is not discussed in detail in the New York Times interview. He himself seems to think that it was the interaction of his personality traits with the highly tensioned system of publication and reward that led to impropriety. His “lifelong obsession with elegance and order” appears to have been at the root of his frustration with “the messiness of experimental data, which rarely led to clear conclusions”.
Stapel is hardly alone in his desire for elegance. Many scientists will have felt the deep satisfaction of conceiving a theory that brings a graceful simplicity to unruly data or of executing experiment that confirms a new hypothesis. There is an almost visceral pleasure in such instances of congruence, and aggravation in equal measure when experiment and theory collide abortively. Thomas Henry Huxley identified the tragedy of science more than a century ago — “the slaying of a beautiful hypothesis by an ugly fact” — but it was for him something you simply had to live with.
However, Huxley’s aphorism belies a more complex truth because science is a messy business and it is not always clear when a fact is truly ugly enough to bring down a hypothesis. The judgement can be a fine one and observations are sometimes set aside quite properly as part of plotting an intuitive path to a new insight; but the process is clouded by the degree of conviction that the scientist has in their cherished hypothesis, so the handling of inconvenient truths can shade into malpractice.
Crick and Watson were up-front about the need to discount some of the data that they worked with en route the structure of DNA — ‘some data was bound to be misleading if not plain wrong’, wrote Watson — but others have dissembled*. Mendel, Millikan and Eddington, for example, all discarded observations that famously conflicted with their respective conclusions on heredity, the charge on the electron and the veracity of Einstein’s general theory of relativity (but see update below with regard to Eddington). As Michael Brooks has pointed out in Free Radicals, his entertaining book on rule-breaking researchers, these renowned scientists may have been vindicated by history but their shady practices were hardly justifiable at the time. Stapel’s misdemeanours of fabricating data to support his hypotheses are more extreme — he also loses out also because his theories of psychological priming have been undermined by his unmasking — but nevertheless lie on a continuum of fraudulent practice with his scientific forebears. They all share the belief that they were right.
Even so, I can’t quite get the measure of Stapel’s behaviour. Perhaps the success that flowed from his synthetic results, given the seal of approval by peer reviewers and editors when published in prestigious journals, validated an approach that he must have known was scientifically dubious. The New York Times interview conveys a sense of regret now that he has been found out — a regret sharpened by the reaction of his wife, children and parents, forced to look anew at a man they knew so well — but why did he never question himself during the years of fabrication?
In my mind I keep returning to Stapel’s dissatisfaction with the untidiness of experimental data. I think that might be because I have just published one of the messiest papers ever to come out of my lab and am rather pleased with it for precisely that reason. I offer this story as a counter-anecdote to the case of the errant psychologist, not as a holier-than-thou pose, but simply to give a sense of what it feels like to wrestle with real data.
Our paper reports the structure of a norovirus protein called VPg. Though long supposed to be ‘intrinsically disordered’, our work shows that the central portion of VPg’s chain of amino acids folds up into a compact structure consisting of two helices packed tightly against one another; the two ends of the protein remain flexible. It’s nice to confound the prevailing viewpoint on VPg but that’s not the interesting bit about our new results.
The interesting bit is that our structure doesn’t make sense. Not yet at any rate. Usually, working out the structure of a protein is an enormously helpful step towards figuring out how it works but that’s not the case with VPg. Our structure is a bit baffling.
The protein plays a key role in the virus replication, the process of reprogramming infected cells to make the components — proteins and copies of the viral RNA genome — needed to assemble thousands of new virus particles. That’s what infection is all about, at least as far as the virus is concerned (though the infected host often has a different perspective).
VPg acts as seed point for the initiation of the synthesis of new viral RNA genomes. To do this it is bound by the viral polymerase, an enzyme or nanomachine that catalyses the chemical attachment of an RNA base to a specific point — a tyrosine side chain — on the surface of protein. In turn this RNA base becomes the point of attachment for the next one and so on until the whole RNA chain — all 7500 bases — is complete.
From our structure we can see that the tyrosine anchor point on VPg lies on the first helix of the core structure but the problem is that the core is too big to fit into the cavity within the polymerase where the chemistry of RNA attachment occurs. So at first sight, VPg appears to have a structure that interferes with one of its most important functions. To solve this apparent contradiction, we came up with what I thought was a rather lovely hypothesis: we guessed that the VPg structure has to unfold to interact properly with the polymerase, supposing there might be just enough room for a single helix to get into the active site but not a tightly associated pair.
VPg: too bloody big to fit in the polymerase active site!
We tested this idea by mutating our VPg to introduce amino acids changes that would destabilise its core structure, reasoning that this would make it easier for the polymerase to grab on to the protein, so increasing the rate at which it could add RNA bases. But although the changes made disrupted the protein structure, they almost invariably also reduced the efficiency of the polymerase reaction. The experiment succeeded only in generating an ugly fact to disfigure our hypothesis.
Except it’s not dead yet — not to me. I can make excuses. The method we used to measure the rate of addition of RNA to VPg by the polymerase was less than optimal. We couldn’t work with purified components in a test tube, and so had to monitor the reaction inside living cells using an indirect readout for elongation of the RNA chain. It remains possible that this assay is confounded by the effects of other molecules in the cell. Plus, we haven’t yet been able to analyse the structure of the viral polymerase with VPg bound to it — caught in the act of adding RNA bases. Like Thomas, until I can really see evidence that conflicts with my supposition, I’m not ready to give up on the hypothesis that VPg has to unfold to interact properly with the polymerase.
But it could take quite a while to develop the reagents and the techniques to do these more probing experiments and since we had already spent quite a number of years getting to this point, we wanted to publish the results. The story we had to tell in the paper in unfinished. To some eyes it might look like a bit of a mess and I was certainly concerned that the reviewers of the Journal of Virology, where we eventually submitted the manuscript for publication, might insist that we go back to the lab to get the data to fill in the gaps. We had an interesting new structure to report but our experimental analyses had only managed to confirm that we don’t yet know what the structure is for. We were asked some searching questions and the manuscript was improved by the subsequent editing but happily the reviewers — and the editor — still understood that progress in science is more often made in small steps than in giant leaps.
We haven’t tied off the whole story of how VPg in norovirus RNA replication but that’s OK. Now that we have given an honest account of our puzzling structure, others can also apply their minds to the problem. Indeed the publication has already sparked a couple of interesting email exchanges. The situation might still be messy but it’s far from messed up.
Update, May 12: As pointed out by Cormac in two comments below and by Peter Coles on twitter (see my reply below), there appear to be strong arguments for not including Eddington in this list of dissemblers. It is ironic perhaps that a blog on messiness in science should itself become rather messy but I prefer to think it merely shows the value of open discussion.
*Of course, Crick and Watson famously benefitted from not entirely proper access to Franklin’s and Gosling’s X-ray diffraction images of DNA.