publishing bias

Publication bias. Grant bias.

All academics writing grants will tell you this: if you want to be successful when applying to a thematic research grant call, you must tick all of the boxes.

Now, imagine that you are a physicist, expert in quantum mechanics. A major funding opportunity arises, exactly matching your interest and track record. That is great news. Obviously you will apply. One difficulty however is that, amongst other things, the call specifies that your project should lead to the “development of highly sensitive approaches enabling the simultaneous determination of the exact position and momentum of a particle“.

At that point, you have three options. The first one is to write a super sexy proposal that somehow ignores the Heisenberg principle. The second option is to write a proposal that addresses the other priorities, but fudges around that particular specification, maybe even alluding to the Heisenberg principle. The third option is to renounce.

The first option is dishonest. The second option is more honest, but, in effect, is not so different from the third: your project is unlikely to get funded if you do not stick to the requirements of the call, as noted above. The third option demonstrates integrity but won’t help you with your career, nor, more importantly with doing any research at all.

And so, you have it. Thematic grant calls that ask for impossible achievements, nourished by publication bias and hype, further contribute to distortion of science.

OK, I’ll confess: I have had a major grant rejected. It was a beautiful EU project (whether BREXIT is partly to blame I do not know). It was not about quantum mechanics but about cell tracking. The call asked for simultaneous “detection of single cells and cell morphologies” and “non-invasive whole body monitoring (magnetic, optical) in large animals” which is just about as impossible as breaking the Heisenberg principle, albeit for less fundamental reasons. We went for option 2. We had a super strong team.

How many people are using the #SmartFlares? Freedom of Information request provide insights

Quick summary of previous episodes for those who have not been following the saga: Chad Mirkin’s group developed a few years ago a technology to detect mRNAs in live cells, the nano-flares. That technology is currently commercialised by Merck under the name smartflares. For a number of reasons (detailed here), I was unconvinced by the publications. We bought the smartflares, studied their uptake in cells as well as their fluorescent signal and concluded that they do not (and in fact cannot) report on mRNAs levels. We published our results as well as all of the raw data

This question – how many people are using the SmartFlares? – is interesting because surely, if a multinational company such as Merck develops, advertises and sells products, to scientists worldwide, these products have to work. As Chad Mirkin himself said today at the ACS National Meeting in Philadelphia “Ultimate measure of impact is how many people are using your technologies“.

So, we must be wrong. SmartFlares must work.

But our data say otherwise, so what is going on?

One hint is the very low number of publications using the smartflares and the fact that some of those are not independent investigations. This, however does not tell us how many groups in the world are using the smartflares.

Here is an hypothesis: maybe lots of groups worldwide are spending public money on probes that don’t work… and then don’t report the results since the probes don’t work. That hypothesis is not as far fetched as it may seem: it is called negative bias in science publishing and it is one of the causes of the reproducibility crisis.

To test this hypothesis, we would need to know how many research groups worldwide have bought the smartflares, an information that I suspected Merck was not going to volunteer. So, instead, I made Freedom of Information requests to (nearly) all UK research intensive universities (the Russell group) asking whether they had evidence of smartflare purchase orders.

Some Universities (6) declined because it would have been too much work to retrieve the information but most (14) obliged. The detailed results are available here. They show that a minimum of 76 different purchases were made between the launch of the product and June 2016. The money spent is £38k representing 0.0013% of these UK universities research income. As far as I can see, none has resulted in a publication so far.

All I can say is that these data do not falsify our hypothesis.

And if after reading this, you are still unconvinced of the need to publish negative data, check the upturnedmicroscope cartoon (warning: scene of violence).