The Crisis Of Reproducibility Of Scientific Experiments - Alternative View

Table of contents:

The Crisis Of Reproducibility Of Scientific Experiments - Alternative View
The Crisis Of Reproducibility Of Scientific Experiments - Alternative View

Video: The Crisis Of Reproducibility Of Scientific Experiments - Alternative View

Video: The Crisis Of Reproducibility Of Scientific Experiments - Alternative View
Video: Is there a reproducibility crisis in science? - Matt Anticole 2024, May
Anonim

By chance, in a stream of news and information, I came across an article in Nature Scientific Reports. It presents data from a survey of 1,500 scientists on the reproducibility of research results. If earlier this problem was raised for biological and medical research, where on the one hand it is explainable (false correlations, the general complexity of the systems under study, sometimes even scientific software is accused), on the other hand, it has a phenomenological character (for example, mice tend to behave differently with scientists different genders (1 and 2)).

However, not everything is smooth with more natural sciences, such as physics and engineering, chemistry, ecology. It would seem that these very disciplines are based on "absolutely" reproducible experiments conducted under the most controlled conditions, alas, amazing - in every sense of the word - the result of the survey: up to 70% of researchers have encountered non-reproducible experiments and results obtained not only by other groups of scientists, BUT also by the authors / co-authors of published scientific works!

Does each sandpiper praise its swamp?

Although 52% of respondents point to a crisis of reproducibility in science, less than 31% consider the published data to be fundamentally incorrect and the majority indicated that they still trust the published work.

Question: Is there a crisis of reproducibility of results?

Of course, one should not hack off the shoulder and lynch all science as such only on the basis of this survey: half of the respondents were still scientists associated, in one way or another, with biological disciplines. As the authors note, in physics and chemistry, the level of reproducibility and confidence in the results obtained is much higher (see graph below), but still not 100%. But in medicine, things are very bad compared to the rest.

Promotional video:

An anecdote comes to mind:

Marcus Munafo, a biological psychologist at the University of Bristol, England, has a longstanding interest in the reproducibility of scientific data. Recalling the days of his student days, he says:

Question: How many published works in your industry are reproducible?

Latitude and longitude problem depth

Imagine that you are a scientist. You come across an interesting article, but the results / experiments cannot be reproduced in the laboratory. It is logical to write about this to the authors of the original article, ask for advice and ask clarifying questions. According to the survey, less than 20% have done this at any time in their scientific career!

The authors of the study note that, perhaps, such contacts and conversations are too difficult for the scientists themselves, because they reveal their incompetence and inconsistency in certain issues or reveal too many details of the current project.

Moreover, an overwhelming minority of scientists attempted to publish a refutation of irreproducible results, while facing opposition from editors and reviewers who demanded that comparisons be downplayed with the original study. Is it any wonder that the chance of reporting the non-reproducibility of scientific results is about 50%.

First question: Have you tried to reproduce the results of the experiment?

Second question: Have you tried posting your attempt to reproduce the results?

Maybe, then, inside the laboratory, at least to carry out a reproducibility test? The saddest thing is that a third of the respondents have NEVER even thought about creating methods for checking data for reproducibility. Only 40% indicated that they regularly use such techniques.

Q: Have you ever developed special techniques / technical processes to improve the reproducibility of results?

Another example, a biochemist from the United Kingdom, who did not want to reveal her name, says that attempts to repeat, reproduce work for her laboratory project simply doubles the time and material costs, without giving or adding anything new to the work. Additional checks are carried out only for innovative projects and unusual results.

And of course, the eternal Russian questions that began to torture foreign colleagues: who is to blame and what to do?

Who's guilty?

The authors of the work identified three main problems of reproducibility of results:

  • Pressure from superiors to get the work published on time
  • Selective reporting (apparently, it means the suppression of some data, which "spoil" the whole picture)
  • Insufficient data analysis (including statistical)

Question: What factors are responsible for irreproducible scientific results?

Answers (from top to bottom): –Sample reporting –Bosses pressure –Poor analysis / statistics –Insufficient repeatability of the experiment in the laboratory –Insufficient supervision –Lack of methodology or code –Poor experimental design –Lack of raw data from the primary laboratory –Fraud –Insufficient verification by experts / reviewers –Problems with attempts to reproduce –Technical expertise is required to reproduce –Variability of standard reagents - "Neddachka and Pichalka"

What to do?

Out of 1,500 surveyed, more than 1,000 specialists spoke in favor of improving statistics in collecting and processing data, improving the quality of oversight from bosses, and more rigorous planning of experiments.

Q: What factors will help improve reproducibility?

Answers (from top to bottom): - Better understanding of statistics - Stricter oversight - Better experimental design - Training - Internal laboratory review - Improved practical skills - Encouraging formal data cross-checking - Interlaboratory review - Allocating more time for project management - Increasing scientific journal standards - Allocating more time to work with laboratory records

Conclusion and some personal experience

Firstly, even for me, as a scientist, the results are staggering, although I have already got used to a certain degree of irreproducibility of the results. This is especially evident in the works performed by the Chinese and Indians without third-party "audit" in the form of American / European professors. It is good that the problem was recognized and thought about its solution (s). I will tactfully keep silent about Russian science in connection with the recent scandal, although many honestly do their job.

Secondly, the article ignores (or rather, does not consider) the role of scientific metrics and peer-reviewed scientific journals in the emergence and development of the problem of irreproducibility of research results. In pursuit of the speed and frequency of publications (read, an increase in citation indices), the quality drops sharply and there is no time for additional verification of the results.

As they say, all characters are fictional, but based on real events. Somehow one student had a chance to review an article, because not every professor has the time and energy to thoughtfully read the articles, so the opinion of 2-3-4 students and doctors is collected, from which the review is formed. A review was written, it pointed out the irreproducibility of the results according to the method described in the article. This was clearly demonstrated to the professor. But in order not to spoil the relationship with the "colleagues" - after all, they succeed in everything - the review was "corrected". And there are 2 or 3 such articles published.

It turns out to be a vicious circle. The scientist sends the article to the editor of the journal, where he indicates the "desired" and, most importantly, "unwanted" reviewers, that is, in fact, leaving only those who are positively disposed towards the group of authors. They review the work, but they cannot “shit in the comments” and try to choose the lesser of two evils - here is a list of questions that need to be answered, and then we will publish the article.

Another example, which the editor of Nature spoke about just a month ago, is Grazel's solar panels. Due to the enormous interest in this topic in the scientific community (after all, they still want an article in Nature!), The editors had to create a special questionnaire, in which it is required to indicate a lot of parameters, provide equipment calibrations, certificates, etc., to confirm that the method for measuring efficiency panels conforms to some general principles and standards.

And, thirdly, when once again you hear about a miracle vaccine that conquers everything and everyone, a new story about Jobs in a skirt, new batteries or the dangers / benefits of GMOs or the radiation of smartphones, especially if it was promoted by yellow writers from journalism, then be understanding and do not jump to conclusions. Wait for the confirmation of the results by other groups of scientists, the accumulation of the array and data samples.