New metric takes a crack at solving science’s credibility problem
A newly proposed, citation-based metric assesses the veracity of scientific claims by evaluating the outcomes of subsequent replication attempts. Introduced in an August bioRxiv preprint
R-factors are calculated from what Verum director of research Sean Rife and his colleagues call golden citations, which reference manuscripts that directly replicate a particular study. (The vast majority of citations—about 95%—just mention other papers.) A paper’s R-factor is the number of confirmatory golden citations divided by the sum of confirmatory and refuting golden citations. The more reproducible a study is, the closer its R-factor is to 1.
In their paper, Rife and colleagues calculated R-factors of three cancer papers that were recently evaluated in the cancer biology reproducibility project
Additionally, the authors scanned more than 12 000 excerpts that cite other papers, judging them as confirming, refuting, mentioning, or unsure. Ultimately the aim is to build a database of around 200 000 sorted texts, including physics and math preprints from the arXiv, that would be used to train an algorithm to do the classification autonomously.
Although it takes on a critical flaw in modern science, the new metric has drawn plenty of criticism. Pseudonymous science blogger Neuroskeptic, who was one of the first to report on R-factors
Another caveat is the tool’s simplicity, says Adam Russell, an anthropologist and program manager at the Defense Advanced Research Projects Agency who has called for solutions
Marcel van Assen, a statistician at Tilburg University in the Netherlands, says the R-factor approach is similar to a procedure in meta-analyses called vote counting, which has “long been discarded because it is suboptimal and misleading.” He concludes that the R-factor “is more like two steps backward rather than one forward.”
Sabine Hossenfelder, a theoretical physicist at the Frankfurt Institute for Advanced Studies
Thumbnail photo credit: Amitchell125, CC BY-SA 3.0