Skip to main content
opinion

Ira Basen is a Toronto-based radio documentary producer and writer.

In June, 1936, Albert Einstein submitted a paper to the journal Physical Review. The famous physicist argued that contrary to the accepted scientific wisdom of the day, gravitational waves did not exist.

Six weeks later, Einstein received a note from the journal’s editor, informing him that the paper had been reviewed by an unnamed expert who had found some mistakes. The editor asked Einstein if he wanted to revise the paper based on the reviewer’s report.

Einstein replied that he had not authorized anyone to review the paper before publication. Accordingly, “I see no reason to address the – in any case erroneous – comments of your anonymous expert.”

Einstein proceeded to publish his work in another journal, and never again submitted a paper to Physical Review.

The process that Einstein found so offensive was still a novelty in 1936. It’s now called “peer review,” and it is the foundation upon which all academic publishing rests. It is the stamp of approval that allows other scientists, journalists and the public to trust the quality of published science. Publishing in a peer-reviewed journal is critical for researchers to gain access to grants, tenure and academic promotions.

But today, that foundation is teetering, collapsing from the weight of its own contradictions. It has been attacked for being too slow, too opaque, too biased, too accepting of the status quo, and too likely to fail in its most important purpose: weeding out bad science.

The idea is simple enough: A journal editor solicits a few experts in the field to help guide the editors’ decision about whether to publish a paper or not. The names of reviewers are generally not revealed to the authors, and reviewers are not paid.

In some highly specialized areas, where the pool of potential reviewers is fairly small, it’s not unusual for senior faculty members to be asked to review hundreds of articles a year. But reviewing is time-consuming, and without compensation or recognition, they’ll typically say yes to only a handful of requests.

That means journals are constantly scrambling to find qualified reviewers. In 2018, the editor of Pharmacy Practice sent invitations to 879 potential peer reviewers. Only 198 accepted, and of those, 15 never completed their review.

This has led to significant backlogs. Researchers often wait a year or more from submission to publication. That can have serious consequences if the article advances an important area of drug or medical research.

Frustrated by publication delays during the pandemic, thousands of researchers chose to bypass the peer-review process altogether by publishing their papers directly to the web as “preprints,” on public servers such as bioRxiv and medRxiv.

Most preprint authors eventually submit their papers to peer-reviewed journals, and recent studies have shown that most get published with only minimal changes.

These authors argue that preprints can foster collaborations between researchers in ways that are not possible with peer review. They can get timely feedback from a worldwide community of experts, rather than two or three overworked, unpaid, anonymous reviewers. That sounds good in theory, but the reality is that most preprints receive few if any comments.

Journals have recently been trying to accelerate their peer-review process, but the length of time it takes to make a decision remains a source of frustration. One large publishing company, the Taylor & Francis Group, recently offered an “Accelerated Publication” deal to writers. They’ll guarantee to make a decision about the article three to five weeks from submission, but the cost for that service is US$7,000.

If a paper is rejected, or if reviewers or editors demand major changes, authors will frequently do what Albert Einstein did, and shop the paper around to other journals where they might get a more positive response. Or, if they’re desperate, they might pay several thousand dollars to publish in one of what the academic community calls “predatory” journals, which number in the thousands and claim to be peer-reviewed but are not.

The anonymity of the review process opens it up to potential abuse. Critics claim that, too often, reviewers will reject a paper if it challenges their own research, or if it’s written by an author the reviewer may not know or like. Women and minorities have historically been seriously underrepresented as editors, reviewers and authors.

Peer review is unquestionably a good way of catching research errors, but it’s far from perfect. Reviewers are asked to evaluate the paper on its validity (was the research design and methodology appropriate?), its significance (is it an important finding?) and its originality (is there something new here?). It is not the reviewers’ job to check the accuracy of the data, and they don’t usually try to replicate experiments. Peer review is not a guarantee that the science is right, just that it was done properly.

Errors are more likely to be exposed after publication. According to the oversight group Retraction Watch, more than 3,000 papers were retracted in 2021 because of errors or outright fraud, including more than 300 COVID-related studies.

Several solutions to the problems confronting peer review are currently under consideration, and one of the most innovative is coming from two biologists at the University of Guelph. They’re trying to reinvent peer review by taking the process out of the hands of journal editors and publishers and placing it squarely with reviewers and authors.

In March, Dr. Andreas Heyland and Dr. Terry Van Raay launched Peer Premier, which promises a faster, less costly, more transparent version of peer review. Authors will pay US$1,100 to have their papers reviewed by three reviewers. An AI algorithm helps select the reviewers to ensure they are qualified and there are no conflicts of interest.

Reviewers will be paid US$300 each. Their reviews will be based on a standardized rubric developed by the two co-founders, and they will be expected to complete their reviews within one week.

At that point, the authors can choose to submit their article to a journal, or they could post it online on a public server. But unlike preprints, these papers will come with the Peer Premier stamp of approval, which means they will have been subjected to peer review that was fair, independent and done by qualified reviewers. That’s a promise that even the most prestigious journals can’t always make today.

One last point – it turns out that the expert who reviewed Einstein’s paper was right. The great scientist had indeed made a computational error in the paper he submitted to Physical Review. Einstein eventually acknowledged the error, and corrected it when he submitted the paper to the next journal.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe