Posted on : November 16th 2021
Posted by : Sithara Chandran
Over the last decade, the scholarly publishing industry has seen an increase in innovation and experimentation in peer review. The development of digital technology has fuelled this, and the fact that there is significant room for improvement in the peer review process is becoming increasingly evident.
Several technological breakthroughs set the ground for innovative development of the peer review system, beginning in the 1990s. This opened up new opportunities for a range of innovative peer review models. The technological infrastructure that permits review is arguably the most important shift brought forth by digitalization. Gaining access to researcher webpages and emails allowed for faster distribution of submissions and review reports, considerably boosting the efficiency and speed of the review process.
Two new forms of peer review have emerged in the last two decades - post-publication peer review, in which manuscripts are evaluated after publication; and registered reports, in which publications are examined prior to submission to the journal.
Post-publication review has gone mainstream among journals and publishers, as well as in preprint servers. The introduction of this new review form was done largely to enable faster knowledge transfer. This post-publication peer review methodology is currently being used by a number of journals.
First introduced in 2013, the registered reports system continues to be restricted primarily to psychological and medical disciplines. Here, manuscripts are reviewed in two stages. First, the methods and proposed analyses are peer-reviewed prior to the data collection and analysis, (stage 1) and once the study is complete, the authors finalise the article to include results and discussion (stage 2). The entire paper is peer-reviewed yet again to verify that there is no unreasonable divergence from the pre-registered protocol.
In addition to establishing a system of pre-print archives, the internet and large databases enabled journals to publish a vast amount of articles. The scholarly community witnessed a sudden surge in the number of manuscripts published in outlets using this non-restrictive review model. This sudden surge could be primarily attributed to the less stringent review procedure, resulting in new challenges in the publication process. Finding enough qualified reviewers to manage submissions is one of the major challenges. Additionally, the large number of published papers raised concerns that the scientific literature may become unmanageable due to an oversupply of papers. This increased the need for further screening to ensure that academics can deal with the enormous volume of potentially relevant papers.
Over the last few decades, new players have entered the review process. Several models have evolved. Platforms like Axios Review, Peerage of Science, and RUBRIQ offer tools and solutions for conducting reviews and submitting manuscripts with referee reports to journals. Adopting the concept of ‘cascading peer review’ is another way to alleviate the burden on peer review. This model, which was originally introduced at the beginning of the twenty-first century, is widely used among publishers today. This approach aims to prevent article rejection by shifting rejected papers to possibly more relevant journals within its portfolio, hence cutting costs and boosting efficiency. These peer review methods are meant to reduce the number of rounds of peer review required for a single manuscript.
There is growing public interest in reproducing data. Both funding and publication policies have been encouraging data sharing. The number of titles that mandate such sharing in some manner is also fast growing. SpringerNature, AGU, PLOS, and the American Economic Association, among others, have all proposed data-sharing rules in recent years. Furthermore, data access has been a focus of other initiatives such as the COPDESS Statement of Commitment. Like the Gates Foundation, the Arnold Foundation, and the Wellcome Trust, several funding organizations have made data sharing a cornerstone of their funding policies. Additionally, several government agencies are covered by the 2013 OSTP memo on improving access to federally funded research.
Even though we are in the digital era where fast-track publication is the norm, the principle behind peer review remains the same. The Internet has transformed our expectations about how communication works, allowing us to change how we communicate and connect online using new technologies.
Download our whitepaper to know more about the importance of integrating new technology-mediated communication standards into successful, broadly recognized peer review models
STM publishing continues to evolve and serves a wide array of academic & scientific communities. And the rise of open access, the impact of mobile tech, and the shifting demand for online content to stay relevant is shaping up their business strategies.
The world of operations is a dynamic one, we balance several variables on any given day, some of them are in our control and some absolutely out of our control, therefore it is essential to hold to customer expectation.
No data can be the same or can be created equal. Data exists in two main formats— structured and unstructured— and although structured data is straightforward and can be used and reused in several ways, it’s unstructured data which is way more than required and common.
As the ‘nuts and bolts' of scholarly and technical research communication become increasingly complex, NISO Plus is quickly becoming one of the most popular conferences for the scholarly research content market.
Innovative dubbing solutions providers are adopting AI for dubbing, training, e-learning, and corporate videos. By leveraging cloud computing, they have simplified the concept of any time, anywhere dubbing and helped reduce the time to market.
Our solutioning team is eager to know about your challenge and how we can help.