Content Repurposing, STM Publishing, Digital Transformation of Scientific Publishing Techtrends

Content Repurposing – Digital Transformation of Scientific Publishing

Posted on : June 8th 2021

Posted by : Sithara Chandran

STM publishing continues to evolve and serves a wide array of academic & scientific communities. And the rise of open access, the impact of mobile tech, and the shifting demand for online content to stay relevant is shaping up their business strategies

Facilitating quality research is just the tip of the iceberg.

It is true that STM publishing has been deeply impacted by open access policies and digital technologies, but it would be unfair to suggest that they have in any way disrupted the core publishing functions of registration, certification dissemination, and preservation. Rather, they have enabled greater and intelligent distribution of scientific discovery and insights among the different participants of the scientific community and not just the publishers. However, this ease of access does not compromise the process integrity of STM publishing. For an article to be published, it must be screened by an editor, and depending on the feedback from a panel of scientific experts in the field, the editor suggests valid changes to the author in the manuscript. It is important to acknowledge this collaboration because it is this process of multi-level impartial and unbiased scrutiny that adds credibility to academic content.

With original research freely accessible on the web (without most copyright and licensing restrictions on their reuse), globalization, and the growth of emerging regions, the development of empowered aggregation platforms is inevitable. But the challenge to publishers goes way beyond economics.

An open platform to access journals and articles, for example, is urging publishers to innovate new solutions for linking publications to research data, to enable data mining, and to manage data in a way that can be segregated to serve specific uses. Data repositories have multiplied and the Scholix initiative has been introduced to systematically establish the interoperability between scientific literature and reference data. As a result, publishers are coming together to develop standardized data deposit and sharing policies for journals (including data citation policies) and are launching pioneering data journals and services while investing in cutting-edge data discovery services.

While we are on the subject, a mention must be made of repurposing or double-dipping one’s work. The line dividing its acceptance or rejection is subjective and depends largely on the author, the institution they are associated with and around issues of copyright, and where the work was first published. One must also consider the repurposing format; an oral presentation may not shape well into a publication, the subject may not fit the repurposing use, or may have been already extensively worked upon by others. The process is viable if there is new information or a fresh perspective on the subject; at any rate, any repurposing must expand on the topic.

Text and data mining is transforming the way scientists use STM literature. Technology is facilitating content aggregation and segregation on a single platform depending on a researcher’s enquiry. This eliminates rework and makes citation seamless at a low cost. If you were to add speed to this, we can see yet another change in the traditional way STM content is handled and disseminated.

Investors are turning publishers in a bid to make their money spent on research go further and sooner than now. A good example would be the Gates Open Research, which promises to publish all manuscripts once it is finalized within a week, complete with Open Peer Reviews and all outputs from the research available, including data and software. Everything will also be citable. This actually feeds into the larger vision of the publisher which has always been to aid researchers to make their work available to the scientific community as much as possible, so that it can be read, validated, critiqued, and further built upon. In this aspect, the unique collaboration of technology and open access has made the process extremely precise and efficient.

While a lot of challenges related to irreproducible research and the optimal use of STM content are being negated with open access, publishers can further bring relief by reformatting the content itself – journals published as data sheets is one such example – and help quality assurance by designing benchmarked peer review modules.

Over and beyond and no matter what, Gen Z – a generation of digital natives – is here to stay.

Seamless and customized solutions flowing from one secure and certified environment into another is critical to these natives. They demand that STM outputs be customized and presented when they want it. As such, open access, transparent peer reviews and AI efficiency are a given.

The role of the publisher has just become even more accountable and challenging. Content repurposing is both the goal and eventually the means as well. With these trends, the industry has made paradigm shifts in the way it functions and perceives its responsibilities. What we are looking at are not just opportunities to grow but a leap to evolve.

STM Tech Trends 2024: FOCUS ON THE USER – Connect the dots

Tech Trends-2024, Global STM Publishing

Source: www.stm-assoc.org

Similar Blogs

The world of operations is a dynamic one, we balance several variables on any given day, some of them are in our control and some absolutely out of our control, therefore it is essential to hold to customer expectation.

No data can be the same or can be created equal. Data exists in two main formats— structured and unstructured— and although structured data is straightforward and can be used and reused in several ways, it’s unstructured data which is way more than required and common.

As the ‘nuts and bolts' of scholarly and technical research communication become increasingly complex, NISO Plus is quickly becoming one of the most popular conferences for the scholarly research content market.

Innovative dubbing solutions providers are adopting AI for dubbing, training, e-learning, and corporate videos. By leveraging cloud computing, they have simplified the concept of any time, anywhere dubbing and helped reduce the time to market.

Content localization involves preparing TV, film, and video titles for global distribution. The demand for content localization is poised for explosive growth. Content owners are rushing to capitalize on the opportunities to sell their new titles and back catalog feature films and TV series in new territories.