APE 2022
The Future of the Permanent Record
The First
Event in the New Year!

Academic Publishing in Europe Nr. 17 on 11-13 January 2022. Please note: three day online conference


APE 2022: The Future of the Permanent Record - Online Conference

For the second time the APE 2022 conference will be online. (Thanks to our sponsor Morressier!) The recordings will be available until long after the conference as part of the Permanent Record.

APE 2022: The Future of the Permanent Record
Please note: three day online conference

DAY ONE: Tuesday, 11 January 2022

Please note: all times indicated are based on Central European Time (CET)


Greetings and Opening

  • Eric Merkel-Sobotta, Managing Director, Berlin Institute for Scholarly Publishing (BISP), Berlin

  • Dr. Caroline Sutton, Incoming (February 2022) CEO, The International Association of Scientific, Technical and Medical Publisers (STM) and Director of Open Research, Taylor & Francis Group, Oxford



  • Introduction: Liz Ferguson, Senior Vice President, Research Publishing, Wiley

Keynote 1. Quality and Equity in Academic Publishing

  • Prof. Dr. Maria Leptin, President, European Research Council (ERC), Brussels

Academic publishing serves several functions: dissemination of research results, quality control, and long-term preservation of the published work. In many disciplines, preprints and publishing platforms using open peer review are becoming popular, often involving the repeated posting of a revised version. As some experts put it, the traditional ‘version of record’ is being replaced by a ‘record of versions’.
Whether journals or books and regardless of discipline, all forms of high-quality academic publishing have one aspect in common: quality control that deserves the name comes at a cost. The burden is usually shared between the academic community (providing peer review and acting as external editors) and professional publishers. The latter organize the services provided by peers and carry out numerous additional steps, such as checks for plagiarism and image manipulation, copy-editing, or verification of data deposition according to community standards. Logically, these costs should be reflected in the price that subscribers or, in the case of open access, authors may be asked to pay.
However, in the absence of relevant information, the link between costs and price remains largely obscure, making it difficult for authors to judge whether prices are justified. Willingness to pay may be primarily determined by the perceived prestige of the publishing venue, which can lead to prices being as high as the market will allow.
Partly as a consequence of funder mandates, publishing is gradually shifting to an open access system, mostly financed by article processing charges. Publishers have used hybrid models as a tool to support the transition, but it is unclear how and at what price a transition will eventually take place, especially for publication venues that are selective and provide high quality services. Alternative approaches, e.g. financial support through consortia of institutions or funders, are rare.
The subscription model excludes researchers and other citizens from fast and easy access to knowledge if they are not associated to an institution that can pay for it, but the new model has introduced new inequalities. Researchers who do not have the necessary funding to pay high open access fees are either forced to publish behind a paywall, or have to choose a cheaper venue. This could exclude them not only from the most prestigious publishing outlets but also from those offering high-class quality control and additional services, pushing them towards predatory venues.
In my talk, I will look into the role that funders can and should play in this context.

Keynote 2. Building a Framework for the Future of the Record of Science

  • Todd Carpenter, Executive Director, NISO (National Information Standards Organisation), Washington, DC

For centuries, the “Record of Science” consisted of printed books and journals. These were lovingly crafted by authors and publishers and curated by librarians. In the mid 20th century, abstracting and indexing services, catalogues and a few basic metadata structures all supported user discovery. Eventually identifier systems were developed for books (ISBN) and journals (ISSN) began to move processing, ordering and discovery online, eventually this would be supported by the Digital Object Identifier (DOI) at the dawn of the 21st Century. People could discover, navigate, purchase, and catalogue content more efficiently. Over the past three decades, scholarly publishing has moved from a primarily print-focused endeavor to one that is primarily, if not almost exclusively digital.

But this shift to electronic publishing and the digital dissemination of information has more than sped up the process and made it more efficient. In many ways it has fundamentally changed and widened this landscape. Traditional publishers with their content forms of books and journals still play a dominant role, but interest is rapidly growing in underlying data, procedural processes, workflows, software, data visualization, and computer simulation. The "article of the future” is so much more than a single object, it is a network of inter-related content forms.

In this new environment, traditional systems of review and publication are being disrupted by preprints, by open science, and by open data. How scholars are being assessed and the metrics upon which that assessment is being done are also shifting. The financial models that supported this ecosystem are also being tested and reconfigured. The trust that is a cornerstone of scholarly publishing is being eroded as fraud, plagiarism, technological manipulation, and paper mills all seed doubt in scholarly output, even as more and more of the world gains access to content via OA.

More than ever, we need greater consistency in community behavior to regain trust. Alongside this, there is an ever increasing demand for interoperability and interaction with content. To complicate this work, the information landscape we function in is significantly more complex, technologically, internationality, and legally. As the expectations change for what is included in the “Record of Science,” the systems that support that ecosystem needs to evolve. New sets of rules for how content is created, how it is distributed, and managed need to be forged and adopted. It took hundreds of years to establish norms for sharing print content to develop. In our current environment, we don’t have that long. We need to get to work.


Breakout Rooms – time to connect


Paper: Metadata Quality in Time of Diverse Research Outputs

  • Martyn Rittman, Product Manager, Crossref

Putting more research into the public domain has significant benefits, but can lead to confusion as the number and types of outputs proliferates. In the print era, the journal article or book was the focal point whereas now there are preprints, datasets, preregistered methods, online commentary, and much more, each with a unique contribution. How can we track and navigate these works across multiple sites, and what happens when one is modified or retracted? How can trustworthiness (or a lack of it) be established and reported? What role can Crossmark metadata play? A number of years ago, we established Crossmark to report the current status of content and whether an update is available. Now we are looking at more broad reporting, including linking versions and reporting relationships. This talk explores these issues and looks at the impact of and challenges for Crossref services such as Crossmark and Event Data.


Introduction: The Future is waiting for us - the Role of the Permament Record in an enriched & dynamic Publishing Ecosystem

  • Dr. Liz Marchant, Global Portfolio Director (Life, Earth & Environmental Sciences) at Taylor & Francis Group, Oxford

For many years we have recognised the value of the permanent record – a record of research undertaken and conclusions made that has been peer reviewed, archived and, on the whole, is not altered or deleted. This makes the content model conservative even though there has been plenty of invention around connecting and accessing the content in different ways through different models. Is everyone still committed to the Version of Record (VOR) – and in what ways? What are the essential elements of the VOR? Is it the predictable content structure? Many want to publish outputs in different ways. Is it confirmation a good Editor has seen it? New models argue this is not necessary. That it has been peer reviewed? Most people say yes, but it is a fallible system. That you can stand behind the ethics of it? Vital. That it is being hosted somewhere reliable, that it feeds metadata so it can be connected, plumbed into the infrastructure so it can be found and used – most would agree. That it has longevity. The permanent record cannot and should not be an inhibitor of change, but an enabler of invention and modernisation.

Session: The Version of Record under Attack! The Dark Side of the Scholarly Publishing Universe

  • Moderation: Dr. Liz Marchant

Can society trust the Version of Record? Fraud is now applied at scale- There are many new actors publishing research outputs. How much more investment does it take to maintain that trust? The rise of the Papermill. A series of lightening talks to give us rapid insights into the risks around the VOR and what can be done to protect it before we move into thinking about what value it brings after the break.

The Need for Trust in Science and how to (try to) preserve it

  • Prof. Dr. Christian Behl, Editor-in-Chief, Journal of Cellular Biochemistry and Institute for Pathobiochemistry, University Medical Center of Johannes-Gutenberg-University, Mainz

As experienced during the currently still ongoing Corona pandemic, trust in scientific data, trust in scientists and their publications and statements is essential. We experience at first hand that, for instance, to solve a medical problem we need to be able to believe what the experts say. Published scientific data need to be constantly challenged, confirmed, or refused by the appropriate methodology as the driving force of the development of hypotheses. Discussion and controversy is at the heart of science and scientific progress.
This may lead to rapid success, such we experienced with the vaccination options during the Corona pandemic. It may also lead to cliffhanging, hypothesis-driven long-term research and dispute as we have been experiencing it in the search for full comprehension of devastating diseases, such as Alzheimer’s disease. The absolutely non-negotiable prerequisite for the general acceptance of scientific facts is their integrity. It appears that in recent years, fake information of fake data as well less stringent attention and effort in the interpretation of scientific data and manuscripts have taken up more and more room. This is indicated by the constantly increasing number of corrected, withdrawn, or retracted publications as well as the increasing popularity of web-based organisations that draw potential data flaws and potentially fake data in published papers into the public eye and to the attention of journals.
Scientific mistakes, misconduct, data manipulations, and fraud are not a recent phenomenon, they have been around as long as scientific publishing has.

However, in recent years, numerous journals have been the target of multiple attempts of fraud, partly orchestrated by so called “paper mill offices”, which make a living out of manipulating or constructing data and mass-producing fraudulent manuscripts for submission. While this expansive phenomenon is being further uncovered, expert referees and journal editors have reached a state of high alert, trying to avoid the fake-data-trap. It is almost impossible for the expert science referee to identify all misconduct, especially when it comes to manipulations that have been realized employing professional image editing software. More and more journals are implementing state-of-the-art technology in order to carefully sort out potentially problematic publications – a costly and tedious effort. Regardless, there is no other viable way than to stay alert and to constantly adjust the toolbox to identify and investigate problematic cases. After all, this is for the sake of scientific integrity, and bringing back and protecting trust in science. Ultimately, this effort – at so many levels – is of utmost importance for us as scientists who want to publish data.

Paper Mills: global Knowledge Contamination by industrial-style Fake Science Publishing

  • Prof. Dr. Ph.D. Bernhard Sabel, Editor-in-Chief, Restorative Neurology and Neuroscience and Medical Faculty, Otto-von-Guericke University of Magdeburg

Criminal science publishing gangs – also known as “paper mills” – contaminate the scientific literature with fake publications (FPs) which are of sufficient quality to pass peer-review. Because FPs are on the rise in an unprecedented manner, we carried out a bibliographical /statistical analysis to estimate their number and the associate revenues of the paper-mill business model. First, we scanned the 149 publications from 2017-2021 in the journal RESTORATIVE NEUROLOGY AND NEUROSCIENCE for tell tails of fake and identified an embarrassing 43 “suspect” FPs (28.7%). This prompted us to collect questionnaires from all corresponding authors to calculate a fake-factor (FF) which identified 23 (15.3%) “probable” FPs. Next, we screened a random mix of 3,500 publications from 35 basic and clinical neuroscience journals and identified 376 “suspects” (10.7%). When extrapolating to all “pubmed” publications (6.27 Mio/5 yrs), we estimated 450.000 medical FPs and 1.4 Mio FPs for all 14 Mio. science/technology publications (14 Mio.; Web of Science). Therefore, all of us in search of generating true knowledge – and not generating fiction – the impact of FPs on the permanent scientific record can no longer be ignored.
The FP business is an attractive commercial opportunity for paper mills. Assuming FP authors pay 5.000 EUR, the global FP business model revenue in 5 years amounts to 5-7 billion EUR (“suspected”/”probable” FPs), i.e. up to 1.0-1.4 billion revenue annually. This magnitude of the knowledge contamination is frightening and of growing concern. Unless all stakeholders counter this massive attack to safeguard public´s trust in, and reputation of, science integrity, this knowledge contamination will have a (delayed) ripple effect on global public health, science, technological and economic development, and the publishing field thereof. Stakeholders at all levels are called to action: journal editors, reviewers, science publishers, national / regional governments, administration bodies, funding agencies, and employers. It is time not only to re-think the incentives and rewards of scientists and publishers, but we should immediately and decisively stop the business model of criminal science publishing gangs, filtering out fraud papers at the pre-publication stage, and retracting FPs from the permanent scientific record.
It may be a bitter pill to swallow to take swift global action, but it is worth the pain to restore the “health” of the scientific record and to prevent the erosion of trust in science. All nations depend on the truth of knowledge laid down in the permanent scientific record. No one will lose face by facing this new and fundamental problem. Science and “true love” have two things in common: both are infatuated by passion, and both rely on trust. If trust is lost, it is very hard to go back.

Time is a Thief of Memory

  • Dr. Alicia Wise, Executive Director, CLOCKSS, Stanford, CA

Reputable publishers have a crucial role to play if we are to preserve digital scholarship over the coming centuries. This challenge is increasingly complex as research and technology continue to evolve at breakneck speeds, and as we collectively rise to the challenges of climate change.

The Research Integrity Arms Race and the Role of Specialists to maintain the Integrity of the published Record

  • Tim Kersjes, Research Integrity Manager, Springer Nature, Dordrecht

Springer Nature and its journals have seen an increase in the number of submissions that appear to originate from paper mills, and we have steadily increased the number of (automated) checks on submitted manuscripts to try and avoid sending these manuscripts through peer review and into the published record. However, paper mill submissions are becoming more and more sophisticated, making it harder to detect them at the submission stage. A journal can try and plug a hole here, a paper mill finds another hole somewhere else. In short, the fight for integrity is an arms race. This is leading to an increasing demand for reliable and automated solutions to stop paper mill submissions from polluting the publishing ecosystem, but the technology paper mills use is also getting better. Additionally, when these submissions make it to the published record, cleaning and correcting the literature also benefits from increasing levels of automation. However, we must not lose sight of the role and importance of human research integrity specialists. Correcting the literature is a slow, sometimes too slow process. Technology and automation can help, but a more speedy correction of the literature also relies on investing in specialists and streamlining the processes that lead to the appropriate correction of the literature. In this talk, we will focus on the joint effort between technology backed prevention and human specialist intervention, and how feedback between the two will hopefully keep publishers one step ahead in the integrity arms race.

Tackling Misconduct through Technology and Collaboration

  • Dr. Joris van Rossum, Product Director, STM Solutions, and Director of Research Integrity, STM, Amsterdam

Recent years have seen a significant increase in research integrity issues. A striking example are so-called ‘paper mills’, fraudulent organizations that produce counterfeit manuscripts submitted to academic journals. Increasingly advanced technologies are used in the process to fabricate, plagiarize, and manipulate text, images, and research data. This is placing a large burden on publishers, editors, and reviewers, who call for reliable, state-of-the-art technology solutions to detect misconduct like this and support the editorial decision-making process.
STM Solutions recently announced it is developing a powerful new platform to detect integrity issues in manuscripts submitted for publication to scholarly journals. The collaboration platform will be built on an architecture that ensures that publishers maintain full control over the content, safeguarding privacy and confidentiality. Critically, by working with content from multiple publishers, the system will be able to detect issues that transcend single submissions, journals, or publishers.
In this talk, Joris van Rossum, Product Director of the collaboration hub, will explain more about the plans and next steps.


The APE Lecture: Life in a Liminal Space

  • Dr. David Crotty, Chef Cook of ‘The Scholarly Kitchen’ and Senior Consultant, Clarke & Esposito

A liminal space is the time between the 'what was' and the ‘next'. It is a period of transition, uncertainty, and multiple paths forward. We close the first day of our meeting with some thoughts on those paths, where they are heading and where they diverge. The first wave of open access is upon us, driven by the APC model, moving us to favor quantity over quality, and resulting in massive consolidation in many areas of the market. What comes next?


End of Day One

DAY TWO: Wednesday, 12 January 2022


Recap of Day One


Introduction: Why is the VOR worth it for the Research Community?

  • Prof. Lisa Janicke Hinchliffe, University Library, University of Illinois, Urbana

The "version of record" is a touchstone concept in scholarly communications that serves as an organizing structure for academic publishing. Nonetheless, the version of record is also increasingly de-coupled from many of the core functions of academic publishing: registration, certification, dissemination, and preservation, which increasingly considers a "record of versions" and not only the version of record. What then of the "version of record"? Is its importance decreased, increased, or unaffected? Does context and use case matter? Does the research community need the version of record? Prof. Hinchliffe will present opening remarks that will then frame an exploratory dialogue among the panelists.

Panel: The Version of Record - an Anchor for Innovation Upstream and Downstream

  • Moderation: Anne Kitson, Managing Director and Senior Vice President, Cell Press and The Lancet, Elsevier, Oxford

  • Prof. Lisa Janicke Hinchliffe, University Library, University of Illinois, Urbana

  • Dr. Niamh O'Connor, Chief Publishing Officer, PLOS

  • Prof. Dr. Ulrich Dirnagl, Director, Quest Center for Transforming Biomedical Research and Dept. of Experimental Neurology, Berlin Institute of Health, Charité and MDC, Berlin

  • Prof. Dr. Christian Behl, Editor-in-Chief, Journal of Cellular Biochemistry and Institute for Pathobiochemistry, University Medical Center of Johannes-Gutenberg-University, Mainz

  • Dr. Bernd Pulverer, Chief Editor, EMBO Reports, Head of Scientific Publications at EMBO Press, Heidelberg


Breakout Rooms – time to connect


Concurrent Sessions

Towards an IDEAL World: How to foster Inclusion, Diversity and Equity in Scholarly Communication

  • Moderation: Dr. Christene Smith, Chair of the Editorial IDEAL Committee at De Gruyter, Berlin

The world is a fabulously diverse place and we need to make room for all voices in our work. However, publishing does not always reflect this diversity, and we as publishers have perhaps been somewhat slow to equip our organizations with the knowledge and best practices to celebrate the people who we and the stakeholders we serve are. Utilizing IDEAL (Inclusion, Diversity, Equity and Anti-Racism Lived) concepts will enlighten our industry with new ideas, new perspectives and new voices.
Gender and ethnic inequalities, systematic racism and other biases affect scholarly research and communication in different forms and on several levels. This session will investigate how our industry can improve equity, diversity and inclusion. What constraints hinder us? What opportunities open up? How do we celebrate diversity as part of being better and therefore even more successful organizations?
We will introduce initiatives and approaches that aim to accelerate a cultural change in scholarly publishing. By emphasizing collaborative approaches and practical aspects, experts from editorial, human resources and data analysis will highlight both the opportunities and the challenges that publishers face in assessing the status quo, defining objectives and implementing improvements.

Collective Action to make Things better, together

  • Dr. Nicola Nugent, Publishing Manager, Quality and Ethics, Royal Society of Chemistry, London

The Royal Society of Chemistry has brought together 47 publishing organisations to set a new standard to ensure a more inclusive and diverse culture within scholarly publishing. The Joint commitment for action on inclusion and diversity in publishing was launched in June 2020. It was developed following a workshop in which we shared our Framework for action – a practical guide to reducing bias in our own publishing activities – with other publishers. In order to deliver on our commitments, we have formed a working group of representatives from each signatory organisation, with sub groups taking forward specific areas of action under ach of the four points made in the original commitment. More than one year on, we can celebrate a number of achievements including the launch of a set of minimum standards for inclusion and diversity in scholarly publishing.

Why Data is the only Way to really improve Diversity

  • Nancy Roberts, Founder and CEO of Umbrella Analytics, Bury St Edmonds, Suffolk

The increased focus on improving diversity in our industry in recent years has led to companies wasting a lot of time and energy on well-intentioned - but ultimately unsuccessful - interventions. It’s time to ditch the feel-good approach and start using the data. This session will give an overview of why data works, how to collect and use it, and how to make measurable improvements in publishing.

Inclusive Research and the Global South

  • Ylann Schemm, Director, Elsevier Foundation and Corporate Responsibility, Elsevier, Amsterdam and Chair of the Research4Life Executive Council

Over the past 20 years, nearly 200 publishers have worked together through Research4Life to provide free and low-cost content to researchers in developing countries. But with growth of OA and the launch of the UN Sustainable Development Goals, Research4Life and the publishing industry at large have witnessed a paradigm shift: researchers in developing countries moving from consumers of knowledge to producers of research. To foster this critical development, publishers must examine their own role and take measures to develop equitable OA policies, create editorial boards with geographic diversity, recruit reviewers from low and middle income countries and publish robust research from the South as well as the North. How can we learn from our experience with gender, race and ethnicity, to build a more inclusive research ecosystem with the Global South?

Panel: Ensuring Public Access to Research Data

Introduction and Moderation: Nick Campbell, Vice President Academic Affairs, Springer Nature


Rick Anderson, University Librarian, Brigham Young University
Michael Levine-Clark, Dean of Libraries, University of Denver
Judith C. Russell, Dean of University Libraries, University of Florida

In this session, three academic research librarians will provide an overview of CHORUS; outline some of the challenges faced by research institutions of different sizes in managing data; articulate the differing roles of libraries in this process at these universities; and describe how these institutions might make use of CHORUS or similar tools to more effectively manage data and link it to related publications - and also seek input from and potential partnerships among the academic research and publishing communities to further our common objectives.


Paper: Publishing as a Process: Open by Design

  • Dr. Niamh O’Connor, Chief Publishing Officer, PLOS

This presentation will examine the ways in which publishing can become a truly integral part of the research process. How moving away from ‘publishing as an event’ gives us the freedom to re-examine the role of the article in enabling the sharing and discovery of information, in research assessment, in the way in which we archive the research record and in publisher business models that are built on the ‘Version of Record’. And how it allows us to reimagine scholarly communications in ways that accelerate discovery and innovation, making the scientific process more inclusive and accessible to society as a whole.


End of Day Two

DAY THREE: Thursday, 13 January 2022


Recap of Day Two


Panel: How can we foster Entrepreneurship and Innovation in Scholarly Communications?

  • Introduction and Moderation: David Worlock, Chief Research Fellow, Outsell Inc., London
  • Sami Benchekroun, Co-founder and CEO of Morressier
  • Ijad Madisch, Co-founder and CEO of ResearchGate

From the way we bank to the way we shop: Technology startups are the driving force for innovation throughout many global industries. Scholarly communications, however, has traditionally provided less fertile ground for venture-backed companies, despite the essential role they play in facilitating progress for the entire community.

As we look to the future, how can we build a more innovation-friendly environment and encourage more startups to tackle the major challenges within scholarly communications? And what role does public and private investment play in supporting such companies?

In this panel, the founders of three leading venture-capital-funded technology companies (Jan Reichelt for Mendeley, Sami Benchekroun for Morressier and Ijad Madisch for ResearchGate) come together to discuss these questions and reflect on their learnings from breaking into the scholarly communications sector.


Breakout Rooms – time to connect


Session: New Dotcoms to Watch

  • presented by Drs. Eefke Smit, STM Director of Standards and Technology, and Dr. Joris van Rossum, Product Director, STM Solutions, and Director of Research Integrity, STM, Amsterdam

Are the start-ups of today, the players of tomorrow? In this very popular Dotcoms-to-Watch session, we shall again present you a stellar line-up of start-ups that we have identified as potential movers and shakers in scholarly communication.

Don’t miss it! And join the popularity vote.


End of APE 2022 Conference

Auf Wiedersehen! Goodbye!

Many thanks to the Program Committee, Chairpersons and Speakers for realizing this program. Also, we would like to thank our Sponsors and Partners without which we could not make the APE Conference happen. See you all next year!

Please note: APE 2023, 10-11 January 2023.