As biotechnology advances, the risk of accidental or deliberate misuse of biological research such as viral engineering increases. At the same time, “open science” practices such as the public sharing of data and research protocols are becoming widespread. An article published on April 14and in the open access journal PLOS Biology by James Smith and Jonas Sandbrink of the University of Oxford, UK, examines how open science practices and misuse risks interface and offers solutions to identified problems.
The authors tackle a crucial question that arose with the advent of nuclear physics: how the scientific community should react when two values – safety and transparency – are in conflict. They argue that in the context of viral engineering, open code, data and materials may increase the risk of release of enhanced pathogens. Freely available machine learning models could reduce the time needed in the laboratory and facilitate pathogen engineering.
To mitigate such catastrophic misuse, mechanisms to ensure responsible access to relevant hazardous research materials should be explored. In particular, to prevent the misuse of IT tools, it may be necessary to control access to software and data.
Preprints, which have become widespread during the pandemic, make it difficult to prevent the dissemination of risky information at the publication stage. In response, the authors argue that monitoring needs to happen earlier in the research life cycle. Finally, Smith and Sandbrink point out that pre-registration of research, a practice promoted by the open science community to increase research quality, can provide an opportunity to review and mitigate research risks.
“In the face of increasingly accessible methods for the creation of potential pandemic pathogens, the scientific community must take steps to mitigate catastrophic misuse,” Smith and Sandbrink say. “Risk mitigation measures must be merged into developed practices to ensure open, high quality and reproducible scientific research. To make progress on this important issue, experts in open science and biosafety must work together to develop mechanisms to ensure responsible research with maximum societal benefit.
The authors propose several such mechanisms and hope that research will spur innovation in this critically important yet much neglected area. They show that science cannot simply be open or closed: there are intermediate states that must be explored, and difficult trade-offs affecting core scientific values may be necessary. “Contrary to the strong narrative in favor of open science that has emerged in recent years, maximizing the societal benefits of scientific work can sometimes mean preventing, rather than encouraging, its spread,” they conclude.
In your coverage, please use this URL to provide access to the article available for free in PLOS Biology: http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3001600
Quote: Smith JA, Sandbrink JB (2022) Biosecurity in the Age of Open Science. PLoS Biol 20(4): e3001600. https://doi.org/10.1371/journal.pbio.3001600
Author countries: UK
Funding: JAS has received support from the Effective Altruism Funds program through the Long Term Future Fund (https://funds.effectivealtruism.org/funds/far-future). The JAS postdoctoral position is funded by the Center for Biomedical Research at the National Institute for Health Research, Oxford (https://oxfordbrc.nihr.ac.uk/). JBS doctoral research is funded by Open Philanthropy (https://www.openphilanthropy.org/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Conflict of Interest Statement
Competing interests: I have read the journal policy and the authors of this manuscript have the following competing interests: In 2017, James Smith was a consultant for Biolacuna Ltd. He is currently a consultant for Alvea LLC.
Warning: AAAS and EurekAlert! are not responsible for the accuracy of press releases posted on EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.