clock menu more-arrow no yes mobile

Filed under:

Why do labs keep making dangerous viruses?

A controversial new study involving an engineered version of Covid’s omicron variant raises new questions about research oversight.

Photo of Boston University National Emerging Infectious Diseases Laboratories.
Boston University’s National Emerging Infectious Diseases Laboratories.
MediaNews Group via Getty Images
Kelsey Piper is a senior writer at Future Perfect, Vox’s effective altruism-inspired section on the world’s biggest challenges. She explores wide-ranging topics like climate change, artificial intelligence, vaccine development, and factory farms, and also writes the Future Perfect newsletter.

The omicron variant of Covid-19 is more transmissible than earlier variants of the virus, which has helped drive so many infections that health officials have essentially lost count. But omicron is also less deadly than past variants, even controlling for patient demographics and vaccination status. That’s a very good thing, because a more transmissible and more deadly version of a virus that has already killed more than 6.5 million people around the world would be a true nightmare.

We’re very fortunate evolution hasn’t dealt us that card yet. Which is why it’s rather strange that scientists at Boston University decided to see if they could engineer in a lab a new Covid virus — called the omicron S-bearing virus — that was as contagious as omicron (that is, extremely contagious), but more likely to cause severe disease.

It turns out they could. Which we know because they published the details over the weekend.

“The Omicron S-bearing virus robustly escapes vaccine-induced humoral immunity ... yet unlike naturally occurring Omicron, efficiently replicates in cell lines and primary-like distal lung cells,” their just-released preprint paper announces. “In K18-hACE2 mice [a kind of mouse specifically engineered to be vulnerable to Covid], while Omicron causes mild, non-fatal infection, the Omicron S-carrying virus inflicts severe disease with a mortality rate of 80%.”

What that means is that their new virus seems substantially more dangerous than the original omicron variant, though still less deadly in their mouse population than original Covid-19. Taking the mouse models at face value, they likely invented a virus with omicron’s infectivity and a lethality somewhere between that of omicron and that of earlier strains of Covid. That’s worrying, to say the least.

Mistakes can be made

The researchers behind this work were doubtlessly trying to help the world, but you don’t have to look very far back in history to imagine what could have gone wrong here. Last November, in Taiwan, a lab assistant working with Covid-infected mice was bitten by the mice, caught Covid — almost certainly from the lab, as it wasn’t circulating in Taiwan at the time — and exposed 110 people.

As with the new omicron research, the Taiwan work took place at a BSL-3 (biological safety level 3) lab, one level below the BSL-4 labs reserved for research on the most dangerous and exotic pathogens, like Ebola. While in theory BSL-3 labs have extensive safety precautions to protect researchers and the public, in practice there’s a lot of room for error, especially in experiments with animal subjects — which have a disease-spreading tendency to bite.

That means every time scientists engineer a more contagious, deadlier version of a virus, we’re running some risk of exposing people in the lab who could carry that pathogen out into the public and potentially unleash the pandemic they meant to study and prevent.

That possibility has led many scientists to warn we need to be more thoughtful about research into making diseases more transmissible or more lethal — sometimes called “gain of function research,” though the exact boundaries of that term are hotly disputed, including in this work. “If a gain-of-function pathogen should reach the public via an infected lab worker, it could have a catastrophic outcome,” Imke Schroeder, a microbiologist at the University of California, Los Angeles, who studies lab safety, told Science after the Taiwan incident.

Doing biology better

There is genuine scientific value to studying and altering viruses in a lab, whether to trace the origins of an outbreak, to figure out how a pathogen spreads, or to develop potential medical countermeasures. But you can get most of the scientific value far more safely while refraining from research that creates new deadly pathogens with pandemic potential or artificially enhances existing pandemic-potential pathogens.

The new Boston University research is a bit of a borderline case here. They weren’t trying to make omicron more transmissible, or make Covid-classic more lethal. The researchers behind the work say that they were trying to determine if omicron really was less virulent than original Covid, and what part of the virus affected the severity of disease. But the virus they created — with omicron’s transmissibility but closer to Covid-classic’s lethality — could potentially kill a lot of people if it were to become the dominant strain of Covid in the world.

Opponents of gain of function research on pandemic-potential pathogens point out that there’s lots of cutting-edge biology work that can make us safer, rather than putting us in more danger. On Monday the Intelligence Advanced Research Projects Activity (IARPA), the leading biotech company Ginkgo Bioworks, and the nonprofit engineering innovation firm Draper announced some new research into identifying whether DNA has been engineered. Down the road, that could help us know if engineered bioweapons had been deployed or if an engineered pathogen had been accidentally released in a lab accident.

But the recent research into a more virulent omicron is past where I’d like us to draw the line. The risk of developing a more transmissible and more deadly version of Covid isn’t that it’s highly likely to kill a specific person; it’s that, as unlikely as an accident may be, if it spread across the planet, it could kill millions. Our existing systems for cost-benefit analysis in scientific research aren’t set up to evaluate risks of that type.

Playing viral Russian roulette

Under the Obama administration, the National Institutes of Health — which funds most bio research — imposed a moratorium on so-called gain of function research.

But in 2017, the moratorium was lifted and replaced with the P3CO framework for evaluating the risks and benefits of research with pandemic potential. But in the case of the omicron research, the framework wasn’t used. It seems the National Institute of Allergy and Infectious Diseases (NIAID), which funded this research, isn’t exactly pleased about that. NIAID divisional director Emily Erbelding told Stat that they learned that the grant had been used to create a hybrid virus from the news, not from the researchers, which meant there was no opportunity to evaluate it.

That points to a big challenge for the effort to ensure adequate oversight of risky research: Agencies count on researchers to say if their intended research could be dangerous, but no one is inclined to think their own research is risky.

Boston University, for example, maintained in a statement to Stat that the research can’t be counted as gain of function, because the virus created was less deadly than the original Covid-19. The university also said that the research had been approved by the Institutional Biosafety Committee, which includes both scientists and local community members.

But the NIH says it will examine whether the research should have triggered a federal review before it was undertaken. And it’s worth noting that “as contagious as omicron, only slightly less deadly than original Covid” is overall a virus that, if it were released into the world, could kill more people than either omicron or original Covid.

That kind of problem — where oversight mechanisms fail to function because researchers don’t have perfect communication with funders, and no one knows what counts as dangerous — comes up frequently. One of the original drivers of speculation that Covid itself could have begun in a lab accident was the revelation that the Wuhan Institute of Virology was conducting much of its coronavirus research in BSL-2 labs (which aren’t rated to prevent coronaviruses from escaping containment).

And while grants to researchers studying coronaviruses at the Wuhan Institute of Virology specified that the money could not be used for gain of function research, it ended up being used to alter coronaviruses in a way that most scientists consider to be gain of function research.

This is, to be clear, not research that sparked the pandemic: the altered coronaviruses studied in these specific experiments are too different from Covid for there to be any direct relationship between them. But they’re suggestive of a problematic safety culture where funders would have no idea if their money was being used to make viruses more dangerous, and these revelations have made people a lot warier — and a lot angrier — about research that appears similarly risky.

Lab safety in a pandemic-wary world

According to research published earlier this year by Gregory Koblentz and Filippa Lentzos, there are nearly 60 BSL-4 labs in operation, under construction, or planned in 23 countries around the world, more than 75 percent of which are located in urban centers near plenty of potential human hosts.

While that lab building boom represents a renewed interest in preparing for the next pandemic, there is currently “no requirement to report these facilities internationally, and no international entity is mandated to collect such information and provide oversight at a global level,” Koblentz and Lentzos wrote in a policy brief.

Doing risky pathogen research is akin to playing Russian roulette — the more labs taking on such research there are, the more times we’re pressing the trigger, until one day the bullet comes up. We can reduce the risk by studying diseases that don’t trigger pandemics in humans or closely related animals, by greatly improving lab safety, and by ensuring all risky research gets oversight — even if the scientists conducting it aren’t worried about the risks and so don’t report it.

If we don’t do better, then someday, we’ll get unlucky like Taiwan did less than a year ago, and our efforts to engineer more dangerous and contagious variants of Covid will succeed beyond our wildest dreams.

A version of this story was initially published in the Future Perfect newsletter. Sign up here to subscribe!

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.