photo illustration

Bent Over in Pain

Student Infected With Debilitating Virus in Undisclosed Biolab Accident

Illustration: Alex Williamson for The Intercept

The graduate student was alone in the lab on a Saturday, handling a mouse infected with a debilitating virus, when the needle slipped. She wore two gowns, two pairs of shoe covers, a hair net, a face mask, and two pairs of gloves. Gingerly, she had pointed the needle at the mouse’s abdomen and injected the antibody. The animal was infected with a recombinant strain of Chikungunya virus, a mosquito-borne pathogen that has sparked epidemics in Africa and the Caribbean. Chikungunya can wreak havoc in other regions when the right kind of mosquito is present; in 2007 and 2017 there were outbreaks in Italy, and in 2014 the virus hit Florida, infecting 11 people who had not recently traveled abroad. In January 2016, nine months before the researcher stood in the lab that weekend, a locally acquired infection was diagnosed in Texas.

Chikungunya, which means “bent over in pain” in the Makonde language, can lead to chronic arthritis, and its spread through the Americas had made studying it more urgent. The researcher’s team at Washington University School of Medicine in St. Louis, Missouri, was studying the virus in the hope of discovering possible treatments or developing a vaccine. The graduate student was working in a biosafety level 3 lab, a level that often includes a completely sealed perimeter, directional airflow, and full personal protective equipment. But accidents still happened. The team’s experiments were set back when, after withdrawing the needle from the mouse’s belly, the graduate student grazed a finger on her left hand.

The needle pierced through both sets of gloves, but the student saw no blood, so she washed her hands, removed her safety equipment, and left the lab without telling anyone what had happened. Four days later, she ran a fever, and her body ached and convulsed in chills. The next morning, her skin was flecked with discolored spots. They multiplied over the course of the day, so she went to the emergency room, where the doctors kept her overnight for observation. A nurse drew her blood and sent it off to a state lab. She tested positive for Chikungunya. Only after getting sick did the student tell her supervisor about the slipped needle.

“That’s not a good situation,” said Scott Weaver, director of the Institute for Human Infections and Immunity at the University of Texas Medical Branch at Galveston and an expert on Chikungunya virus. “If that person knew they had a needlestick and they were working with Chikungunya, they should have reported it immediately. And then whatever health care people saw them should have recognized that there was a very small — but not zero — risk of them transmitting the virus.”

After the student told her supervisor about the accident in September 2016, Washington University reported it to the National Institutes of Health, but until now, the event has remained out of public view. So have hundreds of other incidents in U.S. labs, including four other needle injuries at Washington University.

An Intercept investigation based on over 5,500 pages of NIH documents obtained under the Freedom of Information Act has uncovered a litany of mishaps: malfunctioning equipment, spilled beakers, transgenic rodents running down the hall, a sedated macaque coming back to life and biting a researcher hard enough to lacerate their hand. Many of the incidents involved less dangerous pathogens that can be handled with basic safety equipment, and most did not lead to infection. But several accidents happened while scientists were handling deadly or debilitating viruses in highly secure labs, and a few, like the Chikungunya virus slip-up, did lead to illness.

“People have it in their minds that lab accidents are very, very rare, and if they happen, they happen only in the least well-run overseas labs,” said Richard Ebright, a molecular biologist at Rutgers University and an advocate for better biosafety standards. “That simply isn’t true.”

Key Takeaways
  • The Intercept obtained over 5,500 pages of NIH documents, including 18 years of laboratory incident reports, detailing hundreds of accidents.
  • Documents show that accidents happen even in highly secure BSL3 and BSL4 labs, and that in some cases they lead to infection.
  • After pricking her finger with a needle, a graduate student at Washington University School of Medicine contracted the debilitating Chikungunya virus.
  • The documents also reveal the infection of a researcher working with the antibiotic-resistant bacteria MRSA in a Food and Drug Administration lab.

The United States has a patchwork of regulations and guidelines covering lab biosafety. Safety training can vary widely from one institution to the next. Experiments involving certain pathogens and some research funded by the U.S. government is subject to oversight, but critics say that other areas are like the Wild West. Unless they work with the most dangerous pathogens, biolabs don’t have to register with the U.S. government. As a result, there is little visibility into the biosafety of experiments carried out by private companies or foundations.

“Your favorite tech billionaire could, with their own money, do basically whatever the hell they want with any pathogen,” said Rocco Casagrande, managing director of Gryphon Scientific, a biosafety advisory firm that has advised NIH on biosafety standards. “They could take the measles virus and intentionally try to make it vaccine-resistant and more pathogenic in their garage. If they’re doing it for legitimate research purposes in their own minds, they can do so wildly, unsafely, and no one can stop them.”

As policymakers scramble to prevent future pandemics, those gaps have been thrust into the spotlight. A Senate subcommittee held a hearing in August on the oversight of dangerous pathogens, and NIH, the world’s largest funder of biomedical research, has convened an advisory panel to consider how the agency vets proposals for risky science. In September, the World Health Organization published guidance aimed at “preventing the accidental and deliberate misuse of biology and other life sciences” around the world. That same month, the White House issued an executive order tasking the secretaries of Health and Human Services and Homeland Security with devising a plan to improve biosafety research, noting a need to prevent biotechnology from leading to “accidental or deliberate harm to people, animals, or the environment.” (The Department of Health and Human Services oversees NIH.) In October, the White House unveiled a broader biodefense plan that includes a pledge to strengthen lab biosafety and biosecurity.

Since the question of how to prevent future pandemics is related to the still unsettled question of how the current pandemic started, the policy discussions have been shot through with politics. In Congress, the issue of biosafety regulation has been pushed almost exclusively by Republicans, the very same party that helped usher in the expansion of the U.S. biolab network after September 11. (Many Democrats also supported the effort at the time.) The NIH advisory panel’s members are installed by agency leadership, making them unlikely to buck the status quo. Broader discussions about biosafety, meanwhile, have devolved into bitter Twitter fights.

Biosafety proponents maintain that regardless of what caused the SARS-CoV-2 outbreak in China in 2019, the fact that a lab accident could spark a pandemic is reason alone for better oversight. Many virologists, meanwhile, contend that more regulation is unnecessary and that the benefits of their research outweigh the risks. “There’s a whole community of scientists who downplay the fact that things can be acquired in the lab,” said Stuart Newman, a cell biologist at New York Medical College who sits on his university’s institutional biosafety committee.

The documents show that the threat is real.

The Intercept obtained 18 years of lab incident reports submitted to NIH, which oversees research as well as funds it. Some of these were obtained directly, through a Freedom of Information Act request. Others were obtained by Edward Hammond, former director of the Sunshine Project, and Lynn Klotz, senior science fellow at the Center for Arms Control and Non-Proliferation, who separately requested the reports using FOIA and then provided them to The Intercept. The reports span the period of 2004 to 2021. Institutions funded by NIH are required to report any mishaps involving recombinant DNA to the agency’s Office of Science Policy.

Among the accidents revealed in the documents:

  • In 2010, a machine in a University of California, Irvine lab malfunctioned while decontaminating waste from experiments with the SARS virus. The machine, called an autoclave, leaked steam and water, potentially exposing eight people to the virus, which could spark a pandemic. The risk of an outbreak was mitigated by a quirk of timing: The machine had already reached a high temperature — likely enough to kill the virus — before malfunctioning. The University of California, Irvine spokesperson Tom Vasich wrote in an email, “The incident was quickly addressed. … Released materials were contained in our BSL3 laboratory. Exposed lab workers were wearing proper personal protective gear. No transmission of the virus was detected.”
  • In 2013, a researcher at Kansas State University in Manhattan, Kansas, pricked their finger while drawing blood from a chicken infected with H5N1 avian influenza. The scientist had handed a used syringe to an assistant while trying to get a better grasp of the chicken’s jugular vein. The assistant returned it needle side out, piercing through the scientist’s gloves. The researcher was prescribed Tamiflu for one week and told to immediately report a fever. Kansas State University did not respond to a request to comment.
  • Between April 2013 and March 2014, the University of North Carolina at Chapel Hill reported five mouse escapes, including one of an animal that had been infected with SARS four days earlier. In a letter to NIH, a biosafety specialist argued that the frequency of escapes was due to the “complex research taking place at our institute” rather than a failure of training, noting that several teams at the university use a breed of transgenic mouse known for its unpredictable behavior. After the SARS-infected mouse darted under lab equipment, researchers cornered it with a broom and returned it to its cage. The University of North Carolina did not respond to a request to comment.
  • In 2018, a researcher at the Food and Drug Administration’s Center for Biologics Evaluation and Research in Silver Spring, Maryland, contracted a MRSA infection, a condition that can become severe if left untreated, after working with the antibiotic-resistant bacteria MRSA in the lab. The researcher could not recall any mishaps that would have led to infection, a situation that experts say is common with laboratory-acquired infections. The FDA center did not respond to a request to comment.
  • In early 2020, amid the shortage in respirators and masks brought on by the pandemic, a lab at Tufts University conducted low-risk experiments with the H3N2 flu virus without proper equipment. A student spilled a test tube containing a small about of virus, potentially exposing five people. None were initially wearing masks. (Two later put them on to clean up the spill.) H3N2 is a seasonal flu virus and not considered a dangerous pathogen, but in an email to Tufts, an administrator at NIH highlighted a series of omission and errors. These included the lab’s failure to provide personal protective equipment, a lack of proper safety signage, and the failure of researchers to seek appropriate medical care after being exposed to the virus. The NIH administrator also recommended that the principal investigator be retrained. Tufts declined to comment.

In an analysis of incident reports filed with the Office of Science Policy between 2004 to 2017, Klotz found seven lab infections that initially went undetected or unreported, in addition to the Chikungunya case. Critics say that even a few laboratory-related infections is too many, because often they are avoidable. Lab accidents are typically the result of “cascading errors,” said Casagrande. “Some physical mistake happens that then takes advantage of vulnerabilities introduced by someone’s carelessness or mental mistake or happenstance. Someone spilled something when the backup fan happens to be knocked out by a power outage, or someone spilled something on the day that their lab coat was at the cleaner.”

Cascading Errors

The Washington University case shows how errors can multiply. If the graduate student had promptly reported the needle prick instead of waiting until after she got sick, she could have stayed inside, preventing mosquitoes from feasting on her blood and potentially sparking an outbreak of Chikungunya. According to Weaver, people infected with Chikungunya have the most virus in their blood one to four days after transmission, the very period during which the student went about her life without knowing that she was infected. And while Americans’ habits of spending long hours indoors drastically reduced the chance of local transmission — “Because of our culture, we just don’t get very many mosquito bites,” Weaver said — it’s likely that a vector for transmission was present. In 2016, a survey of Missouri’s mosquito population found that a species that the World Health Organization says is implicated in the spread of Chikungunya, Aedes albopictus, was “very abundant” in southern Missouri.

In the 2016 report to NIH’s Office of Science Policy, Washington University biological safety officer Susan Cook did not name the principal investigator who oversaw the graduate student. The report also omits the name of the infected graduate student, as is standard practice for such documents. (Biosafety experts stress that while accidents can reflect problems with a lab’s culture or training, they should not be seen as an indictment of one researcher’s behavior.)

The Intercept sent a detailed list of questions to Cook along with Deborah Lenschow and Michael S. Diamond, who separately oversee labs that work on Chikungunya virus at Washington University. All three referred questions to a spokesperson, who sent a statement that she said was authored by Cook.

“As a major research institution, the safety of graduate students and scientists working in BSL3 labs is of paramount importance to us,” the statement reads. “We continually evaluate our laboratory safety policies, procedures and training materials and look for ways to incorporate new technologies and tools so that our labs remain safe and our students and researchers can continue their critical infectious diseases research.” The graduate student recovered within a few days and did not suffer prolonged symptoms, the statement says.

In her 2016 report to NIH, Cook wrote that after the infection, the lab’s principal investigator called a meeting about safety standards, and the university added training materials about needle injuries. She added that at its October 2016 meeting, the university’s institutional biological and chemical safety committee would discuss how to minimize injuries from needle pricks. Minutes from that meeting do not show that the infection was discussed there. Cook wrote The Intercept that “most discussions of specific injury/illness reports are too granular to be captured in the IBC minutes.”

An administrator with NIH’s Office of Science Policy responded by admonishing the institution. “We are concerned that an exposure incident occurred in a BL3 laboratory and went unreported for four days,” he wrote in a letter. He asked Washington University staff to conduct a thorough investigation, explore using different needles, better train researchers, and emphasize that exposures in high-containment labs needed to be reported immediately, not days after they happen. But after that, the correspondence chain ended.

Ryan Bayha, a spokesperson for NIH’s Office of Science Policy, would not comment directly on whether the agency continued the discussion, writing, “Washington University and OSP worked together to successfully resolve the issue involved in the Washington University report.”

“There doesn’t seem to be a lot of enforcement or follow-up actions, and there doesn’t seem to be any real accumulation of learning,” said Greg Koblentz, director of the Biodefense Graduate Program at George Mason University’s Schar School of Policy and Government, after reading the Washington University report and NIH’s response. “It helps demonstrate why we need to have a dedicated organization for biosafety and biosecurity in the United States.”

Biosafety protective suits for handling viral diseases are hung up in a biosafety level 4 training facility at U.S. Army Medical Research and Development Command at Fort Detrick in Frederick, Md., Thursday, March 19, 2020, where scientists are working to help develop solutions to prevent, detect and treat the coronavirus. (AP Photo/Andrew Harnik)

Biosafety protective suits for handling viral diseases are hung up in a biosafety level 4 training facility at U.S. Army Medical Research and Development Command at Fort Detrick in Frederick, Md., on March 19, 2020.

Photo: Andrew Harnik/AP

“No Standard”

The United States has the most robust biomedical funding in the world, and controversial breakthroughs in science often come from American labs. Yet the United States lacks a central framework for lab oversight. Canada’s Centre for Biosecurity oversees all pathogen research, setting standards and training regimens for labs and enforcing them as well. The United Kingdom has centralized reporting for infections acquired in the lab. When it comes to U.S. regulations, “There are some significant holes,” said Filippa Lentzos, an expert on biosecurity and biological threats at King’s College London. Biosafety protocols are “not embedded in statutory law. It’s tied to funding.”

Policies governing the use of so-called select agents and dual-use research are limited to specific toxins and types of experiments, leaving out much work on synthetic DNA. Another crucial set of federal guidelines covers research funded by NIH, the world’s largest biomedical funder. But a host of other entities work with or fund research with pathogens, with varying degrees of oversight: the Defense Advanced Research Projects Agency, the United States Agency for International Development, the Bill and Melinda Gates Foundation, and private companies. “There’s nothing out there that says, if you want to fund research, here’s what you should think about,” said Casagrande. “That doesn’t exist, period.”

There’s reason to worry. The 1977 outbreak of H1N1 influenza in the Soviet Union and China is believed to have been accidentally introduced by scientists, either through a lab accident or through a live-vaccine trial gone awry. In 2003 and 2004, the first SARS virus is suspected to have escaped four times from labs in China, Taiwan, and Singapore. In 2007, wastewater containing live virus leaked out of pipes near a highly secure biolab in Surrey, the United Kingdom, sickening animals in the area with foot-and-mouth disease. Accidents regularly happen at even the world’s top labs. In 2019, the Centers for Disease Control and Prevention ordered the U.S. Army Medical Research Institute of Infectious Diseases to temporarily halt work at a lab in Fort Detrick, Maryland, after identifying biosafety issues there. In addition to the MRSA infection at the FDA lab, the documents obtained by The Intercept include records of accidents at labs operated by the CDC and NIH. (In those two cases, researchers were exposed but not infected.)

Related

Infection of Wildlife Biologist Highlights Risks of Virus Hunting

Biosafety proponents worry most about accidents with what are called “potential pandemic pathogens”: bacteria, viruses, and other microorganisms that, either through handling or through modification, could set off another pandemic. Some are also concerned about accidents with pathogens like Chikungunya virus, seeing them as sentinel events that reveal broader problems. Because those incidents are more common, they can give insight into the daily workings of biolabs. And some pathogens that don’t pose a significant threat in the United States might ravage populations in other parts of the world, if a researcher were to travel after getting infected.

NIH-funded institutions that conduct research on recombinant DNA have to get experiments approved by an institutional biosafety committee, or IBC. If that work is extensive or done in a BSL-3 or BSL-4 lab, they are also required to appoint a biosafety officer to oversee lab work. But there is broad variation in how both rules are applied.

“There’s no standard for how many biosafety officers you need and indeed, for many types of institutions, whether you need a full-time monitor at all,” said Casagrande. “Sometimes there’s a part-time person, like you’re the biosafety officer and the animal use officer and the prime minister of bagels.”

At Washington University, the accident went unreported for four days. In other cases, accidents went unreported for months or even years, either because the affected researchers stayed quiet or because staff overlooked the incidents. In 2015, a University of Minnesota vice president for research wrote NIH’s Office of Science Policy to say that an employee had failed to report to the agency four incidents, one of which dated back to 2013. (None of the incidents apparently resulted in infection, though in a response letter, NIH noted that in two cases employees failed to get prompt medical attention.) The university discovered the accidents only after a journalist reached out to the institutional biosafety committee to ask for information. “After having questioned why these reports were not made, I have received a note of apology from the person whose responsibility it was to insure [sic] that this reporting was done,” a human resources administrator wrote NIH in an email sent the same day as the vice president’s letter. “She is no longer in the role.” A University of Minnesota spokesperson wrote that they could not comment on the affair because of “laws designed to protect employee privacy” but that since 2015, the university has improved biosafety procedures, training, and reporting and added resources for the institutional biosafety committee.

In another case, an institutional biosafety committee chair reported to NIH a biosafety infraction that had occurred six years earlier.

In the documents obtained by The Intercept, biosafety officers sometimes appear overly credulous. In 2019, an undergraduate student at the University of Illinois Urbana-Champaign who worked with salmonella contracted salmonellosis. She told a staffer that she thought her illness was caused by eating undercooked turkey, not by exposure to the bacteria in the lab. A biosafety officer appeared to accept this as a possible explanation, noting it in an initial email to NIH. (In a later formal report, the officer made clear that the student likely had a laboratory-acquired infection.) The student’s supervisor only learned that the student was sick after she visited the campus health center.

Even basic concepts, like how to train researchers in biosafety, vary widely from one lab to the next. At some labs, researchers are expected to do dry runs of experiments when learning safety techniques. At other places, said Casagrande, training consists mainly of slideshows.

Slides did a lot of work at Washington University too. In her report to NIH on the Chikungunya infection, Cook, the biosafety officer, noted that staff would add slides about working with needles and other sharp objects to an annual lab training presentation.

NAPLES, CAMPANIA, ITALY - 2019/03/10: A scientist shows a laboratory mouse used for experimentation in the Ceinge Laboratory of Advanced Biotechnology. In this laboratory the diagnosis on clinical suspicion, the possible predisposition to multifactorial diseases and the DNA typing are processed. (Photo by Salvatore Laporta/KONTROLAB /LightRocket via Getty Images)

A scientist shows a laboratory mouse used for experimentation in the Ceinge Laboratory of Advanced Biotechnology in Naples, Italy, on March 10, 2019.

Photo: Salvatore Laporta/LightRocket via Getty Images

A Missed Opportunity

Needlesticks, as scientists call needle injuries, were for decades seen as rare. When they did happen, they were believed to rarely lead to infection. Only recently have biosafety experts begun to challenge those assumptions. “Everyone who works with needles needs an emergency plan for when they stick themselves,” said Casagrande. “Anecdotally, people think of it as a once-in-a-career injury, but the data suggests it should be expected on any R01 grant,” he added, referring to a type of five-year research grant provided by NIH.

In the wake of the Chikungunya infection, Washington University doubled down on education about the safe use of needles in the lab. But in a span of 14 months, it happened twice more: In April 2017 and November 2017, researchers at Washington University pricked themselves while working with mice infected with Chikungunya.

In the statement sent by the spokesperson, Cook cited the incidents as a success because the lab workers immediately reported them and did not contract the virus.

Staff at the Office of Science Policy disagreed. After the April incident, an administrator noted that the needlestick had happened in the same lab that had the Chikungunya infection. But the response was otherwise muted. They again recommended more training, this time adding the world “strongly.”

“At the same facility within the span of a year, you had two incidents, and they’re like, ‘Well, do better,’” said Koblentz, referring to NIH.

In a perfect world, he said, the graduate student’s illness would have been used to teach other labs. “Ideally, these kind of incident reporting systems are a preventive measure. If you could learn from the accidents and then tell people, ‘OK, here’s how to avoid them,’ that’s great.”

Because accidents only come to light through attention from the press or civil society groups, there is little data on how frequently specific breaches occur. “There’s no central repository of accidents,” said Lentzos. “The reporting is very opaque.”

Bayha wrote in an email to The Intercept that NIH often develops “guidance documents” following notable lab incidents but conceded that did not happen in the Washington University case. “There was no feedback to the broader community,” said Koblentz. “It’s a missed opportunity.”

Join The Conversation