The risk and benefits of gain‐of‐function studies on influenza A have been widely debated since 2012 when the methods to create two respiratory transmissible H5N1 mutant isolates were published. Opponents of gain‐of‐function studies argue the biosecurity risk is unacceptable, while proponents cite potential uses for pandemic surveillance, preparedness and mitigation. In this commentary, we provide an overview of the background and applications of gain‐of‐function research and argue that the anticipated benefits have yet to materialize while the significant risks remain.
Adam, Dillon C. et al. “Does Influenza Pandemic Preparedness and Mitigation Require Gain‐of‐function Research?” Influenza and Other Respiratory Viruses 11.4 (2017): 306–310.
Sigue este Blog en Facebook y Twitter
After the largest Ebola virus outbreak in history, experts have attempted to answer how the Zaire ebolavirus species emerged in West Africa and caused chains of human-to-human transmission. The widespread and untimely infection of Health Care Workers (HCW) in the affected countries accelerated spread of the virus within the community. Among the reasons attributed to this trend, it must be considered that HCW were exposed to the virus in their occupational environment. The contribution of environmental conditions to the spread of Ebola in West Africa was examined by investigating the effect of temperature/humidity on the virus’s environmental persistence and by modeling if saturation (liquid stress) allows for penetration of Ebola virus through personal protective equipment (PPE). Ebola-Makona virus persisted on PPE and materials found in outbreak settings for less than 72 hours at 27 °C and 80% relative humidity (RH). A difference in virus penetration was observed between dry (5%, 1/21 tests) and saturated (33%, 7/21 tests) samples of PPE. Infectious virus particles penetrated through saturated coupons of Tyvek Micro Clean, Tychem QC, whole surgical masks and N95 respirators. These findings suggest inclusion of saturation or similar liquid stress simulation in protective equipment testing standards.
Nikiforuk, Aidan M. et al. “Challenge of Liquid Stressed Protective Materials and Environmental Persistence of Ebola Virus.” Scientific Reports 7 (2017): 4388.
Sigue este Blog en Facebook y Twitter
In response to Ebola virus disease outbreak in West Africa, the National Institute for Communicable Diseases in South Africa established a modular high-biosafety field Ebola diagnostic laboratory (FEDL) near Freetown, Sierra Leone. This was the sole diagnostic capacity available to respond to the overwhelming demand for Ebola diagnosis for several weeks in the Western Area of Sierra Leone. The deployment of the FEDL capacity contributed to the overall international efforts in bringing the Ebola outbreak in West Africa under control.
Infectious wastes are potential sources of pathogenic microorganisms, which may represent a risk to the professionals who manage them. In this study we aimed to characterize the infectious bacteria present in dental waste and waste workers. The dental waste produced over 24 hours was collected and waste workers were sampled by swabbing. Isolate resistance profiles were characterized by Vitek® and PCR and biofilm formation by Congo Red agar, string test and microtiter assay. To assess similarity between the waste and the workers' samples, a random amplified polymorphic DNA test was used. Twenty-eight bacteria were identified as clinically relevant. The most frequent gene was blaTEM present in five Gram-negative microorganisms, and one blaSHV in K. pneumoniae. All P. aeruginosa were positive to extracellular polymeric substances formation, except one isolated from a worker. K. pneumoniae had negative results for the string test. P. aeruginosa showed better adherence at 25°C after 48-hour incubation and K. pneumonia had the best biofilm formation at the same temperature, after24 hours. The similarity between P. aeruginosa recovered from dental waste and from workers was low, however, it is important to note that a pathogen was found on worker's hands and that improvements in biosafety are required. This article is protected by copyright. All rights reserved.
The present WHO Guidelines on viral inactivation and removal procedures intended to assure the viral safety of human blood plasma products were developed to complement the WHO Requirements for the collection, processing and quality control of blood, blood components and plasma derivatives”(1), in response to the above requests.
These Guidelines pertain to the validation and assessment of the steps for viral inactivation and removal employed in the manufacture of human blood plasma derivatives and virally inactivated plasma for transfusion, prepared either from plasma pools or from individual donations.It is hoped that this document, by summarizing current experience with well recognized methods, will help set expectations, serve as a guide to speed implementation, and ensure that implementation is appropriate.
Inevitably, individual countries may formulate different policies, not only in relation to procedures for validation and control, but also regarding donor selection and methods of blood screening. These Guidelines do not replace the requirements of regulatory authorities in various parts of the world (2–4); rather, they are primarily intended to assist those national regulatory authorities and manufacturers that are less familiar with viral decontamination processes.
The document does not address products of animal origin or those manufactured by recombinant techniques.
REFERENCE: Guidelines on viral inactivation and removal procedures intended to assure the viral safety of human blood plasma products. World Health Organization, WHO Technical Report, Series No. 924, 2004.
Sigue este Blog en Facebook y Twitter
BACKGROUND: Healthcare workers are at risk of acquiring viral diseases such as hepatitis B, hepatitis C and HIV through exposure to contaminated blood and body fluids at work. Most often infection occurs when a healthcare worker inadvertently punctures the skin of their hand with a sharp implement that has been used in the treatment of an infected patient, thus bringing the patient's blood into contact with their own. Such occurrences are commonly known as percutaneous exposure incidents.
OBJECTIVES: To determine the benefits and harms of extra gloves for preventing percutaneous exposure incidents among healthcare workers versus no intervention or alternative interventions.
SEARCH METHODS: We searched CENTRAL, MEDLINE, EMBASE, NHSEED, Science Citation Index Expanded, CINAHL, NIOSHTIC, CISDOC, PsycINFO and LILACS until 26 June 2013.
SELECTION CRITERIA: Randomised controlled trials (RCTs) with healthcare workers as the majority of participants, extra gloves or special types of gloves as the intervention, and exposure to blood or bodily fluids as the outcome.
DATA COLLECTION AND ANALYSIS: Two authors independently assessed study eligibility and risk of bias, and extracted data. We performed meta-analyses for seven different comparisons.
MAIN RESULTS: We found 34 RCTs that included 6890 person-operations as participating units and reported on 46 intervention-control group comparisons. We grouped interventions as follows: increased layers of standard gloves, gloves manufactured with special protective materials or thicker gloves, and gloves with puncture indicator systems. Indicator gloves show a coloured spot when they are perforated. Participants were surgeons in all studies and they used at least one pair of standard gloves as the control intervention. Twenty-seven studies also included other surgical staff (e.g. nurses). All but one study used perforations in gloves as an indication of exposure. The median control group rate was 18.5 perforations per 100 person-operations. Seven studies reported blood stains on the skin and two studies reported self reported needlestick injuries. Six studies reported dexterity as visual analogue scale scores for the comparison double versus single gloves, 13 studies reported outer glove perforations. We judged the included studies to have a moderate to high risk of bias.We found moderate-quality evidence that double gloves compared to single gloves reduce the risk of glove perforation (rate ratio (RR) 0.29, 95% confidence interval (CI) 0.23 to 0.37) and the risk of blood stains on the skin (RR 0.35, 95% CI 0.17 to 0.70). Two studies with a high risk of bias also reported the effect of double compared to single gloves on needlestick injuries (RR 0.58, 95% CI 0.21 to 1.62).We found low-quality evidence in one small study that the use of three gloves compared to two gloves reduces the risk of perforation further (RR 0.03, 95% CI 0.00 to 0.52). There was similar low-quality evidence that the use of one fabric glove over one normal glove reduces perforations compared to two normal gloves (RR 0.24, 95% CI 0.06 to 0.93). There was moderate-quality evidence that this effect was similar for the use of one special material glove between two normal material gloves. Thicker gloves did not perform better than thinner gloves.There was moderate to low-quality evidence in two studies that an indicator system does not reduce the total number of perforations during an operation even though it reduces the number of perforations per glove used.There was moderate-quality evidence that double gloves have a similar number of outer glove perforations as single gloves, indicating that there is no loss of dexterity with double gloves (RR 1.10, 95% CI 0.93 to 1.31).
AUTHORS' CONCLUSIONS: There is moderate-quality evidence that double gloving compared to single gloving during surgery reduces perforations and blood stains on the skin, indicating a decrease in percutaneous exposure incidents. There is low-quality evidence that triple gloving and the use of special gloves can further reduce the risk of glove perforations compared to double gloving with normal material gloves. The preventive effect of double gloves on percutaneous exposure incidents in surgery does not need further research. Further studies are needed to evaluate the effectiveness and cost-effectiveness of special material gloves and triple gloves, and of gloves in other occupational groups.
Although bioweapons have not been used in modern warfare, and bioterror events are rare, it's an open question as to whether the norms that prohibit the use of biological weapons have an expiration date. Biological techniques and equipment that could be used to create new bioweapons are available and inexpensive, pathogens are plentiful, and some can even be made de novo. In his new book, Biosecurity Dilemmas, Christian Enemark describes the challenges that nations face in providing biosecurity today.
Dreaded Diseases, Ethical Responses, and the Health of Nations
Biosecurity Dilemmas examines conflicting values and interests in the practice of "biosecurity," the safeguarding of populations against infectious diseases through security policies. Biosecurity encompasses both the natural occurrence of deadly disease outbreaks and the use of biological weapons. Christian Enemark focuses on six dreaded diseases that governments and international organizations give high priority for research, regulation, surveillance, and rapid response: pandemic influenza, drug-resistant tuberculosis, smallpox, Ebola, plague, and anthrax. The book is organized around four ethical dilemmas that arise when fear causes these diseases to be framed in terms of national or international security: protect or proliferate, secure or stifle, remedy or overkill, and attention or neglect. For instance, will prioritizing research into defending against a rare event such as a bioterrorist attack divert funds away from research into commonly occurring diseases? Or will securitizing a particular disease actually stifle research progress owing to security classification measures? Enemark provides a comprehensive analysis of the ethics of securitizing disease and explores ideas and policy recommendations about biological arms control, global health security, and public health ethics.
The objective of this document is to provide guidance for selecting the most appropriate for safely managing solid waste generated at Primary Health-Care centres (PHCs) in developing countries. The main tool of this guide consists of six decision-trees aimed at assisting the user in identifying appropriate waste management methods. The guide takes into consideration the most relevant local conditions, the safety of workers and of the general public as well as of environmental criteria.
This guide is composed of the following parts:
Basic risks associated with poor management of heath care waste.
Basic elements for safe health-care waste management (HCWM)
Parameters to assess before selecting HCWM options
Technical annexes describing HCWM options
Estimation of costs of the various options
Decision-trees, assisting the selection of HCWM options
This guide may also be used to evaluate existing practices related to health-care waste management. More detailed sources of information on handling and storage practices, technical options for treatment and disposal of wastes, training and personal protection, and assessment of a country’s situation, are presented in Annex A.
The waste produced in the course of health-care activities, from contaminated needles to radioactive isotopes, carries a greater potential for causing infection and injury than any other type of waste, and inadequate or inappropriate management is likely to have serious public health consequences and deleterious effects on the environment. This handbook – the result of extensive international consultation and collaboration – provides comprehensive guidance on safe, efficient, and environmentally sound methods for the handling and disposal of health-care wastes in normal situations and emergencies. Future issues such as climate change and the changing patterns of diseases and their impacts on health-care waste management are also discussed. For health-care settings in which resources are severely limited, the handbook pays particular attention to basic processes and technologies that are not only safe, but also affordable, sustainable, and culturally appropriate. The guide is aimed at public health managers and policy-makers, hospital managers, environmental health professionals, and all administrators with an interest in and responsibility for waste management. Its scope is such that it will find application in developing and developed countries alike.