Background: Guidance on the use of fluid-resistant surgical masks (FRSM) and filtering face-piece (FFP3) masks by healthcare staff in England is produced nationally and applied locally by hospital trusts. In April 2022, national infection prevention and control (IPC) guidance was updated with reference to the importance of local risk assessment when considering use of FFP3 masks.
Aim: Our aim was to evaluate local hospital policies for use of face masks and risk assessment for healthcare staff.
Methods: A cross-sectional online survey (February-March 2023) of NHS trusts in England. Responses were analysed using Fisher's Exact tests and the framework approach.
Results: Fifty nine percent (109/186) of eligible hospital trusts responded. All trusts required staff to wear FRSM or FFP3 when providing direct care to patients with suspected respiratory viral infection (RVI); 87% (95/109) and 13% (14/109) respectively. FFP3s were required by 13% of trusts (14/109) when providing direct care to individuals with suspected RVI and by 9% of trusts (10/109) when present in a bay/ward with patients with suspected RVI. Over half of trusts used locally developed risk assessment tools.
Conclusions: There was clear variation in policies for use of face masks and use of workplace and individual risk assessments across hospital trusts. There was also variation in application of mask use, fit testing and audit of adherence. Further work is required to explore whether development of further guidance and national implementation tools could reduce unwarranted variation.
Background: Access to safe drinking water is critical for patient care and infection prevention in healthcare facilities (HCFs). In Sindh, Pakistan, limited monitoring data exist despite widespread reports of contamination.
Objective: To evaluate the physicochemical and microbiological quality of drinking water supplied to HCFs across Sindh and assess associated patient safety risks to inform infection prevention and control (IPC) strategies and guide water quality interventions.
Methods: A total of 280 water samples were collected from 136 HCFs across 26 districts and analysed for key physicochemical parameters and microbial contamination indicators (total coliforms, Escherichia coli) were analyzed following APHA standards. Data were interpreted against WHO drinking water guidelines. Multivariate, facies and hydrochemical interpretation were applied to explain contamination sources and controls.
Results: Contamination patterns were highly variable spatially, with groundwater sources contributing primarily to salinity, hardness and sodium exceedances, whereas surface water sources were associated with turbidity and microbial risks. Filtration plants demonstrated variable performance. District level exceedances identified clear contamination hotspots that require targeted intervention rather than uniform policy responses. TDS exceeded WHO limits in 30% of samples, particularly in NFR, SHK, SNG and UMK. Turbidity exceeded permissible values in 20.7% of samples, mainly in THA, SUJ and SUK. Chloride and hardness exceeded guideline limits in 22.1% and 16.1% samples, respectively, predominantly in groundwater. Sodium exceeded limits in 25% of samples. Fluoride and arsenic contamination remained localized. Microbiological contamination was widespread, with total coliforms detected in 76.3% and E. coli in 18.6% of samples. Multivariate analyses provided further insights; PCA identified mineralization (PC1, 49.45%) and carbonate equilibrium (PC2, 10.63%) as key controls, while hydrochemical facies analysis distinguished precipitation-dominated Ca-Mg-HCO3 waters, rock-dominated Na+ enrichment, and evaporation driven Na+-Cl--SO42- salinization.
Conclusions: A substantial proportion of drinking water in Sindh HCFs does not meet WHO standards, presenting significant microbiological and chemical risks. Strengthened monitoring, effective disinfection, and Water Safety Plans are urgently required to safeguard IPC and patient health in line with global IPC priorities and Sustainable Development Goal 6.
Blood culture contamination (BCC) is a frequent and costly challenge in clinical diagnostics. BCC leads to extended hospital stays, unnecessary antimicrobial therapy, diagnostic delays, and increased healthcare costs, sometimes exceeding $100,000 per case depending on the scope of analysis. It also contributes to environmental waste and reputational harm. Blood culture diversion (BCD), particularly via blood culture diversion devices (BCDDs), has emerged as a promising strategy to reduce BCC. BCDDs divert initial blood flow likely contaminated with skin flora, thereby improving diagnostic accuracy. This scoping review analysed 23 studies, including randomized controlled trials and observational designs. BCD was an effective way to reduce the rate of BCC. BCDDs consistently outperformed open diversion methods in reducing BCC rates. However, findings on their impact on antimicrobial usage, hospital length of stay, and cost-effectiveness varied. Some studies reported significant cost savings and reduced vancomycin use, while others showed minimal change. Barriers to BCDD adoption include financial constraints, inconsistent definitions of BCC, and variable staff compliance. Enablers include positive user feedback, targeted training, and integration into national surveillance frameworks. Evidence gaps remain in comparative effectiveness, sustainability metrics, and behavioural factors influencing implementation. The review recommends broader adoption of BCDDs, particularly in high-risk settings, emphasising the need for local data to identify where implementation will be most effective. It also calls for standardized definitions, improved surveillance, and further research into broader clinical, economic, and environmental outcomes.
Background: Non-ventilator-associated pneumonia (NV-HAP), a subset of healthcare-associated pneumonia (HAP), is common and significantly increases patient mortality and hospital stay. However, no systematic review has been undertaken to synthesise the impact of NV-HAP on these outcomes.
Aim: To undertake a review of the evidence on the impact of NV-HAP on mortality and additional length of stay in adults admitted to an acute care hospital.
Methods: We performed a systematic search to identify research exploring and evaluating the impact of NV-HAP on mortality and additional length of stay in adults admitted to an acute care hospital. The electronic databases MEDLINE and CINAHL were searched, for peer-reviewed articles published between January 2004 and August 2025. An assessment of the study quality and risk of bias of included articles was conducted using the ROBINS-E and ROBINS-I tool.
Findings: 6324 studies were initially identified with 49 articles included in the review following the screening and full-text review. Twenty-six papers identified both mortality and additional length of stay results, 21 papers identified mortality results only and two papers reported additional length of stay results only. Inpatient mortality following NV-HAP ranged from 3.1 - 73.9%. Additional length of stay associated with NV-HAP was extended between 10 - 47.5 days.
Conclusions: This systematic review highlights the impact of NV-HAP on patients admitted to hospital. NV-HAP was associated with patient mortality and additional length of stay. Results of this study will inform a larger planned program of research.
Background: Surgical site infections (SSIs) are a significant cause of morbidity and mortality among healthcare-associated infections, as well as increased economic and environmental costs.
Aim: This study aimed to determine the environmental impacts of surgical site infections and the resulting carbon footprint.
Methods: This descriptive study was conducted with the participation of 553 patients who underwent surgery at a university and a city hospital in Central Anatolia between March and June 2025. Data were collected using a sociodemographic information form, the Surgical Wound Assessment Form based on the criteria of the European Center for Disease Prevention and Control, and the Carbon Footprint Calculation Tool based on the calculation tool of the Sustainable Healthcare Coalition.
Findings: A total of 91 individuals (16,5%) experienced SSIs. The total carbon footprint during the follow-up and treatment period of 91 patients with surgical site infection was calculated as approximately 1,735 kg CO2 equivalent. The largest source of emissions was hospitalizations (clinic and intensive care: 1,133 kg CO2e), which accounted for over 70% of the total. This was followed by patient transport (142.4 kg CO2e) and magnetic resonance (MRI) imaging (108.0 kg CO2e). It was determined that the development of SSIs imposes a carbon burden of 16.8 kg CO2e per patient on average compared to a standard surgical procedure.
Conclusions: This study has quantitatively demonstrated that SSIs have a measurable and significant environmental burden in addition to their known clinical and economic burden. Preventing SSIs is a critical strategy for ensuring both patient safety and economic and ecological sustainability in surgery.
Background: Interpretation of microbial tolerance and resistance to disinfectants has long been inconsistent, with heterogeneous definitions and no clinically meaningful threshold. We propose the concept of Replication Capacity After Use (RCAU) as a practical endpoint to assess whether microbial survival after disinfectant exposure constitutes a clinically relevant phenomenon under recommended use conditions. RCAU is defined as the ability of microorganisms to replicate after exposure at recommended application concentration and exposure time. A critical RCAU corresponds to failure of a standardised quantitative suspension test.
Methods: We reassessed published evidence across the most common disinfectant substances listed by the German Association for Applied Hygiene (VAH). Reported findings on survival, tolerance and resistance were re-evaluated against the RCAU definition, with particular attention to whether testing was performed using quantitative suspension methods at application concentration.
Results: No disinfectant group has demonstrated a critical RCAU under application conditions in standardised suspension testing. Reports of reduced susceptibility or microbial survival exist, but many were not based on suspension tests at use concentrations, making interpretation with respect to RCAU uncertain. Transient or reversible adaptations have been described, yet without evidence of a critical RCAU. Only triclosan and silver compounds show established resistance mechanisms, though even here no critical RCAU has been confirmed under standardised testing.
Conclusions: RCAU provides a transparent, use-condition-anchored framework to differentiate non-critical survival from clinically relevant resistance development. Applied across disinfectant classes, it shows that no critical failures have occurred at use concentrations, although many reported findings were not assessed by standardised suspension tests.

