The use of residual streams from agricultural production and food consumption containing animal proteins entails the risk of disease transmission as illustrated by the epidemics of bovine spongiform encephalopathy (BSE) and African swine fever. To combat this risk, the use of animal proteins in livestock feed was banned in the European Union, resulting in a drain of valuable proteins from the agricultural system. With an increasing call for a circular food system, the use of residual streams as a feed ingredient needs to be reconsidered with the associated disease risks being assessed and mitigated where needed. In this study, we assessed the BSE risk of bovine spray-dried red blood cells (SDRBC) as an ingredient of aquafeed. Fish fed with bovine SDRBC could indirectly result in exposure of ruminants to BSE infectivity because one of the exemptions of the feed ban is the use of fishmeal as an ingredient in calf milk replacers. A quantitative risk model was built to evaluate the BSE infectivity present in blood sourced from a slaughtered BSE-infected cow and the reduction of infectivity due to processing steps along the production chain. The end point of the model was the BSE infectivity, expressed in cattle oral ID50 (CoID50), reaching calves fed calf milk replacer containing fishmeal, and the corresponding probability that this will result in at least one new BSE infection.
The expected BSE infectivity in blood from a BSE-infected cow at the clinical end state of infection is 0.75 CoID50 (median value). Infectivity in blood mainly results from cross-contamination with brain tissue during stunning at the slaughterhouse. The initial infectivity is reduced along the pathway from slaughtered cow to calf milk replacer, with the highest reduction achieved by clearance of infectivity by fish fed bovine SDRBC as an ingredient of aquafeed, although this parameter has high uncertainty. The final infectivity reaching calves via inclusion of fishmeal in calf milk replacer is estimated to be very low (median value: 1.1 × 10−5 CoID50). Assuming an exponential dose-response model, this corresponds with an expected probability that < 10 out of a million slaughtered BSE-infected cows will result in new BSE infections, which is far below the threshold value of 1 for the basic reproduction number (R0) to initiate a new epidemic. We thus conclude that it is very unlikely that the use of bovine SDRBC as ingredient of aquafeed will result in a new BSE epidemic in cattle. What-if analysis indicated that this conclusion is robust, despite high uncertainty for some input parameters.
In this study is presented a procedure for surveillance data-driven risk assessment, which can be used to inform inter-sectorial Campylobacter risk-based control, e.g. within National Action Plans and One Health (OH) systems. Campylobacter surveillance data (2019 to 2022) and a published quantitative microbial risk assessment (QMRA) model were used, to show the procedure. Moreover, an interface tool was developed in Excel for showing descriptive statistics on measured apparent flock prevalence (AP) and concentrations (colony forming units per gram, cfu/g) on the meat, together with their related QMRA outputs. Currently (mid-2024), Danish fresh broiler meat is produced by four slaughterhouse companies (A, B, C and D), where approximately 30 % of the annually slaughtered broiler flocks are randomly culture tested, on one leg skin (LS) sample per flock sampled from chilled carcasses. Data variables were: date of sampling, farm-ID, within farm house-ID, flock-ID, slaughterhouse name, sample-ID, and Campylobacter concentrations. Flocks were classified as carcass positive with a concentration ≥ 10 cfu/g. The data was fed into the QMRA model to assess: a) the average risk of human campylobacteriosis per serving (during a month or year), and b) the monthly/annual risk of 2022 relative (RR) to the baseline (average) risk from the previous three years. The descriptive statistics and the risk assessment (RA) were carried out at national level and for each slaughterhouse. In 2022, the national RR was 1.03, implying that the average annual risk increased by approximately 3 % compared to the baseline. Nevertheless, for slaughterhouses A, B and D, the annual risk decreased by ≈ 22 %, 21 % and 43 %, respectively; whereas for slaughterhouse C it increased by 48 %. Monthly risk estimates showed seasonal variations, according to the visualized changes of AP and meat contaminations. The national monthly RR was >1 in July and from September to December. During those months: slaughterhouse C had always RR > 1, slaughterhouse A had a relative increase of risk in July, slaughterhouse B in July and November, and slaughterhouse D in October and December. The procedure and the tools used in this study, allow identifying the impact of seasonality and food-chain stages (i.e. slaughterhouses and their broilers sourcing farms) on the risk per serving, so that Campylobacter risk-based control could be implemented accordingly, from farm to fork, across consecutive surveillance periods. The same principles could be applied in other countries, food chains, and/or for other foodborne pathogens, when similar data and QMRA models are available.
Consumption of drinking water containing pathogenic microorganisms may pose serious health risks from waterborne diseases. Quantifying such risks is essential for guiding interventions and policy decisions. Quantitative microbial risk assessment (QMRA) is a very useful method to estimate the public's risk of infection from disease-causing microorganisms in water sources. QMRA of drinking water production process is limited worldwide and so far no such QMRA study has been conducted in Bangladesh. Moreover, climate and socio-economic changes may impact waterborne pathogens and associated health risks, but to what extent remains unclear, because comprehensive QMRA by taking into account combined impact of climatic and socio-economic factors has never been done worldwide so far. In this study, the Swedish QMRA tool was applied to evaluate public health risk from drinking water production process in Dhaka, Bangladesh as a case study. At first, current risk was quantified, and then the potential future risk was projected by taking into account climate and socio-economic factors. The results revealed that the annual infection risks at the current (2020s) baseline condition were below the acceptable risk threshold 10–4 infections per person per year (as proposed by several USEPA scientists) for all three pathogens Salmonella, norovirus and Giardia. However, after extreme events with sewer overflow and agricultural runoff, norovirus violates the acceptable risk thresholds, and the risks for Salmonella and Giardia are in borderline. The selected sustainable future scenario showed some improvement in terms of annual infection risks, while the uncontrolled scenario resulted in substantially higher infection risks both in the near and far future compared to the current scenarios. installment of a UV treatment step as an additional treatment barrier resulted in significant infection risk reduction. According to the sensitivity analysis results, socio-economic factors such as human population, livestock, and pathogen removal in wastewater were found to have greater influence on the infection risks, compared to climate change. The study can help policy makers and water managers to identify interventions to reduce the burden of disease on the population. The tool can be used to assess the health risk associated with drinking water production process in other areas of the world with similar characteristics.
There is a current trend towards plant-based diets in Western countries. Since changes in the diet imply possible changes in exposure to foodborne pathogens, there is an increasing need to assess the microbiological risks associated with these diets. This study aims to assess microbiological risks for French adults associated with Bacillus cereus group III and group IV in hot, homemade cereal- and lentil-based dishes. A probabilistic retail-to-fork risk assessment model was developed considering cooking, cooling at ambient temperature, and storage under chilled conditions. Data came from a representative national survey, public database and literature. The model was developed in R, and uncertainty and variability were separated using second-order Monte Carlo simulations. Not all consumers have the same storage and cooling practices, so the results were expressed by probabilistic distributions built by specific storage time. The mean concentration of Bacillus cereus in portions at the time of consumption after 72 h of storage was 1.2 log CFU.g−1 for cereal-based dishes and 3.4 log CFU.g−1 for lentil-based dishes. After 72 h of storage under chilled conditions, the risk per portion, defined as the probability of contamination over 5 log CFU.g−1, was 0 (95 % CI: 0 - 0) for cereal-based dishes and 7.95 × 10−4 (95 % CI: 5.55 × 10−4 - 1.12 × 10−3) for lentils-based dishes. However, if cooling time at room temperature reached 24 h, the risk for cereal- and lentil-based dishes increased to 2.39 × 10−3 (95 % CI: 1.15 × 10−3 - 4.90 × 10−3) and 4.66 × 10−1 (95 % CI: 3.16 × 10−1 - 6.07 × 10−1), respectively. The sensitivity analysis indicated that the initial prevalence and level of contamination were key factors in limiting the risk, ranking before cooling time or refrigeration conditions. Besides, the scenario analysis revealed an influence of consumer behaviour regarding cooling and storage time on the risk per portion. The environmental trend towards plant-forward diets, combined with the emerging no-food waste and batch cooking practices in France, will likely favour new consumption patterns and increase the risk associated with Bacillus cereus. Our model will help quantify this extra burden.
This article presents the outcomes of a scientific review and microbiological risk ranking of fresh, frozen, processed, and preserved fruit and vegetables imported into New Zealand. The study was undertaken by New Zealand Food Safety to help in the prioritisation of imported food safety issues for risk management action and ensure that regulatory resources are appropriately focused on food products that represent the highest public health risk.
Risk ranking, also sometimes called comparative risk assessment, is a methodology where the most significant risks associated with specific hazards and foods are identified and characterised, and then compared. The output is a list of pathogen-food combinations ranked according to their relative level of risks, from highest to lowest.
This study involved the development of a New Zealand risk ranking model based on two multicriteria analysis models developed separately by the United States Food and Drug Administration and the European Food Safety Authority (BIOHAZ Panel) for similar risk ranking applications. The New Zealand model uses nine criteria that have been adapted to New Zealand data and circumstances.
The eight top ranking pathogen-produce combinations identified using the New Zealand model were pathogenic E. coli in lettuce, spinach and other leafy greens, Salmonella spp. in lettuce, other leafy greens, tomatoes, melons, and other Cucurbitaceae (e.g. cucumbers, gourds, squashes, pumpkins). Produce categories were also ranked based on overall risk from various pathogens associated with each produce category. The top ranked produce categories, in decreasing order of rank, were: other leafy greens, pods, legumes and grains, tropical fruits, berries, herbs and spinach.
The risk ranking lists provide a starting point and basis for risk management considerations and prioritisation of resources. They will need to be regularly updated to ensure they remain relevant by incorporating the latest epidemiological, hazard, and import volume data. Updates should also consider the availability of new modelling tools and analytical methods for emerging or less common pathogens.
This study assessed the influence of preparing iceberg lettuce salads at home on the risk of Escherichia coli O157:H7, Salmonella Typhimurium, Listeria monocytogenes, and Campylobacter jejuni by conducting quantitative microbial risk assessments (QMRAs1) for distribution, retail, domestic storage, and cross-contamination. The QMRA simulated pathogen behaviors in lettuce and meat from-farm-to-fork environments. Order of food preparation, hand washing, and lettuce washing were assessed in domestic lettuce salad and raw meat processes. Scenario and sensitivity analyses were performed to compare the importance of the process factors. QMRA simulation revealed that factors related to initial contamination and at-home preparation of foods were more critical than those related to the time-temperature environment during distributions and storages. The risk of L. monocytogenes infection decreased only 1 % even in the absence of cross-contamination. Similarly, the risk of C. jejuni hardly decreased (0.91-fold) even in the absence of lettuce contamination. When the lettuce was not washed, the risk of L. monocytogenes was relatively higher (1.92-fold) than that of other pathogens (E. coli O157:H7,1.44-fold; S. Typhimurium, 1.38-fold; and C. jejuni, 1.36-fold). The risk of E. coli O157:H7 (2.60-fold), S. Typhimurium (2.18-fold), and C. jejuni (2.67-fold) increased when hands were not washed before lettuce preparation, whereas the risk of L. monocytogenes did not increase (1.07-fold). The importance of avoiding cross-contamination through appropriate order of food preparation and hand washing in lettuce salad preparation were quantitatively demonstrated in the present study, which provide essential information for food safety education at home.