Connecting Organizations for Regional Disease Surveillance (CORDS) is an international non-governmental organization focused on information exchange between disease surveillance networks in different areas of the world. By linking regional disease surveillance networks, CORDS builds a trust-based social fabric of experts who share best practices, surveillance tools and strategies, training courses, and innovations. CORDS exemplifies the shifting patterns of international collaboration needed to prevent, detect, and counter all types of biological dangers - not just naturally occurring infectious diseases, but also terrorist threats. Representing a network-of-networks approach, the mission of CORDS is to link regional disease surveillance networks to improve global capacity to respond to infectious diseases. CORDS is an informal governance cooperative with six founding regional disease surveillance networks, with plans to expand; it works in complement and cooperatively with the World Health Organization (WHO), the World Organization for Animal Health (OIE), and the Food and Animal Organization of the United Nations (FAO). As described in detail elsewhere in this special issue of Emerging Health Threats, each regional network is an alliance of a small number of neighboring countries working across national borders to tackle emerging infectious diseases that require unified regional efforts. Here we describe the history, culture and commitment of CORDS; and the novel and necessary role that CORDS serves in the existing international infectious disease surveillance framework.
The East African Integrated Disease Surveillance Network (EAIDSNet) was formed in response to a growing frequency of cross-border malaria outbreaks in the 1990s and a growing recognition that fragmented disease interventions, coupled with weak laboratory capacity, were making it difficult to respond in a timely manner to the outbreaks of malaria and other infectious diseases. The East Africa Community (EAC) partner states, with financial support from the Rockefeller Foundation, established EAIDSNet in 2000 to develop and strengthen the communication channels necessary for integrated cross-border disease surveillance and control efforts. The objective of this paper is to review the regional EAIDSNet initiative and highlight achievements and challenges in its implementation. Major accomplishments of EAIDSNet include influencing the establishment of a Department of Health within the EAC Secretariat to support a regional health agenda; successfully completing a regional field simulation exercise in pandemic influenza preparedness; and piloting a web-based portal for linking animal and human health disease surveillance. The strategic direction of EAIDSNet was shaped, in part, by lessons learned following a visit to the more established Mekong Basin Disease Surveillance (MBDS) regional network. Looking to the future, EAIDSNet is collaborating with the East, Central and Southern Africa Health Community (ECSA-HC), EAC partner states, and the World Health Organization to implement the World Bank-funded East Africa Public Health Laboratory Networking Project (EAPHLNP). The network has also begun lobbying East African countries for funding to support EAIDSNet activities.
The Asia Partnership on Emerging Infectious Diseases Research (APEIR) was initiated in 2006 to promote regional collaboration in avian influenza research. In 2009, the partnership expanded its scope to include all emerging infectious diseases. APEIR partners include public health and animal researchers, officials and practitioners from Cambodia, China, Lao PDR, Indonesia, Thailand and Vietnam. APEIR has accomplished several major achievements in three key areas of activity: (i) knowledge generation (i.e., through research); (ii) research capacity building (e.g., by developing high-quality research proposals, by planning and conducting joint research projects, by adopting a broader Ecohealth/OneHealth approach); and (iii) policy advocacy (e.g., by disseminating research results to policy makers). This paper describes these achievements, with a focus on the partnership's five major areas of emerging infectious disease research: wild migratory birds, backyard poultry systems, socio-economic impact, policy analysis, and control measures. We highlight two case studies illustrating how the partnership's research results are being used to inform policy. We also highlight lessons learned after five years of working hard to build our partnership and the value added by a multi-country, multi-sectoral, multi-disciplinary research partnership like APEIR.
There are currently no widely accepted animal surveillance guidelines for human Ebola hemorrhagic fever (EHF) outbreak investigations to identify potential sources of Ebolavirus (EBOV) spillover into humans and other animals. Animal field surveillance during and following an outbreak has several purposes, from helping identify the specific animal source of a human case to guiding control activities by describing the spatial and temporal distribution of wild circulating EBOV, informing public health efforts, and contributing to broader EHF research questions. Since 1976, researchers have sampled over 10,000 individual vertebrates from areas associated with human EHF outbreaks and tested for EBOV or antibodies. Using field surveillance data associated with EHF outbreaks, this review provides guidance on animal sampling for resource-limited outbreak situations, target species, and in some cases which diagnostics should be prioritized to rapidly assess the presence of EBOV in animal reservoirs. In brief, EBOV detection was 32.7% (18/55) for carcasses (animals found dead) and 0.2% (13/5309) for live captured animals. Our review indicates that for the purposes of identifying potential sources of transmission from animals to humans and isolating suspected virus in an animal in outbreak situations, (1) surveillance of free-ranging non-human primate mortality and morbidity should be a priority, (2) any wildlife morbidity or mortality events should be investigated and may hold the most promise for locating virus or viral genome sequences, (3) surveillance of some bat species is worthwhile to isolate and detect evidence of exposure, and (4) morbidity, mortality, and serology studies of domestic animals should prioritize dogs and pigs and include testing for virus and previous exposure.
Background: Emergency physicians see many people who present to the emergency department stating that they are immunized against tetanus, when in fact, they are not. The patient history is not dependable for determining true tetanus status and simple patient surveys do not provide actual prevalence. The objective of this study was to determine the prevalence of tetanus status by antibody titer seropositivity and quantify such status among patients reporting tetanus protection.
Methods: This study is a single center prospective convenience sample of patients presenting to the emergency department 12 years of age or older. Patients deemed study candidates and willing to be in the study filled out an eight-question questionnaire that included the question 'is your tetanus shot up to date'. A blood sample was then drawn for tetanus antibody titer and quantified according to a pre-determined cutoff for protection.
Results: A total of 163 patients were enrolled. Of patients responding yes to the query 'is your tetanus shot up to date' 12.8% (N=5) of them were not seropositive. Of the 26 people who were seronegative in the study all had been to a doctor in the past year and 88.5% (N=23) had been to their family physician.
Conclusion: The study suggests that it may be difficult to trust the tetanus immunization history given by patients presenting to the emergency room. The study also observed that a large percentage of patients who were serenegative were seen by a primary care physician and not had a necessary tetanus immunization.
Background: Calls for disaster medical assistance teams (DMATs) are likely to continue in response to international disasters. As part of a national survey, this study was designed to evaluate Australian DMAT experience in relation to the human resources issues associated with deployment.
Methods: Data was collected via an anonymous mailed survey distributed via State and Territory representatives on the Australian Health Protection Committee, who identified team members associated with Australian DMAT deployments from the 2004 South East Asian Tsunami disaster.
Results: The response rate for this survey was 50% (59/118). Most personnel had deployed to the Asian Tsunami affected areas with DMAT members having significant clinical and international experience. While all except one respondent stated they received a full orientation prior to deployment, only 34% of respondents (20/59) felt their role was clearly defined pre deployment. Approximately 56% (33/59) felt their actual role matched their intended role and that their clinical background was well suited to their tasks. Most respondents were prepared to be available for deployment for 1 month (34%, 20/59). The most common period of notice needed to deploy was 6-12 hours for 29% (17/59) followed by 12-24 hours for 24% (14/59). The preferred period of overseas deployment was 14-21 days (46%, 27/59) followed by 1 month (25%, 15/59) and the optimum shift period was felt to be 12 hours by 66% (39/59). The majority felt that there was both adequate pay (71%, 42/59) and adequate indemnity (66%, 39/59). Almost half (49%, 29/59) stated it was better to work with people from the same hospital and, while most felt their deployment could be easily covered by staff from their workplace (56%, 33/59) and caused an inconvenience to their colleagues (51%, 30/59), it was less likely to interrupt service delivery in their workplace (10%, 6/59) or cause an inconvenience to patients (9%, 5/59). Deployment was felt to benefit the affected community by nearly all (95%, 56/59) while less (42%, 25/59) felt that there was a benefit for their own local community. Nearly all felt their role was recognised on return (93%, 55/59) and an identical number (93%, 55/59) enjoyed the experience. All stated they would volunteer again, with 88% strongly agreeing with this statement.
Conclusions: This study of Australian DMAT members provides significant insights into a number of human resources issues and should help guide future deployments. The preferred 'on call' arrangements, notice to deploy, period of overseas deployment and shift length are all identified. This extended period of operations needs to be supported by planning and provision of rest cycles, food, temporary accommodation and rest areas for staff. The study also suggests that more emphasis should be placed on team selection and clarification of roles. While the majority felt
Rapid developments in nano-technology are likely to confer significant benefits on mankind. But, as with perhaps all new technologies, these benefits are likely to be accompanied by risks, perhaps by new risks. Nano-toxicology is developing in parallel with nano-technology and seeks to define the hazards and risks associated with nano-materials: only when risks have been identified they can be controlled. This article discusses the reasons for concern about the potential effects on health of exposure to nano-materials and relates these to the evidence of the effects on health of the ambient aerosol. A number of hypotheses are proposed and the dangers of adopting unsubstantiated hypotheses are stressed. Nano-toxicology presents many challenges and will need substantial financial support if it is to develop at a rate sufficient to cope with developments in nano-technology.
Introduction: Extended-spectrum beta-lactamase (ESBL) producing bacteria have been increasingly reported as causal agents of nosocomial infection worldwide. Resistance patterns vary internationally, and even locally, from one institution to the other. We investigated the clinical isolates positive for ESBL-producing bacteria in our institution, a tertiary care hospital in Madrid (Spain), during a 2-year period (2007-2008).
Methods: Clinical and microbiological data were retrospectively reviewed. Two hundred and nineteen patients were included in the study.
Results: Advanced age, diabetes, use of catheters, previous hospitalization and previous antibiotic treatment were some of the risk factors found among patients. Escherichia coli was the most frequent isolate, and urinary tract the most common site of isolation. Internal Medicine, Intensive Care Unit (ICU) and General Surgery presented the highest number of isolates. There were no outbreaks during the study period. Antibiotic patterns showed high resistance rates to quinolones in all isolates. There was 100% sensitivity to carbapenems.
Conclusion: Carbapenems continue to be the treatment of choice for ESBL-producing bacteria. Infection control measures are of great importance to avoid the spread of these nosocomial infections.
Background: Studies investigating the effect of power frequency (50-60 Hz) electromagnetic fields (EMF) on melatonin synthesis in rats have been inconsistent with several showing suppression of melatonin synthesis, others showing no effect and a few actually demonstrating small increases. Scant research has focused on the ensuing sleep patterns of EMF exposed rats. The present study was designed to examine the effects of extremely low power frequency electromagnetic fields (EMF) on the production of melatonin and the subsequent sleep structure in rats.
Methods: Eighteen male Sprague-Dawley rats were exposed to a 1000 milligauss (mG) magnetic field for 1 month. Urine was collected for the final 3 days of the exposure period for analysis of 6-sulphatoxymelatonin, the major catabolic product of melatonin found in urine. Subsequent sleep was analyzed over a 24-hour period.
Results: Melatonin production was mildly increased in exposed animals. Although there were no statistically significant changes in sleep structure, exposed animals showed slight decreases in REM (rapid eye movement) sleep as compared to sham (non-exposed) animals.
Conclusions: Power frequency magnetic fields induced a marginally statistically significant increase in melatonin levels in exposed rats compared to control. Subsequent sleep analysis indicated little effect on the sleep architecture of rats, at least not within the first day after 1 month's continuous exposure. Varying results in the literature are discussed and future research suggested.