Cardiopulmonary resuscitation (CPR) quality is crucial for improving patient survival rates after cardiac arrest. This study aimed to investigate the usefulness of femoral artery collapse ratio (systolic diameter/diastolic diameter ratio) measurement using M-mode ultrasound versus end-tidal carbon dioxide (ETCO2) for the assessment of high-quality CPR in a porcine cardiac arrest model. A total of 10 male mongrel pigs (age range, 16–20 weeks; weight, 45–50 kg) were used. After anesthesia, the carotid artery was dissected and exposed. The animals were instrumented with an arterial catheter in the exposed carotid artery to monitor arterial blood pressure. Cardiac arrest was induced by injecting potassium chloride (KCl, 40 equivalents of weight). The animals underwent chest compression using a mechanical device, and the chest compression depth and ETCO2 were measured using a defibrillator. To obtain hemodynamic information, two investigators performed an ultrasound examination on both femoral arteries. One examiner measured the femoral peak systolic velocity (PSV), while the other measured the diameters of the femoral artery (systolic diameter and diastolic diameter) in a transverse or longitudinal position using the M-mode of the linear ultrasound probe. As the compression depth increased, ETCO2, femoral artery diameter, collapse ratio (systolic diameter/diastolic diameter), and blood flow increased; however, PSV decreased. The ETCO2 and collapse ratio were positively correlated. The femoral artery collapse ratio, measured using the M-mode ultrasound, could be an alternative and simple method to evaluate high-quality CPR.
{"title":"Femoral artery collapse ratio as an indicator of chest compression quality during cardiopulmonary resuscitation in a porcine cardiac arrest model","authors":"","doi":"10.22514/sv.2023.102","DOIUrl":"https://doi.org/10.22514/sv.2023.102","url":null,"abstract":"Cardiopulmonary resuscitation (CPR) quality is crucial for improving patient survival rates after cardiac arrest. This study aimed to investigate the usefulness of femoral artery collapse ratio (systolic diameter/diastolic diameter ratio) measurement using M-mode ultrasound versus end-tidal carbon dioxide (ETCO2) for the assessment of high-quality CPR in a porcine cardiac arrest model. A total of 10 male mongrel pigs (age range, 16–20 weeks; weight, 45–50 kg) were used. After anesthesia, the carotid artery was dissected and exposed. The animals were instrumented with an arterial catheter in the exposed carotid artery to monitor arterial blood pressure. Cardiac arrest was induced by injecting potassium chloride (KCl, 40 equivalents of weight). The animals underwent chest compression using a mechanical device, and the chest compression depth and ETCO2 were measured using a defibrillator. To obtain hemodynamic information, two investigators performed an ultrasound examination on both femoral arteries. One examiner measured the femoral peak systolic velocity (PSV), while the other measured the diameters of the femoral artery (systolic diameter and diastolic diameter) in a transverse or longitudinal position using the M-mode of the linear ultrasound probe. As the compression depth increased, ETCO2, femoral artery diameter, collapse ratio (systolic diameter/diastolic diameter), and blood flow increased; however, PSV decreased. The ETCO2 and collapse ratio were positively correlated. The femoral artery collapse ratio, measured using the M-mode ultrasound, could be an alternative and simple method to evaluate high-quality CPR.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135560241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fibrinogen function is evaluated as the maximum amplitude (MA) of the citrated functional fibrinogen (CFF) assay in TEG6s®, however, CFF-MA requires a long time to obtain results. CFF-A10 (10-minute value), allowing more rapid decisions, however, no studies have evaluated the correlation between CFF-A10 levels and fibrinogen concentration. This study aimed to assess the correlation between CFF-A10 and blood fibrinogen levels measured using the dry hematology method after cardiopulmonary bypass (CPB). This retrospective study was conducted in a single university hospital and enrolled 192 patients of all ages who underwent cardiovascular surgery with CPB between 01 March 2020, and 05 November 2021. CFF-A10 and CFF-MA levels were measured using the TEG6s® global hemostasis assay, and blood fibrinogen levels were measured using the Fibcare® DRIHEMATO Fib-HSII after CPB. Simple linear regression analysis was used to evaluate the relationship between TEG6s® parameters and fibrinogen concentration. Furthermore, the patients were classified into four groups based on the cut-off values of fibrinogen at 150 mg/dL and CFF-A10, and the background factors for each group were analyzed. CFF-A10 and blood fibrinogen levels were correlated by linear regression (p < 0.0001, R2 = 0.37), similar to CFF-MA and fibrinogen levels (p < 0.0001, R2 = 0.40). The optimal cut-off value, which maximizes the sensitivity and specificity, of CFF-A10 for predicting low fibrinogen levels below 150 mg/dL, was 8.4 mm, with a sensitivity of 80.7% and specificity of 67.9%; that of CFF-MA was 9.2 mm, with a sensitivity of 76.3% and specificity of 69.8%. Despite sufficient blood fibrinogen levels, patients with low CFF-A10 levels experienced more postoperative bleeding. CFF-A10 predicted fibrinogen loss faster and with the same accuracy as CFF-MA did. Low CFF-A10 levels, despite sufficient fibrinogen levels, may be associated with increased blood loss following CPB.
{"title":"Evaluation of fibrinogen function by CFF-A10 in cardiac surgery","authors":"","doi":"10.22514/sv.2023.072","DOIUrl":"https://doi.org/10.22514/sv.2023.072","url":null,"abstract":"Fibrinogen function is evaluated as the maximum amplitude (MA) of the citrated functional fibrinogen (CFF) assay in TEG6s®, however, CFF-MA requires a long time to obtain results. CFF-A10 (10-minute value), allowing more rapid decisions, however, no studies have evaluated the correlation between CFF-A10 levels and fibrinogen concentration. This study aimed to assess the correlation between CFF-A10 and blood fibrinogen levels measured using the dry hematology method after cardiopulmonary bypass (CPB). This retrospective study was conducted in a single university hospital and enrolled 192 patients of all ages who underwent cardiovascular surgery with CPB between 01 March 2020, and 05 November 2021. CFF-A10 and CFF-MA levels were measured using the TEG6s® global hemostasis assay, and blood fibrinogen levels were measured using the Fibcare® DRIHEMATO Fib-HSII after CPB. Simple linear regression analysis was used to evaluate the relationship between TEG6s® parameters and fibrinogen concentration. Furthermore, the patients were classified into four groups based on the cut-off values of fibrinogen at 150 mg/dL and CFF-A10, and the background factors for each group were analyzed. CFF-A10 and blood fibrinogen levels were correlated by linear regression (p < 0.0001, R2 = 0.37), similar to CFF-MA and fibrinogen levels (p < 0.0001, R2 = 0.40). The optimal cut-off value, which maximizes the sensitivity and specificity, of CFF-A10 for predicting low fibrinogen levels below 150 mg/dL, was 8.4 mm, with a sensitivity of 80.7% and specificity of 67.9%; that of CFF-MA was 9.2 mm, with a sensitivity of 76.3% and specificity of 69.8%. Despite sufficient blood fibrinogen levels, patients with low CFF-A10 levels experienced more postoperative bleeding. CFF-A10 predicted fibrinogen loss faster and with the same accuracy as CFF-MA did. Low CFF-A10 levels, despite sufficient fibrinogen levels, may be associated with increased blood loss following CPB.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135828097","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Magnesium therapy may reduce migraine in children by reducing cortical spread depression and activation of the trigeminovascular complex. It is being used increasingly in Emergency Departments for migraine so we report a case series of children with migraine treated with intravenous (IV) magnesium sulphate. Electronic records were used to identify cases of migraine at our institution from May 2012 to September 2013. Patient records were reviewed to identify those with accurate migraine diagnoses and treatment with IV magnesium sulphate. 18 encounters were identified regarding 9 children. There was a good clinical response in 16 of these encounters and an average time to response of 2.3 hours. Discharge from the Emergency Department (ED) occurred in 10 of the 12 encounters where patients were administered IV magnesium sulphate in ED. Why should an Emergency Physician be aware of this? When oral non-steroidal anti-inflammatories and triptans aren’t successful for Emergency presentations of migraine there are a range of therapeutic options with limited evidence. Some of those options have well known risks, for example extra-pyramidal side effects with prochlorperazine and excessive sedation with propofol. Intravenous magnesium sulphate has a good safety profile, minimal side effects and is familiar to most medical and nursing staff. It is a good option as the infusion is brief and the clinical response is timely.
{"title":"Intravenous magnesium sulphate for treatment of pediatric migraine: case series","authors":"","doi":"10.22514/sv.2023.114","DOIUrl":"https://doi.org/10.22514/sv.2023.114","url":null,"abstract":"Magnesium therapy may reduce migraine in children by reducing cortical spread depression and activation of the trigeminovascular complex. It is being used increasingly in Emergency Departments for migraine so we report a case series of children with migraine treated with intravenous (IV) magnesium sulphate. Electronic records were used to identify cases of migraine at our institution from May 2012 to September 2013. Patient records were reviewed to identify those with accurate migraine diagnoses and treatment with IV magnesium sulphate. 18 encounters were identified regarding 9 children. There was a good clinical response in 16 of these encounters and an average time to response of 2.3 hours. Discharge from the Emergency Department (ED) occurred in 10 of the 12 encounters where patients were administered IV magnesium sulphate in ED. Why should an Emergency Physician be aware of this? When oral non-steroidal anti-inflammatories and triptans aren’t successful for Emergency presentations of migraine there are a range of therapeutic options with limited evidence. Some of those options have well known risks, for example extra-pyramidal side effects with prochlorperazine and excessive sedation with propofol. Intravenous magnesium sulphate has a good safety profile, minimal side effects and is familiar to most medical and nursing staff. It is a good option as the infusion is brief and the clinical response is timely.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"233 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135507338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Over the last 50 years, the recommended chest compression for cardiopulmonary resuscitation (CPR) has become faster and deeper, but maintaining deep compressions may be difficult at higher rates. Our study aimed to determine whether adequate compression (chest compression at an appropriate depth and rate) is being performed in emergency departments (ED). We also investigated the effect of adequate compression performance on the return of spontaneous circulation (ROSC). This prospective observational study was conducted at the EDs of two urban academic medical centers. We included adult patients (age ≥18 years) with cardiac arrest who underwent CPR in the ED between May and November 2020. We excluded patients with cardiac arrest related to trauma, repeated arrest except the first, and those for whom a monitor-defibrillator (ZOLL X-series) was not used. The following data were obtained from the monitor-defibrillator devices: compression depth, rate, chest compression fraction, CPR time, and percentage of compressions at the recommended rate and, at the recommended depth, at over and below rates, and depth, and at the appropriate depth and rate. Our study included 50 patients, from whom 441 chest compression sequences were obtained and analyzed. The mean compression depth, rate, and fraction were 6.48 ± 0.87 cm, 117 ± 5/min, 92.1 ± 3.70%, respectively. As the compression rate increased, the depth decreased, and most compressions were over-depth. Adequate compression (appropriate depth at recommended rate) was observed in 97 of the 441 compression sequences (21.9%). Below-depth and below-rate percentages were higher in the deceased group than that in the ROSC group (9.7 ± 15.2% vs. 3.3 ± 3.5%, p = 0.27; 2.7 ± 2.6% vs. 1.2± 0.9%, p = 0.06). The global ratio of chest compression showed low compliance with the recommended rate and depth, even when performed by skilled ED staff.
{"title":"Cardiopulmonary resuscitation: difficulty in maintaining sufficient compression depth at the appropriate rate","authors":"","doi":"10.22514/sv.2023.104","DOIUrl":"https://doi.org/10.22514/sv.2023.104","url":null,"abstract":"Over the last 50 years, the recommended chest compression for cardiopulmonary resuscitation (CPR) has become faster and deeper, but maintaining deep compressions may be difficult at higher rates. Our study aimed to determine whether adequate compression (chest compression at an appropriate depth and rate) is being performed in emergency departments (ED). We also investigated the effect of adequate compression performance on the return of spontaneous circulation (ROSC). This prospective observational study was conducted at the EDs of two urban academic medical centers. We included adult patients (age ≥18 years) with cardiac arrest who underwent CPR in the ED between May and November 2020. We excluded patients with cardiac arrest related to trauma, repeated arrest except the first, and those for whom a monitor-defibrillator (ZOLL X-series) was not used. The following data were obtained from the monitor-defibrillator devices: compression depth, rate, chest compression fraction, CPR time, and percentage of compressions at the recommended rate and, at the recommended depth, at over and below rates, and depth, and at the appropriate depth and rate. Our study included 50 patients, from whom 441 chest compression sequences were obtained and analyzed. The mean compression depth, rate, and fraction were 6.48 ± 0.87 cm, 117 ± 5/min, 92.1 ± 3.70%, respectively. As the compression rate increased, the depth decreased, and most compressions were over-depth. Adequate compression (appropriate depth at recommended rate) was observed in 97 of the 441 compression sequences (21.9%). Below-depth and below-rate percentages were higher in the deceased group than that in the ROSC group (9.7 ± 15.2% vs. 3.3 ± 3.5%, p = 0.27; 2.7 ± 2.6% vs. 1.2± 0.9%, p = 0.06). The global ratio of chest compression showed low compliance with the recommended rate and depth, even when performed by skilled ED staff.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"182 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135507668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Postoperative cognitive dysfunction (POCD) is a devastating complication with long-term consequences, and new therapeutic targets and drugs are still needed for the treatment of POCD. Sestrin are a family of stress-inducing proteins that regulate cellular metabolic networks. However, the possible effects of Sestrin on POCD were still unclear. This study aimed to investigate the effects of Sestrin 1 (SESN1) in postoperative cognitive dysfunction (POCD) cell model and reveal its mechanism. We constructed an in vitro model of POCD by treating primary rat hippocampal neurons with sevoflurane. Herein, we noticed SESN1 enhanced cell viability induced by sevoflurane. Further, SESN1 improved sevoflurane-induced cell inflammation. We further found that SESN1 improved sevoflurane induced reactive oxygen species (ROS) production and inhibited apoptosis. Mechanically, SESN1 restrained NOD-like receptor thermal protein domain 3 (NLRP3) inflammasome activation and therefore suppressed POCD. In conclusion, SESN1, as a potential target for postoperative cognitive dysfunction, attenuates sevoflurane-induced neuronal cell damage in the hippocampus. These findings will provide guidance for the mechanism study of POCD and future drug development for treatment of POCD.
{"title":"SESN1, as a potential target for postoperative cognitive dysfunction, attenuates sevoflurane-induced neuronal cell damage in the hippocampus","authors":"","doi":"10.22514/sv.2023.107","DOIUrl":"https://doi.org/10.22514/sv.2023.107","url":null,"abstract":"Postoperative cognitive dysfunction (POCD) is a devastating complication with long-term consequences, and new therapeutic targets and drugs are still needed for the treatment of POCD. Sestrin are a family of stress-inducing proteins that regulate cellular metabolic networks. However, the possible effects of Sestrin on POCD were still unclear. This study aimed to investigate the effects of Sestrin 1 (SESN1) in postoperative cognitive dysfunction (POCD) cell model and reveal its mechanism. We constructed an in vitro model of POCD by treating primary rat hippocampal neurons with sevoflurane. Herein, we noticed SESN1 enhanced cell viability induced by sevoflurane. Further, SESN1 improved sevoflurane-induced cell inflammation. We further found that SESN1 improved sevoflurane induced reactive oxygen species (ROS) production and inhibited apoptosis. Mechanically, SESN1 restrained NOD-like receptor thermal protein domain 3 (NLRP3) inflammasome activation and therefore suppressed POCD. In conclusion, SESN1, as a potential target for postoperative cognitive dysfunction, attenuates sevoflurane-induced neuronal cell damage in the hippocampus. These findings will provide guidance for the mechanism study of POCD and future drug development for treatment of POCD.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135560459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In older patients, ground-level falls are the most common cause of injury. Many intrinsic and extrinsic factors influence ground fall injuries. However, the characteristics and severity of ground fall injuries have not been compared according to the activity levels. We compared the characteristics of ground fall injuries by the activity level to establish a preventive strategy for ground fall injuries in older patients. We retrospectively reviewed the records of older patients who were admitted to six university hospitals for ground-level fall injuries from 2011 to 2020. The patients were classified into active and inactive groups. Active activities were defined as paid work, exercise and leisure activities. General and clinical characteristics of both groups for ground-level fall injury were analyzed. Propensity score matching analysis (1:1) was performed for baseline characteristics (sex, age and alcohol consumption). A total of 33,924 patients were enrolled, of which 4887 (14.4%) were classified in the active group. Injury severity was not different between the active and inactive groups. The main factors significantly associated with ground fall injuries during activities in elderly patients were male sex, age from 65 to 74 years and 75 to 84 years compared to greater than 85 years, an injury time other than 00:00–05:59, alcohol consumption, sloping floor and floor type other than concrete. After propensity score matching analysis, the factors associated with ground-level fall injuries in older patients when they were active were a time of injury from 06:00–17:59 compared to 00:00–05:59, slippery floor, slope, the absence of obstacles and type of floor other than concrete. We should establish preventive strategies for reducing ground-level fall injuries in older patients during activity, which could include wearing compatible footwear, caution on sloping areas, and maintenance of unpaved roads.
{"title":"Risk factors for ground-level fall injuries during active activity in older patients","authors":"","doi":"10.22514/sv.2023.099","DOIUrl":"https://doi.org/10.22514/sv.2023.099","url":null,"abstract":"In older patients, ground-level falls are the most common cause of injury. Many intrinsic and extrinsic factors influence ground fall injuries. However, the characteristics and severity of ground fall injuries have not been compared according to the activity levels. We compared the characteristics of ground fall injuries by the activity level to establish a preventive strategy for ground fall injuries in older patients. We retrospectively reviewed the records of older patients who were admitted to six university hospitals for ground-level fall injuries from 2011 to 2020. The patients were classified into active and inactive groups. Active activities were defined as paid work, exercise and leisure activities. General and clinical characteristics of both groups for ground-level fall injury were analyzed. Propensity score matching analysis (1:1) was performed for baseline characteristics (sex, age and alcohol consumption). A total of 33,924 patients were enrolled, of which 4887 (14.4%) were classified in the active group. Injury severity was not different between the active and inactive groups. The main factors significantly associated with ground fall injuries during activities in elderly patients were male sex, age from 65 to 74 years and 75 to 84 years compared to greater than 85 years, an injury time other than 00:00–05:59, alcohol consumption, sloping floor and floor type other than concrete. After propensity score matching analysis, the factors associated with ground-level fall injuries in older patients when they were active were a time of injury from 06:00–17:59 compared to 00:00–05:59, slippery floor, slope, the absence of obstacles and type of floor other than concrete. We should establish preventive strategies for reducing ground-level fall injuries in older patients during activity, which could include wearing compatible footwear, caution on sloping areas, and maintenance of unpaved roads.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135156163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The occurrence of unexpected difficult airway management (DAM) during endotracheal intubation (ETI) attempts represents a life-threatening scenario. The management of such challenges may improve with training in simulated DAM scenarios. Moreover, simulation allows investigation at the potential value of new devices and techniques for DAM. The combined use of laryngoscopy with fiberoptic bronchoscope (CLBI) has been proposed in this regard, but its performance by novices facing DAM remain unexplored. We performed a randomized crossover simulation study evaluating the performance of ninety-six anesthesiology residents during ETI with four approaches: direct laryngoscopy (DL), Glidescope®, McGrath® and CLBI. Increased difficulty was produced by placement of a cervical collar. Residents had maximum of 3 attempts per device/technique (up to 60 seconds per attempt). The main outcomes were success rate (SR) and corrected time-to-intubation (cTTI, with 60 seconds added for each failed attempt). Subgroup analyses were performed separating residents according to their experience (junior, n = 60; senior, n = 36).The CLBI had significantly lower SR at both 1st and 3rd attempt (31% and 64%, respectively) as compared to DL (93% and 98%), Glidescope® (70% and 86%) and McGrath® (58% and 84%), with all p < 0.001. Moreover, CLBI had significantly longer cTTI (158.5 seconds; (54.3; 180)) than other devices: Glidescope® (37.6 seconds; (24.7; 88.2)), McGrath® (39.3 seconds; (20.6; 105.1)), and DL (19 seconds; (15.4; 27.2)), all p < 0.002. CLBI and McGrath® were the only approaches performing better in senior as compared to junior residents. In a DAM simulated setting, anesthesiology residents had lower SR and longer cTTI with the CLBI as compared to direct and video-laryngoscopy.
{"title":"Use of combined laryngo-bronchoscopy intubation approach in a simulated difficult airway scenario with cervical stabilization","authors":"","doi":"10.22514/sv.2023.073","DOIUrl":"https://doi.org/10.22514/sv.2023.073","url":null,"abstract":"The occurrence of unexpected difficult airway management (DAM) during endotracheal intubation (ETI) attempts represents a life-threatening scenario. The management of such challenges may improve with training in simulated DAM scenarios. Moreover, simulation allows investigation at the potential value of new devices and techniques for DAM. The combined use of laryngoscopy with fiberoptic bronchoscope (CLBI) has been proposed in this regard, but its performance by novices facing DAM remain unexplored. We performed a randomized crossover simulation study evaluating the performance of ninety-six anesthesiology residents during ETI with four approaches: direct laryngoscopy (DL), Glidescope®, McGrath® and CLBI. Increased difficulty was produced by placement of a cervical collar. Residents had maximum of 3 attempts per device/technique (up to 60 seconds per attempt). The main outcomes were success rate (SR) and corrected time-to-intubation (cTTI, with 60 seconds added for each failed attempt). Subgroup analyses were performed separating residents according to their experience (junior, n = 60; senior, n = 36).The CLBI had significantly lower SR at both 1st and 3rd attempt (31% and 64%, respectively) as compared to DL (93% and 98%), Glidescope® (70% and 86%) and McGrath® (58% and 84%), with all p < 0.001. Moreover, CLBI had significantly longer cTTI (158.5 seconds; (54.3; 180)) than other devices: Glidescope® (37.6 seconds; (24.7; 88.2)), McGrath® (39.3 seconds; (20.6; 105.1)), and DL (19 seconds; (15.4; 27.2)), all p < 0.002. CLBI and McGrath® were the only approaches performing better in senior as compared to junior residents. In a DAM simulated setting, anesthesiology residents had lower SR and longer cTTI with the CLBI as compared to direct and video-laryngoscopy.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135828100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Multiple Chemical Sensitivity (MCS) is a pathological condition that we do not yet have a clear understanding of from an etiological and clinical point of view. The underlying mechanisms of the disorder are still being investigated and the most frequently reported symptoms by patients are malaise, fatigue, headache, arthralgia, insomnia and dermatitis. Although this condition may entail a real risk of the occurrence of adverse reactions following exposure to many substances, often inhaled, or the taking of drugs, medical/scientific literature provides only a little information regarding the safest course of action to be taken when patients affected by MCS need to undergo anesthesia. It is for this reason that an electronic search of existing literature has been made, using PubMed and Scopus as a primary source, in order to find articles about patients affected by MCS and who have undergone anesthesia. The time frame considered was January 2000–December 2022. The research showed only 13 articles that dealt with anesthesia in patients with multiple chemical sensitivity in the years in question. Only 6 works, all case reports, describe the drugs used to perform anesthesia. Five cases were for general anesthesia and one was a case of subarachnoid block. No major complications related to anesthesiological practice were reported in any of the cases. The limited data does not enable the identification of anesthesiological practice and anesthetic drugs that can be used more safely in MCS patients, but the absence of serious adverse reactions in the case reports described, and in the literature in general, is reasonably reassuring about the possibility of anesthesia in MCS patients without causing serious complications by implementing easily achievable measures.
{"title":"Anesthesia in patients with multiple chemical sensitivity: current understanding","authors":"","doi":"10.22514/sv.2023.096","DOIUrl":"https://doi.org/10.22514/sv.2023.096","url":null,"abstract":"Multiple Chemical Sensitivity (MCS) is a pathological condition that we do not yet have a clear understanding of from an etiological and clinical point of view. The underlying mechanisms of the disorder are still being investigated and the most frequently reported symptoms by patients are malaise, fatigue, headache, arthralgia, insomnia and dermatitis. Although this condition may entail a real risk of the occurrence of adverse reactions following exposure to many substances, often inhaled, or the taking of drugs, medical/scientific literature provides only a little information regarding the safest course of action to be taken when patients affected by MCS need to undergo anesthesia. It is for this reason that an electronic search of existing literature has been made, using PubMed and Scopus as a primary source, in order to find articles about patients affected by MCS and who have undergone anesthesia. The time frame considered was January 2000–December 2022. The research showed only 13 articles that dealt with anesthesia in patients with multiple chemical sensitivity in the years in question. Only 6 works, all case reports, describe the drugs used to perform anesthesia. Five cases were for general anesthesia and one was a case of subarachnoid block. No major complications related to anesthesiological practice were reported in any of the cases. The limited data does not enable the identification of anesthesiological practice and anesthetic drugs that can be used more safely in MCS patients, but the absence of serious adverse reactions in the case reports described, and in the literature in general, is reasonably reassuring about the possibility of anesthesia in MCS patients without causing serious complications by implementing easily achievable measures.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136202523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The effects of synbiotics on gut microbiota have not been thoroughly clarified in critically ill patients with sepsis. In this present study, we aimed to evaluate the effects of synbiotics in a commercial diet on the gut microbiota of mechanically ventilated septic patients. This double-blind, randomized controlled clinical trial was conducted on septic patients under mechanical ventilation in a university-affiliated hospital in southern Thailand from February 2019 to March 2021. The patients were randomly divided into 2 groups stratified by sepsis stages and given commercial enteral feeding with synbiotics or standard commercial feeding for 7 days. The primary outcome was fecal microbial diversity measured as alpha and beta diversity. The secondary outcomes included ventilator-associated pneumonia, nosocomial diarrhea, ventilator days, length of hospital stay, and mortality. Twenty-four patients, 12 on a synbiotic diet and 12 on a non-synbiotic diet, completed this study. On day 3 of feeding, no significant difference was observed in their alpha fecal microbial diversity. However, significantly greater beta diversity was observed in the non-synbiotics group compared with the synbiotic group (Bray Curtis distance, p = 0.001; Jaccard’s distance, p = 0.001; unweighted UniFrac, p = 0.001; weighted UniFrac, p = 0.029). The secondary outcomes were not significantly different between the two groups. In critically ill septic patients, feeding with a commercial diet containing synbiotics did not significantly improve fecal microbial diversity. Due to the small sample size, further study is required.
合生剂对脓毒症危重患者肠道菌群的影响尚未完全明确。在本研究中,我们旨在评估商业饮食中合生剂对机械通气脓毒症患者肠道微生物群的影响。这项双盲、随机对照临床试验于2019年2月至2021年3月在泰国南部一家大学附属医院对机械通气的脓毒症患者进行了研究。按脓毒症分期随机分为2组,分别给予商业肠内喂养或标准商业喂养,为期7 d。主要终点是粪便微生物多样性,测量为α和β多样性。次要结局包括呼吸机相关性肺炎、院内腹泻、呼吸机天数、住院时间和死亡率。24名患者完成了这项研究,其中12名采用合成饮食,12名采用非合成饮食。饲喂第3天时,各组α粪便微生物多样性无显著差异。然而,与合成组相比,非合成组的β多样性显著增加(Bray Curtis距离,p = 0.001; Jaccard距离,p = 0.001;未加权UniFrac, p = 0.001;加权UniFrac, p = 0.029)。两组间的次要结局无显著差异。在重症脓毒症患者中,用含有合成菌的商业饮食喂养并没有显著改善粪便微生物的多样性。由于样本量小,需要进一步研究。
{"title":"Effects of diets containing synbiotics on the gut microbiota of critically ill septic patients: a pilot randomized controlled trial","authors":"","doi":"10.22514/sv.2023.080","DOIUrl":"https://doi.org/10.22514/sv.2023.080","url":null,"abstract":"The effects of synbiotics on gut microbiota have not been thoroughly clarified in critically ill patients with sepsis. In this present study, we aimed to evaluate the effects of synbiotics in a commercial diet on the gut microbiota of mechanically ventilated septic patients. This double-blind, randomized controlled clinical trial was conducted on septic patients under mechanical ventilation in a university-affiliated hospital in southern Thailand from February 2019 to March 2021. The patients were randomly divided into 2 groups stratified by sepsis stages and given commercial enteral feeding with synbiotics or standard commercial feeding for 7 days. The primary outcome was fecal microbial diversity measured as alpha and beta diversity. The secondary outcomes included ventilator-associated pneumonia, nosocomial diarrhea, ventilator days, length of hospital stay, and mortality. Twenty-four patients, 12 on a synbiotic diet and 12 on a non-synbiotic diet, completed this study. On day 3 of feeding, no significant difference was observed in their alpha fecal microbial diversity. However, significantly greater beta diversity was observed in the non-synbiotics group compared with the synbiotic group (Bray Curtis distance, p = 0.001; Jaccard’s distance, p = 0.001; unweighted UniFrac, p = 0.001; weighted UniFrac, p = 0.029). The secondary outcomes were not significantly different between the two groups. In critically ill septic patients, feeding with a commercial diet containing synbiotics did not significantly improve fecal microbial diversity. Due to the small sample size, further study is required.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"2013 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136298880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Positive fluid balance is associated with acute kidney injury (AKI) following cardiac surgery in a dose-dependent manner. Although diuresis is a common intervention for fluid overload, the optimal timing of diuretic administration for preventing AKI after cardiac surgery remains unclear. We aimed to investigate whether early administration of diuretics after cardiac surgery is associated with subsequent AKI progression. This was a post-hoc analysis of a multicenter retrospective cohort study that included adult patients admitted to 14 intensive care units (ICUs) after elective cardiac surgery between January and December 2018. The exposure variable was the administration of intravenous diuretics during the initial 24 hours after ICU admission. The primary outcome was AKI progression, defined as one or more AKI stages using Kidney Disease: Improving Global Outcomes creatinine and urine output criteria between 24 and 72 hours compared with the worst stage during the first 24 hours. We used multivariable logistic regression analyses to assess the association between early administration of diuretics and AKI progression. Among the 718 patients analyzed, 335 (47%) received intravenous diuretics within the first 24 hours, and AKI progression occurred in 115 patients (16%). In the multivariable analyses, early diuresis was not associated with AKI progression (odds ratio, 1.12; 95% confidence interval, 0.74–1.69), confirmed by sensitivity analyses. Early administration of intravenous diuretics was not associated with a lower risk of AKI progression after cardiac surgery.
{"title":"The effect of early diuretics administration on acute kidney injury progression after cardiac surgery: a post-hoc analysis of a multicenter retrospective cohort study (BROTHER study)","authors":"","doi":"10.22514/sv.2023.112","DOIUrl":"https://doi.org/10.22514/sv.2023.112","url":null,"abstract":"Positive fluid balance is associated with acute kidney injury (AKI) following cardiac surgery in a dose-dependent manner. Although diuresis is a common intervention for fluid overload, the optimal timing of diuretic administration for preventing AKI after cardiac surgery remains unclear. We aimed to investigate whether early administration of diuretics after cardiac surgery is associated with subsequent AKI progression. This was a post-hoc analysis of a multicenter retrospective cohort study that included adult patients admitted to 14 intensive care units (ICUs) after elective cardiac surgery between January and December 2018. The exposure variable was the administration of intravenous diuretics during the initial 24 hours after ICU admission. The primary outcome was AKI progression, defined as one or more AKI stages using Kidney Disease: Improving Global Outcomes creatinine and urine output criteria between 24 and 72 hours compared with the worst stage during the first 24 hours. We used multivariable logistic regression analyses to assess the association between early administration of diuretics and AKI progression. Among the 718 patients analyzed, 335 (47%) received intravenous diuretics within the first 24 hours, and AKI progression occurred in 115 patients (16%). In the multivariable analyses, early diuresis was not associated with AKI progression (odds ratio, 1.12; 95% confidence interval, 0.74–1.69), confirmed by sensitivity analyses. Early administration of intravenous diuretics was not associated with a lower risk of AKI progression after cardiac surgery.","PeriodicalId":49522,"journal":{"name":"Signa Vitae","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135559781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}