Pub Date : 2023-12-15DOI: 10.1213/ane.0000000000006301
Genewoo Hong, Alexandra Sideris, Seth Waldman, Joe Stauffer, Christopher L. Wu
have been removed from schedule I. At the state level, a majority of states have passed laws legalizing cannabis in some form, although these laws vary from state to state in terms of the extent to which use is permitted, approved medical uses, and the types of regulation placed on commercial activity and quality control. This inconsistency has contributed to uncertainty among medical providers and their patients. In this review, we provide a brief account of the evolution and current state of federal and state laws and regulatory agencies involved in overseeing medical cannabis use in the United States....
{"title":"Legal and Regulatory Aspects of Medical Cannabis in the United States","authors":"Genewoo Hong, Alexandra Sideris, Seth Waldman, Joe Stauffer, Christopher L. Wu","doi":"10.1213/ane.0000000000006301","DOIUrl":"https://doi.org/10.1213/ane.0000000000006301","url":null,"abstract":" have been removed from schedule I. At the state level, a majority of states have passed laws legalizing cannabis in some form, although these laws vary from state to state in terms of the extent to which use is permitted, approved medical uses, and the types of regulation placed on commercial activity and quality control. This inconsistency has contributed to uncertainty among medical providers and their patients. In this review, we provide a brief account of the evolution and current state of federal and state laws and regulatory agencies involved in overseeing medical cannabis use in the United States....","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138657505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-12-15DOI: 10.1213/ane.0000000000005904
Kevin F. Boehnke, Christopher L. Wu, Daniel J. Clauw
mechanistic plausibility that CPs and CBMs may be useful for pain management, the clinical trial literature is limited and does not refute or support the use of CBMs for pain management. Complicating matters, a large and growing body of observational literature shows that many people use CPs for pain management and in place of other medications. However, products and dosing regimens in existing trials are not generalizable to the current cannabis market, making it difficult to compare and reconcile these 2 bodies of literature. Given this complexity, clinicians need clear, pragmatic guidance on how to appropriately educate and work with patients who are using CBMs for pain management. In this review, we narratively synthesize the evidence to enable a clear view of current landscape and provide pragmatic advice for clinicians to use when working with patients. This advice revolves around 3 principles: (1) maintaining the therapeutic alliance; (2) harm reduction and benefit maximization; and (3) pragmatism, principles of patient-centered care, and use of best clinical judgment in the face of uncertainty. Despite the lack of certainty CPs and chronic pain management use, we believe that following these principles can make most of the clinical opportunity presented by discussions around CPs and also enhance the likelihood of clinical benefit from CPs....
{"title":"Thoughtfully Integrating Cannabis Products Into Chronic Pain Treatment","authors":"Kevin F. Boehnke, Christopher L. Wu, Daniel J. Clauw","doi":"10.1213/ane.0000000000005904","DOIUrl":"https://doi.org/10.1213/ane.0000000000005904","url":null,"abstract":"mechanistic plausibility that CPs and CBMs may be useful for pain management, the clinical trial literature is limited and does not refute or support the use of CBMs for pain management. Complicating matters, a large and growing body of observational literature shows that many people use CPs for pain management and in place of other medications. However, products and dosing regimens in existing trials are not generalizable to the current cannabis market, making it difficult to compare and reconcile these 2 bodies of literature. Given this complexity, clinicians need clear, pragmatic guidance on how to appropriately educate and work with patients who are using CBMs for pain management. In this review, we narratively synthesize the evidence to enable a clear view of current landscape and provide pragmatic advice for clinicians to use when working with patients. This advice revolves around 3 principles: (1) maintaining the therapeutic alliance; (2) harm reduction and benefit maximization; and (3) pragmatism, principles of patient-centered care, and use of best clinical judgment in the face of uncertainty. Despite the lack of certainty CPs and chronic pain management use, we believe that following these principles can make most of the clinical opportunity presented by discussions around CPs and also enhance the likelihood of clinical benefit from CPs....","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138657530","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-16DOI: 10.1213/ANE.0000000000005925
M. Blobner, M. Hollmann, M. Luedi, Ken Johnson
In this Pro-Con article, we debate the merits of using quantitative neuromuscular blockade monitoring. Consensus guidelines recommend their use to guide the administration of nondepolarizing neuromuscular blockade and reversal agents. A major impediment to this guideline is that until recently, reliable quantitative neuromuscular blockade monitors have not been widely available. Without them, anesthesia providers have been trained with and are adept at using a variety of qualitative neuromuscular blockade monitors otherwise known as peripheral nerve stimulators. Although perhaps less accurate, anesthesia providers find them reliable and easy to use. They have a long track record of using them with the perception that their use leads to effective neuromuscular blockade reversal and minimizes clinically significant adverse events from residual neuromuscular blockade. In the recent past, 2 disruptive developments have called upon anesthesia care providers to reconsider their practice in neuromuscular blockade administration, reversal, and monitoring. These include: (1) commercialization of more reliable quantitative neuromuscular monitors and (2) widespread use of sugammadex, a versatile reversal agent of neuromuscular blockade. Sugammadex appears to be so effective at rapidly and effectively reversing even the deepest of neuromuscular blockades, and it has left anesthesia providers wondering whether quantitative monitoring is indeed necessary or whether conventional, familiar, and less expensive qualitative monitoring will suffice? This Pro-Con debate will contrast anesthesia provider perceptions with evidence surrounding the use of quantitative neuromuscular blockade monitors to explore whether quantitative neuromuscular monitoring (NMM) is just another technology solution looking for a problem or a significant advance in NMM that will improve patient safety and outcomes.
{"title":"Pro-Con Debate: Do We Need Quantitative Neuromuscular Monitoring in the Era of Sugammadex?","authors":"M. Blobner, M. Hollmann, M. Luedi, Ken Johnson","doi":"10.1213/ANE.0000000000005925","DOIUrl":"https://doi.org/10.1213/ANE.0000000000005925","url":null,"abstract":"In this Pro-Con article, we debate the merits of using quantitative neuromuscular blockade monitoring. Consensus guidelines recommend their use to guide the administration of nondepolarizing neuromuscular blockade and reversal agents. A major impediment to this guideline is that until recently, reliable quantitative neuromuscular blockade monitors have not been widely available. Without them, anesthesia providers have been trained with and are adept at using a variety of qualitative neuromuscular blockade monitors otherwise known as peripheral nerve stimulators. Although perhaps less accurate, anesthesia providers find them reliable and easy to use. They have a long track record of using them with the perception that their use leads to effective neuromuscular blockade reversal and minimizes clinically significant adverse events from residual neuromuscular blockade. In the recent past, 2 disruptive developments have called upon anesthesia care providers to reconsider their practice in neuromuscular blockade administration, reversal, and monitoring. These include: (1) commercialization of more reliable quantitative neuromuscular monitors and (2) widespread use of sugammadex, a versatile reversal agent of neuromuscular blockade. Sugammadex appears to be so effective at rapidly and effectively reversing even the deepest of neuromuscular blockades, and it has left anesthesia providers wondering whether quantitative monitoring is indeed necessary or whether conventional, familiar, and less expensive qualitative monitoring will suffice? This Pro-Con debate will contrast anesthesia provider perceptions with evidence surrounding the use of quantitative neuromuscular blockade monitors to explore whether quantitative neuromuscular monitoring (NMM) is just another technology solution looking for a problem or a significant advance in NMM that will improve patient safety and outcomes.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79540796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-01DOI: 10.1213/ANE.0000000000006074
M. Nurok, V. Pellegrino, M. Pineton de Chambrun, J. Warsh, M. Young, E. Dong, N. Parrish, S. Shehab, A. Combes, R. Kaplan
The United States spends more for intensive care units (ICUs) than do other high-income countries. We used time-driven activity-based costing (TDABC) to analyze ICU costs for initiation of veno-venous extracorporeal membrane oxygenation (VV ECMO) for respiratory failure to estimate how much of the higher ICU costs at 1 US site can be attributed to the higher prices paid to ICU personnel, and how much is caused by the US site’s use of a higher cost staffing model. We accompanied our TDABC approach with narrative review of the ECMO programs, at Cedars-Sinai (Los Angeles), Hôpital Pitié-Salpêtrière (Paris), and The Alfred Hospital (Melbourne) from 2017 to 2019. Our primary outcome was daily ECMO cost, and we hypothesized that cost differences among the hospitals could be explained by the efficiencies and skill mix of involved clinicians and prices paid for personnel, equipment, and consumables. Our results are presented relative to Los Angeles’ total personnel cost per VV ECMO patient day, indexed at 100. Los Angeles’ total indexed daily cost of care was 147 (personnel: 100, durables: 5, and disposables: 42). Paris’ total cost was 39 (26% of Los Angeles) (personnel: 12, durables: 1, and disposables: 26). Melbourne’s total cost was 53 (36% of Los Angeles) (personnel: 32, durables: 2, and disposables: 19) (rounded). The higher personnel prices at Los Angeles explained only 26% of its much higher personnel costs than Paris, and 21% relative to Melbourne. Los Angeles’ higher staffing levels accounted for 49% (36%), and its costlier mix of personnel accounted for 12% (10%) of its higher personnel costs relative to Paris (Melbourne). Unadjusted discharge rates for ECMO patients were 46% in Los Angeles (46%), 56% in Paris, and 52% in Melbourne. We found that personnel salaries explained only 30% of the higher personnel costs at 1 Los Angeles hospital. Most of the cost differential was caused by personnel staffing intensity and mix. This study demonstrates how TDABC may be used in ICU administration to quantify the savings that 1 US hospital could achieve by delivering the same quality of care with fewer and less-costly mix of clinicians compared to a French and Australian site. Narrative reviews contextualized how the care models evolved at each site and helped identify potential barriers to change.
{"title":"It’s Not Just the Prices: Time-Driven Activity-Based Costing for Initiation of Veno-Venous Extracorporeal Membrane Oxygenation at Three International Sites—A Case Review","authors":"M. Nurok, V. Pellegrino, M. Pineton de Chambrun, J. Warsh, M. Young, E. Dong, N. Parrish, S. Shehab, A. Combes, R. Kaplan","doi":"10.1213/ANE.0000000000006074","DOIUrl":"https://doi.org/10.1213/ANE.0000000000006074","url":null,"abstract":"The United States spends more for intensive care units (ICUs) than do other high-income countries. We used time-driven activity-based costing (TDABC) to analyze ICU costs for initiation of veno-venous extracorporeal membrane oxygenation (VV ECMO) for respiratory failure to estimate how much of the higher ICU costs at 1 US site can be attributed to the higher prices paid to ICU personnel, and how much is caused by the US site’s use of a higher cost staffing model. We accompanied our TDABC approach with narrative review of the ECMO programs, at Cedars-Sinai (Los Angeles), Hôpital Pitié-Salpêtrière (Paris), and The Alfred Hospital (Melbourne) from 2017 to 2019. Our primary outcome was daily ECMO cost, and we hypothesized that cost differences among the hospitals could be explained by the efficiencies and skill mix of involved clinicians and prices paid for personnel, equipment, and consumables. Our results are presented relative to Los Angeles’ total personnel cost per VV ECMO patient day, indexed at 100. Los Angeles’ total indexed daily cost of care was 147 (personnel: 100, durables: 5, and disposables: 42). Paris’ total cost was 39 (26% of Los Angeles) (personnel: 12, durables: 1, and disposables: 26). Melbourne’s total cost was 53 (36% of Los Angeles) (personnel: 32, durables: 2, and disposables: 19) (rounded). The higher personnel prices at Los Angeles explained only 26% of its much higher personnel costs than Paris, and 21% relative to Melbourne. Los Angeles’ higher staffing levels accounted for 49% (36%), and its costlier mix of personnel accounted for 12% (10%) of its higher personnel costs relative to Paris (Melbourne). Unadjusted discharge rates for ECMO patients were 46% in Los Angeles (46%), 56% in Paris, and 52% in Melbourne. We found that personnel salaries explained only 30% of the higher personnel costs at 1 Los Angeles hospital. Most of the cost differential was caused by personnel staffing intensity and mix. This study demonstrates how TDABC may be used in ICU administration to quantify the savings that 1 US hospital could achieve by delivering the same quality of care with fewer and less-costly mix of clinicians compared to a French and Australian site. Narrative reviews contextualized how the care models evolved at each site and helped identify potential barriers to change.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78537995","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-05-17DOI: 10.1213/ANE.0000000000006084
F. Wiepking, A. Van Zundert
Many procedures in science and medicine involve the use of a syringe, and its invention is a key milestone in general and regional anesthesia history. The end of the 19th century brought major changes in syringe production. An industry that initially manually crafted syringes to individual physicians’ instructions saw the introduction of a large variety of syringes, sometimes with odd and unique modifications. For many of these unique syringes, there was no proven evidence that these modifications were effective or safe to use. This article provides examples of “odd” syringe designs for use in medicine, general anesthesia, and regional anesthesia. Some designs proved functional and have stood the test of time; others quickly disappeared and ended up in dusty collections.
{"title":"Oddities in the Evolution of Syringes in Anesthesia","authors":"F. Wiepking, A. Van Zundert","doi":"10.1213/ANE.0000000000006084","DOIUrl":"https://doi.org/10.1213/ANE.0000000000006084","url":null,"abstract":"Many procedures in science and medicine involve the use of a syringe, and its invention is a key milestone in general and regional anesthesia history. The end of the 19th century brought major changes in syringe production. An industry that initially manually crafted syringes to individual physicians’ instructions saw the introduction of a large variety of syringes, sometimes with odd and unique modifications. For many of these unique syringes, there was no proven evidence that these modifications were effective or safe to use. This article provides examples of “odd” syringe designs for use in medicine, general anesthesia, and regional anesthesia. Some designs proved functional and have stood the test of time; others quickly disappeared and ended up in dusty collections.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83201310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-05-13DOI: 10.1213/ANE.0000000000006093
H. Abe, M. Sumitani, H. Matsui, R. Inoue, Mitsuru Konishi, K. Fushimi, K. Uchida, H. Yasunaga
BACKGROUND: It is unclear whether gabapentinoids affect the development of delirium. We aimed to determine the association between gabapentinoid use and hyperactive delirium in older cancer patients undergoing chemotherapy. METHODS: We conducted propensity score-matched analyses using data from a nationwide inpatient database in Japan. We included cancer patients with pain ≥70 years of age undergoing chemotherapy between April 2016 and March 2018. Patients receiving gabapentinoids were matched with control patients using propensity scores. The primary outcome was occurrence of hyperactive delirium during hospitalization, and the secondary outcomes were length of hospital stay, in-hospital fractures, and in-hospital mortality. Hyperactive delirium was identified by antipsychotic use or discharge diagnoses from the International Classification of Diseases, 10th Revision. RESULTS: Among 143,132 identified patients (59% men; mean age, 76.3 years), 14,174 (9.9%) received gabapentinoids and 128,958 (90.1%) did not (control group). After one-to-one propensity score matching, 14,173 patients were included in each group. The occurrence of hyperactive delirium was significantly lower (5.2% vs 8.5%; difference in percent, −3.2% [95% confidence interval, −3.8 to −2.6]; odds ratio, 0.60 [0.54–0.66]; P < .001), the median length of hospital stay was significantly shorter (6 days [interquartile range, 3–15] vs 9 days [4–17]; subdistribution hazard ratio, 1.22 [1.19–1.25]; P < .001), and the occurrence of in-hospital mortality was significantly lower in the gabapentinoid group than in the control group (1.3% vs 1.8%; difference in percent, −0.6% [−0.9 to −0.3]; odds ratio, 0.69 [0.57–0.83]; P < .001). Gabapentinoid use was not significantly associated with the occurrence of in-hospital fractures (0.2% vs 0.2%; difference in percent, 0.0% [−0.1 to 0.1]; odds ratio, 1.07 [0.65–1.76]; P = .799). The results of sensitivity analyses using stabilized inverse probability of treatment weighting were consistent with the results of the propensity score-matched analyses. CONCLUSIONS: Our findings suggest that gabapentinoid use is associated with reduced hyperactive delirium in older cancer patients undergoing chemotherapy, with no evidence of an increase in the fracture rate, length of hospital stay, or in-hospital death.
背景:目前尚不清楚加巴喷丁类药物是否影响谵妄的发展。我们的目的是确定在接受化疗的老年癌症患者中,加巴喷丁类药物的使用与过度活跃谵妄之间的关系。方法:我们使用来自日本全国住院患者数据库的数据进行倾向评分匹配分析。我们纳入了2016年4月至2018年3月期间接受化疗的疼痛≥70岁的癌症患者。使用倾向评分将接受加巴喷丁类药物治疗的患者与对照组患者进行匹配。主要转归是住院期间多动性谵妄的发生,次要转归是住院时间、院内骨折和院内死亡率。根据《国际疾病分类》第十版的抗精神病药物使用或出院诊断,确诊为过度活动性谵妄。结果:在143,132例确诊患者中(59%为男性;平均年龄76.3岁),14174人(9.9%)接受加巴喷丁类药物治疗,128958人(90.1%)未接受加巴喷丁类药物治疗(对照组)。一对一倾向评分匹配后,每组纳入14173例患者。多动性谵妄的发生率显著降低(5.2% vs 8.5%;百分比差异,−3.2%[95%置信区间,−3.8至−2.6];优势比,0.60 [0.54-0.66];P < 0.001),中位住院时间显著缩短(6天[四分位数间距,3-15]vs 9天[4-17];亚分布风险比为1.22 [1.19-1.25];P < 0.001),加巴喷丁类药物组的住院死亡率明显低于对照组(1.3% vs 1.8%;百分比差异,−0.6%[−0.9至−0.3];优势比,0.69 [0.57-0.83];P < 0.001)。加巴喷丁类药物的使用与院内骨折的发生率无显著相关(0.2% vs 0.2%;百分比差异,0.0%[−0.1至0.1];优势比为1.07 [0.65-1.76];P = .799)。使用稳定的处理加权逆概率进行敏感性分析的结果与倾向评分匹配分析的结果一致。结论:我们的研究结果表明,在接受化疗的老年癌症患者中,加巴喷丁类药物的使用与过度活动性谵妄的减少有关,没有证据表明骨折率、住院时间或院内死亡增加。
{"title":"Gabapentinoid Use Is Associated With Reduced Occurrence of Hyperactive Delirium in Older Cancer Patients Undergoing Chemotherapy: A Nationwide Retrospective Cohort Study in Japan","authors":"H. Abe, M. Sumitani, H. Matsui, R. Inoue, Mitsuru Konishi, K. Fushimi, K. Uchida, H. Yasunaga","doi":"10.1213/ANE.0000000000006093","DOIUrl":"https://doi.org/10.1213/ANE.0000000000006093","url":null,"abstract":"BACKGROUND: It is unclear whether gabapentinoids affect the development of delirium. We aimed to determine the association between gabapentinoid use and hyperactive delirium in older cancer patients undergoing chemotherapy. METHODS: We conducted propensity score-matched analyses using data from a nationwide inpatient database in Japan. We included cancer patients with pain ≥70 years of age undergoing chemotherapy between April 2016 and March 2018. Patients receiving gabapentinoids were matched with control patients using propensity scores. The primary outcome was occurrence of hyperactive delirium during hospitalization, and the secondary outcomes were length of hospital stay, in-hospital fractures, and in-hospital mortality. Hyperactive delirium was identified by antipsychotic use or discharge diagnoses from the International Classification of Diseases, 10th Revision. RESULTS: Among 143,132 identified patients (59% men; mean age, 76.3 years), 14,174 (9.9%) received gabapentinoids and 128,958 (90.1%) did not (control group). After one-to-one propensity score matching, 14,173 patients were included in each group. The occurrence of hyperactive delirium was significantly lower (5.2% vs 8.5%; difference in percent, −3.2% [95% confidence interval, −3.8 to −2.6]; odds ratio, 0.60 [0.54–0.66]; P < .001), the median length of hospital stay was significantly shorter (6 days [interquartile range, 3–15] vs 9 days [4–17]; subdistribution hazard ratio, 1.22 [1.19–1.25]; P < .001), and the occurrence of in-hospital mortality was significantly lower in the gabapentinoid group than in the control group (1.3% vs 1.8%; difference in percent, −0.6% [−0.9 to −0.3]; odds ratio, 0.69 [0.57–0.83]; P < .001). Gabapentinoid use was not significantly associated with the occurrence of in-hospital fractures (0.2% vs 0.2%; difference in percent, 0.0% [−0.1 to 0.1]; odds ratio, 1.07 [0.65–1.76]; P = .799). The results of sensitivity analyses using stabilized inverse probability of treatment weighting were consistent with the results of the propensity score-matched analyses. CONCLUSIONS: Our findings suggest that gabapentinoid use is associated with reduced hyperactive delirium in older cancer patients undergoing chemotherapy, with no evidence of an increase in the fracture rate, length of hospital stay, or in-hospital death.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83191186","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-05-12DOI: 10.1213/ANE.0000000000006079
Soo Yeon Kim, Ji-Yoon Kim, Jonghae Kim, S. Yu, Kwang Hyun Lee, Hyeon Seok Lee, M. S. Oh, Eugene Kim
BACKGROUND: The pupillary dilation reflex (PDR), the change in pupil size after a nociceptive stimulus, has been used to assess antinociception during anesthesia. The aim of this study was to compare the antinociceptive properties of sevoflurane and desflurane by measuring the PDR amplitude. METHODS: Seventy patients between 20 and 55 years of age were randomly allocated to receive either sevoflurane or desflurane. The PDR amplitude after an electrical standardized noxious stimulation (SNT) was measured using an infrared pupillometer under 1.0 minimum alveolar concentration (MAC). The pupil diameter was measured from 5 seconds before to 5 minutes after the SNT. The mean arterial pressure (MAP), heart rate (HR), and bispectral index (BIS) were also measured immediately before and after SNT as well as 1 minute and 5 minutes after SNT. The primary outcome was the maximum percent increase from the prestimulation value of the pupil diameter, and the secondary outcomes were the maximum percent increase from the prestimulation value of the MAP, HR, and BIS after SNT. RESULTS: The maximum percent increase of the pupil diameter after SNT was not different between the 2 groups (median [first quartile to third quartile], 45.1 [29.3–80.3] vs 43.4 [27.0–103.1]; median difference, −0.3 [95% confidence interval, −16.0 to 16.5]; P = .986). Before SNT, the MAP was higher under 1.0 MAC of sevoflurane than desflurane; however, the maximum percent increase of MAP, HR, and BIS was not different between the 2 groups. CONCLUSIONS: The amount of change in the PDR amplitude, MAP, and HR after SNT was not different between sevoflurane and desflurane anesthesia. This result might suggest that sevoflurane and desflurane may not have different antinociceptive properties at equivalent MAC.
{"title":"Comparison of Antinociceptive Properties Between Sevoflurane and Desflurane Using Pupillary Dilation Reflex Under Equivalent Minimum Alveolar Concentration: A Randomized Controlled Trial","authors":"Soo Yeon Kim, Ji-Yoon Kim, Jonghae Kim, S. Yu, Kwang Hyun Lee, Hyeon Seok Lee, M. S. Oh, Eugene Kim","doi":"10.1213/ANE.0000000000006079","DOIUrl":"https://doi.org/10.1213/ANE.0000000000006079","url":null,"abstract":"BACKGROUND: The pupillary dilation reflex (PDR), the change in pupil size after a nociceptive stimulus, has been used to assess antinociception during anesthesia. The aim of this study was to compare the antinociceptive properties of sevoflurane and desflurane by measuring the PDR amplitude. METHODS: Seventy patients between 20 and 55 years of age were randomly allocated to receive either sevoflurane or desflurane. The PDR amplitude after an electrical standardized noxious stimulation (SNT) was measured using an infrared pupillometer under 1.0 minimum alveolar concentration (MAC). The pupil diameter was measured from 5 seconds before to 5 minutes after the SNT. The mean arterial pressure (MAP), heart rate (HR), and bispectral index (BIS) were also measured immediately before and after SNT as well as 1 minute and 5 minutes after SNT. The primary outcome was the maximum percent increase from the prestimulation value of the pupil diameter, and the secondary outcomes were the maximum percent increase from the prestimulation value of the MAP, HR, and BIS after SNT. RESULTS: The maximum percent increase of the pupil diameter after SNT was not different between the 2 groups (median [first quartile to third quartile], 45.1 [29.3–80.3] vs 43.4 [27.0–103.1]; median difference, −0.3 [95% confidence interval, −16.0 to 16.5]; P = .986). Before SNT, the MAP was higher under 1.0 MAC of sevoflurane than desflurane; however, the maximum percent increase of MAP, HR, and BIS was not different between the 2 groups. CONCLUSIONS: The amount of change in the PDR amplitude, MAP, and HR after SNT was not different between sevoflurane and desflurane anesthesia. This result might suggest that sevoflurane and desflurane may not have different antinociceptive properties at equivalent MAC.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74724866","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-05-12DOI: 10.1213/ANE.0000000000006068
Ke Peng, D. McIlroy, B. Bollen, F. Billings, A. Zarbock, W. Popescu, A. Fox, L. Shore-lesserson, Shaofeng Zhou, M. Geube, Fuhai Ji, Meena Bhatia, N. Schwann, A. Shaw, Hong Liu
Cardiac surgery-associated acute kidney injury (CS-AKI) is common and is associated with increased risk for postoperative morbidity and mortality. Our recent survey of the Society of Cardiovascular Anesthesiologists (SCA) membership showed 6 potentially renoprotective strategies for which clinicians would most value an evidence-based review (ie, intraoperative target blood pressure, choice of specific vasopressor agent, erythrocyte transfusion threshold, use of alpha-2 agonists, goal-directed oxygen delivery on cardiopulmonary bypass [CPB], and the “Kidney Disease Improving Global Outcomes [KDIGO] bundle of care”). Thus, the SCA’s Continuing Practice Improvement Acute Kidney Injury Working Group aimed to provide a practice update for each of these strategies in cardiac surgical patients based on the evidence from randomized controlled trials (RCTs). PubMed, EMBASE, and Cochrane library databases were comprehensively searched for eligible studies from inception through February 2021, with search results updated in August 2021. A total of 15 RCTs investigating the effects of the above-mentioned strategies on CS-AKI were included for meta-analysis. For each strategy, the level of evidence was assessed using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology. Across the 6 potentially renoprotective strategies evaluated, current evidence for their use was rated as “moderate,” “low,” or “very low.” Based on eligible RCTs, our analysis suggested using goal-directed oxygen delivery on CPB and the “KDIGO bundle of care” in high-risk patients to prevent CS-AKI (moderate level of GRADE evidence). Our results suggested considering the use of vasopressin in vasoplegic shock patients to reduce CS-AKI (low level of GRADE evidence). The decision to use a restrictive versus liberal strategy for perioperative red cell transfusion should not be based on concerns for renal protection (a moderate level of GRADE evidence). In addition, targeting a higher mean arterial pressure during CPB, perioperative use of dopamine, and use of dexmedetomidine did not reduce CS-AKI (a low or very low level of GRADE evidence). This review will help clinicians provide evidence-based care, targeting improved renal outcomes in adult patients undergoing cardiac surgery.
{"title":"Society of Cardiovascular Anesthesiologists Clinical Practice Update for Management of Acute Kidney Injury Associated With Cardiac Surgery","authors":"Ke Peng, D. McIlroy, B. Bollen, F. Billings, A. Zarbock, W. Popescu, A. Fox, L. Shore-lesserson, Shaofeng Zhou, M. Geube, Fuhai Ji, Meena Bhatia, N. Schwann, A. Shaw, Hong Liu","doi":"10.1213/ANE.0000000000006068","DOIUrl":"https://doi.org/10.1213/ANE.0000000000006068","url":null,"abstract":"Cardiac surgery-associated acute kidney injury (CS-AKI) is common and is associated with increased risk for postoperative morbidity and mortality. Our recent survey of the Society of Cardiovascular Anesthesiologists (SCA) membership showed 6 potentially renoprotective strategies for which clinicians would most value an evidence-based review (ie, intraoperative target blood pressure, choice of specific vasopressor agent, erythrocyte transfusion threshold, use of alpha-2 agonists, goal-directed oxygen delivery on cardiopulmonary bypass [CPB], and the “Kidney Disease Improving Global Outcomes [KDIGO] bundle of care”). Thus, the SCA’s Continuing Practice Improvement Acute Kidney Injury Working Group aimed to provide a practice update for each of these strategies in cardiac surgical patients based on the evidence from randomized controlled trials (RCTs). PubMed, EMBASE, and Cochrane library databases were comprehensively searched for eligible studies from inception through February 2021, with search results updated in August 2021. A total of 15 RCTs investigating the effects of the above-mentioned strategies on CS-AKI were included for meta-analysis. For each strategy, the level of evidence was assessed using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology. Across the 6 potentially renoprotective strategies evaluated, current evidence for their use was rated as “moderate,” “low,” or “very low.” Based on eligible RCTs, our analysis suggested using goal-directed oxygen delivery on CPB and the “KDIGO bundle of care” in high-risk patients to prevent CS-AKI (moderate level of GRADE evidence). Our results suggested considering the use of vasopressin in vasoplegic shock patients to reduce CS-AKI (low level of GRADE evidence). The decision to use a restrictive versus liberal strategy for perioperative red cell transfusion should not be based on concerns for renal protection (a moderate level of GRADE evidence). In addition, targeting a higher mean arterial pressure during CPB, perioperative use of dopamine, and use of dexmedetomidine did not reduce CS-AKI (a low or very low level of GRADE evidence). This review will help clinicians provide evidence-based care, targeting improved renal outcomes in adult patients undergoing cardiac surgery.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72840753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-05-06DOI: 10.1213/ANE.0000000000005982
Pudkrong Aichholz, S. A. Lee, Carly K Farr, Hamilton C Tsang, M. Vavilala, L. Stansbury, J. Hess
BACKGROUND: Incorporation of massive transfusion protocols (MTPs) into acute major trauma care has reduced hemorrhagic mortality, but the threshold and timing of platelet transfusion in MTP are controversial. This study aimed to describe early (first 4 hours) platelet transfusion practice in a setting where platelet counts are available within 15 minutes and the effect of early platelet deployment on in-hospital mortality. Our hypothesis in this work was that platelet transfusion in resuscitation of severe trauma can be guided by rapid turnaround platelet counts without excess mortality. METHODS: We examined MTP activations for all admissions from October 2016 to September 2018 to a Level 1 regional trauma center with a full trauma team activation. We characterized platelet transfusion practice by demographics, injury severity, and admission vital signs (as shock index: heart rate/systolic blood pressure) and laboratory results. A multivariable model assessed association between early platelet transfusion and mortality at 4 hours, 24 hours, and overall in-hospital, with P <.001. RESULTS: Of the 11,474 new trauma patients admitted over the study period, 469 (4.0%) were massively transfused (defined as ≥10 units of red blood cells [RBCs] in 24 hours, ≥5 units of RBC in 6 hour, ≥3 units of RBC in 1 hour, or ≥4 units of total products in 30 minutes). 250 patients (53.0%) received platelets in the first 4 hours, and most early platelet transfusions occurred in the first hour after admission (175, 70.0%). Platelet recipients had higher injury severity scores (mean ± standard deviation [SD], 35 ± 16 vs 28 ± 14), lower admission platelet counts (189 ± 80 × 109/L vs 234 ± 80 × 109/L; P < .001), higher admission shock index (heart rate/systolic blood pressure; 1.15 ± 0.46 vs 0.98 ± 0.36; P < .001), and received more units of red cells in the first 4 hours (8.7 ± 7.7 vs 3.3 ± 1.6 units), 24 hours (9 ± 9 vs 3 ± 2 units), and in-hospital (9 ± 8 vs 3 ± 2 units) than nonrecipients (all P < .001). We saw no difference in 4-hour (8% vs 7.8%; P = .4), 24-hour (16.4% vs 10.5%; P = .06), or in-hospital mortality (30.4% vs 23.7%; P = .1) between platelet recipients and nonrecipients. After adjustment for age, injury severity, head injury, and admission physiology/laboratory results, early platelet transfusion was not associated with 4-hour, 24-hour, or in-hospital mortality. CONCLUSIONS: In an advanced trauma care setting where platelet counts are available within 15 minutes, approximately half of massively transfused patients received early platelet transfusion. Early platelet transfusion guided by protocol-based clinical judgment and rapid-turnaround platelet counts was not associated with increased mortality.
背景:将大量输血方案(MTPs)纳入急性重大创伤护理可以降低出血性死亡率,但MTP中血小板输注的阈值和时间存在争议。本研究旨在描述在15分钟内可获得血小板计数的情况下早期(前4小时)血小板输注实践,以及早期血小板部署对住院死亡率的影响。我们在这项工作中的假设是,在严重创伤的复苏中,血小板输注可以通过快速周转血小板计数来指导,而不会造成过多的死亡率。方法:我们检查了2016年10月至2018年9月1级区域创伤中心所有患者的MTP激活情况。我们通过人口统计学、损伤严重程度、入院生命体征(如休克指数:心率/收缩压)和实验室结果来描述血小板输注的特点。多变量模型评估早期血小板输注与4小时、24小时和住院总死亡率之间的关系,P < 0.001。结果:在研究期间新入院的11474例创伤患者中,469例(4.0%)大量输血(定义为24小时内红细胞≥10单位,6小时内红细胞≥5单位,1小时内红细胞≥3单位,或30分钟内总产物≥4单位)。250例(53.0%)患者在入院前4小时接受血小板输注,其中早期血小板输注最多发生在入院后1小时(175例,70.0%)。血小板受体损伤严重程度评分较高(平均±标准差[SD], 35±16比28±14),入院血小板计数较低(189±80 × 109/L比234±80 × 109/L;P < 0.001),较高的入院休克指数(心率/收缩压;1.15±0.46 vs 0.98±0.36;P < 0.001),并且在前4小时(8.7±7.7比3.3±1.6单位)、24小时(9±9比3±2单位)和住院(9±8比3±2单位)接受的红细胞比非受体多(均P < 0.001)。我们没有看到4小时的差异(8% vs 7.8%;P = 0.4), 24小时(16.4% vs 10.5%;P = 0.06)或住院死亡率(30.4% vs 23.7%;P = 1)。在调整了年龄、损伤严重程度、头部损伤和入院生理学/实验室结果后,早期血小板输注与4小时、24小时或住院死亡率无关。结论:在15分钟内可获得血小板计数的高级创伤护理环境中,大约一半的大量输血患者接受了早期血小板输注。以临床判断和快速周转血小板计数为指导的早期血小板输注与死亡率增加无关。
{"title":"Platelet Transfusion and Outcomes After Massive Transfusion Protocol Activation for Major Trauma: A Retrospective Cohort Study","authors":"Pudkrong Aichholz, S. A. Lee, Carly K Farr, Hamilton C Tsang, M. Vavilala, L. Stansbury, J. Hess","doi":"10.1213/ANE.0000000000005982","DOIUrl":"https://doi.org/10.1213/ANE.0000000000005982","url":null,"abstract":"BACKGROUND: Incorporation of massive transfusion protocols (MTPs) into acute major trauma care has reduced hemorrhagic mortality, but the threshold and timing of platelet transfusion in MTP are controversial. This study aimed to describe early (first 4 hours) platelet transfusion practice in a setting where platelet counts are available within 15 minutes and the effect of early platelet deployment on in-hospital mortality. Our hypothesis in this work was that platelet transfusion in resuscitation of severe trauma can be guided by rapid turnaround platelet counts without excess mortality. METHODS: We examined MTP activations for all admissions from October 2016 to September 2018 to a Level 1 regional trauma center with a full trauma team activation. We characterized platelet transfusion practice by demographics, injury severity, and admission vital signs (as shock index: heart rate/systolic blood pressure) and laboratory results. A multivariable model assessed association between early platelet transfusion and mortality at 4 hours, 24 hours, and overall in-hospital, with P <.001. RESULTS: Of the 11,474 new trauma patients admitted over the study period, 469 (4.0%) were massively transfused (defined as ≥10 units of red blood cells [RBCs] in 24 hours, ≥5 units of RBC in 6 hour, ≥3 units of RBC in 1 hour, or ≥4 units of total products in 30 minutes). 250 patients (53.0%) received platelets in the first 4 hours, and most early platelet transfusions occurred in the first hour after admission (175, 70.0%). Platelet recipients had higher injury severity scores (mean ± standard deviation [SD], 35 ± 16 vs 28 ± 14), lower admission platelet counts (189 ± 80 × 109/L vs 234 ± 80 × 109/L; P < .001), higher admission shock index (heart rate/systolic blood pressure; 1.15 ± 0.46 vs 0.98 ± 0.36; P < .001), and received more units of red cells in the first 4 hours (8.7 ± 7.7 vs 3.3 ± 1.6 units), 24 hours (9 ± 9 vs 3 ± 2 units), and in-hospital (9 ± 8 vs 3 ± 2 units) than nonrecipients (all P < .001). We saw no difference in 4-hour (8% vs 7.8%; P = .4), 24-hour (16.4% vs 10.5%; P = .06), or in-hospital mortality (30.4% vs 23.7%; P = .1) between platelet recipients and nonrecipients. After adjustment for age, injury severity, head injury, and admission physiology/laboratory results, early platelet transfusion was not associated with 4-hour, 24-hour, or in-hospital mortality. CONCLUSIONS: In an advanced trauma care setting where platelet counts are available within 15 minutes, approximately half of massively transfused patients received early platelet transfusion. Early platelet transfusion guided by protocol-based clinical judgment and rapid-turnaround platelet counts was not associated with increased mortality.","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91081367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-25DOI: 10.1213/ANE.0000000000006055
L. Goeddel, Samuel Erlinger, Zachary R. Murphy, Olive Tang, Jules Bergmann, Shaun C. Moeller, Mohammad Hattab, Sachinand Hebbar, Charlie Slowey, T. Esfandiary, D. Fine, N. Faraday
BACKGROUND: Acute kidney injury (AKI) after major noncardiac surgery is commonly attributed to cardiovascular dysfunction. Identifying novel associations between preoperative cardiovascular markers and kidney injury may guide risk stratification and perioperative intervention. Increased left ventricular relative wall thickness (RWT), routinely measured on echocardiography, is associated with myocardial dysfunction and long-term risk of heart failure in patients with preserved left ventricular ejection fraction (LVEF); however, its relationship to postoperative complications has not been studied. We evaluated the association between preoperative RWT and AKI in high-risk noncardiac surgical patients with preserved LVEF. METHODS: Patients ≥18 years of age having major noncardiac surgery (high-risk elective intra-abdominal or noncardiac intrathoracic surgery) between July 1, 2016, and June 30, 2018, who had transthoracic echocardiography in the previous 12 months were eligible. Patients with preoperative creatinine ≥2 mg/dL or reduced LVEF (<50%) were excluded. The association between RWT and AKI, defined as an increase in serum creatinine by 0.3 mg/dL from baseline within 48 hours or by 50% within 7 days after surgery, was assessed using multivariable logistic regression adjusted for preoperative covariates. An additional model adjusted for intraoperative covariates, which are strongly associated with AKI, especially hypotension. RWT was modeled continuously, associating the change in odds of AKI for each 0.1 increase in RWT. RESULTS: The study included 1041 patients (mean ± standard deviation [SD] age 62 ± 15 years; 59% female). A total of 145 subjects (13.9%) developed AKI within 7 days. For RWT quartiles 1 through 4, respectively, 20 of 262 (7.6%), 40 of 259 (15.4%), 39 of 263 (14.8%), and 46 of 257 (17.9%) developed AKI. Log-odds and proportion with AKI increased across the observed RWT values. After adjusting for confounders (demographics, American Society of Anesthesiologists [ASA] physical status, comorbidities, baseline creatinine, antihypertensive medications, and left ventricular mass index), each RWT increase of 0.1 was associated with an estimated 26% increased odds of developing AKI (odds ratio [OR]; 95% confidence interval [CI]) of 1.26 (1.09–1.46; P = .002). After adjusting for intraoperative covariates (length of surgery, presence of an arterial line, intraoperative hypotension, crystalloid administration, transfusion, and urine output), RWT remained independently associated with the odds of AKI (OR; 95% CI) of 1.28 (1.13–1.47; P = .001). Increased RWT was also independently associated with hospital length of stay and adjusted hazard ratio (HR [95% CI]) of 0.94 (0.89–0.99; P = .018). CONCLUSIONS: Left ventricular RWT is a novel cardiovascular factor associated with AKI within 7 days after high-risk noncardiac surgery among patients with preserved LVEF. Application of this commonly available measurement of risk stratification or perio
{"title":"Association Between Left Ventricular Relative Wall Thickness and Acute Kidney Injury After Noncardiac Surgery","authors":"L. Goeddel, Samuel Erlinger, Zachary R. Murphy, Olive Tang, Jules Bergmann, Shaun C. Moeller, Mohammad Hattab, Sachinand Hebbar, Charlie Slowey, T. Esfandiary, D. Fine, N. Faraday","doi":"10.1213/ANE.0000000000006055","DOIUrl":"https://doi.org/10.1213/ANE.0000000000006055","url":null,"abstract":"BACKGROUND: Acute kidney injury (AKI) after major noncardiac surgery is commonly attributed to cardiovascular dysfunction. Identifying novel associations between preoperative cardiovascular markers and kidney injury may guide risk stratification and perioperative intervention. Increased left ventricular relative wall thickness (RWT), routinely measured on echocardiography, is associated with myocardial dysfunction and long-term risk of heart failure in patients with preserved left ventricular ejection fraction (LVEF); however, its relationship to postoperative complications has not been studied. We evaluated the association between preoperative RWT and AKI in high-risk noncardiac surgical patients with preserved LVEF. METHODS: Patients ≥18 years of age having major noncardiac surgery (high-risk elective intra-abdominal or noncardiac intrathoracic surgery) between July 1, 2016, and June 30, 2018, who had transthoracic echocardiography in the previous 12 months were eligible. Patients with preoperative creatinine ≥2 mg/dL or reduced LVEF (<50%) were excluded. The association between RWT and AKI, defined as an increase in serum creatinine by 0.3 mg/dL from baseline within 48 hours or by 50% within 7 days after surgery, was assessed using multivariable logistic regression adjusted for preoperative covariates. An additional model adjusted for intraoperative covariates, which are strongly associated with AKI, especially hypotension. RWT was modeled continuously, associating the change in odds of AKI for each 0.1 increase in RWT. RESULTS: The study included 1041 patients (mean ± standard deviation [SD] age 62 ± 15 years; 59% female). A total of 145 subjects (13.9%) developed AKI within 7 days. For RWT quartiles 1 through 4, respectively, 20 of 262 (7.6%), 40 of 259 (15.4%), 39 of 263 (14.8%), and 46 of 257 (17.9%) developed AKI. Log-odds and proportion with AKI increased across the observed RWT values. After adjusting for confounders (demographics, American Society of Anesthesiologists [ASA] physical status, comorbidities, baseline creatinine, antihypertensive medications, and left ventricular mass index), each RWT increase of 0.1 was associated with an estimated 26% increased odds of developing AKI (odds ratio [OR]; 95% confidence interval [CI]) of 1.26 (1.09–1.46; P = .002). After adjusting for intraoperative covariates (length of surgery, presence of an arterial line, intraoperative hypotension, crystalloid administration, transfusion, and urine output), RWT remained independently associated with the odds of AKI (OR; 95% CI) of 1.28 (1.13–1.47; P = .001). Increased RWT was also independently associated with hospital length of stay and adjusted hazard ratio (HR [95% CI]) of 0.94 (0.89–0.99; P = .018). CONCLUSIONS: Left ventricular RWT is a novel cardiovascular factor associated with AKI within 7 days after high-risk noncardiac surgery among patients with preserved LVEF. Application of this commonly available measurement of risk stratification or perio","PeriodicalId":7799,"journal":{"name":"Anesthesia & Analgesia","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88113879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}