Leah Feulner, Thanpicha Sermchaiwong, Nathan Rodland, David Galarneau
Background: Anxiety disorders are commonly diagnosed and cause substantial functional impairment. A mixture of pharmacologic and psychosocial treatments currently exists, but these treatments are not always tolerable and effective. For patients with anxiety resistant to standard therapy, psychedelics may be a promising alternative. This review assesses the therapeutic benefits and safety of psychedelics in treating anxiety disorders. Methods: We searched PubMed, Embase, PsycInfo, and CINAHL for clinical trials investigating psychedelics in patients with clinician-diagnosed generalized anxiety disorder, social anxiety disorder, specific phobia, separation anxiety disorder, selective mutism, panic disorder, agoraphobia, and anxiety attributable to another medical condition. We analyzed data from 9 independent psychedelic-assisted trials testing ayahuasca (1 study), ketamine (4 studies), lysergic acid diethylamide (LSD) (2 studies), 3,4-methylenedioxymethamphetamine (MDMA) (1 study), and psilocybin (1 study). Efficacy was assessed by measuring the change in outcome measures and the quality of life from baseline. Results: The reviewed studies demonstrated encouraging efficacy in reducing anxiety symptoms, increasing self-perception, and increasing social function in patients with generalized anxiety disorder, social anxiety disorder, or anxiety attributable to another medical condition while establishing feasibility and evidence of safety. For many patients, the therapeutic effects of the psychedelic treatment lasted weeks, and no severe adverse events were reported. Conclusion: Based on the evidence of symptom reduction and safety, the current literature (2011 to 2021) shows that psychedelics could be considered for treating clinician-diagnosed anxiety disorders. Psychedelics may provide an alternative therapeutic option for patients resistant to current standard treatments.
{"title":"Efficacy and Safety of Psychedelics in Treating Anxiety Disorders.","authors":"Leah Feulner, Thanpicha Sermchaiwong, Nathan Rodland, David Galarneau","doi":"10.31486/toj.23.0076","DOIUrl":"https://doi.org/10.31486/toj.23.0076","url":null,"abstract":"<p><p><b>Background:</b> Anxiety disorders are commonly diagnosed and cause substantial functional impairment. A mixture of pharmacologic and psychosocial treatments currently exists, but these treatments are not always tolerable and effective. For patients with anxiety resistant to standard therapy, psychedelics may be a promising alternative. This review assesses the therapeutic benefits and safety of psychedelics in treating anxiety disorders. <b>Methods:</b> We searched PubMed, Embase, PsycInfo, and CINAHL for clinical trials investigating psychedelics in patients with clinician-diagnosed generalized anxiety disorder, social anxiety disorder, specific phobia, separation anxiety disorder, selective mutism, panic disorder, agoraphobia, and anxiety attributable to another medical condition. We analyzed data from 9 independent psychedelic-assisted trials testing ayahuasca (1 study), ketamine (4 studies), lysergic acid diethylamide (LSD) (2 studies), 3,4-methylenedioxymethamphetamine (MDMA) (1 study), and psilocybin (1 study). Efficacy was assessed by measuring the change in outcome measures and the quality of life from baseline. <b>Results:</b> The reviewed studies demonstrated encouraging efficacy in reducing anxiety symptoms, increasing self-perception, and increasing social function in patients with generalized anxiety disorder, social anxiety disorder, or anxiety attributable to another medical condition while establishing feasibility and evidence of safety. For many patients, the therapeutic effects of the psychedelic treatment lasted weeks, and no severe adverse events were reported. <b>Conclusion:</b> Based on the evidence of symptom reduction and safety, the current literature (2011 to 2021) shows that psychedelics could be considered for treating clinician-diagnosed anxiety disorders. Psychedelics may provide an alternative therapeutic option for patients resistant to current standard treatments.</p>","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"23 4","pages":"315-328"},"PeriodicalIF":1.2,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10741816/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139032754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Maria C Mejia, Adedamola Adele, Robert S Levine, Charles H Hennekens, Panagiota Kitsantas
Background: Cigarette smoking remains the leading avoidable cause of premature death in the United States, accounting for approximately 500,000, or 1 in 5, deaths annually. We explored trends in cigarette smoking among US adolescents. Methods: We used data for adolescents in grades 9 through 12 from 1991 to 2021 from the Youth Risk Behavior Survey provided by the US Centers for Disease Control and Prevention. We explored trends overall as well as by sex, race/ethnicity, and school grade. Results: All cigarette use-assessed as ever, occasional, frequent, or daily-among adolescents declined markedly from 1991 to 2021. Specifically, ever use significantly decreased from 70.1% in 1991 to 17.8% in 2021 (P<0.05), an almost 4-fold decline. Occasional use significantly decreased from 27.5% in 1991 to 3.8% in 2021 (P<0.05), a greater than 7-fold decline. Frequent use significantly decreased from 12.7% to 0.7%, a greater than 18-fold decline. Daily use declined from 9.8% in 1991 to 0.6% in 2021, a greater than 16-fold decline. Cigarette smoking significantly decreased from 1999 to 2021 across sex, race/ethnicity, and school grade (P<0.05). In 2021, daily use was higher in boys vs girls; Hispanic/Latino and White youth vs Black and Asian youth; and 12th graders vs 9th, 10th, and 11th graders. Conclusion: These data show large and significant decreases in cigarette use among US adolescents in high school grades 9 through 12 from 1991 to 2021. Nonetheless, the data also suggest residual clinical and public health challenges that will require targeted interventions.
{"title":"Trends in Cigarette Smoking Among United States Adolescents.","authors":"Maria C Mejia, Adedamola Adele, Robert S Levine, Charles H Hennekens, Panagiota Kitsantas","doi":"10.31486/toj.23.0113","DOIUrl":"https://doi.org/10.31486/toj.23.0113","url":null,"abstract":"<p><p><b>Background:</b> Cigarette smoking remains the leading avoidable cause of premature death in the United States, accounting for approximately 500,000, or 1 in 5, deaths annually. We explored trends in cigarette smoking among US adolescents. <b>Methods:</b> We used data for adolescents in grades 9 through 12 from 1991 to 2021 from the Youth Risk Behavior Survey provided by the US Centers for Disease Control and Prevention. We explored trends overall as well as by sex, race/ethnicity, and school grade. <b>Results:</b> All cigarette use-assessed as ever, occasional, frequent, or daily-among adolescents declined markedly from 1991 to 2021. Specifically, ever use significantly decreased from 70.1% in 1991 to 17.8% in 2021 (<i>P</i><0.05), an almost 4-fold decline. Occasional use significantly decreased from 27.5% in 1991 to 3.8% in 2021 (<i>P</i><0.05), a greater than 7-fold decline. Frequent use significantly decreased from 12.7% to 0.7%, a greater than 18-fold decline. Daily use declined from 9.8% in 1991 to 0.6% in 2021, a greater than 16-fold decline. Cigarette smoking significantly decreased from 1999 to 2021 across sex, race/ethnicity, and school grade (<i>P</i><0.05). In 2021, daily use was higher in boys vs girls; Hispanic/Latino and White youth vs Black and Asian youth; and 12th graders vs 9th, 10th, and 11th graders. <b>Conclusion:</b> These data show large and significant decreases in cigarette use among US adolescents in high school grades 9 through 12 from 1991 to 2021. Nonetheless, the data also suggest residual clinical and public health challenges that will require targeted interventions.</p>","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"23 4","pages":"289-295"},"PeriodicalIF":1.2,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10741819/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139032769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: In epidemiologic investigations of disease outbreaks, multivariable regression techniques with adjustment for confounding can be applied to assess the association between exposure and outcome. Traditionally, logistic regression has been used in analyses of case-control studies to determine the odds ratio (OR) as the effect measure. For rare outcomes (incidence of 5% to 10%), an adjusted OR can be used to approximate the risk ratio (RR). However, concern has been raised about using logistic regression to estimate RR because how closely the calculated OR approximates the RR depends largely on the outcome rate. The literature shows that when the incidence of outcomes exceeds 10%, ORs greatly overestimate RRs. Consequently, in addition to logistic regression, other regression methods to accurately estimate adjusted RRs have been explored. One method of interest is Poisson regression with robust standard errors. This generalized linear model estimates RR directly vs logistic regression that determines OR. The purpose of this study was to empirically compare risk estimates obtained from logistic regression and Poisson regression with robust standard errors in terms of effect size and determination of the most likely source in the analysis of a series of simulated single-source disease outbreak scenarios. Methods: We created a prototype dataset to simulate a foodborne outbreak following a public event with 14 food exposures and a 52.0% overall attack rate. Regression methods, including binary logistic regression and Poisson regression with robust standard errors, were applied to analyze the dataset. To further examine how these two models led to different conclusions of the potential outbreak source, a series of 5 additional scenarios with decreasing attack rates were simulated and analyzed using both regression models. Results: For each of the explanatory variables—sex, age, and food types—in both univariable and multivariable models, the ORs obtained from logistic regression were estimated further from 1.0 than their corresponding RRs estimated by Poisson regression with robust standard errors. In the simulated scenarios, the Poisson regression models demonstrated greater consistency in the identification of one food type as the most likely outbreak source. Conclusion: Poisson regression with robust standard errors proved to be a decisive and consistent method to estimate risk associated with a single source in an outbreak when the cohort data collection design was used.
{"title":"Investigating the Source of a Disease Outbreak Based on Risk Estimation: A Simulation Study Comparing Risk Estimates Obtained From Logistic and Poisson Regression Applied to a Dichotomous Outcome","authors":"Chanapong Rojanaworarit, Jason J Wong","doi":"10.31486/toj.18.0166","DOIUrl":"https://doi.org/10.31486/toj.18.0166","url":null,"abstract":"Background: In epidemiologic investigations of disease outbreaks, multivariable regression techniques with adjustment for confounding can be applied to assess the association between exposure and outcome. Traditionally, logistic regression has been used in analyses of case-control studies to determine the odds ratio (OR) as the effect measure. For rare outcomes (incidence of 5% to 10%), an adjusted OR can be used to approximate the risk ratio (RR). However, concern has been raised about using logistic regression to estimate RR because how closely the calculated OR approximates the RR depends largely on the outcome rate. The literature shows that when the incidence of outcomes exceeds 10%, ORs greatly overestimate RRs. Consequently, in addition to logistic regression, other regression methods to accurately estimate adjusted RRs have been explored. One method of interest is Poisson regression with robust standard errors. This generalized linear model estimates RR directly vs logistic regression that determines OR. The purpose of this study was to empirically compare risk estimates obtained from logistic regression and Poisson regression with robust standard errors in terms of effect size and determination of the most likely source in the analysis of a series of simulated single-source disease outbreak scenarios. Methods: We created a prototype dataset to simulate a foodborne outbreak following a public event with 14 food exposures and a 52.0% overall attack rate. Regression methods, including binary logistic regression and Poisson regression with robust standard errors, were applied to analyze the dataset. To further examine how these two models led to different conclusions of the potential outbreak source, a series of 5 additional scenarios with decreasing attack rates were simulated and analyzed using both regression models. Results: For each of the explanatory variables—sex, age, and food types—in both univariable and multivariable models, the ORs obtained from logistic regression were estimated further from 1.0 than their corresponding RRs estimated by Poisson regression with robust standard errors. In the simulated scenarios, the Poisson regression models demonstrated greater consistency in the identification of one food type as the most likely outbreak source. Conclusion: Poisson regression with robust standard errors proved to be a decisive and consistent method to estimate risk associated with a single source in an outbreak when the cohort data collection design was used.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"220 - 226"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47975274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Paul M Ndunda, S. Srinivasan, Mohinder R. Vindhyal, Tabitha Muutu, Rachel R. Vukas, Zaher Fanari
Background: Chronic liver disease increases cardiac surgical risk, with 30-day mortality ranging from 9% to 52% in patients with Child-Pugh class A and C, respectively. Data comparing the outcomes of transcatheter aortic valve replacement (TAVR) and surgical aortic valve replacement (SAVR) in patients with liver disease are limited. Methods: We searched PubMed, Cochrane Library, Web of Science, and Google Scholar for relevant studies and assessed risk of bias using the Risk of Bias in Non-Randomized Studies – of Interventions (ROBINS-I) Cochrane Collaboration tool. Results: Five observational studies with 359 TAVR and 1,872 SAVR patients were included in the analysis. Overall, patients undergoing TAVR had a statistically insignificant lower rate of in-hospital mortality (7.2% vs 18.1%; odds ratio [OR] 0.67; 95% confidence interval [CI] 0.25, 1.82; I2=61%) than patients receiving SAVR. In propensity score–matched cohorts, patients undergoing TAVR had lower rates of in-hospital mortality (7.3% vs 13.2%; OR 0.51; 95% CI 0.27, 0.98; I2=13%), blood transfusion (27.4% vs 51.1%; OR 0.36; 95% CI 0.21, 0.60; I2=31%), and hospital length of stay (10.9 vs 15.7 days; mean difference –6.32; 95% CI –10.28, –2.36; I2=83%) than patients having SAVR. No significant differences between the 2 interventions were detected in the proportion of patients discharged home (65.3% vs 53.9%; OR 1.3; 95% CI 0.56, 3.05; I2=67%), acute kidney injury (10.4% vs 17.1%; OR 0.55; 95% CI 0.29, 1.07; I2= 0%), or mean cost of hospitalization ($250,386 vs $257,464; standardized mean difference –0.07; 95% CI –0.29, 0.14; I2=0%). Conclusion: In patients with chronic liver disease, TAVR may be associated with lower rates of in-hospital mortality, blood transfusion, and hospital length of stay compared with SAVR.
{"title":"Clinical Outcomes of Transcatheter vs Surgical Aortic Valve Replacement in Patients With Chronic Liver Disease: A Systematic Review and Metaanalysis","authors":"Paul M Ndunda, S. Srinivasan, Mohinder R. Vindhyal, Tabitha Muutu, Rachel R. Vukas, Zaher Fanari","doi":"10.31486/toj.18.0178","DOIUrl":"https://doi.org/10.31486/toj.18.0178","url":null,"abstract":"Background: Chronic liver disease increases cardiac surgical risk, with 30-day mortality ranging from 9% to 52% in patients with Child-Pugh class A and C, respectively. Data comparing the outcomes of transcatheter aortic valve replacement (TAVR) and surgical aortic valve replacement (SAVR) in patients with liver disease are limited. Methods: We searched PubMed, Cochrane Library, Web of Science, and Google Scholar for relevant studies and assessed risk of bias using the Risk of Bias in Non-Randomized Studies – of Interventions (ROBINS-I) Cochrane Collaboration tool. Results: Five observational studies with 359 TAVR and 1,872 SAVR patients were included in the analysis. Overall, patients undergoing TAVR had a statistically insignificant lower rate of in-hospital mortality (7.2% vs 18.1%; odds ratio [OR] 0.67; 95% confidence interval [CI] 0.25, 1.82; I2=61%) than patients receiving SAVR. In propensity score–matched cohorts, patients undergoing TAVR had lower rates of in-hospital mortality (7.3% vs 13.2%; OR 0.51; 95% CI 0.27, 0.98; I2=13%), blood transfusion (27.4% vs 51.1%; OR 0.36; 95% CI 0.21, 0.60; I2=31%), and hospital length of stay (10.9 vs 15.7 days; mean difference –6.32; 95% CI –10.28, –2.36; I2=83%) than patients having SAVR. No significant differences between the 2 interventions were detected in the proportion of patients discharged home (65.3% vs 53.9%; OR 1.3; 95% CI 0.56, 3.05; I2=67%), acute kidney injury (10.4% vs 17.1%; OR 0.55; 95% CI 0.29, 1.07; I2= 0%), or mean cost of hospitalization ($250,386 vs $257,464; standardized mean difference –0.07; 95% CI –0.29, 0.14; I2=0%). Conclusion: In patients with chronic liver disease, TAVR may be associated with lower rates of in-hospital mortality, blood transfusion, and hospital length of stay compared with SAVR.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"241 - 247"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48693430","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Barley R. Halton, Jill N. T. Roberts, G. Dodd Denton
Background: Pre-exposure prophylaxis (PrEP) with emtricitabine/tenofovir disoproxil fumarate (Truvada) is highly effective at preventing human immunodeficiency virus (HIV) transmission in high-risk populations, including in men who have sex with men (MSM). In 2019, the US Preventive Services Task Force released an A recommendation to offer PrEP to persons at high risk of HIV acquisition. Despite the demonstrated efficacy of PrEP, areas with high HIV incidence, such as Louisiana, have historically had low PrEP prescription rates. The objective of this study was to determine the factors associated with whether providers in the Ochsner Health System (OHS) discussed PrEP with HIV-negative MSM patients. Methods: Investigators extracted electronic medical record data on all HIV-negative MSM patients who had at least one outpatient visit at OHS between July 1, 2012 and July 1, 2016 and manually reviewed a random sample of 115 charts. Results: Subjects were predominantly Caucasian (75.7%) with a mean age of 37.6 years. A PrEP discussion was documented for 34 (29.6%) patients. Multivariate modeling showed that having a PrEP discussion was associated with 3 factors: being assigned to a primary care provider known to specialize in MSM care (odds ratio [OR] 5.05, 95% confidence interval [CI] 1.81-14.10; P=0.002), having a documented history (positive or negative) of sexually transmitted infection vs no documentation (OR 5.41, 95% CI 1.80-16.23; P=0.003), and having documentation of condom use (consistent or inconsistent) vs no documentation (OR 3.32, 95% CI 1.27-8.74; P=0.015). Conclusion: Despite evidence that PrEP significantly reduces sexual transmission of HIV in MSM, PrEP discussions with MSM across OHS were undesirably low. Additional resources need to be aimed at increasing PrEP uptake and should focus on providing skills-based training and education in PrEP and MSM care to healthcare providers. With increased knowledge of and familiarity with PrEP prescribing guidelines, more providers will be better equipped to identify at-risk patients and to discuss prevention options such as PrEP.
{"title":"Factors Associated With Discussions of Human Immunodeficiency Virus Pre-Exposure Prophylaxis in Men Who Have Sex With Men","authors":"Barley R. Halton, Jill N. T. Roberts, G. Dodd Denton","doi":"10.31486/toj.19.0004","DOIUrl":"https://doi.org/10.31486/toj.19.0004","url":null,"abstract":"Background: Pre-exposure prophylaxis (PrEP) with emtricitabine/tenofovir disoproxil fumarate (Truvada) is highly effective at preventing human immunodeficiency virus (HIV) transmission in high-risk populations, including in men who have sex with men (MSM). In 2019, the US Preventive Services Task Force released an A recommendation to offer PrEP to persons at high risk of HIV acquisition. Despite the demonstrated efficacy of PrEP, areas with high HIV incidence, such as Louisiana, have historically had low PrEP prescription rates. The objective of this study was to determine the factors associated with whether providers in the Ochsner Health System (OHS) discussed PrEP with HIV-negative MSM patients. Methods: Investigators extracted electronic medical record data on all HIV-negative MSM patients who had at least one outpatient visit at OHS between July 1, 2012 and July 1, 2016 and manually reviewed a random sample of 115 charts. Results: Subjects were predominantly Caucasian (75.7%) with a mean age of 37.6 years. A PrEP discussion was documented for 34 (29.6%) patients. Multivariate modeling showed that having a PrEP discussion was associated with 3 factors: being assigned to a primary care provider known to specialize in MSM care (odds ratio [OR] 5.05, 95% confidence interval [CI] 1.81-14.10; P=0.002), having a documented history (positive or negative) of sexually transmitted infection vs no documentation (OR 5.41, 95% CI 1.80-16.23; P=0.003), and having documentation of condom use (consistent or inconsistent) vs no documentation (OR 3.32, 95% CI 1.27-8.74; P=0.015). Conclusion: Despite evidence that PrEP significantly reduces sexual transmission of HIV in MSM, PrEP discussions with MSM across OHS were undesirably low. Additional resources need to be aimed at increasing PrEP uptake and should focus on providing skills-based training and education in PrEP and MSM care to healthcare providers. With increased knowledge of and familiarity with PrEP prescribing guidelines, more providers will be better equipped to identify at-risk patients and to discuss prevention options such as PrEP.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"188 - 193"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42274601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Yerrapragada, C. R. Rao, Kavinya Karunakaran, Henry Seow Ern Lee
Background: Type 2 diabetes mellitus is a chronic metabolic disease characterized by hyperglycemia that affects various body systems. Elevated blood glucose levels cause brain malfunction, sorbitol-induced blood vessel damage, and degeneration of the nerves that can lead to dementia or cognitive impairment. Cognitive impairment can result in nonadherence of patients to diabetes treatment, such as diet, medication, and exercise. Methods: We used a cross-sectional design to individually interview 194 patients with type 2 diabetes in a rural field practice area in India. A questionnaire was used to collect sociodemographic and diabetes disease characteristics; anthropometric measurements were also collected. Cognitive dysfunction was assessed with the Kannada version (local language) of the Montreal Cognitive Assessment (MoCA) tool. Blood pressure was measured for all subjects using a standardized sphygmomanometer on the right arm with the patient in a sitting position. Results: Among the 194 diabetic subjects interviewed, 98 (50.5%) were cognitively impaired. More than half of the subjects (56.2%) were ≥65 years, and female participants (53.6%) outnumbered males (46.4%). The majority of patients (62.4%) had had diabetes for <10 years. The sociodemographic characteristics age, sex, education, occupation, and socioeconomic status and the anthropometric measurement of waist-to-hip ratio were significantly associated (P<0.05) with cognitive impairment. Disease characteristics, religion, and blood pressure showed no significant association with cognitive impairment. Conclusion: One in two individuals with type 2 diabetes mellitus in our study population had mild cognitive impairment. Older individuals in the low socioeconomic strata and with low levels of education were identified to be at high risk of cognitive impairment. Hence, screening and appropriate care need to be provided.
{"title":"Cognitive Dysfunction Among Adults With Type 2 Diabetes Mellitus in Karnataka, India","authors":"D. Yerrapragada, C. R. Rao, Kavinya Karunakaran, Henry Seow Ern Lee","doi":"10.31486/toj.18.0160","DOIUrl":"https://doi.org/10.31486/toj.18.0160","url":null,"abstract":"Background: Type 2 diabetes mellitus is a chronic metabolic disease characterized by hyperglycemia that affects various body systems. Elevated blood glucose levels cause brain malfunction, sorbitol-induced blood vessel damage, and degeneration of the nerves that can lead to dementia or cognitive impairment. Cognitive impairment can result in nonadherence of patients to diabetes treatment, such as diet, medication, and exercise. Methods: We used a cross-sectional design to individually interview 194 patients with type 2 diabetes in a rural field practice area in India. A questionnaire was used to collect sociodemographic and diabetes disease characteristics; anthropometric measurements were also collected. Cognitive dysfunction was assessed with the Kannada version (local language) of the Montreal Cognitive Assessment (MoCA) tool. Blood pressure was measured for all subjects using a standardized sphygmomanometer on the right arm with the patient in a sitting position. Results: Among the 194 diabetic subjects interviewed, 98 (50.5%) were cognitively impaired. More than half of the subjects (56.2%) were ≥65 years, and female participants (53.6%) outnumbered males (46.4%). The majority of patients (62.4%) had had diabetes for <10 years. The sociodemographic characteristics age, sex, education, occupation, and socioeconomic status and the anthropometric measurement of waist-to-hip ratio were significantly associated (P<0.05) with cognitive impairment. Disease characteristics, religion, and blood pressure showed no significant association with cognitive impairment. Conclusion: One in two individuals with type 2 diabetes mellitus in our study population had mild cognitive impairment. Older individuals in the low socioeconomic strata and with low levels of education were identified to be at high risk of cognitive impairment. Hence, screening and appropriate care need to be provided.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"227 - 234"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49403057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michael A. Nammour, B. Desai, Michael Warren, Brian M. Godshaw, M. Suri
Background: The trapezoid is the least commonly fractured carpal bone, comprising 4% of all carpal fractures. To date, few articles have been published on isolated trapezoid fractures. Mechanisms of injury have typically been reported as an axial load, with or without forced wrist flexion/extension, that is transmitted from the second metacarpal indirectly to the trapezoid. Case Reports: Two patients presenting with symptoms of nonspecific wrist pain after acute trauma were initially worked up with plain film x-rays. Physical examinations identified nonspecific wrist pain in both patients. Mechanisms of injury involved direct trauma and an axial force transmitted through the scaphoid region of an extended wrist in each patient. Plain x-rays were negative for trapezoid fracture in both patients. Computed tomography and magnetic resonance imaging revealed the diagnoses. Conservative management consisted of splinting and immobilization, with full recovery reported at 2.5- and 3-month follow-up. Conclusion: Isolated fractures of the trapezoid require a high index of suspicion as they are rare, and localizing signs and symptoms are typically vague and may mimic those of scaphoid fractures. When athletes present with dorsal wrist pain, swelling, and snuffbox tenderness in the setting of negative plain x-rays, the most likely mechanisms of injury are associated with athletic activity. Treatment depends on the degree of displacement and other associated injuries and ranges from activity modification or immobilization to open reduction with internal fixation.
{"title":"Approach to Isolated Trapezoid Fractures","authors":"Michael A. Nammour, B. Desai, Michael Warren, Brian M. Godshaw, M. Suri","doi":"10.31486/toj.18.0157","DOIUrl":"https://doi.org/10.31486/toj.18.0157","url":null,"abstract":"Background: The trapezoid is the least commonly fractured carpal bone, comprising 4% of all carpal fractures. To date, few articles have been published on isolated trapezoid fractures. Mechanisms of injury have typically been reported as an axial load, with or without forced wrist flexion/extension, that is transmitted from the second metacarpal indirectly to the trapezoid. Case Reports: Two patients presenting with symptoms of nonspecific wrist pain after acute trauma were initially worked up with plain film x-rays. Physical examinations identified nonspecific wrist pain in both patients. Mechanisms of injury involved direct trauma and an axial force transmitted through the scaphoid region of an extended wrist in each patient. Plain x-rays were negative for trapezoid fracture in both patients. Computed tomography and magnetic resonance imaging revealed the diagnoses. Conservative management consisted of splinting and immobilization, with full recovery reported at 2.5- and 3-month follow-up. Conclusion: Isolated fractures of the trapezoid require a high index of suspicion as they are rare, and localizing signs and symptoms are typically vague and may mimic those of scaphoid fractures. When athletes present with dorsal wrist pain, swelling, and snuffbox tenderness in the setting of negative plain x-rays, the most likely mechanisms of injury are associated with athletic activity. Treatment depends on the degree of displacement and other associated injuries and ranges from activity modification or immobilization to open reduction with internal fixation.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"271 - 275"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49667228","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marvin Kajy, N. Blank, M. Alraies, Jyothsna Akam-Venkata, S. Aggarwal, Amir Kaki, T. Mohamad, Mahir D Elder, T. Schreiber
Background: The clinical presentation of hereditary spherocytosis varies from no symptoms to severe hemolytic anemia requiring splenectomy. Splenectomy imposes the risk of hypercoagulability and acute pulmonary embolism. Catheter-directed thrombolysis is an established treatment for submassive pulmonary embolism in adults. However, the literature regarding its use in children is limited. Case Report: We present the case of a 12-year-old male with hereditary spherocytosis who was diagnosed with pulmonary embolism and successfully treated with catheter-directed thrombolysis. The patient was initially treated with 10.5 mg of recombinant tissue plasminogen activator (r-tPA) delivered over 8 hours. However, because of minimal clinical and hemodynamic improvement, a second course of thrombolytic was administered for an additional 24 hours (25 mg of r-tPA), and the treatment resulted in marked clinical and hemodynamic improvement. Clot resolution was confirmed via angiography. The patient was discharged on enoxaparin and with regular follow-up. One year later, the patient was asymptomatic on enoxaparin. Conclusion: This case demonstrates that catheter-based treatment of submassive pulmonary embolism restores hemodynamic stability and thus is an alternative to surgery or systemic thrombolysis, even in the pediatric setting. While catheter-directed thrombolysis is a safe and effective alternative to systemic thrombolysis, further research is needed to establish appropriate dosing and indications in the adolescent population.
{"title":"Treatment of a Child With Submassive Pulmonary Embolism Associated With Hereditary Spherocytosis Using Ultrasound-Assisted Catheter-Directed Thrombolysis","authors":"Marvin Kajy, N. Blank, M. Alraies, Jyothsna Akam-Venkata, S. Aggarwal, Amir Kaki, T. Mohamad, Mahir D Elder, T. Schreiber","doi":"10.31486/toj.18.0147","DOIUrl":"https://doi.org/10.31486/toj.18.0147","url":null,"abstract":"Background: The clinical presentation of hereditary spherocytosis varies from no symptoms to severe hemolytic anemia requiring splenectomy. Splenectomy imposes the risk of hypercoagulability and acute pulmonary embolism. Catheter-directed thrombolysis is an established treatment for submassive pulmonary embolism in adults. However, the literature regarding its use in children is limited. Case Report: We present the case of a 12-year-old male with hereditary spherocytosis who was diagnosed with pulmonary embolism and successfully treated with catheter-directed thrombolysis. The patient was initially treated with 10.5 mg of recombinant tissue plasminogen activator (r-tPA) delivered over 8 hours. However, because of minimal clinical and hemodynamic improvement, a second course of thrombolytic was administered for an additional 24 hours (25 mg of r-tPA), and the treatment resulted in marked clinical and hemodynamic improvement. Clot resolution was confirmed via angiography. The patient was discharged on enoxaparin and with regular follow-up. One year later, the patient was asymptomatic on enoxaparin. Conclusion: This case demonstrates that catheter-based treatment of submassive pulmonary embolism restores hemodynamic stability and thus is an alternative to surgery or systemic thrombolysis, even in the pediatric setting. While catheter-directed thrombolysis is a safe and effective alternative to systemic thrombolysis, further research is needed to establish appropriate dosing and indications in the adolescent population.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"264 - 270"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48931262","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Disordered metabolism of bone and minerals is a problem frequently encountered in patients with chronic kidney disease. Early biochemical changes include altered calcium and phosphate balance, while advanced disease produces reduced bone strength and extraskeletal calcification. The syndrome describing this constellation of findings is termed chronic kidney disease mineral and bone disorder. Case Report: This report details a rare and extreme manifestation of chronic kidney disease mineral and bone disorder in a patient on long-term hemodialysis for end-stage renal failure. Progressive abnormalities of the thoracic skeleton were ultimately severe enough to produce restrictive lung physiology and symptomatic respiratory failure. Conclusion: Cases of chronic kidney disease mineral and bone disorder with pronounced clinical sequelae occur uncommonly in contemporary practice because of early detection and effective therapies. To our knowledge, this report is the first case in the literature of severe thoracic involvement manifesting as respiratory failure.
{"title":"Respiratory Failure: A Rare Complication of Chronic Kidney Disease Mineral and Bone Disorder","authors":"J. Yaxley, Tahira Scott","doi":"10.31486/toj.18.0177","DOIUrl":"https://doi.org/10.31486/toj.18.0177","url":null,"abstract":"Background: Disordered metabolism of bone and minerals is a problem frequently encountered in patients with chronic kidney disease. Early biochemical changes include altered calcium and phosphate balance, while advanced disease produces reduced bone strength and extraskeletal calcification. The syndrome describing this constellation of findings is termed chronic kidney disease mineral and bone disorder. Case Report: This report details a rare and extreme manifestation of chronic kidney disease mineral and bone disorder in a patient on long-term hemodialysis for end-stage renal failure. Progressive abnormalities of the thoracic skeleton were ultimately severe enough to produce restrictive lung physiology and symptomatic respiratory failure. Conclusion: Cases of chronic kidney disease mineral and bone disorder with pronounced clinical sequelae occur uncommonly in contemporary practice because of early detection and effective therapies. To our knowledge, this report is the first case in the literature of severe thoracic involvement manifesting as respiratory failure.","PeriodicalId":47600,"journal":{"name":"Ochsner Journal","volume":"19 1","pages":"282 - 285"},"PeriodicalIF":1.2,"publicationDate":"2019-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43863119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}