{"title":"Nutrition and Metabolism Research Oral Paper Session Abstracts","authors":"","doi":"10.1002/jpen.2601","DOIUrl":null,"url":null,"abstract":"<p>Sunday, March 3, 2024</p><p>SU30 Parenteral Nutrition Therapy</p><p>SU31 Enteral Nutrition Therapy</p><p>SU32 Malnutrition, Obesity, Nutrition Practice Concepts, and Issues</p><p>SU33 Critical Care and Critical Health Issues</p><p>SU34 GI and Other Nutrition and Metabolic-Related Topics</p><p>SU35 Pediatric, Neonatal, Pregnancy, and Lactation</p><p>Ji Seok Park, MD, MPH; Mohamed Tausif Siddiqui, MD; Kristin Izzo, RD; Sara Yacyshyn, MD; Allison Doriot, RD; Aje Kent, MD; Elizabeth Gallant, RD; Miguel Salazar, MD; Eileen Hendrickson, PharmD; Adriana Panciu, PharmD; Basma Rizk, PharmD; Ann Dugan, RN; James Bena, MS; Shannon Morrison, MS; Ruishen Lyu, MS; Anil Vaidya, MD; Gail Cresci, PhD, RD, LD, FASPEN; Donald F. Kirby, MD, FACP, FACN, FACG, AGAF, FASPEN, CNSC, CPNS</p><p>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support</b>: Cleveland Clinic Center for Human Nutrition Morrison Research and Development Funding.</p><p><b>Background</b>: Preventing catheter-related bloodstream infection (CRBSI) is an essential component in managing patients with chronic intestinal failure dependent on home parenteral nutrition (HPN). Ethanol lock therapy is an effective evidence-based strategy used to decrease the risk of CRBSI, however, it has become less available due to supply chain issues thus other strategies are needed. SQ53 wipe is a novel antimicrobial wipe based on a proprietary compound that has residual efficacy beyond 24 hours. It is registered under the European Union Biocidal Product Regulation but not under the U.S. Food and Drug Administration. This study aimed to evaluate the effectiveness of the SQ53 wipe in preventing CRBSI in patients receiving HPN. The study was registered in ClinicalTrials.gov (NCT 04822467).</p><p><b>Methods</b>: A single-blinded, randomized, placebo-controlled trial was designed. About 200 patients meeting pre-defined criteria were contacted. A total of 60 patients were recruited to the study between December 10, 2021, and June 3, 2022, per sample size calculation. Patients were randomized into a treatment group (SQ53 wipe) and a control group (alcohol wipe). A stratified randomization was done based on the CRBSI risk category (low, high, new) and the types of central venous catheter (CVC; tunneled, non-tunneled). Patients were instructed to use the appropriate type of wipe to clean their CVCs before and after HPN infusion per specific instructions. An interim analysis for both efficacy and futility was planned to occur when the last patient reached 6 months post-randomization. Analyses were performed using Poisson regression for the comparisons of all CRBSI (confirmed and suspected), confirmed CRBSI and CVC exchanges between the two groups. Additional analyses were performed to compare the outcomes between the 6 months prior to the study and the time in the study, using each patient as their own historical control. Both the intention to treat (ITT) and per-protocol (PP) (>90% adherence) analyses were used.</p><p><b>Results</b>: Fifty-nine patients were randomized into the study. When the two groups were compared in parallel, both the ITT and PP analyses did not show statistically significant superiority of using SQ53 wipe over alcohol wipe in decreasing all CRBSI, confirmed CRBSI or CVC exchanges. However, PP analysis suggested that event rates may be lower in the SQ53 group which had a 34% lower risk of all CRBSI (<i>P</i> = 0.43), 53% lower risk of confirmed CRBSI (<i>P</i> = 0.52), and a 30% lower risk of CVC exchanges (<i>P</i> = 0.58). Interestingly, when each patient's CRBSI rate during the trial was compared with their previous CRBSI rate, the SQ53 wipe group showed a 74% lower risk of all CRBSI (<i>P</i> = 0.005) in PP analysis. In patients in the high-risk category, every patient who was randomized had a decreased CRBSI rate compared to their previous experience. Every patient tolerated SQ53 well without predefined adverse events.</p><p><b>Conclusion</b>: Patients who used SQ53 wipe for more than 90% of the time using specific instructions had 74% decreased CRBSI rates compared to their previous experience. SQ53 wipe did not show a statistically significant benefit over alcohol wipe in this study due to the augmented catheter hygienic care in the control group and the insufficient sample size.</p><p><b>Abstract of Distinction</b></p><p>Theresa A. Fessler, MS, RDN, CNSC<sup>1</sup>; Mary B. Crandall, PhD, RN<sup>2</sup>; David N. Martin, PhD<sup>2</sup></p><p><sup>1</sup>Morrison Healthcare, University of Virginia Health System, Charlottesville, VA; <sup>2</sup>University of Virginia Health System, Charlottesville, VA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Catheter-related bloodstream infection (CRBSI) is a serious complication for patients receiving home parenteral nutrition (HPN). The literature is not consistent as to whether there are significant differences in infection risk between central venous catheter (CVC) types, and assessment is complicated by potential alternate infection sources and different evaluation methods: CRBSI, and central line-associated catheter infection (CLABSI). The goals of this project were: To determine if significant differences in infection rates exist between peripherally inserted central venous catheters (PICC), tunneled central venous catheters (TCVC), and implanted ports, or between single-lumen (SL) and multi-lumen (ML) catheters used for HPN; and to identify rates of CVC removal for other complications.</p><p><b>Methods</b>: A prospective, observational quality improvement project was conducted for adults who received HPN provided by the University of Virginia, Continuum Home Infusion Pharmacy from February 2019 through December 2022 with follow-up ending July 31, 2023. Data were collected for 141 CVCs used for 89 patients and included number of HPN days, indications for HPN (Figure 1), reasons for CVC removal, blood draws, and microbiologic results. CRBSI and CLABSI were determined by the criteria described in Table 1. Figure 2 shows the number of peripheral and CVC blood and catheter tip tests done for the CVCs with suspected infection.</p><p><b>Results</b>: Of the CVCs used for HPN, 63% were PICC, 27% TCVC, and 10% ports, with a total of 15,474 HPN catheter days. The CVCs were 42% SL, 55% double-lumen, and 2% triple-lumen. CRBSI rates were 0.97 episodes per 1000 HPN catheter days overall, with 1.54 for PICC, 0.64 for TCVC, and 0.0 for ports. CLABSI rates were 1.74 episodes per 1000 HPN catheter days overall, with 3.07 for PICC, 0.89 for TCVC, and 0.0 for ports. No significant differences were found between PICC and TCVC in CRBSI, however, PICCs had a significantly higher CLABSI rate per 1000 HPN catheter days than did TCVCs (<i>p</i> = 0.005). After a second analysis in which 9 cases of catheter infection were not counted due to undetermined alternate infection sources, overall CRBSI and CLABSI rates were reduced to 0.78 and 1.16 per 1000 HPN catheter days, respectively. The second analysis showed CRBSI rates of 1.23 for PICC and 0.51 for TCVC; and CLABSI rates of 2.0 for PICC and 0.64 for TCVC, with no significant differences in CRBSI, and a significantly higher rate of CLABSI per 1000 HPN catheter days for PICC lines (<i>p</i> = 0.04). Table 2 shows a statistical analysis of CRBSI and CLABSI rates. In the initial analysis, CRBSI was 1.24 for ML and 0.68 for SL CVCs, and CLABSI was 2.1 for ML and 1.36 for SL CVCs, per 1000 HPN days, however the differences were not statistically significant. Other problems which necessitated CVC removal were occlusion, malposition, accidental, leak, and thrombosis. The removal rate for other complications was 2.0 per 1000 HPN catheter days overall, with 1.78 for TCVC and 2.61 for PICCs, and the differences were not statistically significant.</p><p><b>Conclusion</b>: We found no significant differences in CRBSI between PICC and TCVC, significantly more CLABSIs in PICC as compared to TCVCs, and no infections with ports. Although rates of other catheter problems were higher for PICCs, and infection rates were higher for ML than for SL catheters, neither reached statistical significance. We illustrate the variation in results between CRBSI and CLABSI, and that undetermined alternate infection sources complicate reporting. Our results show the need for more study, to be more open to the use of ports, and to choose SL TCVCs when feasible for long-term HPN.</p><p>Haruka Takayama, RD, PhD<sup>1,2</sup>; Kazuhiko Fukatsu, MD, PhD<sup>1,3</sup>; Midori Noguchi, BA<sup>1</sup>; Kazuya Takahashi, MD, PhD<sup>4</sup>; Nana Matsumoto, RD, MS<sup>3</sup>; Tomonori Narita, MD<sup>4</sup>; Satoshi Murakoshi, MD, PhD<sup>1,5</sup></p><p><sup>1</sup>Surgical Center, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; <sup>2</sup>Department of Nutrition, St. Luke's International Hospital, Chuo-ku, Tokyo, Japan; <sup>3</sup>Operating Room Management and Surgical Metabolism, Graduate School of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo, Japan; <sup>4</sup>Gastrointestinal Surgery, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; <sup>5</sup>Nutrition and Dietetics, Kanagawa University of Human Services, Yokosuka City, Kanagawa, Japan</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Our previous study clarified addition of beta-hydroxy-beta-methylbutyrate (HMB) to TPN to partially restores gut-associated lymphoid tissue (GALT) atrophy due to lack of enteral nutrition. Because HMB is a metabolite of amino acid, the recovery effects might derive from increased amount of amino acids in the TPN solution. Or, it is possible that increased amino acid content could not restore GALT atrophy by itself, but that the amino acid increase together with HMB addition could further prevent the atrophy. Herein, we performed 2 studies to answer these questions using a murine TPN feeding model.</p><p><b>Methods</b>: Experiment 1: Six-week-old male Institute of Cancer Research (ICR) mice were divided into A+ (n = 10) and A++ (n = 10) groups. Mice were inserted a catheter into the right jugular vein and they were continuously administered 0.2 mL/h normal saline solution for 2 days and allowed to take chow and water <i>ad libitum</i>. Then, mice received isocaloric PN solution with NPC/N 284 (A+) or 135 (A++) without oral food intake for 5 days. After the dietary manipulation, all mice were killed with cardiac puncture under general anesthesia and harvested the whole small intestine for GALT cell isolation. GALT cell number and its phenotype (B cell, CD4+, CD8+, and αβTCR+, and γδTCR+) were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF) and intestinal washings were collected for IgA level measurement by ELISA. Experiment 2: Mice were randomized to A+H+ (n = 10) and A++H+ (n = 9) groups. The A+H+ mice received PN solution with NPC/N 284 and 2,000 mg/kg BW of Ca-HMB, while the A++H+ animals were given PN solution with NPC/N 135 and 2,000 mg/kg BW of Ca-HMB. After 5 days of PN feeding, the parameters as in Exp.1 were evaluated. The Wilcoxon test was used for all parameter analyses, and the significance level was set at less than 5%.</p><p><b>Results</b>: There were no significant differences between the A+ and A++ groups in GALT cell numbers (Table 1), phenotypes (Table 2) or mucosal IgA levels. However, the A++H+ group showed higher LP cell numbers (Table 1) and higher CD4+ cell percentage (Table 2) in IE space than the A+H+ group, without significant differences in IgA levels at any mucosal sites.</p><p>Anam Bashir, MBBS; Lauren L. Karel, BCPS; Margaret Begany, RD, CSPCC, LDN, CNSC; Jennifer Panganiban, MD</p><p>Children's Hospital of Philadelphia, Philadelphia, PA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Fish oil-based lipid emulsion (FOLE) is FDA-approved at 1 g/kg/day for the treatment of parenteral nutrition-associated cholestasis (PNAC). Due to limited fat provision while on 1 g/kg/day of FOLE, caloric provision, especially in the neonatal population, is skewed primarily to be provided by dextrose and higher than desired glucose infusion rate (GIR) provisions to support weight gain and growth. There is limited published information on the use of FOLE on doses higher than 1 g/kg/day. Concerns about possible essential acid deficiency on 1 g/kg/day have been raised. Thus, we aim to describe patients who received 1.5 g/kg/day of FOLE at our institution.</p><p><b>Methods</b>: A retrospective IRB-approved chart review was conducted on patients who received parenteral nutrition (PN) at Children's Hospital of Philadelphia between January 2020 and August 2023. The inclusion criteria included children who were on PN, ages 0 to 18 years, and receiving FOLE at a dose of more than 1 g/kg/day for at least 14 days. Cholestasis progression, essential fatty acid deficiency (EFAD), clinically severe post-procedure hemorrhage and hypertriglyceridemia were clinical outcomes of interest (Table 1). The progression of cholestatic disease was monitored by conjugated bilirubin levels. A triene to tetraene (T:T) ratio of greater than 0.046 was used to define EFAD based on Associated Regional and University Pathologists, Inc. (ARUP) normative laboratory values. Mead acid, linoleic acid, and alpha-linoleic acid levels were also collected to reflect essential fatty acid stores (normative values in Table 2). Invasive procedures were defined as those that require entry to the body through an incision, and/or tunneling, or cutting technique for vascular procedures. For children younger than 1 year, hypertriglyceridemia was classified as greater than 200 mg/dl, and for older children, greater than 400 mg/dl.</p><p><b>Results</b>: Nine patients [5 males; mean age 2.6 y (range 2 mo–12.9 y)] with PNALD (defined by serum conjugated >= 2 mg/dl and exclusion of other causes of liver disease) were started on FOLE 1.5 g/kg/dose. The purpose of initiating the higher dose FOLE was to decrease GIR provision and/or give additional calories due to suboptimal weight gain using 1 g/kg/day of FOLE. None of the patients developed hypertriglyceridemia. Four patients had improvement of cholestasis with levels decreasing by more than 2 mg/dl, and four patients continued to have no evidence of cholestasis after prior normalization while on 1 g/kg dosing. One patient experienced an increase in conjugated bilirubin of more than 2 mg/dl after which the FOLE was decreased to 1 g/kg/day with resolution of cholestasis over three months. Seven patients had an essential fatty acid panel collected and T:T was within normal limits, although five patients had less than optimal levels of linoleic acid. Seven patients had an invasive procedure performed and only one patient had more than expected bleeding after circumcision. This patient had a low fibrinogen level (70 mg/dL) and required fresh frozen plasma and packed red blood cell transfusion with no significant bleeding event thereafter (Table 1).</p><p>Diana Mulherin, PharmD, BCNSP, BCCCP, FCCM; Sarah Cogle, PharmD, BCNSP, BCCCP; Vanessa Kumpf, PharmD, BCNSP, FASPEN; Edward Woo, PharmD; David Mulherin, PharmD, BCPS; Madeleine Hallum, MSHS, RDN, CSG, LDN; Ankita Sisselman, MD; Dawn Adams, MD, MS, CNSC</p><p>Vanderbilt University Medical Center, Nashville, TN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Copper (Cu) deficiency can lead to poor wound healing, myeloneuropathy, anemia, and cardiac arrhythmias. Deficiency occurs from poor intake or high losses, which may be seen in adult patients requiring parenteral nutrition (PN) including those with severe malnutrition, large burns, requiring continuous renal replacement therapy (CRRT), or with a history of bariatric surgery/malabsorption. A previous formulation of multi-trace elements (MTE) contained Cu 1 mg per dose, and in combination with Cu contamination from other PN ingredients, an increased incidence of hypercupremia was observed in patients requiring long-term PN. As of 2020, the only MTE product for use in adults in the U.S. contains 0.3 mg of Cu. For patients with significant cholestasis of hepatic dysfunction, ASPEN recommends withholding or decreasing Cu doses in PN. Due to a lack of standardized practice, a quality improvement project was initiated to describe practices for ordering Cu in PN and Cu status in acutely ill, hospitalized patients with severe hyperbilirubinemia.</p><p><b>Methods</b>: This was a retrospective evaluation of PN ordering practices of a multidisciplinary nutrition support team (NST) at a large, academic medical center between July 1, 2021, and August 31, 2023. PN encounters (a course of PN treatment during a single inpatient admission) in patients ≥ 18 years of age with severe hyperbilirubinemia (total bilirubin ≥ 10 mg/dL or direct bilirubin ≥ 2 mg/dL) within 5 days before or any time during the PN encounter were included. Patient demographics, frequency of Cu provision in PN, Cu and C-reactive protein (CRP) levels, and CRRT status were assessed using descriptive statistics.</p><p><b>Results</b>: A total of 15,739 PN orders were entered on 1068 patients during the study period. Of those, 155 PN encounters occurred in 144 individual patients with severe hyperbilirubinemia. Baseline demographics are provided in Table 1. A summary of Cu sources (either from MTE product or as cupric chloride additive) for each PN encounter is provided in Figure 1. Cu status was assessed in 53 (34%) PN encounters with a mean concentration of 76.9 (±34.3) mcg/dL. CRP was only obtained concurrently with 58% (n = 31) of Cu levels with a mean concentration of 125.7 (±95.4) mg/L. CRRT was provided in 44 (28.4%) encounters (Table 2).</p><p><b>Figure 1</b>. Copper sources in PN orders.</p><p>Brittney Patterson, MS, RD-AP, CNSC<sup>1</sup>; Ranna Modir, MS, RD, CNSC, CDE, CCTD<sup>1</sup>; Jack McKeown<sup>1</sup>; Rachel Aubyrn<sup>1</sup>; Javier Lorenzo, MD, FCCM<sup>2</sup></p><p><sup>1</sup>Stanford Health Care, Stanford, CA; <sup>2</sup>Stanford University School of Medicine, Stanford, CA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The use of safety alerts in electronic medical records (EMR) aims to improve patient safety, with most alerts directed at medication and nursing workflows. Stanford Health Care (SHC) has added tube feeding regimens (TFR) to the medication administration record (MAR) to further improve patient safety. In critically ill (ICU) patients who are at high risk for gastrointestinal (GI) complications, the ASPEN/SCCM 2016 guidelines recommend using near isotonic, fiber-free TFR. A retrospective analysis between 2014-2016 at SHC found an association of severe GI complications in ICU patients who were started on high-risk tube feeding regimens (HRTFR) of hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. To assure the ASPEN/SCCM guidelines were implemented at SHC, many interventions were put in place including designing order sets with HRTFR listed toward the bottom; specific TFR order sets removing HRTFR; education during new resident orientation, team rounds, and monthly in-services; and Registered Dietitians (RDs) having tube feeding order writing privileges. However, despite these interventions, it was found that HRTFR were still being ordered, with most of them occurring outside of normal RD working hours (8 am to 4 pm). To educate and guide providers to select safe TFR for ICU patients, we aimed to create a novel nutrition support-specific order validation pop-up in the EMR.</p><p><b>Methods</b>: A team of RDs, critical care attendings, and Epic analysts collaborated to create a nutrition support-specific order validation pop-up. ICU patients were defined as requiring vasopressor support from norepinephrine, epinephrine, vasopressin, and/or phenylephrine. HRTFR was defined as hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. The order validation pop-up was built to trigger under the following three scenarios: (1) vasopressors were already on and a HRTFR was ordered, (2) a HRTFR was already on, and a vasopressor was ordered, or (3) both orders were being placed simultaneously. The pop-up displayed the reason for the alert, the importance of avoiding a HRTFR, provided safer TFR options, and recommended contacting the RD for guidance. To preserve individualization of patient care, order validation was overridable, as patients on lower vasopressor doses are appropriate to have the HRTFR. After the order validation pop-up was implemented, a chart review was completed between March 2023 and May 2023 to assess the incidence and actions following the triggered order validation pop-up.</p><p><b>Results</b>: Between March 2023 and May 2023, the order validation pop-up triggered 220 times in a total of 59 patients. Out of the 220 triggers, based on the instructions in the pop-up, 42 (19%) resulted in a changed or discontinued order, or the HRTFR was not ordered. Of those 42 triggers that resulted in a properly adjusted HRTFR, 26 (61%) of them occurred outside of normal RD hours. The remaining triggers, where no changes were made, were found to have low dose vasopressors, vasopressors listed on the MAR but not actively being used, or a HRTFR was ordered on the MAR but held per nursing communication orders.</p><p><b>Conclusion</b>: The creation of a novel nutrition support-specific order validation pop-up provided education and guidance to ordering providers. With this additional layer of safety, 42 ICU patients between March 2023 and May 2023 were placed on safer TFR, with most of the impact occurring outside of RD working hours.</p><p><b>Best of ASPEN - Enteral Nutrition Therapy</b></p><p><b>1627 - Victory for Volume-Based Enteral Nutrition</b></p><p>Julie M. Geyer, RD-AP, CNSC</p><p>University of Colorado Hospital, Aurora, CO</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Enteral nutrition (EN) in the hospital setting is traditionally administered by a fixed rate-based feeding method (RBEN). Studies using RBEN found that due to interruptions or withholding, actual formula delivery averages 60% to 70% of the prescribed volume. Nutrition provision below energy needs contributes to malnutrition and negative consequences including increased health care cost, and increased morbidity and mortality. The American Society for Parenteral and Enteral Nutrition (ASPEN) and the Society of Critical Care Medicine (SCCM), recommend use of a volume-base enteral nutrition feeding method (VBEN) to improve the nutrient delivery, decrease energy deficits and prevent overfeeding.</p><p><b>Methods</b>: This quality improvement study took place at a Level I trauma, academic hospital from June 2022 to September 2023. In September 2022, a hospital-wide process improvement committee was assembled for multi-phase implementation of VBEN. Prior to September 2022, unit-based dietitians conducted quality improvement to address common causes of feeding interruptions. VBEN Inclusion criteria included those demonstrating tolerance of goal RBEN. The maximum hourly rate was set at 150 mL/hr. The ‘goal’ provision was set as 90% to 110% of prescribed formula volume. Patients included in the data collection were tolerating EN at RBEN goal and formula intake volumes were taken directly from the feeding pump history. Changes to the electronic medical record (EMR) included, creation of a VBEN calculator with row instructions built into the tube feeding flowsheet, creation of nurse reminder task every 4 hours to recalculate formula intake and adjust rate as needed. Changes to the formula order on the medication administration record included specification of VBEN vs RBEN feeding method and standardized administration instructions (Figure 1). Nurses, dietitians, and providers received training for the VBEN workflow and process through e-mail communication, in-person training, interactive learning-assisted video, and one-on-one coaching.</p><p><b>Results</b>: Prior to June 2023, RBEN was the standard feeding method. Routine quality improvement audits from October 2020 to December 2022 in one intensive care unit demonstrated that despite strategies to improve formula delivery actual formula provision to meet ‘goal’ was met on 50% to 74% of EN days (Table 1). In June 2022, a hospital-wide audit of formula provision was conducted and included all levels of care (floor, intermediate, and intensive care). In a total of 346 EN days, the actual formula provision to meet ‘goal’ was achieved on 63% of EN days (Table 2). In November 2022, an audit was conducted in the two ICU units selected for phase 1 implementation. In a total of 154 EN days, the actual formula provision to meet ‘goal’ was achieved on 57% of EN days (Table 2). Phase 1 implementation took place in June 2023. A post-go-live audit was completed. In a total of 157 EN days, ‘goal’ formula volume was achieved on 83% of EN days (Table 2). No instances of hypo/hyperglycemia or gastrointestinal complications were reported. Phase 1 was deemed a success and approval was obtained to continue VBEN implementation in a stepwise fashion for the remaining inpatient units.</p><p>Marcin Folwarski, MD, PhD<sup>1</sup>; Stanisław Kłęk<sup>2</sup>; Karolina Skonieczna-Żydecka<sup>3</sup>; Agata Zoubek-Wójcik<sup>4</sup>; Waldemar Szafrański, MD, PhD<sup>5</sup>; Lidia Bartoszewska<sup>6</sup>; Krzysztof Figuła<sup>7</sup>; Marlena Jakubczyk, MD, PhD<sup>8</sup>; Anna Jurczuk<sup>9</sup>; Przemysław Matras, MD, PhD<sup>10</sup>; Zbigniew Kamocki, MD, PhD<sup>11</sup>; Tomasz Kowalczyk, MD, PhD<sup>12</sup>; Bogna Kwella, MD, PhD<sup>13</sup>; Joanna Sonsala-Wołczyk<sup>14</sup>; Jacek Szopiński, MD, PhD<sup>15</sup>; Krystyna Urbanowicz, MD, PhD<sup>16</sup>; Anna Zmarzly, MD, PhD<sup>14</sup></p><p><sup>1</sup>Division of Clinical Nutrition and Dietetics, Medical University of Gdańsk, Gdansk, Pomorskie, Poland; <sup>2</sup>Surgical Oncology Clinic at the National Cancer Institute in Krakow at Maria Sklodowska-Curie National Research Institute of Oncology, Cracow, Poland; <sup>3</sup>Department of Biochemical Science, Pomeranian Medical University in Szczecin, Szczecin, Zachodniopomorskie, Poland; <sup>4</sup>Nutrimed Home Nutrition Center, 3, Warsaw, Poland; <sup>5</sup>Home Enteral and Parenteral Nutrition Unit, General Surgery Department, Nicolaus Copernicus Hospital, Gdansk, Pomorskie, Poland; <sup>6</sup>First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit S, Lublin, Poland; <sup>7</sup>Nutricare Clinical Nutrition Center, Cracov, Poland; <sup>8</sup>Department of Anaesthesiology and Intensive Care Collegium Medicum in Bydgoszcz, Nicolaus Copernicus University, Toruń, Poland; <sup>9</sup>Outpatient Clinic of Nutritional Therapy Clinical Hospital, 15-001 Bialystok, Bialystok, Poland; <sup>10</sup>First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit SPSK4, Lublin, Poland; <sup>11</sup>Department of General and Gastroenterological Surgery Medical University, Bialystok, Poland; <sup>12</sup>Nutricare Clinical Nutrition Center, Cracow, Poland; <sup>13</sup>Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland; <sup>14</sup>Clinical Nutrition Unit, Gromkowski Citi Hospital, Wroclaw, Poland; <sup>15</sup>Department of General Hepatobiliary and Transplant Surgery, Collegium Medicum, Nicolaus Copernicus University in Torun, Torun, Poland; <sup>16</sup>Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Cancer is one of the most common indications for home enteral nutrition (HEN). Malnutrition and weight loss, associated with deterioration in performance status, contribute to poorer outcomes in oncology patients. Systemic inflammation is a characteristic feature of cancer cachexia and may be used as a prognostic factor for short survival. According to the ESPEN guidelines HEN is indicated for patients with an estimated survival of at least 30 days. Therefore, determining survival is essential for individual care planning as it informs healthcare professionals about the suitability of HEN and palliative care strategy.</p><p><b>Methods</b>: In a retrospective multicenter survey, we examined the medical records of cancer patients across 22 Polish HEN centers treated in 2018. Factors assessed during the qualification for HEN included BMI, weight loss, albumin level, total protein level, lymphocyte count, CRP, Prognostic Nutritional Index (PNI), and Eastern Cooperative Oncology Group (ECOG) performance status. The primary endpoint was survival of less than 30 days from the initiation of HEN.</p><p><b>Results</b>: A total of 278 cancer patients: 51.44% head and neck, 41.37% gastrointestinal, and 7.19% other localizations were included in the study (70.14% male, 29.86% female). Inflammatory factors– albumin level below 3.5 g/dL (<i>p</i> = 0.02), C-reactive protein (<i>p</i> = 0.01), PNI > 45 (<i>p</i> = 0.04), high percentage of weight loss in the last 6 months (<i>p</i> < 0.01) and ECOG performance score (<i>p</i> = 0.01) were associated with poor survival (less than 30 days). Body weight, BMI, lymphocyte count, and total protein level were not correlated with survival.</p><p><b>Conclusion</b>: Assessment of performance status, inflammation, and weight loss during qualification for HEN can predict short-term survival of cancer patients. This finding highlights the importance of comprehensive assessments before home nutrition initiation. Predicting poor survival can help plan palliative care and determine whether the patient will benefit from HEN.</p><p>June R. Greaves, RD, CNSC, CDN, LD, LDN, LRD<sup>1</sup>; Katharine Morra, RD, CNSC, CSO, LD, LDN<sup>2</sup></p><p><sup>1</sup>Coram CVS Specialty Infusion Services, Meriden, CT; <sup>2</sup>Coram CVS Specialty Infusion Services, Plainfield, IN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The objective of this quality improvement project was to determine whether patients were successful in administering tube feeding independently at home following a virtual tube feeding instruction by a Registered Dietitian (RD) with a nationwide home care infusion company. The hope is to provide information regarding the process and potentially identify an avenue for further improvements to the process and an area for future research.</p><p><b>Methods</b>: A retrospective review was conducted of approximately 162 patients who received a virtual tube feeding instruction by the enteral RD from June 2022 to June 2023. A virtual instruction was completed for the enteral feeding pump, gravity bag, and bolus/syringe methods of administration. A follow-up call was made to active patients to inquire about their experience of the virtual instruction. For those patients unable to be reached, a review of the medical record was completed to determine if inbound calls were received for questions or issues after the virtual instruction. Patients were queried on confidence in their ability to provide enteral feedings, and if they had any concerns upon completion of the virtual instruction, knowledge of who to contact after the virtual instruction, and if the reference materials provided were helpful. Patients who did not receive virtual instruction, or were discharged from service, were excluded from the review.</p><p><b>Results</b>: One hundred sixty-two total patients were reviewed as potentially eligible for the analysis; 115 were excluded. Of those excluded, 100 (87%) were no longer on service; 12 (10%) declined a virtual instruction due to home health agency instruction, inpatient instruction with nursing or dietitian prior to the start of care, or assistance from the home infusion company sales team; 3 (3%) were a “no show” for the scheduled appointment. Eighteen of the remaining eligible patients were unable to be contacted for follow-up. Of those who were unable to be contacted through a follow-up call, there were no documented inbound calls regarding feeding/equipment questions or concerns. Of the total number of eligible patients, 29 provided telephonic feedback on the virtual instruction experience. Virtual instruction was related to the following administration types: enteral pump (86%, n = 25), followed by gravity bag and bolus/syringe (14%, n = 4). Upon completion of the instruction, 27 (93%) felt confident with feeding administration, 2 (7%) did not feel confident as they identified as “in person learners”; 24 (83%) did not experience issues/concerns, 5 (17%) did have questions/concerns; 27 (93%) responded knowing who to contact, 2 (7%) did not; 22 (76%) found reference materials provided helpful, 2 (7%) did not, and 5 (17%) did not review the reference materials.</p><p><b>Conclusion</b>: Technological advances in recent history have made virtual instruction possible. Virtual enteral instruction can be a successful tool for patients to learn how to administer tube feedings when an in-person instruction is not possible in the home care setting. However, consideration should be given to the client's preferred style of learning. Further research in the use of virtual instruction to enhance the process should be considered. As literature is limited on virtual instruction outcomes, additional research is warranted.</p><p>Danelle A. Olson, RDN; Lisa M. Epp, RDN; Osman Mohamed Elfadil, MBBS; Ryan T. Hurt, MD, PhD; Manpreet S. Mundi, MD</p><p>Mayo Clinic, Rochester, MN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The prevalence of bariatric surgery has increased significantly in recent years, as it is the most effective long-term treatment for obesity. The two most common surgeries, Sleeve Gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), alter gastrointestinal anatomy, producing significant weight loss as well as remission of obesity-related co-morbidities including type 2 diabetes. Despite these benefits, bariatric surgery can be associated with significant debilitating complications. Though the true prevalence and mechanism are unclear, hypoglycemia has been shown to be present in up to 38% of post-surgical RYGB patients and can be very difficult to manage. Currently, there remains a paucity of data regarding the role of enteral nutrition (EN) as a potential therapy.</p><p><b>Methods</b>: A retrospective review of EMR of patients who were seen in our outpatient home enteral nutrition (HEN) clinic for initiation of tube feeding for management of reactive hypoglycemia from March 2017 to July 2023. In addition to baseline clinical characteristics and demographics, we collected data about hypoglycemia incidents, interventions, EN regimens, and outcomes.</p><p><b>Results</b>: Six patients were seen in the HEN clinic with post-bariatric reactive hypoglycemia (mean age 45.5 ± 9.6 years; 66.7% female; mean BMI at HEN initiation 28.6 ± 8.3). Five out of 6 patients underwent RYGB surgery, and 1/6 underwent laparoscopic adjustable gastric banding (LAGB) that was subsequently revised sleeve gastrectomy (SG). The duration until the development of reactive hypoglycemia after surgery varied in the cohort. On average, the first incident was documented 2.6 ± 3.2 years after surgery. Of note, patients lost, on average, 51.2 ± 28.5 kgs after surgery and before they required EN support. We noted a slight change in weight after EN initiation as patients remained, on average, at +2.5 kg one month and 3 months into HEN. Table 1 shows the patients' profiles. Dietary modification focusing especially on reduction in consumption of refined carbohydrates was recommended for all patients. However, poor compliance was prevalent, with 5/6 (83%) of patients not adhering to prescribed diet. In addition to the EN and dietary regimens prescribed for all patients, some had received specific treatment(s) to prevent or manage reactive hypoglycemia. In one case, a combination of α-glucosidase inhibitors, somatostatin, and radical diet changes were used. The majority of patients underwent an initial trial of EN through a naso-jejunal tube, which was then converted to a percutaneous tube after efficacy was established (Table 2). Standard polymeric formulas were utilized for most patients, although one was provided commercial blenderized tube feeds. With the use of EN, 4 out of 6 patients had a resolution of reactive hypoglycemia, while only two continued to experience symptoms. Two patients stopped use of EN due to feeding complications and non-compliance, while the remaining four continued on EN.</p><p>Anna K. Burneske; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD</p><p>Medical College of Wisconsin, Milwaukee, WI</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Patients who are malnourished have worse outcomes. Many standardized tools have been developed to screen for malnutrition in acutely ill pediatric patients: Pediatric Yorkhill Malnutrition Score (PYMS), Pediatric Nutrition Screening Tool (PNST), and Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP). Alternatively, some institutions have developed their own tools for this purpose. These tools are referred to as “home-grown”. Regardless of their origin, none of these tools have been validated in critically ill children. Registered dietitians (RDs) perform nutrition assessments on patients based on the results of these nutrition screenings or based on protocols within their institution. The Virtual Pediatric Systems, LLC (VPS), an international data registry supporting standardized data sharing for research, improved patient care, and benchmarking among pediatric ICUs, developed a nutrition module that captures data for nutritional metrics. VPS has collected data from centers in the nutrition module since October 2019 and collects data for about 10,000 patients per calendar year from the centers participating in the nutrition module. The specific aims were to compare the nutrition screening tools to the dietitians' assessments to determine the screening tools' accuracy and to determine whether standardized screening tools are more accurate than those developed at single centers. We hypothesized that (1) nutrition screening tools used by participating centers will accurately identify malnourished children, and (2) standardized tools will be more accurate than those developed at single centers.</p><p><b>Methods</b>: In this project, we compared pediatric nutrition screening tools with the assessments performed by RDs to determine whether nutrition screening tools accurately identify malnourished patients. We also determined which nutrition screening tools more accurately identify patients who are malnourished or at risk of becoming malnourished during their hospitalization in the PICU so that the appropriate nutrition therapy can be initiated. We obtained de-identified demographic and clinical data from October 2019 through March 2023 for all patients under 18 years of age from the VPS database from centers participating in the nutrition module. We considered the RD's assessment to be the gold standard for determining malnutrition and compared the nutrition screening tools to the RD's assessment. The degree of agreement in malnutrition between nutrition screening tools and RD's assessment was determined by Cohen's kappa (κ).</p><p><b>Results</b>: After selecting subjects who had a complete pediatric nutrition screen and RD assessment, the final data cohort contains a total of 9891 patients. Among them, there were 54% males, 4% neonates (<=29 d), 34% infants (<2 y), 35% children (2-12 y), and 26% adolescent (12-18 y). The subjects were 40% White, 17.5% Black, 22.5% Hispanic, 5.7% Asian, and 14.2% other/mixed. The kappa coefficient for the standardized nutrition screening tools was .38, which is considered a “fair” agreement between the screening tool and the RD “gold standard” assessment. Other unidentified tools listed as “home-grown” or “other” in VPS had kappa coefficients ranging from .31 to .91. 91 is a near-perfect agreement between the screening tool and the RD “gold standard.”</p><p><b>Conclusion</b>: These data show only a fair degree of agreement between the standardized screening tools (PYMS, PNST, STAMP) and RD assessments, meaning that these tools do not adequately assess the nutritional status of critically ill children. However, some unidentified hospital-specific tools show near-perfect agreement with RD assessments, so perhaps there is a better tool for identifying malnourished children in the ICU. Further investigation should be performed to determine why the home-grown tools are superior to the published tools.</p><p><b>Research Trainee Award</b></p><p>Hayley E. Billingsley, PhD, RD, CEP; Michael Dorsch, PharmD, MS; Todd M. Koelling, MD; Scott L. Hummel, MD, MS</p><p>University of Michigan, Ann Arbor, MI</p><p><b>Financial Support</b>: NHLBI - Award 5R33HL155498-03.</p><p><b>Background</b>: Malnutrition is common in patients with heart failure (HF) and worsens already poor prognosis. Previous work suggests that sodium restriction, the most common dietary recommendation for patients with HF, may be associated with reduced micronutrient and energy intake. The Mini Nutritional Assessment-Short Form (MNA-SF) is a strong indicator of nutrition status and prognosis in patients with HF, but the association between MNA nutrition status and sodium intake has not been examined. Therefore, this analysis aimed to examine the association between nutrition status and habitual sodium intake in hospitalized patients with HF.</p><p><b>Methods</b>: This is a cross-sectional analysis of patients (≥18 y of age) hospitalized for decompensated HF. Participants were administered the MNA-SF and scored as nourished, at risk of malnutrition, or malnourished based on established cutoffs. Questions on the MNA-SF regarding weight loss and declines in food intake over the previous 3 months were also considered independently. Participants completed the 2014 Block Food Frequency Questionnaire (FFQ) to assess habitual dietary intake. Estimated daily kilocalories (kcals) from the FFQ were divided by estimated energy needs (Harris-Benedict equation*1.1) to calculate percent (%) estimated energy needs. Estimated protein needs were calculated based on the Academy of Nutrition and Dietetics recommendation of 1.1 g/kilogram (kg) in HF and divided by estimated protein intake from the FFQ to calculate % estimated protein needs. Using the FFQ, participants were grouped into sodium intake ≥ or < 2 g per day. Differences between groups based on sodium intake were explored using Fischer's exact test, Chi-square, or Mann Whitney U as applicable.</p><p><b>Results</b>: Baseline characteristics are presented in Table 1. On FFQ, participants with sodium intake <2 g reported consuming significantly less of their % estimated energy and protein needs than participants with ≥ 2 g sodium intake (Figure 1). All patients (n = 12) with sodium intake <2 g per day were malnourished or at risk for malnutrition on MNA-SF versus 73% (32) of patients with sodium intake ≥2 g per day (<i>P</i> = 0.051). A greater proportion of patients with daily sodium intake <2 g reported recent weight loss >3 kg (75% [9] vs. 43% [19], <i>P</i> = 0.051). No difference was found in the proportion of participants reporting a decrease in food intake on the MNA-SF (<2 g sodium, 67% [8] vs. ≥ 2 g sodium, 50% [22], <i>P</i> = 0.305).</p><p><b>Conclusion</b>: In patients hospitalized for HF, habitual sodium intake <2 g per day was associated with inadequate energy and protein intake, confirming previous findings. Despite the high prevalence of obesity in the cohort, sodium intake <2 g per day was also associated with self-reported weight loss >3 kg and a higher likelihood of being at risk for or having malnutrition. Although this cross-sectional analysis cannot determine the directionality of observed associations, additional studies should examine the impact of personalized nutrition interventions vs. standard-of-care sodium restriction education in HF on clinical outcomes.</p><p><b>Figure 1</b>. Percent estimated energy and protein needs achieved by sodium intake level in hospitalized patients with heart failure.</p><p>Lucia A. Gonzalez Ramirez, cPhD<sup>1,2</sup>; Mary M. Nellis, PhD<sup>3</sup>; Jessica A. Alvarez, PhD<sup>1,2,4</sup>; Tasha M. Burley<sup>2</sup>; Paula D. Nesbeth, cPhD<sup>1,2</sup>; Chin-An Yang, cPhD<sup>1,2</sup>; Dean P. Jones, PhD<sup>1,3,4</sup>; Thomas R. Ziegler, MD<sup>1,2,4</sup></p><p><sup>1</sup>Nutrition and Health Sciences Program, Laney Graduate School, Emory University, Atlanta, GA; <sup>2</sup>Division of Endocrinology, Metabolism and Lipids, Department of Medicine, Emory University, Atlanta, GA; <sup>3</sup>Clinical Biomarkers Laboratory, and Division of Pulmonary, Allergy, Critical Care and Sleep Medicine, Department of Medicine, Emory University, Atlanta, GA; <sup>4</sup>Center for Clinical and Molecular Nutrition, Department of Medicine, Emory University, Atlanta, GA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Postprandial metabolism can identify alterations related to the early stages of cardiovascular disease. However, limited data exist regarding the effects of body composition on postprandial metabolism after a lipid meal challenge. We aimed to characterize the metabolic pathways and metabolites associated with body fat abundance in the postprandial plasma metabolome after an oral lipid challenge.</p><p><b>Methods</b>: Thirty-one healthy individuals between 20 and 50 years old with a lean or overweight/obese body mass index (BMI) were recruited. Participants underwent body composition measurement with dual energy x-ray absorptiometry (DEXA) to quantify body fat percentage and visceral adipose tissue quantity. A standardized 900-kcal lipid meal challenge (a long chain triglyceride fat emulsion oral nutritional supplement) with repeat blood sampling was administered. Untargeted plasma high-resolution metabolomics was determined at baseline, 120 minutes, and 360 minutes after the lipid challenge using dual-column liquid chromatography (C18- and HILIC+ electrospray modes), coupled with high-resolution mass spectrometry (LC-HRMS). Metabolite differences were assessed using a metabolome-wide association study with linear mixed-effect models to study effects of body fat, time, and the body fat*time interaction, controlling for age and sex, and pathway enrichment analysis was performed.</p><p><b>Results</b>: A total of 12,078 (C18) and 15,041 (HILIC) features (metabolites) were detected in plasma at baseline. Changes over time differed by percent body fat (percent fat*time interaction) for 699 (C18) and 814 (HILIC) features from baseline to 120 minutes, respectively, and 465 (C18) and 478 (HILIC) features from baseline to 360 minutes, respectively (all <i>p</i> < 0.05). These were enriched in pathways that include TCA cycle, fatty acid, lysine, tyrosine, tryptophan, butanoate, and purine metabolism, Figures 1 and 2. Additionally, changes over time differed by visceral adipose tissue quantity (VAT*time interaction) for 396 (C18) and 2290 (HILIC) features from baseline to 120 minutes, respectively, and 486 (C18) and 520 (HILIC) features from baseline to 360 minutes, respectively (all <i>p</i> < 0.05). These were enriched in pathways that include fatty acid oxidation, omega-3 and −6 fatty acid, vitamin C, and pentose phosphate metabolism, Figure 3 and 4.</p><p><b>Best of ASPEN - Malnutrition, Obesity, Nutrition Practice Concepts, and Issues</b></p><p>Ana Paula Pagano, MSc<sup>1</sup>; Taiara Poltronieri, BSc<sup>1,2</sup>; William Evans, PhD<sup>3</sup>; M. Cristina Gonzalez, MD, PhD<sup>4</sup>; Anil Abraham Joy, MD<sup>5</sup>; Claude Pichard, MD, PhD<sup>6</sup>; Carla Prado, PhD, RD<sup>1</sup></p><p><sup>1</sup>University of Alberta, Edmonton, AB, Canada; <sup>2</sup>Federal University of Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil; <sup>3</sup>University of California, Berkeley, CA; <sup>4</sup>Federal University of Pelotas, Pelotas, Rio Grande do Sul, Brazil; <sup>5</sup>University of Alberta/Cross Cancer Institute, Edmonton, AB, Canada; <sup>6</sup>Geneva University Hospital, Geneva, Switzerland</p><p><b>Financial Support</b>: ASPEN (American Society for Parenteral and Enteral Nutrition) Rhoads Research Foundation, and the Canadian Institutes of Health Research (CIHR) (FRN 159537).</p><p><b>Background</b>: Accurate understanding of energy requirements is essential for tailored nutritional interventions in patients with cancer. Under- or overestimating these needs can lead to detrimental weight loss or excessive gain. Yet, determining energy needs in cancer is challenging due to factors like individual tumor burden, treatment, and inflammation, all of which can influence energy requirements. Current guidelines offer broad caloric intake (25-30 kcal/kg/d) set as normal values that lack strong evidence. As a result, dietitians often rely on predictive equations, which have proven to be imprecise. Nonetheless, standard techniques available to accurately measure energy requirements are costly, time-consuming, and not applicable to clinical settings. In this study, we leveraged a cohort of patients with breast cancer to evaluate the accuracy of a novel bedside device designed to measure resting energy expenditure (REE) and compared it against a gold-standard method.</p><p><b>Methods</b>: REE data were obtained cross-sectionally from adult females with breast cancer (stages I-III) measured during a 10-minute test with a novel portable device, the Q-NRG® (Cosmed, Roma, Italy), and compared against REE measured during a 1-hour test in a whole-room indirect calorimeter (WRIC) as a gold-standard technique. To assess and describe the REE accuracy between methods, we utilized paired samples t-test or Wilcoxon signed-rank test for instances of non-normality. Accuracy was determined by the percentage of estimates that fell within 10% of the values measured by WRIC. Additionally, Bland-Altman analysis was conducted to determine bias and establish the lower and upper limits of agreement (LOA). A p-value of less than 0.05 was considered statistically significant.</p><p><b>Results</b>: REE was evaluated in 49 females (age 55.9 ± 11.8 y; 42% with stages I or II, and 7% with stage III breast cancer) using both WRIC and the new portable device. Most patients (63.3%) had a body mass index (BMI) classification within the overweight or obesity categories, and none were categorized as underweight. The new portable device provided accurate measurements for over 70% (n = 35) of patients, showing measurements within 10% of those obtained by WRIC. However, the new portable device overestimated REE for 1 patient and underestimated it for 13. Measured REE significantly differed between techniques, with the new portable device underestimating REE compared to WRIC (1406 ± 262 vs 1508 ± 248 kcal/d; <i>p</i> < 0.001). The bias between the new portable device and WRIC was −6.7% (LOA = −24.9%, 11.6%; variance = 36.5%) or −102 kcal (LOA = −378 kcal, −174 kcal; variance = 552 kcal).</p><p><b>Conclusion</b>: When compared to a gold-standard technique, the new portable device showed good agreement at the group level, with REE measurement discrepancies falling within 10% of values determined by the WRIC. Although a greater variability was observed at the individual level, the new portable device accurately assessed REE in comparison to the WRIC for most patients. Thus, the new portable device appears to be a promising tool for estimating REE of patients with breast cancer, positioning it as a viable option for clinical settings.</p><p>Michelle Brown, MS, RD, LDN, CNSC</p><p>UF Health, Gainesville, FL</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Malnutrition is a highly prevalent issue in the healthcare setting. The term malnutrition in the healthcare setting refers to undernutrition. This occurs as a result of inadequate nutrition intake, impaired absorption, or altered utilization of nutrition. Inflammation and hypermetabolism also contribute to the development of malnutrition. Estimates of the prevalence vary and are as high as 54%. In acute care hospitals, the prevalence of malnutrition is 39% when using diagnostic criteria from the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN). Capturing and recognizing malnutrition is important, as this diagnosis is associated with a 3.4x high rate of in-hospital death, 1.9x higher length of stay, 2.2x likelihood of being admitted with a serious infection, higher rates of discharge to a rehabilitation or long-term assisted care facility, an increased rate of readmissions, and a 73% increase in hospital costs. Due to the impact of malnutrition on healthcare costs and requirements for care, ICD10 codes for malnutrition are considered comorbid conditions (CC) or major comorbid conditions (MCC). Accurate diagnosis, treatment, and documentation of malnutrition can improve patient care. Accurate documentation can also help to capture complexity for quality metrics while also allowing for the selection of the correct DRG and base payment which may increase reimbursement.</p><p><b>Methods</b>: An interdisciplinary nutrition committee at our organization consisting of dietitians, physicians, nurses, and informatic professionals completed a quality improvement implementation to improve malnutrition diagnosis rates, documentation, and coding. This was completed in four steps: (1) Identification of malnutrition criteria that could be used across the organization. Our committee elected to use the AND/ASPEN criteria for the diagnosis of malnutrition. This criterion is used by ~85% of hospitals and is widely recognized by payors. (2) Development of a documentation tool that would allow for RD diagnoses malnutrition to populate in provider progress notes. The hospital's electronic medical record (EMR) was leveraged to accomplish this goal. A novel flowsheet and Smartphrase were developed, which allowed information on malnutrition severity, signs/symptoms, and treatment (entered by the dietitian) to flow into physician progress notes automatically. This solution met all the “best practices” for documentation that were identified by our interdisciplinary team – clear signs and symptoms of malnutrition identified, the severity of malnutrition indicated and documented consistently between providers, consistent use of diagnostic criteria, and treatment for malnutrition being provided and documented. (3) All clinical nutrition staff members were provided with hands-on training on the completion of nutrition-focused physical exams (NFPE), and the completion of these exams was prioritized in all nutrition assessments. (4) When the malnutrition Smartphrase was not used, notes were sent to physicians for attestation and signature.</p><p><b>Results</b>: Following this implementation, dietitian-diagnosed malnutrition has been included in physician notes via Smartphrase for 65% of cases. In the six months following NFPE training, malnutrition diagnosis rates increased by 220%. The percentage of dietitian assessments with a malnutrition diagnosis has increased from 13% to 40%. Following the process of sending notes to physicians for attestation and signature, 94% of malnutrition diagnoses are coded in the EMR at discharge from the hospital, and coding queries to physicians decreased by 50%. Hospital reimbursement for dietitian-diagnosed malnutrition has increased from ~$65,000 per quarter to ~$2 million per quarter.</p><p><b>Conclusion</b>: Utilization of appropriate NFPE training, physician-approved diagnostic criteria, and EMR-based documentation solutions can increase diagnosis, documentation, and reimbursement for malnutrition diagnoses in hospitalized patients.</p><p><b>Research Trainee Award</b></p><p>Alan Garcia-Grimaldo<sup>1,2</sup>; Ivan A. Osuna-Padilla<sup>1</sup>; Nadia Rodriguez-Moguel<sup>1</sup>; Martin A. Rios-Ayala<sup>1</sup>; Marycarmen Godinez-Victoria<sup>2</sup></p><p><sup>1</sup>National Institute of Respiratory Diseases, Mexico City, DF, Mexico; <sup>2</sup>Escuela Superior de Medicina, Instituto Politécnico Nacional, Mexico City, DF, Mexico</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Intensive care unit-acquired weakness (ICU-AW) is characterized by peripheral muscle mass wasting, reduced muscle strength, and dysfunction. Respiratory and swallowing related muscles could also be affected by this condition. This study aimed to analyze the association between ICU-AW incidence and post-extubation dysphagia (P-ED).</p><p><b>Methods</b>: A prospective cohort study was conducted. Patients on mechanical ventilation (MV) admitted to the ICU were included. Individuals with a previous diagnosis of myopathies were excluded. NUTRIC-Score, calf circumference adjusted to BMI, and phase angle (PhA) obtained by bioelectrical impedance, were assessed upon admission and after extubation. Biochemical variables (Baseline PCR) were collected from medical records. SOFA score, APACHE II, and malnutrition diagnosis using GLIM criteria were determined upon admission to the ICU. Cumulative energy (CED) and protein (CPD) deficits were calculated during the ICU stay. ICU-AW diagnosis was determined using the Medical Research Council Scale (MRC-Scale <48) and hand grip strength (<11 kg for men, and <7 kg for women). Swallowing function assessment was performed within the first 24 hours after extubation, using the Yale Swallowing Protocol (YSP). For patients who did not meet the success criteria defined for the YSP, the volume-viscosity swallow test was performed to corroborate the presence of post-extubation dysphagia (P-ED). Specific success and failure criteria proposed for each test were used. Mean and median comparison tests were performed for each variable between the group with P-ED and those with normal swallowing. Associations were analyzed using univariate and multivariate logistic and linear regressions. Covariates selection was performed by stepwise method.</p><p><b>Results</b>: Fifty-four patients were included, 19 (35.2%) were diagnosed with P-ED and 32 (59.3%) with ICU-AW. Patients with P-ED showed lower values for PhA at extubation, MRC-Scale, and handgrip strength at extubation. In addition, higher days on Invasive MV, CED, and CPD were observed in this group (Table 1). In the univariate logistic regression analysis, PhA at extubation, CED, CPD, ICU-AW diagnosis, and days on MV were associated with P-ED identification. In multivariate regression analysis, only days on MV, and the ICU-AW diagnosis were independently associated with P-ED (Table 2).</p><p><b>Conclusion</b>: Days on invasive mechanical ventilation, and ICU-acquired weakness diagnosis were predictors for post-extubation dysphagia. Novel clinical and nutritional strategies are required to prevent ICU-acquired muscle weakness and its consequences, which may improve clinical outcomes and quality of life after extubation.</p><p>Ahron Lee, RD<sup>1,2</sup>; Eun-Mee Kim, RD<sup>1</sup>; Bo-eun Kim, RD<sup>1</sup>; Chi-Min Park, MD, PhD<sup>3</sup>; Sung Nim Han, PhD<sup>2</sup></p><p><sup>1</sup>Department of Dietetics, Samsung Medical Center, Seoul, Korea, Republic of (South); <sup>2</sup>Department of Food and Nutrition, College of Human Ecology, Seoul National University, Seoul, Korea, Republic of (South); <sup>3</sup>Department of Critical Care Medicine and Surgery, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea, Republic of (South)</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The importance of “appropriate” nutrition support in the early stages of intensive care unit (ICU) admission is under debate regarding patients who require it, time of initiation, and the amount to be provided. In this study, the characteristics and clinical outcomes of malnourished patients diagnosed using the Global Leadership Initiative on Malnutrition (GLIM) criteria were examined. Also, the actual implementation of nutritional support and its relationship with clinical outcomes based on nutrition status were investigated.</p><p><b>Methods</b>: This retrospective cohort study included critically ill patients receiving invasive mechanical ventilation who were admitted to the ICU and hospitalized for at least 7 days between January 1, 2020, and December 31, 2022. Nutritional and clinical data during their first 10 days in the ICU were collected. All the patients in this study underwent nutrition assessment by the GLIM criteria. The 90-day mortality of patients diagnosed with malnutrition by the GLIM criteria and degree of malnutrition were analyzed. Patients were divided into three energy intake categories (<10 kcal/kg/d, 10–20 kcal/kg/d, and >20 kcal/kg/d) and three protein intake categories (<0.8 g/kg/d, 0.8–1.2 g/kg/d, and >1.2 g/kg/d). Information on intake was categorized by the stage following ICU admission (days 1–3 for the early acute phase, days 4–6 for the late acute phase, and days 7–10 for the recovery phase). We examined the differences in mortality among groups separated by energy and protein intake at each stage. The analyses were performed for the total cohort, well-nourished, and malnourished groups. Differences in the means and distribution were evaluated, and survival analyses and regression analyses were performed.</p><p><b>Results</b>: A total of 595 patients were included. The prevalence of malnutrition according to the GLIM criteria was 61% (n = 362). The 90-day mortality in the well-nourished and the malnourished group was 45% and 58%, respectively (<i>P</i> < 0.001). Mortality differed by the degree of malnutrition (well-nourished 45%, moderately malnourished 53%, severely malnourished 61%, <i>P</i> = 0.001). In the early acute phase and late acute phase, there was no difference in mortality among different energy intake groups. However, in the recovery phase, the group with high energy intake (>20 kcal/kg/d) showed lower mortality (hazard ratio (HR) 0.602; 95% confidence interval (CI) 0.413 to 0.877; <i>P</i> = 0.008) in the total cohort. In well-nourished patients, the high energy intake group tended to have lower mortality (HR 0.573; 95% CI 0.318 to 1.034; <i>P</i> = 0.064) in the recovery phase. However, in malnourished patients, the group with high energy intake showed significantly lower mortality (HR 0.549; 95% CI 0.333 to 0.903; <i>P</i> = 0.018) in the recovery phase. In the early acute phase and late acute phase, there was no difference in mortality among different protein intake groups. However, in the recovery phase, the group with moderate protein intake (0.8–1.2 g/kg/day) showed lower mortality (HR 0.770; 95% CI 0.599 to 0.990; <i>P</i> = 0.041) in the total cohort. When well-nourished patients and malnourished patients were analyzed separately, a significantly lower mortality (HR 0.728; 95% CI 0.536 to 0.988; <i>P</i> = 0.042) in the recovery phase was observed with moderate protein intake among malnourished patients.</p><p><b>Conclusion</b>: Malnutrition diagnosed by the GLIM criteria was associated with 90-day mortality and other clinical outcomes. Furthermore, energy and protein intake at the recovery phase after ICU admission was associated with mortality, especially in malnourished patients classified by the GLIM criteria. Therefore, time-dependent nutritional intake depending on nutrition status may be relevant for optimizing ICU nutrition support strategies.</p><p><b>Best International Abstract</b></p><p><b>International Abstract of Distinction</b></p><p>Fabio Araujo, RD, MHS<sup>1</sup>; Maureen Tosh, PT<sup>1</sup>; Maitreyi Kothandaraman, MD, MSc, FRCPC, CAGF<sup>2</sup>; Juan Posadas, MD, MSc<sup>1,2</sup>; Paul Wischmeyer, MD, EDIC, FASPEN, FCCM<sup>3</sup>; Priscilla Barreto, RD<sup>4</sup>; Chelsia Gillis, RD, PhD, CNSC<sup>5</sup></p><p><sup>1</sup>Alberta Health Services, Calgary, AB, Canada; <sup>2</sup>University of Calgary, Calgary, AB, Canada; <sup>3</sup>Duke University School of Medicine, Durham, NC; <sup>4</sup>Hospital Naval Marcilio Dias, Rio de Janeiro, RJ, Brazil; <sup>5</sup>McGill University School of Human Nutrition, Montreal, QC, Canada</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Functional capacity is the most relevant outcome after critical illness according to ICU survivors. This outcome is especially pertinent as adult ICU mortality has been decreasing, culminating in impaired functional capacity, delayed return to work, and low quality of life. Protein via nutrition support (NS) has the potential to mitigate ICU-acquired weakness but given that current ICU benchmarks are based on mortality and ICU-related complications, it is unknown whether these protein targets also support functional recovery. To address this gap, we conducted a retrospective cohort study to determine whether different protein intake doses influenced the functional capacity of ICU survivors with LOS ≥ 7 days, measured by the Chelsea Physical Assessment score (CPAx) at ICU discharge – a validated measure of functional capacity and robust method based on reliability, measurement error, and responsiveness.</p><p><b>Methods</b>: The medical records of all consecutive patients admitted to a general systems ICU between October 2014 and September 2020 were reviewed. Inclusion criteria were age ≥18 years, survived ICU admission, ICU stay ≥7 days, and received NS. Exclusion criteria included neuromuscular disorders, brain/spinal cord injury, limb amputation, orthopedic fractures, persistent coma during ICU stay, missing CPAx, and mechanical ventilation <3 days. Eligible patients were divided into 4 groups guided by previous literature exploring daily protein intake in ICU (g/Kg/d) on mortality: LOW (<0.8), MEDIUM (0.8-1.19), HIGH (1.2-1.5), and VERY HIGH (>1.5). Groups with similar CPAx were pooled to enhance precision. The effect of protein dose on CPAx was assessed with analysis of covariance (ANCOVA) adjusting for the confounding variables age, disease severity, length of stay in hospital before ICU admission, duration of mechanical ventilation, and time until start of NS in ICU. Effect modification by nutritional status was assessed with stratification according to subjective global assessment (SGA A: well-nourished and B/C: malnourished). The effect of energy intake was assessed using the same regression model (<25 and ≥25 Kcal/Kg/d; <70 and ≥70% daily adequacy).</p><p><b>Results</b>: Inclusion/exclusion criteria were met by 531 patients. CPAx was non-linearly associated with protein doses (Figure 1) AND was not statistically different among LOW, MEDIUM, and VERY HIGH groups. All groups were different from HIGH (<i>p</i> = 0.003), indicating data could be pooled, and giving rise to 2 groups: HIGH (1.2-1.5 g/Kg/d) and POOLED (<1.2 and >1.5 g/Kg/d). Baseline characteristics were comparable between both groups (Table 1). Mean CPAx (±standard error) was greater in the HIGH vs POOLED groups (30.1 ± 0.7 vs. 26.8 ± 0.6, <i>p</i> = 0.001), suggesting that HIGH was associated with superior functional capacity at discharge. The mean difference (MD) remained statistically significant after adjusting for confounding variables (CPAx MD: 3.4 ± 1.1, <i>p</i> = 0.003 in the 4-group model and 3.3 ± 0.9, p = 0.001 in the 2-group model). Energy intake had no effect on CPAx for Kcal/Kg/d (28.1 ± 0.6 in <25 Kcal/Kg vs 27.9 ± 0.8 in ≥25 Kcal/Kg, <i>p</i> = 0.780) nor for adequacy (27.3 ± 0.9 in <70% vs 28.4 ± 0.6 in ≥70%, <i>p</i> = 0.641). Nutritional status was not an effect modifier as the HIGH group had superior CPAx in both well-nourished (MD 3.8 ± 1.7, <i>p</i> = 0.029) and malnourished (MD 2.5 ± 1.1 <i>p</i> = 0.031) patients.</p><p><b>Best of ASPEN - Critical Care and Critical Health Issues</b></p><p><b>International Abstract of Distinction</b></p><p>Chin Han Charles Lew, APD, PhD<sup>1</sup>; Zheng-Yii Lee, PhD<sup>2,3</sup>; Andrew Day, MSc<sup>4</sup>; Xuran Jiang, MSc<sup>4</sup>; Danielle E. Bear, RD, PhD<sup>5,6</sup>; Gordon L. Jensen, MD, PhD<sup>7</sup>; Pauline Y. Ng, MBBS, MRCP(UK), FHKCP, FHKAM<sup>8</sup>; Lauren Tweel, RD, CNSC, MSc<sup>9</sup>; Angela Parillo, RD, LD, CNSC, MSc<sup>10</sup>; Daren K. Heyland, MD, MSc<sup>4</sup>; Charlene Compher, PhD, RD, LDN, FASPEN<sup>11</sup></p><p><sup>1</sup>Dietetics and Nutrition Department, Ng Teng Fong General Hospital, Singapore; <sup>2</sup>Department of Anesthesiology, Faculty of Medicine, Universiti Malaya, 50603 Kuala Lumpur, Kuala Lumpur, Malaysia; <sup>3</sup>Department of Cardiac Anesthesiology & Intensive Care Medicine, Berlin, Germany; <sup>4</sup>Clinical Evaluation Research Unit, Department of Critical Care Medicine, Queen's University, Kingston, ON, Canada; <sup>5</sup>Department of Critical Care, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; <sup>6</sup>Department of Nutrition and Dietetics, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; <sup>7</sup>University of Vermont Larner College of Medicine, Burlington, VT; <sup>8</sup>Critical Care Medicine Unit, School of Clinical Medicine, The University of Hong Kong, Hong Kong; <sup>9</sup>Rutgers University, New Brunswick, NJ; <sup>10</sup>The Ohio State University Wexner Medical Center, Department of Clinical Nutrition, Columbus, OH; <sup>11</sup>University of Pennsylvania School of Nursing, Philadelphia, PA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Pre-existing malnutrition is common among critically ill patients (38-78%), and it can be diagnosed using tools such as the Global Leadership Initiative on Malnutrition (GLIM) criteria, and the Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition (ASPEN) Indicators of Malnutrition (AAIM). However, it is unclear if these tools or their individual components (nutrition parameters [NPs]), such as weight, diet history, body mass index (BMI), or muscle mass have better clinical utility and validity in the intensive care unit (ICU) setting since certain NPs can be easier to obtain (e.g. BMI) than others (e.g. weight history). More importantly, it is unclear if treating malnutrition according to the 2021 ASPEN guidelines (recommend delivering 12-25 kcal/kg/d and 1.2-2 g/kg/d of protein) is associated with improved clinical outcomes. We investigated whether GLIM, AAIM, and/or selected individual NPs measured at ICU admission were associated with time to discharge alive (TTDA) (primary outcome), mortality (60-day), or home discharge, and whether a higher protein delivery modified those associations.</p><p><b>Methods</b>: This was a post hoc analysis of the EFFORT Protein trial (n = 1301), the largest multinational, multicenter trial that compared higher vs. usual protein delivery in critically ill patients. The malnutrition statuses of patients were retrospectively classified according to GLIM and AAIM using NPs that were prospectively collected at ICU admission. For GLIM, acute disease-related inflammation formed the etiologic factor for all patients since they were critically ill, and malnutrition severity was classified according to the phenotypic parameters (severity of weight loss, low-BMI, reduced muscle mass). For AAIM, a modified approach was adopted as certain NPs were not collected (ie, reduced energy intake or weight loss for periods < 1 month, fluid accumulation, and grip strength); hence, malnutrition status was classified by the patient's weight loss severity and any reduction in energy intake. Multivariable regressions were used to identify if malnutrition diagnosed by GLIM and AAIM (both dichotomized by “not identified as malnourished” vs. “moderate/severe malnutrition”) and/or individual NPs were associated with outcomes, and whether protein delivery modified their associations.</p><p><b>Results</b>: Table 1 summarizes the characteristics of patients according to their malnutrition status classified by GLIM. Of 1301 predominantly medical admissions, 41% and 14% of the patients were malnourished according to GLIM and AAIM, respectively. Malnutrition diagnosed by GLIM and AAIM was independently associated with extended TTDA (<i>p</i> = 0.03, p = 0.01), higher odds of 60-day mortality (<i>p</i> = 0.02, <i>p</i> = 0.01), and lower odds of home discharge (<i>p</i> = 0.03, <i>p</i> = 0.05), whereas individual NPs were not (<i>p</i> > 0.10). However, higher protein delivery did not modify the association between malnutrition (diagnosed by GLIM and AAIM) and worse outcomes (Table 2). Notably, in patients with BMI < 18.5 kg/m<sup>2</sup> (n = 78), higher protein delivery was associated with a shorter TTDA (adjusted hazard ratio 2.68, 95% confidence interval [CI] 1.14-6.30) and greater odds of home discharge (adjusted odds ratio 4.61, 95%CI 1.35-15.71) than usual protein delivery.</p><p>Elias Wojahn, B.S.; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD</p><p>Medical College of Wisconsin, Milwaukee, WI</p><p><b>Financial Support</b>: Medical College of Wisconsin.</p><p><b>Background</b>: Previous guidelines lacked sufficient data to comment on the safety of enteral nutrition in critically ill children. A more recent study indicated that enteral nutrition was indeed safe for critically ill children receiving vasoactive medication. Additional data in adults indicated that septic shock patients treated with vasoactive medication and given early enteral nutrition outperform patients given no nutrition. We retrospectively investigated a similar premise in pediatric patients to determine (1) the frequency of use of early enteral versus parenteral nutrition for patients in the PICU for septic shock receiving vasoactive medication and (2) the impact of early enteral versus parenteral nutrition on PICU length of stay (LOS) and mortality for patients admitted with septic shock and treated with vasoactive medication. We hypothesized that (1) clinical practices have changed over recent years such that early enteral nutrition was administered more frequently to pediatric septic shock patients treated with vasoactive medication and (2) receiving early enteral nutrition as a PICU patient treated for septic shock with vasoactive medications was associated with better outcomes.</p><p><b>Methods</b>: We obtained demographic and outcome data for pediatric patients admitted to Children's Hospital Wisconsin for septic shock and treated with vasoactive medications within a 5-year range from the Virtual Pediatric Systems, LLC (VPS) database, a data registry for PICU patients. We obtained clinical data including details of enteral and parenteral nutrition administered and use of vasoactive medications by chart review. We quantified the use of vasoactive medications by Vasoactive-Inotrope Score (VIS). We quantified the severity of illness by PRISM3 Probability of Death. We considered medical LOS and mortality for clinical outcomes. We compared categorical variables by Chi-square tests and compared continuous variables by Mann-Whitney tests or Kruskal-Wallis tests. <i>P</i> < 0.05 were considered statistically significant.</p><p><b>Results</b>: We identified 637 patients aged 0-21 years treated in the PICU with a diagnosis of septic shock. Of these, 401 received vasoactive medication, 183 received early enteral nutrition, and 81 received early parenteral nutrition. Those given early parenteral nutrition had longer LOS (median (IQR): 7.0 (2.2-23.2) days) than those not fed (median (IQR): 2.1 (1.1-5.1) days) (<i>p</i> < 0.0001), but did not differ from those fed enterally (median (IQR): 7.9 (3.7-15.2)) (<i>p</i> = 0.95). After controlling for severity of illness, patients who received early parenteral nutrition were more likely to die than those receiving early enteral nutrition or those who were not fed at all (parental vs. enteral: 17.8% vs. 4.60%, <i>p</i> = 0.002; parenteral vs. none: 17.28% vs. 6.70%, <i>p</i> = 0.002). Mortality did not differ between patients who received early enteral nutrition and those not fed (4.60% vs. 6.70%, <i>p</i> = 0.427543).</p><p><b>Conclusion</b>: Early enteral nutrition was given more frequently than early parenteral nutrition. Early enteral nutrition was not significantly associated with improved outcomes as measured by length of stay and mortality, but early parenteral nutrition was associated with significantly worse outcomes. This suggests that clinical guidelines should favor the use of enteral feeding in septic shock patients receiving vasoactive medication.</p><p><b>Best of ASPEN - Critical Care and Critical Health Issues</b></p><p><b>International Abstract of Distinction</b></p><p>Lu Ke, PhD<sup>1</sup>; Cheng Lv, PhD Candidate<sup>1</sup>; Lingliang Zhou, MD Candidate<sup>2</sup>; Weiqin Li, PhD<sup>1</sup></p><p><sup>1</sup>Nanjing University, Nanjing, Jiangsu, China; <sup>2</sup>Southeast University, Nanjing, Jiangsu, China</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: There is controversy over the optimal early protein delivery in critically ill patients with acute kidney injury (AKI). This study aims to evaluate whether the association between early protein delivery and 28-day mortality was impacted by the presence of AKI in critically ill patients.</p><p><b>Methods</b>: This is a secondary analysis of a multicenter cluster-randomized controlled trial enrolling newly admitted critically ill patients (N = 2772). Participants with complete data on baseline renal function and 28-day mortality were included in this study. Cox proportional hazards models were used to investigate whether early protein delivery, reflected by mean protein delivery from day 3 to day 5 after enrollment, was associated with 28-day mortality and whether baseline AKI stages impacted their association.</p><p><b>Results</b>: Overall, 2,618 patients were included (Table 1), among whom 628 (24.0%) had AKI at enrollment (118 stage I, 97 stage II, 413 stage III). Mean early protein delivery was 0.60 ± 0.38 g/kg/d among the study patients (Figure 1). In the overall study cohort, each 0.1 g/kg/day increase in protein delivery was associated with a 5% reduction in 28-day mortality (Hazard Ratio [HR] = 0.95; 95% confidence interval [CI] 0.92-0.98, <i>P</i> < 0.001). Also, when stratifying the early protein delivery by tertiles, compared with low protein delivery, the risk of 28-day mortality both decreased in the medium protein group (HR = 0.64; 95% CI 0.50-0.82, <i>P</i> < 0.001) and the high protein group (HR = 0.71; 95% CI 0.55-0.91, <i>P</i> = 0.007) after adjusting for potential confounders (Figure 2). The association between early protein delivery and 28-day mortality in patients with different baseline AKI stages showed significant heterogeneity (adjusted interaction <i>P</i> = 0.047). With each 0.1 g/kg/d increase in protein delivery, the 28-day mortality decreased by 5% (HR = 0.95; 95% CI 0.92-1.00, <i>P</i> = 0.008) in patients without AKI and 7% (HR = 0.93; 95% CI 0.86-0.99, P = 0.043) in those with AKI stage III, of whom 72% were on renal replacement therapy upon enrollment. However, these associations were not observed among AKI stage I and II patients. The mortality trends up to day 28 for early protein delivery in different AKI stage groups are depicted in Figure 3.</p><p><b>Conclusion</b>: Higher early protein delivery during days 3-5 of ICU stay was associated with improved 28-day mortality in critically ill patients without AKI and with AKI stage III, but not in those with AKI stage I or II.</p><p><b>Figure 3</b>. The trends of 28-day mortality with early protein delivery in different AKI stages.</p><p>Stanislaw J. Gabryszewski, MD, PhD<sup>1</sup>; David A. Hill, MD, PhD<sup>1,2</sup></p><p><sup>1</sup>Children's Hospital of Philadelphia, Philadelphia, PA; <sup>2</sup>University of Pennsylvania, Philadelphia, PA</p><p><b>Financial Support</b>: This work was supported by the National Institutes of Health (Grants T32HD043021 to SJG; K08DK116668 and R01HL162715 to DAH).</p><p><b>Background</b>: The ketogenic diet (KD) is a high-fat, moderate-protein, low-carbohydrate diet that induces ketosis, a metabolic shift characterized by the use of fatty acid-derived ketone bodies rather than glucose to meet energy needs. While the KD is best known as a dietary therapy for refractory epilepsy, there is growing interest in identifying other diseases in which the KD may be therapeutic. Recent studies have revealed the potential of the KD to dampen inflammation and pathology in mouse models of allergic asthma. However, it is unclear whether the KD has such immunoregulatory effects in other allergic diseases, such as the gastrointestinal allergy eosinophilic esophagitis (EoE).</p><p><b>Methods</b>: We studied the effects of the KD in a mouse model of eosinophilic esophagitis (EoE) in which 10-week-old C57BL/6 mice were topically treated with the vitamin D analog MC903 and the egg white allergen ovalbumin (OVA) on days 0 to 11 to induce eczema-like dermatitis and allergic sensitization, respectively. The effect of the KD following allergic sensitization was studied by feeding mice KD or a regular diet (RD) starting on day 12. Mice were provided with OVA-supplemented water and gavaged with OVA on days 18-20. On day 21, mice were harvested to quantify esophageal eosinophilia and to phenotype immune responses in draining lymph nodes via flow cytometry.</p><p><b>Results</b>: Following induction of EoE, mice in both the KD (n = 17) and RD (n = 17) arms exhibited 100% survival at day 21. Weight recovery (percent of original weight ± SEM) at day 21 was comparable between KD-fed (104.1 ± 1.7%) and RD-fed (99.0 ± 3.2%) mice (<i>p</i> > 0.05). Analysis of esophageal eosinophilia at day 21 revealed significantly decreased numbers (total cells ± SEM) of Siglec-F<sup>+</sup> CD11b<sup>+</sup> eosinophils in KD-fed (711 ± 345 cells) versus RD-fed (880 ± 225 cells) mice (<i>p</i> < 0.05). There was a non-significant reduction in the percentage of esophageal eosinophils (percent of CD45<sup>+</sup> cells ± SEM) in KD-fed (5.1 ± 1.2%) versus RD-fed (8.1 ± 1.5%) mice (<i>p</i> = 0.138). In immunophenotyping of phorbol myristate acetate and ionomycin-stimulated cells from draining lymph nodes at day 21, there was a significantly increased percentage (percent of CD4<sup>+</sup> T cells ± SEM) of Foxp3<sup>+</sup> T regulatory (Treg) cells in KD-fed (6.5 ± 1.1%) versus RD-fed (3.3 ± 0.4%) mice (<i>p</i> < 0.01).</p><p><b>Conclusion</b>: In this mouse model of OVA-induced EoE, we observed a modest inhibitory effect of the KD on the recruitment of eosinophils to the esophagus. As compared with the RD, the KD was associated with increased proportions of Foxp3<sup>+</sup> Tregs in draining lymph nodes of mice with EoE. Additional mechanistic investigations are warranted, including determination of the necessity of Tregs for KD-induced inhibition of esophageal eosinophilia. This study highlights the promise of immunomodulatory dietary interventions in the context of allergic disease.</p><p>Hassan S. Dashti, PhD, RD<sup>1</sup>; Magdalena Sevilla, Ph.D.<sup>1</sup>; Kris Mogensen, MS, RD-AP, LDN, CNSC<sup>2</sup>; Charlene Compher, PhD, RD, LDN, FASPEN<sup>3</sup></p><p><sup>1</sup>Massachusetts General Hospital, Boston, MA; <sup>2</sup>Brigham and Women's Hospital, Boston, MA; <sup>3</sup>University of Pennsylvania School of Nursing, Philadelphia, PA</p><p><b>Financial Support</b>: Research reported in this publication was supported by the American Society for Parenteral and Enteral Nutrition (ASPEN) Rhoads Research Foundation awarded to Hassan S. Dashti.</p><p><b>Background</b>: Patients living with short bowel syndrome (SBS) receiving home parenteral nutrition (HPN) commonly receive nutritional infusions overnight contributing to sleep and circadian disruption. Aligning nutritional intake with the circadian clock is expected to yield high benefits to vulnerable populations by limiting circadian misalignment (i.e., a mismatch between the circadian system and behaviors) and influencing other pathways. Recent advancements in metabolic profiling techniques (systematic profiling of cellular metabolites, i.e., sugars, amino acids, organic acids, nucleotides, and lipids) have emerged as a promising tool for identifying relevant biological pathways. Our objective was to characterize metabolites that differ between daytime and overnight HPN infusions in adults with SBS habitually receiving HPN.</p><p><b>Methods</b>: The present study was a secondary analysis of a controlled, single-arm 2-week pilot and feasibility trial designed to compare daytime to overnight infusions of HPN in adults with SBS consuming HPN (ClinicalTrials.gov: NCT04743960). Enrolled patients received 1 week of HPN infusions overnight followed by 1 week of HPN infusions during the daytime (approximately 12-hour change in infusion start time). Duration, frequency, and composition of infusions remained identical during the two study periods. Following each 1-week study period, patients had a venous blood sample collected at clinical visits. Plasma samples were analyzed using Ultrahigh Performance Liquid Chromatography-Tandem Mass Spectroscopy and global metabolic profiles were determined. Of 1015 measured metabolites, only 622 metabolites with non-missing data across all samples were analyzed. Data were normalized to the volume of sample extracted and then log-transformed and scaled with Z-score prior to analysis. Differential metabolite abundance between the two study periods (daytime vs. overnight) was determined using standard Linear Models for MicroArray Data (LIMMA) models adjusted for dietary fasting duration and time since the end of the last HPN infusion. Pathway enrichment analysis was then conducted using MetaboAnalyst's pathway enrichment tool.</p><p><b>Results</b>: Nine patients (age, 52 years; 80% female; BMI 21.3 kg/m<sup>2</sup>) completed the trial and provided two fasting blood samples. Both blood draws were completed at approximately 11:20 am following at-least an 8-hour fast and at-least 8 hours from the end of an HPN infusion. Changes were detected in 36 metabolites at <i>P</i> < 0.05; top-changing metabolites were mostly fatty acids, long-chain and polyunsaturated fatty acids (Dihomo-gamma-linolenic acid, arachidonate (20:4n6), docosahexaenoate (DHA; 22:6n3)) and glycerolipids. (Figure 1). No metabolites were significant at the stringent <i>FDR</i> threshold. Enrichment analysis of the 36 metabolites identified pathways related to the biosynthesis of unsaturated fatty acids, D-arginine, D-ornithine metabolism, and linoleic acid metabolism, among others (Figure 2).</p><p>Astrid Verbiest, MSc<sup>1,2</sup>; Mark K. Hvistendahl, MS, PhD<sup>3</sup>; Federico Bolognani, MD, PhD<sup>4</sup>; Carrie Li, MS, PhD<sup>4</sup>; Nader N. Youssef, MD, MBA, FACG<sup>4</sup>; Francisca Joly, MD, PhD<sup>5</sup>; Palle B. Jeppesen, MD, PhD<sup>3</sup>; Tim Vanuytsel, Associate Professor<sup>1,2</sup></p><p><sup>1</sup>Leuven Intestinal Failure and Transplantation Center (LIFT), University Hospitals Leuven, Leuven, Belgium; <sup>2</sup>Translational Research Center for Gastrointestinal Disorders (TARGID), University of Leuven, Leuven, Belgium; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; <sup>4</sup>VectivBio, Basel, Switzerland; <sup>5</sup>Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France</p><p><b>Financial Support</b>: This research was supported by VectivBio AG.</p><p><b>Background</b>: Short bowel syndrome (SBS) is a severe organ failure condition with a high risk of developing intestinal failure (SBS-IF) and life-long parenteral support (PS) dependence. Glucagon-like peptide-2 (GLP-2) analogs stimulate adaptation of the remaining intestine resulting in increased intestinal absorption and reduced PS needs. Extensive literature is available on the effect of the short-acting GLP-2 analog teduglutide in patients without a remaining colon. However, the impact of GLP-2 analogs on fluid and energy absorption in SBS-IF with a colon-in-continuity (CiC) is unclear. Apraglutide (APRA) is a novel, long-acting synthetic GLP-2 analog that is in development for SBS-IF. We performed a pre-defined interim analysis of a phase 2 study in SBS-IF-CiC to investigate the safety and efficacy of 4 weeks of apraglutide treatment based on metabolic balance studies (MBS).</p><p><b>Methods</b>: STARS Nutrition is a 52-week multicenter, open-label phase 2 study in adult patients with SBS-IF-CiC receiving once-weekly subcutaneous apraglutide injections (5 mg). MBS were performed at baseline and after 4 weeks with stable PS, followed by a 48-week PS adjustment period. During MBS, fluid intake was kept constant (individual predefined drinking menu). Duplicates of meals and fluids (wet weight intake), urine, and feces (fecal wet weight output) were collected. Safety was the primary endpoint. Secondary endpoints included changes in fecal wet weight output, urinary output, wet weight, and energy absorption. Data are presented as mean (95% CI). <i>P</i> values < 0.05 were considered significant (Wilcoxon matched-pairs signed rank test).</p><p><b>Results</b>: Nine patients were included and comprise the full study population. Apraglutide was well tolerated with no dose discontinuation or interruption. No AEs were considered notable based on their nature or severity. At baseline, patients received a weekly PS volume of 10 (range 4-21) L. Small bowel length was 19 (range 0-50) cm and 79 (range 43-100) % of the colon was in continuity. Fecal wet weight output decreased significantly by 253 (−437 to −68) g/day (<i>p</i> = 0.012). Relative wet weight absorption increased by 9 (1 to 18) % (<i>p</i> = 0.039). There was a numeric increase in urinary output (<i>p</i> = 0.129). No significant changes in energy absorption were observed (Table 1).</p><p>Palle B. Jeppesen, MD, PhD<sup>1</sup>; Tim Vanuytsel, Associate Professor<sup>2</sup>; Sukanya Subramanian, Physician<sup>3</sup>; Francisca Joly, MD, PhD<sup>4</sup>; Geert Wanten, Physician<sup>5</sup>; Georg Lamprecht, Physician, Professor<sup>6</sup>; Marek Kunecki, MD<sup>7</sup>; Farooq Rahman, Physician<sup>8</sup>; Thor Nielsen, Statistician<sup>9</sup>; Lykke Graff, MD<sup>9</sup>; Mark Hansen, Physician<sup>9</sup>; Ulrich Pape, Physician<sup>10</sup>; David Mercer, Physician<sup>11</sup></p><p><sup>1</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; <sup>2</sup>UZ Leuven, Leuven, Belgium; <sup>3</sup>MedStar Georgetown, Washington, DC; <sup>4</sup>Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France; <sup>5</sup>Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands; <sup>6</sup>University Medical Center Rostock, Rostock, Germany; <sup>7</sup>M. Pirogow Hospital, Wolczanska, Poland; <sup>8</sup>University College London Hospitals, London, United Kingdom; <sup>9</sup>Zealand Pharma A/S, Copenhagen, Denmark; <sup>10</sup>ASKLEPIOS Klinik St. Georg, Hamburg, Germany; <sup>11</sup>Nebraska Medical Center, NE</p><p><b>Financial Support</b>: Zealand Pharma A/S Supported Research.</p><p><b>Background</b>: Reduction of parenteral support (PS) is important for improved outcome in short bowel syndrome (SBS) patients with intestinal failure (IF). Clinically meaningful within-patient change in PS volume has until today been regarded as a ≥ 20% reduction. This is however based on clinical experience, and to our knowledge there has been no data-driven exercise which aims at quantifying what constitutes a meaningful change in PS volume from a patient perspective. Glepaglutide, a long-acting GLP-2 analog, reduces PS volume needs and improves patient global impression of change (PGIC), a patient-reported outcome (PRO) tool, in SBS-IF patients. We here report a quantitative analysis of meaningful change in PS volume using PGIC following glepaglutide treatment in the Efficacy and Safety Evaluation (EASE) SBS 1 trial.</p><p><b>Methods</b>: EASE SBS 1 is a multi-center, placebo-controlled, randomized, parallel-group, double-blind phase 3 trial (NCT:03690206). Chronic SBS-IF adult patients with requirement for PS at least 3 days per week were recruited. Patients were randomized to 24 weeks of treatment with SC injections of either 10 mg glepaglutide twice-weekly (TW), 10 mg glepaglutide once-weekly (OW), or placebo. PS volume requirements were evaluated and adjusted using regular fluid balance periods. The primary endpoint was a reduction in weekly PS volume from baseline to week 24. Patients rated their change in overall status since the start of the trial to weeks 12 and 24 by PGIC, using a 7-point Likert scale (ranging from very much worse to very much improved). Anchor-based analysis using scatter plots and empirical cumulative distribution functions (eCDF) were applied to assess the association between PGIC categorical data and % change in PS volume from baseline to weeks 12 and 24. Anchor-based methods are used as external criteria to gain knowledge about what is clinically meaningful to patients based on known anchoring measures.</p><p><b>Results</b>: 99 of the 106 randomized patients completed the trial. Glepaglutide TW treatment significantly reduced mean PS requirements by 47% (5.13 L/wk) from baseline. Improvement in PGIC was shown with significant differences relative to placebo for both glepaglutide TW (<i>p</i> = 0.002) and OW (<i>p</i> < 0.0001). Using the blinded data sample, the association between PGIC and the PS volume % change from baseline to week 24 showed that the two endpoints were correlated, with Spearman rank-order and Kendall's tau-b correlation coefficients of 0.353 and 0.285, respectively. After 12 weeks of treatment, the association appears stronger. Upon inspection of the eCDF, these results support the appropriateness of a % PS volume reduction threshold of 20%.</p><p><b>Conclusion</b>: Anchor analysis, using PGIC as the anchor measurement, showed that the use of 20% reduction in PS volume, an outcome measure used in clinical trials, is considered clinically meaningful to SBS patients.</p><p><b>Abstract of Distinction</b></p><p>Ji Seok Park, MD, MPH<sup>1</sup>; Naseer Sangwan, PhD<sup>1</sup>; Lauren Menke<sup>2</sup>; Gail Cresci, PhD, RD, LD, FASPEN<sup>1</sup></p><p><sup>1</sup>Cleveland Clinic, Cleveland, OH; <sup>2</sup>Case Western Reserve University, Cleveland, OH</p><p><b>Financial Support</b>: 4R00AA023266 (GC) and Standard Process.</p><p><b>Background</b>: A synbiotic is a physical combination of a prebiotic and a probiotic with a general goal of maintaining probiotic viability through co-packaging with its food source. Despite its wide availability, evidence to support its use in a healthy population is limited. The study aimed to test the feasibility and safety of the synbiotic on gastrointestinal symptoms and gut microbiota.</p><p><b>Methods</b>: This was a double-blinded, randomized, placebo-controlled, paired crossover pilot study in healthy adults to test the effects of a targeted synbiotic on gut microbiota diversity and abundance. The targeted synbiotic consisted of 2 probiotic strains, <i>Lactobacillus reuteri</i> 3613 (1 × 10<sup>9</sup> CFU) and <i>Lactobacillus plantarum</i> 276 (1 × 10<sup>11</sup> CFU), and a resistant starch (RS) prebiotic NuBana<sup>TM</sup> RS65G Green Banana Flour (3.84 g/d). Thirty-four healthy participants meeting the pre-defined criteria were enrolled per sample size calculation of 24 completers needed to achieve 91% power at a 5% significance level. Participants were randomized to consume the synbiotic versus maltodextrin placebo for 28 days, followed by a 21-day washout period, and then they crossed over to consume the other supplement for 28 days. Gastrointestinal symptoms were assessed, and fecal samples were collected before and after each supplement period. Fecal samples were analyzed by 16 S rRNA sequencing, and Division Amplicon Denoising Algorithm 2 (DADA2) and Ribosomal Database Project (RDP) classifier were used for taxonomic profiling. Alpha-diversity was assessed using the Shannon diversity index, and beta-diversity was assessed using Bray-Curtis dissimilarity. Differential abundance was used to capture significantly different taxa between the synbiotic group and placebo group. The study was approved by the Cleveland Clinic Institutional Review Board.</p><p><b>Results</b>: Thirty-four participants were randomized into the study, 13 males and 21 females, and 28 participants completed the study with an average age of 32 ± 7 years. Shannon diversity index of fecal samples was higher when participants were taking synbiotic compared to placebo (<i>P</i> = 0.021) suggesting higher microbial richness and evenness during the synbiotic consumption. Bray-Curtis dissimilarity was calculated between the synbiotic group and the placebo group and then was visualized using Principal Coordinates Analysis (PCoA), which showed 2 separate but overlapping groups. Differential abundance identified 11 taxa, including butyrate-producing genera <i>Akkermansia</i> and <i>Butyricimonas</i>, were significantly different between synbiotic and placebo supplements. All subjects tolerated the supplements well reporting no changes in GI symptoms.</p><p><b>Conclusion</b>: This pilot study shows a targeted synbiotic supplement favorably modified gut microbiome diversity and taxa abundance in healthy subjects. Further studies are warranted to test the effects of this targeted synbiotic in clinical scenarios with known gut dysbiosis to determine if modifications can be sustained and associated with disease.</p><p>Kaitlyn Daff, MA, RD, LDN<sup>1</sup>; Gail Cresci, PhD, RD, LD, FASPEN<sup>2</sup></p><p><sup>1</sup>Case Western Reserve University/Cleveland Clinic Lerner Research Institute, Cleveland, OH; <sup>2</sup>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support</b>: NIH-National Institute of Alcohol Abuse and Alcoholism.</p><p><b>Background</b>: Alcohol use disorder is the leading cause of liver disease in the United States<sup>1</sup>, with an estimated 80% of patients with alcohol-associated end-stage liver disease (AA-ESLD) also presenting with clinical malnutrition and sarcopenia<sup>2</sup>. Gut dysbiosis in ALD has been well-characterized in the literature with shifts from a Bacteroidetes and Firmicutes-dominated population towards an increased abundance Proteobacteria<sup>3</sup>. Although it is known that the gut microbiome plays a role in the metabolism and production of amino acids, how alcohol-associated gut dysbiosis influences host amino acid homeostasis is less understood. We aimed to test whether the amino acid metabolite profile in patients with AA-ESLD is unique from patients without disease pathology and if this is correlated to changes in the gut microbiota.</p><p><b>Methods</b>: A secondary data analysis was performed from a larger, single-center, non-randomized prospective pilot study in patients awaiting liver transplantation to characterize metabolomic changes in amino acid homeostasis. Urine samples were collected within 24 hours prior to liver transplant, adjusted for urine osmolality, and untargeted metabolomic analysis by UPLC-MS/MS was performed. Fecal samples collected within 24 hours of liver transplant were sequenced and analyzed using 16srRNA for profiling. Welch's paired t-tests were generated to determine statistically significant changes in metabolite mean scaled intensities between AA-ESLD and healthy control patients. Spearman's correlations were used to identify associations between amino acid metabolites and gut microbial taxa.</p><p><b>Results</b>: Analysis of the urinary metabolome between ALD-ESLD patients (n = 11) and healthy control patients (n = 18) revealed distinct amino acid profiles between groups. Welch's paired t-tests identified that arginine (<i>p</i> = 0.0016), glutamate (<i>p</i> = 0.0289), tyrosine (<i>p</i> = 0.0003, phenylalanine (p = 0.0002), asparagine (<i>p</i> = 0.0005) tryptophan (p = 0.0001), cystine (<i>p</i> = 0.0017) and taurine (<i>p</i> = 0.0480) were all significantly increased in ALD-ESLD patients. When Spearman's correlations were generated, a significant positive correlation was identified between gamma-proteobacteria genera species, phenylalanine (<i>p</i> = 0.0167), and tryptophan (<i>p</i> = 0.0349). These data suggest that the microbiome may contribute to the increased concentrations of these amino acids in the urine. Gamma-proteobacteria were also positively correlated with glutamine (<i>p</i> = 0.0151) and histidine (p = 0.0476), while a negative correlation was found with glycine(<i>p</i> = 0.0071) and creatinine (<i>p</i> = 0.0341).</p><p><b>Conclusion</b>: Urinary amino acid metabolites differ between AA-ESLD patients and those without liver disease. As patients must abstain from alcohol for ~6 months to be eligible for a liver transplant, these data suggest residual effects of AA-ESLD on amino acid homeostasis. Correlations between the microbiome and amino acid metabolites suggest that the unique microbial shifts associated with ALD may play a role in these observed changes to amino acid metabolism.</p><p>Stephanie Merlino Barr, MS, RDN, LD<sup>1,2</sup>; Rosa Hand, PhD, RDN, LD, FAND<sup>2</sup>; Marc Collin, MD<sup>1,2</sup>; Thomas E. Love, PhD<sup>1,2</sup>; Sharon Groh-Wargo, PhD, RDN<sup>1,2</sup></p><p><sup>1</sup>MetroHealth Medical Center, Cleveland, OH; <sup>2</sup>Case Western Reserve University, Cleveland, OH</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Diagnostic criteria for neonatal malnutrition were proposed in 2018 by field experts. This tool has not been validated since its publication. The objective of this study was to assess the agreement and reliability of both the overall malnutrition tool and individual indicators to evaluate how consistently the proposed criteria identify malnutrition in preterm infants.</p><p><b>Methods</b>: A single-center, retrospective cohort study was performed at a level III Neonatal Intensive Care Unit (NICU). The cohort included all preterm infants born between June 2013 and August 2022, who were admitted to the NICU for at least 3 days and did not die before discharge. Malnutrition diagnoses (none/mild/moderate/severe) were assigned to each patient for each indicator, as defined in Table 1; multiple definitions for individual indicators were used to reflect different potential approaches of assessment (eg, growth velocity), or to reflect differences in patient populations (eg, protein and energy intake). The kappa (k) value was used to assess the neonatal malnutrition diagnostic tool's overall inter-indicator reliability; this was calculated separately for indicators used to assess malnutrition in the first two weeks of life and after the first two weeks of life. Each indicator's diagnosis was compared individually to all other indicators' diagnoses to assess inter-indicator reliability; proportion of overall agreement, McNemar's test statistic, and kappa value were calculated. Acceptable agreement was defined as k > 0.8.</p><p><b>Results</b>: A total of 2946 infants were included in this study. The k values for the malnutrition tool overall indicated poor inter-indicator reliability; for malnutrition diagnoses in the first two weeks of life k = 0.054; for diagnoses after the first two weeks of life k = 0.048. Figure 1 depicts the weighted k values for all comparisons of individual indices. Figure 2 depicts the proportions of overall agreement. For example, the weight gain velocity (approach 1) compared to the energy intake malnutrition diagnosis criteria had n = 954 subjects, k = 0.09, and a proportion of overall agreement of 0.28, indicating that both inter-indicator reliability and accuracy were poor. Commonly cited generalized weight gain velocity goals (approaches 2 & 3) had good accuracy and inter-indicator reliability with the recommended method (approach 1) of determining goal weight gain velocity by maintaining weight-for-age z-score (1 vs. 2 k = 0.92, 1 vs. 3 k = 0.88). The generalized linear growth goal (approach 2) had poor accuracy and inter-indicator reliability with the recommended method (approach 1) (k = 0.12). All comparisons of unique indices for malnutrition diagnosis had detectable disagreement in diagnosis patterns as assessed by McNemar's test statistic.</p><p>Amber Hager, BSc, RD; Yiqi Wang, BSc; Sandy Hodgetts, PhD, OT; Lesley Pritchard, PhD, PT; Vera Mazurak, PhD; Susan Gilmour, MD, MSc, FRCPC; Diana R. Mager, MSc, PhD, RD</p><p>University of Alberta, Edmonton, AB, Canada</p><p><b>Financial Support</b>: 2022 ASPEN Rhoads Research Foundation Grant.</p><p><b>Background</b>: Measurement of body composition in young infants and children with chronic liver disease (CLD) can be challenging due to fluid overload, lack of healthy reference data and non-invasive, validated methods to use at the bedside. The use of ultrasonography to serially measure changes in muscle thickness overcomes many of these limitations, but little comparable data is available in young infants and children (<5 y). The study purpose was to serially measure changes in total bicep, calf, and thigh muscle layer thickness (MLT), subcutaneous adipose tissue thickness (SAT-T), and motor (gross/fine) development in infants and children (<5 y) with CLD. We hypothesized that the trajectory of MLT (thigh, bicep, calf) and SAT-T would be significantly impacted by CLD, and informative of gross motor development in infants and children (<5 y).</p><p><b>Methods</b>: Infants and children (4 mo-5 y) with CLD (n = 11) and their age-matched CON (n = 16) were recruited from the Pediatric Liver Clinics/Liver Transplant Clinics at the Stollery Children's Hospital and the community. Participants underwent 2 serial measurements at baseline and after 6 months of (1) MLT, echo intensity and SAT-T of the bicep brachii (BB), rectus femoris (RF), rectus intermedius (RI), soleus and gastronemius (GN) using ultrasound (U/S) and (2) gross motor assessment (Peabody Motor Scale V2 [PDMS-2]) in CLD only. Additional variables collected included demographics (age, sex, CLD diagnosis, PELD), SGNA scores, anthropometrics (wt-z, ht-z, head circumference (hc-z]), body composition (fat-free mass [FFM]/fat-mass [FM] using BIA) and multiple skinfold thickness (SFT) (triceps [TSF], biceps, suprailliac, subscapular), mid-arm circumference [MAC-z]).</p><p><b>Results</b>: CLD etiology included 73% Biliary Atresia (n = 8), 27% other (n = 1 acute liver failure; n = 2 TPN-related cholestasis). No significant differences in age (years), sex, wt-z, ht-z, hc-z, MAC-z, TSF-z, or subscapular-z were noted between groups at baseline (<i>p</i> > 0.05). Thirty percent of CLD children had SGNA scores indicative of mild-moderate malnutrition (SGNA ≥ 2). Total thigh, RI, and soleus MLT was significantly lower in CLD vs CON, and thigh SAT was higher in CLD after 6 months (<i>p</i> < 0.05). This was particularly evident in CLD children ≤ 2 years who had significantly lower total thigh, RI, RF, and soleus MLT than CON at baseline and after six months (<i>p</i> < 0.05). Total thigh, RI, RF MLT (absolute, % change over 6 months) were positively related to measures of BIA-FFM measures (r<sup>2</sup> = 0.46 −0.47); <i>p</i> < 0.001) total motor quotient and gross motor quotient scores (absolute, percentile; r<sup>2</sup> = 0.47 <i>p</i> < 0.001)), but not fine motor quotients (absolute, percentile) of the PDMS-2, particularly in CLD children (<2 y). Bicep and calf (MLT, SAT) were not associated with total motor, gross motor, or fine motor quotients (absolute, percentile) in CLD children.</p><p><b>Conclusion</b>: Children with CLD had significantly lower measures of muscle thickness and higher measures of SAT than CON. Serial measurement of thigh MLT may be informative of the trajectory of fat-free mass and gross motor skill development in young children with CLD.</p><p><b>Abstract of Distinction</b></p><p>Anita Nucci, PhD, RD<sup>1</sup>; Hillary Bashaw, MD<sup>2</sup>; Alexander Kirpich, PhD<sup>1</sup>; Jeffrey Rudolph, MD<sup>3</sup></p><p><sup>1</sup>Georgia State University, Atlanta, GA; <sup>2</sup>Children's Healthcare of Atlanta, Atlanta, GA; <sup>3</sup>UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA</p><p><b>Financial Support</b>: Takeda Pharmaceuticals.</p><p><b>Background</b>: Although survival for children with intestinal failure (IF) has improved with parenteral nutrition (PN), many still fail to maintain adequate somatic growth after achieving enteral autonomy. Few studies have examined growth after weaning from PN and outcomes have been inconsistent. A glucagon-like peptide-2 (GLP-2) analog has been shown to reduce the volume of and time on PN in some children with short bowel syndrome with 6 months of use. The effect of this analog on growth is unknown. We aim to describe growth patterns in children with IF after PN weaning and during treatment with a GLP-2 analog.</p><p><b>Methods</b>: This retrospective observational study was conducted at two centers for pediatric intestinal rehabilitation (IR) in the US eligibility criteria included diagnosis with IF (PN use ≥60 days within a 74 consecutive day interval) at <12 months of age. Patients were referred for IR between September 1989 and January 2023. Z-score values for weight and length/height (adjusted for gestational age up to 2 years of age) are described in those who weaned from PN and in those who received a GLP-2 analog (Gattex®) for ≥6 months (2017-2023).</p><p><b>Results</b>: There were 362 children (57% male, 72% white) with a median age at diagnosis of 6 days (interquartile range [IQR] 1,22) eligible for the study. Common diagnoses included necrotizing enterocolitis (28%), gastroschisis (23%), and small bowel atresia (16%). The median gestational age was 34 weeks (IQR 31,37), the percent small bowel remaining at diagnosis was 23% (IQR 10,50), and 36% had a functional ileocecal valve. One hundred forty-five children (40%) were successfully weaned from PN (median time to wean = 1.5 y [IQR 1,2.9]). 123/145 (85%) achieved enteral autonomy (maintenance of normal growth for >3 consecutive months). Median weight and length/height z-score at the time of PN weaning was −1.04 (IQR −2.09, −0.12) and −1.86 (IQR −3.01, −0.69), respectively. After weaning from PN, weight and linear growth velocity were maintained in 44% and 39% of children, respectively in year 1 and 59% and 55%, in year 2. Acceleration in weight and linear growth velocity was observed in 28% and 34%, respectively in year 1 and 22% and 31%, in year 2. Fourteen children received a GLP-2 analog for a median of 912 days (IQR 365,1304). Of these, 3 were weaned from parenteral support within 9 months. Changes in weight and linear growth velocity z-scores between GLP-2 start and 2 years post-initiation are shown in Table 1.</p><p>Annemarie Rompca, MD<sup>1</sup>; Morgan McLuckey, MD<sup>2</sup>; Anthony J. Perkins<sup>3</sup>; Xiaoyi Zhang, MD, PhD<sup>1</sup>; Charles Vanderpool, MD<sup>1</sup></p><p><sup>1</sup>Riley Hospital for Children, Indianapolis, IN; <sup>2</sup>Department of Radiology, Indianapolis, IN; <sup>3</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Inflammatory bowel disease (IBD) can impact patients' nutritional status. Poor oral intake, poor absorption of nutrients, protein loss in stool, and increased energy requirement can contribute to poor nutrition in this patient population. Poor nutritional status can manifest as poor growth, poor weight gain, and sarcopenia, defined as decreased muscle mass and strength. Studies have demonstrated decreased muscle mass in pediatric IBD patients leads to a need for escalated therapy, increased need for surgery, and increased risk of post-operative complications. We sought to obtain the muscle mass at IBD diagnosis of our cohort on cross-sectional imaging, compare to known age- and sex-specific psoas muscle reference values for pediatric norms, and analyze muscle mass comparison between IBD subtypes and correlations with anthropometrics at diagnosis.</p><p><b>Methods</b>: This study is a single-center retrospective study at a tertiary care facility. Patients with new diagnoses of IBD [Crohn's disease (CD), ulcerative colitis (UC), and indeterminate colitis (IC)] ages 6 to 16 at diagnosis from May 15, 2018, through December 31, 2019, were included. Those who had chronic medical conditions and no accessible cross-sectional imaging within 3 months of diagnosis were excluded. Demographic and anthropometric data at diagnosis of IBD were obtained. The psoas muscle area in mm<sup>2</sup> was measured on cross-sectional imaging at lumbar level 3-4 (L3-4) and lumbar level 4-5 (L4-5) bilaterally. Right and left measurements were added together to obtain the total psoas muscle area (TPMA) at each level. These measurements were compared to pediatric psoas muscle area reference values. We used analysis of variance to determine if outcomes differed by IBD type. Spearman correlations were used to assess the relationship between anthropometric measures and outcomes of interest. All analyses were performed using SAS v9.4.</p><p><b>Results</b>: Cross-sectional imaging from 70 patients with newly diagnosed IBD was reviewed. The average age was 11.9 years, with a male predominance of 42 patients (60%). Most patients were diagnosed with CD (n = 50, 71.4%), followed by UC (n = 17, 24.3%), and then IC (n = 3, 4.3%). The mean z-score for all patients TPMA at L3-4 was −1.7. The mean z-score for all patients TPMA at L4-5 was −1.4 (Table 1). Measures of sarcopenia at both lumbar levels for TPMA and z-score at L3-4 were significantly different across IBD types (CD vs UC vs IC) (Table 2).</p><p><b>Best of ASPEN - Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p><b>Abstract of Distinction</b></p><p>Adam Russman, MD<sup>1</sup>; Anne McCallister, CPNP<sup>2</sup>; Anthony J. Perkins<sup>3</sup>; Charles Vanderpool, MD<sup>4</sup></p><p><sup>1</sup>Children's Medical Center of Dallas, Dallas, TX; <sup>2</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>3</sup>Indiana University School of Medicine, Indianapolis, IN; <sup>4</sup>Riley Hospital for Children, Indianapolis, IN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN) published malnutrition guidelines in 2014. Literature describing clinical outcomes in hospitalized children with a malnutrition diagnosis is limited and few studies focus on the impact of malnutrition severity subtype on clinical outcomes.</p><p><b>Methods</b>: We analyzed patients admitted to our pediatric hospital from 2019 to 2022, excluding maternal/obstetrics admissions. Patients were diagnosed with malnutrition and assigned severity subtype by a registered dietitian according to AND/ASPEN guidelines. Unspecified malnutrition was assigned if there was insufficient physician documentation to determine the malnutrition severity subtype. Data on readmission rate, mortality, length of stay (LOS), LOS index, hospital cost, operative procedure (OR, any procedure), and pediatric intensive care unit (ICU) admission were collected. Clinical outcomes were also analyzed based on the malnutrition severity subtype and compared to patients who were not diagnosed with malnutrition. We used the natural log (LOS + 1) and natural log (costs+1) for LOS and cost analyses since both variables were highly skewed. Mixed effects regression analysis was completed to account for the clustering of repeated admissions. All analyses were performed using SAS v9.4.</p><p><b>Results</b>: Any malnutrition diagnosis was associated with a higher 7-, 14-, and 30-day readmission rate compared to patients without a malnutrition diagnosis. Malnourished patients had a higher mortality rate, median LOS, LOS index, cost, ICU admission rate, and operative procedure rate compared to patients without a malnutrition diagnosis (Table 1). Table 2 represents an analysis based on malnutrition severity subtype. Patients with mild, moderate, and severe malnutrition all had significantly higher readmission rates at 7-, 14-, and 30-day time points compared to patients with no malnutrition. Patients with unspecified malnutrition had a higher readmission rate at only 30 days. At all three readmission time points, there were no significant differences in readmission rates between malnutrition severity categories. The only malnutrition subtype with a significantly increased rate of mortality compared to no malnutrition was patients with severe malnutrition (<i>p</i> = 0.005). Admissions with mild, moderate, unspecified, and severe malnutrition had significantly higher LOS index, LOS, and total costs than admissions without a malnutrition diagnosis. Mild malnutrition admissions had a significantly higher LOS index than moderate (<i>p</i> = 0.050) and severe (<i>p</i> = 0.014) malnutrition while unspecified severity admissions had a significantly higher LOS index than severe admissions (<i>p</i> = 0.026). Mild (<i>p</i> = 0.032), moderate (<i>p</i> = 0.015) and severe (<i>p</i> = 0.001) malnutrition admissions had significantly higher LOS than unspecified severity admissions. Mild (<i>p</i> = 0.011) malnutrition admission had significantly higher costs than admission with unspecified malnutrition.</p>","PeriodicalId":16668,"journal":{"name":"Journal of Parenteral and Enteral Nutrition","volume":"48 S1","pages":"S5-S59"},"PeriodicalIF":4.1000,"publicationDate":"2024-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jpen.2601","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Parenteral and Enteral Nutrition","FirstCategoryId":"3","ListUrlMain":"https://aspenjournals.onlinelibrary.wiley.com/doi/10.1002/jpen.2601","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NUTRITION & DIETETICS","Score":null,"Total":0}
引用次数: 0
Abstract
Sunday, March 3, 2024
SU30 Parenteral Nutrition Therapy
SU31 Enteral Nutrition Therapy
SU32 Malnutrition, Obesity, Nutrition Practice Concepts, and Issues
SU33 Critical Care and Critical Health Issues
SU34 GI and Other Nutrition and Metabolic-Related Topics
SU35 Pediatric, Neonatal, Pregnancy, and Lactation
Ji Seok Park, MD, MPH; Mohamed Tausif Siddiqui, MD; Kristin Izzo, RD; Sara Yacyshyn, MD; Allison Doriot, RD; Aje Kent, MD; Elizabeth Gallant, RD; Miguel Salazar, MD; Eileen Hendrickson, PharmD; Adriana Panciu, PharmD; Basma Rizk, PharmD; Ann Dugan, RN; James Bena, MS; Shannon Morrison, MS; Ruishen Lyu, MS; Anil Vaidya, MD; Gail Cresci, PhD, RD, LD, FASPEN; Donald F. Kirby, MD, FACP, FACN, FACG, AGAF, FASPEN, CNSC, CPNS
Cleveland Clinic, Cleveland, OH
Financial Support: Cleveland Clinic Center for Human Nutrition Morrison Research and Development Funding.
Background: Preventing catheter-related bloodstream infection (CRBSI) is an essential component in managing patients with chronic intestinal failure dependent on home parenteral nutrition (HPN). Ethanol lock therapy is an effective evidence-based strategy used to decrease the risk of CRBSI, however, it has become less available due to supply chain issues thus other strategies are needed. SQ53 wipe is a novel antimicrobial wipe based on a proprietary compound that has residual efficacy beyond 24 hours. It is registered under the European Union Biocidal Product Regulation but not under the U.S. Food and Drug Administration. This study aimed to evaluate the effectiveness of the SQ53 wipe in preventing CRBSI in patients receiving HPN. The study was registered in ClinicalTrials.gov (NCT 04822467).
Methods: A single-blinded, randomized, placebo-controlled trial was designed. About 200 patients meeting pre-defined criteria were contacted. A total of 60 patients were recruited to the study between December 10, 2021, and June 3, 2022, per sample size calculation. Patients were randomized into a treatment group (SQ53 wipe) and a control group (alcohol wipe). A stratified randomization was done based on the CRBSI risk category (low, high, new) and the types of central venous catheter (CVC; tunneled, non-tunneled). Patients were instructed to use the appropriate type of wipe to clean their CVCs before and after HPN infusion per specific instructions. An interim analysis for both efficacy and futility was planned to occur when the last patient reached 6 months post-randomization. Analyses were performed using Poisson regression for the comparisons of all CRBSI (confirmed and suspected), confirmed CRBSI and CVC exchanges between the two groups. Additional analyses were performed to compare the outcomes between the 6 months prior to the study and the time in the study, using each patient as their own historical control. Both the intention to treat (ITT) and per-protocol (PP) (>90% adherence) analyses were used.
Results: Fifty-nine patients were randomized into the study. When the two groups were compared in parallel, both the ITT and PP analyses did not show statistically significant superiority of using SQ53 wipe over alcohol wipe in decreasing all CRBSI, confirmed CRBSI or CVC exchanges. However, PP analysis suggested that event rates may be lower in the SQ53 group which had a 34% lower risk of all CRBSI (P = 0.43), 53% lower risk of confirmed CRBSI (P = 0.52), and a 30% lower risk of CVC exchanges (P = 0.58). Interestingly, when each patient's CRBSI rate during the trial was compared with their previous CRBSI rate, the SQ53 wipe group showed a 74% lower risk of all CRBSI (P = 0.005) in PP analysis. In patients in the high-risk category, every patient who was randomized had a decreased CRBSI rate compared to their previous experience. Every patient tolerated SQ53 well without predefined adverse events.
Conclusion: Patients who used SQ53 wipe for more than 90% of the time using specific instructions had 74% decreased CRBSI rates compared to their previous experience. SQ53 wipe did not show a statistically significant benefit over alcohol wipe in this study due to the augmented catheter hygienic care in the control group and the insufficient sample size.
Abstract of Distinction
Theresa A. Fessler, MS, RDN, CNSC1; Mary B. Crandall, PhD, RN2; David N. Martin, PhD2
1Morrison Healthcare, University of Virginia Health System, Charlottesville, VA; 2University of Virginia Health System, Charlottesville, VA
Financial Support: None Reported.
Background: Catheter-related bloodstream infection (CRBSI) is a serious complication for patients receiving home parenteral nutrition (HPN). The literature is not consistent as to whether there are significant differences in infection risk between central venous catheter (CVC) types, and assessment is complicated by potential alternate infection sources and different evaluation methods: CRBSI, and central line-associated catheter infection (CLABSI). The goals of this project were: To determine if significant differences in infection rates exist between peripherally inserted central venous catheters (PICC), tunneled central venous catheters (TCVC), and implanted ports, or between single-lumen (SL) and multi-lumen (ML) catheters used for HPN; and to identify rates of CVC removal for other complications.
Methods: A prospective, observational quality improvement project was conducted for adults who received HPN provided by the University of Virginia, Continuum Home Infusion Pharmacy from February 2019 through December 2022 with follow-up ending July 31, 2023. Data were collected for 141 CVCs used for 89 patients and included number of HPN days, indications for HPN (Figure 1), reasons for CVC removal, blood draws, and microbiologic results. CRBSI and CLABSI were determined by the criteria described in Table 1. Figure 2 shows the number of peripheral and CVC blood and catheter tip tests done for the CVCs with suspected infection.
Results: Of the CVCs used for HPN, 63% were PICC, 27% TCVC, and 10% ports, with a total of 15,474 HPN catheter days. The CVCs were 42% SL, 55% double-lumen, and 2% triple-lumen. CRBSI rates were 0.97 episodes per 1000 HPN catheter days overall, with 1.54 for PICC, 0.64 for TCVC, and 0.0 for ports. CLABSI rates were 1.74 episodes per 1000 HPN catheter days overall, with 3.07 for PICC, 0.89 for TCVC, and 0.0 for ports. No significant differences were found between PICC and TCVC in CRBSI, however, PICCs had a significantly higher CLABSI rate per 1000 HPN catheter days than did TCVCs (p = 0.005). After a second analysis in which 9 cases of catheter infection were not counted due to undetermined alternate infection sources, overall CRBSI and CLABSI rates were reduced to 0.78 and 1.16 per 1000 HPN catheter days, respectively. The second analysis showed CRBSI rates of 1.23 for PICC and 0.51 for TCVC; and CLABSI rates of 2.0 for PICC and 0.64 for TCVC, with no significant differences in CRBSI, and a significantly higher rate of CLABSI per 1000 HPN catheter days for PICC lines (p = 0.04). Table 2 shows a statistical analysis of CRBSI and CLABSI rates. In the initial analysis, CRBSI was 1.24 for ML and 0.68 for SL CVCs, and CLABSI was 2.1 for ML and 1.36 for SL CVCs, per 1000 HPN days, however the differences were not statistically significant. Other problems which necessitated CVC removal were occlusion, malposition, accidental, leak, and thrombosis. The removal rate for other complications was 2.0 per 1000 HPN catheter days overall, with 1.78 for TCVC and 2.61 for PICCs, and the differences were not statistically significant.
Conclusion: We found no significant differences in CRBSI between PICC and TCVC, significantly more CLABSIs in PICC as compared to TCVCs, and no infections with ports. Although rates of other catheter problems were higher for PICCs, and infection rates were higher for ML than for SL catheters, neither reached statistical significance. We illustrate the variation in results between CRBSI and CLABSI, and that undetermined alternate infection sources complicate reporting. Our results show the need for more study, to be more open to the use of ports, and to choose SL TCVCs when feasible for long-term HPN.
1Surgical Center, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; 2Department of Nutrition, St. Luke's International Hospital, Chuo-ku, Tokyo, Japan; 3Operating Room Management and Surgical Metabolism, Graduate School of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo, Japan; 4Gastrointestinal Surgery, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; 5Nutrition and Dietetics, Kanagawa University of Human Services, Yokosuka City, Kanagawa, Japan
Financial Support: None Reported.
Background: Our previous study clarified addition of beta-hydroxy-beta-methylbutyrate (HMB) to TPN to partially restores gut-associated lymphoid tissue (GALT) atrophy due to lack of enteral nutrition. Because HMB is a metabolite of amino acid, the recovery effects might derive from increased amount of amino acids in the TPN solution. Or, it is possible that increased amino acid content could not restore GALT atrophy by itself, but that the amino acid increase together with HMB addition could further prevent the atrophy. Herein, we performed 2 studies to answer these questions using a murine TPN feeding model.
Methods: Experiment 1: Six-week-old male Institute of Cancer Research (ICR) mice were divided into A+ (n = 10) and A++ (n = 10) groups. Mice were inserted a catheter into the right jugular vein and they were continuously administered 0.2 mL/h normal saline solution for 2 days and allowed to take chow and water ad libitum. Then, mice received isocaloric PN solution with NPC/N 284 (A+) or 135 (A++) without oral food intake for 5 days. After the dietary manipulation, all mice were killed with cardiac puncture under general anesthesia and harvested the whole small intestine for GALT cell isolation. GALT cell number and its phenotype (B cell, CD4+, CD8+, and αβTCR+, and γδTCR+) were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF) and intestinal washings were collected for IgA level measurement by ELISA. Experiment 2: Mice were randomized to A+H+ (n = 10) and A++H+ (n = 9) groups. The A+H+ mice received PN solution with NPC/N 284 and 2,000 mg/kg BW of Ca-HMB, while the A++H+ animals were given PN solution with NPC/N 135 and 2,000 mg/kg BW of Ca-HMB. After 5 days of PN feeding, the parameters as in Exp.1 were evaluated. The Wilcoxon test was used for all parameter analyses, and the significance level was set at less than 5%.
Results: There were no significant differences between the A+ and A++ groups in GALT cell numbers (Table 1), phenotypes (Table 2) or mucosal IgA levels. However, the A++H+ group showed higher LP cell numbers (Table 1) and higher CD4+ cell percentage (Table 2) in IE space than the A+H+ group, without significant differences in IgA levels at any mucosal sites.
Anam Bashir, MBBS; Lauren L. Karel, BCPS; Margaret Begany, RD, CSPCC, LDN, CNSC; Jennifer Panganiban, MD
Children's Hospital of Philadelphia, Philadelphia, PA
Financial Support: None Reported.
Background: Fish oil-based lipid emulsion (FOLE) is FDA-approved at 1 g/kg/day for the treatment of parenteral nutrition-associated cholestasis (PNAC). Due to limited fat provision while on 1 g/kg/day of FOLE, caloric provision, especially in the neonatal population, is skewed primarily to be provided by dextrose and higher than desired glucose infusion rate (GIR) provisions to support weight gain and growth. There is limited published information on the use of FOLE on doses higher than 1 g/kg/day. Concerns about possible essential acid deficiency on 1 g/kg/day have been raised. Thus, we aim to describe patients who received 1.5 g/kg/day of FOLE at our institution.
Methods: A retrospective IRB-approved chart review was conducted on patients who received parenteral nutrition (PN) at Children's Hospital of Philadelphia between January 2020 and August 2023. The inclusion criteria included children who were on PN, ages 0 to 18 years, and receiving FOLE at a dose of more than 1 g/kg/day for at least 14 days. Cholestasis progression, essential fatty acid deficiency (EFAD), clinically severe post-procedure hemorrhage and hypertriglyceridemia were clinical outcomes of interest (Table 1). The progression of cholestatic disease was monitored by conjugated bilirubin levels. A triene to tetraene (T:T) ratio of greater than 0.046 was used to define EFAD based on Associated Regional and University Pathologists, Inc. (ARUP) normative laboratory values. Mead acid, linoleic acid, and alpha-linoleic acid levels were also collected to reflect essential fatty acid stores (normative values in Table 2). Invasive procedures were defined as those that require entry to the body through an incision, and/or tunneling, or cutting technique for vascular procedures. For children younger than 1 year, hypertriglyceridemia was classified as greater than 200 mg/dl, and for older children, greater than 400 mg/dl.
Results: Nine patients [5 males; mean age 2.6 y (range 2 mo–12.9 y)] with PNALD (defined by serum conjugated >= 2 mg/dl and exclusion of other causes of liver disease) were started on FOLE 1.5 g/kg/dose. The purpose of initiating the higher dose FOLE was to decrease GIR provision and/or give additional calories due to suboptimal weight gain using 1 g/kg/day of FOLE. None of the patients developed hypertriglyceridemia. Four patients had improvement of cholestasis with levels decreasing by more than 2 mg/dl, and four patients continued to have no evidence of cholestasis after prior normalization while on 1 g/kg dosing. One patient experienced an increase in conjugated bilirubin of more than 2 mg/dl after which the FOLE was decreased to 1 g/kg/day with resolution of cholestasis over three months. Seven patients had an essential fatty acid panel collected and T:T was within normal limits, although five patients had less than optimal levels of linoleic acid. Seven patients had an invasive procedure performed and only one patient had more than expected bleeding after circumcision. This patient had a low fibrinogen level (70 mg/dL) and required fresh frozen plasma and packed red blood cell transfusion with no significant bleeding event thereafter (Table 1).
Diana Mulherin, PharmD, BCNSP, BCCCP, FCCM; Sarah Cogle, PharmD, BCNSP, BCCCP; Vanessa Kumpf, PharmD, BCNSP, FASPEN; Edward Woo, PharmD; David Mulherin, PharmD, BCPS; Madeleine Hallum, MSHS, RDN, CSG, LDN; Ankita Sisselman, MD; Dawn Adams, MD, MS, CNSC
Vanderbilt University Medical Center, Nashville, TN
Financial Support: None Reported.
Background: Copper (Cu) deficiency can lead to poor wound healing, myeloneuropathy, anemia, and cardiac arrhythmias. Deficiency occurs from poor intake or high losses, which may be seen in adult patients requiring parenteral nutrition (PN) including those with severe malnutrition, large burns, requiring continuous renal replacement therapy (CRRT), or with a history of bariatric surgery/malabsorption. A previous formulation of multi-trace elements (MTE) contained Cu 1 mg per dose, and in combination with Cu contamination from other PN ingredients, an increased incidence of hypercupremia was observed in patients requiring long-term PN. As of 2020, the only MTE product for use in adults in the U.S. contains 0.3 mg of Cu. For patients with significant cholestasis of hepatic dysfunction, ASPEN recommends withholding or decreasing Cu doses in PN. Due to a lack of standardized practice, a quality improvement project was initiated to describe practices for ordering Cu in PN and Cu status in acutely ill, hospitalized patients with severe hyperbilirubinemia.
Methods: This was a retrospective evaluation of PN ordering practices of a multidisciplinary nutrition support team (NST) at a large, academic medical center between July 1, 2021, and August 31, 2023. PN encounters (a course of PN treatment during a single inpatient admission) in patients ≥ 18 years of age with severe hyperbilirubinemia (total bilirubin ≥ 10 mg/dL or direct bilirubin ≥ 2 mg/dL) within 5 days before or any time during the PN encounter were included. Patient demographics, frequency of Cu provision in PN, Cu and C-reactive protein (CRP) levels, and CRRT status were assessed using descriptive statistics.
Results: A total of 15,739 PN orders were entered on 1068 patients during the study period. Of those, 155 PN encounters occurred in 144 individual patients with severe hyperbilirubinemia. Baseline demographics are provided in Table 1. A summary of Cu sources (either from MTE product or as cupric chloride additive) for each PN encounter is provided in Figure 1. Cu status was assessed in 53 (34%) PN encounters with a mean concentration of 76.9 (±34.3) mcg/dL. CRP was only obtained concurrently with 58% (n = 31) of Cu levels with a mean concentration of 125.7 (±95.4) mg/L. CRRT was provided in 44 (28.4%) encounters (Table 2).
Figure 1. Copper sources in PN orders.
Brittney Patterson, MS, RD-AP, CNSC1; Ranna Modir, MS, RD, CNSC, CDE, CCTD1; Jack McKeown1; Rachel Aubyrn1; Javier Lorenzo, MD, FCCM2
1Stanford Health Care, Stanford, CA; 2Stanford University School of Medicine, Stanford, CA
Financial Support: None Reported.
Background: The use of safety alerts in electronic medical records (EMR) aims to improve patient safety, with most alerts directed at medication and nursing workflows. Stanford Health Care (SHC) has added tube feeding regimens (TFR) to the medication administration record (MAR) to further improve patient safety. In critically ill (ICU) patients who are at high risk for gastrointestinal (GI) complications, the ASPEN/SCCM 2016 guidelines recommend using near isotonic, fiber-free TFR. A retrospective analysis between 2014-2016 at SHC found an association of severe GI complications in ICU patients who were started on high-risk tube feeding regimens (HRTFR) of hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. To assure the ASPEN/SCCM guidelines were implemented at SHC, many interventions were put in place including designing order sets with HRTFR listed toward the bottom; specific TFR order sets removing HRTFR; education during new resident orientation, team rounds, and monthly in-services; and Registered Dietitians (RDs) having tube feeding order writing privileges. However, despite these interventions, it was found that HRTFR were still being ordered, with most of them occurring outside of normal RD working hours (8 am to 4 pm). To educate and guide providers to select safe TFR for ICU patients, we aimed to create a novel nutrition support-specific order validation pop-up in the EMR.
Methods: A team of RDs, critical care attendings, and Epic analysts collaborated to create a nutrition support-specific order validation pop-up. ICU patients were defined as requiring vasopressor support from norepinephrine, epinephrine, vasopressin, and/or phenylephrine. HRTFR was defined as hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. The order validation pop-up was built to trigger under the following three scenarios: (1) vasopressors were already on and a HRTFR was ordered, (2) a HRTFR was already on, and a vasopressor was ordered, or (3) both orders were being placed simultaneously. The pop-up displayed the reason for the alert, the importance of avoiding a HRTFR, provided safer TFR options, and recommended contacting the RD for guidance. To preserve individualization of patient care, order validation was overridable, as patients on lower vasopressor doses are appropriate to have the HRTFR. After the order validation pop-up was implemented, a chart review was completed between March 2023 and May 2023 to assess the incidence and actions following the triggered order validation pop-up.
Results: Between March 2023 and May 2023, the order validation pop-up triggered 220 times in a total of 59 patients. Out of the 220 triggers, based on the instructions in the pop-up, 42 (19%) resulted in a changed or discontinued order, or the HRTFR was not ordered. Of those 42 triggers that resulted in a properly adjusted HRTFR, 26 (61%) of them occurred outside of normal RD hours. The remaining triggers, where no changes were made, were found to have low dose vasopressors, vasopressors listed on the MAR but not actively being used, or a HRTFR was ordered on the MAR but held per nursing communication orders.
Conclusion: The creation of a novel nutrition support-specific order validation pop-up provided education and guidance to ordering providers. With this additional layer of safety, 42 ICU patients between March 2023 and May 2023 were placed on safer TFR, with most of the impact occurring outside of RD working hours.
Best of ASPEN - Enteral Nutrition Therapy
1627 - Victory for Volume-Based Enteral Nutrition
Julie M. Geyer, RD-AP, CNSC
University of Colorado Hospital, Aurora, CO
Financial Support: None Reported.
Background: Enteral nutrition (EN) in the hospital setting is traditionally administered by a fixed rate-based feeding method (RBEN). Studies using RBEN found that due to interruptions or withholding, actual formula delivery averages 60% to 70% of the prescribed volume. Nutrition provision below energy needs contributes to malnutrition and negative consequences including increased health care cost, and increased morbidity and mortality. The American Society for Parenteral and Enteral Nutrition (ASPEN) and the Society of Critical Care Medicine (SCCM), recommend use of a volume-base enteral nutrition feeding method (VBEN) to improve the nutrient delivery, decrease energy deficits and prevent overfeeding.
Methods: This quality improvement study took place at a Level I trauma, academic hospital from June 2022 to September 2023. In September 2022, a hospital-wide process improvement committee was assembled for multi-phase implementation of VBEN. Prior to September 2022, unit-based dietitians conducted quality improvement to address common causes of feeding interruptions. VBEN Inclusion criteria included those demonstrating tolerance of goal RBEN. The maximum hourly rate was set at 150 mL/hr. The ‘goal’ provision was set as 90% to 110% of prescribed formula volume. Patients included in the data collection were tolerating EN at RBEN goal and formula intake volumes were taken directly from the feeding pump history. Changes to the electronic medical record (EMR) included, creation of a VBEN calculator with row instructions built into the tube feeding flowsheet, creation of nurse reminder task every 4 hours to recalculate formula intake and adjust rate as needed. Changes to the formula order on the medication administration record included specification of VBEN vs RBEN feeding method and standardized administration instructions (Figure 1). Nurses, dietitians, and providers received training for the VBEN workflow and process through e-mail communication, in-person training, interactive learning-assisted video, and one-on-one coaching.
Results: Prior to June 2023, RBEN was the standard feeding method. Routine quality improvement audits from October 2020 to December 2022 in one intensive care unit demonstrated that despite strategies to improve formula delivery actual formula provision to meet ‘goal’ was met on 50% to 74% of EN days (Table 1). In June 2022, a hospital-wide audit of formula provision was conducted and included all levels of care (floor, intermediate, and intensive care). In a total of 346 EN days, the actual formula provision to meet ‘goal’ was achieved on 63% of EN days (Table 2). In November 2022, an audit was conducted in the two ICU units selected for phase 1 implementation. In a total of 154 EN days, the actual formula provision to meet ‘goal’ was achieved on 57% of EN days (Table 2). Phase 1 implementation took place in June 2023. A post-go-live audit was completed. In a total of 157 EN days, ‘goal’ formula volume was achieved on 83% of EN days (Table 2). No instances of hypo/hyperglycemia or gastrointestinal complications were reported. Phase 1 was deemed a success and approval was obtained to continue VBEN implementation in a stepwise fashion for the remaining inpatient units.
Marcin Folwarski, MD, PhD1; Stanisław Kłęk2; Karolina Skonieczna-Żydecka3; Agata Zoubek-Wójcik4; Waldemar Szafrański, MD, PhD5; Lidia Bartoszewska6; Krzysztof Figuła7; Marlena Jakubczyk, MD, PhD8; Anna Jurczuk9; Przemysław Matras, MD, PhD10; Zbigniew Kamocki, MD, PhD11; Tomasz Kowalczyk, MD, PhD12; Bogna Kwella, MD, PhD13; Joanna Sonsala-Wołczyk14; Jacek Szopiński, MD, PhD15; Krystyna Urbanowicz, MD, PhD16; Anna Zmarzly, MD, PhD14
1Division of Clinical Nutrition and Dietetics, Medical University of Gdańsk, Gdansk, Pomorskie, Poland; 2Surgical Oncology Clinic at the National Cancer Institute in Krakow at Maria Sklodowska-Curie National Research Institute of Oncology, Cracow, Poland; 3Department of Biochemical Science, Pomeranian Medical University in Szczecin, Szczecin, Zachodniopomorskie, Poland; 4Nutrimed Home Nutrition Center, 3, Warsaw, Poland; 5Home Enteral and Parenteral Nutrition Unit, General Surgery Department, Nicolaus Copernicus Hospital, Gdansk, Pomorskie, Poland; 6First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit S, Lublin, Poland; 7Nutricare Clinical Nutrition Center, Cracov, Poland; 8Department of Anaesthesiology and Intensive Care Collegium Medicum in Bydgoszcz, Nicolaus Copernicus University, Toruń, Poland; 9Outpatient Clinic of Nutritional Therapy Clinical Hospital, 15-001 Bialystok, Bialystok, Poland; 10First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit SPSK4, Lublin, Poland; 11Department of General and Gastroenterological Surgery Medical University, Bialystok, Poland; 12Nutricare Clinical Nutrition Center, Cracow, Poland; 13Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland; 14Clinical Nutrition Unit, Gromkowski Citi Hospital, Wroclaw, Poland; 15Department of General Hepatobiliary and Transplant Surgery, Collegium Medicum, Nicolaus Copernicus University in Torun, Torun, Poland; 16Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland
Financial Support: None Reported.
Background: Cancer is one of the most common indications for home enteral nutrition (HEN). Malnutrition and weight loss, associated with deterioration in performance status, contribute to poorer outcomes in oncology patients. Systemic inflammation is a characteristic feature of cancer cachexia and may be used as a prognostic factor for short survival. According to the ESPEN guidelines HEN is indicated for patients with an estimated survival of at least 30 days. Therefore, determining survival is essential for individual care planning as it informs healthcare professionals about the suitability of HEN and palliative care strategy.
Methods: In a retrospective multicenter survey, we examined the medical records of cancer patients across 22 Polish HEN centers treated in 2018. Factors assessed during the qualification for HEN included BMI, weight loss, albumin level, total protein level, lymphocyte count, CRP, Prognostic Nutritional Index (PNI), and Eastern Cooperative Oncology Group (ECOG) performance status. The primary endpoint was survival of less than 30 days from the initiation of HEN.
Results: A total of 278 cancer patients: 51.44% head and neck, 41.37% gastrointestinal, and 7.19% other localizations were included in the study (70.14% male, 29.86% female). Inflammatory factors– albumin level below 3.5 g/dL (p = 0.02), C-reactive protein (p = 0.01), PNI > 45 (p = 0.04), high percentage of weight loss in the last 6 months (p < 0.01) and ECOG performance score (p = 0.01) were associated with poor survival (less than 30 days). Body weight, BMI, lymphocyte count, and total protein level were not correlated with survival.
Conclusion: Assessment of performance status, inflammation, and weight loss during qualification for HEN can predict short-term survival of cancer patients. This finding highlights the importance of comprehensive assessments before home nutrition initiation. Predicting poor survival can help plan palliative care and determine whether the patient will benefit from HEN.
June R. Greaves, RD, CNSC, CDN, LD, LDN, LRD1; Katharine Morra, RD, CNSC, CSO, LD, LDN2
Background: The objective of this quality improvement project was to determine whether patients were successful in administering tube feeding independently at home following a virtual tube feeding instruction by a Registered Dietitian (RD) with a nationwide home care infusion company. The hope is to provide information regarding the process and potentially identify an avenue for further improvements to the process and an area for future research.
Methods: A retrospective review was conducted of approximately 162 patients who received a virtual tube feeding instruction by the enteral RD from June 2022 to June 2023. A virtual instruction was completed for the enteral feeding pump, gravity bag, and bolus/syringe methods of administration. A follow-up call was made to active patients to inquire about their experience of the virtual instruction. For those patients unable to be reached, a review of the medical record was completed to determine if inbound calls were received for questions or issues after the virtual instruction. Patients were queried on confidence in their ability to provide enteral feedings, and if they had any concerns upon completion of the virtual instruction, knowledge of who to contact after the virtual instruction, and if the reference materials provided were helpful. Patients who did not receive virtual instruction, or were discharged from service, were excluded from the review.
Results: One hundred sixty-two total patients were reviewed as potentially eligible for the analysis; 115 were excluded. Of those excluded, 100 (87%) were no longer on service; 12 (10%) declined a virtual instruction due to home health agency instruction, inpatient instruction with nursing or dietitian prior to the start of care, or assistance from the home infusion company sales team; 3 (3%) were a “no show” for the scheduled appointment. Eighteen of the remaining eligible patients were unable to be contacted for follow-up. Of those who were unable to be contacted through a follow-up call, there were no documented inbound calls regarding feeding/equipment questions or concerns. Of the total number of eligible patients, 29 provided telephonic feedback on the virtual instruction experience. Virtual instruction was related to the following administration types: enteral pump (86%, n = 25), followed by gravity bag and bolus/syringe (14%, n = 4). Upon completion of the instruction, 27 (93%) felt confident with feeding administration, 2 (7%) did not feel confident as they identified as “in person learners”; 24 (83%) did not experience issues/concerns, 5 (17%) did have questions/concerns; 27 (93%) responded knowing who to contact, 2 (7%) did not; 22 (76%) found reference materials provided helpful, 2 (7%) did not, and 5 (17%) did not review the reference materials.
Conclusion: Technological advances in recent history have made virtual instruction possible. Virtual enteral instruction can be a successful tool for patients to learn how to administer tube feedings when an in-person instruction is not possible in the home care setting. However, consideration should be given to the client's preferred style of learning. Further research in the use of virtual instruction to enhance the process should be considered. As literature is limited on virtual instruction outcomes, additional research is warranted.
Danelle A. Olson, RDN; Lisa M. Epp, RDN; Osman Mohamed Elfadil, MBBS; Ryan T. Hurt, MD, PhD; Manpreet S. Mundi, MD
Mayo Clinic, Rochester, MN
Financial Support: None Reported.
Background: The prevalence of bariatric surgery has increased significantly in recent years, as it is the most effective long-term treatment for obesity. The two most common surgeries, Sleeve Gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), alter gastrointestinal anatomy, producing significant weight loss as well as remission of obesity-related co-morbidities including type 2 diabetes. Despite these benefits, bariatric surgery can be associated with significant debilitating complications. Though the true prevalence and mechanism are unclear, hypoglycemia has been shown to be present in up to 38% of post-surgical RYGB patients and can be very difficult to manage. Currently, there remains a paucity of data regarding the role of enteral nutrition (EN) as a potential therapy.
Methods: A retrospective review of EMR of patients who were seen in our outpatient home enteral nutrition (HEN) clinic for initiation of tube feeding for management of reactive hypoglycemia from March 2017 to July 2023. In addition to baseline clinical characteristics and demographics, we collected data about hypoglycemia incidents, interventions, EN regimens, and outcomes.
Results: Six patients were seen in the HEN clinic with post-bariatric reactive hypoglycemia (mean age 45.5 ± 9.6 years; 66.7% female; mean BMI at HEN initiation 28.6 ± 8.3). Five out of 6 patients underwent RYGB surgery, and 1/6 underwent laparoscopic adjustable gastric banding (LAGB) that was subsequently revised sleeve gastrectomy (SG). The duration until the development of reactive hypoglycemia after surgery varied in the cohort. On average, the first incident was documented 2.6 ± 3.2 years after surgery. Of note, patients lost, on average, 51.2 ± 28.5 kgs after surgery and before they required EN support. We noted a slight change in weight after EN initiation as patients remained, on average, at +2.5 kg one month and 3 months into HEN. Table 1 shows the patients' profiles. Dietary modification focusing especially on reduction in consumption of refined carbohydrates was recommended for all patients. However, poor compliance was prevalent, with 5/6 (83%) of patients not adhering to prescribed diet. In addition to the EN and dietary regimens prescribed for all patients, some had received specific treatment(s) to prevent or manage reactive hypoglycemia. In one case, a combination of α-glucosidase inhibitors, somatostatin, and radical diet changes were used. The majority of patients underwent an initial trial of EN through a naso-jejunal tube, which was then converted to a percutaneous tube after efficacy was established (Table 2). Standard polymeric formulas were utilized for most patients, although one was provided commercial blenderized tube feeds. With the use of EN, 4 out of 6 patients had a resolution of reactive hypoglycemia, while only two continued to experience symptoms. Two patients stopped use of EN due to feeding complications and non-compliance, while the remaining four continued on EN.
Anna K. Burneske; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD
Medical College of Wisconsin, Milwaukee, WI
Financial Support: None Reported.
Background: Patients who are malnourished have worse outcomes. Many standardized tools have been developed to screen for malnutrition in acutely ill pediatric patients: Pediatric Yorkhill Malnutrition Score (PYMS), Pediatric Nutrition Screening Tool (PNST), and Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP). Alternatively, some institutions have developed their own tools for this purpose. These tools are referred to as “home-grown”. Regardless of their origin, none of these tools have been validated in critically ill children. Registered dietitians (RDs) perform nutrition assessments on patients based on the results of these nutrition screenings or based on protocols within their institution. The Virtual Pediatric Systems, LLC (VPS), an international data registry supporting standardized data sharing for research, improved patient care, and benchmarking among pediatric ICUs, developed a nutrition module that captures data for nutritional metrics. VPS has collected data from centers in the nutrition module since October 2019 and collects data for about 10,000 patients per calendar year from the centers participating in the nutrition module. The specific aims were to compare the nutrition screening tools to the dietitians' assessments to determine the screening tools' accuracy and to determine whether standardized screening tools are more accurate than those developed at single centers. We hypothesized that (1) nutrition screening tools used by participating centers will accurately identify malnourished children, and (2) standardized tools will be more accurate than those developed at single centers.
Methods: In this project, we compared pediatric nutrition screening tools with the assessments performed by RDs to determine whether nutrition screening tools accurately identify malnourished patients. We also determined which nutrition screening tools more accurately identify patients who are malnourished or at risk of becoming malnourished during their hospitalization in the PICU so that the appropriate nutrition therapy can be initiated. We obtained de-identified demographic and clinical data from October 2019 through March 2023 for all patients under 18 years of age from the VPS database from centers participating in the nutrition module. We considered the RD's assessment to be the gold standard for determining malnutrition and compared the nutrition screening tools to the RD's assessment. The degree of agreement in malnutrition between nutrition screening tools and RD's assessment was determined by Cohen's kappa (κ).
Results: After selecting subjects who had a complete pediatric nutrition screen and RD assessment, the final data cohort contains a total of 9891 patients. Among them, there were 54% males, 4% neonates (<=29 d), 34% infants (<2 y), 35% children (2-12 y), and 26% adolescent (12-18 y). The subjects were 40% White, 17.5% Black, 22.5% Hispanic, 5.7% Asian, and 14.2% other/mixed. The kappa coefficient for the standardized nutrition screening tools was .38, which is considered a “fair” agreement between the screening tool and the RD “gold standard” assessment. Other unidentified tools listed as “home-grown” or “other” in VPS had kappa coefficients ranging from .31 to .91. 91 is a near-perfect agreement between the screening tool and the RD “gold standard.”
Conclusion: These data show only a fair degree of agreement between the standardized screening tools (PYMS, PNST, STAMP) and RD assessments, meaning that these tools do not adequately assess the nutritional status of critically ill children. However, some unidentified hospital-specific tools show near-perfect agreement with RD assessments, so perhaps there is a better tool for identifying malnourished children in the ICU. Further investigation should be performed to determine why the home-grown tools are superior to the published tools.
Research Trainee Award
Hayley E. Billingsley, PhD, RD, CEP; Michael Dorsch, PharmD, MS; Todd M. Koelling, MD; Scott L. Hummel, MD, MS
University of Michigan, Ann Arbor, MI
Financial Support: NHLBI - Award 5R33HL155498-03.
Background: Malnutrition is common in patients with heart failure (HF) and worsens already poor prognosis. Previous work suggests that sodium restriction, the most common dietary recommendation for patients with HF, may be associated with reduced micronutrient and energy intake. The Mini Nutritional Assessment-Short Form (MNA-SF) is a strong indicator of nutrition status and prognosis in patients with HF, but the association between MNA nutrition status and sodium intake has not been examined. Therefore, this analysis aimed to examine the association between nutrition status and habitual sodium intake in hospitalized patients with HF.
Methods: This is a cross-sectional analysis of patients (≥18 y of age) hospitalized for decompensated HF. Participants were administered the MNA-SF and scored as nourished, at risk of malnutrition, or malnourished based on established cutoffs. Questions on the MNA-SF regarding weight loss and declines in food intake over the previous 3 months were also considered independently. Participants completed the 2014 Block Food Frequency Questionnaire (FFQ) to assess habitual dietary intake. Estimated daily kilocalories (kcals) from the FFQ were divided by estimated energy needs (Harris-Benedict equation*1.1) to calculate percent (%) estimated energy needs. Estimated protein needs were calculated based on the Academy of Nutrition and Dietetics recommendation of 1.1 g/kilogram (kg) in HF and divided by estimated protein intake from the FFQ to calculate % estimated protein needs. Using the FFQ, participants were grouped into sodium intake ≥ or < 2 g per day. Differences between groups based on sodium intake were explored using Fischer's exact test, Chi-square, or Mann Whitney U as applicable.
Results: Baseline characteristics are presented in Table 1. On FFQ, participants with sodium intake <2 g reported consuming significantly less of their % estimated energy and protein needs than participants with ≥ 2 g sodium intake (Figure 1). All patients (n = 12) with sodium intake <2 g per day were malnourished or at risk for malnutrition on MNA-SF versus 73% (32) of patients with sodium intake ≥2 g per day (P = 0.051). A greater proportion of patients with daily sodium intake <2 g reported recent weight loss >3 kg (75% [9] vs. 43% [19], P = 0.051). No difference was found in the proportion of participants reporting a decrease in food intake on the MNA-SF (<2 g sodium, 67% [8] vs. ≥ 2 g sodium, 50% [22], P = 0.305).
Conclusion: In patients hospitalized for HF, habitual sodium intake <2 g per day was associated with inadequate energy and protein intake, confirming previous findings. Despite the high prevalence of obesity in the cohort, sodium intake <2 g per day was also associated with self-reported weight loss >3 kg and a higher likelihood of being at risk for or having malnutrition. Although this cross-sectional analysis cannot determine the directionality of observed associations, additional studies should examine the impact of personalized nutrition interventions vs. standard-of-care sodium restriction education in HF on clinical outcomes.
Figure 1. Percent estimated energy and protein needs achieved by sodium intake level in hospitalized patients with heart failure.
Lucia A. Gonzalez Ramirez, cPhD1,2; Mary M. Nellis, PhD3; Jessica A. Alvarez, PhD1,2,4; Tasha M. Burley2; Paula D. Nesbeth, cPhD1,2; Chin-An Yang, cPhD1,2; Dean P. Jones, PhD1,3,4; Thomas R. Ziegler, MD1,2,4
1Nutrition and Health Sciences Program, Laney Graduate School, Emory University, Atlanta, GA; 2Division of Endocrinology, Metabolism and Lipids, Department of Medicine, Emory University, Atlanta, GA; 3Clinical Biomarkers Laboratory, and Division of Pulmonary, Allergy, Critical Care and Sleep Medicine, Department of Medicine, Emory University, Atlanta, GA; 4Center for Clinical and Molecular Nutrition, Department of Medicine, Emory University, Atlanta, GA
Financial Support: None Reported.
Background: Postprandial metabolism can identify alterations related to the early stages of cardiovascular disease. However, limited data exist regarding the effects of body composition on postprandial metabolism after a lipid meal challenge. We aimed to characterize the metabolic pathways and metabolites associated with body fat abundance in the postprandial plasma metabolome after an oral lipid challenge.
Methods: Thirty-one healthy individuals between 20 and 50 years old with a lean or overweight/obese body mass index (BMI) were recruited. Participants underwent body composition measurement with dual energy x-ray absorptiometry (DEXA) to quantify body fat percentage and visceral adipose tissue quantity. A standardized 900-kcal lipid meal challenge (a long chain triglyceride fat emulsion oral nutritional supplement) with repeat blood sampling was administered. Untargeted plasma high-resolution metabolomics was determined at baseline, 120 minutes, and 360 minutes after the lipid challenge using dual-column liquid chromatography (C18- and HILIC+ electrospray modes), coupled with high-resolution mass spectrometry (LC-HRMS). Metabolite differences were assessed using a metabolome-wide association study with linear mixed-effect models to study effects of body fat, time, and the body fat*time interaction, controlling for age and sex, and pathway enrichment analysis was performed.
Results: A total of 12,078 (C18) and 15,041 (HILIC) features (metabolites) were detected in plasma at baseline. Changes over time differed by percent body fat (percent fat*time interaction) for 699 (C18) and 814 (HILIC) features from baseline to 120 minutes, respectively, and 465 (C18) and 478 (HILIC) features from baseline to 360 minutes, respectively (all p < 0.05). These were enriched in pathways that include TCA cycle, fatty acid, lysine, tyrosine, tryptophan, butanoate, and purine metabolism, Figures 1 and 2. Additionally, changes over time differed by visceral adipose tissue quantity (VAT*time interaction) for 396 (C18) and 2290 (HILIC) features from baseline to 120 minutes, respectively, and 486 (C18) and 520 (HILIC) features from baseline to 360 minutes, respectively (all p < 0.05). These were enriched in pathways that include fatty acid oxidation, omega-3 and −6 fatty acid, vitamin C, and pentose phosphate metabolism, Figure 3 and 4.
Best of ASPEN - Malnutrition, Obesity, Nutrition Practice Concepts, and Issues
Ana Paula Pagano, MSc1; Taiara Poltronieri, BSc1,2; William Evans, PhD3; M. Cristina Gonzalez, MD, PhD4; Anil Abraham Joy, MD5; Claude Pichard, MD, PhD6; Carla Prado, PhD, RD1
1University of Alberta, Edmonton, AB, Canada; 2Federal University of Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil; 3University of California, Berkeley, CA; 4Federal University of Pelotas, Pelotas, Rio Grande do Sul, Brazil; 5University of Alberta/Cross Cancer Institute, Edmonton, AB, Canada; 6Geneva University Hospital, Geneva, Switzerland
Financial Support: ASPEN (American Society for Parenteral and Enteral Nutrition) Rhoads Research Foundation, and the Canadian Institutes of Health Research (CIHR) (FRN 159537).
Background: Accurate understanding of energy requirements is essential for tailored nutritional interventions in patients with cancer. Under- or overestimating these needs can lead to detrimental weight loss or excessive gain. Yet, determining energy needs in cancer is challenging due to factors like individual tumor burden, treatment, and inflammation, all of which can influence energy requirements. Current guidelines offer broad caloric intake (25-30 kcal/kg/d) set as normal values that lack strong evidence. As a result, dietitians often rely on predictive equations, which have proven to be imprecise. Nonetheless, standard techniques available to accurately measure energy requirements are costly, time-consuming, and not applicable to clinical settings. In this study, we leveraged a cohort of patients with breast cancer to evaluate the accuracy of a novel bedside device designed to measure resting energy expenditure (REE) and compared it against a gold-standard method.
Methods: REE data were obtained cross-sectionally from adult females with breast cancer (stages I-III) measured during a 10-minute test with a novel portable device, the Q-NRG® (Cosmed, Roma, Italy), and compared against REE measured during a 1-hour test in a whole-room indirect calorimeter (WRIC) as a gold-standard technique. To assess and describe the REE accuracy between methods, we utilized paired samples t-test or Wilcoxon signed-rank test for instances of non-normality. Accuracy was determined by the percentage of estimates that fell within 10% of the values measured by WRIC. Additionally, Bland-Altman analysis was conducted to determine bias and establish the lower and upper limits of agreement (LOA). A p-value of less than 0.05 was considered statistically significant.
Results: REE was evaluated in 49 females (age 55.9 ± 11.8 y; 42% with stages I or II, and 7% with stage III breast cancer) using both WRIC and the new portable device. Most patients (63.3%) had a body mass index (BMI) classification within the overweight or obesity categories, and none were categorized as underweight. The new portable device provided accurate measurements for over 70% (n = 35) of patients, showing measurements within 10% of those obtained by WRIC. However, the new portable device overestimated REE for 1 patient and underestimated it for 13. Measured REE significantly differed between techniques, with the new portable device underestimating REE compared to WRIC (1406 ± 262 vs 1508 ± 248 kcal/d; p < 0.001). The bias between the new portable device and WRIC was −6.7% (LOA = −24.9%, 11.6%; variance = 36.5%) or −102 kcal (LOA = −378 kcal, −174 kcal; variance = 552 kcal).
Conclusion: When compared to a gold-standard technique, the new portable device showed good agreement at the group level, with REE measurement discrepancies falling within 10% of values determined by the WRIC. Although a greater variability was observed at the individual level, the new portable device accurately assessed REE in comparison to the WRIC for most patients. Thus, the new portable device appears to be a promising tool for estimating REE of patients with breast cancer, positioning it as a viable option for clinical settings.
Michelle Brown, MS, RD, LDN, CNSC
UF Health, Gainesville, FL
Financial Support: None Reported.
Background: Malnutrition is a highly prevalent issue in the healthcare setting. The term malnutrition in the healthcare setting refers to undernutrition. This occurs as a result of inadequate nutrition intake, impaired absorption, or altered utilization of nutrition. Inflammation and hypermetabolism also contribute to the development of malnutrition. Estimates of the prevalence vary and are as high as 54%. In acute care hospitals, the prevalence of malnutrition is 39% when using diagnostic criteria from the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN). Capturing and recognizing malnutrition is important, as this diagnosis is associated with a 3.4x high rate of in-hospital death, 1.9x higher length of stay, 2.2x likelihood of being admitted with a serious infection, higher rates of discharge to a rehabilitation or long-term assisted care facility, an increased rate of readmissions, and a 73% increase in hospital costs. Due to the impact of malnutrition on healthcare costs and requirements for care, ICD10 codes for malnutrition are considered comorbid conditions (CC) or major comorbid conditions (MCC). Accurate diagnosis, treatment, and documentation of malnutrition can improve patient care. Accurate documentation can also help to capture complexity for quality metrics while also allowing for the selection of the correct DRG and base payment which may increase reimbursement.
Methods: An interdisciplinary nutrition committee at our organization consisting of dietitians, physicians, nurses, and informatic professionals completed a quality improvement implementation to improve malnutrition diagnosis rates, documentation, and coding. This was completed in four steps: (1) Identification of malnutrition criteria that could be used across the organization. Our committee elected to use the AND/ASPEN criteria for the diagnosis of malnutrition. This criterion is used by ~85% of hospitals and is widely recognized by payors. (2) Development of a documentation tool that would allow for RD diagnoses malnutrition to populate in provider progress notes. The hospital's electronic medical record (EMR) was leveraged to accomplish this goal. A novel flowsheet and Smartphrase were developed, which allowed information on malnutrition severity, signs/symptoms, and treatment (entered by the dietitian) to flow into physician progress notes automatically. This solution met all the “best practices” for documentation that were identified by our interdisciplinary team – clear signs and symptoms of malnutrition identified, the severity of malnutrition indicated and documented consistently between providers, consistent use of diagnostic criteria, and treatment for malnutrition being provided and documented. (3) All clinical nutrition staff members were provided with hands-on training on the completion of nutrition-focused physical exams (NFPE), and the completion of these exams was prioritized in all nutrition assessments. (4) When the malnutrition Smartphrase was not used, notes were sent to physicians for attestation and signature.
Results: Following this implementation, dietitian-diagnosed malnutrition has been included in physician notes via Smartphrase for 65% of cases. In the six months following NFPE training, malnutrition diagnosis rates increased by 220%. The percentage of dietitian assessments with a malnutrition diagnosis has increased from 13% to 40%. Following the process of sending notes to physicians for attestation and signature, 94% of malnutrition diagnoses are coded in the EMR at discharge from the hospital, and coding queries to physicians decreased by 50%. Hospital reimbursement for dietitian-diagnosed malnutrition has increased from ~$65,000 per quarter to ~$2 million per quarter.
Conclusion: Utilization of appropriate NFPE training, physician-approved diagnostic criteria, and EMR-based documentation solutions can increase diagnosis, documentation, and reimbursement for malnutrition diagnoses in hospitalized patients.
Research Trainee Award
Alan Garcia-Grimaldo1,2; Ivan A. Osuna-Padilla1; Nadia Rodriguez-Moguel1; Martin A. Rios-Ayala1; Marycarmen Godinez-Victoria2
1National Institute of Respiratory Diseases, Mexico City, DF, Mexico; 2Escuela Superior de Medicina, Instituto Politécnico Nacional, Mexico City, DF, Mexico
Financial Support: None Reported.
Background: Intensive care unit-acquired weakness (ICU-AW) is characterized by peripheral muscle mass wasting, reduced muscle strength, and dysfunction. Respiratory and swallowing related muscles could also be affected by this condition. This study aimed to analyze the association between ICU-AW incidence and post-extubation dysphagia (P-ED).
Methods: A prospective cohort study was conducted. Patients on mechanical ventilation (MV) admitted to the ICU were included. Individuals with a previous diagnosis of myopathies were excluded. NUTRIC-Score, calf circumference adjusted to BMI, and phase angle (PhA) obtained by bioelectrical impedance, were assessed upon admission and after extubation. Biochemical variables (Baseline PCR) were collected from medical records. SOFA score, APACHE II, and malnutrition diagnosis using GLIM criteria were determined upon admission to the ICU. Cumulative energy (CED) and protein (CPD) deficits were calculated during the ICU stay. ICU-AW diagnosis was determined using the Medical Research Council Scale (MRC-Scale <48) and hand grip strength (<11 kg for men, and <7 kg for women). Swallowing function assessment was performed within the first 24 hours after extubation, using the Yale Swallowing Protocol (YSP). For patients who did not meet the success criteria defined for the YSP, the volume-viscosity swallow test was performed to corroborate the presence of post-extubation dysphagia (P-ED). Specific success and failure criteria proposed for each test were used. Mean and median comparison tests were performed for each variable between the group with P-ED and those with normal swallowing. Associations were analyzed using univariate and multivariate logistic and linear regressions. Covariates selection was performed by stepwise method.
Results: Fifty-four patients were included, 19 (35.2%) were diagnosed with P-ED and 32 (59.3%) with ICU-AW. Patients with P-ED showed lower values for PhA at extubation, MRC-Scale, and handgrip strength at extubation. In addition, higher days on Invasive MV, CED, and CPD were observed in this group (Table 1). In the univariate logistic regression analysis, PhA at extubation, CED, CPD, ICU-AW diagnosis, and days on MV were associated with P-ED identification. In multivariate regression analysis, only days on MV, and the ICU-AW diagnosis were independently associated with P-ED (Table 2).
Conclusion: Days on invasive mechanical ventilation, and ICU-acquired weakness diagnosis were predictors for post-extubation dysphagia. Novel clinical and nutritional strategies are required to prevent ICU-acquired muscle weakness and its consequences, which may improve clinical outcomes and quality of life after extubation.
Ahron Lee, RD1,2; Eun-Mee Kim, RD1; Bo-eun Kim, RD1; Chi-Min Park, MD, PhD3; Sung Nim Han, PhD2
1Department of Dietetics, Samsung Medical Center, Seoul, Korea, Republic of (South); 2Department of Food and Nutrition, College of Human Ecology, Seoul National University, Seoul, Korea, Republic of (South); 3Department of Critical Care Medicine and Surgery, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea, Republic of (South)
Financial Support: None Reported.
Background: The importance of “appropriate” nutrition support in the early stages of intensive care unit (ICU) admission is under debate regarding patients who require it, time of initiation, and the amount to be provided. In this study, the characteristics and clinical outcomes of malnourished patients diagnosed using the Global Leadership Initiative on Malnutrition (GLIM) criteria were examined. Also, the actual implementation of nutritional support and its relationship with clinical outcomes based on nutrition status were investigated.
Methods: This retrospective cohort study included critically ill patients receiving invasive mechanical ventilation who were admitted to the ICU and hospitalized for at least 7 days between January 1, 2020, and December 31, 2022. Nutritional and clinical data during their first 10 days in the ICU were collected. All the patients in this study underwent nutrition assessment by the GLIM criteria. The 90-day mortality of patients diagnosed with malnutrition by the GLIM criteria and degree of malnutrition were analyzed. Patients were divided into three energy intake categories (<10 kcal/kg/d, 10–20 kcal/kg/d, and >20 kcal/kg/d) and three protein intake categories (<0.8 g/kg/d, 0.8–1.2 g/kg/d, and >1.2 g/kg/d). Information on intake was categorized by the stage following ICU admission (days 1–3 for the early acute phase, days 4–6 for the late acute phase, and days 7–10 for the recovery phase). We examined the differences in mortality among groups separated by energy and protein intake at each stage. The analyses were performed for the total cohort, well-nourished, and malnourished groups. Differences in the means and distribution were evaluated, and survival analyses and regression analyses were performed.
Results: A total of 595 patients were included. The prevalence of malnutrition according to the GLIM criteria was 61% (n = 362). The 90-day mortality in the well-nourished and the malnourished group was 45% and 58%, respectively (P < 0.001). Mortality differed by the degree of malnutrition (well-nourished 45%, moderately malnourished 53%, severely malnourished 61%, P = 0.001). In the early acute phase and late acute phase, there was no difference in mortality among different energy intake groups. However, in the recovery phase, the group with high energy intake (>20 kcal/kg/d) showed lower mortality (hazard ratio (HR) 0.602; 95% confidence interval (CI) 0.413 to 0.877; P = 0.008) in the total cohort. In well-nourished patients, the high energy intake group tended to have lower mortality (HR 0.573; 95% CI 0.318 to 1.034; P = 0.064) in the recovery phase. However, in malnourished patients, the group with high energy intake showed significantly lower mortality (HR 0.549; 95% CI 0.333 to 0.903; P = 0.018) in the recovery phase. In the early acute phase and late acute phase, there was no difference in mortality among different protein intake groups. However, in the recovery phase, the group with moderate protein intake (0.8–1.2 g/kg/day) showed lower mortality (HR 0.770; 95% CI 0.599 to 0.990; P = 0.041) in the total cohort. When well-nourished patients and malnourished patients were analyzed separately, a significantly lower mortality (HR 0.728; 95% CI 0.536 to 0.988; P = 0.042) in the recovery phase was observed with moderate protein intake among malnourished patients.
Conclusion: Malnutrition diagnosed by the GLIM criteria was associated with 90-day mortality and other clinical outcomes. Furthermore, energy and protein intake at the recovery phase after ICU admission was associated with mortality, especially in malnourished patients classified by the GLIM criteria. Therefore, time-dependent nutritional intake depending on nutrition status may be relevant for optimizing ICU nutrition support strategies.
1Alberta Health Services, Calgary, AB, Canada; 2University of Calgary, Calgary, AB, Canada; 3Duke University School of Medicine, Durham, NC; 4Hospital Naval Marcilio Dias, Rio de Janeiro, RJ, Brazil; 5McGill University School of Human Nutrition, Montreal, QC, Canada
Financial Support: None Reported.
Background: Functional capacity is the most relevant outcome after critical illness according to ICU survivors. This outcome is especially pertinent as adult ICU mortality has been decreasing, culminating in impaired functional capacity, delayed return to work, and low quality of life. Protein via nutrition support (NS) has the potential to mitigate ICU-acquired weakness but given that current ICU benchmarks are based on mortality and ICU-related complications, it is unknown whether these protein targets also support functional recovery. To address this gap, we conducted a retrospective cohort study to determine whether different protein intake doses influenced the functional capacity of ICU survivors with LOS ≥ 7 days, measured by the Chelsea Physical Assessment score (CPAx) at ICU discharge – a validated measure of functional capacity and robust method based on reliability, measurement error, and responsiveness.
Methods: The medical records of all consecutive patients admitted to a general systems ICU between October 2014 and September 2020 were reviewed. Inclusion criteria were age ≥18 years, survived ICU admission, ICU stay ≥7 days, and received NS. Exclusion criteria included neuromuscular disorders, brain/spinal cord injury, limb amputation, orthopedic fractures, persistent coma during ICU stay, missing CPAx, and mechanical ventilation <3 days. Eligible patients were divided into 4 groups guided by previous literature exploring daily protein intake in ICU (g/Kg/d) on mortality: LOW (<0.8), MEDIUM (0.8-1.19), HIGH (1.2-1.5), and VERY HIGH (>1.5). Groups with similar CPAx were pooled to enhance precision. The effect of protein dose on CPAx was assessed with analysis of covariance (ANCOVA) adjusting for the confounding variables age, disease severity, length of stay in hospital before ICU admission, duration of mechanical ventilation, and time until start of NS in ICU. Effect modification by nutritional status was assessed with stratification according to subjective global assessment (SGA A: well-nourished and B/C: malnourished). The effect of energy intake was assessed using the same regression model (<25 and ≥25 Kcal/Kg/d; <70 and ≥70% daily adequacy).
Results: Inclusion/exclusion criteria were met by 531 patients. CPAx was non-linearly associated with protein doses (Figure 1) AND was not statistically different among LOW, MEDIUM, and VERY HIGH groups. All groups were different from HIGH (p = 0.003), indicating data could be pooled, and giving rise to 2 groups: HIGH (1.2-1.5 g/Kg/d) and POOLED (<1.2 and >1.5 g/Kg/d). Baseline characteristics were comparable between both groups (Table 1). Mean CPAx (±standard error) was greater in the HIGH vs POOLED groups (30.1 ± 0.7 vs. 26.8 ± 0.6, p = 0.001), suggesting that HIGH was associated with superior functional capacity at discharge. The mean difference (MD) remained statistically significant after adjusting for confounding variables (CPAx MD: 3.4 ± 1.1, p = 0.003 in the 4-group model and 3.3 ± 0.9, p = 0.001 in the 2-group model). Energy intake had no effect on CPAx for Kcal/Kg/d (28.1 ± 0.6 in <25 Kcal/Kg vs 27.9 ± 0.8 in ≥25 Kcal/Kg, p = 0.780) nor for adequacy (27.3 ± 0.9 in <70% vs 28.4 ± 0.6 in ≥70%, p = 0.641). Nutritional status was not an effect modifier as the HIGH group had superior CPAx in both well-nourished (MD 3.8 ± 1.7, p = 0.029) and malnourished (MD 2.5 ± 1.1 p = 0.031) patients.
Best of ASPEN - Critical Care and Critical Health Issues
International Abstract of Distinction
Chin Han Charles Lew, APD, PhD1; Zheng-Yii Lee, PhD2,3; Andrew Day, MSc4; Xuran Jiang, MSc4; Danielle E. Bear, RD, PhD5,6; Gordon L. Jensen, MD, PhD7; Pauline Y. Ng, MBBS, MRCP(UK), FHKCP, FHKAM8; Lauren Tweel, RD, CNSC, MSc9; Angela Parillo, RD, LD, CNSC, MSc10; Daren K. Heyland, MD, MSc4; Charlene Compher, PhD, RD, LDN, FASPEN11
1Dietetics and Nutrition Department, Ng Teng Fong General Hospital, Singapore; 2Department of Anesthesiology, Faculty of Medicine, Universiti Malaya, 50603 Kuala Lumpur, Kuala Lumpur, Malaysia; 3Department of Cardiac Anesthesiology & Intensive Care Medicine, Berlin, Germany; 4Clinical Evaluation Research Unit, Department of Critical Care Medicine, Queen's University, Kingston, ON, Canada; 5Department of Critical Care, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; 6Department of Nutrition and Dietetics, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; 7University of Vermont Larner College of Medicine, Burlington, VT; 8Critical Care Medicine Unit, School of Clinical Medicine, The University of Hong Kong, Hong Kong; 9Rutgers University, New Brunswick, NJ; 10The Ohio State University Wexner Medical Center, Department of Clinical Nutrition, Columbus, OH; 11University of Pennsylvania School of Nursing, Philadelphia, PA
Financial Support: None Reported.
Background: Pre-existing malnutrition is common among critically ill patients (38-78%), and it can be diagnosed using tools such as the Global Leadership Initiative on Malnutrition (GLIM) criteria, and the Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition (ASPEN) Indicators of Malnutrition (AAIM). However, it is unclear if these tools or their individual components (nutrition parameters [NPs]), such as weight, diet history, body mass index (BMI), or muscle mass have better clinical utility and validity in the intensive care unit (ICU) setting since certain NPs can be easier to obtain (e.g. BMI) than others (e.g. weight history). More importantly, it is unclear if treating malnutrition according to the 2021 ASPEN guidelines (recommend delivering 12-25 kcal/kg/d and 1.2-2 g/kg/d of protein) is associated with improved clinical outcomes. We investigated whether GLIM, AAIM, and/or selected individual NPs measured at ICU admission were associated with time to discharge alive (TTDA) (primary outcome), mortality (60-day), or home discharge, and whether a higher protein delivery modified those associations.
Methods: This was a post hoc analysis of the EFFORT Protein trial (n = 1301), the largest multinational, multicenter trial that compared higher vs. usual protein delivery in critically ill patients. The malnutrition statuses of patients were retrospectively classified according to GLIM and AAIM using NPs that were prospectively collected at ICU admission. For GLIM, acute disease-related inflammation formed the etiologic factor for all patients since they were critically ill, and malnutrition severity was classified according to the phenotypic parameters (severity of weight loss, low-BMI, reduced muscle mass). For AAIM, a modified approach was adopted as certain NPs were not collected (ie, reduced energy intake or weight loss for periods < 1 month, fluid accumulation, and grip strength); hence, malnutrition status was classified by the patient's weight loss severity and any reduction in energy intake. Multivariable regressions were used to identify if malnutrition diagnosed by GLIM and AAIM (both dichotomized by “not identified as malnourished” vs. “moderate/severe malnutrition”) and/or individual NPs were associated with outcomes, and whether protein delivery modified their associations.
Results: Table 1 summarizes the characteristics of patients according to their malnutrition status classified by GLIM. Of 1301 predominantly medical admissions, 41% and 14% of the patients were malnourished according to GLIM and AAIM, respectively. Malnutrition diagnosed by GLIM and AAIM was independently associated with extended TTDA (p = 0.03, p = 0.01), higher odds of 60-day mortality (p = 0.02, p = 0.01), and lower odds of home discharge (p = 0.03, p = 0.05), whereas individual NPs were not (p > 0.10). However, higher protein delivery did not modify the association between malnutrition (diagnosed by GLIM and AAIM) and worse outcomes (Table 2). Notably, in patients with BMI < 18.5 kg/m2 (n = 78), higher protein delivery was associated with a shorter TTDA (adjusted hazard ratio 2.68, 95% confidence interval [CI] 1.14-6.30) and greater odds of home discharge (adjusted odds ratio 4.61, 95%CI 1.35-15.71) than usual protein delivery.
Elias Wojahn, B.S.; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD
Medical College of Wisconsin, Milwaukee, WI
Financial Support: Medical College of Wisconsin.
Background: Previous guidelines lacked sufficient data to comment on the safety of enteral nutrition in critically ill children. A more recent study indicated that enteral nutrition was indeed safe for critically ill children receiving vasoactive medication. Additional data in adults indicated that septic shock patients treated with vasoactive medication and given early enteral nutrition outperform patients given no nutrition. We retrospectively investigated a similar premise in pediatric patients to determine (1) the frequency of use of early enteral versus parenteral nutrition for patients in the PICU for septic shock receiving vasoactive medication and (2) the impact of early enteral versus parenteral nutrition on PICU length of stay (LOS) and mortality for patients admitted with septic shock and treated with vasoactive medication. We hypothesized that (1) clinical practices have changed over recent years such that early enteral nutrition was administered more frequently to pediatric septic shock patients treated with vasoactive medication and (2) receiving early enteral nutrition as a PICU patient treated for septic shock with vasoactive medications was associated with better outcomes.
Methods: We obtained demographic and outcome data for pediatric patients admitted to Children's Hospital Wisconsin for septic shock and treated with vasoactive medications within a 5-year range from the Virtual Pediatric Systems, LLC (VPS) database, a data registry for PICU patients. We obtained clinical data including details of enteral and parenteral nutrition administered and use of vasoactive medications by chart review. We quantified the use of vasoactive medications by Vasoactive-Inotrope Score (VIS). We quantified the severity of illness by PRISM3 Probability of Death. We considered medical LOS and mortality for clinical outcomes. We compared categorical variables by Chi-square tests and compared continuous variables by Mann-Whitney tests or Kruskal-Wallis tests. P < 0.05 were considered statistically significant.
Results: We identified 637 patients aged 0-21 years treated in the PICU with a diagnosis of septic shock. Of these, 401 received vasoactive medication, 183 received early enteral nutrition, and 81 received early parenteral nutrition. Those given early parenteral nutrition had longer LOS (median (IQR): 7.0 (2.2-23.2) days) than those not fed (median (IQR): 2.1 (1.1-5.1) days) (p < 0.0001), but did not differ from those fed enterally (median (IQR): 7.9 (3.7-15.2)) (p = 0.95). After controlling for severity of illness, patients who received early parenteral nutrition were more likely to die than those receiving early enteral nutrition or those who were not fed at all (parental vs. enteral: 17.8% vs. 4.60%, p = 0.002; parenteral vs. none: 17.28% vs. 6.70%, p = 0.002). Mortality did not differ between patients who received early enteral nutrition and those not fed (4.60% vs. 6.70%, p = 0.427543).
Conclusion: Early enteral nutrition was given more frequently than early parenteral nutrition. Early enteral nutrition was not significantly associated with improved outcomes as measured by length of stay and mortality, but early parenteral nutrition was associated with significantly worse outcomes. This suggests that clinical guidelines should favor the use of enteral feeding in septic shock patients receiving vasoactive medication.
Best of ASPEN - Critical Care and Critical Health Issues
1Nanjing University, Nanjing, Jiangsu, China; 2Southeast University, Nanjing, Jiangsu, China
Financial Support: None Reported.
Background: There is controversy over the optimal early protein delivery in critically ill patients with acute kidney injury (AKI). This study aims to evaluate whether the association between early protein delivery and 28-day mortality was impacted by the presence of AKI in critically ill patients.
Methods: This is a secondary analysis of a multicenter cluster-randomized controlled trial enrolling newly admitted critically ill patients (N = 2772). Participants with complete data on baseline renal function and 28-day mortality were included in this study. Cox proportional hazards models were used to investigate whether early protein delivery, reflected by mean protein delivery from day 3 to day 5 after enrollment, was associated with 28-day mortality and whether baseline AKI stages impacted their association.
Results: Overall, 2,618 patients were included (Table 1), among whom 628 (24.0%) had AKI at enrollment (118 stage I, 97 stage II, 413 stage III). Mean early protein delivery was 0.60 ± 0.38 g/kg/d among the study patients (Figure 1). In the overall study cohort, each 0.1 g/kg/day increase in protein delivery was associated with a 5% reduction in 28-day mortality (Hazard Ratio [HR] = 0.95; 95% confidence interval [CI] 0.92-0.98, P < 0.001). Also, when stratifying the early protein delivery by tertiles, compared with low protein delivery, the risk of 28-day mortality both decreased in the medium protein group (HR = 0.64; 95% CI 0.50-0.82, P < 0.001) and the high protein group (HR = 0.71; 95% CI 0.55-0.91, P = 0.007) after adjusting for potential confounders (Figure 2). The association between early protein delivery and 28-day mortality in patients with different baseline AKI stages showed significant heterogeneity (adjusted interaction P = 0.047). With each 0.1 g/kg/d increase in protein delivery, the 28-day mortality decreased by 5% (HR = 0.95; 95% CI 0.92-1.00, P = 0.008) in patients without AKI and 7% (HR = 0.93; 95% CI 0.86-0.99, P = 0.043) in those with AKI stage III, of whom 72% were on renal replacement therapy upon enrollment. However, these associations were not observed among AKI stage I and II patients. The mortality trends up to day 28 for early protein delivery in different AKI stage groups are depicted in Figure 3.
Conclusion: Higher early protein delivery during days 3-5 of ICU stay was associated with improved 28-day mortality in critically ill patients without AKI and with AKI stage III, but not in those with AKI stage I or II.
Figure 3. The trends of 28-day mortality with early protein delivery in different AKI stages.
Stanislaw J. Gabryszewski, MD, PhD1; David A. Hill, MD, PhD1,2
1Children's Hospital of Philadelphia, Philadelphia, PA; 2University of Pennsylvania, Philadelphia, PA
Financial Support: This work was supported by the National Institutes of Health (Grants T32HD043021 to SJG; K08DK116668 and R01HL162715 to DAH).
Background: The ketogenic diet (KD) is a high-fat, moderate-protein, low-carbohydrate diet that induces ketosis, a metabolic shift characterized by the use of fatty acid-derived ketone bodies rather than glucose to meet energy needs. While the KD is best known as a dietary therapy for refractory epilepsy, there is growing interest in identifying other diseases in which the KD may be therapeutic. Recent studies have revealed the potential of the KD to dampen inflammation and pathology in mouse models of allergic asthma. However, it is unclear whether the KD has such immunoregulatory effects in other allergic diseases, such as the gastrointestinal allergy eosinophilic esophagitis (EoE).
Methods: We studied the effects of the KD in a mouse model of eosinophilic esophagitis (EoE) in which 10-week-old C57BL/6 mice were topically treated with the vitamin D analog MC903 and the egg white allergen ovalbumin (OVA) on days 0 to 11 to induce eczema-like dermatitis and allergic sensitization, respectively. The effect of the KD following allergic sensitization was studied by feeding mice KD or a regular diet (RD) starting on day 12. Mice were provided with OVA-supplemented water and gavaged with OVA on days 18-20. On day 21, mice were harvested to quantify esophageal eosinophilia and to phenotype immune responses in draining lymph nodes via flow cytometry.
Results: Following induction of EoE, mice in both the KD (n = 17) and RD (n = 17) arms exhibited 100% survival at day 21. Weight recovery (percent of original weight ± SEM) at day 21 was comparable between KD-fed (104.1 ± 1.7%) and RD-fed (99.0 ± 3.2%) mice (p > 0.05). Analysis of esophageal eosinophilia at day 21 revealed significantly decreased numbers (total cells ± SEM) of Siglec-F+ CD11b+ eosinophils in KD-fed (711 ± 345 cells) versus RD-fed (880 ± 225 cells) mice (p < 0.05). There was a non-significant reduction in the percentage of esophageal eosinophils (percent of CD45+ cells ± SEM) in KD-fed (5.1 ± 1.2%) versus RD-fed (8.1 ± 1.5%) mice (p = 0.138). In immunophenotyping of phorbol myristate acetate and ionomycin-stimulated cells from draining lymph nodes at day 21, there was a significantly increased percentage (percent of CD4+ T cells ± SEM) of Foxp3+ T regulatory (Treg) cells in KD-fed (6.5 ± 1.1%) versus RD-fed (3.3 ± 0.4%) mice (p < 0.01).
Conclusion: In this mouse model of OVA-induced EoE, we observed a modest inhibitory effect of the KD on the recruitment of eosinophils to the esophagus. As compared with the RD, the KD was associated with increased proportions of Foxp3+ Tregs in draining lymph nodes of mice with EoE. Additional mechanistic investigations are warranted, including determination of the necessity of Tregs for KD-induced inhibition of esophageal eosinophilia. This study highlights the promise of immunomodulatory dietary interventions in the context of allergic disease.
1Massachusetts General Hospital, Boston, MA; 2Brigham and Women's Hospital, Boston, MA; 3University of Pennsylvania School of Nursing, Philadelphia, PA
Financial Support: Research reported in this publication was supported by the American Society for Parenteral and Enteral Nutrition (ASPEN) Rhoads Research Foundation awarded to Hassan S. Dashti.
Background: Patients living with short bowel syndrome (SBS) receiving home parenteral nutrition (HPN) commonly receive nutritional infusions overnight contributing to sleep and circadian disruption. Aligning nutritional intake with the circadian clock is expected to yield high benefits to vulnerable populations by limiting circadian misalignment (i.e., a mismatch between the circadian system and behaviors) and influencing other pathways. Recent advancements in metabolic profiling techniques (systematic profiling of cellular metabolites, i.e., sugars, amino acids, organic acids, nucleotides, and lipids) have emerged as a promising tool for identifying relevant biological pathways. Our objective was to characterize metabolites that differ between daytime and overnight HPN infusions in adults with SBS habitually receiving HPN.
Methods: The present study was a secondary analysis of a controlled, single-arm 2-week pilot and feasibility trial designed to compare daytime to overnight infusions of HPN in adults with SBS consuming HPN (ClinicalTrials.gov: NCT04743960). Enrolled patients received 1 week of HPN infusions overnight followed by 1 week of HPN infusions during the daytime (approximately 12-hour change in infusion start time). Duration, frequency, and composition of infusions remained identical during the two study periods. Following each 1-week study period, patients had a venous blood sample collected at clinical visits. Plasma samples were analyzed using Ultrahigh Performance Liquid Chromatography-Tandem Mass Spectroscopy and global metabolic profiles were determined. Of 1015 measured metabolites, only 622 metabolites with non-missing data across all samples were analyzed. Data were normalized to the volume of sample extracted and then log-transformed and scaled with Z-score prior to analysis. Differential metabolite abundance between the two study periods (daytime vs. overnight) was determined using standard Linear Models for MicroArray Data (LIMMA) models adjusted for dietary fasting duration and time since the end of the last HPN infusion. Pathway enrichment analysis was then conducted using MetaboAnalyst's pathway enrichment tool.
Results: Nine patients (age, 52 years; 80% female; BMI 21.3 kg/m2) completed the trial and provided two fasting blood samples. Both blood draws were completed at approximately 11:20 am following at-least an 8-hour fast and at-least 8 hours from the end of an HPN infusion. Changes were detected in 36 metabolites at P < 0.05; top-changing metabolites were mostly fatty acids, long-chain and polyunsaturated fatty acids (Dihomo-gamma-linolenic acid, arachidonate (20:4n6), docosahexaenoate (DHA; 22:6n3)) and glycerolipids. (Figure 1). No metabolites were significant at the stringent FDR threshold. Enrichment analysis of the 36 metabolites identified pathways related to the biosynthesis of unsaturated fatty acids, D-arginine, D-ornithine metabolism, and linoleic acid metabolism, among others (Figure 2).
Astrid Verbiest, MSc1,2; Mark K. Hvistendahl, MS, PhD3; Federico Bolognani, MD, PhD4; Carrie Li, MS, PhD4; Nader N. Youssef, MD, MBA, FACG4; Francisca Joly, MD, PhD5; Palle B. Jeppesen, MD, PhD3; Tim Vanuytsel, Associate Professor1,2
1Leuven Intestinal Failure and Transplantation Center (LIFT), University Hospitals Leuven, Leuven, Belgium; 2Translational Research Center for Gastrointestinal Disorders (TARGID), University of Leuven, Leuven, Belgium; 3Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; 4VectivBio, Basel, Switzerland; 5Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France
Financial Support: This research was supported by VectivBio AG.
Background: Short bowel syndrome (SBS) is a severe organ failure condition with a high risk of developing intestinal failure (SBS-IF) and life-long parenteral support (PS) dependence. Glucagon-like peptide-2 (GLP-2) analogs stimulate adaptation of the remaining intestine resulting in increased intestinal absorption and reduced PS needs. Extensive literature is available on the effect of the short-acting GLP-2 analog teduglutide in patients without a remaining colon. However, the impact of GLP-2 analogs on fluid and energy absorption in SBS-IF with a colon-in-continuity (CiC) is unclear. Apraglutide (APRA) is a novel, long-acting synthetic GLP-2 analog that is in development for SBS-IF. We performed a pre-defined interim analysis of a phase 2 study in SBS-IF-CiC to investigate the safety and efficacy of 4 weeks of apraglutide treatment based on metabolic balance studies (MBS).
Methods: STARS Nutrition is a 52-week multicenter, open-label phase 2 study in adult patients with SBS-IF-CiC receiving once-weekly subcutaneous apraglutide injections (5 mg). MBS were performed at baseline and after 4 weeks with stable PS, followed by a 48-week PS adjustment period. During MBS, fluid intake was kept constant (individual predefined drinking menu). Duplicates of meals and fluids (wet weight intake), urine, and feces (fecal wet weight output) were collected. Safety was the primary endpoint. Secondary endpoints included changes in fecal wet weight output, urinary output, wet weight, and energy absorption. Data are presented as mean (95% CI). P values < 0.05 were considered significant (Wilcoxon matched-pairs signed rank test).
Results: Nine patients were included and comprise the full study population. Apraglutide was well tolerated with no dose discontinuation or interruption. No AEs were considered notable based on their nature or severity. At baseline, patients received a weekly PS volume of 10 (range 4-21) L. Small bowel length was 19 (range 0-50) cm and 79 (range 43-100) % of the colon was in continuity. Fecal wet weight output decreased significantly by 253 (−437 to −68) g/day (p = 0.012). Relative wet weight absorption increased by 9 (1 to 18) % (p = 0.039). There was a numeric increase in urinary output (p = 0.129). No significant changes in energy absorption were observed (Table 1).
Palle B. Jeppesen, MD, PhD1; Tim Vanuytsel, Associate Professor2; Sukanya Subramanian, Physician3; Francisca Joly, MD, PhD4; Geert Wanten, Physician5; Georg Lamprecht, Physician, Professor6; Marek Kunecki, MD7; Farooq Rahman, Physician8; Thor Nielsen, Statistician9; Lykke Graff, MD9; Mark Hansen, Physician9; Ulrich Pape, Physician10; David Mercer, Physician11
1Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; 2UZ Leuven, Leuven, Belgium; 3MedStar Georgetown, Washington, DC; 4Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France; 5Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands; 6University Medical Center Rostock, Rostock, Germany; 7M. Pirogow Hospital, Wolczanska, Poland; 8University College London Hospitals, London, United Kingdom; 9Zealand Pharma A/S, Copenhagen, Denmark; 10ASKLEPIOS Klinik St. Georg, Hamburg, Germany; 11Nebraska Medical Center, NE
Background: Reduction of parenteral support (PS) is important for improved outcome in short bowel syndrome (SBS) patients with intestinal failure (IF). Clinically meaningful within-patient change in PS volume has until today been regarded as a ≥ 20% reduction. This is however based on clinical experience, and to our knowledge there has been no data-driven exercise which aims at quantifying what constitutes a meaningful change in PS volume from a patient perspective. Glepaglutide, a long-acting GLP-2 analog, reduces PS volume needs and improves patient global impression of change (PGIC), a patient-reported outcome (PRO) tool, in SBS-IF patients. We here report a quantitative analysis of meaningful change in PS volume using PGIC following glepaglutide treatment in the Efficacy and Safety Evaluation (EASE) SBS 1 trial.
Methods: EASE SBS 1 is a multi-center, placebo-controlled, randomized, parallel-group, double-blind phase 3 trial (NCT:03690206). Chronic SBS-IF adult patients with requirement for PS at least 3 days per week were recruited. Patients were randomized to 24 weeks of treatment with SC injections of either 10 mg glepaglutide twice-weekly (TW), 10 mg glepaglutide once-weekly (OW), or placebo. PS volume requirements were evaluated and adjusted using regular fluid balance periods. The primary endpoint was a reduction in weekly PS volume from baseline to week 24. Patients rated their change in overall status since the start of the trial to weeks 12 and 24 by PGIC, using a 7-point Likert scale (ranging from very much worse to very much improved). Anchor-based analysis using scatter plots and empirical cumulative distribution functions (eCDF) were applied to assess the association between PGIC categorical data and % change in PS volume from baseline to weeks 12 and 24. Anchor-based methods are used as external criteria to gain knowledge about what is clinically meaningful to patients based on known anchoring measures.
Results: 99 of the 106 randomized patients completed the trial. Glepaglutide TW treatment significantly reduced mean PS requirements by 47% (5.13 L/wk) from baseline. Improvement in PGIC was shown with significant differences relative to placebo for both glepaglutide TW (p = 0.002) and OW (p < 0.0001). Using the blinded data sample, the association between PGIC and the PS volume % change from baseline to week 24 showed that the two endpoints were correlated, with Spearman rank-order and Kendall's tau-b correlation coefficients of 0.353 and 0.285, respectively. After 12 weeks of treatment, the association appears stronger. Upon inspection of the eCDF, these results support the appropriateness of a % PS volume reduction threshold of 20%.
Conclusion: Anchor analysis, using PGIC as the anchor measurement, showed that the use of 20% reduction in PS volume, an outcome measure used in clinical trials, is considered clinically meaningful to SBS patients.
Abstract of Distinction
Ji Seok Park, MD, MPH1; Naseer Sangwan, PhD1; Lauren Menke2; Gail Cresci, PhD, RD, LD, FASPEN1
1Cleveland Clinic, Cleveland, OH; 2Case Western Reserve University, Cleveland, OH
Financial Support: 4R00AA023266 (GC) and Standard Process.
Background: A synbiotic is a physical combination of a prebiotic and a probiotic with a general goal of maintaining probiotic viability through co-packaging with its food source. Despite its wide availability, evidence to support its use in a healthy population is limited. The study aimed to test the feasibility and safety of the synbiotic on gastrointestinal symptoms and gut microbiota.
Methods: This was a double-blinded, randomized, placebo-controlled, paired crossover pilot study in healthy adults to test the effects of a targeted synbiotic on gut microbiota diversity and abundance. The targeted synbiotic consisted of 2 probiotic strains, Lactobacillus reuteri 3613 (1 × 109 CFU) and Lactobacillus plantarum 276 (1 × 1011 CFU), and a resistant starch (RS) prebiotic NuBanaTM RS65G Green Banana Flour (3.84 g/d). Thirty-four healthy participants meeting the pre-defined criteria were enrolled per sample size calculation of 24 completers needed to achieve 91% power at a 5% significance level. Participants were randomized to consume the synbiotic versus maltodextrin placebo for 28 days, followed by a 21-day washout period, and then they crossed over to consume the other supplement for 28 days. Gastrointestinal symptoms were assessed, and fecal samples were collected before and after each supplement period. Fecal samples were analyzed by 16 S rRNA sequencing, and Division Amplicon Denoising Algorithm 2 (DADA2) and Ribosomal Database Project (RDP) classifier were used for taxonomic profiling. Alpha-diversity was assessed using the Shannon diversity index, and beta-diversity was assessed using Bray-Curtis dissimilarity. Differential abundance was used to capture significantly different taxa between the synbiotic group and placebo group. The study was approved by the Cleveland Clinic Institutional Review Board.
Results: Thirty-four participants were randomized into the study, 13 males and 21 females, and 28 participants completed the study with an average age of 32 ± 7 years. Shannon diversity index of fecal samples was higher when participants were taking synbiotic compared to placebo (P = 0.021) suggesting higher microbial richness and evenness during the synbiotic consumption. Bray-Curtis dissimilarity was calculated between the synbiotic group and the placebo group and then was visualized using Principal Coordinates Analysis (PCoA), which showed 2 separate but overlapping groups. Differential abundance identified 11 taxa, including butyrate-producing genera Akkermansia and Butyricimonas, were significantly different between synbiotic and placebo supplements. All subjects tolerated the supplements well reporting no changes in GI symptoms.
Conclusion: This pilot study shows a targeted synbiotic supplement favorably modified gut microbiome diversity and taxa abundance in healthy subjects. Further studies are warranted to test the effects of this targeted synbiotic in clinical scenarios with known gut dysbiosis to determine if modifications can be sustained and associated with disease.
1Case Western Reserve University/Cleveland Clinic Lerner Research Institute, Cleveland, OH; 2Cleveland Clinic, Cleveland, OH
Financial Support: NIH-National Institute of Alcohol Abuse and Alcoholism.
Background: Alcohol use disorder is the leading cause of liver disease in the United States1, with an estimated 80% of patients with alcohol-associated end-stage liver disease (AA-ESLD) also presenting with clinical malnutrition and sarcopenia2. Gut dysbiosis in ALD has been well-characterized in the literature with shifts from a Bacteroidetes and Firmicutes-dominated population towards an increased abundance Proteobacteria3. Although it is known that the gut microbiome plays a role in the metabolism and production of amino acids, how alcohol-associated gut dysbiosis influences host amino acid homeostasis is less understood. We aimed to test whether the amino acid metabolite profile in patients with AA-ESLD is unique from patients without disease pathology and if this is correlated to changes in the gut microbiota.
Methods: A secondary data analysis was performed from a larger, single-center, non-randomized prospective pilot study in patients awaiting liver transplantation to characterize metabolomic changes in amino acid homeostasis. Urine samples were collected within 24 hours prior to liver transplant, adjusted for urine osmolality, and untargeted metabolomic analysis by UPLC-MS/MS was performed. Fecal samples collected within 24 hours of liver transplant were sequenced and analyzed using 16srRNA for profiling. Welch's paired t-tests were generated to determine statistically significant changes in metabolite mean scaled intensities between AA-ESLD and healthy control patients. Spearman's correlations were used to identify associations between amino acid metabolites and gut microbial taxa.
Results: Analysis of the urinary metabolome between ALD-ESLD patients (n = 11) and healthy control patients (n = 18) revealed distinct amino acid profiles between groups. Welch's paired t-tests identified that arginine (p = 0.0016), glutamate (p = 0.0289), tyrosine (p = 0.0003, phenylalanine (p = 0.0002), asparagine (p = 0.0005) tryptophan (p = 0.0001), cystine (p = 0.0017) and taurine (p = 0.0480) were all significantly increased in ALD-ESLD patients. When Spearman's correlations were generated, a significant positive correlation was identified between gamma-proteobacteria genera species, phenylalanine (p = 0.0167), and tryptophan (p = 0.0349). These data suggest that the microbiome may contribute to the increased concentrations of these amino acids in the urine. Gamma-proteobacteria were also positively correlated with glutamine (p = 0.0151) and histidine (p = 0.0476), while a negative correlation was found with glycine(p = 0.0071) and creatinine (p = 0.0341).
Conclusion: Urinary amino acid metabolites differ between AA-ESLD patients and those without liver disease. As patients must abstain from alcohol for ~6 months to be eligible for a liver transplant, these data suggest residual effects of AA-ESLD on amino acid homeostasis. Correlations between the microbiome and amino acid metabolites suggest that the unique microbial shifts associated with ALD may play a role in these observed changes to amino acid metabolism.
Stephanie Merlino Barr, MS, RDN, LD1,2; Rosa Hand, PhD, RDN, LD, FAND2; Marc Collin, MD1,2; Thomas E. Love, PhD1,2; Sharon Groh-Wargo, PhD, RDN1,2
1MetroHealth Medical Center, Cleveland, OH; 2Case Western Reserve University, Cleveland, OH
Financial Support: None Reported.
Background: Diagnostic criteria for neonatal malnutrition were proposed in 2018 by field experts. This tool has not been validated since its publication. The objective of this study was to assess the agreement and reliability of both the overall malnutrition tool and individual indicators to evaluate how consistently the proposed criteria identify malnutrition in preterm infants.
Methods: A single-center, retrospective cohort study was performed at a level III Neonatal Intensive Care Unit (NICU). The cohort included all preterm infants born between June 2013 and August 2022, who were admitted to the NICU for at least 3 days and did not die before discharge. Malnutrition diagnoses (none/mild/moderate/severe) were assigned to each patient for each indicator, as defined in Table 1; multiple definitions for individual indicators were used to reflect different potential approaches of assessment (eg, growth velocity), or to reflect differences in patient populations (eg, protein and energy intake). The kappa (k) value was used to assess the neonatal malnutrition diagnostic tool's overall inter-indicator reliability; this was calculated separately for indicators used to assess malnutrition in the first two weeks of life and after the first two weeks of life. Each indicator's diagnosis was compared individually to all other indicators' diagnoses to assess inter-indicator reliability; proportion of overall agreement, McNemar's test statistic, and kappa value were calculated. Acceptable agreement was defined as k > 0.8.
Results: A total of 2946 infants were included in this study. The k values for the malnutrition tool overall indicated poor inter-indicator reliability; for malnutrition diagnoses in the first two weeks of life k = 0.054; for diagnoses after the first two weeks of life k = 0.048. Figure 1 depicts the weighted k values for all comparisons of individual indices. Figure 2 depicts the proportions of overall agreement. For example, the weight gain velocity (approach 1) compared to the energy intake malnutrition diagnosis criteria had n = 954 subjects, k = 0.09, and a proportion of overall agreement of 0.28, indicating that both inter-indicator reliability and accuracy were poor. Commonly cited generalized weight gain velocity goals (approaches 2 & 3) had good accuracy and inter-indicator reliability with the recommended method (approach 1) of determining goal weight gain velocity by maintaining weight-for-age z-score (1 vs. 2 k = 0.92, 1 vs. 3 k = 0.88). The generalized linear growth goal (approach 2) had poor accuracy and inter-indicator reliability with the recommended method (approach 1) (k = 0.12). All comparisons of unique indices for malnutrition diagnosis had detectable disagreement in diagnosis patterns as assessed by McNemar's test statistic.
Amber Hager, BSc, RD; Yiqi Wang, BSc; Sandy Hodgetts, PhD, OT; Lesley Pritchard, PhD, PT; Vera Mazurak, PhD; Susan Gilmour, MD, MSc, FRCPC; Diana R. Mager, MSc, PhD, RD
University of Alberta, Edmonton, AB, Canada
Financial Support: 2022 ASPEN Rhoads Research Foundation Grant.
Background: Measurement of body composition in young infants and children with chronic liver disease (CLD) can be challenging due to fluid overload, lack of healthy reference data and non-invasive, validated methods to use at the bedside. The use of ultrasonography to serially measure changes in muscle thickness overcomes many of these limitations, but little comparable data is available in young infants and children (<5 y). The study purpose was to serially measure changes in total bicep, calf, and thigh muscle layer thickness (MLT), subcutaneous adipose tissue thickness (SAT-T), and motor (gross/fine) development in infants and children (<5 y) with CLD. We hypothesized that the trajectory of MLT (thigh, bicep, calf) and SAT-T would be significantly impacted by CLD, and informative of gross motor development in infants and children (<5 y).
Methods: Infants and children (4 mo-5 y) with CLD (n = 11) and their age-matched CON (n = 16) were recruited from the Pediatric Liver Clinics/Liver Transplant Clinics at the Stollery Children's Hospital and the community. Participants underwent 2 serial measurements at baseline and after 6 months of (1) MLT, echo intensity and SAT-T of the bicep brachii (BB), rectus femoris (RF), rectus intermedius (RI), soleus and gastronemius (GN) using ultrasound (U/S) and (2) gross motor assessment (Peabody Motor Scale V2 [PDMS-2]) in CLD only. Additional variables collected included demographics (age, sex, CLD diagnosis, PELD), SGNA scores, anthropometrics (wt-z, ht-z, head circumference (hc-z]), body composition (fat-free mass [FFM]/fat-mass [FM] using BIA) and multiple skinfold thickness (SFT) (triceps [TSF], biceps, suprailliac, subscapular), mid-arm circumference [MAC-z]).
Results: CLD etiology included 73% Biliary Atresia (n = 8), 27% other (n = 1 acute liver failure; n = 2 TPN-related cholestasis). No significant differences in age (years), sex, wt-z, ht-z, hc-z, MAC-z, TSF-z, or subscapular-z were noted between groups at baseline (p > 0.05). Thirty percent of CLD children had SGNA scores indicative of mild-moderate malnutrition (SGNA ≥ 2). Total thigh, RI, and soleus MLT was significantly lower in CLD vs CON, and thigh SAT was higher in CLD after 6 months (p < 0.05). This was particularly evident in CLD children ≤ 2 years who had significantly lower total thigh, RI, RF, and soleus MLT than CON at baseline and after six months (p < 0.05). Total thigh, RI, RF MLT (absolute, % change over 6 months) were positively related to measures of BIA-FFM measures (r2 = 0.46 −0.47); p < 0.001) total motor quotient and gross motor quotient scores (absolute, percentile; r2 = 0.47 p < 0.001)), but not fine motor quotients (absolute, percentile) of the PDMS-2, particularly in CLD children (<2 y). Bicep and calf (MLT, SAT) were not associated with total motor, gross motor, or fine motor quotients (absolute, percentile) in CLD children.
Conclusion: Children with CLD had significantly lower measures of muscle thickness and higher measures of SAT than CON. Serial measurement of thigh MLT may be informative of the trajectory of fat-free mass and gross motor skill development in young children with CLD.
1Georgia State University, Atlanta, GA; 2Children's Healthcare of Atlanta, Atlanta, GA; 3UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA
Financial Support: Takeda Pharmaceuticals.
Background: Although survival for children with intestinal failure (IF) has improved with parenteral nutrition (PN), many still fail to maintain adequate somatic growth after achieving enteral autonomy. Few studies have examined growth after weaning from PN and outcomes have been inconsistent. A glucagon-like peptide-2 (GLP-2) analog has been shown to reduce the volume of and time on PN in some children with short bowel syndrome with 6 months of use. The effect of this analog on growth is unknown. We aim to describe growth patterns in children with IF after PN weaning and during treatment with a GLP-2 analog.
Methods: This retrospective observational study was conducted at two centers for pediatric intestinal rehabilitation (IR) in the US eligibility criteria included diagnosis with IF (PN use ≥60 days within a 74 consecutive day interval) at <12 months of age. Patients were referred for IR between September 1989 and January 2023. Z-score values for weight and length/height (adjusted for gestational age up to 2 years of age) are described in those who weaned from PN and in those who received a GLP-2 analog (Gattex®) for ≥6 months (2017-2023).
Results: There were 362 children (57% male, 72% white) with a median age at diagnosis of 6 days (interquartile range [IQR] 1,22) eligible for the study. Common diagnoses included necrotizing enterocolitis (28%), gastroschisis (23%), and small bowel atresia (16%). The median gestational age was 34 weeks (IQR 31,37), the percent small bowel remaining at diagnosis was 23% (IQR 10,50), and 36% had a functional ileocecal valve. One hundred forty-five children (40%) were successfully weaned from PN (median time to wean = 1.5 y [IQR 1,2.9]). 123/145 (85%) achieved enteral autonomy (maintenance of normal growth for >3 consecutive months). Median weight and length/height z-score at the time of PN weaning was −1.04 (IQR −2.09, −0.12) and −1.86 (IQR −3.01, −0.69), respectively. After weaning from PN, weight and linear growth velocity were maintained in 44% and 39% of children, respectively in year 1 and 59% and 55%, in year 2. Acceleration in weight and linear growth velocity was observed in 28% and 34%, respectively in year 1 and 22% and 31%, in year 2. Fourteen children received a GLP-2 analog for a median of 912 days (IQR 365,1304). Of these, 3 were weaned from parenteral support within 9 months. Changes in weight and linear growth velocity z-scores between GLP-2 start and 2 years post-initiation are shown in Table 1.
Annemarie Rompca, MD1; Morgan McLuckey, MD2; Anthony J. Perkins3; Xiaoyi Zhang, MD, PhD1; Charles Vanderpool, MD1
1Riley Hospital for Children, Indianapolis, IN; 2Department of Radiology, Indianapolis, IN; 3Indiana University School of Medicine, Indianapolis, IN
Financial Support: None Reported.
Background: Inflammatory bowel disease (IBD) can impact patients' nutritional status. Poor oral intake, poor absorption of nutrients, protein loss in stool, and increased energy requirement can contribute to poor nutrition in this patient population. Poor nutritional status can manifest as poor growth, poor weight gain, and sarcopenia, defined as decreased muscle mass and strength. Studies have demonstrated decreased muscle mass in pediatric IBD patients leads to a need for escalated therapy, increased need for surgery, and increased risk of post-operative complications. We sought to obtain the muscle mass at IBD diagnosis of our cohort on cross-sectional imaging, compare to known age- and sex-specific psoas muscle reference values for pediatric norms, and analyze muscle mass comparison between IBD subtypes and correlations with anthropometrics at diagnosis.
Methods: This study is a single-center retrospective study at a tertiary care facility. Patients with new diagnoses of IBD [Crohn's disease (CD), ulcerative colitis (UC), and indeterminate colitis (IC)] ages 6 to 16 at diagnosis from May 15, 2018, through December 31, 2019, were included. Those who had chronic medical conditions and no accessible cross-sectional imaging within 3 months of diagnosis were excluded. Demographic and anthropometric data at diagnosis of IBD were obtained. The psoas muscle area in mm2 was measured on cross-sectional imaging at lumbar level 3-4 (L3-4) and lumbar level 4-5 (L4-5) bilaterally. Right and left measurements were added together to obtain the total psoas muscle area (TPMA) at each level. These measurements were compared to pediatric psoas muscle area reference values. We used analysis of variance to determine if outcomes differed by IBD type. Spearman correlations were used to assess the relationship between anthropometric measures and outcomes of interest. All analyses were performed using SAS v9.4.
Results: Cross-sectional imaging from 70 patients with newly diagnosed IBD was reviewed. The average age was 11.9 years, with a male predominance of 42 patients (60%). Most patients were diagnosed with CD (n = 50, 71.4%), followed by UC (n = 17, 24.3%), and then IC (n = 3, 4.3%). The mean z-score for all patients TPMA at L3-4 was −1.7. The mean z-score for all patients TPMA at L4-5 was −1.4 (Table 1). Measures of sarcopenia at both lumbar levels for TPMA and z-score at L3-4 were significantly different across IBD types (CD vs UC vs IC) (Table 2).
Best of ASPEN - Pediatric, Neonatal, Pregnancy, and Lactation
Abstract of Distinction
Adam Russman, MD1; Anne McCallister, CPNP2; Anthony J. Perkins3; Charles Vanderpool, MD4
1Children's Medical Center of Dallas, Dallas, TX; 2Riley Hospital for Children at Indiana University Health, Indianapolis, IN; 3Indiana University School of Medicine, Indianapolis, IN; 4Riley Hospital for Children, Indianapolis, IN
Financial Support: None Reported.
Background: The Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN) published malnutrition guidelines in 2014. Literature describing clinical outcomes in hospitalized children with a malnutrition diagnosis is limited and few studies focus on the impact of malnutrition severity subtype on clinical outcomes.
Methods: We analyzed patients admitted to our pediatric hospital from 2019 to 2022, excluding maternal/obstetrics admissions. Patients were diagnosed with malnutrition and assigned severity subtype by a registered dietitian according to AND/ASPEN guidelines. Unspecified malnutrition was assigned if there was insufficient physician documentation to determine the malnutrition severity subtype. Data on readmission rate, mortality, length of stay (LOS), LOS index, hospital cost, operative procedure (OR, any procedure), and pediatric intensive care unit (ICU) admission were collected. Clinical outcomes were also analyzed based on the malnutrition severity subtype and compared to patients who were not diagnosed with malnutrition. We used the natural log (LOS + 1) and natural log (costs+1) for LOS and cost analyses since both variables were highly skewed. Mixed effects regression analysis was completed to account for the clustering of repeated admissions. All analyses were performed using SAS v9.4.
Results: Any malnutrition diagnosis was associated with a higher 7-, 14-, and 30-day readmission rate compared to patients without a malnutrition diagnosis. Malnourished patients had a higher mortality rate, median LOS, LOS index, cost, ICU admission rate, and operative procedure rate compared to patients without a malnutrition diagnosis (Table 1). Table 2 represents an analysis based on malnutrition severity subtype. Patients with mild, moderate, and severe malnutrition all had significantly higher readmission rates at 7-, 14-, and 30-day time points compared to patients with no malnutrition. Patients with unspecified malnutrition had a higher readmission rate at only 30 days. At all three readmission time points, there were no significant differences in readmission rates between malnutrition severity categories. The only malnutrition subtype with a significantly increased rate of mortality compared to no malnutrition was patients with severe malnutrition (p = 0.005). Admissions with mild, moderate, unspecified, and severe malnutrition had significantly higher LOS index, LOS, and total costs than admissions without a malnutrition diagnosis. Mild malnutrition admissions had a significantly higher LOS index than moderate (p = 0.050) and severe (p = 0.014) malnutrition while unspecified severity admissions had a significantly higher LOS index than severe admissions (p = 0.026). Mild (p = 0.032), moderate (p = 0.015) and severe (p = 0.001) malnutrition admissions had significantly higher LOS than unspecified severity admissions. Mild (p = 0.011) malnutrition admission had significantly higher costs than admission with unspecified malnutrition.
期刊介绍:
The Journal of Parenteral and Enteral Nutrition (JPEN) is the premier scientific journal of nutrition and metabolic support. It publishes original peer-reviewed studies that define the cutting edge of basic and clinical research in the field. It explores the science of optimizing the care of patients receiving enteral or IV therapies. Also included: reviews, techniques, brief reports, case reports, and abstracts.