Nutrition and Metabolism Research Oral Paper Session Abstracts

IF 4.1 3区 医学 Q2 NUTRITION & DIETETICS Journal of Parenteral and Enteral Nutrition Pub Date : 2024-02-22 DOI:10.1002/jpen.2601
{"title":"Nutrition and Metabolism Research Oral Paper Session Abstracts","authors":"","doi":"10.1002/jpen.2601","DOIUrl":null,"url":null,"abstract":"<p>Sunday, March 3, 2024</p><p>SU30 Parenteral Nutrition Therapy</p><p>SU31 Enteral Nutrition Therapy</p><p>SU32 Malnutrition, Obesity, Nutrition Practice Concepts, and Issues</p><p>SU33 Critical Care and Critical Health Issues</p><p>SU34 GI and Other Nutrition and Metabolic-Related Topics</p><p>SU35 Pediatric, Neonatal, Pregnancy, and Lactation</p><p>Ji Seok Park, MD, MPH; Mohamed Tausif Siddiqui, MD; Kristin Izzo, RD; Sara Yacyshyn, MD; Allison Doriot, RD; Aje Kent, MD; Elizabeth Gallant, RD; Miguel Salazar, MD; Eileen Hendrickson, PharmD; Adriana Panciu, PharmD; Basma Rizk, PharmD; Ann Dugan, RN; James Bena, MS; Shannon Morrison, MS; Ruishen Lyu, MS; Anil Vaidya, MD; Gail Cresci, PhD, RD, LD, FASPEN; Donald F. Kirby, MD, FACP, FACN, FACG, AGAF, FASPEN, CNSC, CPNS</p><p>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support</b>: Cleveland Clinic Center for Human Nutrition Morrison Research and Development Funding.</p><p><b>Background</b>: Preventing catheter-related bloodstream infection (CRBSI) is an essential component in managing patients with chronic intestinal failure dependent on home parenteral nutrition (HPN). Ethanol lock therapy is an effective evidence-based strategy used to decrease the risk of CRBSI, however, it has become less available due to supply chain issues thus other strategies are needed. SQ53 wipe is a novel antimicrobial wipe based on a proprietary compound that has residual efficacy beyond 24 hours. It is registered under the European Union Biocidal Product Regulation but not under the U.S. Food and Drug Administration. This study aimed to evaluate the effectiveness of the SQ53 wipe in preventing CRBSI in patients receiving HPN. The study was registered in ClinicalTrials.gov (NCT 04822467).</p><p><b>Methods</b>: A single-blinded, randomized, placebo-controlled trial was designed. About 200 patients meeting pre-defined criteria were contacted. A total of 60 patients were recruited to the study between December 10, 2021, and June 3, 2022, per sample size calculation. Patients were randomized into a treatment group (SQ53 wipe) and a control group (alcohol wipe). A stratified randomization was done based on the CRBSI risk category (low, high, new) and the types of central venous catheter (CVC; tunneled, non-tunneled). Patients were instructed to use the appropriate type of wipe to clean their CVCs before and after HPN infusion per specific instructions. An interim analysis for both efficacy and futility was planned to occur when the last patient reached 6 months post-randomization. Analyses were performed using Poisson regression for the comparisons of all CRBSI (confirmed and suspected), confirmed CRBSI and CVC exchanges between the two groups. Additional analyses were performed to compare the outcomes between the 6 months prior to the study and the time in the study, using each patient as their own historical control. Both the intention to treat (ITT) and per-protocol (PP) (&gt;90% adherence) analyses were used.</p><p><b>Results</b>: Fifty-nine patients were randomized into the study. When the two groups were compared in parallel, both the ITT and PP analyses did not show statistically significant superiority of using SQ53 wipe over alcohol wipe in decreasing all CRBSI, confirmed CRBSI or CVC exchanges. However, PP analysis suggested that event rates may be lower in the SQ53 group which had a 34% lower risk of all CRBSI (<i>P</i> = 0.43), 53% lower risk of confirmed CRBSI (<i>P</i> = 0.52), and a 30% lower risk of CVC exchanges (<i>P</i> = 0.58). Interestingly, when each patient's CRBSI rate during the trial was compared with their previous CRBSI rate, the SQ53 wipe group showed a 74% lower risk of all CRBSI (<i>P</i> = 0.005) in PP analysis. In patients in the high-risk category, every patient who was randomized had a decreased CRBSI rate compared to their previous experience. Every patient tolerated SQ53 well without predefined adverse events.</p><p><b>Conclusion</b>: Patients who used SQ53 wipe for more than 90% of the time using specific instructions had 74% decreased CRBSI rates compared to their previous experience. SQ53 wipe did not show a statistically significant benefit over alcohol wipe in this study due to the augmented catheter hygienic care in the control group and the insufficient sample size.</p><p><b>Abstract of Distinction</b></p><p>Theresa A. Fessler, MS, RDN, CNSC<sup>1</sup>; Mary B. Crandall, PhD, RN<sup>2</sup>; David N. Martin, PhD<sup>2</sup></p><p><sup>1</sup>Morrison Healthcare, University of Virginia Health System, Charlottesville, VA; <sup>2</sup>University of Virginia Health System, Charlottesville, VA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Catheter-related bloodstream infection (CRBSI) is a serious complication for patients receiving home parenteral nutrition (HPN). The literature is not consistent as to whether there are significant differences in infection risk between central venous catheter (CVC) types, and assessment is complicated by potential alternate infection sources and different evaluation methods: CRBSI, and central line-associated catheter infection (CLABSI). The goals of this project were: To determine if significant differences in infection rates exist between peripherally inserted central venous catheters (PICC), tunneled central venous catheters (TCVC), and implanted ports, or between single-lumen (SL) and multi-lumen (ML) catheters used for HPN; and to identify rates of CVC removal for other complications.</p><p><b>Methods</b>: A prospective, observational quality improvement project was conducted for adults who received HPN provided by the University of Virginia, Continuum Home Infusion Pharmacy from February 2019 through December 2022 with follow-up ending July 31, 2023. Data were collected for 141 CVCs used for 89 patients and included number of HPN days, indications for HPN (Figure 1), reasons for CVC removal, blood draws, and microbiologic results. CRBSI and CLABSI were determined by the criteria described in Table 1. Figure 2 shows the number of peripheral and CVC blood and catheter tip tests done for the CVCs with suspected infection.</p><p><b>Results</b>: Of the CVCs used for HPN, 63% were PICC, 27% TCVC, and 10% ports, with a total of 15,474 HPN catheter days. The CVCs were 42% SL, 55% double-lumen, and 2% triple-lumen. CRBSI rates were 0.97 episodes per 1000 HPN catheter days overall, with 1.54 for PICC, 0.64 for TCVC, and 0.0 for ports. CLABSI rates were 1.74 episodes per 1000 HPN catheter days overall, with 3.07 for PICC, 0.89 for TCVC, and 0.0 for ports. No significant differences were found between PICC and TCVC in CRBSI, however, PICCs had a significantly higher CLABSI rate per 1000 HPN catheter days than did TCVCs (<i>p</i> = 0.005). After a second analysis in which 9 cases of catheter infection were not counted due to undetermined alternate infection sources, overall CRBSI and CLABSI rates were reduced to 0.78 and 1.16 per 1000 HPN catheter days, respectively. The second analysis showed CRBSI rates of 1.23 for PICC and 0.51 for TCVC; and CLABSI rates of 2.0 for PICC and 0.64 for TCVC, with no significant differences in CRBSI, and a significantly higher rate of CLABSI per 1000 HPN catheter days for PICC lines (<i>p</i> = 0.04). Table 2 shows a statistical analysis of CRBSI and CLABSI rates. In the initial analysis, CRBSI was 1.24 for ML and 0.68 for SL CVCs, and CLABSI was 2.1 for ML and 1.36 for SL CVCs, per 1000 HPN days, however the differences were not statistically significant. Other problems which necessitated CVC removal were occlusion, malposition, accidental, leak, and thrombosis. The removal rate for other complications was 2.0 per 1000 HPN catheter days overall, with 1.78 for TCVC and 2.61 for PICCs, and the differences were not statistically significant.</p><p><b>Conclusion</b>: We found no significant differences in CRBSI between PICC and TCVC, significantly more CLABSIs in PICC as compared to TCVCs, and no infections with ports. Although rates of other catheter problems were higher for PICCs, and infection rates were higher for ML than for SL catheters, neither reached statistical significance. We illustrate the variation in results between CRBSI and CLABSI, and that undetermined alternate infection sources complicate reporting. Our results show the need for more study, to be more open to the use of ports, and to choose SL TCVCs when feasible for long-term HPN.</p><p>Haruka Takayama, RD, PhD<sup>1,2</sup>; Kazuhiko Fukatsu, MD, PhD<sup>1,3</sup>; Midori Noguchi, BA<sup>1</sup>; Kazuya Takahashi, MD, PhD<sup>4</sup>; Nana Matsumoto, RD, MS<sup>3</sup>; Tomonori Narita, MD<sup>4</sup>; Satoshi Murakoshi, MD, PhD<sup>1,5</sup></p><p><sup>1</sup>Surgical Center, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; <sup>2</sup>Department of Nutrition, St. Luke's International Hospital, Chuo-ku, Tokyo, Japan; <sup>3</sup>Operating Room Management and Surgical Metabolism, Graduate School of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo, Japan; <sup>4</sup>Gastrointestinal Surgery, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; <sup>5</sup>Nutrition and Dietetics, Kanagawa University of Human Services, Yokosuka City, Kanagawa, Japan</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Our previous study clarified addition of beta-hydroxy-beta-methylbutyrate (HMB) to TPN to partially restores gut-associated lymphoid tissue (GALT) atrophy due to lack of enteral nutrition. Because HMB is a metabolite of amino acid, the recovery effects might derive from increased amount of amino acids in the TPN solution. Or, it is possible that increased amino acid content could not restore GALT atrophy by itself, but that the amino acid increase together with HMB addition could further prevent the atrophy. Herein, we performed 2 studies to answer these questions using a murine TPN feeding model.</p><p><b>Methods</b>: Experiment 1: Six-week-old male Institute of Cancer Research (ICR) mice were divided into A+ (n = 10) and A++ (n = 10) groups. Mice were inserted a catheter into the right jugular vein and they were continuously administered 0.2 mL/h normal saline solution for 2 days and allowed to take chow and water <i>ad libitum</i>. Then, mice received isocaloric PN solution with NPC/N 284 (A+) or 135 (A++) without oral food intake for 5 days. After the dietary manipulation, all mice were killed with cardiac puncture under general anesthesia and harvested the whole small intestine for GALT cell isolation. GALT cell number and its phenotype (B cell, CD4+, CD8+, and αβTCR+, and γδTCR+) were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF) and intestinal washings were collected for IgA level measurement by ELISA. Experiment 2: Mice were randomized to A+H+ (n = 10) and A++H+ (n = 9) groups. The A+H+ mice received PN solution with NPC/N 284 and 2,000 mg/kg BW of Ca-HMB, while the A++H+ animals were given PN solution with NPC/N 135 and 2,000 mg/kg BW of Ca-HMB. After 5 days of PN feeding, the parameters as in Exp.1 were evaluated. The Wilcoxon test was used for all parameter analyses, and the significance level was set at less than 5%.</p><p><b>Results</b>: There were no significant differences between the A+ and A++ groups in GALT cell numbers (Table 1), phenotypes (Table 2) or mucosal IgA levels. However, the A++H+ group showed higher LP cell numbers (Table 1) and higher CD4+ cell percentage (Table 2) in IE space than the A+H+ group, without significant differences in IgA levels at any mucosal sites.</p><p>Anam Bashir, MBBS; Lauren L. Karel, BCPS; Margaret Begany, RD, CSPCC, LDN, CNSC; Jennifer Panganiban, MD</p><p>Children's Hospital of Philadelphia, Philadelphia, PA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Fish oil-based lipid emulsion (FOLE) is FDA-approved at 1 g/kg/day for the treatment of parenteral nutrition-associated cholestasis (PNAC). Due to limited fat provision while on 1 g/kg/day of FOLE, caloric provision, especially in the neonatal population, is skewed primarily to be provided by dextrose and higher than desired glucose infusion rate (GIR) provisions to support weight gain and growth. There is limited published information on the use of FOLE on doses higher than 1 g/kg/day. Concerns about possible essential acid deficiency on 1 g/kg/day have been raised. Thus, we aim to describe patients who received 1.5 g/kg/day of FOLE at our institution.</p><p><b>Methods</b>: A retrospective IRB-approved chart review was conducted on patients who received parenteral nutrition (PN) at Children's Hospital of Philadelphia between January 2020 and August 2023. The inclusion criteria included children who were on PN, ages 0 to 18 years, and receiving FOLE at a dose of more than 1 g/kg/day for at least 14 days. Cholestasis progression, essential fatty acid deficiency (EFAD), clinically severe post-procedure hemorrhage and hypertriglyceridemia were clinical outcomes of interest (Table 1). The progression of cholestatic disease was monitored by conjugated bilirubin levels. A triene to tetraene (T:T) ratio of greater than 0.046 was used to define EFAD based on Associated Regional and University Pathologists, Inc. (ARUP) normative laboratory values. Mead acid, linoleic acid, and alpha-linoleic acid levels were also collected to reflect essential fatty acid stores (normative values in Table 2). Invasive procedures were defined as those that require entry to the body through an incision, and/or tunneling, or cutting technique for vascular procedures. For children younger than 1 year, hypertriglyceridemia was classified as greater than 200 mg/dl, and for older children, greater than 400 mg/dl.</p><p><b>Results</b>: Nine patients [5 males; mean age 2.6 y (range 2 mo–12.9 y)] with PNALD (defined by serum conjugated &gt;= 2 mg/dl and exclusion of other causes of liver disease) were started on FOLE 1.5 g/kg/dose. The purpose of initiating the higher dose FOLE was to decrease GIR provision and/or give additional calories due to suboptimal weight gain using 1 g/kg/day of FOLE. None of the patients developed hypertriglyceridemia. Four patients had improvement of cholestasis with levels decreasing by more than 2 mg/dl, and four patients continued to have no evidence of cholestasis after prior normalization while on 1 g/kg dosing. One patient experienced an increase in conjugated bilirubin of more than 2 mg/dl after which the FOLE was decreased to 1 g/kg/day with resolution of cholestasis over three months. Seven patients had an essential fatty acid panel collected and T:T was within normal limits, although five patients had less than optimal levels of linoleic acid. Seven patients had an invasive procedure performed and only one patient had more than expected bleeding after circumcision. This patient had a low fibrinogen level (70 mg/dL) and required fresh frozen plasma and packed red blood cell transfusion with no significant bleeding event thereafter (Table 1).</p><p>Diana Mulherin, PharmD, BCNSP, BCCCP, FCCM; Sarah Cogle, PharmD, BCNSP, BCCCP; Vanessa Kumpf, PharmD, BCNSP, FASPEN; Edward Woo, PharmD; David Mulherin, PharmD, BCPS; Madeleine Hallum, MSHS, RDN, CSG, LDN; Ankita Sisselman, MD; Dawn Adams, MD, MS, CNSC</p><p>Vanderbilt University Medical Center, Nashville, TN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Copper (Cu) deficiency can lead to poor wound healing, myeloneuropathy, anemia, and cardiac arrhythmias. Deficiency occurs from poor intake or high losses, which may be seen in adult patients requiring parenteral nutrition (PN) including those with severe malnutrition, large burns, requiring continuous renal replacement therapy (CRRT), or with a history of bariatric surgery/malabsorption. A previous formulation of multi-trace elements (MTE) contained Cu 1 mg per dose, and in combination with Cu contamination from other PN ingredients, an increased incidence of hypercupremia was observed in patients requiring long-term PN. As of 2020, the only MTE product for use in adults in the U.S. contains 0.3 mg of Cu. For patients with significant cholestasis of hepatic dysfunction, ASPEN recommends withholding or decreasing Cu doses in PN. Due to a lack of standardized practice, a quality improvement project was initiated to describe practices for ordering Cu in PN and Cu status in acutely ill, hospitalized patients with severe hyperbilirubinemia.</p><p><b>Methods</b>: This was a retrospective evaluation of PN ordering practices of a multidisciplinary nutrition support team (NST) at a large, academic medical center between July 1, 2021, and August 31, 2023. PN encounters (a course of PN treatment during a single inpatient admission) in patients ≥ 18 years of age with severe hyperbilirubinemia (total bilirubin ≥ 10 mg/dL or direct bilirubin ≥ 2 mg/dL) within 5 days before or any time during the PN encounter were included. Patient demographics, frequency of Cu provision in PN, Cu and C-reactive protein (CRP) levels, and CRRT status were assessed using descriptive statistics.</p><p><b>Results</b>: A total of 15,739 PN orders were entered on 1068 patients during the study period. Of those, 155 PN encounters occurred in 144 individual patients with severe hyperbilirubinemia. Baseline demographics are provided in Table 1. A summary of Cu sources (either from MTE product or as cupric chloride additive) for each PN encounter is provided in Figure 1. Cu status was assessed in 53 (34%) PN encounters with a mean concentration of 76.9 (±34.3) mcg/dL. CRP was only obtained concurrently with 58% (n = 31) of Cu levels with a mean concentration of 125.7 (±95.4) mg/L. CRRT was provided in 44 (28.4%) encounters (Table 2).</p><p><b>Figure 1</b>. Copper sources in PN orders.</p><p>Brittney Patterson, MS, RD-AP, CNSC<sup>1</sup>; Ranna Modir, MS, RD, CNSC, CDE, CCTD<sup>1</sup>; Jack McKeown<sup>1</sup>; Rachel Aubyrn<sup>1</sup>; Javier Lorenzo, MD, FCCM<sup>2</sup></p><p><sup>1</sup>Stanford Health Care, Stanford, CA; <sup>2</sup>Stanford University School of Medicine, Stanford, CA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The use of safety alerts in electronic medical records (EMR) aims to improve patient safety, with most alerts directed at medication and nursing workflows. Stanford Health Care (SHC) has added tube feeding regimens (TFR) to the medication administration record (MAR) to further improve patient safety. In critically ill (ICU) patients who are at high risk for gastrointestinal (GI) complications, the ASPEN/SCCM 2016 guidelines recommend using near isotonic, fiber-free TFR. A retrospective analysis between 2014-2016 at SHC found an association of severe GI complications in ICU patients who were started on high-risk tube feeding regimens (HRTFR) of hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. To assure the ASPEN/SCCM guidelines were implemented at SHC, many interventions were put in place including designing order sets with HRTFR listed toward the bottom; specific TFR order sets removing HRTFR; education during new resident orientation, team rounds, and monthly in-services; and Registered Dietitians (RDs) having tube feeding order writing privileges. However, despite these interventions, it was found that HRTFR were still being ordered, with most of them occurring outside of normal RD working hours (8 am to 4 pm). To educate and guide providers to select safe TFR for ICU patients, we aimed to create a novel nutrition support-specific order validation pop-up in the EMR.</p><p><b>Methods</b>: A team of RDs, critical care attendings, and Epic analysts collaborated to create a nutrition support-specific order validation pop-up. ICU patients were defined as requiring vasopressor support from norepinephrine, epinephrine, vasopressin, and/or phenylephrine. HRTFR was defined as hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. The order validation pop-up was built to trigger under the following three scenarios: (1) vasopressors were already on and a HRTFR was ordered, (2) a HRTFR was already on, and a vasopressor was ordered, or (3) both orders were being placed simultaneously. The pop-up displayed the reason for the alert, the importance of avoiding a HRTFR, provided safer TFR options, and recommended contacting the RD for guidance. To preserve individualization of patient care, order validation was overridable, as patients on lower vasopressor doses are appropriate to have the HRTFR. After the order validation pop-up was implemented, a chart review was completed between March 2023 and May 2023 to assess the incidence and actions following the triggered order validation pop-up.</p><p><b>Results</b>: Between March 2023 and May 2023, the order validation pop-up triggered 220 times in a total of 59 patients. Out of the 220 triggers, based on the instructions in the pop-up, 42 (19%) resulted in a changed or discontinued order, or the HRTFR was not ordered. Of those 42 triggers that resulted in a properly adjusted HRTFR, 26 (61%) of them occurred outside of normal RD hours. The remaining triggers, where no changes were made, were found to have low dose vasopressors, vasopressors listed on the MAR but not actively being used, or a HRTFR was ordered on the MAR but held per nursing communication orders.</p><p><b>Conclusion</b>: The creation of a novel nutrition support-specific order validation pop-up provided education and guidance to ordering providers. With this additional layer of safety, 42 ICU patients between March 2023 and May 2023 were placed on safer TFR, with most of the impact occurring outside of RD working hours.</p><p><b>Best of ASPEN - Enteral Nutrition Therapy</b></p><p><b>1627 - Victory for Volume-Based Enteral Nutrition</b></p><p>Julie M. Geyer, RD-AP, CNSC</p><p>University of Colorado Hospital, Aurora, CO</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Enteral nutrition (EN) in the hospital setting is traditionally administered by a fixed rate-based feeding method (RBEN). Studies using RBEN found that due to interruptions or withholding, actual formula delivery averages 60% to 70% of the prescribed volume. Nutrition provision below energy needs contributes to malnutrition and negative consequences including increased health care cost, and increased morbidity and mortality. The American Society for Parenteral and Enteral Nutrition (ASPEN) and the Society of Critical Care Medicine (SCCM), recommend use of a volume-base enteral nutrition feeding method (VBEN) to improve the nutrient delivery, decrease energy deficits and prevent overfeeding.</p><p><b>Methods</b>: This quality improvement study took place at a Level I trauma, academic hospital from June 2022 to September 2023. In September 2022, a hospital-wide process improvement committee was assembled for multi-phase implementation of VBEN. Prior to September 2022, unit-based dietitians conducted quality improvement to address common causes of feeding interruptions. VBEN Inclusion criteria included those demonstrating tolerance of goal RBEN. The maximum hourly rate was set at 150 mL/hr. The ‘goal’ provision was set as 90% to 110% of prescribed formula volume. Patients included in the data collection were tolerating EN at RBEN goal and formula intake volumes were taken directly from the feeding pump history. Changes to the electronic medical record (EMR) included, creation of a VBEN calculator with row instructions built into the tube feeding flowsheet, creation of nurse reminder task every 4 hours to recalculate formula intake and adjust rate as needed. Changes to the formula order on the medication administration record included specification of VBEN vs RBEN feeding method and standardized administration instructions (Figure 1). Nurses, dietitians, and providers received training for the VBEN workflow and process through e-mail communication, in-person training, interactive learning-assisted video, and one-on-one coaching.</p><p><b>Results</b>: Prior to June 2023, RBEN was the standard feeding method. Routine quality improvement audits from October 2020 to December 2022 in one intensive care unit demonstrated that despite strategies to improve formula delivery actual formula provision to meet ‘goal’ was met on 50% to 74% of EN days (Table 1). In June 2022, a hospital-wide audit of formula provision was conducted and included all levels of care (floor, intermediate, and intensive care). In a total of 346 EN days, the actual formula provision to meet ‘goal’ was achieved on 63% of EN days (Table 2). In November 2022, an audit was conducted in the two ICU units selected for phase 1 implementation. In a total of 154 EN days, the actual formula provision to meet ‘goal’ was achieved on 57% of EN days (Table 2). Phase 1 implementation took place in June 2023. A post-go-live audit was completed. In a total of 157 EN days, ‘goal’ formula volume was achieved on 83% of EN days (Table 2). No instances of hypo/hyperglycemia or gastrointestinal complications were reported. Phase 1 was deemed a success and approval was obtained to continue VBEN implementation in a stepwise fashion for the remaining inpatient units.</p><p>Marcin Folwarski, MD, PhD<sup>1</sup>; Stanisław Kłęk<sup>2</sup>; Karolina Skonieczna-Żydecka<sup>3</sup>; Agata Zoubek-Wójcik<sup>4</sup>; Waldemar Szafrański, MD, PhD<sup>5</sup>; Lidia Bartoszewska<sup>6</sup>; Krzysztof Figuła<sup>7</sup>; Marlena Jakubczyk, MD, PhD<sup>8</sup>; Anna Jurczuk<sup>9</sup>; Przemysław Matras, MD, PhD<sup>10</sup>; Zbigniew Kamocki, MD, PhD<sup>11</sup>; Tomasz Kowalczyk, MD, PhD<sup>12</sup>; Bogna Kwella, MD, PhD<sup>13</sup>; Joanna Sonsala-Wołczyk<sup>14</sup>; Jacek Szopiński, MD, PhD<sup>15</sup>; Krystyna Urbanowicz, MD, PhD<sup>16</sup>; Anna Zmarzly, MD, PhD<sup>14</sup></p><p><sup>1</sup>Division of Clinical Nutrition and Dietetics, Medical University of Gdańsk, Gdansk, Pomorskie, Poland; <sup>2</sup>Surgical Oncology Clinic at the National Cancer Institute in Krakow at Maria Sklodowska-Curie National Research Institute of Oncology, Cracow, Poland; <sup>3</sup>Department of Biochemical Science, Pomeranian Medical University in Szczecin, Szczecin, Zachodniopomorskie, Poland; <sup>4</sup>Nutrimed Home Nutrition Center, 3, Warsaw, Poland; <sup>5</sup>Home Enteral and Parenteral Nutrition Unit, General Surgery Department, Nicolaus Copernicus Hospital, Gdansk, Pomorskie, Poland; <sup>6</sup>First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit S, Lublin, Poland; <sup>7</sup>Nutricare Clinical Nutrition Center, Cracov, Poland; <sup>8</sup>Department of Anaesthesiology and Intensive Care Collegium Medicum in Bydgoszcz, Nicolaus Copernicus University, Toruń, Poland; <sup>9</sup>Outpatient Clinic of Nutritional Therapy Clinical Hospital, 15-001 Bialystok, Bialystok, Poland; <sup>10</sup>First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit SPSK4, Lublin, Poland; <sup>11</sup>Department of General and Gastroenterological Surgery Medical University, Bialystok, Poland; <sup>12</sup>Nutricare Clinical Nutrition Center, Cracow, Poland; <sup>13</sup>Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland; <sup>14</sup>Clinical Nutrition Unit, Gromkowski Citi Hospital, Wroclaw, Poland; <sup>15</sup>Department of General Hepatobiliary and Transplant Surgery, Collegium Medicum, Nicolaus Copernicus University in Torun, Torun, Poland; <sup>16</sup>Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Cancer is one of the most common indications for home enteral nutrition (HEN). Malnutrition and weight loss, associated with deterioration in performance status, contribute to poorer outcomes in oncology patients. Systemic inflammation is a characteristic feature of cancer cachexia and may be used as a prognostic factor for short survival. According to the ESPEN guidelines HEN is indicated for patients with an estimated survival of at least 30 days. Therefore, determining survival is essential for individual care planning as it informs healthcare professionals about the suitability of HEN and palliative care strategy.</p><p><b>Methods</b>: In a retrospective multicenter survey, we examined the medical records of cancer patients across 22 Polish HEN centers treated in 2018. Factors assessed during the qualification for HEN included BMI, weight loss, albumin level, total protein level, lymphocyte count, CRP, Prognostic Nutritional Index (PNI), and Eastern Cooperative Oncology Group (ECOG) performance status. The primary endpoint was survival of less than 30 days from the initiation of HEN.</p><p><b>Results</b>: A total of 278 cancer patients: 51.44% head and neck, 41.37% gastrointestinal, and 7.19% other localizations were included in the study (70.14% male, 29.86% female). Inflammatory factors– albumin level below 3.5 g/dL (<i>p</i> = 0.02), C-reactive protein (<i>p</i> = 0.01), PNI &gt; 45 (<i>p</i> = 0.04), high percentage of weight loss in the last 6 months (<i>p</i> &lt; 0.01) and ECOG performance score (<i>p</i> = 0.01) were associated with poor survival (less than 30 days). Body weight, BMI, lymphocyte count, and total protein level were not correlated with survival.</p><p><b>Conclusion</b>: Assessment of performance status, inflammation, and weight loss during qualification for HEN can predict short-term survival of cancer patients. This finding highlights the importance of comprehensive assessments before home nutrition initiation. Predicting poor survival can help plan palliative care and determine whether the patient will benefit from HEN.</p><p>June R. Greaves, RD, CNSC, CDN, LD, LDN, LRD<sup>1</sup>; Katharine Morra, RD, CNSC, CSO, LD, LDN<sup>2</sup></p><p><sup>1</sup>Coram CVS Specialty Infusion Services, Meriden, CT; <sup>2</sup>Coram CVS Specialty Infusion Services, Plainfield, IN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The objective of this quality improvement project was to determine whether patients were successful in administering tube feeding independently at home following a virtual tube feeding instruction by a Registered Dietitian (RD) with a nationwide home care infusion company. The hope is to provide information regarding the process and potentially identify an avenue for further improvements to the process and an area for future research.</p><p><b>Methods</b>: A retrospective review was conducted of approximately 162 patients who received a virtual tube feeding instruction by the enteral RD from June 2022 to June 2023. A virtual instruction was completed for the enteral feeding pump, gravity bag, and bolus/syringe methods of administration. A follow-up call was made to active patients to inquire about their experience of the virtual instruction. For those patients unable to be reached, a review of the medical record was completed to determine if inbound calls were received for questions or issues after the virtual instruction. Patients were queried on confidence in their ability to provide enteral feedings, and if they had any concerns upon completion of the virtual instruction, knowledge of who to contact after the virtual instruction, and if the reference materials provided were helpful. Patients who did not receive virtual instruction, or were discharged from service, were excluded from the review.</p><p><b>Results</b>: One hundred sixty-two total patients were reviewed as potentially eligible for the analysis; 115 were excluded. Of those excluded, 100 (87%) were no longer on service; 12 (10%) declined a virtual instruction due to home health agency instruction, inpatient instruction with nursing or dietitian prior to the start of care, or assistance from the home infusion company sales team; 3 (3%) were a “no show” for the scheduled appointment. Eighteen of the remaining eligible patients were unable to be contacted for follow-up. Of those who were unable to be contacted through a follow-up call, there were no documented inbound calls regarding feeding/equipment questions or concerns. Of the total number of eligible patients, 29 provided telephonic feedback on the virtual instruction experience. Virtual instruction was related to the following administration types: enteral pump (86%, n = 25), followed by gravity bag and bolus/syringe (14%, n = 4). Upon completion of the instruction, 27 (93%) felt confident with feeding administration, 2 (7%) did not feel confident as they identified as “in person learners”; 24 (83%) did not experience issues/concerns, 5 (17%) did have questions/concerns; 27 (93%) responded knowing who to contact, 2 (7%) did not; 22 (76%) found reference materials provided helpful, 2 (7%) did not, and 5 (17%) did not review the reference materials.</p><p><b>Conclusion</b>: Technological advances in recent history have made virtual instruction possible. Virtual enteral instruction can be a successful tool for patients to learn how to administer tube feedings when an in-person instruction is not possible in the home care setting. However, consideration should be given to the client's preferred style of learning. Further research in the use of virtual instruction to enhance the process should be considered. As literature is limited on virtual instruction outcomes, additional research is warranted.</p><p>Danelle A. Olson, RDN; Lisa M. Epp, RDN; Osman Mohamed Elfadil, MBBS; Ryan T. Hurt, MD, PhD; Manpreet S. Mundi, MD</p><p>Mayo Clinic, Rochester, MN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The prevalence of bariatric surgery has increased significantly in recent years, as it is the most effective long-term treatment for obesity. The two most common surgeries, Sleeve Gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), alter gastrointestinal anatomy, producing significant weight loss as well as remission of obesity-related co-morbidities including type 2 diabetes. Despite these benefits, bariatric surgery can be associated with significant debilitating complications. Though the true prevalence and mechanism are unclear, hypoglycemia has been shown to be present in up to 38% of post-surgical RYGB patients and can be very difficult to manage. Currently, there remains a paucity of data regarding the role of enteral nutrition (EN) as a potential therapy.</p><p><b>Methods</b>: A retrospective review of EMR of patients who were seen in our outpatient home enteral nutrition (HEN) clinic for initiation of tube feeding for management of reactive hypoglycemia from March 2017 to July 2023. In addition to baseline clinical characteristics and demographics, we collected data about hypoglycemia incidents, interventions, EN regimens, and outcomes.</p><p><b>Results</b>: Six patients were seen in the HEN clinic with post-bariatric reactive hypoglycemia (mean age 45.5 ± 9.6 years; 66.7% female; mean BMI at HEN initiation 28.6 ± 8.3). Five out of 6 patients underwent RYGB surgery, and 1/6 underwent laparoscopic adjustable gastric banding (LAGB) that was subsequently revised sleeve gastrectomy (SG). The duration until the development of reactive hypoglycemia after surgery varied in the cohort. On average, the first incident was documented 2.6 ± 3.2 years after surgery. Of note, patients lost, on average, 51.2 ± 28.5 kgs after surgery and before they required EN support. We noted a slight change in weight after EN initiation as patients remained, on average, at +2.5 kg one month and 3 months into HEN. Table 1 shows the patients' profiles. Dietary modification focusing especially on reduction in consumption of refined carbohydrates was recommended for all patients. However, poor compliance was prevalent, with 5/6 (83%) of patients not adhering to prescribed diet. In addition to the EN and dietary regimens prescribed for all patients, some had received specific treatment(s) to prevent or manage reactive hypoglycemia. In one case, a combination of α-glucosidase inhibitors, somatostatin, and radical diet changes were used. The majority of patients underwent an initial trial of EN through a naso-jejunal tube, which was then converted to a percutaneous tube after efficacy was established (Table 2). Standard polymeric formulas were utilized for most patients, although one was provided commercial blenderized tube feeds. With the use of EN, 4 out of 6 patients had a resolution of reactive hypoglycemia, while only two continued to experience symptoms. Two patients stopped use of EN due to feeding complications and non-compliance, while the remaining four continued on EN.</p><p>Anna K. Burneske; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD</p><p>Medical College of Wisconsin, Milwaukee, WI</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Patients who are malnourished have worse outcomes. Many standardized tools have been developed to screen for malnutrition in acutely ill pediatric patients: Pediatric Yorkhill Malnutrition Score (PYMS), Pediatric Nutrition Screening Tool (PNST), and Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP). Alternatively, some institutions have developed their own tools for this purpose. These tools are referred to as “home-grown”. Regardless of their origin, none of these tools have been validated in critically ill children. Registered dietitians (RDs) perform nutrition assessments on patients based on the results of these nutrition screenings or based on protocols within their institution. The Virtual Pediatric Systems, LLC (VPS), an international data registry supporting standardized data sharing for research, improved patient care, and benchmarking among pediatric ICUs, developed a nutrition module that captures data for nutritional metrics. VPS has collected data from centers in the nutrition module since October 2019 and collects data for about 10,000 patients per calendar year from the centers participating in the nutrition module. The specific aims were to compare the nutrition screening tools to the dietitians' assessments to determine the screening tools' accuracy and to determine whether standardized screening tools are more accurate than those developed at single centers. We hypothesized that (1) nutrition screening tools used by participating centers will accurately identify malnourished children, and (2) standardized tools will be more accurate than those developed at single centers.</p><p><b>Methods</b>: In this project, we compared pediatric nutrition screening tools with the assessments performed by RDs to determine whether nutrition screening tools accurately identify malnourished patients. We also determined which nutrition screening tools more accurately identify patients who are malnourished or at risk of becoming malnourished during their hospitalization in the PICU so that the appropriate nutrition therapy can be initiated. We obtained de-identified demographic and clinical data from October 2019 through March 2023 for all patients under 18 years of age from the VPS database from centers participating in the nutrition module. We considered the RD's assessment to be the gold standard for determining malnutrition and compared the nutrition screening tools to the RD's assessment. The degree of agreement in malnutrition between nutrition screening tools and RD's assessment was determined by Cohen's kappa (κ).</p><p><b>Results</b>: After selecting subjects who had a complete pediatric nutrition screen and RD assessment, the final data cohort contains a total of 9891 patients. Among them, there were 54% males, 4% neonates (&lt;=29 d), 34% infants (&lt;2 y), 35% children (2-12 y), and 26% adolescent (12-18 y). The subjects were 40% White, 17.5% Black, 22.5% Hispanic, 5.7% Asian, and 14.2% other/mixed. The kappa coefficient for the standardized nutrition screening tools was .38, which is considered a “fair” agreement between the screening tool and the RD “gold standard” assessment. Other unidentified tools listed as “home-grown” or “other” in VPS had kappa coefficients ranging from .31 to .91. 91 is a near-perfect agreement between the screening tool and the RD “gold standard.”</p><p><b>Conclusion</b>: These data show only a fair degree of agreement between the standardized screening tools (PYMS, PNST, STAMP) and RD assessments, meaning that these tools do not adequately assess the nutritional status of critically ill children. However, some unidentified hospital-specific tools show near-perfect agreement with RD assessments, so perhaps there is a better tool for identifying malnourished children in the ICU. Further investigation should be performed to determine why the home-grown tools are superior to the published tools.</p><p><b>Research Trainee Award</b></p><p>Hayley E. Billingsley, PhD, RD, CEP; Michael Dorsch, PharmD, MS; Todd M. Koelling, MD; Scott L. Hummel, MD, MS</p><p>University of Michigan, Ann Arbor, MI</p><p><b>Financial Support</b>: NHLBI - Award 5R33HL155498-03.</p><p><b>Background</b>: Malnutrition is common in patients with heart failure (HF) and worsens already poor prognosis. Previous work suggests that sodium restriction, the most common dietary recommendation for patients with HF, may be associated with reduced micronutrient and energy intake. The Mini Nutritional Assessment-Short Form (MNA-SF) is a strong indicator of nutrition status and prognosis in patients with HF, but the association between MNA nutrition status and sodium intake has not been examined. Therefore, this analysis aimed to examine the association between nutrition status and habitual sodium intake in hospitalized patients with HF.</p><p><b>Methods</b>: This is a cross-sectional analysis of patients (≥18 y of age) hospitalized for decompensated HF. Participants were administered the MNA-SF and scored as nourished, at risk of malnutrition, or malnourished based on established cutoffs. Questions on the MNA-SF regarding weight loss and declines in food intake over the previous 3 months were also considered independently. Participants completed the 2014 Block Food Frequency Questionnaire (FFQ) to assess habitual dietary intake. Estimated daily kilocalories (kcals) from the FFQ were divided by estimated energy needs (Harris-Benedict equation*1.1) to calculate percent (%) estimated energy needs. Estimated protein needs were calculated based on the Academy of Nutrition and Dietetics recommendation of 1.1 g/kilogram (kg) in HF and divided by estimated protein intake from the FFQ to calculate % estimated protein needs. Using the FFQ, participants were grouped into sodium intake ≥ or &lt; 2 g per day. Differences between groups based on sodium intake were explored using Fischer's exact test, Chi-square, or Mann Whitney U as applicable.</p><p><b>Results</b>: Baseline characteristics are presented in Table 1. On FFQ, participants with sodium intake &lt;2 g reported consuming significantly less of their % estimated energy and protein needs than participants with ≥ 2 g sodium intake (Figure 1). All patients (n = 12) with sodium intake &lt;2 g per day were malnourished or at risk for malnutrition on MNA-SF versus 73% (32) of patients with sodium intake ≥2 g per day (<i>P</i> = 0.051). A greater proportion of patients with daily sodium intake &lt;2 g reported recent weight loss &gt;3 kg (75% [9] vs. 43% [19], <i>P</i> = 0.051). No difference was found in the proportion of participants reporting a decrease in food intake on the MNA-SF (&lt;2 g sodium, 67% [8] vs. ≥ 2 g sodium, 50% [22], <i>P</i> = 0.305).</p><p><b>Conclusion</b>: In patients hospitalized for HF, habitual sodium intake &lt;2 g per day was associated with inadequate energy and protein intake, confirming previous findings. Despite the high prevalence of obesity in the cohort, sodium intake &lt;2 g per day was also associated with self-reported weight loss &gt;3 kg and a higher likelihood of being at risk for or having malnutrition. Although this cross-sectional analysis cannot determine the directionality of observed associations, additional studies should examine the impact of personalized nutrition interventions vs. standard-of-care sodium restriction education in HF on clinical outcomes.</p><p><b>Figure 1</b>. Percent estimated energy and protein needs achieved by sodium intake level in hospitalized patients with heart failure.</p><p>Lucia A. Gonzalez Ramirez, cPhD<sup>1,2</sup>; Mary M. Nellis, PhD<sup>3</sup>; Jessica A. Alvarez, PhD<sup>1,2,4</sup>; Tasha M. Burley<sup>2</sup>; Paula D. Nesbeth, cPhD<sup>1,2</sup>; Chin-An Yang, cPhD<sup>1,2</sup>; Dean P. Jones, PhD<sup>1,3,4</sup>; Thomas R. Ziegler, MD<sup>1,2,4</sup></p><p><sup>1</sup>Nutrition and Health Sciences Program, Laney Graduate School, Emory University, Atlanta, GA; <sup>2</sup>Division of Endocrinology, Metabolism and Lipids, Department of Medicine, Emory University, Atlanta, GA; <sup>3</sup>Clinical Biomarkers Laboratory, and Division of Pulmonary, Allergy, Critical Care and Sleep Medicine, Department of Medicine, Emory University, Atlanta, GA; <sup>4</sup>Center for Clinical and Molecular Nutrition, Department of Medicine, Emory University, Atlanta, GA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Postprandial metabolism can identify alterations related to the early stages of cardiovascular disease. However, limited data exist regarding the effects of body composition on postprandial metabolism after a lipid meal challenge. We aimed to characterize the metabolic pathways and metabolites associated with body fat abundance in the postprandial plasma metabolome after an oral lipid challenge.</p><p><b>Methods</b>: Thirty-one healthy individuals between 20 and 50 years old with a lean or overweight/obese body mass index (BMI) were recruited. Participants underwent body composition measurement with dual energy x-ray absorptiometry (DEXA) to quantify body fat percentage and visceral adipose tissue quantity. A standardized 900-kcal lipid meal challenge (a long chain triglyceride fat emulsion oral nutritional supplement) with repeat blood sampling was administered. Untargeted plasma high-resolution metabolomics was determined at baseline, 120 minutes, and 360 minutes after the lipid challenge using dual-column liquid chromatography (C18- and HILIC+ electrospray modes), coupled with high-resolution mass spectrometry (LC-HRMS). Metabolite differences were assessed using a metabolome-wide association study with linear mixed-effect models to study effects of body fat, time, and the body fat*time interaction, controlling for age and sex, and pathway enrichment analysis was performed.</p><p><b>Results</b>: A total of 12,078 (C18) and 15,041 (HILIC) features (metabolites) were detected in plasma at baseline. Changes over time differed by percent body fat (percent fat*time interaction) for 699 (C18) and 814 (HILIC) features from baseline to 120 minutes, respectively, and 465 (C18) and 478 (HILIC) features from baseline to 360 minutes, respectively (all <i>p</i> &lt; 0.05). These were enriched in pathways that include TCA cycle, fatty acid, lysine, tyrosine, tryptophan, butanoate, and purine metabolism, Figures 1 and 2. Additionally, changes over time differed by visceral adipose tissue quantity (VAT*time interaction) for 396 (C18) and 2290 (HILIC) features from baseline to 120 minutes, respectively, and 486 (C18) and 520 (HILIC) features from baseline to 360 minutes, respectively (all <i>p</i> &lt; 0.05). These were enriched in pathways that include fatty acid oxidation, omega-3 and −6 fatty acid, vitamin C, and pentose phosphate metabolism, Figure 3 and 4.</p><p><b>Best of ASPEN - Malnutrition, Obesity, Nutrition Practice Concepts, and Issues</b></p><p>Ana Paula Pagano, MSc<sup>1</sup>; Taiara Poltronieri, BSc<sup>1,2</sup>; William Evans, PhD<sup>3</sup>; M. Cristina Gonzalez, MD, PhD<sup>4</sup>; Anil Abraham Joy, MD<sup>5</sup>; Claude Pichard, MD, PhD<sup>6</sup>; Carla Prado, PhD, RD<sup>1</sup></p><p><sup>1</sup>University of Alberta, Edmonton, AB, Canada; <sup>2</sup>Federal University of Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil; <sup>3</sup>University of California, Berkeley, CA; <sup>4</sup>Federal University of Pelotas, Pelotas, Rio Grande do Sul, Brazil; <sup>5</sup>University of Alberta/Cross Cancer Institute, Edmonton, AB, Canada; <sup>6</sup>Geneva University Hospital, Geneva, Switzerland</p><p><b>Financial Support</b>: ASPEN (American Society for Parenteral and Enteral Nutrition) Rhoads Research Foundation, and the Canadian Institutes of Health Research (CIHR) (FRN 159537).</p><p><b>Background</b>: Accurate understanding of energy requirements is essential for tailored nutritional interventions in patients with cancer. Under- or overestimating these needs can lead to detrimental weight loss or excessive gain. Yet, determining energy needs in cancer is challenging due to factors like individual tumor burden, treatment, and inflammation, all of which can influence energy requirements. Current guidelines offer broad caloric intake (25-30 kcal/kg/d) set as normal values that lack strong evidence. As a result, dietitians often rely on predictive equations, which have proven to be imprecise. Nonetheless, standard techniques available to accurately measure energy requirements are costly, time-consuming, and not applicable to clinical settings. In this study, we leveraged a cohort of patients with breast cancer to evaluate the accuracy of a novel bedside device designed to measure resting energy expenditure (REE) and compared it against a gold-standard method.</p><p><b>Methods</b>: REE data were obtained cross-sectionally from adult females with breast cancer (stages I-III) measured during a 10-minute test with a novel portable device, the Q-NRG® (Cosmed, Roma, Italy), and compared against REE measured during a 1-hour test in a whole-room indirect calorimeter (WRIC) as a gold-standard technique. To assess and describe the REE accuracy between methods, we utilized paired samples t-test or Wilcoxon signed-rank test for instances of non-normality. Accuracy was determined by the percentage of estimates that fell within 10% of the values measured by WRIC. Additionally, Bland-Altman analysis was conducted to determine bias and establish the lower and upper limits of agreement (LOA). A p-value of less than 0.05 was considered statistically significant.</p><p><b>Results</b>: REE was evaluated in 49 females (age 55.9 ± 11.8 y; 42% with stages I or II, and 7% with stage III breast cancer) using both WRIC and the new portable device. Most patients (63.3%) had a body mass index (BMI) classification within the overweight or obesity categories, and none were categorized as underweight. The new portable device provided accurate measurements for over 70% (n = 35) of patients, showing measurements within 10% of those obtained by WRIC. However, the new portable device overestimated REE for 1 patient and underestimated it for 13. Measured REE significantly differed between techniques, with the new portable device underestimating REE compared to WRIC (1406 ± 262 vs 1508 ± 248 kcal/d; <i>p</i> &lt; 0.001). The bias between the new portable device and WRIC was −6.7% (LOA = −24.9%, 11.6%; variance = 36.5%) or −102 kcal (LOA = −378 kcal, −174 kcal; variance = 552 kcal).</p><p><b>Conclusion</b>: When compared to a gold-standard technique, the new portable device showed good agreement at the group level, with REE measurement discrepancies falling within 10% of values determined by the WRIC. Although a greater variability was observed at the individual level, the new portable device accurately assessed REE in comparison to the WRIC for most patients. Thus, the new portable device appears to be a promising tool for estimating REE of patients with breast cancer, positioning it as a viable option for clinical settings.</p><p>Michelle Brown, MS, RD, LDN, CNSC</p><p>UF Health, Gainesville, FL</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Malnutrition is a highly prevalent issue in the healthcare setting. The term malnutrition in the healthcare setting refers to undernutrition. This occurs as a result of inadequate nutrition intake, impaired absorption, or altered utilization of nutrition. Inflammation and hypermetabolism also contribute to the development of malnutrition. Estimates of the prevalence vary and are as high as 54%. In acute care hospitals, the prevalence of malnutrition is 39% when using diagnostic criteria from the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN). Capturing and recognizing malnutrition is important, as this diagnosis is associated with a 3.4x high rate of in-hospital death, 1.9x higher length of stay, 2.2x likelihood of being admitted with a serious infection, higher rates of discharge to a rehabilitation or long-term assisted care facility, an increased rate of readmissions, and a 73% increase in hospital costs. Due to the impact of malnutrition on healthcare costs and requirements for care, ICD10 codes for malnutrition are considered comorbid conditions (CC) or major comorbid conditions (MCC). Accurate diagnosis, treatment, and documentation of malnutrition can improve patient care. Accurate documentation can also help to capture complexity for quality metrics while also allowing for the selection of the correct DRG and base payment which may increase reimbursement.</p><p><b>Methods</b>: An interdisciplinary nutrition committee at our organization consisting of dietitians, physicians, nurses, and informatic professionals completed a quality improvement implementation to improve malnutrition diagnosis rates, documentation, and coding. This was completed in four steps: (1) Identification of malnutrition criteria that could be used across the organization. Our committee elected to use the AND/ASPEN criteria for the diagnosis of malnutrition. This criterion is used by ~85% of hospitals and is widely recognized by payors. (2) Development of a documentation tool that would allow for RD diagnoses malnutrition to populate in provider progress notes. The hospital's electronic medical record (EMR) was leveraged to accomplish this goal. A novel flowsheet and Smartphrase were developed, which allowed information on malnutrition severity, signs/symptoms, and treatment (entered by the dietitian) to flow into physician progress notes automatically. This solution met all the “best practices” for documentation that were identified by our interdisciplinary team – clear signs and symptoms of malnutrition identified, the severity of malnutrition indicated and documented consistently between providers, consistent use of diagnostic criteria, and treatment for malnutrition being provided and documented. (3) All clinical nutrition staff members were provided with hands-on training on the completion of nutrition-focused physical exams (NFPE), and the completion of these exams was prioritized in all nutrition assessments. (4) When the malnutrition Smartphrase was not used, notes were sent to physicians for attestation and signature.</p><p><b>Results</b>: Following this implementation, dietitian-diagnosed malnutrition has been included in physician notes via Smartphrase for 65% of cases. In the six months following NFPE training, malnutrition diagnosis rates increased by 220%. The percentage of dietitian assessments with a malnutrition diagnosis has increased from 13% to 40%. Following the process of sending notes to physicians for attestation and signature, 94% of malnutrition diagnoses are coded in the EMR at discharge from the hospital, and coding queries to physicians decreased by 50%. Hospital reimbursement for dietitian-diagnosed malnutrition has increased from ~$65,000 per quarter to ~$2 million per quarter.</p><p><b>Conclusion</b>: Utilization of appropriate NFPE training, physician-approved diagnostic criteria, and EMR-based documentation solutions can increase diagnosis, documentation, and reimbursement for malnutrition diagnoses in hospitalized patients.</p><p><b>Research Trainee Award</b></p><p>Alan Garcia-Grimaldo<sup>1,2</sup>; Ivan A. Osuna-Padilla<sup>1</sup>; Nadia Rodriguez-Moguel<sup>1</sup>; Martin A. Rios-Ayala<sup>1</sup>; Marycarmen Godinez-Victoria<sup>2</sup></p><p><sup>1</sup>National Institute of Respiratory Diseases, Mexico City, DF, Mexico; <sup>2</sup>Escuela Superior de Medicina, Instituto Politécnico Nacional, Mexico City, DF, Mexico</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Intensive care unit-acquired weakness (ICU-AW) is characterized by peripheral muscle mass wasting, reduced muscle strength, and dysfunction. Respiratory and swallowing related muscles could also be affected by this condition. This study aimed to analyze the association between ICU-AW incidence and post-extubation dysphagia (P-ED).</p><p><b>Methods</b>: A prospective cohort study was conducted. Patients on mechanical ventilation (MV) admitted to the ICU were included. Individuals with a previous diagnosis of myopathies were excluded. NUTRIC-Score, calf circumference adjusted to BMI, and phase angle (PhA) obtained by bioelectrical impedance, were assessed upon admission and after extubation. Biochemical variables (Baseline PCR) were collected from medical records. SOFA score, APACHE II, and malnutrition diagnosis using GLIM criteria were determined upon admission to the ICU. Cumulative energy (CED) and protein (CPD) deficits were calculated during the ICU stay. ICU-AW diagnosis was determined using the Medical Research Council Scale (MRC-Scale &lt;48) and hand grip strength (&lt;11 kg for men, and &lt;7 kg for women). Swallowing function assessment was performed within the first 24 hours after extubation, using the Yale Swallowing Protocol (YSP). For patients who did not meet the success criteria defined for the YSP, the volume-viscosity swallow test was performed to corroborate the presence of post-extubation dysphagia (P-ED). Specific success and failure criteria proposed for each test were used. Mean and median comparison tests were performed for each variable between the group with P-ED and those with normal swallowing. Associations were analyzed using univariate and multivariate logistic and linear regressions. Covariates selection was performed by stepwise method.</p><p><b>Results</b>: Fifty-four patients were included, 19 (35.2%) were diagnosed with P-ED and 32 (59.3%) with ICU-AW. Patients with P-ED showed lower values for PhA at extubation, MRC-Scale, and handgrip strength at extubation. In addition, higher days on Invasive MV, CED, and CPD were observed in this group (Table 1). In the univariate logistic regression analysis, PhA at extubation, CED, CPD, ICU-AW diagnosis, and days on MV were associated with P-ED identification. In multivariate regression analysis, only days on MV, and the ICU-AW diagnosis were independently associated with P-ED (Table 2).</p><p><b>Conclusion</b>: Days on invasive mechanical ventilation, and ICU-acquired weakness diagnosis were predictors for post-extubation dysphagia. Novel clinical and nutritional strategies are required to prevent ICU-acquired muscle weakness and its consequences, which may improve clinical outcomes and quality of life after extubation.</p><p>Ahron Lee, RD<sup>1,2</sup>; Eun-Mee Kim, RD<sup>1</sup>; Bo-eun Kim, RD<sup>1</sup>; Chi-Min Park, MD, PhD<sup>3</sup>; Sung Nim Han, PhD<sup>2</sup></p><p><sup>1</sup>Department of Dietetics, Samsung Medical Center, Seoul, Korea, Republic of (South); <sup>2</sup>Department of Food and Nutrition, College of Human Ecology, Seoul National University, Seoul, Korea, Republic of (South); <sup>3</sup>Department of Critical Care Medicine and Surgery, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea, Republic of (South)</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The importance of “appropriate” nutrition support in the early stages of intensive care unit (ICU) admission is under debate regarding patients who require it, time of initiation, and the amount to be provided. In this study, the characteristics and clinical outcomes of malnourished patients diagnosed using the Global Leadership Initiative on Malnutrition (GLIM) criteria were examined. Also, the actual implementation of nutritional support and its relationship with clinical outcomes based on nutrition status were investigated.</p><p><b>Methods</b>: This retrospective cohort study included critically ill patients receiving invasive mechanical ventilation who were admitted to the ICU and hospitalized for at least 7 days between January 1, 2020, and December 31, 2022. Nutritional and clinical data during their first 10 days in the ICU were collected. All the patients in this study underwent nutrition assessment by the GLIM criteria. The 90-day mortality of patients diagnosed with malnutrition by the GLIM criteria and degree of malnutrition were analyzed. Patients were divided into three energy intake categories (&lt;10 kcal/kg/d, 10–20 kcal/kg/d, and &gt;20 kcal/kg/d) and three protein intake categories (&lt;0.8 g/kg/d, 0.8–1.2 g/kg/d, and &gt;1.2 g/kg/d). Information on intake was categorized by the stage following ICU admission (days 1–3 for the early acute phase, days 4–6 for the late acute phase, and days 7–10 for the recovery phase). We examined the differences in mortality among groups separated by energy and protein intake at each stage. The analyses were performed for the total cohort, well-nourished, and malnourished groups. Differences in the means and distribution were evaluated, and survival analyses and regression analyses were performed.</p><p><b>Results</b>: A total of 595 patients were included. The prevalence of malnutrition according to the GLIM criteria was 61% (n = 362). The 90-day mortality in the well-nourished and the malnourished group was 45% and 58%, respectively (<i>P</i> &lt; 0.001). Mortality differed by the degree of malnutrition (well-nourished 45%, moderately malnourished 53%, severely malnourished 61%, <i>P</i> = 0.001). In the early acute phase and late acute phase, there was no difference in mortality among different energy intake groups. However, in the recovery phase, the group with high energy intake (&gt;20 kcal/kg/d) showed lower mortality (hazard ratio (HR) 0.602; 95% confidence interval (CI) 0.413 to 0.877; <i>P</i> = 0.008) in the total cohort. In well-nourished patients, the high energy intake group tended to have lower mortality (HR 0.573; 95% CI 0.318 to 1.034; <i>P</i> = 0.064) in the recovery phase. However, in malnourished patients, the group with high energy intake showed significantly lower mortality (HR 0.549; 95% CI 0.333 to 0.903; <i>P</i> = 0.018) in the recovery phase. In the early acute phase and late acute phase, there was no difference in mortality among different protein intake groups. However, in the recovery phase, the group with moderate protein intake (0.8–1.2 g/kg/day) showed lower mortality (HR 0.770; 95% CI 0.599 to 0.990; <i>P</i> = 0.041) in the total cohort. When well-nourished patients and malnourished patients were analyzed separately, a significantly lower mortality (HR 0.728; 95% CI 0.536 to 0.988; <i>P</i> = 0.042) in the recovery phase was observed with moderate protein intake among malnourished patients.</p><p><b>Conclusion</b>: Malnutrition diagnosed by the GLIM criteria was associated with 90-day mortality and other clinical outcomes. Furthermore, energy and protein intake at the recovery phase after ICU admission was associated with mortality, especially in malnourished patients classified by the GLIM criteria. Therefore, time-dependent nutritional intake depending on nutrition status may be relevant for optimizing ICU nutrition support strategies.</p><p><b>Best International Abstract</b></p><p><b>International Abstract of Distinction</b></p><p>Fabio Araujo, RD, MHS<sup>1</sup>; Maureen Tosh, PT<sup>1</sup>; Maitreyi Kothandaraman, MD, MSc, FRCPC, CAGF<sup>2</sup>; Juan Posadas, MD, MSc<sup>1,2</sup>; Paul Wischmeyer, MD, EDIC, FASPEN, FCCM<sup>3</sup>; Priscilla Barreto, RD<sup>4</sup>; Chelsia Gillis, RD, PhD, CNSC<sup>5</sup></p><p><sup>1</sup>Alberta Health Services, Calgary, AB, Canada; <sup>2</sup>University of Calgary, Calgary, AB, Canada; <sup>3</sup>Duke University School of Medicine, Durham, NC; <sup>4</sup>Hospital Naval Marcilio Dias, Rio de Janeiro, RJ, Brazil; <sup>5</sup>McGill University School of Human Nutrition, Montreal, QC, Canada</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Functional capacity is the most relevant outcome after critical illness according to ICU survivors. This outcome is especially pertinent as adult ICU mortality has been decreasing, culminating in impaired functional capacity, delayed return to work, and low quality of life. Protein via nutrition support (NS) has the potential to mitigate ICU-acquired weakness but given that current ICU benchmarks are based on mortality and ICU-related complications, it is unknown whether these protein targets also support functional recovery. To address this gap, we conducted a retrospective cohort study to determine whether different protein intake doses influenced the functional capacity of ICU survivors with LOS ≥ 7 days, measured by the Chelsea Physical Assessment score (CPAx) at ICU discharge – a validated measure of functional capacity and robust method based on reliability, measurement error, and responsiveness.</p><p><b>Methods</b>: The medical records of all consecutive patients admitted to a general systems ICU between October 2014 and September 2020 were reviewed. Inclusion criteria were age ≥18 years, survived ICU admission, ICU stay ≥7 days, and received NS. Exclusion criteria included neuromuscular disorders, brain/spinal cord injury, limb amputation, orthopedic fractures, persistent coma during ICU stay, missing CPAx, and mechanical ventilation &lt;3 days. Eligible patients were divided into 4 groups guided by previous literature exploring daily protein intake in ICU (g/Kg/d) on mortality: LOW (&lt;0.8), MEDIUM (0.8-1.19), HIGH (1.2-1.5), and VERY HIGH (&gt;1.5). Groups with similar CPAx were pooled to enhance precision. The effect of protein dose on CPAx was assessed with analysis of covariance (ANCOVA) adjusting for the confounding variables age, disease severity, length of stay in hospital before ICU admission, duration of mechanical ventilation, and time until start of NS in ICU. Effect modification by nutritional status was assessed with stratification according to subjective global assessment (SGA A: well-nourished and B/C: malnourished). The effect of energy intake was assessed using the same regression model (&lt;25 and ≥25 Kcal/Kg/d; &lt;70 and ≥70% daily adequacy).</p><p><b>Results</b>: Inclusion/exclusion criteria were met by 531 patients. CPAx was non-linearly associated with protein doses (Figure 1) AND was not statistically different among LOW, MEDIUM, and VERY HIGH groups. All groups were different from HIGH (<i>p</i> = 0.003), indicating data could be pooled, and giving rise to 2 groups: HIGH (1.2-1.5 g/Kg/d) and POOLED (&lt;1.2 and &gt;1.5 g/Kg/d). Baseline characteristics were comparable between both groups (Table 1). Mean CPAx (±standard error) was greater in the HIGH vs POOLED groups (30.1 ± 0.7 vs. 26.8 ± 0.6, <i>p</i> = 0.001), suggesting that HIGH was associated with superior functional capacity at discharge. The mean difference (MD) remained statistically significant after adjusting for confounding variables (CPAx MD: 3.4 ± 1.1, <i>p</i> = 0.003 in the 4-group model and 3.3 ± 0.9, p = 0.001 in the 2-group model). Energy intake had no effect on CPAx for Kcal/Kg/d (28.1 ± 0.6 in &lt;25 Kcal/Kg vs 27.9 ± 0.8 in ≥25 Kcal/Kg, <i>p</i> = 0.780) nor for adequacy (27.3 ± 0.9 in &lt;70% vs 28.4 ± 0.6 in ≥70%, <i>p</i> = 0.641). Nutritional status was not an effect modifier as the HIGH group had superior CPAx in both well-nourished (MD 3.8 ± 1.7, <i>p</i> = 0.029) and malnourished (MD 2.5 ± 1.1 <i>p</i> = 0.031) patients.</p><p><b>Best of ASPEN - Critical Care and Critical Health Issues</b></p><p><b>International Abstract of Distinction</b></p><p>Chin Han Charles Lew, APD, PhD<sup>1</sup>; Zheng-Yii Lee, PhD<sup>2,3</sup>; Andrew Day, MSc<sup>4</sup>; Xuran Jiang, MSc<sup>4</sup>; Danielle E. Bear, RD, PhD<sup>5,6</sup>; Gordon L. Jensen, MD, PhD<sup>7</sup>; Pauline Y. Ng, MBBS, MRCP(UK), FHKCP, FHKAM<sup>8</sup>; Lauren Tweel, RD, CNSC, MSc<sup>9</sup>; Angela Parillo, RD, LD, CNSC, MSc<sup>10</sup>; Daren K. Heyland, MD, MSc<sup>4</sup>; Charlene Compher, PhD, RD, LDN, FASPEN<sup>11</sup></p><p><sup>1</sup>Dietetics and Nutrition Department, Ng Teng Fong General Hospital, Singapore; <sup>2</sup>Department of Anesthesiology, Faculty of Medicine, Universiti Malaya, 50603 Kuala Lumpur, Kuala Lumpur, Malaysia; <sup>3</sup>Department of Cardiac Anesthesiology &amp; Intensive Care Medicine, Berlin, Germany; <sup>4</sup>Clinical Evaluation Research Unit, Department of Critical Care Medicine, Queen's University, Kingston, ON, Canada; <sup>5</sup>Department of Critical Care, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; <sup>6</sup>Department of Nutrition and Dietetics, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; <sup>7</sup>University of Vermont Larner College of Medicine, Burlington, VT; <sup>8</sup>Critical Care Medicine Unit, School of Clinical Medicine, The University of Hong Kong, Hong Kong; <sup>9</sup>Rutgers University, New Brunswick, NJ; <sup>10</sup>The Ohio State University Wexner Medical Center, Department of Clinical Nutrition, Columbus, OH; <sup>11</sup>University of Pennsylvania School of Nursing, Philadelphia, PA</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Pre-existing malnutrition is common among critically ill patients (38-78%), and it can be diagnosed using tools such as the Global Leadership Initiative on Malnutrition (GLIM) criteria, and the Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition (ASPEN) Indicators of Malnutrition (AAIM). However, it is unclear if these tools or their individual components (nutrition parameters [NPs]), such as weight, diet history, body mass index (BMI), or muscle mass have better clinical utility and validity in the intensive care unit (ICU) setting since certain NPs can be easier to obtain (e.g. BMI) than others (e.g. weight history). More importantly, it is unclear if treating malnutrition according to the 2021 ASPEN guidelines (recommend delivering 12-25 kcal/kg/d and 1.2-2 g/kg/d of protein) is associated with improved clinical outcomes. We investigated whether GLIM, AAIM, and/or selected individual NPs measured at ICU admission were associated with time to discharge alive (TTDA) (primary outcome), mortality (60-day), or home discharge, and whether a higher protein delivery modified those associations.</p><p><b>Methods</b>: This was a post hoc analysis of the EFFORT Protein trial (n = 1301), the largest multinational, multicenter trial that compared higher vs. usual protein delivery in critically ill patients. The malnutrition statuses of patients were retrospectively classified according to GLIM and AAIM using NPs that were prospectively collected at ICU admission. For GLIM, acute disease-related inflammation formed the etiologic factor for all patients since they were critically ill, and malnutrition severity was classified according to the phenotypic parameters (severity of weight loss, low-BMI, reduced muscle mass). For AAIM, a modified approach was adopted as certain NPs were not collected (ie, reduced energy intake or weight loss for periods &lt; 1 month, fluid accumulation, and grip strength); hence, malnutrition status was classified by the patient's weight loss severity and any reduction in energy intake. Multivariable regressions were used to identify if malnutrition diagnosed by GLIM and AAIM (both dichotomized by “not identified as malnourished” vs. “moderate/severe malnutrition”) and/or individual NPs were associated with outcomes, and whether protein delivery modified their associations.</p><p><b>Results</b>: Table 1 summarizes the characteristics of patients according to their malnutrition status classified by GLIM. Of 1301 predominantly medical admissions, 41% and 14% of the patients were malnourished according to GLIM and AAIM, respectively. Malnutrition diagnosed by GLIM and AAIM was independently associated with extended TTDA (<i>p</i> = 0.03, p = 0.01), higher odds of 60-day mortality (<i>p</i> = 0.02, <i>p</i> = 0.01), and lower odds of home discharge (<i>p</i> = 0.03, <i>p</i> = 0.05), whereas individual NPs were not (<i>p</i> &gt; 0.10). However, higher protein delivery did not modify the association between malnutrition (diagnosed by GLIM and AAIM) and worse outcomes (Table 2). Notably, in patients with BMI &lt; 18.5 kg/m<sup>2</sup> (n = 78), higher protein delivery was associated with a shorter TTDA (adjusted hazard ratio 2.68, 95% confidence interval [CI] 1.14-6.30) and greater odds of home discharge (adjusted odds ratio 4.61, 95%CI 1.35-15.71) than usual protein delivery.</p><p>Elias Wojahn, B.S.; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD</p><p>Medical College of Wisconsin, Milwaukee, WI</p><p><b>Financial Support</b>: Medical College of Wisconsin.</p><p><b>Background</b>: Previous guidelines lacked sufficient data to comment on the safety of enteral nutrition in critically ill children. A more recent study indicated that enteral nutrition was indeed safe for critically ill children receiving vasoactive medication. Additional data in adults indicated that septic shock patients treated with vasoactive medication and given early enteral nutrition outperform patients given no nutrition. We retrospectively investigated a similar premise in pediatric patients to determine (1) the frequency of use of early enteral versus parenteral nutrition for patients in the PICU for septic shock receiving vasoactive medication and (2) the impact of early enteral versus parenteral nutrition on PICU length of stay (LOS) and mortality for patients admitted with septic shock and treated with vasoactive medication. We hypothesized that (1) clinical practices have changed over recent years such that early enteral nutrition was administered more frequently to pediatric septic shock patients treated with vasoactive medication and (2) receiving early enteral nutrition as a PICU patient treated for septic shock with vasoactive medications was associated with better outcomes.</p><p><b>Methods</b>: We obtained demographic and outcome data for pediatric patients admitted to Children's Hospital Wisconsin for septic shock and treated with vasoactive medications within a 5-year range from the Virtual Pediatric Systems, LLC (VPS) database, a data registry for PICU patients. We obtained clinical data including details of enteral and parenteral nutrition administered and use of vasoactive medications by chart review. We quantified the use of vasoactive medications by Vasoactive-Inotrope Score (VIS). We quantified the severity of illness by PRISM3 Probability of Death. We considered medical LOS and mortality for clinical outcomes. We compared categorical variables by Chi-square tests and compared continuous variables by Mann-Whitney tests or Kruskal-Wallis tests. <i>P</i> &lt; 0.05 were considered statistically significant.</p><p><b>Results</b>: We identified 637 patients aged 0-21 years treated in the PICU with a diagnosis of septic shock. Of these, 401 received vasoactive medication, 183 received early enteral nutrition, and 81 received early parenteral nutrition. Those given early parenteral nutrition had longer LOS (median (IQR): 7.0 (2.2-23.2) days) than those not fed (median (IQR): 2.1 (1.1-5.1) days) (<i>p</i> &lt; 0.0001), but did not differ from those fed enterally (median (IQR): 7.9 (3.7-15.2)) (<i>p</i> = 0.95). After controlling for severity of illness, patients who received early parenteral nutrition were more likely to die than those receiving early enteral nutrition or those who were not fed at all (parental vs. enteral: 17.8% vs. 4.60%, <i>p</i> = 0.002; parenteral vs. none: 17.28% vs. 6.70%, <i>p</i> = 0.002). Mortality did not differ between patients who received early enteral nutrition and those not fed (4.60% vs. 6.70%, <i>p</i> = 0.427543).</p><p><b>Conclusion</b>: Early enteral nutrition was given more frequently than early parenteral nutrition. Early enteral nutrition was not significantly associated with improved outcomes as measured by length of stay and mortality, but early parenteral nutrition was associated with significantly worse outcomes. This suggests that clinical guidelines should favor the use of enteral feeding in septic shock patients receiving vasoactive medication.</p><p><b>Best of ASPEN - Critical Care and Critical Health Issues</b></p><p><b>International Abstract of Distinction</b></p><p>Lu Ke, PhD<sup>1</sup>; Cheng Lv, PhD Candidate<sup>1</sup>; Lingliang Zhou, MD Candidate<sup>2</sup>; Weiqin Li, PhD<sup>1</sup></p><p><sup>1</sup>Nanjing University, Nanjing, Jiangsu, China; <sup>2</sup>Southeast University, Nanjing, Jiangsu, China</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: There is controversy over the optimal early protein delivery in critically ill patients with acute kidney injury (AKI). This study aims to evaluate whether the association between early protein delivery and 28-day mortality was impacted by the presence of AKI in critically ill patients.</p><p><b>Methods</b>: This is a secondary analysis of a multicenter cluster-randomized controlled trial enrolling newly admitted critically ill patients (N = 2772). Participants with complete data on baseline renal function and 28-day mortality were included in this study. Cox proportional hazards models were used to investigate whether early protein delivery, reflected by mean protein delivery from day 3 to day 5 after enrollment, was associated with 28-day mortality and whether baseline AKI stages impacted their association.</p><p><b>Results</b>: Overall, 2,618 patients were included (Table 1), among whom 628 (24.0%) had AKI at enrollment (118 stage I, 97 stage II, 413 stage III). Mean early protein delivery was 0.60 ± 0.38 g/kg/d among the study patients (Figure 1). In the overall study cohort, each 0.1 g/kg/day increase in protein delivery was associated with a 5% reduction in 28-day mortality (Hazard Ratio [HR] = 0.95; 95% confidence interval [CI] 0.92-0.98, <i>P</i> &lt; 0.001). Also, when stratifying the early protein delivery by tertiles, compared with low protein delivery, the risk of 28-day mortality both decreased in the medium protein group (HR = 0.64; 95% CI 0.50-0.82, <i>P</i> &lt; 0.001) and the high protein group (HR = 0.71; 95% CI 0.55-0.91, <i>P</i> = 0.007) after adjusting for potential confounders (Figure 2). The association between early protein delivery and 28-day mortality in patients with different baseline AKI stages showed significant heterogeneity (adjusted interaction <i>P</i> = 0.047). With each 0.1 g/kg/d increase in protein delivery, the 28-day mortality decreased by 5% (HR = 0.95; 95% CI 0.92-1.00, <i>P</i> = 0.008) in patients without AKI and 7% (HR = 0.93; 95% CI 0.86-0.99, P = 0.043) in those with AKI stage III, of whom 72% were on renal replacement therapy upon enrollment. However, these associations were not observed among AKI stage I and II patients. The mortality trends up to day 28 for early protein delivery in different AKI stage groups are depicted in Figure 3.</p><p><b>Conclusion</b>: Higher early protein delivery during days 3-5 of ICU stay was associated with improved 28-day mortality in critically ill patients without AKI and with AKI stage III, but not in those with AKI stage I or II.</p><p><b>Figure 3</b>. The trends of 28-day mortality with early protein delivery in different AKI stages.</p><p>Stanislaw J. Gabryszewski, MD, PhD<sup>1</sup>; David A. Hill, MD, PhD<sup>1,2</sup></p><p><sup>1</sup>Children's Hospital of Philadelphia, Philadelphia, PA; <sup>2</sup>University of Pennsylvania, Philadelphia, PA</p><p><b>Financial Support</b>: This work was supported by the National Institutes of Health (Grants T32HD043021 to SJG; K08DK116668 and R01HL162715 to DAH).</p><p><b>Background</b>: The ketogenic diet (KD) is a high-fat, moderate-protein, low-carbohydrate diet that induces ketosis, a metabolic shift characterized by the use of fatty acid-derived ketone bodies rather than glucose to meet energy needs. While the KD is best known as a dietary therapy for refractory epilepsy, there is growing interest in identifying other diseases in which the KD may be therapeutic. Recent studies have revealed the potential of the KD to dampen inflammation and pathology in mouse models of allergic asthma. However, it is unclear whether the KD has such immunoregulatory effects in other allergic diseases, such as the gastrointestinal allergy eosinophilic esophagitis (EoE).</p><p><b>Methods</b>: We studied the effects of the KD in a mouse model of eosinophilic esophagitis (EoE) in which 10-week-old C57BL/6 mice were topically treated with the vitamin D analog MC903 and the egg white allergen ovalbumin (OVA) on days 0 to 11 to induce eczema-like dermatitis and allergic sensitization, respectively. The effect of the KD following allergic sensitization was studied by feeding mice KD or a regular diet (RD) starting on day 12. Mice were provided with OVA-supplemented water and gavaged with OVA on days 18-20. On day 21, mice were harvested to quantify esophageal eosinophilia and to phenotype immune responses in draining lymph nodes via flow cytometry.</p><p><b>Results</b>: Following induction of EoE, mice in both the KD (n = 17) and RD (n = 17) arms exhibited 100% survival at day 21. Weight recovery (percent of original weight ± SEM) at day 21 was comparable between KD-fed (104.1 ± 1.7%) and RD-fed (99.0 ± 3.2%) mice (<i>p</i> &gt; 0.05). Analysis of esophageal eosinophilia at day 21 revealed significantly decreased numbers (total cells ± SEM) of Siglec-F<sup>+</sup> CD11b<sup>+</sup> eosinophils in KD-fed (711 ± 345 cells) versus RD-fed (880 ± 225 cells) mice (<i>p</i> &lt; 0.05). There was a non-significant reduction in the percentage of esophageal eosinophils (percent of CD45<sup>+</sup> cells ± SEM) in KD-fed (5.1 ± 1.2%) versus RD-fed (8.1 ± 1.5%) mice (<i>p</i> = 0.138). In immunophenotyping of phorbol myristate acetate and ionomycin-stimulated cells from draining lymph nodes at day 21, there was a significantly increased percentage (percent of CD4<sup>+</sup> T cells ± SEM) of Foxp3<sup>+</sup> T regulatory (Treg) cells in KD-fed (6.5 ± 1.1%) versus RD-fed (3.3 ± 0.4%) mice (<i>p</i> &lt; 0.01).</p><p><b>Conclusion</b>: In this mouse model of OVA-induced EoE, we observed a modest inhibitory effect of the KD on the recruitment of eosinophils to the esophagus. As compared with the RD, the KD was associated with increased proportions of Foxp3<sup>+</sup> Tregs in draining lymph nodes of mice with EoE. Additional mechanistic investigations are warranted, including determination of the necessity of Tregs for KD-induced inhibition of esophageal eosinophilia. This study highlights the promise of immunomodulatory dietary interventions in the context of allergic disease.</p><p>Hassan S. Dashti, PhD, RD<sup>1</sup>; Magdalena Sevilla, Ph.D.<sup>1</sup>; Kris Mogensen, MS, RD-AP, LDN, CNSC<sup>2</sup>; Charlene Compher, PhD, RD, LDN, FASPEN<sup>3</sup></p><p><sup>1</sup>Massachusetts General Hospital, Boston, MA; <sup>2</sup>Brigham and Women's Hospital, Boston, MA; <sup>3</sup>University of Pennsylvania School of Nursing, Philadelphia, PA</p><p><b>Financial Support</b>: Research reported in this publication was supported by the American Society for Parenteral and Enteral Nutrition (ASPEN) Rhoads Research Foundation awarded to Hassan S. Dashti.</p><p><b>Background</b>: Patients living with short bowel syndrome (SBS) receiving home parenteral nutrition (HPN) commonly receive nutritional infusions overnight contributing to sleep and circadian disruption. Aligning nutritional intake with the circadian clock is expected to yield high benefits to vulnerable populations by limiting circadian misalignment (i.e., a mismatch between the circadian system and behaviors) and influencing other pathways. Recent advancements in metabolic profiling techniques (systematic profiling of cellular metabolites, i.e., sugars, amino acids, organic acids, nucleotides, and lipids) have emerged as a promising tool for identifying relevant biological pathways. Our objective was to characterize metabolites that differ between daytime and overnight HPN infusions in adults with SBS habitually receiving HPN.</p><p><b>Methods</b>: The present study was a secondary analysis of a controlled, single-arm 2-week pilot and feasibility trial designed to compare daytime to overnight infusions of HPN in adults with SBS consuming HPN (ClinicalTrials.gov: NCT04743960). Enrolled patients received 1 week of HPN infusions overnight followed by 1 week of HPN infusions during the daytime (approximately 12-hour change in infusion start time). Duration, frequency, and composition of infusions remained identical during the two study periods. Following each 1-week study period, patients had a venous blood sample collected at clinical visits. Plasma samples were analyzed using Ultrahigh Performance Liquid Chromatography-Tandem Mass Spectroscopy and global metabolic profiles were determined. Of 1015 measured metabolites, only 622 metabolites with non-missing data across all samples were analyzed. Data were normalized to the volume of sample extracted and then log-transformed and scaled with Z-score prior to analysis. Differential metabolite abundance between the two study periods (daytime vs. overnight) was determined using standard Linear Models for MicroArray Data (LIMMA) models adjusted for dietary fasting duration and time since the end of the last HPN infusion. Pathway enrichment analysis was then conducted using MetaboAnalyst's pathway enrichment tool.</p><p><b>Results</b>: Nine patients (age, 52 years; 80% female; BMI 21.3 kg/m<sup>2</sup>) completed the trial and provided two fasting blood samples. Both blood draws were completed at approximately 11:20 am following at-least an 8-hour fast and at-least 8 hours from the end of an HPN infusion. Changes were detected in 36 metabolites at <i>P</i> &lt; 0.05; top-changing metabolites were mostly fatty acids, long-chain and polyunsaturated fatty acids (Dihomo-gamma-linolenic acid, arachidonate (20:4n6), docosahexaenoate (DHA; 22:6n3)) and glycerolipids. (Figure 1). No metabolites were significant at the stringent <i>FDR</i> threshold. Enrichment analysis of the 36 metabolites identified pathways related to the biosynthesis of unsaturated fatty acids, D-arginine, D-ornithine metabolism, and linoleic acid metabolism, among others (Figure 2).</p><p>Astrid Verbiest, MSc<sup>1,2</sup>; Mark K. Hvistendahl, MS, PhD<sup>3</sup>; Federico Bolognani, MD, PhD<sup>4</sup>; Carrie Li, MS, PhD<sup>4</sup>; Nader N. Youssef, MD, MBA, FACG<sup>4</sup>; Francisca Joly, MD, PhD<sup>5</sup>; Palle B. Jeppesen, MD, PhD<sup>3</sup>; Tim Vanuytsel, Associate Professor<sup>1,2</sup></p><p><sup>1</sup>Leuven Intestinal Failure and Transplantation Center (LIFT), University Hospitals Leuven, Leuven, Belgium; <sup>2</sup>Translational Research Center for Gastrointestinal Disorders (TARGID), University of Leuven, Leuven, Belgium; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; <sup>4</sup>VectivBio, Basel, Switzerland; <sup>5</sup>Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France</p><p><b>Financial Support</b>: This research was supported by VectivBio AG.</p><p><b>Background</b>: Short bowel syndrome (SBS) is a severe organ failure condition with a high risk of developing intestinal failure (SBS-IF) and life-long parenteral support (PS) dependence. Glucagon-like peptide-2 (GLP-2) analogs stimulate adaptation of the remaining intestine resulting in increased intestinal absorption and reduced PS needs. Extensive literature is available on the effect of the short-acting GLP-2 analog teduglutide in patients without a remaining colon. However, the impact of GLP-2 analogs on fluid and energy absorption in SBS-IF with a colon-in-continuity (CiC) is unclear. Apraglutide (APRA) is a novel, long-acting synthetic GLP-2 analog that is in development for SBS-IF. We performed a pre-defined interim analysis of a phase 2 study in SBS-IF-CiC to investigate the safety and efficacy of 4 weeks of apraglutide treatment based on metabolic balance studies (MBS).</p><p><b>Methods</b>: STARS Nutrition is a 52-week multicenter, open-label phase 2 study in adult patients with SBS-IF-CiC receiving once-weekly subcutaneous apraglutide injections (5 mg). MBS were performed at baseline and after 4 weeks with stable PS, followed by a 48-week PS adjustment period. During MBS, fluid intake was kept constant (individual predefined drinking menu). Duplicates of meals and fluids (wet weight intake), urine, and feces (fecal wet weight output) were collected. Safety was the primary endpoint. Secondary endpoints included changes in fecal wet weight output, urinary output, wet weight, and energy absorption. Data are presented as mean (95% CI). <i>P</i> values &lt; 0.05 were considered significant (Wilcoxon matched-pairs signed rank test).</p><p><b>Results</b>: Nine patients were included and comprise the full study population. Apraglutide was well tolerated with no dose discontinuation or interruption. No AEs were considered notable based on their nature or severity. At baseline, patients received a weekly PS volume of 10 (range 4-21) L. Small bowel length was 19 (range 0-50) cm and 79 (range 43-100) % of the colon was in continuity. Fecal wet weight output decreased significantly by 253 (−437 to −68) g/day (<i>p</i> = 0.012). Relative wet weight absorption increased by 9 (1 to 18) % (<i>p</i> = 0.039). There was a numeric increase in urinary output (<i>p</i> = 0.129). No significant changes in energy absorption were observed (Table 1).</p><p>Palle B. Jeppesen, MD, PhD<sup>1</sup>; Tim Vanuytsel, Associate Professor<sup>2</sup>; Sukanya Subramanian, Physician<sup>3</sup>; Francisca Joly, MD, PhD<sup>4</sup>; Geert Wanten, Physician<sup>5</sup>; Georg Lamprecht, Physician, Professor<sup>6</sup>; Marek Kunecki, MD<sup>7</sup>; Farooq Rahman, Physician<sup>8</sup>; Thor Nielsen, Statistician<sup>9</sup>; Lykke Graff, MD<sup>9</sup>; Mark Hansen, Physician<sup>9</sup>; Ulrich Pape, Physician<sup>10</sup>; David Mercer, Physician<sup>11</sup></p><p><sup>1</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; <sup>2</sup>UZ Leuven, Leuven, Belgium; <sup>3</sup>MedStar Georgetown, Washington, DC; <sup>4</sup>Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France; <sup>5</sup>Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands; <sup>6</sup>University Medical Center Rostock, Rostock, Germany; <sup>7</sup>M. Pirogow Hospital, Wolczanska, Poland; <sup>8</sup>University College London Hospitals, London, United Kingdom; <sup>9</sup>Zealand Pharma A/S, Copenhagen, Denmark; <sup>10</sup>ASKLEPIOS Klinik St. Georg, Hamburg, Germany; <sup>11</sup>Nebraska Medical Center, NE</p><p><b>Financial Support</b>: Zealand Pharma A/S Supported Research.</p><p><b>Background</b>: Reduction of parenteral support (PS) is important for improved outcome in short bowel syndrome (SBS) patients with intestinal failure (IF). Clinically meaningful within-patient change in PS volume has until today been regarded as a ≥ 20% reduction. This is however based on clinical experience, and to our knowledge there has been no data-driven exercise which aims at quantifying what constitutes a meaningful change in PS volume from a patient perspective. Glepaglutide, a long-acting GLP-2 analog, reduces PS volume needs and improves patient global impression of change (PGIC), a patient-reported outcome (PRO) tool, in SBS-IF patients. We here report a quantitative analysis of meaningful change in PS volume using PGIC following glepaglutide treatment in the Efficacy and Safety Evaluation (EASE) SBS 1 trial.</p><p><b>Methods</b>: EASE SBS 1 is a multi-center, placebo-controlled, randomized, parallel-group, double-blind phase 3 trial (NCT:03690206). Chronic SBS-IF adult patients with requirement for PS at least 3 days per week were recruited. Patients were randomized to 24 weeks of treatment with SC injections of either 10 mg glepaglutide twice-weekly (TW), 10 mg glepaglutide once-weekly (OW), or placebo. PS volume requirements were evaluated and adjusted using regular fluid balance periods. The primary endpoint was a reduction in weekly PS volume from baseline to week 24. Patients rated their change in overall status since the start of the trial to weeks 12 and 24 by PGIC, using a 7-point Likert scale (ranging from very much worse to very much improved). Anchor-based analysis using scatter plots and empirical cumulative distribution functions (eCDF) were applied to assess the association between PGIC categorical data and % change in PS volume from baseline to weeks 12 and 24. Anchor-based methods are used as external criteria to gain knowledge about what is clinically meaningful to patients based on known anchoring measures.</p><p><b>Results</b>: 99 of the 106 randomized patients completed the trial. Glepaglutide TW treatment significantly reduced mean PS requirements by 47% (5.13 L/wk) from baseline. Improvement in PGIC was shown with significant differences relative to placebo for both glepaglutide TW (<i>p</i> = 0.002) and OW (<i>p</i> &lt; 0.0001). Using the blinded data sample, the association between PGIC and the PS volume % change from baseline to week 24 showed that the two endpoints were correlated, with Spearman rank-order and Kendall's tau-b correlation coefficients of 0.353 and 0.285, respectively. After 12 weeks of treatment, the association appears stronger. Upon inspection of the eCDF, these results support the appropriateness of a % PS volume reduction threshold of 20%.</p><p><b>Conclusion</b>: Anchor analysis, using PGIC as the anchor measurement, showed that the use of 20% reduction in PS volume, an outcome measure used in clinical trials, is considered clinically meaningful to SBS patients.</p><p><b>Abstract of Distinction</b></p><p>Ji Seok Park, MD, MPH<sup>1</sup>; Naseer Sangwan, PhD<sup>1</sup>; Lauren Menke<sup>2</sup>; Gail Cresci, PhD, RD, LD, FASPEN<sup>1</sup></p><p><sup>1</sup>Cleveland Clinic, Cleveland, OH; <sup>2</sup>Case Western Reserve University, Cleveland, OH</p><p><b>Financial Support</b>: 4R00AA023266 (GC) and Standard Process.</p><p><b>Background</b>: A synbiotic is a physical combination of a prebiotic and a probiotic with a general goal of maintaining probiotic viability through co-packaging with its food source. Despite its wide availability, evidence to support its use in a healthy population is limited. The study aimed to test the feasibility and safety of the synbiotic on gastrointestinal symptoms and gut microbiota.</p><p><b>Methods</b>: This was a double-blinded, randomized, placebo-controlled, paired crossover pilot study in healthy adults to test the effects of a targeted synbiotic on gut microbiota diversity and abundance. The targeted synbiotic consisted of 2 probiotic strains, <i>Lactobacillus reuteri</i> 3613 (1 × 10<sup>9</sup> CFU) and <i>Lactobacillus plantarum</i> 276 (1 × 10<sup>11</sup> CFU), and a resistant starch (RS) prebiotic NuBana<sup>TM</sup> RS65G Green Banana Flour (3.84 g/d). Thirty-four healthy participants meeting the pre-defined criteria were enrolled per sample size calculation of 24 completers needed to achieve 91% power at a 5% significance level. Participants were randomized to consume the synbiotic versus maltodextrin placebo for 28 days, followed by a 21-day washout period, and then they crossed over to consume the other supplement for 28 days. Gastrointestinal symptoms were assessed, and fecal samples were collected before and after each supplement period. Fecal samples were analyzed by 16 S rRNA sequencing, and Division Amplicon Denoising Algorithm 2 (DADA2) and Ribosomal Database Project (RDP) classifier were used for taxonomic profiling. Alpha-diversity was assessed using the Shannon diversity index, and beta-diversity was assessed using Bray-Curtis dissimilarity. Differential abundance was used to capture significantly different taxa between the synbiotic group and placebo group. The study was approved by the Cleveland Clinic Institutional Review Board.</p><p><b>Results</b>: Thirty-four participants were randomized into the study, 13 males and 21 females, and 28 participants completed the study with an average age of 32 ± 7 years. Shannon diversity index of fecal samples was higher when participants were taking synbiotic compared to placebo (<i>P</i> = 0.021) suggesting higher microbial richness and evenness during the synbiotic consumption. Bray-Curtis dissimilarity was calculated between the synbiotic group and the placebo group and then was visualized using Principal Coordinates Analysis (PCoA), which showed 2 separate but overlapping groups. Differential abundance identified 11 taxa, including butyrate-producing genera <i>Akkermansia</i> and <i>Butyricimonas</i>, were significantly different between synbiotic and placebo supplements. All subjects tolerated the supplements well reporting no changes in GI symptoms.</p><p><b>Conclusion</b>: This pilot study shows a targeted synbiotic supplement favorably modified gut microbiome diversity and taxa abundance in healthy subjects. Further studies are warranted to test the effects of this targeted synbiotic in clinical scenarios with known gut dysbiosis to determine if modifications can be sustained and associated with disease.</p><p>Kaitlyn Daff, MA, RD, LDN<sup>1</sup>; Gail Cresci, PhD, RD, LD, FASPEN<sup>2</sup></p><p><sup>1</sup>Case Western Reserve University/Cleveland Clinic Lerner Research Institute, Cleveland, OH; <sup>2</sup>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support</b>: NIH-National Institute of Alcohol Abuse and Alcoholism.</p><p><b>Background</b>: Alcohol use disorder is the leading cause of liver disease in the United States<sup>1</sup>, with an estimated 80% of patients with alcohol-associated end-stage liver disease (AA-ESLD) also presenting with clinical malnutrition and sarcopenia<sup>2</sup>. Gut dysbiosis in ALD has been well-characterized in the literature with shifts from a Bacteroidetes and Firmicutes-dominated population towards an increased abundance Proteobacteria<sup>3</sup>. Although it is known that the gut microbiome plays a role in the metabolism and production of amino acids, how alcohol-associated gut dysbiosis influences host amino acid homeostasis is less understood. We aimed to test whether the amino acid metabolite profile in patients with AA-ESLD is unique from patients without disease pathology and if this is correlated to changes in the gut microbiota.</p><p><b>Methods</b>: A secondary data analysis was performed from a larger, single-center, non-randomized prospective pilot study in patients awaiting liver transplantation to characterize metabolomic changes in amino acid homeostasis. Urine samples were collected within 24 hours prior to liver transplant, adjusted for urine osmolality, and untargeted metabolomic analysis by UPLC-MS/MS was performed. Fecal samples collected within 24 hours of liver transplant were sequenced and analyzed using 16srRNA for profiling. Welch's paired t-tests were generated to determine statistically significant changes in metabolite mean scaled intensities between AA-ESLD and healthy control patients. Spearman's correlations were used to identify associations between amino acid metabolites and gut microbial taxa.</p><p><b>Results</b>: Analysis of the urinary metabolome between ALD-ESLD patients (n = 11) and healthy control patients (n = 18) revealed distinct amino acid profiles between groups. Welch's paired t-tests identified that arginine (<i>p</i> = 0.0016), glutamate (<i>p</i> = 0.0289), tyrosine (<i>p</i> = 0.0003, phenylalanine (p = 0.0002), asparagine (<i>p</i> = 0.0005) tryptophan (p = 0.0001), cystine (<i>p</i> = 0.0017) and taurine (<i>p</i> = 0.0480) were all significantly increased in ALD-ESLD patients. When Spearman's correlations were generated, a significant positive correlation was identified between gamma-proteobacteria genera species, phenylalanine (<i>p</i> = 0.0167), and tryptophan (<i>p</i> = 0.0349). These data suggest that the microbiome may contribute to the increased concentrations of these amino acids in the urine. Gamma-proteobacteria were also positively correlated with glutamine (<i>p</i> = 0.0151) and histidine (p = 0.0476), while a negative correlation was found with glycine(<i>p</i> = 0.0071) and creatinine (<i>p</i> = 0.0341).</p><p><b>Conclusion</b>: Urinary amino acid metabolites differ between AA-ESLD patients and those without liver disease. As patients must abstain from alcohol for ~6 months to be eligible for a liver transplant, these data suggest residual effects of AA-ESLD on amino acid homeostasis. Correlations between the microbiome and amino acid metabolites suggest that the unique microbial shifts associated with ALD may play a role in these observed changes to amino acid metabolism.</p><p>Stephanie Merlino Barr, MS, RDN, LD<sup>1,2</sup>; Rosa Hand, PhD, RDN, LD, FAND<sup>2</sup>; Marc Collin, MD<sup>1,2</sup>; Thomas E. Love, PhD<sup>1,2</sup>; Sharon Groh-Wargo, PhD, RDN<sup>1,2</sup></p><p><sup>1</sup>MetroHealth Medical Center, Cleveland, OH; <sup>2</sup>Case Western Reserve University, Cleveland, OH</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Diagnostic criteria for neonatal malnutrition were proposed in 2018 by field experts. This tool has not been validated since its publication. The objective of this study was to assess the agreement and reliability of both the overall malnutrition tool and individual indicators to evaluate how consistently the proposed criteria identify malnutrition in preterm infants.</p><p><b>Methods</b>: A single-center, retrospective cohort study was performed at a level III Neonatal Intensive Care Unit (NICU). The cohort included all preterm infants born between June 2013 and August 2022, who were admitted to the NICU for at least 3 days and did not die before discharge. Malnutrition diagnoses (none/mild/moderate/severe) were assigned to each patient for each indicator, as defined in Table 1; multiple definitions for individual indicators were used to reflect different potential approaches of assessment (eg, growth velocity), or to reflect differences in patient populations (eg, protein and energy intake). The kappa (k) value was used to assess the neonatal malnutrition diagnostic tool's overall inter-indicator reliability; this was calculated separately for indicators used to assess malnutrition in the first two weeks of life and after the first two weeks of life. Each indicator's diagnosis was compared individually to all other indicators' diagnoses to assess inter-indicator reliability; proportion of overall agreement, McNemar's test statistic, and kappa value were calculated. Acceptable agreement was defined as k &gt; 0.8.</p><p><b>Results</b>: A total of 2946 infants were included in this study. The k values for the malnutrition tool overall indicated poor inter-indicator reliability; for malnutrition diagnoses in the first two weeks of life k = 0.054; for diagnoses after the first two weeks of life k = 0.048. Figure 1 depicts the weighted k values for all comparisons of individual indices. Figure 2 depicts the proportions of overall agreement. For example, the weight gain velocity (approach 1) compared to the energy intake malnutrition diagnosis criteria had n = 954 subjects, k = 0.09, and a proportion of overall agreement of 0.28, indicating that both inter-indicator reliability and accuracy were poor. Commonly cited generalized weight gain velocity goals (approaches 2 &amp; 3) had good accuracy and inter-indicator reliability with the recommended method (approach 1) of determining goal weight gain velocity by maintaining weight-for-age z-score (1 vs. 2 k = 0.92, 1 vs. 3 k = 0.88). The generalized linear growth goal (approach 2) had poor accuracy and inter-indicator reliability with the recommended method (approach 1) (k = 0.12). All comparisons of unique indices for malnutrition diagnosis had detectable disagreement in diagnosis patterns as assessed by McNemar's test statistic.</p><p>Amber Hager, BSc, RD; Yiqi Wang, BSc; Sandy Hodgetts, PhD, OT; Lesley Pritchard, PhD, PT; Vera Mazurak, PhD; Susan Gilmour, MD, MSc, FRCPC; Diana R. Mager, MSc, PhD, RD</p><p>University of Alberta, Edmonton, AB, Canada</p><p><b>Financial Support</b>: 2022 ASPEN Rhoads Research Foundation Grant.</p><p><b>Background</b>: Measurement of body composition in young infants and children with chronic liver disease (CLD) can be challenging due to fluid overload, lack of healthy reference data and non-invasive, validated methods to use at the bedside. The use of ultrasonography to serially measure changes in muscle thickness overcomes many of these limitations, but little comparable data is available in young infants and children (&lt;5 y). The study purpose was to serially measure changes in total bicep, calf, and thigh muscle layer thickness (MLT), subcutaneous adipose tissue thickness (SAT-T), and motor (gross/fine) development in infants and children (&lt;5 y) with CLD. We hypothesized that the trajectory of MLT (thigh, bicep, calf) and SAT-T would be significantly impacted by CLD, and informative of gross motor development in infants and children (&lt;5 y).</p><p><b>Methods</b>: Infants and children (4 mo-5 y) with CLD (n = 11) and their age-matched CON (n = 16) were recruited from the Pediatric Liver Clinics/Liver Transplant Clinics at the Stollery Children's Hospital and the community. Participants underwent 2 serial measurements at baseline and after 6 months of (1) MLT, echo intensity and SAT-T of the bicep brachii (BB), rectus femoris (RF), rectus intermedius (RI), soleus and gastronemius (GN) using ultrasound (U/S) and (2) gross motor assessment (Peabody Motor Scale V2 [PDMS-2]) in CLD only. Additional variables collected included demographics (age, sex, CLD diagnosis, PELD), SGNA scores, anthropometrics (wt-z, ht-z, head circumference (hc-z]), body composition (fat-free mass [FFM]/fat-mass [FM] using BIA) and multiple skinfold thickness (SFT) (triceps [TSF], biceps, suprailliac, subscapular), mid-arm circumference [MAC-z]).</p><p><b>Results</b>: CLD etiology included 73% Biliary Atresia (n = 8), 27% other (n = 1 acute liver failure; n = 2 TPN-related cholestasis). No significant differences in age (years), sex, wt-z, ht-z, hc-z, MAC-z, TSF-z, or subscapular-z were noted between groups at baseline (<i>p</i> &gt; 0.05). Thirty percent of CLD children had SGNA scores indicative of mild-moderate malnutrition (SGNA ≥ 2). Total thigh, RI, and soleus MLT was significantly lower in CLD vs CON, and thigh SAT was higher in CLD after 6 months (<i>p</i> &lt; 0.05). This was particularly evident in CLD children ≤ 2 years who had significantly lower total thigh, RI, RF, and soleus MLT than CON at baseline and after six months (<i>p</i> &lt; 0.05). Total thigh, RI, RF MLT (absolute, % change over 6 months) were positively related to measures of BIA-FFM measures (r<sup>2</sup> = 0.46 −0.47); <i>p</i> &lt; 0.001) total motor quotient and gross motor quotient scores (absolute, percentile; r<sup>2</sup> = 0.47 <i>p</i> &lt; 0.001)), but not fine motor quotients (absolute, percentile) of the PDMS-2, particularly in CLD children (&lt;2 y). Bicep and calf (MLT, SAT) were not associated with total motor, gross motor, or fine motor quotients (absolute, percentile) in CLD children.</p><p><b>Conclusion</b>: Children with CLD had significantly lower measures of muscle thickness and higher measures of SAT than CON. Serial measurement of thigh MLT may be informative of the trajectory of fat-free mass and gross motor skill development in young children with CLD.</p><p><b>Abstract of Distinction</b></p><p>Anita Nucci, PhD, RD<sup>1</sup>; Hillary Bashaw, MD<sup>2</sup>; Alexander Kirpich, PhD<sup>1</sup>; Jeffrey Rudolph, MD<sup>3</sup></p><p><sup>1</sup>Georgia State University, Atlanta, GA; <sup>2</sup>Children's Healthcare of Atlanta, Atlanta, GA; <sup>3</sup>UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA</p><p><b>Financial Support</b>: Takeda Pharmaceuticals.</p><p><b>Background</b>: Although survival for children with intestinal failure (IF) has improved with parenteral nutrition (PN), many still fail to maintain adequate somatic growth after achieving enteral autonomy. Few studies have examined growth after weaning from PN and outcomes have been inconsistent. A glucagon-like peptide-2 (GLP-2) analog has been shown to reduce the volume of and time on PN in some children with short bowel syndrome with 6 months of use. The effect of this analog on growth is unknown. We aim to describe growth patterns in children with IF after PN weaning and during treatment with a GLP-2 analog.</p><p><b>Methods</b>: This retrospective observational study was conducted at two centers for pediatric intestinal rehabilitation (IR) in the US eligibility criteria included diagnosis with IF (PN use ≥60 days within a 74 consecutive day interval) at &lt;12 months of age. Patients were referred for IR between September 1989 and January 2023. Z-score values for weight and length/height (adjusted for gestational age up to 2 years of age) are described in those who weaned from PN and in those who received a GLP-2 analog (Gattex®) for ≥6 months (2017-2023).</p><p><b>Results</b>: There were 362 children (57% male, 72% white) with a median age at diagnosis of 6 days (interquartile range [IQR] 1,22) eligible for the study. Common diagnoses included necrotizing enterocolitis (28%), gastroschisis (23%), and small bowel atresia (16%). The median gestational age was 34 weeks (IQR 31,37), the percent small bowel remaining at diagnosis was 23% (IQR 10,50), and 36% had a functional ileocecal valve. One hundred forty-five children (40%) were successfully weaned from PN (median time to wean = 1.5 y [IQR 1,2.9]). 123/145 (85%) achieved enteral autonomy (maintenance of normal growth for &gt;3 consecutive months). Median weight and length/height z-score at the time of PN weaning was −1.04 (IQR −2.09, −0.12) and −1.86 (IQR −3.01, −0.69), respectively. After weaning from PN, weight and linear growth velocity were maintained in 44% and 39% of children, respectively in year 1 and 59% and 55%, in year 2. Acceleration in weight and linear growth velocity was observed in 28% and 34%, respectively in year 1 and 22% and 31%, in year 2. Fourteen children received a GLP-2 analog for a median of 912 days (IQR 365,1304). Of these, 3 were weaned from parenteral support within 9 months. Changes in weight and linear growth velocity z-scores between GLP-2 start and 2 years post-initiation are shown in Table 1.</p><p>Annemarie Rompca, MD<sup>1</sup>; Morgan McLuckey, MD<sup>2</sup>; Anthony J. Perkins<sup>3</sup>; Xiaoyi Zhang, MD, PhD<sup>1</sup>; Charles Vanderpool, MD<sup>1</sup></p><p><sup>1</sup>Riley Hospital for Children, Indianapolis, IN; <sup>2</sup>Department of Radiology, Indianapolis, IN; <sup>3</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: Inflammatory bowel disease (IBD) can impact patients' nutritional status. Poor oral intake, poor absorption of nutrients, protein loss in stool, and increased energy requirement can contribute to poor nutrition in this patient population. Poor nutritional status can manifest as poor growth, poor weight gain, and sarcopenia, defined as decreased muscle mass and strength. Studies have demonstrated decreased muscle mass in pediatric IBD patients leads to a need for escalated therapy, increased need for surgery, and increased risk of post-operative complications. We sought to obtain the muscle mass at IBD diagnosis of our cohort on cross-sectional imaging, compare to known age- and sex-specific psoas muscle reference values for pediatric norms, and analyze muscle mass comparison between IBD subtypes and correlations with anthropometrics at diagnosis.</p><p><b>Methods</b>: This study is a single-center retrospective study at a tertiary care facility. Patients with new diagnoses of IBD [Crohn's disease (CD), ulcerative colitis (UC), and indeterminate colitis (IC)] ages 6 to 16 at diagnosis from May 15, 2018, through December 31, 2019, were included. Those who had chronic medical conditions and no accessible cross-sectional imaging within 3 months of diagnosis were excluded. Demographic and anthropometric data at diagnosis of IBD were obtained. The psoas muscle area in mm<sup>2</sup> was measured on cross-sectional imaging at lumbar level 3-4 (L3-4) and lumbar level 4-5 (L4-5) bilaterally. Right and left measurements were added together to obtain the total psoas muscle area (TPMA) at each level. These measurements were compared to pediatric psoas muscle area reference values. We used analysis of variance to determine if outcomes differed by IBD type. Spearman correlations were used to assess the relationship between anthropometric measures and outcomes of interest. All analyses were performed using SAS v9.4.</p><p><b>Results</b>: Cross-sectional imaging from 70 patients with newly diagnosed IBD was reviewed. The average age was 11.9 years, with a male predominance of 42 patients (60%). Most patients were diagnosed with CD (n = 50, 71.4%), followed by UC (n = 17, 24.3%), and then IC (n = 3, 4.3%). The mean z-score for all patients TPMA at L3-4 was −1.7. The mean z-score for all patients TPMA at L4-5 was −1.4 (Table 1). Measures of sarcopenia at both lumbar levels for TPMA and z-score at L3-4 were significantly different across IBD types (CD vs UC vs IC) (Table 2).</p><p><b>Best of ASPEN - Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p><b>Abstract of Distinction</b></p><p>Adam Russman, MD<sup>1</sup>; Anne McCallister, CPNP<sup>2</sup>; Anthony J. Perkins<sup>3</sup>; Charles Vanderpool, MD<sup>4</sup></p><p><sup>1</sup>Children's Medical Center of Dallas, Dallas, TX; <sup>2</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>3</sup>Indiana University School of Medicine, Indianapolis, IN; <sup>4</sup>Riley Hospital for Children, Indianapolis, IN</p><p><b>Financial Support</b>: None Reported.</p><p><b>Background</b>: The Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN) published malnutrition guidelines in 2014. Literature describing clinical outcomes in hospitalized children with a malnutrition diagnosis is limited and few studies focus on the impact of malnutrition severity subtype on clinical outcomes.</p><p><b>Methods</b>: We analyzed patients admitted to our pediatric hospital from 2019 to 2022, excluding maternal/obstetrics admissions. Patients were diagnosed with malnutrition and assigned severity subtype by a registered dietitian according to AND/ASPEN guidelines. Unspecified malnutrition was assigned if there was insufficient physician documentation to determine the malnutrition severity subtype. Data on readmission rate, mortality, length of stay (LOS), LOS index, hospital cost, operative procedure (OR, any procedure), and pediatric intensive care unit (ICU) admission were collected. Clinical outcomes were also analyzed based on the malnutrition severity subtype and compared to patients who were not diagnosed with malnutrition. We used the natural log (LOS + 1) and natural log (costs+1) for LOS and cost analyses since both variables were highly skewed. Mixed effects regression analysis was completed to account for the clustering of repeated admissions. All analyses were performed using SAS v9.4.</p><p><b>Results</b>: Any malnutrition diagnosis was associated with a higher 7-, 14-, and 30-day readmission rate compared to patients without a malnutrition diagnosis. Malnourished patients had a higher mortality rate, median LOS, LOS index, cost, ICU admission rate, and operative procedure rate compared to patients without a malnutrition diagnosis (Table 1). Table 2 represents an analysis based on malnutrition severity subtype. Patients with mild, moderate, and severe malnutrition all had significantly higher readmission rates at 7-, 14-, and 30-day time points compared to patients with no malnutrition. Patients with unspecified malnutrition had a higher readmission rate at only 30 days. At all three readmission time points, there were no significant differences in readmission rates between malnutrition severity categories. The only malnutrition subtype with a significantly increased rate of mortality compared to no malnutrition was patients with severe malnutrition (<i>p</i> = 0.005). Admissions with mild, moderate, unspecified, and severe malnutrition had significantly higher LOS index, LOS, and total costs than admissions without a malnutrition diagnosis. Mild malnutrition admissions had a significantly higher LOS index than moderate (<i>p</i> = 0.050) and severe (<i>p</i> = 0.014) malnutrition while unspecified severity admissions had a significantly higher LOS index than severe admissions (<i>p</i> = 0.026). Mild (<i>p</i> = 0.032), moderate (<i>p</i> = 0.015) and severe (<i>p</i> = 0.001) malnutrition admissions had significantly higher LOS than unspecified severity admissions. Mild (<i>p</i> = 0.011) malnutrition admission had significantly higher costs than admission with unspecified malnutrition.</p>","PeriodicalId":16668,"journal":{"name":"Journal of Parenteral and Enteral Nutrition","volume":"48 S1","pages":"S5-S59"},"PeriodicalIF":4.1000,"publicationDate":"2024-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jpen.2601","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Parenteral and Enteral Nutrition","FirstCategoryId":"3","ListUrlMain":"https://aspenjournals.onlinelibrary.wiley.com/doi/10.1002/jpen.2601","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NUTRITION & DIETETICS","Score":null,"Total":0}
引用次数: 0

Abstract

Sunday, March 3, 2024

SU30 Parenteral Nutrition Therapy

SU31 Enteral Nutrition Therapy

SU32 Malnutrition, Obesity, Nutrition Practice Concepts, and Issues

SU33 Critical Care and Critical Health Issues

SU34 GI and Other Nutrition and Metabolic-Related Topics

SU35 Pediatric, Neonatal, Pregnancy, and Lactation

Ji Seok Park, MD, MPH; Mohamed Tausif Siddiqui, MD; Kristin Izzo, RD; Sara Yacyshyn, MD; Allison Doriot, RD; Aje Kent, MD; Elizabeth Gallant, RD; Miguel Salazar, MD; Eileen Hendrickson, PharmD; Adriana Panciu, PharmD; Basma Rizk, PharmD; Ann Dugan, RN; James Bena, MS; Shannon Morrison, MS; Ruishen Lyu, MS; Anil Vaidya, MD; Gail Cresci, PhD, RD, LD, FASPEN; Donald F. Kirby, MD, FACP, FACN, FACG, AGAF, FASPEN, CNSC, CPNS

Cleveland Clinic, Cleveland, OH

Financial Support: Cleveland Clinic Center for Human Nutrition Morrison Research and Development Funding.

Background: Preventing catheter-related bloodstream infection (CRBSI) is an essential component in managing patients with chronic intestinal failure dependent on home parenteral nutrition (HPN). Ethanol lock therapy is an effective evidence-based strategy used to decrease the risk of CRBSI, however, it has become less available due to supply chain issues thus other strategies are needed. SQ53 wipe is a novel antimicrobial wipe based on a proprietary compound that has residual efficacy beyond 24 hours. It is registered under the European Union Biocidal Product Regulation but not under the U.S. Food and Drug Administration. This study aimed to evaluate the effectiveness of the SQ53 wipe in preventing CRBSI in patients receiving HPN. The study was registered in ClinicalTrials.gov (NCT 04822467).

Methods: A single-blinded, randomized, placebo-controlled trial was designed. About 200 patients meeting pre-defined criteria were contacted. A total of 60 patients were recruited to the study between December 10, 2021, and June 3, 2022, per sample size calculation. Patients were randomized into a treatment group (SQ53 wipe) and a control group (alcohol wipe). A stratified randomization was done based on the CRBSI risk category (low, high, new) and the types of central venous catheter (CVC; tunneled, non-tunneled). Patients were instructed to use the appropriate type of wipe to clean their CVCs before and after HPN infusion per specific instructions. An interim analysis for both efficacy and futility was planned to occur when the last patient reached 6 months post-randomization. Analyses were performed using Poisson regression for the comparisons of all CRBSI (confirmed and suspected), confirmed CRBSI and CVC exchanges between the two groups. Additional analyses were performed to compare the outcomes between the 6 months prior to the study and the time in the study, using each patient as their own historical control. Both the intention to treat (ITT) and per-protocol (PP) (>90% adherence) analyses were used.

Results: Fifty-nine patients were randomized into the study. When the two groups were compared in parallel, both the ITT and PP analyses did not show statistically significant superiority of using SQ53 wipe over alcohol wipe in decreasing all CRBSI, confirmed CRBSI or CVC exchanges. However, PP analysis suggested that event rates may be lower in the SQ53 group which had a 34% lower risk of all CRBSI (P = 0.43), 53% lower risk of confirmed CRBSI (P = 0.52), and a 30% lower risk of CVC exchanges (P = 0.58). Interestingly, when each patient's CRBSI rate during the trial was compared with their previous CRBSI rate, the SQ53 wipe group showed a 74% lower risk of all CRBSI (P = 0.005) in PP analysis. In patients in the high-risk category, every patient who was randomized had a decreased CRBSI rate compared to their previous experience. Every patient tolerated SQ53 well without predefined adverse events.

Conclusion: Patients who used SQ53 wipe for more than 90% of the time using specific instructions had 74% decreased CRBSI rates compared to their previous experience. SQ53 wipe did not show a statistically significant benefit over alcohol wipe in this study due to the augmented catheter hygienic care in the control group and the insufficient sample size.

Abstract of Distinction

Theresa A. Fessler, MS, RDN, CNSC1; Mary B. Crandall, PhD, RN2; David N. Martin, PhD2

1Morrison Healthcare, University of Virginia Health System, Charlottesville, VA; 2University of Virginia Health System, Charlottesville, VA

Financial Support: None Reported.

Background: Catheter-related bloodstream infection (CRBSI) is a serious complication for patients receiving home parenteral nutrition (HPN). The literature is not consistent as to whether there are significant differences in infection risk between central venous catheter (CVC) types, and assessment is complicated by potential alternate infection sources and different evaluation methods: CRBSI, and central line-associated catheter infection (CLABSI). The goals of this project were: To determine if significant differences in infection rates exist between peripherally inserted central venous catheters (PICC), tunneled central venous catheters (TCVC), and implanted ports, or between single-lumen (SL) and multi-lumen (ML) catheters used for HPN; and to identify rates of CVC removal for other complications.

Methods: A prospective, observational quality improvement project was conducted for adults who received HPN provided by the University of Virginia, Continuum Home Infusion Pharmacy from February 2019 through December 2022 with follow-up ending July 31, 2023. Data were collected for 141 CVCs used for 89 patients and included number of HPN days, indications for HPN (Figure 1), reasons for CVC removal, blood draws, and microbiologic results. CRBSI and CLABSI were determined by the criteria described in Table 1. Figure 2 shows the number of peripheral and CVC blood and catheter tip tests done for the CVCs with suspected infection.

Results: Of the CVCs used for HPN, 63% were PICC, 27% TCVC, and 10% ports, with a total of 15,474 HPN catheter days. The CVCs were 42% SL, 55% double-lumen, and 2% triple-lumen. CRBSI rates were 0.97 episodes per 1000 HPN catheter days overall, with 1.54 for PICC, 0.64 for TCVC, and 0.0 for ports. CLABSI rates were 1.74 episodes per 1000 HPN catheter days overall, with 3.07 for PICC, 0.89 for TCVC, and 0.0 for ports. No significant differences were found between PICC and TCVC in CRBSI, however, PICCs had a significantly higher CLABSI rate per 1000 HPN catheter days than did TCVCs (p = 0.005). After a second analysis in which 9 cases of catheter infection were not counted due to undetermined alternate infection sources, overall CRBSI and CLABSI rates were reduced to 0.78 and 1.16 per 1000 HPN catheter days, respectively. The second analysis showed CRBSI rates of 1.23 for PICC and 0.51 for TCVC; and CLABSI rates of 2.0 for PICC and 0.64 for TCVC, with no significant differences in CRBSI, and a significantly higher rate of CLABSI per 1000 HPN catheter days for PICC lines (p = 0.04). Table 2 shows a statistical analysis of CRBSI and CLABSI rates. In the initial analysis, CRBSI was 1.24 for ML and 0.68 for SL CVCs, and CLABSI was 2.1 for ML and 1.36 for SL CVCs, per 1000 HPN days, however the differences were not statistically significant. Other problems which necessitated CVC removal were occlusion, malposition, accidental, leak, and thrombosis. The removal rate for other complications was 2.0 per 1000 HPN catheter days overall, with 1.78 for TCVC and 2.61 for PICCs, and the differences were not statistically significant.

Conclusion: We found no significant differences in CRBSI between PICC and TCVC, significantly more CLABSIs in PICC as compared to TCVCs, and no infections with ports. Although rates of other catheter problems were higher for PICCs, and infection rates were higher for ML than for SL catheters, neither reached statistical significance. We illustrate the variation in results between CRBSI and CLABSI, and that undetermined alternate infection sources complicate reporting. Our results show the need for more study, to be more open to the use of ports, and to choose SL TCVCs when feasible for long-term HPN.

Haruka Takayama, RD, PhD1,2; Kazuhiko Fukatsu, MD, PhD1,3; Midori Noguchi, BA1; Kazuya Takahashi, MD, PhD4; Nana Matsumoto, RD, MS3; Tomonori Narita, MD4; Satoshi Murakoshi, MD, PhD1,5

1Surgical Center, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; 2Department of Nutrition, St. Luke's International Hospital, Chuo-ku, Tokyo, Japan; 3Operating Room Management and Surgical Metabolism, Graduate School of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo, Japan; 4Gastrointestinal Surgery, The University of Tokyo Hospital, Bunkyo-ku, Tokyo, Japan; 5Nutrition and Dietetics, Kanagawa University of Human Services, Yokosuka City, Kanagawa, Japan

Financial Support: None Reported.

Background: Our previous study clarified addition of beta-hydroxy-beta-methylbutyrate (HMB) to TPN to partially restores gut-associated lymphoid tissue (GALT) atrophy due to lack of enteral nutrition. Because HMB is a metabolite of amino acid, the recovery effects might derive from increased amount of amino acids in the TPN solution. Or, it is possible that increased amino acid content could not restore GALT atrophy by itself, but that the amino acid increase together with HMB addition could further prevent the atrophy. Herein, we performed 2 studies to answer these questions using a murine TPN feeding model.

Methods: Experiment 1: Six-week-old male Institute of Cancer Research (ICR) mice were divided into A+ (n = 10) and A++ (n = 10) groups. Mice were inserted a catheter into the right jugular vein and they were continuously administered 0.2 mL/h normal saline solution for 2 days and allowed to take chow and water ad libitum. Then, mice received isocaloric PN solution with NPC/N 284 (A+) or 135 (A++) without oral food intake for 5 days. After the dietary manipulation, all mice were killed with cardiac puncture under general anesthesia and harvested the whole small intestine for GALT cell isolation. GALT cell number and its phenotype (B cell, CD4+, CD8+, and αβTCR+, and γδTCR+) were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF) and intestinal washings were collected for IgA level measurement by ELISA. Experiment 2: Mice were randomized to A+H+ (n = 10) and A++H+ (n = 9) groups. The A+H+ mice received PN solution with NPC/N 284 and 2,000 mg/kg BW of Ca-HMB, while the A++H+ animals were given PN solution with NPC/N 135 and 2,000 mg/kg BW of Ca-HMB. After 5 days of PN feeding, the parameters as in Exp.1 were evaluated. The Wilcoxon test was used for all parameter analyses, and the significance level was set at less than 5%.

Results: There were no significant differences between the A+ and A++ groups in GALT cell numbers (Table 1), phenotypes (Table 2) or mucosal IgA levels. However, the A++H+ group showed higher LP cell numbers (Table 1) and higher CD4+ cell percentage (Table 2) in IE space than the A+H+ group, without significant differences in IgA levels at any mucosal sites.

Anam Bashir, MBBS; Lauren L. Karel, BCPS; Margaret Begany, RD, CSPCC, LDN, CNSC; Jennifer Panganiban, MD

Children's Hospital of Philadelphia, Philadelphia, PA

Financial Support: None Reported.

Background: Fish oil-based lipid emulsion (FOLE) is FDA-approved at 1 g/kg/day for the treatment of parenteral nutrition-associated cholestasis (PNAC). Due to limited fat provision while on 1 g/kg/day of FOLE, caloric provision, especially in the neonatal population, is skewed primarily to be provided by dextrose and higher than desired glucose infusion rate (GIR) provisions to support weight gain and growth. There is limited published information on the use of FOLE on doses higher than 1 g/kg/day. Concerns about possible essential acid deficiency on 1 g/kg/day have been raised. Thus, we aim to describe patients who received 1.5 g/kg/day of FOLE at our institution.

Methods: A retrospective IRB-approved chart review was conducted on patients who received parenteral nutrition (PN) at Children's Hospital of Philadelphia between January 2020 and August 2023. The inclusion criteria included children who were on PN, ages 0 to 18 years, and receiving FOLE at a dose of more than 1 g/kg/day for at least 14 days. Cholestasis progression, essential fatty acid deficiency (EFAD), clinically severe post-procedure hemorrhage and hypertriglyceridemia were clinical outcomes of interest (Table 1). The progression of cholestatic disease was monitored by conjugated bilirubin levels. A triene to tetraene (T:T) ratio of greater than 0.046 was used to define EFAD based on Associated Regional and University Pathologists, Inc. (ARUP) normative laboratory values. Mead acid, linoleic acid, and alpha-linoleic acid levels were also collected to reflect essential fatty acid stores (normative values in Table 2). Invasive procedures were defined as those that require entry to the body through an incision, and/or tunneling, or cutting technique for vascular procedures. For children younger than 1 year, hypertriglyceridemia was classified as greater than 200 mg/dl, and for older children, greater than 400 mg/dl.

Results: Nine patients [5 males; mean age 2.6 y (range 2 mo–12.9 y)] with PNALD (defined by serum conjugated >= 2 mg/dl and exclusion of other causes of liver disease) were started on FOLE 1.5 g/kg/dose. The purpose of initiating the higher dose FOLE was to decrease GIR provision and/or give additional calories due to suboptimal weight gain using 1 g/kg/day of FOLE. None of the patients developed hypertriglyceridemia. Four patients had improvement of cholestasis with levels decreasing by more than 2 mg/dl, and four patients continued to have no evidence of cholestasis after prior normalization while on 1 g/kg dosing. One patient experienced an increase in conjugated bilirubin of more than 2 mg/dl after which the FOLE was decreased to 1 g/kg/day with resolution of cholestasis over three months. Seven patients had an essential fatty acid panel collected and T:T was within normal limits, although five patients had less than optimal levels of linoleic acid. Seven patients had an invasive procedure performed and only one patient had more than expected bleeding after circumcision. This patient had a low fibrinogen level (70 mg/dL) and required fresh frozen plasma and packed red blood cell transfusion with no significant bleeding event thereafter (Table 1).

Diana Mulherin, PharmD, BCNSP, BCCCP, FCCM; Sarah Cogle, PharmD, BCNSP, BCCCP; Vanessa Kumpf, PharmD, BCNSP, FASPEN; Edward Woo, PharmD; David Mulherin, PharmD, BCPS; Madeleine Hallum, MSHS, RDN, CSG, LDN; Ankita Sisselman, MD; Dawn Adams, MD, MS, CNSC

Vanderbilt University Medical Center, Nashville, TN

Financial Support: None Reported.

Background: Copper (Cu) deficiency can lead to poor wound healing, myeloneuropathy, anemia, and cardiac arrhythmias. Deficiency occurs from poor intake or high losses, which may be seen in adult patients requiring parenteral nutrition (PN) including those with severe malnutrition, large burns, requiring continuous renal replacement therapy (CRRT), or with a history of bariatric surgery/malabsorption. A previous formulation of multi-trace elements (MTE) contained Cu 1 mg per dose, and in combination with Cu contamination from other PN ingredients, an increased incidence of hypercupremia was observed in patients requiring long-term PN. As of 2020, the only MTE product for use in adults in the U.S. contains 0.3 mg of Cu. For patients with significant cholestasis of hepatic dysfunction, ASPEN recommends withholding or decreasing Cu doses in PN. Due to a lack of standardized practice, a quality improvement project was initiated to describe practices for ordering Cu in PN and Cu status in acutely ill, hospitalized patients with severe hyperbilirubinemia.

Methods: This was a retrospective evaluation of PN ordering practices of a multidisciplinary nutrition support team (NST) at a large, academic medical center between July 1, 2021, and August 31, 2023. PN encounters (a course of PN treatment during a single inpatient admission) in patients ≥ 18 years of age with severe hyperbilirubinemia (total bilirubin ≥ 10 mg/dL or direct bilirubin ≥ 2 mg/dL) within 5 days before or any time during the PN encounter were included. Patient demographics, frequency of Cu provision in PN, Cu and C-reactive protein (CRP) levels, and CRRT status were assessed using descriptive statistics.

Results: A total of 15,739 PN orders were entered on 1068 patients during the study period. Of those, 155 PN encounters occurred in 144 individual patients with severe hyperbilirubinemia. Baseline demographics are provided in Table 1. A summary of Cu sources (either from MTE product or as cupric chloride additive) for each PN encounter is provided in Figure 1. Cu status was assessed in 53 (34%) PN encounters with a mean concentration of 76.9 (±34.3) mcg/dL. CRP was only obtained concurrently with 58% (n = 31) of Cu levels with a mean concentration of 125.7 (±95.4) mg/L. CRRT was provided in 44 (28.4%) encounters (Table 2).

Figure 1. Copper sources in PN orders.

Brittney Patterson, MS, RD-AP, CNSC1; Ranna Modir, MS, RD, CNSC, CDE, CCTD1; Jack McKeown1; Rachel Aubyrn1; Javier Lorenzo, MD, FCCM2

1Stanford Health Care, Stanford, CA; 2Stanford University School of Medicine, Stanford, CA

Financial Support: None Reported.

Background: The use of safety alerts in electronic medical records (EMR) aims to improve patient safety, with most alerts directed at medication and nursing workflows. Stanford Health Care (SHC) has added tube feeding regimens (TFR) to the medication administration record (MAR) to further improve patient safety. In critically ill (ICU) patients who are at high risk for gastrointestinal (GI) complications, the ASPEN/SCCM 2016 guidelines recommend using near isotonic, fiber-free TFR. A retrospective analysis between 2014-2016 at SHC found an association of severe GI complications in ICU patients who were started on high-risk tube feeding regimens (HRTFR) of hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. To assure the ASPEN/SCCM guidelines were implemented at SHC, many interventions were put in place including designing order sets with HRTFR listed toward the bottom; specific TFR order sets removing HRTFR; education during new resident orientation, team rounds, and monthly in-services; and Registered Dietitians (RDs) having tube feeding order writing privileges. However, despite these interventions, it was found that HRTFR were still being ordered, with most of them occurring outside of normal RD working hours (8 am to 4 pm). To educate and guide providers to select safe TFR for ICU patients, we aimed to create a novel nutrition support-specific order validation pop-up in the EMR.

Methods: A team of RDs, critical care attendings, and Epic analysts collaborated to create a nutrition support-specific order validation pop-up. ICU patients were defined as requiring vasopressor support from norepinephrine, epinephrine, vasopressin, and/or phenylephrine. HRTFR was defined as hyperosmolar, high-fiber tube feeding formulas and/or fiber supplements. The order validation pop-up was built to trigger under the following three scenarios: (1) vasopressors were already on and a HRTFR was ordered, (2) a HRTFR was already on, and a vasopressor was ordered, or (3) both orders were being placed simultaneously. The pop-up displayed the reason for the alert, the importance of avoiding a HRTFR, provided safer TFR options, and recommended contacting the RD for guidance. To preserve individualization of patient care, order validation was overridable, as patients on lower vasopressor doses are appropriate to have the HRTFR. After the order validation pop-up was implemented, a chart review was completed between March 2023 and May 2023 to assess the incidence and actions following the triggered order validation pop-up.

Results: Between March 2023 and May 2023, the order validation pop-up triggered 220 times in a total of 59 patients. Out of the 220 triggers, based on the instructions in the pop-up, 42 (19%) resulted in a changed or discontinued order, or the HRTFR was not ordered. Of those 42 triggers that resulted in a properly adjusted HRTFR, 26 (61%) of them occurred outside of normal RD hours. The remaining triggers, where no changes were made, were found to have low dose vasopressors, vasopressors listed on the MAR but not actively being used, or a HRTFR was ordered on the MAR but held per nursing communication orders.

Conclusion: The creation of a novel nutrition support-specific order validation pop-up provided education and guidance to ordering providers. With this additional layer of safety, 42 ICU patients between March 2023 and May 2023 were placed on safer TFR, with most of the impact occurring outside of RD working hours.

Best of ASPEN - Enteral Nutrition Therapy

1627 - Victory for Volume-Based Enteral Nutrition

Julie M. Geyer, RD-AP, CNSC

University of Colorado Hospital, Aurora, CO

Financial Support: None Reported.

Background: Enteral nutrition (EN) in the hospital setting is traditionally administered by a fixed rate-based feeding method (RBEN). Studies using RBEN found that due to interruptions or withholding, actual formula delivery averages 60% to 70% of the prescribed volume. Nutrition provision below energy needs contributes to malnutrition and negative consequences including increased health care cost, and increased morbidity and mortality. The American Society for Parenteral and Enteral Nutrition (ASPEN) and the Society of Critical Care Medicine (SCCM), recommend use of a volume-base enteral nutrition feeding method (VBEN) to improve the nutrient delivery, decrease energy deficits and prevent overfeeding.

Methods: This quality improvement study took place at a Level I trauma, academic hospital from June 2022 to September 2023. In September 2022, a hospital-wide process improvement committee was assembled for multi-phase implementation of VBEN. Prior to September 2022, unit-based dietitians conducted quality improvement to address common causes of feeding interruptions. VBEN Inclusion criteria included those demonstrating tolerance of goal RBEN. The maximum hourly rate was set at 150 mL/hr. The ‘goal’ provision was set as 90% to 110% of prescribed formula volume. Patients included in the data collection were tolerating EN at RBEN goal and formula intake volumes were taken directly from the feeding pump history. Changes to the electronic medical record (EMR) included, creation of a VBEN calculator with row instructions built into the tube feeding flowsheet, creation of nurse reminder task every 4 hours to recalculate formula intake and adjust rate as needed. Changes to the formula order on the medication administration record included specification of VBEN vs RBEN feeding method and standardized administration instructions (Figure 1). Nurses, dietitians, and providers received training for the VBEN workflow and process through e-mail communication, in-person training, interactive learning-assisted video, and one-on-one coaching.

Results: Prior to June 2023, RBEN was the standard feeding method. Routine quality improvement audits from October 2020 to December 2022 in one intensive care unit demonstrated that despite strategies to improve formula delivery actual formula provision to meet ‘goal’ was met on 50% to 74% of EN days (Table 1). In June 2022, a hospital-wide audit of formula provision was conducted and included all levels of care (floor, intermediate, and intensive care). In a total of 346 EN days, the actual formula provision to meet ‘goal’ was achieved on 63% of EN days (Table 2). In November 2022, an audit was conducted in the two ICU units selected for phase 1 implementation. In a total of 154 EN days, the actual formula provision to meet ‘goal’ was achieved on 57% of EN days (Table 2). Phase 1 implementation took place in June 2023. A post-go-live audit was completed. In a total of 157 EN days, ‘goal’ formula volume was achieved on 83% of EN days (Table 2). No instances of hypo/hyperglycemia or gastrointestinal complications were reported. Phase 1 was deemed a success and approval was obtained to continue VBEN implementation in a stepwise fashion for the remaining inpatient units.

Marcin Folwarski, MD, PhD1; Stanisław Kłęk2; Karolina Skonieczna-Żydecka3; Agata Zoubek-Wójcik4; Waldemar Szafrański, MD, PhD5; Lidia Bartoszewska6; Krzysztof Figuła7; Marlena Jakubczyk, MD, PhD8; Anna Jurczuk9; Przemysław Matras, MD, PhD10; Zbigniew Kamocki, MD, PhD11; Tomasz Kowalczyk, MD, PhD12; Bogna Kwella, MD, PhD13; Joanna Sonsala-Wołczyk14; Jacek Szopiński, MD, PhD15; Krystyna Urbanowicz, MD, PhD16; Anna Zmarzly, MD, PhD14

1Division of Clinical Nutrition and Dietetics, Medical University of Gdańsk, Gdansk, Pomorskie, Poland; 2Surgical Oncology Clinic at the National Cancer Institute in Krakow at Maria Sklodowska-Curie National Research Institute of Oncology, Cracow, Poland; 3Department of Biochemical Science, Pomeranian Medical University in Szczecin, Szczecin, Zachodniopomorskie, Poland; 4Nutrimed Home Nutrition Center, 3, Warsaw, Poland; 5Home Enteral and Parenteral Nutrition Unit, General Surgery Department, Nicolaus Copernicus Hospital, Gdansk, Pomorskie, Poland; 6First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit S, Lublin, Poland; 7Nutricare Clinical Nutrition Center, Cracov, Poland; 8Department of Anaesthesiology and Intensive Care Collegium Medicum in Bydgoszcz, Nicolaus Copernicus University, Toruń, Poland; 9Outpatient Clinic of Nutritional Therapy Clinical Hospital, 15-001 Bialystok, Bialystok, Poland; 10First Department General and Transplant Surgery and Clinical Nutrition Medical University of Lublin, Home Enteral and Parental Nutrition Unit SPSK4, Lublin, Poland; 11Department of General and Gastroenterological Surgery Medical University, Bialystok, Poland; 12Nutricare Clinical Nutrition Center, Cracow, Poland; 13Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland; 14Clinical Nutrition Unit, Gromkowski Citi Hospital, Wroclaw, Poland; 15Department of General Hepatobiliary and Transplant Surgery, Collegium Medicum, Nicolaus Copernicus University in Torun, Torun, Poland; 16Department of Clinical Nutrition, Provincial Specialist Hospital, Olsztyn, Poland

Financial Support: None Reported.

Background: Cancer is one of the most common indications for home enteral nutrition (HEN). Malnutrition and weight loss, associated with deterioration in performance status, contribute to poorer outcomes in oncology patients. Systemic inflammation is a characteristic feature of cancer cachexia and may be used as a prognostic factor for short survival. According to the ESPEN guidelines HEN is indicated for patients with an estimated survival of at least 30 days. Therefore, determining survival is essential for individual care planning as it informs healthcare professionals about the suitability of HEN and palliative care strategy.

Methods: In a retrospective multicenter survey, we examined the medical records of cancer patients across 22 Polish HEN centers treated in 2018. Factors assessed during the qualification for HEN included BMI, weight loss, albumin level, total protein level, lymphocyte count, CRP, Prognostic Nutritional Index (PNI), and Eastern Cooperative Oncology Group (ECOG) performance status. The primary endpoint was survival of less than 30 days from the initiation of HEN.

Results: A total of 278 cancer patients: 51.44% head and neck, 41.37% gastrointestinal, and 7.19% other localizations were included in the study (70.14% male, 29.86% female). Inflammatory factors– albumin level below 3.5 g/dL (p = 0.02), C-reactive protein (p = 0.01), PNI > 45 (p = 0.04), high percentage of weight loss in the last 6 months (p < 0.01) and ECOG performance score (p = 0.01) were associated with poor survival (less than 30 days). Body weight, BMI, lymphocyte count, and total protein level were not correlated with survival.

Conclusion: Assessment of performance status, inflammation, and weight loss during qualification for HEN can predict short-term survival of cancer patients. This finding highlights the importance of comprehensive assessments before home nutrition initiation. Predicting poor survival can help plan palliative care and determine whether the patient will benefit from HEN.

June R. Greaves, RD, CNSC, CDN, LD, LDN, LRD1; Katharine Morra, RD, CNSC, CSO, LD, LDN2

1Coram CVS Specialty Infusion Services, Meriden, CT; 2Coram CVS Specialty Infusion Services, Plainfield, IN

Financial Support: None Reported.

Background: The objective of this quality improvement project was to determine whether patients were successful in administering tube feeding independently at home following a virtual tube feeding instruction by a Registered Dietitian (RD) with a nationwide home care infusion company. The hope is to provide information regarding the process and potentially identify an avenue for further improvements to the process and an area for future research.

Methods: A retrospective review was conducted of approximately 162 patients who received a virtual tube feeding instruction by the enteral RD from June 2022 to June 2023. A virtual instruction was completed for the enteral feeding pump, gravity bag, and bolus/syringe methods of administration. A follow-up call was made to active patients to inquire about their experience of the virtual instruction. For those patients unable to be reached, a review of the medical record was completed to determine if inbound calls were received for questions or issues after the virtual instruction. Patients were queried on confidence in their ability to provide enteral feedings, and if they had any concerns upon completion of the virtual instruction, knowledge of who to contact after the virtual instruction, and if the reference materials provided were helpful. Patients who did not receive virtual instruction, or were discharged from service, were excluded from the review.

Results: One hundred sixty-two total patients were reviewed as potentially eligible for the analysis; 115 were excluded. Of those excluded, 100 (87%) were no longer on service; 12 (10%) declined a virtual instruction due to home health agency instruction, inpatient instruction with nursing or dietitian prior to the start of care, or assistance from the home infusion company sales team; 3 (3%) were a “no show” for the scheduled appointment. Eighteen of the remaining eligible patients were unable to be contacted for follow-up. Of those who were unable to be contacted through a follow-up call, there were no documented inbound calls regarding feeding/equipment questions or concerns. Of the total number of eligible patients, 29 provided telephonic feedback on the virtual instruction experience. Virtual instruction was related to the following administration types: enteral pump (86%, n = 25), followed by gravity bag and bolus/syringe (14%, n = 4). Upon completion of the instruction, 27 (93%) felt confident with feeding administration, 2 (7%) did not feel confident as they identified as “in person learners”; 24 (83%) did not experience issues/concerns, 5 (17%) did have questions/concerns; 27 (93%) responded knowing who to contact, 2 (7%) did not; 22 (76%) found reference materials provided helpful, 2 (7%) did not, and 5 (17%) did not review the reference materials.

Conclusion: Technological advances in recent history have made virtual instruction possible. Virtual enteral instruction can be a successful tool for patients to learn how to administer tube feedings when an in-person instruction is not possible in the home care setting. However, consideration should be given to the client's preferred style of learning. Further research in the use of virtual instruction to enhance the process should be considered. As literature is limited on virtual instruction outcomes, additional research is warranted.

Danelle A. Olson, RDN; Lisa M. Epp, RDN; Osman Mohamed Elfadil, MBBS; Ryan T. Hurt, MD, PhD; Manpreet S. Mundi, MD

Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: The prevalence of bariatric surgery has increased significantly in recent years, as it is the most effective long-term treatment for obesity. The two most common surgeries, Sleeve Gastrectomy (SG) and Roux-en-Y gastric bypass (RYGB), alter gastrointestinal anatomy, producing significant weight loss as well as remission of obesity-related co-morbidities including type 2 diabetes. Despite these benefits, bariatric surgery can be associated with significant debilitating complications. Though the true prevalence and mechanism are unclear, hypoglycemia has been shown to be present in up to 38% of post-surgical RYGB patients and can be very difficult to manage. Currently, there remains a paucity of data regarding the role of enteral nutrition (EN) as a potential therapy.

Methods: A retrospective review of EMR of patients who were seen in our outpatient home enteral nutrition (HEN) clinic for initiation of tube feeding for management of reactive hypoglycemia from March 2017 to July 2023. In addition to baseline clinical characteristics and demographics, we collected data about hypoglycemia incidents, interventions, EN regimens, and outcomes.

Results: Six patients were seen in the HEN clinic with post-bariatric reactive hypoglycemia (mean age 45.5 ± 9.6 years; 66.7% female; mean BMI at HEN initiation 28.6 ± 8.3). Five out of 6 patients underwent RYGB surgery, and 1/6 underwent laparoscopic adjustable gastric banding (LAGB) that was subsequently revised sleeve gastrectomy (SG). The duration until the development of reactive hypoglycemia after surgery varied in the cohort. On average, the first incident was documented 2.6 ± 3.2 years after surgery. Of note, patients lost, on average, 51.2 ± 28.5 kgs after surgery and before they required EN support. We noted a slight change in weight after EN initiation as patients remained, on average, at +2.5 kg one month and 3 months into HEN. Table 1 shows the patients' profiles. Dietary modification focusing especially on reduction in consumption of refined carbohydrates was recommended for all patients. However, poor compliance was prevalent, with 5/6 (83%) of patients not adhering to prescribed diet. In addition to the EN and dietary regimens prescribed for all patients, some had received specific treatment(s) to prevent or manage reactive hypoglycemia. In one case, a combination of α-glucosidase inhibitors, somatostatin, and radical diet changes were used. The majority of patients underwent an initial trial of EN through a naso-jejunal tube, which was then converted to a percutaneous tube after efficacy was established (Table 2). Standard polymeric formulas were utilized for most patients, although one was provided commercial blenderized tube feeds. With the use of EN, 4 out of 6 patients had a resolution of reactive hypoglycemia, while only two continued to experience symptoms. Two patients stopped use of EN due to feeding complications and non-compliance, while the remaining four continued on EN.

Anna K. Burneske; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD

Medical College of Wisconsin, Milwaukee, WI

Financial Support: None Reported.

Background: Patients who are malnourished have worse outcomes. Many standardized tools have been developed to screen for malnutrition in acutely ill pediatric patients: Pediatric Yorkhill Malnutrition Score (PYMS), Pediatric Nutrition Screening Tool (PNST), and Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP). Alternatively, some institutions have developed their own tools for this purpose. These tools are referred to as “home-grown”. Regardless of their origin, none of these tools have been validated in critically ill children. Registered dietitians (RDs) perform nutrition assessments on patients based on the results of these nutrition screenings or based on protocols within their institution. The Virtual Pediatric Systems, LLC (VPS), an international data registry supporting standardized data sharing for research, improved patient care, and benchmarking among pediatric ICUs, developed a nutrition module that captures data for nutritional metrics. VPS has collected data from centers in the nutrition module since October 2019 and collects data for about 10,000 patients per calendar year from the centers participating in the nutrition module. The specific aims were to compare the nutrition screening tools to the dietitians' assessments to determine the screening tools' accuracy and to determine whether standardized screening tools are more accurate than those developed at single centers. We hypothesized that (1) nutrition screening tools used by participating centers will accurately identify malnourished children, and (2) standardized tools will be more accurate than those developed at single centers.

Methods: In this project, we compared pediatric nutrition screening tools with the assessments performed by RDs to determine whether nutrition screening tools accurately identify malnourished patients. We also determined which nutrition screening tools more accurately identify patients who are malnourished or at risk of becoming malnourished during their hospitalization in the PICU so that the appropriate nutrition therapy can be initiated. We obtained de-identified demographic and clinical data from October 2019 through March 2023 for all patients under 18 years of age from the VPS database from centers participating in the nutrition module. We considered the RD's assessment to be the gold standard for determining malnutrition and compared the nutrition screening tools to the RD's assessment. The degree of agreement in malnutrition between nutrition screening tools and RD's assessment was determined by Cohen's kappa (κ).

Results: After selecting subjects who had a complete pediatric nutrition screen and RD assessment, the final data cohort contains a total of 9891 patients. Among them, there were 54% males, 4% neonates (<=29 d), 34% infants (<2 y), 35% children (2-12 y), and 26% adolescent (12-18 y). The subjects were 40% White, 17.5% Black, 22.5% Hispanic, 5.7% Asian, and 14.2% other/mixed. The kappa coefficient for the standardized nutrition screening tools was .38, which is considered a “fair” agreement between the screening tool and the RD “gold standard” assessment. Other unidentified tools listed as “home-grown” or “other” in VPS had kappa coefficients ranging from .31 to .91. 91 is a near-perfect agreement between the screening tool and the RD “gold standard.”

Conclusion: These data show only a fair degree of agreement between the standardized screening tools (PYMS, PNST, STAMP) and RD assessments, meaning that these tools do not adequately assess the nutritional status of critically ill children. However, some unidentified hospital-specific tools show near-perfect agreement with RD assessments, so perhaps there is a better tool for identifying malnourished children in the ICU. Further investigation should be performed to determine why the home-grown tools are superior to the published tools.

Research Trainee Award

Hayley E. Billingsley, PhD, RD, CEP; Michael Dorsch, PharmD, MS; Todd M. Koelling, MD; Scott L. Hummel, MD, MS

University of Michigan, Ann Arbor, MI

Financial Support: NHLBI - Award 5R33HL155498-03.

Background: Malnutrition is common in patients with heart failure (HF) and worsens already poor prognosis. Previous work suggests that sodium restriction, the most common dietary recommendation for patients with HF, may be associated with reduced micronutrient and energy intake. The Mini Nutritional Assessment-Short Form (MNA-SF) is a strong indicator of nutrition status and prognosis in patients with HF, but the association between MNA nutrition status and sodium intake has not been examined. Therefore, this analysis aimed to examine the association between nutrition status and habitual sodium intake in hospitalized patients with HF.

Methods: This is a cross-sectional analysis of patients (≥18 y of age) hospitalized for decompensated HF. Participants were administered the MNA-SF and scored as nourished, at risk of malnutrition, or malnourished based on established cutoffs. Questions on the MNA-SF regarding weight loss and declines in food intake over the previous 3 months were also considered independently. Participants completed the 2014 Block Food Frequency Questionnaire (FFQ) to assess habitual dietary intake. Estimated daily kilocalories (kcals) from the FFQ were divided by estimated energy needs (Harris-Benedict equation*1.1) to calculate percent (%) estimated energy needs. Estimated protein needs were calculated based on the Academy of Nutrition and Dietetics recommendation of 1.1 g/kilogram (kg) in HF and divided by estimated protein intake from the FFQ to calculate % estimated protein needs. Using the FFQ, participants were grouped into sodium intake ≥ or < 2 g per day. Differences between groups based on sodium intake were explored using Fischer's exact test, Chi-square, or Mann Whitney U as applicable.

Results: Baseline characteristics are presented in Table 1. On FFQ, participants with sodium intake <2 g reported consuming significantly less of their % estimated energy and protein needs than participants with ≥ 2 g sodium intake (Figure 1). All patients (n = 12) with sodium intake <2 g per day were malnourished or at risk for malnutrition on MNA-SF versus 73% (32) of patients with sodium intake ≥2 g per day (P = 0.051). A greater proportion of patients with daily sodium intake <2 g reported recent weight loss >3 kg (75% [9] vs. 43% [19], P = 0.051). No difference was found in the proportion of participants reporting a decrease in food intake on the MNA-SF (<2 g sodium, 67% [8] vs. ≥ 2 g sodium, 50% [22], P = 0.305).

Conclusion: In patients hospitalized for HF, habitual sodium intake <2 g per day was associated with inadequate energy and protein intake, confirming previous findings. Despite the high prevalence of obesity in the cohort, sodium intake <2 g per day was also associated with self-reported weight loss >3 kg and a higher likelihood of being at risk for or having malnutrition. Although this cross-sectional analysis cannot determine the directionality of observed associations, additional studies should examine the impact of personalized nutrition interventions vs. standard-of-care sodium restriction education in HF on clinical outcomes.

Figure 1. Percent estimated energy and protein needs achieved by sodium intake level in hospitalized patients with heart failure.

Lucia A. Gonzalez Ramirez, cPhD1,2; Mary M. Nellis, PhD3; Jessica A. Alvarez, PhD1,2,4; Tasha M. Burley2; Paula D. Nesbeth, cPhD1,2; Chin-An Yang, cPhD1,2; Dean P. Jones, PhD1,3,4; Thomas R. Ziegler, MD1,2,4

1Nutrition and Health Sciences Program, Laney Graduate School, Emory University, Atlanta, GA; 2Division of Endocrinology, Metabolism and Lipids, Department of Medicine, Emory University, Atlanta, GA; 3Clinical Biomarkers Laboratory, and Division of Pulmonary, Allergy, Critical Care and Sleep Medicine, Department of Medicine, Emory University, Atlanta, GA; 4Center for Clinical and Molecular Nutrition, Department of Medicine, Emory University, Atlanta, GA

Financial Support: None Reported.

Background: Postprandial metabolism can identify alterations related to the early stages of cardiovascular disease. However, limited data exist regarding the effects of body composition on postprandial metabolism after a lipid meal challenge. We aimed to characterize the metabolic pathways and metabolites associated with body fat abundance in the postprandial plasma metabolome after an oral lipid challenge.

Methods: Thirty-one healthy individuals between 20 and 50 years old with a lean or overweight/obese body mass index (BMI) were recruited. Participants underwent body composition measurement with dual energy x-ray absorptiometry (DEXA) to quantify body fat percentage and visceral adipose tissue quantity. A standardized 900-kcal lipid meal challenge (a long chain triglyceride fat emulsion oral nutritional supplement) with repeat blood sampling was administered. Untargeted plasma high-resolution metabolomics was determined at baseline, 120 minutes, and 360 minutes after the lipid challenge using dual-column liquid chromatography (C18- and HILIC+ electrospray modes), coupled with high-resolution mass spectrometry (LC-HRMS). Metabolite differences were assessed using a metabolome-wide association study with linear mixed-effect models to study effects of body fat, time, and the body fat*time interaction, controlling for age and sex, and pathway enrichment analysis was performed.

Results: A total of 12,078 (C18) and 15,041 (HILIC) features (metabolites) were detected in plasma at baseline. Changes over time differed by percent body fat (percent fat*time interaction) for 699 (C18) and 814 (HILIC) features from baseline to 120 minutes, respectively, and 465 (C18) and 478 (HILIC) features from baseline to 360 minutes, respectively (all p < 0.05). These were enriched in pathways that include TCA cycle, fatty acid, lysine, tyrosine, tryptophan, butanoate, and purine metabolism, Figures 1 and 2. Additionally, changes over time differed by visceral adipose tissue quantity (VAT*time interaction) for 396 (C18) and 2290 (HILIC) features from baseline to 120 minutes, respectively, and 486 (C18) and 520 (HILIC) features from baseline to 360 minutes, respectively (all p < 0.05). These were enriched in pathways that include fatty acid oxidation, omega-3 and −6 fatty acid, vitamin C, and pentose phosphate metabolism, Figure 3 and 4.

Best of ASPEN - Malnutrition, Obesity, Nutrition Practice Concepts, and Issues

Ana Paula Pagano, MSc1; Taiara Poltronieri, BSc1,2; William Evans, PhD3; M. Cristina Gonzalez, MD, PhD4; Anil Abraham Joy, MD5; Claude Pichard, MD, PhD6; Carla Prado, PhD, RD1

1University of Alberta, Edmonton, AB, Canada; 2Federal University of Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil; 3University of California, Berkeley, CA; 4Federal University of Pelotas, Pelotas, Rio Grande do Sul, Brazil; 5University of Alberta/Cross Cancer Institute, Edmonton, AB, Canada; 6Geneva University Hospital, Geneva, Switzerland

Financial Support: ASPEN (American Society for Parenteral and Enteral Nutrition) Rhoads Research Foundation, and the Canadian Institutes of Health Research (CIHR) (FRN 159537).

Background: Accurate understanding of energy requirements is essential for tailored nutritional interventions in patients with cancer. Under- or overestimating these needs can lead to detrimental weight loss or excessive gain. Yet, determining energy needs in cancer is challenging due to factors like individual tumor burden, treatment, and inflammation, all of which can influence energy requirements. Current guidelines offer broad caloric intake (25-30 kcal/kg/d) set as normal values that lack strong evidence. As a result, dietitians often rely on predictive equations, which have proven to be imprecise. Nonetheless, standard techniques available to accurately measure energy requirements are costly, time-consuming, and not applicable to clinical settings. In this study, we leveraged a cohort of patients with breast cancer to evaluate the accuracy of a novel bedside device designed to measure resting energy expenditure (REE) and compared it against a gold-standard method.

Methods: REE data were obtained cross-sectionally from adult females with breast cancer (stages I-III) measured during a 10-minute test with a novel portable device, the Q-NRG® (Cosmed, Roma, Italy), and compared against REE measured during a 1-hour test in a whole-room indirect calorimeter (WRIC) as a gold-standard technique. To assess and describe the REE accuracy between methods, we utilized paired samples t-test or Wilcoxon signed-rank test for instances of non-normality. Accuracy was determined by the percentage of estimates that fell within 10% of the values measured by WRIC. Additionally, Bland-Altman analysis was conducted to determine bias and establish the lower and upper limits of agreement (LOA). A p-value of less than 0.05 was considered statistically significant.

Results: REE was evaluated in 49 females (age 55.9 ± 11.8 y; 42% with stages I or II, and 7% with stage III breast cancer) using both WRIC and the new portable device. Most patients (63.3%) had a body mass index (BMI) classification within the overweight or obesity categories, and none were categorized as underweight. The new portable device provided accurate measurements for over 70% (n = 35) of patients, showing measurements within 10% of those obtained by WRIC. However, the new portable device overestimated REE for 1 patient and underestimated it for 13. Measured REE significantly differed between techniques, with the new portable device underestimating REE compared to WRIC (1406 ± 262 vs 1508 ± 248 kcal/d; p < 0.001). The bias between the new portable device and WRIC was −6.7% (LOA = −24.9%, 11.6%; variance = 36.5%) or −102 kcal (LOA = −378 kcal, −174 kcal; variance = 552 kcal).

Conclusion: When compared to a gold-standard technique, the new portable device showed good agreement at the group level, with REE measurement discrepancies falling within 10% of values determined by the WRIC. Although a greater variability was observed at the individual level, the new portable device accurately assessed REE in comparison to the WRIC for most patients. Thus, the new portable device appears to be a promising tool for estimating REE of patients with breast cancer, positioning it as a viable option for clinical settings.

Michelle Brown, MS, RD, LDN, CNSC

UF Health, Gainesville, FL

Financial Support: None Reported.

Background: Malnutrition is a highly prevalent issue in the healthcare setting. The term malnutrition in the healthcare setting refers to undernutrition. This occurs as a result of inadequate nutrition intake, impaired absorption, or altered utilization of nutrition. Inflammation and hypermetabolism also contribute to the development of malnutrition. Estimates of the prevalence vary and are as high as 54%. In acute care hospitals, the prevalence of malnutrition is 39% when using diagnostic criteria from the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN). Capturing and recognizing malnutrition is important, as this diagnosis is associated with a 3.4x high rate of in-hospital death, 1.9x higher length of stay, 2.2x likelihood of being admitted with a serious infection, higher rates of discharge to a rehabilitation or long-term assisted care facility, an increased rate of readmissions, and a 73% increase in hospital costs. Due to the impact of malnutrition on healthcare costs and requirements for care, ICD10 codes for malnutrition are considered comorbid conditions (CC) or major comorbid conditions (MCC). Accurate diagnosis, treatment, and documentation of malnutrition can improve patient care. Accurate documentation can also help to capture complexity for quality metrics while also allowing for the selection of the correct DRG and base payment which may increase reimbursement.

Methods: An interdisciplinary nutrition committee at our organization consisting of dietitians, physicians, nurses, and informatic professionals completed a quality improvement implementation to improve malnutrition diagnosis rates, documentation, and coding. This was completed in four steps: (1) Identification of malnutrition criteria that could be used across the organization. Our committee elected to use the AND/ASPEN criteria for the diagnosis of malnutrition. This criterion is used by ~85% of hospitals and is widely recognized by payors. (2) Development of a documentation tool that would allow for RD diagnoses malnutrition to populate in provider progress notes. The hospital's electronic medical record (EMR) was leveraged to accomplish this goal. A novel flowsheet and Smartphrase were developed, which allowed information on malnutrition severity, signs/symptoms, and treatment (entered by the dietitian) to flow into physician progress notes automatically. This solution met all the “best practices” for documentation that were identified by our interdisciplinary team – clear signs and symptoms of malnutrition identified, the severity of malnutrition indicated and documented consistently between providers, consistent use of diagnostic criteria, and treatment for malnutrition being provided and documented. (3) All clinical nutrition staff members were provided with hands-on training on the completion of nutrition-focused physical exams (NFPE), and the completion of these exams was prioritized in all nutrition assessments. (4) When the malnutrition Smartphrase was not used, notes were sent to physicians for attestation and signature.

Results: Following this implementation, dietitian-diagnosed malnutrition has been included in physician notes via Smartphrase for 65% of cases. In the six months following NFPE training, malnutrition diagnosis rates increased by 220%. The percentage of dietitian assessments with a malnutrition diagnosis has increased from 13% to 40%. Following the process of sending notes to physicians for attestation and signature, 94% of malnutrition diagnoses are coded in the EMR at discharge from the hospital, and coding queries to physicians decreased by 50%. Hospital reimbursement for dietitian-diagnosed malnutrition has increased from ~$65,000 per quarter to ~$2 million per quarter.

Conclusion: Utilization of appropriate NFPE training, physician-approved diagnostic criteria, and EMR-based documentation solutions can increase diagnosis, documentation, and reimbursement for malnutrition diagnoses in hospitalized patients.

Research Trainee Award

Alan Garcia-Grimaldo1,2; Ivan A. Osuna-Padilla1; Nadia Rodriguez-Moguel1; Martin A. Rios-Ayala1; Marycarmen Godinez-Victoria2

1National Institute of Respiratory Diseases, Mexico City, DF, Mexico; 2Escuela Superior de Medicina, Instituto Politécnico Nacional, Mexico City, DF, Mexico

Financial Support: None Reported.

Background: Intensive care unit-acquired weakness (ICU-AW) is characterized by peripheral muscle mass wasting, reduced muscle strength, and dysfunction. Respiratory and swallowing related muscles could also be affected by this condition. This study aimed to analyze the association between ICU-AW incidence and post-extubation dysphagia (P-ED).

Methods: A prospective cohort study was conducted. Patients on mechanical ventilation (MV) admitted to the ICU were included. Individuals with a previous diagnosis of myopathies were excluded. NUTRIC-Score, calf circumference adjusted to BMI, and phase angle (PhA) obtained by bioelectrical impedance, were assessed upon admission and after extubation. Biochemical variables (Baseline PCR) were collected from medical records. SOFA score, APACHE II, and malnutrition diagnosis using GLIM criteria were determined upon admission to the ICU. Cumulative energy (CED) and protein (CPD) deficits were calculated during the ICU stay. ICU-AW diagnosis was determined using the Medical Research Council Scale (MRC-Scale <48) and hand grip strength (<11 kg for men, and <7 kg for women). Swallowing function assessment was performed within the first 24 hours after extubation, using the Yale Swallowing Protocol (YSP). For patients who did not meet the success criteria defined for the YSP, the volume-viscosity swallow test was performed to corroborate the presence of post-extubation dysphagia (P-ED). Specific success and failure criteria proposed for each test were used. Mean and median comparison tests were performed for each variable between the group with P-ED and those with normal swallowing. Associations were analyzed using univariate and multivariate logistic and linear regressions. Covariates selection was performed by stepwise method.

Results: Fifty-four patients were included, 19 (35.2%) were diagnosed with P-ED and 32 (59.3%) with ICU-AW. Patients with P-ED showed lower values for PhA at extubation, MRC-Scale, and handgrip strength at extubation. In addition, higher days on Invasive MV, CED, and CPD were observed in this group (Table 1). In the univariate logistic regression analysis, PhA at extubation, CED, CPD, ICU-AW diagnosis, and days on MV were associated with P-ED identification. In multivariate regression analysis, only days on MV, and the ICU-AW diagnosis were independently associated with P-ED (Table 2).

Conclusion: Days on invasive mechanical ventilation, and ICU-acquired weakness diagnosis were predictors for post-extubation dysphagia. Novel clinical and nutritional strategies are required to prevent ICU-acquired muscle weakness and its consequences, which may improve clinical outcomes and quality of life after extubation.

Ahron Lee, RD1,2; Eun-Mee Kim, RD1; Bo-eun Kim, RD1; Chi-Min Park, MD, PhD3; Sung Nim Han, PhD2

1Department of Dietetics, Samsung Medical Center, Seoul, Korea, Republic of (South); 2Department of Food and Nutrition, College of Human Ecology, Seoul National University, Seoul, Korea, Republic of (South); 3Department of Critical Care Medicine and Surgery, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea, Republic of (South)

Financial Support: None Reported.

Background: The importance of “appropriate” nutrition support in the early stages of intensive care unit (ICU) admission is under debate regarding patients who require it, time of initiation, and the amount to be provided. In this study, the characteristics and clinical outcomes of malnourished patients diagnosed using the Global Leadership Initiative on Malnutrition (GLIM) criteria were examined. Also, the actual implementation of nutritional support and its relationship with clinical outcomes based on nutrition status were investigated.

Methods: This retrospective cohort study included critically ill patients receiving invasive mechanical ventilation who were admitted to the ICU and hospitalized for at least 7 days between January 1, 2020, and December 31, 2022. Nutritional and clinical data during their first 10 days in the ICU were collected. All the patients in this study underwent nutrition assessment by the GLIM criteria. The 90-day mortality of patients diagnosed with malnutrition by the GLIM criteria and degree of malnutrition were analyzed. Patients were divided into three energy intake categories (<10 kcal/kg/d, 10–20 kcal/kg/d, and >20 kcal/kg/d) and three protein intake categories (<0.8 g/kg/d, 0.8–1.2 g/kg/d, and >1.2 g/kg/d). Information on intake was categorized by the stage following ICU admission (days 1–3 for the early acute phase, days 4–6 for the late acute phase, and days 7–10 for the recovery phase). We examined the differences in mortality among groups separated by energy and protein intake at each stage. The analyses were performed for the total cohort, well-nourished, and malnourished groups. Differences in the means and distribution were evaluated, and survival analyses and regression analyses were performed.

Results: A total of 595 patients were included. The prevalence of malnutrition according to the GLIM criteria was 61% (n = 362). The 90-day mortality in the well-nourished and the malnourished group was 45% and 58%, respectively (P < 0.001). Mortality differed by the degree of malnutrition (well-nourished 45%, moderately malnourished 53%, severely malnourished 61%, P = 0.001). In the early acute phase and late acute phase, there was no difference in mortality among different energy intake groups. However, in the recovery phase, the group with high energy intake (>20 kcal/kg/d) showed lower mortality (hazard ratio (HR) 0.602; 95% confidence interval (CI) 0.413 to 0.877; P = 0.008) in the total cohort. In well-nourished patients, the high energy intake group tended to have lower mortality (HR 0.573; 95% CI 0.318 to 1.034; P = 0.064) in the recovery phase. However, in malnourished patients, the group with high energy intake showed significantly lower mortality (HR 0.549; 95% CI 0.333 to 0.903; P = 0.018) in the recovery phase. In the early acute phase and late acute phase, there was no difference in mortality among different protein intake groups. However, in the recovery phase, the group with moderate protein intake (0.8–1.2 g/kg/day) showed lower mortality (HR 0.770; 95% CI 0.599 to 0.990; P = 0.041) in the total cohort. When well-nourished patients and malnourished patients were analyzed separately, a significantly lower mortality (HR 0.728; 95% CI 0.536 to 0.988; P = 0.042) in the recovery phase was observed with moderate protein intake among malnourished patients.

Conclusion: Malnutrition diagnosed by the GLIM criteria was associated with 90-day mortality and other clinical outcomes. Furthermore, energy and protein intake at the recovery phase after ICU admission was associated with mortality, especially in malnourished patients classified by the GLIM criteria. Therefore, time-dependent nutritional intake depending on nutrition status may be relevant for optimizing ICU nutrition support strategies.

Best International Abstract

International Abstract of Distinction

Fabio Araujo, RD, MHS1; Maureen Tosh, PT1; Maitreyi Kothandaraman, MD, MSc, FRCPC, CAGF2; Juan Posadas, MD, MSc1,2; Paul Wischmeyer, MD, EDIC, FASPEN, FCCM3; Priscilla Barreto, RD4; Chelsia Gillis, RD, PhD, CNSC5

1Alberta Health Services, Calgary, AB, Canada; 2University of Calgary, Calgary, AB, Canada; 3Duke University School of Medicine, Durham, NC; 4Hospital Naval Marcilio Dias, Rio de Janeiro, RJ, Brazil; 5McGill University School of Human Nutrition, Montreal, QC, Canada

Financial Support: None Reported.

Background: Functional capacity is the most relevant outcome after critical illness according to ICU survivors. This outcome is especially pertinent as adult ICU mortality has been decreasing, culminating in impaired functional capacity, delayed return to work, and low quality of life. Protein via nutrition support (NS) has the potential to mitigate ICU-acquired weakness but given that current ICU benchmarks are based on mortality and ICU-related complications, it is unknown whether these protein targets also support functional recovery. To address this gap, we conducted a retrospective cohort study to determine whether different protein intake doses influenced the functional capacity of ICU survivors with LOS ≥ 7 days, measured by the Chelsea Physical Assessment score (CPAx) at ICU discharge – a validated measure of functional capacity and robust method based on reliability, measurement error, and responsiveness.

Methods: The medical records of all consecutive patients admitted to a general systems ICU between October 2014 and September 2020 were reviewed. Inclusion criteria were age ≥18 years, survived ICU admission, ICU stay ≥7 days, and received NS. Exclusion criteria included neuromuscular disorders, brain/spinal cord injury, limb amputation, orthopedic fractures, persistent coma during ICU stay, missing CPAx, and mechanical ventilation <3 days. Eligible patients were divided into 4 groups guided by previous literature exploring daily protein intake in ICU (g/Kg/d) on mortality: LOW (<0.8), MEDIUM (0.8-1.19), HIGH (1.2-1.5), and VERY HIGH (>1.5). Groups with similar CPAx were pooled to enhance precision. The effect of protein dose on CPAx was assessed with analysis of covariance (ANCOVA) adjusting for the confounding variables age, disease severity, length of stay in hospital before ICU admission, duration of mechanical ventilation, and time until start of NS in ICU. Effect modification by nutritional status was assessed with stratification according to subjective global assessment (SGA A: well-nourished and B/C: malnourished). The effect of energy intake was assessed using the same regression model (<25 and ≥25 Kcal/Kg/d; <70 and ≥70% daily adequacy).

Results: Inclusion/exclusion criteria were met by 531 patients. CPAx was non-linearly associated with protein doses (Figure 1) AND was not statistically different among LOW, MEDIUM, and VERY HIGH groups. All groups were different from HIGH (p = 0.003), indicating data could be pooled, and giving rise to 2 groups: HIGH (1.2-1.5 g/Kg/d) and POOLED (<1.2 and >1.5 g/Kg/d). Baseline characteristics were comparable between both groups (Table 1). Mean CPAx (±standard error) was greater in the HIGH vs POOLED groups (30.1 ± 0.7 vs. 26.8 ± 0.6, p = 0.001), suggesting that HIGH was associated with superior functional capacity at discharge. The mean difference (MD) remained statistically significant after adjusting for confounding variables (CPAx MD: 3.4 ± 1.1, p = 0.003 in the 4-group model and 3.3 ± 0.9, p = 0.001 in the 2-group model). Energy intake had no effect on CPAx for Kcal/Kg/d (28.1 ± 0.6 in <25 Kcal/Kg vs 27.9 ± 0.8 in ≥25 Kcal/Kg, p = 0.780) nor for adequacy (27.3 ± 0.9 in <70% vs 28.4 ± 0.6 in ≥70%, p = 0.641). Nutritional status was not an effect modifier as the HIGH group had superior CPAx in both well-nourished (MD 3.8 ± 1.7, p = 0.029) and malnourished (MD 2.5 ± 1.1 p = 0.031) patients.

Best of ASPEN - Critical Care and Critical Health Issues

International Abstract of Distinction

Chin Han Charles Lew, APD, PhD1; Zheng-Yii Lee, PhD2,3; Andrew Day, MSc4; Xuran Jiang, MSc4; Danielle E. Bear, RD, PhD5,6; Gordon L. Jensen, MD, PhD7; Pauline Y. Ng, MBBS, MRCP(UK), FHKCP, FHKAM8; Lauren Tweel, RD, CNSC, MSc9; Angela Parillo, RD, LD, CNSC, MSc10; Daren K. Heyland, MD, MSc4; Charlene Compher, PhD, RD, LDN, FASPEN11

1Dietetics and Nutrition Department, Ng Teng Fong General Hospital, Singapore; 2Department of Anesthesiology, Faculty of Medicine, Universiti Malaya, 50603 Kuala Lumpur, Kuala Lumpur, Malaysia; 3Department of Cardiac Anesthesiology & Intensive Care Medicine, Berlin, Germany; 4Clinical Evaluation Research Unit, Department of Critical Care Medicine, Queen's University, Kingston, ON, Canada; 5Department of Critical Care, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; 6Department of Nutrition and Dietetics, Guy's and St Thomas' NHS Foundation Trust, St Thomas' Hospital, London, United Kingdom; 7University of Vermont Larner College of Medicine, Burlington, VT; 8Critical Care Medicine Unit, School of Clinical Medicine, The University of Hong Kong, Hong Kong; 9Rutgers University, New Brunswick, NJ; 10The Ohio State University Wexner Medical Center, Department of Clinical Nutrition, Columbus, OH; 11University of Pennsylvania School of Nursing, Philadelphia, PA

Financial Support: None Reported.

Background: Pre-existing malnutrition is common among critically ill patients (38-78%), and it can be diagnosed using tools such as the Global Leadership Initiative on Malnutrition (GLIM) criteria, and the Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition (ASPEN) Indicators of Malnutrition (AAIM). However, it is unclear if these tools or their individual components (nutrition parameters [NPs]), such as weight, diet history, body mass index (BMI), or muscle mass have better clinical utility and validity in the intensive care unit (ICU) setting since certain NPs can be easier to obtain (e.g. BMI) than others (e.g. weight history). More importantly, it is unclear if treating malnutrition according to the 2021 ASPEN guidelines (recommend delivering 12-25 kcal/kg/d and 1.2-2 g/kg/d of protein) is associated with improved clinical outcomes. We investigated whether GLIM, AAIM, and/or selected individual NPs measured at ICU admission were associated with time to discharge alive (TTDA) (primary outcome), mortality (60-day), or home discharge, and whether a higher protein delivery modified those associations.

Methods: This was a post hoc analysis of the EFFORT Protein trial (n = 1301), the largest multinational, multicenter trial that compared higher vs. usual protein delivery in critically ill patients. The malnutrition statuses of patients were retrospectively classified according to GLIM and AAIM using NPs that were prospectively collected at ICU admission. For GLIM, acute disease-related inflammation formed the etiologic factor for all patients since they were critically ill, and malnutrition severity was classified according to the phenotypic parameters (severity of weight loss, low-BMI, reduced muscle mass). For AAIM, a modified approach was adopted as certain NPs were not collected (ie, reduced energy intake or weight loss for periods < 1 month, fluid accumulation, and grip strength); hence, malnutrition status was classified by the patient's weight loss severity and any reduction in energy intake. Multivariable regressions were used to identify if malnutrition diagnosed by GLIM and AAIM (both dichotomized by “not identified as malnourished” vs. “moderate/severe malnutrition”) and/or individual NPs were associated with outcomes, and whether protein delivery modified their associations.

Results: Table 1 summarizes the characteristics of patients according to their malnutrition status classified by GLIM. Of 1301 predominantly medical admissions, 41% and 14% of the patients were malnourished according to GLIM and AAIM, respectively. Malnutrition diagnosed by GLIM and AAIM was independently associated with extended TTDA (p = 0.03, p = 0.01), higher odds of 60-day mortality (p = 0.02, p = 0.01), and lower odds of home discharge (p = 0.03, p = 0.05), whereas individual NPs were not (p > 0.10). However, higher protein delivery did not modify the association between malnutrition (diagnosed by GLIM and AAIM) and worse outcomes (Table 2). Notably, in patients with BMI < 18.5 kg/m2 (n = 78), higher protein delivery was associated with a shorter TTDA (adjusted hazard ratio 2.68, 95% confidence interval [CI] 1.14-6.30) and greater odds of home discharge (adjusted odds ratio 4.61, 95%CI 1.35-15.71) than usual protein delivery.

Elias Wojahn, B.S.; Liyun Zhang, MS; Amy Y. Pan, PhD; Theresa Mikhailov, MD, PhD

Medical College of Wisconsin, Milwaukee, WI

Financial Support: Medical College of Wisconsin.

Background: Previous guidelines lacked sufficient data to comment on the safety of enteral nutrition in critically ill children. A more recent study indicated that enteral nutrition was indeed safe for critically ill children receiving vasoactive medication. Additional data in adults indicated that septic shock patients treated with vasoactive medication and given early enteral nutrition outperform patients given no nutrition. We retrospectively investigated a similar premise in pediatric patients to determine (1) the frequency of use of early enteral versus parenteral nutrition for patients in the PICU for septic shock receiving vasoactive medication and (2) the impact of early enteral versus parenteral nutrition on PICU length of stay (LOS) and mortality for patients admitted with septic shock and treated with vasoactive medication. We hypothesized that (1) clinical practices have changed over recent years such that early enteral nutrition was administered more frequently to pediatric septic shock patients treated with vasoactive medication and (2) receiving early enteral nutrition as a PICU patient treated for septic shock with vasoactive medications was associated with better outcomes.

Methods: We obtained demographic and outcome data for pediatric patients admitted to Children's Hospital Wisconsin for septic shock and treated with vasoactive medications within a 5-year range from the Virtual Pediatric Systems, LLC (VPS) database, a data registry for PICU patients. We obtained clinical data including details of enteral and parenteral nutrition administered and use of vasoactive medications by chart review. We quantified the use of vasoactive medications by Vasoactive-Inotrope Score (VIS). We quantified the severity of illness by PRISM3 Probability of Death. We considered medical LOS and mortality for clinical outcomes. We compared categorical variables by Chi-square tests and compared continuous variables by Mann-Whitney tests or Kruskal-Wallis tests. P < 0.05 were considered statistically significant.

Results: We identified 637 patients aged 0-21 years treated in the PICU with a diagnosis of septic shock. Of these, 401 received vasoactive medication, 183 received early enteral nutrition, and 81 received early parenteral nutrition. Those given early parenteral nutrition had longer LOS (median (IQR): 7.0 (2.2-23.2) days) than those not fed (median (IQR): 2.1 (1.1-5.1) days) (p < 0.0001), but did not differ from those fed enterally (median (IQR): 7.9 (3.7-15.2)) (p = 0.95). After controlling for severity of illness, patients who received early parenteral nutrition were more likely to die than those receiving early enteral nutrition or those who were not fed at all (parental vs. enteral: 17.8% vs. 4.60%, p = 0.002; parenteral vs. none: 17.28% vs. 6.70%, p = 0.002). Mortality did not differ between patients who received early enteral nutrition and those not fed (4.60% vs. 6.70%, p = 0.427543).

Conclusion: Early enteral nutrition was given more frequently than early parenteral nutrition. Early enteral nutrition was not significantly associated with improved outcomes as measured by length of stay and mortality, but early parenteral nutrition was associated with significantly worse outcomes. This suggests that clinical guidelines should favor the use of enteral feeding in septic shock patients receiving vasoactive medication.

Best of ASPEN - Critical Care and Critical Health Issues

International Abstract of Distinction

Lu Ke, PhD1; Cheng Lv, PhD Candidate1; Lingliang Zhou, MD Candidate2; Weiqin Li, PhD1

1Nanjing University, Nanjing, Jiangsu, China; 2Southeast University, Nanjing, Jiangsu, China

Financial Support: None Reported.

Background: There is controversy over the optimal early protein delivery in critically ill patients with acute kidney injury (AKI). This study aims to evaluate whether the association between early protein delivery and 28-day mortality was impacted by the presence of AKI in critically ill patients.

Methods: This is a secondary analysis of a multicenter cluster-randomized controlled trial enrolling newly admitted critically ill patients (N = 2772). Participants with complete data on baseline renal function and 28-day mortality were included in this study. Cox proportional hazards models were used to investigate whether early protein delivery, reflected by mean protein delivery from day 3 to day 5 after enrollment, was associated with 28-day mortality and whether baseline AKI stages impacted their association.

Results: Overall, 2,618 patients were included (Table 1), among whom 628 (24.0%) had AKI at enrollment (118 stage I, 97 stage II, 413 stage III). Mean early protein delivery was 0.60 ± 0.38 g/kg/d among the study patients (Figure 1). In the overall study cohort, each 0.1 g/kg/day increase in protein delivery was associated with a 5% reduction in 28-day mortality (Hazard Ratio [HR] = 0.95; 95% confidence interval [CI] 0.92-0.98, P < 0.001). Also, when stratifying the early protein delivery by tertiles, compared with low protein delivery, the risk of 28-day mortality both decreased in the medium protein group (HR = 0.64; 95% CI 0.50-0.82, P < 0.001) and the high protein group (HR = 0.71; 95% CI 0.55-0.91, P = 0.007) after adjusting for potential confounders (Figure 2). The association between early protein delivery and 28-day mortality in patients with different baseline AKI stages showed significant heterogeneity (adjusted interaction P = 0.047). With each 0.1 g/kg/d increase in protein delivery, the 28-day mortality decreased by 5% (HR = 0.95; 95% CI 0.92-1.00, P = 0.008) in patients without AKI and 7% (HR = 0.93; 95% CI 0.86-0.99, P = 0.043) in those with AKI stage III, of whom 72% were on renal replacement therapy upon enrollment. However, these associations were not observed among AKI stage I and II patients. The mortality trends up to day 28 for early protein delivery in different AKI stage groups are depicted in Figure 3.

Conclusion: Higher early protein delivery during days 3-5 of ICU stay was associated with improved 28-day mortality in critically ill patients without AKI and with AKI stage III, but not in those with AKI stage I or II.

Figure 3. The trends of 28-day mortality with early protein delivery in different AKI stages.

Stanislaw J. Gabryszewski, MD, PhD1; David A. Hill, MD, PhD1,2

1Children's Hospital of Philadelphia, Philadelphia, PA; 2University of Pennsylvania, Philadelphia, PA

Financial Support: This work was supported by the National Institutes of Health (Grants T32HD043021 to SJG; K08DK116668 and R01HL162715 to DAH).

Background: The ketogenic diet (KD) is a high-fat, moderate-protein, low-carbohydrate diet that induces ketosis, a metabolic shift characterized by the use of fatty acid-derived ketone bodies rather than glucose to meet energy needs. While the KD is best known as a dietary therapy for refractory epilepsy, there is growing interest in identifying other diseases in which the KD may be therapeutic. Recent studies have revealed the potential of the KD to dampen inflammation and pathology in mouse models of allergic asthma. However, it is unclear whether the KD has such immunoregulatory effects in other allergic diseases, such as the gastrointestinal allergy eosinophilic esophagitis (EoE).

Methods: We studied the effects of the KD in a mouse model of eosinophilic esophagitis (EoE) in which 10-week-old C57BL/6 mice were topically treated with the vitamin D analog MC903 and the egg white allergen ovalbumin (OVA) on days 0 to 11 to induce eczema-like dermatitis and allergic sensitization, respectively. The effect of the KD following allergic sensitization was studied by feeding mice KD or a regular diet (RD) starting on day 12. Mice were provided with OVA-supplemented water and gavaged with OVA on days 18-20. On day 21, mice were harvested to quantify esophageal eosinophilia and to phenotype immune responses in draining lymph nodes via flow cytometry.

Results: Following induction of EoE, mice in both the KD (n = 17) and RD (n = 17) arms exhibited 100% survival at day 21. Weight recovery (percent of original weight ± SEM) at day 21 was comparable between KD-fed (104.1 ± 1.7%) and RD-fed (99.0 ± 3.2%) mice (p > 0.05). Analysis of esophageal eosinophilia at day 21 revealed significantly decreased numbers (total cells ± SEM) of Siglec-F+ CD11b+ eosinophils in KD-fed (711 ± 345 cells) versus RD-fed (880 ± 225 cells) mice (p < 0.05). There was a non-significant reduction in the percentage of esophageal eosinophils (percent of CD45+ cells ± SEM) in KD-fed (5.1 ± 1.2%) versus RD-fed (8.1 ± 1.5%) mice (p = 0.138). In immunophenotyping of phorbol myristate acetate and ionomycin-stimulated cells from draining lymph nodes at day 21, there was a significantly increased percentage (percent of CD4+ T cells ± SEM) of Foxp3+ T regulatory (Treg) cells in KD-fed (6.5 ± 1.1%) versus RD-fed (3.3 ± 0.4%) mice (p < 0.01).

Conclusion: In this mouse model of OVA-induced EoE, we observed a modest inhibitory effect of the KD on the recruitment of eosinophils to the esophagus. As compared with the RD, the KD was associated with increased proportions of Foxp3+ Tregs in draining lymph nodes of mice with EoE. Additional mechanistic investigations are warranted, including determination of the necessity of Tregs for KD-induced inhibition of esophageal eosinophilia. This study highlights the promise of immunomodulatory dietary interventions in the context of allergic disease.

Hassan S. Dashti, PhD, RD1; Magdalena Sevilla, Ph.D.1; Kris Mogensen, MS, RD-AP, LDN, CNSC2; Charlene Compher, PhD, RD, LDN, FASPEN3

1Massachusetts General Hospital, Boston, MA; 2Brigham and Women's Hospital, Boston, MA; 3University of Pennsylvania School of Nursing, Philadelphia, PA

Financial Support: Research reported in this publication was supported by the American Society for Parenteral and Enteral Nutrition (ASPEN) Rhoads Research Foundation awarded to Hassan S. Dashti.

Background: Patients living with short bowel syndrome (SBS) receiving home parenteral nutrition (HPN) commonly receive nutritional infusions overnight contributing to sleep and circadian disruption. Aligning nutritional intake with the circadian clock is expected to yield high benefits to vulnerable populations by limiting circadian misalignment (i.e., a mismatch between the circadian system and behaviors) and influencing other pathways. Recent advancements in metabolic profiling techniques (systematic profiling of cellular metabolites, i.e., sugars, amino acids, organic acids, nucleotides, and lipids) have emerged as a promising tool for identifying relevant biological pathways. Our objective was to characterize metabolites that differ between daytime and overnight HPN infusions in adults with SBS habitually receiving HPN.

Methods: The present study was a secondary analysis of a controlled, single-arm 2-week pilot and feasibility trial designed to compare daytime to overnight infusions of HPN in adults with SBS consuming HPN (ClinicalTrials.gov: NCT04743960). Enrolled patients received 1 week of HPN infusions overnight followed by 1 week of HPN infusions during the daytime (approximately 12-hour change in infusion start time). Duration, frequency, and composition of infusions remained identical during the two study periods. Following each 1-week study period, patients had a venous blood sample collected at clinical visits. Plasma samples were analyzed using Ultrahigh Performance Liquid Chromatography-Tandem Mass Spectroscopy and global metabolic profiles were determined. Of 1015 measured metabolites, only 622 metabolites with non-missing data across all samples were analyzed. Data were normalized to the volume of sample extracted and then log-transformed and scaled with Z-score prior to analysis. Differential metabolite abundance between the two study periods (daytime vs. overnight) was determined using standard Linear Models for MicroArray Data (LIMMA) models adjusted for dietary fasting duration and time since the end of the last HPN infusion. Pathway enrichment analysis was then conducted using MetaboAnalyst's pathway enrichment tool.

Results: Nine patients (age, 52 years; 80% female; BMI 21.3 kg/m2) completed the trial and provided two fasting blood samples. Both blood draws were completed at approximately 11:20 am following at-least an 8-hour fast and at-least 8 hours from the end of an HPN infusion. Changes were detected in 36 metabolites at P < 0.05; top-changing metabolites were mostly fatty acids, long-chain and polyunsaturated fatty acids (Dihomo-gamma-linolenic acid, arachidonate (20:4n6), docosahexaenoate (DHA; 22:6n3)) and glycerolipids. (Figure 1). No metabolites were significant at the stringent FDR threshold. Enrichment analysis of the 36 metabolites identified pathways related to the biosynthesis of unsaturated fatty acids, D-arginine, D-ornithine metabolism, and linoleic acid metabolism, among others (Figure 2).

Astrid Verbiest, MSc1,2; Mark K. Hvistendahl, MS, PhD3; Federico Bolognani, MD, PhD4; Carrie Li, MS, PhD4; Nader N. Youssef, MD, MBA, FACG4; Francisca Joly, MD, PhD5; Palle B. Jeppesen, MD, PhD3; Tim Vanuytsel, Associate Professor1,2

1Leuven Intestinal Failure and Transplantation Center (LIFT), University Hospitals Leuven, Leuven, Belgium; 2Translational Research Center for Gastrointestinal Disorders (TARGID), University of Leuven, Leuven, Belgium; 3Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; 4VectivBio, Basel, Switzerland; 5Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France

Financial Support: This research was supported by VectivBio AG.

Background: Short bowel syndrome (SBS) is a severe organ failure condition with a high risk of developing intestinal failure (SBS-IF) and life-long parenteral support (PS) dependence. Glucagon-like peptide-2 (GLP-2) analogs stimulate adaptation of the remaining intestine resulting in increased intestinal absorption and reduced PS needs. Extensive literature is available on the effect of the short-acting GLP-2 analog teduglutide in patients without a remaining colon. However, the impact of GLP-2 analogs on fluid and energy absorption in SBS-IF with a colon-in-continuity (CiC) is unclear. Apraglutide (APRA) is a novel, long-acting synthetic GLP-2 analog that is in development for SBS-IF. We performed a pre-defined interim analysis of a phase 2 study in SBS-IF-CiC to investigate the safety and efficacy of 4 weeks of apraglutide treatment based on metabolic balance studies (MBS).

Methods: STARS Nutrition is a 52-week multicenter, open-label phase 2 study in adult patients with SBS-IF-CiC receiving once-weekly subcutaneous apraglutide injections (5 mg). MBS were performed at baseline and after 4 weeks with stable PS, followed by a 48-week PS adjustment period. During MBS, fluid intake was kept constant (individual predefined drinking menu). Duplicates of meals and fluids (wet weight intake), urine, and feces (fecal wet weight output) were collected. Safety was the primary endpoint. Secondary endpoints included changes in fecal wet weight output, urinary output, wet weight, and energy absorption. Data are presented as mean (95% CI). P values < 0.05 were considered significant (Wilcoxon matched-pairs signed rank test).

Results: Nine patients were included and comprise the full study population. Apraglutide was well tolerated with no dose discontinuation or interruption. No AEs were considered notable based on their nature or severity. At baseline, patients received a weekly PS volume of 10 (range 4-21) L. Small bowel length was 19 (range 0-50) cm and 79 (range 43-100) % of the colon was in continuity. Fecal wet weight output decreased significantly by 253 (−437 to −68) g/day (p = 0.012). Relative wet weight absorption increased by 9 (1 to 18) % (p = 0.039). There was a numeric increase in urinary output (p = 0.129). No significant changes in energy absorption were observed (Table 1).

Palle B. Jeppesen, MD, PhD1; Tim Vanuytsel, Associate Professor2; Sukanya Subramanian, Physician3; Francisca Joly, MD, PhD4; Geert Wanten, Physician5; Georg Lamprecht, Physician, Professor6; Marek Kunecki, MD7; Farooq Rahman, Physician8; Thor Nielsen, Statistician9; Lykke Graff, MD9; Mark Hansen, Physician9; Ulrich Pape, Physician10; David Mercer, Physician11

1Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark; 2UZ Leuven, Leuven, Belgium; 3MedStar Georgetown, Washington, DC; 4Centre for Intestinal Failure, Department of Gastroenterology and Nutritional Support, Hôpital Beaujon, Clichy, France; 5Radboud University Nijmegen Medical Centre, Nijmegen, Netherlands; 6University Medical Center Rostock, Rostock, Germany; 7M. Pirogow Hospital, Wolczanska, Poland; 8University College London Hospitals, London, United Kingdom; 9Zealand Pharma A/S, Copenhagen, Denmark; 10ASKLEPIOS Klinik St. Georg, Hamburg, Germany; 11Nebraska Medical Center, NE

Financial Support: Zealand Pharma A/S Supported Research.

Background: Reduction of parenteral support (PS) is important for improved outcome in short bowel syndrome (SBS) patients with intestinal failure (IF). Clinically meaningful within-patient change in PS volume has until today been regarded as a ≥ 20% reduction. This is however based on clinical experience, and to our knowledge there has been no data-driven exercise which aims at quantifying what constitutes a meaningful change in PS volume from a patient perspective. Glepaglutide, a long-acting GLP-2 analog, reduces PS volume needs and improves patient global impression of change (PGIC), a patient-reported outcome (PRO) tool, in SBS-IF patients. We here report a quantitative analysis of meaningful change in PS volume using PGIC following glepaglutide treatment in the Efficacy and Safety Evaluation (EASE) SBS 1 trial.

Methods: EASE SBS 1 is a multi-center, placebo-controlled, randomized, parallel-group, double-blind phase 3 trial (NCT:03690206). Chronic SBS-IF adult patients with requirement for PS at least 3 days per week were recruited. Patients were randomized to 24 weeks of treatment with SC injections of either 10 mg glepaglutide twice-weekly (TW), 10 mg glepaglutide once-weekly (OW), or placebo. PS volume requirements were evaluated and adjusted using regular fluid balance periods. The primary endpoint was a reduction in weekly PS volume from baseline to week 24. Patients rated their change in overall status since the start of the trial to weeks 12 and 24 by PGIC, using a 7-point Likert scale (ranging from very much worse to very much improved). Anchor-based analysis using scatter plots and empirical cumulative distribution functions (eCDF) were applied to assess the association between PGIC categorical data and % change in PS volume from baseline to weeks 12 and 24. Anchor-based methods are used as external criteria to gain knowledge about what is clinically meaningful to patients based on known anchoring measures.

Results: 99 of the 106 randomized patients completed the trial. Glepaglutide TW treatment significantly reduced mean PS requirements by 47% (5.13 L/wk) from baseline. Improvement in PGIC was shown with significant differences relative to placebo for both glepaglutide TW (p = 0.002) and OW (p < 0.0001). Using the blinded data sample, the association between PGIC and the PS volume % change from baseline to week 24 showed that the two endpoints were correlated, with Spearman rank-order and Kendall's tau-b correlation coefficients of 0.353 and 0.285, respectively. After 12 weeks of treatment, the association appears stronger. Upon inspection of the eCDF, these results support the appropriateness of a % PS volume reduction threshold of 20%.

Conclusion: Anchor analysis, using PGIC as the anchor measurement, showed that the use of 20% reduction in PS volume, an outcome measure used in clinical trials, is considered clinically meaningful to SBS patients.

Abstract of Distinction

Ji Seok Park, MD, MPH1; Naseer Sangwan, PhD1; Lauren Menke2; Gail Cresci, PhD, RD, LD, FASPEN1

1Cleveland Clinic, Cleveland, OH; 2Case Western Reserve University, Cleveland, OH

Financial Support: 4R00AA023266 (GC) and Standard Process.

Background: A synbiotic is a physical combination of a prebiotic and a probiotic with a general goal of maintaining probiotic viability through co-packaging with its food source. Despite its wide availability, evidence to support its use in a healthy population is limited. The study aimed to test the feasibility and safety of the synbiotic on gastrointestinal symptoms and gut microbiota.

Methods: This was a double-blinded, randomized, placebo-controlled, paired crossover pilot study in healthy adults to test the effects of a targeted synbiotic on gut microbiota diversity and abundance. The targeted synbiotic consisted of 2 probiotic strains, Lactobacillus reuteri 3613 (1 × 109 CFU) and Lactobacillus plantarum 276 (1 × 1011 CFU), and a resistant starch (RS) prebiotic NuBanaTM RS65G Green Banana Flour (3.84 g/d). Thirty-four healthy participants meeting the pre-defined criteria were enrolled per sample size calculation of 24 completers needed to achieve 91% power at a 5% significance level. Participants were randomized to consume the synbiotic versus maltodextrin placebo for 28 days, followed by a 21-day washout period, and then they crossed over to consume the other supplement for 28 days. Gastrointestinal symptoms were assessed, and fecal samples were collected before and after each supplement period. Fecal samples were analyzed by 16 S rRNA sequencing, and Division Amplicon Denoising Algorithm 2 (DADA2) and Ribosomal Database Project (RDP) classifier were used for taxonomic profiling. Alpha-diversity was assessed using the Shannon diversity index, and beta-diversity was assessed using Bray-Curtis dissimilarity. Differential abundance was used to capture significantly different taxa between the synbiotic group and placebo group. The study was approved by the Cleveland Clinic Institutional Review Board.

Results: Thirty-four participants were randomized into the study, 13 males and 21 females, and 28 participants completed the study with an average age of 32 ± 7 years. Shannon diversity index of fecal samples was higher when participants were taking synbiotic compared to placebo (P = 0.021) suggesting higher microbial richness and evenness during the synbiotic consumption. Bray-Curtis dissimilarity was calculated between the synbiotic group and the placebo group and then was visualized using Principal Coordinates Analysis (PCoA), which showed 2 separate but overlapping groups. Differential abundance identified 11 taxa, including butyrate-producing genera Akkermansia and Butyricimonas, were significantly different between synbiotic and placebo supplements. All subjects tolerated the supplements well reporting no changes in GI symptoms.

Conclusion: This pilot study shows a targeted synbiotic supplement favorably modified gut microbiome diversity and taxa abundance in healthy subjects. Further studies are warranted to test the effects of this targeted synbiotic in clinical scenarios with known gut dysbiosis to determine if modifications can be sustained and associated with disease.

Kaitlyn Daff, MA, RD, LDN1; Gail Cresci, PhD, RD, LD, FASPEN2

1Case Western Reserve University/Cleveland Clinic Lerner Research Institute, Cleveland, OH; 2Cleveland Clinic, Cleveland, OH

Financial Support: NIH-National Institute of Alcohol Abuse and Alcoholism.

Background: Alcohol use disorder is the leading cause of liver disease in the United States1, with an estimated 80% of patients with alcohol-associated end-stage liver disease (AA-ESLD) also presenting with clinical malnutrition and sarcopenia2. Gut dysbiosis in ALD has been well-characterized in the literature with shifts from a Bacteroidetes and Firmicutes-dominated population towards an increased abundance Proteobacteria3. Although it is known that the gut microbiome plays a role in the metabolism and production of amino acids, how alcohol-associated gut dysbiosis influences host amino acid homeostasis is less understood. We aimed to test whether the amino acid metabolite profile in patients with AA-ESLD is unique from patients without disease pathology and if this is correlated to changes in the gut microbiota.

Methods: A secondary data analysis was performed from a larger, single-center, non-randomized prospective pilot study in patients awaiting liver transplantation to characterize metabolomic changes in amino acid homeostasis. Urine samples were collected within 24 hours prior to liver transplant, adjusted for urine osmolality, and untargeted metabolomic analysis by UPLC-MS/MS was performed. Fecal samples collected within 24 hours of liver transplant were sequenced and analyzed using 16srRNA for profiling. Welch's paired t-tests were generated to determine statistically significant changes in metabolite mean scaled intensities between AA-ESLD and healthy control patients. Spearman's correlations were used to identify associations between amino acid metabolites and gut microbial taxa.

Results: Analysis of the urinary metabolome between ALD-ESLD patients (n = 11) and healthy control patients (n = 18) revealed distinct amino acid profiles between groups. Welch's paired t-tests identified that arginine (p = 0.0016), glutamate (p = 0.0289), tyrosine (p = 0.0003, phenylalanine (p = 0.0002), asparagine (p = 0.0005) tryptophan (p = 0.0001), cystine (p = 0.0017) and taurine (p = 0.0480) were all significantly increased in ALD-ESLD patients. When Spearman's correlations were generated, a significant positive correlation was identified between gamma-proteobacteria genera species, phenylalanine (p = 0.0167), and tryptophan (p = 0.0349). These data suggest that the microbiome may contribute to the increased concentrations of these amino acids in the urine. Gamma-proteobacteria were also positively correlated with glutamine (p = 0.0151) and histidine (p = 0.0476), while a negative correlation was found with glycine(p = 0.0071) and creatinine (p = 0.0341).

Conclusion: Urinary amino acid metabolites differ between AA-ESLD patients and those without liver disease. As patients must abstain from alcohol for ~6 months to be eligible for a liver transplant, these data suggest residual effects of AA-ESLD on amino acid homeostasis. Correlations between the microbiome and amino acid metabolites suggest that the unique microbial shifts associated with ALD may play a role in these observed changes to amino acid metabolism.

Stephanie Merlino Barr, MS, RDN, LD1,2; Rosa Hand, PhD, RDN, LD, FAND2; Marc Collin, MD1,2; Thomas E. Love, PhD1,2; Sharon Groh-Wargo, PhD, RDN1,2

1MetroHealth Medical Center, Cleveland, OH; 2Case Western Reserve University, Cleveland, OH

Financial Support: None Reported.

Background: Diagnostic criteria for neonatal malnutrition were proposed in 2018 by field experts. This tool has not been validated since its publication. The objective of this study was to assess the agreement and reliability of both the overall malnutrition tool and individual indicators to evaluate how consistently the proposed criteria identify malnutrition in preterm infants.

Methods: A single-center, retrospective cohort study was performed at a level III Neonatal Intensive Care Unit (NICU). The cohort included all preterm infants born between June 2013 and August 2022, who were admitted to the NICU for at least 3 days and did not die before discharge. Malnutrition diagnoses (none/mild/moderate/severe) were assigned to each patient for each indicator, as defined in Table 1; multiple definitions for individual indicators were used to reflect different potential approaches of assessment (eg, growth velocity), or to reflect differences in patient populations (eg, protein and energy intake). The kappa (k) value was used to assess the neonatal malnutrition diagnostic tool's overall inter-indicator reliability; this was calculated separately for indicators used to assess malnutrition in the first two weeks of life and after the first two weeks of life. Each indicator's diagnosis was compared individually to all other indicators' diagnoses to assess inter-indicator reliability; proportion of overall agreement, McNemar's test statistic, and kappa value were calculated. Acceptable agreement was defined as k > 0.8.

Results: A total of 2946 infants were included in this study. The k values for the malnutrition tool overall indicated poor inter-indicator reliability; for malnutrition diagnoses in the first two weeks of life k = 0.054; for diagnoses after the first two weeks of life k = 0.048. Figure 1 depicts the weighted k values for all comparisons of individual indices. Figure 2 depicts the proportions of overall agreement. For example, the weight gain velocity (approach 1) compared to the energy intake malnutrition diagnosis criteria had n = 954 subjects, k = 0.09, and a proportion of overall agreement of 0.28, indicating that both inter-indicator reliability and accuracy were poor. Commonly cited generalized weight gain velocity goals (approaches 2 & 3) had good accuracy and inter-indicator reliability with the recommended method (approach 1) of determining goal weight gain velocity by maintaining weight-for-age z-score (1 vs. 2 k = 0.92, 1 vs. 3 k = 0.88). The generalized linear growth goal (approach 2) had poor accuracy and inter-indicator reliability with the recommended method (approach 1) (k = 0.12). All comparisons of unique indices for malnutrition diagnosis had detectable disagreement in diagnosis patterns as assessed by McNemar's test statistic.

Amber Hager, BSc, RD; Yiqi Wang, BSc; Sandy Hodgetts, PhD, OT; Lesley Pritchard, PhD, PT; Vera Mazurak, PhD; Susan Gilmour, MD, MSc, FRCPC; Diana R. Mager, MSc, PhD, RD

University of Alberta, Edmonton, AB, Canada

Financial Support: 2022 ASPEN Rhoads Research Foundation Grant.

Background: Measurement of body composition in young infants and children with chronic liver disease (CLD) can be challenging due to fluid overload, lack of healthy reference data and non-invasive, validated methods to use at the bedside. The use of ultrasonography to serially measure changes in muscle thickness overcomes many of these limitations, but little comparable data is available in young infants and children (<5 y). The study purpose was to serially measure changes in total bicep, calf, and thigh muscle layer thickness (MLT), subcutaneous adipose tissue thickness (SAT-T), and motor (gross/fine) development in infants and children (<5 y) with CLD. We hypothesized that the trajectory of MLT (thigh, bicep, calf) and SAT-T would be significantly impacted by CLD, and informative of gross motor development in infants and children (<5 y).

Methods: Infants and children (4 mo-5 y) with CLD (n = 11) and their age-matched CON (n = 16) were recruited from the Pediatric Liver Clinics/Liver Transplant Clinics at the Stollery Children's Hospital and the community. Participants underwent 2 serial measurements at baseline and after 6 months of (1) MLT, echo intensity and SAT-T of the bicep brachii (BB), rectus femoris (RF), rectus intermedius (RI), soleus and gastronemius (GN) using ultrasound (U/S) and (2) gross motor assessment (Peabody Motor Scale V2 [PDMS-2]) in CLD only. Additional variables collected included demographics (age, sex, CLD diagnosis, PELD), SGNA scores, anthropometrics (wt-z, ht-z, head circumference (hc-z]), body composition (fat-free mass [FFM]/fat-mass [FM] using BIA) and multiple skinfold thickness (SFT) (triceps [TSF], biceps, suprailliac, subscapular), mid-arm circumference [MAC-z]).

Results: CLD etiology included 73% Biliary Atresia (n = 8), 27% other (n = 1 acute liver failure; n = 2 TPN-related cholestasis). No significant differences in age (years), sex, wt-z, ht-z, hc-z, MAC-z, TSF-z, or subscapular-z were noted between groups at baseline (p > 0.05). Thirty percent of CLD children had SGNA scores indicative of mild-moderate malnutrition (SGNA ≥ 2). Total thigh, RI, and soleus MLT was significantly lower in CLD vs CON, and thigh SAT was higher in CLD after 6 months (p < 0.05). This was particularly evident in CLD children ≤ 2 years who had significantly lower total thigh, RI, RF, and soleus MLT than CON at baseline and after six months (p < 0.05). Total thigh, RI, RF MLT (absolute, % change over 6 months) were positively related to measures of BIA-FFM measures (r2 = 0.46 −0.47); p < 0.001) total motor quotient and gross motor quotient scores (absolute, percentile; r2 = 0.47 p < 0.001)), but not fine motor quotients (absolute, percentile) of the PDMS-2, particularly in CLD children (<2 y). Bicep and calf (MLT, SAT) were not associated with total motor, gross motor, or fine motor quotients (absolute, percentile) in CLD children.

Conclusion: Children with CLD had significantly lower measures of muscle thickness and higher measures of SAT than CON. Serial measurement of thigh MLT may be informative of the trajectory of fat-free mass and gross motor skill development in young children with CLD.

Abstract of Distinction

Anita Nucci, PhD, RD1; Hillary Bashaw, MD2; Alexander Kirpich, PhD1; Jeffrey Rudolph, MD3

1Georgia State University, Atlanta, GA; 2Children's Healthcare of Atlanta, Atlanta, GA; 3UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA

Financial Support: Takeda Pharmaceuticals.

Background: Although survival for children with intestinal failure (IF) has improved with parenteral nutrition (PN), many still fail to maintain adequate somatic growth after achieving enteral autonomy. Few studies have examined growth after weaning from PN and outcomes have been inconsistent. A glucagon-like peptide-2 (GLP-2) analog has been shown to reduce the volume of and time on PN in some children with short bowel syndrome with 6 months of use. The effect of this analog on growth is unknown. We aim to describe growth patterns in children with IF after PN weaning and during treatment with a GLP-2 analog.

Methods: This retrospective observational study was conducted at two centers for pediatric intestinal rehabilitation (IR) in the US eligibility criteria included diagnosis with IF (PN use ≥60 days within a 74 consecutive day interval) at <12 months of age. Patients were referred for IR between September 1989 and January 2023. Z-score values for weight and length/height (adjusted for gestational age up to 2 years of age) are described in those who weaned from PN and in those who received a GLP-2 analog (Gattex®) for ≥6 months (2017-2023).

Results: There were 362 children (57% male, 72% white) with a median age at diagnosis of 6 days (interquartile range [IQR] 1,22) eligible for the study. Common diagnoses included necrotizing enterocolitis (28%), gastroschisis (23%), and small bowel atresia (16%). The median gestational age was 34 weeks (IQR 31,37), the percent small bowel remaining at diagnosis was 23% (IQR 10,50), and 36% had a functional ileocecal valve. One hundred forty-five children (40%) were successfully weaned from PN (median time to wean = 1.5 y [IQR 1,2.9]). 123/145 (85%) achieved enteral autonomy (maintenance of normal growth for >3 consecutive months). Median weight and length/height z-score at the time of PN weaning was −1.04 (IQR −2.09, −0.12) and −1.86 (IQR −3.01, −0.69), respectively. After weaning from PN, weight and linear growth velocity were maintained in 44% and 39% of children, respectively in year 1 and 59% and 55%, in year 2. Acceleration in weight and linear growth velocity was observed in 28% and 34%, respectively in year 1 and 22% and 31%, in year 2. Fourteen children received a GLP-2 analog for a median of 912 days (IQR 365,1304). Of these, 3 were weaned from parenteral support within 9 months. Changes in weight and linear growth velocity z-scores between GLP-2 start and 2 years post-initiation are shown in Table 1.

Annemarie Rompca, MD1; Morgan McLuckey, MD2; Anthony J. Perkins3; Xiaoyi Zhang, MD, PhD1; Charles Vanderpool, MD1

1Riley Hospital for Children, Indianapolis, IN; 2Department of Radiology, Indianapolis, IN; 3Indiana University School of Medicine, Indianapolis, IN

Financial Support: None Reported.

Background: Inflammatory bowel disease (IBD) can impact patients' nutritional status. Poor oral intake, poor absorption of nutrients, protein loss in stool, and increased energy requirement can contribute to poor nutrition in this patient population. Poor nutritional status can manifest as poor growth, poor weight gain, and sarcopenia, defined as decreased muscle mass and strength. Studies have demonstrated decreased muscle mass in pediatric IBD patients leads to a need for escalated therapy, increased need for surgery, and increased risk of post-operative complications. We sought to obtain the muscle mass at IBD diagnosis of our cohort on cross-sectional imaging, compare to known age- and sex-specific psoas muscle reference values for pediatric norms, and analyze muscle mass comparison between IBD subtypes and correlations with anthropometrics at diagnosis.

Methods: This study is a single-center retrospective study at a tertiary care facility. Patients with new diagnoses of IBD [Crohn's disease (CD), ulcerative colitis (UC), and indeterminate colitis (IC)] ages 6 to 16 at diagnosis from May 15, 2018, through December 31, 2019, were included. Those who had chronic medical conditions and no accessible cross-sectional imaging within 3 months of diagnosis were excluded. Demographic and anthropometric data at diagnosis of IBD were obtained. The psoas muscle area in mm2 was measured on cross-sectional imaging at lumbar level 3-4 (L3-4) and lumbar level 4-5 (L4-5) bilaterally. Right and left measurements were added together to obtain the total psoas muscle area (TPMA) at each level. These measurements were compared to pediatric psoas muscle area reference values. We used analysis of variance to determine if outcomes differed by IBD type. Spearman correlations were used to assess the relationship between anthropometric measures and outcomes of interest. All analyses were performed using SAS v9.4.

Results: Cross-sectional imaging from 70 patients with newly diagnosed IBD was reviewed. The average age was 11.9 years, with a male predominance of 42 patients (60%). Most patients were diagnosed with CD (n = 50, 71.4%), followed by UC (n = 17, 24.3%), and then IC (n = 3, 4.3%). The mean z-score for all patients TPMA at L3-4 was −1.7. The mean z-score for all patients TPMA at L4-5 was −1.4 (Table 1). Measures of sarcopenia at both lumbar levels for TPMA and z-score at L3-4 were significantly different across IBD types (CD vs UC vs IC) (Table 2).

Best of ASPEN - Pediatric, Neonatal, Pregnancy, and Lactation

Abstract of Distinction

Adam Russman, MD1; Anne McCallister, CPNP2; Anthony J. Perkins3; Charles Vanderpool, MD4

1Children's Medical Center of Dallas, Dallas, TX; 2Riley Hospital for Children at Indiana University Health, Indianapolis, IN; 3Indiana University School of Medicine, Indianapolis, IN; 4Riley Hospital for Children, Indianapolis, IN

Financial Support: None Reported.

Background: The Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (AND/ASPEN) published malnutrition guidelines in 2014. Literature describing clinical outcomes in hospitalized children with a malnutrition diagnosis is limited and few studies focus on the impact of malnutrition severity subtype on clinical outcomes.

Methods: We analyzed patients admitted to our pediatric hospital from 2019 to 2022, excluding maternal/obstetrics admissions. Patients were diagnosed with malnutrition and assigned severity subtype by a registered dietitian according to AND/ASPEN guidelines. Unspecified malnutrition was assigned if there was insufficient physician documentation to determine the malnutrition severity subtype. Data on readmission rate, mortality, length of stay (LOS), LOS index, hospital cost, operative procedure (OR, any procedure), and pediatric intensive care unit (ICU) admission were collected. Clinical outcomes were also analyzed based on the malnutrition severity subtype and compared to patients who were not diagnosed with malnutrition. We used the natural log (LOS + 1) and natural log (costs+1) for LOS and cost analyses since both variables were highly skewed. Mixed effects regression analysis was completed to account for the clustering of repeated admissions. All analyses were performed using SAS v9.4.

Results: Any malnutrition diagnosis was associated with a higher 7-, 14-, and 30-day readmission rate compared to patients without a malnutrition diagnosis. Malnourished patients had a higher mortality rate, median LOS, LOS index, cost, ICU admission rate, and operative procedure rate compared to patients without a malnutrition diagnosis (Table 1). Table 2 represents an analysis based on malnutrition severity subtype. Patients with mild, moderate, and severe malnutrition all had significantly higher readmission rates at 7-, 14-, and 30-day time points compared to patients with no malnutrition. Patients with unspecified malnutrition had a higher readmission rate at only 30 days. At all three readmission time points, there were no significant differences in readmission rates between malnutrition severity categories. The only malnutrition subtype with a significantly increased rate of mortality compared to no malnutrition was patients with severe malnutrition (p = 0.005). Admissions with mild, moderate, unspecified, and severe malnutrition had significantly higher LOS index, LOS, and total costs than admissions without a malnutrition diagnosis. Mild malnutrition admissions had a significantly higher LOS index than moderate (p = 0.050) and severe (p = 0.014) malnutrition while unspecified severity admissions had a significantly higher LOS index than severe admissions (p = 0.026). Mild (p = 0.032), moderate (p = 0.015) and severe (p = 0.001) malnutrition admissions had significantly higher LOS than unspecified severity admissions. Mild (p = 0.011) malnutrition admission had significantly higher costs than admission with unspecified malnutrition.

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
营养与代谢研究口头论文会议摘要
su30肠外营养治疗su31肠内营养治疗su32营养不良,肥胖,营养实践概念和问题su33重症监护和关键健康问题su34 GI和其他营养和代谢相关主题su35儿科,新生儿,妊娠和哺乳期ji Seok Park, MD, MPH;Mohamed Tausif Siddiqui医学博士;Kristin Izzo, RD;Sara Yacyshyn医学博士;艾莉森·多里奥特,RD;阿杰·肯特,医学博士;伊丽莎白·加兰特,RD;米格尔·萨拉查,医学博士;艾琳·亨德里克森,药学博士;Adriana Panciu,药学博士;Basma Rizk,药学博士;注册会计师安·杜根;詹姆斯·贝纳,硕士;香农·莫里森,MS;吕瑞深,MS;Anil Vaidya医学博士;Gail Cresci, PhD, RD, LD, FASPENDonald F. Kirby, MD, FACP, FACN, FACG, AGAF, FASPEN, CNSC, CPNSCleveland Clinic, ohio克利夫兰,财政支持:克利夫兰诊所人类营养莫里森研究与发展基金中心。背景:预防导管相关性血流感染(CRBSI)是管理依赖家庭肠外营养(HPN)的慢性肠衰竭患者的重要组成部分。乙醇锁定疗法是一种有效的基于证据的策略,用于降低CRBSI的风险,然而,由于供应链问题,它变得越来越不可行,因此需要其他策略。SQ53擦拭是一种新型抗菌擦拭基于专利化合物,具有超过24小时的剩余功效。它在欧盟生物杀灭剂条例下注册,但没有在美国食品和药物管理局注册。本研究旨在评估SQ53擦拭在HPN患者中预防CRBSI的有效性。该研究已在ClinicalTrials.gov注册(NCT 04822467)。方法:设计单盲、随机、安慰剂对照试验。我们联系了约200名符合预定标准的患者。按样本量计算,在2021年12月10日至2022年6月3日期间共招募了60名患者。将患者随机分为治疗组(SQ53擦拭)和对照组(酒精擦拭)。根据CRBSI风险类别(低、高、新)和中心静脉导管类型(CVC;隧道式,非隧道式)进行分层随机化。患者被指示在HPN输注前后使用适当类型的擦布清洁cvc。计划在随机化后最后一位患者达到6个月时进行疗效和无效性的中期分析。采用泊松回归比较两组间所有CRBSI(确诊和疑似)、确诊CRBSI和CVC交换。将每个患者作为自己的历史对照,进行额外的分析以比较研究前6个月和研究期间的结果。使用治疗意向(ITT)和每个方案(PP)(90%依从性)分析。结果:59例患者被随机纳入研究。当两组平行比较时,ITT和PP分析均未显示使用SQ53擦拭比酒精擦拭在减少所有CRBSI、确认CRBSI或CVC交换方面具有统计学显著优势。然而,PP分析表明,SQ53组的事件发生率可能更低,所有CRBSI风险降低34% (P = 0.43),确诊CRBSI风险降低53% (P = 0.52), CVC交换风险降低30% (P = 0.58)。有趣的是,当将试验期间每位患者的CRBSI率与之前的CRBSI率进行比较时,在PP分析中,SQ53擦拭组的所有CRBSI风险降低了74% (P = 0.005)。在高风险类别的患者中,每个随机分组的患者的CRBSI率都比他们以前的经历有所下降。每位患者对SQ53耐受良好,没有预先定义的不良事件。结论:与之前的经验相比,使用SQ53擦拭超过90%的时间使用特定说明的患者CRBSI发生率降低了74%。在本研究中,由于对照组导管卫生护理的加强和样本量的不足,SQ53擦拭没有显示出统计学上显著的优于酒精擦拭。区别摘要theresa A. Fessler, MS, RDN, CNSC1;玛丽·b·克兰德尔博士,RN2;大卫·n·马丁,博士21莫里森医疗保健,弗吉尼亚大学卫生系统,夏洛茨维尔,弗吉尼亚州;2弗吉尼亚大学卫生系统,夏洛茨维尔,弗吉尼亚州财政支持:无报告。背景:导管相关性血流感染(CRBSI)是接受家庭肠外营养(HPN)患者的严重并发症。关于中心静脉导管(CVC)类型之间感染风险是否存在显著差异,文献并不一致,并且由于潜在的替代感染源和不同的评估方法(CRBSI和中心线相关性导管感染(CLABSI)),评估变得复杂。 方法:招募了31名年龄在20 - 50岁之间、体重指数(BMI)偏瘦或超重/肥胖的健康个体。参与者使用双能x线吸收仪(DEXA)测量身体成分,以量化体脂百分比和内脏脂肪组织数量。标准的900千卡脂质餐(一种长链甘油三酯脂肪乳剂口服营养补充剂)和重复血液采样。使用双柱液相色谱(C18-和HILIC+电喷雾模式),结合高分辨率质谱(LC-HRMS),在基线、脂质激发后120分钟和360分钟测定非靶向血浆高分辨率代谢组学。代谢物差异通过代谢组全关联研究和线性混合效应模型进行评估,研究体脂、时间和体脂*时间相互作用的影响,控制年龄和性别,并进行途径富集分析。结果:基线时血浆中共检测到12078 (C18)和15041 (HILIC)特征(代谢物)。699 (C18)和814 (HILIC)特征从基线到120分钟,465 (C18)和478 (HILIC)特征从基线到360分钟,体脂百分比(脂肪百分比*时间相互作用)随时间的变化不同(均p &lt; 0.05)。它们在TCA循环、脂肪酸、赖氨酸、酪氨酸、色氨酸、丁酸盐和嘌呤代谢等途径中富集,见图1和2。此外,从基线到120分钟,396 (C18)和2290 (HILIC)特征以及486 (C18)和520 (HILIC)特征的内脏脂肪组织数量(VAT*时间相互作用)随时间的变化不同(均p &lt; 0.05)。它们在脂肪酸氧化、omega-3和- 6脂肪酸、维生素C和戊糖磷酸代谢等途径中富集,见图3和4。最佳ASPEN -营养不良,肥胖,营养实践概念和问题ana Paula Pagano, MSc1;中国生物医学工程学报1,2;威廉·埃文斯博士;Cristina Gonzalez, MD, PhD4;阿尼尔·亚伯拉罕·乔伊,MD5;Claude Pichard, MD, PhD6;Carla Prado,博士,rd11阿尔伯塔大学,埃德蒙顿,AB,加拿大;2巴西南格兰德州联邦大学,阿雷格里港,巴西南格兰德州;3加州大学伯克利分校;4佩洛塔斯联邦大学,巴西南大德州佩洛塔斯;5阿尔伯塔大学/交叉癌症研究所,加拿大埃德蒙顿;资金支持:ASPEN(美国肠外和肠内营养学会)Rhoads研究基金会和加拿大卫生研究所(CIHR) (FRN 159537)。背景:准确理解能量需求对于癌症患者量身定制的营养干预至关重要。低估或高估这些需求会导致有害的体重减轻或过度增加。然而,由于个体肿瘤负担、治疗和炎症等因素都会影响能量需求,确定癌症患者的能量需求是具有挑战性的。目前的指南将广泛的热量摄入(25-30千卡/公斤/天)设定为正常值,但缺乏强有力的证据。因此,营养师经常依赖于预测方程,而事实证明这些方程并不精确。然而,准确测量能量需求的标准技术既昂贵又耗时,而且不适用于临床环境。在这项研究中,我们利用一组乳腺癌患者来评估一种用于测量静息能量消耗(REE)的新型床边设备的准确性,并将其与金标准方法进行比较。方法:通过一种新型便携式设备Q-NRG®(Cosmed, Roma, Italy)在10分钟的测试中测量成年乳腺癌女性(I-III期)的REE数据,并与全室间接量热计(WRIC)作为金标准技术在1小时的测试中测量的REE数据进行比较。为了评估和描述方法之间的REE准确性,我们对非正态性实例使用配对样本t检验或Wilcoxon符号秩检验。准确度由在WRIC测量值的10%以内的估计值的百分比决定。此外,进行Bland-Altman分析以确定偏倚并建立一致性的下限和上限(LOA)。p值小于0.05被认为具有统计学意义。结果:49名女性(年龄55.9±11.8岁,42%为I期或II期,7%为III期)同时使用WRIC和新的便携式装置进行REE评估。大多数患者(63.3%)的身体质量指数(BMI)属于超重或肥胖类别,没有人被归类为体重不足。新的便携式设备为超过70% (n = 35)的患者提供了准确的测量,其测量结果与WRIC获得的测量结果在10%以内。 然而,新的便携式设备高估了1名患者的REE,低估了13名患者的REE。测量的稀土元素在不同技术之间存在显著差异,与WRIC相比,新的便携式设备低估了稀土元素(1406±262 vs 1508±248 kcal/d; p &lt; 0.001)。新型便携式设备与WRIC之间的偏差为- 6.7% (LOA = - 24.9%, 11.6%,方差= 36.5%)或- 102 kcal (LOA = - 378 kcal, - 174 kcal,方差= 552 kcal)。结论:与金标准技术相比,新的便携式设备在组水平上显示出良好的一致性,REE测量差异在WRIC测定值的10%以内。尽管在个体水平上观察到更大的可变性,但对于大多数患者,与WRIC相比,新的便携式设备准确地评估了REE。因此,新的便携式设备似乎是一种很有前途的工具,用于估计乳腺癌患者的REE,将其定位为临床设置的可行选择。米歇尔·布朗,MS, RD, LDN, CNSCUF健康,盖恩斯维尔,fl财政支持:无报道。背景:营养不良是一个非常普遍的问题,在医疗保健设置。营养不良一词在卫生保健领域指的是营养不足。这是由于营养摄入不足、吸收受损或营养利用改变所致。炎症和高代谢也会导致营养不良。对患病率的估计各不相同,最高可达54%。根据营养与饮食学会/美国肠外和肠内营养学会(and /ASPEN)的诊断标准,在急症护理医院,营养不良发生率为39%。捕捉和识别营养不良很重要,因为这一诊断与3.4倍的院内死亡率、1.9倍的住院时间、2.2倍的严重感染可能性、更高的康复或长期辅助护理机构出院率、更高的再入院率和73%的医院费用增加有关。由于营养不良对医疗保健费用和护理要求的影响,ICD10对营养不良的编码被视为共病(CC)或主要共病(MCC)。准确的诊断、治疗和记录营养不良可以改善病人的护理。准确的文档还可以帮助捕获质量度量的复杂性,同时还允许选择正确的DRG和可能增加报销的基本付款。方法:我们组织一个由营养师、医生、护士和信息专业人员组成的跨学科营养委员会完成了一项质量改进实施,以提高营养不良诊断率、记录和编码。这分四个步骤完成:(1)确定可在整个组织中使用的营养不良标准。我们的委员会选择使用AND/ASPEN标准来诊断营养不良。约85%的医院采用这一标准,并得到付款人的广泛认可。(2)开发一种记录工具,允许将RD诊断的营养不良纳入提供者进度记录。利用医院的电子医疗记录(EMR)来实现这一目标。开发了一种新的流程表和Smartphrase,它允许营养不良严重程度、体征/症状和治疗(由营养师输入)的信息自动流入医生的进展记录。该解决方案符合我们跨学科团队确定的所有“最佳实践”——明确的营养不良症状和体征,营养不良的严重程度,提供者之间一致的指示和记录,诊断标准的一致使用,以及提供和记录的营养不良治疗。(3)对所有临床营养工作人员进行完成营养重点体检(NFPE)的实践培训,并在所有营养评估中优先考虑这些检查的完成情况。(4)当不使用营养不良智能短语时,将笔记发送给医生进行证明和签名。结果:在此实施后,通过Smartphrase, 65%的病例的医生记录中包含了营养师诊断的营养不良。在NFPE培训后的6个月内,营养不良诊断率提高了220%。营养学家评估中营养不良诊断的比例从13%增加到40%。在向医生发送证明和签名的说明后,94%的营养不良诊断在出院时的电子病历中进行了编码,向医生提出的编码查询减少了50%。医院对营养师诊断的营养不良的报销从每季度6.5万美元增加到每季度200万美元。 结论:利用适当的NFPE培训、医生认可的诊断标准和基于电子病历的文件解决方案可以增加住院患者营养不良诊断的诊断、记录和报销。研究实习生奖:alan garcia - grimaldo 1,2;Ivan A. Osuna-Padilla1;Nadia Rodriguez-Moguel1;马丁A.里奥斯-阿亚拉;Marycarmen godinez - victoria21墨西哥墨西哥城国家呼吸疾病研究所;2墨西哥国立政治学院高等医学院,墨西哥城,DF,墨西哥。背景:重症监护病房获得性虚弱(ICU-AW)的特征是周围肌肉质量减少,肌肉力量降低和功能障碍。呼吸和吞咽相关的肌肉也会受到影响。本研究旨在分析ICU-AW发生率与拔管后吞咽困难(P-ED)之间的关系。方法:采用前瞻性队列研究。采用机械通气(MV)的患者被纳入ICU。既往诊断为肌病者排除在外。入院时和拔管后分别评估nutric评分、调整为BMI的小腿围以及生物电阻抗获得的相位角(PhA)。从医疗记录中收集生化变量(基线PCR)。SOFA评分、APACHE II和营养不良诊断采用GLIM标准在进入ICU时确定。在ICU住院期间计算累积能量(CED)和蛋白质(CPD)赤字。ICU-AW诊断采用医学研究委员会量表(MRC-Scale &lt;48)和握力(男性11公斤,女性7公斤)确定。在拔管后的前24小时内,使用耶鲁吞咽方案(yyale吞咽Protocol, YSP)进行吞咽功能评估。对于不符合YSP成功标准的患者,进行体积粘度吞咽试验以证实拔管后吞咽困难(P-ED)的存在。为每个测试提出了具体的成功和失败标准。在P-ED组和吞咽正常组之间对每个变量进行平均值和中位数比较检验。使用单变量和多变量逻辑及线性回归分析相关性。协变量选择采用逐步法。结果:54例患者中,诊断为P-ED的19例(35.2%),诊断为ICU-AW的32例(59.3%)。P-ED患者拔管时PhA值、MRC-Scale值和拔管时握力值较低。此外,在侵入性MV、CED和CPD上,该组观察到较高的天数(表1)。在单变量logistic回归分析中,拔管时的PhA、CED、CPD、ICU-AW诊断和MV天数与P-ED鉴定相关。在多变量回归分析中,仅MV天数和ICU-AW诊断与P-ED独立相关(表2)。结论:有创机械通气天数和icu获得性虚弱诊断是拔管后吞咽困难的预测因素。需要新的临床和营养策略来预防重症监护下获得性肌肉无力及其后果,这可能会改善拔管后的临床结果和生活质量。Ahron Lee, RD1,2;金恩美,RD1;金宝恩RD1;Park Chi-Min, MD, PhD3;Sung Nim Han博士21韩国首尔三星医疗中心营养学系;2首尔国立大学人类生态学院食品与营养系,韩国首尔;3韩国成均馆大学医学院三星医疗中心重症医外科,首尔,韩国。背景:在重症监护病房(ICU)入院的早期阶段,“适当”营养支持的重要性正在就需要它的患者,开始的时间和提供的数量进行辩论。在这项研究中,使用全球营养不良领导倡议(GLIM)标准诊断的营养不良患者的特征和临床结果进行了检查。同时,调查营养支持的实际实施情况及其与基于营养状况的临床结果的关系。方法:本回顾性队列研究纳入2020年1月1日至2022年12月31日期间入住ICU并住院至少7天的接受有创机械通气的危重患者。收集患者在ICU前10天的营养和临床资料。本研究中所有患者均按GLIM标准进行营养评估。分析GLIM诊断为营养不良患者的90天死亡率及营养不良程度。 将患者分为3个能量摄入类别(&lt;10 kcal/kg/d、10 - 20 kcal/kg/d、&gt;20 kcal/kg/d)和3个蛋白质摄入类别(&lt;0.8 g/kg/d、0.8 - 1.2 g/kg/d、&gt;1.2 g/kg/d)。摄入信息按ICU入院后的分期进行分类(急性早期1-3天,急性晚期4-6天,恢复期7-10天)。我们检查了在每个阶段按能量和蛋白质摄入量分开的组之间死亡率的差异。对整个队列、营养良好组和营养不良组进行了分析。评估平均值和分布的差异,并进行生存分析和回归分析。结果:共纳入595例患者。根据GLIM标准,营养不良发生率为61% (n = 362)。营养良好组和营养不良组90天死亡率分别为45%和58% (P &lt; 0.001)。死亡率因营养不良程度而异(营养良好45%,中度营养不良53%,严重营养不良61%,P = 0.001)。在急性早期和急性晚期,不同能量摄入组的死亡率无显著差异。而在恢复期,高能量摄取量组(20 kcal/kg/d)死亡率较低(危险比(HR) 0.602;95%置信区间(CI) 0.413 ~ 0.877;P = 0.008)。在营养良好的患者中,高能量摄入组恢复期死亡率较低(HR 0.573; 95% CI 0.318 ~ 1.034; P = 0.064)。而在营养不良患者中,高能量摄入组恢复期死亡率显著降低(HR 0.549; 95% CI 0.333 ~ 0.903; P = 0.018)。在急性早期和急性晚期,不同蛋白质摄入组的死亡率无显著差异。然而,在恢复阶段,在整个队列中,蛋白质摄入量适中组(0.8-1.2 g/kg/天)死亡率较低(HR 0.770; 95% CI 0.599 ~ 0.990; P = 0.041)。当将营养良好和营养不良患者分开分析时,营养不良患者中适量摄入蛋白质可显著降低恢复期死亡率(HR 0.728; 95% CI 0.536 ~ 0.988; P = 0.042)。结论:根据GLIM标准诊断的营养不良与90天死亡率和其他临床结果相关。此外,ICU入院后恢复期的能量和蛋白质摄入与死亡率相关,特别是在根据GLIM标准分类的营养不良患者中。因此,营养状况的时间依赖性营养摄入可能与优化ICU营养支持策略有关。最佳国际摘要国际区别摘要fabio Araujo, RD, MHS1;莫林·托什,PT1;Maitreyi Kothandaraman, MD, MSc, FRCPC, CAGF2;Juan Posadas, MD, msc1,2;Paul Wischmeyer, MD, EDIC, FASPEN, FCCM3;普莉希拉·巴雷托,RD4;chelsea Gillis, RD, PhD, CNSC51Alberta Health Services, Calgary, AB, Canada;2卡尔加里大学,加拿大卡尔加里;3杜克大学医学院,北卡罗来纳州达勒姆;4巴西里约热内卢海军马西利奥·迪亚斯医院;5麦吉尔大学人类营养学院,蒙特利尔,QC,加拿大背景:功能能力是重症患者最重要的预后指标。这一结果尤其相关,因为ICU成人死亡率一直在下降,最终导致功能受损、延迟重返工作岗位和低生活质量。通过营养支持(NS)提供蛋白质有可能减轻重症监护病房获得性虚弱,但鉴于目前的ICU基准是基于死亡率和重症监护病房相关并发症,尚不清楚这些蛋白质目标是否也支持功能恢复。为了解决这一差距,我们进行了一项回顾性队列研究,以确定不同的蛋白质摄入剂量是否会影响LOS≥7天的ICU幸存者的功能能力,通过ICU出院时的切尔西身体评估评分(CPAx)来测量。CPAx是一种基于可靠性、测量误差和反应性的有效的功能能力测量方法。方法:回顾2014年10月至2020年9月在普通系统ICU连续收治的所有患者的病历。纳入标准为年龄≥18岁,在ICU存活,ICU住院时间≥7天,并接受NS治疗。排除标准包括神经肌肉疾病、脑/脊髓损伤、肢体截肢、骨科骨折、ICU住院期间持续昏迷、漏服CPAx、机械通气3天。根据既往研究ICU患者每日蛋白质摄入量(g/Kg/d)对死亡率影响的文献,将符合条件的患者分为4组:LOW (&lt;0.8)、MEDIUM(0.8-1.19)、HIGH(1.2-1.5)和VERY HIGH (&gt;1.5)。将具有相似CPAx的组合并以提高精度。 采用协方差分析(ANCOVA)评估蛋白质剂量对CPAx的影响,调整混杂变量为年龄、疾病严重程度、ICU入院前住院时间、机械通气持续时间和ICU开始NS的时间。根据主观整体评价(SGA A:营养良好,B/C:营养不良)分层评价营养状况对效果的影响。使用相同的回归模型(&lt;25和≥25 Kcal/Kg/d; &lt;70和≥70%日摄足量)评估能量摄入的影响。结果:531例患者符合纳入/排除标准。CPAx与蛋白质剂量呈非线性相关(图1),并且在低、中、高剂量组之间无统计学差异。各组均不同于HIGH组(p = 0.003),表明数据可以合并,产生2组:HIGH组(1.2 ~ 1.5 g/Kg/d)和pooled组(&lt;1.2和&gt;1.5 g/Kg/d)。两组的基线特征具有可比性(表1)。HIGH组比POOLED组的平均CPAx(±标准误差)更高(30.1±0.7比26.8±0.6,p = 0.001),表明HIGH与放电时更强的功能容量有关。校正混杂变量后,平均差异(MD)仍有统计学意义(4组模型CPAx MD: 3.4±1.1,p = 0.003, 2组模型3.3±0.9,p = 0.001)。能量摄入对Kcal/Kg/d的CPAx没有影响(25 Kcal/Kg组28.1±0.6 vs≥25 Kcal/Kg组27.9±0.8,p = 0.780),对充足性也没有影响(70%和70%组27.3±0.9 vs≥70%组28.4±0.6,p = 0.641)。营养状况不是影响因素,因为HIGH组在营养良好(MD为3.8±1.7,p = 0.029)和营养不良(MD为2.5±1.1,p = 0.031)患者的CPAx均优于HIGH组。最好的阿斯彭——急救护理和关键健康IssuesInternational文摘DistinctionChin汉查尔斯•卢adp, PhD1;李正益,博士,2,3;安德鲁·戴,MSc4;蒋旭然,MSc4;Danielle E. Bear, RD, ph . 5,6;Gordon L. Jensen,医学博士;吴宝玲,MBBS, MRCP(UK), FHKCP, FHKAM8;Lauren Tweel, RD, CNSC, MSc9;Angela Parillo, RD, LD, CNSC, MSc10;darren K. Heyland, MD, MSc4;Charlene Compher, PhD, RD, LDN, faspen111新加坡黄廷芳总医院营养科;2马来亚大学医学院麻醉科,马来西亚吉隆坡50603;3德国柏林医院心脏麻醉与重症医学部;4皇后大学重症医学系临床评价研究室,加拿大金斯敦;5 .英国伦敦圣托马斯医院盖伊和圣托马斯NHS信托基金重症监护部;6英国伦敦圣托马斯医院,盖伊和圣托马斯NHS基金会信托营养与饮食学系;7佛蒙特大学拉纳医学院,佛蒙特州伯灵顿;8香港大学临床医学院重症医学组,香港;9罗格斯大学,新泽西州新不伦瑞克;10俄亥俄州立大学韦克斯纳医学中心临床营养学系,俄亥俄州哥伦布市;11宾夕法尼亚大学护理学院,费城,太平洋。财政支持:无报道。背景:预先存在的营养不良在危重患者中很常见(38-78%),可以使用诸如全球营养不良领导倡议(GLIM)标准以及营养与饮食学会和美国肠外和肠内营养学会(ASPEN)营养不良指标(AAIM)等工具进行诊断。然而,目前尚不清楚这些工具或其单独组成部分(营养参数[NPs]),如体重、饮食史、体重指数(BMI)或肌肉质量,是否在重症监护病房(ICU)环境中具有更好的临床效用和有效性,因为某些NPs(如BMI)比其他NPs(如体重史)更容易获得。更重要的是,目前尚不清楚根据2021年ASPEN指南(建议提供12-25千卡/千克/天和1.2-2克/千克/天的蛋白质)治疗营养不良是否与改善临床结果相关。我们调查了在ICU入院时测量的GLIM、AAIM和/或选定的个体NPs是否与存活出院时间(TTDA)(主要结局)、死亡率(60天)或出院相关,以及更高的蛋白质递送是否改变了这些关联。方法:这是一项对EFFORT蛋白试验(n = 1301)的事后分析,该试验是最大的多国、多中心试验,比较了危重患者的高蛋白递送与常规蛋白递送。使用ICU入院时前瞻性收集的NPs,根据GLIM和AAIM对患者的营养不良状况进行回顾性分类。 对于GLIM,由于患者病情危重,急性疾病相关炎症是所有患者的病因,根据表型参数(体重减轻严重程度、低bmi、肌肉量减少)对营养不良严重程度进行分类。对于AAIM,采用了一种改进的方法,因为某些NPs没有收集(即,在1个月的时间内,能量摄入减少或体重减轻,液体积聚和握力);因此,根据患者体重减轻的严重程度和能量摄入的减少程度对营养不良状况进行分类。多变量回归用于确定GLIM和AAIM诊断的营养不良(均分为“未确定为营养不良”与“中度/重度营养不良”)和/或个体NPs是否与结果相关,以及蛋白质递送是否改变了它们之间的关联。结果:表1总结了患者根据GLIM分类的营养不良状况的特征。根据GLIM和AAIM,在1301名主要住院患者中,分别有41%和14%的患者营养不良。GLIM和AAIM诊断的营养不良与TTDA延长(p = 0.03, p = 0.01)、60天死亡率较高(p = 0.02, p = 0.01)和出院率较低(p = 0.03, p = 0.05)独立相关,而个体NPs无相关性(p &gt; 0.10)。然而,更高的蛋白质递送并没有改变营养不良(由GLIM和AAIM诊断)与更糟糕结局之间的关联(表2)。值得注意的是,在BMI为18.5 kg/m2 (n = 78)的患者中,较高的蛋白质输送量与较短的TTDA(校正风险比2.68,95%可信区间[CI] 1.14-6.30)和较高的出院几率(校正优势比4.61,95%CI 1.35-15.71)相关。伊莱亚斯·沃扬,理学学士;张丽云,MS;潘艳萍博士;特蕾莎·米哈伊洛夫,医学博士,威斯康星医学院,密尔沃基,威斯康星财政支持:威斯康星医学院。背景:以前的指南缺乏足够的数据来评价重症儿童肠内营养的安全性。最近的一项研究表明,对于接受血管活性药物治疗的危重儿童,肠内营养确实是安全的。成人的其他数据表明,接受血管活性药物治疗并给予早期肠内营养的脓毒性休克患者优于不给予营养的患者。我们在儿科患者中进行了类似的回顾性调查,以确定(1)接受血管活性药物治疗的感染性休克患者在PICU中早期肠内营养与肠外营养的使用频率;(2)早期肠内营养与肠外营养对感染性休克入院并接受血管活性药物治疗的患者在PICU的住院时间(LOS)和死亡率的影响。我们假设:(1)近年来临床实践发生了变化,对接受血管活性药物治疗的儿童感染性休克患者更频繁地给予早期肠内营养;(2)在PICU接受血管活性药物治疗的感染性休克患者早期接受肠内营养与更好的预后相关。方法:我们从虚拟儿科系统有限责任公司(Virtual pediatric Systems, LLC, VPS)数据库(PICU患者的数据注册表)中获得了5年内因感染性休克而入住威斯康星州儿童医院并接受血管活性药物治疗的儿科患者的人口统计学和结局数据。我们通过图表回顾获得了临床数据,包括肠内和肠外营养管理和血管活性药物使用的细节。我们通过血管活性-肌力评分(vasoactive - inotrope Score, VIS)对血管活性药物的使用进行量化。我们用PRISM3死亡概率来量化疾病的严重程度。我们考虑了临床结果的医疗LOS和死亡率。我们用卡方检验比较分类变量,用Mann-Whitney检验或Kruskal-Wallis检验比较连续变量。P &lt; 0.05认为有统计学意义。结果:637例年龄0-21岁的脓毒性休克患者在PICU接受治疗。其中401人接受血管活性药物治疗,183人接受早期肠内营养,81人接受早期肠外营养。早期给予肠外营养组的LOS(中位数(IQR): 7.0(2.2-23.2)天)比未喂食组(中位数(IQR): 2.1(1.1-5.1)天)(p &lt; 0.0001)更长,但与肠内喂养组(中位数(IQR): 7.9(3.7-15.2))没有差异(p = 0.95)。在控制病情严重程度后,接受早期肠外营养的患者比接受早期肠内营养或根本不进食的患者更容易死亡(父母vs肠内:17.8% vs 4.60%, p = 0.002;肠外vs无肠外:17.28% vs 6.70%, p = 0.002)。早期接受肠内营养和未接受肠内营养的患者死亡率无差异(4.60% vs. 6.70%, p = 0.427543)。 结论:早期肠内营养的使用频率高于早期肠外营养。通过住院时间和死亡率来衡量,早期肠内营养与改善的结果没有显著相关,但早期肠外营养与明显较差的结果相关。这表明,临床指南应倾向于在接受血管活性药物治疗的感染性休克患者中使用肠内喂养。《重症监护与关键健康问题》国际特色摘要吕成博士研究生1;周凌亮,医学博士候选人2;李维勤,博士11南京大学,江苏南京;2东南大学,江苏南京背景:急性肾损伤(AKI)危重患者的最佳早期蛋白递送存在争议。本研究旨在评估危重患者早期蛋白递送与28天死亡率之间的关系是否受到AKI存在的影响。方法:这是一项多中心集群随机对照试验的二次分析,纳入了新入院的危重患者(N = 2772)。具有完整基线肾功能和28天死亡率数据的参与者被纳入本研究。使用Cox比例风险模型来研究早期蛋白质输送(通过入组后第3天至第5天的平均蛋白质输送反映)是否与28天死亡率相关,以及基线AKI分期是否影响其相关性。结果:总体而言,纳入2618例患者(表1),其中628例(24.0%)在入组时患有AKI(118例I期,97例II期,413例III期)。研究患者的平均早期蛋白质递送量为0.60±0.38 g/kg/d(图1)。在整个研究队列中,蛋白质排泄量每增加0.1 g/kg/天,28天死亡率降低5%(风险比[HR] = 0.95; 95%可信区间[CI] 0.92-0.98, P &lt; 0.001)。此外,当按三位数对早期蛋白质递送进行分层时,与低蛋白质递送相比,在调整潜在混杂因素后,中等蛋白质组(HR = 0.64; 95% CI 0.50-0.82, P &lt; 0.001)和高蛋白质组(HR = 0.71; 95% CI 0.55-0.91, P = 0.007)的28天死亡率风险均降低(图2)。不同基线AKI分期患者早期蛋白递送与28天死亡率之间的关联显示出显著的异质性(经校正的相互作用P = 0.047)。蛋白质每增加0.1 g/kg/d,无AKI患者的28天死亡率下降5% (HR = 0.95; 95% CI 0.92-1.00, P = 0.008), AKI III期患者的28天死亡率下降7% (HR = 0.93; 95% CI 0.86-0.99, P = 0.043),其中72%在入组时接受肾脏替代治疗。然而,在I期和II期AKI患者中未观察到这些关联。图3描述了不同AKI分期组中早期蛋白质递送至第28天的死亡率趋势。结论:ICU住院3-5天较高的早期蛋白递送与无AKI和AKI III期危重患者28天死亡率的改善相关,但与AKI I期或II期患者无关。图3。不同AKI分期早期蛋白递送28天死亡率变化趋势。Stanislaw J. Gabryszewski, MD, phd;David A. Hill,医学博士,费城21儿童医院,费城,宾夕法尼亚州;资金支持:本工作由美国国立卫生研究院支持(资助T32HD043021给SJG;资助K08DK116668和资助R01HL162715给DAH)。背景:生酮饮食(KD)是一种高脂肪、中等蛋白质、低碳水化合物的饮食,可诱导酮症,这是一种代谢转变,其特征是使用脂肪酸衍生的酮体而不是葡萄糖来满足能量需求。虽然KD最为人所知的是一种治疗难治性癫痫的饮食疗法,但人们对确定KD可能治疗的其他疾病的兴趣越来越大。最近的研究揭示了KD在过敏性哮喘小鼠模型中抑制炎症和病理的潜力。然而,尚不清楚KD是否在其他过敏性疾病(如胃肠道过敏性嗜酸性粒细胞性食管炎(EoE))中具有这种免疫调节作用。方法:在10周龄C57BL/6小鼠嗜酸性食管炎(EoE)模型中,分别于第0 ~ 11天局部给予维生素D类似物MC903和蛋清过敏原卵清蛋白(OVA)诱导湿疹样皮炎和过敏性致敏,研究KD对EoE小鼠模型的影响。从第12天开始,通过给小鼠喂食KD或常规饮食(RD)来研究KD对过敏致敏后小鼠的影响。小鼠于第18-20天灌胃OVA补充水和OVA灌胃。第21天,收集小鼠,通过流式细胞术量化食管嗜酸性粒细胞增多,并对引流淋巴结的免疫反应进行表型分析。 结果:诱导EoE后,KD组(n = 17)和RD组(n = 17)小鼠在第21天的存活率均为100%。第21天,kd饲喂小鼠的体重恢复率为104.1±1.7%,rd饲喂小鼠的体重恢复率为99.0±3.2% (p &gt; 0.05)。第21天食道嗜酸性粒细胞分析显示,与rd饲喂小鼠(880±225个细胞)相比,kd饲喂小鼠(711±345个细胞)siglece -f + CD11b+嗜酸性粒细胞数量(总细胞数±SEM)显著减少(p &lt; 0.05)。与rd喂养的小鼠(8.1±1.5%)相比,kd喂养的小鼠食道嗜酸性粒细胞百分比(CD45+细胞±SEM百分比)(5.1±1.2%)无显著降低(p = 0.138)。在第21天对肉豆酸酯和离子霉素刺激的小鼠引流淋巴结细胞进行免疫表型分析时,kd喂养的小鼠Foxp3+ T调节细胞(Treg)的百分比(CD4+ T细胞的百分比±SEM)显著高于rd喂养的小鼠(3.3±0.4%)(p &lt; 0.01)。结论:在ova诱导的EoE小鼠模型中,我们观察到KD对食管嗜酸性粒细胞募集有适度的抑制作用。与RD相比,KD与EoE小鼠引流淋巴结中Foxp3+ Tregs的比例增加有关。进一步的机制研究是有必要的,包括确定Tregs对kd诱导的食管嗜酸性粒细胞增多症抑制的必要性。这项研究强调了在过敏性疾病背景下免疫调节饮食干预的前景。Hassan S. Dashti博士,RD1;Magdalena Sevilla, phd .1;Kris Mogensen, MS, RD-AP, LDN, CNSC2;Charlene Compher, PhD, RD, LDN, FASPEN31Massachusetts General Hospital, Boston, MA;2马萨诸塞州波士顿布里格姆妇女医院;3宾夕法尼亚大学护理学院,费城,太平洋财政支持:本出版物中报道的研究得到了美国肠外和肠内营养学会(ASPEN)路德斯研究基金会的支持,该基金会授予Hassan S. Dashti。背景:接受家庭肠外营养(HPN)的短肠综合征(SBS)患者通常在夜间接受营养输注,导致睡眠和昼夜节律紊乱。通过限制昼夜节律失调(即昼夜节律系统与行为之间的不匹配)和影响其他途径,将营养摄入与昼夜节律时钟保持一致,预计将为弱势群体带来巨大利益。代谢分析技术的最新进展(细胞代谢物的系统分析,即糖、氨基酸、有机酸、核苷酸和脂质)已成为识别相关生物途径的有前途的工具。我们的目的是表征成人SBS患者习惯接受HPN的日间和夜间HPN输注之间代谢物的差异。方法:本研究是一项为期2周的对照单组先导试验和可行性试验的二次分析,该试验旨在比较SBS成人患者服用HPN的白天和夜间输注HPN (ClinicalTrials.gov: NCT04743960)。入组患者在夜间接受1周的HPN输注,随后在白天进行1周的HPN输注(输注开始时间大约改变12小时)。在两个研究期间,输液的持续时间、频率和组成保持相同。在每一周的研究期后,患者在临床就诊时采集静脉血样本。采用超高效液相色谱-串联质谱分析血浆样品,并测定其整体代谢谱。在1015种测量的代谢物中,只有622种代谢物在所有样本中具有非缺失数据。数据被归一化为提取的样本体积,然后在分析前用Z-score进行对数变换和缩放。两个研究期间(白天与夜间)的差异代谢物丰度使用MicroArray数据的标准线性模型(LIMMA)模型来确定,该模型根据饮食禁食持续时间和最后一次HPN输注结束后的时间进行调整。然后使用MetaboAnalyst的途径富集工具进行途径富集分析。结果:9例患者(年龄52岁,80%为女性,BMI为21.3 kg/m2)完成了试验,并提供了2份空腹血样。在至少8小时禁食和至少8小时HPN输注结束后,两次抽血都在上午11点20分左右完成。36种代谢物变化P &lt; 0.05;排在前列的代谢产物主要是脂肪酸、长链和多不饱和脂肪酸(二高γ -亚麻酸、花生四烯酸酯(20:4n6)、二十二碳六烯酸酯(DHA; 22:6n3))和甘油脂。(图1)。在严格的FDR阈值下,没有显著的代谢物。对36种代谢物的富集分析确定了与不饱和脂肪酸、d -精氨酸、d -鸟氨酸代谢和亚油酸代谢等生物合成相关的途径(图2)。 阿斯特丽德·韦伯斯特,MSc1,2;Mark K. Hvistendahl, MS, phd;Federico Bolognani, MD, PhD4;李慧玲,MS, PhD4;Nader N. Youssef, MD, MBA, FACG4;弗朗西斯卡·乔利,医学博士;Palle B. Jeppesen, MD, phd;Tim Vanuytsel,副教授1,21鲁汶肠衰竭和移植中心(LIFT),鲁汶大学医院,比利时鲁汶;2比利时鲁汶大学胃肠道疾病转化研究中心,比利时鲁汶;3哥本哈根大学医院Rigshospitalet肠衰竭与肝病科,丹麦哥本哈根;4VectivBio,瑞士巴塞尔;5肠胃病学和营养支持科肠衰竭中心Hôpital法国克利希博戎财政支持:本研究由VectivBio AG公司支持。背景:短肠综合征(SBS)是一种严重的器官衰竭疾病,具有发生肠衰竭(SBS- if)和终生依赖肠外支持(PS)的高风险。胰高血糖素样肽-2 (GLP-2)类似物刺激剩余肠道的适应性,从而增加肠道吸收并减少PS需求。关于短效GLP-2类似物teduglutide对无剩余结肠患者的影响有广泛的文献。然而,GLP-2类似物对具有结肠不连续性(CiC)的SBS-IF的流体和能量吸收的影响尚不清楚。Apraglutide (APRA)是一种新型的长效合成GLP-2类似物,正在开发用于SBS-IF。我们对一项针对SBS-IF-CiC的2期研究进行了预先定义的中期分析,以代谢平衡研究(MBS)为基础,研究4周阿普鲁肽治疗的安全性和有效性。STARS Nutrition是一项为期52周的多中心、开放标签的2期研究,研究对象是接受每周一次皮下注射阿普鲁肽(5mg)的SBS-IF-CiC成年患者。在基线和稳定的PS 4周后进行MBS,随后是48周的PS调整期。在MBS期间,液体摄入量保持恒定(个人预定义的饮用菜单)。收集食物和液体(湿重摄入)、尿液和粪便(粪便湿重输出)的副本。安全性是主要终点。次要终点包括粪便湿重输出、尿量、湿重和能量吸收的变化。数据以平均值(95% CI)表示。P值&lt; 0.05被认为是显著的(Wilcoxon配对对有符号秩检验)。结果:9名患者被纳入,构成了完整的研究人群。阿普鲁肽耐受性良好,没有停药或中断。根据其性质或严重程度,没有ae被认为是显著的。在基线时,患者每周接受10(范围4-21)l的PS量,小肠长度为19(范围0-50)cm, 79(范围43-100)%的结肠是连续的。粪便湿重输出显著减少253(- 437至- 68)g/天(p = 0.012)。相对湿重吸收率提高了9 (1 ~ 18)% (p = 0.039)。尿量的数值增加(p = 0.129)。能量吸收未见明显变化(表1)。Palle B. Jeppesen, MD, phd;Tim Vanuytsel副教授;Sukanya Subramanian医师;Francisca Joly, MD, PhD4;内科医生Geert Wanten;乔治·兰普雷希特,医师,教授;Marek Kunecki, MD7;法鲁克·拉赫曼,医生;统计学家托尔·尼尔森;Lykke Graff, MD9;马克·汉森医生;Ulrich Pape医师;David Mercer,内科医师111哥本哈根大学医院Rigshospitalet肠衰竭和肝脏疾病科,丹麦哥本哈根;2鲁汶uz,比利时鲁汶;华盛顿特区乔治城3MedStar;4法国博戎医院消化与营养支持科肠衰竭研究中心Hôpital;5奈梅亨大学医学中心,奈梅亨,荷兰;6罗斯托克大学医学中心,德国罗斯托克;7米。Pirogow医院,Wolczanska,波兰;8伦敦大学学院医院,英国伦敦;9Zealand Pharma A/S,哥本哈根,丹麦;10 .德国汉堡圣乔治Klinik;资金支持:新西兰制药A/S支持研究。背景:减少肠外支持(PS)对于改善短肠综合征(SBS)合并肠衰竭(IF)患者的预后很重要。到目前为止,临床上有意义的患者内PS体积变化被认为是减少≥20%。然而,这是基于临床经验的,据我们所知,还没有数据驱动的练习旨在量化从患者角度来看什么构成了有意义的PS量变化。Glepaglutide是一种长效GLP-2类似物,在SBS-IF患者中减少PS体积需求并改善患者总体变化印象(PGIC),这是一种患者报告的结果(PRO)工具。 我们在此报告了在疗效和安全性评估(EASE) SBS 1试验中使用PGIC治疗格列鲁肽后PS体积有意义变化的定量分析。方法:EASE SBS 1是一项多中心、安慰剂对照、随机、平行组、双盲的3期临床试验(NCT:03690206)。慢性SBS-IF成人患者每周至少需要3天的PS被招募。患者被随机分配到24周的治疗中,SC注射10mg格列鲁肽,每周两次(TW), 10mg格列鲁肽,每周一次(OW),或安慰剂。使用常规液体平衡周期评估和调整PS体积需求。主要终点是从基线到第24周的每周PS量减少。从试验开始到第12周和第24周,患者使用7分李克特量表(从非常差到非常改善)对PGIC的总体状态变化进行评分。采用基于锚点的散点图和经验累积分布函数(eCDF)分析,评估PGIC分类数据与从基线到第12周和第24周PS体积百分比变化之间的关系。基于锚定的方法被用作外部标准,以获得基于已知锚定措施的对患者有临床意义的知识。结果:106例随机患者中有99例完成了试验。格列鲁肽TW治疗显著降低了平均PS需求47% (5.13 L/周)。与安慰剂相比,格列鲁肽TW组和OW组PGIC的改善均有显著差异(p = 0.002)。使用盲法数据样本,PGIC与从基线到第24周的PS体积%变化之间的关联显示两个终点相关,Spearman秩序和Kendall的tau-b相关系数分别为0.353和0.285。经过12周的治疗,这种关联变得更强。在对eCDF进行检查后,这些结果支持将% PS体积减少阈值设置为20%的合理性。结论:锚点分析,以PGIC作为锚点测量,表明使用PS体积减少20%作为临床试验中使用的结果测量,对SBS患者具有临床意义。区分摘要Park ji Seok, MD, MPH1;Naseer Sangwan博士;劳伦Menke2;Gail Cresci,博士,RD, LD, FASPEN11Cleveland Clinic, Cleveland, OH;2凯斯西储大学,克利夫兰,俄亥俄州财政支持:4R00AA023266 (GC)和标准流程。背景:合成菌是益生元和益生菌的物理组合,其总体目标是通过与其食物来源的共同包装来维持益生菌的活力。尽管可广泛获得,但支持在健康人群中使用的证据有限。本研究旨在检验该合生剂对胃肠道症状和肠道菌群的可行性和安全性。方法:这是一项在健康成人中进行的双盲、随机、安慰剂对照、配对交叉试验研究,以测试靶向合成物对肠道微生物群多样性和丰度的影响。目标益生菌包括2株益生菌,罗伊氏乳杆菌3613 (1 × 109 CFU)和植物乳杆菌276 (1 × 1011 CFU),以及抗性淀粉(RS)益生元NuBanaTM RS65G绿香蕉粉(3.84 g/d)。34名符合预先定义标准的健康参与者入组,每个样本量计算24名完成者需要在5%显著性水平下达到91%的功效。参与者被随机分配服用合成与麦芽糊精安慰剂28天,然后是21天的洗脱期,然后他们交叉服用另一种补充剂28天。评估胃肠道症状,并在每个补充期前后收集粪便样本。采用16s rRNA测序对粪便样本进行分析,并采用分裂扩增子去噪算法2 (DADA2)和核糖体数据库项目(RDP)分类器进行分类分析。采用Shannon多样性指数评价α -多样性,采用Bray-Curtis差异评价β -多样性。差异丰度用于捕获合成组和安慰剂组之间显著不同的类群。这项研究得到了克利夫兰诊所机构审查委员会的批准。结果:34例受试者随机纳入研究,其中男性13例,女性21例,完成研究28例,平均年龄32±7岁。与安慰剂相比,服用合成菌时粪便样本的香农多样性指数更高(P = 0.021),表明在食用合成菌期间微生物丰富度和均匀度更高。计算合成组与安慰剂组之间的布雷-柯蒂斯差异,然后使用主坐标分析(PCoA)可视化,显示两个独立但重叠的组。 该项目的目的是:确定外周插入中心静脉导管(PICC)、隧道中心静脉导管(TCVC)和植入端口之间,或用于HPN的单腔(SL)和多腔(ML)导管之间的感染率是否存在显著差异;并确定其他并发症的CVC切除率。方法:对2019年2月至2022年12月接受弗吉尼亚大学Continuum家庭输液药房提供的HPN的成年人进行了一项前瞻性、观察性质量改善项目,随访至2023年7月31日。收集89例患者使用的141例CVC的数据,包括HPN天数、HPN适应症(图1)、CVC移除的原因、抽血和微生物学结果。CRBSI和CLABSI由表1中描述的标准确定。图2显示了疑似感染的CVC患者外周血和CVC血及导管尖端试验的数量。结果:用于HPN的cvc中,PICC占63%,TCVC占27%,port占10%,共使用HPN导管15474天。cvc为42% SL, 55%双腔,2%三腔。CRBSI发生率为每1000 HPN导管天0.97次,PICC为1.54次,TCVC为0.64次,port为0.0次。CLABSI发生率为每1000 HPN导管天1.74次,PICC为3.07次,TCVC为0.89次,port为0.0次。PICC和TCVC在CRBSI方面没有显著差异,然而,PICC每1000 HPN导管天的CLABSI率明显高于TCVC (p = 0.005)。在第二次分析中,由于未确定的替代感染源,9例导管感染未被计算在内,总体CRBSI和CLABSI率分别降至0.78和1.16 / 1000 HPN导管天。第二次分析显示PICC的CRBSI率为1.23,TCVC为0.51;PICC的CLABSI率为2.0,TCVC的CLABSI率为0.64,CRBSI无显著差异,PICC系每1000 HPN导管天CLABSI率显著高于TCVC (p = 0.04)。表2显示了CRBSI和CLABSI率的统计分析。在初始分析中,ML CVCs的CRBSI为1.24,SL CVCs为0.68,ML CVCs的CLABSI为2.1,SL CVCs为1.36,每1000 HPN天,但差异无统计学意义。其他需要切除CVC的问题包括闭塞、错位、意外、泄漏和血栓形成。其他并发症的拔除率总体为2.0 / 1000 HPN导管天,TCVC为1.78 / 1000 HPN导管天,PICCs为2.61 / 1000 HPN导管天,差异无统计学意义。结论:我们发现PICC和TCVC的CRBSI无显著差异,PICC的clabsi明显多于TCVC,且无端口感染。尽管PICCs的其他导管问题发生率较高,ML导管的感染率高于SL导管,但两者均无统计学意义。我们说明了CRBSI和CLABSI之间结果的差异,并且未确定的替代感染源使报告复杂化。我们的结果表明,需要进行更多的研究,对端口的使用更加开放,并在可行的情况下选择SL tcvc进行长期HPN。Takayama Haruka, RD, phd;Kazuhiko Fukatsu, MD, PhD1,3;野口美纪BA1;Kazuya Takahashi, MD, phd;松本奈奈,RD, MS3;成田智则,MD4;Satoshi Murakoshi,医学博士,日本东京文京区东京大学附属医院51外科中心;2日本东京中央区圣路加国际医院营养科;3东京大学医学研究生院手术室管理与手术代谢,日本东京文京区;4日本东京文京区东京大学医院胃肠外科;5 .日本神奈川县横须贺市神奈川县人类服务大学营养与营养学背景:我们之前的研究表明,在TPN中添加β -羟基- β -甲基丁酸盐(HMB)可以部分恢复由于缺乏肠内营养而导致的肠道相关淋巴组织(GALT)萎缩。由于HMB是氨基酸的代谢物,恢复效果可能来自于TPN溶液中氨基酸量的增加。也可能是氨基酸含量的增加本身不能恢复GALT萎缩,而氨基酸含量的增加与HMB的添加可以进一步防止GALT萎缩。在此,我们通过小鼠TPN喂养模型进行了两项研究来回答这些问题。方法:实验1:6周龄雄性癌症研究所(ICR)小鼠分为A+组(n = 10)和A++组(n = 10)。将导管插入小鼠右颈静脉,连续给予0.2 mL/h生理盐水2 d,并允许随意进食和饮水。然后,小鼠给予NPC/ n284 (A+)或135 (A++)等热量PN溶液,不口服食物5天。 鉴定出11个分类群,包括产生丁酸盐的Akkermansia属和Butyricimonas属,在合成菌和安慰剂之间存在显著差异。所有受试者都能很好地耐受补充剂,胃肠道症状没有变化。结论:本初步研究表明,靶向合成补充剂有利于改善健康受试者肠道微生物群的多样性和分类群的丰度。需要进一步的研究来测试这种靶向合成器在已知肠道生态失调的临床情况下的效果,以确定这种改变是否可以持续并与疾病相关。凯特琳·达夫,MA, RD, LDN1;Gail Cresci,博士,RD, LD, faspen21凯斯西储大学/克利夫兰诊所勒纳研究所,克利夫兰,俄亥俄州;2克利夫兰诊所,俄亥俄州克利夫兰。财政支持:美国国立卫生研究院-国家酒精滥用和酒精中毒研究所。背景:在美国,酒精使用障碍是肝脏疾病的主要原因1,估计80%的酒精相关终末期肝病(AA-ESLD)患者同时表现为临床营养不良和肌肉减少2。ALD的肠道生态失调在文献中已经得到了很好的表征,从拟杆菌门和厚壁菌门主导的种群转向丰富度增加的变形菌门3。虽然已知肠道微生物组在氨基酸的代谢和产生中起作用,但酒精相关的肠道生态失调如何影响宿主氨基酸稳态却鲜为人知。我们的目的是测试AA-ESLD患者的氨基酸代谢物谱是否与无疾病病理的患者独特,以及这是否与肠道微生物群的变化相关。方法:从一项更大的、单中心的、非随机的前瞻性先导研究中对等待肝移植的患者进行二次数据分析,以表征氨基酸稳态的代谢组学变化。在肝移植前24小时内收集尿液样本,调整尿液渗透压,并通过UPLC-MS/MS进行非靶向代谢组学分析。采用16srRNA对肝移植24小时内收集的粪便样本进行测序和分析。采用Welch配对t检验来确定AA-ESLD患者与健康对照患者之间代谢物平均标度强度的统计学显著变化。Spearman相关性被用来确定氨基酸代谢物和肠道微生物分类群之间的关系。结果:ALD-ESLD患者(n = 11)和健康对照组(n = 18)的尿液代谢组分析显示,两组之间的氨基酸谱存在差异。Welch配对t检验发现,精氨酸(p = 0.0016)、谷氨酸(p = 0.0289)、酪氨酸(p = 0.0003)、苯丙氨酸(p = 0.0002)、天冬酰胺(p = 0.0005)、色氨酸(p = 0.0001)、胱氨酸(p = 0.0017)和牛磺酸(p = 0.0480)在ALD-ESLD患者中均显著升高。当产生Spearman相关时,发现γ -变形菌属与苯丙氨酸(p = 0.0167)和色氨酸(p = 0.0349)之间存在显著的正相关。这些数据表明,微生物群可能导致尿液中这些氨基酸浓度的增加。γ -变形菌群与谷氨酰胺(p = 0.0151)、组氨酸(p = 0.0476)呈正相关,与甘氨酸(p = 0.0071)、肌酐(p = 0.0341)呈负相关。结论:AA-ESLD患者与非肝病患者尿氨基酸代谢物存在差异。由于患者必须戒酒6个月才有资格进行肝移植,这些数据表明AA-ESLD对氨基酸稳态的残留影响。微生物组与氨基酸代谢物之间的相关性表明,与ALD相关的独特微生物变化可能在这些观察到的氨基酸代谢变化中发挥作用。斯蒂芬妮·梅里诺·巴尔,MS, RDN, LD1,2;罗莎·汉德,博士,RDN, LD, FAND2;马克·科林,md1,2;Thomas E. Love, ph . 1,2;Sharon Groh-Wargo,博士,RDN1,21大都会医疗中心,克利夫兰,俄亥俄州;2凯斯西储大学,克利夫兰,俄亥俄州财政支持:无报道。背景:2018年,现场专家提出了新生儿营养不良的诊断标准。此工具自发布以来尚未经过验证。本研究的目的是评估总体营养不良工具和个体指标的一致性和可靠性,以评估所提出的标准识别早产儿营养不良的一致性。方法:在III级新生儿重症监护病房(NICU)进行单中心、回顾性队列研究。该队列包括2013年6月至2022年8月期间出生的所有早产儿,这些早产儿在新生儿重症监护病房入住至少3天且出院前未死亡。 根据每个指标对每个患者进行营养不良诊断(无/轻度/中度/严重),定义见表1;对单个指标的多重定义用于反映不同的潜在评估方法(例如,生长速度),或反映患者群体的差异(例如,蛋白质和能量摄入)。采用kappa (k)值评估新生儿营养不良诊断工具的总体指标间信度;这是分别计算用于评估出生后两周和两周内营养不良的指标。每个指标的诊断分别与所有其他指标的诊断进行比较,以评估指标间的信度;计算总体一致性比例、McNemar检验统计量和kappa值。可接受一致性定义为k &gt; 0.8。结果:本研究共纳入2946例婴儿。营养不良工具的k值总体上表明指标间信度较差;对于出生后两周的营养不良诊断,k = 0.054;对于出生后两周的诊断,k = 0.048。图1描述了各个指数的所有比较的加权k值。图2描述了总体一致的比例。例如,与能量摄入营养不良诊断标准相比,体重增加速度(方法1)有n = 954名受试者,k = 0.09,总体一致性比例为0.28,表明指标间信度和准确性都较差。常用的广义增重速度目标(方法2和方法3)具有良好的准确性和指标间信度,推荐的方法(方法1)通过维持年龄体重z分数来确定目标增重速度(1 vs. 2 k = 0.92, 1 vs. 3 k = 0.88)。广义线性增长目标(方法2)与推荐方法(方法1)的准确度和指标间信度较差(k = 0.12)。通过McNemar检验统计量评估,营养不良诊断的所有独特指标的比较在诊断模式上存在可检测的差异。Amber Hager, BSc, RD;王一琪,理学学士;Sandy Hodgetts,博士,OT;Lesley Pritchard, PhD, PT;Vera Mazurak博士;Susan Gilmour, MD, MSc, FRCPC;Diana R. Mager,硕士,博士,加拿大阿尔伯塔大学,埃德蒙顿,AB,加拿大。背景:由于体液超载、缺乏健康参考数据和床边使用的无创、经过验证的方法,对患有慢性肝病(CLD)的幼儿和儿童进行身体成分测量可能具有挑战性。使用超声连续测量肌肉厚度的变化克服了许多这些局限性,但很少有可用于幼儿和儿童的可比数据(&lt;5)。研究目的是连续测量患有CLD的婴儿和儿童(&lt;5岁)肱二头肌、小腿和大腿肌肉层厚度(MLT)、皮下脂肪组织厚度(SAT-T)和运动(粗/细)发育的变化。我们假设MLT(大腿、二头肌、小腿)和SAT-T的轨迹会受到CLD的显著影响,并提供婴儿和儿童大肌肉运动发展的信息(&lt;5 y)。方法:从斯托勒里儿童医院的儿科肝脏诊所/肝移植诊所和社区招募患有CLD的婴儿和儿童(4岁至5岁)(n = 11)和与其年龄匹配的CON (n = 16)。参与者在基线和6个月后进行了2次连续测量(1)MLT,回声强度和SAT-T肱二头肌(BB),股直肌(RF),中直肌(RI),比目鱼肌和胃肌(GN)的超声(U/S)和(2)大运动评估(Peabody运动量表V2 [PDMS-2])。收集的其他变量包括人口统计学(年龄、性别、CLD诊断、PELD)、SGNA评分、人体测量学(wt-z、ht-z、头围(hc-z))、身体组成(使用BIA的无脂质量(FFM) /脂肪质量(FM))和多重皮褶厚度(SFT)(三头肌[TSF]、二头肌、髌上、肩胛下)、臂中围[MAC-z])。结果:CLD病因包括73%的胆道闭锁(n = 8), 27%的其他病因(n = 1例急性肝功能衰竭;n = 2例tpn相关胆汁淤积)。两组在年龄、性别、wt-z、ht-z、hc-z、MAC-z、TSF-z、肩胛骨下-z基线时无显著差异(p &gt; 0.05)。30%的CLD儿童的SGNA评分表明轻度至中度营养不良(SGNA≥2)。6个月后,CLD患者总大腿、RI和比目鱼肌MLT明显低于对照组,大腿SAT高于对照组(p &lt; 0.05)。这在≤2岁的CLD儿童中尤为明显,他们在基线和6个月后的总大腿、RI、RF和比目鱼肌MLT明显低于对照组(p &lt; 0.05)。总大腿、RI、RF MLT(绝对值,6个月变化百分比)与BIA-FFM测量值呈正相关(r2 = 0.46−0.47);p &lt; 0; 001)总运动商和大运动商得分(绝对,百分位数;r2 = 0.47 p &lt; 0.001)),但不包括PDMS-2的精细运动商(绝对,百分位数),特别是在CLD儿童中(&lt; y)。二头肌和小腿(MLT, SAT)与CLD儿童的总运动、大运动或精细运动商(绝对,百分位数)无关。结论:CLD患儿的肌肉厚度明显低于对照组,而SAT明显高于对照组。连续测量大腿MLT可能对CLD患儿的无脂质量和大动作技能发展轨迹提供信息。anita Nucci, PhD, RD1;希拉里·巴肖,MD2;Alexander Kirpich, phd;Jeffrey Rudolph, md31佐治亚州立大学,亚特兰大,乔治亚州;2亚特兰大儿童保健中心,亚特兰大,佐治亚州;3UPMC匹兹堡儿童医院,匹兹堡,太平洋资金支持:武田制药。背景:尽管肠外营养(PN)改善了肠衰竭(IF)儿童的生存率,但许多儿童在获得肠内自主后仍无法维持足够的躯体生长。很少有研究检查断奶后的生长情况,结果也不一致。胰高血糖素样肽-2 (GLP-2)类似物已被证明在一些短肠综合征儿童使用6个月后可减少PN的体积和时间。这种类似物对生长的影响尚不清楚。我们的目标是描述IF患儿在PN断奶后和GLP-2类似物治疗期间的生长模式。方法:这项回顾性观察性研究在美国的两个儿童肠道康复(IR)中心进行,入选标准包括在12个月大时诊断为IF(连续74天间隔内使用PN≥60天)。患者于1989年9月至2023年1月间接受IR治疗。体重和身高的z评分值(根据胎龄调整至2岁)描述了那些从PN断奶和接受GLP-2模拟物(Gattex®)≥6个月(2017-2023)的患者。结果:有362名儿童(男性57%,白人72%)符合研究条件,诊断时中位年龄为6天(四分位间距[IQR] 1,22)。常见的诊断包括坏死性小肠结肠炎(28%)、胃裂(23%)和小肠闭锁(16%)。中位胎龄为34周(IQR 31,37),诊断时剩余小肠的百分比为23% (IQR 10,50), 36%具有功能性回盲瓣。145名儿童(40%)成功脱离PN(中位断奶时间= 1.5 y [IQR 1,2.9])。123/145(85%)实现肠内自主(连续3个月维持正常生长)。PN断奶时体重和长/高z评分中位数分别为- 1.04 (IQR为- 2.09,- 0.12)和- 1.86 (IQR为- 3.01,- 0.69)。断奶后,分别有44%和39%的儿童在第1年和59%和55%的儿童在第2年保持体重和线性生长速度。第一年的体重和线性生长速度分别为28%和34%,第二年的体重和线性生长速度分别为22%和31%。14名儿童接受GLP-2模拟物治疗,平均时间为912天(IQR 365,1304)。其中3人在9个月内停止了肠外支持。GLP-2启动和启动2年后体重和线性生长速度z分数的变化如表1所示。Annemarie rommpca, MD1;摩根·麦克卢基,MD2;安东尼J.帕金斯3;张晓毅,MD, phd;Charles Vanderpool, MD11Riley儿童医院,印第安纳州印第安纳波利斯;2印第安纳州印第安纳波利斯放射科;3印第安纳大学医学院,印第安纳州印第安纳波利斯。背景:炎症性肠病(IBD)会影响患者的营养状况。口服摄入不良、营养物质吸收不良、粪便中蛋白质流失以及能量需求增加都可能导致这类患者营养不良。营养状况不佳可能表现为生长不良、体重增加不佳和肌肉减少症(肌肉质量和力量减少)。研究表明,小儿IBD患者肌肉量减少导致需要升级治疗,增加手术需求,并增加术后并发症的风险。我们试图通过横断面成像获得IBD诊断时的肌肉质量,与已知年龄和性别特异性的腰肌肌肉参考值进行比较,并分析IBD亚型之间的肌肉质量比较以及诊断时与人体测量学的相关性。方法:本研究是一项在三级医疗机构进行的单中心回顾性研究。纳入了2018年5月15日至2019年12月31日诊断时年龄在6至16岁的新诊断为IBD[克罗恩病(CD),溃疡性结肠炎(UC)和不确定性结肠炎(IC)]的患者。那些患有慢性疾病且诊断3个月内无法获得横断面成像的患者被排除在外。 获得IBD诊断时的人口学和人体测量数据。双侧腰3-4节段(L3-4)和腰4-5节段(L4-5)横断成像测量腰肌面积(mm2)。将左右两侧的测量结果加在一起,得到每个水平的腰肌总面积(TPMA)。将这些测量值与小儿腰肌面积参考值进行比较。我们使用方差分析来确定不同IBD类型的结果是否不同。Spearman相关性用于评估人体测量测量与相关结果之间的关系。所有分析均使用SAS v9.4进行。结果:回顾了70例新诊断的IBD患者的横断面影像。平均年龄11.9岁,男性占42例(60%)。大多数患者被诊断为CD (n = 50, 71.4%),其次是UC (n = 17, 24.3%),然后是IC (n = 3, 4.3%)。所有患者L3-4阶段TPMA的平均z-score为- 1.7。L4-5期所有患者TPMA的平均z-score为- 1.4(表1)。在不同IBD类型(CD、UC、IC)中,腰部两个水平的TPMA和L3-4的z-score的肌少症测量值有显著差异(表2)。最佳ASPEN -儿科,新生儿,妊娠和哺乳期区分摘要adam Russman, MD1;安妮·麦卡利斯特,CPNP2;安东尼J.帕金斯3;Charles Vanderpool, md41达拉斯儿童医疗中心,达拉斯,德克萨斯州;2印第安纳大学医学院赖利儿童医院,印第安纳州印第安纳波利斯;3印第安纳大学医学院,印第安纳州印第安纳波利斯;4莱利儿童医院,印第安纳波利斯,财政支持:无报告。背景:美国营养与饮食学会/美国肠外和肠内营养学会(and /ASPEN)于2014年发布了营养不良指南。描述诊断为营养不良的住院儿童临床结果的文献有限,很少有研究关注营养不良严重程度亚型对临床结果的影响。方法:我们分析了2019年至2022年在我们儿科医院住院的患者,不包括孕产妇/产科住院患者。根据and /ASPEN指南,由注册营养师诊断患者营养不良并分配严重程度亚型。如果没有足够的医生文件来确定营养不良严重程度亚型,则指定为未指明的营养不良。收集再入院率、死亡率、住院时间(LOS)、住院时间指数、医院费用、手术程序(OR,任何程序)和儿科重症监护病房(ICU)入院数据。临床结果也根据营养不良严重程度亚型进行分析,并与未诊断为营养不良的患者进行比较。我们使用自然对数(LOS +1)和自然对数(成本+1)进行LOS和成本分析,因为这两个变量都是高度倾斜的。完成混合效应回归分析,以解释重复入院的聚类。所有分析均使用SAS v9.4进行。结果:与没有营养不良诊断的患者相比,任何营养不良诊断与更高的7、14和30天再入院率相关。与没有营养不良诊断的患者相比,营养不良患者的死亡率、LOS中位数、LOS指数、费用、ICU入院率和手术率更高(表1)。表2是基于营养不良严重程度亚型的分析。与没有营养不良的患者相比,轻度、中度和重度营养不良的患者在第7天、第14天和第30天的再入院率都明显更高。不明原因营养不良的患者在30天内的再入院率更高。在所有三个再入院时间点,营养不良严重程度类别之间的再入院率没有显著差异。与非营养不良患者相比,唯一死亡率显著增加的营养不良亚型是严重营养不良患者(p = 0.005)。轻度、中度、未明确和严重营养不良的入院患者的LOS指数、LOS和总费用明显高于没有营养不良诊断的入院患者。轻度营养不良入院患者的LOS指数显著高于中度(p = 0.050)和重度(p = 0.014)营养不良,而未明确严重程度入院患者的LOS指数显著高于重度入院患者(p = 0.026)。轻度(p = 0.032)、中度(p = 0.015)和重度(p = 0.001)营养不良入院患者的LOS显著高于未明确严重程度入院患者。轻度(p = 0.011)营养不良的住院费用明显高于未明确营养不良的住院费用。 在饮食操作后,在全身麻醉下穿刺心脏杀死所有小鼠,并收获整个小肠进行GALT细胞分离。测定各组织(支点斑块、PPs、上皮内间隙、IE、固有层、LP)中GALT细胞数量及其表型(B细胞、CD4+、CD8+、αβTCR+、γδTCR+)。收集鼻洗液、支气管肺泡灌洗液(BALF)和肠洗液,采用ELISA法检测IgA水平。实验二:随机分为A+H+组(n = 10)和A++H+组(n = 9)。A+H+组小鼠给予NPC/N 284和Ca-HMB 2000 mg/kg BW的PN溶液,A++H+组小鼠给予NPC/N 135和Ca-HMB 2000 mg/kg BW的PN溶液。PN饲喂5 d后,对试验1所示各项参数进行评价。所有参数分析均采用Wilcoxon检验,显著性水平设为小于5%。结果:A+组和A++组在GALT细胞数量(表1)、表型(表2)和粘膜IgA水平上无显著差异。然而,与A+H+组相比,A++H+组在IE空间中显示更高的LP细胞数量(表1)和更高的CD4+细胞百分比(表2),任何粘膜部位的IgA水平均无显著差异。Anam Bashir, MBBS;劳伦·l·卡雷尔,BCPS;Margaret Begany, RD, CSPCC, LDN, CNSC;Jennifer Panganiban,费城儿童医院,美国费城,财政支持:无报道。背景:鱼油基脂质乳(FOLE)被fda批准以1 g/kg/天的剂量用于治疗肠外营养相关性胆汁淤积症(PNAC)。由于在1 g/kg/天FOLE时脂肪供应有限,热量供应,特别是在新生儿群体中,主要由葡萄糖和高于预期的葡萄糖输注速率(GIR)供应来提供,以支持体重增加和生长。关于FOLE使用剂量高于1 g/kg/天的已发表信息有限。对每日摄入1克/千克必需酸可能缺乏的担忧已经提出。因此,我们的目标是描述在我们机构接受1.5 g/kg/天FOLE的患者。方法:对2020年1月至2023年8月在费城儿童医院接受肠外营养(PN)的患者进行回顾性irb批准的图表回顾。纳入标准包括接受PN治疗的儿童,年龄在0 - 18岁之间,接受FOLE治疗的剂量大于1 g/kg/天,持续至少14天。胆汁淤积进展、必需脂肪酸缺乏(EFAD)、临床严重的术后出血和高甘油三酯血症是值得关注的临床结局(表1)。通过结合胆红素水平监测胆汁淤积症的进展。根据联合区域和大学病理学家公司(ARUP)的标准实验室值,使用大于0.046的三烯与四烯(T:T)比率来定义EFAD。还收集了蜂蜜酸、亚油酸和α -亚油酸水平,以反映必需脂肪酸的储存(标准值见表2)。侵入性手术被定义为那些需要通过切口,和/或隧道或切割技术进入身体的血管手术。对于1岁以下的儿童,高甘油三酯血症被归类为大于200mg /dl,对于较大的儿童,大于400mg /dl。结果:9例患者[男5例;平均年龄为2.6岁(范围为2 - 12.9岁),伴有PNALD(定义为血清偶联≤2mg /dl,排除其他肝脏疾病原因)的患者开始服用1.5 g/kg/剂量的FOLE。启动高剂量FOLE的目的是减少GIR供应和/或由于使用1 g/kg/天FOLE的体重增加不理想而提供额外的卡路里。没有患者出现高甘油三酯血症。4例患者胆汁淤积改善,水平下降超过2mg /dl, 4例患者在服用1g /kg剂量后仍未出现胆汁淤积。一名患者的结合胆红素增加超过2mg /dl,此后FOLE降至1g /kg/天,三个月后胆汁淤积消退。7名患者收集了必需脂肪酸面板,T:T在正常范围内,尽管5名患者的亚油酸低于最佳水平。7名患者进行了侵入性手术,只有1名患者在包皮环切术后出血多于预期。该患者纤维蛋白原水平较低(70 mg/dL),需要新鲜冷冻血浆和填充红细胞输血,此后无明显出血事件(表1)。Diana Mulherin,药学博士,BCNSP, BCCCP, FCCM;Sarah Cogle,药学博士,BCNSP, BCCCP;Vanessa Kumpf,药学博士,BCNSP, FASPEN;爱德华·吴,药学博士;David Mulherin,药学博士,BCPS;Madeleine Hallum, MSHS, RDN, CSG, LDN;Ankita Sisselman,医学博士;Dawn Adams, MD, MS, CNSCVanderbilt大学医学中心,纳什维尔,tn财政支持:无报道。 背景:铜(Cu)缺乏可导致伤口愈合不良、髓神经病变、贫血和心律失常。缺乏症的发生是由于摄入不足或流失过多,这可能出现在需要肠外营养(PN)的成年患者中,包括那些严重营养不良、大面积烧伤、需要持续肾替代治疗(CRRT)或有减肥手术/吸收不良史的患者。以前的多微量元素(MTE)配方每剂量含有1毫克铜,并且与其他PN成分中的铜污染相结合,在需要长期PN的患者中观察到高铜血症的发生率增加。截至2020年,美国唯一用于成人的MTE产品含有0.3毫克的铜。对于有肝功能障碍的明显胆汁淤积的患者,ASPEN建议在PN中保留或减少铜的剂量。由于缺乏标准化的实践,我们启动了一个质量改进项目,以描述在PN中订购铜的实践和重症高胆红素血症住院患者的铜状态。方法:回顾性评估2021年7月1日至2023年8月31日在一家大型学术医疗中心的多学科营养支持团队(NST)的PN订购实践。包括≥18岁严重高胆红素血症(总胆红素≥10mg /dL或直接胆红素≥2mg /dL)患者在PN就诊前5天或就诊期间任何时间的PN就诊(单次住院期间的PN治疗疗程)。采用描述性统计评估患者人口统计学特征、PN中Cu供应频率、Cu和c反应蛋白(CRP)水平以及CRRT状态。结果:1068例患者在研究期间共收到15739张PN单。其中,144例严重高胆红素血症患者发生155例PN遭遇。表1提供了基线人口统计数据。图1总结了每次PN遭遇的铜源(来自MTE产品或作为氯化铜添加剂)。在平均浓度为76.9(±34.3)微克/分升的53例(34%)PN接触中评估了Cu状态。CRP仅与58% (n = 31)的Cu水平同时出现,平均浓度为125.7(±95.4)mg/L。44例(28.4%)患者接受CRRT治疗(表2)。图1所示。PN订单中的铜源。Brittney Patterson, MS, RD-AP, CNSC1;Ranna Modir, MS, RD, CNSC, CDE, CCTD1;杰克McKeown1;瑞秋Aubyrn1;Javier Lorenzo,医学博士,FCCM21Stanford Health Care, Stanford, CA;2斯坦福大学医学院,斯坦福大学,资金支持:无报道。背景:在电子病历(EMR)中使用安全警报旨在提高患者安全,大多数警报针对药物和护理工作流程。斯坦福医疗保健(SHC)在给药记录(MAR)中增加了管饲方案(TFR),以进一步提高患者的安全性。对于胃肠道(GI)并发症高风险的重症患者,ASPEN/SCCM 2016指南建议使用接近等渗的无纤维TFR。SHC 2014-2016年的回顾性分析发现,开始使用高渗、高纤维管饲配方和/或纤维补充剂的高风险管饲方案(HRTFR)的ICU患者存在严重胃肠道并发症的关联。为了确保在SHC实施ASPEN/SCCM指南,实施了许多干预措施,包括设计HRTFR列在底部的订单集;特定TFR顺序集去除HRTFR;在新住户培训、团队查房和每月在职期间进行教育;和注册营养师(rd)有管饲指令书写特权。然而,尽管有这些干预措施,HRTFR仍被要求进行,其中大多数发生在RD正常工作时间之外(上午8点至下午4点)。为了教育和指导提供者为ICU患者选择安全的TFR,我们的目标是在电子病历中创建一个新的营养支持特定订单验证弹出式窗口。方法:由研发人员、重症护理护理人员和Epic分析师组成的团队合作创建了一个营养支持特定订单验证弹出式窗口。ICU患者被定义为需要去甲肾上腺素、肾上腺素、加压素和/或苯肾上腺素的抗利尿激素支持。HRTFR被定义为高渗透性、高纤维的管饲配方和/或纤维补充剂。订单验证弹出窗口是在以下三种情况下触发的:(1)血管加压药已经开启,并且订购了HRTFR, (2) HRTFR已经开启,并且订购了血管加压药,或者(3)两个订单同时被放置。弹出窗口显示了警报的原因,避免HRTFR的重要性,提供了更安全的TFR选项,并建议与RD联系以获得指导。为了保持患者护理的个体化,顺序验证是不可推翻的,因为低血管加压剂剂量的患者适合进行HRTFR。 在订单验证弹出窗口实施后,在2023年3月至2023年5月之间完成了图表审查,以评估触发订单验证弹出窗口后的发生率和操作。结果:在2023年3月至2023年5月期间,订单验证弹出框共触发220次,共有59例患者。在220个触发器中,根据弹出式说明,42个(19%)导致订单更改或中断,或者未订购HRTFR。在导致适当调整HRTFR的42个触发因素中,26个(61%)发生在正常RD时间之外。其余的触发因素,在没有改变的情况下,被发现有低剂量的血管加压药,血管加压药列在MAR上,但没有积极使用,或者在MAR上订购了HRTFR,但根据护理沟通命令保留。结论:新型营养支持特定订单验证弹出窗口的创建对订餐提供者具有教育和指导作用。有了这一额外的安全层,2023年3月至2023年5月期间,42名ICU患者接受了更安全的TFR治疗,其中大部分影响发生在RD工作时间之外。最佳的ASPEN -肠内营养治疗-以容量为基础的肠内营养的胜利julie M. Geyer, RD-AP, cnsccolorado大学医院,Aurora, co .财政支持:无报道。背景:肠内营养(EN)在医院设置传统上是由固定率为基础的喂养方法(RBEN)。使用RBEN的研究发现,由于中断或扣留,实际配方交付平均为规定量的60%至70%。低于能量需求的营养供应会造成营养不良和负面后果,包括保健费用增加、发病率和死亡率增加。美国肠外和肠内营养学会(ASPEN)和重症医学学会(SCCM)建议使用基于容量的肠内营养喂养方法(VBEN)来改善营养输送,减少能量不足,防止过度喂养。方法:本质量改进研究于2022年6月至2023年9月在一家一级创伤学术医院进行。2022年9月,成立了全医院流程改进委员会,分多阶段实施VBEN。在2022年9月之前,以单位为单位的营养师进行了质量改进,以解决喂养中断的常见原因。纳入标准包括表现出目标RBEN耐受的患者。最大小时流速设定为150 mL/hr。“目标”供应被设定为规定配方量的90%至110%。数据收集中的患者耐受EN达到RBEN目标,配方奶摄入量直接从喂养泵历史中获取。对电子病历(EMR)的更改包括:创建一个VBEN计算器,在管饲流程中内置行指令,每4小时创建一个护士提醒任务,以重新计算配方摄入量并根据需要调整比率。给药记录中配方单的更改包括VBEN和RBEN喂养方法的规范和标准化的给药说明(图1)。护士、营养师和医疗服务提供者通过电子邮件交流、面对面培训、互动式学习辅助视频和一对一指导等方式接受了VBEN工作流程和流程的培训。结果:在2023年6月之前,RBEN是标准喂养方法。从2020年10月到2022年12月,在一个重症监护室进行的例行质量改进审计表明,尽管采取了改善配方奶粉交付的策略,但实际配方奶粉供应达到“目标”的比例为50%至74%(表1)。2022年6月,对整个医院的配方奶粉供应进行了审计,包括所有级别的护理(基层、中级和重症护理)。在总共346天的工作天数中,63%的工作天数达到了符合“目标”的实际配方规定(表2)。2022年11月,对选择用于第一阶段实施的两个ICU病房进行了审计。在总共154天的工作天数中,57%的工作天数达到了符合“目标”的配方规定(表2)。第一阶段的实施于2023年6月进行。已完成上线后审计。在总共157个EN日中,83%的EN日达到了“目标”公式量(表2)。无低血糖/高血糖或胃肠道并发症报告。第一阶段被认为是成功的,并获得批准继续在剩余的住院单位逐步实施VBEN。 方法:回顾性分析2022年6月至2023年6月期间接受肠内RD虚拟管饲指导的约162例患者。完成了肠内给药泵、重力袋和丸剂/注射器给药方法的虚拟指导。研究人员给活跃的患者打了一个后续电话,询问他们对虚拟教学的体验。对于那些无法联系到的患者,完成了对医疗记录的审查,以确定在虚拟指导之后是否收到了询问问题的入站电话。询问患者对其提供肠内喂养能力的信心,以及他们在完成虚拟教学后是否有任何担忧,是否知道在虚拟教学后联系谁,以及所提供的参考资料是否有帮助。未接受虚拟指导或出院的患者被排除在本综述之外。结果:总共有162例患者被评估为可能符合分析条件;115例被排除。在这些被排除在外的人中,有100人(87%)不再服役;12(10%)由于家庭健康机构的指导、护理人员或营养师在护理开始前的住院指导或家庭输液公司销售团队的帮助而拒绝虚拟指导;3人(3%)没有赴约。其余符合条件的患者中有18人无法联系到随访。在那些无法通过后续电话联系到的人中,没有关于喂养/设备问题或担忧的入站电话记录。在所有符合条件的患者中,有29人就虚拟教学体验提供了电话反馈。虚拟教学与以下给药方式有关:肠内泵(86%,n = 25),其次是重力袋和丸/注射器(14%,n = 4)。在完成教学后,27人(93%)对喂养管理有信心,2人(7%)对他们认为是“面对面学习者”没有信心;24人(83%)没有遇到问题/担忧,5人(17%)有问题/担忧;27人(93%)回答说知道联系谁,2人(7%)不知道;22人(76%)认为提供的参考资料有帮助,2人(7%)认为没有,5人(17%)没有复习参考资料。结论:近年来的技术进步使虚拟教学成为可能。虚拟肠内指导可以是一个成功的工具,为患者学习如何管理管饲时,在家庭护理环境中无法亲自指导。然而,应该考虑到客户喜欢的学习方式。应该考虑进一步研究如何利用虚拟教学来提高这一过程。由于文献对虚拟教学结果的研究有限,因此有必要进行进一步的研究。Danelle A. Olson, RDN;Lisa M. Epp, RDN;Osman Mohamed Elfadil, MBBS;Ryan T. Hurt,医学博士;Manpreet S. Mundi, MDMayo Clinic, Rochester, mn。资金支持:无报道。背景:近年来,减肥手术的流行率显著增加,因为它是治疗肥胖最有效的长期治疗方法。两种最常见的手术,袖式胃切除术(SG)和Roux-en-Y胃旁路手术(RYGB),改变了胃肠道解剖结构,显著减轻了体重,并缓解了肥胖相关的合并症,包括2型糖尿病。尽管有这些好处,但减肥手术可能与严重的衰弱并发症有关。虽然真正的患病率和机制尚不清楚,但已显示高达38%的RYGB术后患者存在低血糖,并且可能非常难以控制。目前,关于肠内营养(EN)作为一种潜在治疗方法的作用仍然缺乏数据。方法:回顾性分析2017年3月至2023年7月在我院门诊家庭肠内营养(HEN)诊所开始管饲治疗反应性低血糖患者的EMR。除了基线临床特征和人口统计学外,我们还收集了有关低血糖事件、干预措施、EN方案和结果的数据。结果:6例HEN临床出现减重后反应性低血糖(平均年龄45.5±9.6岁,女性占66.7%,HEN开始时平均BMI为28.6±8.3)。6例患者中有5例接受了RYGB手术,1/6的患者接受了腹腔镜可调节胃束带(LAGB),随后进行了改良的袖式胃切除术(SG)。手术后直到出现反应性低血糖的持续时间在队列中有所不同。第一次事件的平均记录时间为手术后2.6±3.2年。值得注意的是,患者在手术后和需要EN支持前平均减轻了51.2±28.5公斤。我们注意到EN开始后体重略有变化,因为患者在HEN开始后的一个月和三个月平均保持在+2.5 kg。 表1显示了患者的概况。建议对所有患者进行饮食调整,特别是减少精制碳水化合物的摄入。然而,依从性差是普遍存在的,5/6(83%)的患者没有坚持规定的饮食。除了为所有患者规定的EN和饮食方案外,一些患者还接受了预防或控制反应性低血糖的特殊治疗。在一个病例中,联合使用α-葡萄糖苷酶抑制剂、生长抑素和彻底改变饮食。大多数患者通过鼻空肠管进行EN的初始试验,然后在确定疗效后转为经皮管(表2)。大多数患者使用标准聚合物配方,尽管提供了商业混合管饲。使用EN后,6例患者中有4例反应性低血糖得到缓解,而只有2例继续出现症状。2例患者因喂养并发症和不依从性停止使用EN,其余4例继续使用EN。安娜·k·伯恩斯克;张丽云,MS;潘艳萍博士;Theresa Mikhailov,医学博士,威斯康辛州密尔沃基市医学院财政支持:无报道。背景:营养不良的患者预后较差。已经开发了许多标准化工具来筛查急性儿科患者的营养不良:儿科约克希尔营养不良评分(PYMS),儿科营养筛查工具(PNST)和儿科营养不良评估筛查工具(STAMP)。另外,一些机构为此目的开发了自己的工具。这些工具被称为“本土工具”。无论其来源如何,这些工具都没有在危重儿童中得到验证。注册营养师(rd)根据这些营养筛查的结果或根据其机构内的协议对患者进行营养评估。虚拟儿科系统有限责任公司(Virtual Pediatric Systems, LLC, VPS)是一个国际数据注册机构,支持研究的标准化数据共享,改善患者护理,并在儿科icu中制定基准,该机构开发了一个营养模块,用于获取营养指标数据。自2019年10月以来,VPS从营养模块的中心收集数据,并从参与营养模块的中心收集每个日历年约10,000名患者的数据。具体目的是将营养筛查工具与营养师的评估进行比较,以确定筛查工具的准确性,并确定标准化筛查工具是否比单一中心开发的筛查工具更准确。我们假设:(1)参与中心使用的营养筛查工具将准确识别营养不良儿童,(2)标准化工具将比单一中心开发的工具更准确。方法:在本项目中,我们将儿科营养筛查工具与rd进行的评估进行比较,以确定营养筛查工具是否能准确识别营养不良患者。我们还确定了哪些营养筛查工具更准确地识别营养不良或在PICU住院期间有营养不良风险的患者,以便开始适当的营养治疗。我们从参与营养模块的中心的VPS数据库中获得了2019年10月至2023年3月期间所有18岁以下患者的去识别人口统计学和临床数据。我们认为RD的评估是确定营养不良的金标准,并将营养筛查工具与RD的评估进行了比较。营养筛查工具与RD评估在营养不良方面的一致程度由Cohen’s kappa (κ)决定。结果:在选择了完成儿科营养筛查和RD评估的受试者后,最终的数据队列共包含9891例患者。其中男性占54%,新生儿(29天)占4%,婴儿(2岁)占34%,儿童(2-12岁)占35%,青少年(12-18岁)占26%。受试者中40%为白人,17.5%为黑人,22.5%为西班牙裔,5.7%为亚洲人,14.2%为其他/混血儿。标准化营养筛选工具的kappa系数为。38,这被认为是筛选工具和RD“金标准”评估之间的“公平”协议。其他在VPS中被列为“国产”或“其他”的未识别工具的kappa系数范围从。31到。91。91是筛选工具和RD“金标准”之间近乎完美的一致。结论:这些数据显示标准化筛查工具(PYMS, PNST, STAMP)与RD评估之间只有相当程度的一致性,这意味着这些工具不能充分评估危重儿童的营养状况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
7.80
自引率
8.80%
发文量
161
审稿时长
6-12 weeks
期刊介绍: The Journal of Parenteral and Enteral Nutrition (JPEN) is the premier scientific journal of nutrition and metabolic support. It publishes original peer-reviewed studies that define the cutting edge of basic and clinical research in the field. It explores the science of optimizing the care of patients receiving enteral or IV therapies. Also included: reviews, techniques, brief reports, case reports, and abstracts.
期刊最新文献
Issue Information Poster Abstracts Issue Information Harry M. Vars Award Candidates Abstracts Nutrition and Metabolism Research Oral Paper Session Abstracts
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1