Pub Date : 2026-03-12DOI: 10.1016/j.tjnut.2026.101398
Davaasambuu Ganmaa, Kaitlyn A Cook, Polyna Khudyakov, Dorjbal Enkhjargal, Tsolmon Bilegtsaikhan, Kenneth H Mayer, Allison Clar, Michael Rueschman, Raji Balasubramanian, Aditi Hazra, Howard D Sesso, Valerie E Stone, Patricia Copeland, Georgina Friedenberg, Douglas C Smith, Quanhong Lei, Todd Lee, Emily G McDonald, Tserenkhuu Enkhtsetseg, Erdenebaatar Sumiya, Yansanjav Narankhuu, Myagmarsuren Erdenetuya, Dalkh Tserendagva, Rikard Landberg, Niclas Roxhed, Susanne Rautiainen Lagerström, JoAnn E Manson
Background: Data from randomized controlled trials of vitamin D3 supplementation in modifying the course of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections are sparse.
Objectives: We evaluated the effect of vitamin D3 supplementation on healthcare utilization and other clinical outcomes among adults with coronavirus disease 2019 (COVID-19) and their close contacts.
Methods: We conducted a parallel 2-group randomized controlled double-blinded trial targeting free-living adults in the United States and Mongolia. Index participants with newly diagnosed COVID-19 were cluster-randomized with up to one of their cohabiting contacts either to an oral vitamin D3 loading dose of 9600 IU/d for 2 d followed by 3200 IU/d for 4 wk or to placebo. Participants completed weekly questionnaires on healthcare utilization, disease severity, and long COVID (index participants) or new SARS-CoV-2 infection (household contacts). The primary outcome was ≥1 healthcare visits (including hospitalization) or death within 4 wk among the index participants.
Results: Index participants (n = 1747) were a median of 38.0 y old (IQR: 31.1-47.0), 65.6% female/other sex, 4.2% Black non-Hispanic, 4.8% Hispanic/Latinx, 43.2% Asian, 44.3% non-Hispanic White, and 44.9% vitamin D deficient or insufficient (25-hydroxyvitamin D3 <20 ng/mL). Baseline characteristics for the household contacts (n = 277) were similar. The 4-wk cumulative incidence of healthcare utilization in index participants did not significantly differ between the vitamin D3 (n = 863) and placebo (n = 884) groups [cumulative incidences, 0.28 compared with 0.29; odds ratio (OR), 0.97; 95% confidence interval (CI): 0.75, 1.24]. Similar nonsignificant results were observed for the prespecified secondary treatment and prevention outcomes, though per-protocol analyses showed a nonsignificant trend toward benefit of vitamin D3 on the prevalence of long COVID at 8 wk (OR, 0.78; 95% CI: 0.59, 1.03). No safety concerns were identified.
Conclusions: Among adults with newly diagnosed SARS-CoV-2 infections, vitamin D3 supplementation did not significantly change the 4-wk cumulative incidence of healthcare utilization or COVID-19-related outcomes compared with placebo. Promising results for long COVID warrant further study. This study was registered at clinicaltrials.gov as NCT04536298. First registered on 1 September, 2020.
{"title":"A Randomized Trial of Vitamin D Supplementation and COVID-19 Clinical Outcomes and Long COVID: The Vitamin D for COVID-19 Trial.","authors":"Davaasambuu Ganmaa, Kaitlyn A Cook, Polyna Khudyakov, Dorjbal Enkhjargal, Tsolmon Bilegtsaikhan, Kenneth H Mayer, Allison Clar, Michael Rueschman, Raji Balasubramanian, Aditi Hazra, Howard D Sesso, Valerie E Stone, Patricia Copeland, Georgina Friedenberg, Douglas C Smith, Quanhong Lei, Todd Lee, Emily G McDonald, Tserenkhuu Enkhtsetseg, Erdenebaatar Sumiya, Yansanjav Narankhuu, Myagmarsuren Erdenetuya, Dalkh Tserendagva, Rikard Landberg, Niclas Roxhed, Susanne Rautiainen Lagerström, JoAnn E Manson","doi":"10.1016/j.tjnut.2026.101398","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101398","url":null,"abstract":"<p><strong>Background: </strong>Data from randomized controlled trials of vitamin D<sub>3</sub> supplementation in modifying the course of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections are sparse.</p><p><strong>Objectives: </strong>We evaluated the effect of vitamin D<sub>3</sub> supplementation on healthcare utilization and other clinical outcomes among adults with coronavirus disease 2019 (COVID-19) and their close contacts.</p><p><strong>Methods: </strong>We conducted a parallel 2-group randomized controlled double-blinded trial targeting free-living adults in the United States and Mongolia. Index participants with newly diagnosed COVID-19 were cluster-randomized with up to one of their cohabiting contacts either to an oral vitamin D<sub>3</sub> loading dose of 9600 IU/d for 2 d followed by 3200 IU/d for 4 wk or to placebo. Participants completed weekly questionnaires on healthcare utilization, disease severity, and long COVID (index participants) or new SARS-CoV-2 infection (household contacts). The primary outcome was ≥1 healthcare visits (including hospitalization) or death within 4 wk among the index participants.</p><p><strong>Results: </strong>Index participants (n = 1747) were a median of 38.0 y old (IQR: 31.1-47.0), 65.6% female/other sex, 4.2% Black non-Hispanic, 4.8% Hispanic/Latinx, 43.2% Asian, 44.3% non-Hispanic White, and 44.9% vitamin D deficient or insufficient (25-hydroxyvitamin D<sub>3</sub> <20 ng/mL). Baseline characteristics for the household contacts (n = 277) were similar. The 4-wk cumulative incidence of healthcare utilization in index participants did not significantly differ between the vitamin D<sub>3</sub> (n = 863) and placebo (n = 884) groups [cumulative incidences, 0.28 compared with 0.29; odds ratio (OR), 0.97; 95% confidence interval (CI): 0.75, 1.24]. Similar nonsignificant results were observed for the prespecified secondary treatment and prevention outcomes, though per-protocol analyses showed a nonsignificant trend toward benefit of vitamin D<sub>3</sub> on the prevalence of long COVID at 8 wk (OR, 0.78; 95% CI: 0.59, 1.03). No safety concerns were identified.</p><p><strong>Conclusions: </strong>Among adults with newly diagnosed SARS-CoV-2 infections, vitamin D<sub>3</sub> supplementation did not significantly change the 4-wk cumulative incidence of healthcare utilization or COVID-19-related outcomes compared with placebo. Promising results for long COVID warrant further study. This study was registered at clinicaltrials.gov as NCT04536298. First registered on 1 September, 2020.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101398"},"PeriodicalIF":3.8,"publicationDate":"2026-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147458093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-12DOI: 10.1016/j.tjnut.2026.101478
Beulah F Ortutu, Faidat A Adeleke, Hajara Idris, Chiamaka J Ezenwa, Adeleke A Folasade, Okemudi Nwonye, Linda O Edafioghor, Ifeoma M Egechizuorom, Gideon O Iheme
Background: The double burden of malnutrition (DBM), classified as the coexistence of maternal overweight or obesity or maternal undernutrition and child undernutrition or childhood overweight/obesity, within the same household, is an increasing concern in Nigeria. Drivers of DBM may differ by location due to urbanization, socioeconomic gradients, and dietary transitions.
Objective: This study examined location-specific predictors of DBM among Nigerian mother-child pairs, with a focus on child dietary quality, maternal education, household food insecurity, and wealth index.
Methods: A descriptive cross-sectional study using a stratified multistage sampling technique was conducted among 1,295 mother-child pairs (children aged 6 - 23 months) across four Nigerian cities. Child nutritional status was assessed using WHO growth standards, and maternal BMI was classified according to WHO adult cutoffs. Household food insecurity, dietary diversity, minimum meal frequency, and wealth index were measured using validated tools. Associations between predictors and DBM were examined using chi-square tests and generalized estimating equations, including interaction and stratified analyses by location.
Results: DBM prevalence was 37.4%, with the most frequent phenotype being overweight mothers and undernourished children (34%). Semi-urban residence (adjusted odds ratio [AOR]: 2.11; 95% CI: 1.93-2.30), food secure households (AOR: 1.22; 95% CI: 1.09-1.37), and not meeting the minimum meal frequency (AOR: 1.41; 95% CI: 1.04-1.92) were associated with an increased risk of DBM.
Conclusions: DBM among Nigerian mother-child pairs is shaped by dietary factors. Context-specific interventions are needed, with a focus on improving child diet quality in semi-urban areas.
{"title":"Location-specific predictors of double burden of malnutrition among Nigerian Mother-Child pairs: Re-evaluating dietary quality and socioeconomic factors.","authors":"Beulah F Ortutu, Faidat A Adeleke, Hajara Idris, Chiamaka J Ezenwa, Adeleke A Folasade, Okemudi Nwonye, Linda O Edafioghor, Ifeoma M Egechizuorom, Gideon O Iheme","doi":"10.1016/j.tjnut.2026.101478","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101478","url":null,"abstract":"<p><strong>Background: </strong>The double burden of malnutrition (DBM), classified as the coexistence of maternal overweight or obesity or maternal undernutrition and child undernutrition or childhood overweight/obesity, within the same household, is an increasing concern in Nigeria. Drivers of DBM may differ by location due to urbanization, socioeconomic gradients, and dietary transitions.</p><p><strong>Objective: </strong>This study examined location-specific predictors of DBM among Nigerian mother-child pairs, with a focus on child dietary quality, maternal education, household food insecurity, and wealth index.</p><p><strong>Methods: </strong>A descriptive cross-sectional study using a stratified multistage sampling technique was conducted among 1,295 mother-child pairs (children aged 6 - 23 months) across four Nigerian cities. Child nutritional status was assessed using WHO growth standards, and maternal BMI was classified according to WHO adult cutoffs. Household food insecurity, dietary diversity, minimum meal frequency, and wealth index were measured using validated tools. Associations between predictors and DBM were examined using chi-square tests and generalized estimating equations, including interaction and stratified analyses by location.</p><p><strong>Results: </strong>DBM prevalence was 37.4%, with the most frequent phenotype being overweight mothers and undernourished children (34%). Semi-urban residence (adjusted odds ratio [AOR]: 2.11; 95% CI: 1.93-2.30), food secure households (AOR: 1.22; 95% CI: 1.09-1.37), and not meeting the minimum meal frequency (AOR: 1.41; 95% CI: 1.04-1.92) were associated with an increased risk of DBM.</p><p><strong>Conclusions: </strong>DBM among Nigerian mother-child pairs is shaped by dietary factors. Context-specific interventions are needed, with a focus on improving child diet quality in semi-urban areas.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101478"},"PeriodicalIF":3.8,"publicationDate":"2026-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147457967","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Selenium is essential for children's health and either deficiency or excess could pose serious health risks. However, nationwide data on selenium nutrition in Chinese children are still lacking, as current studies were primarily limited to historically deficient regions affected by Keshan disease (KD) and Kashin-Beck disease (KBD).
Objective: This study aims to investigate the relationship between the living environment, geographic factors, dietary intake, and the selenium status of Chinese children aged 3-17 years.
Methods: Data were derived from the National Nutrition and Health Systematic Survey for children aged 0-17 years (CNHSC), conducted between 2019 and 2021. Field surveys were conducted to collect general demographic information, and a food frequency questionnaire (FFQ) was used to assess dietary intake. Blood selenium concentrations were measured using Inductively Coupled Plasma Mass Spectrometry (ICP-MS, Agilent 7700x), with a cutoff value of 70 μg/L indicating inadequate selenium status.
Results: The results revealed that the blood selenium concentration of children aged 15-17 years (92.93 μg/L) was higher than that of other age groups. Additionally, children from rural western areas, with a selenium concentration of 83.28 μg/L, significantly more vulnerable to inadequate selenium status compared to those from eastern cities [odds ratio (OR) = 20.56 (10.77-39.25)]. Dietary intake of dairy products [OR= 0.58 (0.44-0.75)], meats [≥7 times per week: OR= 0.48 (0.36-0.64), 3-7 times per week: OR= 0.65 (0.48-0.90)], and aquatic products [OR= 0.53 (0.37-0.75)] were identified as protective factors against inadequate selenium status (p < 0.001).
Conclusion: This study suggests that children in rural western areas may be a priority population for future interventions to improve selenium status. A moderate increase in the consumption of selenium-rich foods, such as dairy products, meats, and aquatic products, is recommended to reduce the prevalence of inadequate selenium status in these regions.
{"title":"Selenium Status is Associated with Living Environment and Dietary Intake among Children Aged 3-17 Years in China.","authors":"Yu Zhou, Shan Jiang, Xuehong Pang, Shujuan Li, Yifan Duan, Shuxia Wang, Wei Cao, Qian Zhang, Tao Xu, Bowen Chen, Yuying Wang, Zhenyu Yang, Wenhua Zhao","doi":"10.1016/j.tjnut.2026.101480","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101480","url":null,"abstract":"<p><strong>Background: </strong>Selenium is essential for children's health and either deficiency or excess could pose serious health risks. However, nationwide data on selenium nutrition in Chinese children are still lacking, as current studies were primarily limited to historically deficient regions affected by Keshan disease (KD) and Kashin-Beck disease (KBD).</p><p><strong>Objective: </strong>This study aims to investigate the relationship between the living environment, geographic factors, dietary intake, and the selenium status of Chinese children aged 3-17 years.</p><p><strong>Methods: </strong>Data were derived from the National Nutrition and Health Systematic Survey for children aged 0-17 years (CNHSC), conducted between 2019 and 2021. Field surveys were conducted to collect general demographic information, and a food frequency questionnaire (FFQ) was used to assess dietary intake. Blood selenium concentrations were measured using Inductively Coupled Plasma Mass Spectrometry (ICP-MS, Agilent 7700x), with a cutoff value of 70 μg/L indicating inadequate selenium status.</p><p><strong>Results: </strong>The results revealed that the blood selenium concentration of children aged 15-17 years (92.93 μg/L) was higher than that of other age groups. Additionally, children from rural western areas, with a selenium concentration of 83.28 μg/L, significantly more vulnerable to inadequate selenium status compared to those from eastern cities [odds ratio (OR) = 20.56 (10.77-39.25)]. Dietary intake of dairy products [OR= 0.58 (0.44-0.75)], meats [≥7 times per week: OR= 0.48 (0.36-0.64), 3-7 times per week: OR= 0.65 (0.48-0.90)], and aquatic products [OR= 0.53 (0.37-0.75)] were identified as protective factors against inadequate selenium status (p < 0.001).</p><p><strong>Conclusion: </strong>This study suggests that children in rural western areas may be a priority population for future interventions to improve selenium status. A moderate increase in the consumption of selenium-rich foods, such as dairy products, meats, and aquatic products, is recommended to reduce the prevalence of inadequate selenium status in these regions.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101480"},"PeriodicalIF":3.8,"publicationDate":"2026-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147457915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-11DOI: 10.1016/j.tjnut.2026.101473
David A Alvarado, Tori A Holthaus, Shelby Martell, Nicole L Southey, Marco Atallah, Rhea Sarma, David Revilla, Marina Brown, Twinkle Mehta, Naiman A Khan, Hannah D Holscher
Background: Dietary fiber may support cognition through gastrointestinal-microbiota mechanisms, but clinical evidence is limited.
Objectives: We aimed to determine whether soluble corn fiber (SCF) improved cognition and altered fecal microbiota and fermentation end products in adults.
Methods: In a randomized, double-blind, crossover trial, 42 healthy adults (45-75y) consumed SCF (18g/d) or a maltodextrin placebo control (CON: 22g/d) for 4 weeks, separated by a washout. Cognitive outcomes included executive function with event-related potentials, relational memory, neuropsychological performance, and mood. Secondary outcomes included fecal microbiota, metabolomics, and gastrointestinal tolerance. Tertiary analyses related microbial and metabolite changes to cognitive improvements using correlation, mediation, and moderation models, and explored SCF fermentation pathways with 16S-predicted functional profiling, shotgun metagenomics and in vitro culturing.
Results: SCF improved reaction times (RT) during congruent (β = -9.8 ms, 95% CI: [-18.4, -1.2], FDR P = 0.01) and incongruent (β = -14.2 ms, 95% CI: [-22.8, -5.6], FDR P = 0.003) flanker trials and increased Parabacteroides (∼4-fold, β = 1.44 log, 95% CI [1.01, 1.88], FDR P < 0.001). At the SCF endpoint, congruent RT tended to be inversely associated with fecal acetate (ρ = -0.33) and propionate (ρ = -0.36), while Parabacteroides was marginally positively associated with acetate (ρ = 0.34) (all FDR P < 0.1). Moderation analyses indicated that SCF-RT relation varied by Parabacteroides magnitude change. At endpoint, SCF increased predicted functional potential of carbohydrate-related KOs and pathways (FDR P < 0.05). In vitro culturing confirmed P. distasonis ferments SCF.
Conclusion: SCF consumption improved attentional inhibition, altered the gut microbiota, and selectively enriched Parabacteroides. Although mediation analyses did not support a direct microbiota-to-cognition pathway, moderation analyses suggested that SCF-related cognitive effects may depend in part on Parabacteroides abundance. Collectively, these findings suggest that certain cognitive benefits of SCF consumption may be partly underpinned by the gut microbiota.
背景:膳食纤维可能通过胃肠道-微生物群机制支持认知,但临床证据有限。目的:我们旨在确定可溶性玉米纤维(SCF)是否能改善成年人的认知能力,改变粪便微生物群和发酵终产物。方法:在一项随机、双盲、交叉试验中,42名健康成年人(45-75岁)连续4周服用SCF (18g/d)或麦芽糖糊精安慰剂对照(CON: 22g/d),通过洗脱期分开。认知结果包括带有事件相关电位的执行功能、关系记忆、神经心理表现和情绪。次要结果包括粪便微生物群、代谢组学和胃肠道耐受性。利用相关性、中介和调节模型分析了相关微生物和代谢物变化与认知改善的关系,并利用16s预测的功能谱、霰弹枪宏基因组学和体外培养探索了SCF发酵途径。结果:SCF改善了同侧试验(β = -9.8 ms, 95% CI: [-18.4, -1.2], FDR P = 0.01)和不同侧试验(β = -14.2 ms, 95% CI: [-22.8, -5.6], FDR P = 0.003)的反应时间(RT),增加了拟abacteroides(~ 4倍,β = 1.44 log, 95% CI [1.01, 1.88], FDR P < 0.001)。在SCF终点,一致RT倾向于与粪便醋酸盐(ρ = -0.33)和丙酸盐(ρ = -0.36)呈负相关,而拟abacteroides与乙酸盐呈边际正相关(ρ = 0.34)(所有FDR P < 0.1)。适度分析表明,SCF-RT关系随副杆菌的大小变化而变化。在终点,SCF增加了碳水化合物相关的KOs和通路的预测功能电位(FDR P < 0.05)。体外培养证实了变形假单胞菌发酵SCF。结论:SCF的摄入改善了注意力抑制,改变了肠道微生物群,并选择性地丰富了副芽孢杆菌。虽然中介分析不支持直接的微生物群-认知途径,但适度分析表明,scf相关的认知效应可能部分取决于拟杆菌的丰度。总的来说,这些发现表明,食用SCF的某些认知益处可能部分受到肠道微生物群的支持。临床试验注册:NCT05066425 (https://clinicaltrials.gov/study/NCT05066425)。
{"title":"Effects of Soluble Corn Fiber Consumption on Executive Functions and Gut Microbiota in Middle to Older Age Adults: A Randomized Controlled Crossover Trial.","authors":"David A Alvarado, Tori A Holthaus, Shelby Martell, Nicole L Southey, Marco Atallah, Rhea Sarma, David Revilla, Marina Brown, Twinkle Mehta, Naiman A Khan, Hannah D Holscher","doi":"10.1016/j.tjnut.2026.101473","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101473","url":null,"abstract":"<p><strong>Background: </strong>Dietary fiber may support cognition through gastrointestinal-microbiota mechanisms, but clinical evidence is limited.</p><p><strong>Objectives: </strong>We aimed to determine whether soluble corn fiber (SCF) improved cognition and altered fecal microbiota and fermentation end products in adults.</p><p><strong>Methods: </strong>In a randomized, double-blind, crossover trial, 42 healthy adults (45-75y) consumed SCF (18g/d) or a maltodextrin placebo control (CON: 22g/d) for 4 weeks, separated by a washout. Cognitive outcomes included executive function with event-related potentials, relational memory, neuropsychological performance, and mood. Secondary outcomes included fecal microbiota, metabolomics, and gastrointestinal tolerance. Tertiary analyses related microbial and metabolite changes to cognitive improvements using correlation, mediation, and moderation models, and explored SCF fermentation pathways with 16S-predicted functional profiling, shotgun metagenomics and in vitro culturing.</p><p><strong>Results: </strong>SCF improved reaction times (RT) during congruent (β = -9.8 ms, 95% CI: [-18.4, -1.2], FDR P = 0.01) and incongruent (β = -14.2 ms, 95% CI: [-22.8, -5.6], FDR P = 0.003) flanker trials and increased Parabacteroides (∼4-fold, β = 1.44 log, 95% CI [1.01, 1.88], FDR P < 0.001). At the SCF endpoint, congruent RT tended to be inversely associated with fecal acetate (ρ = -0.33) and propionate (ρ = -0.36), while Parabacteroides was marginally positively associated with acetate (ρ = 0.34) (all FDR P < 0.1). Moderation analyses indicated that SCF-RT relation varied by Parabacteroides magnitude change. At endpoint, SCF increased predicted functional potential of carbohydrate-related KOs and pathways (FDR P < 0.05). In vitro culturing confirmed P. distasonis ferments SCF.</p><p><strong>Conclusion: </strong>SCF consumption improved attentional inhibition, altered the gut microbiota, and selectively enriched Parabacteroides. Although mediation analyses did not support a direct microbiota-to-cognition pathway, moderation analyses suggested that SCF-related cognitive effects may depend in part on Parabacteroides abundance. Collectively, these findings suggest that certain cognitive benefits of SCF consumption may be partly underpinned by the gut microbiota.</p><p><strong>Clinical trial registry: </strong>NCT05066425 (https://clinicaltrials.gov/study/NCT05066425).</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101473"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147457502","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Influence of dietary protein and its interaction with dietary fiber (DF) on gut microbiota, fermentation, and host immunity, particularly immunoglobulin A (IgA) responses, is not fully understood.
Objective: This study examined how gut microbiota adapt to dietary protein, subsequently influencing fermentation profiles and IgA responses, and how these effects are modulated by co-ingestion of vegetable fiber (VF).
Methods: Male Wistar rats (5 weeks old) were fed one of six diets varying in protein (casein, soy, egg white) and DF (cellulose, VF) source for 14 days (n = 6/group). Cecal microbial composition, organic acids, ammonia, IgA, and immune-related gene expression as well as fecal IgA were analyzed. Data were analyzed using two-way or aligned rank transform (ART) ANOVA. Microbiota composition was assessed using permutational multivariate ANOVA (PERMANOVA) and ANOVA-like differential expression tool version 2 (ALDEx2), and Spearman's rank correlation was applied for microbial co-occurrence network construction and correlation analysis.
Results: Soy protein-VF diet yielded the highest alpha-diversity, while egg white protein-VF diet yielded the lowest across multiple indices (P < 0.05). Beta-diversity analysis confirmed distinct clustering among dietary groups (P < 0.001), while network analysis showed that protein source affected community structure. Soy protein-VF diet showed an increase in n-butyrate production relative to soy protein-cellulose diet (49.09 vs. 17.28 μmol/cecum, P < 0.05). Egg white protein-cellulose diet showed the highest ammonia production that was suppressed by VF co-ingestion (155.52 vs. 61.66 μmol/cecum, P < 0.05). Notably, cecal IgA showed a positive correlation with ammonia (ρ = 0.67, P-adj. < 0.01).
Conclusions: In rats, dietary protein and its interaction with VF are associated with compositionally distinct microbial signatures that influence fermentation profiles and IgA responses in the cecum. These findings highlight the importance of considering protein-fiber combinations when designing dietary interventions to optimize gut health.
{"title":"Interactions between Dietary Protein and Vegetable Fiber via the Gut Microbiota Are Associated with Cecal Fermentation Profiles and IgA Responses in Rats.","authors":"Suzuna Shigetomi, Natsumi Fujimoto, Kana Hirano, Tsukasa Matsuda, Chikara Kato, Naomichi Nishimura, Shingo Hino","doi":"10.1016/j.tjnut.2026.101469","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101469","url":null,"abstract":"<p><strong>Background: </strong>Influence of dietary protein and its interaction with dietary fiber (DF) on gut microbiota, fermentation, and host immunity, particularly immunoglobulin A (IgA) responses, is not fully understood.</p><p><strong>Objective: </strong>This study examined how gut microbiota adapt to dietary protein, subsequently influencing fermentation profiles and IgA responses, and how these effects are modulated by co-ingestion of vegetable fiber (VF).</p><p><strong>Methods: </strong>Male Wistar rats (5 weeks old) were fed one of six diets varying in protein (casein, soy, egg white) and DF (cellulose, VF) source for 14 days (n = 6/group). Cecal microbial composition, organic acids, ammonia, IgA, and immune-related gene expression as well as fecal IgA were analyzed. Data were analyzed using two-way or aligned rank transform (ART) ANOVA. Microbiota composition was assessed using permutational multivariate ANOVA (PERMANOVA) and ANOVA-like differential expression tool version 2 (ALDEx2), and Spearman's rank correlation was applied for microbial co-occurrence network construction and correlation analysis.</p><p><strong>Results: </strong>Soy protein-VF diet yielded the highest alpha-diversity, while egg white protein-VF diet yielded the lowest across multiple indices (P < 0.05). Beta-diversity analysis confirmed distinct clustering among dietary groups (P < 0.001), while network analysis showed that protein source affected community structure. Soy protein-VF diet showed an increase in n-butyrate production relative to soy protein-cellulose diet (49.09 vs. 17.28 μmol/cecum, P < 0.05). Egg white protein-cellulose diet showed the highest ammonia production that was suppressed by VF co-ingestion (155.52 vs. 61.66 μmol/cecum, P < 0.05). Notably, cecal IgA showed a positive correlation with ammonia (ρ = 0.67, P-adj. < 0.01).</p><p><strong>Conclusions: </strong>In rats, dietary protein and its interaction with VF are associated with compositionally distinct microbial signatures that influence fermentation profiles and IgA responses in the cecum. These findings highlight the importance of considering protein-fiber combinations when designing dietary interventions to optimize gut health.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101469"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147457692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-11DOI: 10.1016/j.tjnut.2026.101470
Lynn E Ferro, Kyle Bittinger, Sabrina P Trudo, Jae Kyeom Kim, Clarisse M Hunt, Benjamin Brewer, Shawn W Polson, Jillian C Trabulsi
Background: The gastrointestinal microbiome, integral to immune function, inflammation, and metabolism, becomes less malleable with age, making early dietary exposures, particularly first complementary foods (CFs), important in its development.
Objective: To evaluate the effect of different first CFs on the infant gut microbiome in a pilot, randomized, controlled trial.
Methods: Vaginally delivered, exclusively human milk (HM) fed infants (n=43) with no prior CF exposure were randomized to one of four groups (oatmeal cereal, beef, carrot, prune) (NCT05492253). Infants were fed the randomized food (with HM) for one week (phase 1), followed by oatmeal cereal for another week (phase 2). Daily stool samples were collected and sequenced (full-length V1-V9 16S rRNA gene amplicons).
Results: In phase 1, oatmeal cereal increased observed ASVs compared to beef (p=0.024). Prune increased Bacteroides ovatus (p=0.001) and Klebsiella pneumoniae (p=0.011), while oatmeal cereal increased Enterococcus spp. (p=0.030) relative to beef. In phase 2, oatmeal cereal following beef resulted in increased Shannon diversity (p=0.0497) and following prune increased Faith's phylogenetic diversity (p=0.015). Unweighted UniFrac distances differed when oatmeal cereal followed prune compared to continuing oatmeal cereal (p=0.042). Veillonella infantium increased with continued oatmeal cereal consumption compared to beef (p=0.002) or carrot (p=0.002) followed by oatmeal cereal. After prune, oatmeal cereal increased Lactobacillus rhamnosus and Enterococcus faecalis, and decreased Klebsiella pneumoniae and Clostridium neonatale (all p<0.05).
Conclusions: Beef as a CF is nutritionally desirable as it contains important minerals lacking in HM, yet resulted in a less diverse microbial profile. Since fruit and vegetables yielded comparable diversity to cereal, future research should investigate whether introducing meat alongside fruits and vegetables offers a balanced alternative to early reliance on cereals and further evaluate how first foods influence taxa abundance at the genus and species level and the resulting immune-related and metabolic pathways.
{"title":"Influence of Singular First Foods on the Infant Gut Microbiome: A Randomized Controlled Trial.","authors":"Lynn E Ferro, Kyle Bittinger, Sabrina P Trudo, Jae Kyeom Kim, Clarisse M Hunt, Benjamin Brewer, Shawn W Polson, Jillian C Trabulsi","doi":"10.1016/j.tjnut.2026.101470","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101470","url":null,"abstract":"<p><strong>Background: </strong>The gastrointestinal microbiome, integral to immune function, inflammation, and metabolism, becomes less malleable with age, making early dietary exposures, particularly first complementary foods (CFs), important in its development.</p><p><strong>Objective: </strong>To evaluate the effect of different first CFs on the infant gut microbiome in a pilot, randomized, controlled trial.</p><p><strong>Methods: </strong>Vaginally delivered, exclusively human milk (HM) fed infants (n=43) with no prior CF exposure were randomized to one of four groups (oatmeal cereal, beef, carrot, prune) (NCT05492253). Infants were fed the randomized food (with HM) for one week (phase 1), followed by oatmeal cereal for another week (phase 2). Daily stool samples were collected and sequenced (full-length V1-V9 16S rRNA gene amplicons).</p><p><strong>Results: </strong>In phase 1, oatmeal cereal increased observed ASVs compared to beef (p=0.024). Prune increased Bacteroides ovatus (p=0.001) and Klebsiella pneumoniae (p=0.011), while oatmeal cereal increased Enterococcus spp. (p=0.030) relative to beef. In phase 2, oatmeal cereal following beef resulted in increased Shannon diversity (p=0.0497) and following prune increased Faith's phylogenetic diversity (p=0.015). Unweighted UniFrac distances differed when oatmeal cereal followed prune compared to continuing oatmeal cereal (p=0.042). Veillonella infantium increased with continued oatmeal cereal consumption compared to beef (p=0.002) or carrot (p=0.002) followed by oatmeal cereal. After prune, oatmeal cereal increased Lactobacillus rhamnosus and Enterococcus faecalis, and decreased Klebsiella pneumoniae and Clostridium neonatale (all p<0.05).</p><p><strong>Conclusions: </strong>Beef as a CF is nutritionally desirable as it contains important minerals lacking in HM, yet resulted in a less diverse microbial profile. Since fruit and vegetables yielded comparable diversity to cereal, future research should investigate whether introducing meat alongside fruits and vegetables offers a balanced alternative to early reliance on cereals and further evaluate how first foods influence taxa abundance at the genus and species level and the resulting immune-related and metabolic pathways.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101470"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147457576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-11DOI: 10.1016/j.tjnut.2026.101474
Cory Dugan, Sai Krupa Das, Susan B Racette, Corby K Martin, Eric Ravussin, Leanne M Redman, Stephen R Hennigar
Background: Caloric restriction (CR) is a promising nutritional intervention for improving metabolic and age-related health outcomes, but its long-term effects on hematologic health remain unclear. Clarifying how prolonged CR affects anemia risk and iron status is essential for evaluating its long-term safety and clinical relevance.
Objectives: To determine the effects of a two-year CR intervention on markers of anemia, iron status, and hepcidin in females and males enrolled in the Comprehensive Assessment of Long-Term Effects of Reducing Intake of Energy (CALERIE™) Phase 2 trial.
Methods: Participants in CALERIE Phase 2 (n=220) were randomized to 25% CR or ad libitum (AL) control. AL continued their habitual diet whereas CR received an intensive intervention to promote CR over 2 years. All participants received a multivitamin/mineral supplement containing 18mg iron. Fasted blood was collected at baseline (BL) and 12 (M12) and 24 (M24) months and indicators of iron status (ferritin, soluble transferrin receptor [sTfR], and serum iron), anemia (hemoglobin and hematocrit), and regulators of iron status (hepcidin, CRP, and IL-6) were measured. Six-day diet diaries were collected twice at BL and once each at M12 and M24. An anemia surveillance protocol monitored hemoglobin, hematocrit, red blood cell count, and serum iron throughout the intervention, with medical evaluation and temporary/permanent CR discontinuation as needed. Linear mixed-effects models were used to evaluate the effect of treatment (CR vs. AL), time (BL, M12, M24), and their interaction (treatment x time).
Results: Participants (n=218) were mostly female (70%) with an average age (±SD) of 38.1±7.2 years and a mean BMI of 25.2±1.7 kg/m2. At baseline, ferritin (105.5±126.9 μg/L), hepcidin (8.6±5.8 ng/mL), and dietary iron intake (16.1±5.5 mg/day) were similar between groups (p>0.05). There were no group x time interactions for markers of anemia, indicators of iron status, or hepcidin (p>0.05). Despite the anemia surveillance protocol, anemia prevalence remained above 5% in both groups across all timepoints. Low RBC count was the most common trigger, and participants who triggered the protocol had lower hematocrit at M12 (p < 0.001) and M24 (p < 0.001); however, dietary iron intake remained similar and there were no differences in any indicators of iron status or hepcidin between those who triggered the protocol and those who did not.
Conclusions: These findings suggest that prolonged CR in the absence of malnutrition does not adversely affect iron status or hepcidin in healthy adults.
{"title":"Effect of two-year caloric restriction in the absence of malnutrition on indicators of anemia, iron status, and hepcidin in healthy adults: a randomized clinical trial.","authors":"Cory Dugan, Sai Krupa Das, Susan B Racette, Corby K Martin, Eric Ravussin, Leanne M Redman, Stephen R Hennigar","doi":"10.1016/j.tjnut.2026.101474","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101474","url":null,"abstract":"<p><strong>Background: </strong>Caloric restriction (CR) is a promising nutritional intervention for improving metabolic and age-related health outcomes, but its long-term effects on hematologic health remain unclear. Clarifying how prolonged CR affects anemia risk and iron status is essential for evaluating its long-term safety and clinical relevance.</p><p><strong>Objectives: </strong>To determine the effects of a two-year CR intervention on markers of anemia, iron status, and hepcidin in females and males enrolled in the Comprehensive Assessment of Long-Term Effects of Reducing Intake of Energy (CALERIE™) Phase 2 trial.</p><p><strong>Methods: </strong>Participants in CALERIE Phase 2 (n=220) were randomized to 25% CR or ad libitum (AL) control. AL continued their habitual diet whereas CR received an intensive intervention to promote CR over 2 years. All participants received a multivitamin/mineral supplement containing 18mg iron. Fasted blood was collected at baseline (BL) and 12 (M12) and 24 (M24) months and indicators of iron status (ferritin, soluble transferrin receptor [sTfR], and serum iron), anemia (hemoglobin and hematocrit), and regulators of iron status (hepcidin, CRP, and IL-6) were measured. Six-day diet diaries were collected twice at BL and once each at M12 and M24. An anemia surveillance protocol monitored hemoglobin, hematocrit, red blood cell count, and serum iron throughout the intervention, with medical evaluation and temporary/permanent CR discontinuation as needed. Linear mixed-effects models were used to evaluate the effect of treatment (CR vs. AL), time (BL, M12, M24), and their interaction (treatment x time).</p><p><strong>Results: </strong>Participants (n=218) were mostly female (70%) with an average age (±SD) of 38.1±7.2 years and a mean BMI of 25.2±1.7 kg/m<sup>2</sup>. At baseline, ferritin (105.5±126.9 μg/L), hepcidin (8.6±5.8 ng/mL), and dietary iron intake (16.1±5.5 mg/day) were similar between groups (p>0.05). There were no group x time interactions for markers of anemia, indicators of iron status, or hepcidin (p>0.05). Despite the anemia surveillance protocol, anemia prevalence remained above 5% in both groups across all timepoints. Low RBC count was the most common trigger, and participants who triggered the protocol had lower hematocrit at M12 (p < 0.001) and M24 (p < 0.001); however, dietary iron intake remained similar and there were no differences in any indicators of iron status or hepcidin between those who triggered the protocol and those who did not.</p><p><strong>Conclusions: </strong>These findings suggest that prolonged CR in the absence of malnutrition does not adversely affect iron status or hepcidin in healthy adults.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101474"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147457385","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-11DOI: 10.1016/j.tjnut.2026.101475
Emma J Stinson, Ethan Collins, Tomas Cabeza De Baca, Marci E Gluck, Manuel Dote-Montero, Susan B Racette, Stavros A Kavouras, Sai Krupa Das, Paolo Piaggi, Susanne Votruba, Ashley Hale, Douglas C Chang
Background: Water intake is vital for health, yet the determinants of preformed water consumption in adults are poorly understood.
Objectives: The aim of this study was to apply machine learning (ML) models to identify factors associated with preformed water intake, defined as water ingestion from plain water, other beverages, and food.
Methods: This secondary analysis used baseline data from 219 participants in the Comprehensive Assessment of Long-term Effects of Reducing Intake of Energy (CALERIE™) 2 trial, a randomized controlled trial with extensive measures of body composition, energy expenditure, and dietary, physiological, psychological, and biomarker variables in healthy adults without obesity. Habitual intake of preformed water was quantified using deuterium and oxygen-18 isotope data obtained during two consecutive 14-day doubly labeled water measurement periods of weight stability. We developed models using linear regression, tree-based models (random forest, gradient boosting, extreme gradient boosting), and penalized regression models (ridge, lasso, elastic net) to identify factors associated with preformed water intake.
Results: Based on root mean squared error, the ridge regression model using 25 variables was the best and explained 38% of the variance in preformed water intake. Higher preformed water intake was associated with higher intake of dietary fiber, protein, alcohol, total weight of food ingested, and lower intake of carbohydrate and sodium. Higher preformed water intake also was associated with lower percent body fat and higher fat free mass and total energy expenditure. Notably, ML models identified alcohol and potassium intake as important predictors that were not selected by traditional linear regression, underscoring their ability to capture nuanced relationships.
Conclusions: These results demonstrate that data-driven ML models using a complex dataset can identify features and patterns associated with an important nutrient that might be missed using traditional statistical approaches and could be used to identify individuals at risk of inadequate hydration.
{"title":"Use of machine learning to identify determinants of habitual preformed water intake.","authors":"Emma J Stinson, Ethan Collins, Tomas Cabeza De Baca, Marci E Gluck, Manuel Dote-Montero, Susan B Racette, Stavros A Kavouras, Sai Krupa Das, Paolo Piaggi, Susanne Votruba, Ashley Hale, Douglas C Chang","doi":"10.1016/j.tjnut.2026.101475","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101475","url":null,"abstract":"<p><strong>Background: </strong>Water intake is vital for health, yet the determinants of preformed water consumption in adults are poorly understood.</p><p><strong>Objectives: </strong>The aim of this study was to apply machine learning (ML) models to identify factors associated with preformed water intake, defined as water ingestion from plain water, other beverages, and food.</p><p><strong>Methods: </strong>This secondary analysis used baseline data from 219 participants in the Comprehensive Assessment of Long-term Effects of Reducing Intake of Energy (CALERIE™) 2 trial, a randomized controlled trial with extensive measures of body composition, energy expenditure, and dietary, physiological, psychological, and biomarker variables in healthy adults without obesity. Habitual intake of preformed water was quantified using deuterium and oxygen-18 isotope data obtained during two consecutive 14-day doubly labeled water measurement periods of weight stability. We developed models using linear regression, tree-based models (random forest, gradient boosting, extreme gradient boosting), and penalized regression models (ridge, lasso, elastic net) to identify factors associated with preformed water intake.</p><p><strong>Results: </strong>Based on root mean squared error, the ridge regression model using 25 variables was the best and explained 38% of the variance in preformed water intake. Higher preformed water intake was associated with higher intake of dietary fiber, protein, alcohol, total weight of food ingested, and lower intake of carbohydrate and sodium. Higher preformed water intake also was associated with lower percent body fat and higher fat free mass and total energy expenditure. Notably, ML models identified alcohol and potassium intake as important predictors that were not selected by traditional linear regression, underscoring their ability to capture nuanced relationships.</p><p><strong>Conclusions: </strong>These results demonstrate that data-driven ML models using a complex dataset can identify features and patterns associated with an important nutrient that might be missed using traditional statistical approaches and could be used to identify individuals at risk of inadequate hydration.</p><p><strong>Clinical trial registration: </strong>NCT00427193, clinicaltrials.gov.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101475"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147458032","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-11DOI: 10.1016/j.tjnut.2026.101477
Jonathan Clinthorne, Heather J Leidy, Kris Sollid
Protein is an essential nutrient that supports many critical aspects of health across the lifespan. The scientific report from the 2025 Dietary Guidelines Advisory Committee found that certain subgroups, particularly adolescent females, young women and older adults, are at a higher risk of not consuming the recommended amount of dietary protein. The new 2025-2030 Dietary Guidelines for Americans prioritize a serving of protein with each meal and recommend a healthy range of protein intake for adults of 1.2-1.6 g protein per kg body weight per day. Data from the 2025 Food and Health Survey conducted by the International Food Information Council (IFIC) show that consumer interest in protein has risen dramatically in the last decade, as consumers increasingly report following a high-protein diet and use the protein content of food as a marker for healthfulness. Despite an increase in consumer interest in protein, there is still limited understanding of how healthcare professionals can effectively support increased intake of protein sources among at-risk populations. Past research shows that protein-fortified foods can be used to supplement protein intake in randomized controlled trials that have demonstrated positive health outcomes in study participants. However, many of these foods are considered highly processed which leads to debate regarding their role in healthy dietary patterns. This Perspective examines consumer perceptions around protein intake and highlights the role of healthcare professionals in providing tailored guidance on protein food choices.
{"title":"A Modern Take on Protein Nutrition Meets Evolving Consumer Perceptions.","authors":"Jonathan Clinthorne, Heather J Leidy, Kris Sollid","doi":"10.1016/j.tjnut.2026.101477","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101477","url":null,"abstract":"<p><p>Protein is an essential nutrient that supports many critical aspects of health across the lifespan. The scientific report from the 2025 Dietary Guidelines Advisory Committee found that certain subgroups, particularly adolescent females, young women and older adults, are at a higher risk of not consuming the recommended amount of dietary protein. The new 2025-2030 Dietary Guidelines for Americans prioritize a serving of protein with each meal and recommend a healthy range of protein intake for adults of 1.2-1.6 g protein per kg body weight per day. Data from the 2025 Food and Health Survey conducted by the International Food Information Council (IFIC) show that consumer interest in protein has risen dramatically in the last decade, as consumers increasingly report following a high-protein diet and use the protein content of food as a marker for healthfulness. Despite an increase in consumer interest in protein, there is still limited understanding of how healthcare professionals can effectively support increased intake of protein sources among at-risk populations. Past research shows that protein-fortified foods can be used to supplement protein intake in randomized controlled trials that have demonstrated positive health outcomes in study participants. However, many of these foods are considered highly processed which leads to debate regarding their role in healthy dietary patterns. This Perspective examines consumer perceptions around protein intake and highlights the role of healthcare professionals in providing tailored guidance on protein food choices.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101477"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147458123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-11DOI: 10.1016/j.tjnut.2026.101476
Nilmini Karunarathna, Thushanthi Perera, Odiche Nwabuikwu, Boateng Bannerman, Silas Bempong, Peter McCloskey, Phuong Hong Nguyen, David Hughes, Gloria Folson, Renuka Silva, Aulo Gelli
Background: Reliable dietary data for adolescents in low- and middle-income countries (LMICs) are limited due to high costs and estimation errors in traditional dietary assessment methods. Although technology-assisted dietary assessment tools are becoming popular, few have been validated in LMICs.
Objective: This study validated the PlantVillage Food Recognition Assistance and Nudging Insights (FRANI), an Artificial Intelligence (AI)- assisted mobile application for dietary assessment, against weighed food records (WFR) and multipass 24-hour recalls (24HR) among adolescent girls aged 14-18 years (n=60) in urban/semi-urban communities in Sri Lanka.
Methods: Dietary intake was assessed over 2 non-consecutive days using three methods: FRANI, WFR, and 24HR. The equivalence of nutrient intake was evaluated using mixed-effect models accounting for repeated measures by comparing intake ratios (FRANI/WFR and 24HR/WFR) with 10%, 15%, and 20% equivalence bounds. The concordance correlation coefficient (CCC) was utilized to assess the agreement between methods.
Results: FRANI demonstrated equivalence with WFR at the 10% bound for energy and vitamin A; 15% for protein, fiber, iron and zinc; and 20% for fat, niacin and folate intakes. Comparisons between 24HR and WFR found that no nutrients fell within the 10% bound. Energy, protein, fat, iron, niacin and vitamin A intakes were equivalent at 15% bound, while fiber, calcium, folate and vitamin C intakes were equivalent at 20% bound. CCCs ranged from 0.49 to 0.89 for FRANI vs. WFR, and 0.44 to 0.84 for 24HR vs. WFR. Omission errors were 2% for FRANI and 12% for 24HR, and intrusion errors were 7% and 9%, respectively.
Conclusions: PlantVillage FRANI application accurately estimated nutrient intakes of adolescent girls in Sri Lanka compared to the WFR. Its performance was at least comparable to the traditional 24HR method, supporting its potential as a scalable alternative for dietary assessment in similar LMIC populations.
{"title":"Artificial Intelligence-assisted dietary assessment in adolescent girls in Sri Lanka: Validity against weighed food records and comparison with 24-hour recalls.","authors":"Nilmini Karunarathna, Thushanthi Perera, Odiche Nwabuikwu, Boateng Bannerman, Silas Bempong, Peter McCloskey, Phuong Hong Nguyen, David Hughes, Gloria Folson, Renuka Silva, Aulo Gelli","doi":"10.1016/j.tjnut.2026.101476","DOIUrl":"https://doi.org/10.1016/j.tjnut.2026.101476","url":null,"abstract":"<p><strong>Background: </strong>Reliable dietary data for adolescents in low- and middle-income countries (LMICs) are limited due to high costs and estimation errors in traditional dietary assessment methods. Although technology-assisted dietary assessment tools are becoming popular, few have been validated in LMICs.</p><p><strong>Objective: </strong>This study validated the PlantVillage Food Recognition Assistance and Nudging Insights (FRANI), an Artificial Intelligence (AI)- assisted mobile application for dietary assessment, against weighed food records (WFR) and multipass 24-hour recalls (24HR) among adolescent girls aged 14-18 years (n=60) in urban/semi-urban communities in Sri Lanka.</p><p><strong>Methods: </strong>Dietary intake was assessed over 2 non-consecutive days using three methods: FRANI, WFR, and 24HR. The equivalence of nutrient intake was evaluated using mixed-effect models accounting for repeated measures by comparing intake ratios (FRANI/WFR and 24HR/WFR) with 10%, 15%, and 20% equivalence bounds. The concordance correlation coefficient (CCC) was utilized to assess the agreement between methods.</p><p><strong>Results: </strong>FRANI demonstrated equivalence with WFR at the 10% bound for energy and vitamin A; 15% for protein, fiber, iron and zinc; and 20% for fat, niacin and folate intakes. Comparisons between 24HR and WFR found that no nutrients fell within the 10% bound. Energy, protein, fat, iron, niacin and vitamin A intakes were equivalent at 15% bound, while fiber, calcium, folate and vitamin C intakes were equivalent at 20% bound. CCCs ranged from 0.49 to 0.89 for FRANI vs. WFR, and 0.44 to 0.84 for 24HR vs. WFR. Omission errors were 2% for FRANI and 12% for 24HR, and intrusion errors were 7% and 9%, respectively.</p><p><strong>Conclusions: </strong>PlantVillage FRANI application accurately estimated nutrient intakes of adolescent girls in Sri Lanka compared to the WFR. Its performance was at least comparable to the traditional 24HR method, supporting its potential as a scalable alternative for dietary assessment in similar LMIC populations.</p>","PeriodicalId":16620,"journal":{"name":"Journal of Nutrition","volume":" ","pages":"101476"},"PeriodicalIF":3.8,"publicationDate":"2026-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147458167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}