首页 > 最新文献

Plant Methods最新文献

英文 中文
Overexpression of Vitis GRF4-GIF1 improves regeneration efficiency in diploid Fragaria vesca Hawaii 4. 过表达葡萄 GRF4-GIF1 可提高二倍体 Fragaria vesca Hawaii 4 的再生效率。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-18 DOI: 10.1186/s13007-024-01270-8
Esther Rosales Sanchez, R Jordan Price, Federico Marangelli, Kirsty McLeary, Richard J Harrison, Anindya Kundu

Background: Plant breeding played a very important role in transforming strawberries from being a niche crop with a small geographical footprint into an economically important crop grown across the planet. But even modern marker assisted breeding takes a considerable amount of time, over multiple plant generations, to produce a plant with desirable traits. As a quicker alternative, plants with desirable traits can be raised through tissue culture by doing precise genetic manipulations. Overexpression of morphogenic regulators previously known for meristem development, the transcription factors Growth-Regulating Factors (GRFs) and the GRF-Interacting Factors (GIFs), provided an efficient strategy for easier regeneration and transformation in multiple crops.

Results: We present here a comprehensive protocol for the diploid strawberry Fragaria vesca Hawaii 4 (strawberry) regeneration and transformation under control condition as compared to ectopic expression of different GRF4-GIF1 chimeras from different plant species. We report that ectopic expression of Vitis vinifera VvGRF4-GIF1 provides significantly higher regeneration efficiency during re-transformation over wild-type plants. On the other hand, deregulated expression of miRNA resistant version of VvGRF4-GIF1 or Triticum aestivum (wheat) TaGRF4-GIF1 resulted in abnormalities. Transcriptomic analysis between the different chimeric GRF4-GIF1 lines indicate that differential expression of FvExpansin might be responsible for the observed pleiotropic effects. Similarly, cytokinin dehydrogenase/oxygenase and cytokinin responsive response regulators also showed differential expression indicating GRF4-GIF1 pathway playing important role in controlling cytokinin homeostasis.

Conclusion: Our data indicate that ectopic expression of Vitis vinifera VvGRF4-GIF1 chimera can provide significant advantage over wild-type plants during strawberry regeneration without producing any pleiotropic effects seen for the miRNA resistant VvGRF4-GIF1 or TaGRF4-GIF1.

背景:植物育种在将草莓从一种地域范围较小的小众作物转变为一种在全球种植的重要经济作物的过程中发挥了非常重要的作用。但是,即使是现代标记辅助育种也需要相当长的时间,经过多代植物才能培育出具有理想性状的植物。作为一种更快捷的替代方法,可以通过精确的基因操作,通过组织培养培育出具有理想性状的植物。过度表达以前已知的分生组织发育的形态发生调节因子--转录因子生长调节因子(GRFs)和 GRF 交互因子(GIFs)--提供了一种高效的策略,使多种作物的再生和转化更加容易:结果:我们在此介绍了在对照条件下二倍体草莓Fragaria vesca Hawaii 4(草莓)再生和转化的综合方案,并对不同植物物种异位表达不同的GRF4-GIF1嵌合体进行了比较。我们发现,异位表达葡萄 VvGRF4-GIF1 在再转化过程中的再生效率明显高于野生型植株。另一方面,抗 miRNA 版本的 VvGRF4-GIF1 或 Triticum aestivum(小麦)TaGRF4-GIF1 的表达失调会导致异常。不同嵌合 GRF4-GIF1 株系之间的转录组分析表明,FvExpansin 的不同表达可能是造成所观察到的多效应的原因。同样,细胞分裂素脱氢酶/加氧酶和细胞分裂素反应调节因子也出现了差异表达,这表明 GRF4-GIF1 通路在控制细胞分裂素平衡中发挥着重要作用:我们的数据表明,在草莓再生过程中,葡萄 VvGRF4-GIF1 嵌合体的异位表达比野生型植株具有显著优势,而不会产生任何 miRNA 抗性 VvGRF4-GIF1 或 TaGRF4-GIF1 的多生物效应。
{"title":"Overexpression of Vitis GRF4-GIF1 improves regeneration efficiency in diploid Fragaria vesca Hawaii 4.","authors":"Esther Rosales Sanchez, R Jordan Price, Federico Marangelli, Kirsty McLeary, Richard J Harrison, Anindya Kundu","doi":"10.1186/s13007-024-01270-8","DOIUrl":"https://doi.org/10.1186/s13007-024-01270-8","url":null,"abstract":"<p><strong>Background: </strong>Plant breeding played a very important role in transforming strawberries from being a niche crop with a small geographical footprint into an economically important crop grown across the planet. But even modern marker assisted breeding takes a considerable amount of time, over multiple plant generations, to produce a plant with desirable traits. As a quicker alternative, plants with desirable traits can be raised through tissue culture by doing precise genetic manipulations. Overexpression of morphogenic regulators previously known for meristem development, the transcription factors Growth-Regulating Factors (GRFs) and the GRF-Interacting Factors (GIFs), provided an efficient strategy for easier regeneration and transformation in multiple crops.</p><p><strong>Results: </strong>We present here a comprehensive protocol for the diploid strawberry Fragaria vesca Hawaii 4 (strawberry) regeneration and transformation under control condition as compared to ectopic expression of different GRF4-GIF1 chimeras from different plant species. We report that ectopic expression of Vitis vinifera VvGRF4-GIF1 provides significantly higher regeneration efficiency during re-transformation over wild-type plants. On the other hand, deregulated expression of miRNA resistant version of VvGRF4-GIF1 or Triticum aestivum (wheat) TaGRF4-GIF1 resulted in abnormalities. Transcriptomic analysis between the different chimeric GRF4-GIF1 lines indicate that differential expression of FvExpansin might be responsible for the observed pleiotropic effects. Similarly, cytokinin dehydrogenase/oxygenase and cytokinin responsive response regulators also showed differential expression indicating GRF4-GIF1 pathway playing important role in controlling cytokinin homeostasis.</p><p><strong>Conclusion: </strong>Our data indicate that ectopic expression of Vitis vinifera VvGRF4-GIF1 chimera can provide significant advantage over wild-type plants during strawberry regeneration without producing any pleiotropic effects seen for the miRNA resistant VvGRF4-GIF1 or TaGRF4-GIF1.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"160"},"PeriodicalIF":4.7,"publicationDate":"2024-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11488064/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142472424","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Resource-optimized cnns for real-time rice disease detection with ARM cortex-M microprocessors. 使用 ARM cortex-M 微处理器实时检测水稻病害的资源优化 cnns。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-16 DOI: 10.1186/s13007-024-01280-6
Hermawan Nugroho, Jing Xan Chew, Sivaraman Eswaran, Fei Siang Tay

This study explores the application of Artificial Intelligence (AI), specifically Convolutional Neural Networks (CNNs), for detecting rice plant diseases using ARM Cortex-M microprocessors. Given the significant role of rice as a staple food, particularly in Malaysia where the rice self-sufficiency ratio dropped from 65.2% in 2021 to 62.6% in 2022, there is a pressing need for advanced disease detection methods to enhance agricultural productivity and sustainability. The research utilizes two extensive datasets for model training and validation: the first dataset includes 5932 images across four rice disease classes, and the second comprises 10,407 images across ten classes. These datasets facilitate comprehensive disease detection analysis, leveraging MobileNetV2 and FD-MobileNet models optimized for the ARM Cortex-M4 microprocessor. The performance of these models is rigorously evaluated in terms of accuracy and computational efficiency. MobileNetV2, for instance, demonstrates a high accuracy rate of 97.5%, significantly outperforming FD-MobileNet, especially in detecting complex disease patterns such as tungro with a 93% accuracy rate. Despite FD-MobileNet's lower resource consumption, its accuracy is limited to 90% across varied testing conditions. Resource optimization strategies highlight that even slight adjustments, such as a 0.5% reduction in RAM usage and a 1.14% decrease in flash memory, can result in a notable 9% increase in validation accuracy. This underscores the critical balance between computational resource management and model performance, particularly in resource-constrained settings like those provided by microcontrollers. In summary, the deployment of CNNs on microcontrollers presents a viable solution for real-time, on-site plant disease detection, demonstrating potential improvements in detection accuracy and operational efficiency. This study advances the field of smart agriculture by integrating cutting-edge AI with practical agricultural needs, aiming to address the challenges of food security in vulnerable regions.

本研究探讨了人工智能(AI),特别是卷积神经网络(CNN)在使用 ARM Cortex-M 微处理器检测水稻病害方面的应用。鉴于水稻作为主食的重要作用,特别是马来西亚的水稻自给率从 2021 年的 65.2% 下降到 2022 年的 62.6%,因此迫切需要先进的病害检测方法来提高农业生产率和可持续性。该研究利用两个广泛的数据集进行模型训练和验证:第一个数据集包括 5932 幅图像,涉及四个水稻病害类别;第二个数据集包括 10407 幅图像,涉及十个类别。这些数据集有助于利用针对 ARM Cortex-M4 微处理器优化的 MobileNetV2 和 FD-MobileNet 模型进行全面的病害检测分析。这些模型的性能在准确性和计算效率方面得到了严格评估。例如,MobileNetV2 的准确率高达 97.5%,明显优于 FD-MobileNet,特别是在检测复杂的疾病模式(如桐子病)时,准确率高达 93%。尽管 FD-MobileNet 的资源消耗较低,但在不同的测试条件下,其准确率仅限于 90%。资源优化策略突出表明,即使是微小的调整,如减少 0.5% 的内存使用量和 1.14% 的闪存使用量,也能显著提高 9% 的验证准确率。这凸显了计算资源管理与模型性能之间的关键平衡,尤其是在微控制器等资源有限的环境中。总之,在微控制器上部署 CNN 为实时、现场植物病害检测提供了可行的解决方案,显示了在检测精度和运行效率方面的潜在改进。这项研究通过将前沿的人工智能与实际农业需求相结合,推进了智能农业领域的发展,旨在应对脆弱地区的粮食安全挑战。
{"title":"Resource-optimized cnns for real-time rice disease detection with ARM cortex-M microprocessors.","authors":"Hermawan Nugroho, Jing Xan Chew, Sivaraman Eswaran, Fei Siang Tay","doi":"10.1186/s13007-024-01280-6","DOIUrl":"https://doi.org/10.1186/s13007-024-01280-6","url":null,"abstract":"<p><p>This study explores the application of Artificial Intelligence (AI), specifically Convolutional Neural Networks (CNNs), for detecting rice plant diseases using ARM Cortex-M microprocessors. Given the significant role of rice as a staple food, particularly in Malaysia where the rice self-sufficiency ratio dropped from 65.2% in 2021 to 62.6% in 2022, there is a pressing need for advanced disease detection methods to enhance agricultural productivity and sustainability. The research utilizes two extensive datasets for model training and validation: the first dataset includes 5932 images across four rice disease classes, and the second comprises 10,407 images across ten classes. These datasets facilitate comprehensive disease detection analysis, leveraging MobileNetV2 and FD-MobileNet models optimized for the ARM Cortex-M4 microprocessor. The performance of these models is rigorously evaluated in terms of accuracy and computational efficiency. MobileNetV2, for instance, demonstrates a high accuracy rate of 97.5%, significantly outperforming FD-MobileNet, especially in detecting complex disease patterns such as tungro with a 93% accuracy rate. Despite FD-MobileNet's lower resource consumption, its accuracy is limited to 90% across varied testing conditions. Resource optimization strategies highlight that even slight adjustments, such as a 0.5% reduction in RAM usage and a 1.14% decrease in flash memory, can result in a notable 9% increase in validation accuracy. This underscores the critical balance between computational resource management and model performance, particularly in resource-constrained settings like those provided by microcontrollers. In summary, the deployment of CNNs on microcontrollers presents a viable solution for real-time, on-site plant disease detection, demonstrating potential improvements in detection accuracy and operational efficiency. This study advances the field of smart agriculture by integrating cutting-edge AI with practical agricultural needs, aiming to address the challenges of food security in vulnerable regions.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"159"},"PeriodicalIF":4.7,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11481777/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142472425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Correction: A comprehensive review of in planta stable transformation strategies. 更正:植物体内稳定转化策略综述。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-15 DOI: 10.1186/s13007-024-01282-4
Jérôme Bélanger, Tanya Rose Copley, Valerio Hoyos-Villegas, Jean-Benoit Charron, Louise O'Donoughue
{"title":"Correction: A comprehensive review of in planta stable transformation strategies.","authors":"Jérôme Bélanger, Tanya Rose Copley, Valerio Hoyos-Villegas, Jean-Benoit Charron, Louise O'Donoughue","doi":"10.1186/s13007-024-01282-4","DOIUrl":"https://doi.org/10.1186/s13007-024-01282-4","url":null,"abstract":"","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"158"},"PeriodicalIF":4.7,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11476722/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142472422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Quantification of the fungal pathogen Didymella segeticola in Camellia sinensis using a DNA-based qRT-PCR assay. 利用基于 DNA 的 qRT-PCR 检测法定量分析茶花中的真菌病原 Didymella segeticola。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-08 DOI: 10.1186/s13007-024-01284-2
You Zhang, Yiyi Tu, Yijia Chen, Jialu Fang, Fan'anni Chen, Lian Liu, Xiaoman Zhang, Yuchun Wang, Wuyun Lv

The fungal pathogen Didymella segeticola causes leaf spot and leaf blight on tea plant (Camellia sinensis), leading to production losses and affecting tea quality and flavor. Accurate detection and quantification of D. segeticola growth in tea plant leaves are crucial for diagnosing disease severity or evaluating host resistance. In this study, we monitored disease progression and D. segeticola development in tea plant leaves inoculated with a GFP-expressing strain. By contrast, a DNA-based qRT-PCR analysis was employed for a more convenient and maneuverable detection of D. segeticola growth in tea leaves. This method was based on the comparison of D. segeticola-specific DNA encoding a Cys2His2-zinc-finger protein (NCBI accession number: OR987684) in relation to tea plant Cs18S rDNA1. Unlike ITS and TUB2 sequences, this specific DNA was only amplified in D. segeticola isolates, not in other tea plant pathogens. This assay is also applicable for detecting D. segeticola during interactions with various tea cultivars. Among the five cultivars tested, 'Zhongcha102' (ZC102) and 'Fuding-dabaicha' (FDDB) were more susceptible to D. segeticola compared with 'Longjing43' (LJ43), 'Zhongcha108' (ZC108), and 'Zhongcha302' (ZC302). Different D. segeticola isolates also exhibited varying levels of aggressiveness towards LJ43. In conclusion, the DNA-based qRT-PCR analysis is highly sensitive, convenient, and effective method for quantifying D. segeticola growth in tea plant. This technique can be used to diagnose the severity of tea leaf spot and blight or to evaluate tea plant resistance to this pathogen.

真菌病原体半知菌(Didymella segeticola)会导致茶树(Camellia sinensis)叶斑病和叶枯病,造成生产损失,并影响茶叶的品质和风味。准确检测和量化茶树叶片中的半知菌(D. segeticola)生长情况对于诊断病害严重程度或评估寄主抗性至关重要。在这项研究中,我们监测了接种了 GFP 表达菌株的茶树叶片的病害进展和 D. segeticola 的生长情况。相比之下,我们采用了基于 DNA 的 qRT-PCR 分析方法,以更方便、更易操作地检测茶叶中 D. segeticola 的生长情况。这种方法是通过比较一种编码 Cys2His2-锌指蛋白(NCBI登录号:OR987684)的 D. segeticola 特异性 DNA 与茶树 Cs18S rDNA1 的关系。与 ITS 和 TUB2 序列不同的是,这种特异性 DNA 只在 D. segeticola 分离物中扩增,而不在其他茶树病原体中扩增。这种检测方法也适用于检测与不同茶树品种交互作用过程中的 D. segeticola。与'龙井43'(LJ43)、'中茶108'(ZC108)和'中茶302'(ZC302)相比,'中茶102'(ZC102)和'福鼎大白茶'(FDDB)对半知菌更易感。不同的 D. segeticola 分离物对 LJ43 也表现出不同程度的侵染性。总之,基于 DNA 的 qRT-PCR 分析是一种高灵敏度、简便而有效的方法,可用于定量分析茶叶中 D. segeticola 的生长情况。该技术可用于诊断茶叶叶斑病和枯萎病的严重程度,或评估茶树对该病原体的抗性。
{"title":"Quantification of the fungal pathogen Didymella segeticola in Camellia sinensis using a DNA-based qRT-PCR assay.","authors":"You Zhang, Yiyi Tu, Yijia Chen, Jialu Fang, Fan'anni Chen, Lian Liu, Xiaoman Zhang, Yuchun Wang, Wuyun Lv","doi":"10.1186/s13007-024-01284-2","DOIUrl":"10.1186/s13007-024-01284-2","url":null,"abstract":"<p><p>The fungal pathogen Didymella segeticola causes leaf spot and leaf blight on tea plant (Camellia sinensis), leading to production losses and affecting tea quality and flavor. Accurate detection and quantification of D. segeticola growth in tea plant leaves are crucial for diagnosing disease severity or evaluating host resistance. In this study, we monitored disease progression and D. segeticola development in tea plant leaves inoculated with a GFP-expressing strain. By contrast, a DNA-based qRT-PCR analysis was employed for a more convenient and maneuverable detection of D. segeticola growth in tea leaves. This method was based on the comparison of D. segeticola-specific DNA encoding a Cys2His2-zinc-finger protein (NCBI accession number: OR987684) in relation to tea plant Cs18S rDNA1. Unlike ITS and TUB2 sequences, this specific DNA was only amplified in D. segeticola isolates, not in other tea plant pathogens. This assay is also applicable for detecting D. segeticola during interactions with various tea cultivars. Among the five cultivars tested, 'Zhongcha102' (ZC102) and 'Fuding-dabaicha' (FDDB) were more susceptible to D. segeticola compared with 'Longjing43' (LJ43), 'Zhongcha108' (ZC108), and 'Zhongcha302' (ZC302). Different D. segeticola isolates also exhibited varying levels of aggressiveness towards LJ43. In conclusion, the DNA-based qRT-PCR analysis is highly sensitive, convenient, and effective method for quantifying D. segeticola growth in tea plant. This technique can be used to diagnose the severity of tea leaf spot and blight or to evaluate tea plant resistance to this pathogen.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"157"},"PeriodicalIF":4.7,"publicationDate":"2024-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11462658/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142392472","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hyperspectral imaging for pest symptom detection in bell pepper. 用于检测甜椒虫害症状的高光谱成像技术。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-03 DOI: 10.1186/s13007-024-01273-5
Marvin Krüger, Thomas Zemanek, Dominik Wuttke, Maximilian Dinkel, Albrecht Serfling, Elias Böckmann

Background: The automation of pest monitoring is highly important for enhancing integrated pest management in practice. In this context, advanced technologies are becoming increasingly explored. Hyperspectral imaging (HSI) is a technique that has been used frequently in recent years in the context of natural science, and the successful detection of several fungal diseases and some pests has been reported. Various automated measures and image analysis methods offer great potential for enhancing monitoring in practice.

Results: In this study, the use of hyperspectral imaging over a wide spectrum from 400 to 2500 nm is investigated for noninvasive identification and the distinction of healthy plants and plants infested with Myzus persicae (Sulzer) and Frankliniella occidentalis (Pergande) on bell peppers. Pest infestations were carried out in netted areas, and images of single plants and dissected leaves were used to train the decision algorithm. Additionally, a specially modified spraying robot was converted into an autonomous platform used to carry the hyperspectral imaging system to take images under greenhouse conditions. The algorithm was developed via the XGBoost framework with gradient-boosted trees. Signals from specific wavelengths were found to be associated with the damage patterns of different insects. Under confined conditions, M. persicae and F. occidentalis infestations were distinguished from each other and from the uninfested control for single leaves. Differentiation was still possible when small whole plants were used. However, application under greenhouse conditions did not result in a good fit compared to the results of manual monitoring.

Conclusion: Hyperspectral images can be used to distinguish sucking pests on bell peppers on the basis of single leaves and intact potted bell pepper plants under controlled conditions. Wavelength reduction methods offer options for multispectral camera usage in high-grown vegetable greenhouses. The application of automated platforms similar to the one tested in this study could be possible, but for successful pest detection under greenhouse conditions, algorithms should be further developed fully considering real-world conditions.

背景:害虫监测自动化对于在实践中加强害虫综合治理非常重要。在此背景下,人们越来越多地探索先进技术。高光谱成像(HSI)是近年来在自然科学领域频繁使用的一种技术,有报道称该技术成功检测了多种真菌病害和一些害虫。各种自动化措施和图像分析方法为加强实际监测提供了巨大潜力:在这项研究中,研究了如何利用波长从 400 纳米到 2500 纳米的宽光谱高光谱成像技术,对甜椒上的健康植物和受柿蝇菌(Myzus persicae (Sulzer))和西洋桔霉(Frankliniella occidentalis (Pergande))侵染的植物进行非侵入式识别和区分。虫害发生在网状区域,单株植物和剖开叶片的图像用于训练决策算法。此外,一个经过特殊改装的喷洒机器人被改装成一个自主平台,用于携带高光谱成像系统,在温室条件下拍摄图像。该算法是通过梯度增强树的 XGBoost 框架开发的。研究发现,特定波长的信号与不同昆虫的危害模式有关。在密闭条件下,对于单片叶片,可以区分出被害虫(M. persicae)和被害虫(F. occidentalis),也可以区分出未被害虫(F. occidentalis)。使用小的整株植物时仍可区分。然而,在温室条件下应用时,与人工监测的结果相比,其拟合效果并不理想:高光谱图像可用于在受控条件下根据单叶和完整的盆栽甜椒植株区分甜椒上的吸食害虫。波长缩减方法为在高生长蔬菜温室中使用多光谱相机提供了选择。类似于本研究中测试的自动平台的应用是可能的,但要在温室条件下成功检测害虫,应充分考虑实际条件进一步开发算法。
{"title":"Hyperspectral imaging for pest symptom detection in bell pepper.","authors":"Marvin Krüger, Thomas Zemanek, Dominik Wuttke, Maximilian Dinkel, Albrecht Serfling, Elias Böckmann","doi":"10.1186/s13007-024-01273-5","DOIUrl":"10.1186/s13007-024-01273-5","url":null,"abstract":"<p><strong>Background: </strong>The automation of pest monitoring is highly important for enhancing integrated pest management in practice. In this context, advanced technologies are becoming increasingly explored. Hyperspectral imaging (HSI) is a technique that has been used frequently in recent years in the context of natural science, and the successful detection of several fungal diseases and some pests has been reported. Various automated measures and image analysis methods offer great potential for enhancing monitoring in practice.</p><p><strong>Results: </strong>In this study, the use of hyperspectral imaging over a wide spectrum from 400 to 2500 nm is investigated for noninvasive identification and the distinction of healthy plants and plants infested with Myzus persicae (Sulzer) and Frankliniella occidentalis (Pergande) on bell peppers. Pest infestations were carried out in netted areas, and images of single plants and dissected leaves were used to train the decision algorithm. Additionally, a specially modified spraying robot was converted into an autonomous platform used to carry the hyperspectral imaging system to take images under greenhouse conditions. The algorithm was developed via the XGBoost framework with gradient-boosted trees. Signals from specific wavelengths were found to be associated with the damage patterns of different insects. Under confined conditions, M. persicae and F. occidentalis infestations were distinguished from each other and from the uninfested control for single leaves. Differentiation was still possible when small whole plants were used. However, application under greenhouse conditions did not result in a good fit compared to the results of manual monitoring.</p><p><strong>Conclusion: </strong>Hyperspectral images can be used to distinguish sucking pests on bell peppers on the basis of single leaves and intact potted bell pepper plants under controlled conditions. Wavelength reduction methods offer options for multispectral camera usage in high-grown vegetable greenhouses. The application of automated platforms similar to the one tested in this study could be possible, but for successful pest detection under greenhouse conditions, algorithms should be further developed fully considering real-world conditions.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"156"},"PeriodicalIF":4.7,"publicationDate":"2024-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11447932/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142366229","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Variation in forest root image annotation by experts, novices, and AI. 专家、新手和人工智能在林根图像标注方面的差异。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-01 DOI: 10.1186/s13007-024-01279-z
Grace Handy, Imogen Carter, A Rob Mackenzie, Adriane Esquivel-Muelbert, Abraham George Smith, Daniela Yaffar, Joanne Childs, Marie Arnaud

Background: The manual study of root dynamics using images requires huge investments of time and resources and is prone to previously poorly quantified annotator bias. Artificial intelligence (AI) image-processing tools have been successful in overcoming limitations of manual annotation in homogeneous soils, but their efficiency and accuracy is yet to be widely tested on less homogenous, non-agricultural soil profiles, e.g., that of forests, from which data on root dynamics are key to understanding the carbon cycle. Here, we quantify variance in root length measured by human annotators with varying experience levels. We evaluate the application of a convolutional neural network (CNN) model, trained on a software accessible to researchers without a machine learning background, on a heterogeneous minirhizotron image dataset taken in a multispecies, mature, deciduous temperate forest.

Results: Less experienced annotators consistently identified more root length than experienced annotators. Root length annotation also varied between experienced annotators. The CNN root length results were neither precise nor accurate, taking ~ 10% of the time but significantly overestimating root length compared to expert manual annotation (p = 0.01). The CNN net root length change results were closer to manual (p = 0.08) but there remained substantial variation.

Conclusions: Manual root length annotation is contingent on the individual annotator. The only accessible CNN model cannot yet produce root data of sufficient accuracy and precision for ecological applications when applied to a complex, heterogeneous forest image dataset. A continuing evaluation and development of accessible CNNs for natural ecosystems is required.

背景:利用图像对根系动态进行人工研究需要投入大量的时间和资源,而且很容易出现以前难以量化的注释者偏差。人工智能(AI)图像处理工具已成功克服了同质土壤中人工标注的局限性,但其效率和准确性还有待在同质程度较低的非农业土壤剖面(如森林)中进行广泛测试,而根系动态数据是了解碳循环的关键。在这里,我们对不同经验水平的人工标注者所测量的根长差异进行了量化。我们对卷积神经网络(CNN)模型的应用进行了评估,该模型是在一个多树种、成熟的落叶温带森林中拍摄的异质小根系图像数据集上应用卷积神经网络(CNN)模型进行训练的,没有机器学习背景的研究人员也可以使用该软件:结果:与经验丰富的标注者相比,经验不足的标注者识别出的根长更多。不同经验的标注者对根长的标注也不尽相同。CNN 的根长结果既不精确也不准确,用时约为人工标注的 10%,但与专家人工标注相比明显高估了根长(p = 0.01)。CNN 的净根长变化结果更接近人工标注结果(p = 0.08),但仍存在很大差异:结论:人工根长标注取决于标注者个人。当应用于复杂、异构的森林图像数据集时,唯一可用的 CNN 模型还不能生成足够准确和精确的生态应用根数据。需要继续评估和开发适用于自然生态系统的 CNN。
{"title":"Variation in forest root image annotation by experts, novices, and AI.","authors":"Grace Handy, Imogen Carter, A Rob Mackenzie, Adriane Esquivel-Muelbert, Abraham George Smith, Daniela Yaffar, Joanne Childs, Marie Arnaud","doi":"10.1186/s13007-024-01279-z","DOIUrl":"10.1186/s13007-024-01279-z","url":null,"abstract":"<p><strong>Background: </strong>The manual study of root dynamics using images requires huge investments of time and resources and is prone to previously poorly quantified annotator bias. Artificial intelligence (AI) image-processing tools have been successful in overcoming limitations of manual annotation in homogeneous soils, but their efficiency and accuracy is yet to be widely tested on less homogenous, non-agricultural soil profiles, e.g., that of forests, from which data on root dynamics are key to understanding the carbon cycle. Here, we quantify variance in root length measured by human annotators with varying experience levels. We evaluate the application of a convolutional neural network (CNN) model, trained on a software accessible to researchers without a machine learning background, on a heterogeneous minirhizotron image dataset taken in a multispecies, mature, deciduous temperate forest.</p><p><strong>Results: </strong>Less experienced annotators consistently identified more root length than experienced annotators. Root length annotation also varied between experienced annotators. The CNN root length results were neither precise nor accurate, taking ~ 10% of the time but significantly overestimating root length compared to expert manual annotation (p = 0.01). The CNN net root length change results were closer to manual (p = 0.08) but there remained substantial variation.</p><p><strong>Conclusions: </strong>Manual root length annotation is contingent on the individual annotator. The only accessible CNN model cannot yet produce root data of sufficient accuracy and precision for ecological applications when applied to a complex, heterogeneous forest image dataset. A continuing evaluation and development of accessible CNNs for natural ecosystems is required.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"154"},"PeriodicalIF":4.7,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11443924/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142352021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Study on canopy extraction method for narrowband spectral images based on superpixel color gradation skewness distribution features. 基于超像素色阶偏度分布特征的窄带光谱图像树冠提取方法研究
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-10-01 DOI: 10.1186/s13007-024-01281-5
Hongfeng Yu, Yongqian Ding, Pei Zhang, Furui Zhang, Xianglin Dou, Zhengmeng Chen

Background: Crop phenotype extraction devices based on multiband narrowband spectral images can effectively detect the physiological and biochemical parameters of crops, which plays a positive role in guiding the development of precision agriculture. Although the narrowband spectral image canopy extraction method is a fundamental algorithm for the development of crop phenotype extraction devices, developing a highly real-time and embedded integrated narrowband spectral image canopy extraction method remains challenging owing to the small difference between the narrowband spectral image canopy and background.

Methods: This study identified and validated the skewed distribution of leaf color gradation in narrowband spectral images. By introducing kurtosis and skewness feature parameters, a canopy extraction method based on a superpixel skewed color gradation distribution was proposed for narrowband spectral images. In addition, different types of parameter combinations were input to construct two classifier models, and the contribution of the skewed distribution feature parameters to the proposed canopy extraction method was evaluated to confirm the effectiveness of introducing skewed leaf color skewed distribution features.

Results: Leaf color gradient skewness verification was conducted on 4200 superpixels of different sizes, and 4190 superpixels conformed to the skewness distribution. The intersection over union (IoU) between the soil background and canopy of the expanded leaf color skewed distribution feature parameters was 90.41%, whereas that of the traditional Otsu segmentation algorithm was 77.95%. The canopy extraction method used in this study performed significantly better than the traditional threshold segmentation method, using the same training set, Y1 (without skewed parameters) and Y2 (with skewed parameters) Bayesian classifier models were constructed. After evaluating the segmentation effect of introducing skewed parameters, the average classification accuracies Acc_Y1 of the Y1 model and Acc_Y2 of the Y2 model were 72.02% and 91.76%, respectively, under the same test conditions. This indicates that introducing leaf color gradient skewed parameters can significantly improve the accuracy of Bayesian classifiers for narrowband spectral images of the canopy and soil background.

Conclusions: The introduction of kurtosis and skewness as leaf color skewness feature parameters can expand the expression of leaf color information in narrowband spectral images. The narrowband spectral image canopy extraction method based on superpixel color skewness distribution features can effectively segment the canopy and soil background in narrowband spectral images, thereby providing a new solution for crop canopy phenotype feature extraction.

背景:基于多波段窄带光谱图像的作物表型提取装置可以有效检测作物的生理生化参数,对精准农业的发展具有积极的指导作用。虽然窄带光谱图像冠层提取方法是开发作物表型提取设备的基础算法,但由于窄带光谱图像冠层与背景之间的差异较小,开发一种实时性高、嵌入式集成的窄带光谱图像冠层提取方法仍具有挑战性:方法:本研究发现并验证了窄带光谱图像中叶片色阶的倾斜分布。通过引入峰度和偏度特征参数,针对窄带光谱图像提出了一种基于超像素倾斜色阶分布的树冠提取方法。此外,还输入了不同类型的参数组合来构建两个分类器模型,并评估了倾斜分布特征参数对所提出的树冠提取方法的贡献,以确认引入倾斜叶色倾斜分布特征的有效性:结果:对4200个不同大小的超像素进行了叶色梯度偏度验证,4190个超像素符合偏度分布。扩展叶色倾斜分布特征参数的土壤背景与树冠之间的交集大于联合(IoU)率为 90.41%,而传统的大津分割算法的交集大于联合率为 77.95%。使用相同的训练集,构建了 Y1(无偏斜参数)和 Y2(有偏斜参数)贝叶斯分类器模型,本研究使用的冠层提取方法的性能明显优于传统的阈值分割方法。在评估了引入倾斜参数的分割效果后,在相同的测试条件下,Y1 模型的平均分类精度 Acc_Y1 和 Y2 模型的平均分类精度 Acc_Y2 分别为 72.02% 和 91.76%。这表明,引入叶色梯度偏斜参数可以显著提高贝叶斯分类器对冠层和土壤背景窄带光谱图像的准确性:结论:引入峰度和偏度作为叶色偏度特征参数可以扩展窄带光谱图像中叶色信息的表达。基于超像素颜色偏度分布特征的窄带光谱图像冠层提取方法能有效分割窄带光谱图像中的冠层和土壤背景,从而为作物冠层表型特征提取提供了一种新的解决方案。
{"title":"Study on canopy extraction method for narrowband spectral images based on superpixel color gradation skewness distribution features.","authors":"Hongfeng Yu, Yongqian Ding, Pei Zhang, Furui Zhang, Xianglin Dou, Zhengmeng Chen","doi":"10.1186/s13007-024-01281-5","DOIUrl":"10.1186/s13007-024-01281-5","url":null,"abstract":"<p><strong>Background: </strong>Crop phenotype extraction devices based on multiband narrowband spectral images can effectively detect the physiological and biochemical parameters of crops, which plays a positive role in guiding the development of precision agriculture. Although the narrowband spectral image canopy extraction method is a fundamental algorithm for the development of crop phenotype extraction devices, developing a highly real-time and embedded integrated narrowband spectral image canopy extraction method remains challenging owing to the small difference between the narrowband spectral image canopy and background.</p><p><strong>Methods: </strong>This study identified and validated the skewed distribution of leaf color gradation in narrowband spectral images. By introducing kurtosis and skewness feature parameters, a canopy extraction method based on a superpixel skewed color gradation distribution was proposed for narrowband spectral images. In addition, different types of parameter combinations were input to construct two classifier models, and the contribution of the skewed distribution feature parameters to the proposed canopy extraction method was evaluated to confirm the effectiveness of introducing skewed leaf color skewed distribution features.</p><p><strong>Results: </strong>Leaf color gradient skewness verification was conducted on 4200 superpixels of different sizes, and 4190 superpixels conformed to the skewness distribution. The intersection over union (IoU) between the soil background and canopy of the expanded leaf color skewed distribution feature parameters was 90.41%, whereas that of the traditional Otsu segmentation algorithm was 77.95%. The canopy extraction method used in this study performed significantly better than the traditional threshold segmentation method, using the same training set, Y1 (without skewed parameters) and Y2 (with skewed parameters) Bayesian classifier models were constructed. After evaluating the segmentation effect of introducing skewed parameters, the average classification accuracies Acc_Y1 of the Y1 model and Acc_Y2 of the Y2 model were 72.02% and 91.76%, respectively, under the same test conditions. This indicates that introducing leaf color gradient skewed parameters can significantly improve the accuracy of Bayesian classifiers for narrowband spectral images of the canopy and soil background.</p><p><strong>Conclusions: </strong>The introduction of kurtosis and skewness as leaf color skewness feature parameters can expand the expression of leaf color information in narrowband spectral images. The narrowband spectral image canopy extraction method based on superpixel color skewness distribution features can effectively segment the canopy and soil background in narrowband spectral images, thereby providing a new solution for crop canopy phenotype feature extraction.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"155"},"PeriodicalIF":4.7,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11446045/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142361915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Measurement of maize stalk shear moduli. 测量玉米茎秆的剪切模量。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-09-30 DOI: 10.1186/s13007-024-01264-6
Joseph Carter, Joshua Hoffman, Braxton Fjeldsted, Grant Ogilvie, Douglas D Cook

Maize is the most grown feed crop in the United States. Due to wind storms and other factors, 5% of maize falls over annually. The longitudinal shear modulus of maize stalk tissues is currently unreported and may have a significant influence on stalk failure. To better understand the causes of this phenomenon, maize stalk material properties need to be measured so that they can be used as material constants in computational models that provide detailed analysis of maize stalk failure. This study reports longitudinal shear modulus of maize stalk tissue through repeated torsion testing of dry and fully mature maize stalks. Measurements were focused on the two tissues found in maize stalks: the hard outer rind and the soft inner pith. Uncertainty analysis and comparison of multiple methodologies indicated that all measurements are subject to low error and bias. The results of this study will allow researchers to better understand maize stalk failure modes through computational modeling. This will allow researchers to prevent annual maize loss through later studies. This study also provides a methodology that could be used or adapted in the measurement of tissues from other plants such as sorghum, sugarcane, etc.

玉米是美国种植最多的饲料作物。由于风灾和其他因素,每年有 5%的玉米倒伏。玉米茎秆组织的纵向剪切模量目前尚未报道,但可能对茎秆倒伏有重大影响。为了更好地了解这一现象的原因,需要测量玉米茎秆的材料特性,以便在计算模型中将其用作材料常数,对玉米茎秆倒伏进行详细分析。本研究通过对干燥和完全成熟的玉米茎秆进行反复扭转测试,报告了玉米茎秆组织的纵向剪切模量。测量的重点是玉米茎秆中的两种组织:坚硬的外皮和柔软的内髓。不确定性分析和多种方法的比较表明,所有测量的误差和偏差都很小。这项研究的结果将使研究人员能够通过计算建模更好地了解玉米茎秆的失效模式。这将使研究人员能够通过后期研究防止每年的玉米损失。这项研究还提供了一种可用于或适用于测量高粱、甘蔗等其他植物组织的方法。
{"title":"Measurement of maize stalk shear moduli.","authors":"Joseph Carter, Joshua Hoffman, Braxton Fjeldsted, Grant Ogilvie, Douglas D Cook","doi":"10.1186/s13007-024-01264-6","DOIUrl":"10.1186/s13007-024-01264-6","url":null,"abstract":"<p><p>Maize is the most grown feed crop in the United States. Due to wind storms and other factors, 5% of maize falls over annually. The longitudinal shear modulus of maize stalk tissues is currently unreported and may have a significant influence on stalk failure. To better understand the causes of this phenomenon, maize stalk material properties need to be measured so that they can be used as material constants in computational models that provide detailed analysis of maize stalk failure. This study reports longitudinal shear modulus of maize stalk tissue through repeated torsion testing of dry and fully mature maize stalks. Measurements were focused on the two tissues found in maize stalks: the hard outer rind and the soft inner pith. Uncertainty analysis and comparison of multiple methodologies indicated that all measurements are subject to low error and bias. The results of this study will allow researchers to better understand maize stalk failure modes through computational modeling. This will allow researchers to prevent annual maize loss through later studies. This study also provides a methodology that could be used or adapted in the measurement of tissues from other plants such as sorghum, sugarcane, etc.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"152"},"PeriodicalIF":4.7,"publicationDate":"2024-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11441149/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142352019","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An automated phenotyping method for Chinese Cymbidium seedlings based on 3D point cloud. 基于三维点云的大花蕙兰幼苗自动表型方法
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-09-30 DOI: 10.1186/s13007-024-01277-1
Yang Zhou, Honghao Zhou, Yue Chen

Aiming at the problems of low efficiency and high cost in determining the phenotypic parameters of Cymbidium seedlings by artificial approaches, this study proposed a fully automated measurement scheme for some phenotypic parameters based on point cloud. The key point or difficulty is to design a segmentation method for individual tillers according to the morphology-specific structure. After determining the branch points, two rounds of segmentation schemes were designed. The non-overlapping part of each tiller and the overlapping parts of each ramet are separated in the first round based on the edge point cloud-based segmentation, while in the second round, the overlapping part was sliced along the horizontal direction according to the weight ratio of the tillers above, to obtain the complete point cloud of all tillers. The core superiority of the algorithm is that the segmentation fits the tiller growth direction well, and the extracted skeleton points of tillers are close to the actual growth direction, significantly improving the prediction accuracy of the subsequent phenotypic parameters. Five phenotypic parameters, plant height, leaf number, leaf length, leaf width and leaf area, were automatically calculated. Through experiments, the accuracy of the five parameters reached 98.6%, 100%, 92.2%, 89.1%, and 82.3%, respectively, which reach the needs of various phenotypic applications.

针对人工方法测定大花蕙兰幼苗表型参数效率低、成本高的问题,本研究提出了基于点云的部分表型参数全自动测量方案。其中的重点和难点在于如何根据形态特征结构设计单个分蘖的分割方法。在确定分枝点后,设计了两轮分割方案。第一轮是基于边缘点云的分割,将每个分蘖的非重叠部分和每个穗束的重叠部分分离出来;第二轮是将重叠部分按照上面分蘖的权重比沿水平方向切分,得到所有分蘖的完整点云。该算法的核心优势在于分割后的分蘖生长方向拟合度高,提取的分蘖骨架点与实际生长方向接近,显著提高了后续表型参数的预测精度。植株高度、叶片数、叶片长度、叶片宽度和叶面积这五个表型参数是自动计算得出的。通过实验,五个参数的准确率分别达到了 98.6%、100%、92.2%、89.1% 和 82.3%,达到了各种表型应用的需求。
{"title":"An automated phenotyping method for Chinese Cymbidium seedlings based on 3D point cloud.","authors":"Yang Zhou, Honghao Zhou, Yue Chen","doi":"10.1186/s13007-024-01277-1","DOIUrl":"10.1186/s13007-024-01277-1","url":null,"abstract":"<p><p>Aiming at the problems of low efficiency and high cost in determining the phenotypic parameters of Cymbidium seedlings by artificial approaches, this study proposed a fully automated measurement scheme for some phenotypic parameters based on point cloud. The key point or difficulty is to design a segmentation method for individual tillers according to the morphology-specific structure. After determining the branch points, two rounds of segmentation schemes were designed. The non-overlapping part of each tiller and the overlapping parts of each ramet are separated in the first round based on the edge point cloud-based segmentation, while in the second round, the overlapping part was sliced along the horizontal direction according to the weight ratio of the tillers above, to obtain the complete point cloud of all tillers. The core superiority of the algorithm is that the segmentation fits the tiller growth direction well, and the extracted skeleton points of tillers are close to the actual growth direction, significantly improving the prediction accuracy of the subsequent phenotypic parameters. Five phenotypic parameters, plant height, leaf number, leaf length, leaf width and leaf area, were automatically calculated. Through experiments, the accuracy of the five parameters reached 98.6%, 100%, 92.2%, 89.1%, and 82.3%, respectively, which reach the needs of various phenotypic applications.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"151"},"PeriodicalIF":4.7,"publicationDate":"2024-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11441005/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142352015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A deep learning approach for deriving wheat phenology from near-surface RGB image series using spatiotemporal fusion. 利用时空融合从近地表 RGB 图像系列推导小麦物候的深度学习方法。
IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS Pub Date : 2024-09-30 DOI: 10.1186/s13007-024-01278-0
Yucheng Cai, Yan Li, Xuerui Qi, Jianqing Zhao, Li Jiang, Yongchao Tian, Yan Zhu, Weixing Cao, Xiaohu Zhang

Accurate monitoring of wheat phenological stages is essential for effective crop management and informed agricultural decision-making. Traditional methods often rely on labour-intensive field surveys, which are prone to subjective bias and limited temporal resolution. To address these challenges, this study explores the potential of near-surface cameras combined with an advanced deep-learning approach to derive wheat phenological stages from high-quality, real-time RGB image series. Three deep learning models based on three different spatiotemporal feature fusion methods, namely sequential fusion, synchronous fusion, and parallel fusion, were constructed and evaluated for deriving wheat phenological stages with these near-surface RGB image series. Moreover, the impact of different image resolutions, capture perspectives, and model training strategies on the performance of deep learning models was also investigated. The results indicate that the model using the sequential fusion method is optimal, with an overall accuracy (OA) of 0.935, a mean absolute error (MAE) of 0.069, F1-score (F1) of 0.936, and kappa coefficients (Kappa) of 0.924 in wheat phenological stages. Besides, the enhanced image resolution of 512 × 512 pixels and a suitable image capture perspective, specifically a sensor viewing angle of 40° to 60° vertically, introduce more effective features for phenological stage detection, thereby enhancing the model's accuracy. Furthermore, concerning the model training, applying a two-step fine-tuning strategy will also enhance the model's robustness to random variations in perspective. This research introduces an innovative approach for real-time phenological stage detection and provides a solid foundation for precision agriculture. By accurately deriving critical phenological stages, the methodology developed in this study supports the optimization of crop management practices, which may result in improved resource efficiency and sustainability across diverse agricultural settings. The implications of this work extend beyond wheat, offering a scalable solution that can be adapted to monitor other crops, thereby contributing to more efficient and sustainable agricultural systems.

准确监测小麦物候期对于有效管理作物和做出明智的农业决策至关重要。传统方法往往依赖于劳动密集型的实地调查,容易产生主观偏差,而且时间分辨率有限。为了应对这些挑战,本研究探索了近地表相机与先进的深度学习方法相结合的潜力,以从高质量的实时 RGB 图像系列中推导出小麦物候期。基于三种不同的时空特征融合方法(即顺序融合、同步融合和并行融合)构建了三种深度学习模型,并对其进行了评估,以利用这些近地表 RGB 图像系列推导出小麦物候期。此外,还研究了不同图像分辨率、拍摄角度和模型训练策略对深度学习模型性能的影响。结果表明,在小麦物候阶段,使用顺序融合方法的模型是最佳的,其总体准确率(OA)为 0.935,平均绝对误差(MAE)为 0.069,F1 分数(F1)为 0.936,卡帕系数(Kappa)为 0.924。此外,512 × 512 像素的增强图像分辨率和合适的图像捕捉视角,特别是传感器垂直视角为 40° 至 60°,为物候期检测引入了更有效的特征,从而提高了模型的准确性。此外,在模型训练方面,采用两步微调策略也能增强模型对随机视角变化的鲁棒性。这项研究引入了一种实时物候期检测的创新方法,为精准农业奠定了坚实的基础。通过准确推导关键物候期,本研究开发的方法有助于优化作物管理实践,从而在不同的农业环境中提高资源效率和可持续性。这项工作的意义不仅限于小麦,它还提供了一种可扩展的解决方案,可用于监测其他作物,从而有助于提高农业系统的效率和可持续性。
{"title":"A deep learning approach for deriving wheat phenology from near-surface RGB image series using spatiotemporal fusion.","authors":"Yucheng Cai, Yan Li, Xuerui Qi, Jianqing Zhao, Li Jiang, Yongchao Tian, Yan Zhu, Weixing Cao, Xiaohu Zhang","doi":"10.1186/s13007-024-01278-0","DOIUrl":"10.1186/s13007-024-01278-0","url":null,"abstract":"<p><p>Accurate monitoring of wheat phenological stages is essential for effective crop management and informed agricultural decision-making. Traditional methods often rely on labour-intensive field surveys, which are prone to subjective bias and limited temporal resolution. To address these challenges, this study explores the potential of near-surface cameras combined with an advanced deep-learning approach to derive wheat phenological stages from high-quality, real-time RGB image series. Three deep learning models based on three different spatiotemporal feature fusion methods, namely sequential fusion, synchronous fusion, and parallel fusion, were constructed and evaluated for deriving wheat phenological stages with these near-surface RGB image series. Moreover, the impact of different image resolutions, capture perspectives, and model training strategies on the performance of deep learning models was also investigated. The results indicate that the model using the sequential fusion method is optimal, with an overall accuracy (OA) of 0.935, a mean absolute error (MAE) of 0.069, F1-score (F1) of 0.936, and kappa coefficients (Kappa) of 0.924 in wheat phenological stages. Besides, the enhanced image resolution of 512 × 512 pixels and a suitable image capture perspective, specifically a sensor viewing angle of 40° to 60° vertically, introduce more effective features for phenological stage detection, thereby enhancing the model's accuracy. Furthermore, concerning the model training, applying a two-step fine-tuning strategy will also enhance the model's robustness to random variations in perspective. This research introduces an innovative approach for real-time phenological stage detection and provides a solid foundation for precision agriculture. By accurately deriving critical phenological stages, the methodology developed in this study supports the optimization of crop management practices, which may result in improved resource efficiency and sustainability across diverse agricultural settings. The implications of this work extend beyond wheat, offering a scalable solution that can be adapted to monitor other crops, thereby contributing to more efficient and sustainable agricultural systems.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"153"},"PeriodicalIF":4.7,"publicationDate":"2024-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11443927/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142352013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Plant Methods
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1