{"title":"概念验证:用于植物鉴定的自主机器视觉软件。","authors":"Nathan Stern, Jonathan Leidig, Gregory Wolffe","doi":"10.1093/jaoacint/qsae091","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>HPTLC is a widely used and accepted technique for identification of botanicals. Current best practices involve subjective comparison of HPTLC-generated images between test samples and certified botanical reference materials based on specific bands.</p><p><strong>Objective: </strong>This research was designed to evaluate the potential of cutting-edge machine vision-based machine learning techniques to automate identification of botanicals using native HPTLC image data.</p><p><strong>Method: </strong>HPTLC images from Ginger and its closely related species and common adulterants were used to create large, synthetic datasets using a deep conditional generative adversarial network. This synthetic dataset was used to train and validate a deep convolutional neural network capable of automatically identifying new HPTLC image data. Performance of both neural networks was evaluated over time using appropriate loss functions as an indicator of their progress during learning. Validation of the overall system was measured via the accuracy of the learned model when applied to real HPTLC data.</p><p><strong>Results: </strong>The machine vision system was able to generate realistic synthetic HPTLC images that were successfully used to train a deep convolutional neural network. The resulting learned model achieved high-accuracy identification from HPTLC images corresponding to Ginger and six other related species.</p><p><strong>Conclusions: </strong>A proof-of-concept HPTLC image-based machine vision system for the identification of botanicals was proven to be feasible and a fully working prototype was validated for several species related to Ginger.</p><p><strong>Highlights: </strong>This use of an autonomous machine-vision system for botanical identification removed the subjectivity inherent to human-based evaluation. The learned model also accurately evaluated botanical HPTLC images significantly faster than its human counterpart, which could save both time and resources.</p>","PeriodicalId":94064,"journal":{"name":"Journal of AOAC International","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Proof of Concept: Autonomous Machine Vision Software for Botanical Identification.\",\"authors\":\"Nathan Stern, Jonathan Leidig, Gregory Wolffe\",\"doi\":\"10.1093/jaoacint/qsae091\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>HPTLC is a widely used and accepted technique for identification of botanicals. Current best practices involve subjective comparison of HPTLC-generated images between test samples and certified botanical reference materials based on specific bands.</p><p><strong>Objective: </strong>This research was designed to evaluate the potential of cutting-edge machine vision-based machine learning techniques to automate identification of botanicals using native HPTLC image data.</p><p><strong>Method: </strong>HPTLC images from Ginger and its closely related species and common adulterants were used to create large, synthetic datasets using a deep conditional generative adversarial network. This synthetic dataset was used to train and validate a deep convolutional neural network capable of automatically identifying new HPTLC image data. Performance of both neural networks was evaluated over time using appropriate loss functions as an indicator of their progress during learning. Validation of the overall system was measured via the accuracy of the learned model when applied to real HPTLC data.</p><p><strong>Results: </strong>The machine vision system was able to generate realistic synthetic HPTLC images that were successfully used to train a deep convolutional neural network. The resulting learned model achieved high-accuracy identification from HPTLC images corresponding to Ginger and six other related species.</p><p><strong>Conclusions: </strong>A proof-of-concept HPTLC image-based machine vision system for the identification of botanicals was proven to be feasible and a fully working prototype was validated for several species related to Ginger.</p><p><strong>Highlights: </strong>This use of an autonomous machine-vision system for botanical identification removed the subjectivity inherent to human-based evaluation. The learned model also accurately evaluated botanical HPTLC images significantly faster than its human counterpart, which could save both time and resources.</p>\",\"PeriodicalId\":94064,\"journal\":{\"name\":\"Journal of AOAC International\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-11-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of AOAC International\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1093/jaoacint/qsae091\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of AOAC International","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/jaoacint/qsae091","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Proof of Concept: Autonomous Machine Vision Software for Botanical Identification.
Background: HPTLC is a widely used and accepted technique for identification of botanicals. Current best practices involve subjective comparison of HPTLC-generated images between test samples and certified botanical reference materials based on specific bands.
Objective: This research was designed to evaluate the potential of cutting-edge machine vision-based machine learning techniques to automate identification of botanicals using native HPTLC image data.
Method: HPTLC images from Ginger and its closely related species and common adulterants were used to create large, synthetic datasets using a deep conditional generative adversarial network. This synthetic dataset was used to train and validate a deep convolutional neural network capable of automatically identifying new HPTLC image data. Performance of both neural networks was evaluated over time using appropriate loss functions as an indicator of their progress during learning. Validation of the overall system was measured via the accuracy of the learned model when applied to real HPTLC data.
Results: The machine vision system was able to generate realistic synthetic HPTLC images that were successfully used to train a deep convolutional neural network. The resulting learned model achieved high-accuracy identification from HPTLC images corresponding to Ginger and six other related species.
Conclusions: A proof-of-concept HPTLC image-based machine vision system for the identification of botanicals was proven to be feasible and a fully working prototype was validated for several species related to Ginger.
Highlights: This use of an autonomous machine-vision system for botanical identification removed the subjectivity inherent to human-based evaluation. The learned model also accurately evaluated botanical HPTLC images significantly faster than its human counterpart, which could save both time and resources.