{"title":"Quality Control of Assay Data: A Review of Procedures for Measuring and Monitoring Precision and Accuracy","authors":"M. Abzalov","doi":"10.2113/GSEMG.17.3-4.131","DOIUrl":null,"url":null,"abstract":"Control of analytical data quality is usually referred to in the mining industry as Quality Assurance and Quality Control (QAQC), and involves the monitoring of sample quality and quantification of analytical accuracy and precision. QAQC procedures normally involve using sample duplicates and specially prepared standards whose grade is known. Numerous case studies indicate that reliable control of sample precision is achieved by using approximately 5% to 10% of field duplicates and 3% to 5% of pulp duplicates. These duplicate samples should be prepared and analyzed in the primary laboratory.\n\nBias in the analytical results can be identified by inclusion of 3% to 5% of the standard in each sample batch. Several different standards are used, with values spanning the practical range of grades in the actual samples. A blank (a sample in which the concentration of metal of interest is below detection limit) should also be included. Standard samples alone cannot identify biases introduced during sample preparation, and therefore approximately 5% of the duplicate samples (coarse rejects and pulp) should be processed and assayed at another, external, reputable laboratory.\n\nThis paper discusses techniques used for estimation of errors in precision and accuracy, and overviews diagnostic tools. It is shown that one of the most commonly used methods, the Thompson-Howarth technique, produces consistently lower results than other methods. These results reflect the nature of this method, which relies on the assumption of a normally distributed error, and thus produces biased results when errors have a skewed distribution. This study concurs with the suggestion of Stanley and Lawie (2007: Exploration and Mining Geology, v. 16, p. 265–274) to use the average coefficient of variation ( CV AVR (%) ) as the universal measure of relative precision error in mine geology applications:\n\n![Graphic][1] \n\nBased on case studies, an acceptable level of sample precision is proposed for several different deposit types.\n\n [1]: /embed/inline-graphic-1.gif","PeriodicalId":206160,"journal":{"name":"Exploration and Mining Geology","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"59","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Exploration and Mining Geology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2113/GSEMG.17.3-4.131","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 59
Abstract
Control of analytical data quality is usually referred to in the mining industry as Quality Assurance and Quality Control (QAQC), and involves the monitoring of sample quality and quantification of analytical accuracy and precision. QAQC procedures normally involve using sample duplicates and specially prepared standards whose grade is known. Numerous case studies indicate that reliable control of sample precision is achieved by using approximately 5% to 10% of field duplicates and 3% to 5% of pulp duplicates. These duplicate samples should be prepared and analyzed in the primary laboratory.
Bias in the analytical results can be identified by inclusion of 3% to 5% of the standard in each sample batch. Several different standards are used, with values spanning the practical range of grades in the actual samples. A blank (a sample in which the concentration of metal of interest is below detection limit) should also be included. Standard samples alone cannot identify biases introduced during sample preparation, and therefore approximately 5% of the duplicate samples (coarse rejects and pulp) should be processed and assayed at another, external, reputable laboratory.
This paper discusses techniques used for estimation of errors in precision and accuracy, and overviews diagnostic tools. It is shown that one of the most commonly used methods, the Thompson-Howarth technique, produces consistently lower results than other methods. These results reflect the nature of this method, which relies on the assumption of a normally distributed error, and thus produces biased results when errors have a skewed distribution. This study concurs with the suggestion of Stanley and Lawie (2007: Exploration and Mining Geology, v. 16, p. 265–274) to use the average coefficient of variation ( CV AVR (%) ) as the universal measure of relative precision error in mine geology applications:
![Graphic][1]
Based on case studies, an acceptable level of sample precision is proposed for several different deposit types.
[1]: /embed/inline-graphic-1.gif