Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.42
J. Cheung
A new guidance document “Evaluation of measurement data • The role of measurement uncertainty in conformity assessment” prepared by the Joint Committee for Guides in Metrology was published in October 2012. The document provides guidance and procedures for determining an acceptance interval, chosen so as to balance the risks associated the consumers and the producers. The Standards and Calibration Laboratory (SCL), Hong Kong Special Administrative Region has developed a software tool that allows easy calculation of the acceptance limits based on the production process, the measurement system capabilities, and the defined consumer or producer risks.
{"title":"Software for Implementation of JCGM 106","authors":"J. Cheung","doi":"10.51843/wsproceedings.2013.42","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.42","url":null,"abstract":"A new guidance document “Evaluation of measurement data • The role of measurement uncertainty in conformity assessment” prepared by the Joint Committee for Guides in Metrology was published in October 2012. The document provides guidance and procedures for determining an acceptance interval, chosen so as to balance the risks associated the consumers and the producers. The Standards and Calibration Laboratory (SCL), Hong Kong Special Administrative Region has developed a software tool that allows easy calculation of the acceptance limits based on the production process, the measurement system capabilities, and the defined consumer or producer risks.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132551074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.21
Logan Kunitz
The Use of Hardware Abstraction Layers in Automated Calibration Software. Today’s automated calibration tests systems depend greatly on interaction between the automation software and the physical instrumentation being controlled. This interaction creates dependencies between the software and the hardware, which can be compromised when an instrument needs to be replaced due to failure, obsolescence, or external calibration. Developing a Hardware Abstraction Layer (HAL) is a proactive method of mitigating the risks of planned or unplanned instrument replacement. A HAL decouples automated test software from the underlying hardware, facilitating instrument interchangeability. This paper will introduce the concept of industry-standard, vendor-defined, and user-defined HALs, describe their benefits and uses, and present a use case for implementing a HAL for a given set of instruments.
{"title":"The Use of Hardware Abstraction Layers in Automated Calibration Software","authors":"Logan Kunitz","doi":"10.51843/wsproceedings.2013.21","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.21","url":null,"abstract":"The Use of Hardware Abstraction Layers in Automated Calibration Software. Today’s automated calibration tests systems depend greatly on interaction between the automation software and the physical instrumentation being controlled. This interaction creates dependencies between the software and the hardware, which can be compromised when an instrument needs to be replaced due to failure, obsolescence, or external calibration. Developing a Hardware Abstraction Layer (HAL) is a proactive method of mitigating the risks of planned or unplanned instrument replacement. A HAL decouples automated test software from the underlying hardware, facilitating instrument interchangeability. This paper will introduce the concept of industry-standard, vendor-defined, and user-defined HALs, describe their benefits and uses, and present a use case for implementing a HAL for a given set of instruments.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124064389","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.04
John W. Wilson
Creating a Calibration Measurement Monitoring System for Many, Ever-changing, Complex Instruments. Today’s fast paced instrumentation world pushes the demands on calibration quality systems to new levels with: 1) Reported measurement uncertainties. 2) Tighter test limits by applying guard bands. 3) Increasing number of complex instruments needing calibration. 4) Increasing number of automated calibration routines. 5) Global delivery of calibrations. These demands raise a number of questions and challenges for keeping up with the pace: 1) How does one ensure the calibration is within measurement uncertainties claimed? 2) Where’s the biggest impact for improving procedures by reducing uncertainties? 3) How does one: a) Identify issues? b) Correlate issues to root causes? c) Quantify the impact of issues? d) Communicate to fix issues? e) Verify the issue is fixed? 4) How does one accomplish this with global teams in a timely and economical fashion? To address these issues, a “Calibration Measurement Monitoring System” (CMMS) brings a wealth of the most pertinent information to bear, expanding on control charts to provide timely checks, communicate issues and correlate issues to root causes quickly and easily. This paper explores the development and implementation of one such system.
{"title":"Creating a Calibration Measurement Monitoring System for Many, Ever-changing, Complex instruments","authors":"John W. Wilson","doi":"10.51843/wsproceedings.2013.04","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.04","url":null,"abstract":"Creating a Calibration Measurement Monitoring System for Many, Ever-changing, Complex Instruments. Today’s fast paced instrumentation world pushes the demands on calibration quality systems to new levels with: 1) Reported measurement uncertainties. 2) Tighter test limits by applying guard bands. 3) Increasing number of complex instruments needing calibration. 4) Increasing number of automated calibration routines. 5) Global delivery of calibrations. These demands raise a number of questions and challenges for keeping up with the pace: 1) How does one ensure the calibration is within measurement uncertainties claimed? 2) Where’s the biggest impact for improving procedures by reducing uncertainties? 3) How does one: a) Identify issues? b) Correlate issues to root causes? c) Quantify the impact of issues? d) Communicate to fix issues? e) Verify the issue is fixed? 4) How does one accomplish this with global teams in a timely and economical fashion? To address these issues, a “Calibration Measurement Monitoring System” (CMMS) brings a wealth of the most pertinent information to bear, expanding on control charts to provide timely checks, communicate issues and correlate issues to root causes quickly and easily. This paper explores the development and implementation of one such system.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123566130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.44
Philip Mistretta
Quality measurements are not produced by accident. They are not the result of a single action, occurrence, or event. They are a collection of activities that are planned, interrelated and cohesive; they should be considered alongside the development of manufacturing processes and not an afterthought. Measurement activities are inextricably linked to product quality and manufacturing systems. These activities collectively are referred to as Measurement Quality Assurance (MQA). MQA is good for product quality and good for business, and can even be legislated. The Unites States Code of Federal regulations, Title 21- Food and Drugs states, “Each manufacturer shall ensure that all inspection, measuring, and test equipment, including mechanical, automated, or electronic inspection and test equipment, is suitable for its intended purposes and is capable of producing valid results.”(21 CFR 820.72)A comprehensive Measurement Quality Assurance program designed to mitigate risk has many components. It starts with the product design and identification of the required measurements and the process tolerances required to efficiently build a quality process or service. Once the process tolerances have been defined, test equipment must be selected to take these measurements. The test equipment selected must be suitable and appropriate for the measurement tasks and it must also be capable of producing valid results. Even the proper instrument can produce in-valid results if not handled, maintained, used and stored properly. Even if these events happen, but are not documented by objective evidence, the intent of the MQA program, mitigating risk, can be derailed.
{"title":"Producing Valid Results(Risk Mitigation and Measurement Assurance)","authors":"Philip Mistretta","doi":"10.51843/wsproceedings.2013.44","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.44","url":null,"abstract":"Quality measurements are not produced by accident. They are not the result of a single action, occurrence, or event. They are a collection of activities that are planned, interrelated and cohesive; they should be considered alongside the development of manufacturing processes and not an afterthought. Measurement activities are inextricably linked to product quality and manufacturing systems. These activities collectively are referred to as Measurement Quality Assurance (MQA). MQA is good for product quality and good for business, and can even be legislated. The Unites States Code of Federal regulations, Title 21- Food and Drugs states, “Each manufacturer shall ensure that all inspection, measuring, and test equipment, including mechanical, automated, or electronic inspection and test equipment, is suitable for its intended purposes and is capable of producing valid results.”(21 CFR 820.72)A comprehensive Measurement Quality Assurance program designed to mitigate risk has many components. It starts with the product design and identification of the required measurements and the process tolerances required to efficiently build a quality process or service. Once the process tolerances have been defined, test equipment must be selected to take these measurements. The test equipment selected must be suitable and appropriate for the measurement tasks and it must also be capable of producing valid results. Even the proper instrument can produce in-valid results if not handled, maintained, used and stored properly. Even if these events happen, but are not documented by objective evidence, the intent of the MQA program, mitigating risk, can be derailed.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116958694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.56
T. Doiron
How is the calibration of a micrometer using gage blocks different from using a micrometer to calibrate gage blocks? In the measurement community there seems to be a lot of confusion about which characteristics of an instrument go into the uncertainty budget for calibrating that instrument. This talk will discuss this issue for a standard hand micrometer and show that given a set of gage blocks and micrometer readings, the uncertainty is exceedingly different depending on which is the standard and which is the instrument under test.
{"title":"Uncertainty of Calibration of Instruments, a Simple Example in Dimensional Metrology","authors":"T. Doiron","doi":"10.51843/wsproceedings.2013.56","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.56","url":null,"abstract":"How is the calibration of a micrometer using gage blocks different from using a micrometer to calibrate gage blocks? In the measurement community there seems to be a lot of confusion about which characteristics of an instrument go into the uncertainty budget for calibrating that instrument. This talk will discuss this issue for a standard hand micrometer and show that given a set of gage blocks and micrometer readings, the uncertainty is exceedingly different depending on which is the standard and which is the instrument under test.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128423853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.07
F. Mercader-Trejo
Net work for Innovation and Research in Metrology for the Automotive Industry (RIIMSA) in Mexico. The Network for Innovation and Research in Metrology for the Automotive Industry in Mexico (RIIMSA) was created to solve the growing metrology needs of the automotive sector in the central plateau of Mexico, which includes the states of Guanajuato, Aguascalientes and Querétaro. The largest Mexican automotive based suppliers are found in these three states. This initiative was proposed by the Polytechnic University at Santa Rosa Jauregui (UPSRJ) and supported by the Council of Science and Technology of Querétaro (CONCYTEQ) and the National Metrology Institute of Mexico (CENAM).The purpose of the network is to strengthen the bonds between the scientific and technological capabilities in the region, in order to generate synergies among network members, automotive sector, government, other national and international networks and society. RIIMSA is aiming to impact on the training of human capital, research and development of innovative projects in the field of metrology.
{"title":"Network for Innovation and Research in Metrology for the Automotive Industry (RIIMSA) in Mexico","authors":"F. Mercader-Trejo","doi":"10.51843/wsproceedings.2013.07","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.07","url":null,"abstract":"Net work for Innovation and Research in Metrology for the Automotive Industry (RIIMSA) in Mexico. The Network for Innovation and Research in Metrology for the Automotive Industry in Mexico (RIIMSA) was created to solve the growing metrology needs of the automotive sector in the central plateau of Mexico, which includes the states of Guanajuato, Aguascalientes and Querétaro. The largest Mexican automotive based suppliers are found in these three states. This initiative was proposed by the Polytechnic University at Santa Rosa Jauregui (UPSRJ) and supported by the Council of Science and Technology of Querétaro (CONCYTEQ) and the National Metrology Institute of Mexico (CENAM).The purpose of the network is to strengthen the bonds between the scientific and technological capabilities in the region, in order to generate synergies among network members, automotive sector, government, other national and international networks and society. RIIMSA is aiming to impact on the training of human capital, research and development of innovative projects in the field of metrology.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"277 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133422952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.18
Yi-hua Tang, Johnathan P. Harben, J. Sims
The National Conference of Standard Laboratories International (NCSLI) is scheduled to start the 10th Josephson Voltage Standard (JVS) Interlaboratory Comparison (ILC) in early 2014. NASA’s Kennedy Space Center (KSC) which began operating a 10V Programmable Josephson Voltage Standard (PJVS) in 2010 is a pivot lab candidate for the NCSLI JVS ILC. We propose to use the NASA PJVS as a transfer standard for the intercomparison in addition to using the group of Zeners that were used in the previous ILC. The superior stability of the 10V PJVS’s voltage step enables it to perform the same tasks as the Zener standards and to also improve the efficiency and effectiveness of the ILC through a direct comparison. Recently, a comparison between a conventional JVS and the NIST 10V PJVS was performed by NIST in order to verify the performance of the NIST 10V PJVS. The mean difference between the two systems at 10V was found to be -0.49 nV with a combined standard uncertainty of 1.32 nV (k = 1) or a relative combined standard uncertainty of 1.32 parts in 1010. The advantage of using the 10V PJVS is that a participating lab is able to make comparisons using its conventional JVS system against the 10V PJVS in the same manner as the measurements for Zener standards are performed. Due to the quantum nature of the 10V PJVS, its superior accuracy and stability will improve the uncertainty of a JVS comparison for the direct comparison participants to a level of a few parts in 1010 at 10 V. This would be an improvement over the 2011 ILC which reported an expanded uncertainty with 95% confidence limits of +220 nV and -150 nV.
{"title":"New 10V Programmable Josephson Voltage Standard (PJVS) and its Application for the 2014 NCSLI JVS Interlaboratory Comparison","authors":"Yi-hua Tang, Johnathan P. Harben, J. Sims","doi":"10.51843/wsproceedings.2013.18","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.18","url":null,"abstract":"The National Conference of Standard Laboratories International (NCSLI) is scheduled to start the 10th Josephson Voltage Standard (JVS) Interlaboratory Comparison (ILC) in early 2014. NASA’s Kennedy Space Center (KSC) which began operating a 10V Programmable Josephson Voltage Standard (PJVS) in 2010 is a pivot lab candidate for the NCSLI JVS ILC. We propose to use the NASA PJVS as a transfer standard for the intercomparison in addition to using the group of Zeners that were used in the previous ILC. The superior stability of the 10V PJVS’s voltage step enables it to perform the same tasks as the Zener standards and to also improve the efficiency and effectiveness of the ILC through a direct comparison. Recently, a comparison between a conventional JVS and the NIST 10V PJVS was performed by NIST in order to verify the performance of the NIST 10V PJVS. The mean difference between the two systems at 10V was found to be -0.49 nV with a combined standard uncertainty of 1.32 nV (k = 1) or a relative combined standard uncertainty of 1.32 parts in 1010. The advantage of using the 10V PJVS is that a participating lab is able to make comparisons using its conventional JVS system against the 10V PJVS in the same manner as the measurements for Zener standards are performed. Due to the quantum nature of the 10V PJVS, its superior accuracy and stability will improve the uncertainty of a JVS comparison for the direct comparison participants to a level of a few parts in 1010 at 10 V. This would be an improvement over the 2011 ILC which reported an expanded uncertainty with 95% confidence limits of +220 nV and -150 nV.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"73 3-4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114034308","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.25
J. Salsbury
A revision to the U.S. standard on micrometers, ASME B89.1.13, was approved by the ASME B89 dimensional metrology standards committee in 2012, and final publication of the standard is expected in 2013. This standard includes many modern and novel calibration concepts that apply beyond the dimensional field, and the purpose of this paper is to communicate some of the highlights of this new standard to the larger metrology community. Some of the key issues include defining the measurand, traceability requirements, conformance decision rules, calibration versus verification, and measurement uncertainty. It is expected that some of the concepts in the revised ASME B89.1.13 will be controversial, for example the intentional lack of inclusion of the resolution of the unit under test in the estimation of measurement uncertainty. By presenting this new standard in completion, it is hoped that others will understand and appreciate the reasoning behind some of the novel and controversial concepts in this standard and therefore be able to apply some of the ideas not just to micrometers but to other fields of metrology as well.
{"title":"Important Broad-Based Metrology Concepts in the Revised U.S. Micrometer Standard","authors":"J. Salsbury","doi":"10.51843/wsproceedings.2013.25","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.25","url":null,"abstract":"A revision to the U.S. standard on micrometers, ASME B89.1.13, was approved by the ASME B89 dimensional metrology standards committee in 2012, and final publication of the standard is expected in 2013. This standard includes many modern and novel calibration concepts that apply beyond the dimensional field, and the purpose of this paper is to communicate some of the highlights of this new standard to the larger metrology community. Some of the key issues include defining the measurand, traceability requirements, conformance decision rules, calibration versus verification, and measurement uncertainty. It is expected that some of the concepts in the revised ASME B89.1.13 will be controversial, for example the intentional lack of inclusion of the resolution of the unit under test in the estimation of measurement uncertainty. By presenting this new standard in completion, it is hoped that others will understand and appreciate the reasoning behind some of the novel and controversial concepts in this standard and therefore be able to apply some of the ideas not just to micrometers but to other fields of metrology as well.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123432255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.52
Jonathan Eric Cortez-Rincon, Manuel Darío Hernández-Ripalda, Moisés Tapia-Esquivias, S. Echeverría-Villagómez
This project involves two systems of measurement evaluation: a Method based on Regression [1] and a Gauge R&R study [2]. There are different tools for the analysis, control and improvement of processes but this paper will only address those that involve the data of the realized measurements. These tools will be defined by a Method based on Regression and the Method of the ANOVA in the Gauge R&R Study [3]. Besides using different criteria to accept or to reject measurements realized under certain conditions, the approach used a number of significant numerical models that meet certain statistical conditions to be evaluated using both tools. This way it will be possible to compare the results shown by the Regression and a Gauge R&R Study by method ANOVA. By obtaining the number of different categories [4] and the relation between the projections in the measurements, it is possible to know if what is reliable for one system is also suitable for the other. Finally, this thesis proposes a comparative table of criterion of evaluation of both systems, showing where the line between acceptance and rejection is broken.
{"title":"Comparison of Evaluation Criteria in the Use of Measurement System Based on Regression with Gauge R&R Study","authors":"Jonathan Eric Cortez-Rincon, Manuel Darío Hernández-Ripalda, Moisés Tapia-Esquivias, S. Echeverría-Villagómez","doi":"10.51843/wsproceedings.2013.52","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.52","url":null,"abstract":"This project involves two systems of measurement evaluation: a Method based on Regression [1] and a Gauge R&R study [2]. There are different tools for the analysis, control and improvement of processes but this paper will only address those that involve the data of the realized measurements. These tools will be defined by a Method based on Regression and the Method of the ANOVA in the Gauge R&R Study [3]. Besides using different criteria to accept or to reject measurements realized under certain conditions, the approach used a number of significant numerical models that meet certain statistical conditions to be evaluated using both tools. This way it will be possible to compare the results shown by the Regression and a Gauge R&R Study by method ANOVA. By obtaining the number of different categories [4] and the relation between the projections in the measurements, it is possible to know if what is reliable for one system is also suitable for the other. Finally, this thesis proposes a comparative table of criterion of evaluation of both systems, showing where the line between acceptance and rejection is broken.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"194 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122515864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1900-01-01DOI: 10.51843/wsproceedings.2013.20
C. Diethold, H. Weis, I. Gushchina, A. Amthor, F. Hilbrunner, C. Diethold, T. Frohlich
This paper deals with the optimization of the dynamic performance of electromagnetic force compensated balances by alternative controller concepts. Concerning uncertainty of measurement and achievable resolution, balances based on the principle of electromagnetic force compensation represent the state of the art. Due to the high achievable resolution, the focus of the implementation is not concentrated on static weighing applications any longer, but also shifted towards dynamic purposes. Typical applications would be check-weigher or metering devices in pharmaceutical or food filling plants. There, the demand is not only to obtain the resolution at a high reproducibility, but also to reach a stable and reliable measurement result in a very short time. One possibility to achieve this goal is an optimization of the controller, whereas a PID controller represents the state of the art. For static applications, the PID controller’s performance is fairly good, but due to the limited number of parameters and the basic concept, the optimization potentialities for dynamic applications are restricted. An alternative approach is to realize the controller digitally. The advantage of this approach is, that the controller and filtering concept can be adapted unrestrictedly to the system to be controlled, and that parameters may be adjusted easily and online. With these concepts, the time to achieve a stable measurement signal can be diminished significantly. In this paper we will present detailed investigations on two commercially available Real-time systems (Controller and FPGA-based) for the implementation of digital control and filtering algorithms. The hardware restrictions were evaluated, and based on these results possible software realizations were tested. In accordance with the determined capabilities and limitations, a controller was designed. These first investigations emphasize the capabilities and potential of digital controllers.
{"title":"Investigation on Digital Control Concepts for Dynamic Applications of Electromagnetic Force Compensated Balances","authors":"C. Diethold, H. Weis, I. Gushchina, A. Amthor, F. Hilbrunner, C. Diethold, T. Frohlich","doi":"10.51843/wsproceedings.2013.20","DOIUrl":"https://doi.org/10.51843/wsproceedings.2013.20","url":null,"abstract":"This paper deals with the optimization of the dynamic performance of electromagnetic force compensated balances by alternative controller concepts. Concerning uncertainty of measurement and achievable resolution, balances based on the principle of electromagnetic force compensation represent the state of the art. Due to the high achievable resolution, the focus of the implementation is not concentrated on static weighing applications any longer, but also shifted towards dynamic purposes. Typical applications would be check-weigher or metering devices in pharmaceutical or food filling plants. There, the demand is not only to obtain the resolution at a high reproducibility, but also to reach a stable and reliable measurement result in a very short time. One possibility to achieve this goal is an optimization of the controller, whereas a PID controller represents the state of the art. For static applications, the PID controller’s performance is fairly good, but due to the limited number of parameters and the basic concept, the optimization potentialities for dynamic applications are restricted. An alternative approach is to realize the controller digitally. The advantage of this approach is, that the controller and filtering concept can be adapted unrestrictedly to the system to be controlled, and that parameters may be adjusted easily and online. With these concepts, the time to achieve a stable measurement signal can be diminished significantly. In this paper we will present detailed investigations on two commercially available Real-time systems (Controller and FPGA-based) for the implementation of digital control and filtering algorithms. The hardware restrictions were evaluated, and based on these results possible software realizations were tested. In accordance with the determined capabilities and limitations, a controller was designed. These first investigations emphasize the capabilities and potential of digital controllers.","PeriodicalId":445779,"journal":{"name":"NCSL International Workshop & Symposium Conference Proceedings 2013","volume":"369 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127480154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}