Pub Date : 2022-09-02DOI: 10.3390/standards2030028
Jin Wang, Chang Liu, Liang Zhou, Jiangpei Xu, Jie Wang, Ziqin Sang
After the Smart City initiative was put forward, cities all over the world started the pilot practice of developing Smart Cities. This triggered a series of thoughts: what is a Smart City, how do we determine the scope of work of a Smart City, and how do we formulate a new strategic agenda of the Smart City to make city smarter and more sustainable? The answer lies not only in finding Smart City solutions, but also leads to the research on the definition of Smart City terminology and the determination of corresponding tasks. Stakeholders of Smart City (e.g., policy makers, municipalities, solution providers, industry, and academia) develop technical and management standards for these tasks jointly. This paper reports the standardization planning on Smart City by the international standardization development organizations (SDOs), that is, the standardization framework of Smart City. It also presents one of the important aspects, namely, the progress of standardization activities on urban infrastructure that are being carried out by the International Telecommunication Union (ITU) via its Study Group 20, in supporting the adoption of information and communication technologies (ICTs) in Smart City. These standards include the classification of urban infrastructure, the interoperability between urban infrastructure and smart city platforms, and the requirements of detailed infrastructure from the perspective of ICT and the Internet of things (IoT). This paper also provides the use cases of application of some standards in global cities.
{"title":"Progress of Standardization of Urban Infrastructure in Smart City","authors":"Jin Wang, Chang Liu, Liang Zhou, Jiangpei Xu, Jie Wang, Ziqin Sang","doi":"10.3390/standards2030028","DOIUrl":"https://doi.org/10.3390/standards2030028","url":null,"abstract":"After the Smart City initiative was put forward, cities all over the world started the pilot practice of developing Smart Cities. This triggered a series of thoughts: what is a Smart City, how do we determine the scope of work of a Smart City, and how do we formulate a new strategic agenda of the Smart City to make city smarter and more sustainable? The answer lies not only in finding Smart City solutions, but also leads to the research on the definition of Smart City terminology and the determination of corresponding tasks. Stakeholders of Smart City (e.g., policy makers, municipalities, solution providers, industry, and academia) develop technical and management standards for these tasks jointly. This paper reports the standardization planning on Smart City by the international standardization development organizations (SDOs), that is, the standardization framework of Smart City. It also presents one of the important aspects, namely, the progress of standardization activities on urban infrastructure that are being carried out by the International Telecommunication Union (ITU) via its Study Group 20, in supporting the adoption of information and communication technologies (ICTs) in Smart City. These standards include the classification of urban infrastructure, the interoperability between urban infrastructure and smart city platforms, and the requirements of detailed infrastructure from the perspective of ICT and the Internet of things (IoT). This paper also provides the use cases of application of some standards in global cities.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90755785","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-01DOI: 10.3390/standards2030027
P. Topiwala, W. Dai, J. Pian, Katalina Biondi, Arvind Krovvidi
Video quality assessment (VQA) is now a fast-growing field, maturing in the full reference (FR) case, yet challenging in the exploding no reference (NR) case. In this paper, we investigate some variants of the popular FR VMAF video quality assessment algorithm, using both support vector regression and feedforward neural networks. We also extend it to the NR case, using different features but similar learning, to develop a partially unified framework for VQA. When fully trained, FR algorithms such as VMAF perform very well on test datasets, reaching a 90%+ match in the popular correlation coefficients PCC and SRCC. However, for predicting performance in the wild, we train/test them individually for each dataset. With an 80/20 train/test split, we still achieve about 90% performance on average in both PCC and SRCC, with up to 7–9% gains over VMAF, using an improved motion feature and better regression. Moreover, we even obtain good performance (about 75%) if we ignore the reference, treating FR as NR, partly justifying our attempts at unification. In the true NR case, typically with amateur user-generated data, we avail of many more features, but still reduce complexity vs. recent algorithms VIDEVAL and RAPIQUE, while achieving performance within 3–5% of them. Moreover, we develop a method to analyze the saliency of features, and conclude that for both VIDEVAL and RAPIQUE, a small subset of their features provide the bulk of the performance. We also touch upon the current best NR methods: MDT-VSFA, and PVQ which reach above 80% performance. In short, we identify encouraging improvements in trainability in FR, while constraining training complexity against leading methods in NR, elucidating the saliency of features for feature selection.
{"title":"Video Quality Analysis: Steps towards Unifying Full and No Reference Cases","authors":"P. Topiwala, W. Dai, J. Pian, Katalina Biondi, Arvind Krovvidi","doi":"10.3390/standards2030027","DOIUrl":"https://doi.org/10.3390/standards2030027","url":null,"abstract":"Video quality assessment (VQA) is now a fast-growing field, maturing in the full reference (FR) case, yet challenging in the exploding no reference (NR) case. In this paper, we investigate some variants of the popular FR VMAF video quality assessment algorithm, using both support vector regression and feedforward neural networks. We also extend it to the NR case, using different features but similar learning, to develop a partially unified framework for VQA. When fully trained, FR algorithms such as VMAF perform very well on test datasets, reaching a 90%+ match in the popular correlation coefficients PCC and SRCC. However, for predicting performance in the wild, we train/test them individually for each dataset. With an 80/20 train/test split, we still achieve about 90% performance on average in both PCC and SRCC, with up to 7–9% gains over VMAF, using an improved motion feature and better regression. Moreover, we even obtain good performance (about 75%) if we ignore the reference, treating FR as NR, partly justifying our attempts at unification. In the true NR case, typically with amateur user-generated data, we avail of many more features, but still reduce complexity vs. recent algorithms VIDEVAL and RAPIQUE, while achieving performance within 3–5% of them. Moreover, we develop a method to analyze the saliency of features, and conclude that for both VIDEVAL and RAPIQUE, a small subset of their features provide the bulk of the performance. We also touch upon the current best NR methods: MDT-VSFA, and PVQ which reach above 80% performance. In short, we identify encouraging improvements in trainability in FR, while constraining training complexity against leading methods in NR, elucidating the saliency of features for feature selection.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74163118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-08-22DOI: 10.3390/standards2030026
B. Kang, Y. Lee
In the era of the Fourth Industrial Revolution, the establishment of a TBT system that utilizes the knowledge-based view as a means of overcoming the problems of scarcity of human resources and lack of technological capabilities faced by export companies that produce and supply products and services is being emphasized. The WTO TBT Agreement, which is based on the multilateral agreement of the GATT 7th Tokyo Round, consists of 15 articles and 3 annexes to ensure that technical regulations, standards, and conformity assessment systems do not act as technical barriers to trade. The transition to the digital economy (EDT) has been accelerating, and currently the EDT presents both a challenge and an opportunity. The US, which is at the center of the international standards competition, has accelerated standards competition by invoking supply chain executive order decoupling, and as China looks to implement the policy set out in the China Standards 2035 Plan, the relationship between the US and China is worsening in relation to the preoccupation with standards. Dreaming of a Chinese version of this US strategy, China, which is connected from 12.5 to 14.5 units, is accelerating its standardization strategy through the Made in China 2035 program. The “double cycle development strategy” and “technological innovation” are key mid- to long-term policy directions. Korea should develop a Korean-style conformity assessment development model based on the TBT system, which is a major element of non-tariff barriers, under the WTO/FTA system that promotes the flow of the KBV along with the establishment of a digital transformation system.
{"title":"Forecasting the Competition of International Standardization Preoccupation","authors":"B. Kang, Y. Lee","doi":"10.3390/standards2030026","DOIUrl":"https://doi.org/10.3390/standards2030026","url":null,"abstract":"In the era of the Fourth Industrial Revolution, the establishment of a TBT system that utilizes the knowledge-based view as a means of overcoming the problems of scarcity of human resources and lack of technological capabilities faced by export companies that produce and supply products and services is being emphasized. The WTO TBT Agreement, which is based on the multilateral agreement of the GATT 7th Tokyo Round, consists of 15 articles and 3 annexes to ensure that technical regulations, standards, and conformity assessment systems do not act as technical barriers to trade. The transition to the digital economy (EDT) has been accelerating, and currently the EDT presents both a challenge and an opportunity. The US, which is at the center of the international standards competition, has accelerated standards competition by invoking supply chain executive order decoupling, and as China looks to implement the policy set out in the China Standards 2035 Plan, the relationship between the US and China is worsening in relation to the preoccupation with standards. Dreaming of a Chinese version of this US strategy, China, which is connected from 12.5 to 14.5 units, is accelerating its standardization strategy through the Made in China 2035 program. The “double cycle development strategy” and “technological innovation” are key mid- to long-term policy directions. Korea should develop a Korean-style conformity assessment development model based on the TBT system, which is a major element of non-tariff barriers, under the WTO/FTA system that promotes the flow of the KBV along with the establishment of a digital transformation system.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86938627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-08-02DOI: 10.3390/standards2030025
G. Giorgi, Sarah Tonello
The availability of low-cost plug-and-play devices may contribute to the diffusion of methods and technologies for the personalized monitoring of physiological parameters by wearable devices. This paper is focused on biosensors, which represent an interesting enabling technology for the real-time continuous acquisition of biological or chemical analytes of physio-pathological interest, e.g., metabolites, protein biomarkers, and electrolytes in biofluids. Currently available commercial biosensors are usually referred to as customized and proprietary solutions. However, the efficient and robust development of e-health applications based on wearable biosensors can be eased from device interoperability. In this way, even if the different modules belong to different manufacturers, they can be added, upgraded, changed or removed without affecting the whole data acquisition system. A great effort in this direction has already been made by the ISO/IEC/IEEE 21451 standard that introduces the concept of smart sensors by defining the main and essential characteristics that these devices should have. Following the guidelines provided by this standard, here we propose a set of characteristics that should be considered in the development of a smart biosensor and how they could be integrated into the existing standard.
{"title":"Wearable Biosensor Standardization: How to Make Them Smarter","authors":"G. Giorgi, Sarah Tonello","doi":"10.3390/standards2030025","DOIUrl":"https://doi.org/10.3390/standards2030025","url":null,"abstract":"The availability of low-cost plug-and-play devices may contribute to the diffusion of methods and technologies for the personalized monitoring of physiological parameters by wearable devices. This paper is focused on biosensors, which represent an interesting enabling technology for the real-time continuous acquisition of biological or chemical analytes of physio-pathological interest, e.g., metabolites, protein biomarkers, and electrolytes in biofluids. Currently available commercial biosensors are usually referred to as customized and proprietary solutions. However, the efficient and robust development of e-health applications based on wearable biosensors can be eased from device interoperability. In this way, even if the different modules belong to different manufacturers, they can be added, upgraded, changed or removed without affecting the whole data acquisition system. A great effort in this direction has already been made by the ISO/IEC/IEEE 21451 standard that introduces the concept of smart sensors by defining the main and essential characteristics that these devices should have. Following the guidelines provided by this standard, here we propose a set of characteristics that should be considered in the development of a smart biosensor and how they could be integrated into the existing standard.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72610920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-08-01DOI: 10.3390/standards2030024
J. Giacomelli
Rating systems are applied to a wide variety of different contexts as a tool to map a large amount of information to a symbol, or notch, chosen from a finite, ordered set. Such a set is commonly known as the rating scale, and its elements represent all the different degrees of quality—in some sense—that a given rating system aims to express. This work investigates a simple yet nontrivial paradox in constructing that scale. When the considered quality parameter is continuous, a bijection must exist between a specific partition of its domain and the rating scale. The number of notches and their meanings are commonly defined a priori based on the convenience of the rating system users. However, regarding the partition, the number of subsets and their amplitudes should be chosen a posteriori to minimize the unavoidable information loss due to discretization. Considering the typical case of a creditworthy rating system based on a logistic regression model, we discuss to what extent this contrast may impact a realistic framework and how a proper rating scale definition may handle it. Indeed, we show that choosing between a priori methods, which privilege the meaning of the rating scale, and a posteriori methods, which minimize information loss, is not strictly necessary. It is possible to mix the two approaches instead, choosing a hybrid criterion tunable according to the rating model’s user needs.
{"title":"The Rating Scale Paradox: Semantics Instability versus Information Loss","authors":"J. Giacomelli","doi":"10.3390/standards2030024","DOIUrl":"https://doi.org/10.3390/standards2030024","url":null,"abstract":"Rating systems are applied to a wide variety of different contexts as a tool to map a large amount of information to a symbol, or notch, chosen from a finite, ordered set. Such a set is commonly known as the rating scale, and its elements represent all the different degrees of quality—in some sense—that a given rating system aims to express. This work investigates a simple yet nontrivial paradox in constructing that scale. When the considered quality parameter is continuous, a bijection must exist between a specific partition of its domain and the rating scale. The number of notches and their meanings are commonly defined a priori based on the convenience of the rating system users. However, regarding the partition, the number of subsets and their amplitudes should be chosen a posteriori to minimize the unavoidable information loss due to discretization. Considering the typical case of a creditworthy rating system based on a logistic regression model, we discuss to what extent this contrast may impact a realistic framework and how a proper rating scale definition may handle it. Indeed, we show that choosing between a priori methods, which privilege the meaning of the rating scale, and a posteriori methods, which minimize information loss, is not strictly necessary. It is possible to mix the two approaches instead, choosing a hybrid criterion tunable according to the rating model’s user needs.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86056894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-22DOI: 10.3390/standards2030022
L. Carlsen, R. Bruggemann
Evaluation by ranking/rating of data based on a multitude of indicators typically calls for multi-criteria decision analyses (MCDA) methods. MCDA methods often, in addition to indicator values, require further information, typically subjective. This paper presents a partial-order methodology as an alternative to analyze multi-indicator systems (MIS) based on indicator values that are simultaneously included in the analyses. A non-technical introduction of main concepts of partial order is given, along with a discussion of the location of partial order between statistics and MCDA. The paper visualizes examples of a ‘simple’ partial ordering of a series of chemicals to explain, in this case, unexpected behavior. Further, a generalized method to deal with qualitative inputs of stakeholders/decision makers is suggested, as well as how to disclose peculiar elements/outliers. The paper finishes by introducing formal concept analysis (FCA), which is a variety of partial ordering that allows exploration and thus the generation of implications between the indicators. In the conclusion and outlook section, take-home comments as well as pros and cons in relation to partial ordering are discussed.
{"title":"Partial Order as Decision Support between Statistics and Multicriteria Decision Analyses","authors":"L. Carlsen, R. Bruggemann","doi":"10.3390/standards2030022","DOIUrl":"https://doi.org/10.3390/standards2030022","url":null,"abstract":"Evaluation by ranking/rating of data based on a multitude of indicators typically calls for multi-criteria decision analyses (MCDA) methods. MCDA methods often, in addition to indicator values, require further information, typically subjective. This paper presents a partial-order methodology as an alternative to analyze multi-indicator systems (MIS) based on indicator values that are simultaneously included in the analyses. A non-technical introduction of main concepts of partial order is given, along with a discussion of the location of partial order between statistics and MCDA. The paper visualizes examples of a ‘simple’ partial ordering of a series of chemicals to explain, in this case, unexpected behavior. Further, a generalized method to deal with qualitative inputs of stakeholders/decision makers is suggested, as well as how to disclose peculiar elements/outliers. The paper finishes by introducing formal concept analysis (FCA), which is a variety of partial ordering that allows exploration and thus the generation of implications between the indicators. In the conclusion and outlook section, take-home comments as well as pros and cons in relation to partial ordering are discussed.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73002734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-22DOI: 10.3390/standards2030023
A. Stoyanova, Velichka Marinova, Daniel Stoilov, D. Kirechev
The management strategy requires a shift to change-oriented management. These management approaches are process- and activity-oriented and are based on the assumption that the future is difficult to predict and ineffective for modeling. The aim of this study is to present a model of food safety management using a process approach based on the PDCA cycle set in the international standard ISO 22000:2018 by supplementing the regulatory requirements for food safety management. After analyzing the aspects of food safety management, a model is proposed for risk analysis and assessment at the operational and organisational level. In this study, the FMEA method for risk assessment of storage of foods of plant origin was used. The research can be useful for producers and traders in the planning and development of food safety management systems according to the requirements of the ISO 22000:2018 standard. The implementation of documented rules for compliance with the requirements of the international standard is aimed at the management and control of processes at the operational and organisational level in the activities of companies. Process management and data analysis is a direction to improve activities aimed at minimizing food safety risks.
{"title":"Food Safety Management System (FSMS) Model with Application of the PDCA Cycle and Risk Assessment as Requirements of the ISO 22000:2018 Standard","authors":"A. Stoyanova, Velichka Marinova, Daniel Stoilov, D. Kirechev","doi":"10.3390/standards2030023","DOIUrl":"https://doi.org/10.3390/standards2030023","url":null,"abstract":"The management strategy requires a shift to change-oriented management. These management approaches are process- and activity-oriented and are based on the assumption that the future is difficult to predict and ineffective for modeling. The aim of this study is to present a model of food safety management using a process approach based on the PDCA cycle set in the international standard ISO 22000:2018 by supplementing the regulatory requirements for food safety management. After analyzing the aspects of food safety management, a model is proposed for risk analysis and assessment at the operational and organisational level. In this study, the FMEA method for risk assessment of storage of foods of plant origin was used. The research can be useful for producers and traders in the planning and development of food safety management systems according to the requirements of the ISO 22000:2018 standard. The implementation of documented rules for compliance with the requirements of the international standard is aimed at the management and control of processes at the operational and organisational level in the activities of companies. Process management and data analysis is a direction to improve activities aimed at minimizing food safety risks.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81403305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-02DOI: 10.3390/standards2030021
M. Šolc, P. Blaško, L. Girmanová, J. Kliment
The main task of safety and health at work is to protect the most important thing we have, the health of each of us. Employers are able to anticipate and prevent risks by properly implemented occupational safety and health management systems. The basic task of the article is to describe the history of safety management systems to identify the state of implementation of the ISO 45001 system in the world. Subsequently, the article describes the ISO 45001 standard from the perspective of the PDCA cycle and describes the benefits and importance of implementing the ISO 45001 standard. The conclusion of the article deals with the development trend of the occupational health and safety management system according to STN ISO 45001:2019 in the context of occupational accidents in the conditions of the Slovak Republic.
工作中安全和健康的主要任务是保护我们最重要的东西,我们每个人的健康。雇主能够通过正确实施职业安全和健康管理体系来预测和预防风险。本文的基本任务是描述安全管理体系的历史,以识别ISO 45001体系在世界范围内的实施状态。随后,文章从PDCA循环的角度描述了ISO 45001标准,并描述了实施ISO 45001标准的好处和重要性。文章的结论涉及在斯洛伐克共和国的职业事故背景下,根据STN ISO 45001:2019的职业健康和安全管理体系的发展趋势。
{"title":"The Development Trend of the Occupational Health and Safety in the Context of ISO 45001:2018","authors":"M. Šolc, P. Blaško, L. Girmanová, J. Kliment","doi":"10.3390/standards2030021","DOIUrl":"https://doi.org/10.3390/standards2030021","url":null,"abstract":"The main task of safety and health at work is to protect the most important thing we have, the health of each of us. Employers are able to anticipate and prevent risks by properly implemented occupational safety and health management systems. The basic task of the article is to describe the history of safety management systems to identify the state of implementation of the ISO 45001 system in the world. Subsequently, the article describes the ISO 45001 standard from the perspective of the PDCA cycle and describes the benefits and importance of implementing the ISO 45001 standard. The conclusion of the article deals with the development trend of the occupational health and safety management system according to STN ISO 45001:2019 in the context of occupational accidents in the conditions of the Slovak Republic.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77673240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-24DOI: 10.3390/standards2030019
E. Szewczak
The implementation of a standard should be preceded by research work aimed at developing the test method, particularly in validation experiments. Is it actually so? Numerous experiences of producers and labs and an increasing number of scientific works prove the opposite. It turns out that some standard methods are very poorly suited to assessing the performance of construction products. This is related both to the specificity of the methods and the tested products. This article presents some product assessment problems and the risk of using test methods that have not been fully validated. The risk seems relatively low if laboratories account for their own uncertainty. However, in some cases, additional components that both laboratories and product manufacturers might fail to consider can significantly increase the risk. This indicates the need for continuous work in the reference area.
{"title":"Does Standardisation Ensure a Reliable Assessment of the Performance of Construction Products?","authors":"E. Szewczak","doi":"10.3390/standards2030019","DOIUrl":"https://doi.org/10.3390/standards2030019","url":null,"abstract":"The implementation of a standard should be preceded by research work aimed at developing the test method, particularly in validation experiments. Is it actually so? Numerous experiences of producers and labs and an increasing number of scientific works prove the opposite. It turns out that some standard methods are very poorly suited to assessing the performance of construction products. This is related both to the specificity of the methods and the tested products. This article presents some product assessment problems and the risk of using test methods that have not been fully validated. The risk seems relatively low if laboratories account for their own uncertainty. However, in some cases, additional components that both laboratories and product manufacturers might fail to consider can significantly increase the risk. This indicates the need for continuous work in the reference area.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88776577","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-23DOI: 10.3390/standards2030018
M. Azad, K. Lari, Rana Oudi, T. Sadeghifar, O. Kisi
A dust storm is one of the costliest and most destructive events in many desert regions. This research investigates the effect of dust storm on sea surface temperature (SST) in the western zone of the Persian Gulf, especially Bushehr Province and its beaches in the years 2008 and 2009. Therefore, some climate and sea parameters such as SST, salinity, air temperature, wind velocity and direction, evaporation, horizontal visibility, sunshine hours and radiation, simultaneously measured in a specific period of time, were analyzed by comparing each of them with satellite data. Sea surface temperature analysis in summer shows that the maximum SST in Persian Gulf along neighbor waters to Bushehr County and central regions in northern section of Persian Gulf is about 34–36 °C. The SST amplitude variation in these places in summer ranges from 28 to 34 °C and when there are dust phenomena, it is from 29.5 to 31 °C. The outcome of this study shows that the SST increases during dusting phenomena and this causes an increase in vapor and as a result a decrease in temperature occurs. On the other hand, vapor increase leads to a growth in the amount and layer of earth’s cloud cover and finally it causes an effective decrease in short-wave sunshine and the temperature and the vapor on surface decrease. As a result, the decrease in sea surface temperature terminates.
{"title":"The Effect of Dust Storm on Sea Surface Temperature in the Western Basin of Persian Gulf","authors":"M. Azad, K. Lari, Rana Oudi, T. Sadeghifar, O. Kisi","doi":"10.3390/standards2030018","DOIUrl":"https://doi.org/10.3390/standards2030018","url":null,"abstract":"A dust storm is one of the costliest and most destructive events in many desert regions. This research investigates the effect of dust storm on sea surface temperature (SST) in the western zone of the Persian Gulf, especially Bushehr Province and its beaches in the years 2008 and 2009. Therefore, some climate and sea parameters such as SST, salinity, air temperature, wind velocity and direction, evaporation, horizontal visibility, sunshine hours and radiation, simultaneously measured in a specific period of time, were analyzed by comparing each of them with satellite data. Sea surface temperature analysis in summer shows that the maximum SST in Persian Gulf along neighbor waters to Bushehr County and central regions in northern section of Persian Gulf is about 34–36 °C. The SST amplitude variation in these places in summer ranges from 28 to 34 °C and when there are dust phenomena, it is from 29.5 to 31 °C. The outcome of this study shows that the SST increases during dusting phenomena and this causes an increase in vapor and as a result a decrease in temperature occurs. On the other hand, vapor increase leads to a growth in the amount and layer of earth’s cloud cover and finally it causes an effective decrease in short-wave sunshine and the temperature and the vapor on surface decrease. As a result, the decrease in sea surface temperature terminates.","PeriodicalId":21933,"journal":{"name":"Standards","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88342596","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}