The EPA Supersites Research Program needs consistency of metadata and data structures to facilitate information sharing among investigators, analysts, and ultimately secondary data users. Under the auspices of NARSTO a successful mechanism was created to develop and implement reporting standards. The development effort included working closely with Supersites data coordinators, investigators, and technical experts, and also leveraging from existing data standards and practices. Overall, the standards are getting good acceptance from the atmospheric research community.
{"title":"Data and metadata reporting standards for the U.S. Environmental Protection Agency's PM Supersites Research Program.","authors":"L. Hook, S. W. Christensen, W. Sukloff","doi":"10.1080/713844021","DOIUrl":"https://doi.org/10.1080/713844021","url":null,"abstract":"The EPA Supersites Research Program needs consistency of metadata and data structures to facilitate information sharing among investigators, analysts, and ultimately secondary data users. Under the auspices of NARSTO a successful mechanism was created to develop and implement reporting standards. The development effort included working closely with Supersites data coordinators, investigators, and technical experts, and also leveraging from existing data standards and practices. Overall, the standards are getting good acceptance from the atmospheric research community.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"117 1","pages":"155-64"},"PeriodicalIF":0.0,"publicationDate":"2002-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89857379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2001-01-01DOI: 10.1080/10529410290116829
C. Bethell
The Environmental Protection Agency's Strategic Plan was developed in response to internal and external concerns about the integrity, consistency, and accuracy of EPA's environmental data. This document explains why a Strategic Plan is needed and the methodology used in its development, cites Agency models of excellence, and presents the six recommendations of EPA's Data and Information Quality Strategic Plan.
{"title":"Data and information quality strategic plan.","authors":"C. Bethell","doi":"10.1080/10529410290116829","DOIUrl":"https://doi.org/10.1080/10529410290116829","url":null,"abstract":"The Environmental Protection Agency's Strategic Plan was developed in response to internal and external concerns about the integrity, consistency, and accuracy of EPA's environmental data. This document explains why a Strategic Plan is needed and the methodology used in its development, cites Agency models of excellence, and presents the six recommendations of EPA's Data and Information Quality Strategic Plan.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"5 1","pages":"63-97"},"PeriodicalIF":0.0,"publicationDate":"2001-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89075751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852367
D. MacMillan
The Corps of Engineers works with local restoration advisory boards (RAB) to exchange information and develop plans for restoration of closed military bases for civilian reuse. Meetings of the RAB to discuss progress in environmental assessment and restoration of former defense sites can be contentious due to the complex technical nature of the information to be shared and the personal stake that the members of the community have in ensuring that contentious areas are restored for safe use. A prime concern of community representatives is often the quality of the data used to make environmental decisions. Laboratory case narratives and data flags may suggest laboratory errors and low data quality to those without an understanding of the information's full meaning. RAB members include representatives from local, state, and tribal governments, the Department of Defense, the Environmental Protection Agency, and the local community. The Corps of Engineers representatives usually include project technical and management personnel, but these individuals may not have sufficient expertise in the project quality assurance components and laboratory data quality procedures to completely satisfy community concerns about data quality. Communication of this information to the RAB by a quality assurance professional could serve to resolve some of the questions members have about the quality of acquired data and proper use of analytical results, and increase community trust that appropriate decisions are made regarding restoration. Details of the effectiveness of including a quality assurance professional in RAB discussions of laboratory data quality and project quality management are provided in this paper.
{"title":"The quality management system as a tool for improving stakeholder confidence.","authors":"D. MacMillan","doi":"10.1080/10529410052852367","DOIUrl":"https://doi.org/10.1080/10529410052852367","url":null,"abstract":"The Corps of Engineers works with local restoration advisory boards (RAB) to exchange information and develop plans for restoration of closed military bases for civilian reuse. Meetings of the RAB to discuss progress in environmental assessment and restoration of former defense sites can be contentious due to the complex technical nature of the information to be shared and the personal stake that the members of the community have in ensuring that contentious areas are restored for safe use. A prime concern of community representatives is often the quality of the data used to make environmental decisions. Laboratory case narratives and data flags may suggest laboratory errors and low data quality to those without an understanding of the information's full meaning. RAB members include representatives from local, state, and tribal governments, the Department of Defense, the Environmental Protection Agency, and the local community. The Corps of Engineers representatives usually include project technical and management personnel, but these individuals may not have sufficient expertise in the project quality assurance components and laboratory data quality procedures to completely satisfy community concerns about data quality. Communication of this information to the RAB by a quality assurance professional could serve to resolve some of the questions members have about the quality of acquired data and proper use of analytical results, and increase community trust that appropriate decisions are made regarding restoration. Details of the effectiveness of including a quality assurance professional in RAB discussions of laboratory data quality and project quality management are provided in this paper.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"105 1","pages":"201-4"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78706597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852349
D. R. Claycomb
Environmental data quality improvement continues to focus on analytical laboratoryperformance with little, if any, attention given to improving the performance of field consultants responsible for sample collection. Many environmental professionals often assume that the primary opportunity for data error lies within the activities conducted by the laboratory. Experience in the evaluation of environmental data and project-wide quality assurance programs indicates that an often-ignored factor affecting environmental data quality is the manner in which a sample is acquired and handled in the field. If a sample is not properly collected, preserved, stored, and transported in the field, even the best laboratory practices and analytical methods cannot deliver accurate and reliable data (i.e., bad data in equals bad data out). Poor quality environmental data may result in inappropriate decisions regarding site characterization and remedial action. Field auditing is becoming an often-employed technique for examining the performance of the environmental sampling field team and how their performance may affect data quality. The field audits typically focus on: (1) verifying that field consultants adhere to project control documents (e.g., Work Plans and Standard Operating Procedures [SOPs]) during field operations; (2) providing third-party independent assurance that field procedures, quality assurance/ quality control (QA/QC)protocol, and field documentation are sufficient to produce data of satisfactory quality; (3) providing a defense in the event that field procedures are called into question; and (4) identifying ways to reduce sampling costs. Field audits are typically most effective when performed on a surprise basis; that is, the sampling contractor may be aware that a field audit will be conducted during some phase of sampling activities but is not informed of the specific day(s) that the audit will be conducted. The audit also should be conducted early on in the sampling program such that deficiencies noted during the audit can be addressed before the majority of field activities have been completed. A second audit should be performed as a follow-up to confirm that the recommended changes have been implemented. A field auditor is assigned to the project by matching, as closely as possible, the auditor's experience with the type of field activities being conducted. The auditor uses a project-specific field audit checklist developed from key information contained in project control documents. Completion of the extensive audit checklist during the audit focuses the auditor on evaluating each aspect of field activities being performed. Rather than examine field team performance after sampling, a field auditor can do so while the samples are being collected and can apply real-time corrective action as appropriate. As a result of field audits, responsible parties often observe vast improvements in their consultant's field procedures and, consequently, r
{"title":"The role of field auditing in environmental quality assurance management.","authors":"D. R. Claycomb","doi":"10.1080/10529410052852349","DOIUrl":"https://doi.org/10.1080/10529410052852349","url":null,"abstract":"Environmental data quality improvement continues to focus on analytical laboratoryperformance with little, if any, attention given to improving the performance of field consultants responsible for sample collection. Many environmental professionals often assume that the primary opportunity for data error lies within the activities conducted by the laboratory. Experience in the evaluation of environmental data and project-wide quality assurance programs indicates that an often-ignored factor affecting environmental data quality is the manner in which a sample is acquired and handled in the field. If a sample is not properly collected, preserved, stored, and transported in the field, even the best laboratory practices and analytical methods cannot deliver accurate and reliable data (i.e., bad data in equals bad data out). Poor quality environmental data may result in inappropriate decisions regarding site characterization and remedial action. Field auditing is becoming an often-employed technique for examining the performance of the environmental sampling field team and how their performance may affect data quality. The field audits typically focus on: (1) verifying that field consultants adhere to project control documents (e.g., Work Plans and Standard Operating Procedures [SOPs]) during field operations; (2) providing third-party independent assurance that field procedures, quality assurance/ quality control (QA/QC)protocol, and field documentation are sufficient to produce data of satisfactory quality; (3) providing a defense in the event that field procedures are called into question; and (4) identifying ways to reduce sampling costs. Field audits are typically most effective when performed on a surprise basis; that is, the sampling contractor may be aware that a field audit will be conducted during some phase of sampling activities but is not informed of the specific day(s) that the audit will be conducted. The audit also should be conducted early on in the sampling program such that deficiencies noted during the audit can be addressed before the majority of field activities have been completed. A second audit should be performed as a follow-up to confirm that the recommended changes have been implemented. A field auditor is assigned to the project by matching, as closely as possible, the auditor's experience with the type of field activities being conducted. The auditor uses a project-specific field audit checklist developed from key information contained in project control documents. Completion of the extensive audit checklist during the audit focuses the auditor on evaluating each aspect of field activities being performed. Rather than examine field team performance after sampling, a field auditor can do so while the samples are being collected and can apply real-time corrective action as appropriate. As a result of field audits, responsible parties often observe vast improvements in their consultant's field procedures and, consequently, r","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"30 1","pages":"189-94"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85997185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852385
J. C. Worthington
Enterprises, including Federal agencies such as the U.S. Environmental Protection Agency (EPA), are now identifying their information as a strategic resource. As part of a new strategy, enterprises address quality system planning. This technical paper presents some of EPA's approaches and techniques for reconciling quality system considerations for science and technical activities with quality system considerations for information technology and resources. Identification of key information quality indicators, management processes, and assessment processes are addressed.
{"title":"Two data bases in every garage: information quality systems.","authors":"J. C. Worthington","doi":"10.1080/10529410052852385","DOIUrl":"https://doi.org/10.1080/10529410052852385","url":null,"abstract":"Enterprises, including Federal agencies such as the U.S. Environmental Protection Agency (EPA), are now identifying their information as a strategic resource. As part of a new strategy, enterprises address quality system planning. This technical paper presents some of EPA's approaches and techniques for reconciling quality system considerations for science and technical activities with quality system considerations for information technology and resources. Identification of key information quality indicators, management processes, and assessment processes are addressed.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"1 1","pages":"225-44"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78843499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852330
S. Bouzdalkina, R. Bath, P. Greenlaw, D. Bottrell
The quality evaluation and assessment of radiological data is the final step in the overall environmental data decisionprocess. This quality evaluation and assessment process is performed outside of the laboratory, and generally the radiochemist is not involved. However, with the laboratory quality management systems in place today, the data packages of radiochemical analyses are frequently much more complex than the project/program manager can effectively handle and additionally, with little involvement from radiochemists in this process, the potential for misinterpretation of radiological data is increasing. The quality evaluation and assessment of radiochemistry data consists of making three decisions for each sample and result, remembering that the laboratory reports all the data for each analyses as well as the uncertainty in each of these analyses. Therefore, at the data evaluation and assessment stage, the decisions are: (1) is the radionuclide of concern detected (each data point always has a number associated with it?); (2) is the uncertainty associated with the result greater than would normally be expected; and (3) if the laboratory rejected the analyses is there serious consequences to other samples in the same group? The need for the radiochemist's expertise for this process is clear. Quality evaluation and assessment requires the input of the radiochemist particularly in radiochemistry because of the lack of redundancy in the analytical data. This paper describes the role of the radiochemist in the quality assessment of radiochemical data for environmental decision making.
{"title":"The radiochemist's role in the quality evaluation and assessment of radiological data in environmental decision making.","authors":"S. Bouzdalkina, R. Bath, P. Greenlaw, D. Bottrell","doi":"10.1080/10529410052852330","DOIUrl":"https://doi.org/10.1080/10529410052852330","url":null,"abstract":"The quality evaluation and assessment of radiological data is the final step in the overall environmental data decisionprocess. This quality evaluation and assessment process is performed outside of the laboratory, and generally the radiochemist is not involved. However, with the laboratory quality management systems in place today, the data packages of radiochemical analyses are frequently much more complex than the project/program manager can effectively handle and additionally, with little involvement from radiochemists in this process, the potential for misinterpretation of radiological data is increasing. The quality evaluation and assessment of radiochemistry data consists of making three decisions for each sample and result, remembering that the laboratory reports all the data for each analyses as well as the uncertainty in each of these analyses. Therefore, at the data evaluation and assessment stage, the decisions are: (1) is the radionuclide of concern detected (each data point always has a number associated with it?); (2) is the uncertainty associated with the result greater than would normally be expected; and (3) if the laboratory rejected the analyses is there serious consequences to other samples in the same group? The need for the radiochemist's expertise for this process is clear. Quality evaluation and assessment requires the input of the radiochemist particularly in radiochemistry because of the lack of redundancy in the analytical data. This paper describes the role of the radiochemist in the quality assessment of radiochemical data for environmental decision making.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"68 1","pages":"181-7"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89083985","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852394
J. C. Worthington, G. Brilis
Quality assurance techniques used in software development and hardware maintenance/reliability help ensure that data in a computerized information management system are maintained well. However, information workers may not know the quality of data resident in their information systems. Knowledge of the quality of information and data in an enterprise provides managers with important facts for managing and improving the processes that impact information quality. This paper presents quality assessment methodology to assist information workers in planning and implementing an effective assessment of their information data and quality. The areas covered include: identifying appropriate information quality indicators; developing assessment procedures; conducting information quality assessments; reporting information assessment results; tracking improvements in information quality.
{"title":"How good are my data?: Information quality assessment methodology.","authors":"J. C. Worthington, G. Brilis","doi":"10.1080/10529410052852394","DOIUrl":"https://doi.org/10.1080/10529410052852394","url":null,"abstract":"Quality assurance techniques used in software development and hardware maintenance/reliability help ensure that data in a computerized information management system are maintained well. However, information workers may not know the quality of data resident in their information systems. Knowledge of the quality of information and data in an enterprise provides managers with important facts for managing and improving the processes that impact information quality. This paper presents quality assessment methodology to assist information workers in planning and implementing an effective assessment of their information data and quality. The areas covered include: identifying appropriate information quality indicators; developing assessment procedures; conducting information quality assessments; reporting information assessment results; tracking improvements in information quality.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"4 4 1","pages":"245-60"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78331088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852358
L. Johnson, J. C. Worthington
Quality assurance project plans for environmental data collections consider user requirements for the measurements and express these in the form of data quality objectives. User requirements now may include capture of measurements and associated information in prescribed formats to facilitate entry into computerized information systems. Establishing ahead of time that the data requirements may be an important "back seat driver" for an environmental collection effort can save considerable resources for an organization. Also, the planning may need to accommodate unique requirements associated with the entry of data into data collection systems.
{"title":"Data standards are back seat drivers! Methodology for incorporating information quality into quality assurance project plans.","authors":"L. Johnson, J. C. Worthington","doi":"10.1080/10529410052852358","DOIUrl":"https://doi.org/10.1080/10529410052852358","url":null,"abstract":"Quality assurance project plans for environmental data collections consider user requirements for the measurements and express these in the form of data quality objectives. User requirements now may include capture of measurements and associated information in prescribed formats to facilitate entry into computerized information systems. Establishing ahead of time that the data requirements may be an important \"back seat driver\" for an environmental collection effort can save considerable resources for an organization. Also, the planning may need to accommodate unique requirements associated with the entry of data into data collection systems.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"41 1","pages":"195-9"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82395945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852303
G. Johnson
In April 2000, the White House issued Executive Order 13148, Greening the Government Through Leadership in Environmental Management. This Order applies to all appropriate federal facilities that have operations which interact with the environment and includes a number of environmentally-related requirements. The most significant requirement is that all appropriate federal facilities must implement an Environmental Management System (EMS) by December 31, 2005. This Order affects federal laboratories, testing facilities, maintenance facilities, hospitals, and so forth across all federal departments and agencies.
{"title":"Overview of executive order 13148: requirements for environmental management systems at federal facilities.","authors":"G. Johnson","doi":"10.1080/10529410052852303","DOIUrl":"https://doi.org/10.1080/10529410052852303","url":null,"abstract":"In April 2000, the White House issued Executive Order 13148, Greening the Government Through Leadership in Environmental Management. This Order applies to all appropriate federal facilities that have operations which interact with the environment and includes a number of environmentally-related requirements. The most significant requirement is that all appropriate federal facilities must implement an Environmental Management System (EMS) by December 31, 2005. This Order affects federal laboratories, testing facilities, maintenance facilities, hospitals, and so forth across all federal departments and agencies.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"11 1","pages":"153-60"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83736812","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2000-07-01DOI: 10.1080/10529410052852286
D. Bottrell, R. Bath
Implementation of a Quality Systems approach to making defensible environmental program decisions depends upon multiple, interrelated components. Often, these components are developed independently and implemented at various facility and program levels in an attempt to achieve consistency and cost savings. The U.S. Department of Energy, Office of Environmental Management (DOE-EM) focuses on three primary system components to achieve effective environmental data collection and use. (1) Quality System guidance, which establishes the management framework to plan, implement, and assess work performed; (2) A Standardized Statement of Work for analytical services, which defines data generation and reporting requirements consistent with user needs; and (3) A laboratory assessment program to evaluate adherence of work performed to defined needs, e.g., documentation and confidence. This paper describes how DOE-EM fulfills these requirements and realizes cost-savings through participation in interagency working groups and integration of system elements as they evolve.
{"title":"Doe's quality system program: cooperative development and implementation.","authors":"D. Bottrell, R. Bath","doi":"10.1080/10529410052852286","DOIUrl":"https://doi.org/10.1080/10529410052852286","url":null,"abstract":"Implementation of a Quality Systems approach to making defensible environmental program decisions depends upon multiple, interrelated components. Often, these components are developed independently and implemented at various facility and program levels in an attempt to achieve consistency and cost savings. The U.S. Department of Energy, Office of Environmental Management (DOE-EM) focuses on three primary system components to achieve effective environmental data collection and use. (1) Quality System guidance, which establishes the management framework to plan, implement, and assess work performed; (2) A Standardized Statement of Work for analytical services, which defines data generation and reporting requirements consistent with user needs; and (3) A laboratory assessment program to evaluate adherence of work performed to defined needs, e.g., documentation and confidence. This paper describes how DOE-EM fulfills these requirements and realizes cost-savings through participation in interagency working groups and integration of system elements as they evolve.","PeriodicalId":20856,"journal":{"name":"Quality assurance","volume":"5 1","pages":"139-44"},"PeriodicalIF":0.0,"publicationDate":"2000-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85345384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}