Establishing criteria for knowledge management is an imperative aspect of management as it helps determine results. Our research indicated that widely-accepted criteria and performance measures have not been developed for knowledge management. Our survey-based research, which used a questionnaire targeting knowledge management professionals as respondents, was aimed at establishing criteria for assessing knowledge management success. Using these criteria, a computer-assisted model is used to understand the shared underlying organizational framework in which KM operates and to identify how these criteria were linked. These relations should be explored and utilized to improve organizational performance. Future research should focus on translating the soft measures of knowledge management into detailed metrics.
{"title":"Establishing and Structuring Criteria for Measuring Knowledge Management Efforts","authors":"V. Anantatmula, S. Kanungo","doi":"10.1109/HICSS.2005.247","DOIUrl":"https://doi.org/10.1109/HICSS.2005.247","url":null,"abstract":"Establishing criteria for knowledge management is an imperative aspect of management as it helps determine results. Our research indicated that widely-accepted criteria and performance measures have not been developed for knowledge management. Our survey-based research, which used a questionnaire targeting knowledge management professionals as respondents, was aimed at establishing criteria for assessing knowledge management success. Using these criteria, a computer-assisted model is used to understand the shared underlying organizational framework in which KM operates and to identify how these criteria were linked. These relations should be explored and utilized to improve organizational performance. Future research should focus on translating the soft measures of knowledge management into detailed metrics.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"34 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131153127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Within the field of consumer health informatics there is a need to develop transparent validation methods and rating instruments both of sufficient complexity and reliability to help designers, evaluators and patients to evaluate the quality of health web sites and health information on the Web. Further refinement and validation of the Bomba and Land Consumer Health Website Rating Index (v.1) was conducted. This paper reports on the validation approach utilised (a combination of the Delphi Technique and Sullivan's 5 step process) to produce version 2 of the Bomba and Land Index.
{"title":"Evaluating the Quality of Health Web Sites: Developing a Validation Method and Rating Instrument","authors":"D. Bomba","doi":"10.1109/HICSS.2005.251","DOIUrl":"https://doi.org/10.1109/HICSS.2005.251","url":null,"abstract":"Within the field of consumer health informatics there is a need to develop transparent validation methods and rating instruments both of sufficient complexity and reliability to help designers, evaluators and patients to evaluate the quality of health web sites and health information on the Web. Further refinement and validation of the Bomba and Land Consumer Health Website Rating Index (v.1) was conducted. This paper reports on the validation approach utilised (a combination of the Delphi Technique and Sullivan's 5 step process) to produce version 2 of the Bomba and Land Index.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131143775","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As a useful analytical tool, artificial neural networks (ANN) are widely applied in analyzing the information stored in enterprise database or data warehouse. Certain structured ANN can also be used for customer segmentation. In the process of establishing the training data sets, there is the problem of selecting customer data from potentially wide-varying time periods, as is in a typical enterprise data warehouse. To solve this problem, we introduced a forgetting coefficient into the BP ANN; we also established a new ANN, ANN with forgetting, to improve the customer segmentation process.
{"title":"Neural Network with Forgetting: An ANN Algorithm for Customer","authors":"Q. Ye, Tao Lu, Yijun Li, Wenjun Sun","doi":"10.1109/HICSS.2005.454","DOIUrl":"https://doi.org/10.1109/HICSS.2005.454","url":null,"abstract":"As a useful analytical tool, artificial neural networks (ANN) are widely applied in analyzing the information stored in enterprise database or data warehouse. Certain structured ANN can also be used for customer segmentation. In the process of establishing the training data sets, there is the problem of selecting customer data from potentially wide-varying time periods, as is in a typical enterprise data warehouse. To solve this problem, we introduced a forgetting coefficient into the BP ANN; we also established a new ANN, ANN with forgetting, to improve the customer segmentation process.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132848983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This full-day mini-track has four sessions covering current research themes on competitive strategy and economic analysis, coupled with leading managerial issues in IS and e-commerce. The first session is on the Economics of Network Pricing Mechanisms and Electronic Markets. Karl Lang and Roumen Vagrov of the City University of New York discuss “A Pricing Mechanism for Digital Content Distribution over Peer-toPer Networks.” They model a usage-based pricing scheme for distributing digital content via P2P networks that rewards users who actively participate in the distribution process. Hemant Bhargava of the University of California at Davis and Daewon Sun of Notre Dame University contributed “Quality-Contingent Pricing for Broadband Services.” To resolve consumption uncertainties, they propose the use of contracts involving proportional rebates and threshold quality-contingent pricing. Anindya Ghose of New York University, and Rahul Telang and Ramayya Krishnan of Carnegie Mellon University examine the “Welfare Implications of Electronic Secondary Markets.” They elaborate on new modeling findings that suggest that the presence of used goods in the market should not be a deterrent for valuemaximizing sellers to participate.
{"title":"Competitive Strategy, Economics and Information Systems: Introduction to the Mini-Track","authors":"E. Clemons, R. M. Dewan, R. Kauffman","doi":"10.1109/HICSS.2005.157","DOIUrl":"https://doi.org/10.1109/HICSS.2005.157","url":null,"abstract":"This full-day mini-track has four sessions covering current research themes on competitive strategy and economic analysis, coupled with leading managerial issues in IS and e-commerce. The first session is on the Economics of Network Pricing Mechanisms and Electronic Markets. Karl Lang and Roumen Vagrov of the City University of New York discuss “A Pricing Mechanism for Digital Content Distribution over Peer-toPer Networks.” They model a usage-based pricing scheme for distributing digital content via P2P networks that rewards users who actively participate in the distribution process. Hemant Bhargava of the University of California at Davis and Daewon Sun of Notre Dame University contributed “Quality-Contingent Pricing for Broadband Services.” To resolve consumption uncertainties, they propose the use of contracts involving proportional rebates and threshold quality-contingent pricing. Anindya Ghose of New York University, and Rahul Telang and Ramayya Krishnan of Carnegie Mellon University examine the “Welfare Implications of Electronic Secondary Markets.” They elaborate on new modeling findings that suggest that the presence of used goods in the market should not be a deterrent for valuemaximizing sellers to participate.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133339825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper proposes a framework for examining the use of the knowledge value-added (KVA) methodology to measure the value added by information systems (IS) to business processes. This approach provided a practical way to examine how estimates of the return on knowledge (ROK) of the use of IS supported the core processes before and after deployment within a large telecommunications company (SBC Telecom). This approach offers a means for estimating the return on investments in IS. The paper begins by reviewing the context of IS investment decision making at SBC Telecom. The method for deriving estimates using KVA is reviewed in the SBC Telecom case. The problems in using this new methodology in the high pressure, time critical context of the case are reviewed along with the ways the research team overcame the initial reluctance to use KVA.
{"title":"Where to Invest in Information Systems: A CRM Case Study","authors":"G. Cook, Thomas J. Housel","doi":"10.1109/HICSS.2005.692","DOIUrl":"https://doi.org/10.1109/HICSS.2005.692","url":null,"abstract":"This paper proposes a framework for examining the use of the knowledge value-added (KVA) methodology to measure the value added by information systems (IS) to business processes. This approach provided a practical way to examine how estimates of the return on knowledge (ROK) of the use of IS supported the core processes before and after deployment within a large telecommunications company (SBC Telecom). This approach offers a means for estimating the return on investments in IS. The paper begins by reviewing the context of IS investment decision making at SBC Telecom. The method for deriving estimates using KVA is reviewed in the SBC Telecom case. The problems in using this new methodology in the high pressure, time critical context of the case are reviewed along with the ways the research team overcame the initial reluctance to use KVA.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132436509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Agent Technology, Intelligent Systems and Soft Computing in Management Support mini-track is part of the movement towards developing effective intelligent systems for problem solving and decision making, and towards building and implementing systems that can deal with complex and ill-structured situations, i.e. contexts for which discovery and learning can positively impact the outcome of the problem solving process. The next generation of modeling tools and support systems will include (but is not limited to) the use of intelligent technologies (machine intelligence, neural nets, genetic algorithms), soft computing (fuzzy logic, approximate reasoning, probabilistic modeling) and advanced mathematical modeling. The use of soft computing methods is gaining in both acceptability and importance as the importance of the conflict between rigueur and relevance is becoming more apparent in a dynamic and quickly changing world. The use of advanced methods gives us more rigorous problem solving and more precise results, which become harder and harder to implement, i.e. they lose in relevance. Soft computing offers a way to keep a rigorous theoretical framework and at the same time to allow for an imprecision, which keeps the results relevant. There is an increasing demand for smart systems for interactive planning, problem solving and decision making, by individuals or by groups of users. The future systems will be more robust, more adaptive and easier to use than standard analytical tools. The optimization models (most of the time multiple criteria models) will be more easily incorporated in support systems. The expected end result will give the users knowledgebased support, which is adapted to the problems they need to solve and the decision making expected of them and, furthermore, to the internal logic of the context in which they will have to carry out their activities. There is a growing interest in soft computing tools, which are used to handle imprecision and uncertainty, and to build flexibility and context adaptability into intelligent systems. The application of soft computing to decision problems is focused on a decision context, where fast and correct decision making is becoming instrumental. There is no great consensus on what exactly will form the “new decision context”, but some of the key elements will most probably be, (i) virtual teamwork in different places and in different time zones, (ii) decision support systems on mobile devices, with (iii) access to and the use of multilayer networks (internet(s), intranets), through which (iv) access to and the use of a multitude of data sources (databases, data warehouses, text files, multimedia sources, etc.), and with support from (v) intelligent technologies for filtering, sifting and summarizing (software agents, evolutionary computing, neural nets, etc.), (vi) multiple criteria (crisp, soft) algorithms for problem solving and (vii) semantic web technology to form associative data source
{"title":"Agent Technology, Intelligent Systems and Soft Computing in Management Support","authors":"C. Carlsson, P. Walden","doi":"10.1109/HICSS.2005.69","DOIUrl":"https://doi.org/10.1109/HICSS.2005.69","url":null,"abstract":"The Agent Technology, Intelligent Systems and Soft Computing in Management Support mini-track is part of the movement towards developing effective intelligent systems for problem solving and decision making, and towards building and implementing systems that can deal with complex and ill-structured situations, i.e. contexts for which discovery and learning can positively impact the outcome of the problem solving process. The next generation of modeling tools and support systems will include (but is not limited to) the use of intelligent technologies (machine intelligence, neural nets, genetic algorithms), soft computing (fuzzy logic, approximate reasoning, probabilistic modeling) and advanced mathematical modeling. The use of soft computing methods is gaining in both acceptability and importance as the importance of the conflict between rigueur and relevance is becoming more apparent in a dynamic and quickly changing world. The use of advanced methods gives us more rigorous problem solving and more precise results, which become harder and harder to implement, i.e. they lose in relevance. Soft computing offers a way to keep a rigorous theoretical framework and at the same time to allow for an imprecision, which keeps the results relevant. There is an increasing demand for smart systems for interactive planning, problem solving and decision making, by individuals or by groups of users. The future systems will be more robust, more adaptive and easier to use than standard analytical tools. The optimization models (most of the time multiple criteria models) will be more easily incorporated in support systems. The expected end result will give the users knowledgebased support, which is adapted to the problems they need to solve and the decision making expected of them and, furthermore, to the internal logic of the context in which they will have to carry out their activities. There is a growing interest in soft computing tools, which are used to handle imprecision and uncertainty, and to build flexibility and context adaptability into intelligent systems. The application of soft computing to decision problems is focused on a decision context, where fast and correct decision making is becoming instrumental. There is no great consensus on what exactly will form the “new decision context”, but some of the key elements will most probably be, (i) virtual teamwork in different places and in different time zones, (ii) decision support systems on mobile devices, with (iii) access to and the use of multilayer networks (internet(s), intranets), through which (iv) access to and the use of a multitude of data sources (databases, data warehouses, text files, multimedia sources, etc.), and with support from (v) intelligent technologies for filtering, sifting and summarizing (software agents, evolutionary computing, neural nets, etc.), (vi) multiple criteria (crisp, soft) algorithms for problem solving and (vii) semantic web technology to form associative data source","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134493245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Distance learning is becoming a popular option for education. Course management platforms (CMP) are used to deliver distance learning courses. These include an abundance of channels by which to communicate and run a distance class. As such, CMP provide an excellent environment to test the implications of Media Richness Theory. Based upon a review of "interaction" theories, conjectures are formulated outlining anticipated differences based upon the richness used in the CMP. These conjectures are tested with a set of data from an established distance education program. The results show significant differences, in the direction predicted by Media Richness Theory, for satisfaction, communication and interaction, and perceived technology effectiveness.
{"title":"Looking for Indicators of Media Richness Theory in Distance Education","authors":"W. B. Martz, V. Reddy","doi":"10.1109/HICSS.2005.392","DOIUrl":"https://doi.org/10.1109/HICSS.2005.392","url":null,"abstract":"Distance learning is becoming a popular option for education. Course management platforms (CMP) are used to deliver distance learning courses. These include an abundance of channels by which to communicate and run a distance class. As such, CMP provide an excellent environment to test the implications of Media Richness Theory. Based upon a review of \"interaction\" theories, conjectures are formulated outlining anticipated differences based upon the richness used in the CMP. These conjectures are tested with a set of data from an established distance education program. The results show significant differences, in the direction predicted by Media Richness Theory, for satisfaction, communication and interaction, and perceived technology effectiveness.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115504593","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Whenever message-oriented middleware is used within applications which run on heterogeneous software systems of multiple partners, and middleware protocols have to be replaced or to fulfil changing tasks, it may be considerably advantageous to have a modular system which supports the exchange of protocols and a dynamic adaptation to changing requirements. Our approach to middleware allows system developers to generate a modular stack of message-oriented middleware from a set of service specifications. We present a technique how service specifications can be translated into executable middleware code that includes a message queue, a stack of protocols chosen by the middleware developer, and a set of modules that implement the desired tasks.
{"title":"Service Composition on Top of Exchangable Protocols","authors":"S. Böttcher, Christian Dannewitz","doi":"10.1109/HICSS.2005.539","DOIUrl":"https://doi.org/10.1109/HICSS.2005.539","url":null,"abstract":"Whenever message-oriented middleware is used within applications which run on heterogeneous software systems of multiple partners, and middleware protocols have to be replaced or to fulfil changing tasks, it may be considerably advantageous to have a modular system which supports the exchange of protocols and a dynamic adaptation to changing requirements. Our approach to middleware allows system developers to generate a modular stack of message-oriented middleware from a set of service specifications. We present a technique how service specifications can be translated into executable middleware code that includes a message queue, a stack of protocols chosen by the middleware developer, and a set of modules that implement the desired tasks.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"252 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115616954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kohonen's Self-Organizing Map (SOM) network maps input data to a lower dimensional output map. The extended SOM network further groups the nodes on the output map into a user specified number of clusters. Kiang, Hu and Fisher used the extended SOM network for market segmentation and showed that the extended SOM provides better results than the statistical approach that reduces the dimensionality of the problem via factor analysis and then forms segments with cluster analysis. In this study we examine the effect of sample size on the extended SOM compared to that on the factor/cluster approach. Comparisons will be made using the correct classification rates between the two approaches at various sample sizes. Unlike statistical models, neural networks are not dependent on statistical assumptions. Thus we expect the results for neural network models to be stable across sample sizes but may be sensitive to initial weights and model specifications.
{"title":"The Effect of Sample Size on the Extended Self-Organizing Map Network for Market Segmentation","authors":"M. Kiang, Michael Y. Hu, D. Fisher, R. Chi","doi":"10.1109/HICSS.2005.590","DOIUrl":"https://doi.org/10.1109/HICSS.2005.590","url":null,"abstract":"Kohonen's Self-Organizing Map (SOM) network maps input data to a lower dimensional output map. The extended SOM network further groups the nodes on the output map into a user specified number of clusters. Kiang, Hu and Fisher used the extended SOM network for market segmentation and showed that the extended SOM provides better results than the statistical approach that reduces the dimensionality of the problem via factor analysis and then forms segments with cluster analysis. In this study we examine the effect of sample size on the extended SOM compared to that on the factor/cluster approach. Comparisons will be made using the correct classification rates between the two approaches at various sample sizes. Unlike statistical models, neural networks are not dependent on statistical assumptions. Thus we expect the results for neural network models to be stable across sample sizes but may be sensitive to initial weights and model specifications.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115758175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Toomey, W. Schulze, R. Schuler, R. Thomas, J. Thorp
The economic theory that has been used to support restructuring of the electric power industry has ignored several important technological constraints and public goods that affect the way in which power is delivered. Some of these public goods include voltage, frequency, and reliability of lines. Similarly, engineers, by using security-constrained optimization to incorporate the demand for reliability, have failed to properly define the economic problem. This research attempts to remedy this deficiency through a collaborative effort between economists and engineers to examine the theoretical and empirical properties of a networked power system that provides economically optimal reliability and draw conclusions regarding efficient market design.
{"title":"Reliability, Electric Power, and Public Versus Private Goods: A New Look at the Role of Markets","authors":"D. Toomey, W. Schulze, R. Schuler, R. Thomas, J. Thorp","doi":"10.1109/HICSS.2005.518","DOIUrl":"https://doi.org/10.1109/HICSS.2005.518","url":null,"abstract":"The economic theory that has been used to support restructuring of the electric power industry has ignored several important technological constraints and public goods that affect the way in which power is delivered. Some of these public goods include voltage, frequency, and reliability of lines. Similarly, engineers, by using security-constrained optimization to incorporate the demand for reliability, have failed to properly define the economic problem. This research attempts to remedy this deficiency through a collaborative effort between economists and engineers to examine the theoretical and empirical properties of a networked power system that provides economically optimal reliability and draw conclusions regarding efficient market design.","PeriodicalId":355838,"journal":{"name":"Proceedings of the 38th Annual Hawaii International Conference on System Sciences","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115774935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}