Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603538
Min Hou, Jun Han, Jiasha Zhang, Jing Liu
Network education has been developing rapidly, but there is always a problem that learners' motivation is not high. This paper is based on incentive theory and motivation theory, according to four stages of teaching process: stimulate, guide, maintain and evaluate, the corresponding incentive measures are put forward, the structure of the network teaching incentive measures is formed. Then, on the basis of the framework of incentive measures, taking Blackboard as an example, the research of incentive mechanism in three courses: information technology education, instructional design and educational technology research methods is investigated, which is widely used in Capital Normal University. We analyze the present situation of the blackboard, find out its advantages and disadvantages, and put forward the corresponding solution to the problem. This study not only provides a reference for the analysis of incentive measures in the network teaching platform, but also puts forward new ideas for the study of incentive mechanism, and further deepens our understanding of the incentive mechanism.
{"title":"Research on incentive mechanism in network education platform","authors":"Min Hou, Jun Han, Jiasha Zhang, Jing Liu","doi":"10.1109/FSKD.2016.7603538","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603538","url":null,"abstract":"Network education has been developing rapidly, but there is always a problem that learners' motivation is not high. This paper is based on incentive theory and motivation theory, according to four stages of teaching process: stimulate, guide, maintain and evaluate, the corresponding incentive measures are put forward, the structure of the network teaching incentive measures is formed. Then, on the basis of the framework of incentive measures, taking Blackboard as an example, the research of incentive mechanism in three courses: information technology education, instructional design and educational technology research methods is investigated, which is widely used in Capital Normal University. We analyze the present situation of the blackboard, find out its advantages and disadvantages, and put forward the corresponding solution to the problem. This study not only provides a reference for the analysis of incentive measures in the network teaching platform, but also puts forward new ideas for the study of incentive mechanism, and further deepens our understanding of the incentive mechanism.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127899253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603433
Xiaoliang Gong, Bozhong Long, Kun Fang, Zongling Di, Yichu Hou, Lei Cao
With the development of the Internet, a lot of people trapped in the network, especially the adolescent depending on the network game and disturbing their normal life. 579 freshmen participated in this work who were collected the personality questionnaires in the first week they came in university and their grades points average (GPA) after half year. The questionnaires were including Self-Control (SCS), Barratt impulse Inventory (BIS) and Chinese Big Five Personality (CBF). This work used multi-clustering algorithms to construct the models of predicting for Internet game disorder (IGD) risk, including FCM, K-means, and Hierarchical for training model. This is the first try to predict the risk of IGD by personality traits. The results shown the questionnaire data were well separated by different clustering algorithms into three groups who were shared the analogous personality traits which has a relationship with the behavior of IGD. But compared to the GPA of each group, the efficiency of the prediction model seems not so satisfactory. There need more efforts to optimized the model in the future.
{"title":"A prediction based on clustering and personality questionnaire data for IGD risk: A preliminary work","authors":"Xiaoliang Gong, Bozhong Long, Kun Fang, Zongling Di, Yichu Hou, Lei Cao","doi":"10.1109/FSKD.2016.7603433","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603433","url":null,"abstract":"With the development of the Internet, a lot of people trapped in the network, especially the adolescent depending on the network game and disturbing their normal life. 579 freshmen participated in this work who were collected the personality questionnaires in the first week they came in university and their grades points average (GPA) after half year. The questionnaires were including Self-Control (SCS), Barratt impulse Inventory (BIS) and Chinese Big Five Personality (CBF). This work used multi-clustering algorithms to construct the models of predicting for Internet game disorder (IGD) risk, including FCM, K-means, and Hierarchical for training model. This is the first try to predict the risk of IGD by personality traits. The results shown the questionnaire data were well separated by different clustering algorithms into three groups who were shared the analogous personality traits which has a relationship with the behavior of IGD. But compared to the GPA of each group, the efficiency of the prediction model seems not so satisfactory. There need more efforts to optimized the model in the future.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"196 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115653137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603147
C. Li, Leifu Gao
To deal with the problem that traditional satellite remote sensing image change detection methods overestimate changed areas, a context-sensitive similarity based supervised satellite image change detection method was proposed. Both context-sensitive magnitude and direction of change in the vicinity of each pixel by means of local intercept and slope were exploited, and then SVM (support vector machine) with local intercept and slope was used in satellite image change detection. In the experiment for change detection of high resolution bi-temporal multispectral earthquake satellite images including building damage, the results showed that compared to standard SVM, the accuracy of satellite image change detection had been obviously improved, and overestimation of changed areas had been effectively reduced.
{"title":"Context-sensitive similarity based supervised image change detection","authors":"C. Li, Leifu Gao","doi":"10.1109/FSKD.2016.7603147","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603147","url":null,"abstract":"To deal with the problem that traditional satellite remote sensing image change detection methods overestimate changed areas, a context-sensitive similarity based supervised satellite image change detection method was proposed. Both context-sensitive magnitude and direction of change in the vicinity of each pixel by means of local intercept and slope were exploited, and then SVM (support vector machine) with local intercept and slope was used in satellite image change detection. In the experiment for change detection of high resolution bi-temporal multispectral earthquake satellite images including building damage, the results showed that compared to standard SVM, the accuracy of satellite image change detection had been obviously improved, and overestimation of changed areas had been effectively reduced.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124170812","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603347
Wuying Liu, Lin Wang
With the fast-paced prevalence of smartphones, binary short text classification (STC) is becoming a basic and challenging issue, and relevant STC algorithms can be successfully used in spam filtering for short message service (SMS), wechat, microblogging, and so on. In this manuscript, we address the structural feature of SMS documents and propose a structural learning framework, which decomposes the complex binary STC problem according to the SMS document structure, and predicts the final category by combining several sub-predictions. Supported by our index of string-frequence, we also implement some STC domain classifiers. The experimental results show that the performance of two previous STC algorithms can be upgraded by the structural learning framework, and our STC domain classifiers can achieve the state-of-the-art performance on the task of Chinese SMS spam filtering within the structural learning framework.
{"title":"Structural learning framework for binary short text classification","authors":"Wuying Liu, Lin Wang","doi":"10.1109/FSKD.2016.7603347","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603347","url":null,"abstract":"With the fast-paced prevalence of smartphones, binary short text classification (STC) is becoming a basic and challenging issue, and relevant STC algorithms can be successfully used in spam filtering for short message service (SMS), wechat, microblogging, and so on. In this manuscript, we address the structural feature of SMS documents and propose a structural learning framework, which decomposes the complex binary STC problem according to the SMS document structure, and predicts the final category by combining several sub-predictions. Supported by our index of string-frequence, we also implement some STC domain classifiers. The experimental results show that the performance of two previous STC algorithms can be upgraded by the structural learning framework, and our STC domain classifiers can achieve the state-of-the-art performance on the task of Chinese SMS spam filtering within the structural learning framework.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114420923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603457
Xiujie Qu, Yue Sun, Yue Gu, Shuang Yu, Liwen Gao
Aiming at solving the problem of low matching accuracy caused by different imaging mechanism of heterologous image, we propose a novel image registration algorithm based on effective sub-image extraction and bidirectional matching for surf feature points. The algorithm adopts a coarse-to-fine matching strategy. Firstly, we transform the edge image into frequency domain through fast Fourier transform, and roughly estimate transform parameters using the cross power spectrum; secondly, we divide the images after rough matching into several sub-graphs, from which we will pick out the effective sub-graph in terms of normalized mutual information, then we match bidirectionally the feature points of effective sub-graph pair according to time domain features, thus obtaining accurate transformation parameters, completing the fine matching. Experimental results of heterologous images in different scenarios show that, the proposed algorithm effectively improves the registration accuracy which is up to sub pixel level.
{"title":"A high-precision registration algorithm for heterologous image based on effective sub-graph extraction and feature points bidirectional matching","authors":"Xiujie Qu, Yue Sun, Yue Gu, Shuang Yu, Liwen Gao","doi":"10.1109/FSKD.2016.7603457","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603457","url":null,"abstract":"Aiming at solving the problem of low matching accuracy caused by different imaging mechanism of heterologous image, we propose a novel image registration algorithm based on effective sub-image extraction and bidirectional matching for surf feature points. The algorithm adopts a coarse-to-fine matching strategy. Firstly, we transform the edge image into frequency domain through fast Fourier transform, and roughly estimate transform parameters using the cross power spectrum; secondly, we divide the images after rough matching into several sub-graphs, from which we will pick out the effective sub-graph in terms of normalized mutual information, then we match bidirectionally the feature points of effective sub-graph pair according to time domain features, thus obtaining accurate transformation parameters, completing the fine matching. Experimental results of heterologous images in different scenarios show that, the proposed algorithm effectively improves the registration accuracy which is up to sub pixel level.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122080000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603391
Jiwu Peng, Zheng Xiao, Cen Chen, Wangdong Yang
Iterative SpMV (ISpMV) is a key operation in many graph-based data mining algorithms and machine learning algorithms. Along with the development of big data, the matrices can be so large, perhaps billion-scale, that the SpMV can not be implemented in a single computer. Therefore, it is a challenging issue to implement and optimize SpMV for large-scale data sets. In this paper, we used an in-memory heterogeneous CPU-GPU cluster computing platforms (IMHCPs) to efficiently solve billion-scale SpMV problem. A dedicated and efficient hierarchy partitioning strategy for sparse matrices and the vector is proposed. The partitioning strategy contains partitioning sparse matrices among workers in the cluster and among GPUs in one worker. More, the performance of the IMHCPs-based SpMV is evaluated from the aspects of computation efficiency and scalability.
{"title":"Iterative sparse matrix-vector multiplication on in-memory cluster computing accelerated by GPUs for big data","authors":"Jiwu Peng, Zheng Xiao, Cen Chen, Wangdong Yang","doi":"10.1109/FSKD.2016.7603391","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603391","url":null,"abstract":"Iterative SpMV (ISpMV) is a key operation in many graph-based data mining algorithms and machine learning algorithms. Along with the development of big data, the matrices can be so large, perhaps billion-scale, that the SpMV can not be implemented in a single computer. Therefore, it is a challenging issue to implement and optimize SpMV for large-scale data sets. In this paper, we used an in-memory heterogeneous CPU-GPU cluster computing platforms (IMHCPs) to efficiently solve billion-scale SpMV problem. A dedicated and efficient hierarchy partitioning strategy for sparse matrices and the vector is proposed. The partitioning strategy contains partitioning sparse matrices among workers in the cluster and among GPUs in one worker. More, the performance of the IMHCPs-based SpMV is evaluated from the aspects of computation efficiency and scalability.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122084394","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603536
Xiaohong Zhang, Q. Zhan, Xue-ping Wang
The new notion of non-commutative hyper residuated lattice is introduced and some properties are investigated. Moreover, two definitions of hyper pseudo-BCK algebras are discussed, and the following result is proved: every strong non-commuatative hyper residuated lattice can induce a weak hyper pseudo-BCK algebra. Finally, some mistakes in literatures are pointed out.
{"title":"Non-commutative hyper residuated lattices and hyper pseudo-BCK algebras","authors":"Xiaohong Zhang, Q. Zhan, Xue-ping Wang","doi":"10.1109/FSKD.2016.7603536","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603536","url":null,"abstract":"The new notion of non-commutative hyper residuated lattice is introduced and some properties are investigated. Moreover, two definitions of hyper pseudo-BCK algebras are discussed, and the following result is proved: every strong non-commuatative hyper residuated lattice can induce a weak hyper pseudo-BCK algebra. Finally, some mistakes in literatures are pointed out.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125752899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603188
Lei Song, Rong-Qiang Zeng, Yang Wang, Ming-Sheng Shang
This paper investigates a multi-objective path relinking algorithm in order to optimize a bi-objective unconstrained binary quadratic programming problem. In this algorithm, we integrate the path relinking techniques into hypervolume-based multi-objective optimization, where we propose a method to generate a path and select a set of non-dominated solutions from the generated path for further improvements. Experimental results show that the proposed algorithm is very effective compared with the original multi-objective optimization algorithms.
{"title":"Solving bi-objective unconstrained binary quadratic programming problem with multi-objective path relinking algorithm","authors":"Lei Song, Rong-Qiang Zeng, Yang Wang, Ming-Sheng Shang","doi":"10.1109/FSKD.2016.7603188","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603188","url":null,"abstract":"This paper investigates a multi-objective path relinking algorithm in order to optimize a bi-objective unconstrained binary quadratic programming problem. In this algorithm, we integrate the path relinking techniques into hypervolume-based multi-objective optimization, where we propose a method to generate a path and select a set of non-dominated solutions from the generated path for further improvements. Experimental results show that the proposed algorithm is very effective compared with the original multi-objective optimization algorithms.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124758928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-08-01DOI: 10.1109/FSKD.2016.7603432
Keyang Cheng, Kaifa Hui, Yongzhao Zhan, M. Qi
This paper presents a new improved ViBe algorithm approach to accelerate the ghost suppression, which is a robust and efficient background subtraction algorithm for video sequences. The ViBe has the advantages of faster processing speed and lighter computation load compared with other algorithms. For the sake of the real-time performance of the background modeling, it only uses the first frame to build the background model during the process of initialization. However, it will result in introducing ghost area in the subtraction progression, which has an impact on the performance of the background modeling. We put forward a novel method to accelerate the elimination of the ghost area by detecting and reinitializing the region of the ghost area. Our enhanced algorithm is compared with other improved algorithms with the same aim to suppress the generation of the ghost by conducting a series of contrast experiments. The comparison figures show that our method has better performance in ghost suppression. The PCC is used as a metric to evaluate our algorithm performance. The experiment results show that the PCC of our algorithm has improved after the second frame distinctly in contrast to the original algorithm as well as the improved algorithms mentioned in this paper. Besides, the time for processing per frame can still meet the demand of real-time performance.
{"title":"A novel improved ViBe algorithm to accelerate the ghost suppression","authors":"Keyang Cheng, Kaifa Hui, Yongzhao Zhan, M. Qi","doi":"10.1109/FSKD.2016.7603432","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603432","url":null,"abstract":"This paper presents a new improved ViBe algorithm approach to accelerate the ghost suppression, which is a robust and efficient background subtraction algorithm for video sequences. The ViBe has the advantages of faster processing speed and lighter computation load compared with other algorithms. For the sake of the real-time performance of the background modeling, it only uses the first frame to build the background model during the process of initialization. However, it will result in introducing ghost area in the subtraction progression, which has an impact on the performance of the background modeling. We put forward a novel method to accelerate the elimination of the ghost area by detecting and reinitializing the region of the ghost area. Our enhanced algorithm is compared with other improved algorithms with the same aim to suppress the generation of the ghost by conducting a series of contrast experiments. The comparison figures show that our method has better performance in ghost suppression. The PCC is used as a metric to evaluate our algorithm performance. The experiment results show that the PCC of our algorithm has improved after the second frame distinctly in contrast to the original algorithm as well as the improved algorithms mentioned in this paper. Besides, the time for processing per frame can still meet the demand of real-time performance.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128376830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In order to improve the issue that the existing source location privacy protection algorithms have lower safety periods, the random selection fake source-based algorithm for protecting source-location privacy is put forward. The algorithm selects phantom source node with random walk. The intermediate node is selected from the nodes on the shortest phantom source-sink path with the random number and information about hops to sink. Fake source node is selected by intermediate node through random walk. Analytical results demonstrate that the algorithm can select intermediate node during the process of each packet transmission. The simulation results show that the algorithm can enhance safety period and improve the level of source-location privacy against local adversary.
{"title":"Random selection false source-based algorithm for protecting source-location privacy in WSNs","authors":"Leqiang Bai, Ling Li, Shiguang Qian, Shihong Zhang","doi":"10.1109/FSKD.2016.7603499","DOIUrl":"https://doi.org/10.1109/FSKD.2016.7603499","url":null,"abstract":"In order to improve the issue that the existing source location privacy protection algorithms have lower safety periods, the random selection fake source-based algorithm for protecting source-location privacy is put forward. The algorithm selects phantom source node with random walk. The intermediate node is selected from the nodes on the shortest phantom source-sink path with the random number and information about hops to sink. Fake source node is selected by intermediate node through random walk. Analytical results demonstrate that the algorithm can select intermediate node during the process of each packet transmission. The simulation results show that the algorithm can enhance safety period and improve the level of source-location privacy against local adversary.","PeriodicalId":373155,"journal":{"name":"2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128447397","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}