Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877475
A. Agrawal, Dharvi Verma, Shilpa Gupta
Information about climate changes is required at global, regional and basin levels for a variety of purposes, including the study of impact of the greenhouse gases. The analyses mentioned in this research relate to the observation of trends in the temperatures of the Indian states. The research begins with the exposition of the ongoing analysis methodologies prevalent in exploratory analysis and prediction modeling on temperature data. It further develops into the proposed work, where the analysis of means of the average temperatures observed across the Indian states from 1800–2013 is summarized, which in turn is found to reveal confounding results. The proposed work concludes with further focused analysis of geographically similar states, namely the states lying on the Indo-Gangetic plains, which reveal encouraging results, thereby showing an occurrence of a trend. The research concludes with the propounding of the future scope, which includes modeling for predicting the average temperatures which can be attained over the next few decades, which in turn would be significant for the observation of the corollaries of global warming in India.
{"title":"Exploratory data analysis on temperature data of Indian states from 1800–2013 (Analysis of trends in temperature data for prediction modelling)","authors":"A. Agrawal, Dharvi Verma, Shilpa Gupta","doi":"10.1109/NGCT.2016.7877475","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877475","url":null,"abstract":"Information about climate changes is required at global, regional and basin levels for a variety of purposes, including the study of impact of the greenhouse gases. The analyses mentioned in this research relate to the observation of trends in the temperatures of the Indian states. The research begins with the exposition of the ongoing analysis methodologies prevalent in exploratory analysis and prediction modeling on temperature data. It further develops into the proposed work, where the analysis of means of the average temperatures observed across the Indian states from 1800–2013 is summarized, which in turn is found to reveal confounding results. The proposed work concludes with further focused analysis of geographically similar states, namely the states lying on the Indo-Gangetic plains, which reveal encouraging results, thereby showing an occurrence of a trend. The research concludes with the propounding of the future scope, which includes modeling for predicting the average temperatures which can be attained over the next few decades, which in turn would be significant for the observation of the corollaries of global warming in India.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134109002","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877396
J. Kumar
A number of approaches have been developed for image steganography till date but none have used quadtree partition scheme. The quadtree partition is used to divide the image on the basis of variations among the pixel values and therefore we can identify the fine grained and coarse grained areas inside the image. It is very much comfortable and effective to hide something in the roadside scrub rather than hiding it on the middle of the road. In the same manner we can identify the dense areas within the image where the pixel values are frequently and randomly changing. This can be achieved by quadtree partition with different thresholds. These areas are therefore the best areas to hide any secret message without get noticed by a human eye. The paper implements the concept and compares it with other approaches.
{"title":"A novel approach to image steganography using quadtree partition","authors":"J. Kumar","doi":"10.1109/NGCT.2016.7877396","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877396","url":null,"abstract":"A number of approaches have been developed for image steganography till date but none have used quadtree partition scheme. The quadtree partition is used to divide the image on the basis of variations among the pixel values and therefore we can identify the fine grained and coarse grained areas inside the image. It is very much comfortable and effective to hide something in the roadside scrub rather than hiding it on the middle of the road. In the same manner we can identify the dense areas within the image where the pixel values are frequently and randomly changing. This can be achieved by quadtree partition with different thresholds. These areas are therefore the best areas to hide any secret message without get noticed by a human eye. The paper implements the concept and compares it with other approaches.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131623277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877471
Rohit K. Bhullar, Lokesh Pawar, Vijay Kumar, Anjali
Searching is a prime operation in computer science and numerous methods has been devised to make it efficient. Hashing is one such searching technique with objective of limiting the searching complexity to O (1) i.e. finding the desired item in one attempt. But achieving complexity of O (1) is quite difficult or usually not possible. This happens because there is no perfect mapping function for insertion and searching; and this imperfection of hashing function results in collisions. The algorithm and technique presented in this article minimizes the number of collisions by removing the problem of clustering. Clustering occurs when the data items congregates in one particular area thus increasing the number of collisions and results in increased number of probes to insert and search an item. During trials runs the proposed algorithm have shown considerable improvements over all major hashing algorithms in terms of performance.
{"title":"A novel prime numbers based hashing technique for minimizing collisions","authors":"Rohit K. Bhullar, Lokesh Pawar, Vijay Kumar, Anjali","doi":"10.1109/NGCT.2016.7877471","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877471","url":null,"abstract":"Searching is a prime operation in computer science and numerous methods has been devised to make it efficient. Hashing is one such searching technique with objective of limiting the searching complexity to O (1) i.e. finding the desired item in one attempt. But achieving complexity of O (1) is quite difficult or usually not possible. This happens because there is no perfect mapping function for insertion and searching; and this imperfection of hashing function results in collisions. The algorithm and technique presented in this article minimizes the number of collisions by removing the problem of clustering. Clustering occurs when the data items congregates in one particular area thus increasing the number of collisions and results in increased number of probes to insert and search an item. During trials runs the proposed algorithm have shown considerable improvements over all major hashing algorithms in terms of performance.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124083719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877397
Ajay Kumar, A. Chaudhry, Vijay Kumar, Vishal S Sharma
This paper presents a comparative study of dual material double gate junctionless transistor with the single material double gate junctionless transistor. A review of the basic modelling of the Junctionless transistor is also given in the paper. The surface potential of both structures are compared. The threshold voltage compared for both the devices shows that the single material gate has higher threshold voltage.
{"title":"A comparison of dual material double gate JLFET with single material double gate JLFET","authors":"Ajay Kumar, A. Chaudhry, Vijay Kumar, Vishal S Sharma","doi":"10.1109/NGCT.2016.7877397","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877397","url":null,"abstract":"This paper presents a comparative study of dual material double gate junctionless transistor with the single material double gate junctionless transistor. A review of the basic modelling of the Junctionless transistor is also given in the paper. The surface potential of both structures are compared. The threshold voltage compared for both the devices shows that the single material gate has higher threshold voltage.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125954214","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877460
D. Kumar, Arun Kumar
This paper gives the performance study of a cascaded diamond network. A route that provides the maximum of minimum value of available SNRs of all paths is elected. At first, the cumulative distribution function (CDF) of the overall SNR of the considered system is found. By using the derived CDF, the outage probability, coding and diversity gain of the considered system are derived. Later, ergodic capacity of the cascaded diamond network is also found. It is shown that the considered system is having a diversity order of two. Finally, numerical outcomes are posed to discuss the performance of the system.
{"title":"Outage and capacity analysis of cascaded diamond shaped cooperative relay network","authors":"D. Kumar, Arun Kumar","doi":"10.1109/NGCT.2016.7877460","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877460","url":null,"abstract":"This paper gives the performance study of a cascaded diamond network. A route that provides the maximum of minimum value of available SNRs of all paths is elected. At first, the cumulative distribution function (CDF) of the overall SNR of the considered system is found. By using the derived CDF, the outage probability, coding and diversity gain of the considered system are derived. Later, ergodic capacity of the cascaded diamond network is also found. It is shown that the considered system is having a diversity order of two. Finally, numerical outcomes are posed to discuss the performance of the system.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"04 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127450407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877485
Kawaljit Kaur, H. Singh
World Wide Web is expanding day by day in number of users and websites. How a website could stand out in millions of websites to satisfy its users. This is where the role of website evaluation becomes crucial. Website evaluation methods help website designers to understand behavior of users, to improve the design and layout of the website and are useful to improve the website experience for users. In this paper, we will use an effective but very less often used website evaluation technique called Click Analytics. Click analytics records the clicks such as where a user clicks on the webpage and how many times. A click analytics based tool called Crazy Egg has been used to record the clicks on the home page of www.gndu.ac.in and gndualumni.net website. The data was analyzed and the behavior of the users based on clicks was interpreted. We found that interpretation of click data can be difficult task and it needs guidance and expertise. To make the task of analysts easier and to improve websites effectively, we have proposed a framework. This framework is useful to help the website designers and researchers to interpret the meaning of clicks received on each element of webpage and take respective actions to improve organization, design and navigability of websites and eventually the profitability of website.
{"title":"Click analytics: What clicks on webpage indicates?","authors":"Kawaljit Kaur, H. Singh","doi":"10.1109/NGCT.2016.7877485","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877485","url":null,"abstract":"World Wide Web is expanding day by day in number of users and websites. How a website could stand out in millions of websites to satisfy its users. This is where the role of website evaluation becomes crucial. Website evaluation methods help website designers to understand behavior of users, to improve the design and layout of the website and are useful to improve the website experience for users. In this paper, we will use an effective but very less often used website evaluation technique called Click Analytics. Click analytics records the clicks such as where a user clicks on the webpage and how many times. A click analytics based tool called Crazy Egg has been used to record the clicks on the home page of www.gndu.ac.in and gndualumni.net website. The data was analyzed and the behavior of the users based on clicks was interpreted. We found that interpretation of click data can be difficult task and it needs guidance and expertise. To make the task of analysts easier and to improve websites effectively, we have proposed a framework. This framework is useful to help the website designers and researchers to interpret the meaning of clicks received on each element of webpage and take respective actions to improve organization, design and navigability of websites and eventually the profitability of website.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"94 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130912383","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877503
Shikhar Arora, Karan Bhatia, V. Amit
Closed-circuit television (CCTV) or video surveillance is the most useful technology mostly used in the field of security purposes. CCTVs can be found at many places ranging from public to private places. One of the most challenging problem in installing the CCTV cameras at large scale is storage space occupied by the footage. Footage is mostly stored in secondary storage devices like hard disk drives. So, to reduce the storage space, compression techniques are applied. The proposed idea is to remove the adjacent redundant frames. We are proposing a method to optimize the storage space occupied by the CCTV footage by deleting the redundant frames by comparing the adjacent frames using MSE (Mean Squared Error) between the adjacent frames of the video clip. The approach will optimize the storage maintaining the information as well as quality of the video clip.
{"title":"Storage optimization of video surveillance from CCTV camera","authors":"Shikhar Arora, Karan Bhatia, V. Amit","doi":"10.1109/NGCT.2016.7877503","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877503","url":null,"abstract":"Closed-circuit television (CCTV) or video surveillance is the most useful technology mostly used in the field of security purposes. CCTVs can be found at many places ranging from public to private places. One of the most challenging problem in installing the CCTV cameras at large scale is storage space occupied by the footage. Footage is mostly stored in secondary storage devices like hard disk drives. So, to reduce the storage space, compression techniques are applied. The proposed idea is to remove the adjacent redundant frames. We are proposing a method to optimize the storage space occupied by the CCTV footage by deleting the redundant frames by comparing the adjacent frames using MSE (Mean Squared Error) between the adjacent frames of the video clip. The approach will optimize the storage maintaining the information as well as quality of the video clip.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127860373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877418
Anubha Chauhan, Smita Singh, Sarita Negi, S. Verma
Grid technology is the structure of technology that provides highly efficient performance in the grid environment. Design of an efficient and reliable task scheduling algorithm is one of the challenging issues in grid computing. A novel improvised scheduling algorithm (IDSA) with deadline limitation for efficient job execution is proposed in this paper. This algorithm is compared with renowned task scheduling algorithms such as Prioritized Based Deadline Scheduling Algorithm (PDSA) and Earliest Deadline First (EDF) in terms of Average Tardiness ARt (the total delay time between the deadline and end time of the task by the total number of tasks) and Number of non-tardy jobs (total number of tasks finishes before their deadline). The proposed algorithm achieves 27.28 % and 30.0 % less ATi than the EDF and PDSA respectively at 4000 number of tasks. The computational results by proposed IDSA for Non-delayed tasks are 2.17 % and 1.70 % higher than the EDF and PDSA respectively at 4000 number of tasks. This enhances the performance of proposed IDSA compare to existing scheduling algorithms and shows IDSA is more suitable scheduling algorithm for grid computing.
{"title":"Algorithm for deadline based task scheduling in heterogeneous grid environment","authors":"Anubha Chauhan, Smita Singh, Sarita Negi, S. Verma","doi":"10.1109/NGCT.2016.7877418","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877418","url":null,"abstract":"Grid technology is the structure of technology that provides highly efficient performance in the grid environment. Design of an efficient and reliable task scheduling algorithm is one of the challenging issues in grid computing. A novel improvised scheduling algorithm (IDSA) with deadline limitation for efficient job execution is proposed in this paper. This algorithm is compared with renowned task scheduling algorithms such as Prioritized Based Deadline Scheduling Algorithm (PDSA) and Earliest Deadline First (EDF) in terms of Average Tardiness ARt (the total delay time between the deadline and end time of the task by the total number of tasks) and Number of non-tardy jobs (total number of tasks finishes before their deadline). The proposed algorithm achieves 27.28 % and 30.0 % less ATi than the EDF and PDSA respectively at 4000 number of tasks. The computational results by proposed IDSA for Non-delayed tasks are 2.17 % and 1.70 % higher than the EDF and PDSA respectively at 4000 number of tasks. This enhances the performance of proposed IDSA compare to existing scheduling algorithms and shows IDSA is more suitable scheduling algorithm for grid computing.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123788085","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877515
S. Aparna, M. Naidu
Data compression plays a vital role in various video processing applications. Basically video compression is implemented in a video stream frame by frame. Live video processing is a cumbersome task which usually reduces the video transmission rate. So, multimedia data transmission is carried out using suitable hardware modems. The question that arises here is that whether it is possible to achieve live video transmission in a compressed mode without causing loss in data and transmission rate. This paper addresses this problem and proposes a universal coder decoder procedure called UCODEC. Subsampling of video frames and their morphological processing are shown to be very effective for video compression, transmission and reconstruction.
{"title":"Spatial compression and reconstruction of digital video stream using morphological filters","authors":"S. Aparna, M. Naidu","doi":"10.1109/NGCT.2016.7877515","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877515","url":null,"abstract":"Data compression plays a vital role in various video processing applications. Basically video compression is implemented in a video stream frame by frame. Live video processing is a cumbersome task which usually reduces the video transmission rate. So, multimedia data transmission is carried out using suitable hardware modems. The question that arises here is that whether it is possible to achieve live video transmission in a compressed mode without causing loss in data and transmission rate. This paper addresses this problem and proposes a universal coder decoder procedure called UCODEC. Subsampling of video frames and their morphological processing are shown to be very effective for video compression, transmission and reconstruction.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125209117","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.1109/NGCT.2016.7877441
Anand Pandey, Sanjeev Kumar, Krishan Kumar
In mobile multimedia communication systems, the limited bandwidth is an issue of serious concern. However for the better utilization of available resources in a network, channel allocation scheme plays a very important role to manage the available resources in each cell, hence this issue should be managed to reduce the call blocking or dropping probabilities. In this paper we have proposed the new dynamic channel allocation scheme based on handoff calls and traffic mobility using Hopfield neural network. It will improve the capacity of existing system. Hopfield method develops the new energy function that allocates channel not only for new call but also for handoff calls on the basis of traffic mobility information. Moreover, we have also examined the performance of traffic mobility with the help of error back propagation neural network model to enhance the overall quality of services (QoS) in terms of continuous service availability and intercell handoff calls. Our scheme decreases the call handoff dropping and blocking probability up to a better extent as compared to the other existing systems of fixed channel allocation and dynamic channel allocation.
{"title":"Dynamic call admission control for QoS provision in mobile multimedia networks using artificial neural networks","authors":"Anand Pandey, Sanjeev Kumar, Krishan Kumar","doi":"10.1109/NGCT.2016.7877441","DOIUrl":"https://doi.org/10.1109/NGCT.2016.7877441","url":null,"abstract":"In mobile multimedia communication systems, the limited bandwidth is an issue of serious concern. However for the better utilization of available resources in a network, channel allocation scheme plays a very important role to manage the available resources in each cell, hence this issue should be managed to reduce the call blocking or dropping probabilities. In this paper we have proposed the new dynamic channel allocation scheme based on handoff calls and traffic mobility using Hopfield neural network. It will improve the capacity of existing system. Hopfield method develops the new energy function that allocates channel not only for new call but also for handoff calls on the basis of traffic mobility information. Moreover, we have also examined the performance of traffic mobility with the help of error back propagation neural network model to enhance the overall quality of services (QoS) in terms of continuous service availability and intercell handoff calls. Our scheme decreases the call handoff dropping and blocking probability up to a better extent as compared to the other existing systems of fixed channel allocation and dynamic channel allocation.","PeriodicalId":326018,"journal":{"name":"2016 2nd International Conference on Next Generation Computing Technologies (NGCT)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122486690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}