Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279692
M. Morita, Hironobu Kinjo, Shido Sato
Drones are currently being used for a wide range of applications, such as for the i-Construction initiative and public surveys [1] conducted by the Ministry of Land, Infrastructure, Transport and Tourism, and for surveillance and for crop-dusting operations [2]. For these purposes, drones are controlled either by a pilot through an FPV, or drones fly autonomously by estimating self-position using GPS. GPS data, however, can become faulty under bridges, inside tunnels, or near high-voltage power lines [3], which could lead to drone flight errors. To address this issue, we are developing a system for conducting infrastructure inspections using drones that basically use GPS for autonomous flight control, but can also estimate self-position through image processing when GPS cannot be used under situations such as those mentioned above.
{"title":"Autonomous flight drone for infrastructure (transmission line) inspection (3)","authors":"M. Morita, Hironobu Kinjo, Shido Sato","doi":"10.1109/ICIIBMS.2017.8279692","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279692","url":null,"abstract":"Drones are currently being used for a wide range of applications, such as for the i-Construction initiative and public surveys [1] conducted by the Ministry of Land, Infrastructure, Transport and Tourism, and for surveillance and for crop-dusting operations [2]. For these purposes, drones are controlled either by a pilot through an FPV, or drones fly autonomously by estimating self-position using GPS. GPS data, however, can become faulty under bridges, inside tunnels, or near high-voltage power lines [3], which could lead to drone flight errors. To address this issue, we are developing a system for conducting infrastructure inspections using drones that basically use GPS for autonomous flight control, but can also estimate self-position through image processing when GPS cannot be used under situations such as those mentioned above.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132038864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279742
Takashi Sato
Adolescence is a period in which individuals begin facing some challenging choices. Through these choices, and social interactions with peers, adolescent individuals develop their “personalized values” that are the foundations of their actions. It is known that the adolescent brains have high plasticity, and that the adolescent brains change dynamically, other than that of an adult brains. This study discusses the type of behavior that emerges from adolescents, as well as adult individuals with elevated plasticities. To realize this, we adopt the minority game (MG), which is one of the task concerning the choice described above. We implement Elman-nets with different learning rates that express different degrees of the plasticity as the player models in the MG. Our simulation results showed that it is a possibility that robust cooperative states emerge by iteratively internalizing the opponent players' personalized values among players that have a network of reference relationship, irrespective of their varying degrees of plasticity.
{"title":"Emergence of robust cooperative states by iterative internalizations of opponents' personalized values in minority game","authors":"Takashi Sato","doi":"10.1109/ICIIBMS.2017.8279742","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279742","url":null,"abstract":"Adolescence is a period in which individuals begin facing some challenging choices. Through these choices, and social interactions with peers, adolescent individuals develop their “personalized values” that are the foundations of their actions. It is known that the adolescent brains have high plasticity, and that the adolescent brains change dynamically, other than that of an adult brains. This study discusses the type of behavior that emerges from adolescents, as well as adult individuals with elevated plasticities. To realize this, we adopt the minority game (MG), which is one of the task concerning the choice described above. We implement Elman-nets with different learning rates that express different degrees of the plasticity as the player models in the MG. Our simulation results showed that it is a possibility that robust cooperative states emerge by iteratively internalizing the opponent players' personalized values among players that have a network of reference relationship, irrespective of their varying degrees of plasticity.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133233150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279748
Khin Yadanar Win, S. Choomchuay, K. Hamamoto, Manasanan Raveesunthornkiat
Automated segmentation of cell nuclei is the crucial step towards computer-aided diagnosis system because the morphological features of the cell nuclei are highly associated with the cell abnormality and disease. This paper contributes four main stages required for automatic segmentation of the cell nuclei on cytology pleural effusion images. Initially, the image is preprocessed to enhance the image quality by applying contrast limited adaptive histogram equalization (CLAHE). The segmentation process is relied on a supervised Artificial Neural network (ANN) based pixel classification. Then, the boundaries of the extracted cell nuclei regions are refined by utilizing the morphological operation. Finally, the overlapped or touched nuclei are identified and split by using the marker-controlled watershed method. The proposed method is evaluated with the local dataset containing 35 cytology pleural effusion images. It achieves the performance of 0.95%, 0.86 %, 0.90% and 92% in precision, recall, F-measure and Dice Similarity Coefficient respectively. The average computational time for the entire algorithm took 15 mins per image. To our knowledge, this is the first attempt that utilizes ANN as the segmentation on cytology pleural effusion images.
{"title":"Artificial neural network based nuclei segmentation on cytology pleural effusion images","authors":"Khin Yadanar Win, S. Choomchuay, K. Hamamoto, Manasanan Raveesunthornkiat","doi":"10.1109/ICIIBMS.2017.8279748","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279748","url":null,"abstract":"Automated segmentation of cell nuclei is the crucial step towards computer-aided diagnosis system because the morphological features of the cell nuclei are highly associated with the cell abnormality and disease. This paper contributes four main stages required for automatic segmentation of the cell nuclei on cytology pleural effusion images. Initially, the image is preprocessed to enhance the image quality by applying contrast limited adaptive histogram equalization (CLAHE). The segmentation process is relied on a supervised Artificial Neural network (ANN) based pixel classification. Then, the boundaries of the extracted cell nuclei regions are refined by utilizing the morphological operation. Finally, the overlapped or touched nuclei are identified and split by using the marker-controlled watershed method. The proposed method is evaluated with the local dataset containing 35 cytology pleural effusion images. It achieves the performance of 0.95%, 0.86 %, 0.90% and 92% in precision, recall, F-measure and Dice Similarity Coefficient respectively. The average computational time for the entire algorithm took 15 mins per image. To our knowledge, this is the first attempt that utilizes ANN as the segmentation on cytology pleural effusion images.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134246249","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279705
Mohamed Atef Seif, Ryosuke Umeda, H. Higa
This paper presents a user interface of 3D (three-dimensional) object converted from CT data in DICOM format using Leap Motion device that can be used as a medical training system for medical students and interns. The formed data can be controlled in a 3D development environment such as Unity. The system consists of Leap Motion device, desktop computer and the displaying software environment. The experimental results show that we can have a desirable control over the rendered object.
{"title":"An attempt to control a 3D object in medical training system using leap motion","authors":"Mohamed Atef Seif, Ryosuke Umeda, H. Higa","doi":"10.1109/ICIIBMS.2017.8279705","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279705","url":null,"abstract":"This paper presents a user interface of 3D (three-dimensional) object converted from CT data in DICOM format using Leap Motion device that can be used as a medical training system for medical students and interns. The formed data can be controlled in a 3D development environment such as Unity. The system consists of Leap Motion device, desktop computer and the displaying software environment. The experimental results show that we can have a desirable control over the rendered object.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126140261","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279694
Hironobu Kinjo, M. Morita, Shido Sato
GPS data can become faulty under bridges, inside tunnels, or near high-voltage power lines, which could lead to drone flight errors. To address this issue, we are developing a system for conducting infrastructure inspections using drones that basically use GPS for autonomous flight control, but can also estimate self-position through image processing when GPS cannot be used under situations such as those mentioned above. This paper describes a method for estimation of self-position using ground images, with the aim of developing an autonomous drone flight system for use in infrastructure inspection.
{"title":"Infrastructure (transmission line) check autonomous flight drone (1)","authors":"Hironobu Kinjo, M. Morita, Shido Sato","doi":"10.1109/ICIIBMS.2017.8279694","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279694","url":null,"abstract":"GPS data can become faulty under bridges, inside tunnels, or near high-voltage power lines, which could lead to drone flight errors. To address this issue, we are developing a system for conducting infrastructure inspections using drones that basically use GPS for autonomous flight control, but can also estimate self-position through image processing when GPS cannot be used under situations such as those mentioned above. This paper describes a method for estimation of self-position using ground images, with the aim of developing an autonomous drone flight system for use in infrastructure inspection.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125098838","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279756
S. Augustinaite, B. Kuhn
Layer 6 (L6), the deepest lamina of cerebral cortex, is one of the key structures regulating behavior state related information processing within the cortex and various subcortical areas. However, very little is known about the functional significance of different L6 circuits in vivo. L6 experiments in the behaving animals still remain challenging due to hard access of the recording / imaging site, complexity of neuronal circuits and heterogeneity in neuron morphology. Here, we focus on primary visual cortex L6 feedback projections to visual thalamus (lateral geniculate nucleus) which regulate visual signal transmission from retina to cortex. We developed a method to “dissect” and study these L6 corticothalamic feedback projections in vivo with 2P microscopy. With this method, we can reliably image retrogradely labeled corticothalamic and other excitatory L6 neurons throughout full layer, down to about 850 μm below dura in a head-fixed mouse. Up to a few hundred individual neurons can be recorded simultaneously for several hours and/or repeatedly recorded during different days, while monitoring mouse behavior state. This allows us to study the cortical feedback to the primary visual thalamus during different behavior states, ranging from full alertness to sleep.
{"title":"2-P imaging of mouse visual cortex layer 6 corticothalamic feedback during different behavior states","authors":"S. Augustinaite, B. Kuhn","doi":"10.1109/ICIIBMS.2017.8279756","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279756","url":null,"abstract":"Layer 6 (L6), the deepest lamina of cerebral cortex, is one of the key structures regulating behavior state related information processing within the cortex and various subcortical areas. However, very little is known about the functional significance of different L6 circuits in vivo. L6 experiments in the behaving animals still remain challenging due to hard access of the recording / imaging site, complexity of neuronal circuits and heterogeneity in neuron morphology. Here, we focus on primary visual cortex L6 feedback projections to visual thalamus (lateral geniculate nucleus) which regulate visual signal transmission from retina to cortex. We developed a method to “dissect” and study these L6 corticothalamic feedback projections in vivo with 2P microscopy. With this method, we can reliably image retrogradely labeled corticothalamic and other excitatory L6 neurons throughout full layer, down to about 850 μm below dura in a head-fixed mouse. Up to a few hundred individual neurons can be recorded simultaneously for several hours and/or repeatedly recorded during different days, while monitoring mouse behavior state. This allows us to study the cortical feedback to the primary visual thalamus during different behavior states, ranging from full alertness to sleep.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"101 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114818297","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279686
Asuka Noda, O. Fukuda, H. Okumura, K. Arai
The purpose of this paper is to monitor and analyze the behavior of small animals, which has been still poorly by many research. By using IoT sensors and a monitoring camera, the user can observe the behavior of small animals. The system can inform the user of IoT sensor events through Twitter and e-mail service. Also, their states are recorded on a Google spreadsheet. Then, we can analyze the behavior based on the records. The experiments were conducted to verify validity of the system with a djungarian hamster. The experimental results revealed that the behavior of small animals were changed depending on environment around them. By using the Bayesian network, the environment and time period were successfully estimated only based on the IoT sensor data.
{"title":"Behavior analysis of a small animal using IoT sensor system","authors":"Asuka Noda, O. Fukuda, H. Okumura, K. Arai","doi":"10.1109/ICIIBMS.2017.8279686","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279686","url":null,"abstract":"The purpose of this paper is to monitor and analyze the behavior of small animals, which has been still poorly by many research. By using IoT sensors and a monitoring camera, the user can observe the behavior of small animals. The system can inform the user of IoT sensor events through Twitter and e-mail service. Also, their states are recorded on a Google spreadsheet. Then, we can analyze the behavior based on the records. The experiments were conducted to verify validity of the system with a djungarian hamster. The experimental results revealed that the behavior of small animals were changed depending on environment around them. By using the Bayesian network, the environment and time period were successfully estimated only based on the IoT sensor data.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114969700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279732
Kiri Lee, Jiseop Kim, Byungjun Park, H. Bang, Byungyeon Kim, Youngjae Won, Seungrag Lee
Molecular endoscopic fluorescence imaging can improve noninvasive detection of lesions in the colon. We developed fluorescence endoscopic imaging system. We obtained fluorescence image of ex vivo mouse colon tissue from designed image acquisition system.
{"title":"Development of endoscopic system based on fluorescence imaging for detection of colon cancer","authors":"Kiri Lee, Jiseop Kim, Byungjun Park, H. Bang, Byungyeon Kim, Youngjae Won, Seungrag Lee","doi":"10.1109/ICIIBMS.2017.8279732","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279732","url":null,"abstract":"Molecular endoscopic fluorescence imaging can improve noninvasive detection of lesions in the colon. We developed fluorescence endoscopic imaging system. We obtained fluorescence image of ex vivo mouse colon tissue from designed image acquisition system.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117244581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279747
H. Tjandrasa, S. Djanali, F. X. Arunanto
Brain-computer interfaces have been enabled severely disabled users to communicate with their environments. One method is to use a controlled stimulus to elicit the P300 event-related potential. EEG signals during the repeated stimuli were recorded from four disabled subjects and processed with a Butterworth bandpass filter and Singular Spectrum Analysis, normalized, separated into 2 groups of the target and non-target trial data, and averaged for every 5 trials for each group before classified using a neural network. The purpose of averaging every five target and non-target trials was to emerge the P300 component of even-related potentials so that the target trials could be differentiated from the non-target trials. Further processing by selecting 1 of every 5 processed non-target trials increased the value of sensitivity by 10.9%, it showed that the number of false negatives of target trials was reduced. The results of the classification gave the maximum accuracy of 92.5%. The average values of sensitivity, specificity, and accuracy were 70.8%, 89,8%, and 84.6% respectively.
{"title":"Classification of P300 in EEG signals for disable subjects using singular spectrum analysis","authors":"H. Tjandrasa, S. Djanali, F. X. Arunanto","doi":"10.1109/ICIIBMS.2017.8279747","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279747","url":null,"abstract":"Brain-computer interfaces have been enabled severely disabled users to communicate with their environments. One method is to use a controlled stimulus to elicit the P300 event-related potential. EEG signals during the repeated stimuli were recorded from four disabled subjects and processed with a Butterworth bandpass filter and Singular Spectrum Analysis, normalized, separated into 2 groups of the target and non-target trial data, and averaged for every 5 trials for each group before classified using a neural network. The purpose of averaging every five target and non-target trials was to emerge the P300 component of even-related potentials so that the target trials could be differentiated from the non-target trials. Further processing by selecting 1 of every 5 processed non-target trials increased the value of sensitivity by 10.9%, it showed that the number of false negatives of target trials was reduced. The results of the classification gave the maximum accuracy of 92.5%. The average values of sensitivity, specificity, and accuracy were 70.8%, 89,8%, and 84.6% respectively.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"380 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122858040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-11-01DOI: 10.1109/ICIIBMS.2017.8279754
B. Kuhn, Ray X. Lee, G. Stephens
Despite the long history and wide use of standard behavioral tests to measure emotion in laboratory animals, the approach and logic have been heavily criticized. Here we solved the fundamental logical and quantitative problems by a proof-of-principle testing approach with fine-scale behavioral analysis. The reported approach was able to detect informative behavioral details and further prove stress incubation after acute psychological trauma in mice. Standard analyses, in contrast, gave inconclusive results. This approach provides a technical advance allowing exploration of a wide range of potential measurements using standard behavioral tests, and a more solid basis supporting the concluded animal emotionality.
{"title":"Detection and identification of animal emotionality-exposing stress incubation in Mice","authors":"B. Kuhn, Ray X. Lee, G. Stephens","doi":"10.1109/ICIIBMS.2017.8279754","DOIUrl":"https://doi.org/10.1109/ICIIBMS.2017.8279754","url":null,"abstract":"Despite the long history and wide use of standard behavioral tests to measure emotion in laboratory animals, the approach and logic have been heavily criticized. Here we solved the fundamental logical and quantitative problems by a proof-of-principle testing approach with fine-scale behavioral analysis. The reported approach was able to detect informative behavioral details and further prove stress incubation after acute psychological trauma in mice. Standard analyses, in contrast, gave inconclusive results. This approach provides a technical advance allowing exploration of a wide range of potential measurements using standard behavioral tests, and a more solid basis supporting the concluded animal emotionality.","PeriodicalId":122969,"journal":{"name":"2017 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128444806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}