Pub Date : 2022-10-01DOI: 10.5121/csit.2022.121812
Jun Yu, Zhaoming Kong, Liang Zhan, Li Shen, Lifang He
The assessment of Alzheimer's Disease (AD) and Mild Cognitive Impairment (MCI) associated with brain changes remains a challenging task. Recent studies have demonstrated that combination of multi-modality imaging techniques can better reflect pathological characteristics and contribute to more accurate diagnosis of AD and MCI. In this paper, we propose a novel tensor-based multi-modality feature selection and regression method for diagnosis and biomarker identification of AD and MCI from normal controls. Specifically, we leverage the tensor structure to exploit high-level correlation information inherent in the multi-modality data, and investigate tensor-level sparsity in the multilinear regression model. We present the practical advantages of our method for the analysis of ADNI data using three imaging modalities (VBM-MRI, FDG-PET and AV45-PET) with clinical parameters of disease severity and cognitive scores. The experimental results demonstrate the superior performance of our proposed method against the state-of-the-art for the disease diagnosis and the identification of disease-specific regions and modality-related differences. The code for this work is publicly available at https://github.com/junfish/BIOS22.
{"title":"Tensor-Based Multi-Modality Feature Selection and Regression for Alzheimer's Disease Diagnosis.","authors":"Jun Yu, Zhaoming Kong, Liang Zhan, Li Shen, Lifang He","doi":"10.5121/csit.2022.121812","DOIUrl":"10.5121/csit.2022.121812","url":null,"abstract":"<p><p>The assessment of Alzheimer's Disease (AD) and Mild Cognitive Impairment (MCI) associated with brain changes remains a challenging task. Recent studies have demonstrated that combination of multi-modality imaging techniques can better reflect pathological characteristics and contribute to more accurate diagnosis of AD and MCI. In this paper, we propose a novel tensor-based multi-modality feature selection and regression method for diagnosis and biomarker identification of AD and MCI from normal controls. Specifically, we leverage the tensor structure to exploit high-level correlation information inherent in the multi-modality data, and investigate tensor-level sparsity in the multilinear regression model. We present the practical advantages of our method for the analysis of ADNI data using three imaging modalities (VBM-MRI, FDG-PET and AV45-PET) with clinical parameters of disease severity and cognitive scores. The experimental results demonstrate the superior performance of our proposed method against the state-of-the-art for the disease diagnosis and the identification of disease-specific regions and modality-related differences. The code for this work is publicly available at https://github.com/junfish/BIOS22.</p>","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":"12 18","pages":"123-134"},"PeriodicalIF":0.0,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9985071/pdf/nihms-1874354.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9708291","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-09-23DOI: 10.48550/arXiv.2209.11372
Jun Yu, Zhaoming Kong, L. Zhan, Li Shen, Lifang He
The assessment of Alzheimer's Disease (AD) and Mild Cognitive Impairment (MCI) associated with brain changes remains a challenging task. Recent studies have demonstrated that combination of multi-modality imaging techniques can better reflect pathological characteristics and contribute to more accurate diagnosis of AD and MCI. In this paper, we propose a novel tensor-based multi-modality feature selection and regression method for diagnosis and biomarker identification of AD and MCI from normal controls. Specifically, we leverage the tensor structure to exploit high-level correlation information inherent in the multi-modality data, and investigate tensor-level sparsity in the multilinear regression model. We present the practical advantages of our method for the analysis of ADNI data using three imaging modalities (VBM-MRI, FDG-PET and AV45-PET) with clinical parameters of disease severity and cognitive scores. The experimental results demonstrate the superior performance of our proposed method against the state-of-the-art for the disease diagnosis and the identification of disease-specific regions and modality-related differences. The code for this work is publicly available at https://github.com/junfish/BIOS22.
{"title":"Tensor-Based Multi-Modality Feature Selection and Regression for Alzheimer's Disease Diagnosis","authors":"Jun Yu, Zhaoming Kong, L. Zhan, Li Shen, Lifang He","doi":"10.48550/arXiv.2209.11372","DOIUrl":"https://doi.org/10.48550/arXiv.2209.11372","url":null,"abstract":"The assessment of Alzheimer's Disease (AD) and Mild Cognitive Impairment (MCI) associated with brain changes remains a challenging task. Recent studies have demonstrated that combination of multi-modality imaging techniques can better reflect pathological characteristics and contribute to more accurate diagnosis of AD and MCI. In this paper, we propose a novel tensor-based multi-modality feature selection and regression method for diagnosis and biomarker identification of AD and MCI from normal controls. Specifically, we leverage the tensor structure to exploit high-level correlation information inherent in the multi-modality data, and investigate tensor-level sparsity in the multilinear regression model. We present the practical advantages of our method for the analysis of ADNI data using three imaging modalities (VBM-MRI, FDG-PET and AV45-PET) with clinical parameters of disease severity and cognitive scores. The experimental results demonstrate the superior performance of our proposed method against the state-of-the-art for the disease diagnosis and the identification of disease-specific regions and modality-related differences. The code for this work is publicly available at https://github.com/junfish/BIOS22.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":"32 1","pages":"123-134"},"PeriodicalIF":0.0,"publicationDate":"2022-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88588878","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111101
Haluk Altay, M. F. Solmazgül
Systems engineering is the most important branch of engineering in interdisciplinary study. Successfully performing a multidisciplinary complex system is one of the most challenging tasks of systems engineering. Multidisciplinary study brings problems such as defining complex systems, ensuring communication between stakeholders, and common language among different design teams. In solving such problems, traditional systems engineering approach cannot provide an efficient solution. In this paper, a model-based systems engineering approach is applied with a case study and the approach is found to be more efficient. In the case study, the design of the helicopter automatic flight control system was realized by applying model-based design processes with integration of tools. Requirement management, system architecture management and model-based systems engineering processes are explained and applied of the case study. Finally, model-based systems engineering approach is proven to be effective compared with the traditional systems engineering methods for complex systems in aviation and defence industries.
{"title":"Model-Based Systems Engineering Approach with SysML for an Automatic Flight Control System","authors":"Haluk Altay, M. F. Solmazgül","doi":"10.5121/CSIT.2021.111101","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111101","url":null,"abstract":"Systems engineering is the most important branch of engineering in interdisciplinary study. Successfully performing a multidisciplinary complex system is one of the most challenging tasks of systems engineering. Multidisciplinary study brings problems such as defining complex systems, ensuring communication between stakeholders, and common language among different design teams. In solving such problems, traditional systems engineering approach cannot provide an efficient solution. In this paper, a model-based systems engineering approach is applied with a case study and the approach is found to be more efficient. In the case study, the design of the helicopter automatic flight control system was realized by applying model-based design processes with integration of tools. Requirement management, system architecture management and model-based systems engineering processes are explained and applied of the case study. Finally, model-based systems engineering approach is proven to be effective compared with the traditional systems engineering methods for complex systems in aviation and defence industries.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49198311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111111
Vikas Thammanna Gowda
Although k-Anonymity is a good way to publish microdata for research purposes, it still suffers from various attacks. Hence, many refinements of k-Anonymity have been proposed such as ldiversity and t-Closeness, with t-Closeness being one of the strictest privacy models. Satisfying t-Closeness for a lower value of t may yield equivalence classes with high number of records which results in a greater information loss. For a higher value of t, equivalence classes are still prone to homogeneity, skewness, and similarity attacks. This is because equivalence classes can be formed with fewer distinct sensitive attribute values and still satisfy the constraint t. In this paper, we introduce a new algorithm that overcomes the limitations of k-Anonymity and lDiversity and yields equivalence classes of size k with greater diversity and frequency of a SA value in all the equivalence classes differ by at-most one.
{"title":"Stack and Deal: An Efficient Algorithm for Privacy Preserving Data Publishing","authors":"Vikas Thammanna Gowda","doi":"10.5121/CSIT.2021.111111","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111111","url":null,"abstract":"Although k-Anonymity is a good way to publish microdata for research purposes, it still suffers from various attacks. Hence, many refinements of k-Anonymity have been proposed such as ldiversity and t-Closeness, with t-Closeness being one of the strictest privacy models. Satisfying t-Closeness for a lower value of t may yield equivalence classes with high number of records which results in a greater information loss. For a higher value of t, equivalence classes are still prone to homogeneity, skewness, and similarity attacks. This is because equivalence classes can be formed with fewer distinct sensitive attribute values and still satisfy the constraint t. In this paper, we introduce a new algorithm that overcomes the limitations of k-Anonymity and lDiversity and yields equivalence classes of size k with greater diversity and frequency of a SA value in all the equivalence classes differ by at-most one.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45731279","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111113
Parhat Abla
Group key exchange schemes allow group members to agree on a session key. Although there are many works on constructing group key exchange schemes, but most of them are based on algebraic problems which can be solved by quantum algorithms in polynomial time. Even if several works considered lattice based group key exchange schemes, believed to be post-quantum secure, but only in the random oracle model. In this work, we propose a group key exchange scheme based on ring learning with errors problem. On contrast to existing schemes, our scheme is proved to be secure in the standard model. To achieve this, we define and instantiate multi-party key reconciliation mechanism. Furthermore, using known compiler with lattice based signature schemes, we can achieve authenticated group key exchange with postquantum security.
{"title":"Lattice Based Group Key Exchange Protocol in the Standard Model","authors":"Parhat Abla","doi":"10.5121/CSIT.2021.111113","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111113","url":null,"abstract":"Group key exchange schemes allow group members to agree on a session key. Although there are many works on constructing group key exchange schemes, but most of them are based on algebraic problems which can be solved by quantum algorithms in polynomial time. Even if several works considered lattice based group key exchange schemes, believed to be post-quantum secure, but only in the random oracle model. In this work, we propose a group key exchange scheme based on ring learning with errors problem. On contrast to existing schemes, our scheme is proved to be secure in the standard model. To achieve this, we define and instantiate multi-party key reconciliation mechanism. Furthermore, using known compiler with lattice based signature schemes, we can achieve authenticated group key exchange with postquantum security.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41828777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111108
Quoc-Huy Trinh, Minh Le Nguyen
We propose a method that configures Fine-tuning to a combination of backbone DenseNet and ResNet to classify eight classes showing anatomical landmarks, pathological findings, to endoscopic procedures in the GI tract. Our Technique depends on Transfer Learning which combines two backbones, DenseNet 121 and ResNet 101, to improve the performance of Feature Extraction for classifying the target class. After experiment and evaluating our work, we get accuracy with an F1 score of approximately 0.93 while training 80000 and test 4000 images.
{"title":"Dense-Res Net for Endoscopic Image Classification","authors":"Quoc-Huy Trinh, Minh Le Nguyen","doi":"10.5121/CSIT.2021.111108","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111108","url":null,"abstract":"We propose a method that configures Fine-tuning to a combination of backbone DenseNet and ResNet to classify eight classes showing anatomical landmarks, pathological findings, to endoscopic procedures in the GI tract. Our Technique depends on Transfer Learning which combines two backbones, DenseNet 121 and ResNet 101, to improve the performance of Feature Extraction for classifying the target class. After experiment and evaluating our work, we get accuracy with an F1 score of approximately 0.93 while training 80000 and test 4000 images.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46362793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111106
K. Islam, Sabeur Aridhi, Malika Smaïl-Tabbone
The task of inferring missing links or predicting future ones in a graph based on its current structure is referred to as link prediction. Link prediction methods that are based on pairwise node similarity are well-established approaches in the literature and show good prediction performance in many real-world graphs though they are heuristic. On the other hand, graph embedding approaches learn low-dimensional representation of nodes in graph and are capable of capturing inherent graph features, and thus support the subsequent link prediction task in graph. This appraisal paper studies a selection of methods from both categories on several benchmark (homogeneous) graphs with different properties from various domains. Beyond the intra and inter category comparison of the performances of the methods our aim is also to uncover interesting connections between Graph Neural Network(GNN)-based methods and heuristic ones as a means to alleviate the black-box well-known limitation.
{"title":"Appraisal Study of Similarity-Based and Embedding-Based Link Prediction Methods on Graphs","authors":"K. Islam, Sabeur Aridhi, Malika Smaïl-Tabbone","doi":"10.5121/CSIT.2021.111106","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111106","url":null,"abstract":"The task of inferring missing links or predicting future ones in a graph based on its current structure is referred to as link prediction. Link prediction methods that are based on pairwise node similarity are well-established approaches in the literature and show good prediction performance in many real-world graphs though they are heuristic. On the other hand, graph embedding approaches learn low-dimensional representation of nodes in graph and are capable of capturing inherent graph features, and thus support the subsequent link prediction task in graph. This appraisal paper studies a selection of methods from both categories on several benchmark (homogeneous) graphs with different properties from various domains. Beyond the intra and inter category comparison of the performances of the methods our aim is also to uncover interesting connections between Graph Neural Network(GNN)-based methods and heuristic ones as a means to alleviate the black-box well-known limitation.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44053868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111112
Y. Moumen, M. Benhadou, A. Haddout
During the course of the industrial 4.0 era, companies have been exponentially developed and have digitized almost the whole business system to stick to their performance targets and to keep or to even enlarge their market share. Maintenance function has obviously followed the trend as it’s considered one of the most important processes in every enterprise as it impacts a group of the most critical performance indicators such as: cost, reliability, availability, safety and productivity. E-maintenance emerged in early 2000 and now is a common term in maintenance literature representing the digitalized side of maintenance whereby assets are monitored and controlled over the internet. According to literature, e-maintenance has a remarkable impact on maintenance KPIs and aims at ambitious objectives like zero-downtime.
{"title":"Impact of E-maintenance over Industrial Processes","authors":"Y. Moumen, M. Benhadou, A. Haddout","doi":"10.5121/CSIT.2021.111112","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111112","url":null,"abstract":"During the course of the industrial 4.0 era, companies have been exponentially developed and have digitized almost the whole business system to stick to their performance targets and to keep or to even enlarge their market share. Maintenance function has obviously followed the trend as it’s considered one of the most important processes in every enterprise as it impacts a group of the most critical performance indicators such as: cost, reliability, availability, safety and productivity. E-maintenance emerged in early 2000 and now is a common term in maintenance literature representing the digitalized side of maintenance whereby assets are monitored and controlled over the internet. According to literature, e-maintenance has a remarkable impact on maintenance KPIs and aims at ambitious objectives like zero-downtime.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44537698","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111114
M. Hamoumi, A. Haddout, M. Benhadou
Based on the principle that perfection is a divine criterion, process management exists on the one hand to achieve excellence (near perfection) and on the other hand to avoid imperfection. In other words, Operational Excellence (EO) is one of the approaches, when used rigorously, aims to maximize performance. Therefore, the mastery of problem solving remains necessary to achieve such performance level. There are many tools that we can use whether in continuous improvement for the resolution of chronic problems (KAIZEN, DMAIC, Lean six sigma…) or in resolution of sporadic defects (8D, PDCA, QRQC ...). However, these methodologies often use the same basic tools (Ishikawa diagram, 5 why, tree of causes…) to identify potential causes and root causes. This results in three levels of causes: occurrence, no detection and system. The research presents the development of DINNA diagram [1] as an effective and efficient process that links the Ishikawa diagram and the 5 why method to identify the root causes and avoid recurrence. The ultimate objective is to achieve the same result if two working groups with similar skills analyse the same problem separately, to achieve this, the consistent application of a robust methodology is required. Therefore, we are talking about 5 dimensions; occurrence, non-detection, system, effectiveness and efficiency. As such, the paper offers a solution that is both effective and efficient to help practitioners of industrial problem solving avoid missing the real root cause and save costs following a wrong decision.
{"title":"The 5 Dimensions of Problem Solving using DINNA Diagram: Double Ishikawa and Naze Naze Analysis","authors":"M. Hamoumi, A. Haddout, M. Benhadou","doi":"10.5121/CSIT.2021.111114","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111114","url":null,"abstract":"Based on the principle that perfection is a divine criterion, process management exists on the one hand to achieve excellence (near perfection) and on the other hand to avoid imperfection. In other words, Operational Excellence (EO) is one of the approaches, when used rigorously, aims to maximize performance. Therefore, the mastery of problem solving remains necessary to achieve such performance level. There are many tools that we can use whether in continuous improvement for the resolution of chronic problems (KAIZEN, DMAIC, Lean six sigma…) or in resolution of sporadic defects (8D, PDCA, QRQC ...). However, these methodologies often use the same basic tools (Ishikawa diagram, 5 why, tree of causes…) to identify potential causes and root causes. This results in three levels of causes: occurrence, no detection and system. The research presents the development of DINNA diagram [1] as an effective and efficient process that links the Ishikawa diagram and the 5 why method to identify the root causes and avoid recurrence. The ultimate objective is to achieve the same result if two working groups with similar skills analyse the same problem separately, to achieve this, the consistent application of a robust methodology is required. Therefore, we are talking about 5 dimensions; occurrence, non-detection, system, effectiveness and efficiency. As such, the paper offers a solution that is both effective and efficient to help practitioners of industrial problem solving avoid missing the real root cause and save costs following a wrong decision.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42885170","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-24DOI: 10.5121/CSIT.2021.111103
Samah Mohammed S ALhusayni, Wael Alosaimi
Internet of Things (IoT) has a huge attention recently due to its new emergence, benefits, and contribution to improving the quality of human lives. Securing IoT poses an open area of research, as it is the base of allowing people to use the technology and embrace this development in their daily activities. Authentication is one of the influencing security element of Information Assurance (IA), which includes confidentiality, integrity, and availability, non repudiation, and authentication. Therefore, there is a need to enhance security in the current authentication mechanisms. In this report, some of the authentication mechanisms proposed in recent years have been presented and reviewed. Specifically, the study focuses on enhancement of security in CoAP protocol due to its relevance to the characteristics of IoT devices and its need to enhance its security by using the symmetric key with biometric features in the authentication. This study will help in providing secure authentication technology for IoT data, device, and users.
{"title":"Enhancing Security in Internet of Things Environment by Developing an Authentication Mechanism using COAP Protocol","authors":"Samah Mohammed S ALhusayni, Wael Alosaimi","doi":"10.5121/CSIT.2021.111103","DOIUrl":"https://doi.org/10.5121/CSIT.2021.111103","url":null,"abstract":"Internet of Things (IoT) has a huge attention recently due to its new emergence, benefits, and contribution to improving the quality of human lives. Securing IoT poses an open area of research, as it is the base of allowing people to use the technology and embrace this development in their daily activities. Authentication is one of the influencing security element of Information Assurance (IA), which includes confidentiality, integrity, and availability, non repudiation, and authentication. Therefore, there is a need to enhance security in the current authentication mechanisms. In this report, some of the authentication mechanisms proposed in recent years have been presented and reviewed. Specifically, the study focuses on enhancement of security in CoAP protocol due to its relevance to the characteristics of IoT devices and its need to enhance its security by using the symmetric key with biometric features in the authentication. This study will help in providing secure authentication technology for IoT data, device, and users.","PeriodicalId":72673,"journal":{"name":"Computer science & information technology","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44394468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}