Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409166
R. Annauth, H. Rughooputh
OFDM has been recognized as a promising technique for providing high spectral efficiency in future wireless system. In this paper, a novel optimization technique, the Estimation Distribution Algorithm (EDA) is proposed for multi-objective optimization of OFDM systems resources. The performance and effectiveness of the optimizer are analyzed and compared with the state of art NSGA II and multi-objective particle swarm optimization algorithm.
{"title":"Optimization of OFDM systems resources using multi-objective Estimation Distribution Algorithms","authors":"R. Annauth, H. Rughooputh","doi":"10.1109/WICT.2012.6409166","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409166","url":null,"abstract":"OFDM has been recognized as a promising technique for providing high spectral efficiency in future wireless system. In this paper, a novel optimization technique, the Estimation Distribution Algorithm (EDA) is proposed for multi-objective optimization of OFDM systems resources. The performance and effectiveness of the optimizer are analyzed and compared with the state of art NSGA II and multi-objective particle swarm optimization algorithm.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130029731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
There are some fields in ad-hoc networks that are more highlighted these days, such as energy consumption, quality of service and security. Among these, security has been predominantly concerned in military, civil and educational applications. In security problem, suspect nodes detection or abnormal behavior nodes is one of the most important parts. We have addressed the malicious nodes detection problem in ad-hoc networks using special type of learning automata in an irregular network. We have used the irregular cellular learning automata to detect anomalies in two levels. We have also rigorously evaluated the performance of our approach by simulating it with MATLAB and Glomosim simulator and have compared our solution with a powerful similar learning automata-based protocol named LAID. The simulation results proofs that our approach is more promising.
{"title":"A novel approach for malicious nodes detection in ad-hoc networks based on cellular learning automata","authors":"Amir Bagheri Aghababa, Amirhosein Fathinavid, Abdolreza Salari, Seyedeh Elaheh Haghayegh Zavareh","doi":"10.1109/WICT.2012.6409055","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409055","url":null,"abstract":"There are some fields in ad-hoc networks that are more highlighted these days, such as energy consumption, quality of service and security. Among these, security has been predominantly concerned in military, civil and educational applications. In security problem, suspect nodes detection or abnormal behavior nodes is one of the most important parts. We have addressed the malicious nodes detection problem in ad-hoc networks using special type of learning automata in an irregular network. We have used the irregular cellular learning automata to detect anomalies in two levels. We have also rigorously evaluated the performance of our approach by simulating it with MATLAB and Glomosim simulator and have compared our solution with a powerful similar learning automata-based protocol named LAID. The simulation results proofs that our approach is more promising.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130102251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409072
A. Das, M. Bhattacharya
In the present study, we have applied the process of image enhancement to mammographic images. This study has examined the problem of fuzziness/impreciseness of mammograms such as inhomogeneous background, low contrast, indistinct borders, small and ill-defined shapes, varying intensities of the suspicious regions and low distinguishability from their surroundings. Though fuzzy logic based contrast enhancement technique has good potential to handle the problem of impreciseness in mammograms, more generalized and flexible Vague Set theory is appropriate to capture the vagueness of mammograms.
{"title":"Development of advanced contrast enhancement technique for mammographie images","authors":"A. Das, M. Bhattacharya","doi":"10.1109/WICT.2012.6409072","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409072","url":null,"abstract":"In the present study, we have applied the process of image enhancement to mammographic images. This study has examined the problem of fuzziness/impreciseness of mammograms such as inhomogeneous background, low contrast, indistinct borders, small and ill-defined shapes, varying intensities of the suspicious regions and low distinguishability from their surroundings. Though fuzzy logic based contrast enhancement technique has good potential to handle the problem of impreciseness in mammograms, more generalized and flexible Vague Set theory is appropriate to capture the vagueness of mammograms.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131826384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409200
K. Sreenivasulu, E. Prasad, A. Subramanyam
In this paper we present a novel Vindictive Link detection enabled Distance Vector Routing Protocol (VLDD) for mobile ad hoc networks, through which the cost of working out and consumption of resources is bounded. While most of the existing models go for Digital Signature the proposed protocol is based on adept symmetric key cryptography with proactive routing topology called DSDV. This proposed protocol offers an efficient hash chaining technique in which each node in the routing path stores only one arcanum where as the hash chained based protocols obliges high storage memory as it requires to preserve multiple numbers of Aracnums. The other hash chaining protocols like SEAD uses multiple user authentications where as VLDD uses only one authentication tag for giving acknowledgements, and for fault declarations. By the efficient use of hash chain elements the metric values and the sequence numbers on the path can be secured from logical tampering. In contrast with the proactive routing topology called SEAD that cited frequently in literature about secure ad hoc routing provides security that limited to lower bound of the evaluation metrics, VLDD can provide total protection. We evaluated scalability of VLDD by comparing with state-of-the-art algorithms and observed that it is scalable, adaptive over a wide range of divergent situations under different number of assessment measures. In particular, we show that it scales better on networks with dense in node count. In this regard simulation model developed inNS-2 environment.
{"title":"VLDD: Vindictive Link detection in Distance Vector Routine for ad hoc network","authors":"K. Sreenivasulu, E. Prasad, A. Subramanyam","doi":"10.1109/WICT.2012.6409200","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409200","url":null,"abstract":"In this paper we present a novel Vindictive Link detection enabled Distance Vector Routing Protocol (VLDD) for mobile ad hoc networks, through which the cost of working out and consumption of resources is bounded. While most of the existing models go for Digital Signature the proposed protocol is based on adept symmetric key cryptography with proactive routing topology called DSDV. This proposed protocol offers an efficient hash chaining technique in which each node in the routing path stores only one arcanum where as the hash chained based protocols obliges high storage memory as it requires to preserve multiple numbers of Aracnums. The other hash chaining protocols like SEAD uses multiple user authentications where as VLDD uses only one authentication tag for giving acknowledgements, and for fault declarations. By the efficient use of hash chain elements the metric values and the sequence numbers on the path can be secured from logical tampering. In contrast with the proactive routing topology called SEAD that cited frequently in literature about secure ad hoc routing provides security that limited to lower bound of the evaluation metrics, VLDD can provide total protection. We evaluated scalability of VLDD by comparing with state-of-the-art algorithms and observed that it is scalable, adaptive over a wide range of divergent situations under different number of assessment measures. In particular, we show that it scales better on networks with dense in node count. In this regard simulation model developed inNS-2 environment.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133640953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409085
Hanaa Ismail Elshazly, N. Ghali, Abir M. El Korany, A. Hassanien
The use of computational intelligence systems such as rough sets, neural networks, fuzzy set, genetic algorithms, etc., for predictions and classification has been widely established. This paper presents a generic classification model based on a rough set approach and decision rules. To increase the efficiency of the classification process, boolean reasoning discretization algorithm is used to discretize the data sets. The approach is tested by a comparative study of three different classifiers (decision rules, naive bayes and k-nearest neighbor) over three distinct discretization techniques (equal bigning, entropy and boolean reasoning). The rough set reduction technique is applied to find all the reducts of the data which contains the minimal subset of attributes that are associated with a class label for prediction. In this paper we adopt the genetic algorithms approach to reach reducts. Finally, decision rules were used as a classifier to evaluate the performance of the predicted reducts and classes. To evaluate the performance of our approach, we present tests on breast cancer data set. The experimental results obtained, show that the overall classification accuracy offered by the employed rough set approach and decision rules is high compared with other classification techniques including Bayes and k-nearest neighbor.
{"title":"Rough sets and genetic algorithms: A hybrid approach to breast cancer classification","authors":"Hanaa Ismail Elshazly, N. Ghali, Abir M. El Korany, A. Hassanien","doi":"10.1109/WICT.2012.6409085","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409085","url":null,"abstract":"The use of computational intelligence systems such as rough sets, neural networks, fuzzy set, genetic algorithms, etc., for predictions and classification has been widely established. This paper presents a generic classification model based on a rough set approach and decision rules. To increase the efficiency of the classification process, boolean reasoning discretization algorithm is used to discretize the data sets. The approach is tested by a comparative study of three different classifiers (decision rules, naive bayes and k-nearest neighbor) over three distinct discretization techniques (equal bigning, entropy and boolean reasoning). The rough set reduction technique is applied to find all the reducts of the data which contains the minimal subset of attributes that are associated with a class label for prediction. In this paper we adopt the genetic algorithms approach to reach reducts. Finally, decision rules were used as a classifier to evaluate the performance of the predicted reducts and classes. To evaluate the performance of our approach, we present tests on breast cancer data set. The experimental results obtained, show that the overall classification accuracy offered by the employed rough set approach and decision rules is high compared with other classification techniques including Bayes and k-nearest neighbor.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133674390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409150
P. Neenu, S. Kumari, R. Dhuli
In this paper we derive a model for a sensor. Such a model can form the basis of characterising the multirate sensor array in terms of a nonuniform filter bank structure. The capabilities and limitations of the model are also highlighted. Two approaches for obtaining the modeling error are presented. The model is extended for the case when the sensor sampling rate is a rational factor of the observation signal Nyquist rate. The additional error that results in this scenario is analysed.
{"title":"On modeling the multirate sensor array","authors":"P. Neenu, S. Kumari, R. Dhuli","doi":"10.1109/WICT.2012.6409150","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409150","url":null,"abstract":"In this paper we derive a model for a sensor. Such a model can form the basis of characterising the multirate sensor array in terms of a nonuniform filter bank structure. The capabilities and limitations of the model are also highlighted. Two approaches for obtaining the modeling error are presented. The model is extended for the case when the sensor sampling rate is a rational factor of the observation signal Nyquist rate. The additional error that results in this scenario is analysed.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131871302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409169
D. Ramesh, K. Kumar, B. Ramji
In a distributed database environment, when the coordinator site (root node or process) is not working, the environment needs to choose or elect a new one in order to perform the transactional tasks. The elected coordinator takes the lead to perform the activities as well and continues the functioning. If the previous (crashed) site is recovered from the failures then again it leads the system by taking the responsibility. In this paper, a recovery instance based on bi-directional ring election algorithm for the crashed coordinator was brought up. The new algorithm for the recovered site quickly brings the state back by sending messages in parallel instances. This work shows that how the algorithm makes the recovered site faster and takes less time to make the system quickly to handle transactions normally.
{"title":"Design of a transaction recovery instance based on bi-directional ring election algorithm for crashed coordinator in distributed database systems","authors":"D. Ramesh, K. Kumar, B. Ramji","doi":"10.1109/WICT.2012.6409169","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409169","url":null,"abstract":"In a distributed database environment, when the coordinator site (root node or process) is not working, the environment needs to choose or elect a new one in order to perform the transactional tasks. The elected coordinator takes the lead to perform the activities as well and continues the functioning. If the previous (crashed) site is recovered from the failures then again it leads the system by taking the responsibility. In this paper, a recovery instance based on bi-directional ring election algorithm for the crashed coordinator was brought up. The new algorithm for the recovered site quickly brings the state back by sending messages in parallel instances. This work shows that how the algorithm makes the recovered site faster and takes less time to make the system quickly to handle transactions normally.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116861914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409219
H. B. Kekre, P. Shrinath
Ultrasound (US) imaging is important modality to examine the clinical problems and also used as complimentary to the mammogram images to understand nature and shape of the breast tumor. Accurate and efficient segmentation method helps radiologists to understand and observe the volume of a tumor (growth or shrinkage). Inherent artifact present in US images, such as speckle, attenuation and shadows are major hurdles in achieving proper segmentation. Along with the accuracy, computational efficiency is also major concern in the segmentation process. Here, in this paper, VQ based clustering technique is proposed for US image segmentation with KMCG and KFCG as codebook generation algorithms. A novel technique of sequential cluster clubbing is used on clusters obtained from codebook generation algorithms and appropriate cluster has been selected as segmentation result. Besides original KMCG and KFCG, augmented KMCG and KFCG are also proposed for clustering with different block sizes. The results of all proposed methods are compared with each other and best result is selected based on two criteria's, one is computational efficiency and other is accuracy. Finally, best results amongst our methods are compared with results of original watershed and improved watershed transforms.
{"title":"Tumor demarcation by VQ based clustering and augmentation with KMCG and KFCG codebook generation algorithms","authors":"H. B. Kekre, P. Shrinath","doi":"10.1109/WICT.2012.6409219","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409219","url":null,"abstract":"Ultrasound (US) imaging is important modality to examine the clinical problems and also used as complimentary to the mammogram images to understand nature and shape of the breast tumor. Accurate and efficient segmentation method helps radiologists to understand and observe the volume of a tumor (growth or shrinkage). Inherent artifact present in US images, such as speckle, attenuation and shadows are major hurdles in achieving proper segmentation. Along with the accuracy, computational efficiency is also major concern in the segmentation process. Here, in this paper, VQ based clustering technique is proposed for US image segmentation with KMCG and KFCG as codebook generation algorithms. A novel technique of sequential cluster clubbing is used on clusters obtained from codebook generation algorithms and appropriate cluster has been selected as segmentation result. Besides original KMCG and KFCG, augmented KMCG and KFCG are also proposed for clustering with different block sizes. The results of all proposed methods are compared with each other and best result is selected based on two criteria's, one is computational efficiency and other is accuracy. Finally, best results amongst our methods are compared with results of original watershed and improved watershed transforms.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122108984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409079
N. Dey, P. Das, A. B. Roy, A. Das, S. S. Chaudhuri
Intravascular ultrasound (IVUS) is a medical imaging system, which uses a specially designed catheter equipped with a small ultrasound probing device at its distal end. This system allows the ultrasound signal to visualize the blood vessels from the inside and is capable of imaging the diseased vessels, measuring the dimension and composition of the plaque. IVUS is an efficient method for detecting various ischemie heart diseases. Presently considerable amount of work has been done in telemonitoring which involves the transmission of biomedicai signals through wireless media. Exchange of biomedicai signals amongst hospitals needs efficient and reliable transmission. Watermark is added "ownership" information in multimedia contents to prove authenticity, verify signal integrity, and achieve control over copy process. This paper proposes a novel session based blind watermarking method with secret key by embedding binary watermark images into IVUS video. The IVUS video is a sensitive diagnostic tool that is used to detect various cardio-vascular diseases by measuring and recording the anatomy of the heart and adjoined blood vessels in exquisite detail. The various anatomical minutiae of the heart and blood vessels are important characteristics which correspond to many severe human cardiac diseases. In this present work the entire video of IVUS is split into frames and application of Discrete Wavelet Transformation (DWT), Discrete Cosine Transformation (DCT) followed by Singular Value Decomposition (SVD) composes the watermark embedding technique. Watermark extraction is achieved by applying inverse DWT, inverse DCT and SVD. In this approach the generated peak signal to noise ratio (PSNR) of the original IVUS video signal vs. watermarked signal and the correlation value of the original watermark image and the extracted watermark image have a high acceptable level of imperceptibility and distortion.
{"title":"DWT-DCT-SVD based intravascular ultrasound video watermarking","authors":"N. Dey, P. Das, A. B. Roy, A. Das, S. S. Chaudhuri","doi":"10.1109/WICT.2012.6409079","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409079","url":null,"abstract":"Intravascular ultrasound (IVUS) is a medical imaging system, which uses a specially designed catheter equipped with a small ultrasound probing device at its distal end. This system allows the ultrasound signal to visualize the blood vessels from the inside and is capable of imaging the diseased vessels, measuring the dimension and composition of the plaque. IVUS is an efficient method for detecting various ischemie heart diseases. Presently considerable amount of work has been done in telemonitoring which involves the transmission of biomedicai signals through wireless media. Exchange of biomedicai signals amongst hospitals needs efficient and reliable transmission. Watermark is added \"ownership\" information in multimedia contents to prove authenticity, verify signal integrity, and achieve control over copy process. This paper proposes a novel session based blind watermarking method with secret key by embedding binary watermark images into IVUS video. The IVUS video is a sensitive diagnostic tool that is used to detect various cardio-vascular diseases by measuring and recording the anatomy of the heart and adjoined blood vessels in exquisite detail. The various anatomical minutiae of the heart and blood vessels are important characteristics which correspond to many severe human cardiac diseases. In this present work the entire video of IVUS is split into frames and application of Discrete Wavelet Transformation (DWT), Discrete Cosine Transformation (DCT) followed by Singular Value Decomposition (SVD) composes the watermark embedding technique. Watermark extraction is achieved by applying inverse DWT, inverse DCT and SVD. In this approach the generated peak signal to noise ratio (PSNR) of the original IVUS video signal vs. watermarked signal and the correlation value of the original watermark image and the extracted watermark image have a high acceptable level of imperceptibility and distortion.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128689552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2012-10-01DOI: 10.1109/WICT.2012.6409170
P. R. Kumar, S. S. Dhenakaran, K. Sailaja, P. Saikishore
Encryption is an art of protecting data that win wars, protect properties and personal information if performed properly. It ensures that the appropriate data is exchanged between the intended persons only; even if the eavesdroppers get the content of the data they should not be able to understand it. Chakra Algorithm is a symmetric key encryption technique. It is a process of encrypting the data with the concepts of Cartesian Co-ordinate Geometry andcircle generation. The process considers the translation and rotation of axis when the data is grouped into circles each circle holds the portion of data. The Cartesian axis will be migrated to the respective centerscircles and rotated by certain angle. The collection of angle with which each individual circle is rotated; the co-ordinates to which it is swapped, the size of the square grid the radius of the circle hold the symmetric key. Unlike the other current algorithms, in Chakra Algorithm we will not directly change the data instead location of data.
{"title":"Chakra: A new approach for symmetric key encryption","authors":"P. R. Kumar, S. S. Dhenakaran, K. Sailaja, P. Saikishore","doi":"10.1109/WICT.2012.6409170","DOIUrl":"https://doi.org/10.1109/WICT.2012.6409170","url":null,"abstract":"Encryption is an art of protecting data that win wars, protect properties and personal information if performed properly. It ensures that the appropriate data is exchanged between the intended persons only; even if the eavesdroppers get the content of the data they should not be able to understand it. Chakra Algorithm is a symmetric key encryption technique. It is a process of encrypting the data with the concepts of Cartesian Co-ordinate Geometry andcircle generation. The process considers the translation and rotation of axis when the data is grouped into circles each circle holds the portion of data. The Cartesian axis will be migrated to the respective centerscircles and rotated by certain angle. The collection of angle with which each individual circle is rotated; the co-ordinates to which it is swapped, the size of the square grid the radius of the circle hold the symmetric key. Unlike the other current algorithms, in Chakra Algorithm we will not directly change the data instead location of data.","PeriodicalId":445333,"journal":{"name":"2012 World Congress on Information and Communication Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129799831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}