Disease prediction of infants from DNA sequences remains as an open challenge in the area of bioinformatics, which deals with understanding human diseases and in identification of new molecular target for drug discovery. To provide solution, a novel prototype architecture is designed for the disease prediction of infants, based on the combination of parent DNA using Artificial Neural Networks (ANN). The proposed system integrates Micro array Technology, Digital Image Processing and Artificial Neural Networks. By applying micro array technology to the parental DNA, a gene expression image can be obtained. The extracted gene image is further characterized with the help of Digital Image processing. A neural network is trained with the mutated value which lists the probability of diseases to the infants. The proposed architecture is implemented in MATLAB. This novel system ensures that the prediction of disease for infants is possible and can be elaborated in future.
{"title":"Infants Disease Prediction Architecture Using Artificial Neural Networks and Digital Image Processing","authors":"V. Thamaraiselvi, V. K. Kaliappan","doi":"10.1109/WCCCT.2014.25","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.25","url":null,"abstract":"Disease prediction of infants from DNA sequences remains as an open challenge in the area of bioinformatics, which deals with understanding human diseases and in identification of new molecular target for drug discovery. To provide solution, a novel prototype architecture is designed for the disease prediction of infants, based on the combination of parent DNA using Artificial Neural Networks (ANN). The proposed system integrates Micro array Technology, Digital Image Processing and Artificial Neural Networks. By applying micro array technology to the parental DNA, a gene expression image can be obtained. The extracted gene image is further characterized with the help of Digital Image processing. A neural network is trained with the mutated value which lists the probability of diseases to the infants. The proposed architecture is implemented in MATLAB. This novel system ensures that the prediction of disease for infants is possible and can be elaborated in future.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114983595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Offering context aware services to users is one of the main objectives of pervasive computing. A context aware system needs to know the activities being performed by the user. Deciding what a user is doing at a given time poses a number of challenges. One significant challenge is dealing with the variation in the number, order and duration of the constituent steps of an activity. There happens to be considerable variation in these parameters even if the same user is performing the same activity at different times. Though fuzzy finite automata have been used by researchers to overcome this challenge, manual construction of the automata for daily life activities becomes onerous. This paper illustrates how finite automata can be constructed automatically and fuzziness incorporated into it, to recognize user activities in a smart environment. The proposed method is tested with a publicly available dataset and is found to give promising results.
{"title":"Activity Recognition with Fuzzy Finite Automata","authors":"H. Ali, D. Amalarethinam","doi":"10.1109/WCCCT.2014.34","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.34","url":null,"abstract":"Offering context aware services to users is one of the main objectives of pervasive computing. A context aware system needs to know the activities being performed by the user. Deciding what a user is doing at a given time poses a number of challenges. One significant challenge is dealing with the variation in the number, order and duration of the constituent steps of an activity. There happens to be considerable variation in these parameters even if the same user is performing the same activity at different times. Though fuzzy finite automata have been used by researchers to overcome this challenge, manual construction of the automata for daily life activities becomes onerous. This paper illustrates how finite automata can be constructed automatically and fuzziness incorporated into it, to recognize user activities in a smart environment. The proposed method is tested with a publicly available dataset and is found to give promising results.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129782799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mammography is the best available technique used by radiologists for screening early detection of breast cancer. In digital mammography the crisis of finding efficient and precise breast profile segmentation technique is time-consuming. In this research work, a novel hybrid method named M-HBMO (Mammogram based Honey Bees Mating Optimization) algorithm has been proposed to segment the lesion. The cancer profile segmentation is based on texture feature and extraction of the lesion. The M-HBMO is evaluated with conventional ROI (region of interest) Algorithm. The experiment is conducted with MRI images retrieved from the medical hospital database. The result proves that the M-HBMO method segments the breast region accurately correspond to respective MRI images.
{"title":"Detection of Mammograms Using Honey Bees Mating Optimization Algorithm (M-HBMO)","authors":"R. Durgadevi, B. Hemalatha, K. V. K. Kaliappan","doi":"10.1109/WCCCT.2014.52","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.52","url":null,"abstract":"Mammography is the best available technique used by radiologists for screening early detection of breast cancer. In digital mammography the crisis of finding efficient and precise breast profile segmentation technique is time-consuming. In this research work, a novel hybrid method named M-HBMO (Mammogram based Honey Bees Mating Optimization) algorithm has been proposed to segment the lesion. The cancer profile segmentation is based on texture feature and extraction of the lesion. The M-HBMO is evaluated with conventional ROI (region of interest) Algorithm. The experiment is conducted with MRI images retrieved from the medical hospital database. The result proves that the M-HBMO method segments the breast region accurately correspond to respective MRI images.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131959680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cloud computing has appeared as one of the most influential paradigms in the IT commerce in recent years and this technology needs users to entrust their precious facts and figures to cloud providers, there have been expanding security and privacy concerns on outsourced data. Several schemes employing attribute-based encryption (ABE) have been suggested for get access to control of outsourced data in cloud computing; however, most of them suffer from inflexibility in applying convoluted get access to command principles. In order to recognize scalable, flexible, and finegrained get access to control of outsourced facts and figures in cloud computing, in this paper, we suggest hierarchical attribute-set-based encryption (HASBE) by expanding ciphertext-policy attributeset- based encryption (ASBE) with a hierarchical structure of users. The suggested design not only achieves scalability due to its hierarchical structure, but furthermore inherits flexibility and fine-grained get access to command in carrying compound attributes of ASBE. In addition, HASBE uses multiple worth assignments for access expiration time to deal with client revocation more effectively than living schemes. We apply our scheme and show that it is both effective and flexible in dealing with get access to command for outsourced facts in cloud computing with comprehensive trials.
{"title":"Improving Cloud Security by Enhanced HASBE Using Hybrid Encryption Scheme","authors":"B. Poornima, T. Rajendran","doi":"10.1109/WCCCT.2014.88","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.88","url":null,"abstract":"Cloud computing has appeared as one of the most influential paradigms in the IT commerce in recent years and this technology needs users to entrust their precious facts and figures to cloud providers, there have been expanding security and privacy concerns on outsourced data. Several schemes employing attribute-based encryption (ABE) have been suggested for get access to control of outsourced data in cloud computing; however, most of them suffer from inflexibility in applying convoluted get access to command principles. In order to recognize scalable, flexible, and finegrained get access to control of outsourced facts and figures in cloud computing, in this paper, we suggest hierarchical attribute-set-based encryption (HASBE) by expanding ciphertext-policy attributeset- based encryption (ASBE) with a hierarchical structure of users. The suggested design not only achieves scalability due to its hierarchical structure, but furthermore inherits flexibility and fine-grained get access to command in carrying compound attributes of ASBE. In addition, HASBE uses multiple worth assignments for access expiration time to deal with client revocation more effectively than living schemes. We apply our scheme and show that it is both effective and flexible in dealing with get access to command for outsourced facts in cloud computing with comprehensive trials.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126542710","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Customer Relationship Management (CRM) is a widespread strategy and process of identifying, retaining and associating selective customers in order to sustain their relationship with the organization. By the application of CRM, greater efficacy and effectiveness in delivering strategies could be achieved. To support CRM in retail sector, many data mining techniques are applied in identifying, attracting, developing and retaining the customer's relationships. On the other hand, the retailers are unable to predict the demand of products which may leads to inconsistencies in profit by the use of CRM. To overcome this disadvantage, an integrated version of CRM along with Supply Chain Management (SCM) is designed which affords the stock requirement from SCM based on the demand in CRM. Thus SCM with CRM enhances the retailers profit, meet the customer's demand and retain the current customers.
{"title":"Improving the Retailers Profit for CRM Using Data Mining Techniques","authors":"K. Deepa, S. Dhanabal, V. K. Kaliappan","doi":"10.1109/WCCCT.2014.23","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.23","url":null,"abstract":"Customer Relationship Management (CRM) is a widespread strategy and process of identifying, retaining and associating selective customers in order to sustain their relationship with the organization. By the application of CRM, greater efficacy and effectiveness in delivering strategies could be achieved. To support CRM in retail sector, many data mining techniques are applied in identifying, attracting, developing and retaining the customer's relationships. On the other hand, the retailers are unable to predict the demand of products which may leads to inconsistencies in profit by the use of CRM. To overcome this disadvantage, an integrated version of CRM along with Supply Chain Management (SCM) is designed which affords the stock requirement from SCM based on the demand in CRM. Thus SCM with CRM enhances the retailers profit, meet the customer's demand and retain the current customers.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126377859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Numerous advanced operating systems honestly hold both time-sliced and multiprocessor threading with a process scheduler. The kernel of an operating system permits programmers to control threads via the system call interface. Java incorporates threading facility within the language itself rather than managing threads as a facility of the underlying operating system. This research finding focuses on how Java can facilitate Pthreads through JNI, which can exploit in Java threads and Native Pthreads to execute in hybrid model.
{"title":"Java Native Pthread for Win32 Platform","authors":"Dr. Bala Dhandayuthapani V, G. M. Nasira","doi":"10.1109/WCCCT.2014.13","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.13","url":null,"abstract":"Numerous advanced operating systems honestly hold both time-sliced and multiprocessor threading with a process scheduler. The kernel of an operating system permits programmers to control threads via the system call interface. Java incorporates threading facility within the language itself rather than managing threads as a facility of the underlying operating system. This research finding focuses on how Java can facilitate Pthreads through JNI, which can exploit in Java threads and Native Pthreads to execute in hybrid model.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125178188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The performance of K-means clustering algorithm is poor for high dimensions data set. The goal of this paper is to reduce the high dimensional data to a meaningful low dimensional data representation, so that the efficiency of clustering algorithm will be elevated. Hence to improve the efficiency of clustering analysis, unsupervised quick reduct algorithm (USQR) is used for selecting the features from high dimensional data. Then the selected features are used to find the initial centroid using k-MAM initialization technique for k-means. The initial centroids are finally used to find the clusters. The results are compared to k-means and k-MAM with USQR so that outperforms well, in terms of accuracy and number of iterations compared to the k-means, for high dimensional data.
{"title":"An Enhanced Clustering of High Dimensional Datasets Using Unsupervised Quick Reduct Algorithm (USQR) With Rough Set Theory","authors":"P. Gomathi, S. Dhanabal, V. K. Kaliappan","doi":"10.1109/WCCCT.2014.55","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.55","url":null,"abstract":"The performance of K-means clustering algorithm is poor for high dimensions data set. The goal of this paper is to reduce the high dimensional data to a meaningful low dimensional data representation, so that the efficiency of clustering algorithm will be elevated. Hence to improve the efficiency of clustering analysis, unsupervised quick reduct algorithm (USQR) is used for selecting the features from high dimensional data. Then the selected features are used to find the initial centroid using k-MAM initialization technique for k-means. The initial centroids are finally used to find the clusters. The results are compared to k-means and k-MAM with USQR so that outperforms well, in terms of accuracy and number of iterations compared to the k-means, for high dimensional data.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116037916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper is an attempt at the detailed study of Cryptography algorithm. In resource constrained system, Elliptic Curve Cryptography is a promising alternative for public algorithms, because it provides similar level of security with proposed shorter keys than conventional integer based public key algorithm. ECC over binary field is taken up with special interest because the operation in binary filed operation, are thought to be more in space and efficient in time. However, ECC's software implementation, on binary field are slow, Specially on low end processors, which are used in small computing devices such as sensors node, mobile phone, etc. This proposed paper, studied the Cryptography algorithms and software implementation of ECC. Firstly, while implementing ECC with software, for example byte size may affect the choice of algorithm some architectural parameters has been examined. Also, identification of software for low-end processors has been done. In addition, the proposed paper has implemented ECC algorithm in Multiparty Electronic transaction.
{"title":"Secure Multiparty Electronic Payments Using ECC Algorithm: A Comparative Study","authors":"Dr. K. Ravikumar, A. Udhayakumar","doi":"10.1109/WCCCT.2014.31","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.31","url":null,"abstract":"This paper is an attempt at the detailed study of Cryptography algorithm. In resource constrained system, Elliptic Curve Cryptography is a promising alternative for public algorithms, because it provides similar level of security with proposed shorter keys than conventional integer based public key algorithm. ECC over binary field is taken up with special interest because the operation in binary filed operation, are thought to be more in space and efficient in time. However, ECC's software implementation, on binary field are slow, Specially on low end processors, which are used in small computing devices such as sensors node, mobile phone, etc. This proposed paper, studied the Cryptography algorithms and software implementation of ECC. Firstly, while implementing ECC with software, for example byte size may affect the choice of algorithm some architectural parameters has been examined. Also, identification of software for low-end processors has been done. In addition, the proposed paper has implemented ECC algorithm in Multiparty Electronic transaction.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128654746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Increasing data volumes, data replication at offsite, and the greater than ever use of content-rich and Big Data, applications are mandating IT organizations to optimize their network resources. Trends such as Virtualization and Cloud computing further emphasize this requirement of this current era of Big data. To help with this process, companies are increasingly relying on a new generation of WAN optimization Techniques, Appliances, Controllers, Platforms and Products that are displacing standalone physical appliances by offering more scalability, flexibility, and manageability by additional inclusion of software to handle this Big data and bring valuable insights through big data analytics. An optimized WAN environment can increase network reliability, accessibility and availability and improve cost profiles. It also improves the performance and consistency of data backup, replication, and recovery processes. This paper covers the introduction to WAN optimization, prominent WAN optimization techniques, WAN optimization products used for Big data analytics and finally future trends and research Issues of WAN optimization in the ensuing era of Big data.
{"title":"WAN Optimization Tools, Techniques and Research Issues for Cloud-Based Big Data Analytics","authors":"M. Nirmala","doi":"10.1109/WCCCT.2014.72","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.72","url":null,"abstract":"Increasing data volumes, data replication at offsite, and the greater than ever use of content-rich and Big Data, applications are mandating IT organizations to optimize their network resources. Trends such as Virtualization and Cloud computing further emphasize this requirement of this current era of Big data. To help with this process, companies are increasingly relying on a new generation of WAN optimization Techniques, Appliances, Controllers, Platforms and Products that are displacing standalone physical appliances by offering more scalability, flexibility, and manageability by additional inclusion of software to handle this Big data and bring valuable insights through big data analytics. An optimized WAN environment can increase network reliability, accessibility and availability and improve cost profiles. It also improves the performance and consistency of data backup, replication, and recovery processes. This paper covers the introduction to WAN optimization, prominent WAN optimization techniques, WAN optimization products used for Big data analytics and finally future trends and research Issues of WAN optimization in the ensuing era of Big data.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130930088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Medical databases contain massive volume of clinical data which could provide valuable information regarding diagnosis, prognosis and treatment plan when mining algorithms are used in appropriate manner. The irrelevant, redundant and incomplete data in medical databases makes the extraction of useful pattern a difficult process. Feature selection, a robust data preprocessing method selects attributes that enhances the predictive accuracy of classification algorithms. Consistency subset evaluation with best first search approach selects a feature subset of consistence equal to that of full feature set. The optimal feature subset selected is classified using Modlem, a rough set based rule-induction algorithm. The performance of the classification algorithms are evaluated in terms of three metrics viz, Accuracy, Sensitivity and Specificity.
{"title":"An Empirical Study on the Performance of Rule-Based Classification by Feature Selection","authors":"S. Balakrishnan, M. Babu, P. Krishna","doi":"10.1109/WCCCT.2014.76","DOIUrl":"https://doi.org/10.1109/WCCCT.2014.76","url":null,"abstract":"Medical databases contain massive volume of clinical data which could provide valuable information regarding diagnosis, prognosis and treatment plan when mining algorithms are used in appropriate manner. The irrelevant, redundant and incomplete data in medical databases makes the extraction of useful pattern a difficult process. Feature selection, a robust data preprocessing method selects attributes that enhances the predictive accuracy of classification algorithms. Consistency subset evaluation with best first search approach selects a feature subset of consistence equal to that of full feature set. The optimal feature subset selected is classified using Modlem, a rough set based rule-induction algorithm. The performance of the classification algorithms are evaluated in terms of three metrics viz, Accuracy, Sensitivity and Specificity.","PeriodicalId":421793,"journal":{"name":"2014 World Congress on Computing and Communication Technologies","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116683959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}