首页 > 最新文献

2014 International Conference on Contemporary Computing and Informatics (IC3I)最新文献

英文 中文
Novel data storage and retrieval in cloud database by using frequent access node encryption 基于频繁访问节点加密的云数据库数据存储与检索
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019733
Sadia Syed, P. Teja
Cloud computing has the advantage that it offers companies unlimited data storage at attractive costs. However, it also introduces new challenges for protecting the confidentiality of the data, and the access to the data. Sensitive data like medical records, business or governmental data cannot be stored unencrypted on the cloud. Moreover, they can be of interest to many users and different policies could apply to each. Companies need new mechanisms to query the encrypted data without revealing anything to the cloud server, and to enforce access policies to the data. Current security schemes do not allow complex encrypted queries over encrypted data in a multi-user setting. Instead, they are limited to keyword searches. Moreover, current solutions assume that all users have the same access rights to the data. This paper shows the implementation of a scheme that allows making SQL-like queries on encrypted databases in a multi-user setting, while at the same time allowing the database owner to assign different access rights to users.we address these issues by combining cloud computing technologies and Attribute Based Encryption for Secure storage and efficient retrieval of Data from the Databases. Here the Attribute is the Frequent access Node in the database which can be Encrypted for Secure Storage and Retrieval. Using database encryption to protect data in some situations where access control is not solely enough is inevitable. Database encryption provides an additional layer of protection to conventional access control techniques. It prevents unauthorized users, including intruders breaking into a network, from viewing the sensitive data. As a result data keeps protected even in the incident that database is successfully attacked or stolen. However, data encryption and decryption process result in database performance degradation. In the situation where all the information is stored in encrypted form, one cannot make the selection on the database content any more. Data should be decrypted first, so an unwilling tradeoff between the security and the performance is normally forced. We present our approach for a multi-level threshold attribute based encryption scheme whose cipher text size depends only on the size of the policy and is independent of the number of attributes. The attribute can be taken as the Very frequent Accessing Node in the Database.
云计算的优势在于它以诱人的成本为企业提供无限的数据存储。然而,它也为保护数据的机密性和访问数据带来了新的挑战。医疗记录、商业或政府数据等敏感数据不能不加密地存储在云上。此外,它们可能是许多用户感兴趣的,每个用户可以应用不同的策略。公司需要新的机制来查询加密的数据而不向云服务器泄露任何信息,并对数据实施访问策略。当前的安全方案不允许在多用户设置中对加密数据进行复杂的加密查询。相反,它们仅限于关键字搜索。此外,当前的解决方案假设所有用户对数据具有相同的访问权限。本文展示了一个方案的实现,该方案允许在多用户设置下对加密数据库进行类似sql的查询,同时允许数据库所有者为用户分配不同的访问权限。我们通过结合云计算技术和基于属性的加密技术来解决这些问题,以便从数据库中安全存储和有效检索数据。这里的属性是数据库中的频繁访问节点,可以对其进行加密,以便安全存储和检索。在一些访问控制不够的情况下,使用数据库加密来保护数据是不可避免的。数据库加密为传统的访问控制技术提供了额外的保护层。它可以防止未经授权的用户,包括闯入网络的入侵者,查看敏感数据。因此,即使在数据库被成功攻击或被盗的情况下,数据也能得到保护。但是,数据加解密过程会导致数据库性能下降。在所有信息都以加密形式存储的情况下,不能再对数据库内容进行选择。数据应该首先解密,因此通常被迫在安全性和性能之间进行不情愿的权衡。我们提出了一种基于多级阈值属性的加密方案,该方案的密文大小仅取决于策略的大小,而与属性的数量无关。该属性可以看作是数据库中非常频繁的访问节点。
{"title":"Novel data storage and retrieval in cloud database by using frequent access node encryption","authors":"Sadia Syed, P. Teja","doi":"10.1109/IC3I.2014.7019733","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019733","url":null,"abstract":"Cloud computing has the advantage that it offers companies unlimited data storage at attractive costs. However, it also introduces new challenges for protecting the confidentiality of the data, and the access to the data. Sensitive data like medical records, business or governmental data cannot be stored unencrypted on the cloud. Moreover, they can be of interest to many users and different policies could apply to each. Companies need new mechanisms to query the encrypted data without revealing anything to the cloud server, and to enforce access policies to the data. Current security schemes do not allow complex encrypted queries over encrypted data in a multi-user setting. Instead, they are limited to keyword searches. Moreover, current solutions assume that all users have the same access rights to the data. This paper shows the implementation of a scheme that allows making SQL-like queries on encrypted databases in a multi-user setting, while at the same time allowing the database owner to assign different access rights to users.we address these issues by combining cloud computing technologies and Attribute Based Encryption for Secure storage and efficient retrieval of Data from the Databases. Here the Attribute is the Frequent access Node in the database which can be Encrypted for Secure Storage and Retrieval. Using database encryption to protect data in some situations where access control is not solely enough is inevitable. Database encryption provides an additional layer of protection to conventional access control techniques. It prevents unauthorized users, including intruders breaking into a network, from viewing the sensitive data. As a result data keeps protected even in the incident that database is successfully attacked or stolen. However, data encryption and decryption process result in database performance degradation. In the situation where all the information is stored in encrypted form, one cannot make the selection on the database content any more. Data should be decrypted first, so an unwilling tradeoff between the security and the performance is normally forced. We present our approach for a multi-level threshold attribute based encryption scheme whose cipher text size depends only on the size of the policy and is independent of the number of attributes. The attribute can be taken as the Very frequent Accessing Node in the Database.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131499372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Analysis and performance evaluation of various image segmentation methods 各种图像分割方法的分析与性能评价
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019614
S. U. Mageswari, C. Mala
Image segmentation is a primary stage in image processing for identifying objects of interest. Segmentation methods are classified into region based, transform based, edge based and clustering based segmentation. In this paper, segmentation methods including histogram, watershed, Canny edge detector and K-means clustering techniques are studied and analyzed. The experimental results obtained are compared with different evaluation measures including three standard image segmentation indices: rand index, globally consistency error and variation of information.
图像分割是图像处理中识别感兴趣对象的主要步骤。分割方法分为基于区域的、基于变换的、基于边缘的和基于聚类的。本文对直方图、分水岭、Canny边缘检测和k均值聚类等分割方法进行了研究和分析。采用兰德指数、全局一致性误差和信息变异三种标准图像分割指标对实验结果进行了比较。
{"title":"Analysis and performance evaluation of various image segmentation methods","authors":"S. U. Mageswari, C. Mala","doi":"10.1109/IC3I.2014.7019614","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019614","url":null,"abstract":"Image segmentation is a primary stage in image processing for identifying objects of interest. Segmentation methods are classified into region based, transform based, edge based and clustering based segmentation. In this paper, segmentation methods including histogram, watershed, Canny edge detector and K-means clustering techniques are studied and analyzed. The experimental results obtained are compared with different evaluation measures including three standard image segmentation indices: rand index, globally consistency error and variation of information.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"83 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131315494","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Template matching method for Kannada Handwritten recognition based on correlation analysis 基于相关分析的卡纳达语手写体识别模板匹配方法
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019635
C. Aravinda, H. Prakash
Handwriting recognition systems have been developed out of a need to automate the process of converting data into electronic format, which otherwise would have been lengthy and error prone. As we all know that building a character recognition system is one of the major areas of research over a decade, due to its wide range of prospects. Various techniques have been discussed by many researchers regarding the recognition of handwritten characters for different languages. In this paper we adopted a Correlation Technique for recognition of Kannada Handwritten Characters. The formation of Kannada Characters into its compound form, also called as Kagunita makes its recognition more complex. The digitized input image is subjected to various preprocessing techniques and the processed image is then segmented into individual characters using simple segmentation algorithm. The segmented individual character is correlated with the stored templates. The template with maximum correlation value is displayed in editable format.
手写识别系统的开发是出于将数据自动转换为电子格式的需要,否则这将是冗长且容易出错的。众所周知,构建字符识别系统是近十年来研究的主要领域之一,具有广阔的前景。关于不同语言的手写字符识别,许多研究人员讨论了各种技术。本文采用关联技术对卡纳达语手写体进行识别。卡纳达汉字的复合式,也被称为“卡古塔”,使其识别更加复杂。数字化的输入图像经过各种预处理技术,然后使用简单的分割算法将处理后的图像分割成单个字符。分割的单个字符与存储的模板相关联。关联值最大的模板以可编辑格式显示。
{"title":"Template matching method for Kannada Handwritten recognition based on correlation analysis","authors":"C. Aravinda, H. Prakash","doi":"10.1109/IC3I.2014.7019635","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019635","url":null,"abstract":"Handwriting recognition systems have been developed out of a need to automate the process of converting data into electronic format, which otherwise would have been lengthy and error prone. As we all know that building a character recognition system is one of the major areas of research over a decade, due to its wide range of prospects. Various techniques have been discussed by many researchers regarding the recognition of handwritten characters for different languages. In this paper we adopted a Correlation Technique for recognition of Kannada Handwritten Characters. The formation of Kannada Characters into its compound form, also called as Kagunita makes its recognition more complex. The digitized input image is subjected to various preprocessing techniques and the processed image is then segmented into individual characters using simple segmentation algorithm. The segmented individual character is correlated with the stored templates. The template with maximum correlation value is displayed in editable format.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125493252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
A survey on spectrum handoff techniques in cognitive radio networks 认知无线电网络中的频谱切换技术综述
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019755
Pushp Maheshwari, Awadhesh Kumar Singh
The last decade has witnessed tremendous increase in the use of wireless applications in all walks of life. Consequently, the availability of treasurable and limited electromagnetic radio spectrum has emerged as a major challenge. The escalated demand of this limited natural resource puts a number of constraints on its usage. Because of inefficient spectrum usage and ineffective utilization, a new communication model, namely cognitive radios has been developed to exploit the available spectrum in an opportunistic manner. In order to use the vacant or underutilized frequency bands (called spectrum holes, henceforth) opportunistically, the cognitive radios may have to switch the frequency band quite often. This may lead to application discontinuity and performance degradation. Thus, it is desired to have efficient techniques to handle the spectrum handoff. In this paper, we present a survey of spectrum handoff techniques with a brief overview of cognitive radio technology. The cognitive user throughput and handoff delay are the two popular parameters of interest for comparing different handoff techniques.
在过去的十年中,无线应用在各行各业的使用都有了巨大的增长。因此,宝贵而有限的电磁无线电频谱的可用性已成为一项重大挑战。对这一有限自然资源的需求不断增加,对其使用造成了一些限制。针对频谱使用效率低、利用效率低的问题,人们提出了一种新的通信模式,即认知无线电,以机会主义的方式利用可用频谱。为了有机会地使用空闲或未充分利用的频段(以下称为频谱空穴),认知无线电可能不得不经常切换频段。这可能导致应用程序的不连续性和性能下降。因此,需要有有效的技术来处理频谱切换。在本文中,我们提出了频谱切换技术的调查与认知无线电技术的简要概述。认知用户吞吐量和切换延迟是比较不同切换技术的两个常用参数。
{"title":"A survey on spectrum handoff techniques in cognitive radio networks","authors":"Pushp Maheshwari, Awadhesh Kumar Singh","doi":"10.1109/IC3I.2014.7019755","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019755","url":null,"abstract":"The last decade has witnessed tremendous increase in the use of wireless applications in all walks of life. Consequently, the availability of treasurable and limited electromagnetic radio spectrum has emerged as a major challenge. The escalated demand of this limited natural resource puts a number of constraints on its usage. Because of inefficient spectrum usage and ineffective utilization, a new communication model, namely cognitive radios has been developed to exploit the available spectrum in an opportunistic manner. In order to use the vacant or underutilized frequency bands (called spectrum holes, henceforth) opportunistically, the cognitive radios may have to switch the frequency band quite often. This may lead to application discontinuity and performance degradation. Thus, it is desired to have efficient techniques to handle the spectrum handoff. In this paper, we present a survey of spectrum handoff techniques with a brief overview of cognitive radio technology. The cognitive user throughput and handoff delay are the two popular parameters of interest for comparing different handoff techniques.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130570504","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Robust classification of primary brain tumor in Computer Tomography images using K-NN and linear SVM 基于K-NN和线性支持向量机的计算机断层图像中原发性脑肿瘤的鲁棒分类
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019693
G. Sundararaj, V. Balamurugan
Computer Tomography (CT) Images are widely used in the intracranical hematoma and hemorrhage. In this paper we have developed a new approach for automatic classification of brain tumor in CT images. The proposed method consists of four stages namely preprocessing, feature extraction, feature reduction and classification. In the first stage Gaussian filter is applied for noise reduction and to make the image suitable for extracting the features. In the second stage, various texture and intensity based features are extracted for classification. In the next stage principal component analysis (PCA) is used to reduce the dimensionality of the feature space which results in a more efficient and accurate classification. In the classification stage, two classifiers are used for classify the experimental images into normal and abnormal. The first classifier is based on k-nearest neighbour and second is Linear SVM. The obtained experimental are evaluated using the metric similarity index (SI), overlap fraction (OF), and extra fraction (EF). For comparison, the performance of the proposed technique has significantly improved the tumor detection accuracy with other neural network based classifier.
计算机断层扫描(CT)在颅内血肿和出血的诊断中有着广泛的应用。本文提出了一种新的脑肿瘤CT图像自动分类方法。该方法包括预处理、特征提取、特征约简和分类四个阶段。第一阶段采用高斯滤波进行降噪,使图像适合提取特征。在第二阶段,提取各种基于纹理和强度的特征进行分类。在第二阶段,使用主成分分析(PCA)来降低特征空间的维数,从而提高分类的效率和准确性。在分类阶段,使用两个分类器将实验图像分为正常和异常。第一个分类器是基于k近邻的,第二个分类器是线性支持向量机。利用度量相似指数(SI)、重叠分数(OF)和额外分数(EF)对得到的实验结果进行评价。与其他基于神经网络的分类器相比,该技术的性能显著提高了肿瘤检测的准确率。
{"title":"Robust classification of primary brain tumor in Computer Tomography images using K-NN and linear SVM","authors":"G. Sundararaj, V. Balamurugan","doi":"10.1109/IC3I.2014.7019693","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019693","url":null,"abstract":"Computer Tomography (CT) Images are widely used in the intracranical hematoma and hemorrhage. In this paper we have developed a new approach for automatic classification of brain tumor in CT images. The proposed method consists of four stages namely preprocessing, feature extraction, feature reduction and classification. In the first stage Gaussian filter is applied for noise reduction and to make the image suitable for extracting the features. In the second stage, various texture and intensity based features are extracted for classification. In the next stage principal component analysis (PCA) is used to reduce the dimensionality of the feature space which results in a more efficient and accurate classification. In the classification stage, two classifiers are used for classify the experimental images into normal and abnormal. The first classifier is based on k-nearest neighbour and second is Linear SVM. The obtained experimental are evaluated using the metric similarity index (SI), overlap fraction (OF), and extra fraction (EF). For comparison, the performance of the proposed technique has significantly improved the tumor detection accuracy with other neural network based classifier.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130483716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Web based and voice enabled IVRS for large scale Malayalam speech data collection 基于网络和语音的IVRS用于大规模马拉雅拉姆语语音数据收集
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019717
P. Shobana Devi, Divya Das, J. Stephen, V. K. Bhadran
Speech corpora are vital resource in development and evaluation of automatic speech recognition systems, as well as for acoustic phonetic studies. Collecting a huge corpus is not an easy task. The lack of such resources is one of the reasons for the absence of good quality speech recognition systems in Indian languages. Here we have automated such process by developing web based tool for collecting broad band speech data and an IVR system with speech recognition for collecting narrow band speech data. The main features includes the full support for the typical recording, annotation and project administration workflow, easy editing of the speech content, with an advantage of a fully localizable user interface. This paper describes in detail the development of web based speech collection tool and an IVR system which will enable end-to-end building of speech corpus with minimum manual effort.
语音语料库是自动语音识别系统开发和评价以及声学语音研究的重要资源。收集一个庞大的语料库不是一件容易的事。缺乏这些资源是印度语言中缺乏高质量语音识别系统的原因之一。在这里,我们通过开发基于web的工具来收集宽带语音数据和一个带有语音识别的IVR系统来收集窄带语音数据,从而实现了这一过程的自动化。主要功能包括完全支持典型的录音,注释和项目管理工作流程,轻松编辑演讲内容,具有完全可本地化的用户界面的优势。本文详细介绍了基于web的语音采集工具和IVR系统的开发,该系统可以以最少的人工工作量实现端到端的语音语料库构建。
{"title":"Web based and voice enabled IVRS for large scale Malayalam speech data collection","authors":"P. Shobana Devi, Divya Das, J. Stephen, V. K. Bhadran","doi":"10.1109/IC3I.2014.7019717","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019717","url":null,"abstract":"Speech corpora are vital resource in development and evaluation of automatic speech recognition systems, as well as for acoustic phonetic studies. Collecting a huge corpus is not an easy task. The lack of such resources is one of the reasons for the absence of good quality speech recognition systems in Indian languages. Here we have automated such process by developing web based tool for collecting broad band speech data and an IVR system with speech recognition for collecting narrow band speech data. The main features includes the full support for the typical recording, annotation and project administration workflow, easy editing of the speech content, with an advantage of a fully localizable user interface. This paper describes in detail the development of web based speech collection tool and an IVR system which will enable end-to-end building of speech corpus with minimum manual effort.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124611897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Modeling Chinese wall access control using formal concept analysis 用形式化概念分析建模中国墙访问控制
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019619
S. C. Mouliswaran, Ch. Aswani Kumar, C. Chandrasekar
Chinese wall access control (CWAC) is a well known and suitable access control model for secured sharing of commercial consultancy services. It is to avoid the information flow which causes conflict of interest for every individual consultant in these services. The main objective is to model the Chinese wall access control policy using formal concept analysis which extends and restructures the lattice theory. To attain this goal, we develop a formal context in the security aspects of Chinese wall access permissions. We experiment the proposed method in a common commercial consultancy service sharing scenario. The analysis results confirms that the proposed method satisfies the constraints of Chinese wall security policy and its properties such as simple security and *-property.
中国墙访问控制(CWAC)是一种众所周知的、适用于商业咨询服务安全共享的访问控制模式。这是为了避免在这些服务中导致每个顾问个人利益冲突的信息流。主要目的是利用形式概念分析对中国墙访问控制策略进行建模,并对格理论进行扩展和重构。为了实现这一目标,我们在中国墙访问权限的安全方面开发了一个正式的上下文。我们在一个常见的商业咨询服务共享场景中实验了所提出的方法。分析结果表明,所提出的方法满足中国墙安全策略的约束条件及其简单安全性和*-性等特性。
{"title":"Modeling Chinese wall access control using formal concept analysis","authors":"S. C. Mouliswaran, Ch. Aswani Kumar, C. Chandrasekar","doi":"10.1109/IC3I.2014.7019619","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019619","url":null,"abstract":"Chinese wall access control (CWAC) is a well known and suitable access control model for secured sharing of commercial consultancy services. It is to avoid the information flow which causes conflict of interest for every individual consultant in these services. The main objective is to model the Chinese wall access control policy using formal concept analysis which extends and restructures the lattice theory. To attain this goal, we develop a formal context in the security aspects of Chinese wall access permissions. We experiment the proposed method in a common commercial consultancy service sharing scenario. The analysis results confirms that the proposed method satisfies the constraints of Chinese wall security policy and its properties such as simple security and *-property.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133789906","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Live forensics analysis: Violations of business security policy 实时取证分析:违反业务安全策略
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019695
G. Tanwar, A. S. Poonia
Many more corporate entities today are utilizing ICTs to identify opportunities for innovative and customer-centric, value-added products and services. Indeed, information systems have been key characteristic of any growing and successful businesses, as they utilize ICTs for business value creation. The key motivation for the huge investment in IT infrastructures is to ensure an upsurge in revenue and retention of sizeable market share. Computer Usage policy is a document that provides guidelines that regulates the acceptable usage of these systems by end- users. The provision of these guidelines also serve as benchmark metrics in assessing the abuse or misuse of corporate information systems. These misuse and/or abuse are referred to as violations of computer usage in this study. 10 users, selected randomly from within each unit of a multi-lateral company, were observed for violations. Live computer forensics techniques utilizing EnCase, Microsoft reporting tools, WinHex, etc., were employed to investigate these violations. Notwithstanding the strict corporate policies, the study revealed that end-users virtually violated all computer usage policies. This paper further analyses and addresses the causes, effects and offers measures to mitigate computer usage violations.
如今,越来越多的企业实体正在利用信息通信技术寻找机会,提供创新的、以客户为中心的增值产品和服务。事实上,信息系统一直是任何成长型和成功企业的关键特征,因为它们利用信息通信技术创造业务价值。对IT基础设施进行巨额投资的主要动机是确保收入的激增和保持相当大的市场份额。计算机使用政策是一份提供指导方针的文件,规范最终用户对这些系统的可接受使用。这些准则的规定也可作为评估公司信息系统滥用或误用情况的基准量度。这些误用和/或滥用在本研究中被视为违反计算机使用。从一家跨国公司的每个单位中随机抽取10名用户,观察其违规行为。使用EnCase、微软报告工具、WinHex等实时计算机取证技术来调查这些违规行为。尽管有严格的公司政策,但研究显示,最终用户实际上违反了所有计算机使用政策。本文进一步分析和阐述了计算机使用违法的原因、影响,并提出了减轻计算机使用违法的措施。
{"title":"Live forensics analysis: Violations of business security policy","authors":"G. Tanwar, A. S. Poonia","doi":"10.1109/IC3I.2014.7019695","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019695","url":null,"abstract":"Many more corporate entities today are utilizing ICTs to identify opportunities for innovative and customer-centric, value-added products and services. Indeed, information systems have been key characteristic of any growing and successful businesses, as they utilize ICTs for business value creation. The key motivation for the huge investment in IT infrastructures is to ensure an upsurge in revenue and retention of sizeable market share. Computer Usage policy is a document that provides guidelines that regulates the acceptable usage of these systems by end- users. The provision of these guidelines also serve as benchmark metrics in assessing the abuse or misuse of corporate information systems. These misuse and/or abuse are referred to as violations of computer usage in this study. 10 users, selected randomly from within each unit of a multi-lateral company, were observed for violations. Live computer forensics techniques utilizing EnCase, Microsoft reporting tools, WinHex, etc., were employed to investigate these violations. Notwithstanding the strict corporate policies, the study revealed that end-users virtually violated all computer usage policies. This paper further analyses and addresses the causes, effects and offers measures to mitigate computer usage violations.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132188807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A trade-off establishment between software complexity and its usability using evolutionary multi-objective optimization (EMO) 基于演化多目标优化(EMO)的软件复杂性与可用性之间的权衡
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019730
Vandana Yadav, Siddharth Lavania, Arun Chaudhary, Namrata Dhanda
In this paper, by the application of evolutionary multi-objective optimization (EMO) technique, we will be establishing a trade-off balance between the two conflicting aspects i.e. complexity of a software and the usability (business value) of the software.
在本文中,通过应用演化多目标优化(EMO)技术,我们将在软件的复杂性和软件的可用性(商业价值)这两个相互冲突的方面之间建立一种权衡平衡。
{"title":"A trade-off establishment between software complexity and its usability using evolutionary multi-objective optimization (EMO)","authors":"Vandana Yadav, Siddharth Lavania, Arun Chaudhary, Namrata Dhanda","doi":"10.1109/IC3I.2014.7019730","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019730","url":null,"abstract":"In this paper, by the application of evolutionary multi-objective optimization (EMO) technique, we will be establishing a trade-off balance between the two conflicting aspects i.e. complexity of a software and the usability (business value) of the software.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132316470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Depth recovery from stereo images 立体图像的深度恢复
Pub Date : 2014-11-01 DOI: 10.1109/IC3I.2014.7019764
P. R. Induchoodan, M. J. Josemartin, P. R. Geetharanjin
In machine vision applications, distance or depth is an important factor. This paper describes stereoscopic depth calculation method by using images by two identical cameras separated by a small distance. This method requires calibration of cameras and rectification, an important step which is required for the matching of the images captured by two cameras. Using this stereo matching technique disparity is calculated. This is directly related to the depth. The proposed method is very much useful for planetary vision, autopilots, etc.
在机器视觉应用中,距离或深度是一个重要的因素。本文介绍了利用相距较近的两台相同摄像机的图像进行立体深度计算的方法。该方法需要对摄像机进行校准和校正,这是实现两台摄像机捕获的图像匹配的重要步骤。利用这种立体匹配技术计算视差。这与深度直接相关。该方法在行星视觉、自动驾驶仪等领域具有重要的应用价值。
{"title":"Depth recovery from stereo images","authors":"P. R. Induchoodan, M. J. Josemartin, P. R. Geetharanjin","doi":"10.1109/IC3I.2014.7019764","DOIUrl":"https://doi.org/10.1109/IC3I.2014.7019764","url":null,"abstract":"In machine vision applications, distance or depth is an important factor. This paper describes stereoscopic depth calculation method by using images by two identical cameras separated by a small distance. This method requires calibration of cameras and rectification, an important step which is required for the matching of the images captured by two cameras. Using this stereo matching technique disparity is calculated. This is directly related to the depth. The proposed method is very much useful for planetary vision, autopilots, etc.","PeriodicalId":430848,"journal":{"name":"2014 International Conference on Contemporary Computing and Informatics (IC3I)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130074050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
期刊
2014 International Conference on Contemporary Computing and Informatics (IC3I)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1