首页 > 最新文献

2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)最新文献

英文 中文
Detection of macula and fovea for disease analysis in color fundus images 彩色眼底图像中黄斑和中央凹病变分析的检测
Dharitri Deka, J. Medhi, S. Nirmala
For the diagnosis of several retinal diseases, such as Diabetic Retinopathy (DR), Diabetic Macular Edema (DME), Age-related Macular degeneration (AMD) detection of macula and fovea is considered as an important prerequisite. If any abnormality like haemorrhages, exudates fall over macula then vision is severely effected and at higher stage people may become blind. In this paper a new approach for detection of macula and fovea is presented. Investigating the structure of blood vessels (BV) in the macular region localization of macula is done. The proposed method is tested on both normal and diseased images using DRIVE, MESSIDOR, DIARETDB1, HRF, STARE databases.
对于糖尿病性视网膜病变(DR)、糖尿病性黄斑水肿(DME)等几种视网膜疾病的诊断,黄斑和中央凹的老年性黄斑变性(AMD)检测被认为是一个重要的前提。如果有任何异常,如出血、渗出物落在黄斑上,那么视力就会受到严重影响,在较高的阶段,人们可能会失明。本文提出了一种检测黄斑和中央凹的新方法。对黄斑区血管结构进行了研究,对黄斑进行了定位。采用DRIVE、MESSIDOR、DIARETDB1、HRF、STARE数据库对正常和病变图像进行了测试。
{"title":"Detection of macula and fovea for disease analysis in color fundus images","authors":"Dharitri Deka, J. Medhi, S. Nirmala","doi":"10.1109/ReTIS.2015.7232883","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232883","url":null,"abstract":"For the diagnosis of several retinal diseases, such as Diabetic Retinopathy (DR), Diabetic Macular Edema (DME), Age-related Macular degeneration (AMD) detection of macula and fovea is considered as an important prerequisite. If any abnormality like haemorrhages, exudates fall over macula then vision is severely effected and at higher stage people may become blind. In this paper a new approach for detection of macula and fovea is presented. Investigating the structure of blood vessels (BV) in the macular region localization of macula is done. The proposed method is tested on both normal and diseased images using DRIVE, MESSIDOR, DIARETDB1, HRF, STARE databases.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116664973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
A novel clustering strategy for fingerprinting-based localization system to reduce the searching time 一种新的聚类策略用于指纹定位系统以减少搜索时间
Arka Saha, P. Sadhukhan
Location estimation is essential to the success of location based services. Since GPS does not work well in indoor and the urban areas, several indoor localization systems have been proposed in the literature. Among these, the fingerprinting-based localization systems involving two phases: training phase and positioning phase, are used mostly. In the training phase, a radio map is constructed by collecting the received signal strength (RSS) measurements at a set of known training locations. In the positioning phase, the training location whose corresponding RSS pattern matches best with the currently observed RSS pattern is selected as the estimated location of the object. The positioning accuracy of such systems depends on the grain size of the training locations, i.e., better localization accuracy can be achieved with increasing number of training locations, which in turn, increases the comparison cost as well as the searching time in the positioning phase. Several clustering strategies have been proposed in the literature to reduce the searching time by grouping several training locations into a cluster and selecting the right cluster in the positioning phase followed by searching within the selected cluster to localize an object. However, selection of some false cluster degrades the positioning accuracy of the localization system. Thus, this paper aims at devising some novel clustering strategy that would reduce the searching time without compromising the positioning accuracy.
位置估计是基于位置的服务成功的关键。由于GPS不能很好地在室内和城市地区工作,在文献中提出了几种室内定位系统。其中,基于指纹的定位系统主要分为训练阶段和定位阶段。在训练阶段,通过收集一组已知训练地点的接收信号强度(RSS)测量值来构建无线电地图。在定位阶段,选择相应的RSS模式与当前观测到的RSS模式最匹配的训练位置作为目标的估计位置。这类系统的定位精度取决于训练位置的粒度大小,即随着训练位置数量的增加,定位精度会提高,这反过来又增加了定位阶段的比较成本和搜索时间。为了减少搜索时间,文献中提出了几种聚类策略,将多个训练位置分组成一个簇,在定位阶段选择合适的簇,然后在选择的簇内搜索以定位目标。然而,一些错误聚类的选取会降低定位系统的定位精度。因此,本文旨在设计一种新的聚类策略,在不影响定位精度的情况下减少搜索时间。
{"title":"A novel clustering strategy for fingerprinting-based localization system to reduce the searching time","authors":"Arka Saha, P. Sadhukhan","doi":"10.1109/ReTIS.2015.7232937","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232937","url":null,"abstract":"Location estimation is essential to the success of location based services. Since GPS does not work well in indoor and the urban areas, several indoor localization systems have been proposed in the literature. Among these, the fingerprinting-based localization systems involving two phases: training phase and positioning phase, are used mostly. In the training phase, a radio map is constructed by collecting the received signal strength (RSS) measurements at a set of known training locations. In the positioning phase, the training location whose corresponding RSS pattern matches best with the currently observed RSS pattern is selected as the estimated location of the object. The positioning accuracy of such systems depends on the grain size of the training locations, i.e., better localization accuracy can be achieved with increasing number of training locations, which in turn, increases the comparison cost as well as the searching time in the positioning phase. Several clustering strategies have been proposed in the literature to reduce the searching time by grouping several training locations into a cluster and selecting the right cluster in the positioning phase followed by searching within the selected cluster to localize an object. However, selection of some false cluster degrades the positioning accuracy of the localization system. Thus, this paper aims at devising some novel clustering strategy that would reduce the searching time without compromising the positioning accuracy.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122122783","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
Spine medical image fusion using wiener filter in shearlet domain 基于剪切波域维纳滤波的脊柱医学图像融合
Biswajit Biswas, A. Chakrabarti, K. Dey
Medical image fusion combines both functional and anatomical structures in different imaging modalities such as Computer Tomography (CT) and Magnetic Resonance Image (MRI). In spine medical image fusion, CT and MR of the spine provides complementary information that assist to diagnostic and therapeutic decisions. Thus, spine medical image fusion is an essential technique that integrate the anatomical details of CT image and the functional information of MR image to a fused image with high functional and anatomical structures. This paper proposes a spine medical image fusion using wiener filter (WF) in shearlet domain. Shearlet transform (ST) obtains the shearlet subbands from CT and MR source images. A unique fusion strategy is devised for lowpass ST subbands. The processing of highpass ST subbands are considered in detail. Finally, the fused image achieved by inverse shearlet transform (IST). By evaluating with mainly some familiar techniques with regard to some quality assessment indexes, simulation and experimental results on spine images are presented the excellence of proposed technique.
医学图像融合结合了不同成像方式的功能和解剖结构,如计算机断层扫描(CT)和磁共振成像(MRI)。在脊柱医学图像融合中,脊柱的CT和MR提供了辅助诊断和治疗决策的互补信息。因此,脊柱医学图像融合是将CT图像的解剖细节和MR图像的功能信息融合成具有高度功能和解剖结构的融合图像的关键技术。提出了一种基于剪切波域维纳滤波的脊柱医学图像融合方法。Shearlet变换(ST)从CT和MR源图像中提取Shearlet子带。设计了一种独特的低通ST子带融合策略。详细讨论了高通ST子带的处理。最后,通过反剪切波变换(IST)得到融合图像。通过对几种常用技术在一些质量评价指标上的评价,给出了该技术在脊柱图像上的仿真和实验结果。
{"title":"Spine medical image fusion using wiener filter in shearlet domain","authors":"Biswajit Biswas, A. Chakrabarti, K. Dey","doi":"10.1109/ReTIS.2015.7232910","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232910","url":null,"abstract":"Medical image fusion combines both functional and anatomical structures in different imaging modalities such as Computer Tomography (CT) and Magnetic Resonance Image (MRI). In spine medical image fusion, CT and MR of the spine provides complementary information that assist to diagnostic and therapeutic decisions. Thus, spine medical image fusion is an essential technique that integrate the anatomical details of CT image and the functional information of MR image to a fused image with high functional and anatomical structures. This paper proposes a spine medical image fusion using wiener filter (WF) in shearlet domain. Shearlet transform (ST) obtains the shearlet subbands from CT and MR source images. A unique fusion strategy is devised for lowpass ST subbands. The processing of highpass ST subbands are considered in detail. Finally, the fused image achieved by inverse shearlet transform (IST). By evaluating with mainly some familiar techniques with regard to some quality assessment indexes, simulation and experimental results on spine images are presented the excellence of proposed technique.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114444956","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
A new adaptive Cuckoo search algorithm 一种新的自适应布谷鸟搜索算法
N. M. Kumar, Maheshwari Rashmita Nath, Aneesh Wunnava, Siddharth Sahany, Rutuparna Panda
This paper presents a new adaptive Cuckoo search (ACS) algorithm based on the Cuckoo search (CS) for optimization. The main thrust is to decide the step size adaptively from its fitness value without using the Levy distribution. The other idea is to enhance the performance from the point of time and global minima. The performance of ACS against standard benchmark function show that the proposed algorithm converges to best solution with less time than Cuckoo search.
在布谷鸟搜索算法的基础上,提出了一种新的自适应布谷鸟搜索算法(ACS)。其主要目的是在不使用Levy分布的情况下,从适应度值自适应地确定步长。另一个想法是从时间点和全局最小值来增强性能。ACS对标准基准函数的性能表明,该算法收敛到最优解的时间比布谷鸟搜索短。
{"title":"A new adaptive Cuckoo search algorithm","authors":"N. M. Kumar, Maheshwari Rashmita Nath, Aneesh Wunnava, Siddharth Sahany, Rutuparna Panda","doi":"10.1109/ReTIS.2015.7232842","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232842","url":null,"abstract":"This paper presents a new adaptive Cuckoo search (ACS) algorithm based on the Cuckoo search (CS) for optimization. The main thrust is to decide the step size adaptively from its fitness value without using the Levy distribution. The other idea is to enhance the performance from the point of time and global minima. The performance of ACS against standard benchmark function show that the proposed algorithm converges to best solution with less time than Cuckoo search.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128923929","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 53
An efficient periodic web content recommendation based on web usage mining 基于web使用挖掘的高效周期性web内容推荐
Ravi Khatri, D. Gupta
Now a day's use of internet has been increased tremendously, so providing information relevant to a user at particular time is very important task. Periodic web personalization is a process of recommending the most relevant information to the users at accurate time. In this paper we are proposing an improved personalize web recommender model, which not only considers user specific activities but also considers some other factors related to websites like total number of visitors, number of unique visitors, numbers of users download data, amount of data downloaded, amount of data uploaded and number of advertisements for a particular URL to provide a better result. This model consider user's web access activities to extract its usage behavior to build knowledge base and then knowledge base along with prior specified factors are used to predict the user specific content. Thus this advance computation of resources will help user to access required information more efficiently and effectively.
现在人们每天使用互联网的时间大大增加,因此提供与用户在特定时间相关的信息是一项非常重要的任务。周期性的网络个性化是在准确的时间向用户推荐最相关信息的过程。在本文中,我们提出了一种改进的个性化网页推荐模型,该模型不仅考虑了用户的特定活动,还考虑了与网站相关的其他因素,如访问者总数、唯一访问者数量、用户下载数据数量、下载数据数量、上传数据数量以及特定URL的广告数量,以提供更好的结果。该模型考虑用户的网络访问活动,提取其使用行为来构建知识库,然后利用知识库与预先指定的因素来预测用户特定的内容。因此,这种先进的资源计算将有助于用户更有效地访问所需的信息。
{"title":"An efficient periodic web content recommendation based on web usage mining","authors":"Ravi Khatri, D. Gupta","doi":"10.1109/ReTIS.2015.7232866","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232866","url":null,"abstract":"Now a day's use of internet has been increased tremendously, so providing information relevant to a user at particular time is very important task. Periodic web personalization is a process of recommending the most relevant information to the users at accurate time. In this paper we are proposing an improved personalize web recommender model, which not only considers user specific activities but also considers some other factors related to websites like total number of visitors, number of unique visitors, numbers of users download data, amount of data downloaded, amount of data uploaded and number of advertisements for a particular URL to provide a better result. This model consider user's web access activities to extract its usage behavior to build knowledge base and then knowledge base along with prior specified factors are used to predict the user specific content. Thus this advance computation of resources will help user to access required information more efficiently and effectively.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"00 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133359792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Hand gesture recognition of English alphabets using artificial neural network 基于人工神经网络的英语字母手势识别
Sourav Bhowmick, Sushant Kumar, Anurag Kumar
Human computer interaction (HCI) and sign language recognition (SLR), aimed at creating a virtual reality, 3D gaming environment, helping the deaf-and-mute people etc., extensively exploit the use of hand gestures. Segmentation of the hand part from the other body parts and background is the primary need of any hand gesture based application system; but gesture recognition systems are often plagued by different segmentation problems, and by the ones like co-articulation, movement epenthesis, recognition of similar gestures etc. The principal objective of this paper is to address a few of the said problems. In this paper, we propose a method for recognizing isolated as well as continuous English alphabet gestures which is a step towards helping and educating the hearing and speech-impaired people. We have performed the classification of the gestures with artificial neural network. Recognition rate (RR) of the isolated gestures is found to be 92.50% while that of continuous gestures is 89.05% with multilayer perceptron and 87.14% with focused time delay neural network. These results, when compared with other such system in the literature, go into showing the effectiveness of the system.
人机交互(HCI)和手语识别(SLR),旨在创造虚拟现实,3D游戏环境,帮助聋哑人等,广泛利用手势的使用。从其他身体部位和背景中分割手的部分是任何基于手势的应用系统的首要需求;但手势识别系统经常受到不同分割问题的困扰,如共同发音、动作放大、相似手势识别等。本文的主要目的是解决上述几个问题。在本文中,我们提出了一种识别孤立的和连续的英语字母手势的方法,这是帮助和教育听力和语言障碍人士的一步。我们用人工神经网络对手势进行了分类。多层感知器对孤立手势的识别率为92.50%,连续手势的识别率为89.05%,集中时滞神经网络的识别率为87.14%。这些结果,当与文献中的其他此类系统进行比较时,将显示该系统的有效性。
{"title":"Hand gesture recognition of English alphabets using artificial neural network","authors":"Sourav Bhowmick, Sushant Kumar, Anurag Kumar","doi":"10.1109/ReTIS.2015.7232913","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232913","url":null,"abstract":"Human computer interaction (HCI) and sign language recognition (SLR), aimed at creating a virtual reality, 3D gaming environment, helping the deaf-and-mute people etc., extensively exploit the use of hand gestures. Segmentation of the hand part from the other body parts and background is the primary need of any hand gesture based application system; but gesture recognition systems are often plagued by different segmentation problems, and by the ones like co-articulation, movement epenthesis, recognition of similar gestures etc. The principal objective of this paper is to address a few of the said problems. In this paper, we propose a method for recognizing isolated as well as continuous English alphabet gestures which is a step towards helping and educating the hearing and speech-impaired people. We have performed the classification of the gestures with artificial neural network. Recognition rate (RR) of the isolated gestures is found to be 92.50% while that of continuous gestures is 89.05% with multilayer perceptron and 87.14% with focused time delay neural network. These results, when compared with other such system in the literature, go into showing the effectiveness of the system.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133566918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
An efficient DCT based image watermarking using RGB color space 基于DCT的RGB彩色图像水印算法
Priyanka, Sushila Maheshkar
With the advent of various image processing tools, images can be easily counterfeited or corrupted. Digital image watermarking has emerged as an important tool for protection and authentication of digital multimedia content. This paper presents a robust DCT-based blind digital watermarking scheme for still color images. RGB color space has been used to decompose the color cover image into three channels which can be taken as a gray scaled image. Experimental result shows that it sustains good visual quality even after attacks. Proposed technique is better in terms of payload, imperceptibility than available counterparts.
随着各种图像处理工具的出现,图像很容易被伪造或损坏。数字图像水印已成为数字多媒体内容保护和认证的重要工具。提出了一种鲁棒的基于dct的静态彩色图像盲数字水印方案。利用RGB色彩空间将彩色封面图像分解为三个通道,作为灰度图像。实验结果表明,该方法在遭受攻击后仍能保持良好的视觉质量。所提出的技术在有效载荷、隐蔽性方面优于现有的同类技术。
{"title":"An efficient DCT based image watermarking using RGB color space","authors":"Priyanka, Sushila Maheshkar","doi":"10.1109/ReTIS.2015.7232881","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232881","url":null,"abstract":"With the advent of various image processing tools, images can be easily counterfeited or corrupted. Digital image watermarking has emerged as an important tool for protection and authentication of digital multimedia content. This paper presents a robust DCT-based blind digital watermarking scheme for still color images. RGB color space has been used to decompose the color cover image into three channels which can be taken as a gray scaled image. Experimental result shows that it sustains good visual quality even after attacks. Proposed technique is better in terms of payload, imperceptibility than available counterparts.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130065230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Fast and efficient compressive sensing for wideband Cognitive Radio systems 面向宽带认知无线电系统的快速高效压缩感知
Naveen Kumar, Neetu Sood
This paper presents a Compressive Spectrum Sensing (CSS) technique for wideband Cognitive Radio (CR) system to shorten spectrum sensing interval. Fast and efficient CSS is used to detect wideband spectrum, where samples are taken at sub-Nyquist rate and signal acquisition is terminated automatically once the samples are sufficient for the best spectral recovery. To improve sensing performance we propose a new approach for sparsifying basis in context of CSS, based on Empirical Wavelet Transform (EWT) which is adaptive to the processed signal spectrum. Simulation results show that the proposed fast and efficient EWT CSS scheme outperforms the conventional Discrete Fourier Transform (DFT) and Discrete Cosine Transform (DCT) based schemes in terms of sensing time, detection probability, system throughput and robustness to noise.
为了缩短宽带认知无线电(CR)系统的频谱感知间隔,提出了一种压缩频谱感知(CSS)技术。快速高效的CSS用于检测宽带频谱,其中样本以亚奈奎斯特速率采集,一旦样本足以达到最佳光谱恢复,信号采集将自动终止。为了提高感知性能,提出了一种基于经验小波变换(EWT)的基于CSS的基稀疏化方法,该方法对处理后的信号频谱具有自适应能力。仿真结果表明,该方案在感知时间、检测概率、系统吞吐量和对噪声的鲁棒性等方面均优于传统的离散傅立叶变换(DFT)和离散余弦变换(DCT)方案。
{"title":"Fast and efficient compressive sensing for wideband Cognitive Radio systems","authors":"Naveen Kumar, Neetu Sood","doi":"10.1109/ReTIS.2015.7232858","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232858","url":null,"abstract":"This paper presents a Compressive Spectrum Sensing (CSS) technique for wideband Cognitive Radio (CR) system to shorten spectrum sensing interval. Fast and efficient CSS is used to detect wideband spectrum, where samples are taken at sub-Nyquist rate and signal acquisition is terminated automatically once the samples are sufficient for the best spectral recovery. To improve sensing performance we propose a new approach for sparsifying basis in context of CSS, based on Empirical Wavelet Transform (EWT) which is adaptive to the processed signal spectrum. Simulation results show that the proposed fast and efficient EWT CSS scheme outperforms the conventional Discrete Fourier Transform (DFT) and Discrete Cosine Transform (DCT) based schemes in terms of sensing time, detection probability, system throughput and robustness to noise.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132086801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Discovering association rules partially devoid of dissociation by weighted confidence 利用加权置信度发现部分无解离的关联规则
Subrata Datta, Subrata Bose
Useful rule formation from frequent itemsets in a large database is a crucial task in Association Rule mining. Traditionally association rules refer to associations between two frequent sets and measured by their confidence. This approach basically concentrates on positive associations and thereby do not study the effect of dissociation and null transactions in association. Though the effect of dissociation has been studied in association, the impact of null transactions has been generally ignored. Some scholars have identified both positive and negative rules and thus studied the impact of the null transactions. However there is no uniform treatment towards inclusion of null transactions in either positive or negative category. We have tried to bridge these gaps. We have established a uniform approach to mine association rules by combining the effect of all kinds of transactions in the rules without categorizing them as positives and negatives. We have proposed to identify the frequent sets by weighted support in lieu of support and measure rules by weighted confidence in lieu of confidence for useful positive rule generation taking care of the negativity through dissociation and Null Transaction Impact Factor. We have shown that the weighted support, weighted confidence approach increases the chance of discovering rules which are less dissociated compared to the traditional support-confidence framework provided we maintain same level of minsupp and minconf in both cases.
从大型数据库中的频繁项集中生成有用的规则是关联规则挖掘的关键任务。传统的关联规则是指两个频繁集之间的关联,并通过它们的置信度来度量。这种方法基本上集中在积极的联想上,因此没有研究游离和无效交易在联想中的影响。虽然在关联中已经研究了解离的影响,但通常忽略了零交易的影响。一些学者区分了积极规则和消极规则,从而研究了无效交易的影响。但是,对于将无效交易列入正面或负面类别并没有统一的处理办法。我们试图弥合这些差距。我们建立了一种统一的方法来挖掘关联规则,将规则中各种交易的影响结合起来,而不将其分类为正面和负面。我们建议通过加权支持代替支持来识别频繁集,并通过加权置信度代替置信度来度量规则,从而生成有用的积极规则,通过解离和Null事务影响因子来处理消极性。我们已经证明,与传统的支持-置信度框架相比,加权支持、加权置信度方法增加了发现分离程度较低的规则的机会,前提是我们在两种情况下保持相同的最小支持和最小信任水平。
{"title":"Discovering association rules partially devoid of dissociation by weighted confidence","authors":"Subrata Datta, Subrata Bose","doi":"10.1109/ReTIS.2015.7232867","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232867","url":null,"abstract":"Useful rule formation from frequent itemsets in a large database is a crucial task in Association Rule mining. Traditionally association rules refer to associations between two frequent sets and measured by their confidence. This approach basically concentrates on positive associations and thereby do not study the effect of dissociation and null transactions in association. Though the effect of dissociation has been studied in association, the impact of null transactions has been generally ignored. Some scholars have identified both positive and negative rules and thus studied the impact of the null transactions. However there is no uniform treatment towards inclusion of null transactions in either positive or negative category. We have tried to bridge these gaps. We have established a uniform approach to mine association rules by combining the effect of all kinds of transactions in the rules without categorizing them as positives and negatives. We have proposed to identify the frequent sets by weighted support in lieu of support and measure rules by weighted confidence in lieu of confidence for useful positive rule generation taking care of the negativity through dissociation and Null Transaction Impact Factor. We have shown that the weighted support, weighted confidence approach increases the chance of discovering rules which are less dissociated compared to the traditional support-confidence framework provided we maintain same level of minsupp and minconf in both cases.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123568013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Workload characteristics and resource aware Hadoop scheduler 工作负载特征和资源感知Hadoop调度器
M. Divya, B. Annappa
Hadoop MapReduce is one of the largely used platforms for large scale data processing. Hadoop cluster has machines with different resources, including memory size, CPU capability and disk space. This introduces challenging research issue of improving Hadoop's performance through proper resource provisioning. The work presented in this paper focuses on optimizing job scheduling in Hadoop. Workload Characteristic and Resource Aware (WCRA) Hadoop scheduler is proposed, that classifies the jobs into CPU bound and Disk I/O bound. Based on the performance, nodes in the cluster are classified as CPU busy and Disk I/O busy. The amount of primary memory available in the node is ensured to be more than 25% before scheduling the job. Performance parameters of Map tasks such as the time required for parsing the data, map, sort and merge the result, and of Reduce task, such as the time to merge, parse and reduce is considered to categorize the job as CPU bound or Disk I/O bound. Tasks are assigned the priority based on their minimum Estimated Completion Time. The jobs are scheduled on a compute node in such a way that jobs already running on it will not be affected. Experimental results has given 30 % improvement in performance compared to Hadoop's FIFO, Fair and Capacity scheduler.
Hadoop MapReduce是大量使用的大规模数据处理平台之一。Hadoop集群拥有不同资源的机器,包括内存大小、CPU能力和磁盘空间。这就引入了通过适当的资源配置来提高Hadoop性能的挑战性研究问题。本文的工作重点是优化Hadoop中的作业调度。提出了工作负载特征和资源感知(WCRA) Hadoop调度器,该调度器将任务划分为CPU绑定和磁盘I/O绑定。根据性能的不同,集群中的节点可分为CPU繁忙和磁盘I/O繁忙。在调度作业之前,确保节点中可用的主内存量大于25%。Map任务的性能参数(如解析数据、映射、排序和合并结果所需的时间)和Reduce任务的性能参数(如合并、解析和减少所需的时间)可用于将作业分类为CPU绑定或磁盘I/O绑定。根据任务的最小估计完成时间为其分配优先级。在计算节点上调度作业的方式是,已经在该节点上运行的作业不会受到影响。实验结果表明,与Hadoop的FIFO、Fair和Capacity调度器相比,性能提高了30%。
{"title":"Workload characteristics and resource aware Hadoop scheduler","authors":"M. Divya, B. Annappa","doi":"10.1109/ReTIS.2015.7232871","DOIUrl":"https://doi.org/10.1109/ReTIS.2015.7232871","url":null,"abstract":"Hadoop MapReduce is one of the largely used platforms for large scale data processing. Hadoop cluster has machines with different resources, including memory size, CPU capability and disk space. This introduces challenging research issue of improving Hadoop's performance through proper resource provisioning. The work presented in this paper focuses on optimizing job scheduling in Hadoop. Workload Characteristic and Resource Aware (WCRA) Hadoop scheduler is proposed, that classifies the jobs into CPU bound and Disk I/O bound. Based on the performance, nodes in the cluster are classified as CPU busy and Disk I/O busy. The amount of primary memory available in the node is ensured to be more than 25% before scheduling the job. Performance parameters of Map tasks such as the time required for parsing the data, map, sort and merge the result, and of Reduce task, such as the time to merge, parse and reduce is considered to categorize the job as CPU bound or Disk I/O bound. Tasks are assigned the priority based on their minimum Estimated Completion Time. The jobs are scheduled on a compute node in such a way that jobs already running on it will not be affected. Experimental results has given 30 % improvement in performance compared to Hadoop's FIFO, Fair and Capacity scheduler.","PeriodicalId":161306,"journal":{"name":"2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127830975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
期刊
2015 IEEE 2nd International Conference on Recent Trends in Information Systems (ReTIS)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1