首页 > 最新文献

Journal of Artificial Intelligence and Data Mining最新文献

英文 中文
Interval Type-II Fuzzy H∞ Frequency Control for an Island Microgrid 海岛微电网区间ⅱ型模糊H∞频率控制
Pub Date : 2020-01-01 DOI: 10.22044/JADM.2019.6982.1825
F. Sabahi
Frequency control is one of the key parts for the arrangement of the performance of a microgrid (MG) system. Theoretically, model-based controllers may be the ideal control mechanisms; however, they are highly sensitive to model uncertainties and have difficulty with preserving robustness. The presence of serious disturbances, the increasing number of MG, varying voltage supplies of MGs, and both independent operations of MGs and their interaction with the main grid makes the design of model-based frequency controllers for MGs become inherently challenging and problematic. This paper proposes an approach that takes advantage of interval Type II fuzzy logic for modeling an MG system in the process of its robust H∞ frequency control. Specifically, the main contribution of this paper is that the parameters of the MG system are modeled by interval Type-II fuzzy system (IT2FS), and simultaneously MG deals with perturbation using H∞ index to control its frequency. The performance of the microgrid equipped with the proposed modeling and controller is then compared with the other controllers such as H2 and μ-synthesis during changes in the microgrid parameters and occurring perturbations. The comparison shows the superiority and effectiveness of the proposed approach in terms of robustness against uncertainties in the modeling parameters and perturbations.
频率控制是微电网系统性能安排的关键环节之一。理论上,基于模型的控制器可能是理想的控制机制;然而,它们对模型的不确定性高度敏感,难以保持鲁棒性。严重干扰的存在、MG数量的增加、MG电压供应的变化、MG的独立运行以及它们与主电网的相互作用,使得基于模型的MG频率控制器的设计变得具有挑战性和问题。本文提出了一种利用区间II型模糊逻辑对MG系统鲁棒H∞频率控制过程进行建模的方法。具体而言,本文的主要贡献在于采用区间ii型模糊系统(IT2FS)对MG系统的参数进行建模,同时MG系统利用H∞指数来处理摄动,控制其频率。在微网参数变化和微扰发生的情况下,将该模型和控制器与H2和μ-合成等控制器的性能进行了比较。结果表明,该方法对模型参数的不确定性和扰动具有较强的鲁棒性。
{"title":"Interval Type-II Fuzzy H∞ Frequency Control for an Island Microgrid","authors":"F. Sabahi","doi":"10.22044/JADM.2019.6982.1825","DOIUrl":"https://doi.org/10.22044/JADM.2019.6982.1825","url":null,"abstract":"Frequency control is one of the key parts for the arrangement of the performance of a microgrid (MG) system. Theoretically, model-based controllers may be the ideal control mechanisms; however, they are highly sensitive to model uncertainties and have difficulty with preserving robustness. The presence of serious disturbances, the increasing number of MG, varying voltage supplies of MGs, and both independent operations of MGs and their interaction with the main grid makes the design of model-based frequency controllers for MGs become inherently challenging and problematic. This paper proposes an approach that takes advantage of interval Type II fuzzy logic for modeling an MG system in the process of its robust H∞ frequency control. Specifically, the main contribution of this paper is that the parameters of the MG system are modeled by interval Type-II fuzzy system (IT2FS), and simultaneously MG deals with perturbation using H∞ index to control its frequency. The performance of the microgrid equipped with the proposed modeling and controller is then compared with the other controllers such as H2 and μ-synthesis during changes in the microgrid parameters and occurring perturbations. The comparison shows the superiority and effectiveness of the proposed approach in terms of robustness against uncertainties in the modeling parameters and perturbations.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"68375074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Capturing Outlines of Planar Generic Images by Simultaneous Curve Fitting and Sub-division 基于曲线拟合和细分的平面通用图像轮廓捕获方法
Pub Date : 2020-01-01 DOI: 10.22044/JADM.2019.6727.1788
A. Ebrahimi, G. B. Loghmani, M. Sarfraz
In this paper, a new technique has been designed to capture the outline of 2D shapes using cubic B´ezier curves. The proposed technique avoids the traditional method of optimizing the global squared fitting error and emphasizes the local control of data points. A maximum error has been determined to preserve the absolute fitting error less than a criterion and it administers the process of curve subdivision. Depending on the specified maximum error, the proposed technique itself subdivides complex segments, and curve fitting is done simultaneously. A comparative study of experimental results embosses various advantages of the proposed technique such as accurate representation, low approximation errors and efficient computational complexity.
在本文中,设计了一种新技术,可以使用三次B´ezier曲线捕捉二维形状的轮廓。该方法避免了传统的全局平方拟合误差优化方法,强调数据点的局部控制。确定了最大误差以保证绝对拟合误差小于一个准则,并控制曲线细分过程。该方法根据给定的最大误差对复杂线段进行细分,同时进行曲线拟合。实验结果的对比研究表明,该方法具有表示准确、近似误差小、计算复杂度高等优点。
{"title":"Capturing Outlines of Planar Generic Images by Simultaneous Curve Fitting and Sub-division","authors":"A. Ebrahimi, G. B. Loghmani, M. Sarfraz","doi":"10.22044/JADM.2019.6727.1788","DOIUrl":"https://doi.org/10.22044/JADM.2019.6727.1788","url":null,"abstract":"In this paper, a new technique has been designed to capture the outline of 2D shapes using cubic B´ezier curves. The proposed technique avoids the traditional method of optimizing the global squared fitting error and emphasizes the local control of data points. A maximum error has been determined to preserve the absolute fitting error less than a criterion and it administers the process of curve subdivision. Depending on the specified maximum error, the proposed technique itself subdivides complex segments, and curve fitting is done simultaneously. A comparative study of experimental results embosses various advantages of the proposed technique such as accurate representation, low approximation errors and efficient computational complexity.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"68374369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Salt and Pepper Noise Removal using Pixon-based Segmentation and Adaptive Median Filter 基于像素分割和自适应中值滤波的椒盐噪声去除
Pub Date : 2020-01-01 DOI: 10.22044/JADM.2019.7921.1930
S. A. Amiri
Removing salt and pepper noise is an active research area in image processing. In this paper, a two-phase method is proposed for removing salt and pepper noise while preserving edges and fine details. In the first phase, noise candidate pixels are detected which are likely to be contaminated by noise. In the second phase, only noise candidate pixels are restored using adaptive median filter. In terms of noise detection, a two-stage method is utilized. At first, a thresholding is applied on the image to initial estimation of the noise candidate pixels. Since some pixels in the image may be similar to the salt and pepper noise, these pixels are mistakenly identified as noise. Hence, in the second step of the noise detection, the pixon-based segmentation is used to identify the salt and pepper noise pixels more accurately. Pixon is the neighboring pixels with similar gray levels. The proposed method was evaluated on several noisy images, and the results show the accuracy of the proposed method in salt and pepper noise removal and outperforms to several existing methods.
椒盐噪声去除是图像处理中一个活跃的研究领域。本文提出了一种去除椒盐噪声同时保留图像边缘和细节的两阶段方法。在第一阶段,检测可能被噪声污染的候选噪声像素。在第二阶段,使用自适应中值滤波器只恢复噪声候选像素。在噪声检测方面,采用了两阶段检测方法。首先对图像进行阈值分割,对噪声候选像素进行初始估计。由于图像中的一些像素可能与盐和胡椒噪声相似,因此这些像素被错误地识别为噪声。因此,在噪声检测的第二步中,采用基于像素的分割来更准确地识别椒盐噪声像素。像素是具有相似灰度的相邻像素。在多幅含噪图像上对该方法进行了评价,结果表明该方法在去除椒盐噪声方面具有较好的准确性,并且优于现有的几种方法。
{"title":"Salt and Pepper Noise Removal using Pixon-based Segmentation and Adaptive Median Filter","authors":"S. A. Amiri","doi":"10.22044/JADM.2019.7921.1930","DOIUrl":"https://doi.org/10.22044/JADM.2019.7921.1930","url":null,"abstract":"Removing salt and pepper noise is an active research area in image processing. In this paper, a two-phase method is proposed for removing salt and pepper noise while preserving edges and fine details. In the first phase, noise candidate pixels are detected which are likely to be contaminated by noise. In the second phase, only noise candidate pixels are restored using adaptive median filter. In terms of noise detection, a two-stage method is utilized. At first, a thresholding is applied on the image to initial estimation of the noise candidate pixels. Since some pixels in the image may be similar to the salt and pepper noise, these pixels are mistakenly identified as noise. Hence, in the second step of the noise detection, the pixon-based segmentation is used to identify the salt and pepper noise pixels more accurately. Pixon is the neighboring pixels with similar gray levels. The proposed method was evaluated on several noisy images, and the results show the accuracy of the proposed method in salt and pepper noise removal and outperforms to several existing methods.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"68374677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Image Segmentation using Improved Imperialist Competitive Algorithm and a Simple Post-processing 基于改进的帝国竞争算法和简单后处理的图像分割
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2019.3935.1464
V. Naghashi, S. Lotfi
Image segmentation is a fundamental step in many of image processing applications. In most cases the image’s pixels are clustered only based on the pixels’ intensity or color information and neither spatial nor neighborhood information of pixels is used in the clustering process. Considering the importance of including spatial information of pixels which improves the quality of image segmentation, and using the information of the neighboring pixels, causes enhancing of the accuracy of segmentation. In this paper the idea of combining the K-means algorithm and the Improved Imperialist Competitive algorithm is proposed. Also before applying the hybrid algorithm, a new image is created and then the hybrid algorithm is employed. Finally, a simple post-processing is applied on the clustered image. Comparing the results of the proposed method on different images, with other methods, shows that in most cases, the accuracy of the NLICA algorithm is better than the other methods.
图像分割是许多图像处理应用的基本步骤。在大多数情况下,图像像素的聚类仅基于像素的强度或颜色信息,而在聚类过程中不使用像素的空间和邻域信息。考虑到包含像素空间信息对提高图像分割质量的重要性,并利用相邻像素的信息,使得分割精度得到提高。本文提出了k均值算法与改进的帝国主义竞争算法相结合的思想。在应用混合算法之前,先创建新图像,然后再使用混合算法。最后,对聚类后的图像进行简单的后处理。将该方法在不同图像上的结果与其他方法进行比较,结果表明,在大多数情况下,NLICA算法的精度优于其他方法。
{"title":"Image Segmentation using Improved Imperialist Competitive Algorithm and a Simple Post-processing","authors":"V. Naghashi, S. Lotfi","doi":"10.22044/JADM.2019.3935.1464","DOIUrl":"https://doi.org/10.22044/JADM.2019.3935.1464","url":null,"abstract":"Image segmentation is a fundamental step in many of image processing applications. In most cases the image’s pixels are clustered only based on the pixels’ intensity or color information and neither spatial nor neighborhood information of pixels is used in the clustering process. Considering the importance of including spatial information of pixels which improves the quality of image segmentation, and using the information of the neighboring pixels, causes enhancing of the accuracy of segmentation. In this paper the idea of combining the K-means algorithm and the Improved Imperialist Competitive algorithm is proposed. Also before applying the hybrid algorithm, a new image is created and then the hybrid algorithm is employed. Finally, a simple post-processing is applied on the clustered image. Comparing the results of the proposed method on different images, with other methods, shows that in most cases, the accuracy of the NLICA algorithm is better than the other methods.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47736179","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Morphological Exudate Detection in Retinal Images using PCA-based Optic Disc Removal 基于pca的视盘去除视网膜图像形态学渗出物检测
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2019.1488
J. Darvish, M. Ezoji
Diabetic retinopathy lesion detection such as exudate in fundus image of retina can lead to early diagnosis of the disease. Retinal image includes dark areas such as main blood vessels and retinal tissue and also bright areas such as optic disk, optical fibers and lesions e.g. exudate. In this paper, a multistage algorithm for the detection of exudate in foreground is proposed. The algorithm segments the background dark areas in the proper channels of RGB color space using morphological processing such as closing, opening and top-hat operations. Then an appropriate edge detector discriminates between exudates and cotton-like spots or other artificial effects. To tackle the problem of optical fibers and to discriminate between these brightness and exudates, in the first stage, main vessels are detected from the green channel of RGB color space. Then the optical fiber areas around the vessels are marked up. An algorithm which uses PCA-based reconstruction error is proposed to discard another fundus bright structure named optic disk. Several experiments have been performed with HEI-MED standard database and evaluated by comparing with ground truth images. These results show that the proposed algorithm has a detection accuracy of 96%.
检测糖尿病视网膜病变,如视网膜眼底图像中的渗出物,可以早期诊断该疾病。视网膜图像包括暗区域,如主血管和视网膜组织,也包括亮区域,如视盘、光纤和病变,如渗出物。本文提出了一种用于前台渗出物检测的多级算法。该算法使用形态学处理,如关闭、打开和礼帽操作,在RGB颜色空间的适当通道中分割背景暗区域。然后,合适的边缘检测器在渗出物和棉状斑点或其他人工效应之间进行区分。为了解决光纤的问题并区分这些亮度和渗出物,在第一阶段,从RGB颜色空间的绿色通道检测主要血管。然后对船只周围的光纤区域进行标记。提出了一种利用基于PCA的重建误差来丢弃另一种眼底明亮结构视盘的算法。使用HEI-MED标准数据库进行了几个实验,并通过与地面实况图像的比较进行了评估。这些结果表明,所提出的算法具有96%的检测准确率。
{"title":"Morphological Exudate Detection in Retinal Images using PCA-based Optic Disc Removal","authors":"J. Darvish, M. Ezoji","doi":"10.22044/JADM.2019.1488","DOIUrl":"https://doi.org/10.22044/JADM.2019.1488","url":null,"abstract":"Diabetic retinopathy lesion detection such as exudate in fundus image of retina can lead to early diagnosis of the disease. Retinal image includes dark areas such as main blood vessels and retinal tissue and also bright areas such as optic disk, optical fibers and lesions e.g. exudate. In this paper, a multistage algorithm for the detection of exudate in foreground is proposed. The algorithm segments the background dark areas in the proper channels of RGB color space using morphological processing such as closing, opening and top-hat operations. Then an appropriate edge detector discriminates between exudates and cotton-like spots or other artificial effects. To tackle the problem of optical fibers and to discriminate between these brightness and exudates, in the first stage, main vessels are detected from the green channel of RGB color space. Then the optical fiber areas around the vessels are marked up. An algorithm which uses PCA-based reconstruction error is proposed to discard another fundus bright structure named optic disk. Several experiments have been performed with HEI-MED standard database and evaluated by comparing with ground truth images. These results show that the proposed algorithm has a detection accuracy of 96%.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45488710","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Depth Improvement for FTV Systems Based on the Gradual Omission of Outliers 基于渐隐异常值的FTV系统深度改进
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2019.7278.1864
H. Hosseinpour, Seyed A. Moosavie nia, M. Pourmina
Virtual view synthesis is an essential part of computer vision and 3D applications. A high-quality depth map is the main problem with virtual view synthesis. Because as compared to the color image the resolution of the corresponding depth image is low. In this paper, an efficient and confided method based on the gradual omission of outliers is proposed to compute reliable depth values. In the proposed method depth values that are far from the mean of depth values are omitted gradually. By comparison with other state of the art methods, simulation results show that on average, PSNR is 2.5dB (8.1 %) higher, SSIM is 0.028 (3%) more, UNIQUE is 0.021 (2.4%) more, the running time is 8.6s (6.1%) less and wrong pixels are 1.97(24.8%) less.
虚拟视图合成是计算机视觉和三维应用的重要组成部分。高质量的深度图是虚拟视图合成的主要问题。因为与彩色图像相比,相应深度图像的分辨率较低。本文提出了一种基于逐渐省略离群点的高效、可信的深度值计算方法。在该方法中,远离深度值均值的深度值被逐步省略。仿真结果表明,该方法的平均PSNR提高了2.5dB (8.1%), SSIM提高了0.028 (3%),UNIQUE提高了0.021(2.4%),运行时间减少了8.6s(6.1%),错误像元减少了1.97(24.8%)。
{"title":"Depth Improvement for FTV Systems Based on the Gradual Omission of Outliers","authors":"H. Hosseinpour, Seyed A. Moosavie nia, M. Pourmina","doi":"10.22044/JADM.2019.7278.1864","DOIUrl":"https://doi.org/10.22044/JADM.2019.7278.1864","url":null,"abstract":"Virtual view synthesis is an essential part of computer vision and 3D applications. A high-quality depth map is the main problem with virtual view synthesis. Because as compared to the color image the resolution of the corresponding depth image is low. In this paper, an efficient and confided method based on the gradual omission of outliers is proposed to compute reliable depth values. In the proposed method depth values that are far from the mean of depth values are omitted gradually. By comparison with other state of the art methods, simulation results show that on average, PSNR is 2.5dB (8.1 %) higher, SSIM is 0.028 (3%) more, UNIQUE is 0.021 (2.4%) more, the running time is 8.6s (6.1%) less and wrong pixels are 1.97(24.8%) less.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47660767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Prediction and Diagnosis of Diabetes Mellitus Using a Water Wave Optimization Algorithm 用水波优化算法预测和诊断糖尿病
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2018.6446.1758
S. T. Dehkordi, A. K. Bardsiri, M. Zahedi
Data mining is an appropriate way to discover information and hidden patterns in large amounts of data, where the hidden patterns cannot be easily discovered in normal ways. One of the most interesting applications of data mining is the discovery of diseases and disease patterns through investigating patients' records. Early diagnosis of diabetes can reduce the effects of this devastating disease. A common way to diagnose this disease is performing a blood test, which, despite its high precision, has some disadvantages such as: pain, cost, patient stress, lack of access to a laboratory, and so on. Diabetic patients’ information has hidden patterns, which can help you investigate the risk of diabetes in individuals, without performing any blood tests. Use of neural networks, as powerful data mining tools, is an appropriate method to discover hidden patterns in diabetic patients’ information. In this paper, in order to discover the hidden patterns and diagnose diabetes, a water wave optimization(WWO) algorithm; as a precise metaheuristic algorithm, was used along with a neural network to increase the precision of diabetes prediction. The results of our implementation in the MATLAB programming environment, using the dataset related to diabetes, indicated that the proposed method diagnosed diabetes at a precision of 94.73%,sensitivity of 94.20%, specificity of 93.34%, and accuracy of 95.46%, and was more sensitive than methods such as: support vector machines, artificial neural networks, and decision trees.
数据挖掘是在大量数据中发现信息和隐藏模式的一种合适的方法,在这些数据中,隐藏模式不容易用常规方法发现。数据挖掘最有趣的应用之一是通过调查病人的记录来发现疾病和疾病模式。糖尿病的早期诊断可以减少这种毁灭性疾病的影响。诊断这种疾病的一种常用方法是进行血液检查,尽管这种方法精度很高,但也有一些缺点,例如:疼痛、费用、患者压力、无法进入实验室等等。糖尿病患者的信息具有隐藏的模式,这可以帮助您调查个人患糖尿病的风险,而无需进行任何血液检查。利用神经网络作为强大的数据挖掘工具,是发现糖尿病患者信息中隐藏模式的合适方法。为了发现糖尿病的隐藏模式,诊断糖尿病,本文提出了一种水波优化(WWO)算法;作为一种精确的元启发式算法,该算法与神经网络相结合,提高了糖尿病预测的精度。在MATLAB编程环境下,利用糖尿病相关数据集对糖尿病进行诊断,结果表明,该方法诊断糖尿病的精密度为94.73%,灵敏度为94.20%,特异度为93.34%,准确度为95.46%,比支持向量机、人工神经网络、决策树等方法更敏感。
{"title":"Prediction and Diagnosis of Diabetes Mellitus Using a Water Wave Optimization Algorithm","authors":"S. T. Dehkordi, A. K. Bardsiri, M. Zahedi","doi":"10.22044/JADM.2018.6446.1758","DOIUrl":"https://doi.org/10.22044/JADM.2018.6446.1758","url":null,"abstract":"Data mining is an appropriate way to discover information and hidden patterns in large amounts of data, where the hidden patterns cannot be easily discovered in normal ways. One of the most interesting applications of data mining is the discovery of diseases and disease patterns through investigating patients' records. Early diagnosis of diabetes can reduce the effects of this devastating disease. A common way to diagnose this disease is performing a blood test, which, despite its high precision, has some disadvantages such as: pain, cost, patient stress, lack of access to a laboratory, and so on. Diabetic patients’ information has hidden patterns, which can help you investigate the risk of diabetes in individuals, without performing any blood tests. Use of neural networks, as powerful data mining tools, is an appropriate method to discover hidden patterns in diabetic patients’ information. In this paper, in order to discover the hidden patterns and diagnose diabetes, a water wave optimization(WWO) algorithm; as a precise metaheuristic algorithm, was used along with a neural network to increase the precision of diabetes prediction. The results of our implementation in the MATLAB programming environment, using the dataset related to diabetes, indicated that the proposed method diagnosed diabetes at a precision of 94.73%,sensitivity of 94.20%, specificity of 93.34%, and accuracy of 95.46%, and was more sensitive than methods such as: support vector machines, artificial neural networks, and decision trees.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48221231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Entropy-based Consensus for Distributed Data Clustering 基于熵的分布式数据聚类一致性
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2018.4237.1514
M. Owhadi-Kareshki, M. Akbarzadeh-T.
The increasingly larger scale of available data and the more restrictive concerns on their privacy are some of the challenging aspects of data mining today. In this paper, Entropy-based Consensus on Cluster Centers (EC3) is introduced for clustering in distributed systems with a consideration for confidentiality of data; i.e. it is the negotiations among local cluster centers that are used in the consensus process, hence no private data are transferred. With the proposed use of entropy as an internal measure of consensus clustering validation at each machine, the cluster centers of the local machines with higher expected clustering validity have more influence in the final consensus centers. We also employ relative cost function of the local Fuzzy C-Means (FCM) and the number of data points in each machine as measures of relative machine validity as compared to other machines and its reliability, respectively. The utility of the proposed consensus strategy is examined on 18 datasets from the UCI repository in terms of clustering accuracy and speed up against the centralized version of FCM. Several experiments confirm that the proposed approach yields to higher speed up and accuracy while maintaining data security due to its protected and distributed processing approach.
可用数据的规模越来越大,对其隐私的限制也越来越严格,这是当今数据挖掘的一些挑战性方面。本文将基于熵的聚类中心一致性(EC3)引入分布式系统中,并考虑数据的机密性;即,在协商过程中使用的是本地集群中心之间的协商,因此不传输私有数据。由于建议使用熵作为每台机器上一致性聚类验证的内部度量,具有较高预期聚类有效性的本地机器的聚类中心对最终一致性中心有更大的影响。我们还使用局部模糊C均值(FCM)的相对成本函数和每台机器中的数据点数量分别作为与其他机器相比的相对机器有效性及其可靠性的度量。在UCI存储库的18个数据集上检验了所提出的一致性策略在聚类精度和速度方面的效用,与集中式FCM相比。几项实验证实,由于其受保护的分布式处理方法,所提出的方法在保持数据安全的同时具有更高的速度和准确性。
{"title":"Entropy-based Consensus for Distributed Data Clustering","authors":"M. Owhadi-Kareshki, M. Akbarzadeh-T.","doi":"10.22044/JADM.2018.4237.1514","DOIUrl":"https://doi.org/10.22044/JADM.2018.4237.1514","url":null,"abstract":"The increasingly larger scale of available data and the more restrictive concerns on their privacy are some of the challenging aspects of data mining today. In this paper, Entropy-based Consensus on Cluster Centers (EC3) is introduced for clustering in distributed systems with a consideration for confidentiality of data; i.e. it is the negotiations among local cluster centers that are used in the consensus process, hence no private data are transferred. With the proposed use of entropy as an internal measure of consensus clustering validation at each machine, the cluster centers of the local machines with higher expected clustering validity have more influence in the final consensus centers. We also employ relative cost function of the local Fuzzy C-Means (FCM) and the number of data points in each machine as measures of relative machine validity as compared to other machines and its reliability, respectively. The utility of the proposed consensus strategy is examined on 18 datasets from the UCI repository in terms of clustering accuracy and speed up against the centralized version of FCM. Several experiments confirm that the proposed approach yields to higher speed up and accuracy while maintaining data security due to its protected and distributed processing approach.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43215744","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A Novel Architecture for Detecting Phishing Webpages using Cost-based Feature Selection 一种利用基于成本的特征选择检测网络钓鱼网页的新架构
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2019.7183.1852
A. Zangooei, V. Derhami, F. Jamshidi
Phishing is one of the luring techniques used to exploit personal information. A phishing webpage detection system (PWDS) extracts features to determine whether it is a phishing webpage or not. Selecting appropriate features improves the performance of PWDS. Performance criteria are detection accuracy and system response time. The major time consumed by PWDS arises from feature extraction that is considered as feature cost in this paper. Here, two novel features are proposed. They use semantic similarity measure to determine the relationship between the content and the URL of a page. Since suggested features don't apply third-party services such as search engines result, the features extraction time decreases dramatically. Login form pre-filer is utilized to reduce unnecessary calculations and false positive rate. In this paper, a cost-based feature selection is presented as the most effective feature. The selected features are employed in the suggested PWDS. Extreme learning machine algorithm is used to classify webpages. The experimental results demonstrate that suggested PWDS achieves high accuracy of 97.6% and short average detection time of 120.07 milliseconds.
网络钓鱼是用来利用个人信息的引诱手段之一。钓鱼网页检测系统(PWDS)提取特征以确定它是否是钓鱼网页。选择适当的功能可以提高PWDS的性能。性能标准是检测精度和系统响应时间。PWDS所花费的主要时间来自于本文中被认为是特征成本的特征提取。在这里,提出了两个新颖的特点。他们使用语义相似性度量来确定内容和页面URL之间的关系。由于建议的功能不适用于搜索引擎结果等第三方服务,因此特征提取时间显著缩短。登录表单预申报器用于减少不必要的计算和误报率。在本文中,基于成本的特征选择被认为是最有效的特征。所选的功能被用于建议的PWDS中。使用极限学习机算法对网页进行分类。实验结果表明,所提出的PWDS实现了97.6%的高精度和120.07毫秒的短平均检测时间。
{"title":"A Novel Architecture for Detecting Phishing Webpages using Cost-based Feature Selection","authors":"A. Zangooei, V. Derhami, F. Jamshidi","doi":"10.22044/JADM.2019.7183.1852","DOIUrl":"https://doi.org/10.22044/JADM.2019.7183.1852","url":null,"abstract":"Phishing is one of the luring techniques used to exploit personal information. A phishing webpage detection system (PWDS) extracts features to determine whether it is a phishing webpage or not. Selecting appropriate features improves the performance of PWDS. Performance criteria are detection accuracy and system response time. The major time consumed by PWDS arises from feature extraction that is considered as feature cost in this paper. Here, two novel features are proposed. They use semantic similarity measure to determine the relationship between the content and the URL of a page. Since suggested features don't apply third-party services such as search engines result, the features extraction time decreases dramatically. Login form pre-filer is utilized to reduce unnecessary calculations and false positive rate. In this paper, a cost-based feature selection is presented as the most effective feature. The selected features are employed in the suggested PWDS. Extreme learning machine algorithm is used to classify webpages. The experimental results demonstrate that suggested PWDS achieves high accuracy of 97.6% and short average detection time of 120.07 milliseconds.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48983466","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Analytical evaluation of an innovative decision-making algorithm for VM live migration 虚拟机实时迁移决策创新算法的分析评价
Pub Date : 2019-11-01 DOI: 10.22044/JADM.2018.7178.1847
Majid Tajamolian, M. Ghasemzadeh
In order to achieve the virtual machines live migration, the two "pre-copy" and "post-copy" strategies are presented. Each of these strategies, depending on the operating conditions of the machine, may perform better than the other. In this article, a new algorithm is presented that automatically decides how the virtual machine live migration takes place. In this approach, the virtual machine memory is considered as an informational object that has a revision number and it is constantly changing. We have determined precise criteria for evaluating the behavior of a virtual machine and automatically select the appropriate live migration strategy. Also in this article, different aspects of required simulations and implementations are considered. Analytical evaluation shows that using the proposed scheme and the presented algorithm, can significantly improve the virtual machines live migration process.
为了实现虚拟机的实时迁移,提出了“预拷贝”和“后拷贝”两种策略。根据机器的操作条件,这些策略中的每一种都可能比另一种执行得更好。在本文中,提出了一种新的算法,可以自动决定虚拟机实时迁移的方式。在这种方法中,虚拟机内存被视为具有修订号的信息对象,并且它不断变化。我们已经确定了评估虚拟机行为的精确标准,并自动选择适当的实时迁移策略。本文还考虑了所需模拟和实现的不同方面。分析评价表明,使用所提出的方案和算法,可以显著改善虚拟机的实时迁移过程。
{"title":"Analytical evaluation of an innovative decision-making algorithm for VM live migration","authors":"Majid Tajamolian, M. Ghasemzadeh","doi":"10.22044/JADM.2018.7178.1847","DOIUrl":"https://doi.org/10.22044/JADM.2018.7178.1847","url":null,"abstract":"In order to achieve the virtual machines live migration, the two \"pre-copy\" and \"post-copy\" strategies are presented. Each of these strategies, depending on the operating conditions of the machine, may perform better than the other. In this article, a new algorithm is presented that automatically decides how the virtual machine live migration takes place. In this approach, the virtual machine memory is considered as an informational object that has a revision number and it is constantly changing. We have determined precise criteria for evaluating the behavior of a virtual machine and automatically select the appropriate live migration strategy. Also in this article, different aspects of required simulations and implementations are considered. Analytical evaluation shows that using the proposed scheme and the presented algorithm, can significantly improve the virtual machines live migration process.","PeriodicalId":32592,"journal":{"name":"Journal of Artificial Intelligence and Data Mining","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41462037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
期刊
Journal of Artificial Intelligence and Data Mining
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1