首页 > 最新文献

2015 Eighth International Conference on Contemporary Computing (IC3)最新文献

英文 中文
Smartphone-based colorimetric detection to measure Blood Glucose Levels 基于智能手机的比色检测来测量血糖水平
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346691
Sarthak Singhal, P. Ralhan, Nishtha Jatana
Diabetes Mellitus is a hazardous disease plaguing many humans in the world and it is caused due to high Blood Glucose Levels (BGL). Patients with diabetes need to regularly check their BGL in order to stay safe. Through this paper, we aim to provide a medical image processing system based on colorimetric detection particularly for diabetic patients that will measure the blood glucose levels by scanning an image of the visual glucose test strip using a smart-phone. The proposed technique measures the color change of the scanned test strip by using HSV and CIELAB color model based processing wherein concepts of color-gradient and CIEDE2000 color-difference formulas are applied. The technique was implemented in an Android based environment and tested using Accu-Chek active visual test strips along with its corresponding reference color chart. The results of the experiment show that such technique has promising capability to measure BGL in humans. Further, such technique can be also applied for any other visual medical test strips or even for other measurements which are calculated using a color metric.
糖尿病是困扰世界上许多人的一种危险疾病,它是由高血糖引起的。糖尿病患者需要定期检查血糖以保证安全。通过本文,我们旨在为糖尿病患者提供一种基于比色检测的医学图像处理系统,该系统将通过使用智能手机扫描视觉血糖试纸条的图像来测量血糖水平。该技术采用基于HSV和CIELAB颜色模型的处理方法,采用颜色梯度概念和CIEDE2000色差公式,测量扫描测试条的颜色变化。该技术在基于Android的环境中实现,并使用accu - check活动视觉测试条及其相应的参考颜色表进行测试。实验结果表明,该技术在人体BGL测量中具有良好的应用前景。此外,这种技术还可以应用于任何其他视觉医学试纸或甚至用于使用颜色度量法计算的其他测量。
{"title":"Smartphone-based colorimetric detection to measure Blood Glucose Levels","authors":"Sarthak Singhal, P. Ralhan, Nishtha Jatana","doi":"10.1109/IC3.2015.7346691","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346691","url":null,"abstract":"Diabetes Mellitus is a hazardous disease plaguing many humans in the world and it is caused due to high Blood Glucose Levels (BGL). Patients with diabetes need to regularly check their BGL in order to stay safe. Through this paper, we aim to provide a medical image processing system based on colorimetric detection particularly for diabetic patients that will measure the blood glucose levels by scanning an image of the visual glucose test strip using a smart-phone. The proposed technique measures the color change of the scanned test strip by using HSV and CIELAB color model based processing wherein concepts of color-gradient and CIEDE2000 color-difference formulas are applied. The technique was implemented in an Android based environment and tested using Accu-Chek active visual test strips along with its corresponding reference color chart. The results of the experiment show that such technique has promising capability to measure BGL in humans. Further, such technique can be also applied for any other visual medical test strips or even for other measurements which are calculated using a color metric.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117127390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Prediction of click frauds in mobile advertising 预测移动广告中的点击欺诈行为
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346672
Mayank Taneja, Kavyanshi Garg, Archana Purwar, Samarth Sharma
Click fraud represents a serious drain on advertising budgets and can seriously harm the viability of the internet advertising market. This paper proposes a novel framework for prediction of click fraud in mobile advertising which consists of feature selection using Recursive Feature Elimination (RFE) and classification through Hellinger Distance Decision Tree (HDDT).RFE is chosen for the feature selection as it has given better results as compared to wrapper approach when evaluated using different classifiers. HDDT is also selected as classifier to deal with class imbalance issue present in the data set. The efficiency of proposed framework is investigated on the data set provided by Buzzcity and compared with J48, Rep Tree, logitboost, and random forest. Results show that accuracy achieved by proposed framework is 64.07 % which is best as compared to existing methods under study.
点击欺诈严重消耗了广告预算,并可能严重损害互联网广告市场的生存能力。本文提出了一种新的移动广告点击欺诈预测框架,该框架包括使用递归特征消除(RFE)进行特征选择和使用海林格距离决策树(HDDT)进行分类。之所以选择RFE进行特征选择,是因为在使用不同分类器进行评估时,与包装器方法相比,RFE方法给出了更好的结果。为了解决数据集中存在的类不平衡问题,我们还选择了HDDT作为分类器。在Buzzcity提供的数据集上考察了该框架的效率,并与J48、Rep Tree、logitboost和random forest进行了比较。结果表明,该框架的准确率为64.07%,与现有方法相比,准确率最高。
{"title":"Prediction of click frauds in mobile advertising","authors":"Mayank Taneja, Kavyanshi Garg, Archana Purwar, Samarth Sharma","doi":"10.1109/IC3.2015.7346672","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346672","url":null,"abstract":"Click fraud represents a serious drain on advertising budgets and can seriously harm the viability of the internet advertising market. This paper proposes a novel framework for prediction of click fraud in mobile advertising which consists of feature selection using Recursive Feature Elimination (RFE) and classification through Hellinger Distance Decision Tree (HDDT).RFE is chosen for the feature selection as it has given better results as compared to wrapper approach when evaluated using different classifiers. HDDT is also selected as classifier to deal with class imbalance issue present in the data set. The efficiency of proposed framework is investigated on the data set provided by Buzzcity and compared with J48, Rep Tree, logitboost, and random forest. Results show that accuracy achieved by proposed framework is 64.07 % which is best as compared to existing methods under study.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126121915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
Consistency of Java run-time behavior with design-time specifications Java运行时行为与设计时规范的一致性
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346742
Swaminathan Jayaraman, D. Hari, B. Jayaraman
We present a novel framework for formal verification of run-time behaviour of Java programs. We focus on the class of programs with a repetitive behaviour, such as servers and interactive programs, including programs exhibiting concurrency and non-determinism. The design-time specifications for such programs can be specified as UML-like finite-state diagrams, or Kripke structures, in the terminology of model checking. In order to verify the run-time behavior of a Java program, we extract a state diagram from the execution trace of the Java program and check whether the run-time state diagram is consistent with the design-time diagram. We have implemented this framework as an extension of JIVE (Java Interactive Visualization Environment), a state-of-the-art dynamic analysis and visualization tool which constructs object, sequence, and state diagrams for Java program executions. JIVE is available as an open-source plugin for Eclipse and makes available the execution trace for facilitating analyses of program executions. We have tested our extension on a number of programs, and our experiments show that our methodology is effective in helping close the gap between the design and implementation of Java programs.
我们提出了一个新的框架来形式化验证Java程序的运行时行为。我们关注具有重复行为的程序类,如服务器和交互式程序,包括显示并发性和非确定性的程序。在模型检查的术语中,这些程序的设计时规范可以指定为类似uml的有限状态图,或者Kripke结构。为了验证Java程序的运行时行为,我们从Java程序的执行轨迹中提取状态图,并检查运行时状态图是否与设计时图一致。我们已经将这个框架作为JIVE (Java交互式可视化环境)的扩展来实现,JIVE是一种最先进的动态分析和可视化工具,可以为Java程序的执行构建对象、序列和状态图。JIVE作为Eclipse的开源插件可用,并提供执行跟踪,以促进对程序执行的分析。我们已经在许多程序上测试了我们的扩展,我们的实验表明,我们的方法在帮助缩小Java程序的设计和实现之间的差距方面是有效的。
{"title":"Consistency of Java run-time behavior with design-time specifications","authors":"Swaminathan Jayaraman, D. Hari, B. Jayaraman","doi":"10.1109/IC3.2015.7346742","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346742","url":null,"abstract":"We present a novel framework for formal verification of run-time behaviour of Java programs. We focus on the class of programs with a repetitive behaviour, such as servers and interactive programs, including programs exhibiting concurrency and non-determinism. The design-time specifications for such programs can be specified as UML-like finite-state diagrams, or Kripke structures, in the terminology of model checking. In order to verify the run-time behavior of a Java program, we extract a state diagram from the execution trace of the Java program and check whether the run-time state diagram is consistent with the design-time diagram. We have implemented this framework as an extension of JIVE (Java Interactive Visualization Environment), a state-of-the-art dynamic analysis and visualization tool which constructs object, sequence, and state diagrams for Java program executions. JIVE is available as an open-source plugin for Eclipse and makes available the execution trace for facilitating analyses of program executions. We have tested our extension on a number of programs, and our experiments show that our methodology is effective in helping close the gap between the design and implementation of Java programs.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124594891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Identifying gamakas in Carnatic music 辨别卡纳蒂克音乐中的伽玛卡
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346662
Harsh M. Vyas, M. SumaS., S. Koolagudi, K. Guruprasad
In this work, an effort has been made to identify the gamakas present in a given piece of Carnatic music clip. Gamakas are the beautification elements used to improve the melody. The identification of gamaka is very important stage in note transcription. In the proposed method, features that correspond to melodic variations such as pitch and energy are used for characterizing the gamakas. The input pitch contour is modelled using Hidden Markov Model with 3 states, namely Attack, Sustain and Decay. These states correspond to ups and downs in the melody of the music. The system is validated using a comprehensive data set consisting 160 songs from 8 different ragas. The average accuracy of 75.86% is achieved using this method.
在这项工作中,我们努力识别卡纳蒂克音乐片段中出现的伽玛卡。Gamakas是用来改善旋律的美化元素。音符的识别是音符誊写的一个重要环节。在所提出的方法中,与音调和能量等旋律变化相对应的特征被用于表征伽马。输入音高轮廓采用隐马尔可夫模型建模,模型具有攻击、维持和衰减三种状态。这些状态对应着音乐旋律的起伏。该系统使用由8种不同拉格乐的160首歌曲组成的综合数据集进行验证。该方法的平均准确率为75.86%。
{"title":"Identifying gamakas in Carnatic music","authors":"Harsh M. Vyas, M. SumaS., S. Koolagudi, K. Guruprasad","doi":"10.1109/IC3.2015.7346662","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346662","url":null,"abstract":"In this work, an effort has been made to identify the gamakas present in a given piece of Carnatic music clip. Gamakas are the beautification elements used to improve the melody. The identification of gamaka is very important stage in note transcription. In the proposed method, features that correspond to melodic variations such as pitch and energy are used for characterizing the gamakas. The input pitch contour is modelled using Hidden Markov Model with 3 states, namely Attack, Sustain and Decay. These states correspond to ups and downs in the melody of the music. The system is validated using a comprehensive data set consisting 160 songs from 8 different ragas. The average accuracy of 75.86% is achieved using this method.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124755597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Exploring sentiment analysis on twitter data 探索twitter数据的情感分析
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346686
Manju Venugopalan, Deepa Gupta
The growing popularity of microblogging websites has transformed these into rich resources for sentiment mining. Even though opinion mining has more than a decade of research to boost about, it is mostly confined to the exploration of formal text patterns like online reviews, news articles etc. Exploration of the challenges offered by informal and crisp microblogging have taken roots but there is scope for a large way ahead. The proposed work aims at developing a hybrid model for sentiment classification that explores the tweet specific features and uses domain independent and domain specific lexicons to offer a domain oriented approach and hence analyze and extract the consumer sentiment towards popular smart phone brands over the past few years. The experiments have proved that the results improve by around 2 points on an average over the unigram baseline.
微博网站的日益普及已经将其转化为情感挖掘的丰富资源。尽管意见挖掘已经有十多年的研究可以推进,但它主要局限于对正式文本模式的探索,如在线评论、新闻文章等。对非正式和简洁的微博所带来的挑战的探索已经扎根,但还有很大的发展空间。提出的工作旨在开发一种用于情感分类的混合模型,该模型探索tweet的特定特征,并使用领域独立和领域特定的词汇来提供面向领域的方法,从而分析和提取过去几年消费者对流行智能手机品牌的情感。实验证明,结果比单格基线平均提高了2点左右。
{"title":"Exploring sentiment analysis on twitter data","authors":"Manju Venugopalan, Deepa Gupta","doi":"10.1109/IC3.2015.7346686","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346686","url":null,"abstract":"The growing popularity of microblogging websites has transformed these into rich resources for sentiment mining. Even though opinion mining has more than a decade of research to boost about, it is mostly confined to the exploration of formal text patterns like online reviews, news articles etc. Exploration of the challenges offered by informal and crisp microblogging have taken roots but there is scope for a large way ahead. The proposed work aims at developing a hybrid model for sentiment classification that explores the tweet specific features and uses domain independent and domain specific lexicons to offer a domain oriented approach and hence analyze and extract the consumer sentiment towards popular smart phone brands over the past few years. The experiments have proved that the results improve by around 2 points on an average over the unigram baseline.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127679952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 51
Cluster based load balancing in cloud computing 云计算中基于集群的负载均衡
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346656
Surbhi Kapoor, Chetna Dabas
For a cloud datacenter the biggest issue is how to tackle billions of requests coming dynamically from the end users. To handle such requests efficiently and effectively, there is a need to distribute the load evenly among the cloud nodes. To achieve this goal, various load balancing approaches have been proposed in the past years. Load balancing strategies aim at achieving high user satisfaction by minimizing response time of the tasks and improving resource utilization through even and fair allocation of cloud resources. The traditional Throttled load balancing algorithm is a good approach for load balancing in cloud computing as it distributes the incoming jobs evenly among the VMs. But the major drawback is that this algorithm works well for environments with homogeneous VMS, does not considers the resource specific demands of the tasks and has additional overhead of scanning the entire list of VMs every time a task comes. In this paper, these issues have been addressed by proposing an algorithm Cluster based load balancing which works well in heterogeneous nodes environment, considers resource specific demands of the tasks and reduces scanning overhead by dividing the machines into clusters. Experimental results have shown that our algorithm gives better results in terms of waiting time, execution time, turnaround time and throughput as compared to existing throttled and modified throttled algorithms.
对于云数据中心来说,最大的问题是如何处理来自最终用户的数十亿个动态请求。为了高效地处理此类请求,需要在云节点之间均匀地分配负载。为了实现这一目标,过去几年提出了各种负载平衡方法。负载均衡策略旨在通过均衡和公平地分配云资源,最大限度地减少任务的响应时间,提高资源利用率,从而达到较高的用户满意度。传统的throttled负载平衡算法是一种很好的云计算负载平衡方法,因为它将传入的作业均匀地分配到vm中。但主要的缺点是,该算法在具有同构vm的环境中工作得很好,不考虑任务的特定资源需求,并且在每次任务来的时候都有扫描整个vm列表的额外开销。本文提出了一种基于集群的负载平衡算法来解决这些问题,该算法在异构节点环境下工作良好,考虑了任务的资源特定需求,并通过将机器划分为集群来减少扫描开销。实验结果表明,与现有的节流算法和改进的节流算法相比,我们的算法在等待时间、执行时间、周转时间和吞吐量方面都有更好的结果。
{"title":"Cluster based load balancing in cloud computing","authors":"Surbhi Kapoor, Chetna Dabas","doi":"10.1109/IC3.2015.7346656","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346656","url":null,"abstract":"For a cloud datacenter the biggest issue is how to tackle billions of requests coming dynamically from the end users. To handle such requests efficiently and effectively, there is a need to distribute the load evenly among the cloud nodes. To achieve this goal, various load balancing approaches have been proposed in the past years. Load balancing strategies aim at achieving high user satisfaction by minimizing response time of the tasks and improving resource utilization through even and fair allocation of cloud resources. The traditional Throttled load balancing algorithm is a good approach for load balancing in cloud computing as it distributes the incoming jobs evenly among the VMs. But the major drawback is that this algorithm works well for environments with homogeneous VMS, does not considers the resource specific demands of the tasks and has additional overhead of scanning the entire list of VMs every time a task comes. In this paper, these issues have been addressed by proposing an algorithm Cluster based load balancing which works well in heterogeneous nodes environment, considers resource specific demands of the tasks and reduces scanning overhead by dividing the machines into clusters. Experimental results have shown that our algorithm gives better results in terms of waiting time, execution time, turnaround time and throughput as compared to existing throttled and modified throttled algorithms.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133951619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 53
Enhanced heuristic approach for Traveling Tournament Problem based on Grey Wolf Optimizer 基于灰狼优化器的旅行竞赛问题的改进启发式方法
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346685
D. Gupta, Chand Anand, Tejas Dewan
This paper shows an enhanced heuristic approach of grey wolf optimizer and simulated annealing in order to find optimum solution for Travelling Tournament Problem. In this paper, we tackle the mirrored version of TTP. We use a fast constructive heuristic algorithm to generate schedules. Later we integrate an enhanced heuristic approach of simulated annealing based on prey proximity model of Grey Wolf Optimizer to obtain least cost tournament schedule. We upgrade the position updating step of GWO by using probabilistic methods and hybridize it with SA to solve TTP and avoid getting stuck in local minima. Our proposed hybrid algorithm converges to an optimum solution for TTP. We calculate the overall cost of TTP and compare the performance of our alorithm with other metaheuristic algorithms like MBBO/ESA, BBO/SA, ACO and PSO.
本文提出了一种改进的启发式灰狼优化算法和模拟退火算法来求解旅行比赛问题的最优解。在本文中,我们研究了http的镜像版本。我们使用一种快速的建设性启发式算法来生成调度。在此基础上,结合灰狼优化器的猎物接近模型,提出了一种改进的启发式模拟退火方法,以获得成本最小的比赛计划。我们利用概率方法改进了GWO的位置更新步骤,并将其与SA混合求解TTP,避免陷入局部极小值。我们提出的混合算法收敛于TTP的最优解。我们计算了TTP的总成本,并将我们的算法与其他元启发式算法(如MBBO/ESA、BBO/SA、ACO和PSO)的性能进行了比较。
{"title":"Enhanced heuristic approach for Traveling Tournament Problem based on Grey Wolf Optimizer","authors":"D. Gupta, Chand Anand, Tejas Dewan","doi":"10.1109/IC3.2015.7346685","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346685","url":null,"abstract":"This paper shows an enhanced heuristic approach of grey wolf optimizer and simulated annealing in order to find optimum solution for Travelling Tournament Problem. In this paper, we tackle the mirrored version of TTP. We use a fast constructive heuristic algorithm to generate schedules. Later we integrate an enhanced heuristic approach of simulated annealing based on prey proximity model of Grey Wolf Optimizer to obtain least cost tournament schedule. We upgrade the position updating step of GWO by using probabilistic methods and hybridize it with SA to solve TTP and avoid getting stuck in local minima. Our proposed hybrid algorithm converges to an optimum solution for TTP. We calculate the overall cost of TTP and compare the performance of our alorithm with other metaheuristic algorithms like MBBO/ESA, BBO/SA, ACO and PSO.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134343696","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
An image processing based method to identify and grade conjunctivitis infected eye according to its types and intensity 基于图像处理的结膜炎感染眼的类型和强度识别和分级方法
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346658
Joydeep Tamuli, Aishwarya Jain, A. V. Dhan, Anupama Bhan, M. Dutta
Inflammation of the conjunctiva and pain and discomfort in the inner surface of the eyelids is referred to as Conjunctivitis. It causes severe pain, burning sensation or in extreme cases blindness of the eye. Normally conjunctivitis is detected by eye specialist doctors and their limited number makes it difficult for everyone to reach them and get themselves diagnosed. This paper describes an automatic efficient image processing based method to identify conjunctivitis infected eye from a normal eye and classify it according to its types. Some statistical and texture features were used and then followed by PCA for extraction of discriminatory features and then classified using supervised learning method such as multi-class SVM and KNN. The intensity of the infected eyes were also calculated using the significant red plane. Plotconfusion was used to calculate the accuracy and a high accuracy was achieved using this method. Also in addition this proposed method is efficient, computationally fast and costs very low.
结膜的炎症以及眼睑内表面的疼痛和不适被称为结膜炎。它会引起剧烈的疼痛、灼烧感,在极端情况下会导致眼睛失明。通常情况下,结膜炎是由眼科专家医生检测出来的,他们的数量有限,很难让每个人都能找到他们并得到诊断。本文提出了一种基于图像自动处理的有效识别结膜炎感染眼和正常眼的方法,并根据其类型进行分类。利用统计特征和纹理特征,通过主成分分析提取判别特征,然后利用多类支持向量机和KNN等监督学习方法进行分类。用显著红平面计算感染眼的强度。采用Plotconfusion方法计算精度,获得了较高的精度。此外,该方法效率高,计算速度快,成本低。
{"title":"An image processing based method to identify and grade conjunctivitis infected eye according to its types and intensity","authors":"Joydeep Tamuli, Aishwarya Jain, A. V. Dhan, Anupama Bhan, M. Dutta","doi":"10.1109/IC3.2015.7346658","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346658","url":null,"abstract":"Inflammation of the conjunctiva and pain and discomfort in the inner surface of the eyelids is referred to as Conjunctivitis. It causes severe pain, burning sensation or in extreme cases blindness of the eye. Normally conjunctivitis is detected by eye specialist doctors and their limited number makes it difficult for everyone to reach them and get themselves diagnosed. This paper describes an automatic efficient image processing based method to identify conjunctivitis infected eye from a normal eye and classify it according to its types. Some statistical and texture features were used and then followed by PCA for extraction of discriminatory features and then classified using supervised learning method such as multi-class SVM and KNN. The intensity of the infected eyes were also calculated using the significant red plane. Plotconfusion was used to calculate the accuracy and a high accuracy was achieved using this method. Also in addition this proposed method is efficient, computationally fast and costs very low.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127632695","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
N-Hop broadcast and Street Broadcast Reduction algorithm using OMNeT++ and Google Earth plugin N-Hop广播和街道广播减少算法使用omnet++和谷歌地球插件
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346741
Deepti Agarwal, S. Sharma, Kavita Pandey
VANET are the systems used to collaborate the wireless devices installed on the vehicles, for transfer of messages. The implementation of algorithms on VANET in actual device is preceded by simulation of the same on a simulating environment like OMNeT++, SUMO, etc. This paper presents a study and implementation of VANET Broadcasting algorithms, and algorithms which are able to solve the problem of broadcast storming. With efficient use of resources, the transmission of signals to the different vehicles have been scalable to be broadcasted using two protocols namely, N-Hops Weighted p-persistence Broadcasting which is able to disseminate the data to different vehicles in all the regions covering the broadcast range. Another algorithm is Street Broadcast Reduction, which covers the area through different roads and effectively send the warning and alert messages to the vehicles, using the distance based schemes. The problems in VANET broadcast has been addressed, and have been implemented using OMNeT++, SUMO and Google Earth.
VANET是用于协作安装在车辆上的无线设备的系统,用于传输信息。在实际设备上实现VANET算法之前,先在omnet++、SUMO等仿真环境上对算法进行仿真。本文介绍了VANET广播算法的研究与实现,以及能够解决广播风暴问题的算法。在有效利用资源的情况下,向不同车辆传输的信号已经扩展到使用两种协议进行广播,即N-Hops加权p-persistence Broadcasting,它能够将数据传播到覆盖广播范围的所有区域的不同车辆。另一种算法是“街道广播减少”,它通过不同的道路覆盖该区域,并使用基于距离的方案有效地向车辆发送警告和警报信息。VANET广播中的问题已经得到解决,并已使用omnet++、SUMO和Google Earth实现。
{"title":"N-Hop broadcast and Street Broadcast Reduction algorithm using OMNeT++ and Google Earth plugin","authors":"Deepti Agarwal, S. Sharma, Kavita Pandey","doi":"10.1109/IC3.2015.7346741","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346741","url":null,"abstract":"VANET are the systems used to collaborate the wireless devices installed on the vehicles, for transfer of messages. The implementation of algorithms on VANET in actual device is preceded by simulation of the same on a simulating environment like OMNeT++, SUMO, etc. This paper presents a study and implementation of VANET Broadcasting algorithms, and algorithms which are able to solve the problem of broadcast storming. With efficient use of resources, the transmission of signals to the different vehicles have been scalable to be broadcasted using two protocols namely, N-Hops Weighted p-persistence Broadcasting which is able to disseminate the data to different vehicles in all the regions covering the broadcast range. Another algorithm is Street Broadcast Reduction, which covers the area through different roads and effectively send the warning and alert messages to the vehicles, using the distance based schemes. The problems in VANET broadcast has been addressed, and have been implemented using OMNeT++, SUMO and Google Earth.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126528403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Analysis of the improved knapsack cipher 改进的背包密码分析
Pub Date : 2015-08-20 DOI: 10.1109/IC3.2015.7346740
Ashish Jain, N. Chaudhari
Computational complexity of order 256n to solve the knapsack cipher 0/255 (or improved knapsack cipher) recently suggested by Pham is optimistic result. In this paper, we systematically analyze the improved knapsack cipher in relation to the complexity claims. During the analysis of the improved knapsack cipher, we examined that the public key size must be large to satisfy the constraint of complexity. Though we chose keys of large sizes, the improved knapsack cipher is not safe and its security is compromised by lattice-based attacks. Furthermore, the cipher lost its practicality, since the use of large size public keys. Hence, we identify a construction of practical knapsack cipher 0/f, characterize its requirements and demonstrate asymptotically, that the computational complexity of the knapsack cipher 0/f and knapsack cipher 0/1 (basic or sum-of-subset cipher) is equal. It is worth pointing out that the proposed knapsack cipher 0/f is a compact and viable option to use as a building block for security purpose in resource-constrained devices such as RFID tags, smart cards and the like.
Pham最近提出的背包密码0/255(或改进背包密码)的计算复杂度为256n阶,是乐观的结果。本文系统地分析了改进的背包密码与复杂度声明的关系。在对改进的背包密码进行分析时,我们考察了公钥的大小必须很大才能满足复杂度的约束。虽然我们选择了大尺寸的密钥,但改进的背包密码并不安全,并且其安全性受到基于格的攻击。此外,由于使用了大尺寸的公钥,该密码失去了实用性。因此,我们确定了一个实用的背包密码0/f的构造,描述了它的要求,并渐近地证明了背包密码0/f和背包密码0/1(基本或子集和密码)的计算复杂度相等。值得指出的是,所提出的背包密码0/f是一种紧凑而可行的选择,可以用作资源受限设备(如RFID标签、智能卡等)中安全目的的构建块。
{"title":"Analysis of the improved knapsack cipher","authors":"Ashish Jain, N. Chaudhari","doi":"10.1109/IC3.2015.7346740","DOIUrl":"https://doi.org/10.1109/IC3.2015.7346740","url":null,"abstract":"Computational complexity of order 256n to solve the knapsack cipher 0/255 (or improved knapsack cipher) recently suggested by Pham is optimistic result. In this paper, we systematically analyze the improved knapsack cipher in relation to the complexity claims. During the analysis of the improved knapsack cipher, we examined that the public key size must be large to satisfy the constraint of complexity. Though we chose keys of large sizes, the improved knapsack cipher is not safe and its security is compromised by lattice-based attacks. Furthermore, the cipher lost its practicality, since the use of large size public keys. Hence, we identify a construction of practical knapsack cipher 0/f, characterize its requirements and demonstrate asymptotically, that the computational complexity of the knapsack cipher 0/f and knapsack cipher 0/1 (basic or sum-of-subset cipher) is equal. It is worth pointing out that the proposed knapsack cipher 0/f is a compact and viable option to use as a building block for security purpose in resource-constrained devices such as RFID tags, smart cards and the like.","PeriodicalId":217950,"journal":{"name":"2015 Eighth International Conference on Contemporary Computing (IC3)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121161331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
期刊
2015 Eighth International Conference on Contemporary Computing (IC3)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1