Speckle is a multiplicative noise that degrades the visual evaluation in ultrasound imaging. In medical ultrasound image processing, speckle noise suppression has become a very essential exercise for diagnose. The recent advancements in ultrasound devices necessitate the need of more robust despeckling techniques for enhancing ultrasound medical imaging in routine clinical practice. Many denoising techniques have been proposed for effective suppression of speckle noise. This paper compiles the performance of various techniques in medical B-mode ultrasound images.
{"title":"Speckle Noise Suppression Techniques for Ultrasound Images","authors":"Changming Zhu, Jun Ni, Yan-bo Li, Guochang Gu","doi":"10.1109/ICICSE.2009.26","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.26","url":null,"abstract":"Speckle is a multiplicative noise that degrades the visual evaluation in ultrasound imaging. In medical ultrasound image processing, speckle noise suppression has become a very essential exercise for diagnose. The recent advancements in ultrasound devices necessitate the need of more robust despeckling techniques for enhancing ultrasound medical imaging in routine clinical practice. Many denoising techniques have been proposed for effective suppression of speckle noise. This paper compiles the performance of various techniques in medical B-mode ultrasound images.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"5 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131958131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In mobile computing environments, the limited bandwidth of wireless communications, the weak capability of mobile computing equipments and the valuable power make it difficult for mobile devices to be maintained with the network all the time. Cache technique caches part of data in the clients to reduce the query of database server, so it solves the problem to some extent. Research on cache strategy for mobile computing environment has become an important research direction. Based on the analysis of existing cache strategy, we studied the cache strategy based on mobile agent further. In order to make Mobile Agents have the real and effective collaboration and not limited reference to the conceptual level, we use the ant colony algorithm to propose the data cache strategy based on ant colony algorithm. Finally, we give the experiment to show the superiority.
{"title":"Data Cache Strategy Based on Colony Algorithm in Mobile Computing Environment","authors":"Yan Chu, Jianpei Zhang, C. Zhao","doi":"10.1109/ICICSE.2009.60","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.60","url":null,"abstract":"In mobile computing environments, the limited bandwidth of wireless communications, the weak capability of mobile computing equipments and the valuable power make it difficult for mobile devices to be maintained with the network all the time. Cache technique caches part of data in the clients to reduce the query of database server, so it solves the problem to some extent. Research on cache strategy for mobile computing environment has become an important research direction. Based on the analysis of existing cache strategy, we studied the cache strategy based on mobile agent further. In order to make Mobile Agents have the real and effective collaboration and not limited reference to the conceptual level, we use the ant colony algorithm to propose the data cache strategy based on ant colony algorithm. Finally, we give the experiment to show the superiority.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123797212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
There exists a variety of heterogeneous data that need to be integrated within a secure domain. How to integrate the security data from various data source safely is a challenge for database researchers. In this paper, we introduce a new model for security data integrity, Weight-value-Extended MLS (WEMLS), which is development based on MLS. Compared with MLS, WEMLS can not only guarantee the security and integrity of data accessing, but also provide a more flexible mechanism for data accessing. Finally, WEMLS has been verified by proposing a security data integration model based on ontology.
{"title":"Ontology Security Strategy of Security Data Integrity","authors":"Yulong Meng, Guisheng Yin, Ke Geng","doi":"10.1109/ICICSE.2009.43","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.43","url":null,"abstract":"There exists a variety of heterogeneous data that need to be integrated within a secure domain. How to integrate the security data from various data source safely is a challenge for database researchers. In this paper, we introduce a new model for security data integrity, Weight-value-Extended MLS (WEMLS), which is development based on MLS. Compared with MLS, WEMLS can not only guarantee the security and integrity of data accessing, but also provide a more flexible mechanism for data accessing. Finally, WEMLS has been verified by proposing a security data integration model based on ontology.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115215398","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Position information is important for spatial information grids. In accurate position of datum may lead to mistake decisions or even disastrous actions. In this paper, we study the problem of localization for spatial information grid in underwater environments. There are many challenges facing the localization in underwater environments, because the unavailability of radio frequency. Besides, there are various uncontrolled effects disturbing distance estimation, hence degrades the accuracy of localization in underwater environments. We devise a new method to map distance information to measurement error in a virtual frame composed with beacons. Furthermore, we propose a linear transformation to infer the measurement error by which appropriate adjustments can be obtained to make accurate distance estimation. Simulation result show that the method proposed in this paper outperforms DV-Distance, especially when the number of beacons is low.
{"title":"A Localization Method for Spatial Information Grid in Underwater Environments","authors":"Zhe Zhang, Min Song, Daxin Liu, Hongbin Wang","doi":"10.1109/ICICSE.2009.47","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.47","url":null,"abstract":"Position information is important for spatial information grids. In accurate position of datum may lead to mistake decisions or even disastrous actions. In this paper, we study the problem of localization for spatial information grid in underwater environments. There are many challenges facing the localization in underwater environments, because the unavailability of radio frequency. Besides, there are various uncontrolled effects disturbing distance estimation, hence degrades the accuracy of localization in underwater environments. We devise a new method to map distance information to measurement error in a virtual frame composed with beacons. Furthermore, we propose a linear transformation to infer the measurement error by which appropriate adjustments can be obtained to make accurate distance estimation. Simulation result show that the method proposed in this paper outperforms DV-Distance, especially when the number of beacons is low.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129470602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Threatens faced by large-scale network attacks, it is of great importance to exert an emergency response, in order to mitigate the further hazardous caused by network attacks. To implement a reasonable control strategy, a minimal set of routers is computed in a network that can be used to control and reduce damage done by a large-scale attack such as worm or DDOS attacks. Our work focuses on large network while previous works focuse on LANs. We proposed rules to choose these routers based on the network topology and based on risk assessment. Many topological factors are considered into the control strategy process, the control router sets selecting algorithm based on entropy was put forward. According to the incidents distribution on the topology, the factor of macroscopic epidemic status is proposed by the method of quantitative and qualitative analysis, which offers administrators the direct decisive advice to prevent network security event from overspreading and minimize the costs. At last, the experiment effectively proved the evaluation framework and the control algorithm.
{"title":"Topology Awareness on Network Damage Assessment and Control Strategies Generation","authors":"Hui He, Hongli Zhang, Lihua Yin, Yongtan Liu","doi":"10.1109/ICICSE.2009.50","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.50","url":null,"abstract":"Threatens faced by large-scale network attacks, it is of great importance to exert an emergency response, in order to mitigate the further hazardous caused by network attacks. To implement a reasonable control strategy, a minimal set of routers is computed in a network that can be used to control and reduce damage done by a large-scale attack such as worm or DDOS attacks. Our work focuses on large network while previous works focuse on LANs. We proposed rules to choose these routers based on the network topology and based on risk assessment. Many topological factors are considered into the control strategy process, the control router sets selecting algorithm based on entropy was put forward. According to the incidents distribution on the topology, the factor of macroscopic epidemic status is proposed by the method of quantitative and qualitative analysis, which offers administrators the direct decisive advice to prevent network security event from overspreading and minimize the costs. At last, the experiment effectively proved the evaluation framework and the control algorithm.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115014248","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Based on the theories of the intrusion trapping and natural language understanding, oriented e-government affairs security issues, this paper proposed a content-based self-feedback model at the point of attackers. By this model, the concrete information under attacking can be focused and the attack methods would be ignored in a standard honey trap. With the supporting of honey nets, The target sensitivity is taken as the appraisement factor to demonstrate whether the information need be protected. Through adjusting the primitive feedback coefficients of the model, we can know the most important information as the attackers focusing on. At the same time, this paper introduced the concept of domain coefficients of the model. Through the experiments in the actual networks, it is the first successful model being of the prediction and feedback for e-government affairs.
{"title":"A Content-Based Self-Feedback E-government Network Security Model","authors":"Songzhu Xia, Jianpei Zhang, Jing Yang, Jun Ni","doi":"10.1109/ICICSE.2009.28","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.28","url":null,"abstract":"Based on the theories of the intrusion trapping and natural language understanding, oriented e-government affairs security issues, this paper proposed a content-based self-feedback model at the point of attackers. By this model, the concrete information under attacking can be focused and the attack methods would be ignored in a standard honey trap. With the supporting of honey nets, The target sensitivity is taken as the appraisement factor to demonstrate whether the information need be protected. Through adjusting the primitive feedback coefficients of the model, we can know the most important information as the attackers focusing on. At the same time, this paper introduced the concept of domain coefficients of the model. Through the experiments in the actual networks, it is the first successful model being of the prediction and feedback for e-government affairs.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115266219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper proposes a new texture description method named Chromatic Statistical Landscape Features(CSLF) for color images. The pro-hue pseudo intensity imageis extracted from a color image based on pro-hue pseudo function which measures the degree of a color containing the amount of the reference color component of hue. Besides prohue pseudo intensity image, the separate chromatic components saturation and intensity of a color image are also equivalent to two gray images. The graph of a gray image function is a rumpled surface appearing like a landscape, from which six novel texture feature curves are derived. In this way, eighteen feature curves can be obtained from the three gray images. The proposed Chromatic Statistical Landscape Features uses these eighteen feature curves to characterize the color texture and has been shown by systematic experiments to offer very high retrieval performance on the VisTex texture collection.
{"title":"Chromatic Statistical Landscape Features for Retrieval of Color Textured Images","authors":"C. Xu, Xian Tong Zhen","doi":"10.1109/ICICSE.2009.12","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.12","url":null,"abstract":"This paper proposes a new texture description method named Chromatic Statistical Landscape Features(CSLF) for color images. The pro-hue pseudo intensity imageis extracted from a color image based on pro-hue pseudo function which measures the degree of a color containing the amount of the reference color component of hue. Besides prohue pseudo intensity image, the separate chromatic components saturation and intensity of a color image are also equivalent to two gray images. The graph of a gray image function is a rumpled surface appearing like a landscape, from which six novel texture feature curves are derived. In this way, eighteen feature curves can be obtained from the three gray images. The proposed Chromatic Statistical Landscape Features uses these eighteen feature curves to characterize the color texture and has been shown by systematic experiments to offer very high retrieval performance on the VisTex texture collection.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128145115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The random packet sampling method is the simplest methodology for reducing the amount of packets that the network monitoring system has to process. However, the accuracy of anomaly detection is affected by the fact that this method biases a large IP flow. In order to reduce the impact of sampled traffic on network anomaly detecting, an adaptive traffic sampling method is proposed. This method is developed based on time stratification. Our adaptive method lies in an innovative scheme. It divides time into strata and then samples an incoming packet with a probalility, which is a decreasing function f of the predicted size of the flow the packet belongs to. Instead of data streaming algorithms, we use packet samples and a sampling probability to estimate flow size, thus to save resources. A force sampling is also employed to increase the accuracy of estimation of smaller flows. Experiments results show that our scheme is more accurate than traditional random packet sampling for estimating anomalous traffic, thus the performance of anomalous detecting is improved.
{"title":"An Adaptive Traffic Sampling Method for Anomaly Detection","authors":"Xiaobing He, Wu Yang, Qing Wang","doi":"10.1109/ICICSE.2009.32","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.32","url":null,"abstract":"The random packet sampling method is the simplest methodology for reducing the amount of packets that the network monitoring system has to process. However, the accuracy of anomaly detection is affected by the fact that this method biases a large IP flow. In order to reduce the impact of sampled traffic on network anomaly detecting, an adaptive traffic sampling method is proposed. This method is developed based on time stratification. Our adaptive method lies in an innovative scheme. It divides time into strata and then samples an incoming packet with a probalility, which is a decreasing function f of the predicted size of the flow the packet belongs to. Instead of data streaming algorithms, we use packet samples and a sampling probability to estimate flow size, thus to save resources. A force sampling is also employed to increase the accuracy of estimation of smaller flows. Experiments results show that our scheme is more accurate than traditional random packet sampling for estimating anomalous traffic, thus the performance of anomalous detecting is improved.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"341 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132495474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The most important thing of using model checking technology to verify production knowledge base is to build system model from rule set. It is a fundamental but time-consuming job. This paper presents an efficient formal method to verify production knowledge base. Two main contributions of this paper are as follows. Firstly, we propose a dynamic modeling method to build system model of knowledge base, this method utilizes the dynamic procedural nature of production rule to build system model, and it improves the modeling efficiency significantly; Secondly, a conditional transition system based on the standard transition system is given to represent system model, our conditional transition system contains the whole information about the actual state transition process, which solves the information loss problem of the static transition system built by static modeling method and improves the efficiency of error diagnosis.
{"title":"A Formal Method for Verifying Production Knowledge Base","authors":"Hongtao Huang, Shaobin Huang, Tao Zhang","doi":"10.1109/ICICSE.2009.14","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.14","url":null,"abstract":"The most important thing of using model checking technology to verify production knowledge base is to build system model from rule set. It is a fundamental but time-consuming job. This paper presents an efficient formal method to verify production knowledge base. Two main contributions of this paper are as follows. Firstly, we propose a dynamic modeling method to build system model of knowledge base, this method utilizes the dynamic procedural nature of production rule to build system model, and it improves the modeling efficiency significantly; Secondly, a conditional transition system based on the standard transition system is given to represent system model, our conditional transition system contains the whole information about the actual state transition process, which solves the information loss problem of the static transition system built by static modeling method and improves the efficiency of error diagnosis.","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"132 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132720370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
the paper presents an image processing, aiming at understanding and exploring the image identification technology for human cells. The paper reports several algorithm improvements in the processing including the Gauss-Laplace algorithm that makes image of human cells be identified effectively, and combination of texture match that using pixels statistics method makes several algorithms together. With the consideration of the precision of image recognition, we combine image texture characteristics with other image processing technology. The texture characteristics of human cells are considered as specific information to be added in the knowledge-based biological systems. The knowledge reasoning is introduced and knowledge storage is established using artificial intelligence process of identifying the human cells
{"title":"Intelligent Image Identification to Human Cells","authors":"Duan Jun, Zhao Yang, J. Ni","doi":"10.1109/ICICSE.2009.40","DOIUrl":"https://doi.org/10.1109/ICICSE.2009.40","url":null,"abstract":"the paper presents an image processing, aiming at understanding and exploring the image identification technology for human cells. The paper reports several algorithm improvements in the processing including the Gauss-Laplace algorithm that makes image of human cells be identified effectively, and combination of texture match that using pixels statistics method makes several algorithms together. With the consideration of the precision of image recognition, we combine image texture characteristics with other image processing technology. The texture characteristics of human cells are considered as specific information to be added in the knowledge-based biological systems. The knowledge reasoning is introduced and knowledge storage is established using artificial intelligence process of identifying the human cells","PeriodicalId":193621,"journal":{"name":"2009 Fourth International Conference on Internet Computing for Science and Engineering","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130360572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}