Shyue-Wen Yang, M. Sheu, Jun-Jie Lin, Chuang-Chun Hu, Tzu-Hsuing Chen, S. Tseng
In this paper, we present a parallel connected component labeling method and its VLSI architecture design. The proposed method can assign labels to three pixels simultaneously for the raster scan input and then generate three label equivalences rapidly. We also present 3 arrays to process all label mergence. Based on the proposed method, we develop the hardware design for real-time application. The parallel architecture efficiently reduces total execution cycle significantly. From the experimental results, our 3-pixel labeling design can save 66% and 33% of the execution cycle comparing with the designs by 1-pixel labeling and 2-pixel labeling approaches, respectively.
{"title":"Parallel 3-Pixel Labeling Method and its Hardware Architecture Design","authors":"Shyue-Wen Yang, M. Sheu, Jun-Jie Lin, Chuang-Chun Hu, Tzu-Hsuing Chen, S. Tseng","doi":"10.1109/IAS.2009.74","DOIUrl":"https://doi.org/10.1109/IAS.2009.74","url":null,"abstract":"In this paper, we present a parallel connected component labeling method and its VLSI architecture design. The proposed method can assign labels to three pixels simultaneously for the raster scan input and then generate three label equivalences rapidly. We also present 3 arrays to process all label mergence. Based on the proposed method, we develop the hardware design for real-time application. The parallel architecture efficiently reduces total execution cycle significantly. From the experimental results, our 3-pixel labeling design can save 66% and 33% of the execution cycle comparing with the designs by 1-pixel labeling and 2-pixel labeling approaches, respectively.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128964102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The method of corrosion diagnosis for substation grounding grids based on large change sensitivity is proposed. By analyzing the node voltages, the large change sensitivity of the port voltages in response to the branch resistances is given and the nonlinear optimal model of corrosion diagnosis is established. The diagnosis equations based on the differential sensitivity were set up in traditional methods. Contrasting with them, the equations established in the paper is more appropriate for the real corrosion case. To insure the independency of diagnosis equations, the optimal selection method of test ports is put up. The effectiveness of algorithm is validated by the simulation of substation grounding grids.
{"title":"Grounding Grid Corrosion Diagnosis Based on Large Change Sensitivity","authors":"Liqiang Liu, Xianjue Luo, Tao Niu, Kai Wang","doi":"10.1109/IAS.2009.241","DOIUrl":"https://doi.org/10.1109/IAS.2009.241","url":null,"abstract":"The method of corrosion diagnosis for substation grounding grids based on large change sensitivity is proposed. By analyzing the node voltages, the large change sensitivity of the port voltages in response to the branch resistances is given and the nonlinear optimal model of corrosion diagnosis is established. The diagnosis equations based on the differential sensitivity were set up in traditional methods. Contrasting with them, the equations established in the paper is more appropriate for the real corrosion case. To insure the independency of diagnosis equations, the optimal selection method of test ports is put up. The effectiveness of algorithm is validated by the simulation of substation grounding grids.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129048333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chiu proposed a clustering algorithm adjusting the numeric feature weights automatically for k-anonymity implementation and this approach gave a better clustering quality over the traditional generalization and suppression methods. In this paper, we propose an improved weighted-feature clustering algorithm which takes the weight of categorical attributes and the thesis of optimal k-partition into consideration. To show the effectiveness of our method, we do some information loss experiments to compare it with greedy k-member clustering algorithm.
{"title":"An Improved Weighted-Feature Clustering Algorithm for K-anonymity","authors":"Lijian Lu, Xiaojun Ye","doi":"10.1109/IAS.2009.311","DOIUrl":"https://doi.org/10.1109/IAS.2009.311","url":null,"abstract":"Chiu proposed a clustering algorithm adjusting the numeric feature weights automatically for k-anonymity implementation and this approach gave a better clustering quality over the traditional generalization and suppression methods. In this paper, we propose an improved weighted-feature clustering algorithm which takes the weight of categorical attributes and the thesis of optimal k-partition into consideration. To show the effectiveness of our method, we do some information loss experiments to compare it with greedy k-member clustering algorithm.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129110202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In our paper, we propose a novel distributed video coding (DVC) scheme using the theory of multiple description (MD), in which key frame is encoded by MD codec and transmitted over the corresponding channel. This scheme combines the advantage of DVC as well as robustness of MD, and exploits three different methods to generate multiple descriptions for the key frames that are essential to side information. Experiments demonstrate that it can get better performance than some general DVC methods. Besides, it demonstrates higher robustness in packet-loss channel than general DVC due to the MD algorithm.
{"title":"Distributed Video Coding Based on Multiple Description","authors":"Hon-Shing Ma, Yao Zhao, Chunyu Lin, Anhong Wang","doi":"10.1109/IAS.2009.196","DOIUrl":"https://doi.org/10.1109/IAS.2009.196","url":null,"abstract":"In our paper, we propose a novel distributed video coding (DVC) scheme using the theory of multiple description (MD), in which key frame is encoded by MD codec and transmitted over the corresponding channel. This scheme combines the advantage of DVC as well as robustness of MD, and exploits three different methods to generate multiple descriptions for the key frames that are essential to side information. Experiments demonstrate that it can get better performance than some general DVC methods. Besides, it demonstrates higher robustness in packet-loss channel than general DVC due to the MD algorithm.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"02 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129136698","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H.264/AVC based lossless image compression is a latest and extent technique for data compression. Block based intra-prediction is originally exploited in this extension. DPCM based intra-prediction is an outstanding technique to improve the compression ratio for this research area. In this paper, we propose an intra-mode dependent coding scheme to further improve the performance of DPCM-based compression technique. Different scan orders prior to entropy coding is considered to fit the various data attributes after predictions for different intra-modes.
{"title":"Intra-mode Dependent Coding Method for Image Compression","authors":"Yung-Chiang Wei, Jui-Che Teng, Chien-Wen Chung","doi":"10.1109/IAS.2009.346","DOIUrl":"https://doi.org/10.1109/IAS.2009.346","url":null,"abstract":"H.264/AVC based lossless image compression is a latest and extent technique for data compression. Block based intra-prediction is originally exploited in this extension. DPCM based intra-prediction is an outstanding technique to improve the compression ratio for this research area. In this paper, we propose an intra-mode dependent coding scheme to further improve the performance of DPCM-based compression technique. Different scan orders prior to entropy coding is considered to fit the various data attributes after predictions for different intra-modes.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130256309","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Network traffic anomaly detection can be done through the self-similar analysis of network traffic. In this case, the abnormal condition of network can be indicated by investigating if the performance parameters of real time data locate at the acceptable ranges. A common method of estimating self-similar parameter is the Wavelet transform. However, the Wavelet transform fails to exclude the influence of non-stationary signal’s periodicity and trend term. In view of the fact that Hilbert-Huang Transform (HHT) has unique advantage on non-stationary signal treatment, in this paper, a refined self-similar parameter estimation algorithm is designed through the combination of wavelet analysis and Hilbert-Huang Transform and a set of experiments are run to verify the improvement in the accuracy of parameter estimation and network traffic anomaly detection.
{"title":"Network Traffic Anomaly Detection Based on Self-Similarity Using HHT and Wavelet Transform","authors":"Xiaorong Cheng, Kun Xie, Dong Wang","doi":"10.1109/IAS.2009.219","DOIUrl":"https://doi.org/10.1109/IAS.2009.219","url":null,"abstract":"Network traffic anomaly detection can be done through the self-similar analysis of network traffic. In this case, the abnormal condition of network can be indicated by investigating if the performance parameters of real time data locate at the acceptable ranges. A common method of estimating self-similar parameter is the Wavelet transform. However, the Wavelet transform fails to exclude the influence of non-stationary signal’s periodicity and trend term. In view of the fact that Hilbert-Huang Transform (HHT) has unique advantage on non-stationary signal treatment, in this paper, a refined self-similar parameter estimation algorithm is designed through the combination of wavelet analysis and Hilbert-Huang Transform and a set of experiments are run to verify the improvement in the accuracy of parameter estimation and network traffic anomaly detection.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130279549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As mobile wireless networks increase in popularity and pervasiveness, seamless mobility is an important issue for uninterrupted services in ubiquitous network environments. In this paper, we propose a zero packet loss handoff mechanism for real-time streaming services, which is based on the SIP protocol cooperated with mobile agent, multicast, and buffering technique to overcome the impact of handoff delay in SIP-based wireless networks. The simulation results reveal that the proposed scheme has a great performance in packet loss and higher QoS during handoff. Furthermore, we realize and evaluate the proposed scheme by applying MP3-music streaming on the system.
{"title":"Zero Packet Loss Hand-off Mechanism in SIP-Based Wireless Networks","authors":"Ching-Lung Chang, Jia-Yi Syu, Y. Chu","doi":"10.1109/IAS.2009.323","DOIUrl":"https://doi.org/10.1109/IAS.2009.323","url":null,"abstract":"As mobile wireless networks increase in popularity and pervasiveness, seamless mobility is an important issue for uninterrupted services in ubiquitous network environments. In this paper, we propose a zero packet loss handoff mechanism for real-time streaming services, which is based on the SIP protocol cooperated with mobile agent, multicast, and buffering technique to overcome the impact of handoff delay in SIP-based wireless networks. The simulation results reveal that the proposed scheme has a great performance in packet loss and higher QoS during handoff. Furthermore, we realize and evaluate the proposed scheme by applying MP3-music streaming on the system.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130291886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ethernet Passive Optical Network (EPON) has combined the advantages of Ethernet and Passive Optical Network (PON). High speed accessing at lower price has been realized in this way. In this paper, based on CBR (Constant Bit Rate), design issues of Dynamic Bandwidth Allocation (DBA) are studied and the improved DBA algorithm is proposed. Ensure effectively Quality of Service (QoS) of the entire EPON and fairness of bandwidth allocation, and meet the request of the current network with multi-service, multi-level priorities. Provide reference for the standard of multi-service QoS in EPON uplink bandwidth allocation.
以太网无源光网络(EPON)结合了以太网和无源光网络(PON)的优点。通过这种方式实现了低价格的高速接入。本文在恒定比特率(CBR)的基础上,研究了动态带宽分配(DBA)的设计问题,提出了改进的DBA算法。有效保证整个EPON的QoS (Quality of Service)和带宽分配的公平性,满足当前网络多业务、多层次优先级的需求。为EPON上行带宽分配中的多业务QoS标准提供参考。
{"title":"Design of DBA Algorithm in EPON Uplike","authors":"Ya-min Wang, Yan Liu","doi":"10.1109/IAS.2009.272","DOIUrl":"https://doi.org/10.1109/IAS.2009.272","url":null,"abstract":"Ethernet Passive Optical Network (EPON) has combined the advantages of Ethernet and Passive Optical Network (PON). High speed accessing at lower price has been realized in this way. In this paper, based on CBR (Constant Bit Rate), design issues of Dynamic Bandwidth Allocation (DBA) are studied and the improved DBA algorithm is proposed. Ensure effectively Quality of Service (QoS) of the entire EPON and fairness of bandwidth allocation, and meet the request of the current network with multi-service, multi-level priorities. Provide reference for the standard of multi-service QoS in EPON uplink bandwidth allocation.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127898293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Y. D. Lyuu and M. L. Wu had proposed an improved multi-proxy multi-signature scheme, which was claimed to resist the forge attack. Lately, L. Guo and G. Wang found an inside attack on the Lyuu-Wu's scheme. In this paper, we propose a new attack on Lyuu-Wu's scheme, which can factor the parameter $N$ and $Q$ by using efficient number-theoretic algorithms when $Q$ is roughly larger than the square root of $N$. It follows that Lyuu-Wu's scheme suffers from the forge attack from the proxy signers in that case.
Y. D. Lyuu和M. L. Wu提出了一种改进的多代理多重签名方案,并声称该方案可以抵抗伪造攻击。后来,郭立国和王立国发现了对吕武计划的内部攻击。本文提出了一种新的对Lyuu-Wu方案的攻击,当Q$大致大于N$的平方根时,利用有效的数论算法将参数$N$和$Q$分解。由此可见,在这种情况下,Lyuu-Wu的方案遭受了代理签名者的伪造攻击。
{"title":"Number-Theoretic Attack on Lyuu-Wu's Multi-proxy Multi-signature Scheme","authors":"Fanyu Kong, Jia Yu","doi":"10.1109/IAS.2009.130","DOIUrl":"https://doi.org/10.1109/IAS.2009.130","url":null,"abstract":"Y. D. Lyuu and M. L. Wu had proposed an improved multi-proxy multi-signature scheme, which was claimed to resist the forge attack. Lately, L. Guo and G. Wang found an inside attack on the Lyuu-Wu's scheme. In this paper, we propose a new attack on Lyuu-Wu's scheme, which can factor the parameter $N$ and $Q$ by using efficient number-theoretic algorithms when $Q$ is roughly larger than the square root of $N$. It follows that Lyuu-Wu's scheme suffers from the forge attack from the proxy signers in that case.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128863787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rongxian Nie, Guiguang Ding, Jianmin Wang, Li Zhang
Content-based copy detection (CBCD) is more and more important in the field of information security, assurance, copyright protection etc. The sequences matching, one of the key technologies in CBCD, is very critical for querying quickly and accurately. In this paper, we propose a novel fingerprint sequences matching algorithm based on the dynamic programming. This method is universal to all kinds of fingerprint sequences matching. We examined this method with experiments at last in this paper. The experimental results demonstrate it’s effective in sequences matching.
{"title":"A New Fingerprint Sequences Matching Algorithm for Content-Based Copy Detection","authors":"Rongxian Nie, Guiguang Ding, Jianmin Wang, Li Zhang","doi":"10.1109/IAS.2009.284","DOIUrl":"https://doi.org/10.1109/IAS.2009.284","url":null,"abstract":"Content-based copy detection (CBCD) is more and more important in the field of information security, assurance, copyright protection etc. The sequences matching, one of the key technologies in CBCD, is very critical for querying quickly and accurately. In this paper, we propose a novel fingerprint sequences matching algorithm based on the dynamic programming. This method is universal to all kinds of fingerprint sequences matching. We examined this method with experiments at last in this paper. The experimental results demonstrate it’s effective in sequences matching.","PeriodicalId":240354,"journal":{"name":"2009 Fifth International Conference on Information Assurance and Security","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125362600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}