We propose an algorithm to separate out tables and math-zones from document images. The algorithm relies on the spatial characteristics of tables and math-zones in a document. It has been observed that tables have distinct columns which imply that gaps between the fields are substantially larger than the gaps between the words in text lines and in math-zones the characters and symbols are less dense in comparison to normal text lines. These deceptively simple observations have led us to design a simple but powerful table and math-zone detection system with low computation cost.
{"title":"Detection and segmentation of tables and math-zones from document images","authors":"Sekhar Mandal, S. Chowdhury, A. Das, B. Chanda","doi":"10.1145/1141277.1141469","DOIUrl":"https://doi.org/10.1145/1141277.1141469","url":null,"abstract":"We propose an algorithm to separate out tables and math-zones from document images. The algorithm relies on the spatial characteristics of tables and math-zones in a document. It has been observed that tables have distinct columns which imply that gaps between the fields are substantially larger than the gaps between the words in text lines and in math-zones the characters and symbols are less dense in comparison to normal text lines. These deceptively simple observations have led us to design a simple but powerful table and math-zone detection system with low computation cost.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134343209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Delivery of IT projects in today's rapidly changing business environment is a challenge. Conventional investment approaches result in lumpy capital allocations, which encourage managers to include many potential future business requirements in each capital request. This locks in the delivery of future requirements despite high market uncertainty. The resulting projects are large and complex from both a technical and management perspective. In the management literature, new frameworks are emerging that draw on Real Options valuations to justify early infrastructure investment and provide fine-grained control over business initiatives in an uncertain world. Business managers can then build on the infrastructure by selecting business initiatives to maximise option value. However, this requires engineering approaches that separates infrastructure and business requirements and minimises their dependencies. This paper explores a contingency approach to Requirements Engineering (RE) to minimise initial requirements and maximise future strategic options, challenging the research community's dominant paradigm of completeness, correctness and consistency.
{"title":"A contingency view of organizational infrastructure requirements engineering","authors":"Karl Cox, S. Bleistein, P. Reynolds, A. Thorogood","doi":"10.1145/1141277.1141628","DOIUrl":"https://doi.org/10.1145/1141277.1141628","url":null,"abstract":"Delivery of IT projects in today's rapidly changing business environment is a challenge. Conventional investment approaches result in lumpy capital allocations, which encourage managers to include many potential future business requirements in each capital request. This locks in the delivery of future requirements despite high market uncertainty. The resulting projects are large and complex from both a technical and management perspective. In the management literature, new frameworks are emerging that draw on Real Options valuations to justify early infrastructure investment and provide fine-grained control over business initiatives in an uncertain world. Business managers can then build on the infrastructure by selecting business initiatives to maximise option value. However, this requires engineering approaches that separates infrastructure and business requirements and minimises their dependencies. This paper explores a contingency approach to Requirements Engineering (RE) to minimise initial requirements and maximise future strategic options, challenging the research community's dominant paradigm of completeness, correctness and consistency.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133154737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a domotic house gateway capable of seamlessly interacting with different devices from heterogeneous domotic systems and appliances. Such a gateway also provides the possibility to automate device cooperation through an embedded rule-based engine, which can be dynamically and automatically updated to accommodate necessities and anticipate users' actions. Some practical applications will show the effectiveness of the system.
{"title":"Domotic house gateway","authors":"Paolo Pellegrino, Dario Bonino, Fulvio Corno","doi":"10.1145/1141277.1141730","DOIUrl":"https://doi.org/10.1145/1141277.1141730","url":null,"abstract":"This paper presents a domotic house gateway capable of seamlessly interacting with different devices from heterogeneous domotic systems and appliances. Such a gateway also provides the possibility to automate device cooperation through an embedded rule-based engine, which can be dynamically and automatically updated to accommodate necessities and anticipate users' actions. Some practical applications will show the effectiveness of the system.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115431335","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Practical experience has shown that separating security enforcement code from functional code using separation of concerns techniques such as behavioural reflection leads to improvements in code undestandability and maintainability. However, using these techniques at requires providing a consistent and declarative way to specify policies. We have developed a prototype tool that allows the use of Ponder policies that are enforced by the Kava metaobject protocol. This prototype translates high-level policies into configuration files used to enforce the policies upon Java applications.
{"title":"Policy-driven reflective enforcement of security policies","authors":"I. Welch, Fan Lu","doi":"10.1145/1141277.1141645","DOIUrl":"https://doi.org/10.1145/1141277.1141645","url":null,"abstract":"Practical experience has shown that separating security enforcement code from functional code using separation of concerns techniques such as behavioural reflection leads to improvements in code undestandability and maintainability. However, using these techniques at requires providing a consistent and declarative way to specify policies. We have developed a prototype tool that allows the use of Ponder policies that are enforced by the Kava metaobject protocol. This prototype translates high-level policies into configuration files used to enforce the policies upon Java applications.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115684235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The heterogeneity of the architectural constraint languages makes difficult the transformation of constraints throughout the development process. Indeed they have significantly different metamodels, which make the definition of mapping rules complex. In this paper, we present an approach that aims at simplifying transformations of architectural constraints. It is based on an architectural constraint language (ACL), which includes one core constraint expression language and different profiles. Each profile is defined upon a metamodel, which represents the architectural abstractions manipulated at each stage in the development process.
{"title":"Simplifying transformation of software architecture constraints","authors":"Chouki Tibermacine, Régis Fleurquin, Salah Sadou","doi":"10.1145/1141277.1141568","DOIUrl":"https://doi.org/10.1145/1141277.1141568","url":null,"abstract":"The heterogeneity of the architectural constraint languages makes difficult the transformation of constraints throughout the development process. Indeed they have significantly different metamodels, which make the definition of mapping rules complex. In this paper, we present an approach that aims at simplifying transformations of architectural constraints. It is based on an architectural constraint language (ACL), which includes one core constraint expression language and different profiles. Each profile is defined upon a metamodel, which represents the architectural abstractions manipulated at each stage in the development process.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113994807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We investigate conditions under which an infinite set of atomic messages can be replaced with one or two values without affecting the correctness of a security protocol. The work is conducted using the strand spaces formalism, but the results apply to all protocol analysis techniques, and should be of particular value to those using model checking.The implications of the central result are discussed.
{"title":"To infinity and beyond or, avoiding the infinite in security protocol analysis","authors":"J. Heather, Steve A. Schneider","doi":"10.1145/1141277.1141359","DOIUrl":"https://doi.org/10.1145/1141277.1141359","url":null,"abstract":"We investigate conditions under which an infinite set of atomic messages can be replaced with one or two values without affecting the correctness of a security protocol. The work is conducted using the strand spaces formalism, but the results apply to all protocol analysis techniques, and should be of particular value to those using model checking.The implications of the central result are discussed.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114050197","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Several different techniques and softwares intend to improve the accuracy of results computed in a fixed finite precision. Here we focus on a method to improve the accuracy of the polynomial evaluation. It is well known that the use of the Fused Multiply and Add operation available on some microprocessors like Intel Itanium improves slightly the accuracy of the Horner scheme. In this paper, we propose an accurate compensated Horner scheme specially designed to take advantage of the Fused Multiply and Add. We prove that the computed result is as accurate as if computed in twice the working precision. The algorithm we present is fast since it only requires well optimizable floating point operations, performed in the same working precision as the given data.
{"title":"Improving the compensated Horner scheme with a fused multiply and add","authors":"S. Graillat, P. Langlois, N. Louvet","doi":"10.1145/1141277.1141585","DOIUrl":"https://doi.org/10.1145/1141277.1141585","url":null,"abstract":"Several different techniques and softwares intend to improve the accuracy of results computed in a fixed finite precision. Here we focus on a method to improve the accuracy of the polynomial evaluation. It is well known that the use of the Fused Multiply and Add operation available on some microprocessors like Intel Itanium improves slightly the accuracy of the Horner scheme. In this paper, we propose an accurate compensated Horner scheme specially designed to take advantage of the Fused Multiply and Add. We prove that the computed result is as accurate as if computed in twice the working precision. The algorithm we present is fast since it only requires well optimizable floating point operations, performed in the same working precision as the given data.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115611619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
MPEG-7 has become a key standard to multimedia research in searching, filtering and retrieval. Understanding experiences of users when using MPEG-7-based tools is necessary if we are to improve how MPEG-7 metadata is applied in practice. COSMOS-7 enables structured modeling and filtering of MPEG-7-compliant metadata for digital video. We describe two COSMOS-7 front end systems: COSMOSIS, for modeling digital video metadata, and the Filtering Manager, for filtering digital video metadata. We then present an empirical evaluation of these front end systems undertaken with a sample set of end users from a London, UK, theater company. Our results reveal that end users progress through a number of key stages when modeling and filtering video content.
{"title":"MPEG-7 in action: end user experiences with COSMOS-7 front end systems","authors":"H. Agius, M. Angelides","doi":"10.1145/1141277.1141591","DOIUrl":"https://doi.org/10.1145/1141277.1141591","url":null,"abstract":"MPEG-7 has become a key standard to multimedia research in searching, filtering and retrieval. Understanding experiences of users when using MPEG-7-based tools is necessary if we are to improve how MPEG-7 metadata is applied in practice. COSMOS-7 enables structured modeling and filtering of MPEG-7-compliant metadata for digital video. We describe two COSMOS-7 front end systems: COSMOSIS, for modeling digital video metadata, and the Filtering Manager, for filtering digital video metadata. We then present an empirical evaluation of these front end systems undertaken with a sample set of end users from a London, UK, theater company. Our results reveal that end users progress through a number of key stages when modeling and filtering video content.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114235215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Advances in data acquisition hardware and embedded systems have led to the data stream era. A growing number of emerging applications varying from business to scientific to industrial ones continuously generate open-ended data streams. In practice, such data cannot be stored but must be both queried and analyzed as they arrive, discarding it right away. In many cases, we need to extract some sort of knowledge from these continuous streams that challenge the scalability of several batch-learning methods. Therefore, this new field has attracted researchers from different disciplines over the past few years. Examples of data streams include customer click streams, networks event logs, telephone records, large sets of web pages, multimedia data, scientific data, and sets of retail chain transactions. Applications include credit card fraud protection, target marketing, and intrusion detection, for which it is not possible to collect all relevant input data. In these environments, KDD systems have to operate online under memory and time limitations.
{"title":"Editorial message: special track on data streams","authors":"J. Aguilar-Ruiz, Francisco J. Ferrer-Troyano","doi":"10.1145/1141277.1141425","DOIUrl":"https://doi.org/10.1145/1141277.1141425","url":null,"abstract":"Advances in data acquisition hardware and embedded systems have led to the data stream era. A growing number of emerging applications varying from business to scientific to industrial ones continuously generate open-ended data streams. In practice, such data cannot be stored but must be both queried and analyzed as they arrive, discarding it right away. In many cases, we need to extract some sort of knowledge from these continuous streams that challenge the scalability of several batch-learning methods. Therefore, this new field has attracted researchers from different disciplines over the past few years. Examples of data streams include customer click streams, networks event logs, telephone records, large sets of web pages, multimedia data, scientific data, and sets of retail chain transactions. Applications include credit card fraud protection, target marketing, and intrusion detection, for which it is not possible to collect all relevant input data. In these environments, KDD systems have to operate online under memory and time limitations.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114257074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fredrik Valeur, G. Vigna, Christopher Krügel, E. Kirda
Careless development of web-based applications results in vulnerable code being deployed and made available to the whole Internet, creating easily-exploitable entry points for the compromise of entire networks. To ameliorate this situation, we propose an approach that composes a web-based anomaly detection system with a reverse HTTP proxy. The approach is based on the assumption that a web site's content can be split into security sensitive and non-sensitive parts, which are distributed to different servers. The anomaly score of a web request is then used to route suspicious requests to copies of the web site that do not hold sensitive content. By doing this, it is possible to serve anomalous but benign requests that do not require access to sensitive information, sensibly reducing the impact of false positives. We developed a prototype of our approach and evaluated its applicability with respect to several existing web-based applications, showing that our approach is both feasible and effective.
{"title":"An anomaly-driven reverse proxy for web applications","authors":"Fredrik Valeur, G. Vigna, Christopher Krügel, E. Kirda","doi":"10.1145/1141277.1141361","DOIUrl":"https://doi.org/10.1145/1141277.1141361","url":null,"abstract":"Careless development of web-based applications results in vulnerable code being deployed and made available to the whole Internet, creating easily-exploitable entry points for the compromise of entire networks. To ameliorate this situation, we propose an approach that composes a web-based anomaly detection system with a reverse HTTP proxy. The approach is based on the assumption that a web site's content can be split into security sensitive and non-sensitive parts, which are distributed to different servers. The anomaly score of a web request is then used to route suspicious requests to copies of the web site that do not hold sensitive content. By doing this, it is possible to serve anomalous but benign requests that do not require access to sensitive information, sensibly reducing the impact of false positives. We developed a prototype of our approach and evaluated its applicability with respect to several existing web-based applications, showing that our approach is both feasible and effective.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114676848","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}