Object types are abstract specifications of object behaviors; object behaviors are abstractly indicated by object component interdependencies; and program verifications are based on object behaviors. In conventional object type systems, object component interdependencies are not taken into account. As a result, distinct behaviors of objects are confused, which can lead to fundamental typing/subtyping loopholes and program verification troubles. In this paper, we first identify a program verification problem which is caused by the loose conventional object typing/subtyping which is in turn caused by the overlooking of object component interdependencies. Then, as a new object typing scheme, we introduce object type graphs (OTG) in which object component interdependencies are integrated into object types. Finally, we show how the verification problem can be resolved under OTG.
{"title":"Enhancing program verifications by restricting object types","authors":"Cong-Cong Xing","doi":"10.1145/1141277.1141705","DOIUrl":"https://doi.org/10.1145/1141277.1141705","url":null,"abstract":"Object types are abstract specifications of object behaviors; object behaviors are abstractly indicated by object component interdependencies; and program verifications are based on object behaviors. In conventional object type systems, object component interdependencies are not taken into account. As a result, distinct behaviors of objects are confused, which can lead to fundamental typing/subtyping loopholes and program verification troubles. In this paper, we first identify a program verification problem which is caused by the loose conventional object typing/subtyping which is in turn caused by the overlooking of object component interdependencies. Then, as a new object typing scheme, we introduce object type graphs (OTG) in which object component interdependencies are integrated into object types. Finally, we show how the verification problem can be resolved under OTG.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132764464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Many numerical computations, be they solutions to systems of differential equations or optimization problems coming from applied areas like protein folding, do not provide us with guaranteed computation results. In many situations, we have numerical solutions, we may even have a theorem guaranteeing that, eventually, this numerical solution tends to the actual precise one, but the algorithm itself does not provide us with guaranteed bounds on the difference between the numerical approximate solution and the desired actual one. Therefore, in some practical situations, numerical solutions are much farther from the actual (unknown) precise solutions than the users assume.
{"title":"Editorial: track reliable computations and their applications","authors":"M. Ceberio, V. Kreinovich, M. Rueher","doi":"10.1145/1141277.1141661","DOIUrl":"https://doi.org/10.1145/1141277.1141661","url":null,"abstract":"Many numerical computations, be they solutions to systems of differential equations or optimization problems coming from applied areas like protein folding, do not provide us with guaranteed computation results. In many situations, we have numerical solutions, we may even have a theorem guaranteeing that, eventually, this numerical solution tends to the actual precise one, but the algorithm itself does not provide us with guaranteed bounds on the difference between the numerical approximate solution and the desired actual one. Therefore, in some practical situations, numerical solutions are much farther from the actual (unknown) precise solutions than the users assume.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133225692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We propose an algorithm to separate out tables and math-zones from document images. The algorithm relies on the spatial characteristics of tables and math-zones in a document. It has been observed that tables have distinct columns which imply that gaps between the fields are substantially larger than the gaps between the words in text lines and in math-zones the characters and symbols are less dense in comparison to normal text lines. These deceptively simple observations have led us to design a simple but powerful table and math-zone detection system with low computation cost.
{"title":"Detection and segmentation of tables and math-zones from document images","authors":"Sekhar Mandal, S. Chowdhury, A. Das, B. Chanda","doi":"10.1145/1141277.1141469","DOIUrl":"https://doi.org/10.1145/1141277.1141469","url":null,"abstract":"We propose an algorithm to separate out tables and math-zones from document images. The algorithm relies on the spatial characteristics of tables and math-zones in a document. It has been observed that tables have distinct columns which imply that gaps between the fields are substantially larger than the gaps between the words in text lines and in math-zones the characters and symbols are less dense in comparison to normal text lines. These deceptively simple observations have led us to design a simple but powerful table and math-zone detection system with low computation cost.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134343209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Delivery of IT projects in today's rapidly changing business environment is a challenge. Conventional investment approaches result in lumpy capital allocations, which encourage managers to include many potential future business requirements in each capital request. This locks in the delivery of future requirements despite high market uncertainty. The resulting projects are large and complex from both a technical and management perspective. In the management literature, new frameworks are emerging that draw on Real Options valuations to justify early infrastructure investment and provide fine-grained control over business initiatives in an uncertain world. Business managers can then build on the infrastructure by selecting business initiatives to maximise option value. However, this requires engineering approaches that separates infrastructure and business requirements and minimises their dependencies. This paper explores a contingency approach to Requirements Engineering (RE) to minimise initial requirements and maximise future strategic options, challenging the research community's dominant paradigm of completeness, correctness and consistency.
{"title":"A contingency view of organizational infrastructure requirements engineering","authors":"Karl Cox, S. Bleistein, P. Reynolds, A. Thorogood","doi":"10.1145/1141277.1141628","DOIUrl":"https://doi.org/10.1145/1141277.1141628","url":null,"abstract":"Delivery of IT projects in today's rapidly changing business environment is a challenge. Conventional investment approaches result in lumpy capital allocations, which encourage managers to include many potential future business requirements in each capital request. This locks in the delivery of future requirements despite high market uncertainty. The resulting projects are large and complex from both a technical and management perspective. In the management literature, new frameworks are emerging that draw on Real Options valuations to justify early infrastructure investment and provide fine-grained control over business initiatives in an uncertain world. Business managers can then build on the infrastructure by selecting business initiatives to maximise option value. However, this requires engineering approaches that separates infrastructure and business requirements and minimises their dependencies. This paper explores a contingency approach to Requirements Engineering (RE) to minimise initial requirements and maximise future strategic options, challenging the research community's dominant paradigm of completeness, correctness and consistency.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133154737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
For the fourth time in a sequence the annual ACM-SAC symposium is hosting this Software Engineering track. A few changes have taken place since last year: From last year's SE-Track team only Sung Shin and Stefan Gruner are continuing to organise this track and, at the occasion of the 21st SAC symposium, the track's previous subtitle "Methods, Practices and Tools" was modified to "Sound Solutions for the 21st Century". Moreover: For the very first time this track is not only supported by the ACM via the SAC symposium but also endorsed by two further Software Engineering societies, namely Formal Methods Europe (FME) and the European Association of Software Science and Technology (EASST).
{"title":"Editorial message","authors":"Stefan Gruner, Sung-uk Shin","doi":"10.1145/1141277.1141684","DOIUrl":"https://doi.org/10.1145/1141277.1141684","url":null,"abstract":"For the fourth time in a sequence the annual ACM-SAC symposium is hosting this Software Engineering track. A few changes have taken place since last year: From last year's SE-Track team only Sung Shin and Stefan Gruner are continuing to organise this track and, at the occasion of the 21st SAC symposium, the track's previous subtitle \"Methods, Practices and Tools\" was modified to \"Sound Solutions for the 21st Century\". Moreover: For the very first time this track is not only supported by the ACM via the SAC symposium but also endorsed by two further Software Engineering societies, namely Formal Methods Europe (FME) and the European Association of Software Science and Technology (EASST).","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123009132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The study looks at the application of preference programming approaches and techniques for decision support during prenegotiations over services. In hierarchical decision analysis models the need for multi-attribute evaluation techniques that may incorporate uncertainties directly in the modeling phase has resulted in the use of the 'interval' approach. With such an approach, preference judgments are presented as ranges including all possible value estimates. This paper reports the results of applying an interval preference programming approach and technique in decision support scenarios for reasoning during pre-negotiations over services. The aim has been to critically evaluate the approach and establish its applicability for ranking multi-dimensional service offers. Our experimental results using interval SMART, in pre-negotiation decision making scenarios, showed that while the dominance relations among alternatives remained unchanged following the introduction of uncertainty intervals, the rank order and dominance relations of the alternatives may vary as a result of the addition or dropping of new alternatives with inferior values.
{"title":"Dominance and ranking issues applying interval techniques in pre-negotiations for services","authors":"P. Tsvetinov, A. Underwood, Taizan Chan","doi":"10.1145/1141277.1141482","DOIUrl":"https://doi.org/10.1145/1141277.1141482","url":null,"abstract":"The study looks at the application of preference programming approaches and techniques for decision support during prenegotiations over services. In hierarchical decision analysis models the need for multi-attribute evaluation techniques that may incorporate uncertainties directly in the modeling phase has resulted in the use of the 'interval' approach. With such an approach, preference judgments are presented as ranges including all possible value estimates. This paper reports the results of applying an interval preference programming approach and technique in decision support scenarios for reasoning during pre-negotiations over services. The aim has been to critically evaluate the approach and establish its applicability for ranking multi-dimensional service offers. Our experimental results using interval SMART, in pre-negotiation decision making scenarios, showed that while the dominance relations among alternatives remained unchanged following the introduction of uncertainty intervals, the rank order and dominance relations of the alternatives may vary as a result of the addition or dropping of new alternatives with inferior values.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124276052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper we present a tool that assists in the automated analysis of a Java application, aimed at two purposes: (i) identifying class structure and, within this, micro-architectures that conform to known design patterns; (ii) providing visual representations of classes, concerns and their relationships. This affords a more abstract view of the analysed application, letting its structure emerge more clearly and its components be separately understood. As a result, it becomes easier for developers to assess whether well-known desirable characteristics, notably those favouring modularity and concern separation, or rather bad design choices, have been incorporated into the application.The proposed approach can be helpful both within the undertaking of a new development effort, and reverse engineering of an existing application in view of its evolution.
{"title":"Automatically discovering design patterns and assessing concern separations for applications","authors":"G. Pappalardo, E. Tramontana","doi":"10.1145/1141277.1141647","DOIUrl":"https://doi.org/10.1145/1141277.1141647","url":null,"abstract":"In this paper we present a tool that assists in the automated analysis of a Java application, aimed at two purposes: (i) identifying class structure and, within this, micro-architectures that conform to known design patterns; (ii) providing visual representations of classes, concerns and their relationships. This affords a more abstract view of the analysed application, letting its structure emerge more clearly and its components be separately understood. As a result, it becomes easier for developers to assess whether well-known desirable characteristics, notably those favouring modularity and concern separation, or rather bad design choices, have been incorporated into the application.The proposed approach can be helpful both within the undertaking of a new development effort, and reverse engineering of an existing application in view of its evolution.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"122 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133862450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Loadable kernel modules supported by Linux provides lots of benefits such as a small-sized kernel, on-demand loading, and easy software upgrading. However, since modules are executed in a privileged mode, trivial misuses in a module may cause critical system halts or deadlock situations. This paper presents a kernel resource protector which prevents kernel from faults generated by modules. The protector models the system in two objects: module object and resource object. By observing the interrelations between the two objects, the protector can detect misuses of modules and take actions to resolve the erroneous situations. Implementation study has shown that the protector can find out memory leaks wasted by modules and can reclaim leaks without degrading system performance. The proposed protector makes Linux more robust, which is required indispensably in the system equipped with NVRAM (Non Volatile RAM) such as FRAM and PRAM.
{"title":"Design and implementation of a kernel resource protector for robustness of Linux module programming","authors":"Jongmoo Choi, Seungjae Baek, Sung Y. Shin","doi":"10.1145/1141277.1141621","DOIUrl":"https://doi.org/10.1145/1141277.1141621","url":null,"abstract":"Loadable kernel modules supported by Linux provides lots of benefits such as a small-sized kernel, on-demand loading, and easy software upgrading. However, since modules are executed in a privileged mode, trivial misuses in a module may cause critical system halts or deadlock situations. This paper presents a kernel resource protector which prevents kernel from faults generated by modules. The protector models the system in two objects: module object and resource object. By observing the interrelations between the two objects, the protector can detect misuses of modules and take actions to resolve the erroneous situations. Implementation study has shown that the protector can find out memory leaks wasted by modules and can reclaim leaks without degrading system performance. The proposed protector makes Linux more robust, which is required indispensably in the system equipped with NVRAM (Non Volatile RAM) such as FRAM and PRAM.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117271275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Component-oriented programming facilitates the development of reusable application parts encapsulated by well-defined interfaces. There is however a tension between compatibility and evolution, since the interface of a component may constrain refactoring or require manual development of multiple, ad-hoc adaptation layers when an interface is evolved. We here present the declarative language VIDL for specifying component interface evolution. VIDL allows evolution of components with automatic generation of efficient adapter code that statically guarantees interface compatibility with other components that rely on anterior versions of the interface.
{"title":"Supporting transparent evolution of component interfaces","authors":"Emanuela P. Lins, U. Schultz","doi":"10.1145/1141277.1141658","DOIUrl":"https://doi.org/10.1145/1141277.1141658","url":null,"abstract":"Component-oriented programming facilitates the development of reusable application parts encapsulated by well-defined interfaces. There is however a tension between compatibility and evolution, since the interface of a component may constrain refactoring or require manual development of multiple, ad-hoc adaptation layers when an interface is evolved. We here present the declarative language VIDL for specifying component interface evolution. VIDL allows evolution of components with automatic generation of efficient adapter code that statically guarantees interface compatibility with other components that rely on anterior versions of the interface.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121300854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present a method for checking if macro-definitions written in C respect their specification. We are interested in simple time-bounded reactive properties. We use the abstract interpretation framework and a compact representation of sets of traces to provide a formalization of the specification, the semantics and the algorithms allowing us to build a representation of the set of traces.
{"title":"Static analysis of time bounded reactive properties of Boolean symbols","authors":"Guillaume Capron","doi":"10.1145/1141277.1141707","DOIUrl":"https://doi.org/10.1145/1141277.1141707","url":null,"abstract":"We present a method for checking if macro-definitions written in C respect their specification. We are interested in simple time-bounded reactive properties. We use the abstract interpretation framework and a compact representation of sets of traces to provide a formalization of the specification, the semantics and the algorithms allowing us to build a representation of the set of traces.","PeriodicalId":269830,"journal":{"name":"Proceedings of the 2006 ACM symposium on Applied computing","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121332183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}