A weighted directed graph GQ = (V, E) is defined for Q, the set of problems known to be NP-complete, with a vertex vi in V being an NP-complete problem pi in Q and the weight of an edge in E being the complexity of the transformation used to prove the NP-completeness of a problem. If the complexity of problem p1 relative to problem p2 is defined as the minimum complexity of a reduction from p1 to p2 (considering all paths from v1 to v2 in V) then the relative complexity graph of Q is the weighted graph GQR = (V, E'), with the weight of edge e'ij being the minimum complexity of p1 relative to p2. An O(n3) variant of the shortest path problem on directed graphs that is similar to the Floyd algorithm for all-pairs shortest paths is used to construct GQR. GQR can be updated with an O(n2) algorithm when a reduction with smaller complexity is found or a new edge is added to the graph, and with an O(n) algorithm if a new vertex (corresponding to the discovery of a new NP-complete problem) is added to the graph.
{"title":"A graph for NP-complete problems","authors":"G. Sampath","doi":"10.1145/2817460.2817501","DOIUrl":"https://doi.org/10.1145/2817460.2817501","url":null,"abstract":"A weighted directed graph G<sub>Q</sub> = (V, E) is defined for Q, the set of problems known to be NP-complete, with a vertex v<sub>i</sub> in V being an NP-complete problem p<sub>i</sub> in Q and the weight of an edge in E being the complexity of the transformation used to prove the NP-completeness of a problem. If the complexity of problem p<sub>1</sub> relative to problem p<sub>2</sub> is defined as the minimum complexity of a reduction from p<sub>1</sub> to p<sub>2</sub> (considering all paths from v<sub>1</sub> to v<sub>2</sub> in V) then the relative complexity graph of Q is the weighted graph G<sub>QR</sub> = (V, E'), with the weight of edge e'<sub>ij</sub> being the minimum complexity of p<sub>1</sub> relative to p<sub>2</sub>. An O(n<sup>3</sup>) variant of the shortest path problem on directed graphs that is similar to the Floyd algorithm for all-pairs shortest paths is used to construct G<sub>QR</sub>. G<sub>QR</sub> can be updated with an O(n<sup>2</sup>) algorithm when a reduction with smaller complexity is found or a new edge is added to the graph, and with an O(n) algorithm if a new vertex (corresponding to the discovery of a new NP-complete problem) is added to the graph.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126431155","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Many authors have used the continuous relaxation of linear formulations of quadratic 0-1 optimization problems subject to linear constraints in order to obtain a bound of the optimal value by linear programming. But usually, optimal solutions are non-integer vectors, and thus are not feasible for the 0-1 problem. In this paper, we propose a based linear programming scheme to try to build ε-approximate polynomial time algorithms for any quadratic 0-1 maximization problems subject to linear constraints. By using this scheme, we obtain ε-approximate polynomial-time algorithms for several basic problems : the maximization of an unconstrained quadratic posiform, an assignment problem which contains k-max-cut as a particular case, k-max-cut, the k-cluster problem on bipartite graphs, and the bipartitioning problem (max-cut with a set of cardinal k).
{"title":"Linear programming to approximate quadratic 0-1 maximization problems","authors":"A. Billionnet, Frédéric Roupin","doi":"10.1145/2817460.2817503","DOIUrl":"https://doi.org/10.1145/2817460.2817503","url":null,"abstract":"Many authors have used the continuous relaxation of linear formulations of quadratic 0-1 optimization problems subject to linear constraints in order to obtain a bound of the optimal value by linear programming. But usually, optimal solutions are non-integer vectors, and thus are not feasible for the 0-1 problem. In this paper, we propose a based linear programming scheme to try to build ε-approximate polynomial time algorithms for any quadratic 0-1 maximization problems subject to linear constraints. By using this scheme, we obtain ε-approximate polynomial-time algorithms for several basic problems : the maximization of an unconstrained quadratic posiform, an assignment problem which contains k-max-cut as a particular case, k-max-cut, the k-cluster problem on bipartite graphs, and the bipartitioning problem (max-cut with a set of cardinal k).","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122727838","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Dynamic Brain Project will create an interactive database on the human brain, available via the Internet. In addition to medical images and textual information, the completed project will give users access to all types of data, including video and sound. One of the primary tasks is to create a point-and-click navigation interface for the user, giving access to the data through a 3-dimensional, volumetric image generated from medical scans. The project uses Postgres, an object-oriented database management system. Data relations (tables) are spatially linked to the interface image by the x,y,z-coordinates to which they belong. These coordinates are determined by building the volume with an octree data structure and outputting the spatially registered data to a Postgres table. This paper summarizes the database schema and the octree algorithm for the project.
{"title":"Building a registered volume database: an object-oriented octree program","authors":"Lynn W. Jones","doi":"10.1145/2817460.2817465","DOIUrl":"https://doi.org/10.1145/2817460.2817465","url":null,"abstract":"The Dynamic Brain Project will create an interactive database on the human brain, available via the Internet. In addition to medical images and textual information, the completed project will give users access to all types of data, including video and sound. One of the primary tasks is to create a point-and-click navigation interface for the user, giving access to the data through a 3-dimensional, volumetric image generated from medical scans. The project uses Postgres, an object-oriented database management system. Data relations (tables) are spatially linked to the interface image by the x,y,z-coordinates to which they belong. These coordinates are determined by building the volume with an octree data structure and outputting the spatially registered data to a Postgres table. This paper summarizes the database schema and the octree algorithm for the project.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131998380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In automatically assigning part-of-speech tags to scientific text, we find a high error rate when tagging main verbs. To reduce the occurrence of this serious error, we have created a neural network to search for main verbs that have been mis-tagged by a rule-based tagger. In this paper we describe our efforts to evolve the connection weights for the neural network, and to fractally configure another neural network for the same task.
{"title":"Improving the identification of verbs","authors":"Lynellen D. S. Perry","doi":"10.1145/2817460.2817470","DOIUrl":"https://doi.org/10.1145/2817460.2817470","url":null,"abstract":"In automatically assigning part-of-speech tags to scientific text, we find a high error rate when tagging main verbs. To reduce the occurrence of this serious error, we have created a neural network to search for main verbs that have been mis-tagged by a rule-based tagger. In this paper we describe our efforts to evolve the connection weights for the neural network, and to fractally configure another neural network for the same task.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114176280","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, a way of extending the EER model is described. This new extension would be able to represent other functions that previously could not be specified using the ER diagram. It would also allow a user to incorporate previous ER diagrams into any diagram being used. This new extension is called an operator. In this paper, operators will be described more in depth and examples will be given by using a company database.
{"title":"Operators in an extended entity-relationship model and their applications","authors":"S. Claverie","doi":"10.1145/2817460.2817463","DOIUrl":"https://doi.org/10.1145/2817460.2817463","url":null,"abstract":"In this paper, a way of extending the EER model is described. This new extension would be able to represent other functions that previously could not be specified using the ER diagram. It would also allow a user to incorporate previous ER diagrams into any diagram being used. This new extension is called an operator. In this paper, operators will be described more in depth and examples will be given by using a company database.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"33 29","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120853901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
ACM Computing Week 1995 was evaluated using computer-based survey forms. The electronic evaluations were e-mailed to conference attendees who filled out the evaluation forms and e-mailed them to Nancy Wahl, the ACM Computing Week '95 Activities Evaluation Chairperson. The forms were processed by a computer program which extracted, tabulated, and formatted results. This was the first time that electronic evaluations were used to evaluate Computing Week. The design of the computer-based survey forms is compared to the design of paper-based survey forms. The difficulties encountered in developing a program that can parse survey forms looking for answers that were put in unexpected places are described and suggestions for improving the evaluation process are outlined.
{"title":"Computer-based evaluation forms","authors":"N. J. Wahl, Jason N. Denton","doi":"10.1145/2817460.2817491","DOIUrl":"https://doi.org/10.1145/2817460.2817491","url":null,"abstract":"ACM Computing Week 1995 was evaluated using computer-based survey forms. The electronic evaluations were e-mailed to conference attendees who filled out the evaluation forms and e-mailed them to Nancy Wahl, the ACM Computing Week '95 Activities Evaluation Chairperson. The forms were processed by a computer program which extracted, tabulated, and formatted results. This was the first time that electronic evaluations were used to evaluate Computing Week. The design of the computer-based survey forms is compared to the design of paper-based survey forms. The difficulties encountered in developing a program that can parse survey forms looking for answers that were put in unexpected places are described and suggestions for improving the evaluation process are outlined.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126217379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jill S. Kirschman, Chad W. Patton, Mukarram H. Shah
Coordinating activities in cross-disciplinary projects is a complex task. Complexity levels increase when members of groups involved in the project are physically separated and work at different times with different computing tools. One aspect of the Multi-Disciplinary Design and Analysis Group's research at Clemson University is to explore the factors which will successfully lead to the selection or development of a computing application that will support group efforts carried out by students and faculty belonging to different disciplines. This paper outlines our efforts to understand the functionality desired by scientists and engineers when they work in a collaborative environment, to evaluate some existing groupware applications, and to review a system that offers a number of essential features. In addition, we provide a brief background of computer supported cooperative work and groupware, detail the methodology used to address groupware selection, summarize and interpret the results, and conclude with recommendations.
{"title":"Matching groupware to user needs: an exploratory study","authors":"Jill S. Kirschman, Chad W. Patton, Mukarram H. Shah","doi":"10.1145/2817460.2817511","DOIUrl":"https://doi.org/10.1145/2817460.2817511","url":null,"abstract":"Coordinating activities in cross-disciplinary projects is a complex task. Complexity levels increase when members of groups involved in the project are physically separated and work at different times with different computing tools. One aspect of the Multi-Disciplinary Design and Analysis Group's research at Clemson University is to explore the factors which will successfully lead to the selection or development of a computing application that will support group efforts carried out by students and faculty belonging to different disciplines. This paper outlines our efforts to understand the functionality desired by scientists and engineers when they work in a collaborative environment, to evaluate some existing groupware applications, and to review a system that offers a number of essential features. In addition, we provide a brief background of computer supported cooperative work and groupware, detail the methodology used to address groupware selection, summarize and interpret the results, and conclude with recommendations.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130341943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Java offers programmers three variable classes, local variables, instance variables, and constants. We compare the performance characteristics of each of these variable types with focus on local variables and the Local Variable Table (LVT). The LVT is an array containing the values of local variables and parameters for each code block. We demonstrate that when local variables are stored in the first four entries of the LVT, instructions using those variables execute quicker. In order to exploit these performance benefits, we present a modified Sun Java compiler to optimize the LVT using a most frequently used (MFU) ordering. By ordering the LVT, we achieve speedups of traditional benchmark programs.
{"title":"Java dynamic LVT optimization","authors":"Danny Mace, H. C. Grossman","doi":"10.1145/2817460.2817518","DOIUrl":"https://doi.org/10.1145/2817460.2817518","url":null,"abstract":"Java offers programmers three variable classes, local variables, instance variables, and constants. We compare the performance characteristics of each of these variable types with focus on local variables and the Local Variable Table (LVT). The LVT is an array containing the values of local variables and parameters for each code block. We demonstrate that when local variables are stored in the first four entries of the LVT, instructions using those variables execute quicker. In order to exploit these performance benefits, we present a modified Sun Java compiler to optimize the LVT using a most frequently used (MFU) ordering. By ordering the LVT, we achieve speedups of traditional benchmark programs.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132406231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces an evolutionary/systematic hybrid which combines the concept of evolutionary hill-climbing search with the systematic search concept of arc revision to form a hybrid that quickly find solutions to Fuzzy Constraint Satisfaction Problems (FCSPs). The performance of this hybrid on 250 randomly generated FCSPs in which the fuzzy constraints are evenly distributed amongst the variables of the FCSP is compared with its performance on 250 randomly generated FCSPs where the fuzzy constraints are unevenly distributed. The results provide some interesting insights in the role that Fuzzy Constraint Network Topology has on Evolutionary Search.
{"title":"Fuzzzy constraint network topology and evolutionary hill-climbing","authors":"G. Dozier, A. Homaifar, J. Bowen, A. Esterline","doi":"10.1145/2817460.2817495","DOIUrl":"https://doi.org/10.1145/2817460.2817495","url":null,"abstract":"This paper introduces an evolutionary/systematic hybrid which combines the concept of evolutionary hill-climbing search with the systematic search concept of arc revision to form a hybrid that quickly find solutions to Fuzzy Constraint Satisfaction Problems (FCSPs). The performance of this hybrid on 250 randomly generated FCSPs in which the fuzzy constraints are evenly distributed amongst the variables of the FCSP is compared with its performance on 250 randomly generated FCSPs where the fuzzy constraints are unevenly distributed. The results provide some interesting insights in the role that Fuzzy Constraint Network Topology has on Evolutionary Search.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117122729","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A crucial need exists for innovative means to access, secure, and maintain distributed or networked resources such as those found on the Internet. Instead of centralized controllers, schedulers, and priority schemes, we advocate the use of decentralized approaches such as software agents with autonomous protocols. We have proposed protocols for agents which represent tasks that negotiate to gain access to computing resources such as Web files, graphics processors, databases, special output devices, etc. Four different market protocols have been developed (single action, auction, barter, and challenge), and through simulation, associated system performance has been analyzed by monitoring agent and task performance; viz, processing times, total system times, resource availability, resource utilization, and system efficiency. Experimental results show that agents using market protocols are more effective than the standard hosted approaches, encouraging their possible further exploration into non-von Neumann architectures and hostless network operating systems.
{"title":"Software agents and the role of market protocols","authors":"R. Gagliano, Martin D. Fraser","doi":"10.1145/2817460.2817484","DOIUrl":"https://doi.org/10.1145/2817460.2817484","url":null,"abstract":"A crucial need exists for innovative means to access, secure, and maintain distributed or networked resources such as those found on the Internet. Instead of centralized controllers, schedulers, and priority schemes, we advocate the use of decentralized approaches such as software agents with autonomous protocols. We have proposed protocols for agents which represent tasks that negotiate to gain access to computing resources such as Web files, graphics processors, databases, special output devices, etc. Four different market protocols have been developed (single action, auction, barter, and challenge), and through simulation, associated system performance has been analyzed by monitoring agent and task performance; viz, processing times, total system times, resource availability, resource utilization, and system efficiency. Experimental results show that agents using market protocols are more effective than the standard hosted approaches, encouraging their possible further exploration into non-von Neumann architectures and hostless network operating systems.","PeriodicalId":274966,"journal":{"name":"ACM-SE 35","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114354584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}