It is known that there is no one-way, non-deterministic Pushdown Automaton (INPDA) which is a universal machine for the class of Finite Automata [6]. We will show that there is a two-way, deterministic Pushdown Automaton (2DPDA), U, which is a universal machine for the class of Finite Automata (FA). Our method will parallel Knuth and Bigelow's construction of a language which is not context sensitive but which is the acceptance set of some stack automaton [5], that is, we will construct a language which is not context-free, but which is accepted by a 2DPDA.
{"title":"Two results concerning the power of two-way deterministic Pushdown Automata","authors":"Daniel Martin, J. Gwynn","doi":"10.1145/800192.805729","DOIUrl":"https://doi.org/10.1145/800192.805729","url":null,"abstract":"It is known that there is no one-way, non-deterministic Pushdown Automaton (INPDA) which is a universal machine for the class of Finite Automata [6]. We will show that there is a two-way, deterministic Pushdown Automaton (2DPDA), U, which is a universal machine for the class of Finite Automata (FA). Our method will parallel Knuth and Bigelow's construction of a language which is not context sensitive but which is the acceptance set of some stack automaton [5], that is, we will construct a language which is not context-free, but which is accepted by a 2DPDA.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83086735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Vanderheiden, D. Lamers, A. M. Volk, C. Geisler
A student initiated project has led to the use of a new technique and the development of a new device which can provide a means of communication for severely handicapped people, especially useful for those who are both mute and unable to use a typewriter. The device, called the Auto-Com (Auto-Monitoring Communication Board), was developed especially for use by individuals with afflictions which result in severe spastic motions. For these individuals, most devices that utilize switches are extremely difficult or impossible to operate. This is due to the fact that normal switches, levers and keyboards all rely on a discrete motion of some form for their operation. Because of sporadic motions, these people are constantly making errors due to false triggering of the switches. These errors are very frustrating for them and usually result in rapid rejection of the device. Some devices have been developed which overcome the spastic motion problem by using large, gross motor movements as the signal source. In this manner switches can be successfully used, but their number is limited. Special techniques are then required to specify all of the alphanumeric characters with these few switches. In contrast, the Auto-Com copes with the problem of spastic motion by using the lack of motion rather than its presence as its control signal. Thus, any sudden Jerks, movements or lack of control will be ignored, allowing error-free control. This technique also allows the close packing of signal switches: more than 80 of them are located in a 12" × 15" area. This large number of switches allows a simple one-to-one relationship between the switches and the characters, making understanding and operation of the unit simple, even for a young child. This direct specification of each letter also reduces the time needed by the user to select a letter. The current model of the Auto-Com consists of a sensing board, a hand-piece and several output devices. The sensing board is similar in appearance to a “language board”, a commonly used communication technique for the handicapped. A typical language board is a flat piece of wood with letters, numbers and some common words painted on it. The handicapped person communicates by pointing out the letters of his message to another person. The Auto-Com works in much the same manner except that it is auto-monitoring. It does not require the presence of another person. Like the language board, the surface of the sensing board is hard and smooth and has letters painted on it. Communication is accomplished by sliding the handpiece over the board's surface until its black post (a magnet) is located over a desired letter. A magnetic reed switch located directly underneath that letter is then closed. If the magnet is kept there for a short (adjustable) period of time, the letter is printed on the TV screen or teletype unit. Since the magnet does not need to stand absolutely still, but only remain within an area surrounding the letter, the
{"title":"A communications device for the severely handicapped","authors":"G. Vanderheiden, D. Lamers, A. M. Volk, C. Geisler","doi":"10.1145/800192.805743","DOIUrl":"https://doi.org/10.1145/800192.805743","url":null,"abstract":"A student initiated project has led to the use of a new technique and the development of a new device which can provide a means of communication for severely handicapped people, especially useful for those who are both mute and unable to use a typewriter. The device, called the Auto-Com (Auto-Monitoring Communication Board), was developed especially for use by individuals with afflictions which result in severe spastic motions.\u0000 For these individuals, most devices that utilize switches are extremely difficult or impossible to operate. This is due to the fact that normal switches, levers and keyboards all rely on a discrete motion of some form for their operation. Because of sporadic motions, these people are constantly making errors due to false triggering of the switches. These errors are very frustrating for them and usually result in rapid rejection of the device.\u0000 Some devices have been developed which overcome the spastic motion problem by using large, gross motor movements as the signal source. In this manner switches can be successfully used, but their number is limited. Special techniques are then required to specify all of the alphanumeric characters with these few switches.\u0000 In contrast, the Auto-Com copes with the problem of spastic motion by using the lack of motion rather than its presence as its control signal. Thus, any sudden Jerks, movements or lack of control will be ignored, allowing error-free control. This technique also allows the close packing of signal switches: more than 80 of them are located in a 12\" × 15\" area. This large number of switches allows a simple one-to-one relationship between the switches and the characters, making understanding and operation of the unit simple, even for a young child. This direct specification of each letter also reduces the time needed by the user to select a letter.\u0000 The current model of the Auto-Com consists of a sensing board, a hand-piece and several output devices. The sensing board is similar in appearance to a “language board”, a commonly used communication technique for the handicapped. A typical language board is a flat piece of wood with letters, numbers and some common words painted on it. The handicapped person communicates by pointing out the letters of his message to another person. The Auto-Com works in much the same manner except that it is auto-monitoring. It does not require the presence of another person. Like the language board, the surface of the sensing board is hard and smooth and has letters painted on it. Communication is accomplished by sliding the handpiece over the board's surface until its black post (a magnet) is located over a desired letter. A magnetic reed switch located directly underneath that letter is then closed. If the magnet is kept there for a short (adjustable) period of time, the letter is printed on the TV screen or teletype unit. Since the magnet does not need to stand absolutely still, but only remain within an area surrounding the letter, the","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81891112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper outlines the objectives and preliminary design for an experimental POlyProcessor SYstem (POPSY) which we hope to build at the Courant Institute. The motivation for this work is our belief that the dramatic reduction in hardware costs which we expect to result from large-scale integration (LSI) will affect the architecture of computer systems in a significant way. In particular, we expect there to be the following main results - the reduction in importance of program efficiency - the use of software to replace hardware The SETL work1 is concerned with one approach to the first. Here we are concerned with the second.
{"title":"Preliminary design for POPSY - a POlyProcessor SYstem","authors":"M. Harrison","doi":"10.1145/800192.805677","DOIUrl":"https://doi.org/10.1145/800192.805677","url":null,"abstract":"This paper outlines the objectives and preliminary design for an experimental <underline>PO</underline>ly<underline>P</underline>rocessor <underline>SY</underline>stem (POPSY) which we hope to build at the Courant Institute. The motivation for this work is our belief that the dramatic reduction in hardware costs which we expect to result from large-scale integration (LSI) will affect the architecture of computer systems in a significant way. In particular, we expect there to be the following main results\u0000 - the reduction in importance of program efficiency\u0000 - the use of software to replace hardware\u0000 The SETL work<supscrpt>1</supscrpt> is concerned with one approach to the first. Here we are concerned with the second.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88328699","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Algorithms searching game trees for games like chess, checkers etc. must have two features: They must find a good solution by inspecting all relevant parts of the tree and they must do it within acceptable time. According to that, most existing algorithms have two principal weaknesses: inconsistency and redundancy. By inconsistency I understand that not all relevant parts of the tree are considered or that parts are considered or pruned for incorrect reasons. Typical inconsistencies are fixed depth limitations, incorrect quiescence analysis and the ordinary methods of forward pruning. Consistency is a sufficient, but fortunately not always necessary condition for correct results.
{"title":"Implementation of a dynamic tree searching algorithm in a chess programme","authors":"Gerhard Wolf","doi":"10.1145/800192.805704","DOIUrl":"https://doi.org/10.1145/800192.805704","url":null,"abstract":"Algorithms searching game trees for games like chess, checkers etc. must have two features: They must find a good solution by inspecting all relevant parts of the tree and they must do it within acceptable time. According to that, most existing algorithms have two principal weaknesses: inconsistency and redundancy. By inconsistency I understand that not all relevant parts of the tree are considered or that parts are considered or pruned for incorrect reasons. Typical inconsistencies are fixed depth limitations, incorrect quiescence analysis and the ordinary methods of forward pruning. Consistency is a sufficient, but fortunately not always necessary condition for correct results.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73360024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a preliminary report on a new algorithm for computing the Greatest Common Divisor (GCD) of two multivariate polynomials over the integers. The algorithm is strongly influenced by the method used for factoring multivariate polynomials over the integers. It uses an extension of the Hensel lemma approach originally suggested by Zassenhaus for factoring univariate polynomials over the integers. We point out that the cost of the Modular GCD algorithm applied to sparse multivariate polynomials grows at least exponentially in the number of variables appearing in the GCD. This growth is largely independent of the number of terms in the GCD. The new algorithm, called the EZ (Extended Zassenhaus) GCD Algorithm, appears to have a computing bound which in most cases is a polynomial function of the number of terms in the original polynomials and the sum of the degrees of the variables in them. Especially difficult cases for the EZ GCD Algorithm are described. Applications of the algorithm to the computation of contents and square-free decompositions of polynomials are indicated.
{"title":"The EZ GCD algorithm","authors":"J. Moses, D. Yun","doi":"10.1145/800192.805698","DOIUrl":"https://doi.org/10.1145/800192.805698","url":null,"abstract":"This paper presents a preliminary report on a new algorithm for computing the Greatest Common Divisor (GCD) of two multivariate polynomials over the integers. The algorithm is strongly influenced by the method used for factoring multivariate polynomials over the integers. It uses an extension of the Hensel lemma approach originally suggested by Zassenhaus for factoring univariate polynomials over the integers. We point out that the cost of the Modular GCD algorithm applied to sparse multivariate polynomials grows at least exponentially in the number of variables appearing in the GCD. This growth is largely independent of the number of terms in the GCD. The new algorithm, called the EZ (Extended Zassenhaus) GCD Algorithm, appears to have a computing bound which in most cases is a polynomial function of the number of terms in the original polynomials and the sum of the degrees of the variables in them. Especially difficult cases for the EZ GCD Algorithm are described. Applications of the algorithm to the computation of contents and square-free decompositions of polynomials are indicated.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74501248","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A grandmaster usually spends a lifetime collecting knowledge or information about the game. Some of this knowledge is given to COKO in the form of a 12,000 line FORTRAN program. Using this knowledge COKO plays very poorly but at the super rate of approximately one move/sec. The use of a brute force selective tree searching procedure yields an order of magnitude improvement in performance at the standard rate of 3 min./move. Perhaps three orders of magnitude additional improvement is needed to defeat the world champion, a gap which must be bridged, if ever, by programming more chess knowledge into the machine. In addition “inter-snap judgment communication” is described as a natural, powerful procedure frequently used by humans to guide their selective search and as a point of emphasis for future development.
{"title":"COKO III and the future of inter-snap judgment communication","authors":"E. W. Kozdrowicki, Dennis W. Cooper","doi":"10.1145/800192.805706","DOIUrl":"https://doi.org/10.1145/800192.805706","url":null,"abstract":"A grandmaster usually spends a lifetime collecting knowledge or information about the game. Some of this knowledge is given to COKO in the form of a 12,000 line FORTRAN program. Using this knowledge COKO plays very poorly but at the super rate of approximately one move/sec. The use of a brute force selective tree searching procedure yields an order of magnitude improvement in performance at the standard rate of 3 min./move. Perhaps three orders of magnitude additional improvement is needed to defeat the world champion, a gap which must be bridged, if ever, by programming more chess knowledge into the machine. In addition “inter-snap judgment communication” is described as a natural, powerful procedure frequently used by humans to guide their selective search and as a point of emphasis for future development.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76327162","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
An experimental interactive system has been developed on an intelligent terminal which accepts only syntactically correct Fortran statements and otherwise assists the user in preparing Fortran programs. Whenever possible the system anticipates the syntax of statements which is implied by an initial input and supplies the general form of the statement directly beneath the line on which input is being accepted. In some cases (the function heading) decisions are inverted from the language and in others (statement numbers) inputs are automatically positioned. In general, only syntax within the statements is checked, but some global checking (multiple main programs) is performed. Alternate forms of statements are often displayed with the unwanted ones being eliminated as entries are made. The techniques used demonstrate pedagogic as well as productive potential.
{"title":"A Fortran language anticipation and prompting system","authors":"John H. Pinc, Earl J. Schweppe","doi":"10.1145/800192.805701","DOIUrl":"https://doi.org/10.1145/800192.805701","url":null,"abstract":"An experimental interactive system has been developed on an intelligent terminal which accepts only syntactically correct Fortran statements and otherwise assists the user in preparing Fortran programs. Whenever possible the system anticipates the syntax of statements which is implied by an initial input and supplies the general form of the statement directly beneath the line on which input is being accepted. In some cases (the function heading) decisions are inverted from the language and in others (statement numbers) inputs are automatically positioned. In general, only syntax within the statements is checked, but some global checking (multiple main programs) is performed. Alternate forms of statements are often displayed with the unwanted ones being eliminated as entries are made. The techniques used demonstrate pedagogic as well as productive potential.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85812817","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Contemporary computer designers are largely machine centered. They emphasize function and standards of elegance, efficiency and computing power. The side effects of systems may enhance or diminish the well-being of various users. Person-centered standards that promote a sense of competence and autonomy are outlined. The coupling of flexible software with responsive organizations is suggested as a means of enhancing personal competence and self-esteem of computer users.
{"title":"Towards a person-centered computer technology","authors":"R. Kling","doi":"10.1145/800192.805740","DOIUrl":"https://doi.org/10.1145/800192.805740","url":null,"abstract":"Contemporary computer designers are largely machine centered. They emphasize function and standards of elegance, efficiency and computing power. The side effects of systems may enhance or diminish the well-being of various users. Person-centered standards that promote a sense of competence and autonomy are outlined. The coupling of flexible software with responsive organizations is suggested as a means of enhancing personal competence and self-esteem of computer users.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74646622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The first preliminary computer translation of machine produced stenographic notes was described in Salton 1959. Since then other approaches have been suggested but none seem to have succeeded. In the meantime the need for a solution becomes more and more pressing. The impetus for a solution is coming from two user areas—court reporting and speech recognition. The purpose of this paper is to describe the problems and solutions to the problems of automatic Steno-English translation.
{"title":"Automatic Steno translation","authors":"Raoul N. Smith","doi":"10.1145/800192.805687","DOIUrl":"https://doi.org/10.1145/800192.805687","url":null,"abstract":"The first preliminary computer translation of machine produced stenographic notes was described in Salton 1959. Since then other approaches have been suggested but none seem to have succeeded. In the meantime the need for a solution becomes more and more pressing. The impetus for a solution is coming from two user areas—court reporting and speech recognition.\u0000 The purpose of this paper is to describe the problems and solutions to the problems of automatic Steno-English translation.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76308530","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As computer systems become more complex and costly, the day-to-day efficient utilization of their capacity becomes more critical. At the same time, the system complexity makes analysis more difficult, and in some cases nearly impossible. It appears that an almost “standard solution” to the analysis problem is the replacement of the current system with a larger, faster, more complex, and more costly computer configuration. The need for accurate analysis leading to more appropriate management decisions is then postponed. Unfortunately, the cycle repeats itself and the inability to accurately assess the operation and performance of the computer system is now passed on to the new system. The effects of this analysis deficiency will not be immediately realized due to the increased speed and capacity of the new system. This type of solution, however, is becoming so very costly that it is imperative to find an alternative. It is in this context that this analysis and management tool is introduced.
{"title":"Device gain - a measure of system component simultaneous operation","authors":"J. Hoffman","doi":"10.1145/800192.805725","DOIUrl":"https://doi.org/10.1145/800192.805725","url":null,"abstract":"As computer systems become more complex and costly, the day-to-day efficient utilization of their capacity becomes more critical. At the same time, the system complexity makes analysis more difficult, and in some cases nearly impossible. It appears that an almost “standard solution” to the analysis problem is the replacement of the current system with a larger, faster, more complex, and more costly computer configuration. The need for accurate analysis leading to more appropriate management decisions is then postponed. Unfortunately, the cycle repeats itself and the inability to accurately assess the operation and performance of the computer system is now passed on to the new system. The effects of this analysis deficiency will not be immediately realized due to the increased speed and capacity of the new system. This type of solution, however, is becoming so very costly that it is imperative to find an alternative. It is in this context that this analysis and management tool is introduced.","PeriodicalId":72321,"journal":{"name":"ASSETS. Annual ACM Conference on Assistive Technologies","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1973-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77363787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}