Results



Mathematical and computational models of transformational grammar

Classics

We were led to this comparison by the observation that the computer model is weaker in three important ways: search depth is not unbounded, structures matching variables cannot be compared, and structures matching variables cannot be moved. Thus, every recursively enumerable language is generated by a transformational grammar with limited search depth, without equality comparisons of variables, and without moving structures corresponding to variables. On the other hand, both mathematical models allow unbounded depth of analysis; both allow equality comparisons of variables, although the Ginsburg-Partee model.compares


Some techniques for proving correctness of programs which alter data structures

Classics

We will extend Floyd's proof system for flow diagrams to handle commands Which process lists. McCarthy and Painter (1967) deal with arrays by introducing'change' and'access' functions so as to write a[i]: a[j] 1 as a: change (a, i, access 24 BURSTALL King (1969) in mechanising Floyd's technique gives a method for such assignments which, however, introduces case analysis that sometimes becomes unwieldy. Let us recall briefly the technique of Floyd (1967) for proving correctness of programs in flow diagram form. We will here retain the inductive method of Floyd for dealing with flow diagrams containing loops, but give methods for coping with more complex kinds of assignment command.


An approach to the frame problem, and its implementation

Classics

This paper proposes a method for handling the frame problem in representing conceptual, or natural-language-type information. The method is part of a larger calculus for expressing conceptual information, called P c F-2, which is described in Sandewall (1972), and which is a modification and extension of Sandewall (1971a). When the STRIPS schema adds a fact, PLANNER would add the corresponding fact to the data base using the primitive thassert. In this context, by epistemological information we mean a notation together with a set of rules (for example, logical axioms) which describe permissible deductions.


Artificial Intelligence: A General Survey (The Lighthill Report)

Classics

In forming such a view the Council has available to it a great deal of specialist information through its structure of Boards and Committees-- particularly from the Engineering Board and its Computing Science Committee and from the Science Board and its Biological Sciences Committee. To supplement the important mass of specialist and detailed information available to the Science Research Council, its Chairman decided to commission an independent report by someone outside the Al eld but with substantial general experience of research work in multidisciplinary elds including elds with mathematical, engineering and biological aspects. Such a personal view of the subject might be helpful to other lay persons such as Council members in the process of preparing to study specialist reports and recommendations and working towards detailed policy formation and decision taking. In scientic applications, there is a similar look beyond conventional data processing to the problems involved in large-scale data banking and retrieval, The vast eld of chemical compounds is one which has lent itself to ingenious and eective programs for data storage and retrieval and for the inference of chemical structure from mass-spec- trometry and other data.


On generality and problem solving: a case study using the DENDRAL program

Classics

"Heuristic DENDRAL is a computer program written to solve problems of inductive inference in organic chemistry. This paper will use the design of Heuristic DENDRAL and its performance on different problems for a discussion of the following topics: 1. the design for generality; 2. the performance problems attendant upon too much generality; 3. the coupling of expertise to the general problem solving processes; 4. the symbiotic relationship between generality and expertness, and the implications of this symbiosis for the study and design of problem solving systems. We conclude the paper with a view of the design for a general problem solver that is a variant of the "big switch" theory of generality."See also: Stanford University Report (ACM Citation)In Meltzer, B. and Michie, D. (Eds.), Machine Intelligence 6, pp. 165–190. Edinburgh University Press


Bi-Directional Search

Classics

Ph.D. dissertation "Bi-directional and heuristic search in path problems" (Stanford, Computer Science, 1970) summarized in this article in Machine Intelligence 6 (1971).In the uni-directional algorithms, the search proceeds from an initial nodeforward until the goal node is encountered. Problems for which the goal nodeis explicitly known can be searched backward from the goal node. Analgorithm combining both search directions is bi-directional.This method has not seen much use because book-keeping problems werethought to outweigh the possible search reduction. The use of hashingfunctions to partition the search space provides a solution to some of theseimplementation problems. However, a more serious difficulty is involved.To realize significant savings in bi-directional search, the forward andbackward search trees must meet in the 'middle' of the space. The potentialbenefits from this technique motivates this paper's examination of thetheoretical and practical problems in using bi-directional search.


Experiments with a pleasure seeking automaton

Classics

Attempts to write'intelligent' computer programs have commonly involved the choice for attack of some particular aspect of intelligent behaviour, together with the choice of some relevant task, or range of tasks, which the program must perform. Toda (1962), in a whimsical and illuminating paper, has discussed the problems facing an automaton in a simple artificial environment. The reader may find it illuminating to imagine himself (the automaton) before a screen on which is displayed a complex pattern which changes from time to time (sequence of states). These are: 1. the subjective environment graph (figure 1), 2. the stored graph which is that portion of the subjective environment graph which the automaton has stored in its memory as a result of its experience (figure 2 (b)), and 3. the option graph which is that fragment of the stored graph which the automaton'knows' how to reach (figure 2(c)).


The Programming Language LISP

Classics

"Among the new languages for instructing computers is a remarkable one called LISP. The name comes from the first three letters of LIST and the first letter of PROCESSING. Not only is LISP a language for instructing computers but it is also a formal mathematical language, in the same way as elëmentary algebra when rigorously defined and used is a formal mathematical language.The LISP language and its implementation on the IBM 7090 computer were worked out by a group including John McCarthy, Stephen B. Russell , Daniel J. Edwards, Paul W. Abrahams, Timothy P. Hart, Michael I. Levin, Marvin L. Minsky, and others.LISP is designed primarily for processing data consisting of lists of symbols. It has been used for symbolic calculations in differential and integral calculus, electrical circuit theory, mathematical logic , game playing, and other fields of intelligent handling of symbols."Information International, Inc, Cambridge, Mass.


Syntactic dependence and the computer generation of coherent discourse

Classics

The two primary components of the experimental computer program consisted of a phrase structure generation grammar capable of generating grammatical nonsense, and a monitoring system which would abort the generation process whenever it was apparent that the dependency structure of a sentence being generated was not in harmony with the dependency relations existing in an input source text. Potential applications include automatic kernelizing, question answering, automatic essay writing, and automatic abstracting systems. Introduction This paper sets forth the hypothesis that there is in the English language a general principle of transitivity of dependence among elements and describes an experiment in the computer generation of coherent discourse that supports the hypothesis. Given as input a set of English sentences, if we hold constant the set of vocabulary tokens and generate grammatical English statements from that vocabulary with the additional restriction that their transitive dependencies agree with those of the input text, the resulting sentences will all be truth-preserving paraphrases derived from the original set.