In brief, we believe that programs for learning large games will need to have at their disposal good rules for learning small games. Each separate box functions as a separate learning machine: it is only brought into play when the corresponding board position arises, and its sole task is to arrive at a good choice of move for that specific position. The demon's task is to make his choices in successive plays in such a way as to maximise his expected number of wins over some specified period. By a development of Laplace's Law of Succession we can determine the probability, This defines the score associated with the node N. To make a move the automaton examines all the legal alternatives and chooses the move leading to the position having the highest associated score, ties being decided by a random choice.
The two primary components of the experimental computer program consisted of a phrase structure generation grammar capable of generating grammatical nonsense, and a monitoring system which would abort the generation process whenever it was apparent that the dependency structure of a sentence being generated was not in harmony with the dependency relations existing in an input source text. Potential applications include automatic kernelizing, question answering, automatic essay writing, and automatic abstracting systems. Introduction This paper sets forth the hypothesis that there is in the English language a general principle of transitivity of dependence among elements and describes an experiment in the computer generation of coherent discourse that supports the hypothesis. Given as input a set of English sentences, if we hold constant the set of vocabulary tokens and generate grammatical English statements from that vocabulary with the additional restriction that their transitive dependencies agree with those of the input text, the resulting sentences will all be truth-preserving paraphrases derived from the original set.
"This report describes some experiments in constructing a compiler that makes use of heuristic problem~solving techniques such as those incorporated in the General Problem Solver (GPS) . The experiments were aimed at the dual objectives of throwing light on some of the problems of constructing more powerful programming languages and compilers, and of testing whether the task of writing a computer program can be regarded as a "problem" in the sense in which that term is used in GPS. The present paper is concerned primarily with the second objective--with analyzing some of the problem-solving processes that are involved in writing computer programs. At the present stage of their development, no claims will be made for the heuristic programming procedures described here as practical approaches to the construction of compilers. Their interest lies in what they teach us about the nature of the programming task."See also: Artificial intelligence and self-organizing systems: Experiments with a Heuristic CompilerJACM, 10, 493-506
A large high-speed general-purpose digital computer (IBM 7090) wasProgrammed to solve elementary symbolic integration problems at approximatelythe level of a good college freshman. The program is called SAINT,an acronym for "Symbolic Automatic INTegrator." The SAINT programis written in LISP (McCarthy, 1960), and most of the work reported hereJs the substance of a doctoral dissertation at the Massachusetts Institute ofTechnology (Slagle, 1961). This discussion concerns the SAINT programand its performance.Some typical samples of SAINT's external behavior are given so thatthe reader may think in concrete terms. Journal of the ACM, Vol 10, No. 4, pp. 507-520, October 1963.