We may regard the subject of artificial intelligence as beginning with Turing's article'Computing Machinery and Intelligence' (Turing 1950) and with Shannon's (1950) discussion of how a machine might be programmed to play chess. In this case we have to say that a machine is intelligent if it solves certain classes of problems requiring intelligence in humans, or survives in an intellectually demanding environment. However, we regard the construction of intelligent machines as fact manipulators as being the best bet both for constructing artificial intelligence and understanding natural intelligence. Given this notion of intelligence the following kinds of problems arise in constructing the epistemological part of an artificial intelligence: I.
In brief, we believe that programs for learning large games will need to have at their disposal good rules for learning small games. Each separate box functions as a separate learning machine: it is only brought into play when the corresponding board position arises, and its sole task is to arrive at a good choice of move for that specific position. The demon's task is to make his choices in successive plays in such a way as to maximise his expected number of wins over some specified period. By a development of Laplace's Law of Succession we can determine the probability, This defines the score associated with the node N. To make a move the automaton examines all the legal alternatives and chooses the move leading to the position having the highest associated score, ties being decided by a random choice.
The outline is drawn of a hypothetical machine to recognise speech, comprising a basic recogniser working on short segments of acoustic waveform only, on to which may be added further structures to use knowledge of speaker characteristics, speech statistics, syntax rules, and semantics, in order to improve the recognition performance. Suppose one tried to implement a recogniser by telling the machine to store every new pattern it encountered together with a label telling it what word or words the pattern represented, with the intention of recognising an arbitrarily large vocabulary for an arbitrarily large proportion of the total population of speakers. The fifth section will describe briefly some work which is being carried out at Standard Telecommunication Laboratories towards implementing a real machine, and the final section will contain conclusions. If, as is highly probable for ASR, the speech is transmitted through a telephone link the problems of noise and distortion can be quite severe and include noises due to handling the handset, clicks and hisses in the speech band, limitation of the bandwidth to the range between 300 and 3400 cps, and pre-emphasis of the signal.
The design of classification computers is discussed in the first paper; the design of conditional probability computers Is discussed in a third paper (Uttley, 1958, ref. Nervous transmission is in terms of standard impulses which meet the requirements of binary classification. However, at low levels in nervous systems, intensity is signalled in terms of impulse frequency. If, at higher levels, patterns are distinguished by classification then intensity must not be signalled in terms of frequency but in terms of'place'.
These and other considerations are offered to justify earlier suggestions that the mechanization of Intellect requires a hybrid information-system,, wherein the conditional probabilities of digital decision-processes are determined by a separate (though interacting) computing process which could operate best on'analogue' principles. of Illustrates convincingly in a recent paper on problem-solving, there is a fundamental difference between a solution by a strictly formalised procedure and what is termed a'heuristic' solution entailing the crossing of a logical gap, in that the first is logically reversible and repeatable, while the second is not. "Established rules of inference offer public paths for drawing intelligent conclusions from existing knowledge. Any information-system with'intellect' must be capable of activity The degree of logical indeterminacy (the amount of selective information lacking) defines the width of the logical gap crossed in the solution.
He is currently a Research Psychologist at the Cornell Aeronautical Laboratory, Inc., in Buffalo, New York, where he Is Project Engineer responsible for Project PARA (Perceiving and Recognizing Automaton). FRANK ROSENBLATT SUMMARY A THEORETICAL brain model, the perceptron, has been developed at the Cornell Aeronautical Laboratory, In Buffalo, New York. SYMBOLIC LOGIC ONLY a few months before the Office of Naval Research began its support of the perceptron program, at the Cornell Aeronautical Laboratory, John von Neumann, one Of the most outstanding advocates of the proposition that man might some day achieve an artificial device working on the same principles as the human brain, wrote the following prophetic passage (re.f.4): "Logics and mathematics in the central nervous system...must structurally be essentially different from those languages to which our common experience refers... What von Neumann is saying here deserves careful consideration. Similarly, if we are ingenious enough to write a set of exact rules for minimizing the cost of some business operation, we can program a computer to minimize cost, and other such complex problems.