Not enough data to create a plot.
Try a different view from the menu above.
Government
Generalization by Weight-Elimination with Application to Forecasting
Weigend, Andreas S., Rumelhart, David E., Huberman, Bernardo A.
Inspired by the information theoretic idea of minimum description length, we add a term to the back propagation cost function that penalizes network complexity. We give the details of the procedure, called weight-elimination, describe its dynamics, and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We use this procedure to predict the sunspot time series and the notoriously noisy series of currency exchange rates. 1 INTRODUCTION Learning procedures for connectionist networks are essentially statistical devices for performing inductive inference. There is a tradeoff between two goals: on the one hand, we want such devices to be as general as possible so that they are able to learn a broad range of problems.
Discrete Affine Wavelet Transforms For Anaylsis And Synthesis Of Feedfoward Neural Networks
Pati, Y. C., Krishnaprasad, P. S.
In this paper we show that discrete affine wavelet transforms can provide a tool for the analysis and synthesis of standard feedforward neural networks. It is shown that wavelet frames for L2(IR) can be constructed based upon sigmoids. The spatia-spectral localization property of wavelets can be exploited in defining the topology and determining the weights of a feedforward network. Training a network constructed using the synthesis procedure described here involves minimization of a convex cost functional and therefore avoids pitfalls inherent in standard backpropagation algorithms. Extension of these methods to L2(IRN) is also discussed.
On the Circuit Complexity of Neural Networks
Roychowdhury, V. P., Siu, K. Y., Orlitsky, A., Kailath, T.
Viewing n-variable boolean functions as vectors in'R'2", we invoke tools from linear algebra and linear programming to derive new results on the realizability of boolean functions using threshold gat.es. Using this approach, one can obtain: (1) upper-bounds on the number of spurious memories in HopfielJ networks, and on the number of functions implementable by a depth-d threshold circuit; (2) a lower bound on the number of ort.hogonal input.
AAAI News
All inquiries should include your travel support for students who are registration area. Now Exempt from applicants must have fulfilled your lab's research efforts to be the volunteer and reporting requirements California Sales Tax shown to a large portion of the AI for previous awards. This year, Recent California legislation required community. California that can be run in parallel on several who submit a letter of recommendation Senate Bill 89 (Chapter 461, screens. Please do not send tapes of a from a faculty supervisor in lieu Statutes of 1991)-signed by the governor particular project or lecture but, of a paper, student authors from foreign at press time-provides AAAI rather, tapes that present broad institutions, and foreign scholars.
Where's the AI?
I survey four viewpoints about what AI is. I describe a program exhibiting AI as one that can change as a result of interactions with the user. Such a program would have to process hundreds or thousands of examples as opposed to a handful. Because AI is a machine's attempt to explain the behavior of the (human) system it is trying to model, the ability of a program design to scale up is critical. Researchers need to face the complexities of scaling up to programs that actually serve a purpose. The move from toy domains into concrete ones has three big consequences for the development of AI. First, it will force software designers to face the idiosyncrasies of its users. Second, it will act as an important reality check between the language of the machine, the software, and the user. Third, the scaled-up programs will become templates for future work. For a variety of reasons, some of which I discuss one of the following four things: (1) AI means in this article, the newly formed Institute magic bullets, (2) AI means inference engines, for the Learning Sciences has been concentrating (3) AI means getting a machine to do something its efforts on building high-quality you didn't think a machine could do educational software for use in business and (the "gee whiz" view), and (4) AI means elementary and secondary schools. In the two having a machine learn.
Enabling Technology for Knowledge Sharing
Neches, Robert, Fikes, Richard E., Finin, Tim, Gruber, Thomas, Patil, Ramesh, Senator, Ted, Swartout, William R.
Building new knowledge-based systems today usually entails constructing new knowledge bases from scratch. It could instead be done by assembling reusable components. System developers would then only need to worry about creating the specialized knowledge and reasoners new to the specific task of their system. This new system would interoperate with existing systems, using them to perform some of its reasoning. In this way, declarative knowledge, problem- solving techniques, and reasoning services could all be shared among systems. This approach would facilitate building bigger and better systems cheaply. The infrastructure to support such sharing and reuse would lead to greater ubiquity of these systems, potentially transforming the knowledge industry. This article presents a vision of the future in which knowledge-based system development and operation is facilitated by infrastructure and technology for knowledge sharing. It describes an initiative currently under way to develop these ideas and suggests steps that must be taken in the future to try to realize this vision.
A Performance Evaluation of Text-Analysis Technologies
Lehnert, Wendy, Sundheim, Beth
A performance evaluation of 15 text-analysis systems was recently conducted to realistically assess the state of the art for detailed information extraction from unconstrained continuous text. Reports associated with terrorism were chosen as the target domain, and all systems were tested on a collection of previously unseen texts released by a government agency. Based on multiple strategies for computing each metric, the competing systems were evaluated for recall, precision, and overgeneration. The results support the claim that systems incorporating natural language-processing techniques are more effective than systems based on stochastic techniques alone. A wide range of language-processing strategies was employed by the top-scoring systems, indicating that many natural language-processing techniques provide a viable foundation for sophisticated text analysis. Further evaluation is needed to produce a more detailed assessment of the relative merits of specific technologies and establish true performance limits for automated information extraction.
AAAI News
Intelligence (AAAI) hopes that these This year's conference featured a new A talk united by a set of related research This year's program represented an by Jim Green0 addressed modeling issues. Constraint this approach is not seen There was time to interact Reasoning and Component Technologies as often today. Where is it session following each set of presentations. Highlights from the program focused on a presentation on among the accepted papers. A panel entitled "How Long which ran for two consecutive days Until the Household Robot: The For the first time, Innovative Applications during the conference. The emergence State of the Art in Robotics" featured in Artificial Intelligence (IAAI) of the forum Planning, Perception, speakers from industry and Carnegie presentations and AI Online interactive and Robotics reflected a recent trend Mellon's Robotic Institute, who panels were presented concurrently in Planning, with videotapes and a live robot providing an impressive demonstration Perception, and Robotics included demonstration.
Applied AI News
Blue Cross/Blue Shield of Virginia AT&T's Merrimack Valley Works The US Army Laboratory Command's (Richmond, VA) has developed an (North Andover, MA) has developed Human Engineering Laboratory expert system to classify, evaluate the Expert Capacity and Material (Aberdeen Proving Ground, MD) has and process medical claims. The system, System (XCAM), an expert system awarded a $2.4 million contract to called MedScreen, reportedly which simplifies forecast evaluations Carnegie Group (Pittsburgh, PA) to can process up to 500 claims in 45 for a manufacturing operation The continue work on a knowledge-based minutes, an operation that used to system automates the analysis of logistics planning system. The system take several days to complete. The IBM (Armonk, NY) and Dragon Systems NRM has been successfully deployed ICL (Birmingham, England) has completed (Newton, MA) have jointly in a number of Australian banks, as a pilot test of an intelligent developed VoiceType, a speech recognition well as a food storage and distribution system for field service diagnosing system based on elements of center. ICL used a laptop-based allows hands-free typing.
An Overview of Some Recent and Current Research in the AI Lab at Arizona State University
Findler, Nicholas V., Sengupta, Uttam
The applications include the user-advised construction of an assembly line balancing system and a self-optimizing street light control system. The generalized production-rule strategy that is better than any other at Arizona State University. The estimation is based on for the decision maker to respond to. The system can serve as a module simulation models. of an expert system in need of numeric Figure 1 shows the or functional estimates of hiddenvariable Mazur, Robert F. geographically distributed input Cromp, Bede McCall, operations and knowledge bases. Bickmore, Jan van been in the area of forecasting and Leeuwen, João Martins, interpolating econometric indicators.