Not enough data to create a plot.
Try a different view from the menu above.
Government
Extended ICA Removes Artifacts from Electroencephalographic Recordings
Jung, Tzyy-Ping, Humphries, Colin, Lee, Te-Won, Makeig, Scott, McKeown, Martin J., Iragui, Vicente, Sejnowski, Terrence J.
Severe contamination of electroencephalographic (EEG) activity by eye movements, blinks, muscle, heart and line noise is a serious problem for EEG interpretation and analysis. Rejecting contaminated EEG segments results in a considerable loss of information and may be impractical for clinical data. Many methods have been proposed to remove eye movement and blink artifacts from EEG recordings. Often regression in the time or frequency domain is performed on simultaneous EEG and electrooculographic (EOG) recordings to derive parameters characterizing the appearance and spread of EOG artifacts in the EEG channels. However, EOG records also contain brain signals [1, 2], so regressing out EOG activity inevitably involves subtracting a portion of the relevant EEG signal from each recording as well. Regression cannot be used to remove muscle noise or line noise, since these have no reference channels. Here, we propose a new and generally applicable method for removing a wide variety of artifacts from EEG records. The method is based on an extended version of a previous Independent Component Analysis (lCA) algorithm [3, 4] for performing blind source separation on linear mixtures of independent source signals with either sub-Gaussian or super-Gaussian distributions. Our results show that ICA can effectively detect, separate and remove activity in EEG records from a wide variety of artifactual sources, with results comparing favorably to those obtained using regression-based methods.
Stacked Density Estimation
Smyth, Padhraic, Wolpert, David
One frequently estimates density functions for which there is little prior knowledge on the shape of the density and for which one wants a flexible and robust estimator (allowing multimodality if it exists). In this context, the methods of choice tend to be finite mixture models and kernel density estimation methods. For mixture modeling, mixtures of Gaussian components are frequently assumed and model choice reduces to the problem of choosing the number k of Gaussian components in the model (Titterington, Smith and Makov, 1986). For kernel density estimation, kernel shapes are typically chosen from a selection of simple unimodal densities such as Gaussian, triangular, or Cauchy densities, and kernel bandwidths are selected in a data-driven manner (Silverman 1986; Scott 1994). As argued by Draper (1996), model uncertainty can contribute significantly to pre- - Also with the Jet Propulsion Laboratory 525-3660, California Institute of Technology, Pasadena, CA 91109 Stacked Density Estimation 669 dictive error in estimation. While usually considered in the context of supervised learning, model uncertainty is also important in unsupervised learning applications such as density estimation. Even when the model class under consideration contains the true density, if we are only given a finite data set, then there is always a chance of selecting the wrong model. Moreover, even if the correct model is selected, there will typically be estimation error in the parameters of that model.
The DARPA High-Performance Knowledge Bases Project
Cohen, Paul R., Schrag, Robert, Jones, Eric, Pease, Adam, Lin, Albert, Starr, Barbara, Gunning, David, Burke, Murray
Now completing its first year, the High-Performance Knowledge Bases Project promotes technology for developing very large, flexible, and reusable knowledge bases. The project is supported by the Defense Advanced Research Projects Agency and includes more than 15 contractors in universities, research laboratories, and companies.
The DARPA High-Performance Knowledge Bases Project
Cohen, Paul R., Schrag, Robert, Jones, Eric, Pease, Adam, Lin, Albert, Starr, Barbara, Gunning, David, Burke, Murray
Now completing its first year, the High-Performance Knowledge Bases Project promotes technology for developing very large, flexible, and reusable knowledge bases. The project is supported by the Defense Advanced Research Projects Agency and includes more than 15 contractors in universities, research laboratories, and companies. The evaluation of the constituent technologies centers on two challenge problems, in crisis management and battlespace reasoning, each demanding powerful problem solving with very large knowledge bases. This article discusses the challenge problems, the constituent technologies, and their integration and evaluation.
Applied AI News
The National Aeronautics and Space Administration Jet Propulsion Laboratory (Pasadena, Calif.) has developed The chip, which has The National Aeronautics and Chester, N.Y.) to improve its ability to been licensed by automaker Ford Space Administration (NASA) Goddard match reported wage information. Motor (Dearborn, Mich.), is designed Space Flight Center (Greenbelt, The solution will help the agency to augment current vehicle on-board Md.) has developed the The Philippines (Quezon City, The process for outside scientists wanting RoyScot Trust, the asset finance arm Philippines) has adopted an intelligent to use NASA's space telescopes. of the Royal Bank of Scotland (Edinburgh, agent-based software system to The system is designed to capture and Scotland), has implemented an manage mission-critical tax processes maintain key scientific knowledge expert system-based solution to automate across The Philippines. The intelligent while it reduces common errors made the credit-underwriting process. The firm has set up a credit control management of the bureau's entire Johnson Controls (Milwaukee, Wis.), system, The turnkey expert installs and maintains. By integrating component has deployed a speech-recognition- frequent air travelers through U.S. math data with work-cell visualization based application for its frequent flier Immigration inspection in less than software, engineers can simulate customers.
Highly Autonomous Systems Workshop
Doyle, Richard, Rasmussen, Robert, Man, Guy, Patel, Keyur
Researchers and technology developers from the National Aeronautics and Space Administration (NASA), other government agencies, academia, and industry recently met in Pasadena, California, to take stock of past and current work and future challenges in the application of AI to highly autonomous systems. The meeting was catalyzed by new opportunities in developing autonomous spacecraft for NASA and was in part a celebration of the fictional birth year of the HAL-9000 computer.