Goto

Collaborating Authors

 Williamson, Robert C.


The Entropy Regularization Information Criterion

Neural Information Processing Systems

Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, wheregood bounds are obtainable by the entropy number approach.


The Entropy Regularization Information Criterion

Neural Information Processing Systems

Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general linear additive models. This is achieved by a data dependent analysis of the eigenvalues of the corresponding design matrix.


Support Vector Method for Novelty Detection

Neural Information Processing Systems

Suppose you are given some dataset drawn from an underlying probability distributionP and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S equals some a priori specified





Examples of learning curves from a modified VC-formalism

Neural Information Processing Systems

We examine the issue of evaluation of model specific parameters in a modified VC-formalism. Two examples are analyzed: the 2-dimensional homogeneous perceptron and the I-dimensional higher order neuron. Both models are solved theoretically, and their learning curves are compared againsttrue learning curves. It is shown that the formalism has the potential to generate a variety of learning curves, including ones displaying ''phase transitions."


Examples of learning curves from a modified VC-formalism

Neural Information Processing Systems

We examine the issue of evaluation of model specific parameters in a modified VC-formalism. Two examples are analyzed: the 2-dimensional homogeneous perceptron and the I-dimensional higher order neuron. Both models are solved theoretically, and their learning curves are compared against true learning curves. It is shown that the formalism has the potential to generate a variety of learning curves, including ones displaying ''phase transitions."


Rational Parametrizations of Neural Networks

Neural Information Processing Systems

IR is typically a sigmoidal function such as (1.2) but other choices than (1.2) are possible and of interest.


Rational Parametrizations of Neural Networks

Neural Information Processing Systems

IR is typically a sigmoidal function such as (1.2) but other choices than (1.2) are possible and of interest.