Williamson, Robert C.
The Entropy Regularization Information Criterion
Smola, Alex J., Shawe-Taylor, John, Schölkopf, Bernhard, Williamson, Robert C.
The Entropy Regularization Information Criterion
Smola, Alex J., Shawe-Taylor, John, Schölkopf, Bernhard, Williamson, Robert C.
Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general linear additive models. This is achieved by a data dependent analysis of the eigenvalues of the corresponding design matrix.
Support Vector Method for Novelty Detection
Schölkopf, Bernhard, Williamson, Robert C., Smola, Alex J., Shawe-Taylor, John, Platt, John C.
Shrinking the Tube: A New Support Vector Regression Algorithm
Schölkopf, Bernhard, Bartlett, Peter L., Smola, Alex J., Williamson, Robert C.
Shrinking the Tube: A New Support Vector Regression Algorithm
Schölkopf, Bernhard, Bartlett, Peter L., Smola, Alex J., Williamson, Robert C.
Shrinking the Tube: A New Support Vector Regression Algorithm
Schölkopf, Bernhard, Bartlett, Peter L., Smola, Alex J., Williamson, Robert C.
Examples of learning curves from a modified VC-formalism
Kowalczyk, Adam, Szymanski, Jacek, Bartlett, Peter L., Williamson, Robert C.
We examine the issue of evaluation of model specific parameters in a modified VC-formalism. Two examples are analyzed: the 2-dimensional homogeneous perceptron and the I-dimensional higher order neuron. Both models are solved theoretically, and their learning curves are compared againsttrue learning curves. It is shown that the formalism has the potential to generate a variety of learning curves, including ones displaying ''phase transitions."
Examples of learning curves from a modified VC-formalism
Kowalczyk, Adam, Szymanski, Jacek, Bartlett, Peter L., Williamson, Robert C.
We examine the issue of evaluation of model specific parameters in a modified VC-formalism. Two examples are analyzed: the 2-dimensional homogeneous perceptron and the I-dimensional higher order neuron. Both models are solved theoretically, and their learning curves are compared against true learning curves. It is shown that the formalism has the potential to generate a variety of learning curves, including ones displaying ''phase transitions."
Rational Parametrizations of Neural Networks
Helmke, Uwe, Williamson, Robert C.
Rational Parametrizations of Neural Networks
Helmke, Uwe, Williamson, Robert C.