generalisation cost
The Generalisation Cost of RAMnets
Given unlimited computational resources, it is best to use a crite(cid:173) rion of minimal expected generalisation error to select a model and determine its parameters. However, it may be worthwhile to sac(cid:173) rifice some generalisation performance for higher learning speed. A method for quantifying sub-optimality is set out here, so that this choice can be made intelligently. Furthermore, the method is applicable to a broad class of models, including the ultra-fast memory-based methods such as RAMnets. This brings the added benefit of providing, for the first time, the means to analyse the generalisation properties of such models in a Bayesian framework .
Computational Invention of Cadences and Chord Progressions by Conceptual Chord-Blending
Eppe, Manfred (IIIA-CSIC, ICSI) | Confalonieri, Roberto (IIIA-CSIC) | MacLean, Ewen (University of Edinburgh) | Kaliakatsos, Maximos (Uniersity of Thessaloniki) | Cambouropoulos, Emilios (University of Thessaloniki) | Schorlemmer, Marco (IIIA-CSIC) | Codescu, Mihai (University of Magdeburg) | Kühnberger, Kai-Uwe (University of Osnabrück)
We present a computational framework for chord invention based on a cognitive-theoretic perspective on conceptual blending. The framework builds on algebraic specifications, and solves two musicological problems. It automatically finds transitions between chord progressions of different keys or idioms, and it substitutes chords in a chord progression by other chords of a similar function, as a means to create novel variations. The approach is demonstrated with several examples where jazz cadences are invented by blending chords in cadences from earlier idioms, and where novel chord progressions are generated by inventing transition chords.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- Europe > Germany > Saxony-Anhalt > Magdeburg (0.04)
- North America > United States (0.04)
- (3 more...)
- Media > Music (1.00)
- Leisure & Entertainment (1.00)
The Generalisation Cost of RAMnets
Rohwer, Richard, Morciniec, Michal
We follow a similar approach to (Zhu & Rohwer, to appear 1996) in using a Gaussian process to define a prior over the space of functions, so that the expected generalisation cost under the posterior can be determined. The optimal model is defined in terms of the restriction of this posterior to the subspace defined by the model. The optimum is easily determined for linear models over a set of basis functions. We go on to compute the generalisation cost (with an error bar) for all models of this class, which we demonstrate to include the RAMnets.
- North America > Canada > Ontario > Toronto (0.14)
- Europe > United Kingdom (0.04)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models (0.69)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.47)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.47)
The Generalisation Cost of RAMnets
Rohwer, Richard, Morciniec, Michal
We follow a similar approach to (Zhu & Rohwer, to appear 1996) in using a Gaussian process to define a prior over the space of functions, so that the expected generalisation cost under the posterior can be determined. The optimal model is defined in terms of the restriction of this posterior to the subspace defined by the model. The optimum is easily determined for linear models over a set of basis functions. We go on to compute the generalisation cost (with an error bar) for all models of this class, which we demonstrate to include the RAMnets.
- North America > Canada > Ontario > Toronto (0.14)
- Europe > United Kingdom (0.04)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models (0.69)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.47)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.47)
The Generalisation Cost of RAMnets
Rohwer, Richard, Morciniec, Michal
Neural Computing Research Group Aston University Aston Triangle, Birmingham B4 7ET, UK. Abstract Given unlimited computational resources, it is best to use a criterion ofminimal expected generalisation error to select a model and determine its parameters. However, it may be worthwhile to sacrifice somegeneralisation performance for higher learning speed. A method for quantifying sub-optimality is set out here, so that this choice can be made intelligently. Furthermore, the method is applicable to a broad class of models, including the ultra-fast memory-based methods such as RAMnets. This brings the added benefit of providing, for the first time, the means to analyse the generalisation properties of such models in a Bayesian framework . 1 Introduction In order to quantitatively predict the performance of methods such as the ultra-fast RAMnet, which are not trained by minimising a cost function, we develop a Bayesian formalism for estimating the generalisation cost of a wide class of algorithms.
- Europe > United Kingdom (0.24)
- North America > Canada > Ontario > Toronto (0.14)