Haussler, David
Exploiting Generative Models in Discriminative Classifiers
Jaakkola, Tommi, Haussler, David
On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often result in classification performance superiorto that of the model based approaches. An ideal classifier should combine these two complementary approaches. In this paper, we develop a natural way of achieving this combination byderiving kernel functions for use in discriminative methods such as support vector machines from generative probability models.
Exploiting Generative Models in Discriminative Classifiers
Jaakkola, Tommi, Haussler, David
On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often result in classification performance superior to that of the model based approaches. An ideal classifier should combine these two complementary approaches. In this paper, we develop a natural way of achieving this combination by deriving kernel functions for use in discriminative methods such as support vector machines from generative probability models.
What Size Net Gives Valid Generalization?
Baum, Eric B., Haussler, David
We address the question of when a network can be expected to generalize from m random training examples chosen from some arbitrary probabilitydistribution, assuming that future test examples are drawn from the same distribution. Among our results are the following bounds on appropriate sample vs. network size.
What Size Net Gives Valid Generalization?
Baum, Eric B., Haussler, David
We address the question of when a network can be expected to generalize from m random training examples chosen from some arbitrary probability distribution, assuming that future test examples are drawn from the same distribution. Among our results are the following bounds on appropriate sample vs. network size.
Quantifying the Inductive Bias in Concept Learning
Haussler, David