Marchand, Mario
A PAC-Bayes approach to the Set Covering Machine
Laviolette, François, Marchand, Mario, Shah, Mohak
A PAC-Bayes approach to the Set Covering Machine
Laviolette, François, Marchand, Mario, Shah, Mohak
PAC-Bayes Learning of Conjunctions and Classification of Gene-Expression Data
Marchand, Mario, Shah, Mohak
We propose a "soft greedy" learning algorithm for building small conjunctions of simple threshold functions, called rays, defined on single real-valued attributes. We also propose a PAC-Bayes risk bound which is minimized for classifiers achieving a nontrivial tradeoff between sparsity (the number of rays used) and the magnitude of the separating margin of each ray. Finally, we test the soft greedy algorithm on four DNA micro-array data sets.
PAC-Bayes Learning of Conjunctions and Classification of Gene-Expression Data
Marchand, Mario, Shah, Mohak
We propose a "soft greedy" learning algorithm for building small conjunctions of simple threshold functions, called rays, defined on single real-valued attributes. We also propose a PAC-Bayes risk bound which is minimized for classifiers achieving a nontrivial tradeoff between sparsity (the number of rays used) and the magnitude ofthe separating margin of each ray. Finally, we test the soft greedy algorithm on four DNA micro-array data sets.
The Decision List Machine
Sokolova, Marina, Marchand, Mario, Japkowicz, Nathalie, Shawe-taylor, John S.
We introduce a new learning algorithm for decision lists to allow features that are constructed from the data and to allow a tradeoff between accuracy and complexity. We bound its generalization error in terms of the number of errors and the size of the classifier it finds on the training data. We also compare its performance on some natural data sets with the set covering machine and the support vector machine.
The Decision List Machine
Sokolova, Marina, Marchand, Mario, Japkowicz, Nathalie, Shawe-taylor, John S.
We introduce a new learning algorithm for decision lists to allow features that are constructed from the data and to allow a tradeoff betweenaccuracy and complexity. We bound its generalization error in terms of the number of errors and the size of the classifier it finds on the training data. We also compare its performance on some natural data sets with the set covering machine and the support vector machine.
Learning Stochastic Perceptrons Under k-Blocking Distributions
Marchand, Mario, Hadjifaradji, Saeed
Such distributions represent an important stepbeyond the case where each input variable is statistically independent since the 2k-blocking family contains all the Markov distributions of order k. By stochastic perceptron we mean a perceptron which,upon presentation of input vector x, outputs 1 with probability fCLJi WiXi - B).