Not enough data to create a plot.
Try a different view from the menu above.
Chauvin, Yves
Hidden Markov Models for Human Genes
Baldi, Pierre, Brunak, Søren, Chauvin, Yves, Engelbrecht, Jacob, Krogh, Anders
We apply HMMs to the problem of modeling exons, intronsand detecting splice sites in the human genome. Our most interesting result so far is the detection of particular oscillatory patterns,with a minimal period ofroughly 10 nucleotides, that seem to be characteristic of exon regions and may have significant biological implications.
Hidden Markov Models for Human Genes
Baldi, Pierre, Brunak, Søren, Chauvin, Yves, Engelbrecht, Jacob, Krogh, Anders
Human genes are not continuous but rather consist of short coding regions (exons) interspersed with highly variable non-coding regions (introns). We apply HMMs to the problem of modeling exons, introns and detecting splice sites in the human genome. Our most interesting result so far is the detection of particular oscillatory patterns, with a minimal period ofroughly 10 nucleotides, that seem to be characteristic of exon regions and may have significant biological implications.
Hidden Markov Models in Molecular Biology: New Algorithms and Applications
Baldi, Pierre, Chauvin, Yves, Hunkapiller, Tim, McClure, Marcella A.
Hidden Markov Models (HMMs) can be applied to several important problems in molecular biology. We introduce a new convergent learning algorithm for HMMs that, unlike the classical Baum-Welch algorithm is smooth and can be applied online or in batch mode, with or without the usual Viterbi most likely path approximation. Left-right HMMs with insertion and deletion states are then trained to represent several protein families including immunoglobulins and kinases. In all cases, the models derived capture all the important statistical properties of the families and can be used efficiently in a number of important tasks such as multiple alignment, motif detection, and classification.
Hidden Markov Models in Molecular Biology: New Algorithms and Applications
Baldi, Pierre, Chauvin, Yves, Hunkapiller, Tim, McClure, Marcella A.
Hidden Markov Models (HMMs) can be applied to several important problems in molecular biology. We introduce a new convergent learning algorithm for HMMs that, unlike the classical Baum-Welch algorithm is smooth and can be applied online or in batch mode, with or without the usual Viterbi most likely path approximation. Left-right HMMs with insertion and deletion states are then trained to represent several protein families including immunoglobulins and kinases. In all cases, the models derived capture all the important statistical properties of the families and can be used efficiently in a number of important tasks such as multiple alignment, motif detection, and classification.
Hidden Markov Models in Molecular Biology: New Algorithms and Applications
Baldi, Pierre, Chauvin, Yves, Hunkapiller, Tim, McClure, Marcella A.
Hidden Markov Models (HMMs) can be applied to several important problemsin molecular biology. We introduce a new convergent learning algorithm for HMMs that, unlike the classical Baum-Welch algorithm is smooth and can be applied online or in batch mode, with or without the usual Viterbi most likely path approximation. Left-right HMMs with insertion and deletion states are then trained to represent several protein families including immunoglobulins and kinases. In all cases, the models derived capture all the important statistical properties of the families and can be used efficiently in a number of important tasks such as multiple alignment, motif detection, andclassification.
Generalization Dynamics in LMS Trained Linear Networks
Chauvin, Yves
Recent progress in network design demonstrates that nonlinear feedforward neural networks can perform impressive pattern classification for a variety of real-world applications (e.g., Le Cun et al., 1990; Waibel et al., 1989). Various simulations and relationships between the neural network and machine learning theoretical literatures also suggest that too large a number of free parameters ("weight overfitting") could substantially reduce generalization performance.
Generalization Dynamics in LMS Trained Linear Networks
Chauvin, Yves
Recent progress in network design demonstrates that nonlinear feedforward neural networkscan perform impressive pattern classification for a variety of real-world applications (e.g., Le Cun et al., 1990; Waibel et al., 1989). Various simulations and relationships between the neural network and machine learning theoretical literatures alsosuggest that too large a number of free parameters ("weight overfitting") could substantially reduce generalization performance.