Cross-conformal predictors
The method of conformal prediction produces set predictions that are automatically valid in the sense that their unconditional coverage probability is equal to or exceeds a preset confidence level ([14], Chapter 2). A more computationally efficient method of this kind is that of inductive conformal prediction ([12], [14], Section 4.1, [1]). However, inductive conformal predictors are typically less predictively efficient, in the sense of producing larger prediction sets as compared with conformal predictors. Motivated by the method of cross-validation [11, 13], this note explores a hybrid method, which we call cross-conformal prediction. We are mainly interested in the problems of classification and regression, in which we are given a training set consisting of examples, each example consisting of an object and a label, and asked to predict the label of a new test object; in the problem of classification labels are elements of a given finite set, and in the problem of regression labels are real numbers. If we are asked to predict labels for more than one test objects, the same prediction procedure can be applied to each test object separately. In this introductory section and in our empirical studies we consider the problem of binary classification, in which labels can take only two values, which we will encode as 0 and 1. We always assume that the examples (both the training examples and the test examples, consisting of given objects and unknown labels) are generated independently from the same probability measure; this assumption will be called the assumption of randomness.
Aug-3-2012
- Country:
- Europe > Middle East
- Cyprus (0.04)
- North America > United States
- Massachusetts > Middlesex County
- Reading (0.04)
- Nevada > Clark County
- Las Vegas (0.04)
- New York (0.05)
- Massachusetts > Middlesex County
- Europe > Middle East
- Genre:
- Research Report (0.88)
- Technology: