Reviews: Adversarial Surrogate Losses for Ordinal Regression

Neural Information Processing Systems 

The paper proposes an adversarial approach to ordinal regression, building upon recent works along these lines for cost-sensitive losses. The proposed method is shown to be consistent, and to have favourable empirical performance compared to existing methods. The basic idea of the paper is simple yet interesting: since ordinal regression can be viewed as a type of multiclass classification, and the latter has recently been attacked by adversarial learning approaches with some success, one can combine the two to derive adversarial ordinal regression approaches. By itself this would make the contribution a little narrow, but it is further shown that the adversarial loss in this particular problem admits a tractable form (Thm 1), which allows for efficient optimisation. Fisher-consistency of the approach also follows as a consequence of existing results for the cost-sensitive case, which is a salient feature of the approach.