Deep Submodular Peripteral Networks Arnav M. Das
–Neural Information Processing Systems
Seemingly unrelated, learning a scaling from oracles offering graded pairwise preferences (GPC) is underexplored, despite a rich history in psychometrics. In this paper, we introduce deep submodular peripteral networks (DSPNs), a novel parametric family of submodular functions, and methods for their training using a GPC-based strategy to connect and then tackle both of the above challenges. We introduce newly devised GPC-style "peripteral" loss which leverages numerically graded relationships between pairs of objects (sets in our case). Unlike traditional contrastive learning, or RHLF preference ranking, our method utilizes graded comparisons, extracting more nuanced information than just binary-outcome comparisons, and contrasts sets of any size (not just two). We also define a novel suite of automatic sampling strategies for training, including active-learning inspired submodular feedback. We demonstrate DSPNs' efficacy in learning submodularity from a costly target submodular function and demonstrate its superiority both for experimental design and online streaming applications.
Neural Information Processing Systems
Mar-21-2025, 19:25:32 GMT
- Country:
- North America
- Canada > Quebec (0.14)
- United States > Washington
- King County > Seattle (0.14)
- North America
- Genre:
- Research Report > New Finding (0.67)
- Industry:
- Health & Medicine (0.67)
- Technology: