Goto

Collaborating Authors

 subproblem



Revisiting Area Convexity: Faster Box-Simplex Games and Spectrahedral Generalizations Arun Jambulapati Kevin Tian

Neural Information Processing Systems

We develop a deeper understanding of its relationship with more conventional analyses of extragradient methods [Nem04, Nes07]. We also give improved solvers for the subproblems required by variants of the [She17] algorithm, designed through the lens of relative smoothness [BBT17, LFN18].




Appendix to Weakly Coupled Deep Q-Networks A Proofs

Neural Information Processing Systems

We prove part the first part of the proposition (weak duality) by induction. It is well-known that, by the value iteration algorithm's convergence, Q Consider a state s S and a feasible action a A (s). We use an induction proof. B (w), which follows by the convergence of value iteration.A.2 Proof of Theorem 1 Proof. Now we state the following lemma.



Neural Multi-Objective Combinatorial Optimization with Diversity Enhancement (Appendix) A Reference point and hypervolume ratio

Neural Information Processing Systems

In the inference process, the submodel is used to solve the corresponding subproblem. The input dimensions of the node features vary with different problems. A masking mechanism is adopted in each decoding step to ensure the solution feasibility. For MOTSP, the visited nodes are masked. NHDE-M usually spends relatively more inference time than MDRL with the same number of weights.