Goto

Collaborating Authors

 theorem3


LOG: ActiveModelAdaptationforLabel-Efficient OODGeneralization

Neural Information Processing Systems

Thisworkdiscusses howtoachieveworst-case Out-Of-Distribution(OOD) generalization for avariety of distributions based on arelatively small labeling cost.


TowardtheFundamentalLimitsofImitation Learning

Neural Information Processing Systems

We then propose a novel algorithm based on minimum-distance functionals in the setting where the transition model is given and the expert is deterministic.Thealgorithmissuboptimalby .|S|H3/2/N,matchingourlower


Uniform Error Bounds for Gaussian Process Regression with Application to Safe Control

Armin Lederer, Jonas Umlauft, Sandra Hirche

Neural Information Processing Systems

Key to the application of such models in safety-critical domains is the quantification of their model error. Gaussian processes provide such a measure anduniform error bounds havebeen derived,which allowsafe control based on thesemodels.




On Exact Computation with an Infinitely Wide Neural Net

Sanjeev Arora, Simon S. Du, Wei Hu, Zhiyuan Li, Russ R. Salakhutdinov, Ruosong Wang

Neural Information Processing Systems

Moreo randominitializationH( 0)conv deterministic H asthewidthNeur ker ( , ) (Equation (2)) evaluatedH(t)= H forallt, then (3) becomes du(t) dt = H (u(t) y). Suppose (z)= max ( 0,z), 1/ = poly ( 1/ ,log (n / )) and d1 = d2 = = dL = m with m poly ( 1/ , L,1/ 0,n,log ( 1/ )).