Goto

Collaborating Authors

 embedding




fa3a3c407f82377f55c19c5d403335c7-AuthorFeedback.pdf

Neural Information Processing Systems

Extended " T able 2" in submitted paper. Extended " T able 3" in submitted paper. We thank reviewers for their comments, and will carefully revise paper considering these comments. Q1 (R1): References and comparison with a baseline that learns embeddings only through a standard convnet. In Tab.2 of this rebuttal, the state-of-the-art method of AISI [7] also depends on We will give more details of these compared methods in paper for clarity.




Appendix

Neural Information Processing Systems

B.1 BaselineGHN:GHN-1 GHNs were designed for NAS, which typically make strong assumptions about the choice of operations and their possible dimensions tomakesearch and learning feasible.




NavigatingtheEffectofParametrization forDimensionalityReduction

Neural Information Processing Systems

Parametric dimensionality reduction methods have gained prominence for their ability togeneralize tounseen datasets, anadvantage that traditional approaches typically lack. Despite their growing popularity, there remains a prevalent misconception among practitioners about the equivalence in performance between parametric and non-parametric methods. Here, we showthat these methods are not equivalent - parametric methods retain global structure but lose significant localdetails.


Tree! Iamno Tree! Iama Low Dimensional Hyperbolic Embedding

Neural Information Processing Systems

Note havethatd(z, w)=( y, z)w ifandonlyd(z, w)=( x, z)w. InProceedingsof the Twenty-sixth Annual ACMSymposiumon Principlesof Distributed Computing, PODC '07, pages 43-52, New York, NY, USA, 2007.