Goto

Collaborating Authors

 admission








Distilled Wasserstein Learning for Word Embedding and Topic Modeling

Hongteng Xu, Wenlin Wang, Wei Liu, Lawrence Carin

Neural Information Processing Systems

Theworddistributions of topics, their optimal transports to the word distributions of documents, and the embeddings of words are learned in a unified framework. When learning thetopic model, weleverage adistilled underlying distance matrix toupdate the topic distributions and smoothly calculate the corresponding optimal transports.



SupplementaryMaterial

Neural Information Processing Systems

This work was supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant (No.2019-0-00075, Artificial Intelligence Graduate School Program(KAIST)), National Research Foundation of Korea (NRF) grant (NRF2020H1D3A2A03100945) andDataVoucher grant(2021-DV-I-P-00114), funded bythe Koreagovernment(MSIT). The dataset contains question-SQL pairs if the question is answerable. Are relationships between individual instances made explicit (e.g., users' movie ratings, socialnetworklinks)? N/A. Arethereanyerrors,sourcesofnoise,orredundanciesinthedataset? Question templates are created to have slots that are later filled with pre-defined values and records from the database. EHRSQL is based on patients in MIMIC-III and eICU.