1. Efficient Checkpoint-Based Meta-Learning With Gradient Matching Yoonho Lee, and Juho Lee [abstract] [bibtex]
  2. Diversity Matters When Learning From Ensembles Giung Nam, Jongmin Yoon, Yoonho Lee, and Juho Lee [abstract] [bibtex]
  3. Amortized Probabilistic Detection of Communities in Graphs Yueqi Wang*, Yoonho Lee*, Pallab Basu, Juho Lee, Yee Whye Teh, Liam Paninski, and Ari Pakman [abstract] [paper] [code] [bibtex]
  4. UAI
    On the Distribution of Penultimate Activations of Classification Networks Minkyo Seo*, Yoonho Lee*, and Suha Kwak UAI 2021 [abstract] [bibtex]
  1. NeurIPS
    Bootstrapping Neural Processes Juho Lee*, Yoonho Lee*, Jungtaek Kim, Eunho Yang, Sung Ju Hwang, and Yee Whye Teh NeurIPS 2020 [abstract] [paper] [video] [code] [bibtex]
  2. NeurIPS
    Neural Complexity Measures Yoonho Lee, Juho Lee, Sung Ju Hwang, Eunho Yang, and Seungjin Choi NeurIPS 2020 [abstract] [paper] [blog] [video] [code] [bibtex]
  1. Discrete Infomax Codes for Supervised Representation Learning Yoonho Lee, Wonjae Kim, Wonpyo Park, and Seungjin Choi arXiv:1905.11656 [abstract] [paper] [bibtex]
  2. NeurIPS-W
    Deep Amortized Clustering Juho Lee, Yoonho Lee, and Yee Whye Teh Sets and Parts Workshop @ NeurIPS 2019 (oral) [abstract] [paper] [bibtex]
  3. NeurIPS
    Learning Dynamics of Attention: Human Prior for Interpretable Machine Reasoning Wonjae Kim, Yoonho Lee NeurIPS 2019 [abstract] [paper] [code] [bibtex]
  4. ICML
    Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks Juho Lee, Yoonho Lee, Jungtaek Kim, Adam Kosiorek, Seungjin Choi, and Yee Whye Teh ICML 2019 [abstract] [paper] [code] [bibtex]
  1. ICML
    Gradient-based Meta-learning with Learned Layerwise Metric and Subspace Yoonho Lee, and Seungjin Choi ICML 2018 [abstract] [paper] [video] [code] [bibtex]