prof_pic.jpg

I’m a CS Ph.D. student at Stanford, advised by Chelsea Finn and part of the IRIS Lab. My research is supported by the KFAS Doctoral Fellowship. Previously, as alternative military service for the South Korean army, I worked as a research scientist at Kakao and AITRICS, working with Juho Lee. Before that, I completed my master’s (CS, advised by Seungjin Choi) and undergraduate (math) degrees at POSTECH.

My research interest is in developing neural network models that can learn and make reliable decisions even in changing environments. To accomplish this, I believe that we should move beyond the independent and identically distributed (i.i.d.) paradigm and address the nonstationary nature of real-world data in our learning and evaluation procedures.

selected publications

  1. ICML
    DetectGPT: Zero-Shot Machine-Generated Text Detection using Probability Curvature
    Eric Mitchell, Yoonho Lee, Alexander Khazatsky, Christopher D Manning, Chelsea Finn
    ICML 2023 (oral) [abstract] [paper] [website] [code] [demo]
  2. Conservative Prediction via Transductive Confidence Minimization
    Caroline Choi*, Fahim Tajwar*, Yoonho Lee*, Huaxiu Yao, Ananya Kumar, Chelsea Finn
    ICLR 2023 workshops: TrustML, ME-FoMo [abstract] [paper]
  3. Project and Probe: Sample-Efficient Domain Adaptation by Interpolating Orthogonal Features
    Annie S. Chen*, Yoonho Lee*, Amrith Setlur, Sergey Levine, Chelsea Finn
    ICLR 2023 workshops: TrustML (oral), ME-FoMo [abstract] [paper]
  4. ICLR
    Surgical Fine-Tuning Improves Adaptation to Distribution Shifts
    Yoonho Lee*, Annie S. Chen*, Fahim Tajwar, Ananya Kumar, Huaxiu Yao, Percy Liang, Chelsea Finn
    ICLR 2023
    NeurIPS 2022 Workshops: DistShift, ICBINB
    [abstract] [paper]
  5. ICLR
    Diversify and Disambiguate: Out-of-Distribution Robustness via Disagreement
    Yoonho Lee, Huaxiu Yao, Chelsea Finn
    ICLR 2023
    ICML workshops: PODS, SCIS
    [abstract] [paper] [website] [code]
  6. ICML
    Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks
    Juho Lee, Yoonho Lee, Jungtaek Kim, Adam Kosiorek, Seungjin Choi, Yee Whye Teh
    ICML 2019 [abstract] [paper] [code]
  7. ICML
    Gradient-based Meta-learning with Learned Layerwise Metric and Subspace
    Yoonho Lee, Seungjin Choi
    ICML 2018 [abstract] [paper] [video] [code]