Tyler LaBonte

PhD Student in Machine Learning
School of Industrial/Systems Engineering
Georgia Institute of Technology
Office: CODA S1249H

Email | CV | Resume
Scholar | LinkedIn | GitHub | Twitter

Fellowship Materials
Favorite Expository Articles

Georgia Tech undergrads: Please contact me! I am happy to chat about research (or anything, really).

About Me

I am a fourth-year PhD student in Machine Learning at the Georgia Institute of Technology advised by Vidya Muthukumar and Jacob Abernethy. I completed my BS in Applied and Computational Mathematics at the University of Southern California advised by Shaddin Dughmi, where I was a Trustee Scholar and Viterbi Fellow. My work has been generously supported by the DoD NDSEG Fellowship.

I am interested in advancing our scientific understanding of deep learning and using theoretical insights to design methods which work well in practice. My current focus is characterizing the generalization phenomena of overparameterized neural networks and developing provable algorithms for robust learning, particularly under distribution shift. The ultimate goal of my research is to enable the safe and trusted deployment of deep learning systems in high-consequence applications such as medicine, defense, and energy.

My industry research experience has focused on capabilities of large-scale language and vision models. During my PhD, I interned at Google where I leveraged the Gemini LLM for hardware-software code design and at Microsoft Research where I developed Detection Transformers for weakly supervised object detection.


Publications

Conference Articles

  1. The Group Robustness is in the Details: Revisiting Finetuning under Spurious Correlations.
    Tyler LaBonte, John C. Hill, Xinchen Zhang, Vidya Muthukumar, and Abhishek Kumar.
    NeurIPS 2024. [arxiv] [code] [poster]
  2. Towards Last-layer Retraining for Group Robustness with Fewer Annotations.
    Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
    NeurIPS 2023. [arxiv] [code] [poster] [video]
  3. Scaling Novel Object Detection with Weakly Supervised Detection Transformers.
    Tyler LaBonte, Yale Song, Xin Wang, Vibhav Vineet, and Neel Joshi.
    WACV 2023. [arxiv] [code] [poster]

Journal Articles

  1. Student Misconceptions of Dynamic Programming: A Replication Study.
    Michael Shindler, Natalia Pinpin, Mia Markovic, Frederick Reiber, Jee Hoon Kim, Giles Pierre Nunez Carlos, Mine Dogucu, Mark Hong, Michael Luu, Brian Anderson, Aaron Cote, Matthew Ferland, Palak Jain, Tyler LaBonte, Leena Mathur, Ryan Moreno, and Ryan Sakuma.
    Computer Science Education, 32(3):288—312, 2022.
  2. Quantifying the Unknown Impact of Segmentation Uncertainty on Image-Based Simulations.
    Michael C. Krygier, Tyler LaBonte, Carianne Martinez, Chance Norris, Krish Sharma, Lincoln N. Collins, Partha P. Mukherjee, and Scott A. Roberts.
    Nature Communications, 12(1):5414, 2021.

Workshop Articles

  1. Saving a Split for Last-layer Retraining can Improve Group Robustness without Group Annotations.
    Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
    ICML 2023 Workshop on Spurious Correlations, Invariance, and Stability. [poster]
  2. Dropout Disagreement: A Recipe for Group Robustness with Fewer Annotations.
    Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
    NeurIPS 2022 Workshop on Distribution Shifts. [poster]
  3. Scaling Novel Object Detection with Weakly Supervised Detection Transformers.
    Tyler LaBonte, Yale Song, Xin Wang, Vibhav Vineet, and Neel Joshi.
    CVPR 2022 Workshop on Transformers for Vision. [poster]

Theses

  1. Finding the Needle in a High-Dimensional Haystack: Oracle Methods for Convex Optimization.
    Tyler LaBonte.
    Undergraduate Thesis, University of Southern California, 2021.
    Winner of the USC Discovery Scholar distinction.

Manuscripts

  1. We Know Where We Don't Know: 3D Bayesian CNNs for Credible Geometric Uncertainty.
    Tyler LaBonte, Carianne Martinez, and Scott A. Roberts.
    Manuscript, 2019. [code]

Selected Awards

  1. Simons Institute Deep Learning Theory Workshop Travel Grant ($2,000)
  2. DoD NDSEG Fellowship ($170,000)
  3. NSF Graduate Research Fellowship ($138,000—declined)
  4. USC Discovery Scholar (research distinction for <100 USC graduates)
  5. USC Trustee Scholar (Full scholarship worth $225,000)
  6. USC Viterbi Fellow (Research funding worth $24,000)

Industry Research Experience

  1. Machine Learning Research Intern, Google (2023)
  2. Machine Learning Research Intern, Microsoft Research (2021—2022)
  3. Machine Learning Research Intern, Google X (2020)
  4. Machine Learning Research Intern, Sandia National Labs (2019—2020)

Advising

  1. Xinchen Zhang—Georgia Tech MS (2024—)
  2. John C. Hill—Georgia Tech BS/MS → Georgia Tech PhD (2022—2024)
  3. Pratik Deolasi—Georgia Tech BS → MathWorks (2021—2022)
  4. Rishit Mohan Ahuja—Georgia Tech BS → Georgia Tech MS (2021—2022)

Teaching

  1. Lecturer/TA (8 lectures), Georgia Tech CS 7545: Machine Learning Theory (2024)
  2. Lecturer/TA (12 lectures), Georgia Tech CS 7545: Machine Learning Theory (2023)
  3. Teaching Assistant, USC CSCI 270: Intro to Algorithms and Theory of Computing (2021)
  4. Instructor, USC Center for AI in Society: Introduction to Machine Learning (2020)
  5. Teaching Assistant, USC CSCI 170: Discrete Methods in Computer Science (2019)

Reviewing

  1. Reviewer, NeurIPS 2024
  2. Reviewer, ICLR 2024
  3. Reviewer, NeurIPS 2023

Service and Leadership

  1. Student Organizer, Learning Theory Alliance Workshop (2023)
  2. System Administrator, Georgia Tech ML Theory GPU Cluster (2022—)
  3. Organizer, Georgia Tech ML Theory Reading Group (2021—2023)
  4. Projects Lead, USC Center for AI in Society (2019)
  5. Associate Director of Robotics, USC Viterbi K-12 STEM Center (2018)
  6. Robotics Mentor, USC Viterbi K-12 STEM Center (2017—2018)

Other Activities

  1. Fleet Captain, Georgia Tech Sailing Club (2023—)
  2. House Chair, USC Hawai'i Club (2020—2021)
  3. Vice President of Finance, USC Hawai'i Club (2019—2020)