Tyler LaBonte

PhD Student in Machine Learning
School of Industrial/Systems Engineering
Georgia Institute of Technology
Office: CODA S1249H

Email | CV | Resume
Scholar | LinkedIn | GitHub | Twitter

Fellowship Materials
Favorite Expository Articles

Georgia Tech undergrads: Please contact me! I am happy to chat about research (or anything, really).

About Me

I am a fourth-year PhD student in Machine Learning at the Georgia Institute of Technology advised by Vidya Muthukumar and Jacob Abernethy. I completed my BS in Applied and Computational Mathematics at the University of Southern California advised by Shaddin Dughmi, where I was a Trustee Scholar and Viterbi Fellow. My work has been generously supported by the DoD NDSEG Fellowship.

I am interested in foundational aspects of generalization in machine learning. My research goal is to advance our scientific and mathematical understanding of deep learning and leverage theoretical insights to design practical algorithms. My current focus includes:

During my PhD, I interned at Microsoft Research, where I am currently investigating reasoning proficiency of multimodal VLMs, and Google, where I utilized the Gemini LLM for hardware-software code design.


Publications

An asterisk (*) denotes equal contribution.

Conference Articles

  1. Task Shift: From Classification to Regression in Overparameterized Linear Models.
    Tyler LaBonte*, Kuo-Wei Lai*, and Vidya Muthukumar.
    AISTATS 2025. [arxiv] [code]
  2. The Group Robustness is in the Details: Revisiting Finetuning under Spurious Correlations.
    Tyler LaBonte, John C. Hill, Xinchen Zhang, Vidya Muthukumar, and Abhishek Kumar.
    NeurIPS 2024. [arxiv] [code] [poster] [video]
  3. Towards Last-layer Retraining for Group Robustness with Fewer Annotations.
    Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
    NeurIPS 2023. [arxiv] [code] [poster] [video]
  4. Scaling Novel Object Detection with Weakly Supervised Detection Transformers.
    Tyler LaBonte, Yale Song, Xin Wang, Vibhav Vineet, and Neel Joshi.
    WACV 2023. [arxiv] [code] [poster]

Journal Articles

  1. Student Misconceptions of Dynamic Programming: A Replication Study.
    Michael Shindler, Natalia Pinpin, Mia Markovic, Frederick Reiber, Jee Hoon Kim, Giles Pierre Nunez Carlos, Mine Dogucu, Mark Hong, Michael Luu, Brian Anderson, Aaron Cote, Matthew Ferland, Palak Jain, Tyler LaBonte, Leena Mathur, Ryan Moreno, and Ryan Sakuma.
    Computer Science Education, 32(3):288—312, 2022.
  2. Quantifying the Unknown Impact of Segmentation Uncertainty on Image-Based Simulations.
    Michael C. Krygier, Tyler LaBonte, Carianne Martinez, Chance Norris, Krish Sharma, Lincoln N. Collins, Partha P. Mukherjee, and Scott A. Roberts.
    Nature Communications, 12(1):5414, 2021. [code]

Workshop Articles

  1. On the Unreasonable Effectiveness of Last-layer Retraining.
    John C. Hill, Tyler LaBonte, Xinchen Zhang, and Vidya Muthukumar.
    ICLR 2025 Workshop on Spurious Correlations and Shortcut Learning.
  2. Saving a Split for Last-layer Retraining can Improve Group Robustness without Group Annotations.
    Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
    ICML 2023 Workshop on Spurious Correlations, Invariance, and Stability. [code] [poster]
  3. Dropout Disagreement: A Recipe for Group Robustness with Fewer Annotations.
    Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
    NeurIPS 2022 Workshop on Distribution Shifts. [code] [poster]
  4. Scaling Novel Object Detection with Weakly Supervised Detection Transformers.
    Tyler LaBonte, Yale Song, Xin Wang, Vibhav Vineet, and Neel Joshi.
    CVPR 2022 Workshop on Transformers for Vision. [code] [poster]

Theses

  1. Finding the Needle in a High-Dimensional Haystack: Oracle Methods for Convex Optimization.
    Tyler LaBonte.
    Undergraduate Thesis, University of Southern California, 2021.
    Winner of the USC Discovery Scholar distinction.

Manuscripts

  1. We Know Where We Don't Know: 3D Bayesian CNNs for Credible Geometric Uncertainty.
    Tyler LaBonte, Carianne Martinez, and Scott A. Roberts.
    Manuscript, 2019. [code]

Selected Awards

  1. Simons Institute Deep Learning Theory Workshop Travel Grant ($2,000)
  2. DoD NDSEG Fellowship ($170,000)
  3. NSF Graduate Research Fellowship ($138,000—declined)
  4. USC Discovery Scholar (research distinction for <100 USC graduates)
  5. USC Trustee Scholar (Full scholarship worth $225,000)
  6. USC Viterbi Fellow (Research funding worth $24,000)

Industry Research Experience

  1. Machine Learning Research Intern, Microsoft Research (2025)
  2. Machine Learning Research Intern, Google (2023)
  3. Machine Learning Research Intern, Microsoft Research (2021—2022)
  4. Machine Learning Research Intern, Google X (2020)
  5. Machine Learning Research Intern, Sandia National Labs (2019—2020)

Advising

  1. Xinchen Zhang—Georgia Tech MS (2024—2025)
  2. John C. Hill—Georgia Tech BS/MS → Georgia Tech PhD (2022—2024)

Teaching

  1. Lecturer/TA (8 lectures), Georgia Tech CS 7545: Machine Learning Theory (2024)
  2. Lecturer/TA (12 lectures), Georgia Tech CS 7545: Machine Learning Theory (2023)
  3. Teaching Assistant, USC CSCI 270: Intro to Algorithms and Theory of Computing (2021)
  4. Instructor, USC Center for AI in Society: Introduction to Machine Learning (2020)
  5. Teaching Assistant, USC CSCI 170: Discrete Methods in Computer Science (2019)

Academic Service

  1. Program Committee, ICLR Workshop on Spurious Correlations & Shortcut Learning (2025)
  2. Reviewer, International Conference on Machine Learning (2025)
  3. Organizer, Georgia Tech ML Theory Reading Group (2021—2023, 2025)
  4. System Administrator, Georgia Tech ML Theory GPU Cluster (2022—2025)
  5. Reviewer, International Conference on Learning Representations (2024)
  6. Reviewer, Conference on Neural Information Processing Systems (2023—2024)
  7. Student Organizer, Learning Theory Alliance Workshop (2023)

Other Activities

  1. Fleet Captain, Georgia Tech Sailing Club (2023—2025)
  2. House Chair, USC Hawai'i Club (2020—2021)
  3. Vice President of Finance, USC Hawai'i Club (2019—2020)

    1. Theme by orderedlist