About Me
I am a fourth-year PhD student in Machine Learning at the Georgia Institute of Technology advised by Vidya Muthukumar and Jacob Abernethy. I completed my BS in Applied and Computational Mathematics at the University of Southern California advised by Shaddin Dughmi, where I was a Trustee Scholar and Viterbi Fellow. My work has been generously supported by the DoD NDSEG Fellowship.
I am interested in advancing our scientific understanding of deep learning and using theoretical insights to design methods which work well in practice. My current focus is characterizing the generalization phenomena of overparameterized neural networks and developing provable algorithms for robust learning, particularly under distribution shift. The ultimate goal of my research is to enable the safe and trusted deployment of deep learning systems in high-consequence applications such as medicine, defense, and energy.
My industry research experience has focused on capabilities of large-scale language and vision models. During my PhD, I interned at Google where I leveraged the Gemini LLM for hardware-software code design and at Microsoft Research where I developed Detection Transformers for weakly supervised object detection.
Publications
An asterisk (*) denotes equal contribution.
Preprints
-
Task Shift: From Classification to Regression in Overparameterized Linear Models.
Tyler LaBonte*, Kuo-Wei Lai*, and Vidya Muthukumar.
Under submission.
Conference Articles
-
The Group Robustness is in the Details: Revisiting Finetuning under Spurious Correlations.
Tyler LaBonte, John C. Hill, Xinchen Zhang, Vidya Muthukumar, and Abhishek Kumar.
NeurIPS 2024. [arxiv] [code] [poster]
-
Towards Last-layer Retraining for Group Robustness with Fewer Annotations.
Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
NeurIPS 2023. [arxiv] [code] [poster] [video]
-
Scaling Novel Object Detection with Weakly Supervised Detection Transformers.
Tyler LaBonte, Yale Song, Xin Wang, Vibhav Vineet, and Neel Joshi.
WACV 2023. [arxiv] [code] [poster]
Journal Articles
-
Student Misconceptions of Dynamic Programming: A Replication Study.
Michael Shindler, Natalia Pinpin, Mia Markovic, Frederick Reiber, Jee Hoon Kim, Giles Pierre Nunez Carlos, Mine Dogucu, Mark Hong, Michael Luu, Brian Anderson, Aaron Cote, Matthew Ferland, Palak Jain, Tyler LaBonte, Leena Mathur, Ryan Moreno, and Ryan Sakuma.
Computer Science Education, 32(3):288—312, 2022.
-
Quantifying the Unknown Impact of Segmentation Uncertainty on Image-Based Simulations.
Michael C. Krygier, Tyler LaBonte, Carianne Martinez, Chance Norris, Krish Sharma, Lincoln N. Collins, Partha P. Mukherjee, and Scott A. Roberts.
Nature Communications, 12(1):5414, 2021. [code]
Workshop Articles
-
Saving a Split for Last-layer Retraining can Improve Group Robustness without Group Annotations.
Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
ICML 2023 Workshop on Spurious Correlations, Invariance, and Stability. [code] [poster]
-
Dropout Disagreement: A Recipe for Group Robustness with Fewer Annotations.
Tyler LaBonte, Vidya Muthukumar, and Abhishek Kumar.
NeurIPS 2022 Workshop on Distribution Shifts. [code] [poster]
-
Scaling Novel Object Detection with Weakly Supervised Detection Transformers.
Tyler LaBonte, Yale Song, Xin Wang, Vibhav Vineet, and Neel Joshi.
CVPR 2022 Workshop on Transformers for Vision. [code] [poster]
Theses
-
Finding the Needle in a High-Dimensional Haystack: Oracle Methods for Convex Optimization.
Tyler LaBonte.
Undergraduate Thesis, University of Southern California, 2021.
Winner of the USC Discovery Scholar distinction.
Manuscripts
-
We Know Where We Don't Know: 3D Bayesian CNNs for Credible Geometric Uncertainty.
Tyler LaBonte, Carianne Martinez, and Scott A. Roberts.
Manuscript, 2019. [code]
Selected Awards
- Simons Institute Deep Learning Theory Workshop Travel Grant ($2,000)
- DoD NDSEG Fellowship ($170,000)
- NSF Graduate Research Fellowship ($138,000—declined)
- USC Discovery Scholar (research distinction for <100 USC graduates)
- USC Trustee Scholar (Full scholarship worth $225,000)
- USC Viterbi Fellow (Research funding worth $24,000)
Industry Research Experience
- Machine Learning Research Intern, Google (2023)
- Machine Learning Research Intern, Microsoft Research (2021—2022)
- Machine Learning Research Intern, Google X (2020)
- Machine Learning Research Intern, Sandia National Labs (2019—2020)
Advising
- Xinchen Zhang—Georgia Tech MS (2024—)
- John C. Hill—Georgia Tech BS/MS → Georgia Tech PhD (2022—2024)
Teaching
- Lecturer/TA (8 lectures), Georgia Tech CS 7545: Machine Learning Theory (2024)
- Lecturer/TA (12 lectures), Georgia Tech CS 7545: Machine Learning Theory (2023)
- Teaching Assistant, USC CSCI 270: Intro to Algorithms and Theory of Computing (2021)
- Instructor, USC Center for AI in Society: Introduction to Machine Learning (2020)
- Teaching Assistant, USC CSCI 170: Discrete Methods in Computer Science (2019)
Reviewing
- Reviewer, ICML 2025
- Reviewer, NeurIPS 2024
- Reviewer, ICLR 2024
- Reviewer, NeurIPS 2023
Service and Leadership
- System Administrator, Georgia Tech ML Theory GPU Cluster (2022—)
- Student Organizer, Learning Theory Alliance Workshop (2023)
- Organizer, Georgia Tech ML Theory Reading Group (2021—2023)
- Projects Lead, USC Center for AI in Society (2019)
- Associate Director of Robotics, USC Viterbi K-12 STEM Center (2018)
- Robotics Mentor, USC Viterbi K-12 STEM Center (2017—2018)
Other Activities
- Fleet Captain, Georgia Tech Sailing Club (2023—)
- House Chair, USC Hawai'i Club (2020—2021)
- Vice President of Finance, USC Hawai'i Club (2019—2020)