Tyler LaBonte

PhD Student in Machine Learning
Dept. of Industrial & Systems Engineering
Georgia Institute of Technology

Email | CV | Resume
Scholar | LinkedIn | GitHub | Twitter

Georgia Tech undergrads: Please contact me! I am happy to chat about research (or anything, really).

About Me

I am a first-year PhD student in Machine Learning at the Georgia Institute of Technology advised by Tuo Zhao and a Machine Learning Research Intern at Microsoft Research advised by Neel Joshi. My work is generously supported by the DoD NDSEG Fellowship.

I received my BS in Applied and Computational Mathematics at the University of Southern California advised by Shaddin Dughmi, where I was a Trustee Scholar and Viterbi Fellow. My senior thesis in convex optimization received the USC Discovery Scholar distinction for exemplary research. During my undergraduate, I was a Machine Learning Research Intern at Google X and Sandia National Laboratories.

I use theory to develop robust and reliable machine learning models which can be deployed in high-consequence applications. My current focus is optimization and generalization in deep learning, where I seek to reconcile theoretical understanding with empirical phenomena. I also enjoy working on self-supervised and weakly supervised machine learning, particularly for computer vision applications.

In 2021, I was one of two undergraduates to receive both the DoD NDSEG and NSF GRFP fellowships in Computer Science; the other was Ethan Fahnestock. To democratize the fellowship application process, I have made my essays available here.

Publications

Preprints

  1. M.C. Krygier, T. LaBonte, C. Martinez, C. Norris, K. Sharma, L.N. Collins, P.P. Mukherjee, and S.A. Roberts. Quantifying the Unknown: Impact of Segmentation Uncertainty on Image-Based Simulations. Under submission to Nature Communications, 2020.
  2. T. LaBonte, C. Martinez, and S.A. Roberts. We Know Where We Don't Know: 3D Bayesian CNNs for Credible Geometric Uncertainty. Preprint, 2019.

Theses

  1. T. LaBonte. Finding the Needle in a High-Dimensional Haystack: Oracle Methods for Convex Optimization. Undergraduate Thesis, 2021. USC Discovery Scholar distinction.

Awards

  1. DoD NDSEG Fellowship ($170,000)
  2. NSF Graduate Research Fellowship ($138,000—declined)
  3. USC Discovery Scholar (research distinction for <100 USC graduates)
  4. USC Trustee Scholar ($225,000)
  5. USC Viterbi Fellow ($24,000)

Industry Experience

  1. Machine Learning Research Intern, Microsoft Research (2021)
  2. Machine Learning Research Intern, Google X (2020)
  3. Machine Learning Research Intern, Sandia National Labs (2019)
  4. Machine Learning Engineer Intern, Air Force Research Lab (2018)

Teaching

  1. USC CSCI 270: Intro to Algorithms and Theory of Computing (2021)
  2. USC Center for AI in Society: Introduction to Machine Learning (2020)
  3. USC CSCI 170: Discrete Methods in Computer Science (2019)

Service and Leadership

  1. Projects Lead, USC Center for AI in Society (2019)
  2. Associate Director of Robotics Outreach, USC Viterbi K-12 STEM Center (2018)
  3. Volunteer VEX Robotics Mentor, USC Viterbi K-12 STEM Center (2017—2018)