I am a PhD student at UC Berkeley interested in statistics and machine learning, advised by Michael Mahoney. I also work closely with Jason Klusowski at Princeton. Before coming to Berkeley, I was a student at Arizona State University, where I studied mathematics and economics, and completed a master’s thesis under Sebastien Motsch in the mathematics department. I was also the machine learning lead in the Luminosity Lab. Outside of university, I spent two summers as a research intern at Salesforce Research, and have also worked for Amazon.com in the past. See my CV for more.
I love to travel and spend as much time outdoors as possible. I always take my camera with me wherever I go - feel free to check out some of my photos.
Since coming to Berkeley, I have been primarily interested in various theoretical aspects of deep learning, though lately this has mainly been focused on the question of generalization. Namely, I’m interested in when and how we can fit extremely complicated and expressive models to data, and expect these models to perform well on new, unseen data.
Below is a list of papers I’ve co-authored.
Yang, Y., Hodgkinson, L., Theisen, R., Zou, J., Gonzalez J. E., Ramchandran, K., Mahoney, M.W. Taxonomizing Local Versus Global Structure in Neural Network Loss Landscapes. Conference on Neural Information Processing Systems, 2021. [ArXiv]
Theisen, R., Wang, H., Varshney L. R., Xiong, C., Socher, R. Evaluating State-of-the-Art Classification Models Against Bayes Optimality. Conference on Neural Information Processing Systems, 2021. [ArXiv]
Cao, F., Motsch, S., Reamy, A., Theisen, R. Asymptotic Flocking for the Three-Zone Model. Mathematical Biosciences and Engineering, 2020. [AIMS]
Theisen, R., Klusowski, J. M., Wang, H., Keskar, N., Xiong, C., Socher, R. Global Capacity Measures for Deep ReLU Networks via Path Sampling. Submitted, 2019. [ArXiv]
Theisen, R., Master’s Thesis advised by Sebastien Motsch. Convergence Results for Two Models of Interaction. [Download]
Email: theisen [at] berkeley.edu