
Kevin Tian
kjtian (at) stanford (dot) edu
About me: I am a fifth-year Ph.D. student with the Theory Group in Computer Science at Stanford (since 2016). I am broadly interested in the interplay between continuous optimization and algorithm design in areas such as spectral graph theory, stochastic processes, high-dimensional statistics, convex geometry, and machine learning. I am fortunate to be advised by Aaron Sidford. I completed undergraduate studies in Computer Science and Math at MIT from 2012-2015. I also care greatly about STEM education, diversity, and related outreach.
In my free time, I enjoy dabbling in guitar and piano (and various quirkier instruments), running and biking (slowly), and watching an unhealthy amount of semi-educational YouTube videos. I am an avid fan of the San Antonio Spurs, Philadelphia Eagles, and Texas Longhorns. I am from the beautiful city of Austin, Texas, home of the best barbecue in the world. Please do not hesitate to get in touch if we share interests.
Website design credits go to Sunny Tian (check out some of her awesome work here). Photo credits go to Sonya Han.
My awesome (frequent) collaborators: Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Ruoqi Shen, Ilias Diakonikolas, Daniel Kane, Daniel Kongsgaard
Preprints
-
List-Decodable Mean Estimation in Nearly-PCA Time
with Ilias Diakonikolas, Daniel Kane, Daniel Kongsgaard, and Jerry Li
arXiv preprint, 2020. -
Structured Logconcave Sampling with a Restricted Gaussian Oracle
with Yin Tat Lee and Ruoqi Shen
arXiv preprint, 2020. -
Semi-Streaming Bipartite Matching in Fewer Passes and Less Space
with Yujia Jin and Aaron Sidford
arXiv preprint, 2020. -
Well-Conditioned Methods for Ill-Conditioned Systems: Linear Regression with Semi-Random Noise
with Jerry Li, Aaron Sidford, and Huishuai Zhang
arXiv preprint, 2020.
Publications
-
Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration
with Michael B. Cohen and Aaron Sidford
Innovations in Theoretical Computer Science (ITCS), 2021. -
Acceleration with a Ball Optimization Oracle
with Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, and Aaron Sidford
Neural Information Processing Systems (NeurIPS), 2020. Oral presentation (top 1% of submissions). -
Robust Sub-Gaussian Principal Component Analysis and Width-Independent Schatten Packing
with Arun Jambulapati and Jerry Li
Neural Information Processing Systems (NeurIPS), 2020. Spotlight presentation (top 3% of submissions). -
Coordinate Methods for Matrix Games
with Yair Carmon, Yujia Jin, and Aaron Sidford
Foundations of Computer Science (FOCS), 2020. -
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo
with Yin Tat Lee and Ruoqi Shen
Conference on Learning Theory (COLT), 2020. -
Positive Semidefinite Programming: Mixed, Parallel, and Width-Independent
with Arun Jambulapati, Yin Tat Lee, Jerry Li, and Swati Padmanabhan
Symposium on Theory of Computing (STOC), 2020. -
Variance Reduction for Matrix Games
with Yair Carmon, Yujia Jin, and Aaron Sidford
Neural Information Processing Systems (NeurIPS), 2019. Oral presentation (top 0.5% of submissions). -
A Direct Õ(1/ϵ) Iteration Parallel Algorithm for Optimal Transport
with Arun Jambulapati and Aaron Sidford
Neural Information Processing Systems (NeurIPS), 2019. -
A Rank-1 Sketch for Matrix Multiplicative Weights
with Yair Carmon, John C. Duchi, and Aaron Sidford
Conference on Learning Theory (COLT), 2019. -
Coordinate Methods for Accelerating ℓ∞ Regression and Faster Approximate Maximum Flow
with Aaron Sidford
Foundations of Computer Science (FOCS), 2018. Invited to the FOCS SICOMP Special Issue. -
CoVeR: Learning Covariate-Specific Vector Representations with Tensor Decompositions
with Teng Zhang and James Zou
International Conference on Machine Learning (ICML), 2018. -
Learning Populations of Parameters
with Weihao Kong and Gregory Valiant
Neural Information Processing Systems (NeurIPS), 2017. Here is a humorous informational video about the paper.
Earlier published work
-
A novel k-mer set memory (KSM) motif representation improves regulatory variant prediction
with Yuchun Guo, Haoyang Zeng, Xiaoyun Guo, and David K. Gifford
Genome Research, 2018. -
Predicting gene expression in massively parallel reporter assays: a comparative study
with Anat Kreimer, Haoyang Zeng, et al.
Human Mutation, 2017. -
K-mer Set Memory (KSM) Motif Representation Enables Accurate Prediction of the Impact of Regulatory Variants
with Yuchun Guo, Haoyang Zeng, and David K. Gifford
International Conference on Research in Computational Molecular Biology (RECOMB), 2017. -
On the Power Dominating Sets of Hypercubes
with Nathaniel Dean, Alexandra Ilic, Ignacio Ramirez, and Jian Shen
International Conference on Computational Science and Engineering (CSE), 2011.
Teaching and professional service
At Stanford, I was a graduate teaching assistant for MS&E 213/CS 269O: Introduction to Optimization Theory and MS&E 313/CS 269G: Almost Linear Time Graph Algorithms. At MIT, I was a teaching assistant for 6.006: Introduction to Algorithms. In the 2018-19 school year, I ran the Stanford Optimization for Algorithm Design reading group. I was the Stanford Theory Lunch organizer for Spring 2020.
Reviewer for: STOC 2019, COLT 2019, STOC 2020, ICALP 2020, ICML 2020, NeurIPS 2020, SODA 2021, STACS 2021, STOC 2021