Uthaipon (Tao) Tantipongpipat

I am a machine learning researcher in META (Machine learning Ethics, Transparency and Accountability) team at Twitter Cortex.

CV  •   Resume  •   Publications  •   Google Scholar  •   LinkedIn   •   Github  •   Research Statement
(Last updated June 2020)


I graduated from Algorithms, Combinatorics, and Optimization (ACO) PhD program at Georgia Institute of Technology, based in School of Computer Science. My research interests include machine learning algorithms, combinatorial optimization, differential privacy, and fairness in machine learning. I am grateful to be advised by Mohit Singh. Our current research has been finding better (randomized and deterministic) polynomial-time approximation algorithms for optimal design problems in statistics. The first algorithmic work is joint with Aleksandar (Sasho) Nikolov and Vivek Madan. My colleages Vivek Madan, Mohit Singh, Weijun Xie and I also first prove the theoretical guarantee of commonly used heuristics, namely local search (Fedorov exchange) and greedy algorithm in this work .

In addition, I am working on differentially privacy. The first project is on growing databases with Rachel Cummings and Sara Krehbiel. Part of the work was presented at TPDP2017. Rachel and I also is a part of the team winning first prize and people's choice award ($20000 total) for NIST's privacy challenge . Our proposed solution is by differentially private generation of synthetic data via GANs and is presented at TPDP2018. More recently, I am exploring the practicality of employing differential privacy in training large deep learning models (during summer 2019 internship, hosted by Janardhan Kulkarni and Sergey Yekhanin).

My other recent direction of work is fairness in machine learning. In particular, my coauthors and I defined a notion of fairness in PCA (the blog on this notion) and proposed algorithms for two groups. We later improve the result to solving fairness over multiples groups and more general objective. The algorithms have proven theoretical guarantee and scale to large datasets. The implementation is publicly available on Github.


During my undergraduate, I studied mathematics at University of Richmond. My undergraduate research was in discrete geometry and algebraic combinatorics, mainly on bent functions (coding theory) and partial different sets.

My undergraduate thesis, under the supervision of James Davis, was in the area of algebraic combinatorics, coding theory, and discrete geometry, specifically on Cameron-Liebler line classes and partial difference sets, which includes a new non-existence result of partial difference sets in a certain class of abelian groups.

I am originally from Bangkok, Thailand. I graduated high school from Bangkok Christian College. During the middle and high school period, I was involved in and very much enjoyed national and international mathematics competitions and many serious training that came with those.


For authors with *, the author order is alphabetical which is conventional in theoretical science community.

  1. Fast and Memory Efficient Differentially Private-SGD via JL Projections
    * Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, and Uthaipon Tantipongpipat.

  2. λ-Regularized A-Optimal Design and its Approximation by λ-Regularized Proportional Volume Sampling
    Uthaipon Tantipongpipat.

  3. Differentially Private Mixed-Type Data Generation For Unsupervised Learning
    Uthaipon Tantipongpipat, Chris Waites, Digvijay Boob, Amaresh Ankit Siva, and Rachel Cummings
    Code on GitHub

  4. Maximizing Determinants under Matroid Constraints
    * Vivek Madan, Aleksandar Nikolov, Mohit Singh, and Uthaipon Tantipongpipat.
    Symposium on Foundations of Computer Science (FOCS) 2020.

  5. Multi-Criteria Dimensionality Reduction with Applications to Fairness
    Uthaipon Tantipongpipat, Samira Samadi, Jamie Morgenstern, Mohit Singh, and Santosh Vempala.
    Conference on Neural Information Processing Systems (NeurIPS) 2019, spotlight (top 2.5% of submitted papers)
    Code on GitHub | GT Press Release

  6. Combinatorial Algorithms for Optimal Design
    * Vivek Madan, Mohit Singh, Uthaipon Tantipongpipat, and Weijun Xie.
    Conference on Learning Theory (COLT) 2019.

  7. Proportional Volume Sampling and Approximation Algorithms for A-Optimal Design
    * Aleksandar Nikolov, Uthaipon Tantipongpipat, and Mohit Singh.
    ACM-SIAM Symposium on Discrete Algorithms (SODA) 2019.

  8. The Price of Fair PCA: One Extra Dimension
    Samira Samadi, Uthaipon Tantipongpipat, Jamie Morgenstern, Mohit Singh, and Santosh Vempala.
    Conference on Neural Information Processing Systems (NeurIPS) 2018.
    Website on fair PCA | Code on GitHub | Poster | GT Press Release

  9. Differential Privacy for Growing Databases
    * Rachel Cummings, Sara Krehbiel, Kevin Lai, and Uthaipon Tantipongpipat.
    Conference on Neural Information Processing Systems (NeurIPS) 2018.

  10. A Combinatorial Approach to Ebert's Hat Game with Many Colors
    Uthaipon Tantipongpipat
    The Electronic Journal of Combinatorics 21.4 (2014): P4-33.


  • Microsoft Research intern, Redmond, WA. Summer 2019.
    Supervisor: Janardhan Kulkarni and Sergey Yekhanin
    Research topic: implementation of differential privacy in large-scale deep machine learning models (NLP) and privacy analysis of correlation clustering.



[email protected]
tha[email protected][email protected][email protected]gma

[email protected]


[email protected]

.com[email protected]