I work on randomized algorithms for scalable machine learning. By replacing expensive exact algorithms with lightweight approximate methods, we can substantially reduce the resources needed to run a program. Machine learning is an ideal application area because learning algorithms can adapt to the noise introduced by the approximation.

My current work is on efficient approximate algorithms for low-level building blocks of machine learning, such as kernel sums or near-neighbor search. I am particularly interested in simple methods with theoretical guarantees that also work well in a web-scale production environment.

Benjamin Coleman, Anshumali Shriavastava 2020. The Web Conference.** (WWW20)**

Benjamin Coleman, Richard G Baraniuk, Anshumali Shriavastava 2020. International Conference on Machine Learning.** (ICML20)**

Benjamin Coleman*, Benito Geordie*, Li Chou, R. A. Leo Elworth, Todd J. Treangen, and Anshumali Shrivastava (preprint, under review)

*Equal contribution

Benjamin Coleman, Anshumali Shrivastava (preprint, under review)

Benjamin Coleman*, Gaurav Gupta*, John Chen, Anshumali Shrivastava (preprint, under review)

*Equal contribution

John Chen, Benjamin Coleman, Anshumali Shrivastava (preprint, under review)