Cyrus Cousins

The Life and Times of

Cyrus Cousins 🔊

Finitely Wise, Infinitely Curious

About Me

Abridged Biography

I am Cyrus Cousins, a visiting assistant professor at Brown University in the BIGDATA group, where I also completed my doctoral studies under the tutelage of the great Eli Upfal. Before arriving at Brown University, I earned my undergraduate degree in computer science, mathematics, and biology from Tufts University. My research interests lie primarily in showing novel techniques to bound uniform convergence rates and generalization error in exotic settings, and applying these bounds to tasks of real-world interest, most notably in data science, empirical game theory, and fair machine learning.

My favorite theorem is the Dvoretzky-Kiefer-Wolfowitz Inequality, and my favorite algorithm is simulated annealing.

Research Overview

In my research, I strive to strike a delicate balance between theory and practice. On the theory side, my work primarily lies in sample complexity analysis for machine learning algorithms, as well as time complexity analysis and probabilistic guarantees for efficient sampling-based routines in randomized algorithms and statistical data science [1] [2] [3] [4]. In addition to statistical analysis, much of my work deals with delicate computational questions, like how to optimally characterize and estimate the sample-complexity of various estimation tasks (with applications to oblivious algorithms, which achieve near-optimal performance while requiring limited a priori knowledge), as well as the development of fair-PAC learning, with the accompanying computational and statistical reductions between classes of learnable models.

On the practical side, much of my early work was led by the observation that modern methods in statistical learning theory (Rademacher averages and localized Rademacher averages) often yield vacuous or unsatisfying guarantees, so I strove to understand why, and to show sharper bounds, with particular emphasis on constant factors and performance in the small sample setting. From there, I have worked to apply statistical methods developed for these approaches to myriad practical settings, including statistical data science tasks, and the analysis of machine learning, and more recently, fairness sensitive machine learning algorithms.

By blurring the line between theory and practice, I have been able to adapt rigorous theoretical guarantees to novel settings. For example, my work on adversarial learning from weak supervision stemmed from a desire to apply statistical learning theory techniques in absentia of sufficient labeled data. Conversely, I have also been able to treat theoretical problems that previously seemed unmotivated or irrelevant; my work in fair machine learning led to the fair-PAC learning formalism, where power-means over per-group losses (rather than averages) are minimized. The motivation to optimize power-means derives purely from the economic theory of cardinal welfare, but the value of this learning concept only becomes apparent when one observes that many of the desirable (computational and statistical) properties of risk minimization directly translate to power-mean minimization.

Research Statements

My research statement is publicly available (5 pages).

The best (mostly current) overview of my research is given in my thesis summary (4 pages). The piece is a non-mathematical, but still somewhat technical, overview of my dissertation.

The best mathematical overview of my research for general audiences is given in this piece (5 pages). Here the focus is less on applications and implications, and more on intuition for the deeper mathematical connections between my various areas of study. Results are selected for elegance and simplicity, and the piece should be broadly accessible to all audiences with a basic grounding in probability and statistics.

Cyrus Cousins

News

2021

2020

2019

  • I will be returning to the Labs group at Two Sigma Investments to work with Larry Rudolph.

2018

  • I have accepted a summer internship offer with Two Sigma Investments, and will be working with Matteo Riondato in the Labs group.


Major Projects

  1. Axiomatically Justified and Statistically Sound Fair Machine Learning
  2. Fair Adversarial Machine Learning from Partial Demographic Information
  3. Making mean-estimation more efficient using an MCMC trace variance approach: DynaMITE
  4. Adversarial Multi Class Learning under Weak Supervision with Performance Guarantees
  5. Sharp Uniform Convergence Bounds through Empirical Centralization
  6. CADET: Interpretable Parametric Conditional Density Estimation with Decision Trees and Forests
  7. Empirical Game Theoretic Analysis

My dissertation: Bounds and Applications of Concentration of Measure in Fair Machine Learning and Data Science




A Complete List of Publications

Find My Work

  1. DBLP
  2. Google Scholar
  3. ResearchGate
  4. Scopus
  5. ArXiv





Teaching

Professor

Graduate Teaching Assistant



Curriculum Vitae (CV)