This lecture series honors Paris Kanellakis, a distinguished computer scientist who was an esteemed and beloved member of Brown CS. Paris joined us in 1981 and became a full professor in 1990. His research area was theoretical computer science, with an emphasis on the principles of database systems, logic in computer science, the principles of distributed computing, and combinatorial optimization.
Constantinos Daskalakis of MIT will deliver the 19th Paris C. Kanellakis Memorial Lecture ("Learning from Censored and Dependent Data") on Thursday, December 5, in CIT 368:
"Machine Learning is invaluable for extracting insights from large volumes of data. A key assumption enabling many methods, however, is having access to training data comprising independent observations from the entire distribution of relevant data. In practice, data is commonly missing due to measurement limitations, legal restrictions, or data collection and sharing practices. Moreover, observations are commonly collected on a network, a spatial or a temporal domain and may be intricately dependent. Training on data that is censored or dependent is known to lead to Machine Learning models that are biased.
In this talk, we overview recent work on learning from censored and dependent data. We propose a learning framework which is widely applicable, and instantiate this framework to obtain computationally and statistically efficient methods for linear, logistic and probit regression from censored or dependent samples, in high dimensions. We complement these theoretical findings with experiments showing the practicality of the framework in training Deep Neural Network models on biased data. Our findings are enabled through connections to Statistical Physics, Concentration and Anti-concentration of measure, and properties of Stochastic Gradient Descent, and resolve classical challenges in Statistics and Econometrics."
Constantinos (a.k.a. “Costis”) Daskalakis is a Professor of Computer Science at MIT and a member of CSAIL. He works on computation theory and its interface with game theory, economics, probability theory, statistics and machine learning. He holds a Diploma in Electrical and Computer Engineering from the National Technical University of Athens, Greece, and a PhD in Computer Science from UC-Berkeley. He has been honored with the ACM Doctoral Dissertation award, the Kalai Prize from the Game Theory Society, the Sloan Foundation Fellowship, the Microsoft Faculty Fellowship, the SIAM outstanding paper prize, the ACM Grace Murray Hopper Award, the Simons investigator award, the Bodossaki Foundation Distinguished Young Scientists Award, and the Nevanlinna prize from the International Mathematical Union.
To watch the recording of a lecture or read its abstract, click its title.
|2018||Algorithms: Theory meets Practice||Robert E. Tarjan (Princeton)|
|2017||Below P vs. NP: Conditional Quadratic-Time Hardness for Big Data Problems||Piotr Indyk (MIT)|
|2016||Donald Knuth (Stanford)|
Shafi Goldwasser (MIT)
Daniel Spielman (Yale)
Jon Kleinberg (Cornell)
Cynthia Dwork (Microsoft)
Andrew Yao (Tsinghua University)
Moshe Vardi (Rice University)
|2009||John C. Mitchell (Stanford)|
|2008||Anna Karlin (University of Washington)|
|2006||Eugene Myers (Howard Hughes Medical Institute)|
|2005||Richard Karp (UC Berkeley)|
Michael Rabin (Harvard)
Reconfigurable Atomic Memory for Dynamic Networks
|Nancy Lynch (MIT- delivered by Alex Shvartsman)|
|2002||Christos Papadimitriou (UC Berkeley)|
|2001||Mihalis Yannakakis (Avaya Laboratories)|