Department of Mathematics
UChicago logo

Charles Amick Memorial Lectures in Applied Mathematics

Charles J. Amick was an applied mathematician at the University of Chicago who died in 1991, at the age of 39. The Lecture Series was begun in 1993 as a means of honoring his painfully brief life.


2018 Speaker: L. Craig Evans (UC Berkeley)

Lecture 1: Riccati equations and nonlinear PDE

Tuesday, March 6, 2018, 4:00pm - Eckhart 202

Abstract: In this expository lecture, I will discuss some partial differential theory analogs of Riccati equation tricks from ordinary differential theory.

Lecture 2: Weak KAM 1: Formalism and Foundations

Wednesday, March 7, 2018, 4:00pm - Eckhart 202

Abstract: I will briefly describe both the Lagrangian and Hamiltonian approaches to so-called weak KAM theory and explain how interesting information about dynamics is encoded within certain associated nonlinear partial differential equations.

Lecture 3: Weak KAM 2: Recent Progress and Conjectures

Thursday, March 8, 2018, 4:00pm - Eckhart 202

Abstract: I will continue with various themes of the previous lecture and will, in particular, discuss some recent conjectures concerning weak KAM theory and distinguished solutions of related matrix Riccati equations.


2016 Speaker: Jennifer Chayes (Microsoft Research)

Lecture 1: Modeling and Estimating Massive Networks: Overview

Friday, October 28, 2016, 4:00pm–5:00pm, Ryerson 251

Abstract: There are numerous examples of sparse massive networks, including the Internet, WWW and online social networks. How do we model and learn these networks? In contrast to conventional learning problems, where we have many independent samples, it is often the case for these networks that we can get only one independent sample. How do we use a single snapshot today to learn a model for the network, and hence predict a similar, but larger network in the future? In the case of relatively small or moderately sized networks, it’s appropriate to model the network parametrically, and attempt to learn these parameters. For massive networks, a non-parametric representation is more appropriate. I review the theory of graph limits (graphons), developed over the last decade, to describe limits of dense graphs and, more recently, sparse graphs of unbounded degree, such as power-law graphs. I then show how to use these graphons to give consistent estimators of non-parametric models of sparse networks, and moreover how to do this in a way that protects the privacy of individuals on the network. This first lecture gives an overview of results, while the second two focus more on details and methods.

Lecture 2: Limits and Stochastic Models for Sparse Massive Networks

Monday, November 1, 2016, 4:00pm–5:00pm, Eckhart 202

Abstract: Graphons are obtained as limits of sequences graphs, and provide non-parametric ways of modeling and generating networks. In this lecture, I focus on analytical methods to obtain graphons as limits of sparse networks of unbounded degree. These networks include growing power-law networks in which vertices can disconnect from previous friends and contacts (known as non-projective sequences of networks), like Facebook and Google. A fundamental tool in the study of networks and other random structures is the Szemeredi Regularity Lemma, which has been used extensively in combinatorics and other fields to discover apparent order within random structures. We show how the conventional (dense network) form of the lemma breaks down for sparse networks, and how to extend it in the sparse case.

Lecture 3: Exchangeability and Estimation of Sparse Massive Networks

Tuesday, November 2, 2016, 4:00pm–5:00pm, Eckhart 206

Abstract: Graphons are obtained as limits of sequences graphs, and provide non-parametric ways of modeling and generating networks. In this final lecture, I give an alternative way to model massive sparse networks, in this case networks which retain all previous connections as they grow. While the approach of Lecture 2 was more analytic, this lecture focuses on statistical considerations. I show how to extend the classic de Finetti Theorem for infinite exchangeable sequences, and the later Aldous-Hoover Theorem for infinite exchangeable dense arrays, to the case of sparse arrays. I also provide proofs of how to use graphons to consistently estimate massive sparse networks, and in certain cases to do this in a way that preserves the privacy of the individuals on the network.


Past Amick Lecturers include: Andrew Majda, Joseph Keller, John Ball, Martin Kruskal, Paul Roberts, David Ruelle, John Guckenheimer, Percy Deift, Keith Moffatt, Ingrid Daubechies, Yann Brenier, Felix Otto, Claude Bardos, George Papanicolaou, Michael Brenner, Ronald Coifman, Pierre-Louis Lions, Claude LeBris, L. Mahadevan, Jeffrey Rauch, Peter Constantin, Emmanuel Candes, and Amit Singer.