RTG Summer Lectures on Data Science

1:30–5:30 pm Stevanovich Center
MS 112
5727 S. University Avenue
Chicago, Illinois 60637

Lectures 1 & 2: Latent factor models
Speaker: David Bindel, Cornell University
Monday, June 17, 2019, at 1:30-2:30 PM, and 3:00-4:00 PM

Approximate low-rank factorizations pervade matrix data analysis, often interpreted in terms of latent factor models. After discussing the ubiquitous singular value decomposition (aka PCA), we turn to factorizations such as the interpolative decomposition and the CUR factorization that offer advantages in terms of interpretability and ease of computation. We then discuss constrained approximate factorizations, particularly non-negative matrix factorizations and topic models, which are often particularly useful for decomposing data into sparse parts. Unfortunately, these decompositions may be very expensive to compute, at least in principal. But in many practical applications one can make a separability assumption that allows for relatively inexpensive algorithms. In particular, we show how to the separability assumption enables efficient linear-algebra-based algorithms for topic modeling, and how linear algebraic preprocessing can be used to “clean up” the data and improve the quality of the resulting topics.

Seminar I: Universal sparsity of deep ReLU networks
Speaker: Dennis Elbrächter, University of Vienna
Monday, June 17, 2019, at 4:30-5:30 PM

We consider the approximation capabilities of deep neural networks. Specifically, we introduce (or assimilate) a number of key concepts, which allows us to compare neural networks to classical representation systems (meaning e.g. wavelets, shearlets, and Gabor systems, or more generally any system generated from some mother function through translation, dilation and modulation). This enables us to establish that any function class is (asymptotically) at least as sparse with respect to (ReLU) neural networks, as it is in any 'reasonable' classical representation system.

Event Type

Seminars, Lectures, CAM rtg

Jun 17