Past Events

2025

John Reinitz - Memorial Service

3:00–4:30 pm Bond Chapel

A memorial to honor Dr. John Reinitz will be held in Bond Chapel with a reception at the Quad Club afterwards.

Apr 9

Student Seminar: Yu Gui

11:30 am–1:30 pm Jones 304

Tuesday, April 8 2025, at 1:00 PM, in Jones 304, 5747 S. Ellis Avenue
PhD Dissertation Defense Presentation
Yu Gui, Department of Statistics, The University of Chicago
“Statistical Learning and Inference in Weakly Specified Settings: Shifted Distributions and Unlabeled Data”

Apr 8

Student Seminars: Rohan Hore

1:00–3:00 pm Searle 236

Monday, April 7, 2025, at 1:00 PM, in Searle 236, 5735 S. Ellis Avenue
PhD Dissertation Defense Presentation
Rohan Hore, Department of Statistics, The University of Chicago
“Assumption-lean approaches to modern statistical inference”

Apr 7

Statistics Colloquium: David Dunson

11:30 am–12:30 pm Jones 303

David Dunson, Statistical Science and Department of Mathematics, Duke University
Title: Deep latent class regression

Abstract: High-dimensional categorical data arise in diverse scientific domains and are often accompanied by covariates. Latent class regression models are routinely used in such settings, reducing dimensionality by assuming conditional independence of the categorical variables given a single latent class that depends on covariates through a logistic regression model. However, such methods become unreliable as the dimensionality increases. To address this, we propose a flexible family of deep latent class models. Our model satisfies key theoretical properties, including identifiability and posterior consistency, and we establish a Bayes oracle clustering property that ensures robustness against the curse of dimensionality. We develop efficient posterior computation methods, validate them through simulation studies, and apply our model to joint species distribution modeling in ecology. The theory and methods can be easily extended beyond categorical observed data.

Joint work with Yuren Zhou & Yuqi Gu

Apr 7

Student Seminars: Subhodh Kotekal

3:30–5:30 pm Jones 304

Friday, April 4, 2025, at 3:30 PM, in Jones 304, 5747 S. Ellis Avenue
PhD Dissertation Defense Presentation
Subhodh Kotekal, Department of Statistics, The University of Chicago
“Minimax hypothesis testing in large-scale inference”

Apr 4

Student Seminar: Qian Liu

2:30–3:00 pm Jones 226

Friday, April 4, 2025, at 2:30 PM, in Jones 226, 5747 S. Ellis Avenue
Master’s Thesis l Presentation
Qian Liu, Department of Statistics, The University of Chicago
“Fitting Mixed-Effects Location Scale Models with Ordinal Outcomes”

Apr 4

Statistics Colloquium: Qingyuan Zhao

11:30 am–12:30 pm Jones 303

Qingyuan Zhao, Statistical Laboratory and Department of Pure Mathematics and Mathematical Statistics, University of Cambridge

Title: The symbiosis of statistical genetics and causal inference

Abstract: Statistical genetics and causal inference share an intertwined beginning in the first half of the 20th century. Many fundamental concepts in causal inference, such as Fisher’s randomization principle for experimental design and Wright’s path analysis, originated from genetic problems, but the two fields grew apart later. In this talk, I will argue that the two fields can (and perhaps should) still learn from each other by highlighting two examples from my own research. The first example concerns Mendelian randomization, a popular method in genetic epidemiology to estimate the causal effect of a heritable risk factor. I will use the modern theory of causal graphical model and randomization inference to give a precise understanding of the assumptions and sources of bias in Mendelian randomization. My second example concerns heritability, an important concept in the broad “nature versus nurture” debate. I will review some existing notions of heritability and propose a counterfactual definition that does not rely on any parametric model for the phenotypic trait. A key feature of counterfactual heritability is that it is generally not exactly identified but can be bounded from above and below using empirical data, reflecting our intrinsic uncertainty about the level of interaction between gene and (unmeasured) environment. I will also discuss how this motivates a measure of counterfactual explainability for black-box models and how this connects to the functional analysis of variance.

Mar 31

Statistics Colloquium: Victor Veitch

11:30 am–12:30 pm Jones 303

Victor Veitch, Department of Statistics and the Data Science Institute, University of Chicago

Title: Statistical Views on LLM post training

Abstract: Typically, large language models are refined with a post training procedure aimed at biasing their outputs to have desirable properties—-helpfulness, harmlessness, factualness, and so forth. These desiderata are often elicited by pairwise comparisons of LLM responses. This comparative reward signal creates some subtleties in how post training should be conducted. I’ll discuss some ways of formalizing the goal of post training and methods for achieving these goals.

Mar 24

Statistics Colloquium: Sara Algeri

11:30 am–12:30 pm Jones 303

Sara Algeri, School of Statistics, University of Minnesota
“When Pearson’s Chi-square and other divisible statistics are not goodness-of-fit tests”

Mar 3

Joint Colloquium with DSI: Jeremy Bernstein

11:30 am–12:30 pm Jones 303

Jeremy Bernstein, Computer Science and Artificial Intelligence Laboratory, MIT
“Metrized deep learning: Fast & scalable optimization via programmatic theory”

Feb 24