
Statistics Colloquium: Yiqiao Zhong
11:30 am–12:30 pm Jones 303
Yiqiao Zhong
Department of Statistics
University of Wisconsin-Madison
Title: Compositionality in Large Language Models: Emergence, Generalization, and Geometry
Abstract: Large language models (LLMs) have demonstrated remarkable reasoning abilities through novel techniques such as in-context learning and chain-of-thought (CoT) reasoning. Empirically, key reasoning skills often emerge only at larger scales or after prolonged training. Yet the underlying mechanism of LLM reasoning—-how compositional representations are formed and organized—-remains poorly understood.
In this talk, I present recent progress toward uncovering emergent compositional structure through controlled synthetic experiments on small transformers and targeted intervention studies on modern LLMs. First, I show that learning a key compositional structure is essential for out-of-distribution generalization, and that this process undergoes sharp phase transitions during training. At a critical stage, an intermediate low-dimensional “bridge subspace” emerges, serving as a shared representation connecting multiple layers. Second, using arithmetic composition as a minimal testbed for CoT reasoning, I demonstrate that autoregressive training on reasoning traces exhibits distinct reasoning phases. In particular, causally faithful reasoning emerges only when training noise lies below a critical threshold.
Together, these findings suggest that core statistical principles such as low-dimensional subspaces and causality may provide key foundations for advancing the interpretability and transparency of LLMs.

Statistics Colloquium: Stefan Wager
11:30 am–12:30 pm Jones 303
Stefan Wager
Department of Statistics
Stanford University
Title: TBA
Abstract: TBA

Bahadur Memorial Lectures: Nancy Reid (Day 1)
11:30 am–12:30 pm Jones 303
Title: “Lies, Damned Lies, and Statistics”
Abstract: This is the title I used the first time I taught the U of T First-Year Seminar course, many years ago. I was nervous about the prospect of giving a seminar-style course for students fresh from high school, and unsure how to distinguish it from a run-of-the-mill introductory statistics course. As it turned out, however, the experience had a big impact on my teaching, research, and views on statistical science. Although much has changed in our field in the years since, the basic principles of reasoning with uncertainty have not. In this talk I will reflect on my experiences in trying to convey the ongoing importance of statistical science and perhaps hazard a guess about the future.

Bahadur Memorial Lectures: Nancy Reid (Day 2)
3:00–4:00 pm DSI 105
Title: “All Models are Wrong”
Abstract: This talk will consider the assessment of semiparametric and other highly-parametrized models from the perspective of foundational principles of parametric statistical inference. It is cast as a generalised version of the Fisherian sufficiency/co-sufficiency separation, replacing out-of-sample prediction error by a type of within-sample prediction error. The theory is illustrated through several examples, including a post-reduction inference approach to confidence sets of sparse regression models. This is joint work with Heather Battey.

Statistics Colloquium: Aravindan Vijayaraghavan
11:30 am–12:30 pm Jones 303
Aravindan Vijayaraghavan
Department of Computer Science
Northwestern University
Title: TBA
Abstract: TBA

Statistics Colloquium: David Blei
11:30 am–12:30 pm Jones 303
David Blei
Departments of Statistics and Computer Science
Columbia University
Title: TBA
Abstract: TBA