Hybrid

Zheng (Tracy) Ke, Associate Professor of Statistics, Harvard University Department of Statistics

Mon Feb 16, 2026 4:00 p.m.—5:00 p.m.
Associate Professor of Statistics
Kline Tower, Kline Tower, 13th Floor, Rm. 1327
219 Prospect Street New Haven, CT 06511

Title: Poisson-process Topic Model for Integrating Knowledge from Pre-trained Language Models

Abstract: Topic modeling is traditionally applied to word counts without accounting for the context in which words appear. Recent advancements in large language models (LLMs) offer contextualized word embeddings, which capture deeper meaning and relationships between words. We aim to leverage such embeddings to improve topic modeling. 

We use a pre-trained LLM to convert each document into a sequence of word embeddings. This sequence is then modeled as a Poisson point process, with its intensity measure expressed as a convex combination of K base measures, each corresponding to a topic. To estimate these 

topics, we propose a flexible algorithm that integrates traditional topic modeling methods, enhanced by net-rounding applied before and kernel smoothing applied after. One advantage of this framework is that it treats the LLM as a black box, requiring no fine-tuning of its parameters. Another advantage is its ability to seamlessly integrate any traditional topic modeling approach as a plug-in module, without the need for modifications. 

Assuming each topic is a β-H ̈older smooth intensity measure on the embedded space, we establish the rate of convergence of our method. We also provide a minimax lower bound and show that the rate of our method matches with the lower bound when β ≤ 1. Additionally, we apply our method to two real datasets, providing evidence that it offers an advantage over traditional topic modeling approaches. 

3:30pm - Pre-talk meet and greet teatime - 219 Prospect Street, 13 floor, there will be light snacks and beverages in the kitchen area.  For more details and upcoming events visit our website at https://statistics.yale.edu/calendar.