Jaume de Dios, UCLA
1:30 PM - 2:30 PM
A central problem in generative machine learning is that of sampling from probability distributions: Given the density of a random variable (for example, as a black-box function that one can query) generate samples from a random variable that has a distribution “similar enough” to the given one.
Significant effort has been devoted to designing more and more efficient algorithms, ranging from relatively simple algorithms, such rejection sampling, to increasingly sophisticated such as langevin-based or diffusion-based models. In this talk, we will focus on the converse question: Finding universal complexity lower bounds that no algorithm can beat.
We will do so in the case when the log density is a strictly concave smooth function. In this case, we will be able to construct tight bounds in low dimensions using a modification of Perron’s sprouting construction for Kakeya sets.
Based on joint work with Sinho Chewi, Jerry Li, Chen Lu and Shyam Narayanan
Event contact: iosevich at gmail dot com