On Friday, Sep 25 we will welcome Mr Tin Nguyen to host a seminar on “Independent finite approximations for Bayesian nonparametric inference: construction, error bounds, and practical implications”. Tin Nguyen (Nguyễn Danh Tín) is a PhD student at MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), working with Tamara Broderick on computational statistics. His research website is http://www.mit.edu/~tdn/.
Time: 10.00 am – 11.30 am (GMT+7), Friday, Sep 25 2020
This is an online seminar and will be streaming on our Facebook fanpage VinAI Research.
Bayesian nonparametrics based on completely random measures (CRMs) offers a flexible modeling approach when the number of clusters or latent components in a dataset is unknown. However, managing the infinite dimensionality of CRMs often leads to slow computation. Practical inference typically relies on either integrating out the infinite-dimensional parameter or using a finite approximation: a truncated finite approximation (TFA)
or an independent finite approximation (IFA). The atom weights of TFAs are constructed sequentially, while the atoms of IFAs are independent, which (1) make them well-suited for parallel and distributed computation and (2) facilitates more convenient inference schemes. While IFAs have been developed in certain special cases in the past, there has not yet been a general template for construction or a systematic comparison to TFAs. We show how to construct IFAs for approximating distributions in a large family of CRMs, encompassing all those typically used in practice. We quantify the approximation error between IFAs and the target nonparametric prior, and prove that, in the worst-case, TFAs provide more component-efficient approximations than IFAs. However, in experiments on image denoising and topic modeling tasks with real data, we find that the error of Bayesian approximation methods overwhelms any finite approximation error, and IFAs perform very similarly to TFAs.