Variational Inference for Neyman-Scott Processes (2023)

by Chengkuan Hong and Christian R. Shelton


Abstract: Neyman-Scott processes (NSPs) have been applied across a range of fields to model points or temporal events with a hierarchy of clusters. Markov chain Monte Carlo (MCMC) is typically used for posterior sampling in the model. However, MCMC's mixing time can cause the resulting inference to be slow, and thereby slow down model learning and prediction. We develop the first variational inference (VI) algorithm for NSPs, and give two examples of suitable variational posterior point process distributions. Our method minimizes the inclusive Kullback-Leibler (KL) divergence for VI to obtain the variational parameters. We generate samples from the approximate posterior point processes much faster than MCMC, as we can directly estimate the approximate posterior point processes without any MCMC steps or gradient descent. We include synthetic and real-world data experiments that demonstrate our VI algorithm achieves better prediction performance than MCMC when computational time is limited.

Download Information

Chengkuan Hong and Christian R. Shelton (2023). "Variational Inference for Neyman-Scott Processes." International Conference on Artificial Intelligence and Statistics. pdf          

Bibtex citation

@inproceedings{HonShe23,
   author = "Chengkuan Hong and Christian R. Shelton",
   title = "Variational Inference for {N}eyman-{S}cott Processes",
   year = 2023,
   booktitle = "International Conference on Artificial Intelligence and Statistics",
   booktitleabbr = "AISTATS",
}