Over the last month I have been making some interesting connections between different areas of statistics. Explaining these connections (or even the things being connected) is perhaps an overly ambitious aim for a blog post, but I think it is worth a try!

The three problems below are all interconnected.

This typically occurs when you want to use MCMC to estimate parameters of a model where the normalization term is intractable

With the basic MCMC approach, the normalization term needs to be evaluated on every iteration of the algorithm. Recent advances in MCMC (the Pseudo-Marginal approach) have lead to algorithms where a suitable approximation to the normalization terms can be used and the algorithm still gives the same results (asymptotically) as the basic algorithm.

This occurs when one of the things you don't know is the number of parameters in your model (e.g. number of clusters or components in a mixture model).

With the basic MCMC approach it is difficult to make good proposals for the parameter values when your proposal changes the number of parameters in the model. However the the Pseudo-Marginal approach can also be applied to Reversible Jump MCMC. This results in parameter proposals that are more likely to be accepted in the MCMC, and therefore makes the method much more computationally efficient because less time is spent generating proposals that subsequently get rejected.

This is useful for some models in econometrics, epidemiology, and chemical kinetics. Suppose you have some data (e.g. stock prices, infection numbers, chemical concentrations) and a nonlinear stochastic model for how these quantities evolve over time. You may be interested in inferring posterior distributions for the parameter values in your model.

The Pseudo-Marginal approach was used to develop a method called particle MCMC that does this parameter estimation more efficiently than basic MCMC.

In all three cases (doubly intractable, reversible jump, nonlinear stochastic processes), applying basic MCMC results in the need to evaluate the density of an intractable marginal probability distribution. Somewhat miraculously, the Pseudo-Marginal approach obviates the need to evaluate this probability density exactly, while still preserving key theoretical properties of the algorithm related to convergence and accurate estimation.

The three problems below are all interconnected.

__MCMC for doubly intractable problems__This typically occurs when you want to use MCMC to estimate parameters of a model where the normalization term is intractable

*and*it is dependent on the parameters you are interested in estimating.With the basic MCMC approach, the normalization term needs to be evaluated on every iteration of the algorithm. Recent advances in MCMC (the Pseudo-Marginal approach) have lead to algorithms where a suitable approximation to the normalization terms can be used and the algorithm still gives the same results (asymptotically) as the basic algorithm.

__Reversible Jump MCMC__This occurs when one of the things you don't know is the number of parameters in your model (e.g. number of clusters or components in a mixture model).

With the basic MCMC approach it is difficult to make good proposals for the parameter values when your proposal changes the number of parameters in the model. However the the Pseudo-Marginal approach can also be applied to Reversible Jump MCMC. This results in parameter proposals that are more likely to be accepted in the MCMC, and therefore makes the method much more computationally efficient because less time is spent generating proposals that subsequently get rejected.

__MCMC parameter estimation for nonlinear stochastic processes__This is useful for some models in econometrics, epidemiology, and chemical kinetics. Suppose you have some data (e.g. stock prices, infection numbers, chemical concentrations) and a nonlinear stochastic model for how these quantities evolve over time. You may be interested in inferring posterior distributions for the parameter values in your model.

The Pseudo-Marginal approach was used to develop a method called particle MCMC that does this parameter estimation more efficiently than basic MCMC.

In all three cases (doubly intractable, reversible jump, nonlinear stochastic processes), applying basic MCMC results in the need to evaluate the density of an intractable marginal probability distribution. Somewhat miraculously, the Pseudo-Marginal approach obviates the need to evaluate this probability density exactly, while still preserving key theoretical properties of the algorithm related to convergence and accurate estimation.