Skip to main content
Log in

Efficient use of exact samples

  • Published:
Statistics and Computing Aims and scope Submit manuscript

Abstract

Propp and Wilson (Random Structures and Algorithms (1996) 9: 223–252, Journal of Algorithms (1998) 27: 170–217) described a protocol called coupling from the past (CFTP) for exact sampling from the steady-state distribution of a Markov chain Monte Carlo (MCMC) process. In it a past time is identified from which the paths of coupled Markov chains starting at every possible state would have coalesced into a single value by the present time; this value is then a sample from the steady-state distribution.

Unfortunately, producing an exact sample typically requires a large computational effort. We consider the question of how to make efficient use of the sample values that are generated. In particular, we make use of regeneration events (cf. Mykland et al. Journal of the American Statistical Association (1995) 90: 233–241) to aid in the analysis of MCMC runs. In a regeneration event, the chain is in a fixed reference distribution– this allows the chain to be broken up into a series of tours which are independent, or nearly so (though they do not represent draws from the true stationary distribution).

In this paper we consider using the CFTP and related algorithms to create tours. In some cases their elements are exactly in the stationary distribution; their length may be fixed or random. This allows us to combine the precision of exact sampling with the efficiency of using entire tours.

Several algorithms and estimators are proposed and analysed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Asmussen S. 1987. Applied Probability and Queues. New York, John Wiley & Sons.

    Google Scholar 

  • Fill J.A. 1998. An interruptible algorithm for perfect sampling via Markov chains. Annals of Applied Probability 8: 131–162.

    Google Scholar 

  • Fill J.A., Machida M., Murdoch D.J., and Rosenthal J.S. 2000. Extension of Fill's perfect rejection sampling algorithm to general chains. Random Structures and Algorithms 16, to appear.

  • Gelfand A.E., Hills S.E., Racine-Poon A., and Smith A.F.M. 1990. Illustration of Bayesian inference in normal data models using Gibbs sampling. Journal of the American Statistical Association 85: 972–985.

    Google Scholar 

  • Gelfand A.E. and Smith A.F.M. 1990. Sampling-based approaches to calculating marginal densities. Journal of the American Statistical Association 85: 398–409.

    Google Scholar 

  • Gilks W.R. 1996. Full conditional distributions. In: Gilks W.R., Richardson S., and Spiegelhalter D.J. (Eds.), Markov Chain Monte Carlo in Practice. Chapman and Hall, pp. 75–88.

  • Green P.J. and Murdoch D.J. 1999. Exact sampling for Bayesian inference: towards general purpose algorithms (with discussion). In: Bayesian Statistics 6, Bernardo U.M., Berger J.O., Dawid A.P., and Adrian Smith F.M. (Eds.), Oxford University Press, pp. 301–321.

  • Häggström O., Lieshout M.N.M., and Møller J. 1996. Characterisation results and Markov chain Monte Carlo algorithms including exact simulation for some spatial point processes. Technical Report R-96-2040, Department of Mathematics, Aalborg University. To appear in Bernoulli.

  • Kendall W.S. 1997. On some weighted Boolean models. In: Jeulin, D. (Ed.), Advances in Theory and Applications of Random Sets. Singapore, World Scientific Publishing Company, pp. 105–120.

    Google Scholar 

  • Kendall W. 1998. Perfect simulation for the area-interaction point process. In: Accardi L. and Heyde C.C. (Eds.), Probability Towards 2000. New York, Springer.

    Google Scholar 

  • Meyn S.P. and Tweedie R.L. 1994. Computable bounds for convergence rates of Markov chains. Annals of Applied Probability 4: 981–1011.

    Google Scholar 

  • Møller J. 1997. Perfect simulation of conditionally specified models. Technical Report R-97-2006, Department of Mathematics, Aalborg University. To appear in J. Roy. Stat. Soc., Ser. B.

  • Murdoch D.J. and Green P.J. 1997. Exact sampling from a continuous state space. Scandinavian Journal of Statistics 25: 483–502.

    Google Scholar 

  • Mykland P., Tierney L., and Yu B. 1995. Regeneration in Markov chain samplers. Journal of the American Statistical Association 90: 233–241.

    Google Scholar 

  • Propp J.G. and Wilson D.B. 1996. Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures and Algorithms 9: 223–252.

    Google Scholar 

  • Propp J.G. and Wilson D.B. 1998. How to get a perfectly random sample from a generic Markov chain and generate a random spanning tree of a directed graph. Journal of Algorithms 27: 170–217.

    Google Scholar 

  • Rosenthal J.S. 1995. Minorization conditions and convergence rates for Markov chain Monte Carlo. Journal of the American Statistical Association 90: 558–566.

    Google Scholar 

  • Smith A.F.M. and Roberts G.O. 1993. Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods (with discussion). Journal of the Royal Statistical Association, Series B 55: 3–24.

    Google Scholar 

  • Tierney L. 1994. Markov chains for exploring posterior distributions (with discussion). Annals of Statistics 22: 1701–1762.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Murdoch, D.J., Rosenthal, J.S. Efficient use of exact samples. Statistics and Computing 10, 237–243 (2000). https://doi.org/10.1023/A:1008991527785

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1008991527785

Navigation