Last edited by Gajin
Friday, July 17, 2020 | History

1 edition of On the computational complexity of MCMC-based estimators in large samples found in the catalog.

On the computational complexity of MCMC-based estimators in large samples

Alexandre Belloni

On the computational complexity of MCMC-based estimators in large samples

by Alexandre Belloni

  • 245 Want to read
  • 27 Currently reading

Published by Massachusetts Institute of Technology, Dept. of Economics in Cambridge, MA .
Written in English


About the Edition

This paper studies the computational complexity of Bayesian and quasi-Bayesian estimation in large samples carried out using a basic Metropolis random walk. The framework covers cases where the underlying likelihood or extremum criterion function is possibly non-concave, discontinuous, and of increasing dimension. Using a central limit framework to provide structural restrictions for the problem, it is shown that the algorithm is computationally efficient. Specifically, it is shown that the running time of the algorithm in large samples is bounded in probability by a polynomial in the parameter dimension d, and in particular is of stochastic order d2 in the leading cases after the burn-in period. The reason is that, in large samples, a central limit theorem implies that the posterior or quasi-posterior approaches a normal density, which restricts the deviations from continuity and concavity in a specific manner, so that the computational complexity is polynomial. An application to exponential and curved exponential families of increasing dimension is given. Keywords: Computational Complexity, Metropolis, Large Samples, Sampling, Integration, Exponential family, Moment restrictions. JEL Classifications: C1, C11, C15, C6, C63.

Edition Notes

StatementAlexandre Belloni [and] Victor Chernozhukov
SeriesWorking paper series / Massachusetts Institute of Technology, Dept. of Economics -- working paper 07-08, Working paper (Massachusetts Institute of Technology. Dept. of Economics) -- no. 07-08.
ContributionsChernozhukov, Victor, Massachusetts Institute of Technology. Dept. of Economics
The Physical Object
Pagination37 p. :
Number of Pages37
ID Numbers
Open LibraryOL25480346M
OCLC/WorldCa122268578

Alexandre Belloni is a Professor of Decision Sciences at the Fuqua School of Business at Duke University. He received his Ph.D. in Operations Research from the Massachusetts Institute of Technology () and a in Mathematical Economics from IMPA (). In high dimensions, the principal tool for carrying out the integrals is an MCMC based on the Metropolis algorithm. The greater efficiency of an MCMC stems from its ability, after an initial burn-in period, to generate samples in parameter space in direct proportion to the joint target probability distribution.

MCMC-Based Bayesian Inference Bayes Theorem and the MCMC Algorithm MCMC-Based Estimation of the Standard SV Model Empirical Illustrations The Data Estimation of SV Models Appendix Derivation of the Conditional Posterior Distributions Bibliography 13 Measuring and Modeling Risk Using High. Bayesian Nonparametric Learning of Switching Dynamics in Cohort Physiological Time Series: Application in Critical Care Patient Monitoring Book Chapter Advanced State Space Methods for Neural and Clinical Data, Cambridge University Press, Cambridge, UK,

This book collects important advances in methodology and data analysis for directional statistics. It is the companion book of the more theoretical treatment presented in Modern Directional Statistics (CRC Press, ). The field of directional statistics has received a lot of attention due to demands from disciplines such as life sciences or. Here it is the sheer diversity/complexity of evolutionary tracks open to a cluster of given mass under a stochastically sampled initial mass function (IMF) that unrenders infeasible (i.e. intractable) any explicit formulation of the observational likelihood function (cf. Asa’D & Hanson ; Bonatto, Lima & Bica ; Hernandez ; Koda et al.


Share this book
You might also like
clinical examination of patients with organic cerebral disease

clinical examination of patients with organic cerebral disease

Exploring the park

Exploring the park

Quantitative Literacy

Quantitative Literacy

Veto-S. 21, Senate Document Number 104-7, U.S. Senate, 104th Congress, 1st Session.

Veto-S. 21, Senate Document Number 104-7, U.S. Senate, 104th Congress, 1st Session.

Betrayed trust

Betrayed trust

Long time dead

Long time dead

Student group supervision

Student group supervision

Veer Vinod

Veer Vinod

prospect of flowers

prospect of flowers

Republic of Fritz Hansen.

Republic of Fritz Hansen.

A description of the Ordnance Survey Large scale plans..

A description of the Ordnance Survey Large scale plans..

High tech and the high seas

High tech and the high seas

Law and politics in space

Law and politics in space

Interaction cross sections of elementary particles

Interaction cross sections of elementary particles

Rovings in the Pacific, from 1837 to 1849

Rovings in the Pacific, from 1837 to 1849

On the computational complexity of MCMC-based estimators in large samples by Alexandre Belloni Download PDF EPUB FB2

MCMC-based Estimators in Large Samples computational complexity analysis of the MCMC algorithm for computation of () and sampling from (). Our analysis of computational complexity builds on several fundamental pa-pers studying the computational complexityof Metropolisprocedures, especially.

Title: On the Computational Complexity of MCMC-based Estimators in Large Samples. Authors: Alexandre Belloni, Victor Chernozhukov.

Download PDF Abstract: In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random Cited by: 5.

Citation: Belloni, Alexandre, and Victor Chernozhukov. “On the computational complexity of MCMC-based estimators in large samples.” The Annals of Statist no.

4 (August ): The computational complexity of MCMC-based estimators in large samples is discussed in [56], where the implications of the statistical large sample theory for the computational complexity of. CiteSeerX - Document Details (Isaac Councill, Lee Giles, Pradeep Teregowda): In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks.

Our analysis is motivated by the Laplace-Bernstein-Von Mises central limit theorem, which states that in large samples. BibTeX @MISC{Belloni_onthe, author = {Alexandre Belloni and Alexandre Belloni and Victor Chernozhukov}, title = {On the computational complexity of MCMC-based estimators in large samples}, year = {}}.

ON THE COMPUTATIONAL COMPLEXITY OF MCMC-BASED ESTIMATORS IN LARGE SAMPLES. By Alexandre Belloni and Victor Chernozhukov. Abstract. In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks.

Our analysis is. Download: PDF; PostScript; Other. On the Computational Complexity of MCMC-based Estimators in. large sample theory for the computational complexity of Bayesian. Download PDF.

Studia Geophysica et. A computational complexity analysis of Tienstras solution to. polynomial time – computational complexity of these algorithms. These results suggest that at least in large samples, Bayesian and quasi-Bayesian estimators can be computationally efficient alternatives to maximum likelihood and ex-tremum estimators, most of all.

fimEY HBM MassachusettsInstituteofTechnology DepartmentofEconomics WorkingPaperSeries ONTHECOMPUTATIONALCOMPLEXITYOF MCMC-basedESTIMATORS INLARGESAMPLES. Belloni, A., Chernozhukov, V.: On the computational complexity of MCMC-based estimators in large samples.

Ann. Stat. 37(4), – (). () On the Computational Complexity of MCMC-Based Estimators in Large Samples. SSRN Electronic Journal. () Multiconsistency and Robustness with Global Constraints. Alexandre Belloni & Victor Chernozhukov, "On the computational complexity of MCMC-based estimators in large samples," CeMMAP working papers CWP12/07, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.

Victor Chernozhukov & Ivan Fernandez-Val & Alfred Galichon, Application domains. MCMC methods are primarily used for calculating numerical approximations of multi-dimensional integrals, for example in Bayesian statistics, computational physics, computational biology and computational linguistics.

In Bayesian statistics, the recent development of MCMC methods has made it possible to compute large hierarchical models that require integrations over. Belloni A, Chernozhukov V () On the computational complexity of MCMC-based estimators in large samples. Ann Stat – CrossRef zbMATH MathSciNet Google Scholar Bickel PJ, Chernoff H () Asymptotic distribution of the likelihood ratio.

The initial large sample work on Bayesian estimators was done by Laplace (see Stigler () for a detailed review). Further early work of Bernstein () and von Mises () has been considerably extended in both econometric and statistical research, cf. Ibragimov and Has'minskii (), Bickel and Yahav (), Andrews (b), Phillips.

Alexandre Belloni & Victor Chernozhukov, "On the computational complexity of MCMC-based estimators in large samples," CeMMAP working papers CWP12/07, Centre for Microdata Methods and Practice, Institute for Fiscal Studies. Daron Acemoglu & Victor Chernozhukov & Muhamet Yildiz, Approximate Bayesian computation (ABC) constitutes a class of computational methods rooted in Bayesian statistics that can be used to estimate the posterior distributions of model parameters.

In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus. If a test is based on a statistic which has asymptotic distribution different from normal or chi-square, a simple determination of the asymptotic efficiency is not possible.

We may define the asymptotic efficiency e along the lines of Remark and Remarkor alternatively along the lines of Remark In the former case, e is defined so that the test under consideration. In many cases this implies that Bayesian parameter estimation is faster than classical maximum likelihood estimation.

In this paper we illustrate the computational advantages of Bayesian estimation using MCMC in several popular latent variable models. These new groups of estimators can all achieve substantially smaller variances and may even reach the minimum variance. One common feature of these estimators is that they are all computationally intensive and, as a result, are suitable for only relatively smaller samples.

Such limitations are particularly serious for the MCMC-based approach.Method III applies a multi-threaded MCMC-based inference technique to jointly optimize over CVIM in a level set by involving implicit shape matching without target pre-segmentation.

It was shown in that Method III significantly outperforms the first two, but it suffers from high computational complexity due to the MCMC-based shape inference.().

Book Reviews. Journal of the American Statistical Association: Vol.No.pp.