mcmc's questions - Chinese 1answer

975 mcmc questions.

As an exercise to learn how to manually code MCMC, I've built a Metropolis-Hastings sampler on top of a multinomial-Dirichlet posterior distribution. Since a closed form solution exists, I can compare ...

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space $\pi$ be a probability measure on $(\mathbb R,\mathcal B(\mathbb R))$ $(X_n)_{n\in\mathbb N}$ be a real-valued stationary stochastic ...

I have a very noisy/multimodal likelihood function for a 6-parameter model. The popular emcee sampler fails miserably (no matter how many chains I use and for how ...

I have a model with 5 continuous and 1 discrete parameter. I am using PyMC2 to implement slice sampling. I have a custom likelihood function that returns the log likelihood value that gets passed to ...

This is obviously overkill for this problem, but I thought it would help cement the concepts for me. The problem: Suppose there are two bowls of cookies. Bowl 1 contains 30 vanilla cookies and 10 ...

I've read that serial tempering is an approach for "MCMC sampling from a sum of parametrized distributions". I've only found two papers (Marinari and Parisi and Geyer and Thompson) introducing this ...

I am modelling plant dispersal using a generalised normal distribution (wikipedia entry), which has the probability density function: $$ \frac{b}{2a\Gamma(1/b)} e^{-(\frac{d}{a})^b} $$ where $d$ is ...

I'm using the emcee package to sample the distribution of a single parameter, using a uniform prior and 8 chains. In this toy example, my likelihood is defined ...

I'm currently trying to use the Metropolis-Hastings algorithm to sample from a posterior distribution of the form $$p(\theta | y ) \propto \prod_{ij} \phi (\theta_{ij}) \times \prod_{i=1}^n \pi_{y_i}...

Lets say I have $n$ posterior samples of $\theta_1$ and $\theta_2$. I suppose that any region $R$ which contains exactly $(1-\alpha)n$ of the points will be an approximate $(1-\alpha)\times100$ ...

I'm currently reading the Probabilistic Programming and Bayesian Methods for Hackers "book". I've read a few chapters and I was thinking on the first Chapter where the first example with pymc consist ...

I have Winbugs code for a zero-inflated Poisson (ZIP) model. I obtained this code from my lab at university and the person who wrote it is not accessible for me to ask questions. Here is the code:<...

In a previous question I asked if I could scale the likelihood as my MCMC process advanced, to keep the acceptance fraction within a reasonable range (~0.2-0.5). I was told that this is not a valid ...

I'm reading about the Approximate Bayesian Computation (ABC) method, and I came across two rather popular approaches: Sequential Monte Carlo (SMC) methodology to sample sequentially from a ...

I've come across a very good text on Bayes/MCMC. IT suggests that a standardisation of your independent variables will make an MCMC (Metropolis) algorithm more efficient, but also that it may reduce (...

Gibbs sampling is a profound and popular technique for creating samples of Bayesian networks (BNs). Metropolis sampling is another popular technique, though - in my opinion - a less accessible method. ...

I am currently in the process of implementing a model for soccer result prediction in JAGS. Actually, I have implemented several, but I have reached my most difficult challenge yet: A model described ...

I am studying MCMC approaches to HMMs and Factorial HMMs. I am reading this paper 'introduction to hidden markov models and bayesian networks': http://mlg.eng.cam.ac.uk/zoubin/papers/ijprai.pdf In ...

I am trying to understand the zero-inflated poisson (ZIP) model used in Bayesian regression modelling. I came across code here for the ZIP model. My question is related to the 3rd line of code within ...

I'm doing simulation in following framework. I have some responses $\theta_{ik}$ and since K is very large, I try to have a bayesian factor model to reduce the dimension. Following is a factor part, ...

I need to study Markov Chain Monte Carlo methods, to be more specific I need to study Metropolis Hastings algorithm and all about it like convergence criteria. Who can prescribe me a book, or a paper,...

I am aware that both are methods of sampling from the posterior. MC integration replaces the integral by a sample MC sample. Is this sample independent? Gibbs sampling is a class of MCMC ...

There are different kinds of MCMC algorithms: Metropolis-Hastings Gibbs Importance/rejection sampling (related). Why would one use Gibbs sampling instead of Metropolis-Hastings? I suspect there ...

I am working with a rather noisy and multi-modal likelihood. I've found that in order to obtain reasonable results from my Bayesian MCMC sampler (emcee, an affine ...

I've read lot of articles that is using pymc python module to apply MCMC algorithms into solving real life problems. I found that all the examples are about to assume various kinds of distribution ...

I'm doing MCMC simulation and a posterior is hard to sample. Suppose I need to sample a vector $\beta \sim N(M_{\beta} , \Sigma_{\beta})1_{\beta_{K}>0}$, which mean $\beta$ is a vector with length ...

I'm reading the original NUTS paper by Hoffman and Gelman, but couldn't fully understand the recursively doubling process. The following figure is taken from the paper. The NUTS process starts ...

In Bayesian inference, we usually sample from the posterior $f(\theta_1,\theta_2|-)$ via MCMC to compute point estimates for the parameters of interest. I am investigating an alternate form of ...

I am running an MCMC sampler with a model that uses Cash's C statistic for the likelihood (along with gaussian priors), which is supposed to resemble a chi square distribution in the limit of large ...

In the Metropolis–Hastings algorithm for sampling a target distribution, let: $\pi_{i}$ be the target density at state $i$, $\pi_j$ be the target density at the proposed state $j$, $h_{ij}$ be the ...

I have been trying to understand the Metropolis-Hastings algorithm in order to write a code for estimating the parameters of a model (i.e. $f(x)=a*x$). According to bibliography the Metropolis-...

I'm reading Learning in Graphical Models (Jordan 1998) and in the chapter Introduction to MCMC method by D. J. C. Mackay (page 201) it says this: It's not clear to me to what this is referring to. "...

I have used PYMC3 to perform inference on a Bayesian logistic regression model. I want to find the posterior over the weights $\beta \in \mathbb{R}^K$ given a Gaussian prior $\mathcal{N} \sim (0,100 \...

I am having trouble interpreting the output of an MCMC Logistic Regression run using R from the MCMCpack. Unfortunately I have had very little luck in finding sources on the web. I am assuming that ...

I guess I understand the equation of the detailed balance condition, which states that for transition probability $q$ and stationary distribution $\pi$, a Markov Chain satisfies detailed balance if $$...

Debugging MCMC programs is notoriously difficult. The difficulty arises because of several issues some of which are: (a) Cyclic nature of the algorithm We iteratively draw parameters conditional on ...

I am running a Metropolis-Hastings MCMC to find the distribution of a parameter that takes real, positive values. I was considering using the truncated normal distribution, and was wondering if I have ...

I currently am trying to figure out what would be the best option to perform MCMC sampling for a model which may show some kind of pathological behavior for some parameter combinations. The concrete ...

I am currently testing some multilevel models in pymc3 and found that the hyperpriors get distorted when I run the level only to generate the prior. The hyperpriors I am using are generating ...

Slice Sampling asks to draw uniformly from $f^{-1}]y,+\infty[$. Wikipedia page However, how can we be sure that a uniform defined over the set $f^{-1}]y,+\infty[$ is in fact proper? If I had to ...

I'm using MCMC to simulation the distribution of some parameters in a Bayesian hierarchical model, which has the following form: $$\gamma_{ik} \sim Ber(\omega_{ik}).$$ Then I make a logit-...

I am trying to interpret the regression coefficients of a covariate in a Bayesian linear regression problem. More specifically, I am trying to determine if the regression coefficient have an important ...

I am a new user to WINBUGS. I am running a model with 2 chains. When my model has finished running I have the following posterior density plot of my parameter: The plot only shows one distribution (i....

When running MCMC sampling, a common measure of performance is the effective sample size (ESS). There are lots of different ways to estimate the ESS from samples e.g. https://arxiv.org/abs/1011.0175. ...

For numerical Bayesian inference we have Posterior~Prior*Likelihood. In MCMC we do not need to calculate the denominator in Bayes rule. My question is that can I multiply the Likelihood by a large ...

I am trying to estimate a Bayesian Hierarchical model using the random-walk Metropolis-Hastings algorithm. While in a non-Hierarchical model, the algorithm is staight-forward, I am not sure I am ...

I'm reading about Markov chains and I'm starting to bump into these drift conditions, and their relationship with a chain's ergodic properties. The drift condition is that there exists a "scale ...

I believe MCMC could be utilized to estimate the MAP. At least there is an option in packages like PyMC. I just started reading about Bayesian Optimization, but the first thing that hit me was that ...

I am interested in generating samples from a density $\pi(\theta)$ to construct a histogram for $\pi(\theta)$ and to use these samples to generate samples of $f(\theta)$ for some function $f$. I may ...

I have a BSTS model and need the forecast for the entire period. For example, My training set is between 2008 to 2016 and my testing is 2017 Jan to 2018 Jan. Now I need the predicted values for 2008 ...

Related tags

Hot questions

Language

Popular Tags