# markov-chains's questions - Chinese 1answer

3.065 markov-chains questions.

### 5 Markov's Mouse and the Maze

I recently started to learn about Markov Chains and had a problem regarding the expected time to absorption: Problem: Markov has an untrained mouse that he place in a maze. The mouse may move ...

### 2 Markov Chain problem from Grimmett and Stirzaker Ex 6.1.2.a

Problem 2 in section 6.1 of Grimmett and Stirzaker (G+M) asks: a die is rolled repeatedly which of the following are Markov chains, and supply the transition matrix for those that are: a.) the ...

### Controlling mixing time of metropolis Hastings in terms of original chain?

0 answers, 4 views probability-theory markov-chains
Let M be a reversible Markov chain, with stationary distribution Q. Suppose we construct a metropolis Hastings proposal in order to get a chain N with stationary distribution P. I'd like to bound the ...

### Markov Chain problem from Grimmett and Stirzaker Ex 6.1.2.a part 2

0 answers, 10 views markov-chains markov-process
I got the first part of this question answered see link below: Markov Chain problem from Grimmett and Stirzaker Ex 6.1.2.a There is however a second part that I still don't understand: The original ...

### Covariance of a Markov chain

Suppose I have a transition density (matrix) $K$, and I want to calculate the covariance between random variables at two different times. \begin{align*} Cov_{\mu_0}(X_n,X_{n+k})&=E_{\mu_0}(X_n-E_{...

### 1 Strong markov property of Markov chain

I have found an exercise in an old exam without a solution. Let $X_{0},X_{1},...$ be a Markov chain with state space $S$ and $N$ a stopping time. Let $\mathbb{E}_{x}(\cdot)$ denote the conditional ...

### 3 What is the necessary and sufficient condition of Markov chain sample average converging to the expectation wrt the stationary distribution?

The ergodic theorem says that for an irreducible and positive-recurrent Markov chain $P$, any distribution $\lambda$, and $x_n (n\geq0) \sim Markov(\lambda, P)$ then it follows that for any bounded ...

### Ergodic Markov chains and eigenvalues

I just read on wikipedia that a way to check whether a Markov chain is ergodic is to compute the eigenvalues of the transition matrix, and if those are all (except for 1) less than 1, then the chain ...

### 2 Properties of Discrete Time Continuous State Markov Process

I have a problem setup that reduces to finding the stationary state of a discrete time continuous state stationary Markov process. Is that state unique, as it is in the discrete state case? ...

### 2 Transient states of a Markov chain and average number of visits

I am a bit confused with the following: Suppose that we have a Markov Chain and a state $i$ that is transient, that is $f_i<1$. Since $i$ is transient, we cannot define the average time of ...

### Not sure how to solve or setup this markov chain question

2 answers, 45 views probability-theory markov-chains
The maternity has 2 beds. Admissions are made at the beginning of the day. With each day, there is a probability of $1/2$ that no admission will arrive and probability $1/2$ that only one potential ...

### -1 How to identify birth rates and death rates of a Birth-death process

0 answers, 20 views markov-chains queueing-theory
Consider a queueing system accepting two types of arrivals: some customers arrive alone, according to a Poisson process of rate $\lambda > 0$, whereas other customers arrive two by two, with ...

### Detailed proof of Central Limit Theorem for Markov Chains

Can someone please provide a reference to a detailed proof of the CLT for MC? I have a paper MC's for MCMC'ists but I have a problem understanding a crucial part. Namely, the transition from line 2 to ...

### 1 Finding $\mathbb{E}(s^T)$ for simple symmetric random walk on $\mathbb{Z}$

Let $X_n$ be a simple, symmetric random walk on $\mathbb{Z}$ with $X_0=0$. Let $$T=\inf\{n\ge 1 : X_n=0\}$$ Compute $\mathbb{E}(s^T)$ for fixed $s\in(0,1)$. Apologies if this has been asked ...

### 2 The probability of the first jump occurring after $t$ is always positive

Let $\mathbb{S}$ be countable and $\Omega$ be the set of all right continuous functions $\omega:[0,\infty)\rightarrow \mathbb{S}$. Let $X_t(\omega) = \omega(t)$ denote a continuous Markov chain. ...

### Weather forecasting problem using Markov chains

1 answers, 26 views markov-chains markov-process
Suppose that the weather tomorrow depends only on the weather today and that there are $3$ states for the weather: Sun, Cloudy, Rainy. Suppose today is day $1$ and is Sun, what's the probability of ...