Quantcast
Channel: Active questions tagged expected-value - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 1310

Trouble with an expression of expected value

$
0
0

So I am reading this proof about finite irreducible markov chains which says that states have unique positive stationary distribution $\pi$ such that $\pi_j=\frac{1}{\mu_j} \forall j$. The problem is not the entire proof or so, but rather one specific part of it that I don't understand. First some neccessary bits:


...

Let $Y_n$ is the number of steps from the $(n-1)$st visit to $j$ to the $n$th visit to j. By the strong markov property and by the law of large numbers$$\lim_{n\rightarrow\infty}\frac{Y_1+Y_2+...+Y_n}{n}=\mu_j$$ with probability $1$.

Let the indicator variable $I_m$ equal $1$ if $X_m=j$ and $0$ otherwise. Then $\sum_{m=0}^{n-1}I_m$ is the number of visits to $j$ in the first steps of the chain. In the long-term:$$\lim_{n\rightarrow\infty}\mathbb{E}(\frac{1}{n}\sum_{m=0}^{n-1}I_m)=\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{m=0}^{n-1}\mathbb{E}(I_m)=\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{m=0}^{n-1}P_{ij}^{m}$$.

[Now to the trouble-part]Since there are $n$ visits to $j$ by time $Y_1+...+Y_n$, for large $n$,$$\frac{1}{n}\sum_{m=0}^{n-1}I_m\approx\frac{n}{Y_1+...+Y_n}$$giving that$$\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{m=0}^{n-1}P_{ij}^{m}=\lim_{n\rightarrow\infty}\frac{n}{Y_1+...+Y_n}=\frac{1}{\mu_j}$$

...


So, specifically for the last equation; why is the LHS equal to the limit of $n/(Y_1+...+Y_n)$. Above we take the expectation of $1/n\sum I_m$. What happens with the expectation? Should'nt we need to also take the expectation of $n/(Y_1+...+Y_n)$ as well?


Viewing all articles
Browse latest Browse all 1310


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>