Quantcast
Channel: Active questions tagged expected-value - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 1314

Fisher information of poisson distributed random variable

$
0
0

Let's consider a printer queue. We know that the expected number of printer jobs almost obeys a Poisson distribution, so $P_{\vartheta}(X=k)=e^{-\vartheta}\frac{\vartheta^k}{k!}$, where $\vartheta\in]0,\infty[$. We estimate the expected number of printer jobs $\vartheta$ by $\frac{1}{n}\sum\limits_{i=1}^nX_i$. Compute the Fisher information $I(\vartheta):=\mathbb{E}_{\vartheta}\left(\left(\frac{d\ln(P_{\vartheta}(X))}{d\vartheta}\right)^2\right)$.

We know that if $(X_1,\dots,X_n)$ are independent random variables with a distribution like $P_X(\vartheta)=f(X_1,\vartheta)\dots f(X_n,\vartheta)$, then

$$\mathbb{E}_{\vartheta}\left(\left(\frac{d\ln(P_{\vartheta}(X))}{d\vartheta}\right)^2\right)=n\cdot \mathbb{E}_{\vartheta}\left(\left(\frac{d\ln(P_{\vartheta}(X_i))}{d\vartheta}\right)^2\right).$$

If we consider the $n$-many independent observations $X:=(X_1,\dots,X_n)$ , where each $X_i$ is the number of printer jobs in a certain period of time, the probability is given by\begin{align*} &P_{\vartheta}(\{X=(x_1,\dots,x_n)\})=P_{\vartheta}(\{X_1=x_1\})\cdots P_{\vartheta}(\{X_n=x_n\})\\ &=\frac{e^{-\vartheta}\vartheta^{x_1}}{(x_1!)}\dots \frac{e^{-\vartheta}\vartheta^{x_n}}{(x_n!)}=\frac{e^{-n\vartheta}\vartheta^{\sum\limits_{i=1}^nx_i}}{\prod\limits_{i=1}^n(x_i!)}. \end{align*}

Applying the above statement yields after some manipulations $I(\vartheta)=\frac{n}{\vartheta}$.

However, the sample solution says:

\begin{align*}&I(\vartheta)=\sum\limits_{x=0}^{\infty}\frac{\left(\frac{d\ln(P_{\vartheta}(X))}{d\vartheta}\right)^2}{P_{\vartheta}(X)}=\sum\limits_{x=0}^{\infty}\frac{1}{x!}e^{\vartheta}\vartheta^{-x}\left(-e^{-\vartheta}\vartheta^x+e^{-\vartheta}x\vartheta^{x-1}\right)\\&= \sum\limits_{x=0}^{\infty}\frac{\vartheta^x}{x!}e^{-\vartheta}\left(\frac{x}{\vartheta}-1\right)^2=\mathbb{V}(X)\frac{1}{\vartheta}^2=\frac{1}{\vartheta}.\end{align*}

This makes no sense to me? Why $x\to\infty$ and why is there only one random variable instead of a vector which represents $n$-many obersavtions?


Viewing all articles
Browse latest Browse all 1314

Latest Images

Trending Articles



Latest Images