Quantcast
Channel: Active questions tagged expected-value - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 1327

Expectation of product of functions of random variables

$
0
0

If $X$ and $Y$ are independent continuous random variables (each withfinite expectation), the product of their expectation is the productof their individual expectation.

This can proved by starting from the definition:$$\mathbb{E}(XY)=\int_{-\infty}^{\infty} \int_{-\infty}^{\infty} xyf_X(x)f_Y(y) dy dx$$

Since $x$ is a constant with respect to $y$, we can move the $x$ terms outside of the integral:$$=\int_{-\infty}^{\infty} xf_X(x)\left(\int_{-\infty}^{\infty} yf_Y(y) dy\right) dx$$

Since $y$ is a constant with respect to $x$, we can move the $y$ terms outside of the integral:$$=\left(\int_{-\infty}^{\infty} xf_X(x)dx\right) \left(\int_{-\infty}^{\infty} yf_Y(y) dy\right)$$

Similarly (useful when showing that the mgf of a sum for independent random variables is the product of the mgfs), we can also show that:$$ \mathbb{E} \left[e^{tX} e^{tY}\right]=\mathbb{E} \left[e^{tX} \right]\mathbb{E} \left[e^{tY}\right]$$

But, in general when is is true that:$$ \mathbb{E} \left[g(X) h(Y)\right]=\mathbb{E} \left[g(X) \right]\mathbb{E} \left[h(Y)\right]$$

Is it always true, or are there any useful rules and properties on when it is (or is not) true? I think that replacing $x$ and $y$ in the proof above with $h(x)$ and $g(y)$ should work. But I am not whether any other nuance needs to be taken care of.

Note: This is an extension of a similar kind of property in the discrete case that I had proved some time back. (Similar as in the same idea would be used to prove the discrete version). I haven't taken the "proof" from any source.


Viewing all articles
Browse latest Browse all 1327

Latest Images

Trending Articles



Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>