Here is a probability game description:
You generate a uniformly random number in the interval(0,1). You can generate additional random numbers as many times as you want for a fee of$0.02per generation. This decision can be made with the information of all of the previous values that have been generated. Your payout is the maximum of all the numbers you generate. Under the optimal strategy, find your expected payout of this game.
For this game, my approach was to determine the number of rolls n, at which the expected difference between the maximum of n rolls and n-1 rolls is below 0.02. This would be the point at which I wouldn't roll anymore, as the fee overshadows the expected gain. Setting this up as follows:
$X = Max( X_1,...X_n)$, $Y = Max(X_1,...,X_{n-1})$
$E(X) - E(Y) \leq 0.02 $
$\frac{n}{n+1} - \frac{n-1}{n} \leq 0.02$
Solving this, we get n = 6.58. Then, as the number of rolls is discrete, we take n=6. Then the expected payout should be:
$Payout = \frac{6}{7} - 0.02*5 = 0.76$
However, the answer is 0.82. What is wrong with my logic?