You’re betting on the final two football games of the season for your home team, the Fiddle-delphia Eagles. Thanks to the wonders of time travel, you happen to know that the Eagles will win one game and lose the other. Unfortunately, you can’t remember in which order they do so. Maybe the Eagles win the first game and lose the second, or maybe they lose the first and win the second.
You have $\$100$, of which you can bet any amount (including fractions of pennies) that the Eagles will win. So if you bet $x$ dollars on the first game and the Eagles win, you’ll have $100+x$ dollars to bet on the second game. But if the Eagles lose that first game, you’re left with $100−x$ dollars to bet on the second game.
You want to implement a betting strategy that guarantees you’ll have as much money as possible after both games. If you did so, then after the two games how much money would you be guaranteed to have?
In this case, where you know there are only two outcomes for the final two games (WL or LW), then you know with perfect clarity what the outcome of the final game is once you observe the outcome of the first game. Now here's where we get into some semantics, what exactly does "you're betting ... for your home team" mean? If you are just betting for either the Figgles as they are affectionately known or their opponents, then you can end up with a guaranteed $\$200$ by betting nothing on the first game, then doubling up on the second game with the sure lock. However, that wouldn't be a very fun answer, so let's instead assume that we can only bet on the Figgles to win and that the question involves deciding how much to optimally bet for the first game and the second game.
Assume that we bet $x \in [0,100]$ on the first game, observe outcome $o \in \{W, L\}$ and then bet $y \in [0, U(x,o)]$ where $$U(x,o) = \begin{cases} 100+x, &\text{if $o = W$;}\\ 100-x, &\text{if $o = L$}\end{cases}$$ on the second game. Since we know that if $o = W,$ then the Figgles will lose the second game and vice versa, we see that the final money balance will be $$V(x, y, o) = \begin{cases} 100 + x - y, & \text{if $o = W$;}\\ 100 - x + y & \text{if $o = L$}\end{cases}$$ then the problem becomes $$\max_{x, y} \min \{ V(x,y, W), V(x,y, L) \}.$$ Since this is multistage, we should first optimize the choice of $y$ in each case. In particular $\hat{y}(W) = 0,$ since if the Figgles won the first game they will lose the second and $\hat{y}(L) = 100 -x,$ since you might as well double up everything on the sure win knowing that the Figgles lost the first game.
So we can reduce the problem by introducing $$\hat{V}(x,o) = V(x, \hat{y}(o), o) = \begin{cases} 100 + x, &\text{if $o = W$;}\\ 2 ( 100 - x ), &\text{if $o = L$}\end{cases}$$ and then solving for the optimal strategy we get an ending money balance of $$\max_{x \in [0,100]} \min \{ \hat{V}(x, W), \hat{V}(x, L) \} = \max_{x \in [100]} \min \{ 100 -x, 200 - 2x \} = 133.\bar{3}$$ when originally wagering $\hat{x} = 33.\bar{3}$ on the first game.
No comments:
Post a Comment