Alice thinks of two integers, not both equal. She writes them down, and places one (at random) in each of her hands. She presents both of her closed hands to Bob, who gets to pick one. Alice will show Bob the integer in the hand he picks. Now Bob has to guess which hand contains the larger integer. This is a very simple game, but has some very interesting properties.
First, suppose that Bob knows nothing of the process that Alice uses to pick her integers -- he doesn't know her distribution, or any bounds on the values she might choose. Can he win more than 50% of the time? It would seem that he cannot -- If he picks one of her hands, and sees some number i, for every distribution that she might be using so that he would expect i to be the smaller number, there is another translated distribution such that he would expect i to be the larger number. Now what if Alice has a spy, and gets to hear Bob's strategy before the game begins, and can change her strategy accordingly? Does Bob have any hope now?
Well, Bob can of course guarantee that he wins 50% of the time, even if Alice knows his strategy, by simply guessing a hand at random and guessing that it contains the larger integer (And this is the basic rationale for randomized algorithms -- Alice can't thwart this strategy even if she knows what Bob will do). In fact, to do this, Bob doesn't even have to look at the number in Alice's hand when she shows it to him. But it turns out that if he does look, he can guarantee that he wins with probability strictly more than 50%, no matter what Alice does, even if she knew his strategy ahead of time. Here's one thing Bob can do: Before the game begins, Bob can pick a random integer Q from some distribution that has support over all of the integers (Say he flips a coin i times until he gets a heads, and then flips another coin. If it comes up heads, his number is i, otherwise it is -i). Bob then picks one of Alice's hands at random, and looks at the number in it. Bob pretends that the number in Alice's other hand is Q + 0.5, and plays accordingly. That's it.
Why does this work? Its funny that an overactive imagination can help Bob win a game like this -- after all, he doesn't really have any idea what number is in Alice's other hand. But as we will see, it does work. Suppose Alice's numbers are L and R, and that L is less than R.
The point is, the third case occurs with positive probability no matter what numbers Alice picked, because Bob picked Q from a distribution with support over all of the integers! Say case 3 occurs with probability p > 0. Then Bob's probability of winning is (1-p)*0.5 + p = 0.5 + p/2 > 0.5!
Strangely, things can get funnier if Bob knows Alice's strategy. Lets change Bob's incentives a bit: Suppose now, Bob gets paid an amount of money equal to the value in the hand that he guessed was larger, minus the value in the hand that he guessed was smaller (So if Bob guesses correctly he wins money, but if he guesses incorrectly, he loses money). Suppose Alice picks her integers as follows: With probability 1/2, she picks (1, 3). With probability 1/4 she picks (3, 9). With probability 1/8, she picks (9, 27). In general, with probability 1/2^i she picks (3^(i-1), 3^i). It is easy to verify that this is a well-defined probability distribution (In fact, Alice can implement it by simply flipping a coin i times until she gets a heads).
Now how should Bob play? If he picks a hand, and sees a '1', he knows that Alice's other hand must contain a 3, and so he should guess her other hand is larger, guaranteeing a payoff of 2 (rather than a guaranteed loss of 2). Suppose he sees a '3'? In this case, he knows that in her other hand, she has either a 1 (with probability 2/3), or a 9 (with probability 1/3). His expected value for guessing that the other hand is larger is (1/3)*(9-3) - (2/3)*(3-1) = 2/3. His expected value for guessing that the hand he sees is larger is -2/3. In general, conditioned on seeing 3^i for i > 0, the event that Alice's other hand contains a larger number is twice as half as likely as the event that Alice's other hand contains a smaller number, and we can calculate that Bob's expected value for switching hands is 2*3^(i-2), while his expected value for not switching is -2*3^(i-2). So what should Bob do?
He should pick a hand at random. He should look at the number. If he sees a 1, he should switch hands. Otherwise, he should calculate the expected value of switching, and if it is higher than the expected value of staying, he should switch. Equivalently, he should pick a hand at random, and look at the number. If he sees a 1, he should switch hands. Otherwise, he should switch hands (Since we calculated that the expected value of switching is ALWAYS larger, no matter what Bob saw). Equivalently, he should pick a hand, look at the number, and switch hands. Equivalently, he should pick a hand, NOT look at the number, and switch hands (since we calculated that the value he would see is immaterial to his decision). So we have argued that his expected value by picking a hand at random and guessing that the other one contains the larger number has a positive expected value, and that picking a hand at random and guessing that IT contains the larger number has a negative expected value.
But wait a minute -- the two strategies are clearly equivalent -- both of them involve picking the left hand with probability 50%, and picking the right hand with probability 50% -- they must both yield the same expected value. What gives? Surely the value of Bob's strategy isn't affected by the physical act of looking at the number in the hand that he picks (since he won't act on it). Yet we have just argued that two clearly equivalent strategies have different expected values... Hm...This is sometimes referred to as a variant of the two envelopes paradox. You can read some supposed explanations of it, and decide whether they are convincing (I'm not completely convinced). It seems related to the St. Petersburg "paradox".