Is there a house advantage in a "double-or-nothing" coin flip game?
from HandwovenConsensus@lemm.ee to nostupidquestions@lemmy.world on 01 Sep 06:22
https://lemm.ee/post/41147568

The idea is that if the coin flip goes in the player’s favor, they win double their bet. After winning, they can either collect their winnings, or risk them all on another coin flip to have a chance at doubling them. The initial bet is fixed at, let’s say $1.

Mathematically, this seems like a fair game. The expected value of each individual round is zero for both house and player.

Intuitively, though, I can’t shake the notion that the player will tend to keep flipping until they lose. In theory, it isn’t the wrong decision to keep flipping since the expected value of the flip doesn’t change, but it feels like it is.

Any insight?

#nostupidquestions

threaded - newest

solrize@lemmy.world on 01 Sep 06:28 next collapse

If both players have infinite bankrolls, but only one of them is allowed to stop the game once they are ahead, the one with the option of stopping has an advantage. They can play until they are in the lead, then stop. The reason this doesn’t work in real life is that real bankrolls aren’t infinite.

See also: en.wikipedia.org/wiki/Gambler's_ruin

rimu@piefed.social on 01 Sep 06:58 next collapse

And, in real life, the house has a much larger bankroll.

HandwovenConsensus@lemm.ee on 01 Sep 09:44 collapse

I don’t know if that applies to this scenario. In this game, the player is always in the lead until they aren’t, but I don’t see how that works in their favor.

solrize@lemmy.world on 01 Sep 11:39 collapse

Oh wait you mean the player has to stop if they lose? That’s different.

HandwovenConsensus@lemm.ee on 01 Sep 19:04 collapse

Well, they have to start over with a $1 bet.

OsrsNeedsF2P@lemmy.ml on 01 Sep 07:27 next collapse

If you have 100$, and you bet 1$ at a time, infinitely, you will lose.

More generally (simplified to assume you’re always betting the same amount):

P(ruin after X bets) = (edit: I removed my formula because it was wrong…but I’m sure you could mathematically prove a formula)

HandwovenConsensus@lemm.ee on 01 Sep 09:40 collapse

You’re saying that the player pays a dollar each time they decide to “double-or-nothing”? I was thinking they’d only be risking the dollar they bet to start the game.

That change in the ruleset would definitely tilt the odds in the house’s favor.

ada@lemmy.blahaj.zone on 01 Sep 08:27 next collapse

Once the player loses, the chain ends, and the house wins. So as long as the house can afford to keep pushing the player in to trying again, they’re going to create more opportunities for the player to return their winnings to the house.

HandwovenConsensus@lemm.ee on 01 Sep 09:37 collapse

Right, and as the chain continues, the probability of the player maintaining their streak becomes infinitesimal. But the potential payout scales at the same rate.

If the player goes for 3 rounds, they only have a 1/8 chance of winning… but they’ll get 8 times their initial bet. So it’s technically a fair game, right?

rowinxavier@lemmy.world on 01 Sep 10:01 next collapse

If everyone has the same amount of starting capital it is a fair game assuming both can opt out at any time.

That said, the house appears to not be able to opt out (they definitely can, you just don’t think about that part), and the house has more capital. For them each time someone plays a round there are only 3 possible outcomes. Half are the player loses, then a quarter are the player wins and plays another round, and lastly a quarter are the player wins and ends the game. The only case where the player wins is option 3, in all other cases, so 75%, the house wins because the next round has another chance to make the player lose directly at a 50/50 chance or play another round.

[deleted] on 01 Sep 10:20 next collapse
.
Hobbes@startrek.website on 02 Sep 00:47 collapse

They have a 50/50 chance each round. Doesn’t matter how many rounds they’ve won.

HandwovenConsensus@lemm.ee on 02 Sep 01:06 collapse

I’m looking at the game as a whole. The player has a 1 in 8 chance of winning 3 rounds overall.

commie@lemmy.dbzer0.com on 01 Sep 11:40 next collapse

the house can only make $1 per play, and the bettor can make a functionally unlimited amount.

see the martingale strategy. you are basically sticking the house with a martingale strategy in which you get to decide when they bet.

HandwovenConsensus@lemm.ee on 01 Sep 19:36 collapse

But the odds of the player managing to do so are proportionate. In theory, if 8 players each decide to go for three rounds, one of them will win, but the losings from the other 7 will pay for that player’s winnings.

You’re right that the house is performing a Martingale strategy. That’s a good insight. That may actually be the source of the house advantage. The scenario is ideal for a Martingale strategy to work.

General_Effort@lemmy.world on 03 Sep 10:33 collapse

That looks like the St. Petersburg Paradox. Much ink has been spilled over it.

The expected payout is infinite. At any point, the “rational” (profit-maximizing) decision is to keep flipping, since you wager a finite sum of money to win an infinite sum. It’s very counter-intuitive, hence called a paradox.

In reality, a casino has finite money. You can work out how many coin flips it takes to bankrupt it. So you can work out how likely it is to reach that point with a given, finite sum of money. Martingale strategies have already been mentioned.

HandwovenConsensus@lemm.ee on 05 Sep 19:05 collapse

Not quite the same, since in my scenario the player loses everything after a loss while in the St. Petersburg Paradox it seems they keep their winnings. But it does seem relevant in explaining that expected value isn’t everything.