|
Craps Math (WARNING: math content)
This is assuming you understand the game, and like math. If both of these aren't true you should probably just leave now.
I was trying to determine the effect of making a bet of 1 with odds of 5, vs a bet of 2 with odds of 4. I know the first should be more eV but I wanted to see how much. Anyway, this is pissing me off. Here's my spreadsheet...
http://spreadsheets.google.com/pub?k...lx4jYynUVGySNA
Column A: the dice total
Col B: # of combinations
Col C: colB divided by 36, the % chance of the roll
Col D/E: The percentage win/lost from roll with a passline bet. So 7 is always a winner, but you win 45689T a certain % of the time. I'm basically only counting a throw with the point off
Col F/G: colD/E * colC -> the percentage won times the chance of that number being thrown
I am 100% confident in the columns so far (shows 49.3% expected rate which is right). Now to add odds, not sure where I'm thinking wrong.
Col H/I: The number of bets won/lost.
when a 4 hits you win the bet + 2 * the odds, when you lose you lose the bet + the odds. This seems logically sound
col J/K: eV is the number of bets times the chance of the occurrence happening.
col L: sum the +eV and the -eV
but the sum is always -0.014 for a bet of one and doesn't change with the odds. But you're supposed to show a better edge with increased odds. Does anyone know what's wrong. (you should be able to edit, if you change the odds the eV doesn't change, if you change the bet it does, please don't save changes).
|