Page 1 of 1

Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 4:45 pm
by Haggis_McMutton
After the roaring success of the previous installment I present you with this new completely unrealistic game.

I hold two envelopes. Each contains a sum of money between $1 and $100 dollars. Neither of us has any knowledge about the envelopes other than this fact.
If you choose to play the game, you get to pick one envelope and look inside it as to see how much money it has. Then you can choose to keep your current envelope or switch and keep the other one. Once the choice is made you're stuck with the envelope you picked.

If you had to pay a fee to play the game, how much would you be willing to pay. (no $0 or minus infinity shenanigans, you know what I mean).
Does the peek at the cash in the first envelope offer you any valuable info? Can this be used to increase your chances of getting the envelope with more money ?
How about if instead of us having no information about the money in the envelope we knew for sure that the amounts in both were randomly chosen (from a uniform distribution, i.e. every value between $1 and $100 has a 1% chance of being in either envelope). Does this change anything?


P.S. as before, if you know exactly what this is don't post the wikipedia page just yet.

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 6:27 pm
by jimboston
$5... maybe $10

No real though put into the statistics here.

It's gambling and the numbers are small enough for me to risk up to $10.

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 6:28 pm
by jimboston
jimboston wrote:$5... maybe $10

No real though put into the statistics here.

It's gambling and the numbers are small enough for me to risk up to $10.


Well maybe $20 if I'm playing at a place that provides free drinks too...

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 8:28 pm
by /
If this were a casino-style game, one would think they would charge at least $50.50 just to be able to break even, giving a 50% chance to win something and a 50% chance to lose something.
But still that's not an appealing prospect since the loss and gain can be a little as 50 cents, eh... I suppose if there's two envelopes it's two shots for the price of one, so that's a plus. I would be willing to pay between 20-40 depending on how lucky I felt at the moment.

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 8:57 pm
by rdsrds2120
What a nice little paradox.

BMO

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 9:03 pm
by x-raider
After the roaring success of the previous installment I present you with this new completely unrealistic game. I remember that. It was great. Thanks for doing this again.

Does the peek at the cash in the first envelope offer you any valuable info? Can this be used to increase your chances of getting the envelope with more money ? It depends. Are they different values? If so then I'd always swap from picking again from getting $1 at first. But either way, as the value of the first envelope goes up the chances of you receiving more than that in the 2nd diminish. At which point It would only be logical to settle for anything above $50. But I personally would settle for $35.
How about if instead of us having no information about the money in the envelope we knew for sure that the amounts in both were randomly chosen (from a uniform distribution, i.e. every value between $1 and $100 has a 1% chance of being in either envelope). Does this change anything? It changes everything. I assumed that already. I haven't considered the alternative. My brain will hurt if I try.
If you had to pay a fee to play the game, how much would you be willing to pay? As the professor said. Except since $0 is not a valid value the average (probability-wise) would be $50.50 and I'd want to make a profit. You might just pay more coz it's fun. But since I said I'd settle for $35 I guess I'd still only pay $40. As the house I'd charge $51

Haha, fast-posted with one line by Mr.BMO

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 9:22 pm
by Lootifer
Not having encountered this before but it sounds very much like a simple form of blackjack.

Hard one to calculate; there are two possible reasons why the second envelope gives you a slight advantage.

Firstly the envelopes may not be independent; that is they may come from a cache of 100 envelopes (or some other finite number); thus by knowing one number you can infer estimates on the remaining population (this is the same principle on which counting cards in blackjack is based on).

Secondly the law of averages still applies in uniform distributions. The mean will still be 50.5 given enough samples; thus if your first envelope is less than this number the most rational decision is to select the second one. Yes you may lose even more by doing this (say you picked $47 in your first envelope and $12 in your second) but on average if you always pick based on this rule you will come out slightly better off.

Now I cant be bothered working out the impact of these effects, but the population is big enough for them to not mean much; i still wouldnt pay any more than $50.50.

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 10:09 pm
by Haggis_McMutton
jimboston wrote:Well maybe $20 if I'm playing at a place that provides free drinks too...


Ah, they key to being a real winner in any gambling establishment. Make sure you cost them more in free booze than they take from you in gambling loses.

rdsrds2120 wrote:What a nice little paradox.

BMO


What a tease. I think there's more than one (pseudo)-paradox associated with this problem.
Elaborate plox.

x-raider wrote:Does the peek at the cash in the first envelope offer you any valuable info? Can this be used to increase your chances of getting the envelope with more money ? It depends. Are they different values? If so then I'd always swap from picking again from getting $1 at first. But either way, as the value of the first envelope goes up the chances of you receiving more than that in the 2nd diminish. At which point It would only be logical to settle for anything above $50. But I personally would settle for $35.
How about if instead of us having no information about the money in the envelope we knew for sure that the amounts in both were randomly chosen (from a uniform distribution, i.e. every value between $1 and $100 has a 1% chance of being in either envelope). Does this change anything? It changes everything. I assumed that already. I haven't considered the alternative. My brain will hurt if I try.

So the alternative would be that they come from ANY distribution.
Worst case scenario, this means the universe might be ruled by some malevolant deity that knows exactly what your strategy will be and what envelope you'll look at first and chooses sums to put in the envelope especially designed to f*ck up your plan. So, if we're not assuming the uniform distribution, worst case scenario, your plan would have to be "adversary-proof"(i.e. even if the deity existed and knew your plan you should still be able to come ahead somehow).

But it actually isn't too hard to extrapolate from the uniform distribution to the adversary distribution after you figure out the trick for the uniform (yeah, there's a trick).

Lootifer wrote:Not having encountered this before but it sounds very much like a simple form of blackjack.

Hard one to calculate; there are two possible reasons why the second envelope gives you a slight advantage.

Firstly the envelopes may not be independent; that is they may come from a cache of 100 envelopes (or some other finite number); thus by knowing one number you can infer estimates on the remaining population (this is the same principle on which counting cards in blackjack is based on).

Secondly the law of averages still applies in uniform distributions. The mean will still be 50.5 given enough samples; thus if your first envelope is less than this number the most rational decision is to select the second one. Yes you may lose even more by doing this (say you picked $47 in your first envelope and $12 in your second) but on average if you always pick based on this rule you will come out slightly better off.

Now I cant be bothered working out the impact of these effects, but the population is big enough for them to not mean much; i still wouldnt pay any more than $50.50.


Hey, you're back. I'm gonna have to find excuses to bump my previous math/statistics related threads now.

Your strategy is correct for the uniform case, but you're missing a big factor in how that strategy affects your estimated payoff. You should be willing to pay more.
Write a script if you don't believe me (I wrote a short one to convince myself ... wait do you write code? I forget)

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 10:23 pm
by Lootifer
Interesting; instinct tells me its not much, but ill believe you :)

And I dont really code much, but wait I can do this empirically through a spreadsheet... BRB

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 10:26 pm
by Lootifer
Dizzam! Makes a way bigger difference than I expected; no way a casino would ever take this up unless it was loaded in some fashion.

For those interested:
show

Re: Let's play a game - part 2

PostPosted: Mon Nov 12, 2012 10:45 pm
by Haggis_McMutton
Lootifer wrote:Dizzam! Makes a way bigger difference than I expected; no way a casino would ever take this up unless it was loaded in some fashion.

For those interested:
show


Yeah, that's about what I'm getting. On 100k runs you usually get about $62.4
This is actually more than I expected. With my initial calculation I was expecting about $58. Not sure where the difference is coming from.
Cool! this is why I post these things. Thought I knew what was going on but turns out I don't exactly.

So yeah, the interesting question for anyone who cares seem to be:
Why do you get this sum? (trying to figure this one out myself) and
How can this be best adapted to non-uniform distributions?

Re: Let's play a game - part 2

PostPosted: Tue Nov 13, 2012 12:14 pm
by AndyDufresne
So much maths. Haggis, I need your phone number in case I need to phone a friend.


--Andy