# Thread: the two envelopes problem. My take. I want to know opinions)

1. ## the two envelopes problem. My take. I want to know opinions)

The two envelopes paradox

“You have two indistinguishable envelopes that each contain money. One contains twice as much as the other. You may pick one envelope and keep the money it contains. You pick at random, but before you open the envelope, you are offered the chance to take the other envelope instead.”
Two envelopes problem - Wikipedia, the free encyclopedia

-This is a problem of expected value:

Suppose random variable X can take value x1 with probability p1, value x2 with probability p2, and so on, up to value xk with probability pk. Then the expectation of this random variable X is defined as

Since all probabilities pi add up to one (p1 + p2 + ... + pk = 1), the expected value can be viewed as the weighted average, with pi’s being the weights:

If all outcomes xi are equally likely (that is, p1 = p2 = ... = pk), then the weighted average turns into the simple average. This is intuitive: the expected value of a random variable is the average of all values it can take; thus the expected value is what one expects to happen on average. If the outcomes xi are not equally probable, then the simple average must be replaced with the weighted average, which takes into account the fact that some outcomes are more likely than the others. The intuition however remains the same: the expected value of X is what one expects to happen on average.

Expected value - Wikipedia, the free encyclopedia

My contribution
On expected vale:
E(x)= Expected Value
P= probability
V= Value
EP= P1+P2+……….+Pn
EV=V1+V2+………..+Vn
E(x)= P1xV1+P2xV2+…………..+ PnxVn

V1= currency value inside of Envelope 1
V2= currency value inside of Envelope 2
Vx= Expected currency value inside V1 or V2 (meaning the expected value of any of the two envelopes)

EV= V1+V2
The problem can be solved in two ways either V1=2V2 or V1=0.5V2. Either way you assume, the total of the values of both envelopes must and will remain the same.
(1) EV= V1+V2
(2) If V1>V2 then EV = 2X + 1 X =3X. If EV = 30 then X=10, V1=20, V2=10
(3) If V1<V2 then EV =X+0.5 X=1.5X. If EV = 30 then X=10, V1=10. V2= 20

In other words, regardless of which equation of EV you use. You will find both values of 10 and 20, which again add to 30 and therefore the expected value should be 15.
Determining expected value of Vx

Vx=P1xV1+P2xV2

Using equation (2). Vx= ½ (2X) + ½ (X)= 1.5X
EV=2X+X= 3X. If EV= 30 then X= 10 then Vx= 1.5X= 1.5 (10)= 15

Using equation (3) Vx= ½ ( X ) + ½ (0.5 X)= 3/4X
EV = X+0.5X= 1.5X. If EV= 30 then X= 20 then Vx= 1.5X= 1.5 (20)= 15
Regardless, which equation envelope you pick, the sum of the values of both envelopes have to remain constant in this sample: EV=30 and Vx= 15

The paradox
Vx=P1xV1+P2xV2
Vx=½ (2x) + ½ (0.5X)= 5/4 X
However………Vx≠ 5/4 X
Cause EV≠2X+0.5X≠2.5X.
EV=3X or 1.5X depending on which equation is used (either equation 2 or 3).
The reason for the paradox is due to rather than using either equation (2) or (3). The paradox is using an equation that does equal the total value of both envelopes cause is combining a portion of equation 2 and a portion of equation 3 into a new one.
In other words, assuming that EV =2.5X.
Which we know is not true.

If rather than using money values we have two pills: a red pill and a blue pill. And we distribute them in two envelopes.
The expected value of which pill is on each envelope is
(a) Vx= ½ Blue+ ½ Red
(b) Total of colors= Blue + Red
The total of the colors of options (B) have to present in the formula of expected value (a).

Let me know your opinions guys.

2. ## Re: the two envelopes problem. My take. I want to know opinions)

another way to see the problem is that an amount of money has been distributed in two envelopes. one envelope having 1/3 of the amount and the other having 2/3 of the amount.

EV=1/3EV+ 2/3EV

Vx= 1/2 1/3EV + 1/2 2/3EV= 1/2 EV

meaning, there is not need for switching envelopes.

3. ## Re: the two envelopes problem. My take. I want to know opinions)

Originally Posted by Verdugo
another way to see the problem is that an amount of money has been distributed in two envelopes. one envelope having 1/3 of the amount and the other having 2/3 of the amount.

EV=1/3EV+ 2/3EV

Vx= 1/2 1/3EV + 1/2 2/3EV= 1/2 EV

meaning, there is not need for switching envelopes.
The way you are looking at it here is from the view of someone who has not chosen either envelope but in the problem the person has already chosen one envelope and must choose whether to switch or not. I think this is the main reason for the paradox. When reading about the paradox we think of it as if we haven't chosen either envelope yet and it should be obvious that both envelopes have the same expected value, but then when we try to resolve the paradox we calculate it in terms of a person who has already chosen one envelope and they are making a decision. Taking different perspectives when calculating expected values and when deciding that the result does not make sense is what causes the apparent contradiction. Taking different perspectives will cause a contradiction so we should not be surprised that we reach an irrational conclusion.

4. ## Re: the two envelopes problem. My take. I want to know opinions)

Originally Posted by Shakarri
The way you are looking at it here is from the view of someone who has not chosen either envelope but in the problem the person has already chosen one envelope and must choose whether to switch or not. I think this is the main reason for the paradox. When reading about the paradox we think of it as if we haven't chosen either envelope yet and it should be obvious that both envelopes have the same expected value, but then when we try to resolve the paradox we calculate it in terms of a person who has already chosen one envelope and they are making a decision. Taking different perspectives when calculating expected values and when deciding that the result does not make sense is what causes the apparent contradiction. Taking different perspectives will cause a contradiction so we should not be surprised that we reach an irrational conclusion.

(a) if you assume the envelope you selected is 2X. The other envelope has to be X. It cant be X/2. Meaning Vx= 1/2 (2X)+ 1/2 (X)= 1.5X and since V1+V2= 3X.
Vx= 1/2(V1+V2)

(b) If you assume, the envelope you selected is X/2. The other envelope has to be X. It cant be 2X.
Meaning Vx= 1/2 (X/2) + 1/2 (X) = 3/4X and since V1+V2=1.5X.
Vx=1/2( V1+V2)

C) if the envelope you selected is X, then you have to make a choice on what is the total of both envelopes is either 2X+X or X+0.5X.

If you say the envelope you selected is X and the total of both envelopes is 2X+X. then you are in case (a), explained above.

If you say the envelope you selected is X and the total of both envelopes is X+1/2X. then you are in case (b), explained above.

either way Vx=1/2(V1+V2)

5. ## Re: the two envelopes problem. My take. I want to know opinions)

You calculated the expected value of each envelope, not the expected value of switching. What you currently have plays a role in the pay-off of switching.

6. ## Re: the two envelopes problem. My take. I want to know opinions)

Originally Posted by Shakarri
You calculated the expected value of each envelope, not the expected value of switching. What you currently have plays a role in the pay-off of switching.
I think I calculated the value of each envelope first. Then I calculated the expected value of switching which is the average value of both envelopes.

In case (a). one envelope is 2X and the other envelope is X.

the expected value of switching the envelopes = 1/2 (2x) + 1/2(X)=3/4X and since both envelopes is 3X. the expected value in switching is 1/2 (3X). No need for switching as the expected value is half the total anyways.

In case (b) one envelope is X and the other envelope is 0.5X

the expected value of switching the envelopes is 1/2 (X) +1/2 (0.5X)= 3/4X and since the total of both envelopes is 1.5X. the expected value is 1/2 of the total of both envelopes. No need for switching.

7. ## Re: the two envelopes problem. My take. I want to know opinions)

Your calculation is more the intuitive reasoning behind saying "It doesn't make sense to always switch to the second envelope"
Do you follow that your calculation is just the reason why we say it doesn't make sense to always switch. There are two ways to evaluate the expected value of switching, your way and the normal way shown in the paradox. We already know intuitively that there should be no reason to switch but you have to also give reasoning as to why the normal evaluation of switching is incorrect if you want to debunk the paradox.

8. ## Re: the two envelopes problem. My take. I want to know opinions)

I want to elaborate on my point but since we are replying quite rapidly I will post a new reply instead of editing in case you are repling at this moment and you miss my edit.

The problem posed can be evaluated in two ways.

1. If we picked X and we switch to 2X then the gain is X, if we picked X and we switch to X/2 then the gain is -X/2. The expected value is then 0.5(X)+0.5(-X/2)=X/2 therefore we should switch

2. The total value in the envelopes is 3X/2. The Expected value of both envelopes is 3X/4, neither envelope has a high expected value so switching is pointless. Therefore we should not switch.

Both of these evaluations of the problem has their own proof which I have summarized into two lines. You have proven that it is futile to switch but I have proven that it is worth switching. We both have a proof for conflicting statements so one of us must be wrong. Proving your theory is not enough to disprove mine as both proofs appear valid. You must find the error in one or other proof in order to overcome the paradox.

Perhaps you hadn't seen your proof before and you thought it was bringing new evidence to light but generally only proof 1 is shown and proof 2 isn't necessary to show because it should just be obvious to us that one envelope is not better than the other and it is informally accepted that proof 1 reaches a result which is not right.

9. ## Re: the two envelopes problem. My take. I want to know opinions)

Originally Posted by Shakarri
I want to elaborate on my point but since we are replying quite rapidly I will post a new reply instead of editing in case you are repling at this moment and you miss my edit.

The problem posed can be evaluated in two ways.

1. If we picked X and we switch to 2X then the gain is X, if we picked X and we switch to X/2 then the gain is -X/2. The expected value is then 0.5(X)+0.5(-X/2)=X/2 therefore we should switch.
I will prove that what you are describing is false.

Let say, there are 3 envelopes.

the envelope you pick has X amount (cause you opened it)
one of the other two envelopes has 2X and the other envelope has X/2 (you dont know which envelope has 2X or X/2 but you know they do exist with those amounts)
Should you switch?
the average value of the last two envelope (expected value of switching) = 1/2 (2X) + 1/2 (X/2)= 5/4 X.

Since you have X, the expected gain on the switching is 5/4X-X= 1/4X. meaning you should switch.

This is the equation that you are using, which as I have demonstrated is for another problem all together.

10. ## Re: the two envelopes problem. My take. I want to know opinions)

Please don't take me as being an annoying user trying to poke holes in your proof, I am just trying to show you why your addition does not solve the paradox.
The calculation I did may be equivalent to one from a different problem but why do the problems have to be the same to have the same calculation? Your calculation is equivalent to the problem of having two envelopes in front of you and choosing which to open knowing you wont be able to switch, this does not invalidate your calculation.

11. ## Re: the two envelopes problem. My take. I want to know opinions)

Originally Posted by Shakarri
Please don't take me as being an annoying user trying to poke holes in your proof, I am just trying to show you why your addition does not solve the paradox.
The calculation I did may be equivalent to one from a different problem but why do the problems have to be the same to have the same calculation? Your calculation is equivalent to the problem of having two envelopes in front of you and choosing which to open knowing you wont be able to switch, this does not invalidate your calculation.
you are not annoying. I actually appreciate your responses.

If you think the envelope you have is X. You have 50% chance that the other envelope is 0.5X (in this case, you would have the bigger envelope) but if so the total of both envelopes is X+0.5X = 1.5X
if you think the envelope you have is X. You have 50% chance that the other envelope is 2X (in this case you would have the smaller value envelope) but if so, the total of both envelopes is 2X+X= 3X

what I'm trying to explain is that you need to select whats the total of the two envelopes. That info cant be ignored when doing the calculation.

The funny thing about this false paradox is that the information about one envelope being more valuable than the other is useless............cause even if one envelop is 0.99999 of the sum of both envelopes. Any of the two envelopes has a expected value of 0.5 the value of both envelopes added together.

12. ## Re: the two envelopes problem. My take. I want to know opinions)

It may be more intuitive to think of it this way. Suppose you choose an envelope, open it and find it contains $20. This means you know the other envelope has either$10 or $40 in it. Is it worth a$20 bet to take a chance of winning either $10 or$40 with 50:50 odds? The answer is yes - in fact it would be a break even proposition if you know the other envelope had either $0 or$40; knowing that you will win at least $10 makes this a bet in your favor. So you take that bet, exchange your$20 for the sealed envelope and on average you will get back (1/2)(10+40) = $25. It's a different game if you don't know that you have the$20 in hand - i.e. if you don't open the envelope before deciding to switch. This is the sceanrio that the OP followed, but it's not consistent with the game of the so-calledparadox. Under this scenario there's a 1/4 chance that you are holding the $10, a 1/2 chance that you are holding the$20, and a 1/4 chance that you are holding the $40. The expected value is$22.50. If you switch envelopes the expected value becomes (1/4)20 + (1/2)(1/2)(10+40) + (1/4)20 = \$22.50. So there is no advantage in switching.

So it matters knowing that you have the mid-value envelope in hand.

13. ## Re: the two envelopes problem. My take. I want to know opinions)

Originally Posted by ebaines
So it matters knowing that you have the mid-value envelope in hand.
and yes we know the mid value. The mid value (which is also the expected value) is the sum of both envelopes divided by the amount of envelopes (2).

Expected Value of any envelope= (value of Envelope 1 + Value of Envelope 2+..........+ Value of Envelope N) /N

Regardless whats the proportion of money value among the envelopes, regardless how many envelopes, regardless if we switched or if we have not switched.

I still think the confusion is for combining two equations that are not related X+0.5X with 2X+X

14. ## Re: the two envelopes problem. My take. I want to know opinions)

Combining both formulas
There is a game of 3 cards. Each card at the front reads as follows: X/2, X and 2X. The back of the card has a unique color that allows only the host to know which card is each.
Prior the game begins, the host will randomly take one of these two cards out of the game: X/2 or 2X. The participant has no knowledge of which of these two cards has been taken out.
Meaning that the two cards left in the game is one of the following two possible sets:
1) X/2, X
2) X, 2X

In case the game is set 1, He only wins if he has the card “X” in his hand
In case the game is set 2. He only wins if he has the card called “2x” in his hand.

The participant will be asked to select one card randomly and he is allowed to switch it as long as he has not seen which card he initially got, neither the card that is left on the table.
The participant is not allowed either to know which card was taken out of the game.

Should he switch? Let’s see

Which card he has in his hand?
This is based on the chances of him having any of the cards.
Chances:
“Card X”: the chances is X being in his hand is 1/2 since this card is always in the game.
The other 1/2 of the cases, he will have either for cards X/2 or 2X, meaning each of these cards has 1/4 of being in his hand.
So, the card he has in his hand is the following equation= 1/2 (X) +1/4 (X/2) +1/4 (2X)

What is the winning value of each card?
X: X wins when the other card is X/2 but he loses if the other card left in the game is 2X. Meaning X has a 50% chance of winning when in the game.
X/2: This card will always lose. So its winning value is 0.00
2X: This card will always win. So the winning value is 100.00

So what is the expected value of the card he has (the chances of winning the game)?

1/2(50%) + ¼ (0%) + ¼ (100%) = 50%
So, the chances of winning the game is 50%. No need for switching. In other words, there is no paradox.