It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Game theory that shouldn't work.

page: 1
6
<<   2 >>

log in

join
share:

posted on Jun, 26 2014 @ 08:21 PM
link   
Imagine that their are two blindfold players, player A and player B and a dealer. The dealer has a load of brass coins and a load of zinc coins. The dealer randomly puts two zinc coins and one brass coin in a holder, which holds the coins in numbered slots 1.2.3. Player A is asked to choose, a number at random. In this example, for the first trial. He choses slot number 1...the dealer then removes one of the zinc coins left in the remaining slot . Player A. now changes his mind and chooses the remaining coin instead of his first choice. Player B is given the remaining coin. After a run of a hundred it consistently turns out that player A ends up with 60 brass coins and player B ends up with 40.

In James mays program where this was demonstrated. They used cans of beer held to their heads to jazz it up, two were shaken and one was left unshaken, after the choices they had to open the can an hold it to their heads.

Isn't this a bit weird? as the two blindfold players should actually have a 50/50 chance and the number of trails should end up around 50/50? and not 60/40.

The rationale given was, Player A's , first choice gave him only a 33% and a third chance of picking a brass coin, the nice shiny one lets say. Player A's choice left the dealer with either one or two zinc of which which. he removed one zinc one.. Player A by changing his choice to the only coin that was left and letting player B have what he first picked at random it might be said, now, player B is left with the 33% chance and A left with the 66% chance of the shiny coin. ..Which is what happens in all the trails on a constant basis. But hang on both were blindfolded and both should have gone home with an equal amount of the shiny coins. Pick the bones out of that, and if their are any other weird betting strategies lets use them advantageously.




posted on Jun, 26 2014 @ 08:33 PM
link   
a reply to: anonentity

No. It is not weird.

It is math.



Edit: Come to think of it, those are not mutually exclusive :-D (But really, it is just math!)
edit on 26-6-2014 by DupontDeux because: (no reason given)



posted on Jun, 26 2014 @ 08:44 PM
link   
a reply to: anonentity

"Isn't this a bit weird? as the two blindfold players should actually have a 50/50 chance and the number of trails should end up around 50/50? and not 60/40."
Not at all. You see, the 50/50 is the odds. Ideally, that would mean, yes, both players would end up with the same amount. Realistically, this isn't so- its still going to land on one side more than the other. Try it for yourself. Pick up a quarter, or any other coin for that matter, and flip it 20 times. Tell me what you get.



posted on Jun, 26 2014 @ 08:50 PM
link   
Probability is only absolute when an infinite amount of trials are run. One of the first things I learned in statistics. That's where things like outliers and standard deviation come in to play when graphing outcomes.



posted on Jun, 26 2014 @ 08:56 PM
link   
The first player has a 33.3% chance of getting a brass coin, but the second player has either a 50% or a 0% chance, depending on the outcome of the first pick. By allowing player A to change his choice, it doesn't really change the outcome.

Can you provide a video demonstration of the game, because I'm not sure that I understand the mechanism? How is there a brass coin selected in 100% of the rounds? It would seem that both players would end up with a zinc coin some of the time.



posted on Jun, 26 2014 @ 09:00 PM
link   
a reply to: LucidWarrior

True! But that's not really the spirit of the problem. There is math to support that this outcome will almost never end up 50/50, where as if you flip coins, they pretty much are 50/50. This guy thinks otherwise, but he is still saying that aerodynamics make it so that its about 51/49 in favor of heads-up.

Beside the point though. Examine the problem again in the OP and think about the three coins in the initial pool, the probability of selecting brass and the probability of selecting zinc randomly--they are not equal, and each decision leads to two different outcomes… This is a good indicator off the bat that it is not 50/50.



posted on Jun, 26 2014 @ 09:03 PM
link   
a reply to: Lynk3

That is not correct. The more data you get, the more accurately you can predict it… however playing with "infinity" complicates things. In fact, in some cases, the more data you have the more inaccurate its predictive qualities become-mainly due to things like outliers and anomalies. Plus, in math, ESPECIALLY statistics, defining infinity is a royal pain. When using a t-distribution or a small sample analysis, many statisticians refer to 30 as being close enough to infinity… In which case, 30 trials or sample selections is very easy to do whilst testing.



posted on Jun, 26 2014 @ 09:04 PM
link   
This is a famous problem, and it has a name. It is called "The Monty Hall Problem" and it even has its own Wikipedia page.

en.wikipedia.org...

Should you stick with the door you selected? Or should you switch? (You have to know the TV show "Let's Make A Deal" to get that reference...)

So -- I have thought about this literally for years, and can give you an example that definitely clears up why, when one of your choices is narrowed down from some number to only two, you should always switch. If anyone is interested, I can post this explanation -- but I have a feeling this excellent thread is too technical, and probably won't last long.

That's unfortunate --

edit on 26-6-2014 by Axial Leader because: striving for perfection



posted on Jun, 26 2014 @ 09:16 PM
link   
Could it have something to do with the methods/orders in which during the trials the dealer placed the coins in?



posted on Jun, 26 2014 @ 09:24 PM
link   


Here you go. The Monty Hall in a little under 5 minutes. There will be two thirds more of one option than the other.



posted on Jun, 26 2014 @ 09:32 PM
link   
a reply to: PhysicsAdept

I didn't mean do infinity amount of trials. It means that the more trials you do, the more balanced the outcomes tend to be. I should've made that more clear. I understand that infinity complicates things because you can't calculate a concept. As for flipping a quarter, i do believe the miniscule weight differences in the logos of the head and the eagle do play a part in affecting gravity rotating the coin in mid-air.
edit on 0146k3 by Lynk3 because: (no reason given)



posted on Jun, 26 2014 @ 09:42 PM
link   
a reply to: Lynk3

No but you did say:



Probability is only absolute when an infinite amount of trials are run.


I am saying, that is erroneous. Probability can be absolute, or at least can be calculated with spectral accuracy under special circumstances. If I say, pick a number between 1 and 100 randomly, there is a 1 percent choice you will pick 37, that's just how probability works. Now, in a real situation where we do not know the possible outcomes, we measure it using statistics. We take a sample and use it to predict samples in the future. Again, in some cases you can literally pretend an infinite amount of trials is 30 trials (odd I know, I can provide a source if needed… I remember this from a stats class I took and I'd have to reference something I have not yet read if you need one.)

As for calculating a concept--again, its quite easy. Research the concepts of limits, or anything in calculus for that matter, to see how infinity is actually used in calculations all the time.



posted on Jun, 26 2014 @ 09:51 PM
link   

originally posted by: LuXiferGriM
The first player has a 33.3% chance of getting a brass coin, but the second player has either a 50% or a 0% chance, depending on the outcome of the first pick. By allowing player A to change his choice, it doesn't really change the outcome.

Can you provide a video demonstration of the game, because I'm not sure that I understand the mechanism? How is there a brass coin selected in 100% of the rounds? It would seem that both players would end up with a zinc coin some of the time.


They don't get a brass coin all the time, its the % of who gets the brass coin. A gets it 60% and B 40% ish.



posted on Jun, 26 2014 @ 09:53 PM
link   
a reply to: anonentity

This is a variation of the Monty Hall problem.

In this case, the dealer is biased, just like Monty.



posted on Jun, 26 2014 @ 09:56 PM
link   
a reply to: Axial Leader

The key to the Monty Hall problem is that Monty is not randomly opening a door. He's biased.



posted on Jun, 26 2014 @ 09:57 PM
link   
"the dealer then removes one of the zinc coins left in the remaining slot"

Although the players are blindfolded, the dealer is not and so he can artificially reduce the chances of picking a zinc coin by removing one.

If player A originally picks the brass coin P(1/3) then the dealer is free to choose to remove one of the two remaining zinc coins - increasing the odds of picking the brass coin to P(1/2) .

If Player A originally picked a zinc coin P[2/3] then the dealer would selectively remove the other zinc coin leaving only the brass coin with P(1). Giving overall odds of 2/3 to pick the brass coin.

So it is always beneficial to switch your original choice because in two thirds of cases the brass coin is guaranteed.



posted on Jun, 26 2014 @ 10:05 PM
link   
a reply to: PhysicsAdept

I remember the 30 rule, but let's just say, in that particular high school course, the students had to teach the teacher how to teach us statistics. Not because of his teaching methods but because he didn't know how to teach statistics. Thank you for the correction with specific examples.



posted on Jun, 26 2014 @ 10:34 PM
link   
Nice! There are some good answers here.

This is what I've discovered -- the Monty Hall Problem becomes a lot easier to understand if you are working with larger numbers.

Picture a situation where you have 100 cups face down -- a diamond is hidden in one of the cups. You can keep the diamond if you guess right.

You pick cup 88 -- at random. You have only one chance in a hundred of keeping that diamond!

Now the dealer turns over all the other cups except one -- cup 25 is left face down -- all the other cups are shown to be empty.

Now you see two cups face down: -- there is cup 88 which you initially picked -- and cup 25 which the deal picked.

Should you stick with your original guess? Or switch?

If you decide to stick, you aren't paying enough attention. Obviously the diamond is under cup 25 -- THAT IS WHY THE DEALER LEFT THAT CUP FACE DOWN! You should definitely switch.

You see -- when you switch, all you are doing is gambling that your first guess was WRONG! This is a good bet, since with three cups you have only one chance in three you guessed right on your first guess.

In the case of 100 cups, it becomes even more obvious -- you only had one chance in 100 (1% chance) of guessing correctly on your first guess, so there is a 99% chance that you guessed wrong, and the diamond is under the cup the dealer picked.

It is a great bet that if you switch, you will get the diamond.



posted on Jun, 26 2014 @ 10:37 PM
link   

originally posted by: ImaFungi
Could it have something to do with the methods/orders in which during the trials the dealer placed the coins in?


Lets try and up the ante a bit.

Instead of a dealer lets have a random number generator, the generator has three screens where after its switched on the number "1" will stop at either screen 1..2..3..The 1 that's travelling between screens 1. 2. 3 . representing the brass coin , and the two empty screens represent the two zinc ones.

lets get rid of player B who does nothing, and turn player A into another random generator, programmed to pick the numbers 1. 2. or 3. which represents the dealer generators three screens, where the number 1 will appear,then when the dealer generator has removed a screen from the game which represents the one or two zinc coins. Generator A then switches its choice.

What we have done is change the "Dealer" into the" Dealer Generator "and player A is playing the house so to speak. What has happened is that if random number generator "A"was playing the dealer generator, at a guessing game on what screen the "1" would fall it would have done it about 33 times in a run of a hundred, Now under this scenario it has guessed it about 66 times in a run of a hundred.

Then change the question to , how can you score more than 50% hits on a random number generator, from guessing on what screen the number 1 will appear, answer = do it this way with three screens ,by giving yourself 2/3 of a chance and not 50/50.



posted on Jun, 26 2014 @ 11:20 PM
link   
a reply to: PhysicsAdept

This is where Bayes' Law comes into play, because in the real world, there is no such thing as a fair coin or a fair toss.



new topics




 
6
<<   2 >>

log in

join