27171 total geeks with 3531 solutions
Recent challengers:
 Welcome, you are an anonymous user! [register] [login] Get a yourname@osix.net email address 

Articles

GEEK

User's box
Username:
Password:

Forgot password?
New account

Shoutbox
ewheregoose
[b][url=http ://www.gg995 5.net/]links of london outle<strong ><a href="http:/ /www.gg9955. net/">links of london outlet store</a></s trong> <br> <strong><a href="http:/ /www.gg9955. net/">Cheap Links Of London Jewelry Wholesale</a ></strong> < br>
ewheregoose
<strong><a href="http:/ /www.gg9955. net/">Links London Discount</a> </strong> <b r> <strong><a href="http:/ /www.gg9955. net/">links of london silver</a></ strong> <br> <strong><a href="http:/ /www.gg9955. net/">Links of London Sale Online</a></ strong> <br> <br>
ewheregoose
<strong><a href="http:/ /www.objecto studio.com/" >replica watches</a>< /strong> <br > <strong><a href="http:/ /www.objecto studio.com/" >watches sale</a></st rong> <br> <strong><a href="http:/ /www.objecto studio.com/" >replica watches</a>< /strong> <br > <br> <titl
ewheregoose
[b][url=http ://www.shoes tomsonline.c om/]2014 New st<strong><a href="http:/ /www.shoesto msonline.com /">2014 New style toms shoes</a></s trong> <br> <strong><a href="http:/ /www.shoesto msonline.com /">TOMS Official Outlet Store</a></s trong> <br>
ewheregoose
<ul><li><str ong><a href="http:/ /www.shoesto msonline.com /">toms shoes outlet sale</a></st rong> </li>< li><strong>< a href="http:/ /www.shoesto msonline.com /">Toms Outlet</a></ strong> </li ><li><strong ><a href="http:/ /www.shoesto msonline.com /">toms shoes outlet<

Donate
Donate and help us fund new challenges
Donate!
Due Date: Nov 30
November Goal: $40.00
Gross: $0.00
Net Balance: $0.00
Left to go: $40.00
Contributors


News Feeds
The Register
How to run a big
web music biz
(Spotify): Grab
more cash from
fans, sink deeper
into
Floody hell! Brits
cram Internet of
Things into tight
White Spaces
BT said to have
pulled
patent-infringing
boxes from DSL
network
Bloke fighting
Facebook in court
says ad network
claims its users
lack "legal
capacit
Technology quiz
reveals that nobody
including quiz
drafters knows
anything about IT
Hey, here"s some
face-tracking tech
from Samsung you
probably won"t find
creepy at all
Why did it take
antivirus giants
YEARS to drill into
super-scary Regin?
Symantec
resp
UK cops: Give us
ONE journo"s phone
records. Vodafone:
Take the WHOLE damn
database!
Privacy bods Detekt
FinFisher dressed
as bookmark manager
Snowden doc leak
lists submarine"d
cables tapped by
spooks
Slashdot
NSF Commits $16M To
Build Cloud-Based
and Data-Intensive
Supercomputers
New Snowden Docs
Show GCHQ Paid
Telcos For Cable
Taps
ISS"s 3-D Printer
Creates Its First
Object In Space
Is LTO Tape On Its
Way Out?
The People Who Are
Branding
Vulnerabilities
Firefox Will Soon
Offer One-Click
Buttons For Your
Search Engines
How the World"s
First Computer Was
Rescued From the
Scrap Heap
Samsung Shows "Eye
Mouse" For People
With Disabilities
About 40% of World
Population Online,
90% of Offliners In
Developing
Countries
How Intel and
Micron May Finally
Kill the Hard Disk
Drive
Article viewer

Math Oddities



Written by:sfabriz
Published by:Nightscript
Published on:2006-08-28 10:30:52
Topic:Maths
Search OSI about Maths.More articles by sfabriz.
 viewed 28993 times send this article printer friendly

Digg this!
    Rate this article :
This article will show you some mathematical oddities that I find very interesting and curious.

What is a math oddity? Something counter-intuitive, something that you may not know or just something particulary curious. We're going to see four of these examples in four different fields of mathematics:

  • Mathematics
  • Geometry
  • Probability
  • Arithmetic


Also, I will be leaving you with a mathematical problem that I like a lot.

Chapter 1: Do you know number 1?

All the time people talk about numbers: When they go shopping they have to pay, when they set an appointment they have to calculate time or date differences, or when they talk about their relatives and say "yes, I have eleven cousins." A very basic level of math is required to do things like these, and if I asked you "do you really know the number 1?" , you would without any hesitation answer "of course!". So here's the question:

0.9 is (less than / equal to / greater than) 1?
The answer would be "less than". Nice work.

Another question:

1.1 is (less than / equal to / greater than) 1?

The answer would be "greater than". Nice work again.

But what would you answer be to this question:

is (less than / equal to / greater than) 1?

I'm sure that noone would answer "greater than", but I'm also sure that most of you would answer "less than" which is as wrong as "greater than".
The correct answer in fact is "equal to".
You can't imagine how many people told me that no, is surely less than 1 even if the difference between the two numbers would be a very very small number, yet there must be some difference.
I say, no difference! and 1 are the same exact identical number.

Let's see why:

I need you to be familiar with the summation notation.
A summation (capital greek letter sigma) is a math symbol that represents a sum. The argument can vary depending on whether it's related to the index or not. The index is something that walks a path (a discrete path) between 2 bounds. For example if I ask you to calculate the sum of the fist ten numbers you would create a program that goes from 1 to 10 and sums all the numbers.
In programming language something like:

sum=0;
for (i=1; i<=10; i++) {
   sum = sum + i;
}
print( sum );


Thanking Mr. Gauss (at the age of 6!) we know that sum will be 10*11/2=55. But it's not the point here.

If you have to write this in math symbols, this is what you write:


Now let's go back to our problem:

You sure agree with the fact that:


and you can write the summation in a short form like this:


also we know that:


and hence:


and concluding:


Nice, isn't it? Some of you may have already known this. Some of you maybe not.
The most common argument used by who disagrees with this is that the summation goes to infinite and hence its sum must be an approximation. This is a very common mistake, also because itself is something that involves the concept of ∞. Anyway, if you still aren't convinced of the truth of what I said then please ask your math teacher and you'll see!

Ready for the next chapter? Let's go!


Chapter 2: Big Circles!

This one I like a lot.
Consider the earth. You know that its shape is like a sphere, just a little compressed on the poles. You surely know the equator, which is the circumference across the middle point of z-axis of the earth. Its radius is about 6378,135 km. Lets say it is 6378 km ok? It's not important for our purposes here. Now, to know how long the equator is we simply calculate the circumference given the radius with the simple formula C=2πr, r being the radius and π being the very famous number 3.141592... (greek p)

The equator circumference is C=2*π*6378=40074 km. About 40000 km. It's a loooooong way isn't it?

Let's do some geometry. Imagine you have a very very big sheet and you can draw this circumference. It's 40000 km so the sheet must be very large. Now, since you have drawn this one, let's draw another one with a different length. Instead of 40.000.000 meters let's draw another one which length will be 40.000.001. Just one meter longer than the first one. Be careful to pick the same center, so the two circumferences are concentric, which means that the distance between the two circumferences is constant. Having added a single meter the second circumference is a little larger than the first one. Since you cannot find a sheet that big, draw them in your mind and think. How long would the distance between them be? You can put through them an atom, a needle or a cat? Remember you just added a meter to a circumference long 40 millions of meters, so probably the distance between the 2 circumferences won't be that big, hehe.
Surprisingly, a cat will be able to pass through them. It's not magic! Let's see why.

Let's call C the equator circumference (40.000.000 meters long) and C' the modified one (40.000.001 meters long). Let's calculate back the two radiuses r and r'.

Inverting the formula given above you get that r=C/(2*π) and obviously r'=C'/(2*π).

Do the math and you'll get r = 40000000/(2*π) = 6366197.72 meters and r' = 40000001/(2*π) = 6366197.88.

The difference between the two radiuses, and hence the distance from the two circumferences is about 16 cm! That easily allows a cat to pass between them! Surprised? I guess! I had a face like a monkey the first time I saw this. In fact, the mind tends to imagine that adding just one meter to a very very large circumference won't produce that great change, but instead it happens. The key to understand this is that the relation between the radius and the circumference is linear and you can write the formula in this way.

C' = C+1 => r' = C' / (2*π) = (C+1) / (2*π) = r + 1/(2*π) = r + 16cm

Nice huh? Wanna some more? let's proceed.


Chapter 3: Flip a coin!

Many many times I heard people talking about chances and probability. I too used to talk about probability a lot, since I studied it and understood that I didn't know anything about it. In fact, most of the times, when we predict an event we are wrong, because our knowledge of probability is quite bad.
An example? Have you ever played the lottery? hehe...

Ok, let's pick an easy example to show how man's mind works (bad) sometimes.

Have you ever played a football match? (by the way... WORLD CHAMPIONS!!!)
Well, when you start the match you usually flip a coin to decide who's going to have the ball first. The two captains choose which side they want. Tail or Head. If the result is Tail, the captain who chose Tail gets the ball, otherwise the other one gets the ball. Very easy.
A perfectly balanced coin, flipped by a skilled flipper, has 50% of chances to fall on the Tail and 50% of chances to fall on the Head.

In probability jargon we will say that p[T]=0.5 and p[H] = 0.5 = (1-p[T]).
T and H are the two events (coin fallen on Tail - coin fallen on Head) and obviously their probability is the same.

If p[A]=0 then A is called impossible event and is never going to happen.
If p[A]=1 then A is called certain event and it's the only thing that can happen.

So when you divide the omega-set (the set containing the events that can happen), the sum of the probabilities assigned to every event must be 1. That's why when we consider just one event (and it's opposite) if I say that an event E has 20% of probability to happen (p[E]=0.2) you instantly know that it also has the 80% (0.8) of probability not to happen.
If you have 40% of probability to win a game then you have 60% of probability to lose it.

Now, if you flip a coin for 1000 times it's very likely that you'll get about 500 Heads and 500 Tails. Well, you'll get probably something like about 500 Heads and (1000 - about 500) Tails or so. I mean, not a perfect result, but something that is quite similar to it. This because, being 0.5 the probability of getting a Head, if you flip 1000 times you expect to get about 500 Heads. That's right and in fact works.

Now, you pick your perfectly balanced coin and start flipping and incredibly you get 500 Heads in the first 500 flips. This is a very rare event, but nonetheless can happen. The chances for this to happen are 2-500 which is a very very small number.
I ask you, what is the probability to get Tail in the remaining 500 flips?

Now you think a bit and come up with this reasoning: if I flip a coin for 1000 times it's very likely to get about 500 Heads and 500 Tails. So, to achieve that result the probability for the Tail to come up now is very high, since I expect a balanced result at the end of the 1000 flips. So, resuming, the probability for Tail to come up now is much more higher than 0.5.

That's a common reasoning, but it's wrong. It's true that when you flip 1000 times a coin you get about 500 Tails and 500 Heads, more or less, but this doesn't affect our situation here. Coin flips in fact are independent events, the one from the others and whatever result you have in your hand now, the probability for you to get a Tail in next flips is the same as before: 50%. Every flip has the p[T]=0.5. Hence, you'll probably get something around 750 Heads and 250 Tails when you finish the 1000 flips. This because, having 500 Heads, you now remain with 500 flips and it's likely that they will distribute around 250 Heads and 250 Tails giving you around 750 Heads and 250 Tails at the end.

What is wrong in the common reasoning is that we consider the 1000 flips to be somehow dependent the one from the others, but they are not! A coin doesn't know what result you got before the nth flip. And can't tell what you'll get in the next flip.

Here in italy we have a lottery game called lotto (and probably somewhere else too...). There are 10 cities on the sheet. Every city gets 5 numbers. Since the numbers are taken from a bucket containing 90 numbers (from 1 to 90), 10x5=50 numbers are going to come out during a game, while the other 40 aren't. The next week all this routine happens again. Some other 50 numbers are taken randomly from the bucket and are distributed on the cities. You bet on some numbers coming out on a specific city (or you bet without specifying the city but the winning is much lower). Depending on how many numbers you guessed correctly you gain a prize. If you guessed correctly one number you win (let's use fake numbers here) 10 x what you bet. If you guessed correctly 2 numbers you win 20 x what you bet, and so on.

Some numbers are called late-comers because they don't come out for many many weeks. You can't imagine how many people base their play on the fact that since those numbers are late-comers, they must have a higher probability of coming out rather than non late-comers' one. This is totally wrong! Every number has the same probability to come out and numbers don't remember if they have been called or not at the last game. This because one week's extraction is completely independent from all the others. But people think it's not, and bet most of their money on the late-comers.

The principle that prooves this reasoning wrong is the same as the coin one. If you think some events are dependent the one from the others you can expect a specific result, but if they're not, it's very likely that you will just waste your money.

Ok, I think we're ready for the next one. Relax, this one is the easiest one.

Chapter 4: A square curiosity.

This is a nice curiosity that came up while I was talking to my uncle. He makes machine projects and deals with numbers all day long.
He told me that when you square a positive number that ends with 5, the square is very easy to calculate. You pick the number (without considering the 5), multiply it for the next one, and concatenate the result with number 25.

Example: let's square number 35.
Consider just number 3 (35 without 5). Multiply it for the next (4) and get 12. Concatenate it with 25 and you get 1225, which is correctly the square of 35.

Example 2: let's square 125
Consider just number 12 (125 without 5). Multiply it for the next (13) and get 156. Concatenate it with 25 and you get 15625, which is correctly the square of 125.

I love these tricks! But I don't like to learn something without understanding why it works, so I came up with this little proof of this rule.

First of all, how can we describe mathematically a number ending by 5? Well, we write it as 10*k+5, for any k in the natural set (0 inclusive).
So, if k==0 then 10*0+5=5.
If k==12 we get our friend 125, in fact 10*12+5=125, and so on.

Let's square this number:

(10*k+5)2 = 100*k2+100*k+25
= 100*(k2+k)+25
= 100*k*(k+1)+25
and we're done!

Multiplying k*(k+1) is what we did when we multiplied the number (without 5) by the next one. The multiplication for number 100 plus addition of number 25 is the left shift by 2 positions and concatenation of 25.

This trick is very easy, but nonetheless very nice I think.

Ok, now that you have a well trained brain in math tricks I will propose here a very very nice problem. I won't post the solution for this problem and I ask you please, when you'll solve it, not to post the solution here as a comment, so other guys can think about it with their own heads.

But let's dig in...

Chapter 5: The missing euro.

Three guys walk into an hotel and ask for a room with three beds. They just want to stay for one night. The man at the hotel tells them that the cost will be 30 euros. They pay 10 euros each and walk upstairs in their room. After some minutes the man knocks at their door, apologizing for a mistake. He made them pay the high season price and not the low season one, and since now it's the low season, he must give them 5 euros back.
They are happy and appreciate the honesty of the man, so they tell him: "from these 5 euros we will just take back 1 euro each and we will give you the remaining 2 euros as a tip for your honesty". The man thanks the guys and walks out.
Now, they paid 10 euros each (30 in total). They take back 1 euro each (3 euros in total). In this way the have paid 10-1=9 euros each. So in total they have now paid 9*3=27 euros. Plus the 2 euros tip given to the man we get 27+2=29 euros. Hey?! There is a missing euro here! Can you find it?

If you need help with this problem just drop me a mail (the mail address is specified in my account page).
Don't google it, don't ask for it, solve it!

Hope you enjoyed the trip.
Best wishes to you all guys,

sfabriz

Did you like this article? There are hundreds more.

Comments:
anilg
2006-08-28 10:36:06
Great article, as always :)

The part about probability I found very refreshing, reminded me of Scott Adams' stuff from God's Debris.

Nice work
anilg
2006-08-28 10:36:52
Oh.. and do I give out the solution of the honest man problem here??
sfabriz
2006-08-28 10:55:56
Thank you. For the solution, if you want, you can send me a mail. Not posting it here will result in a major enjoyment for the people who will be interested into solving it.
Cheers
innocentius
2006-08-28 14:48:39
Here's another classic that I feel obligated to contribute:

You're watching Monty Phan's "Let's Make A Deal." A contestant is faced with three garages - one has a new car, the other two have Eskimo Pies. The contestant chooses Garage #1. First, Monty Phan opens Garage #3; it contains Eskimo Pies. The contestant then says, "Wait! Can I switch?"

Believe it or not, by switching to Garage #2, he's increased his chances of winning. WHY?
paranoia
2006-08-28 15:19:41
hehe monty hall problem. everyone should write a program to monte carlo test it.
sfabriz
2006-08-28 17:40:35
I never heard of monty hall problem. Very nice I have to say.
kdemetter
2006-08-28 18:34:50
i wrote a program to test it (monty hall) .

but it doesn't seem to show that he has more chance of winning with #1 or #2 . But maybe i misunderstood it .

the program makes about a 1000 runs in wich it randomly places the car in one of the boxes , and counts the number of times to car is in box #1 and #2
Quantris
2006-08-29 00:07:02
Monty Hall is usually stated like:

You pick one of the three doors. Then Monty opens one of the remaining doors to show that there is nothing behind it (or, an Eskimo Pie). Then you decide whether you should stick to your original door or switch to the last door. The important thing is that Monty always eliminates a bad door.
kdemetter
2006-08-29 02:23:01
ah thx .

i was wondering about that 3rd door in the example
Anonymous
2006-08-29 04:47:25
Well, it's a paradox but you can seize this problem with a statistical analysys. You have 1/3 of chances to pick the car and 2/3 of chance to pick a goat. If you don't switch, knowing where a goat is it's not helpful and you stick to your 1/3 of probability to win the car. But if you switch, in case you chose a goat, the other one gets showed to you, so then you choose the car.
E.G.
1- choose goat 1 -> see goat 2 -> switch to the car
2- choose goat 2 -> see goat 1 -> switch to the car
3- choose the car -> see random goat -> switch to the other goat

As you see, when you switch, you win the car 2 times out of 3, which means with 2/3 of probability. If you don't switch and stick to your first choice you win the car just when you're lucky enough to pick it at first, with 1/3 of probability.
kdemetter
2006-08-29 08:43:44
well , i made somechanges to my program , and it actually proves your right .
jonoringading
2006-08-29 12:59:53
the Monty Hall problem was also mentioned in Marilyn vos savant's column... and even though she has the highest IQ, there is a logic problem. as in the Anonymous Goat examples, it is unfair to say that there is only one weight for item 3.
it should say:
3- choose the car -> see goat 1 -> switch to goat2
4- choose the car -> see goat 2 - switch to goat1
this way you are correctly back to 50-50.
if you think about it, you are still choosing between 2 doors, even if you stay where you are, which is the point of the coin flip warning...
kdemetter
2006-08-29 14:05:21
it's confusing for sure .

i wrote a program to try it out , and it shows that the sucess rate is higher when you switch it , than when you stick to the same . ( about 2x , though it varies )

but off course , i could have made a mistake somewhere , it wouldn't be the first time :-)

kdemetter
2006-08-29 14:29:17
i put the source on my osidrive , so you can check if it's working correctly

http://www.osix.net/modules/folder/index.php?tid=12204&action=vf
innocentius
2006-08-29 14:32:10
Sorry about the omission. It's a little bit easier to understand if you use a deck of cards. Say you have to pick the Ace of Spades. You draw your card - a 1/52 chance of getting the ace. Then, someone throws away fifty of the other cards. Now it's clear that you should switch, because one of the two cards is an ace. Since the one you're holding has a 1/52 probability of being correct, the other must have a 51/52 probability of being correct, so it is better to switch. The Monty Hall problem is similar, but on a smaller scale, so it is more difficult to visualize.
paranoia
2006-08-29 15:20:22
jonoringading: no the odds actually are 2/3. splitting choice 3 into two different options is invalid, since the only difference between your 3 and 4 is the door monty opens, not the door you've chosen.

using your options, here is the full case list:

1 - choose goat 1 (1/3) -> see goat 2 (1/1) = switch to car (1/3 * 1/1) = 1/3

2 - choose goat 2 (1/3) -> see goat 1 (1/1) = switch to car (1/3 * 1/1) = 1/3

3 - choose car (1/3) -> see goat 1 (1/2 -- NOTE THE DIFFERENCE HERE) = switch to goat 2 (1/3 * 1/2) = 1/6

4 - choose car (1/3) -> see goat 2 (1/2 -- NOTE THE DIFFERENCE HERE) = switch to goat 1 (1/3 * 1/2) = 1/6

the coin flip warning only works for independent events, the two events of choosing a first door and switching are not independent, because monty is not picking his door to reveal at random. he will, with 100% odds open a losing door.

innocentius provides a nice example of a more extreme case.

and if you still don't believe us, then run kdemetter's code.
paranoia
2006-08-29 15:40:03
here's code (it hasn't been tested or compiled because I'm at work and my dev comp is being lame), it is nicely commented though

#include <stdlib.h>
#include <stdio.h>
#include <time.h>

#define NUM_ITER 10000

int main(int argc, char *argv[]) {
  srand(time(NULL));

  int winsWithSwitch = 0;
  int winsWithoutSwitch = 0;

  for(int q=0;q<NUM_ITER;q++) {
    // Randomly pick the winner
    int winner = rand()%3;

    // And randomly pick the first choice
    int firstChoice = rand()%3;

    int opened;

    for(int i=0;i<3;i++) {
      // eliminate the door if it isn't the winner and isn't the first choice
      if(i!=winner && i!=firstChoice) opened = i;
    }

    int secondChoice;
    for(int i=0;i<3;i++) {
      // second choice would be the door that isn't opened and isn't the first choice
      if(i!=opened && i!=firstChoice) secondChoice = i;
    }

    if(firstChoice==winner) winsWithoutSwitch++;
    if(secondChoice==winner) winsWithSwitch++;

    // Note now that there are only two options, and firstChoice != secondChoice (see code above),
    // and neither of the two equals opened (again, see code above). So either firstChoice or
    // secondChoice will be the winner. Nothing has changed about firstChoice since it was
    // initially chosen, so why would it automagically have better odds of winning once Monty
    // opened a losing door?
    //
    // ANSWER: It wouldn't, but secondChoice would, because it is essentially the two non-firstChoice
    // doors collapsed into one choice by Monty's revealing of a losing door.
  }

  // Proof of my claim
  printf("Wins without switching: %i\nWins with switching: %i\n", winsWithoutSwitch, winsWithSwitch);

  return 0;
}
sfabriz
2006-08-29 19:49:34
http://en.wikipedia.org/wiki/Monty_Hall_problem

this should clear your doubts.
cheers
anilg
2006-08-29 23:58:06
All those ints strewn everywhere gives me the creeps
paranoia
2006-08-30 14:08:21
hooray for wikiality!

anyway that's a good wikipedia article. should make it easier for me to explain the solution to people in the future.
jonoringading
2006-08-31 11:03:38
Sorry folks, I still don't beliee you
int secondChoice;
for(int i=0;i<3;i++) {
// second choice would be the door that isn't opened and isn't the first choice
if(i!=opened && i!=firstChoice) secondChoice = i;
}

is not correct
the second choice is simply another random choice between the 2 unchosen doors, you are building in bias by not including the choice of sticking with the first door.
janus
2006-08-31 11:15:00
it is correct: secondChoice is the door that you would pick when you switch, firstChoice the one that you picked first.
paranoia
2006-08-31 18:06:41
janus is right, secondChoice is used to count wins if you DO switch, firstChoice is used to count wins if you DON'T switch. my code doesn't make a random choice, it counts how often you'd win if you stuck with one or the other of the two strategies for every single time you got to play the contest and then compares the two. see:

if(firstChoice==winner) winsWithoutSwitch++;
if(secondChoice==winner) winsWithSwitch++;
.
.
.
// Proof of my claim
printf("Wins without switching: %i\nWins with switching: %i\n", winsWithoutSwitch, winsWithSwitch)

kdemetter
2006-08-31 18:39:01
jonoringading .

the are only 3 doors .

- the first is randomly chosen .
- the bad door is chosen taken a wrong door that isn't the first .

so there is only one left , and that's the second choice .

so second door is not random chosen .

If all these arguments don't convince you , nothing will.
kdemetter
2006-08-31 18:40:29
in my previous post :
"second door " should be "second choice "
Anonymous
2006-09-02 03:35:59
it's pretty obvious..
before being revealed: the first door has 1/3 odds,
after being revealed, that door STILL has 1/3 odds, so obviously the remaining doors must share between them the remaining 2/3 odds.. you are given that one of them is 0/3, thus the 3rd door must be 2/3 odds... simple logic: forget all the induction and math :P
OzMick
2006-09-15 05:24:10
Understanding the problem is actually easier by flipping the premise of success, and extending the number of doors. If you have 50 doors, there is a 49/50 chance of picking a loser door, almost guaranteed. You are guaranteed to get to the final 2 doors, so stick with that door till the final 2. If you've managed to pick a LOSER at first, it gets inverted to a winner in the final 2. Well that is what changed my thinking on the issue, makes sense now. Assuming random chance *will* give you a 50-50 chance, but the tactic will give you a slight advantage if you stick with your choice till the last 2 and swap.
muzzy
2007-01-20 15:23:57
The 0.999...=1 thing is easier to prove without involving any math whatsoever. For real numbers, if two numbers aren't equal, there are going to be more numbers between them. So, if 0.999... was less than 1, you should be able to find some other number in between them. However, whatever you choose as the number between them, it's intuitively true that 0.999... is going to be greater than that. So, since there are no numbers in between them, it means they must be equal.
muzzy
2007-01-20 15:56:08
Monty Hall problem is lovely. It's important to remember that probability always applies to a specific source of knowledge, in this case the placement of prize over the three doors is what's random. After Monty opens one door, that source doesn't change. It's not uniformly over the two doors now, the original source of probability is still the same - over the three doors.

Now, as Monty opens the door, you have been revealed new information about the original source of probability. This new knowledge is not merely "this door was wrong one", the information you receive is "if either of these two doors is the right one, it's the other one". This is because the information isn't independent from the original source of probability.

Your original choice had 1/3 chance, the other two had 1/3 each. Since Monty reveals which of the other two you should've picked if either of them was correct, that door now holds the probability for both of the doors. This is because, again, the probability is related to knowledge that existed before any door was opened.

Ofcourse, if you always pick the leftmost door as your first choice, then you cannot know what the probability distribution was for the original doors. For this reason, you should always pick the first door randomly, through a true source of randomness such as throwing a dice. After that, it no longer matters which door the prize was placed behind, the Monty's source of randomness is modulated by your dice's source of randomness so each door will have a perfect 1/3 chance at the beginning. Assuming you have a perfect dice, anyway.

Remember kids. There are no random choices, there are no random numbers. There are only sources of randomness, sources of unknown information.
muzzy
2007-01-21 10:22:34
Anyway, the flip a coin thing above is nasty one because it seems so obviously false, yet common people often fall for its variations. One particular example I remember from years ago, I was sitting in McDonalds and listening to other people chatting.

There had just been elections, and the votes were being counted, with one area having a delay. Two guys were arguing about how the delayed area is going to vote, and the logic was: "X is currently winning a little, but according to polls Y ends up winning a little, so a lot of people in that area are voting for Y. Really lot". I was amazed to think that anyone could do such a logical mistake.

Then again, I remember thinking like that about dice throwing when I was 8 or something. Throwing 6 three times in a row was unlikely, so OFCOURSE it must've been still equally unlikely to get 6 even if you threw it twice first. RPG games permanently fixed (most of) my misconceptions about probability, though. It's a good thing kids play a lot of games nowadays :)
sfabriz
2007-01-22 04:14:01
Well, it can be proved in several ways. Another one is:
x=0.9
then 10 * x = 9.9
But 10x-x=9x hence
9.9 - 0.9 = 9
And since 9x=9, x=1 (and therefore 1=0.9)

The reason why I chose to demonstrate it with maths is that I like that particular demonstration that leaves no room for questions or disagreement.
Cheers
Anonymous
2008-07-14 00:19:12
sfabriz, your proof is wrong, 9x isn't 9, its 8.9 repeating.(9*0.999...)
lsharp
2009-10-11 22:25:57
37037037 times any multiple 3 thru 27 has interesting results
CodeX
2009-10-12 08:53:20
37037037 * 3 = 111111111
37037037 * 4 = 148148148
37037037 * 5 = 185185185
37037037 * 6 = 222222222
37037037 * 7 = 259259259
37037037 * 8 = 296296296
37037037 * 9 = 333333333
37037037 * 10 = 370370370
37037037 * 11 = 407407407
37037037 * 12 = 444444444
37037037 * 13 = 481481481
37037037 * 14 = 518518518
37037037 * 15 = 555555555
37037037 * 16 = 592592592
37037037 * 17 = 629629629
37037037 * 18 = 666666666
37037037 * 19 = 703703703
37037037 * 20 = 740740740
37037037 * 21 = 777777777
37037037 * 22 = 814814814
37037037 * 23 = 851851851
37037037 * 24 = 888888888
37037037 * 25 = 925925925
37037037 * 26 = 962962962
37037037 * 27 = 999999999
Domuk
2009-10-12 19:45:45
Hah, that is pretty cool.
MaxMouse
2009-10-13 13:12:57
37037037 * 28 = 1037037036
CodeX
2009-10-13 14:19:46
37037037 * 2 = 74074074
CodeX
2009-10-13 14:32:21
also noticed
37037037^2 * 3^2 = 12345678987654321
both a palendrome and is nearly pandigital, also when its squared and multiplied by a 81n^2 gives repeatyness
37037037 * (81 * 1^2) = 111111110888888889
37037037 * (81 * 2^2) = 444444443555555556
37037037 * (81 * 3^2) = 999999998000000001
37037037 * (81 * 4^2) = 1777777774222222224
37037037 * (81 * 5^2) = 2777777772222222225
37037037 * (81 * 6^2) = 3999999992000000004
37037037 * (81 * 7^2) = 5444444433555555561
37037037 * (81 * 8^2) = 7111111096888888896
37037037 * (81 * 9^2) = 8999999982000000009
37037037 * (81 * 10^2) = 11111111088888888900
37037037 * (81 * 11^2) = 13444444417555555569
37037037 * (81 * 12^2) = 15999999968000000016
37037037 * (81 * 13^2) = 18777777740222222241
37037037 * (81 * 14^2) = 21777777734222222244
37037037 * (81 * 15^2) = 24999999950000000025
37037037 * (81 * 16^2) = 28444444387555555584
37037037 * (81 * 17^2) = 32111111046888888921
37037037 * (81 * 18^2) = 35999999928000000036
37037037 * (81 * 19^2) = 40111111030888888929
Then again doing anything with 9s of it tends to make something with repeaty qualities
Anonymously add a comment: (or register here)
(registration is really fast and we send you no spam)
BB Code is enabled.
Captcha Number:


Test Yourself: (why not try testing your skill on this subject? Clicking the link will start the test.)
Math Test by mattman059

Goes through some basic Boolean arithmetic and a little set theory.
Number Translation Test by crazynird

So you think you're good at translating numbers? Then try out this test!


     
Your Ad Here
 
Copyright Open Source Institute, 2006