Probabilities are a wonderful business tool, but there are alternatively under-used or mis-used. Today, however, I have a quiz question for folks, a probability question that entertained me this morning.

Someone recently gamed the Missouri lottery. It is a Pick-3 game, which in Missouri’s case means you pick three digits between 0 and 9 to create your entry. You pay a dollar for the privilege, and if your entry is selected you win $600. The odds are against you, with there being 1,000 numbers to choose among, meaning that your expected payoff on a $1 bet is $0.60.

But the lottery also has a bonus draw which changes things. It works like this:

… the lottery also has a special drawing from a bag with seven balls in it – six white and one orange – that determines whether a second, bonus set of winning numbers will be drawn that day.

If a white one is drawn, they leave it out of the mix and nothing special happens. But if they draw the orange ball, it triggers a second drawing, giving ticket holders a second chance to win.

As you can see, if you know that the orange ball is going to be drawn then the odds tilt in your favor. You can spend $1,000 to cover all 1,000 possible winning numbers, and then be assured of winning $600 on each of two drawings. Your total winnings would be $1,200, a guaranteed 20% gain.

And that is precisely what recently happened. After six days of drawing white balls only the orange ball was left. An entrepreneurial fellow in St. Louis realized that he had a chance to make some money so he pulled together friends and spent $23,000 on the lottery, neatly making $4,600 the next day when two numbers were drawn and the group won twice. (As a side note, there are various limits in the lottery to prevent people from busting the state by betting billions.)

So here is my question: What are the odds of this happening? What, in other words, are the odds that you go six days and have only the orange ball left? According to the state of Missouri the odds are 1 in 4.3, but a simple analysis would suggest 1 in 7 (which can be thought of as the odds of not drawing the orange ball for six days, and then drawing it). Anyone want to point out the flaw in the analysis?

Related posts:

The odds are 1 in 7… I have no idea what they’re smoking in the state of Missouri.

My guess is someone got creative with some math:

http://www.google.com/search?hl=en&lr=&ie=UTF-8&q=1%2F7+*+2%2F6+*+3%2F5+*+4%2F4+*+5%2F2+*+6%2F1+&btnG=Search

My guess is someone got creative with some math:

http://www.google.com/search?hl=en&lr=&ie=UTF-8&q=1%2F7+*+2%2F6+*+3%2F5+*+4%2F4+*+5%2F2+*+6%2F1+&btnG=Search

OK, this puzzled me, too, when I first read it.

Don’t the odds follow a progression, of sorts, i.e., the chance of drawing the orange ball on the first day is 1 in 7. But, on the second day it is 1 in 6, then 1 in 5, and so on.

So, for the orange ball to be the last ball left on the seventh day, the odds become 7x6x5x4x3x2= 1 in 5,040.

In other words, don’t the odds start fresh each day?

Or, am I way off base?