Take a look at Champernowne's Constant. It's a ridiculously easy sequence to make, and yet it fooled programs designed to root out underlying order in seemingly random numbers.

David Gawen Champernowne was born in 1912. When he was an undergraduate in college, he published a seemingly simple number. Champernowne's Constant is formed by taking the sequence of whole numbers - 1, 2, 3, 4, 5, and so on – and putting them behind a decimal point. So a long sequence of Champernowne's Constant would be as follows:

0.12345678910111213141516171819202122232425 2627282930...

It's just the whole numbers in order with the commas removed between them — it is called a "normal" number. The term "normal" is the key to fooling early computers looking for patterns. Select any single digit from a huge sequence of Champernowne's Constant, and there will be a 10 percent chance of getting a 9. There will also be a ten percent chance of getting a 0, or any other digit.

Now take a sampling of two digits from any of part of Champernowne's Constant. What will the result be? If someone were to pick the number 41, how likely would they be to find it? Well, it occurs naturally once in between the numbers one and a hundred, and that sequence repeats every hundred numbers, so it's once roughly every 100 numbers. (Unless the computer were searching the specific and narrow section of Champernowne's Constant that is 410, 411, 412, 413, and so on.)

Now consider a sequence of numbers that is truly random. Each single number will have a ten percent chance of showing up in each slot, just as they do in Champernowne's Constant. So a person looking for the digit 41 will have a one out of ten chance of getting a four as the first digit, and a one out of ten chance of getting a one as the second digit. Chance of picking any sequence of two digits and getting a 41? One out of one hundred. Chance of getting a specific three digit number? One out of a thousand. And so on.

This is why Champernowne's Constant fooled early programs meant to check if certain sequences of numbers were truly random. The programs searched to see if each one-digit number, two-digit number, three-digit number and so on showed up as often as it should have if the numbers were truly random, and they did. It's just they showed up as often as they would if a person were simply counting them as well.

[Via Math is Fun, Mathworld, and Simon]