Image of the glider from the Game of Life by John Conway
Skip to content

Strong Passwords NEED Entropy

I just finished reading an article on Ars Technica titled "Ask Ars: Where should I store my passwords?". There was a specific paragraph that I took issue with, which in turn prompted me to write this post. It is:

"Still, it would take thousands of years to crack an 8-character password when checking both small and capital letters, spaces, and numbers. That's on a low-power computer, but the time it takes to crack a string of characters goes up exponentially the more characters you use. So again, use a long password and you can foil even the Watsons of today for long enough that you would probably decide on a whim to change your password before the password is solved."

I guess I should expect more out of them, but I was disappointed, and I want to use this blog post explaining why. If you wish to write an article about password security, regardless of the angle from which you approach it, if you don't mention entropy to your readers, you are doing them a GREAT disservice. Let me rephrase:

If you don't mention entropy in your article about passwords, you did it wrong.

If you've taken physics in secondary or undergraduate school, chances are good that you've heard about entropy. If not, you'll learn about it now. Entropy in computer science is very similar to entropy in physics. To put it straight, entropy is defined as the total combination of states that a system can be in. For example, if you have 3 cups, and 4 ping pong balls, how many ways can you arrange all 4 ping pong balls in the 3 cups? Assuming that 4 ping pong balls will fit in 1 cup, and order is not important, you can arrange the ping pong balls 12 different ways: {4,0,0},{3,1,0},{3,0,1},{2,2,0},{2,0,2},{2,1,1},{1,1,2},{1,2,1},{1,3,0},{1,0,3},{0,4,0},{0,0,4} So, this system has an entropy of 12. Entropy in physics is useful to show the Ideal Gas Law, among other things, showing the possible number of states various gases can be in, and it's useful for explaining why some things systems behave one way, but not in the reverse (such as temperature "flowing" from hot to cold, and not the reverse).

It turns out that entropy has a great deal of use in computer science. For example, on most Unix-like operating systems, there is a /dev/random and /dev/urandom device. These devices are useful for extracting random bits to build encryption keys, one-time session keys, seeds for probability outcomes, etc. These devices hold entropy. In /dev/random, for example, environmental "noise" is gathered from the user, such as mouse movements, disk usage, etc. and thrown into an entropy pool. This pool is then hashed with the SHA1 hashing algorithm to provide a total of 160-bits of entropy. Thus, when generating a 160-bit key, the data in /dev/random can be used. If more bits are needed, entropy is gathered by hashing the already hashed bits, as well as gathering additional noise from the environment, and appending to outcome until the number of bits is satisfied. The point is, /dev/random and /dev/urandom are sources of entropy.

So, what does this have to do with passwords? Your password has a certain amount of entropy. This means, that it belongs to a pool of passwords that have the same amount of entropy. The question is, though: "how do you calculate the amount of entropy in a password?" Thankfully, we don't have too think to terribly hard about this one. If you've taken college algebra, the math is pretty straight forward. Entropy in information comes from a branch of probability called "information theory". Any message contains some amount of entropy, and we can measure that entropy in binary bits. The formula for calculating this entropy is:

H = L * log_2(N)

H is the size of the message measured in binary bits. L is the length of the message- in our case, the length of your password. log_2() is the log function, base 2, and N is the number of possible symbols in the password (only lowercase letters provide 26 possible characters, uppercase provide an additional 26 possible characters, the digits provide 10 possible characters and punctuation provides 32 possible characters on an United States English keyboard). I rewrote the equation, so you could find it using your calculator:

H = L * log(N) / log(2)

Having this formula makes calculating the entropy of passwords straight forward. Here are some examples:

  • password: 38 bits (8 * log_2(26)
  • RedSox: 34 bits (6 * log_2(52))
  • B1gbRother|$alw4ysriGHt!?: 164 bits (26 * log_2(94))
  • deer2010: 41 bits (8 * log_2(36))
  • l33th4x0r: 46 bits (9 * log_2(36))
  • !Aaron08071999Keri|: 131 bits (28 * log_2(94))
  • PassWord: 46 bits (8 * log_2(52))
  • 4pRte!aii@3: 78 bits (12 * log_2(94))

Question: what gives you more entropy per bit- length or possible characters? If you passed college algebra, you would know that the answer is length, not total possible characters (if you need to think about this, graph the log function on your calculator, then graph a multiple of it). Of course, you shouldn't ignore using lowercase, uppercase, numbers and punctuation in your password, but they won't buy you as much entropy as length will. Thus, I prefer the term "passphrase" over "password", as it implies this concept to the user.

Here's a table showing the length your password must be given the possible character combinations in your password, if you want a certain entropy. Say you want an entropy of 64-bits using only numbers, it would need to be 20 characters long. If you wanted an entropy of 80 bits using characters from the entire ASCII set, it would only need to be 13 characters long.

Entropy (H) Numbers Alphabet Alphanumeric All ASCII characters
32 10 6 6 5
40 13 8 7 7
64 20 12 11 10
80 25 15 14 13
96 29 17 17 15
128 39 23 22 20
160 49 29 27 25
192 58 34 33 30
224 68 40 38 35
256 78 45 43 40
384 116 68 65 59
512 155 90 86 79
1024 309 180 172 157

So, how much entropy should you have in your password? What is considered "strong"? Well, let us look at Distributed.net. They are working on two projects: Optimal Golomb Rulers and cracking an RSA 72-bit message. Let's look at the RSA project. In January 1997, RSA Laboratories issued a secret key challenge. They generated random keys ranging from 40-bits to 128-bits. They provided the ciphertext, and a $1,000 prize to the person who find the private key that generated the message, for every message. In order to know whether or not you found the key, they gave you the first two words of the message.

Currently, as already mentioned, Distributed.net is working on the 72-bit key from that challenge. If you check the stats page, you can see that it has been running for 3,017 days as of the writing of this post, and at the current pace, it would take roughly 200,000 days to search the entire key space. Now, of course it is probable that they will find the key before exhausting the entire space, but knowing when that will be is anyone's guess. Needless to say, 200,000 days or about 540 years at the current pace is substantially large. If they kept that pace up for the 80-bit key, it would take them roughly 140,000 years to search the entire space. However, it only took them 1,726 days, or 4-and-a-half years, to find the 64-bit key, and only 193 days, or 6 months to find the 56-bit key.

So, I think that should give you a good rule of thumb to go by. 72-bits of entropy for your password seems strong enough for the short term, but it wouldn't hurt to probably increase your passwords to contain 80-bits of entropy for the long term. Of course, I don't think I need to mention not using your names, birthdates, or other silliness in your password. That's been beaten to death plenty online. Search Google for generating strong passwords, and you'll find plenty of them (all of which don't mention entropy either, I would be willing to bet).

Now, to be fair, the Ars Technica article mostly mentioned STORING your passwords, not how to create strong ones. I've already written about this before, and I think it's the perfect solution. http://passwordcard.org is the exact solution for storing strong passwords that have a good amount of entropy in them. The great thing about it too, is it does not require any software once generated. It stays with you in your wallet, so if you visit a computer that doesn't have your encrypted database, or you are not allowed to install software on the machine so you can restore your password database, the password card is the perfect fit. It's entirely platform-independent. Just pull it out of your wallet or purse, type in your password, and move on with your life. All you need to remember on the card is three things:

  1. The starting column and row of your password.
  2. The length of the password.
  3. The path your password takes on the card.

I've been using it since it first "released", and I use it for all my passwords on all my accounts, web-based, key-based or account-based. Every password is unique, they all contain more than 100-bits of entropy, and every password follows the "best rules" for creating strong passwords. I've typed many of them enough to have them memorized; I rarely pull out my card (there is also an Android and Apple application if you want).

So, next time you read an article about password strength, do a quick search for the word "entropy". If it's not mentioned, take the article in stride, or at least notify the author of this post, and that they should discuss entropy to their readers. Entropy is your friend.

{ 26 } Comments