Image of the glider from the Game of Life by John Conway
Skip to content

Creating Strong Passwords Without A Computer, Part 0 - Understanding Entropy

I've written a new series that investigates the art of creating very strong passwords without the aid of a computer. Sure, there are many software applications that will generate strong passwords for you, each with their own advantages and disadvantages. They're so plentiful, that it would be impossible to outline them all. However, generating passwords without the aid of a computer can be a very challenging, yet very rewarding process. It may even be impractical or impossible for you to install software on the computer you're using at the moment, when you need to generate a password.

Introduction

Before we start diving into passwords, we need to define what a "strong password" is. I've defined this many times on this blog, and as long as I keep blogging about passwords, I'll continue to define it. A "strong password" is one that is defined by entropy. The more the entropy, the stronger the password. Further, a "strong password" is one that is defined by true randomness, which means it was not influenced by a human being during the creation.

We're told that passwords must have the following characteristics when creating them:

  • Passwords should be long.
  • Passwords should contain both uppercase and lowercase letters.
  • Passwords should contain digits.
  • Passwords should contain special punctuation characters.
  • Passwords should be unique for every account.
  • Passwords should be easy to remember.
  • Passwords should not be written down.
  • Passwords should not be based on dictionary words.
  • Passwords should not contain birthdates, anniversaries, other other personally identifiable information.

These rules can be difficult, because remembering all these rules make passwords difficult to remember, especially if you have a unique password for every account. It's almost like creating a witches brew, just so you can create the perfect password potion:

Double, double toil and trouble;
Fire burn and cauldron bubble.
Fillet of a fenny snake,
In the cauldron boil and bake;
Eye of newt, and toe of frog,
Wool of bat, and tongue of dog,
Adder’s fork, and blind-worm’s sting,
Lizard’s leg, and howlet’s wing,
For a charm of powerful trouble,
Like a hell-broth boil and bubble.

Personally, I find it all a bit confusing, and even more annoying that security-conscious blogs endorse that garbage. I tend to keep things much more simple:

  • A password must contain great amounts of entropy (we'll quantify this in a minute).
  • A password must be truly random (no human interference).

So, what is entropy, and how do we quantify it? Let's begin.

Defining Entropy

Entropy can be found in many fields of study. Abstractly, entropy is defined as:

1. a thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.
2. lack of order or predictability; gradual decline into disorder. "a marketplace where entropy reigns supreme".

So entropy is a measure of disorder, or unpredictability. For our needs, we'll use entropy from Claude Shannon's Information Theory, a branch of Mathematics, which is defined mathematically as follows:

Definition of entropy.

where:

  • H = entropy in binary bits.
  • L = the number of symbols chosen for your message (length).
  • N = Total number of unique possibilities each symbol could be.

Or, if you would like to enter this on your calculator:

Entopy equation suitable for calculator entry.

This can be proven quite simply. First off, let us define the logarithm. As with standard mathematics, we'll define it as the inverse of the exponential function. In other words:

Definition of the logarithm.

Suppose we want to find the entropy size of of a 13-character password taking advantage of all 94 possible printable symbols on the ASCII keyboard. Then, if all 94 printable symbols are a valid choice for each of the 13 characters in my passphrase, then to find the total number of combinations that my password could be, we write it as:

logarithm2

Now, a property of logarithms is the ability to change base. Because entropy is defined in bits, or base-2, I'll change the base of my logarithm, as follows:

logarithm3

Rewritting the equation, we get the following result:

logarithm4

And thus, we've arrived at our conclusion.

This assumes that the message was chosen completely at random. However, if there is human intervention in creating the message, then the equation gets much more difficult. As such, for the point of this post, and most of my posts detailing entropy, we'll assume that the password or message was chosen completely at random.

Some Entropy Equation Examples

If you create a message from lowercase letters only, then the number of unique possibilities for each symbol is 26, as there are only 26 letters in the English alphabet. So, each character provides only 4.7 bits of entropy:

Entropy example equation 1.

If you created a message from lowercase letters, uppercase letters and digits, then the number of unique possibilities for each symbol is 62, as there are 26 unique lowercase letters, 26 unique uppercase letters, and 10 digits in the English language. So, each character would provide only 5.95 bits of entropy:

Entropy example equation 2.

A Needle In A Haystack

Knowing how much entropy is needed when creating a password can be tricky. Is 50-bits enough? Do I need 60-bits instead? 80-bits? 128? 256? More? In order to get a good firm grasp on quantifying entropy, I want to first create an analogy:

Entropy is to a haystack, as your password is to a needle.

Finding your password means searching through your entropy space. Your password should have a minimum amount of entropy. That means that the password will NOT be found in entropy spaces smaller than what your password is defined in. So if your haystack, or entropy, is 256-bits in size, your needle, or password, should not be found in 255-bit haystacks, or smaller.

To quantify this a bit, in a previous post, I demonstrated that a SHA1 hash has an output space of 61-bits due to cryptanalysis techniques, and it turns out that it was far too small. At the time of this writing, the bitcoin network is processing 2^61 SHA256 hashes every 76 seconds using specialized hardware called ASICs. The computing power is only going to grow also, as there is financial gain to be made from mining.

In other words, with these ASICs, 30 quadrillion pieces of hay can be analyzed every second. If your haystack has 2^61 pieces of hay, one of which is actually your needle, this haystack can be fully processed in 76 seconds flat.

How Entropy Relates To Passwords

If you continue using the same bitcoin network model for our speed benchmark, then at 30 quadrillion pieces of hay (passwords) every second, to completely exhaust the full output space, for the following haystack sizes, it would take:

  • 64-bits: 10 minutes.
  • 72-bits: 2 days.
  • 80-bits: 15 months.
  • 88-bits: 327 years.

In the past, I've shown that 72-bits of entropy (your haystack) seems to be sufficient for passwords (your needle) using the RSA 72-bit distributed computing project on distributed.net. After analyzing the bitcoin network mining operations, and how trivial it is to build specialized ASICs for these tasks, I'm beginning to think that you should have at least 80-bits of entropy for your passwords. As computing gets more and more stronger, that number will continue to increase.

As a result, to create a "strong password", I would recommend the following:

  • A password must contain at least 80-bits of entropy.
  • A password must be truly random.

Conclusion

Entropy is how we measure unpredictability. If we need any sort of unpredictability, such as finding passwords (needles in a haystack), then the more entropy we have, the better off we are. On computer systems, entropy is stored in what's called an "entropy pool". The larger the pool, the more reliably the operating system can generate true random numbers for security applications, such as GPG and long term SSL keys. The same can be said for passwords. The more entropy we have, the better off our passwords can become.

So, don't starve yourself by selecting weak 8-10 character passwords, and trying to generate random data in your head. Generate real random passwords and use large amounts of entropy.

{ 3 } Trackbacks

  1. […] Previously, I used entropy as a backdrop for creating strong passwords. It's important that you read that article and fully understand it before moving on with the rest of the series. […]

  2. […] Previously, I used entropy as a backdrop for creating strong passwords. It's important that you read that article and fully understand it before moving on with the rest of the series. […]

  3. […] Previously, I used entropy as a backdrop for creating strong passwords. It's important that you read that article and fully understand it before moving on with the rest of the series. […]

Post a Comment

Your email is never published nor shared.