How many bits of entropy per character in various encoding schemes.
By Encoding Scheme
The number of symbols (characters) in each encoding scheme, the multiplier (to get how many characters are needed to store so many bytes), and maximum number of bits per character.
Encoding | # Chars | Multiplier | Bits per Char |
---|---|---|---|
Byte | 256 | 1 | 8 |
ASCII | 128 | 8/7 | 7 |
Base64 | 64 | 4/3 | 6 |
AlphaNum (Base62) | 62 | - | 5.954 (approx) |
LowerNum | 36 | - | 5.169 (approx) |
Base32 | 32 | 8/5 | 5 |
Hex | 16 | 2 | 4 |
Numeric | 10 | - | 3.321 (approx) |
Octal | 8 | 8/3 | 3 |
Binary | 2 | 8 | 1 |
See also https://en.wikipedia.org/wiki/Password_strength#Random_passwords
By Bits vs Minimum Number of Characters
How many characters are needed in each encoding to store at least so many bits.
Bits | Byte (256) | ASCII (128) | Base64 | 62 | 32 | Hex (16) | Dec (10) | Octal (8) | Binary (2) |
---|---|---|---|---|---|---|---|---|---|
8 bits | 1 B | 2 | 2 | 2 | 2 | 2 | 3 | 3 | 8 |
19 bits | 3 B | 3 | 4 | 4 | 4 | 5 | 6 | 7 | 19 |
24 bits | 3 B | 4 | 4 | 5 | 5 | 6 | 8 | 8 | 24 |
29 bits | 4 B | 5 | 5 | 5 | 6 | 8 | 9 | 10 | 29 |
32 bits | 4 B | 5 | 6 | 6 | 7 | 8 | 10 | 11 | 32 |
48 bits | 6 B | 7 | 8 | 9 | 10 | 12 | 15 | 16 | 48 |
60 bits | 8 B | 9 | 10 | 11 | 12 | 15 | 19 | 20 | 60 |
64 bits | 8 B | 10 | 11 | 11 | 13 | 16 | 20 | 22 | 64 |
72 bits | 9 B | 11 | 12 | 13 | 15 | 18 | 22 | 24 | 72 |
96 bits | 12 B | 14 | 16 | 17 | 20 | 24 | 29 | 32 | 96 |
120 bits | 15 B | 18 | 20 | 21 | 24 | 30 | 37 | 40 | 120 |
128 bits | 16 B | 19 | 22 | 22 | 26 | 32 | 39 | 43 | 128 |
144 bits | 18 B | 21 | 24 | 25 | 29 | 36 | 44 | 48 | 144 |
192 bits | 24 B | 28 | 32 | 33 | 39 | 48 | 58 | 64 | 192 |
256 bits | 32 B | 37 | 43 | 43 | 52 | 64 | 78 | 86 | 256 |
512 bits | 64 B | 74 | 86 | 86 | 103 | 128 | 155 | 171 | 512 |
1024 bits | 128 B | 147 | 171 | 172 | 205 | 256 | 309 | 342 | 1024 |
4096 bits | 512 B | 586 | 683 | 688 | 820 | 1024 | 1234 | 1366 | 4096 |
- 19 bits - common for OTP
- 29 bits - minimum recommendation for online systems
- 96 bits - minimum recommendation for offline systems
- 128 bits - common for API keys
- 256 bits - common for overkill
- 4096 bits - common for prime numbers (sparse keyspace)
Reference Tables (Base64, Decimal, Hex)
A quick lookup for the maximum entropy in an encoded string of a given minimum length:
Base64
Base64 is also a good approximation of the ASCII characters people actually use in their passwords - they may include a !
, $
, or space, but not with any randomness - so the extra possible “special character” entropy is maybe on par with base64’s two extra characters.
Base64 Chars | Bits | Bytes |
---|---|---|
2 | 8 bits | 1 |
4 | 24 bits | 3 |
6 | 32 bits | 4 |
8 | 48 bits | 6 |
10 | 60 bits | > 7 |
11 | 64 bits | 8 |
12 | 72 bits | 9 |
16 | 96 bits | 12 |
20 | 120 bits | 15 |
22 | 128 bits | 16 |
24 | 144 bits | 18 |
32 | 192 bits | 24 |
var n = 16;
crypto.randomBytes(n).toString("base64").replace(/=/g, "").length;
Decimal
Commonly used for PINs and OTP.
Decimal Chars | Bits | Bytes |
---|---|---|
3 | 8 bits | 1 |
4 | > 13 bits | > 1 |
5 | > 16 bits | > 2 |
6 |
> 19 bits (common for OTP) |
> 2 |
8 | > 26 bits | > 3 |
9 | > 29 bits (minimum recommendation for online systems) |
> 3 |
10 | > 33 bits | > 4 |
12 | > 39 bits | > 4 |
var n = 4;
parseInt(crypto.randomBytes(n).toString("hex"), 16).toString(10).length;
Hex
Hex is easy to compute in your head (either 2x or n/4), but just for reference:
Hex Chars | Bits | Bytes |
---|---|---|
2 | 8 bits | 1 |
6 | 24 bits | 3 |
8 | 32 bits (29+ recommended for online systems) |
4 |
10 | 40 bits | 5 |
12 | 48 bits | 6 |
16 | 64 bits | 8 |
20 | 80 bits | 10 |
24 | 96 bits (min recommendation for offline systems) |
12 |
32 | 128 bits | 16 |
var n = 4;
crypto.randomBytes(n).toString("hex").length;
See also https://en.wikipedia.org/wiki/Password_strength#Required_bits_of_entropy.
How to Calculate Bits of Entropy per Character
You can arrive at bits of entropy per character in a string with with Math.log(n) / Math.log(2)
.
For example:
function getBits(n) {
return Math.log(n) / Math.log(2);
}
getBits(256); // 8 (Byte)
getBits(62); // 5.954 (AlphaNumeric)
getBits(36); // 5.169 (Case-Insensitive AlphaNumeric)
getBits(32); // 5 (Crockford Base32)
getBits(10); // 3.321 (Numeric)
getBits(2); // 1 (Binary)
Note: In the example here I truncate (floor) the value rather than rounding because my moral belief is that the nature of entropy is such that it must only be rounded down - an entropy of 1.9999 bits is simply not 2. 😉
Math.log(n)/Math.log(b)
is the inverse of Math.pow(b, e)
, but whereas the Math.pow(b, e)
allows you to specify the exponent, Math.log(n)
does not - so you have to resort to math - rules and all that.
Any log at a given base divided by the log of the same base (10 in this case, I think) yields the log of the numerator in the base of the denominator.
Calculating Entropy in Unicode Strings
Unicode strings have different entropies based on their encoding, but each extra tuple of bytes has fewer bits per tuple (the leading bits are used as byte markers).
UTF-8, for example:
Tuple Size | Bit Sizes | Total Bits |
---|---|---|
1 | 7 | 7 |
2 | 7+6 | 13 |
3 | 7+6+5 | 18 |
4 | 7+6+5+4 | 22 |
5 | 7+6+5+4+3 | 25 |
6 | 7+6+5+4+3+2 | 27 |
7 | 7+6+5+4+3+2+1 | 28 |
I don’t know why you’d want to use that - maybe you allow emojis in passwords? - but anyway, there it is.
How to generate such Tables
"use strict";
console.info("| Bits | Bytes | ASCII | Base64 | 62 | 32 | Hex | Dec | Octal | Binary |");
console.info("| ---: | ----: | ----: | -----: | --: | --: | --: | --: | ----: | -----: |");
[
8, 19, 24, 29, 32, 48, 60, 64, 72, 96, 120, 128, 144, 192, 256, 512, 1024,
4096,
].forEach(function (bits) {
var B = getNumChars(bits, 256);
var a = getNumChars(bits, 128);
var b64 = getNumChars(bits, 64);
var b62 = getNumChars(bits, 62);
var b32 = getNumChars(bits, 32);
var h = getNumChars(bits, 16);
var d = getNumChars(bits, 10);
var o = getNumChars(bits, 8);
var b = getNumChars(bits, 2);
console.info(
`| %s bits | %s B | %s | %s | %s | %s | %s | %s | %s | %s |`,
b,
B,
a,
b64,
b62,
b32,
h,
d,
o,
b
);
});
function getNumChars(bits, base) {
return Math.ceil(bits / getBitsPerChar(base));
}
function getBitsPerChar(n) {
// number of symbols in set
return Math.log(n) / Math.log(2);
}