This week's topic is Big Numbers in JavaScript.

As your may know, JavaScript has a sordid past when it comes to Numbers and binary. Until recently bitwise operations could only be done on 31-bit integers (or 32-bit if you triple right shift `>>>`) and other mathematical calculations could only be done accurately so long as the result didn't exceed 52 bits.

Just this year, however, `BigInt`, `BigInt64Array`, and `BigUint64Array` have landed in both node and Chrome. Seeing that there's also (minimal) documentation for it on MDN suggests that it may be coming to Firefox soon.

TL;DR:

For the impatient, here's a sampling of how BigInts work:

``````// Decimal String to BigInt
var bn = BigInt("12345678901234690123456789012345678990");

// Hex to BigInt (note the '0x' prefix)
var bn = BigInt("0x80808080808080808080808080808080");

// Binary to BigInt (note the '0b' prefix)
var bn = BigInt("0b1000000010000000100000001000000010000000100000001000000010000000");

// Octal (note the '0o' prefix)
var bn = BigInt("0o100200401002004010020040100200")
``````

(note that base64 is left out, you'll have to use `atob` to go to a binary string and then convert that to hex)

``````// BigInt to Decimal
var d = bn.toString(10);

// BigInt to Hex
var hex = bn.toString(16);
if (hex.length % 2) { hex = '0' + hex; }

// BigInt to BinarygInt (note the '0b' prefix)
var b = BigInt("0b1000000010000000100000001000000010000000100000001000000010000000").toString(2)
while (b.length % 8) { b = '0' + b; }
``````

If you're interested in the benefits and quirks, keep reading below.

What's BigInt for?

Broadly speaking a "Big Number" is any number that is "wider" (has more bits) than what the CPU can easily handle.

Most modern computers have multiple 64-bit CPU cores. This tends to mean that they can most easily do computations on 64-bit numbers.

Although computers have handled 64-bit numbers since... pretty much forever, as we've moved from 8-bit and 16-bit processors to 32-bit and 64-bit processors (and as we've exponentially increased RAM), it's actually become more efficient to work with larger numbers.

Although GPUs (on graphics cards for games and engineering) have increased to 256-bit, I think we've stopped at 64-bit on CPUs because all of the numbers that we have beyond that are quite varied.

There's a class of problems that are easily calculated with 64-bits. There's not a class of problems well suited to 128-bit. The next jump varies into very large ranges - between 256-bit and 4096-bit for assymetric cryptography (PKI) - or even bigger for certain scientific calculations.

After 64-bit it's simply more efficient to have the occasional "slow" BigInt operations rather than building specific hardware to handle such large bit-widths.

In the cases where that doesn't hold true, the specific applications use GPUs or CPU clusters.

The bottom line: It's nicer to have arbitrarily large numbers conveniently available when we need them.

What's Good

`BigInt`s give us a lot of goodies. In particular, here are a few things that weren't easy to do before:

• Conversion between Long Hex to Decimal strings and vice-versa.
• Arbitrary precision arithmatic (great for dealing with money)
• Arbitrary precision bitwise operations (great for cryptography)

In particular, this will make it easier to develop future crypto libraries on devices and in browsers that are a little more efficient (and far less costs in terms of dependency management, dependency security, and end-user experience).

It's nice to have a way to accurately represent these large numbers.

``````var bn = BigInt("0xFF00FF00FF00FF00FF00FF00FF00FF00");
``````

Note that although you might have expected something like `BigInt.parseInt` to complement `Number.parseInt`, instead you use the use the same prefixs that you would use for literal numbers, but in string form.

Alternatively, there's a new literal syntax (the `n` suffix):

``````var bn = 0xFF00FF00FF00FF00FF00FF00FF00FF00n;

var bn = 12346789012345678901234567890124567890n
``````

Generally speaking bitwise operations can mix and match between `Number`s and `BigInt`s. For example:

``````BigInt(-1) < 0 // true
``````

However, arithmatic operations must be done between numbers of the same type only:

``````BigInt("0x10001") % 2 // throws exception
``````
``````BigInt("0x10001") % 2n // works like a charm
``````

What's Still Broken

There are still a few old quirks (and some new quirks) and, in particular, working with signed (negative) numbers is still quite half-baked, and that's not likely to change.

Here's some things to note:

• The bitwise `~` operator is essentially useless.
• Non-decimal encodings (hex, octal, binary) lack proper padding
• Encoding negative numbers fails royally
• Converting between `BigInt`s and ArrayBuffers is sloppy
• You can't mix and match JavaScript `Number`s with `BigInt`s... except sometimes
• BigInts are a new type - just like `String` or `Boolean`
• `1 !== BigInt(1)`
• You can't `Uint8Array.from( BigUint64Array.from([ BigInt(1) ]) )`

That's too much material to cover here, which leads to this question:

What Now? Where to Next?

I've done quite a bit of exploration and written a few more articles to help get done what you need to get done (or at least to the extend that I felt like learning it):

Between these articles I've covered everything that seemed important in regards

By AJ ONeal