5
What is the most performative way to find the minimum number of bits needed to represent a natural number (i.e. no signal) in Javascript? There is a way to do without using loops?
The code below for example works for every integer between 0
and 2^30 - 1
(infinite loop if it is larger than that):
var x = 9001;
var bits = 1;
while ( x >= (1 << bits) )
bits++;
And this one works until 2^53
(the largest representable integer guaranteed without loss of precision) and somewhat beyond:
var x = 9001;
var bits = 1;
var limite = 2;
while ( x >= limite ) {
bits++;
limite *= 2;
}
But they both use loops, basically asking: does it fit in 1 bit? does it fit in 2 bits? does it fit in 3 bits? etc. I was curious to know if there is a better way to do this.
Note: I’m only interested in know how many bits are needed, not in actually doing that representation. Even because Javascript does not use int
, long
, unsigned int
, etc - and yes double
for everything... (and when you use, does not expose to the programmer)
I would sound even stupider than I look if I asked why this question?
– Bruno Augusto
@Brunoaugusto Originally, I wanted to store a bit mask and a unique identifier in a single
Number
. For this I needed to know if the identifier (variable) "fits" in the space that remains beyond the mask (fixed). As I wrote the question, I realized that saying whether or not it fits is easy - I just subtract from53
available bits the size of the mask, and see if the identifier is between zero and the largest number that fits in that space. But I remained curious to know if there is a way to determine the minimum number of bits needed, given a number, so I submitted the question anyway.– mgibsonbr
In the general case, calculating the logarithm of a number - even approximate - is complicated, but in binary it is easy: just count the bits to the right of the last bit set (inclusive). But can you do this with a simple operation? I don’t know, and I thought maybe someone knew a way...
– mgibsonbr