14
We have 16, 32 and 64 bit integer values. short
, int
and long
, respectively. But why don’t we have 128 integers? Or 256 integers?
I ask this in case we need to keep an extremely large number that does not fit in the notation we know today.
14
We have 16, 32 and 64 bit integer values. short
, int
and long
, respectively. But why don’t we have 128 integers? Or 256 integers?
I ask this in case we need to keep an extremely large number that does not fit in the notation we know today.
14
Basically because it is rare to have need. Real problems do not need such large numbers. For the rare cases where you need such a large number it’s usually better to work with a guy who has total elasticity like BigInteger
of C# or java.
This type has the advantage of not having pop, you can start small and grow without problems. Of course it takes up more space and is less efficient. If you need this, use it. It allows any number that is representable by a human being and that fits on a computer.
Basically it is pragmatic choice of language not to pollute with something little needed. Probably this guy would be considered primitive or something like that in language and then have to deal with it in many places, it gets complicated.
A minor and minor collateral motif is that processors do not work directly on the simplest instructions with so many bits. But it’s not important because even the 64-bit type had and has this problem on some architectures, the same goes for other numbers. There are architectures that support this, but they are not common and do not do basic processing that Cpus usually do. One reason for them to avoid this is also that the processor would be built much larger to only work with these numbers and would not use all this for addressing, which would be too high a cost for little use and gain. But again, this isn’t the main reason why I don’t have this guy.
On the other hand nothing prevents someone from creating such a type if it is very important to you, most languages allow it in a better or worse way, even Java will allow, in a way not much worse, and in general all accept in a worse way. A type like this would have to be composed of 2 64-bit values on most architectures and do the proper treatment to always give the expected result.
It has nothing to do with integers but we have a type with 128 bits (technically 96 are used to represent greatness, or 97 if we consider the sign as part of greatness, the rest is used to represent scale and reserved for future implementations), except that it is not integer is decimal
, which does not prevent it from being used as an integer as well, it is just not guaranteed.
Browser other questions tagged characteristic-language typing int
You are not signed in. Login or sign up in order to post.
https://en.wikipedia.org/wiki/128-bit
– Victor Stafusa
On VHDL I had\(ツ)/¯
– Woss
A curiosity: Numbers.app, Apple’s Excel, uses a 128-bit calculation engine since the last version
– vinibrsl