The method System.currentTimeMillis
returns the value of timestamp in milliseconds (also known as "thousandths of a second"). But let’s go in parts.
As you saw in the documentation, the return is described as "the difference in milliseconds between the current date/time and 1 January 1970 at midnight in UTC".
This date (1 January 1970, midnight, in UTC, that you can also see written in the ISO 8601 format, that is to say, 1970-01-01T00:00Z
) is called Unix Epoch, and we can consider it - to simplify - as the "zero instant".
What System.currentTimeMillis
returns is a number (in case, a long
) which represents the amount of milliseconds that has passed since this instant.
For example, running now, I got the value 1579182398676
(more than 1 trillion and 579 billion milliseconds). It is important to note that this value is the same all over the world - any computer, anywhere in the world, that called this method at the same instant that I would get the same result (assuming the computer clock is not in trouble, etc).
That is, the timestamp represents a unique instant, a point in the timeline. The detail is that in each part of the world (in each time zone), this same value can represent a different date and time. For example, the timestamp 1579182398676
corresponds to:
- 16 January 2020, 13:46:38.676 in London
- 16 January 2020, at 10:46:38.676 in São Paulo
- 17 of January 2020, to 02:46:38.676 in Auckland (New Zealand)
- and in each part of the world, it may be a different date and/or time
All the above dates and times correspond to the same timestamp (1579182398676
) - at the same time.
How much to be standardized depends on what means "standardized".
System.currentTimeMillis
returns this value in milliseconds, but there are other languages/API’s that work with this value in seconds, as is the case with function time
PHP, for example. Already the module datetime
python has the method datetime.timestamp()
, which also returns the value in seconds, but unlike PHP, it returns a float
that also has the second fractions.
In Java itself, from JDK 8, we have the API java.time
, that has nanosecond accuracy (9 decimal places in the fraction of seconds), which results in loss of precision when converting the types of this API to Date
or Calendar
(see at the beginning of this reply some details about it). The detail is that methods like Instant.now()
use "the best watch available in the system", which may or may not have nanosecond accuracy (therefore the number of decimals returned depends on the environment in which the code runs).
Not to mention the class java.time.Instant
can work both with the value in milliseconds (with the methods ofEpochMilli
and toEpochMilli
) when in seconds (with ofEpochSecond
and getEpochSecond
), which is quite rare, since most languages and API’s only work with one of these options.
That is, even if many languages use the concept of timestamp, one must take these differences into account. We can consider that it is standardized because they use the same idea, or that it is not standardized because each one can return the value with a different precision?
Another detail is that the Unix Epoch is not the only value used as the "zero instant". In . NET, for example, a DateTime
uses the amount of "ticks" since January 1 of year 1 (it is not 1901 nor 2001, it is the same year 1), at midnight, in UTC (being that a "tick" is equal to 100 nanoseconds). Moreover, there are several others epochs used in computing.