Problem of the Year 2038

Asked

Viewed 347 times

4

What is the problem of the year 2038?

  • It is similar to the millennium bug?

  • Solutions to avoid information disruption already exist?

  • Related: https://answall.com/q/100231/112052

  • 2

    It’s the new end of time (literally), written by 32-Bit programmers.

3 answers

4


The problem is because of the way many systems have chosen to count time. Several systems/languages/Apis currently use Unix timestamps, which is the amount of time elapsed from a predefined instant.

The default instant as the start (or "zero instant") is known as Unix Epoch, and its value is January 1, 1970, at midnight, in UTC (1970-01-01T00:00:00Z). And timestamps are values that define how much time has passed since zero instant.

Many systems keep this value in 32-bit integers with signal, representing the number of seconds since the Unix Epoch. The greatest value this guy can hold is 2.147.483.647 (slightly more than two billion). And 2,147,483,647 seconds after the Unix Epoch amounts to 2038-01-19T03:14:07Z (19 January 2038, 03:14:07, UTC).

A second after that, the next value would be 2,147,483,648, but as this value exceeds the maximum that a 32-bit integer with signal supports, one occurs overflow and its value becomes a negative number. More precisely, -2.147.483.648, which is equivalent to about 2 billion seconds before of Unix Epoch, which amounts to 1901-12-13T20:45:52Z (December 13 1901, at 20:45:52 in UTC).


Any application/system/language/API that uses 32-bit integers with signal to store timestamps is subject to this problem. But if other types are used, the problem occurs at a different date.

If the system uses 32-bit integers with no signal, the highest possible value is 4,294,967,294, then in this case only a similar problem will occur in February 2106.

Many languages and systems already use 64-bit timestamps, for example, which ensures a much wider range (but future generations may still have the year 292.277.026.596, unless the computer architecture - and the timing, among other factors - changes dramatically by then).

In any case, the solutions vary greatly from one system to another, as it depends on how dependent each application is on the types that hold the values of timestamps, and how it manipulates such values.

For example, if an application uses 32-bit integers with signal and handles dates prior to 1970 (i.e., dates whose timestamp is a negative number), you could not switch to 32-bit integers without signal (which only support positive values).

Many systems can be recompiled/changed to use 64-bit types, but depending on the case, it may not be possible to cause backward compatibility problems. Each case is a case and there is no universal solution.

  • 1

    Now I understand why the old programmer put all the dates in string. Brain 100

2

What is the bug: The bug is the ending of the 32-bit numerical sequence that is used to count time from mid-day on January 1, 1970 adopted mainly by the C language and derivatives. The maximum number is 2147483647 and will end on 01/19/2038, from there, it starts using this number negatively and should decrease to 0. So who has systems that use future dates may have problems with this.

To solve, an alternative is to change the way the date is controlled from 32 bits to 64 bits, which gives there some many many and many years of survival.

In short, that’s it, but I honestly don’t believe it’s anything fancy, maybe more of an unnecessary panic like the Millennium Bug. As far as I remember, no nuclear bomb was dropped because of this hehehe.

If you do a sweep, there’s enough material already on.

I have researched about, because my system uses future date due to long financing, but I’m not getting many difficulties no.

I hope I’ve helped.

  • 1

    In fact the count is made from midnight of 1/1/1970 in UTC: https://en.wikipedia.org/wiki/Unix_time - of course that same instant could be noon in some part of the world, depending on the time zone, but UTC is generally used as a reference for this :-)

  • Well-remembered! :)

-1

What is This possible problem arose the popular knowledge from the song "Gangnam Style", this year, 32-bit processors may stop working, because they will no longer be able to count the time, in the case of music, it became so famous that it almost reached the maximum value in views, and from that this phenomenon became famous.

In 2038, at 3 o'clock, 14 minutes and 5 seconds on March 19, computers using 32-bit systems will not be able to handle the date change, as they will have reached their maximum count limit.

What has to be done? Systems that use 32-bit processors will stop working or be replaced within the next 24 years. Let’s hope that’s enough time to plan the infrastructure changes.

  • 3

    In fact the Y2038 bug has been known for a long time (well before youtube existed), and the case of Gangnam Style had nothing to do with the date exactly, but rather with the view counter (since the video was about to burst the maximum 32-bit support). Although it is related to the same thing (the maximum number that 32 bits supports), it is incorrect to state that the problem of the year 2038 arose from the Gangnam Style

  • I considered these facts implicit, but corrected.

  • Another point (and sorry I’m so pedantic) is that the time limit is 3:14 am UTC. But right now, depending on the time zone, it could be a different time and day (for example, in California that same time will be 19:14 on the 18th). Anyway, the problem will occur in 2038 for all :-)

  • Ah, and the correct date is January 19 (not March) 2038, at 3:14:07 (7 seconds, not 5) in UTC :-)

  • 2

    Still, 32-bit or less systems and processors can work perfectly well as long as they use data structures correctly. There is no problem with an 8-bit processor saving dates much larger than 2038 and the problem is programming and not processing.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.