Anyone remember the really bad Y2K explanations?
Like "The year 2000 problem is, that computers think it's 1900 instead."
And I never understood why computers would care if it's 2000 or 1900 if they just use the last two digits anyway.
Well, the year 2038 problem made me realize why Y2K was a problem at all and how that stuff works.
And thanks to capitalism, those are even problems at all.
@compufox @maxine Time on most computers is measured as seconds since the Unix epoch (1970-01-01 00:00:00 GMT). In January 2038, the number of seconds will be more than what a 32-bit signed integer can store, so it'll overflow and make computers think it's December 1901 instead.
The only real fix is for systems, programs, and data formats using Unix time is for them to use 64-bit integers instead of 32-bit.
@arielmt @compufox To mention.
Any system newer than like mid 90s should be able to switch to 64-bit time