32bit systems will stop working. The Unix timestamp, which increases by 1 every second and started the first second of 1970, will reach the max of 32 bit integers. Bad things will follow.
And now that every time library has been updated, we’re safe until our grandchildren reimplement those bugs in a language that has not yet been invented.
2100 and 2400 will be a shitshow
Not as much as 2038
Yeah that’s a different shitshow but agreed it is likely to be worse - like y2k the effects are smeared out before and after the date.
Why?
Because of the Year 2038 problem.
32bit systems will stop working. The Unix timestamp, which increases by 1 every second and started the first second of 1970, will reach the max of 32 bit integers. Bad things will follow.
2038 will certainly be a shit show
Yeah but I’ll be dead so not my problem lmao
Nah.
Same thing happened in 2000 and it was a mouse’s fart.
Because of months of preparation. I know, I was doing it.
And now that every time library has been updated, we’re safe until our grandchildren reimplement those bugs in a language that has not yet been invented.
I’ve already seen reimplementation of 2 digit dates here and there.
LOL fuck those guys.
Fortunately I will not be involved. Hopefully I can make something from 2038 though.
You’re not the only one forseeing a nice consultant payday there.
Why
2100 not a leap year (divisible by 100). 2400 is a leap year (divisible by 400). Developing for dates is a minefield.
Because they’re not leap years but are
0 === year % 4
Luckily, none of us will be there.
Won’t the computer’s clock reset every time you go to sleep and stop cranking the power generator?
Yeah who knows if our computers are sticks by either date