May. 9th, 2012 13:50
The Year 2038 Problem
[ Context for noncomputergeeks: A lot of computer systems based on Unix or Unix-derived software store times as number of seconds since the beginning of 1970. Midnight on Dec 31 1969 / Jan 1 1970, is "0" time. If you use a signed 32 bit number to store time, the highest possible value is in 2038; over the past decade, the computer world has been shifting from 32 bit to 64 bit systems, but there are still lots of data formats based on storing 32 bit values. ]
I used to think we have plenty of time to convert all of our computing and data formats to 64 bit times before the year 2038. But an email discussion at work made me look at it in a new light:
"Back when I worked at a bank, we started seeing bugs in 2008 for 30 year mortgages."
I used to think we have plenty of time to convert all of our computing and data formats to 64 bit times before the year 2038. But an email discussion at work made me look at it in a new light:
"Back when I worked at a bank, we started seeing bugs in 2008 for 30 year mortgages."
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
64-bit time_t is the right thing to do, but I bet we'll see all sorts of ugliness like is in the messages database on the iPhone -- there'll be a bit flag that specifies a different epoch time 0. Epoch 0 starts in 1970, epoch 1 starts in 2038...
no subject
Although, the Y2K problem was due to storing dates as (too short) strings, so at least those databases won't break again until 2100...
no subject
no subject
1) I love that "noncomputergeeks" can be multiply parsed.
2) It's interesting that this problem is less hyped in the popular press -- I assume that's in part because it doesn't have a Nice Round Number associated with it, like Y2K did.
no subject
no subject
no subject
no subject
Thank you, Unix-using mortgage lenders.