Jump to content

Year 2038 problem

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 130.76.64.15 (talk) at 14:54, 24 May 2010. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Example showing how the date would reset (at 03:14:08 UTC on 19 January 2038).

The year 2038 problem (also known as Unix Millennium Bug, Y2K38 by analogy to the Y2K problem) may cause some computer software to fail before or in the year 2038. The problem affects all software and systems that store system time as a signed 32-bit integer, and interpret this number as the number of seconds since 00:00:00 UTC on Thursday, 1 January 1970.[1] The furthest time that can be represented this way is 03:14:07 UTC on Tuesday, 19 January 2038.[2] Times beyond this moment will "wrap around" and be stored internally as a negative number, which these systems will interpret as a date in 1901 rather than 2038. This will likely cause problems for users of these systems due to erroneous calculations.

Further, while most programs will only be affected in or very close to 2038, programs that work with future dates will begin to run into problems much sooner. For example, a program that works with dates 20 years in the future will have to be fixed no later than in 2018.

Because most 32-bit Unix-like systems store and manipulate time in this format, it is usually called Unix time, and so the year 2038 problem is often referred to as the Unix Millennium Bug. However, any other non-Unix operating systems and software that store and manipulate time this way will be just as vulnerable.

Early problems

In May 2006, reports surfaced of an early manifestation of the Y2038 problem in the AOLserver software. The software was designed with a kludge to handle a database request that should "never" time out. Rather than specifically handling this special case, the initial design simply specified an arbitrary time-out date in the future. The default configuration for the server specified that the request should time out after one billion seconds. One billion seconds (approximately thirty-two years) after 9:27.28 pm on 12 May 2006 is beyond the 2038 cutoff date. Thus, after this time, the time-out calculation overflowed and returned a date that was actually in the past, causing the software to crash. When the problem was discovered, AOL's server managers had to edit the configuration file and set the time out to a lower value.[3][4]

Solutions

There is no straightforward and general fix for this problem for existing CPU and operating system combinations, existing file systems, or existing binary data formats. Changing the definition of time_t data type to a 64-bit type would break binary compatibility for software, data storage, and may affect any code that deals with the binary representation of time. Changing time_t to an unsigned 32-bit integer, effectively allowing timestamps to be accurate until the year 2106, would affect programs that deal with time differences or dates before 1970, and thus would also break binary compatibility in many cases.

Most operating systems for 64-bit architectures already use 64-bit integers in their time_t, and these operating systems are becoming more common, particularly in desktop and server environments. Using a (signed) 64-bit value introduces a new wraparound date that is over twenty times greater than the present age of the universe: approximately 292 billion years from now, on Sunday, December 4, 292,277,026,596 AD.[2] As of 2007, however, hundreds of millions of 32-bit systems are deployed, many in embedded systems, and it is far from certain that they will all be replaced by 2038. Additionally, 32-bit applications running on 64-bit systems are likely to be affected by the issue.

Despite the modern eighteen-to-twenty-four-month generational update in computer systems technology, embedded computers may operate unchanged for the life of the system they control. The use of 32-bit time_t has also been encoded into some file formats, which means it can live on for a long time beyond the life of the machines for which such file formats were originally designed.

Alternative proposals have been made (some of which are in use) including storing either milliseconds or microseconds since an epoch (typically either 1 January 1970 or 1 January 2000) in a signed-64 bit integer, providing a minimum of 300,000 years range.[5][6] Other proposals for new time representations provide different precisions, ranges, and sizes (almost always wider than 32 bits), as well as solving other related problems, such as the handling of leap seconds.

See also

References

  1. ^ "The Open Group Base Specifications Issue 6 IEEE Std 1003.1, 2004 Edition (definition of epoch)". IEEE and The Open Group. The Open Group. 2004. Retrieved 2008-03-07.
  2. ^ a b Diomidis Spinellis (2006). Code quality: the open source perspective. Effective software development serie in Safari Books Online (illustrated ed.). Adobe Press. ISBN 0321166078. {{cite book}}: Unknown parameter |página= ignored (|page= suggested) (help)
  3. ^ "The Future Lies Ahead". 28 June 2006. Retrieved 2006-11-19.
  4. ^ Weird "memory leak" problem in AOLserver 3.4.2/3.x 2006-05-12
  5. ^ "Unununium Time". Archived from the original on 4 August 2006. Retrieved 2006-11-19. {{cite web}}: |archive-date= / |archive-url= timestamp mismatch; 8 April 2006 suggested (help)
  6. ^ Sun Microsystems. "Java API documentation: System.currentTimeMillis". Retrieved 2007-05-07.