8
votes

As we know that all dates using Javascript Date constructor are calculated in milliseconds from 01 January, 1970 00:00:00 Universal Time (UTC) with a day containing 86,400,000 milliseconds. This implies that JS uses UNIX timestamp. I set my timer to a date beyond 2038 (say 14 Nov 2039) and run the script:

    <script>
      var d = new Date();
      alert(d.getFullYear()+" "+d.getMonth()+" "+d.getDate());
    </script>

It alerts 2039 10 14 successfully unlike PHP which prints "9 Oct, 1903 07:45:59"

How JS handles this? Explanation is appreciated as I am confused!

5

5 Answers

14
votes

32bit PHP uses 32bit integers, whose max value puts the last UNIX timestamp that can be expressed by them in 2038. That's widely known as the Y2K38 problem and affects virtually all 32bit software using UNIX timestamps. Moving to 64bits or libraries which work with other timestamp representations (in the case of PHP the DateTime class) solves this problem.

Javascript doesn't have integers but only floats, which don't have an inherent maximum value (but in return have less precision).

5
votes

Javascript doesn't have integer numbers, only floating point numbers (details can be found in the standards document).

That means that you can represent some really large numbers, but at the cost of precision. A simple test is this:

i = 1384440291042
 => 1384440291042
i = 13844402910429
 => 13844402910429
i = 138444029104299
 => 138444029104299
i = 1384440291042999
 => 1384440291042999
i = 13844402910429999
 => 13844402910430000
i = 138444029104299999
 => 138444029104300000
i = 1384440291042999999
 => 1384440291043000000
i = 13844402910429999999
 => 13844402910430000000

As you can see the number is not guaranteed to be kept exact. The outer limits of integer precision in javascript (where you will actually get back the same value you put in) is 9007199254740992. That would be good up until 285428751-11-12T07:36:32+00:00 according to my conversion test :)

The simple answer is that Javascript internally uses a larger data type than the longint (4 bytes, 32bit) that is used for the C style epoc ...

2
votes

This implies that JS uses UNIX timestamp.

Just a sidenote: Unix timestamp are seconds since 1970. JS time is milliseconds since 1970. So JS timestamp does not fit in a 32 bit int much earlier (but JS does not use 32 bit int for this)

2
votes

It can. Try out new Date(8640000000000000)

Sat Sep 13 275760 03:00:00 GMT+0300 (Eastern European Summer Time)

Year 275760 is is a bit beyond 2038 :)

Read the spec section 15.9.1.1

http://ecma-international.org/ecma-262/5.1/#sec-15.9.1.1

A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number is called a time value. A time value may also be NaN, indicating that the Date object does not represent a specific instant of time.

Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.

The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.

The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.

1
votes

The year 2038 problem applies to signed 32 bit timestamps only, which PHP and some other systems use. A signed 32-bit timestamp's range runs out with the number of seconds in 2038.

From the Wikipedia article (emphasis mine):

The year 2038 problem may cause some computer software to fail at some point near the year 2038. The problem affects all software and systems that both store system time as a signed 32-bit integer, and interpret this number as the number of seconds since 00:00:00 UTC on Thursday, 1 January 1970.1 The furthest time that can be represented this way is 03:14:07 UTC on Tuesday, 19 January 2038.[2] ... This is caused by integer overflow. The counter "runs out" of usable digits, "increments" the sign bit instead, and reports a maximally negative number (continuing to count up, toward zero). This is likely to cause problems for users of these systems due to erroneous calculations.

Storing a timestamp in a variable with a greater range solves the problem.