I\'m confused but in javascript:
> new Date(\'2012-1-15\') - new Date(\'2012-01-15\')
21600000
Why is that? (21600000 / 1000 / 3600 ==
The date format yyyy-mm-dd
(2012-01-15) is parsed as being a UTC date while yyyy-m-dd
(2012-1-15) is parsed as a local date. This is shown if you use .toString
on each.
> (new Date( '2012-01-15' )).toString()
"Sat Jan 14 2012 16:00:00 GMT-0800 (Pacific Standard Time)"
> (new Date( '2012-1-15' )).toString()
"Sun Jan 15 2012 00:00:00 GMT-0800 (Pacific Standard Time)"
Note that I am in California, hence the Pacific Standard Time. If you are in a different time zone you will get different results.
When JavaScript parses dates it tries formats used in more areas (such as UTC) first before it tries localized date formats. The last part of the UTC date format is a timezone offset from GMT which is assumed to be 0 when it is missing (as it is in this example). To get the same date you would need the full UTC timestamp: 2012-01-15T00:00:00-08:00.
The result of new Date('2012-1-15')
is implementation-dependent (ECMAScript standard, clause 15.9.4.2).
a= new Date('2012-1-16')
b= new Date('2012-01-16')
alert(a);
alert(b);
For the first case, the constructor function sets time to 00:00 in your time zone. But in the second case, it initialize time relative to GMT +00.00