This caused me a bit of a headache last night and I wanted to understand why the getDate method in the Date object is 1 based (returns values from 1-31) while the getMonth m
I suppose months are 0-based because Java did it the same way when the JavaScript language was designed.
EDIT Oracle took down older Java documentation, there is an archived version of that page.
In JavaScript, counters start at zero.
Months do not necessarily have to be represented by a digit. "Months" is a countable sequence. The first element of this sequence is referred by zero.
In real-life, days are represented by a fixed digit. Although days are also countable, it would be extremely confusing to represent the first day as Day Zero.
So I dropped Brendan Eich a tweet asking him the question (for those who don't know he is the creator of JS) and his response was:
@magrangs because that is how java.util.Date did it.
https://twitter.com/BrendanEich/status/179610205317902337
Hard to tell, but I have a suspicion that Month is 0-based so it can be easily used as an indexer in a months array that holds month names
var months = new ["Jan", "Feb", ....]
months[new Date().getMonth()];