При поиске чего-то исходно обеспеченного.NET, ответ нет. Необходимо было бы записать собственный дополнительный метод, чтобы сделать это.
The month is zero-based for JavaScript.
Days and years are one-based.
Go figure.
UPDATE
The reason this is so, from the creator of JavaScript, is
JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.
http://www.jwz.org/blog/2010/10/every-day-i-learn-something-new-and-stupid/#comment-1021
As Eric said, this is due to months being listed as 0-11 range.
This is a common behavior - same is true of Perl results from localtime(), and probably many other languages.
This is likely originally inherited from Unix's localtime() call. (do "man localtime")
The reason is that days/years are their own integers, while months (as a #) are indexes of an array, which in most languages - especially C where the underlying call is implemented on Unix - starts with 0.
date1 = new Date();
//year, month, day [, hrs] [, min] [, sec]
date1 = new Date.UTC(date1.getFullYear(),date1.getMonth()+1,date1.getDate(),date1.getHours(),date1.getMinutes(),date1.getSeconds());
date2 = new Date();
date2 = date2.getTime();
alert(date1)
alert(date2)