In ECMA compliant implementations, time is measured in milliseconds since the first of January 1970 UTC.
In ECMA compliant implementations, leap seconds are ignored and it is assumed that there are exactly 86,400,000 milliseconds per day. The available range of number values is 18 quadrillion, which is sufficient to measure, to millisecond accuracy, over a time period of nearly 286,000 years forwards or backwards from 01-January-1970 UTC.
Date objects don't use this entire range of values and only cope with 100 million days either side of 01-January-1970 UTC. Still, that is a time period that covers just over half a million years. So, no Y2K crisis there (probably).
The exact moment of midnight at the beginning of 01-January-1970 UTC is represented by the value 0.
The time range may not be the same as that provided by the underlying host environment. For example, Macintosh dates and times are based on a start time of the first of January 1904 measured in seconds. The adjustment is trivial in computational terms but may be missing in some implementations.