Learn through the super-clean Baeldung Pro experience:
>> Membership and Baeldung Pro.
No ads, dark-mode and 6 months free of IntelliJ Idea Ultimate to start with.
Last updated: March 18, 2024
Most computer systems in the world use 1 January 1970, as an epoch time. In this tutorial, we’ll learn why the world agreed to use it.
Epoch, by definition, is a point in time that we use as a reference point for measuring time. It doesn’t necessarily have to have a specific meaning, so long as people agree on it.
There are many epoch dates throughout history. Let’s look at a few notable ones in the computing world:
Unix was developed in 1969 and first released in 1971. Initially, Unix didn’t use 1 January 1970 as the epoch.
In early versions, Unix measured system time in 60-hertz intervals, and the system used a 32-bit unsigned integer to store the value. The data type could only represent a span of time of less than 829 days, or about 2.5 years. Because of this, the time represented by 0 (the epoch) had to be set in the very recent past, and 1 January 1971 was selected.
In later versions in the early 1970s, Unix system time incremented every second. As a result, this increased the span of time a 32-bit unsigned integer could represent to around 136 years.
After that, according to Dennis Richie, Unix engineers arbitrarily selected 1 January 1970 00:00:00 UTC as the epoch because it was considered a convenient date to work with.
Even though most systems in the world use Unix time, it also has limitations.
Since Unix time increments every second, we need to use a different data type to represent timestamps with greater precision.
Many Unix programs are still using 32-bit integers to store the time, which can only represent integer values from −(231) to 231 − 1 inclusive. Consequently, the latest time that we can store is 231 − 1 (2,147,483,647) seconds after the epoch, which is 03:14:07 on Tuesday, 19 January 2038. Systems that attempt to increment this value by one more second will cause an integer overflow.
Using the 64-bit integer data type to store Unix time solves this issue, as the range of dates representable with it is over 584 billion years.
The Unix epoch 0 is at 00:00:00 UTC on 1 January 1970, so the system uses a negative number to represent any timestamp before that.
In Unix time, every day is precisely 86,400 seconds. When a leap second occurs, it has to make an adjustment to increase or decrease a second unlike UTC or TAI.
In this article, we learned about some notable epoch dates in the computing world, a brief history of Unix time and its limitations, and the reason why Unix engineers selected 1 January 1970 as the epoch time.