A Unix timestamp is simply the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC. That's it. No months, no timezones, no date formatting — just an integer. This simplicity is precisely why it's used everywhere: databases, APIs, log files, file systems, and authentication tokens. But simplicity creates pitfalls that trip up even experienced developers.
Why January 1, 1970?
The date is called the "Unix Epoch." When Ken Thompson and Dennis Ritchie built the first versions of Unix at Bell Labs in the early 1970s, they needed a reference point for measuring time. January 1, 1970 was chosen somewhat arbitrarily — it was roughly when the first edition of the Unix manual was published. The system clock started at 0, and every second since then increments by 1. As of April 2026, the Unix timestamp is approximately 1,776,000,000 (1.77 billion seconds since the epoch).
How Timestamps Work in Practice
32-bit vs. 64-bit: The 2038 Problem
A 32-bit signed integer can hold values from −2,147,483,648 to 2,147,483,647. The maximum Unix timestamp in 32 bits is 2,147,483,647, which corresponds to January 19, 2038, 03:14:07 UTC. At that moment, 32-bit systems will overflow and wrap around to negative values, potentially causing system failures — similar to the Y2K bug but for Unix-based systems.
Most modern systems use 64-bit timestamps, which won't overflow for approximately 292 billion years. But legacy embedded systems, older databases, and some IoT devices may still be running 32-bit time. If you're working with systems that will be operational past 2038, verify that they use 64-bit timestamps.
Millisecond Precision
Standard Unix time is in seconds, but many systems use millisecond timestamps (13 digits instead of 10). JavaScript's Date.now() returns milliseconds. Java's System.currentTimeMillis() does the same. If you see a 13-digit timestamp like 1712899200000, divide by 1000 to get the standard Unix time of 1712899200.
Converting Timestamps
Unix Timestamp to Human-Readable Date
- Python:
datetime.fromtimestamp(1712899200, tz=timezone.utc) - JavaScript:
new Date(1712899200 * 1000).toISOString() - MySQL:
FROM_UNIXTIME(1712899200) - Linux CLI:
date -d @1712899200 --utc
Human-Readable Date to Unix Timestamp
- Python:
int(datetime(2024, 4, 12, tzinfo=timezone.utc).timestamp()) - JavaScript:
Math.floor(new Date('2024-04-12T00:00:00Z').getTime() / 1000) - MySQL:
UNIX_TIMESTAMP('2024-04-12 00:00:00')
The Timezone Trap
This is where most timestamp bugs originate. A Unix timestamp represents a specific moment in time and is always in UTC. The problem arises during conversion:
Scenario: A user in Tokyo (UTC+9) creates a calendar event for "April 12, 2024, 09:00." The server stores the timestamp 1712883600. A user in New York (UTC−4) views the event and sees "April 11, 2024, 20:00." This is correct — it's the same moment in time, just displayed in a different timezone.
The bug: If the server naively converts "09:00" without specifying UTC, it might store the timestamp for 09:00 UTC — which is 18:00 in Tokyo. The user's event appears at the wrong time.
Rule: Always store and transmit timestamps in UTC. Convert to local time only for display. Never store a timestamp that was created from a local time without an explicit timezone offset.
Practical Uses of Unix Timestamps
- API rate limiting: Store a timestamp with each request. Reject requests where
current_time - last_request_time < interval - Cache invalidation: Include a timestamp in cache keys. When data is updated, the timestamp changes, automatically invalidating old cache entries
- Database indexing: Timestamps are integers, making them extremely fast to index and compare. Range queries (
WHERE created_at > ?) benefit from this - Authentication tokens: JWT tokens include an
exp(expiration) claim as a Unix timestamp. The server checkscurrent_time > expto determine if the token has expired - Log analysis: Log timestamps are easier to sort and correlate across systems when they're in Unix format
Quick Conversion Reference
- 1 minute = 60 seconds
- 1 hour = 3,600 seconds
- 1 day = 86,400 seconds
- 1 week = 604,800 seconds
- 1 year (365 days) = 31,536,000 seconds
- Leap year (366 days) = 31,622,400 seconds
For quick conversions without writing code or opening a terminal, Risetop's Unix timestamp converter lets you convert between timestamps and human-readable dates instantly. It handles both 10-digit (seconds) and 13-digit (milliseconds) formats and shows the time in multiple timezones.
Conclusion
Unix timestamps are deceptively simple — an integer counting seconds since 1970. But timezone handling, precision differences (seconds vs. milliseconds), and the looming 2038 problem make them a topic worth understanding deeply. The golden rule is simple: store in UTC, convert for display, and always be explicit about your timezone.