Unix Timestamp Converter

Convert between Unix epoch timestamps and human-readable dates instantly

Current Unix Timestamp

Timestamp → Date

Seconds (10-digit)
Milliseconds (13-digit)

Date → Timestamp

Seconds
Milliseconds

Quick Reference

Frequently Asked Questions

What is a Unix timestamp?
A Unix timestamp (also called epoch time) is the number of seconds that have elapsed since January 1, 1970 (UTC). It's widely used in computing for time representation.
What is the difference between seconds and milliseconds timestamps?
Seconds timestamps have 10 digits (e.g., 1680000000), while millisecond timestamps have 13 digits (e.g., 1680000000000). JavaScript uses milliseconds by default, while most Unix systems use seconds.
What is the Unix epoch date?
The Unix epoch is January 1, 1970, 00:00:00 UTC. This is the starting point from which Unix timestamps are calculated.
When will 32-bit Unix timestamps overflow?
32-bit signed Unix timestamps will overflow on January 19, 2038, at 03:14:07 UTC (the Year 2038 problem). Systems must migrate to 64-bit timestamps before then.
How do I get the current Unix timestamp?
In JavaScript: Math.floor(Date.now()/1000). In Python: int(time.time()). In Bash: date +%s. This page shows the live current timestamp at the top.
Can Unix timestamps represent dates before 1970?
Yes, with 64-bit timestamps or signed 32-bit integers, negative values represent dates before the Unix epoch (e.g., -86400 = December 31, 1969).
How accurate are Unix timestamps?
Standard Unix timestamps are accurate to 1 second. Millisecond timestamps provide 1ms accuracy. Nanosecond precision is available in some systems.
Why do programmers use Unix timestamps?
Unix timestamps are timezone-independent, easy to store as numbers, simple to compare and calculate differences, and supported by virtually all programming languages and databases.
Copied!