Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates.

Unix Timestamps Explained

A Unix timestamp is the number of seconds since January 1, 1970 (UTC). Right now, it's a 10-digit number in the billions. This simple format avoids timezone confusion and makes date arithmetic easy - add 86400 to get tomorrow.

JavaScript uses milliseconds instead of seconds, so JS timestamps have 13 digits. Databases and APIs use both formats. If your timestamp seems 1000x too big or small, that's probably why.

Notable Timestamps

  • 0: Jan 1, 1970 00:00:00 UTC (the epoch)
  • 1000000000: Sept 9, 2001 - "billennium"
  • 2147483647: Jan 19, 2038 - 32-bit overflow (Y2K38)

Understanding Unix Timestamps

Unix timestamps (also called Epoch time or POSIX time) represent the number of seconds elapsed since January 1, 1970, 00:00:00 UTC. This standardized format eliminates timezone ambiguity and simplifies date calculations across different systems and programming languages.

The "Year 2038 Problem" affects 32-bit systems where timestamps will overflow on January 19, 2038. Modern systems use 64-bit integers, which won't overflow for billions of years. JavaScript uses milliseconds (13-digit timestamps), while Unix typically uses seconds (10-digit).

Common Timestamp Values

  • 0:January 1, 1970 00:00:00 UTC (Unix Epoch)
  • 1000000000:September 9, 2001 01:46:40 UTC
  • 1609459200:January 1, 2021 00:00:00 UTC

When Developers Actually Use This

Unix timestamps are everywhere in software — database records, API responses, log files, JWT expiry fields — but they're completely unreadable to humans in their raw form. When you see 1711234567 in a database column or a JWT payload, you have no immediate sense of whether that's yesterday, last year, or three hours from now. Converting it instantly tells you it's March 23, 2024 at 17:16 UTC — which is the information you actually need to debug the issue at hand.

The reverse conversion is equally common. If you're writing a query to fetch records created in the last 30 days, constructing a JWT with a specific expiry time, or setting a cache TTL in an API response, you need to know what Unix timestamp corresponds to a specific date and time. Doing this math mentally or with a calculator is error-prone, especially across timezones. A common mistake is calculating a timestamp in your local timezone and then having it behave unexpectedly on a server running UTC — this tool lets you be explicit about timezone so you get the exact value you intend.

Frequently Asked Questions

What's the difference between seconds and milliseconds?

Unix timestamps in seconds are 10 digits (e.g., 1640995200). JavaScript uses milliseconds, which are 13 digits (e.g., 1640995200000). Divide by 1000 to convert milliseconds to seconds for this tool.

How do timezones affect timestamps?

Unix timestamps are timezone-independentthey always represent UTC. When displaying as a date, your local timezone is applied. Store timestamps in UTC and convert to local time only for display.

Can timestamps be negative?

Yes! Negative timestamps represent dates before January 1, 1970. For example, -86400 represents December 31, 1969. Most systems support negative timestamps, though some older systems may not.