Unix Timestamp Conversion Guide: How Epoch Time Works

Unix timestamps are the universal language of time in computing. Every time you see a database created_at column, a JWT exp claim, or an API response with a date field, there's a good chance Unix timestamps are involved. This guide explains what they are, how to convert them, and the common pitfalls to watch out for.

What Is a Unix Timestamp?

A Unix timestamp (also called "epoch time" or "POSIX time") is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC — a moment known as the "Unix epoch." It's a single integer that unambiguously represents a point in time, regardless of timezone.

  • 0 = January 1, 1970 00:00:00 UTC
  • 1000000000 = September 9, 2001 01:46:40 UTC
  • 1710230400 = March 12, 2024 00:00:00 UTC
  • 2147483647 = January 19, 2038 03:14:07 UTC (the 32-bit limit)

Seconds vs Milliseconds

Different systems use different precisions. Knowing which one you're dealing with is critical:

  • Seconds (10 digits): Unix/Linux, Python time.time(), PHP time()
  • Milliseconds (13 digits): JavaScript Date.now(), Java System.currentTimeMillis()
  • Microseconds (16 digits): Python datetime.timestamp() with microsecond precision
  • Nanoseconds (19 digits): Go time.Now().UnixNano()
// Quick check: seconds vs milliseconds
const ts = 1710230400;     // 10 digits → seconds
const tsMs = 1710230400000; // 13 digits → milliseconds

// Convert between them
const seconds = Math.floor(tsMs / 1000);
const milliseconds = ts * 1000;
💡 Rule of thumb: If the number has 10 digits, it's seconds. If it has 13 digits, it's milliseconds. This simple check saves hours of debugging.

Timestamp Conversion in JavaScript

// Current timestamp
const nowSec = Math.floor(Date.now() / 1000);  // seconds
const nowMs = Date.now();                        // milliseconds

// Timestamp → Date object
const date = new Date(1710230400 * 1000); // multiply by 1000 for Date()
console.log(date.toISOString());
// "2024-03-12T00:00:00.000Z"

// Date → Timestamp
const ts = Math.floor(new Date("2024-03-12T00:00:00Z").getTime() / 1000);
console.log(ts); // 1710230400

// Formatted output
console.log(date.toLocaleString("en-US", { timeZone: "America/New_York" }));
// "3/11/2024, 8:00:00 PM"

Timestamp Conversion in Python

import datetime
import time

# Current timestamp
now_sec = int(time.time())          # 1741737600
now_ms = int(time.time() * 1000)    # 1741737600000

# Timestamp → datetime
dt = datetime.datetime.fromtimestamp(1710230400, tz=datetime.timezone.utc)
print(dt)  # 2024-03-12 00:00:00+00:00

# datetime → Timestamp
ts = int(dt.timestamp())
print(ts)  # 1710230400

# Formatted output
print(dt.strftime("%Y-%m-%d %H:%M:%S %Z"))
# "2024-03-12 00:00:00 UTC"

Command Line Conversion

# Current timestamp
date +%s                              # 1741737600

# Timestamp → human-readable (Linux)
date -d @1710230400                   # Tue Mar 12 00:00:00 UTC 2024

# Timestamp → human-readable (macOS)
date -r 1710230400                    # Tue Mar 12 00:00:00 UTC 2024

# Human-readable → timestamp (Linux)
date -d "2024-03-12 00:00:00 UTC" +%s # 1710230400

Timezone Handling

Unix timestamps are always in UTC — they represent a fixed point in time. The timezone only matters when displaying the time to humans:

// Same timestamp, different timezones
const ts = 1710230400;
const date = new Date(ts * 1000);

// UTC
date.toLocaleString("en-US", { timeZone: "UTC" });
// "3/12/2024, 12:00:00 AM"

// New York (UTC-4 in March)
date.toLocaleString("en-US", { timeZone: "America/New_York" });
// "3/11/2024, 8:00:00 PM"

// Tokyo (UTC+9)
date.toLocaleString("en-US", { timeZone: "Asia/Tokyo" });
// "3/12/2024, 9:00:00 AM"
⚠️ Common bug: Using local time functions (e.g., new Date() without timezone) on the server can produce different timestamps depending on server timezone configuration. Always use UTC explicitly.

The Y2038 Problem

Systems that store Unix timestamps as 32-bit signed integers will overflow on January 19, 2038 at 03:14:07 UTC. The maximum 32-bit signed integer is 2,147,483,647, which corresponds to this date.

After this moment, 32-bit timestamps would wrap around to negative values (interpreted as dates in 1901). Most modern systems have migrated to 64-bit timestamps, which won't overflow for about 292 billion years. However, embedded systems and legacy databases may still be affected.

💡 Best practice: Always use 64-bit integers for timestamps. In databases, use BIGINT instead of INT for timestamp columns.

Notable Epoch Milestones

  • Epoch: 0 — Jan 1, 1970 00:00:00 UTC
  • Billennium: 1000000000 — Sep 9, 2001 01:46:40 UTC
  • 2 Billion: 2000000000 — May 18, 2033 03:33:20 UTC
  • Y2038: 2147483647 — Jan 19, 2038 03:14:07 UTC
  • Negative epoch: -1 — Dec 31, 1969 23:59:59 UTC

🛠️ Convert Timestamps Online

Open Timestamp Tool →

Further Reading