Open any server log, database record, or API response and you'll find numbers like 1745318400. These are Unix timestamps — a universal way to represent a moment in time as a single integer. Converting them in your head is impossible, but understanding them is essential for anyone working with data, APIs, or backend systems.
What Is a Unix Timestamp?
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC — an arbitrary date chosen when Unix was being designed. Negative values represent times before 1970.
0 → 1970-01-01 00:00:00 UTC (the Unix epoch)
1745318400 → 2025-04-22 08:00:00 UTC
-86400 → 1969-12-31 00:00:00 UTC
Timestamps have several properties that make them useful:
- Unambiguous — a number like
1745318400means exactly one moment in time, everywhere on Earth - Timezone-independent — timestamps are always UTC; local time is a display concern
- Sortable — higher numbers are later in time; you can sort a list of timestamps numerically
- Arithmetic-friendly — adding 86,400 to any timestamp advances it by exactly one day
Seconds, Milliseconds, and Microseconds
Here's where timestamps get confusing: not all systems use the same unit.
| Unit | Example | Used by |
|---|---|---|
| Seconds | 1745318400 |
Unix/Linux, most server APIs, database DATETIME |
| Milliseconds | 1745318400000 |
JavaScript (Date.now()), Java, many web APIs |
| Microseconds | 1745318400000000 |
Python (time.time_ns()), PostgreSQL, some logging systems |
| Nanoseconds | 1745318400000000000 |
Go (time.Now().UnixNano()), high-precision systems |
When a timestamp value looks "wrong" — like a date in 1973 when you expected 2025 — it usually means you treated milliseconds as seconds (or vice versa). Divide by 1,000 to go from milliseconds to seconds.
How to Convert a Unix Timestamp
Use DevZone's Unix Timestamp Converter to instantly convert any timestamp to a human-readable date, or convert a date back to a Unix timestamp. The tool handles seconds and milliseconds and shows the result in UTC and your local timezone.
Manual conversion in code
JavaScript:
// Timestamp → Date
const ts = 1745318400;
const date = new Date(ts * 1000); // multiply by 1000 for milliseconds
console.log(date.toISOString()); // "2025-04-22T08:00:00.000Z"
// Current timestamp (milliseconds)
const nowMs = Date.now();
// Current timestamp (seconds)
const nowSec = Math.floor(Date.now() / 1000);
Python:
import datetime
# Timestamp → datetime
ts = 1745318400
dt = datetime.datetime.fromtimestamp(ts, tz=datetime.timezone.utc)
print(dt.isoformat()) # 2025-04-22T08:00:00+00:00
# Current timestamp (seconds)
import time
now = int(time.time())
SQL (PostgreSQL):
-- Timestamp → date
SELECT to_timestamp(1745318400);
-- Current timestamp
SELECT extract(epoch FROM now())::integer;
Bash:
# Timestamp → human-readable
date -d @1745318400
# Current timestamp
date +%s
Common Timestamp Arithmetic
Once you understand that timestamps are just numbers representing seconds, date arithmetic becomes straightforward.
Add one day:
1745318400 + 86400 = 1745404800
There are 86,400 seconds in a day (24 × 60 × 60).
Add one hour:
1745318400 + 3600 = 1745322000
Find the difference between two dates:
const start = 1745318400;
const end = 1745491200;
const diffSeconds = end - start;
const diffDays = diffSeconds / 86400; // 2 days
Check if a timestamp is in the past:
const expiry = 1745318400;
const isExpired = expiry < Math.floor(Date.now() / 1000);
Dealing with Timezones
Timestamps themselves have no timezone — they always represent UTC. Timezone conversion is a display concern:
const ts = 1745318400;
const date = new Date(ts * 1000);
// UTC display
date.toISOString(); // "2025-04-22T08:00:00.000Z"
// Local time display (depends on user's system timezone)
date.toLocaleString(); // varies
// Specific timezone
date.toLocaleString("en-US", { timeZone: "America/New_York" });
// "4/22/2025, 4:00:00 AM"
A common mistake is storing timestamps in local time and then wondering why they drift when you deploy to a server in a different timezone. Always store and transmit timestamps in UTC, convert to local time only at display.
The Year 2038 Problem
Unix timestamps stored as 32-bit signed integers overflow on January 19, 2038, at 03:14:07 UTC. After that point, the value wraps to a large negative number, representing 1901.
Most modern systems use 64-bit integers, which push the overflow date billions of years into the future. But embedded systems, older databases, and legacy code may still use 32-bit timestamps. If you're building a system that will store data past 2038, verify your timestamp storage is 64-bit.
FAQ
What is "epoch time"?
Epoch time is another name for Unix timestamp. The "epoch" is the reference point — January 1, 1970, UTC — from which the count begins.
Why 1970?
It's an artifact of Unix history. Early Unix developers needed a fixed reference point for their time libraries, and January 1, 1970 was chosen somewhat arbitrarily as a recent, round date.
How do I get the current timestamp in my terminal?
On macOS/Linux: date +%s
On Windows (PowerShell): [int][double]::Parse((Get-Date -UFormat %s))
Can timestamps represent times before 1970?
Yes — as negative integers. date -d @-86400 on Linux gives 1969-12-31 00:00:00 UTC.
Why does JavaScript return milliseconds but most other languages return seconds?
Historical accident. JavaScript's Date object was designed for high-precision timing in browsers, so it defaults to milliseconds. Most server-side languages default to seconds. Always check the units when working across language boundaries.