DEVELOPER TOOL

Timestamp Converter

Convert between Unix timestamps and human-readable date formats. Supports seconds, milliseconds, ISO 8601, and natural date strings.

Unix timestamp (seconds): 1700000000
Unix timestamp (milliseconds): 1700000000000
ISO 8601: 2023-11-14T22:13:20Z
Date string: Nov 14, 2023 10:13 PM
Converted timestamps will appear here...

Time in Computing: From Epoch to ISO

The Unix Epoch: January 1, 1970

The Unix epoch — midnight UTC on January 1, 1970 — is computing's universal time reference point. A Unix timestamp is simply the count of seconds elapsed since that moment: timestamp 0 = January 1, 1970 00:00:00 UTC, timestamp 1000000000 = September 9, 2001 01:46:40 UTC, and timestamp 1700000000 = November 14, 2023. This single integer representation makes time arithmetic trivial: the difference between two events is just subtraction, and adding 86400 always advances exactly one day.

The Year 2038 Problem

231- 1 = 2,147,483,647 → Jan 19, 2038 03:14:07 UTC

Systems storing Unix timestamps as 32-bit signed integers will overflow on January 19, 2038 at 03:14:07 UTC. After this moment, the counter wraps to -2,147,483,648, interpreted as December 13, 1901. This is computing's next Y2K-style milestone. Modern 64-bit systems use 64-bit timestamps, extending the range to approximately 292 billion years in both directions. JavaScript uses 64-bit floating-point numbers (IEEE 754), safely handling dates until the year 275,760.

Time Zones: The Developer's Nemesis

Unix timestamps are inherently UTC — they have no timezone. The moment timestamp 1700000000 is the same instant everywhere on Earth. But converting to human-readable form requires a timezone: 1700000000 is “Nov 14, 2023 10:13 AM” in New York (UTC-5) but “Nov 15, 2023 12:13 AM” in Seoul (UTC+9). Daylight Saving Time adds another layer: the same timezone offset can change twice a year. This tool shows both UTC and your browser's local time so you can immediately see the difference and avoid off-by-hours bugs.

Seconds, Milliseconds, and ISO 8601

Seconds (10 digits): 1700000000

Milliseconds (13 digits): 1700000000000

ISO 8601: 2023-11-14T15:13:20.000Z

Unix traditionally uses seconds, but JavaScript's Date.now() returns milliseconds, and many APIs (Stripe, Discord, Snowflake IDs) use millisecond precision. This tool auto-detects the format based on digit count. ISO 8601 is the human-readable standard — its YYYY-MM-DDTHH:mm:ss.sssZ format is unambiguous, sortable as a string, and understood by every major programming language and database.

Frequently Asked Questions

How does the tool distinguish seconds from milliseconds?

Timestamps with 10 or fewer digits are treated as seconds (valid through the year 2286). Timestamps with 13 digits are treated as milliseconds. This heuristic correctly handles all current and near-future dates. If your timestamp falls outside the expected range, the tool will display an error.

Why does my timestamp show the wrong time?

The most common cause is timezone confusion. Unix timestamps are always UTC. If you received a timestamp labeled “PST” but it was actually generated in UTC, the displayed local time will be off by 8 hours. Another common issue: some systems (Java, JavaScript) use milliseconds while others (Unix, Python) use seconds. Entering a millisecond timestamp as seconds shows a date 1000x too far in the future.

What is the best way to store dates in a database?

For most applications, store timestamps as UTC (either Unix timestamps or ISO 8601 with the Z suffix) and convert to local time only at display time. This avoids DST ambiguity and makes cross-timezone queries trivial. PostgreSQL's TIMESTAMPTZ and MySQL's TIMESTAMP types handle this automatically. Avoid storing local times without timezone information — you lose the ability to unambiguously reconstruct the moment.