itime.day

Unix Expert

Unix timestamp debugger

Paste seconds, milliseconds, microseconds, nanoseconds, or an ISO date. The debugger identifies the unit, normalizes the instant, and generates implementation snippets.

Detected unitseconds

Digits10

safe integeryes

Reason: 10 digits or fewer is usually Unix seconds.

Implementation snippets

2038 boundary

Signed 32-bit Unix seconds overflow after 2038-01-19 03:14:07 UTC. Modern 64-bit systems are not limited by this boundary, but old C APIs, embedded systems, and database schemas can still break.

32-bit max second

2147483647

2038-01-19T03:14:07.000Z

One second later

2147483648

2038-01-19T03:14:08.000Z

Unix Expert FAQ

How do I know if a timestamp is seconds or milliseconds?

Current Unix seconds are usually 10 digits, while JavaScript milliseconds are usually 13 digits. If a date lands in 1970 or far in the future, the unit is probably wrong.

Why are microseconds and nanoseconds risky in JavaScript?

They often exceed Number.MAX_SAFE_INTEGER. Use strings, BigInt, or split seconds+nanos to avoid silently losing precision.

Does timezone change a Unix timestamp?

No. Unix time represents one instant. Timezone only changes the human-readable display.

What is the 2038 problem?

Signed 32-bit Unix seconds overflow after 2147483647. Old systems may wrap to a negative date even though modern 64-bit systems are safe.

Related tools