Seconds timestamp
A 10-digit epoch usually means seconds.
1710500000 Understand Unix timestamps, seconds vs milliseconds, and how to convert between epoch values and readable dates.
Unix timestamps show up everywhere: logs, JWT claims, databases, caches, and analytics pipelines. They are compact and reliable for machines, but opaque for humans.
The most common source of confusion is not the epoch itself. It is whether a system is using seconds or milliseconds.
A Unix timestamp is the number of elapsed seconds or milliseconds since January 1, 1970 UTC.
Computers use it as a neutral, sortable time representation that avoids locale-specific date formatting.
A 10-digit epoch usually means seconds.
1710500000 A 13-digit epoch usually means milliseconds.
1710500000000 Convert Unix timestamps to ISO dates and back.
Open full tool pageAs a quick heuristic, 10 digits usually means seconds and 13 digits usually means milliseconds. The source system should still be your real authority.
Yes. The timestamp itself is timezone-neutral and anchored to UTC.