Example A: Convert a 10-digit Unix timestamp
Input: 1700000000
Detected: Seconds (10 digits)
UTC: 2023-11-14T22:13:20Z
Local Example: 2023-11-15 06:13:20 GMT+08:00
A timestamp, also called a Unix timestamp or epoch time, is a number that represents a single instant in time.
In most systems it is counted from 1970-01-01 00:00:00 UTC, which makes it easy to compare, store, and exchange time across logs, APIs, and databases.
A 10-digit value is usually Unix seconds, while a 13-digit value is usually milliseconds.
One of the most common mistakes is using JavaScript Date.now() output as if it were seconds. Date.now() returns milliseconds, so mixing the two creates a 1000x offset.
A Unix timestamp does not contain timezone information. It represents the same instant everywhere.
UTC output stays the same for everyone, but local time changes with the viewer's timezone and daylight saving rules. That is why the same timestamp can show different clock times in different places.
By convention, Unix timestamps are usually stored in seconds. A 10-digit value is commonly seconds, while a 13-digit value is commonly milliseconds.
The timestamp represents the same instant globally, but the formatted local clock time depends on timezone and daylight saving rules.
No. A Unix timestamp is timezone-agnostic and only becomes UTC or local time when it is formatted for display.
Most timestamp tools display it live. In JavaScript, Date.now() returns milliseconds and Math.floor(Date.now() / 1000) returns Unix seconds.
Paste the timestamp into a converter and read the UTC or local datetime output. The most important step is confirming whether the input is seconds or milliseconds.
Paste the Unix timestamp, let the tool detect the unit, and read the full UTC or local datetime output. If you only need clock output, the same conversion also helps you convert unix timestamp to time.
Choose or type a date-time value, then convert it into Unix seconds or milliseconds. UTC and browser-local inputs can represent the same instant differently when displayed.
Yes. Negative timestamps represent moments before 1970-01-01 00:00:00 UTC, although support can vary across runtimes and databases.
Some 32-bit systems store Unix seconds in a signed 32-bit integer, which overflows in January 2038. Modern 64-bit systems are not affected by that limit.
Need the broader Unix Timestamp Converter home page with related tools?