Timestamp → Date
Date → Timestamp
daytics is free. Help keep it that way.
Built by one person. No ads, no paywalls. If it saved you time, a coffee goes a long way.
How to Use This Tool
A Unix timestamp is the number of seconds that have elapsed since 00:00:00 UTC on 1 January 1970 (the "Unix epoch"). It's the standard time representation in Unix-like systems, databases, APIs, JWT tokens, and log files — portable, unambiguous, and easy to compare. This converter goes both ways: enter a timestamp to see the human-readable date in UTC, local time, and ISO-8601, or enter a date to get its timestamp in seconds and milliseconds.
How to Use
- On the left panel: paste a Unix timestamp, choose seconds or milliseconds, and click Convert. You'll see UTC, local time, ISO-8601 and a relative ("3 hours ago") representation.
- On the right panel: pick a date and time, or click Use current time, and convert to timestamps in both seconds and milliseconds.
- The Live current timestamp at the bottom ticks in real time — useful as a reference when debugging.
Frequently Asked Questions
What is a Unix timestamp?
A single integer representing an instant in time: the number of seconds since 1970-01-01 00:00 UTC. Also called "epoch time" or "POSIX time".
Why does it start in 1970?
Unix was developed at Bell Labs around 1970, and the engineers chose a recent round date as the zero point. That arbitrary choice has stuck as the universal epoch for most computing platforms.
Seconds vs milliseconds?
Most back-end systems and Unix tooling use seconds. Most JavaScript, Java, and modern APIs use milliseconds. Use the selector above to match what your system expects — 10-digit values are almost always seconds, 13-digit values are milliseconds.
What happens after 2038?
On 19 January 2038 at 03:14:07 UTC, signed 32-bit integer timestamps overflow. Systems that still use 32-bit Unix time will see times jump backwards or fail. Most modern systems use 64-bit timestamps, which won't overflow for ~292 billion years.
