UTC Clock
Coordinated Universal Time
--:--:--
Unix Timestamp (seconds)
Unix Timestamp (milliseconds)
ISO 8601
Your Local Time
World Clock
--:--:-- --:--:-- --:--:-- --:--:-- --:--:-- --:--:-- --:--:-- --:--:-- What is UTC?
UTC (Coordinated Universal Time) is the global time standard that all other time zones are defined relative to. Unlike local time, UTC never changes for daylight saving. Servers log events, APIs return timestamps, and databases store records in UTC so that time-sensitive data stays consistent regardless of where it is processed or read.
What is a Unix Timestamp?
A Unix timestamp counts the number of seconds (or milliseconds) that have elapsed since midnight on January 1, 1970 in UTC. This moment is called the Unix epoch. Timestamps are timezone-agnostic — the same integer value means the same instant everywhere on Earth.
Tips
- Always store dates and times in UTC in your database. Convert to local time only at display time.
- Unix timestamps in seconds fit in a 32-bit integer until January 19, 2038. After that, use 64-bit or milliseconds.
- ISO 8601 format (2024-05-06T14:30:00Z) is human-readable and sorts correctly as a string, making it ideal for logs and filenames.
- The Z at the end of an ISO timestamp stands for Zulu time, which is another name for UTC.
Frequently Asked Questions
Is UTC the same as GMT?
For practical purposes, yes. UTC and GMT are always within 0.9 seconds of each other and the difference is invisible to most applications. GMT is a timezone; UTC is the international atomic time standard. Developers use UTC; meteorologists and navigators often still say GMT.
Why do APIs return Unix timestamps instead of formatted dates?
Timestamps are a single integer with no ambiguity about timezone, locale, or date format. Any language can convert them to a local time. Formatted strings require parsing and carry implicit assumptions about format and timezone.
How do I convert a Unix timestamp to a date in JavaScript?
Pass milliseconds to the Date constructor: new Date(timestamp * 1000) for a seconds-based timestamp, or new Date(timestamp) if already in milliseconds.
What is the Year 2038 problem?
32-bit signed integers can store values up to 2,147,483,647. Unix timestamps will exceed that value on January 19, 2038 at 03:14:07 UTC. Systems still using 32-bit time storage will overflow. Most modern systems have migrated to 64-bit, which is safe until the year 292 billion.