Unit of time
A unit of time is a standardized measure used to quantify duration, intervals, or the passage of events in the physical world. In the International System of Units (SI), the base unit of time is the second (symbol: s), defined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom, at rest and at a temperature of 0 K.[1] Historically, the second was derived from astronomical phenomena, initially defined in 1956 by the International Committee for Weights and Measures (CIPM) as 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time, known as the ephemeris second.[2] This was refined in 1960 to 1/31,556,925.9747 of the length of the tropical year 1900, but to achieve greater precision and independence from Earth's irregular rotation, the 13th General Conference on Weights and Measures (CGPM) redefined it in 1967 using the caesium atomic transition, establishing an atomic time standard that remains in use today.[2] Common derived and accepted units of time build upon the second for practical applications. The minute (min) equals 60 seconds, the hour (h) equals 3,600 seconds, and the day (d) equals 86,400 seconds; these non-SI units are explicitly accepted for use with the SI due to their widespread adoption in everyday and scientific contexts.[3] Larger intervals, such as the week (7 days), month, and year (approximately 365.25 days), are calendar-based and not formally part of the SI but are essential for chronology and scheduling.[3] In physics and engineering, smaller subunits like the millisecond (10^{-3} s) and microsecond (10^{-6} s) enable precise measurements in fields ranging from electronics to cosmology.[4]Historical Development
Ancient Units
Prehistoric humans relied on observable natural cycles to conceptualize and measure time, with the alternation of day and night serving as the most fundamental unit, driven by the Earth's rotation relative to the Sun.[5] The recurring phases of the Moon, completing a cycle approximately every 29.5 days, provided an early basis for tracking shorter periods akin to months, while seasonal changes tied to the Sun's annual path enabled longer-term divisions for agriculture and rituals.[5] These cycles lacked precise quantification but formed the practical foundation for all subsequent timekeeping across cultures.[6] In ancient Mesopotamia, particularly among the Sumerians and later Babylonians, time units evolved from these natural observations into more structured systems around 2000 BCE. The Sumerians approximated the year at 360 days, aligning it with a simplified zodiacal circle divided into 12 equal parts of 30 days each, which facilitated early calendrical planning.[7] The Babylonians adopted the division of the day into 24 hours and refined it with a sexagesimal (base-60) system, subdividing each hour into 60 minutes and each minute into 60 seconds, a framework that emphasized divisibility for astronomical and administrative purposes.[8] Ancient Egyptians, around 1500 BCE, developed practical devices to divide the solar day into 12 daytime hours and 12 nighttime hours, using sundials to track the Sun's shadow during daylight and water clocks (clepsydras) to measure intervals at night or in low light.[9] These hours varied in length seasonally to reflect the changing duration of daylight, prioritizing alignment with natural solar cycles over fixed equality, and supported temple rituals, labor organization, and Nile flood predictions.[9] Greek and Roman civilizations adapted these concepts into longer cycles for civic and religious use. The Greeks established the Olympiad as a four-year interval between the Olympic Games starting in the 8th century BCE, serving as a standardized era for dating historical events and synchronizing calendars across city-states.[10] In Rome, the nundina formed an eight-day market cycle during the Republic, marking periodic assemblies and trade days within the lunar month, which complemented but differed from emerging seven-day planetary weeks influenced by Hellenistic astronomy.[11] Ancient Chinese calendars incorporated the xun as a ten-day week, evident in Shang dynasty records from around 1250–1046 BCE, where it structured divinations, sacrifices, and administrative tasks alongside lunar months and solar years.[12] This decimal-based division reflected the broader ten-stem (tiangan) system, providing a rhythmic subunit for the 60-day sexagenary cycle used in cyclical dating.[13]Transition to Modern Standards
The introduction of mechanical clocks in 14th-century Europe represented a pivotal technological advancement in timekeeping, allowing for the precise subdivision of hours into minutes and seconds, which facilitated more accurate daily scheduling and astronomical observations.[14] These devices, first appearing in northern Italy around 1270 and spreading across the continent, shifted reliance from sundials and water clocks to escapement mechanisms driven by weights, enabling public towers to chime the hours reliably.[15] In 1582, Pope Gregory XIII promulgated the Gregorian calendar through the papal bull Inter gravissimas, correcting the Julian calendar's accumulated error by omitting 10 days (October 4 was followed directly by October 15) to realign the calendar with the solar year of approximately 365.2425 days.[16] The reform also refined leap year rules: years divisible by 4 are leap years, except for century years, which must be divisible by 400 to qualify, thus reducing the average year length to match solar cycles more closely over centuries.[17] The 19th century's railway boom drove further standardization, as disparate local times caused scheduling chaos; in response, North American railroads adopted five time zones in 1883, while the International Meridian Conference in Washington, D.C., in 1884 established the Greenwich meridian as the prime reference, formalizing Greenwich Mean Time (GMT) for global coordination.[18] This conference, attended by delegates from 25 nations, recommended dividing the world into 24 time zones of 15 degrees each, laying the groundwork for international synchronization despite uneven adoption.[19] Early 20th-century efforts addressed irregularities in Earth's rotation by introducing ephemeris time in 1956, defined as the measure where one ephemeris second equals 1/31,556,925.9747 of the tropical year at the 1900 epoch, derived from Simon Newcomb's solar tables to provide a uniform scale based on orbital motion rather than variable day lengths.[20] A bold but short-lived experiment occurred with the French Revolutionary Calendar, enacted in 1793 and abolished in 1805, which divided the day into 10 decimal hours of 100 minutes each, with minutes further split into 100 seconds, aiming for metric consistency but failing amid resistance from traditionalists and practical disruptions in agriculture and trade.[21]Scientific Definitions
The Second in SI
The second, symbol s, is the SI base unit of time. It is defined by taking the fixed numerical value of the caesium frequency Δν_Cs, the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom, to be 9 192 631 770 when expressed in the unit Hz, which is equal to s⁻¹.[22] This definition, established in 1967 by the 13th General Conference on Weights and Measures (CGPM) and reaffirmed in the 2019 SI revision, replaced earlier astronomical standards, providing a stable and reproducible measure independent of Earth's rotation. The value was chosen to closely match the previous ephemeris second, ensuring continuity in timekeeping.[2] Historical refinements leading to this definition began in the 17th century with pendulum-based clocks, pioneered by Christiaan Huygens in 1656, which achieved accuracies of about 10 seconds per day by regulating time through gravitational swings, though the formal unit remained tied to the solar day.[2] By the 1920s, quartz crystal oscillators improved precision to around three seconds per year, as demonstrated in early NIST quartz clocks insulated against environmental noise.[2] The shift to atomic standards occurred in 1955 when the National Physical Laboratory (NPL) in the UK developed the first practical caesium-beam atomic clock, with NIST building their own in 1959, measuring the hyperfine transition frequency with unprecedented stability, paving the way for the 1967 SI adoption.[23] An interim step was the 1956 ephemeris second, defined by the International Committee for Weights and Measures (CIPM) as 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time, bridging mean solar time and atomic measurements.[24] Modern realizations of the second using caesium atomic clocks achieve fractional uncertainties of about 10^{-16}, corresponding to an accuracy of one second in 300 million years, as exemplified by NIST-F2.[25] This precision underpins critical applications, including GPS satellite synchronization, where timing errors must not exceed nanoseconds for accurate positioning, and telecommunications networks, which rely on atomic time for data packet ordering and signal synchronization.[26] The 2019 SI revision by the CGPM reaffirmed the second's definition unchanged while redefining the entire system through fixed values of fundamental constants, including the caesium hyperfine frequency Δν_Cs = 9,192,631,770 Hz, enhancing overall metrological consistency without altering time measurement.[27]Other Fundamental Units
In fields such as astronomy and relativity, several units of time are defined beyond the SI second to address specific theoretical or observational needs, often tying into natural scales like celestial motions or fundamental constants. The sidereal second measures time based on Earth's rotation relative to distant stars, forming the basis for sidereal time systems used in celestial mechanics to track stellar positions. It is defined such that a sidereal day consists of 86,164.0905 mean solar seconds, making one sidereal second approximately 0.99727 mean solar seconds.[28][29] This unit accounts for the Earth's orbital motion around the Sun, which causes the sidereal day to be shorter than the solar day by about four minutes annually.[30] The Julian century serves as a standardized long-duration unit in astronomical computations, equivalent to 36,525 mean solar days or precisely 3,155,760,000 seconds. It facilitates calculations involving gradual phenomena like the precession of the equinoxes, where rates are expressed in arcseconds per Julian century to simplify historical and predictive models.[31][32] In relativity, the light-second—defined as the distance light travels in vacuum during one second (exactly 299,792,458 meters)—is often employed inversely as a time unit in spacetime diagrams and analyses of causal structures. This approach sets the speed of light c to unity, allowing time intervals to be measured in light-seconds for events like signal propagation or the temporal extent of event horizons in black hole metrics, where the horizon's "size" corresponds to the time light takes to traverse it.[33][34] The Planck time represents the fundamental timescale in theories of quantum gravity, derived from the gravitational coupling constant G, the reduced Planck constant \hbar, and c: t_P = \sqrt{\frac{\hbar G}{c^5}} \approx 5.39 \times 10^{-44} \ \text{seconds}. It denotes the shortest interval at which classical spacetime concepts break down, marking the scale where quantum fluctuations dominate gravitational effects and a unified theory is required. In particle physics, the Fermi time—approximately $10^{-24} seconds—characterizes the timescale for light to traverse a typical nucleon (diameter ~1 femtometer), providing a benchmark for strong interaction processes within nuclear matter. This duration aligns with the expansion and cooling phases in high-energy nuclear collisions, where quark-gluon plasmas form and evolve before hadronization.[35]Common Units
Sexagesimal Divisions
The sexagesimal divisions of time, based on a base-60 numeral system, structure the day into hours, minutes, and seconds for practical use in everyday life, navigation, and technology. This system divides the mean solar day—approximately 86,400 seconds—into 24 equal hours, each further subdivided into 60 minutes and each minute into 60 seconds, facilitating precise timekeeping without decimal fractions in most contexts.[36] The hour represents one twenty-fourth of the mean solar day, a convention tracing back to ancient Egyptian timekeeping, where the day was divided into 12 hours of daylight and 12 of night using duodecimal (base-12) divisions observed via sundials and star clocks. Later standardization incorporated the sexagesimal subdivisions for minutes, influenced by Babylonian astronomical practices that emphasized base-60 calculations for their divisibility. In modern usage, the hour serves as a fundamental interval in scheduling, work shifts, and global coordination, with each hour equaling 3,600 seconds.[37][36] The minute is defined as one-sixtieth of an hour, or 60 seconds, providing a convenient scale for short-duration activities like cooking or meetings. This division enhances the hour's utility by allowing finer granularity while maintaining the sexagesimal framework's advantages, such as easy halving into 30 minutes or quartering into 15 minutes.[38] The second forms the base unit of these divisions, serving as the SI unit of time and underpinning all higher sexagesimal units. One hour thus comprises 3,600 seconds, and one minute 60 seconds, ensuring consistency in scientific measurements and daily applications.[1] Time notation often employs either a 24-hour clock, which runs continuously from 00:00 to 23:59 to avoid ambiguity across day-night cycles, or a 12-hour clock supplemented by AM (ante meridiem, before noon) and PM (post meridiem, after noon) indicators, the latter being more common in casual Anglo-American contexts but prone to confusion in international settings. The ISO 8601 standard formalizes the 24-hour format as HH:MM:SS (hours:minutes:seconds), promoting interoperability in data exchange, computing, and documentation worldwide.[39][40]Calendar-Based Units
Calendar-based units of time extend beyond the fixed divisions of the day to encompass longer periods derived from astronomical cycles, particularly the Earth's rotation, orbit, and the Moon's phases. These units form the backbone of civil calendars, balancing solar and lunar observations to maintain alignment with seasons and cultural practices. The day serves as the foundational unit, with the mean solar day— the average time for the Sun to return to the same meridian position—equaling approximately 86,400 seconds, or 24 hours, as established in historical definitions of the ephemeris second.[41] In contrast, the sidereal day, measured relative to distant stars, lasts about 23 hours, 56 minutes, and 4 seconds, reflecting the Earth's rotation period without accounting for orbital motion around the Sun. The week introduces a seven-day cycle not directly tied to celestial mechanics but rooted in ancient Babylonian astronomy, where the lunar cycle's quarters suggested divisions every 7.4 days, approximated to seven for ritual purposes, such as marking "evil days" or rest periods.[42] This structure was later integrated into Judeo-Christian traditions through the biblical account of creation in seven days, culminating in the Sabbath as a day of rest, influencing its global adoption in calendars.[43] The month builds on lunar observations, with the synodic month—the time from one new Moon to the next—averaging 29.53 days, resulting in a 354-day year for 12 such months in purely lunar calendars, which requires periodic intercalation to synchronize with the solar year and prevent seasonal drift.[44] In the Gregorian calendar, months vary from 28 to 31 days, yielding an average length of 30.436875 days per month to approximate the solar year.[45] The year represents the longest common calendar-based unit, primarily the tropical year of approximately 365.2422 days, defined as the time for Earth to complete one orbit relative to the vernal equinox, driving seasonal cycles.[45] The sidereal year, measured against fixed stars, extends slightly longer at 365.2564 mean solar days, due to Earth's precession.[46] To accommodate the fractional day, leap years insert an extra day on February 29, with the Julian calendar averaging 365.25 days by adding a leap day every four years, while the Gregorian reform refines this to 365.2425 days on average by skipping leap days in certain century years, enhancing accuracy over centuries.[45]Specialized Units
In Physics and Astronomy
In physics, the concept of time is central to understanding relativistic effects, where the distinction between proper time and coordinate time becomes crucial. Proper time, denoted as \tau, represents the time interval measured by a clock following a specific worldline between two events, independent of any coordinate system. In contrast, coordinate time t is the time component in a chosen reference frame, such as that of a stationary observer. This differentiation arises in special relativity, where the invariance of the spacetime interval leads to observable discrepancies in time measurements across different frames.[47] A key manifestation is time dilation, where a clock moving at velocity v relative to an observer experiences less elapsed proper time than the coordinate time t recorded by the stationary observer. The relationship is given by the formula \tau = t \sqrt{1 - \frac{v^2}{c^2}}, where c is the speed of light; this equation derives from the Lorentz transformation and has been experimentally verified through phenomena like muon decay in cosmic rays, where observed lifetimes exceed predictions without relativistic corrections. In general relativity, gravitational fields further modify this, but the special relativistic case highlights how units of time, such as seconds, vary contextually for physical processes.[48] In astrophysics, black holes introduce extreme timescales governed by quantum effects. The Hawking evaporation time t_H, the duration for a black hole to lose its mass via Hawking radiation, scales with its mass M according to t_H = \frac{5120 \pi G^2 M^3}{\hbar c^4}, where G is the gravitational constant and \hbar is the reduced Planck constant; for a solar-mass black hole (M \approx 2 \times 10^{30} kg), this yields approximately $10^{67} years, vastly exceeding the current age of the universe and underscoring the stability of stellar-mass black holes. This timescale emerges from semiclassical calculations balancing quantum particle emission against gravitational binding.[49] Cosmological models employ units like the Hubble time to quantify the universe's expansion history. Defined as t_H \approx 1/H_0, where H_0 is the present-day Hubble constant (approximately 70 km/s/Mpc), it provides a rough estimate of the universe's age at about 13.8 billion years, though refined models incorporating dark energy adjust this slightly lower. On galactic scales, light travel times illustrate vast distances: traversing the Milky Way's diameter of roughly 100,000 light-years requires 100,000 years at light speed, explaining why we observe ancient stellar events from our position. Pulsars, rapidly rotating neutron stars, operate on millisecond scales, with rotation periods as short as 1.4 milliseconds for some, enabling precise timing for gravitational wave detection and navigation.[50][51][52] A notable astronomical unit is the cosmic year, the period for the Sun to complete one orbit around the Milky Way's center, spanning 225 to 250 million Earth years at an orbital speed of about 230 km/s. This long cycle contextualizes galactic dynamics, such as the Sun's passage through spiral arms every few tens of millions of years, influencing star formation rates.[53]In Computing and Technology
In computing and technology, time units are adapted from standard SI definitions to suit digital systems, where precision at microsecond and sub-microsecond scales is critical for scheduling, performance measurement, and synchronization. These units enable efficient resource allocation in operating systems, hardware timing in processors, and data representation in software protocols, often prioritizing computational efficiency over absolute physical accuracy. The jiffy serves as an informal unit in computing, frequently denoting a brief interval for animations and frame updates, such as 1/60 second (approximately 16.67 milliseconds) to match typical frame rates or 1/100 second (10 milliseconds) for smoother rendering.[54] In Unix-like systems, particularly the Linux kernel, a jiffy represents the duration between consecutive timer interrupts, known as the kernel's "heartbeat," with its length determined by the HZ configuration parameter—commonly 10 milliseconds (HZ=100) for general-purpose systems or 1 millisecond (HZ=1000) for low-latency environments like real-time applications.[55] This variability allows kernels to balance responsiveness and power efficiency, as higher tick rates increase interrupt overhead but improve scheduling precision.[56] A tick refers to the basic interval of a system's clock, used for task scheduling and timekeeping in operating systems. In Microsoft Windows, the default system timer tick is approximately 10 milliseconds, though it can be adjusted via multimedia timers to finer resolutions like 1 millisecond for audio and video processing.[57] In Linux kernels, the tick aligns with the jiffy duration, often set to 1 millisecond in high-performance configurations to support precise timing for multimedia and embedded systems.[58] These intervals ensure periodic context switches, preventing any single process from monopolizing the CPU while maintaining system stability. The nanosecond, equal to 10^{-9} seconds, is fundamental to measuring processor performance, where clock speeds are expressed in gigahertz (GHz), indicating billions of cycles per second. For instance, a 1 GHz processor completes one cycle every 1 nanosecond, while a 4 GHz unit achieves cycles in 0.25 nanoseconds, allowing rapid instruction execution in modern computing tasks like data processing and simulations.[59] This scale underscores the need for nanoscale timing in hardware design, as even slight delays can impact overall system throughput. Epoch time, or Unix timestamp, quantifies time as the number of seconds elapsed since the Unix epoch—January 1, 1970, at 00:00:00 UTC—providing a compact, integer-based representation for dates in software.[60] This convention simplifies cross-system synchronization in distributed computing but faces limitations in 32-bit implementations, leading to the Year 2038 problem: on January 19, 2038, at 03:14:07 UTC, the signed 32-bit integer overflows at 2^{31} seconds (approximately 68 years from the epoch), potentially causing systems to revert to 1970 or fail in time-dependent operations like file timestamps and database queries.[61] Mitigation involves transitioning to 64-bit integers, which extend the range far beyond practical needs. Femtosecond units, at 10^{-15} seconds, are employed in advanced technologies like femtosecond lasers, which deliver ultrashort pulses for precise applications in micromachining, biomedical imaging, and telecommunications. These lasers enable non-thermal ablation in materials processing, minimizing heat damage, and support high-resolution techniques such as multiphoton microscopy for cellular studies.[62]Interrelations and Conversions
Mathematical Relations
The sexagesimal division of time, inherited from ancient Babylonian mathematics, establishes fixed ratios relative to the second: one minute equals 60 seconds, one hour equals 60 minutes or 3,600 seconds, and one mean solar day equals 24 hours or 86,400 seconds.[41] These ratios form the basis for interconnecting everyday time units within the broader framework of the International System of Units (SI), where the second serves as the fundamental unit.[4] SI prefixes enable the expression of time intervals across decimal scales by multiplying or dividing the second by powers of 10. For submultiples, the deci- prefix denotes $10^{-1} seconds (one decisecond), while centi- denotes $10^{-2} seconds (one centisecond); for multiples, kilo- denotes $10^3 seconds (one kilosecond, equivalent to about 16.67 minutes), and mega- denotes $10^6 seconds (one megasecond, equivalent to about 11.57 days).[63] These prefixes facilitate precise quantification in scientific contexts, such as millisecond measurements in particle physics or megasecond timescales in astrophysics, without altering the base unit definition.[22] Calendar-based units relate to the second through astronomical periods approximated in SI terms. The mean tropical year, the time for Earth to complete one orbit relative to the vernal equinox, spans approximately 31,556,925.9747 seconds.[64] Similarly, the synodic lunar month, the interval between successive new moons as viewed from Earth, averages about 2,551,442 seconds (or 29 days, 12 hours, 44 minutes).[65] These values provide the mathematical foundation for converting between solar and lunar calendars, though they vary slightly due to orbital perturbations.[66] To convey the vast scales of time, relative comparisons highlight the second's role: the estimated age of the universe, based on cosmic microwave background data, is approximately $4.35 \times 10^{17} seconds (derived from 13.82 billion years).[67] This underscores the exponential growth in time measurements from atomic to cosmological domains. In relativistic physics, time units interconnect through equations accounting for gravitational effects, such as the Schwarzschild time dilation formula for an observer near a non-rotating black hole of mass M: the proper time interval \Delta \tau at radial distance r from the center relates to coordinate time t by \Delta \tau = t \sqrt{1 - \frac{2GM}{rc^2}}, where G is the gravitational constant and c is the speed of light.[68] This relation, derived from the 1916 Schwarzschild metric solution to Einstein's field equations, illustrates how time intervals in seconds dilate predictably under strong gravity, with the factor approaching zero at the event horizon (r = 2GM/c^2).Conversion Tables
Conversion tables provide quick references for transforming between various units of time, aiding in scientific calculations, engineering applications, and everyday estimations. These tables draw from established standards in metrology, astronomy, and geology, ensuring accuracy for practical conversions.Table 1: SI Prefixes Applied to the Second
SI prefixes scale the base unit of time, the second (s), by powers of 10, as defined by the International Bureau of Weights and Measures (BIPM).| Prefix | Symbol | Factor | Example |
|---|---|---|---|
| kilo | k | 10³ | 1 ks = 1,000 s |
| hecto | h | 10² | 1 hs = 100 s |
| deca | da | 10¹ | 1 das = 10 s |
| (none) | - | 10⁰ | 1 s = 1 s |
| deci | d | 10⁻¹ | 1 ds = 0.1 s |
| centi | c | 10⁻² | 1 cs = 0.01 s |
| milli | m | 10⁻³ | 1 ms = 0.001 s |
| micro | μ | 10⁻⁶ | 1 μs = 10⁻⁶ s |
| nano | n | 10⁻⁹ | 1 ns = 10⁻⁹ s |
| pico | p | 10⁻¹² | 1 ps = 10⁻¹² s |
Table 2: Sexagesimal Units to Decimal Seconds
Sexagesimal systems, rooted in ancient divisions of the hour and day, convert to decimal seconds as follows; in astronomy, angular degrees relate to time via Earth's rotation, where 1° corresponds to 4 minutes of time since the full circle (360°) aligns with 24 hours.[69]| Unit | Equivalent in Seconds |
|---|---|
| 1 minute | 60 s |
| 1 hour | 3,600 s |
| 1 day | 86,400 s |
| 1° (astronomical time) | 240 s (4 minutes) |
Table 3: Calendar-Based Units to Seconds
Calendar units approximate longer periods; a week is exactly 7 days, while the Julian year, used in astronomy, totals precisely 31,557,600 seconds based on 365.25 mean solar days of 86,400 seconds each, as adopted by the International Astronomical Union (IAU).[70]| Unit | Equivalent in Seconds |
|---|---|
| 1 week | 604,800 s |
| 1 mean synodic month | 2,551,442 s |
| 1 mean tropical year | 31,556,926 s |
| 1 Julian year | 31,557,600 s |
Table 4: Specialized Units and Scales
Specialized units include light-time measures in physics, where 1 light-second represents the distance light travels in 1 second, equivalent to a time interval of 1 second or approximately 3.17 × 10^{-8} Julian years; in computing, a jiffy often denotes 1/60 second (≈0.0167 s) tied to 60 Hz power cycles.[71][72]| Unit/Scale | Equivalent in Seconds |
|---|---|
| 1 light-second (time interval) | 1 s (≈ 3.17 × 10^{-8} years) |
| 1 jiffy (computing) | ≈ 0.0167 s |