How can we avoid needing a leap year/second?

  • Given the Earth's current speed around the sun and current rate & axis of rotation, what is the best way to keep time to avoid a leap year? How many hours should we have in the day and days in a year would keep things balanced to not need to add or remove days from the year? Further, how many minutes per hour and seconds per minute should we have to avoid a leap second?


    All we need to do is drop Mount Everest into the Marianas Trench. That should speed up the Earth's rotation enough that leap seconds are no longer necessary.

    @MikeScott Then we just need to speed up earth's orbit around the sun somehow to get rid of leap years.

  • Leap years exist for two reasons:




    • There are not an integer number of days in a year.

    • People perceive a need to keep the seasons where they are on the calendar.



    Given the above, there is no way to avoid leap years, or something similar. Defining the calendar year as being a fixed number of days (e.g., 365 days) would result in the seasons shifting by one day per four years.






    Leap seconds exist for two reasons:




    • The length of a day as measured by an atomic clock is not constant.

    • People perceive a need to keep midnight at midnight, noon at noon.



    Given the above, there is no way to avoid leap seconds, or something similar. Defining the day as being a fixed number of atomic clock seconds (e.g., 86400) would result in your clock and the Sun disagreeing on mean local noon, but by a very small amount.



    That said, there are serious proposals to eliminate leap seconds. Some people such as those who use UTC to timestamp financial transactions do not like them. So far, those proposals have been rejected. The standard response is that it's not UTC that's broken; it's using of UTC in a context where it shouldn't be used that is broken. If you need a monotonically increasing time scale, use TAI or GPS time instead.


    What difficulties would arise if separate units were created for "long civil second", "median civil second", and "short civil second", with the first being defined as being one part in 60 million longer than a "normal" second, the middle being a "normal" second, and the last defined as being one part in 60 million shorter, and civil time were specified as switching among the different kinds of seconds for different years? Applying a 1 microsecond per minute correction for a year would seem less disruptive than occasionally applying a one-second correction all at once.

    That might seem simpler but I think it's actually more messy that way. Definition of a second here: http://physics.nist.gov/cuu/Units/second.html If you allow "seconds" to vary with the Earth's rotation, then lots of things change with it including the numerical value for the speed of light. Besides, I don't think it's very disruptive to add or remove a second every 6 months or however often it's done.

    Most proposals to get rid of leap seconds are based around simply allowing midnight/noon to be wrong. Since they would only become noticeably wrong after centuries, particularly because timezones are off by tens of minutes depending where you are.

    We could avoid leap seconds if we were okay with solar time and atomic time being unrelated things.

    Avoiding leap seconds is utterly trivial: don't use SI seconds but what I call "calendar seconds", aka UT1.

    @supercat - That would be a huge step backwards. From late 1920 to 1972, official time in the US was kept by the radio station WWV. The clocks used to generate that signal had their tick rate modulated to stay within 0.1 seconds of UT2 (a time standard that was officially deprecated in 1972, along with GMT). There are many, many reasons why it is better to have clocks tick at a uniform rate as opposed to a rate dictated by the rotating Earth. If you want a uniform tick rate (and many people do), you'll have to abide with leap seconds.

    For completeness sake, it's worth noting that we are actually OK with noon and midnight being wrong in the U.S. (and other places) for about half a year: during Daylight Saving Time. And now that I think about it, for most of a time zone, the sun is not directly overheard at noon. Between Washington, DC and Ohio, the solar time changes by about an hour (the sun rises about an hour later in Ohio), but they are in the same time zone.

    @DavidHammen: you don't *have* to abide with leap seconds, you could let mean local noon slip. As Todd says, people live with it being up to 90 minutes out (DST plus half a timezone) and barely notice. Even more in large places with a single time zone, such as China, but there comes a point where it's genuinely inconvenient in practice. I don't think we ever actually need two leap seconds in a year, so the situation really is that we're taking on a responsibility not to just kick the can 3000+ years down the road and let someone deal with a one hour discrepancy.

    @SteveJessop - As I mentioned in my answer, there are serious proposals to eliminate leap seconds. Just let our concept of solar time drift. I'll update my answer to make that stick out a bit more. Regarding the need for more than two leap seconds per year: We came perilously close in the late 1970s / early 1980s; the Earth's rotation rate has sped back up a bit since then. That won't happen forever. The Earth's rotation rate is slowly slowing down. Within a few centuries the concept of only two leap seconds per year will be rather quaint (if we still leap seconds then).

    @DavidHammen: What I am proposing is not that things just arbitrarily slip, but rather that precise corrections be linearly interpolated over the course of a year. If "linear time" representing Midnight Jan 1. 2016 UTC is lt0, and "civil time" is ct0, and that year was designated as using long civil seconds, then within that year, the relationship between civil time and linear time would be ct=ct0+(lt-lt0)*60000001/60000000. Applications which need to track relative time would use linear time (which would simply be a number of seconds since some mark), while...

    ...applications which needed to track time of day could either use linear time and apply the correction if they had time bases that were more accurate than 16ppb, or simply use a local time base and periodically sync to a civil time base (the more typical action, given that most common time bases are off by a few parts per million anyway). I'm not sure what is gained by applying corrections a whole second at a time, rather than specifying that they be applied continuously? How is adding an individual leap second better than adding 525,600,000,000 individual leap picoseconds?

    Leap seconds can be avoided by directly using solar time.

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM