Expected more
Facebook/Meta has a lot of smart engineers working for it. I would have expected a bit more from these people than: "We don't like this solution please don't do it." i.e. an alternative solution.
Meta's engineering team has proposed doing away with leap seconds. Time Lords at the International Earth Rotation and Reference Systems Service occasionally decree to add a leap second – usually a 3601st second in an hour – to reflect the changing speed of the Earth's rotation and ensure that our measurements of time remain …
From the article is sounded like they were adding support to an existing solution: run clock a little slower for a few hours around the time when a leap second is scheduled.
This has benefits and problems: it will work well with software that does not understand leap seconds. It will work badly with software that does understand leap seconds. It adds yet another possibility for confusion when computers using different time standards have to agree on the time.
Facebook are being reasonable here: identifying a problem. Pointing at one possible part of a solution. Not inflicting a solution on the world that only solves only their own problems. Trying to get the problem addressed in a forum where many stakeholders have a voice. Hopefully we will end up with one more time standard that suits many people instead of dozens of new ones.
One of the biggest problems handling time is the massively fragmented way we handle it. Virtually every broken system of time calculation is alive and well in some area of the world, industry, or academia. The worst part of the problem is that most of us reflexively thing we understand it and it is somehow reducible to something simple or easy.
That may be true operating as individuals dealing with stuff from our immediate environment on the scales of hours or minutes within the same day, and breaks down rapidly from there.
We fixed our system of time, understandably and narcissistically, on the large ball of dirt we are presently stuck on. It wobbles in fairly complex ways that took us literally millennia to figure out. Trying to keep clocks in sync with it while maintaining both metrics of time and a calendar that predate our understanding of them is a lesson in both humility and futility.
Meta's engineers should be marched out into the parking lot and flogged with a kippered haddock for falling prey to the same kind of problems that they are ostensibly complaining about. Changing the duration of a second for aperiodic and non-deterministic windows of time is going to break a bunch of other systems and assumptions. They can't fix other people's mistaken assumptions by introducing another version of the same idea that is incompatible, which ironically was THEIR mistaken assumption. Common guys, XKCD... Also, who really thinks that trading "which second is this?" is a worse problem than "what is a second? are all seconds created equal? Starting when? Where is the record of all unequal seconds back to some arbitrary point, say the big bang? How about handling time during the period where the solar system was forming?
Any "solution" would still require everyone to understand how the new system worked and then in most cases re-write everything back to the beginning of time. Good luck with that.
Even if think you are clever and define a safe and arbitrary time reference and provide a well documented conversion function back to current earth standard time, everything will still fall apart the second you have to deal with anything sourced from the outside of your organization. Time isn't easy, and in most cases any idea of time that doesn't carefully define it's frames of reference isn't going to be precise, predictable, reliable or deterministic. If that is something you need your time to be, put in the work or fail, possibly spectacularly.
Meta's engineers should be marched out into the parking lot and flogged with a kippered haddock for falling prey to the same kind of problems that they are ostensibly complaining about. Changing the duration of a second for aperiodic and non-deterministic windows of time is going to break a bunch of other systems and assumptions.
Those knock-on effects are precisely what will sink this. Since the metre is defined in terms of c and the second, smearing has the effect of making a metre longer.
Tell Zuck this proposal will make his cock smaller and there'll soon be a U turn.
Ha! Changing a metre is only the beginning of the problem. With that goes mass, the speed of light, all forces and accelerations... the electrical world will go wonky too, with the volt, ampere etc. changing. Planets will mis-align with calculations ... you'll probably find that the value of your bank account will change too, and certainly any investments linked to gold (see "mass"). Road speed limits will vary. It might be fun :)
I once tried explaining to a corporate tour that the clock in the studio we were in was accurate. This is a slave clock running off a master clock in the server room and we know that the second pulse is highly accurate. The group are C-suite types with no technical understanding so this is confusing. One of them tells me that his bedside clock has a radio in it, that makes it super accurate. His clock isn’t just accurate with the seconds but the hours and minutes too.
I asked if he’d ever needed to change the time on his clock and he said “No” Well we might need to if for example we are recording something for future broadcast. It’s no good if the presenter looks up at the clock and reads out the current time as 2pm when the show will be going out at 9pm. Another one asked about how we play out a show so accurately that the time on the clock during recording matches the time at broadcast. So I then explained about playout systems which a few of them found utterly incomprehensible.
Trying to explain the same thing to my mum involved the “it’s all done by science” line when she said it was baffling her.
"This has benefits and problems: it will work well with software that does not understand leap seconds. It will work badly with software that does understand leap seconds. It adds yet another possibility for confusion when computers using different time standards have to agree on the time."
Agreed. Leap seconds have been around for a while now and most systems deal with them, They are a known factor. Facebooks backing of the "smearing" is even noted by them as complex to do. So why change? Why not just put the effort into making sure existing stuff deals with the present system instead of creating a new "complex" system than most system probably don't support.
Leap seconds have been around for a while now and most systems deal with them...
My thoughts entirely.
Dear Fuckberks, if you still have systems that have a problem with leap seconds, here's an obvious solution: Fix the bloody things!
Reinventing the wheel and coming up with a square version is always a bad idea.
Well, yes and no. We'd temporarily speed up the earth's rotational speed; it would then return to normal. But if we'd gained a second, we'd keep it. Sort of as if you had a clock running a bit slow and set it ahead; you wouldn't "fix" the long-term slowness of the clock, but it would temporarily be closer to the actual time.
Strange as it may seem, the human race is actually having an effect on the earth's rotation, though in the wrong direction to fix this problem. There's always been a seasonal component to the earth's rotation, caused by ice melting and re-freezing at the poles. There is now a secular component due to ice melting and not freezing; it tends to flow away from the poles and, via conservation of angular momentum, the earth turns at a slightly slower rate. This will cause more leap seconds, not fewer.
I suppose you could argue that a side benefit of "fixing" global warming would be that if ice started re-forming near the poles, we might get the earth's rotation rate back up to match the SI second, eliminating the need for leap seconds. Fine-tuning might then be required, in the form of emitting or capturing CO2 to get the rotation rate Just So.
Sorry, lapsed into astronomer-speak, where "secular" = "linearly changing over time", usually opposed to "periodic" = "changing back and forth around some mean value". The seasonal changes are periodic. The overall trend to a slowing earth is secular.
I probably also should have noted that most of the gradual (secular) change in the earth's rotation rate is, of course, due to lunar tides. That causes a transfer of angular momentum; the moon recedes at about 3 cm/year, and the earth rotates slightly more slowly.
I conjecture that the downvoters thought I was blaming global warming for leap seconds. It's a measurable factor, but that's mostly just because we've gotten really, really good at measuring the earth's rotation. And, in fact, we've had a slight speed-up in recent years; we may yet see a negative leap second. Kinda early to say, but there have been reversals of that sort (all before leap seconds were adopted.)
If we do reach the point where there's a negative leap second, I'd expect lots of arguing in favor of abolishing leap seconds. I'd bet lots of code is not currently set up for such... come to think of it, my own code doesn't account for negative leap seconds, and it'd be some serious work to do so. (Though on the bright side, we don't have the "one second repeating" issue that occurs with positive leap seconds.)
> the moon recedes at about 3 cm/year, and the earth rotates slightly more slowly.
So if we attach a tether to the Artemis missions, lasso the Moon and pull it back towards us, we can remove the need for any more leap seconds *and* ensure that our descendants will still be able to enjoy a proper Solar Eclipse?
All of the problems identified are down to crap software implementations that do not allow for leap seconds or are simple not tested. Over the same periods many systems just kept going and nothing troublesome was reported.
So maybe they should actually test their systems on known and predictable-a-year-ahead events before deploying them?
Also worth noting that for decades there are systems that get round these issues by keeping either "GPS time" or TDT and then applying an offset, just as DST hour changes and global time zones are handled in UNIX. So there already are tested solutions to this problem if you simply cannot allow for non-uniform time.
So how many other software systems have fallen over on DST changes? Remember Windows 95 doing that?
Now it won't due to storing the cron times as UTC. This was done as the bug fix for some older Unix versions where it just had the text versions of the dates. Some other variants (HP for sure) also had an obscure cron bug that applied if the server was rebooted in that magic hour.
"So how many other software systems have fallen over on DST changes? Remember Windows 95 doing that?"
Although in this case, I think eliminating DST might work as a solution. Leap seconds are kind of annoying in the same way that leap year rules are; if we could ignore them, that would be convenient, but if we did, things would start getting a little messed up and we'd spend more time calculating for it than we did just designing the system to handle it.
Are you thinking of the 1024 week number rollover in GPS (about 19.6 years)?
The last time was 2019 and the next 2038.
There is a new version of the protocol which adds 3 bits to the week number so it's first rollover is in 2137.
The underlying second count however remains constant so when a leap second occurs then the offset is just increased (or rarely decreased)
So if Zuck gets his way, there wouldn't be exactly 60000 ms between 00:00:00 and 00:01:00 on a leap second day? That's astonishing behavior and tbh I'd rather keep intervals exact and live with an extra inserted second, than have intervals perhaps off by a few hundred ms, if they're a few hours in length. Imagine playing a two hour movie and discovering that your audio/visual sync gets progressively more and more fucked up because naively taking the system time and assuming 1s = 1000ms = 60 frames = 48000 samples breaks during leap second smearing :/
tbh I'd rather keep intervals exact and live with an extra inserted second, than have intervals perhaps off by a few hundred ms, if they're a few hours in length
When the leap second is inserted NTP maintaining time on equipment will smear it over a period of time to correct all of the computer clocks. What you are fearing is already the norm.
Meta are free to switch from using UTC to TAI or GPS, if they so choose. They don't have to persuade anyone else to do this.
The impact will be on libraries which convert utime to human-readable date/time: those will need changing to account for leap seconds. I'm sure Meta can afford the resources to maintain their own libc fork and their own TZ database.
Their applications will also need to be careful not to hard-code things like
day = utime / 86400
time = utime % 86400
which are quite tempting short-cuts for programmers.
Using TAI internally would be the simplest solution; leap second manipulations would only need to be done when converting to and from UTC (and timezones based on UTC). Hardcoded conversions like
tai_day = tai_utime/86400
tai_time = tai_utime%86400
are perfectly valid in TAI.
I understand the problem leap seconds are solving, but do we need them? We are talking about a difference between two ways of "measuring" time that diverge by about a minute every century. If we just just stop adding them and let the two diverge it it's going to take >5,000 years for it to make an hours difference. We are very happy and generally cope with applying an arbitrary one hour DST correction to our time a couple of times a year (with the exception of those countries that don't do DST). All that would need to be done to fix the divergence of the two time systems would be to cancel one of those two arbitrary DST changes every 5,000 years. The people who need to be aware of the difference between the two methods already are familiar with there being a offset between them and how this changes as leap seconds are added. They should be quite capable of coping with this offset getting a bigger over the next 5,000 years. For the rest of us its a minor change to the way the DST change is calculated in 5,000 years.
Exactly. Because it matters they account for the human IERS decision to add the leap second or not. They will not stop working because the extra second is added this year or next or even deferred for years. It's just an offset that builds up and gets nulled out from time to time. It built up to 10 seconds before we started nulling it out. If we hadn't done anything it would now be 37.
I agree, but why even adjust at all? In modern society, there is no real need for the calendar year to coincide with the solar year. Already now it doesn't: Midwinter is ten days before New Year, so letting it slide even further doesn't matter. Our months don't coincide with the phases of the moon, so why should our year coincide with the Earth's orbit around the Sun.
We could even drop leap days every (roughly) four years, and it wouldn't matter. We could even decide that every other month is 30 days and the rest 31 days (getting rid of the irregularity of February), making a year 366 days instead of 365.25 days. Or we could make months 30 days each, which makes a year 360 days, which divides evenly by many numbers -- the reason we have 360 seconds to an hour and 360 degrees to a circle. And while we are at it, drop time zones and use CET everywhere. So what if school starts at 03:00 in some countries and at 17:00 in other countries?
Astronomers already use a different year that aligns with the positions of stars (other than our own), called the sidereal year, so they can keep using astronomically accurate time, but the rest of us don't have to.
This post has been deleted by its author
Try putting a purely lunar calendar on your wall for the next 20 mean tropical years, and see how well you adapt to a calendar year not coïnciding with the mean tropical year.
the reason we have 360 seconds to an hour and 360 degrees to a circle.
You might want to recheck your seconds-per-hour formula.
"And while we are at it, drop time zones and use CET everywhere. So what if school starts at 03:00 in some countries and at 17:00 in other countries?"
Well, it makes scheduling things harder. When you go to another country, you can calculate in your head to figure out when people normally wake up or eat lunch, instead of using the same numbers. If you want to describe how you were called at the middle of the night to an international audience, you can just say "I got a call at 20:30" and let them figure out where you probably were at the time. Not to mention that, for those fortunate enough to be in Europe or Africa, you get to be asleep when the days change, whereas those on other continents switch from Tuesday to Wednesday while they're in normal daylight hours. We have reasons to describe the time relative to a day, and the sun's out at different times for different places. Time zones work pretty well for that.
"Our months don't coincide with the phases of the moon, so why should our year coincide with the Earth's orbit around the Sun."
Because the sun controls the climate, and it's kind of useful to have data about times of year that don't change. We currently can look at climate data and know that July is a summer month in the northern hemisphere and a winter month in the southern, so we can add new data to an average for that month and have useful numbers. If we eventually have to calculate that July is now a spring month in the northern hemisphere but it used to be a summer one, so the data for this July should be averaged with the old data from May, at least until July 7th when we should start averaging it into the data for old June, that would be a bit annoying. Similarly for things that are scheduled in the year. If you want to have a scheduled summer holiday, it would be complicated to keep moving it as the calendar slid the months into times you didn't like.
We don't need to do that with the moon because the moon's control over conditions here is lower. Full moon versus new moon is, for most people, not really important. Those who do care about that can use calendars based on it, or more often have lunar information written on their calendars. If it was doing things everyone cared about, like the sun is, we'd probably still be using it with ours.
"In modern society, there is no real need for the calendar year to coincide with the solar year."
I think this is more about the length of a day. Eliminate leap seconds and it might take a while for the change in sunup, sundown and noon to be noticed. But pretty soon all the night owls (and high school kids) will complain about having to get up so damned early.
Basic problem here is thar “seconds” is about the only SI unit that shares an name with an older unit. Second used to be defined as 1/86400 of a day. In the SI it is based on the transition frequency of a ceasium which unlike an earth day will never vary ever. Over the years to two defintions have diverged but everybody still thinks of it as a fraction of a day.
Just let UTC tick without the leap seconds, apply them to GMT instead. Then offset local time zones from GMT instead of UTC.
Software can catch up slowly with the new plan or not at all.
Smearing is ok but not great. For those who are against it, have you considered what your computer does when the NTP servers tell it it’s fast or slow?
Yes: just count seconds since the epoch and things become simpler.
Put the complexity on converting epoch seconds to a form that a human will understand. That is already quite complicated as it has to cope with time zones and daylight savings. This can have 1.30 am happening twice on some days or not at all.
Part of the problem is that POSIX defines a day as 24 x 3600 seconds, it does not allow + or - 1. Fixing this would be hard (lots of legacy software to check), then the conversion code but would then let us count seconds.
It's a good idea, but it solves the wrong problem. What we need is to smear the leap hours that cause us so many problems when we make them go forward or back for "daylight saving". Let's introduce leap minutes every day for 60 days, then 60 negative leap minutes, when we need to put the clocks back. Let the machines handle that and our bodies won't notice. Bonus, your kids won't try to get you up an hour early when the clocks "go back".
The ensuing debacle night persuade us to abandon "daylight saving" altogether.
Please arrange to do this on the exact moment I get paid every month, so that the payment system goes back in time after the first payment is made, and then pays me again a second later.
However you choose to mess with time, there will always be edge cases like this that cause problems.
Don't, just don't. I recall having to argue long and hard to get a new version of a commercial CCTV system to switch over to storing UTC as an offset from an epoch - the previous versions had all stored the time as read from an RTC chip - ie, as ddmmyyyyhhmmss.cc (to the centi-second, no less). And they tried to do all the calculations in that format as well, such as "record at high frame rate for the next 17 minutes": at least that meant there was no need to worry about leapseconds as the errors from trying to do time arithmetic like that could make leapdays vanish into the noise!
This post has been deleted by its author
There will be only one outcome from this:
Meta who ultimately is suffering from Not Invented Here syndrome will add a new simplified (Invented Here) Meta standard time.
"Yay, get rid of leap days as well."
What level of compensation are you planning to give all those people who's birthday fell on the now defunct and non-existent February 29th? And how about compensating the pubs, restaurants or whatver who will lose out on the money all those people no longer having a birthday to celebrate would spend? Your flippant remark could cause people to go out of business! Have you no heart?
So managing leap seconds is hard, so FZuck it! we'll just not do it.
I seem to recall Indiana taking much the same tack at or around the turn of the (20th) century. That didn't end well, either.
Reading through the comments is interesting and enlightening as usual but no one has yet asked the obvious question: why on earth does it matter to Facebook?
They can't seriously be suggesting that they have all their servers time aligned to the nearest picosecond? They'll be using NTP servers like the rest of us so perhaps all they need is a custom NTP client that can be informed when a leap second is coming-up and apply it as it would any other clock change - either instantaneously or spread slowly over a predefined number of seconds - the smear that they are talking about.
They may well have APIs that are time sensitive - e.g. which distributed DB change happened first - but then if picoseconds matter I suspect they'll find that there are instances where the client they think came second actually tried to make the a connection first (in the real world) but packet collisions and a retry meant it came second.
Picosecond aligned clocks? I hope none of those are in the Penthouse trying to talk to systems down in the basement. Or sitting on a particularly dense piece of Earth.
https://www.u-tokyo.ac.jp/focus/en/articles/z0508_00097.html
https://m.youtube.com/watch?v=EDirgdlPEuA
So how goes time work if we colonise other planets? Do they get their own timezones relative to themselves, or to some sort of Earth standard time? Will we have to co-ordinate their leap days and leap seconds with ours? What happens once there is a thousand worlds in the Earth Emp... er, Federation? When will my Birthday be if I'm on Pluto? Are dog years the same on Venus? Important questions!
At that point, we will have a lot more things we have to do. My guess: you'll have a universal time counter, which may still be UTC but in any case it will work the same. Each planet will have its own seasonal calendar for things that are climate dependent. Anything without climate (E.G. closed space stations) will use the most common calendar, probably Earth's. Weekdays will be aligned. Planets that don't have a 24-hour sun cycle will have very annoyed inhabitants until we can figure out how to make our brains deal with the unbalanced sleep cycles. Of these predictions, the last is likely to cause the most problems, and is independent of what the clocks say. I think that, by the time we have working travel to these planets where the people can survive long-term, we'll have a better method of measuring time and keeping it coordinated across planets.
There are two conflicting definitions of a second.
The SI definition (which is the one that computers should use, as it is measurable and constant) is related to a specific number of vibrations of a particular atom in a known state (IIRC). As with SI standards, sometimes they change them, to something more accurate, so this may no longer be the case...
The other definition, is based upon a second being a sixtieth of a minute, which itself is a sixtieth of an hour, which in turn is one twenty-fourth of a day. That's the "common sense" definition, and it's fine for most uses, but it's absolutely terrible for scientific and technical uses because the length of the Earth's day varies due to various types of orbital wobble*, and is actually getting longer over the long term.
The problem arises when you try to measure days using SI seconds, and expect (for example) midday to always fall at the point when the sun is highest in the sky. Over time, this "wanders", leading to the need to insert leap-seconds in order to keep those things in synch (note that "negative leap-seconds" that Meta is so afeared of are very unlikely because days are, on the whole, getting longer, not shorter). "Smearing" is an awful way of handling this, if you're using the difference between two times to measure something accurately.
The root of this problem really is that we are measuring two different things with the same unit and pretending that they are the same.
My preferred solution to this is to introduce some sort of "Universal Time" (not the same as UTC) which doesn't have leap seconds or smearing, so would wander from sidereal time over time, but would therefore also give a reliable measure of the difference between two time points.
Of course, that doesn't work with relativity, which screws everything up buy the facts that fast-moving clocks tick faster than slow-moving ones. Fast, or slow-moving, that is, in regards to any particular frame of reference.
Are we all bleeding out of our ears yet trying to get our brains round this?
*not the technical term
My preferred solution to this is to introduce some sort of "Universal Time" (not the same as UTC) which doesn't have leap seconds or smearing, so would wander from sidereal time over time, but would therefore also give a reliable measure of the difference between two time points.
You mean like one of these?
https://en.wikipedia.org/wiki/International_Atomic_Time
https://en.wikipedia.org/wiki/Global_Positioning_System#Timekeeping
https://gssc.esa.int/navipedia/index.php/Time_References_in_GNSS
If they are getting this worked up over handling leapseconds with any luck their systems will just lock up completely when they are confronted with relativistic effects of just being in orbit.
So the O'Neil Colony will be a rugged frontier but blessedly free of Facebook!
Time is actually quite simple for those that understand high-school physics - among which Facebook coders don't seem to be.
Time is a physical measurement. The unit is called a second, and is defined via transition frequency of the caesium 133 atom. The most common time periods we measure in everyday life (and model in information systems) is the number of such units it takes the Earth to complete one rotation around it's axis (a "day") and the number of units it takes for the Earth to complete its rotation around the Sun (a "year"), both in the astronomical (extra-solar system) frame of reference. Neither of two is some even number of basic units, and, in addition, since they are both manifestations of behavior of irregularly shaped and geometrically unstable physical bodies, are irregular and variable over time and impossible to predict with anywhere near the same precision as that with which we measure time. (NB.: "impossible to predict...")
If those Metaverse self-proclaimed "software engineers" would carefully examine the above paragraph (and let it sink, over a period of time commensurate with their education in natural sciences), I am pretty sure they will understand just how absurd a call for doing away with leap second is.