Could they chose a different time
Nice time to do it - when everyone who is supposed to be handling a potential fallout is completely plastered off their t*ts...
The Time Lords at the Earth Orientation Center of the International Earth Rotation Service have again decided the world needs an extra second and have picked New Year's Eve as the best moment for the extra sliver of time. As the Center's notice states, at 23h 59m 59s on December the 31st you'll need to hold off your …
As if there was much fallout from previous leap seconds. Yes there are real-time systems which need to be synchronized to a high precision and have independent clocks. Yes there is going to be a bit of faffing around to ensure that the time difference caused by clock drift is accounted for. It was always the case, but hopefully you have learned at least a little bit how to handle that. And, on the positive note, there are going to be fewer such systems live on the day which is considered to be a holiday by large part of our civilization. For one, I am quite certain there will not be any trading on 1st Jan.
Well, they could choose a better time but where is the fun in that? Think of all those Swiss Tonys rubbing their hands in glee at the prospect of yet another Millennium Bug. Time again for, "trebles all round lads, but not the sysadmin - he's got to man the phones and look like we're really earning the money by monitoring the clients' networks"
Mid-morning, what time-zone?
Doing it at midnight UTC makes much more sense, because it won't screw up people who are working in most timezones, and you'll have 9 hours to fix anything that does go wrong.
However, I agree on the date. There are better days to choose.
I'm infinitely more concerned, however, that a leap second mechanism that's existed for years, if not decades, still doesn't have a general solution in any software package dependent on time. It shouldn't be something we're wasting so much time on any more, like you shouldn't be using any software that doesn't understand February 29th.
still doesn't have a general solution in any software package dependent on time.
Well, the Unix standards are quite clear on it, as an example from the strftime manpage:
"Second as a decimal number [00,60]; the range of values is [00,60] rather than [00,59] to allow for the occasional leap second.".
If implementers choose not to follow the standard that's hardly the fault of the leap second model. Much the same applies to day-of-year calculations, which must allow for 1-366, not 1-365.
Well, leaving aside the point that "never test for an error you can't handle" was intended as a joke:
1. How do you know about ALL the errors?
Good design. If you can get an error you didn't know about, you have a design flaw, and probably a security bug.
2. Even if you do know about an error, how do you know you can't handle it?
Well, as a trivial example, what happens if you have a C program that calls exit(), and execution continues from after the exit() call? Officially that can't happen, exit() is defined not to return, but do you test to see if, due to a bug, it does? What would you do if, by some error, exit() did return? You can't just call exit() if exit() fails...
In general, there's no point in checking to see if exit() returns, it should never do so and there's nothing you can do to improve the situation if it does.
3. Isn't testing part of the answer to both questions?
No, good design is.
Vendor guidance? That'd be nice, we had to request it from some of ours as they hadn't yet considered/documented it. Like when we had to prompt them a certificate was about to expire and would stop their software from working.. suddenly a 'patch' was issued to work around it. To be installed at our expense of course..
Windows does essentially the same thing: Ignores the leap second an treats the updated time after the event as clock skew, adjusting over an hour or so.
Your junior devs will never be good enough to handle leap seconds correctly.
Your server clock is not that accurate anyway.
It doesn't matter for most applications.
If you are not sure whether it matters for your application, it doesn't. If it did you would know because you would have an atomic time source in your lab.
Then they cause problems for themselves - as you are fully one second out of whack if you ever compare "seconds since a certain time" against the result of "current time in seconds".
And your server clock can be accurate to thousandths of a second without even trying. NTP alone provides millisecond accuracy.
And it doesn't matter for most applications, but it DOES matter for the ones that fall over when it changes if you haven't taken account. Word isn't going to throw a fit, but anything billing, accounting, collating statistics, or reading data each second is going to mess up unless you take account. And that's EXACTLY why you program those as if they DO matter and not just slew clocks. E.g. the 59th second of Dec 31st will collect TWICE as much - transactions, temperature collections, billing periods, etc. as any other in the year, which might well trigger alerts and compensations that you DO NOT want.
No need for atomic clocks.
But an extreme need for people to learn that - when programming anything reliant on the clock - the OS clock can be extremely unreliable.
And it's more to do with people doing things like: programming seconds that only go from 0 to 59. 60 causes them to crash, loop, or other things.
There's no excuse - like there's no excuse for programming leap years wrong (every four year? WRONG!). The specification is out there and if you don't program to the specification, you're going to have trouble.
The leap second is easily noticeable using a shortwave radio tuned into a time signal (WWV, CHU, or European equivalent) where they'll actually insert the 60th second.
Most easily by comparison to a clock just precisely set in the previous minutes. Good clocks for reference include a computer with a recent manual NTP update, a GPS with a trustworthy time display, or a self-setting clock (the ones with the embedded LW receiver).
Get everything set up and aligned, and if you count the seconds in that final minute, then you'll notice the extra second at the appointed time. Your local clock(s) will suddenly be off by one second.
It's hugely entertaining... (<- LOL)
Typical crystal oscillators are accurate to about 1-10 seconds in the day.
Most servers only update time via NTP a few times a day, and many only weekly or less or not at all.
The leap second is of the same order as the normal time skew which occurs on commodity hardware.
Nobody is suggesting you should allow the leap second to simply be added to the preceding second.
The proposition is that it is gradually adjusted over the subsequent hour or so, resulting in around 0.05% inaccuracy in duration during the period of adjustment, additional transactions, error comparing time elapsed to wall-clock time and so forth.
Some twit put a big rock into orbit that goes round once per month slowing down the planet. The rock speeds up, orbits higher up and the extra circumference means it takes more time to go around the Earth. Eventually, each day will be a moonth long. The temporary solution is migration towards the poles. By moving closer to the axis of rotation you will make the Earth spin faster. If you take you really big oil tanker from the equator to the north pole you will speed up the Earth by about one second every ten million years.
Time goes forward. Adding a leap second makes it go forward a bit faster, for a short while.
Now, subtracting a leap second must be more dangerous - as it makes time go backwards.
Having looked, I can find no record of humans ever having subtracted a leap second. So it is something we (BIH) have no experience of how to handle.
Worriers should worry a awful lot about this. Time might stop, and not restart. With time stopped, we might all live forever. The world might end, or we all might spend forever waiting for it to end - soon!
NS "Adding a leap second makes it go forward a bit faster..."
A Leap Second of the sort planned is similar to a pause. Like a 'Built-In Hold' during a rocket launch count down. Or like the 29th of February.
So what is it that you believe goes "forward a bit faster"?
If you're referring to our clocks, they're effectively pausing for one second. That's the opposite of both going "forward" and going "a bit faster".
If you're referring to Newton's pre-Einstein Absolute Time, well it doesn't exist. And even if you thought it did exist, then you'll not be thinking that we're making it do anything (what with being Absolute Time and all).
I suspect you're directionally-confused by the word 'Leap' in Leap Second. It's a misnomer. It should be called a 'Pause Second' for these extra second. But they were following the misnomer convention of Leap Year. Which is also the exact opposite of a 'Leap', as we pause Feb/Mar transition for an extra day.
What value does this have? Really none everyone is already in sync even if it is off by a second or two or three. Keeping time in sync with aliens from another planet provides no value.(if there are people that it does cause an issue for then change your clocks don't fuck with the rest of the world's)
(Someone who was burned by the 2012 fuckup, least I didn't run the airline that went down, and of course NTP did nothing to address the issue)
Quite a lot of value. Remember your history,
"The October Revolution (Russian: Октя́брьская револю́ция, tr. Oktjabrjskaja revoljucija; IPA: [ɐkˈtʲabrʲskəjə rʲɪvɐˈlʲutsɨjə]), officially known in the Soviet literature as the Great October Socialist Revolution (Russian: Вели́кая Октя́брьская социалисти́ческая револю́ция, tr. Velikaja Oktjabrjskaja socialističeskaja revoljucija), and commonly referred to as Red October, the October Uprising or the Bolshevik Revolution, was a seizure of state power instrumental in the larger Russian Revolution of 1917. It took place with an armed insurrection in Petrograd traditionally dated to 25 October 1917 (by the Julian or Old Style calendar, which corresponds to 7 November 1917 in the Gregorian or New Style calendar)."
"The Gregorian Calendar was first introduced by Pope Gregory XIII - which is how the calendar got its name. This calendar has been implemented by several countries because the Julian calendar assumes a full year is 365.25 days whereas it is actually 11 minutes less. So, the Julian calendar many countries felt wasn't a true year so they made the change.
The Gregorian calendar was able to make up for this 11 minute difference by not making years divisible by 100 to be a leap year. This means that the year 2,100, for example wouldn't be a leap year whereas in the Julian calendar format - it would be. "
Time matters because there is inherent and ascribed importance to the point in a cycle at which an event occurs. Count the number of seconds from the beginning of time if you will, but rejoice in it's calibrated relevance to our everyday lives.
Is midnight UTC on 31st December
If only because here in Enn Zedd it's lunchtime on New Year's Day and there's hardly any work being done so if things do go TITSUP, mostly nobody will notice. On account of being on summer holiday and only being concerned about cold beer and hot BBQs - neither of which will mind too much if there's an NTP hiccup
The last time they changed it on 30th June which was potentially a real PITA as lunchtime (NZ) 1st July is just an ordinary busy work time.
Biting the hand that feeds IT © 1998–2020