Unix time
Seems pretty clear you should never adjust unix time, but rather adjust the time zone data used to turn it into yyy-mm-dd hh:mm:ss.
The measurement and regulation of time could start to change this week if an ITU meeting in Geneva, Switzerland, gives the nod. International Telecommunication Union members are discussing whether Co-ordinated Universal Time (UTC) should be set using a system that does not factor in the Earth's imperfect spin – which …
The issue is what happens if the leap second adjustment is made during some critical operation and whether different equipment remains synchronised during the time update. Space missions typically pick some fixed time datum (e.g. launch) and measure time relative to that datum, so such adjustments are largely irrelevant.
Oh really?
But the devil is in the detail. Many Unix machines used to have their hardware clocks set by counting the elapsed seconds since, i think, 1st Jan 1970! Then the software clocks were synced by atomic clocks using NTP or the likes. I'm not sure that this still holds but nevertheless
this is important because time must be tracked even when the kernel is not running.
Not sure that i agree that they can easily cope with leap seconds. That has to be only a maybe.
So if we go with the atomic time & ignore sunrise, does this mean that at some point we'll end up with '12 midnight' some time in the middle of the afternoon? Or maybe starting work at '9am' - two hours after sunset.
IMO, despite the difficulties behind a computer fix, fixing people's perception that 9am should be in 'the morning' and midnight should be 'at night' is going to be much much harder.
This post has been deleted by its author
The time under discussion here is UTC (or GMT) rather than local time. Considering that where I live (Australia), the sun generally rises somewhere between 7:30 PM and 9:00 PM UTC, for the majority of the planet UTC doesn't correspond to sunrise/sunset anyway.
So having UTC refer to a fixed number of seconds rather than the wobbly rock we're sitting on does make sense. Although calculating local time might become a bit more fiddly than just adding or subtracting X hours from UTC, the difference would be small enough for the next few thousand years that for most practical purposes it would be insignificant.
Consider this: In the past, noon was defined as when the sun was at its zenith. That failed when we started having transportation fast enough, and clocks accurate enough, to allow detecting that noon here wasn't noon there.
So then we moved to time zones, and accepted that noon didn't always mean "highest sun".
Now, between electric light and world-wide communications, time being tied to the fact we live on a round ball is getting unwieldly - my co-workers in the UK and I have to deal with a six hour difference on a daily basis, and one of my co-workers in California just gets up 2 hours early to avoid the issue.
So what if TAI is minutes or more off what the earth is doing? Very few people live where noon in their time zone is solar zenith, so we are already dealing with this. Just get over it.
(/me pulls pin...) And let's just skip Daylight Savings - pick a time offset and stay with it (/me throws....)
Daylight savings was invented by people who thought you could take time off the start of the day, tag it on the end and have a longer day. IT IS AN ABOMINATION UNTO NUGGAN.
However leap seconds and suchlike are a necessary evil, otherwise you end up with calendar year slowly drifting through the siderial year. Not adding a leap second will very, very, very slowly cauyse havoc with calendars and lead to some future generation having to reset the calendar to make it match the siderial year again, as with the Gregorian to Julian calendars. Our descendants in the year 28822 will rue the day we stopped adding leap seconds.
The trouble your proposal is that it makes it very difficult to accurately work out the time between two dates. The answer would necessarily be very ambiguous! For example the answer to the question 'how many seconds were there in 2011?' would depend on whether you're taking the current length of a second as the basis, or the length of a second at the very end of 2011, or the mean second length during 2011, and so on.
The REAL answer to the problem is to get the operating system boys to sort out time properly. Pretty much every OS, programming library and application out there has always completely ignored leap seconds purely because the original programmers were too lazy to find out what UTC actually is, which is crazy considering UTC was defined decades ago back in the early days of computing.
The only people who have actually got it right so far as I know is the astronomers; not surprising as they do actually care about accurate time differences over long periods of time. The IAU's SOFA source code library has all the routines needed to accurately convert between UTC, TAI, etc, taking proper account of leap seconds.
The only disadvantage of SOFA is that it needs a static table of leap second data manually updated (and your code recompiled) every time there is a new leap second (they're not predictable in advance). It uses this table of all the leap seconds there have ever been when converting between TAI and UTC. In this day and age it should be trivial for something like NTP to communicate that table to OSes automatically. It would take some work to update apps to use something like SOFA instead of the inaccurate libraries that are used in the mainstream today, but it would completely solve the problem for ever.
Changing OSes and software is probably a whole lot easier to do that than it would be to change all the laws, working practise, train timetables, etc. etc. when humanity finally gets fed up with 0900 being increasingly early in the sidereal day. My great great....great grandchildren don't want to be contractually bound to turn up to work at 0900 if that's in the middle of the night.
I reckon that this is something that Linus Torvalds could single handedly solve. He can change the Linux kernel and has enough influence over NTP and glibC to make it happen. If Linux gets it right, every one else might follow.
Getting it right in the OSes would make a tremendous difference to software programmers who have to worry about time. For example, how many times have Apple failed to get their iPhone OS alarm clock to actually work as an alarm clock? How many people with electronic calendars have been frustrated by the inability to properly deal with daylight savings and time zones?
Spoken like someone who is completely unfamiliar with the differences between sidereal, solar, and atomic time; the origins of time keeping; the development of naval navigation; or even the importance of knowing when to plant the crops, which really, is pretty basic and critically important when you get right down to it.
Well the current system has to deal with the odd 'quarter of a day' error. So why not do 'gregorian calendar with this year's fudge factor' and add that factor to seconds.
1 atomic second = 1 atomic second + (or minus) 0.0000000072, or whatever.
That takes care of all the inconvenient dicking around and would be smoother for all concerned. As a rider to this policy, I would suggest that a law compelling every single farmer to own a fucking torch (with a public emergency fund for really skint farmers) and we can stop all this summertime/wintertime crap.
This argument is rubbish. I'm an ex-pat living in Sweden and where I live in the winter, at it's worst, the sun doesn't rise until 09:30 and then sets at 14:30. Builders don't stop work here just because it's winter and it's dark (they're building a new 18 storey appartment block across from where I work and they've been working all winter), and there are no significant increases in traffic accidents just because it's dark (there are more due to the onset of winter weather but not because of the dark). As I said this argument is rubbish.
the repetition is good, but you need to work harder on multiple capital letters and maybe add some excess punctuation. But good marks for completely missing the key point: I was relating accident rate to increased traffic density, not hours of darkness. I'm quite sure you're right that the Swedish outdoor trades adapt well to winter darkness, but I'm less sure that its a useful comparison to southern English ones.
'A Method For Deterministic Time Synchronisation And Co-Ordination In Fixed And Moving Reference Frames For Portable Electronic Devices With Or Without Roundy Corner Bits'
I think the ITU are knackered whatever they do. I imagine Apple have already patented something like the above though the *cough* USPTO.
What I'd like to see is a good shake up of the entire calendar, starting with putting New Years Day where it (roughly) should be... December 21st.
@moiety: There's already such a fund. It's called the Common Agricultural Policy. France knick loads of it. It keeps those inefficient, not-so hardworking rural French farmers, well, not very busy but quids in ;)
I think New Year used to start at the beginning of Spring or the Vernal Equinox, as new growth for the New Year started. January and February were added as sort of "leap months".
As for twice per year juggling with the clock or moving UK to GMT + 2 or whatever, why? Start times for work and school are pure conventions. In Switzerland, work tends to start at around 0800, lunch tends to be at about 1200. As somebody wrote, farmers are ruled by real time as indicated by the sun. Just start the school/working day at the naturally appropriate time for the season.
I never can understand why travelling to work in the dark, when one is tired and rushing, should be any safer than going home in the dark, when presumably most people are awake as they have no intention of going to sleep as soon as they get home. At school in England, in the days when we had daily sport, in Winter we simply adjusted our timetable to have sport in the afternoon and lessons began at about 16.30 i.e. we made use of the daylight without adjusting the clocks. Surely, that is the proper way if we are not mindless machines.
Sad that anyone wants our clocks regulated by some means that is unrelated to the rhythm of the physical world in which we live. As it was, UK did not adjust from an artificial to an almost natural calendar until 1752, by which time the adjustment of the eleven days disparity caused a lot of upset, even riots. Now we want to return to a dislocated, artificial (if perfect) system that will need a more awkward adjustment at some unspecified time in some very roughly specified future and society.
Time measurement is just a tool. Tools should fit the normal user in his normal environment, not the other way around.
To clarify: where it should be - to me. I wasn't actually trying to draw a reference to bygone days.
IMO the most 'logical' day for New Year's day is when the sun is at (or one day past) it's lowest altitude above the horizon and definately not when the sun is at one of the equinoctial points... But then I also see sidereal as more intuitive and natural than say, solar :)
"Time measurement is just a tool. Tools should fit the normal user in his normal environment, not the other way around."
Ah'ha. HCT? Human Circadian Time. That may put me at odds with the bloke next door who is on permanent nights. He's on HCT +12. (I jest of course).
but whether it was equinox or solstice seems to depend on the culture. Early months also tended to be exactly 30 days, but this led to other obvious problems. We will always have to deal with occasionally straightening out a mess because sidereal /= solar /= atomic, and each has its legitimate use and purpose. Although it does seem to me that for most practical purposes these days we would be better off holding on the leap seconds until there is a leap minute and then making one large shift that everyone is actually prepared for rather than more frequent shifts for which we are unprepared and which are prone to truly bollux up things.
A lot of the history of time measurement, especially dates, was developed because of all the problems caused by calendars that didn't synch with terrestrial rotation. Now it's proposed to reverse this!
If, by the year 37whatsit, our time synch systems are not dealing with time far far better than they do now, I think our descendants will need shooting.Still, maybe by then we'll be off this planet, and terran time will be irrelevant. Stardate anyone? But sufficient to the day and all that.
Whilst I totally respect his academic credentials, I'm not sure that "abandoning leap seconds would break sundials" or claiming ">5000 years of human practice" - neither of which could claim any degree of accuracy beyond +/- several minutes, at least for most of the 5000 years - are the strongest points he could have made ...
So long as humans want to go to sleep sometime after dark and wake up sometime around about when the sun comes up, we will need a time scale that is aligned with the sun.
We either stop caring about 0800-ish being the time we wake up and go to work, or we arrange matters so that 0800-ish is when the sun comes up. Trying to coordinate the former across the world without causing a lot of havoc is going to be difficult, because actually there is so much in our lives that is based on clock time being equal to sidereal time.
If the ITU does abandon leap seconds the whole world would occasionally have to re-align every time dependent aspect of our lives (timetables, contracts, laws, telephone systems, etc. etc) to keep it in track with the fact that humans live sidereal lives. Changing all that in one go and getting it right sounds a lot harder than dealing with a leap second every now and then.
so things that haven't been designed and/or tested properly to work with leap seconds use the time standard which has leap seconds.
well, duh.
those things should use TAI, the time standard without leap seconds.
the changeover may be problematic (adding back in the 34 leap seconds so far in one lump), but it only needs to be done once.
personally i still can't see the point in daylight saving time. fucking do-gooders. I don't wish to get up an hour earlier every day for the whole summer and autumn.
At the present moment all time zones are defined as UTC plus or minus an offset.
If UTC abandons its synchronisation with the Earth's spin then by 3752 it will be 2 hours ahead which means all the time zones will be incorrect. So in December, for example, it will still be dark at 10am in the morning in the UK.
An alternative is that UTC doesn't add leap seconds, but each and every time zone subtracts a second. So on 30th June British Summer Time (BST) changes from UTC + 1 hour to UTC + 59 minutes 59 seconds. This would have to happen simultaneously in every single time zone, with the co-operation of every single country otherwise when dealing with NY, for example, you would have to adjust your time by 5 hours and a second or by 4 hours 59 minute 59 seconds depending upon who didn't change their time zone.
Personally I think its a lot simpler just to add a leap second to UTC every now and again.
That would be the devil in the details, then. AFAIK, predicting the need for leap seconds is not practical if you want UTC and solar time to be precisely aligned. The Earth wobbles and no-one is quite sure in advance how much it is going to shake next year.
However ... over the very long term it is certainly possible (which is why people are making predictions for the next millennium or two) so a reasonable compromise would appear to be: "Legislate for some Gregorian-style fudge that will work for the next thousand years or so and just accept that you might be several seconds out every now and again.".
This has the enormous benefit that someone who burns the current rules into a ROM for a non-networked, innaccessible device, doesn't get caught short by an earthquake on the other side of the planet.
You mean it'll only be another thirteen hundred years before this becomes a problem? Clearly nothing will do but that we replace our entire timekeeping infrastructure within the next year!
Want a hand grenade? Here's one: One wonders whether somebody's press release generator should've been switched out of climate-change mode before the info on the ITU debate was fed into the input hopper.
Strikes me the solution is pretty obvious - we need two "times" and one "interval" (the second):
a) a 'scientific' time that is rigorously defined and that increments at the standard interval - the second - forever and does not have leap-seconds - lets call this "epoc time", and
b) a 'practical' time that is is aligned to the scientific standard, i.e. it uses the same interval (the second) but is adjusted via a local "offset" which provides the local time that we see - the yakns call this "wall clock time".
Leap-seconds are applied to "wall clock time" via the local offset as required to keep the time right (within 0.9 seconds) such that sunrise and sunset work and people watches work.
Important systems such as international telecommunications, computer networks, scientific experiments use "epoc time" and simple humans use "wall clock time".
GPS already does this with its difference between GPS time (1st Jan 1986) and UTC with its "UTC offset". We can reuse this idea - all we need to do is to take the existing unix time_t epoc time, extend it to 64-bits (UINT64) and synchronise it to the 300+ atomic clocks in the world and call this the international standard.
The trick here is to have one internal standard that just keeps counting without interruption and a local representation with is adjusted on use/on display, i.e. on output, without resorting to changing the underlying master source.
unix does it already... can't be hard...
Mike
Your "scientific" time exists already, and is called TAI.
ISTM the *real* problem is that the ITU doesn't have a proper understanding of computers. Of course, it's obvious to us reading the Reg that anything that needs an unvarying timebase, such as financial transactions, should use TAI, and ordinary people wanting a "civil time" should use something simple that roughly corresponds to the sun's position in the sky. There would be no great technological barrier to that happening, and I can't believe the ITU quite understands that.
The underlying problem is that the Earth's speed of rotation is slowing down. At some point we will have a 25 hour day if you use the atomic clock second. But a day will remain 24 hours with 60 minutes per hour and 60 seconds per minute. You can't use an offset to adjust for that. Besides which, the whole concept of time as something which if fixed is much more akin to the now dead concept of the ether than most people realize.
This post has been deleted by its author
Clearly a statement from someone who's never heard of the Egyptian civil calendar (365 days and never mind the astronomical discrepancy because it was easier for bookkeeping), the Mayan calendars (3 of them, none synchronized with the Earth's orbit), the Jewish calendar (Metonic cycle), the Islamic calendar (strictly lunar)...
1. Thermodynamically challenged High Frequency Traders. Mine's the one with the enthalpy in the pocket, and I want it all back.
2. Gregorian Calendar challenged Financiers who want all the Leap Days at the end of time where they belong if their formulas are to work.
Isn't there an empty planet around somewhere ? They'll need cable TV, of course; we're not barbarians.
If the Boffins solve the existential problem of Mondays, please get back to me.
People here absolutely miss the big joke in all this and it is astonishing once a reader gets the big picture with the smallest of effort.There are 1461 days in 4 years and this matches up with 1461 rotations in 4 circuits of the Earth around the Sun.On Feb 29th there will be another sunrise and sunset due to the rotation of the Earth and this day will close out 4 orbital circuits of the Earth that began on March1st 2008.Simple enough given that if you divide 1461 rotations by 4 orbits you are arrive at 365 1/4 rotations for 1 orbital circuit.
You see the dominant view is not the common sense 365 1/4 rotations/days in a year/orbital circuit but a mindnumbing 366 1/4 rotations in a year so how they are going to fit 1465 rotations into a 4 year period is beyond comprehension and this is not some college 'flat earth society' joke,this is actually mainstream policy -
"The Earth spins on its axis about 366 and 1/4 times each year, but there are only 365 and 1/4 days per year." Goddard Space Flight Center
So when everyone here steps into intellectual oblivion after they make a late 17th century error official policy (John Flamsteed came up with that conclusion in 1677) say goodbye to thousands of years of astronomy.
The error is in counting days as rotations, and ignoring the fact that if we orbit the sun, but have zero days (i.e. it's always noon at greenwich) then we'll have rotated once, not zero times.
So if we've observed the rotation (by the sun apparently traversing the sky) 365.25 times in an orbit then we've actually rotated one more than that (of course if we rotated the other way then it would be 1 less, but we don't)
Every day the sun should rise at 6:00, and set at 18:00, everywhere in the world.
There is no need to pretend that time elapses at some fixed period.
For people in the northern climates, this means that winter day-time clock would tick-tock faster during the day than the night, in order to fit a consistent number of tick-tocks into a significantly shorter time.
We can leave it to the pointy heads to make sure our little devices co-ordinate properly so buses will be on time, and planes won't bump into one another.
I prefer Pink Floyd to marching music....
ps: sherlock for his 7% solution
DST typically does little to save money, but it is a political thing so politicians look like they've done something. The spots in Indiana that started observing DST ended up using more electricity rather than less due to using more electricity for air conditioning in their homes. If they weren't in the homes for an extra hour of daylight in the evening, they'd need less AC. For an example, for much of the northern US, "Daylight Wastage" (an term I invented for an hour subtracted from the evening) makes more sense due to AC consuming more electricity than light bulbs. In a good chunk of the Southern US it doesn't matter. If you turn off the AC during the day to save power, it'll take most of the night to catch back up. Summertime 100F+ degree heat and sunshine makes for easy solar power and wind (in West Texas), unlike the UK which has problems with consistent power from either, but it means you have to keep the AC on all the time during the summer sucking down a lot more power which is a problem the UK doesn't have.
Regardless, artificially jacking with the clock for DST is nuts. If a government is going to do something that crazy, just legislate *suggested* "Standard" Business hours instead. Much less destructive than switching clocks around willy nilly.
Why not just split the difference, in 32719381274 years when a leap hour is required the adjustment of one hour will be pretty significant, but today, leap seconds every few years is a bit much. Why not just do leap minutes every few centuries or however long it works out to be, not as big a hit as a leap hour, but less of a nuisance than leap seconds (so what if the bus is a minute late, it's already bloody late an extra minute won't matter)
*Nuke: Oh, so that's why the button on the atomic clock was labelled 'Do Not Push This Button' !!
Why is it that whenever this topic comes up in the press, some hack puts in "Oh noes, is b0rk GPS yes?"?
GPS and other satellites, spacecraft, space stations, probes and such, already live with the fact that their clocks run faster than ours, being further up or out of the gravity well. Synchronisation between spacecraft and ground stations already has to cater for clock drift due to relativistic effects and the odd leap second here and there to account for the Earth's rotation is a trivial adjustment by comparison.
Thus while this one is often presented as the headline argument, it isn't. At all.
"We must not give up the >5,000 years old human practice of defining time through Earth's rotation because of unfounded worries of some air traffic control engineers,"
Right so even if something is technically incorrect and wrong as long as we have been doing it long enough we will say its correct, oh well good news for religious people.... another 3000 years and well have to agree there is a god.
I mean there is no reason why the number stored on my computer needs to be easily translatable into a hh:mm:ss time. In fact looking at the timezone databases it isn't anyhow.
Not every system which needs time needs a wall clock. So why not just run those systems which don't on some other timescale like GPS? And whenever we have some human time interface, we can just adapt the time in a suitable way. In the meanwhile, we'll just add a "timescale" attribute to times, just like we already do for timezones.
This has been done in the past when railways came along. Before that every place had its local solar time. Timezones are a deviation of the idea that time is kept by the position of the stars.
If Earth stood still, it would have mid-day, mid-night, sun-up and sun-down as 4 corners. Each rotation of earth has 4 mid-days, 4 mid-nights, 4 sun-ups and 4 sun-downs.
The sixteen(16) space times demonstrates cube proof of 4 full days simultaneously on earth within one (1) rotation. The academia created 1 day greenwich time is bastardly queer and dooms future youth and nature to a hell.
Ignorance of 4 day harmonic cubic nature indicts humans as unfit to live on earth
This is utterly stupid. I can't believe that scientists can't figure out a simple solution to the problem. But then, they never have been able to solve the worlds problems, only create more of them. For those that can't deal with leap seconds, have a universal clock that is simply the number of seconds from a set period of time, rather than the awkward 24/60/60/1000 breakdown that we currently have. Then it can all be in decimal. It could be counted from the beginning of the millenium, the new year in 2001. We could then add leap seconds to our hearts content, and programmers for scientific systems and in spaceflight could use a period of time called a kilosecond, which would equate to 16 2/3 minutes, and a mega second which would be about 11 1/2 days. A gigasecond would be 31.7 years. And then everyone would finally be happy.
The first is TheWife's monthly cycle. If you are married, you'll grok.
The second is the seasonal clock handily provided by the Solar Year & the Earth's axial tilt with respect to its orbit. It is totally out of my control, but I plant my fields & breed my critters by it, as humans have since time immemorial. Trying to change this is a fool's errand.
The third is the clock provided by the Master clock on my network, which syncs up to an atomic clock once per day (ntp.org works for most purposes ... I use something else), which all of my machines adhere to. This is for computer record keeping more than anything else.
Context is the key. There is no "SingleTimeStandard[tm]", and never will be. With the exception of TheWife's, of course ;-)
As a side-note, I don't wear a wristwatch ... and haven't in over a third of a century (since my HP-01, back in 1977, in fact). In my mind, they are completely pointless. Everywhere you look these days you can see something giving you a pretty good approximation of "local time". Humans living life to the second or minute (or even ten minutes!) is counter productive. Even when baking bread ... Relax, be patient, learn to make homebrewed beer :-)