Don't people test edge cases any more?
Text required so might as well be this pedantic note
- it's not "divisible by 4" as there are 365.2425 days in a year, not 365.25. So it's divisible by 4 unless divisible by 100 unless divisible by 400.
Today is February 29, an unusual day in that it is added to the common 28 in years that are multiples of four to keep the calendar in sync with the astronomical year. This kludge prevents our seasons from drifting out of whack, but it presents a problem for computers and software, which have to be programmed to account for the …
As implemented by numerous good libraries that handle time.
The real problem is that people assume they understand time and date. It is THE most difficult subject. Let alone time zones and leap seconds. A limited compiled list of falsehoods is just the beginning. Or take a look at days that were removed.
Do you think year 0 (zero) exists?
It generally does, in that if you use a library that has existed and worked correctly in many countries for a lot of years, they probably considered time zones and leap days. It's usually not too hard to find something to help with time. Most programming language standard libraries and operating systems have that handled. Unless you have something they can't handle, the chances are that you will not benefit by either writing your own or trying to find someone else's library for the task.
If you're going to do so, perform the following basic tests:
1. Look through their documentation. If they mention oddities of time zones that they handle, they probably work. If they sound like students putting out something on GitHub, maybe not.
2. Check leap year information. Run this code or language equivalent foreach (int i in [2000, 2100, 2200, 2300, 2400]) print(is_leap_year(i))
If you get true, false, false, false, true, good sign. If you don't, don't use it.
3. Check what they did the last time some country decided to mess with daylight saving time for no reason. For example, you could see whether and how quickly they updated the time rules that Greenland changed in 2023. If they're using the typical sources of information, this could be automatic.
I agree with you. A case I hit this year (not in my code) was using a library to do 5 digit date validation. The library allowed for 29FEB, but only if a year was included. As the year had not been determined at that point (it could be future, today, or past), validation failed.
Ostensibly the code was 29FEB safe and the library validated it, it still failed.
I was asked to check if we could handle this a couple of weeks ago.
Our system has been in production since 2006 so I'm pretty sure we've been through this before but someone in the upper strata obviously had a bit of a panic over their morning coffee when he realised that four years ago 29/2 fell on the weekend so its been *8* years since we did this!
We store dates as Julian numbers, so really, there is absolutely no possibility of our core functions not working. They quite literally DO NOT KNOW what date it is, in the anthropomorphic sense. Anyone who encodes dates any other way is an idiot. YYYYMMDD is a display format (one of a great many), its NOT a suitable internal representation. Sadly few developers understand the difference between these concepts.
The input validation functions do care, but I already fixed those to recognise 29/2 several years ago (along with the correct leap year function, the one that's good for 2100...). We inherited the date class from another team, they were superstars...
I had never heard of the Julian date concept before. Thanks for introducing me to the concept. I've got a programming project in mind that'll be able to make good use of this, I think. Don't worry, I'm not a professional programmer who's about to display his ignorance about time. I'm doing that here!
If your application mostly deals with date arithmetic then any representation that is an offset from a known epoch is what you want. Just so long as the range of the offset covers the needs of your problem domain. Julian numbers probably overkill for most commercial applications but a little inadequate if you are simulating the formation of the solar system (I'm guessing you would use different units for that one though).
Stuff like number of days between two date points becomes trivially easy. But converting to an input or output representation that a human can understand is more expensive. Debugging can be a nuisance too. Some domains have complex rules for day counting e.g. number of business days between two points, we do that with a list of non-business days.
Martin Fowler's Analysis Patterns is a very good read on this and other subjects.
And define your requirements. Do you really need to be able to display any date? - because that's a lot of complexity that you probably don't really care about. You can save, probably, some months of work just by setting a starting date and specifying a single calendar.
This post has been deleted by its author
We store dates as Julian numbers, so really, there is absolutely no possibility of our core functions not working. They quite literally DO NOT KNOW what date it is, in the anthropomorphic sense. Anyone who encodes dates any other way is an idiot.
Julian numbers are fine, but make sure your programs can handle day 366 in a leap year. That was the one that tripped up the Norwegian railways in (yup, here it comes) the year 2000, a leap year, on the last day of that year.
Well, the WWW can enlighten you on that one. From the first result from my search: "In 2011, Samoa switched time zones, skipping directly from Thursday to Friday."
It's not that time and date calculations are inherently complex or difficult; they're not. What's complex and difficult is accounting the arbitrarary changes introduced by powerful people and organizations, in the envronment of human politics.
Time calculations are inherently complex and difficult.
Every object in the universe has its own proper time. If you do any kind of measurement based on physics (i.e. any measurement whatsoever) you must use proper time. It's the correct time for description of how physical processes go for the specific observer.
Different observers can disagree on which of two events occured first (if they are space-like).
The idea that there is some single time barely works at the scale of counties with our current time meaurement precision.
The mess created by humans is just a big and ugly database of weird rules. But it is not fundamentally mind boggling – unlike anything coming from relativity.
But "divisible by 4" is entirely adequate for use. Even worked for 2000[1], when quite a lot of things didn't without change, courtesy of the "unless" qualifier.
As I have observed before, if anything I've written is still in use for something important come 2100, I would be flattered and very surprised, if only I were not dead.
[1] Rather amusingly, the one bit of leap year logic that had to be changed as part of the Y2K project was where some eejit had tried to be clever, but had been ignorant of the 400 year exception. KISS engineering wins every time.
"But "divisible by 4" is entirely adequate for use ... if anything I've written is still in use for something important come 2100, I would be ... very surprised"
Which is exactly the sort of cavalier attitude that caused the Y2K situation in the first place. (Different technical problem of course, but the same short-term thinking.)
"Which is exactly the sort of cavalier attitude"
Yes and no. Like the previous commenter I've also adopted the principle that the Y2100 problem simply isn't something I need to concern myself with in the code I write, because I'd be willing to bet my life on none of my code (at least none that I've ever written so far or am currently in the process of writing) still running without modification by then.
I mean, I'm most assuredly already going to be dead by then anyway, given that I've just entered my 5th decade on this planet, but the principle remains - I KNOW none of my code will still be used as-is by then, because all of the date-handling code I've written so far has been for embedded systems which, no matter how much longevity the customers might try to eke out of them, will themselves no longer still be operating by then.
And should any of my code still find itself being used by other engineers by then, and should it then cause them to run into Y2100 issues, more fool them for not paying attention to the clear comments in said code explaining the short-cuts/limitations/optimisations/etc taken...
So under certain circumstances (and embedded coding is an area where certain circumstances occur rather frequently, given the limitations of the hardware on which our code operates) it's really not cavalier to take such an attitude, it may well be an entirely pragmatic and reasonable approach to developing code that does what it needs to do without incurring unnecessary overheads which may impact on other aspects of the system.
Depending on what kind of embedded systems you’re building, that may not be true. Stuff like railways comes to mind. 75 years is old, but not out of the question. I believe parts of the NY subway signalling go back to the 1930s.
"And should any of my code still find itself being used by other engineers by then, and should it then cause them to run into Y2100 issues, more fool them for not paying attention to the clear comments in said code explaining the short-cuts/limitations/optimisations/etc taken..."
I worked on railway software that was still in production 30 years later. By then, they had lost the source code. Undeterred, they built a PDP-11 emulator for it (not that the non-standard pre-ANSI C would have compiled as was). It was finally retired when enough new railway stations had been built that a statically allocated array was now too small and it couldn’t load its data at startup! And yes, that array bound would have been #defined, but without the source code...
Someone trying to restore a car or electronic toy in 2124 also won’t have the source code.
"... has been for embedded systems which, no matter how much longevity the customers might try to eke out of them, will themselves no longer still be operating by then."
Older hardware systems can live nearly-forever in compatibility modes and emulation. I choked on my own spit when I started reading the instruction set of the then-new PowerPC CPU and saw something from the IBM System/360 series: "Branch and Link Register?!!"
(Mine's the one with an RPi box in the pocket running an emulator of [Dead or Alive on Windows CE on a Hitachi SH-4 CPU].)
You can't assume that dates like 2100 will only be of concern to programs that deal with "now", though. Anyone born today has an excellent chance of still being alive in 2100, so any programs that are expected to deal with life assurance or pensions for them may well have to handle dates up to 2120 and beyond. You don't want pension forecast reports to fail just because someone didn't care whether 2100 was a leap year or not.
A previous employer of mine had massive budgets for software, but not hardware (software being the "in" thing). So we had to be really careful with hardware resources (memory, disk space, CPU performance). In the real-time* situation we were coding, performng one IF THEN test rather than two or more allowed critical time dependent operations to be performed within strict time constraints. I suggested collapsing two tests down into one by using a 'magic number' of 31/12/1999 to compare against, on the understanding that the hardware would be upgraded prior to the millennium. This was accepted by management and was well documented (backside covered? Check).
Another problem with many applications is that they use different concepts of time. We used 2-second time. This was to fit into whatever data-width we were lumbered with at the time. I forget where the baseline was set, but we regularly converted to/from 2-second time into normal time to interface with reality. So we had to concoct our own library to do these things.
*These being typical applications that could have caused big Y2K problems.
Not quite.
It's a leap year when the year is divisible by 4, unless the year is divisible by 100 and not divisible by 1000. I think I said that properly.
2024 is a leap year.
2000 was a leap year.
1900 was not a leap year.
Plus, years that are divisible by 25000 are not leap years.
I'm not sure where you got this, but it is not correct. For example, the year 2400 is a leap year according to the Gregorian calendar, but using your calculations, it would not be one. Your rules also result in a different number of days per solar year of 365.24096. The Gregorian calendar's cycle is 400 years in length and repeats after that. There is no rule based on 1000 or 25000 years.
There is only one unless in their rule:
"It's a leap year when the year is divisible by 4, unless the year is divisible by 100 and not divisible by 1000."
2400 is divisible by 4, divisible by 100, not divisible by 1000. As I structure their statement, that does not allow 2400 to be a leap year. I don't see how you find a second unless, nor how you can make 2400 a leap year and 2300 not one using divisors of 4, 100, 1000, and 25000 singly. It's also just incorrect based on Gregorian rules. There is no rule for divisors of 1000 or 25000 and there is one for 400.
Plus, years that are divisible by 25000 are not leap years.
Note that in 24999 years, the solar and astronomical calendar will have diverged by half a day, and the extra leap adjustment will suddenly make it wrong by half a day in the other direction. That won't help people. We mostly care about the cumulative effect, so after multiple 25000 year periods, and we don't even see any benefit at all until another 12500 years after the adjustment.
They've already effectively decided to stop using leap seconds, partly because of the predicted problems caused by skipping seconds. No reason to expect anyone to want to shift everything by one exceptional day in 25000 years, even if we are still using the same calendar.
Had a problem just this morning at the BP station. System refused my BP Visa card but no problems with my ELAN Visa card. Had me worried for a minute as we've had repeated problems over the years with credit card fraud and I never find out about it until card is refused. Never gave it a thought that it might be "leap day".
Not a leap year issue, but I had a classic y2k one only the other day. A spreadsheet at work required me to enter a future date for something. Being lazy, I entered "1/1/30" and it parsed it to 1st January 1930. A bit of experimentation revealed some logic in there that was something like "if YY<34 then year =19YY else year = 20YY".
Partly my fault for entering a 2 digit year, but I do think the interface should have returned "use a 4 digit year, you plonker!"
Err, edge points are the *only* places you need to test. Just before the boundary, just on the boundary and just after the boundary. If you know the direction of the equality then you can drop one of those three.
In between the boundaries you follow the same code paths so testing random samples does nothing to improve code coverage.
The key of course is knowing where the boundaries lie. If you wrote it then you should know the answer to that. If not then a combination of white and black box testing may be what you want.
As I had to learn when recently auditing interest calculations.
But February does have 30 days!
That's the 30 day interest calculation method. Annual interest is divided by 12 to get monthly interest, monthly interest is divided by 30 to get daily interest, March 31 gets no interest, March 1 gets 2 or 3 days interest. Still sometimes used for contracts (typically with annual compounding, fixed amounts, and daily interest required for point-im-time valuation).
If you are doing (another example) minimum-monthly-balance, monthly compounding, it doesn't matter how many days are in the month, and the 30-day month assumption just gives you value for the days in the first fractional month of a contract.
That interest method (and others like it) were designed for people doing calculation by hand, but still exist today because they are in standard contracts.
Sometimes markets switch to a new standard contract, but often there is no real reason to do so, and it takes a very long time.
This is why you use a date library for financial calculations.
Look on it as a regular test to find out which of those who entered the job market in the last four years should never have been sat in front of a keyboard, at leas, not one connected to anything. Wasn't it 1988 when a lot of Sun systems crashed? And before it even got to February I think. And my then manager's multi-page "Is it Friday yet" function couldn't tell when it was Friday.
"Just one of those things" if the software isn't calibrated for the event, which to us is highly suggestive of human error.
Wonderful.
The Antikythera Mechanism, had, as far as we can tell, built in features to deal with the variability of the temporal cycle. It's all way above my head but scholars have shown that the variability of the Moon's phases, intercalary adjustments and so on have been factored into the design of the mechanism and was, at our best guess, designed circa 200 BCE. See here
That's 2200 years ago so why hasn't the world learned to take things like leap years into account yet?
That's 2200 years ago so why hasn't the world learned to take things like leap years into account yet?
Well, we can thank in part the Millennial mindset that anything that happened before they were born is not important. Factor in the corollary that they can do anything better than the previous version (even if they don't have the slightest idea how to do it).
You know, something from 200 BCE just isn't shiny enough.
Yes, millennials can be blamed for that, as long as you're* willing to take the blame for everything that breaks in 2038. I'm willing to bet that that will be more things.
* I'm guessing that, because you said this, you're not a millennial and probably of the appropriate age to have been around when someone made the 32-bit signed time solution. You probably weren't the one to do it, but we're blaming generations for the actions for a member, so it's your fault.
Actually I was the one who filed a bug report to Micros~1 when the released that abomination laughingly referred to as "Quick C" complaining about how they buggered up the time_t and related structures. One of the problems was signed values; another was keying the beginning of the epoch at 1Jan1900 (which, because of yet another bug, was really 31Dec1899). To the best of my reading, nothing in the C standard (up through C99) requires the epoch to start at 1Jan1970, nor requires signed (or unsigned) time_t values of any magnitude. A time_t could be a 64-bit unsigned integer, and be standards compliant (unless, of course, C-standards beyond C99 changed, that...but I doubt they would).
Bloody brilliant thing. For the Moon position, it had two gears beside each other, one at an angle with a slot, driven by a pin on the other gear. This way it would speed up and slow down as it went around, just like the Moon does as it seems to go round the Earth.
An exercise in basic programming and an exercise in BASIC programming.
And yet 2000 was a leap year, which meant that anyone who only partially understood the rules would get caught out. If you only knew the "every 4 years" part, you'd be fine. If you remembered the "except every 100 years" and forgot the "unless it's the 400 year mark", you'd get it wrong. One of those occasions where being mostly wrong was better than being mostly right...
As for 2100 - there's still a push from some to get rid of leap years/days, so we may not be working with the current calendar by then anyway. Which means we'll likely have other software bugs in date functions to deal with. The three certainties in computing: BGP errors, DNS failures and incorrect time/date functions.
Unless we alter the Earths orbit we are stuck with having to make periodic adjustments to keep the year aligned. No amount of changes to the calendar or the measurement of time will change that.
I suppose we could alter the rotation rate and make the days a bit longer and chop 1.25 of them off the year. Technically doable but it would mean bleeding off a lot of angular momentum.
And screwing up every human's sleep cycle. We're very hard coded for a 24-hour sleep cycle. Bad things happen when that changes too much. There are some people who have a circadian rhythm that's not aligned with that, but most of them are caused by total blindness where the light triggers used to form normal rhythms are unavailable. From online descriptions, the experience seems very unpleasant.
I’m not so sure. I recall some experiments, from the 1970s or earlier, when people were put in windowless (but otherwise comfortable, livable) environments for several weeks on end, with no cues as to when was night or day. Their natural body clocks seem to lengthen to a period of more like 26-27 hours.
I have not read the study, but I do wonder a few things about it. Specifically, how long they kept that up, because anyone who has stayed awake all night to do something knows that you can do something like that occasionally and be generally okay, but doing it too often has some really negative results. Also, the schedule is probably different depending on how much exertion or stress a person goes through, so people living in an experimental environment probably aren't doing the same amount of stuff as a typical person at a job, a student, or a parent.
I did it when I was a student (apart from the hiding from daylight part) and both my sons did it extensively during the Covid lockdowns.
We may not count as fully human but none of us had (or have) any trouble at all in adopting diurnal cycle well over 24 hours.
《Their natural body clocks seem to lengthen to a period of more like 26-27 hours.》
One such experiment was deep underground - with no natural day night forcing.
Nuclear powered submarine would be another good environment in which to test this but in practice a normal 24 hour day would be maintained I imagine.
"Unless we alter the Earths orbit we are stuck with having to make periodic adjustments to keep the year aligned."
That's only because some of us care about it, usually for historical reasons Other calendar systems don't worry so much about when the "new year" starts in relation to the Earths position around the sun and a weird fixation on specific religious things happening on certain dates in relation to the seasons.. After all, Easter is pretty variable in when it happens already :-)
There's no real reason why we should not just decide that a "year" is 365 days and allow the calendar to slowly change such that "new year" moves in terms of our orbital position. Does it matter if December is in Winter or Summer[1]?
[1] No matter which hemisphere you are in :-)
"Does it matter if December is in Winter or Summer[1]?"
It does to me. If, for example, we're tracking climate changes, we can ask a question like "how has the rainfall in December varied over the past hundred years" and get an answer. If December keeps moving between seasons, it's no longer going to work as accurately. In order to calculate it, you'd have to phrase the question like "how has the rainfall during the period from twenty days before the solstice to eleven days after the solstice" and do all the calculations. This isn't just relevant for months; we get the same effect if we calculate the rainfall between November 17th and January 3rd, assuming there's a reason to do it, because that or any other set of dates consistently refers to almost the same time in the solar year.
I admit to being one of those who doesn't understand why we benefit from changing the clocks for summer time and thinking that we could manage equally well without doing it. Many countries have managed that. But I know why there are leap years and we cannot get rid of them. Most societies two millennia ago understood that and had figured out some method of handling it, so it should be pretty obvious how necessary it is.
"But I know why there are leap years and we cannot get rid of them. Most societies two millennia ago understood that and had figured out some method of handling it, so it should be pretty obvious how necessary it is."
Not really. In a strictly agricultural society without an easily accessible technology telling you what time of year to plant, some form of regular calendar that didn't slowly get out of synch with the seasons across the generations was vital, likewise where certain organised religions needed to exert control, but today, not so much. Without leap years, it would take ~700 years for the June/December seasons to entirely switch. Most people would barely notice the difference across their natural lifespan :-)
It would take significantly less time for December to become October, which would be relatively inconvenient even if it wasn't a complete reversal. However, even if you don't care about that, we get into philosophical areas about why we even have a calendar. If we don't care about the consistency between our date counting and seasons, why have months? In fact, why track years? Just count everything in days so you can give an age when necessary, and ignore everything else. I think we still find that having a way to describe, consistently, times in relation to solar movement, to be something we want to continue doing. Doing that inaccurately is not any better than not doing it at all.
The planting calendar is one of the most important tools hung on the wall of the seed barn.
I have a stack of them going back to the late 1800s. When combined with the farm journals, I have a pretty good idea of what I need to plant, when, this Spring (which looks to be wetter than normal, but not outside historical data).
If you are far enough from the equator, the benefit is in getting up earlier because sunrise is earlier, so you get more daylight hours before you go to bed in relation to sunset.
Some misguided folk suggest going onto permanent daylight saving all year round. The downside to that is waking up earlier in winter when it’s still dark. Not so much fun then.
"Where I live its not fully daylight until after 9am in late Dec. Without DST adjustment that would be 10am. So some justification to the travel safety argument."
I think you have that backward. In winter, the clocks are still on standard time, so there is no adjustment and 9:00 is 9:00. I'm assuming winter from the late sunrise. Thus, without DST, it would be exactly the same. The difference comes in the summer only. Of course, you could institute winter time where the clocks go extra forward, making that 9:00 into 8:00, but only if you're willing to have a rather early sunset.
"you get more daylight hours before you go to bed"
Shifting the day doesn't make it longer: You lose on one end what you have gained on the other, so what's the point? You get to wake up one hour earlier, in total darkness, just to get some more daylight in the evening, when you're sitting in front of your TV? Not convincing.
That is the point. Typical city dwellers do not get up much before sunrise, but they do stay up long after sunset. Daylight saving simply caters to this characteristic to maximize our use of available daylight hours — hence the “saving” bit.
"Typical city dwellers do not get up much before sunrise, but they do stay up long after sunset."
Have you seen a city dweller lately? I mean, since electricity was invented? They all live indoors, all lights on, and who cares if there is still some amount of daylight outside. After all, there is no day or night in a city, it's just the sky illumination which turns off part of the day.
The argument of people staying up late and thus needing additional daylight was valid in the 19th and early 20th century, nowadays people live totally disconnected from the sun cycle.
In my mind, this is no problem, just get up earlier. This is especially true for farmers who can start their day as early as they want to because their fields don't register for appointments on their calendar. The clock doesn't need to be changed for that to happen when they could just set their alarm back an hour some time in the spring when they decide the sunrise justifies it, or even better, by ten minutes six times throughout a month so there's less of a jolt to the schedule. I don't need you to get up earlier in order for me to do so.
Alarm? The cows queuing up outside the milking parlor let us know the day is starting. (Some people swear by roosters, personally I swear at them ... so there are none on the Ranch.)
The clock has no bearing on day-to-day life around here. It's all tied to the sun and weather. No amount of the Government dicking around with what the clock says will ever change that. Even the Vet, Farrier and various delivery folks use time no more precise than morning, noonish, afternoon or evening.
The "solution" I keep banging on about is to do a one-off move of the clocks by half an hour in spring or autumn, then leave them forever like that. Will work for anywhere there is Daylight Saving. One reason why it might not be liked as a solution is that GMT will then only exist as a virtual reference point for those of us in Blighty. Another tradition consigned to history.
Ok there will be a big upheaval initially, but in the long run... Perhaps the apocalypse is so near that it is not worth the effort.
Joh Bjelke-Petersen more than most with gems like:
You can’t sit on a fence, a barbed wire fence at that, and have one ear to the ground.
The Canadian environmentalist, David Suzuki, named him the greatest deadhead in the universe at one stage. :)
Australian politicians generally aren't greatly endowed gray matter department, Queensland ones don't even seem to have been waiting in the queue.
For the record Queensland doesn't have daylight saving but the other three eastern states (NSW,VIC,TAS) in the same timezone (UTC+10) do which can be a PITA.
When it mattered to me I just set the watch to UTC and add 12 sub 1 in Sydney and add 12 sub 2 in Brisbane. (remember its modulo 12 :)
A consequence of buggering a couple of watches changing the time backwards and forward.
That was interesting. Stupid worked. Clever worked. The halfwits in between failed. In 2100 the halfwits will be fine, only stupid will fail.
And my prediction is we will get rid of utc. Only solar time to keep things working, and atomic time with 37 seconds offset for precision. Posix time will be solar time.
One of the best descriptions of the "unless 400" rule was found in a VAX/VMS Software Problem Report (SPR). I can no longer find the original, but a transcription is still available from Hewlett-Packard at [http://h41379.www4.hpe.com/openvms/products/year-2000/leap.html]. Highly recommended reading!
Exactly 40 years ago, I was working for an IT company which supplied software for pathology labs in hospitals - basically databases of results of blood tests, urine tests, and other unmentionable things.
Quality wasn't very good, but, having been there for only a year, I was trying to improve it.
Feb 29 rolled around, so I sat back and waited for the phone to ring. Nothing. No complaints. Had we got through it? No. The next day, March 1st, the complaints came in.
At some time in the previous four years, before I joined, someone had added a "delta check" facility to the software. This checked a patient's latest results against their previous results, and raised an alarm if they were changing too quickly.
Whoever programmed the delta check had forgotten about February having 29 days every four years. So when it compared new results against older ones, it calculated the time difference to be 24 hours less than it really was... and all hell broke loose as a large number of patients were flagged as needing attention.
Only good thing about it was that eight years later, New Scientist magazine published an article which I had written about it. I had realised that in 1992 they would have an issue actually dated 29th February, so I submitted an article recounting the above leap year woes, and then looking forward to 01 January 2000 - one of the early mentions of what became known as the Millennium Bug.
The next week they published a letter by one Arthur C Clarke, saying "interesting article, but I described this problem, and a solution, in my book......". I later saw several very similar letters from him, on other topics, so he must have had a standard template that he just added the appropriate details to before firing it off to the magazine.
Unisys 2200 systems log files use a double word integer (72 bit) to store the time. The starting date is December 31st 1899.
While in development the starting date was January 1st 1900. Luckily during development, a large customer wrote their own date conversion routine. They discovered the OS team didn't know 1900 wasn't a leap year.
Things were far enough along that the OS team didn't want to fix it "correctly" so they just changed the documented starting date.
Leap days are not going anywhere.
Almost nothing anyone says or does will change the fact that the number of days in a year is not an integer; that is to say, the Earth does not rotate around its own axis a whole number of times in the time it takes to complete a full turn around the Sun.
Almost nothing anyone says or does will change the fact that the number of days in a year is not an integer; that is to say, the Earth does not rotate around its own axis a whole number of times in the time it takes to complete a full turn around the Sun.
No, that's one of the other myths about timekeeping. A day is the (average) period from one noon until the next. For the Sun to return to its highest point the Earth has to rotate slightly more than once, since it has moved around its orbit in the interim. A sidereal day (a single rotation) is about 4 minutes shorter.
> Earth does not rotate around its own axis a whole number of times in the time it takes to complete a full turn around the Sun.
So we either need to push the earth closer or further away to shorten (fixed 365) and lengthen (366) the number of days it takes to get around the sun
As the planet is getting warmer I suggest we push it out.
And there is another variable - the moon is gradually slowing the spin of the earth down just for good measure
It was tongue-in-cheek (notice the original icon).
But if you're happy to have sunrise at midday, and sunset at midnight, because it makes computing easier - why not have spring in July because it makes computing simpler?
We know, from countless ERM fiascos, that you fit your processes to the computers - not the other way around. Man's time keeping will have to match what the computers can manage. (Still, slightly, tongue-in-cheek)
This is not altogether unbelievable. Certain genes have actually had to be renamed; fittingly enough, because Microsoft Excel was incorrectly interpreting their names in cells of type "auto" as dates and storing them internally as such, which led to them being mis-displayed on systems configured for other localities.
why not have spring in July
This type of dating actually depends on governments. Here is Australia the government defines that the season start at the being of a month regardless of nature. Other places define the start of a season as starting on the equinox or solstice. To me nature defines when they start,
Removing leap seconds is actually reasonable.
There are two reasonable time systems: Solar time where the sun is at its highest above Greenwich exactly at noon every single day. Which can be measured within 25 microseconds. And atomic time, where each second lasts exactly one SI second, within a few nanoseconds every year. Currently 37 seconds ahead of solar time.
UTC is a perversion that tries to match up atomic time and solar time. Every time atomic time goes too far ahead of solar time they insert a leap seconds. So you get a mix of solar time before the decimal point and atomic time to the right of the decimal point. With solar time you get the time within 25 microseconds without problems. Atomic time gives you the time within nanoseconds. With UTC getting the time within a second is hard.
"There are two reasonable time systems"
There is a difference between "time", the dimension, and "what time is it?", clock/calendar time.
I run on three major clocks, and one minor one.
The first is TheWife's monthly cycle. If you are married to a woman, you'll grok.
The second is the seasonal clock handily provided by the Solar Year & the Earth's axial tilt with respect to its orbit. It is totally out of my control, but I plant my fields & breed my critters by it, as humans have since time immemorial. Trying to change this is a fool's errand.
The third is the clock provided by the Master clock on my network, which syncs up to an atomic clock once per day (ntp.org works for most purposes ... I use something else), which all of my machines adhere to. This is for computer record keeping more than anything else.
Context is key. There is no "SingleTimeStandard[tm]", and never will be. With the exception of The Wife's, of course.
The minor fourth clock is my dive watch. I wear it when appropriate. It's kinda important ... but it could be completely out of sync with the three major clocks in my life and it wouldn't matter at all.
As a side-note, I don't wear a wristwatch day-to-day ... and haven't in nearly half a century (since my HP-01, back in 1977). In my mind, they are completely pointless. Everywhere you look these days you can see something giving you a pretty good approximation of "local time". Humans living life to the second or minute (or even ten minutes!) is counter productive. Even when baking bread ten minutes either way won't kill you, or the loaf ... Relax, be patient, learn to make cheese, cure meat and brew beer.
The first is TheWife's monthly cycle. If you are married to a woman, you'll grok.I'm genuinely curious here: Do you not have your own monthly cycle?
Oestrogen affects the growth of facial and body hair. If you shave with a blade, you can actually feel the resistance offered by the hairs changing from one day to the next, coming down the handle; and it's easy enough to demonstrate experimentally that this is not just your razor blade getting blunt.
It's my personal contention that this phenomenon is where lycanthropy legends originated; since it's no great leap to imagine a man preferring to believe he is turning into a wolf during a certain phase of the moon (which, conveniently, tends to line up fairly well with the oestrogen cycle; there may well be an evolutionary explanation, as an external clue to your own fertility sounds like the sort of survival advantage that could be significant over thousands of generations, and the period of the cycle sounds like the sort of thing that would be hereditary), rather than accept his body undergoes the same cyclic changes as a woman's body.
Oh, and if the quartz clock in my kitchen is in anything like sync with local solar time, you may infer from that fact that the battery has recently been changed. It's used strictly for measuring time differences (especially since the timer on my microwave broke; so as long as there is any life left in the magnetron, it's wedged on full power and controlled with the switch on the wall socket), and whatever it may lose or gain over the duration of cooking a meal is unlikely to be significant. Even although it drifts a lot over the life of a battery, which may well include daylight saving changes. It's not really worth getting the clock down from the wall to adjust the hours and minutes hands, if I'm only paying mind to the seconds hand!
"Solar time where the sun is at its highest above Greenwich exactly at noon every single day."
But, due to the earth's orbit not being circular, the interval between this can vary by +/-O(15 mins). That's why we had to invent a mean day and mean time for practical day-to-day applications. There are just four days in the year when local apparent solar time and mean time are about aligned at Greenwich - as governed by the "equation of time". (Which, I guess, is an example of regularising our time keeping to make computing easier.)
"Every time atomic time goes too far ahead of solar time they insert a leap seconds. So you get a mix of solar time before the decimal point and atomic time to the right of the decimal point."
Yes, UTC is kept within 1 second of UT1. But UT1 is not a measure of solar time: it's a measure of the angle of the earth (the position of the equinox) remembering it takes about 4mins less than 24 hours for the earth to rotate through 360°. Let me quote the Explanatory Supplement to the Astronomical Almanac:
Although it would be possible to define a system of time measurement in terms of the hour angle of the sun, such a system could never be precisely related to sidereal time and could not, therefore, be determined by star transits.
I'm suspecting your 25μs is, anyway, the error in difference between UTC and UT1 (IERS Bulletin B, 12th Feb, gives the mean formal error UT1-UTC as 0.0253ms)
A consequence of this is the mean sun used in GMT isn't even the real mean sun - it's a fictitious one. (Hence the "about aligned" in my opening para.)
I also think it means your assertion that "you get a mix of solar time before the decimal point and atomic time to the right of the decimal point." is incorrect. I get what you're saying; it's not a bad mental model. But UTC is an approximate measure of what you call "solar time" (i.e. the earth's rotation) that is advancing in SI seconds as measured on the "surface" of the earth.
And the point is, as humans, we like that angle. IIRC, the historical record shows 1hr/1000 years is plausible. If the poles disappear quickly, it could get even worse. Nobody is going to accept to sunrise at midday. (Pre-millennial projections over estimate the number of leap seconds we would have. So maybe it will sign flip and go negative; I'm sure negative leap seconds will really piss everybody off even more.) But,drop leap seconds, and we'll either introduce leap hours or time zones shifts, and end up with timezones that are 18hrs off "atomic time". So we'll be saving ourselves some some work in the short term but creating a lot of pain further down the line.
The 25 microseconds is the precision with which we can determine “highest position of sun above Greenwich”. In that time the Earth at Greenwich moves about 1cm below the sun. What actually is measured is the position of some quasars in the middle of the night.
Earth rotation speed is off by 0 to 4 or do milliseconds every day; that determines how the difference between UTC and UT1 changes. That can be 100 times more in a day. Plus the 25 microseconds are not cumulative. They don’t add up. Every day we get the exact UT1 date with an error of +/- 25 microseconds.
So if you really want Earth rotation as the basis, ut1 gives you that within 25 microseconds, UTC is off up to 0.9 seconds.
Why is this a problem? I spent 35 years as a computer programmer and never once encountered a problem with February the 29th. One of the Amstrad CPC mags back in the 1980s published a BASIC function that could tell you the day of the week for any date going back hundreds of years. It could even be modified to handle the various leap days when the Gregorian calendar was adapted around the world (eg;in protestant countries 11 days were removed from September in 1752).
If anyone's code is confused then whoever wrote it made a right bog of it.
Back in those day though we had Analysts, Programmers, DBA and other specialists,
Now half of development is by lowest cost script kiddies.
I've just been part of setting up a new office in India, interviewing supposedly senior developers. Watching their eyes flicking around when I asked questions, resulted in the same two responses from me:
"Are you reading from Post-It-Notes all over your wall?"
"Are you looking this up on the Internet?"
I even told one guy that ChatGPT gave a better summary than he did trying to read from Wikipedia!
My senior people in the new India office are of equivalent standard of my first year summer holiday interns when they go back to Uni to learn more.
The bean counters just see wage costs, not the amount of teaching time we have to put into these people, fixing their bugs, etc. until they leave as soon as they think they have learned enough to get the next job.
For a cheaper site, I prefer Romania.
They speak better English
They are closed to Central European Time
They are loyal to the company
But again the bean counters kyboshed that idea.
The English-speaking world changed in 1752. Catholic countries changed in 1582. In places where Orthodox Christian churches dominate, civil calendars changed at various other years. Greece, for instance, didn't switch until 1923. At least some Orthodox (e.g. Russian Orthodox) churches still use the Julian calendar as their liturgical calendar.
All of this has led to one of my favorite questions... If it is claimed that someone was born on 29 February 1900, can that be true? Lots of people have such a poor grasp of the Gregorian calendar that they don't see why there is a problem. Those that do understand the calendar ask, "Where were they born?"
Um..no. It was published in a magazine aimed at the young computer enthusiast. The Amstrad CPC was an 8-bit Z80 based Microcomputer sold in the early 80s.
As published the function worked according to the protestant version of the Gregorian calendar (it was a UK magazine after all) but the line of code that handled that calculation was followed by several similar lines commented out with information as to which sky fairy sect they related to.
Ah, the CPC 464. I have fond memories of that. Probably more playing "Harrier Attack" than of writing some incredibly elementary BASIC to make a little character man run across the screen.
I have less fond memories of waiting 17 minutes for a game to load from tape, from (very hazy) memory it was called "Sultan's Maze", only for it to be, well, crap.
If they only they had some prior warning this extra day that happens every four years was going to happen. The concept of leap year has only been around since the 3rd century B.C. and introduced into the Gregorian calendar in 1582 so it's understandable they haven't got the hang of it yet. I wish them luck in 4 years time when it unexpectedly happens again for the 112th time.
Can't be building pyramids in winter
Or, more accurately, can't be building pyramids during the Nile flood and the month or so afterwards..
(That was when the farmers planted their crops in the nice rich mud that the floods had left. There isn't a huge amount of seasonal variation in the weather there!)
Here I was thinking "I haven't worried about date problems since the 90s" And then it was only because we were prefixing ticket numbers with 2 digit years and going from 991234 to 001234 blew up our sorts on reporting. Lesson learned.
Then you made me look at my watch, out of curiosity. It's says it's Thursday 3/1. I guess it's going to be 2/28 again today, and maybe tomorrow it will be back to normal.
My digital watch also says "TH 3/01". I'm waiting until it says "FR 3/02" before I fix it.
Getting leap days wrong is not a new thing. I remember on 2004-02-29 when many at work were complaining their cellphones said 03-01. I fixed my Nokia something-or-other by removing the battery, after putting it back and restarting it picked up the correct date from the network. Worked for my boss as well, different cellphone manufacturer.
My Casio G-shock went 28, 29 and is now showing 1st. I remember that the manual for my first Casio digital watch bought in the early 80s said it knew about leap years and would be good until something like 2100. At the time knowing what I did I couldn't see why it would have a problem beyond the year 2100 but I suppose Casio's marketing department felt that 'it'll calculate the date correctly forever' to be too broad a claim.
So..yeah. I don't understand why this year has suddenly sprung a problem on the IT world and if I think about it I become increasingly alarmed. I only retired in August so surely things can't have gone to shit that quickly?
OK, I can just about accept that a few things like petrol pumps might go wrong, due to some amateurs somewhere not thinking too carefully when hacking out code. But come on, a watch, which has the one main function of working with date and time............. smart that most definitely is not.
Well, my 1989-vintage Casio F-91W digital watch isn't aware of leap years. At midnight it jumped to the 1st of March. I had to put it back to 29th of February. Now let's see if it's smart enough to know what to do when the 29th ends.
> February 29
Also, it's the day many people don't get paid for working.
If you are paid weekly, then the extra day just forms part of the normal working week and employers pay their staff for the leap-day.
But if you are paid monthly, then you get the same amount fof February 2024 as you got for February 2023 (assuming no intervening pay rise - an increasingly common complaint). Even though you work an extra day in 2024.
Where I work, we are paid a (mostly) fixed monthly amount that is calculated upon a specific yearly number of hours worked (1,607 I think; it's the number of non-holiday days multiplied by seven hours) divided by 12 (months).
As we clock in and clock out, the time worked is subtracted from that big scary number. The aim is to be +/- 14 hours by the end of the "year" (April 1st to March 31st).
So we don't get paid extra for the 29th, but we do get seven hours deducted so we're not losing.
If you're payed the same amount for every month, then you're getting more per hour or day in February, whether it has 28 or 29 days, because it doesn't have 30 or 31 days and every other month does. If it's something based on the number of days and they treat it as 28, then maybe you have a point, but I'd need to see that algorithm to understand if it applies or not.
@Pete 2 "But if you are paid monthly, then you get the same amount fof February 2024 as you got for February 2023 (assuming no intervening pay rise - an increasingly common complaint). Even though you work an extra day in 2024."
It's not the frequency you get paid, it's what you are being paid, salary (normally per annum) or wage (normally per hour). Waged you are paid for the actual hours worked, Salaried is a fixed amount for a year paid incrementally at regular intervals.
Waged does not have to be paid weekly in past I had hourly paid jobs that have been paid monthly and even fortnightly. Salaried is normally in my experience paid monthly but not always, one of my past employers paid 4 weekly so 13 pay packets a year.
BTW the number of working days in a year can vary by 2 if your working week is Monday to Friday, 2024 has 2 more working days than 2023.
If you pay salary employees people biweekly normally there are 26 pay periods in a year.
Every 4 or so leap years you end up with 27. Of the 23 years I've worked for my employer this has never been a problem until one bean counter realized that we would be paying salary people 1 extra pay period. They devised a plan to reduce every pay amount so that the annual amount would match the salary. The backlash was deafening!
The simple answer is to pay people semi-monthly but our payroll software can't do that! It can only pay people hourly in periods divisible by 1 week. (i.e. weekly or biweekly). We are using one of the largest financial programs in the world.
"just one of those things that caused payment software to have a glitch."
Just who the f~1 are they getting to write their software?
Unix Time starts counting from Jan 01 1970
Windows File Time starts counting from Jan 01 1601
----------------
Python Script to test if leap year is handled correctly ..
----------------
from datetime import datetime, timedelta
# Create a datetime object for February 28th, 2020 (leap year)
date = datetime(2020, 2, 28)
# Add one day to the date
next_day = date + timedelta(days=1)
# Print the result
print(next_day)
Just who the f~1 are they getting to write their software?
Exactly! When I did my compsci degree I remember one of the first year assignments was a date calculator and it had to manage leap years. Pay peanuts get monkeys...
I'm just gobsmaked as to how anyone could make this mistake!
"We have all that still to look forward to" in 2038.
Not all of us. I expect to have no more computer time or any other kind of problems by then.
"OK Boomer" you may well say, but sometimes it's reassuring not to have to worry about things. "Right. We can just cross that one off the list."
Not to mention that this must be a fairly new bit of software or at least has a had a major update in the last four years. I suspect the latter since it seems to be being used across a fairly wide range of customers. But either way, someone wrote or updated those date functions in the last four years or they'd have already been hit by it and fixed it in 2020.
The BBC had an article about "leaplings" and many of them complained about not being able to enter their correct birth date into many forms
"When taking out a new phone contract, she failed the credit check because 29 February didn't appear as a date on the system. "
"the dropdown boxes on online forms will only list 28 days for February"
https://www.bbc.com/news/uk-68404617
Will send you birthday greetings each year for good customer relationships (i.e. local businesses like dentists, auto shops, restaurants)
Or maybe that's just a US thing. Anyway, I receive a few around my birthday each year. I have a friend born on Feb. 29th, and she once said she only gets those things every four years. Because her "birthday" doesn't come up on those systems the other three years!
One of my investment portfolios has a poor web interface, especially when searching for documents: date ranges are impossible to choose if they stretch outside two calendar months. To work around this (so they must be aware people need to) they offer some presets: last three months, last six months and last twelve months. This morning, I needed to use "last twelve months", and it dutifully set the range to 01/03/2023 to 29/02/2024. Great! It worked! So I click "OK" and get "Error - date ranges of more than 365 days are not allowed".
In my dark days in the mid-90's in College doing some obligatory web design course, I ran into this exact bug. Pushing random values into the National Express website; the user could select the 30th February or 31st September. This would of course promptly crash the form.
I think the point being that these organisations just don't care if it works. Slapped together, untested, on the cheap by some outsourced outfit. Dick Jones and his ED-209 programme ala Robocop is out in full force.
Considering that ideas presented in said film were supposed to be deeply satirical, that so much of it is now in fact reality...
One day, Starship Troopers will be considered a visionary film too, one suspects. For now, cult following will have to suffice!
In about 4 million years leap days will no longer be necessary.
Actually, I only came here to complain about you stripping me of my silver status.
Can anyone suggest an unpopular sport for me to try at the next Olympics? Obviously, if there are fewer competitors I'll have more chance of winning, given that there isn't much time left for me to train. Gold would be good, but I'd settle for Silver.
It's a shame that posting snarky comments on the Internet isn't an Olympic sport yet.
You need to post 100 times in the past year to get Bronze.
On top of that, you'll need 2,000 upvotes to get Silver.
You already have the 2,000 upvotes, so I suspect that if you "test" post 100 times to a junk thread over in the user forums, you'll be silver again. Well, 99 times ... you've already posted once. The user forums are kind of hard to find these days, so here ya go:
https://forums.theregister.com/section/user/
Welcome back. Have a beer :-)
It is a big issue on the day.
A temporary workaround is issued.
The permanent fix task goes in the work backlog...
Then descends the priority list - well, there are another 4 years to get it fixed!
In 4 years time, it fails again.
... and no one is left in the tech team or the business stream who even remembers.
Been there.
a huge amount of new code out there
It will be an error in the date validation code in an interface. By report it's a problem with the credit card interface. Date calculations within a program are pretty much a solved problem, but interfaces between disparate systems, using 3rd-party channel definitions, are not a solved problem, so you have validation applied at the interface. And yes, there is still a lot of new code being written for applications connecting to backends over interfaces. The whole IOT space is new, and the BTB space is still evolving.
but interfaces between disparate systems
You could be on to something! The EFTPOS interface ISO8583 or equivalent should not be a problem as other things would have stopped as well. But I'm interested in the firmware in the actual pump. Strange that it just happened this leap year and not last leap year...
So they always say the Millenium Bug was nothing, that it was hyped. I always said its because stuff was fixed before it. Doesn't this prove it could or would of been an issue if nothing had been done?
Funny, boring story is I was still in college in 1999, just finishing a HND and I never remember anyone mentioning Y2K to us. So when my brother in law asked me a quiz question at the time that was in the paper I knew nothing about it. Don't know why I missed it.
My place were all over it, mostly because half the hardware they had needed BIOS updates to keep working. Those old 486s with poorly coded BIOS were soldiered hard and long!
I'm sure some of the contractors of the era took the pish with Y2K bug threats; but there were genuinely places and patches that needed to be done. I'd say it's only in the last couple years with the attention moving onto cyber security that BIOS updates are back on the radar of many operators. IME and AGESA firmware being so complex and likely sources of exploits alone being the obvious examples of why you do actually need to stay on top of them.
"Don't know why I missed it."
In the 2 years leading up to 2000, I got paid an awful lot of money re-certifying stuff that I had already certified to be Y2K compliant some 10-20 years earlier. Same for the embedded guys & gals. By the time 2000 came around, most of the hard work was had been done a decade or more in the past ... the re-certification was pure management bullshit, so they could be seen as doing something ... anything! ... useful during the beginning of the dot-bomb bubble.
Look for similar bullshit/misdirection leading up to 2038 ... despite the fact that all of the important systems that would be affected either already have been, or can easily be modified, making to so-called"problem" non-existent. (Certain hardware that was stupidly hard-coded being the exception, but most of that will probably be landfill by then anyway.)
My dad was offered a bunch of money to certify an expensive business-critical system. He was already too old for that shit: he explained to them that, if the system was not Y2K compliant, the dates printed on the optional paper copies they didn't print would be incorrect, and since they didn't care about that, they could just wait and see.
The millennium bug wasn't nothing. I worked my butt off replacing a banking package that wasn't Y2K compliant. Got a shit load of frequent flier points enough for a round the world with accommodation for two people.
It didn't happen because a load of people did a load of work.
One of my treasures:
Calendrical Calculations
Nachum Dershowitz and Edward M. Reingold
Cambridge University Press, 1997
First edition but there are a few later editions which would have enlarged on this fascinating topic. Mine came with a cdrom with scheme(lisp) code implementing the book's calculations.
I think the Hobbits had the best calendar - something like 12 months each of 30 days and get blotto for the extra five days - bit like Xmas - NY. Over time probably over-observe the blotto period by the odd day or so that it averages out 5.25 days per year.
The first day of the Hobbit new year started on the same day of the week so calendars and diaries were reusable. :)
How can some software, in the year 2024, not properly account for the additional day in a leap year. The rules for when there is a February 29th added (or not) exist for several hundreds of years, well before there was even a remote inkling of computers around.
It just comes to show that there are far too many "programmers" around these days that might know about all the latest paradigms and other fluff in programming, but have lost all connection to real world problems...
"It just comes to show that there are far too many "programmers" around these days that might know about all the latest paradigms and other fluff in programming, but have lost all connection to real world problems..."
Well, what do you expect when Management in the Corporate World is firing old programmers and hiring wet-behind-the-ears new graduates[0] with absolutely zero street smarts? Throw in so-called "DevOps" and its insistence that QA can be dispensed with (as a money saving measure, don'tchaknow) along with Marketing's attitude of "just ship it, we don't care if it's useful to anybody, some schmuck will buy it!" and Bob's your Auntie.
The proverbial thinking man can probably see that it's only going to get worse before it gets better ... and a techie with an entrepreneurial bent can undoubtedly figure out how to profit from this shortsightedness on the part of marketing and management.
[0] Round about 2000 I started interviewing "programmers" fresh out of school who didn't know what the heap and the stack are (much less how the compiler uses them) on a fairly regular basis. Nowadays it's normal for the youngsters to have many gaps of that nature in their education. I fear we are losing something very important that is going to prove to be almost impossible to get back.
Hear Hear! Back in 1990ish I'd written 6502 code that correctly converted dates to days-from-400year-epoch (which then gives you day-of-week by MOD'ing by 7), and increment-this-date-to-next, all working. (Converted it to PDP11 a while back). Being a finite and managable set of inputs, it was easy enough to code some scafford to run through the entire set and check the output was correct. Dunt people do testing today? Or even basic sums?
Getting it right is easy. Figuring out what you need to get right is hard. Someone reported their software calculated “average revenue for the last 5 years” and failed on Feb 29th in leap years because “5 years before today” didn’t exist.
Now imagine they wanted “average revenue for the last four years”. The difference is that the date “four years before today” will always exist until Feb 29th 2104. That’s the first time when “for years before a leap year” is not a leap year.