
Collapsing global economy and rising fascism.
So no real change then - Situation normal.
Microsoft has released a (very) miniature rival to the Millennium Bug into the wild with a glitch in Outlook that takes the user instead back to the 20th century if they seek dates beyond December 31, 2029. Those with long enough memories will recall the Millennium Bug, a feature of software design resulting in two-digit date …
That paragraph, and the one about contradicting views on the Y2K bug, are filled the kind of subtleties that the old Reg was choke full of and made me want to read it instead of greyer publications but are, unfortunately, getting rarer in current articles
As others say, bring Dabbsy back!!
This post has been deleted by its author
I believe most Microsoft products currently assumes that two digit years from 30 upwards are 19nn, and I doubt it's unique to them.
Excel certainly works that way, enter 31/12/29 and you get 31/12/2029; enter 01/01/30 and you get 01/01/1930, although for MS SQL 01/01/50 is the changeover date.
(/me yells into a pillow in frustration, then takes a deep, calming breath) For those of you who were not involved with Y2K testing or remediation:
There is a limited range of dates expressable using just two digits for the year.
"Date windowing" is a technique which allows the use of two-digit years. The farther back (in the past) you set the lower bound of the window, the fewer future dates are expressible within in the window. The window is not stretchable. It covers a range of 100 years.
Microsoft picked what they thought was a reasonable lower bound for their window, and most (not all) other companies used that same lower bound for theirs.
If you want to express dates beyond 2029, you will have to type in four digits for the year. This is not a bug; it's an inherent limitation using date-windowing where the lower bound of the window is set to 1929.
Software which will not allow you to type in four digits for the year, or which will not display four-digit years, does have (unremediated) bugs.
A common method is a sliding window, eg $thisyear-60 to $thisyear+40 assuming more dates in the past are going to be manually typed in. (Because most of the dates my code flings around go into file stamps, I use 1980-2079.)
But really, the solution is for the software to scream at the user and refuse to accept manually typed two-digit years.
Sliding date windows: beware!
Sliding date windows ~might~ be okay for manual data input, but should not be used for programmatic data input, nor for internal calculations. If you use sliding date windows for manual data input, aren't you depending on your users to know the limits of the sliding window, and know that today, the maximum two-digit date they should use is DD-MM-YY? If users don't always correctly remember-and-compensate, they may inadvertantely introduce data errors. This sounds like a recipe for trouble.
Using sliding windows with a fixed set of programmatic input data will produce output which varies depending on the date when the program was run.
Using sliding windows in internal calculations wth a fixed set of data will produce output which varies depending on when the program was run.
I don't want my martgage payments or drug dosages varying based on "stuff".
Programs that output incomplete dates for other programs to consume need to be taken out behind the shed and beaten with a shovel.
The programmers who *wrote* such an abomination any time in the last 30 years need the same, along with an additional dose of carpet and quicklime...
Actually storing dates as a two-digit year has been Really Dumb for quite some time now.
Yes, there is an option, but it's not an 'Excel' or 'Outlook" option, it's a 'System' option.
On Win7 it's in 'Region and Language" -- which is not intuitive, but it's where you select which calendar you system uses.
Where it is on Win10 or 11, or 365, I leave as an exercise for the reader.
It's in the same place in Windows 11 (and presumably therefore in 10, too). It's one of the few remaining things you need to launch the old Control Panel for, there isn't an equivalent in Settings.
Interestingly, in Windows 11 the default setting is 1950 -> 2049, so dates up to 31/12/49 should fall into the 21st Century correctly.
GJC
MS' crime is the window hasn't kept up with the current date, it's probably remained the same in Outlook for two decades.
This is assuming that the only date windowing happening now is for user input and everything works behind the scenes... I'm sure it does, right?
Icon is for "depending on who you believe" in the article. They still know how to wind up commentards of a certain age when they want to despite the po-faced new house style (they're turning into Ars Technica...).
Date windowing should just be for user input at the keyboard where the year is immediately converted and the user can see if the wrong century was chosen. Nobody's going to stop getting their pension at -12 as a result of moving the window.
But by not moving the window, it will cause problems at around 2030 or so.
Absolutely.
Unlike Y2K, where users couldn't enter a four digit year because the database or the data entry screen wouldn't allow it, this is entirely a user problem where users can't be bothered to enter a four digit year.
If Microsoft wanted to fix it they'd require four digit year entries in Office.
We'd approve, but the common users would scream bloody murder.
IMO the sliding date window is ONLY acceptable for general use if entering a two digit year always means the current year: if you want to input a date in any other year, then enter four digits. This assumes the system is a financial one and is only dealing with precisely known dates that are close to the present, AND that an European/American style calendar is being used.
Outside financial circles the ways a date is represented and its (implied) accuracy can vary widely.
I once worked on a music-related system that dealt with dates that, amongst other things, needed to record the date when a piece of music was written or when a composer was alive. As a result it had to deal with dates ranging from from 55BC to the present and with varying degrees of accuracy ranging from "11th century", i.e. some time in that century, up to and including exactly known future dates, such as the projected date of a concert. Needless to say, in this system every date needed to include a code indicating its accuracy and its required format when it was being entered or displayed.
If the current year is meant then a day and month should be sufficient. If provision is made to enter a 2 digit year then the sensible default is to that closest to the current year which is what the user is most likely to expect.
Historical dates are a nightmare. E.g. the archivist classified something as "Early C14th" but looking at it I suspect it could be before another dated in the 1290s and none of the witnesses who are found in other, dated, documents are exclusively C14th.
I've used LibreOffice entering dates - as 4 digits - back to the C12th. I have no problem with using 4 digit formats. But if the year is set to 2 digits the default should be what a user might find most reasonable and I doubt a date almost a century ago compared to one a few years ahead is not likely to be considered reasonable.
This might have been reasonably considered Good Practice back in 1999 but I can't imagine anyone being able to put forward a cogent argument for its being that almost a quarter of a century later.
This post has been deleted by its author
If you mean the "cookie consent" banner, it's actually unnecessary as it hides a link at the bottom of the page (Your consent options that should do the same thing. The myth that cookie banners are required by law is actually a massive con. The law requires consent to be sought for non-essential trackers (including cookies) but it's actually unlawful to try and coerce consent by interfering with access to a site (e.g. by use of intrusive banners). The problem is that if there were just a passive link, nobody would follow it and agree to non-essential trackers, so the industry has decided to make it annoying so we are temped to clock through thoughtlessly, giving them the supposed right to use non-essential trackers. I say 'supposed' because it's actually breaking the law. It's notable that the banner you are apparently objecting to is much larger than strictly necessary and obscures the foot of the page content.
The problem though is that the regulators don't really give two hoots about this kind of law breaking, to the extent that the UK govt. is trying to remove the consent requirement on the ironic basis that people don't like the essentially illegal cookie banners. So because people object to businesses breaking the law by coercing consent in order to track us, we're going to make what they're doing lawful by allowing them to track us without consent (so they won't need to use the annoying banners any more).
I like your rant but I think the poster is referring to the 'What are the biggest tech-related initiatives impacting your organisation right now?' inline panel that appears at the bottom of every article, whether you've answered it or not.
Maybe it gets blocked by certain ad/tracker blockers, but I get it here at work.
Sounds like a great way to get a load of unwanted noise/garbage responses messing up your survey.
Is there any way for them to intelligently filter out random selections hammered in by people under the (mistaken) impression if they do so it'll go away, or by those who just dislike the thing full stop and want to mess it up?
Can confirm, the millennium bug was mostly "an overblown scam to suck money into the tech industry", peddled by people who mostly didn't know the difference between a bug and a virus, and had magic snake-oil fixes for sale.
Because obviously if my 1990s electromechanical washing machine thinks it hasn't been invented yet, it is going to spew water everywhere and spontaneously combust.
Me too - and, having fixed it, pulled all nighters around the new year because management weren’t 100% convinced that our fixes worked.
Luckily they did - although the downside is that we now have to put up with conspiracy theorists claiming that it’s all a scam.
I now work in a field adjacent to medical research and I see the same dimwits denigrating the efforts of my colleagues by saying that the Covid vaccine is a scam.
Same dimwits, different decade.
In 1998 I was part of a team of 70 in the UK working on Y2K fixes for combined banking & insurance systems created in the 1970s. Plus a software house in India working on the simpler fixes. Plus an automated s/w updating tool running in the US for the easy-to-spot issues. Plus a team of people working every night to fix crashes caused by having birth dates before 1900 and policy expiry dates after 2000 on a system that had been written with 2-digit dates in many hundreds of different fields across a similarly large number of programs. For that particular business, it was a potentially terminal issue that had to be fixed and tested well before 31/12/99.
Kudos for your efforts, but your higher-level management sucked for not having started the Y2K work earlier.
Properly-done, Y2K testing and remediation would not be a down-to-the-wire finish. (Our management sucked, too. If we'd started a year earlier, we would have finished "comfortably" on time, rather than "barely" on time. On the plus side, in 1999 I made more money from overtime than from my normal pay.)
Kudos for your efforts, but your higher-level management sucked for not having started the Y2K work earlier.
It should have been started earlier, but management did suck big time. In 1991 I got a negative comment for using four digit years in programs I wrote during my education as programmer. I countered with two questions:
1) If this were for real instead of education, what would be the expected lifetime of the program?
2) If this were for real instead of education, what would be the chance of the program being used beyond that expected lifetime?
The answer to the first question still (marginally) allowed for a two digit year, the answer to the second was embarrassed silence.
As a bonus, they were impressed by my correct leap year check using a single division by four.
A simple divide-by-four will work between 1900 March 1 and 2100 February 28. Which might have led the above to be expanded to...
3) What's the likelihood this will be used after 2100 February 29?
The nature of the assignment isn't mentioned, but if it involved birth dates... in 1991, my then-100 year old grandfather, and many others born before 1900 March 1, still walked the face of the earth. If it involved hundred-year mortgages, there could have been problems nine years later, on 2000 February 29.
I first checked if the last two digits were zero. If that was the case, I moved the first two digits to a another variable, if it wasn't, I moved the last two digits to that same variable. Division of that variable by four would indicate whether it was a leap year or not. That method works between 15-10-15821) and 28-02-40002).
1)Or whatever date was locally used for the conversion from the Julian to the Gregorian calendar
2)Next exception, if the program is still in use when that becomes relevant, they are welcome to recall me by that time.
It's a clever enough solution alright, but is it understandable 2000 years later when they're debugging it?
After all, films tell us the future will be built on old code.
Not sure I understand why it shouldn't work for the year 4000 and after? 4000 will still be a leap-year won't it?
The year 4000 won't be a leap year, next exception as I wrote. The rounding errors on all of those decimals will accrue to a whole day (too much) by that time. But it is sufficiently far away none of us will have to deal with it (unless we are recalled ;) ).
It's the just that is the issue there. As part of a local authority teaching team we had to use a chunk of our limited budget to get our lap tops Y2K tested. Even though they were standalone machines, no networking and not system critical in any way.
But paying for the test was compulsory. For all of them And that was just us. I'm sure the same waste of money was replicated in lots of places. Everything had to be certified. There was no risk assessment element.
we discovered Y2K bugs in life-affecting software
Indeed. At the time, someone posted to RISKS (if memory serves) an account of fixing software for a dialysis machine that went into cleaning mode if the date 9/9/99 was entered. That's another variant of Y2K bug – an assumption that your software won't be used after a not-far-distant date – which would have killed people if not corrected.
People routinely underestimate how long their software will be in use. I support a commercial software package that was last updated about two decades ago; we have a handful of customers who are still using it. (That is, I officially provide development support for it. We haven't actually had any support questions about it in years, and even then they were generally "we moved this to a new machine and no one still working here has any idea how to adjust the configuration".) Last year I got rid of a VHS VCR that only supported two-digit years, so it no longer had the correct day-of-week in its display (which I didn't care about, but it's another example).
Customers occasionally give us pieces of their application source code to help diagnose some obscure problem, and it's not unusual to see change dates from the 1980s or even 1970s.
The 31/Dec/1999 night I was on duty, after a year and a half working on that project for a large bank.
At around 1 AM I went to the nearest ATM and checked my balance and latest account movements (my account was not with the bank I was working for). There was an interest credit of around the equivalent of 3000€. Resisting the urge to spend it there and then, I went back, showed the slip to my co-workers and pondered on what would happen from there on. At 8 AM, after an uneventful night on the job, I went down and checked my balance again. Without a trace of that earlier payment, it now showed the correct and, unfortunately, much smaller interest deposit...
Someone's night was indeed a lot more eventful than mine ;)
the millennium bug was mostly "an overblown scam to suck money into the tech industry"
It's not so long ago that I posted this here, as I have done a few times before but obviously it needs to be repeated:
I had a client who had a non-Y2K compatible version of their accounts S/W. Their old alleged hot standby server (also previously commented on) wouldn't run the newer version so both servers were replaced and UAT completed to the accountants' satisfaction. We planned to cut over between Christmas & New Year. The accountants chickened out and wouldn't let us until they'd closed out 1999 accounts in mid January. It was a torrid couple of weeks with the vendors dialling in (a modem on a serial port!) several times each week to fix the data.
Yes, Y2K really was a genuine problem.
In the late 1990s I was working with Oracle Forms (kids, ask the oldest person in your IT department) and I can state for a fact that we had to do a lot of work and even more testing to fix a slew of Y2K bugs. Without this work, many 1000s of applications in every field, affections 100s of thousands of end users, would have been impacted.
Most basically, we implemented customer-settable date windowing including a runtime parameter for the base date. But we also did a laundry list of other fixes, changes, and user options including adding support for the RRRR date format. And we had to deal properly with data from Oracle 6 databases.
And even then, many customers had to do work themselves e.g. to fix form layouts with field expansion to allow 4 digit dates; or to correct date arithmetic and related logic.
So yeah, anybody who says it was a scam is flaunting their ignorance, and El Reg is really not a great place to do that.
Yet another example of why automatic updates are a bad idea - particularly stealthy updates.
Not really a glitch -- this is (AFAIK) documented behavior within the Microsoft Office products, they picked xx30 as the cutoff probably arbitrarily, but back in like the 1990s, when xx30 would be pretty far from the current year in either direction. Solution? Use a Y2K-compliant date! If you're going to put in a non-Y2K complaint year, they have to do something after all -- the reasonable thing may be to put the current millenium and century on there, except then people would be crying up a storm when they put in "1/1/99" and get 2099 out (... for a few more decades, then they could complain that when they put "1/1/01" it's giving them 2001 instead of 2101).
Of course, for compatibility with existing files, it will probably have to ALWAYS treat 00-29 as 2000-2029, and 30-99 as 1930-1999, even when it's 2040. After all, otherwise all your existing files with improper dates would suddenly have the date jump ahead 100 years.
At least LO fixed Excel's "29th of February 1900" bug.
Legend has it it was a deliberate bug so Excel would calculate dates the same way as in Lotus 1-2-3. In order to gain market share, it apparently was more important to get things wrong the same way the market leader did.
Hey, me too. As a student in the mid-1980s I had a summer coding job at Satchwell Control Systems and one of the things I wrote was for standard time/summer time switching. And I can say for a fact that unless somebody else fixed it, that code (a) was in buildings including at least one major hospital at the turn of the millennium and (2) would not have worked properly after that time.
Office 95 and Office 97 certainly had what I would consider a more serious bug than 2 digit years.
The ability to copy/read password protected documents in seconds without the need for a password or anything other than Office itself.
Word documents could be compared to a blank document and give you all the contents of the password protected file. Simply because everything was different.
Showing a very large hotel admin office that thousands of documents of their clients from conferences I could easily read including some of my companies competitors caused panic. Especially as even back then they had several back doors installed that were able access to the same files remotely.
For the record rather than open a competitors file I had the secretary create a new one and I then opened it, cut and paste it in to a new document so it looked the same without demonstrating how I did so. My ethics and mission was to get their 'IT' support to do something about it and not blame network issues they had on my companies equipment.
Thankfully MS sorted that issue out in Office. I never did find out if the hotel decided to delete all the documents they kept from previous conferences or not as I was never asked to do a call out on our equipment there.
"Outlook sends the user back to 1930, where they may have to deal with a collapsing global economy and rising fascism."
I know it's been commented on above, but this was a masterpiece of under-the-radar comment. If I wasn't a coward I would be able to select the coffee-spattered keyboard icon.
Many early databases also stored the date as two digits (or at least defaulted to that - Oracle 6 and earlier, I think?). Consequently, you ended up with windowing logic in the application itself in order to slide the 100 year range available to the coder to an application-appropriate window.
Many early databases also stored the date as two digits
That is a nice trick, storing a date as two digits. Can you please explain that technique as I need at least five digits for a date when I use Julian date format or weeknumber and daynumber format, in both cases with two digits for the year.
On a related note, I know of at least one person, who managed to store full dates in six digits by conversion to number of days from a (arbitrary) base date. That gives a window of 2 * 2,740 years.
The dates were often stored in the databases as character strings. Usually in a local and non-transportable format of course... why use the international standard YYYYMMDD (or YYMMDD) which is easily sortable when a local standard that is not easily sortable can be used instead?
Other than legibility by developers who are unable to understand YYYYMMDD formatted dates, there were few database advantages to storing the date in text form and the text form even allowed invalid dates to be recorded, which causes no end of problems when encountered. However the comparison gets a bit muddied depending on how the date is actually stored in the database, dates are usually stored as:
Where storing dates in a textual form has an advantage is where the developer just takes the raw character string and throws it out to the user, possibly with separators put into the character position extracted date components.
The alternative is that, as noted by the previous poster, a date is recorded in the database as a numeric value with an offset from a given arbitrary point in time. This is much more efficient by way of data storage, dates cannot be an invalid date and day based date mathematical operations are much simpler. The major downside is translating such a numerical date into the date components of year, month and day can be computationally quite expensive and when looking at the raw data the date cannot be easily seen. Converting character strings into the date components is simpler but the values must then be checked for validity which when compared to an efficient number to date conversion algorithm makes the translations quite similar processing wise.
Date manipulation such as adding or subtracting months is tedious regardless of the underling format due to the variable month length, let alone leap years and even adding or subtracting years is tedious for similar reasons as 29th February is only valid every on a leap year.