COBOL?
has someone has forgotten about binary coded decimal?
A new year bug has scuppered card transactions at thousands of Australian shops for four days so far, because systems at the Bank of Queensland say it is now 2016. The glitch has meant most debit cards have effectively expired. In the ensuing chaos, shops served by Bank of Queensland have been forced to introduce temporary …
If the dates are stored or manipulated as only two digits at some point, then there is a very strong relationship between 10 and 16 - if you consider decimal vs hexadecimal. It's a long shot, but then after several days I guess they've checked the more obvious ideas.
Of course after the Y2K fuss two digit date manipulations shouldn't really be allowed ...
if you read the article, and the link to the work-around, you'll see it's affecting Blackberry and other handsets as well. If the phone is configured to us the operator timestamp for texts rather than the handset timestamp then people see the problem.
On WM it's fairly simply to toggle the behaviour to use the handset timestamp (WinMo) rather than the operator timestamp (wonder if the operators are running *nix of some sort. Does BoQ run *nix or Windows behind the scenes for the EFTPOS terminals?)
When Y2K came around and nothing happened, I always said it was because us code monkeys prepared for it and solved it before it became a problem. But when nothing happened, everybody said we were full of shit. They never realised that nothing happened not because it was just hype, but because we prevented it.
And this sort of thing is what happens when we don't expect it and thus can't prepare for it. And this is a minor glitch compared to what Y2K would have been. So now, all of you who took the piss out of us coders when Y2K failed to materialise, imagine how much worse than this it would have been if we'd just sat on our arses and did nothing.
Roll on January 2038... we'll fix that one too, never fear. (64-bit time = start 0 at the Big Bang and wrap 584.5 billion years on - never a problem again!)
We'll never know what Y2K would have been without your heroic efforts. However, we can be fairly sure that it couldn't have been anything within an order of magnitude as bad as was being predicted by the army of self-appointed consultants who stepped in to save us. At one point in the late 90s, Gartner group were saying it would cost several hundred billion dollars to fix, which on any reasonable assumptions was far more than the software cost to write in the first place.
Predictions of doom were also rather less credible when one remembered that only real-time systems would be affected (since any system used for forward planning had clearly been handling post-2K dates since time immemorial) and that in many cases one could assess the exposure simply by setting clocks forward. Yes, I'm sure you can cite lots of obscure scenarios where these two rules don't apply, but they covered the vast majority of the code being "fixed" at great expensive by "Y2K consultants".
You were clearly spending too much of the 90s reviewing code. Those of us who were watching the Y2K fraud (and what else can you call it when knowledgeable people sell squillions of pounds of services to non-knowledgeable ones by invoking FUD) have a very different memory of that decade.
Here we go again. Another ignorant troll from somebody who has no idea of what they are talking about. The Y2K work we did was not some big con, but "non-knowledgeable" morons like you will never accept that we actually did any work because you didn't see anything go wrong and would prefer to believe that it was a big conspiracy to con poor old you.
We "time-travelled" one of our mainframes and discovered over 2,000 instances of Y2K code faults which we had to correct, unit test, system test, integration test and user acceptance test. That's just one mainframe, let alone the others, or the mid-range systems/applications, or the faults that time-travelling the systems didn't reveal but could only be found by going through the code, line by fucking line.
Then of course, there's the huge savings we made and efficiency we gained (from economies of scale) from all the other non-Y2K bugs we found and fixed during the process.
If you think it was a waste of time, then you're the kind of idiot we can well do without in IT.
If I say that something is happening, it is I who must show the evidence. Saying that someone doesn’t "believe" in AGW because there is no evidence AGAINST it is nonsense.
- Where is the hard data that shows that HUMANS are responsible? There isn’t.
- Is the World’s climate changing? Yes. It always is.
- What are we doing to ensure our survival when climate DO changes and crops around the World start dying for drought, too much heat or too much cold? Nothing, because everyone is too busy with the fucking fossil fuels, something that in 50 or 60 years will no longer need or have because it is not renewable. We should be planning to see what to do when freaking next ice age starts, not harassing governments and industries to make sure “BIG OIL” is stopped.
It is sad when scientist do their research with political agendas in mind instead of pure pursuit of knowledge, be it on the left o right of the political spectrum.
I assume AGW stands for Anthropogenic Global Warming. And indeed so far there are no real evidence that it is true. But since it has became such a popular topic in the mind of the general public now politicians are following suit.
I experienced the W2K issue in 1986 when I took a splendid old lady to the then brand new hospital at Milton Keynes. Daisy, bless her, had slipped and broken her wrist and sat patiently in a chair as we spoke to the girl on the desk and tried to book her in.
The problem of course was that Daisy was born in 1898 (or something like that) and when 92 was put into the two digit year box on the terminal simply spat it out again as it was not possible, we had not got to 1998 yet so that had to be wrong.
A discussion round the dinner table a few days later raised the issue and Daisy's Grandson later went on to become a very successful businessman, and one of his businesses was based on solving the Y2K bug, and he sold the business to M$!
The Y2K issue was very real, even if it was caused by lack of foresight in the first place!
As I pointed out in a previous post, 2038 is coming and by then all systems will be on at least 64-bit time if not more. 64-bit time will easily cover us from the Big Bang to well after the end of the Stelliferous Era, at which point anything descended from us won't have any use for computers anyway.
As to using 4 characters to store the year - nearly every database I've worked with stores dates as Unix timestamps, aka the 32-bit integer that's going to wrap in 2038. And most programs I've worked on also compute date differences etc with the integer timestamp as well and only convert to dd-mm-yyyy when it's time to display results to the user. (It's much easier to add 30 days to today by going $expire = time()+(30 * 86400); rather than futzing about with splitting days, months, and years and adding/subtracting from monthday arrays.)
In any case, any computer we're using 7,990 years from now will be as far beyond us as the internet is to Hammurabi, and the likelihood of us still using the current date system then is about on par with us using the Babylonian calendar now, so somehow I don't think it'll be an issue! ;)