Coop bank never looked so good
With their outdated and behind the curve systems....
Nationwide UK customers have been unable to see payments on their online banking systems since this morning, in the latest in a string of digital glitches at the bank. According to Downdetector, Nationwide customers have been reporting problems with internet banking since 8.37am this morning (July 24), following a day of …
"Is the SECOND time this month I have been left with no access to my own money due to YOUR mistakes! It's disgraceful! 2/2"
Jesus some people are ungrateful. Two downtimes in a month after years of up time? Oh yeah, you don't remember the good times do you? No, just the bad.
Sod off and put your money under the matress next time if you're so worried about having access to your cash.
"Is the SECOND time this month I have been left with no access to my own money due to YOUR mistakes! It's disgraceful! 2/2"
Jesus some people are ungrateful. Two downtimes in a month after years of up time? Oh yeah, you don't remember the good times do you?
Whilst I get your point about the uptimes.
Good times? It's a bank, they have never been my pal and quiet happy to charge for the smallest infraction.
They probably use Virgin Media as their ISP... however the total failure of much of VM's network since midday Friday (21/7/2017) is astonishling absent from the press. It's only just beginning to come back online now, 10:00 Monday (24/7/2017). VM even took their own service status offline rather than actually communicate with any of their customers about the failure.
Am I the only one that thinks the occasional planned weekend maintenance is perfectly fine even for a bank? This only affected telephone and online banking, ATMs and cards were still useable and customers were notified well in advance.
Surely it's the price we pay if a bank wants to invest in keeping their infrastructure modernised? Better that than the "if it aint broke dont touch it" approach favoured by banks towards the end of the 20th Century.
Ex 1) An update needs to change data formats. For that you need data to be in a known state - and not changing - while the upgrade and testing process run. So you have to stop all processes that change the data.
Ex 2) A code change is being applied that needs a restart of the process. New program does not play well with old program so all copies need to update at same time. Even more fun if multiple programs needs to change
Ex 3) Upgrade of entire application suite applying multiple code and data changes all at the same time, changing every executable of said application suite across multiple hosts at the same time, quite often the entire DB and sometimes even moving whole lot to new hardware and OS version.
I'm sure there's many other examples.
So while you're complaining that you couldn't check your balance for a few hours, a whole bunch of techies have been working all feckin' weekend (plus a dry-run or two that you don't see), and when something unexpected happened not only did they have to fix the problem they also had to contend with a bunch of PHBs getting on their case.
AC cos I do this myself and certain PHB's don't like suggestions that things don't always go entirely as expected.
Of course things don't always go as planned and cock ups happen. What kind of bank doesn't have an alternate site they can failover to? All the examples given would require much shorter outages if this existed. Test and start new services at alternate site with a shorter outage while data is copied, real banks do this all the time. Questions should be asked at Nationwide.
By the way the techies are handsomely rewarded for doing their job, I also do it for a living.
The original maintenance slot was due to finish at 12pm UK on Sunday and overran by at least 13 hours (I received a text notification overnight, around 1:17am I think). The downtime highlights clear problems in the project plans for implementation and backout. A "no-go" should have been declared a lot earlier and even if it was, having a 12-hour backout plan doesn't seem good.
Banks and other companies are in an increasingly difficult position because of 24-hour working. Standard "batch" processes the banks relied on (and still do) for many years, just don't work. These sorts of companies need people with bigger vision and better ideas.
Unfortunately, this is not always feasible and it does sometimes make sense to see operations through even if it means extending the maintenance. I have first hand experience of an issue with a production database where indexes where being added and there was a underestimation of the time it would take. The decision was made to roll it back, which essentially meant cancelling the operation and letting the engine roll back. When the change was rescheduled with a bigger window, it actually only took another 1 hour on top of where the 'no-go' decision was made. The rollback however took a lot longer...
When you let Risk Management and the business decide which risks to accept on your IT systems then sometimes these risks are realised.
Old school technical and engineering approaches may seem painful to the uninitiated but they work when implemented correctly. All companies need people with this mind set who have a direct interest in ensuring the continuity of our systems. You cannot outsource this responsibility.
AC because now I work for an outsourcer and have direct experience and knowledge.
I was called in to a Sunday meeting with a Building Society which was migrating and merging the IT systems of a rival which it had taken over some years earlier.
The PM was so tired he could barely walk. He had worked every day for 18 months, with 1 and a half days off. I made sure I never banked with them. It also brought home the importance of holidays and
weekends.
To be fair, they are actually one of the most reliable when it comes to online access.
Almost all of their outages are pre-communicated before hand, and being in the IT game I can understand how scheduled maintenance can overrun. It happens.
Truth be told, no access to the app or online bank is nothing compared to the full NO-access-to-any-of-your-money-whatsoever NatWest or the rest of that gang have on a frustratingly regular basis. if nationwide have overrunning scheduled maintenance (which the others seem to have no maintenance...) limiting app/web access in aid of staving off any NatWest style full on outages, i'm all for it.
SAP Banking Services, like other SAP products require downtime. Everything other than the bootstrap is in the database and there's only one copy of the code/user data. So you will take downtime when you do maintenance. And if it goes wrong, there's no likely quick fix. Someone should have warned them. Oh yeah, they did.
Oh no... I couldn't access my online banking for 30hrs... what a crime... some one should be shot for this outrage...
May the first persons up against the wall... be those whining little toe rags who's cries of faux outrage point to an utter lack of anything interesting or meaningful happening in their sad little lives.
I'm a customer... I couldn't access my bank statements until this morning... end result.. I waited... disruptions to my life... feck all... If anything the only actual real life result from this whole thing.. It's made me despise these little whiny tossers even more... Gawd forbid anything monumental happens that actually disrupts their lives... they couldn't cope with the stress, their heads would explode from all the ranting and raving they were doing... We'd see bodies laying in the street headless with blood and brain matter splattered all over the nearest cash points (which were working fine, as were card transactions... you just couldn't look at your statements online you feckless losers).
I've had various Nationwide savings and current accounts for donkey's years - in fact I've migrated other accounts to them because they were the most reliable and helpful of all the bunch I've had over the years, Midland, First direct, Halifax, B&B etc etc. I've seen other banks come and go and watched numerous IT disasters with some amusement, but I've no complaint at all over my Nationwide service, (and no, I have no affiliation with them...).
Nationwide: We're carrying out planned maintenance this weekend. From midnight Saturday 5 August until 5am on Sunday 6 August.
For over 3 years Nationwide's service has been excellent. Unlike Barclays they haven't suddenly closed my account and accused me of laundering money. Unfortunately, I must say I'm slightly alarmed because it seems the planned maintenance happens every couple of weeks. It's beyond inconvenient especially for those who are self employed. I have a personal account with Lloyds and have never once been locked out of Online Banking in over 4 years (could just be good luck and coincidence, but it has me thinking of switching).
Nationwide Internet Banking and Mobile App banking both down again today. They're giving no ETA for restoring services. And speaking as a customer, when I just phoned their telephone banking to try and make payments etc I was told they have no way of doing any banking over the phone. In other words there's no Plan B other than finding a branch and walking in which I find slightly astonishing.