Six feet deep and rising
I worked at a Fortune 100 healthcare company in the early 2000s that was on the very edge of the midwest in the USA near where they have a very large horse race and some incredibly pure tap water that comes from limestone wells. (That last part will be important) I was not IT, but I was highly dependent on many IT systems and had friends in various IT departments so I heard various bits of this story from six different directions. (Some bits may be modified slightly to protect the guilty)
For reasons no one could rationally explain, a data center wound up in a basement. And in a highly predicable fashion, two months before the DC was to be relocated in a new facility, an 8" fire line on the 3rd floor blew out. It took hours to get the building water supply shut off and by then the DC was literally neck deep in water. By chance (or likely laziness) all the UPS & power transfer switches were located on the floor above (I think they cannibalized a loading dock) so there was not a horrific >BANG< or >ZZORZTCH<.
Fire marshals evicted everyone from the building but as there had not been a >BANG< or >ZZORZTCH< they let the power stay on. In Theory this was just to catch the last business day's worth of processing and push that over to the DR facility.
Reality & Murphy grabbed beers while they pointed and laughed at Theory.
There was a migration plan, multiple backups, offsite disaster recovery, and all the business continuity stuff you would expect. What the cynic in particular will expect is that none of it had been given a proper test so it was all FUBAR. The backup system was pointed at the DEV servers, The offsite DR copy had been used for a data anonymization project that had accidentally removed all user identifiers, and various other Shakespearean comedy of errors had transpired to render it all a dumpster fire.
But for some reason, the servers were still online, their lights twinkling merrily under all that water. A frantic data exodus began. Various network hacks were added to increase bandwidth, cables were plugged in through above-ground windows to pull data out through the building LAN to laptops with external drives, you name a desperate stunt that the Fire marshals would allow and they did it.
The magic is that for four days the lights stayed on in that data center, during which every bit and byte were streamed out of that DC. The new facility was booted up and business functions resumed in less than a calendar week.
AFAIK, no heads rolled. All the senior IT staff had seen the potential for doom had kept plenty of receipts. From the whispers I heard, various VPs who had demanded things like the anonymization project be done on crash timelines in violation of the corporate processes but gotten sign off from the C-suite.