back to article You were told to clean up our systems, not delete 8,000 crucial files

Congratulations on making it through the first week of 2019, and welcome to the first Who, Me? of the year. This is the column in which Reg readers reminisce about the right royal cockups of days gone by, and this week we meet "Sam". Sam was thinking back to 1998, when he had just been hired as tech support at a government …

  1. storner
    FAIL

    Backups

    Sounds like backing up those harddisks were not on Sam's agenda. Considering the reliability of harddisks back then, maybe it should have been.

    1. Pete 2 Silver badge

      Re: Backups

      Yup, first rule of document shredding: photocopy everything before starting.

      1. Yet Another Anonymous coward Silver badge

        Re: Backups

        Stuff you want to keep is on the shared network drive.

        Stuff labelled "temp" on your local drive doesn't get backup up

        Stuff you want to use again is kept in the recycle bin, that's why it's called "recycle"

        1. Evil Auditor Silver badge
          Coffee/keyboard

          Re: Backups

          Stuff you want to use again is kept in the recycle bin, that's why it's called "recycle"

          You just made me draw rather inappropriate attention in an otherwise serious, calm airport lounge.

        2. TonyJ

          Re: Backups

          No word of a lie I once had the director of a company use the deleted items in Outlook to store important stuff he didn't want others to look at.

          He'd hit on the idea that turning off the delete on exit and storing things in there was a smart thing to do and couldn't be dissuaded.

          1. joewilliamsebs

            Re: Backups

            I've had the exact same thing, which he suddenly remembered the day after approving a GPO to enable emptying the folder on exit for all users.

            Apparently that's the "Things I don't want in the inbox, don't need to file away, but might want to look at again you never know" folder.

            1. Anonymous Coward
              Anonymous Coward

              Re: Backups

              I have had the exact same thing as well. This was the head of legal for a large corp and we used Lotus Notes - she kept everything in her recycle bin like an archive and we found out and told her how bad this was but she wouldn't listen. We were later bought by another company and one of their existing processes cleared out the recycle bin overnight on the server :D 30 thousand docs apparently, all gone. I am sure they had printed versions...

          2. phuzz Silver badge

            Re: Backups

            I had a boss who would delete almost everything, and only leave the most important stuff in his inbox (to be deleted when he'd dealt with it). He was an IT manager and had a strong technical background, and was only dissuaded after I emptied his deleted mails whilst fixing another problem, eventually agreeing that maybe archive folders were what he really needed.

            Yeah Malc, you know I'm talking about you mate ;)

        3. Frank Bitterlich

          Re: Backups

          Stuff you want to use again is kept in the recycle bin, that's why it's called "recycle"

          A long time ago (not too much later than Sam's story) I was doing routine maintenance on a Mac for an office worker of the company I was working for... and that included emptying the trash can (as the Recycle Bin is called in Mac OS). Cue some serious berating about how I dare empty the trash – she was "keeping important files in there"...

          Sounds funny, but apparently some people have trouble understanding what the word "trash" means, and still get beyond flipping burgers in their professional carreer.

        4. chuBb.

          Re: Backups

          Stuff you want to use again is kept in the recycle bin, that's why it's called "recycle"

          I actually had this happen to me, emptied recycle bin, office ground to a halt as that's where the receptionist stored all her documents to be reused. She also was incensed after the same cleaning session her cursor was "broken" as it was no longer a carrot, and how could she search the internet without all of the search bars in ie....

        5. Anonymous Coward
          Anonymous Coward

          Re: Backups

          I remember back in 2002/3, doing data migration on job-centres. We migrated a user's emails. This was the time when mail accounts were quite small, so they all used pst files to archive.

          On user for some reason filed away a whole load of emails in her bin. Apparently she didn't know how to create folders. At that time we migrated pst files, but always cleaned them up first.

          Oh the horror. She lost all her "important" emails. She started to sling shit, but no one would switch the fan on for her.

      2. Anonymous Coward
        Anonymous Coward

        Re: Backups

        Like the time we were migrating for the prison service.

        In one particular prison, the governor’s secretary, was so paranoid. That she had never deleted an email. Everything was sorted and filed in the appropriate folder,in her pst files.

        And just in case, there was ever a power or computer failure and she needed to refer to an email, she had printed out every single email ever sent or received and had them filed away in filing cabinets.

    2. Dimmer Silver badge

      Re: Backups

      Opps there goes $10 mil in bit coins,

    3. hoola Silver badge

      Re: Backups

      Backups = less grey hair

      Fairly recently I was replacing an ancient g7 Microserver for a small business. This contained everything, data, accounts, email, AD and so on however most of it was backed up. Due to a lack of resources (it being a small business) the best option was to transplant the OS to the new hardware and fix the odds and sods around networking.

      The grand plan:

      Create a disk image backup of the old server

      Build the replacement Microserver at home and on the appointed day take it in.

      Update the disk image then turn off the old server

      Swap the power and networking then restore the images into a HyperV VM as this is easier for future management and expansion.

      Back home messing with another Microserver of the same generation that I am replacing the SmartArray on (you can see where this is going). Both are through RRDP session and the base OS is the same.

      Both servers have the same number of disks (not logical volumes) and for some reason I had the SSA open on both. I needed to delete the array on the local server but managed to accomplish this on the server at the remote end.

      Much swearing (even the kids realised something had gone badly wrong). The server was supposed to be on the Saturday morning. Everyone had gone home at the business and the disk images were with me. Copies were on the local disks in the server but that was gone. Fortunately the OS was on a separate disk so at least you could sit and stare at your incompetence/

      A phone call to the boss of the small business to explain the situation and grovel.

      The upside was that I had the old server & my disk images so I could start again and get everything back. Fortunately, apart from email, there were no crucial files or plans needed for the Saturday it was down so the original migration could be completed.

      Finally get everything working ready for Monday so they have only been down for 1 day. A few lost emails from the period it was briefly up but it could have been far worse.

      You can never have enough backups but they are only any use if you test them!

  2. chivo243 Silver badge
    Facepalm

    Recycle Bin

    The secret, save for later folder! People kill me...

    1. Steve Davies 3 Silver badge
      Thumb Up

      Re: Recycle Bin

      Yep the save for later folder until... Windows decides that it is already too full and starts deleting the oldest files to make room for the new ones.

      Yep, this got me a few years ago. Thankfully, my paranoia had made me take a full backup of the system first.

      1. HelpfulJohn

        Re: Recycle Bin

        "Yep the save for later folder until... Windows decides that it is already too full and starts deleting the oldest files to make room for the new ones."

        I have a person to whom I am the resident cheap labour help desk, She uses the Win-ten Desktop as a file storage. Windows helpfully moves stuff around and deletes or hides old files that haven't been used for a while.

        I tried folding her files into folders on the Desktop, "PDFs", "Tools", "Bills" and such so now she has those *and* a whole slew of new files.

        She is bright but she just doesn't get the idea of file systems.

        Or backups.

    2. Anonymous Coward
      Anonymous Coward

      Re: Recycle Bin

      [I have seen at least 1 person actually do this. They could not find the "my documents" in windows 10, so renamed the recycle bin... ... ... I still blame Windows 10!]

    3. Montreal Sean

      Re: Recycle Bin

      Years ago I worked as a system admin in a small (120 user) accounting firm.

      We migrated to Windows XP on all workstations, started enforcing quotas on user shares on the server, made a couple of changes to the way Outlook worked - including emptying the bin in Outlook on program close.

      These changes came after months of info emails and warnings about said changes.

      I lost count of how many users screamed the first time they closed Outlook because they lost years of emails.

      1. Vector

        Re: Recycle Bin

        "These changes came after months of info emails and warnings about said changes."

        Ah! That was your mistake.

        You expected users to read!

        1. Prst. V.Jeltz Silver badge

          Re: Recycle Bin

          "You expected users to read!"

          Tru dat.

          "months of emails and warnings ..."

          Prettty naive to think *any* of the users would pay *any* attention to that, at least without some kind of force-feedback system like putting "Please reply to confirm you have read an understood this" at least then the total lack of replies would mean that you know no ones listening to you.

          1. JimboSmith

            Re: Recycle Bin

            One of my ex employers was migrating from Novell to Outlook and they were trying to minimise the pain involved. Therefore staff were told to get the size of their mailbox down to under 2GB. I hadn't realised just how big some people had let their mailboxes get so big. People in my department had some whoppers 6+GB which had to be slimmed down. So I explained to the most technically challenged staff how easy it was to save file attachments to the hard drive or the server. I also explained that if users didn't slim their mailboxes down by the deadline these wouldn't be transferred to outlook. Instead an empty mailbox would be created in outlook and only new emails would fill the mailbox. That last one really spurred people on because the majority of the buggers were terrified about losing their emails. It also wasn't true but it worked like a charm. Head of the project asked me how I'd got my department whipped into shape so quickly. When I told her she laughed out loud.

        2. Anonymous Coward
          Anonymous Coward

          Re: Recycle Bin

          I expect the email sent from IT to the losers went straight to their deleted items!

  3. Mike 125

    xfer

    TMP/temp was always a dumb name- all files are by varying degrees temporary. I use a 'transfer' directory: stuff en route to an unknown, as yet, destination. It's otherwise known as 'to_be_sorted'.

    But like temp, it's also never empty...

    1. Terry 6 Silver badge

      Re: xfer

      Yeah, I got one of those.

      1. Ryan 7

        Re: xfer

        They're traditionally named "New Folder (2)", and nested by era.

        1. Prst. V.Jeltz Silver badge

          Re: xfer

          Lets not forget the traditional

          "stuff from old hard drive"

          "stuff from other computer"

    2. VinceH

      Re: xfer

      I also have one of those.

      But worse, whenever I move to a new computer, I decide to 'start again' in how I organise my (local) files. I therefore used to create a directory for the data brought forward from the old system until I can find the time to properly integrate it into the new system. So I have directory containing data from the old computer, which contains a directory containing data from the old old computer, containing a directory from... you get the picture. :)

      I'm now sensible. They're no longer recursive, and instead self contained folders on the NAS. :)

      1. PaulR79

        Re: xfer

        I feel called out reading this comment. "Old laptop" and "old backup" etc have places on my computers. One day I will find a duplicate file utility and sort it out but that day is not today!

        1. herman Silver badge

          Re: xfer

          That dedup utility is called 'hardlink'

        2. Pascal

          Re: xfer

          I took an even worse habit of just hooking up the old drive to a new system. My current home desktop has its own fast ssd + big hdd pair, and then it also has the drives from the 2 previous generations hooked up "to be sorted later".

          Yes there are backups of that mess.

          But I don't see myself sorting it out any time soon.

          Maybe next system.

          1. Steve Davies 3 Silver badge

            Re: xfer

            I know how you feel.

            You take copies of 'stuff' and over the years you have a whole stack of HDD's full of mostly junk but some hidden gems.

            I spent 10 days before Crimble sorting my data out while I was housebound with a severely sprained ankle.

            I started with almost 40TB of data. That is now 6TB including 2.3TB of Photographs[1]. There were backups of projects that I worked on in the early 1990's... Oh the memories...

            Still some way to go but I now have a pile of empty HDD's. Half of them will go to recycling once I've done a security wipe. That will make it a lot less tempting.

            [1] The new Camera that 'her indoors' gave me for Crimble/Birthday will really eat up storage as it has a 48M Pixel sensor... Perhaps I should think again about donating those HDD's?

            1. Charles Calthrop

              Re: xfer

              why delete it?

              I'd have created a new HDD with your 'best ofs' but left the rest. Spending ages removing things, just so you can throw out a hdd is not something I would do

            2. timmy 2

              Re: xfer

              Photographs, ay?

              (he asked him "knowingly"?)

              Snap snap, grin grin, wink wink, nudge nudge, say no more?

          2. Terry 6 Silver badge

            Re: xfer

            I kind of do that. My home PC has its own 2TB HDD and a pair of salvaged 500s of which one was from a previous PC and one salvaged from a PVR. But I do assimilate and organise them. First by moving stuff from the old PC into a partition to free up the rest of the drive, then by reorganising the drives over time. Now one 500 is purely for images of this PC and my laptop's C: drives. The other other is purely for data backups. All run automatically. I likewise have a caddy with an old laptop HDD that I use for an extra external b/u from time to time. (As well as a pair of proper external drives that I keep connected but swap round two or three times a year). There's stuff I couldn't bare to lose, as well as the stuff I need.

            Currently using a Freeware called "Personal Backup" by Dr. J Rathlev and Macrium Reflect.

        3. Doctor Syntax Silver badge

          Re: xfer

          "Old laptop" and "old backup" etc have places on my computers.

          And "From key" - the USB drive on my keyring.

          1. Simon Harris

            Re: xfer

            Only the one 'from key'? - I've got quite a collection of folders with names USB_STICK, SD_CARD, etc. where I need something for external storage/camera use but can't be arsed to sort it out properly, so just dump it to the hard drive 'temporarily' and then wipe the external drive.

            1. Doctor Syntax Silver badge
              Happy

              Re: xfer

              "Only the one 'from key'?"

              It's a big key.

              Camera stuff just goes straight into ~/Pictures/Lumix, ~/Pictures/Nikon or whatever.

        4. elDog

          Re: xfer

          If more pernicious are those who name a project with a "new" name such as NewApplication. Then when it's time to upgrade yet again, the new name must be NewestApplication, then EvenMoreNew, etc.

          1. ARGO

            Re: xfer

            I once worked in a lab where we were testing a laser. So obviously all the data files were called "Latest" :-/

            1. JJKing
              Facepalm

              Re: xfer

              Was it attached to a shark?

          2. Joe W Silver badge
            FAIL

            Re: xfer

            In my current job people still do this. When I started I asked about version control, development and test systems, things that I have been using for my own projects for... a while, let's use that (getting depressed about my age? Naaah...). Blank faces all around. I have to do some (a lot...) educating colleagues. or leave again.

          3. Antonius_Prime

            Re: xfer

            It's here that Sir PTerry's Contributions to Computing become apparent.

            One, Two, Three, Many. Many-One, Many-Two...

            (Base 16, I know, but easier to remember than 0-F)

            1. Terry 6 Silver badge

              Re: xfer

              Bit older than Sir T that one. Hebrew (and I think Arabic) for 4, arba come from arbey which means many

              Literally counting 1 2 3 Many back in the depths of prehistory

      2. Anonymous Coward
        Anonymous Coward

        Re: xfer

        Me too. Mine are called "From old drive", and are about 6 layers deep.

      3. The Oncoming Scorn Silver badge
        Facepalm

        Re: xfer

        On a laptop deployment project I had a user with a nested backup structure like that, going back 10+ years.

        1. swm

          Re: xfer

          I have files going back more than 30 years. Some of them are still useful (for some definition of "useful").

          1. cantankerous swineherd

            Re: xfer

            complete with .com files doomed never to run again.

            1. Criggie

              Re: xfer

              dosbox and wine make those old apps breathe again, (possibly)

              Then again, some of the best are packaged in your fav distro - Debian has the old Sopwith Camel as a package ready to go with `apt install sopwith`

          2. J.G.Harston Silver badge

            Re: xfer

            The earliest files (of my own) I've got are from 1984-ish. The earliest with a datestamp/embedded date 1985.

      4. Aitor 1

        Re: xfer

        I still do that.

        I have files from my 386 in my current computer.

      5. jelabarre59

        Re: xfer

        I therefore used to create a directory for the data brought forward from the old system until I can find the time to properly integrate it into the new system. So I have directory containing data from the old computer, which contains a directory containing data from the old old computer, containing a directory from...

        So it's not just me...

    3. Zarno

      Re: xfer

      In a similar vein to the circular filing cabinet, I use a ramdisk for all the random stuff that I don't care about, but need for between a few days and maybe never.

      It's also set as the download location for things from the browser.

      If it nukes, it nukes, and the ethereal nature and fixed size of 1G max keeps me from having "accidental clutter".

      I also have "NukeAfterUsed" "DeleteMePlease" "DO_NOT_DELETE", yada yada bing bang names for folders.

      1. doublelayer Silver badge

        Re: xfer

        When I create temporary files, I put dates in the file names that are my estimation for the latest date where I'll need this. My rule to myself which I've been pretty good at following is that if I find one of these, and I don't know what it is, and the date in the name is more than a week ago, that file gets deleted without my putting any effort into figuring out what it is. So far, that has never been a problem.

        1. Tom Wood

          Re: xfer

          I have a cron job that runs once a day and deletes files from ~/tmp (not /tmp) after 7 days and ~/Downloads after 30 days.

          Stuff of a 'here's some notes I might need this afternoon' nature gets saved to ~/tmp. If it's something genuinely useful then it must be filed properly or hit the bit bucket next week.

          1. Prst. V.Jeltz Silver badge
            Coat

            Re: xfer

            well , looking at he above 100 posts ....

            You're all a bunch of messy bastards!

  4. Terry 6 Silver badge

    The roots

    So many seeds of modern bad practice in that. Not Sam's (arguably). But asking people ( the staff) to run before they can walk is still an issue. Giving staff big complex systems and no training. Clunky, unintuitive software. Failure to invest in proper support, failure to invest in safety procedures.

    Nothing seems to have changed. Except that the challenges have got bigger.

    1. John Brown (no body) Silver badge

      Re: The roots

      "Clunky, unintuitive software."

      As GUI based desktops became the norm, apps were strongly encouraged if not actualy forced to follow the desktop style guides, thus making it reletively easy for most users to get the hang of many new apps, at least at the most basic level. With the switch to web apps running in a browsers, the app devs designers have been unleashed once again and it's like the Wild West out there again. Everyone has there own ideas about where menus should go, what should be in them and where to place buttons.

      Just today, I've used three separate browser based apps for work and every one of them has the log out button in a completely different place to mention but one oddity.

      1. Pascal Monett Silver badge

        That also has to do with the fact that most of the developers that learned those GUI lessons are now retired and the new generation is not aware of the issue.

        We'll need another 20 years of this mayhem before standards are set by habit and the situation returns to something more controlled and easier to cope with.

      2. Michael Wojcik Silver badge

        Re: The roots

        As GUI based desktops became the norm, apps were strongly encouraged if not actualy forced to follow the desktop style guides, thus making it reletively easy for most users to get the hang of many new apps, at least at the most basic level.

        I don't think that helped much, if at all. In my experience, UI standardization mostly provided users with a false sense of confidence and made it more difficult to recognize changes in context, so they'd erroneously try to apply knowledge about one application to another.

        UI standards also constrain innovation. Often that's a good thing - GUI products that depart from the platform's UI standards often have abysmal user interaction models and user experiences. (Antivirus products are usually a good example of this, for some reason; apparently developers of security pablum all think they're great UI designers as well.) But it also means we're still using OSes with foolish "desktop" metaphors like Recycle Bin.

  5. Anonymous Coward Silver badge
    Holmes

    "old days of 320MB IDE drives"

    "Normally I'd clean it all out, empty the Recycle Bin"

    BS

    In the days of 320MB IDE drives it would've likely been Windows 3.11, which didn't have a Recycle Bin. The 'undelete' command saved my donkey several times, but the Recycle Bin came later.

    1. Locky

      W95 had a Recycle Bin, definitely in the 320MB era

      1. DJ Smiley

        Also government systems? It could be as bad as win xp on 320Mb ;)

        1. Ryan 7

          Not in 1998.

    2. Alister

      In the days of 320MB IDE drives it would've likely been Windows 3.11

      What absolute rubbish, Desktop hard drive sizes didn't regularly exceed 1GB until well into the nineties, certainly anything up to Windows 2000 machines would not have had anything more than 500MB.

      1. d3vy

        "certainly anything up to Windows 2000 machines would not have had anything more than 500MB."

        My first windows 95 machine had a 4.3gb Seagate drive,

        That would have been 96/97 so I call BS on that.

        1. BigSLitleP

          Considering i was repairing laptops with smaller than 1Gb HDD with win95 on them back in 96/97, i call bullshit on your call of bullshit.

          1. theDeathOfRats

            (Standard)Laptops still use to have HDs smaller than desktops. It was the same in 96/97. I think we used to mount 1'2GB normally back then, but 2'5GB were not unusual (if the customer wanted to pay for it). Though it would have been a bit hard to install a Quantum Bigfoot in a laptop, yes.

          2. Anonymous Coward
            Anonymous Coward

            I call BS on...

            Beware of using the above phrase. It signifies that the author doesn't know what they're talking about, yet purports to know better than someone else who does.

          3. Anonymous Coward
            Anonymous Coward

            Two points:-

            (a) He didn't say it was a laptop drive- as still is the case today, desktop drives were generally of higher capacity than laptop ones. (As I noted elsewhere, the 3.2 GB drive I had in a desktop PC bought in early 1998 was already considered a little on the small side by buying guides at that time).

            (b) He was talking about a *new* machine, whereas if you were *repairing* machines they were more likely to be older. And back then HDD sizes were increasing (and the price-per-gigabyte falling) far faster than they were today, so even 18 months (say) would have made a noticeable difference in that respect.

            Regardless, the fact that some of the laptops you repaired back then had sub-1GB hard drives at that time doesn't mean that was the largest anyone was using.

          4. d3vy

            @bigs

            Which said anything about laptops?

            From the article I got the impression it's a desktop.

            Look up Seagate medalist, I think the model was 4310 it was 4.3GB and was in the first PC I bought (second hand) with my own money in '97/98 so I'm absolutely certain that desktop PCs had moved past sub 1gb drives by the mid 90s.

        2. Dabooka

          I remember the 2Gb file size limit in 95 / ME

          It caused a hell of a nightmare when trying to edit a video for whatever reason in the early 2000s, and was a good excuse to upgrade everything, so certainly some machines would've had higher than 320mb.

          However being in a government organisation 2 or 3 years earlier, I suspect there was a greater chance the hard drive would've been a more modest size though, so 320-500mb sounds feasible.

          1. John Brown (no body) Silver badge

            Re: I remember the 2Gb file size limit in 95 / ME

            "However being in a government organisation 2 or 3 years earlier, I suspect there was a greater chance the hard drive would've been a more modest size though, so 320-500mb sounds feasible."

            Absolutely this! At the start of the 90's, a 40MB HDD was pretty standard. By the end of the 90's, IBM had introduced a 16GB Deskstar HDD and it wasn't unusual for PCs to be expected to last AT LEAST 5 years back then, often much longer.

            Back when 40GB IDE HDDs were still standard items, I was dealing with at least one customer still using original IBM PCs, booting from floppy onto an IPX/SPX network to share files and a laser printer running Wordstar on MS-DOS. When PCs cost over £1000 each and a £1000 was a lot of money, PCs were often purchased from the capital budget and were given a depreciation value/duration of a minimum of 5 years.

            1. Criggie

              Re: I remember the 2Gb file size limit in 95 / ME

              I've seen toner cartridges entered into accounts as capital items, because they cost over $250.

              I've even seen fibre cables installed inside walls as capital items, depreciated over 20 years.

              I've even been shat on by the technophobe accountants in 1998, for throwing out a 14.4k modem, because it was ~5 years into a 20 year depreciation cycle and had a book value of ~$1400 remaining. This was in the day when a new 56k modem was about $150 and dialup minutes had a cost, so upgrading to faster made a lot of sense. Thank Christ for a diplomatic boss who saved my job when I was young and knew no better.

              1. katrinab Silver badge

                Re: I remember the 2Gb file size limit in 95 / ME

                Not your fault their depreciation policy was complete nonsense - the idea that it would still have value in 2013 when BT had turned off their dial-up service because nobody was using it.

        3. Anonymous Coward
          Anonymous Coward

          "My first windows 95 machine had a 4.3gb Seagate drive,

          That would have been 96/97 so I call BS on that."

          And was it a corporate build or a custom build/home spec?

          Large corporate rollouts I saw in 2000-2003 with WinXP were on 40-250GB hard drives as data was typically stored centrally. i.e. Dell GX240's

          Even now 160GB/250GB SSD's are common in corporate devices in my experience.

        4. juice

          The household's first PC was a P100 with a whopping 8mb of ram, running Win95. Iirc, we purchased it in December 1995, and it came with an 850mb hard drive - which the manufacturer (AST?) had kindly stuffed full of low-resolution tutorial videos.

          It's worth bearing in mind that there's been quite a few hardware limits on how big a HDD could be, as detailed here:

          https://www.tldp.org/HOWTO/Large-Disk-HOWTO-4.html

          When Win95 came out, using more than 540mb (528 after formatting) was still problematic - in 1996, I had huge issues getting a 1GB drive working with a hand-me-down 486, and getting anything over 2GB working was a nightmare for a year or two after that.

          1. Dave K

            2GB was the big limit at the time as it's the limit that a FAT16 partition can be. I remember a few Win95 PCs at work (used for video work) that had 4GB hard drives and which were partitioned into 2GB chunks due to the FAT16 limit.

            Later OEM releases of Win95 did come with some support for FAT32, but it was Windows 98 before FAT32 became mainstream.

        5. ShortLegs

          I can relate, but it would have been a SCSI drive. Seagate did not have a 4.3GB (E)IDE drive at that time. They did offer the Barracuda SCSI in 2.1, 4.3 and 9.1GB flavours, the later two tending to be SCSI UW, and half-height.. and weighed a ton.

          I remember a Computer Weekly (what ever the trade weekly rag was back then) reporting that Barclay's had had an issue with two drives failing in a RAID5 setup, and a "massive" 45GB array was at risk :-)

        6. heyrick Silver badge

          My first windows 95 machine had a 4.3gb Seagate drive,

          My first W95 machine had a 520MB harddisc and a big pile of bull to trick the BIOS into understanding the disc (how many heads?!?!).

          My Acorn A5000, on the other hand, had a whole gigabyte in two partitions.

        7. Anonymous Coward
          Anonymous Coward

          Yes, that sounds absolutely plausible.

          I bought my first PC in April 1998. It was an utterly unremarkable upper-budget/lower-midrange model that came with a 3.2 GB HDD, and I still remember even *that* was considered a bit on the small side compared to the recommended 4.3 GB. As with your machine, it came with Windows 95 (#).

          A year later, I wanted to play around with Linux, and ended up buying an 8 GB second hard drive- and even *that* wasn't especially big by that point and IIRC cost circa £110. (Hard drive prices/capacity were falling/increasing so fast during the 90s and early 2000s that even a year back then could make a noticeable difference).

          Anyway... all this was before Windows 2000 came out in late 1999, so the suggestion that we were all running 500MB drives at most back then is- as you suggest- nonsense.

          (#) Albeit one of the later OEM-only versions that was in practice far closer to Windows 98 (which came out a couple of months later).

        8. David Neil

          Worked at a large insurance company in 98/99

          >1Gb HDD on desktop was not uncommon

      2. bpfh

        At IBM in 1998...

        When a 1994 era 540 meg drive failed it got a direct upgrade to 2 gb as there were no smaller FRU HDD’s available as replacement stock, and we had 4 gb Aptiva’s floating around being deployed new.

      3. Anonymous Coward
        Anonymous Coward

        For a project at work - yes, I was a generous soul - I requested a PC which had 2x 540 MB drives (from poor memory, and don't ask what else there was in terms of RAM, but it cost over a grand, and that was deducted from my salary over the course of the year - while I was using it at work and yet buying it for my own use!)

        Anyway, that was late 98 or into 99 and probably using WIn 98. (As an aside, I remember the support guy at Microsoft who sounded like his jaw had dropped when I told him I was processing 1 million records in MS Access.... and boy was it slow when I had to 'compress' to remove the deleted records (going from drive to drive, so they were not being thrashed too much compared with a larger capacity drive and both source and destination being on the same drive.)

        1. juice

          I suspect you're thinking of 1989, not 1998 ;)

          By 1998/1999, even budget HDDs were in the GB range.

          In fact, someone with far more time on their hands than I have did some research (https://www.jcmit.net/diskprice.htm), and the cheapest 3.5" drive they could find in 1999 was a 6GB unit at $160 (which probably would have been around £160, given the usual 1:1 exchange rate for technology pricing).

          (I would have done a bit more of a dig on archive.org, but their year filter seems a bit b0rked...)

          So by 1999, the manufacturers would have long since stopped making 540mb drives, though there was probably still some floating around the channels...

      4. ShortLegs

        Not true. The 528mb limit was encountered in the early-to-mid 90's. 850MB IDE disks were available in 1994/5, 1.2GB in 1996, whilst SCSI disks were available up to 9.1GB.

      5. commonsense

        "certainly anything up to Windows 2000 machines would not have had anything more than 500MB."

        That's not really true. 16Gb hard disks were available in 1997. Windows 2000's system requirements were "2Gb, with 650mb free space". The Gb era was very much in full flow.

        The idea of 320mb drives and Windows 95 is perfectly cromulent though.

      6. Dave K

        Sorry, my father's Gateway 2000 PC bought in early 97 had a 2GB EIDE drive and ran Windows 95. I believe he did pay extra for it, the standard drive at the time was nearer 1.2GB I seem to recall.

        Certainly I remember my Windows 98 PC at the end of the 90s having a 6.4GB EIDE drive. To suggest that Windows 2000 PCs wouldn't have had more than 500MB, well you're several years out I'm afraid.

        There's a Wikimedia chart showing capacities over the years here: https://upload.wikimedia.org/wikipedia/commons/a/a1/Hard_drive_capacity_over_time.png

      7. Tomato Krill

        Our Dell machine was bought around 98, and had a 2GB hard disk.

        And cost over 2k...

        1. Anonymous Coward
          Anonymous Coward

          My first computer (mine, not family's) was in 1998, and had a 6GB drive. I selected that machine partly because it was CHEAP.

      8. Stevie

        Bah!

        My gov't-issued NT4 workstation had a 520 MB hard drive in 1999, but everyone in a position to do so was going larger than that if they could. Can't remember what was "du jour" standard in sane workplaces, but I'd guess at a gig on account of it being twice what I had.

        I remember having to admit this to a nice lady at Sams publishing when she asked why I didn't solve a problem with a "book on CD*" by copying it to my hard drive.

        *Unix Unleashed, System Administrators Edition. Life and career saving book. Can't praise highly enough.

      9. katrinab Silver badge

        My first computer, in 1996, had a 1.2GB hard drive.

      10. FIA Silver badge

        What absolute rubbish, Desktop hard drive sizes didn't regularly exceed 1GB until well into the nineties, certainly anything up to Windows 2000 machines would not have had anything more than 500MB.

        A fresh install of Windows 2000 is around 555MB.

        1. Dave K

          It's amazing how someone is going around down-voting all of these comments without thinking about it.

          FWIW, the *minimum requirements* for Windows 2000 according to MS are a 1GB hard drive with 5GB being recommended.

          Hence any machines running Windows 2000 with a 500MB drive would be below MS's minimum requirements and hence a very old and unlikely scenario.

    3. TRT

      I saw True Lies over the weekend. Arnie gatecrashes a party at a mansion and plugs a relay dongle into the PC running an Arabic version of Windows 3.11

      I'd forgotten how old that film must be!

      1. Afernie

        Yeah, I'd forgotten about that as well. Amusingly, the files are copied via modem in the space of seconds and our hero's techie sidekick declares that decrypting the bad guy's files will take "a few minutes". Some things never change, and Hollywood's depiction of magical technology is one of them.

        1. TRT

          Like Jurassic Park's use of SGI's IRIX button fly interface and a 10 year old kid gleefully rubbing their hands exclaiming "This is a Unix system. I know this. It's like a phone book - it tells you everything."

          1. smot

            ...or the Atari Portfolio used in Terminator 2 to open a door security lock....

          2. whitepines

            I always got a chuckle out of that scene -- many Unix systems will in fact tell you how to use them, IFF you know the command names already (man, -h). However what was being said did not, as you say, match the action on the screen.

            Who else got a kick out of the quite real Connection machine in the background, sitting doing nothing but blinking its lights?

            1. Tim99 Silver badge

              BSD apropos and whatis - Bill Joy - 1979.

      2. Dabooka
        Happy

        1994?

        Yes I know I could check IMDB but where's the fun in that?!

        First movie I saw with my (future) missus.

      3. heyrick Silver badge

        I'd forgotten how old that film must be!

        Don't forget the Epic Gushing over tech specs in the film "Hackers"... it's insanely great that it comes equipped with a 28k8 modem! And a PCI bus! (which, if I recall, was somehow connected to RISC taking over the world <shrugs>).

        Of course, the biggest WTF was all these people going into phone cabins. Like, these little public glass and metal boxes...with really big telephones inside! These days they're about as historical a rarity as blue police boxes.

    4. ShortLegs

      or not. DOS/BIOS used CHS addressing back then, and the max limit was 1024 cylinders, 16 heads, 63 sectors per track, giving a maximum disk size of 528MB. By 1993/4 this had become an issue, as consumer hard disks were available that approached this size; often a new hard disks would be sold with a floppy containing a manufacture-supplied driver to workaround the issue, e.g. SeaTools, DriveManager.

      That said, IIRC Windows 95 was never supplied via DGITS/CCTA, the supported OS were Windows 3.11 and WinNT 3/4.

  6. ArrZarr Silver badge
    Unhappy

    Don't recall ever deleting anything, I do recall accidentally moving a site in an FTP with Filezilla (which will happily proceed with questionable instructions without blinking) into a subfolder, taking it offline for a few hours...

  7. Chairman of the Bored

    A file so large...

    Must be very useful...

    But now it is gone.

    Gummnit job. Constraints: thou shalt retain copies thy emails. Copies thereof shalt thou keep. Both sent and received shalt thee keep. Outlook .pst files shall be thy instrument of archiving.

    And lo! Despite mass storage being too cheap to meter, thy great organization shall set a limit of 512MB for each .pst. File size warning? We know nothing thereof, and shall not configure.

    So it came to pass - like a good little boy I dutifully copied all of my emails into their .pst bins. One overflowed without warning and became utterly corrupted. The IT guy attempted to fix it, and the results were ... disappointing. Claims the network share upon which the .pst files reside are backed up proved to as hollow as the archive files generated from them.

    A couple of days later I was asked to produce some emails on demand per a FOIA request. THAT did not go well.

    1. defiler

      Re: A file so large...

      Bear in mind that back in Days of Yore, a PST was limited to 2GB, and would throw a bit strop when it hit that wall, in much the same way as a car will throw a big strop when it races into a wall... I suspect the IT folks were trying to shield themselves from that kind of horror, without realising that they'd made it 1.5GB more likely.

      1. Snapper

        Re: A file so large...

        What is it with Microsoft and their file systems? I was supporting quite a few clients in the 90's and early 2,000's that ran Outlook for Mac (until macOS 10.3 in 2003 Apple didn't have their own viable email client). Even though the Mac files were stored in a non-PST format library there was still a 2GB limit. As soon as you hit 2GB the whole library was toast.

        1. Killfalcon

          Re: A file so large...

          "Sure the address space could be higher, but no one will ever need two freaking gigs for emails: it's basically plain text, FFS."

          I think my personal favourite example of "no-one will ever need that much" coding was in Deluxe Paint back on Ye Olde Amiga. If the RAM used by the picture you were working on exceeded 512, it threw an out of memory warning, because, well... when they wrote it, no-one had more than that, nor did anyone sell more than that, so they set an upper bound on the test.

          However, it was just a warning (not an error or a crash), so if you'd gotten one of the 1MB expansions Commodore later introduced, the program would carry on happily after, letting you do all the fancy gradient fills you could want.

          1. Anonymous Coward
            Anonymous Coward

            Re: A file so large...

            Sod the Amiga extension, just cut the relevant tracks, remove the old soldered on RAM chips and solder 1MB of faster ones on the mobo for 1MB VRAM; that is what I did.

            Or are you talking Pre A500 Amiga ? Cos the A500 addon 20MB HDD had space for 2MB of extra RAM, giving you a MASSIVE 3MB of RAM.

            (HDD later upgraded to a HUMONGUS 200MB)

            1. Killfalcon

              Re: A file so large...

              I'm fairly sure the code was written pre-A500, but my folks had the A500+, and later this chunky bolt-on wedge-thing that pushed it to (IIRC) 2MB.

              Some of the details elude me, I was maybe 12 at the time...

        2. btsfh

          Re: A file so large...

          At least the 2G file size limit makes sense in the context of 32-bit compilers of the era. Even Unix varients of the era using mbox format rather than maildir tended to have a ~2G mailbox limit. This has caused me to lose email in the (increasingly distant) past. Now it goes in a tool using maildir whose only limitation is partition size, and is lost because I never remember a unique enough portion to ever find again. :)

          1. heyrick Silver badge

            Re: A file so large...

            "At least the 2G file size limit makes sense in the context of 32-bit compilers of the era."

            No, 32 bit can happily count up to 4GB. The problem is C's concept of error checking is rubbish, so file operations need to use signed variables (and throw away a potential 2GB) in order that it can return a value of -1 if it needs to. That, coupled with the number of times I've seen software that utterly mixes up signed and unsigned... It's not a surprise, but it's not the fault of 32 bit.

        3. This post has been deleted by its author

      2. Anonymous Coward Silver badge
        Facepalm

        Re: A file so large...

        A very old version of PSP would claim that a disk was 'read only' if it had more than 1TB free space and therefore wouldn't save to it (details are fuzzy, but that's the gist)

        'dd if=/dev/random' to create some filler files on the new 2TB "future proof" drive saved me trying to explain that to the client. Yes, I excluded them from the backup routine too.

    2. tim 13

      Re: A file so large...

      For a FoI request, if you haven't got it, you don't need to provide it

    3. Norman Nescio

      PST files on a network? - NOTWORK

      Um, I hate to say this, but even Microsoft have made it clear that accessing PST files within Outlook across a network is an unsupported configuration.

      Microsoft Technet Blogs:Network Stored PST files … don’t do it!

      and

      Microsoft Knowledge Base: KB 297019: Limits to using personal folders (.pst) files over LAN and WAN links

      You don't explicitly say that is what you were doing, but your posting is a bit suggestive:

      Outlook .pst files shall be thy instrument of archiving.

      ...

      Claims the network share upon which the .pst files reside are backed up proved to as hollow as the archive files generated from them.

      Of course, if you copied the mail to an Archive PST locally, then shut down Outlook, then copied the closed PST file to the network share, and did not try to access it using Outlook across the network, it would probably (this is Microsoft, after all) be OK. I used to do precisely that, until Corporate IT decided our network share data was getting too big*, and it became allowable to store PSTs locally on IT-issued USB connected encrypted portable disk drives.

      *We were allowed only a couple of hundred megabytes of Outlook mailbox each. There was also an email retention policy enforced that acted automatically on mailboxes deleting emails older than a certain age. I successfully pointed out that I needed certain of my emails to be retained for at least the length of customer contracts (which could be multi-year), ending up needing to juggle multiple USB disks (for backup) containing some** contractually useful emails.

      **Read 'a lot of'.

      1. Chairman of the Bored

        Re: PST files on a network? - NOTWORK

        @Norman,

        "Um, I hate to say this, but even Microsoft have made it clear that accessing PST files within Outlook across a network is an unsupported configuration."

        You are absolutely, totally correct. However, my organization was not entirely competent. Calling the admin "tossers" is probably a bridge too far though as I doubt they could achieve and sustain, let alone produce output.

        We were expressly prohibited from keeping any email .pst on local storage- regardless of Microsoft's direction- because there was no backup solution provided for local machines. And in any case the local storage was barely sufficient for Windows Vista (shudder). More storage? Not unless you are a senior executive. Backup critical data? If you're not a senior exec, its not critical. Beancounters.

        That's called a stupid sandwich. For extra sauce... we were prohibited from putting "technical" data on the network shares, since our IT contractor owned and operated the net, not the Government. So by definition we had no backups of whatever technical data we could somehow produce on that POS of a system.

        I do not miss that job.

        1. Is It Me

          Re: PST files on a network? - NOTWORK

          I had similar fun with PST files in a previous job, there was one member of staff who worked on 2 computers and constantly complained that he couldn't access his archive - no matter how many times we told him he had to close Outlook on one computer before opening Outlook on the other one.

          Then there was the Microsoft backup PST tool that would copy the PST file to a network share each time you closed Outlook. Great if people let it finish running before shutting down the computer.

          The next problem came when Outlook was upgraded past 2003 and the tool stopped working.

          Not long after that we moved to a hosted Exchange service with MUCH larger mailboxes and pretty much did away with PST archives.

  8. imanidiot Silver badge
    Joke

    TMP=temporary? I hope not!

    TMP stands for Turbo Molecular Pump, clearly. Anybody who says otherwise is wrong. Though a hunk of metal that size spinning at 30.000 to 60.000 RPM isn't exactly a permanent fixture either...

    1. druck Silver badge

      Re: TMP=temporary? I hope not!

      Either you are specifying the RPM to 3 decimal places, or you are using a decimal point where you should use a comma.

      1. Androgynous Cupboard Silver badge

        Re: TMP=temporary? I hope not!

        Far beyond the seas, there are strange countries filled with wild men that use decimal points where you would use a comma, and vice versa, when formatting numbers.

        1. imanidiot Silver badge

          Re: TMP=temporary? I hope not!

          I'm just used to the . as the thousand separator and the comma as the decimal sign. Can't help it, that's what I've been seeing most of my life in my region (western europe)

        2. druck Silver badge

          Re: TMP=temporary? I hope not!

          In case you had not noticed, this is an English language website, so correct use of the decimal point is required.

          1. imanidiot Silver badge
            Trollface

            Re: TMP=temporary? I hope not!

            In case you hadn't noticed, I don't care.

          2. Androgynous Cupboard Silver badge

            Re: TMP=temporary? I hope not!

            And if you continue to misuse it the punishment is 20,00 lines!

    2. dfsmith

      Re: TMP=temporary? I hope not!

      I named my (long-running) temperature logging data "temp.log". After nearly deleting it years later, I renamed it "degC.log". Don't get me started on template files.

  9. Vanir

    TMP =

    Terrible Management Practices?

    The solution as described seems to reflect an Agile 'value': Responding to change over following a plan.

    Which begs the question: what was the plan? Assuming there was one.

    Perhaps all this was a test to drive development of managerial competence.

  10. Norman Nescio

    Users do what works for them

    In the absence of clear rules regarding the use of files with a .TMP extension, or placed in a TMP directory (or tmp folder*), I wouldn't be too quick to blame the user.

    That said, I have experienced pathological user behaviour using Microsoft Outlook, where one user used the 'Deleted Items' Folder as their archive. Their client was set up to NOT empty the 'Deleted Items' folder on program exit, and the folder held several years of previous emails that they wanted to keep. The users workflow was sensible from their point of view: they read an incoming email, did whatever necessary and 'deleted' the email, recovering it from the 'Deleted items' folder if it was needed to be referenced again at a later date. I had been called in because the version of Outlook being used had a size limit on the mailbox storage file, and was having problems.

    Some user education followed.

    NN

    *If you look at the Linux Filesystem Hierarchy Standard, the behaviour of a tmp directory varies according to where in the hierarchy it is found.

    3.17. /tmp : Temporary files

    3.17.1. Purpose

    The /tmp directory must be made available for programs that require temporary files.

    Programs must not assume that any files or directories in /tmp are preserved between invocations of the program.

    Rationale

    IEEE standard P1003.2 (POSIX, part 2) makes requirements that are similar to the above section.

    Although data stored in /tmp may be deleted in a site-specific manner, it is recommended that files and directories located in /tmp be deleted whenever the system is booted.

    FHS added this recommendation on the basis of historical precedent and common practice, but did not make it a requirement because system administration is not within the scope of this standard.

    ...

    5.2. Requirements

    ...

    The following directories, or symbolic links to directories, are required in /var.

    Directory Description

    [/var/]tmp Temporary files preserved between system reboots

    Of course, while the standard quoted is relatively sensible, someone doing a 'man hier' can get different answers:

    /tmp This directory contains temporary files which may be deleted with no notice, such as by a regular job or at system boot up.

    /var/tmp Like /tmp, this directory holds temporary files stored for an unspecified duration.

    Although the man pages do link to the Filesystem Hierarchy Standard, it would have to be a particularly dedicated user that chased down the exact rules, and even then, assuming developers have been just as dedicated, and implemented things correctly, can be dangerous.

    1. Anonymous Coward
      Anonymous Coward

      Re: Users do what works for them

      Users using the deleted items 'folder' as a 'message dealt with' folder is quite common. I've seen it many times.

    2. Roland6 Silver badge

      Re: Users do what works for them

      >using Microsoft Outlook

      I found it useful to ensure the folders, typically in one or other Windows temp directory, MS Office uses for temp/recovery, are omitted from any automatic disk/temp file housekeeping. It has enabled me several times to recover files for users.

    3. Paul Crawford Silver badge

      Re: Linux Filesystem Hierarchy Standard

      Behind that is the possible case that /tmp is a ramdrive and small, while /var/tmp is expected to be on non-volatile storage and much larger. In the ramdrive case a reboot will inevitably wipe the directory even if the OS has no explicit step to do so.

      Debian based systems like Ubuntu wipe /tmp on reboot only, where as RedHat based systems typically deleted from /tmp by cron job based on the last access time being a week or two ago.

    4. sebbb

      Re: Users do what works for them

      Oh I tried so hard to change people's mind in my old IT job in NHS, you really get the best: PSTs on network drives, size about 10-12GB each, people systematically "deleting" e-mails once needed to be archived, e-mails with gross, giant .doc and .xls monsters... and then the best part: SMB2 for remote access over VPN. I don't miss that job.

    5. Paul Hovnanian Silver badge

      Re: Users do what works for them

      Some years ago, I worked at an outfit supporting a suite of applications hosted on some HP-UX servers. One day, we started to get calls from the factory that the critical apps were down and they had stopped building airplanes. Upon logging into the system, I found that the /tmp subdirectory had been deleted. Calling the sysadmin, I found that the IT department had recently hired a new manager. One not familiar with the systems we used. He took it upon himself to start poking around and asking questions. One being, "What's this /tmp subdirectory for?" When informed* that it was used for storing "junk" he ordered that said subdirectory be deleted immediately. "We don't store junk on our production servers!" Being a real type A manager (I can guess what the A really stood for) nobody dared disobey his orders.

      I managed to resurrect the stuff I was responsible for by setting my TMPDIR environment variable to another path. But it did take several days of intermittent system operation before admins could go over this guy's head and straighten the whole /tmp fiasco out.

      *Whoever supplied this incorrect description is really the one who deserves the flogging. One has to be very careful when briefing a PHB or the error may propagate out of your control.

  11. Anonymous Coward
    Anonymous Coward

    At one job a production database was being run out of /tmp. Because the non-English-speaking dev didn’t realise it meant temporary, and it was the only directory he had write access to...

    1. Tom 38

      That's crazy! Everyone knows the DBs should live in /var/tmp

  12. Groaning Ninny

    Storing things in the Recycle Bin.

    First things first, this isn't a "friend of a friend", it's about something I did. This is real, despite the fact it's too stupid to be true.

    As the main sysadmin I regularly was called on to help out with all the trivial stuff - clearing up disk space, removing unintended programs and so on. One chap reported he was running low of disk space on his laptop, so I did the normal thing of removing the NT Service Pack backups, large temporary files and so on, and emptied the Recycle Bin. Noting really all that radical, only it turns out he stored documents in the Recycle Bin because it saved space.

    1. Snapper

      Re: Storing things in the Recycle Bin.

      Been there, done that, eaten the T-shirt!

  13. FuzzyWuzzys
    Facepalm

    Then you had the opposite, the paranoid user who maintains 58 copies of every one of 5,000 documents in very carefully organised file structures and dirs. Their organisation skills in maintaining the structure manually on disk is a work of pure art. Well that is until the day the HD heads crash and you ask if they ever did backups to a something other than the HD...ah, now we have a problem...

  14. defiler

    Like leaving documents on top of the bin

    My dad did that once with a whole bunch of weekly progress reports for a petrochemical construction project, as it was the only available surface. Wasn't best pleased when the cleaners disposed of it... Luckily you could look out of the window to see the progress, and it wasn't something signed off by the client. But then he's managed to be a fairly lucky chap when it's counted. Could fall into the Clyde and come out with a salmon...

    1. GlenP Silver badge

      Re: Like leaving documents on top of the bin

      Back in my first, Civil Service, job any document in or on the bin that hadn't been torn through would be returned to your desk. Sensible policy (although it didn't seem to be followed by more senior people from some news reports!)

    2. J.G.Harston Silver badge

      Re: Like leaving documents on top of the bin

      A friend of mine moved house, and put loads of belonging in BIN bags. While he was doing some shuffling, the BIN men came and disposed of the filled BIN bags for him.

  15. d3vy

    I was cleaning up files on an old web server many years ago and came across an aspiring file.

    Rather than opening it and reading the contents I thought "I'll just run it, it's in wwwroot what could go wrong" cracked open a browser and typed the URL.

    After a surprisingly long time I got the message "complete" nothing else, just that text on a plain white page.

    Turns out I'd stumbled on the script that someone knocked up during development and forgot to delete which initialised the holiday system... Of course being asp this used DSNs to connect to the database and being on a live box that meant live data.

    21k employees holidays records for 10+ years wiped out.

    Luckily the dbas we're quite accommodating and we sorted within the hour... But that taught me a very important lesson!

    1. Anonymous Coward
      Anonymous Coward

      Presumably if you found an unlabelled jar in the garage, you would take a spoonful of it to taste?

      1. Korev Silver badge
        Joke

        Well luckily, people don't use Java outside of the server room much any more....

      2. d3vy

        @anon.

        Definitely, a jar of mystery delights, delicious.

        In my defense, it was the end of a long day, I'd inspected and removed hundreds if not thousands of shite files and I just wasn't thinking... Live and learn.

  16. Tom 7

    Thems my initials thems is.

    And were used as a file extension to indicate ownership on our VAX. And of course were also used to denote temporary files too so I used to get all sorts of shit when the disks filled up and the system manager wasn't around.People would come to me demanding I deleted my files so their jobs could run. I could of course work out which .TMP files were generated by their jobs and delete them so their jobs would no longer run when I was in the right mood.

    1. Dunstan Vavasour

      Re: Thems my initials thems is.

      For the first 20 years of my life it didn't matter that my initials are DEV.

      1. Doctor Syntax Silver badge

        Re: Thems my initials thems is.

        "my initials are DEV"

        Could be worse: DEVOPS.

  17. Anonymous Coward
    Anonymous Coward

    deleting to make space, one more story

    "Have you ever deleted something crucial? Did you manage to get it back?"

    As a matter of fact, answers are, for me: yes, no ;)

    This was a quite big openmail server (then HP product). We had severe issues of space occupancy going through the roof, despite using quotas on every maibox.

    It turned out quotas didn't apply to the inbox, only folders, in order to avoid bouncing emails to customers. 1200 users on this system.

    The issue was aggravated by the fact many people switched from the proprietary email client, using native objects in the mailstore, to IMAP clients, that would cause duplication of every single email from native to IMAP format (yes, openmail was very retarded).

    One day, out of despair, after seeing some mail reports and the fact many users would leave emails rot forever in their inbox, I decided to set up a script removing every emails older than one year, from the inbox of every user.

    It did well for space occupancy and no-one complained as I expected. Except this dude who was storing everything in inbox (no folders). Turned out I totally smashed his archives, lol. He was quite upset, but since we could only recover a full mail store from backups, no single mailbox, he had to do ...

  18. Anonymous Coward
    Anonymous Coward

    A "proper" OS...

    A "proper" OS... allows truely "temporary" files to be created that are automatically deleted when the last reference is closed (or the system is restarted). Does Windows support this?

    1. Anonymous Coward
      Anonymous Coward

      Re: A "proper" OS...

      by your implication, unix doesn't need any /tmp directories...

      1. Anonymous Coward
        Anonymous Coward

        "by your implication, unix doesn't need any /tmp directories..."

        Who said that?

        1. Anonymous Coward
          Anonymous Coward

          Re: "by your implication, unix doesn't need any /tmp directories..."

          You did.

          The article is about problems with the storage of temporary files. You said that "proper" systems can automatically delete a temporary file on final close, implying that the problem wouldn't exist on these "proper" systems.

          Well, unix is one of those "proper" systems, so why does unix require temporary directories to hold temporary files? ... unless your whole point is completely irrelevent to the story.

          But hey, I didn't downvote you, so at least 2 others feel the same way!

          1. really_adf

            Re: "by your implication, unix doesn't need any /tmp directories..."

            why does unix require temporary directories to hold temporary files?

            Apart from supporting cases where persistence beyond that last reference closing is desirable, it lets root control where the data are able to be (temporarily) stored through mountpoints and directory permissions (you must be able to create a file).

            1. Anonymous Coward
              Anonymous Coward

              Re: "by your implication, unix doesn't need any /tmp directories..."

              exactly... i was asking the question based on the previous posters argument.

    2. swm

      Re: A "proper" OS...

      The Dartmouth time sharing system II had "catalogued" files (and directories) and "scratch" files (and directories). The former were in the directory hierarchy while the latter only lived while the process was active. You could move a file from one state to the other. Once we wanted to give a copy of the system to someone so we "uncatalogued" the top of the student file hierarchy and ran a logical dump of the root of the file system. When we were done, just for fun, we dropped the student file hierarchy and watched the system cleanup and reclaim space. After an hour we killed the system and rebooted from backups.

    3. Ken Hagan Gold badge

      Re: A "proper" OS...

      "Does Windows support this?"

      Yes. Use FILE_FLAG_DELETE_ON_CLOSE (writing from memory) when you create the file.

      Supported since NT 3.1, probably because NT was designed to be a superset of both POSIX and VMS (and probably also OS/2 and DOS) and because Dave Cutler knew his shit.

    4. Roland6 Silver badge

      Re: A "proper" OS...

      I think a "proper" OS would facilitate the use of a scratch partition, so that truely "temporary" files are held separately to the normal filesystem. One of the big complaints with Windows is that it didn't facilitate sensible disk partitioning and usage...

      Whilst this could be forgiven in early editions, for versions derived from NT and particularly server versions this omission is unforgivable.

  19. jms222

    ZFS and rotating snapshots

    No substitute for backups but rotating snapshots are great for this sort of thing and various kinds of finger trouble.

    I should know better but once wrote some production programming code that continued to use something in /tmp even when it went into use.

    1. Jamie Jones Silver badge

      Re: ZFS and rotating snapshots

      I live by snapshots.(though mainly on ufs not zfs)

      Amongst other things, I always do a diff of a file I've been editing to make sure I've not accidentally made typos, or injected some weird characters.

      No longer do rogue problems crop up that can be traced to a spurious ^Z in the source code!

  20. Admiral Grace Hopper

    A tale as old as time

    Way back in Jurassic times, developing on mainframes, a smart girl *coff* asked if our relatively new environment which had been set up for a new project was included in the schedules which backed everything up to tape*. We checked, it wasn't, so our semi-technical team leader raised the paperwork, then said he would work late to "tidy things up" before the first backup as we had been so productive that the first back up was likely to be quite large.

    You can guess how this went, can't you?

    Fortunately, we had enough compilation listings to allow us to re-type it all back in within a week.

    * Proper tapes, with Joe 90 reels, vacuum feeds, reel clips that broke when hit by footballs (another story for another time), with tape accessible to grumpy tape ops who were not above giving it a tweak to stretch it and render it unreadable 2 hours into the batch run if you'd pissed them off.

    1. This post has been deleted by its author

  21. irrelevant

    rm -rf /

    Not me, thankfully, but I got a panicked late night call from one of my colleagues whom, at that time, was stationed at a major customer. One urgent dash down to Stoke later, I found the doors unlocked alarm off, and not a soul in sight. My colleague arrived soon after, as he had slightly further to travel.

    It seemed their on-site IT tech had attempted to do a later night restore of a backup, and wanted to clear out the relevant folder first. But issued the rm in root... He called it in, but had bolted by the time we got there. I don't think he was ever seen again.

    We managed to recover the system; he had done the usual backups first, thankfully. But had we not, it would probably have taken down the company. Then known as Midland Cellular, it went on to be better known as Phones4u. Now, with hindsight....

    1. Doctor Syntax Silver badge

      Re: rm -rf /

      "Then known as Midland Cellular, it went on to be better known as Phones4u."

      That might explain something. I had a 4 week gig in Phones4u days as a 2 week holiday cover plus a week either side. ISTR it took just about all that time to do the paperwork to get some disk (probably 2Gb) allocated under LVM by the admin team and permission for us on the database team could add it as another chunk to the Informix database. It did strike me as a bit over-cautious.

      1. Joe W Silver badge

        Re: rm -rf /

        The classic is trying to remove the hidden files and directories by doing

        rm -rf .*

        as root.

  22. Baldrickk

    Nuked Network Drive

    Was called to help another team with some build issues / updates to do with changing code coupled with a SCM migration.

    Made the required changes, ran the build script.

    Only this build script required environmental variables to be set.

    Environmental variables that were not checked to ensure that they had been created

    Environmental variables that turned a rm -rf from emptying the target build directory, to targeting the root of the drive

    Said drive was a networked location.

    I realised what was happening by the time it had munched about half of the data stored on the drive, by which point it had removed some data critical to building something else, which alerted us to something being wrong.

    I'm thankful for working backups.

    1. Phil O'Sophical Silver badge

      Re: Nuked Network Drive

      We had a similar problem. Sysadmin came in one morning to some complaints of missing files. They continued over the course of the morning, and it was only when he noticed that the complainants were arriving in roughly alphabetical order of username that he twigged what had happened. QA test system was doing "rm -rf ${TESTROOT}/" with TESTROOT unset, but with the NFS automounter configured. Not running as root, so the damage only really started when it got to /home and found all the files with "group" delete permissions...

      Many people were very relieved that we had nightly offline backups.

  23. Anonymous Coward
    Anonymous Coward

    Domain admin rights? Domain admins don't need those right?

    About 5 years ago I was tasked with cleaning up our AD rights. Unfortunately I removed all of our domain admin accounts from the domain admin group.. I knew what I'd done the instant I logged out and went back to my PC to log in and started having problems accessing things.

    We were a small company so no replication to fall back on. After a period of considering how best to hand write my resignation I vaguely remembered the other IT techs bad habit of not logging out of RDP sessions and since I knew his password..

    So I promptly used RDP as him to connect back to the server I'd just managed to screw up - he was still logged in, so still had admin rights applied - reset permissions and then never mentioned this to anyone.

  24. Kobus Botes
    Flame

    Same user - same problem twice

    I had a user in the late 1990's/early 2000's (a most unreasonable man and a director to boot) who caused me no end of troubles.

    The first problem arose when he logged a call for a problem with Outlook freezing, not responding, et cetera.

    Upon checking the machine and Outlook, I found that his pst file was close to 2GB in size (Windows 98 SE and Office 95). Turned out that about 90% of the size was taken up by his deleted items, so I told him what the problem was and if it was OK if I delete it, or should I copy it to another pst file should he want to keep it. The answer was "No - it is deleted."

    About ten minutes later I had a frantic call from Head Office about this user's mail that I had deleted and that I should restore it immediately, as all his important stuff was gone. Luckily they backed me on this one when I had explained what the situation was (although he was sore with me for a long time about the incident and the fact that I could not recover his deleted items).

    In order to prevent future occurrences I then gave him space on a server, where he could save his documents, and I wrote a little batch file that he could run at any time (provided Office was closed) to back up his documents as well as his pst's. I also stressed that he should run the file every Friday before calling it a day. So far so good.

    Then he bought himself a copy of Windows XP towards the end of 2002 (as the company said that no machines would be upgraded to XP- only new machines would come with XP and machines would only be refreshed in the normal cycle. His laptop was also self-purchased, because the company did not buy laptops unless you really, REALLY needed one, and he NEEDED one).

    So he logged a call for me to come and install XP NOW, as he needs it for an important meeting the next day. Once again my explanation that it would take about three days for the rebuild (cataloguing everything on the machine, getting his install disks for Office XP that he also bought, which was at home, as well as a number of other essential, cannot do without programs, then installing everything as well as the service pack that I had to download and install, and including a fairly generous contingency time to allow for possible problems) did not go down well, to say the least.

    I vaguely remember that I was severely stressed for time, probably a slew of new machines I had to build (all new machines came naked and everything had to be installed by yours truly), which also did not help.

    Stressing the importance of his running the backup script every day until the appointed installation time, since my having to do it would only stretch out the build time, and sending him an e-mail with all the details of what he needs to do beforehand and also apologising for the fact that he will have to make do with an old desktop in the time that his machine was out of commission, I was obviously left with a very grumpy guy.

    Come the day of installation I installed his desktop and set it up for him, including copying my script and making sure that everything worked. The next question was whether he had done the backups as requested (to which the reply was that of course it had been done, just get out of the office and get cracking.

    Come delivery time less than two days later, everything went swimmingly, until he started checking that everything was done correctly, only to query where his latest documents were. My insistence that everything that had been there had been restored, did not go down well. Upon checking, I saw that the last backup he had made was some months prior to the reinstall (I had no access to user's folders on the server - it was all tied to the users' domain accounts).

    The upshot was that I had to remove his hard drive and take it to a professional outlet to try and recover as much as possible (he did lose a number of mail attachments, but luckily mostly non-important stuff - most of the other missing items could be recovered from the people who sent it to him or to whom he had sent it), since my e-mail regarding the rebuild did not explicitly mention the fact that he had to run the backup script and that everything that was on his hard disk would be lost forever. Plus I got a written warning for failing to ensure that no documents were lost.

    Fun times.

    ------------------------------------------------------------------> What I felt like at the time.

    1. Anonymous Coward
      Anonymous Coward

      Re: Same user - same problem twice

      And that is why I always did PC resets/OS installs onto a new hard drive and kept the original to one side to be used for another upgrade. Similar with servers - major new upgrades with specialist software where there was only one or two servers for the application and database I would buy new servers and commision the software and copy the data onto those so the old ones could always be fired up at any time or find that important lost customisation script that you changed, etc.

      1. Is It Me
        Thumb Down

        Re: Same user - same problem twice

        Not everywhere has the budget for that.

        Most of the places wouldn't have had the budget for the spare desktop drives (even assuming re-use after a month or so), let alone the new servers.

        1. Doctor Syntax Silver badge

          Re: Same user - same problem twice

          "Not everywhere has the budget for that."

          Somehow they always have the budget to cope with it going wrong. Either that or they're out of business.

      2. Antron Argaiv Silver badge
        Thumb Up

        Re: Same user - same problem twice

        I do the same. HDDs are about $100. Cheap insurance, and, once you have insured that all the data has been transferred, you can use the old drive as a secondary (or for something else).

        1. Anonymous Coward
          Anonymous Coward

          Re: Same user - same problem twice

          well here in the NHS when migrating from w7 to w10, we replace harddrive with shiny new SSD (costing £30) then keep the old one for a bit in case user has some data on it (they havent) and then we pay someone to come round , stick a spike through the old ones, and take them away.

  25. Anonymous Coward
    Anonymous Coward

    scratch

    I still use a scratch disk for temp files, it's much faster to clean up (just format it) even though disks are getting bigger ... my first scratch disk was 10Mb.

    1. Swarthy

      Re: scratch

      "Always mount a scratch monkey."

  26. Doctor Syntax Silver badge

    mv * in root

    Just caught it too late. As cd and echo are shell built-ins I could navigate and list the ruins but not do anything about it. A reboot from the SCO install disks would let me sort it out but they didn't include the custom driver for the RAID. It took most of the next day to get someone to email us the driver. Putting the driver on a floppy, booting from install disk and putting everything back took minutes.

  27. Anonymous Coward
    Anonymous Coward

    no! .tmp is for template

    Not the culprit this time back in 1990 when cleaning another administrator cleaned the file system of .tmp files to the howls of a senior scientist, who had stored many report templates with the extension .tmp.

  28. Cynic_999

    My method

    About once a year, I buy a new HDD or recycle an older one and install an OS onto it from scratch. I then copy all the files I think I still need from the old HDD. Over the course of the following couple of months I will need to mount the old HDD at increasingly infrequent intervals to grab some file/licence etc. that I had neglected to copy.

    This has the effect of clearing out a load of accumulated crud (stale cookies, leftover files from unwanted installs etc.)

    After about 18 months I figure that the old HDD can be recycled as it is highly unlikely to contain anything that I still need. (Files that are definitely important or irreplaceable get backed up separately).

    1. Prst. V.Jeltz Silver badge

      Re: My method

      I keep all my stuff on a nas, therefore all my computers just have the OS , and maybe a couple apps - just like at work.

      So any can be rebuilt with minimum prep. Not that i seem to need to do that as often as the old days - probly cos everythings web based theses days and we're not installing shite all day.

  29. Jamie Jones Silver badge

    All the admins fault

    You should never assume that files in the users storage area follow your naming convention, however common it might be in your area of work.

    As others have said, it could be .template, or the users initials.

    Reminds me of the time a sysadmin at a universty deleted a file called "penis" that contained important research data by some people in the biology department...

  30. Anonymous Coward
    Anonymous Coward

    Back in the early 90's...

    ...so pre-Outlook days, we were piloting an email system for, er, a large HMG department. During the pilot an updated version of the email client became available which we went to install.

    After installing the new client, one of the fairly senior people taking part in the pilot called to complain that he had lost his emails, which was odd as we had migrated his Inbox successfully.

    We had to explain to him, as tactfully as possible, that using the Deleted Items folder as an email archive wasn't the greatest idea in the world...

  31. ecarlseen

    Oh, it gets worse...

    It's one thing to have users "store" old email and files in the Recycle Bin or trash folder or whatever. There's enough of this lunacy going around to where I would guess that it's a small double-digit percentage of people (frightening!).

    But it gets worse. Much, much worse.

    We dealt with a vertical-market ERP vendor (now fairly dominant in their field) who for years would store critical local machine configuration files and scanned document data in subfolders of C:\TEMP. They would then have pearl-clutching, screaming fucktard shit-fits whenever an admin had the temerity (oh my!) to actually delete stuff in C:\TEMP. Eventually they knocked this particular bit of stupid off, but to this day they still do things that make my head explode...

    1. Doctor Syntax Silver badge

      Re: Oh, it gets worse...

      I had a gig with someone who used a vertical market (on SCO) ERP system.

      The box had been set up with very few separate partitions. An overnight job wrote working files into /tmp but cleaned up nicely after itself. Except for the odd time when something triggered the job into just keeping running so the file grew and grew. By the morning the partition holding /tmp, /, /bin &ct was at 99%. The box was unresponsive, probably because it had also filled memory and the OS was thrashing.

      AFAICR the process couldn't be killed, either because it wouldn't terminate with an unwritten buffer or maybe because it was so sluggish it was taking a few hours to terminate. Attempts to free up space failed - the monster wrote to them faster than it would list files for me to delete. It didn't help that the box was in a branch office over a hundred miles from my desk. I think we had to wait till someone came in to the office and hit the switch for us. Oddly it didn't seem to do any actual damage other than needing an fsck on reboot.

  32. Anonymous Coward
    Anonymous Coward

    Have you ever deleted something crucial?

    Regardless of what the natives say, before you upgrade alter or delete anything, you made a full backup.

    1. Prst. V.Jeltz Silver badge

      Re: Have you ever deleted something crucial?

      how do you save any space if you back up everything you delete?

  33. David Given
    Stop

    core

    There's the old legend from the university Unix days of the geologist asking the admins what happened to their thesis, which they'd saved in their home directory and was now missing. What had they called it? Well, they were studying the Earth's core, so it was just 'core'... and the automated core dump deletion cron job had nuked it.

  34. billdehaan
    Facepalm

    Been there, done that, had the stitches pulled

    Oh, this story definitely rings a bell.

    Back around 1988 or so, I worked in a shared lab environment. There was no network, and it was very much "every man for himself" development.

    Every group had its' own naming convention, naturally. And this was in the days of DOS, with 8.3 file and directory names. So, I created a root directory DEVTEAMS, under which I'd put a READ.ME file stating that the subdirectories could be used by anyone. I figured that way, anyone who backed up the DEVTEAMS directory would back up every group's work, and we'd have multiple backups. Backups were done with floppies, which were painfully slow, so I tried to compartmentalize all the development into one folder tree.

    One machine had both a transputer card (remember those?) and a special video card. My group was doing video work; another group was doing work on the transputer, so we created \DEVTEAMS\VIDPROJ for the video project. I even created \DEVTEAMS\TRANSPTR for the other team, though they never used it.

    Now, for those unfamiliar with transputers, they are/were massively parallel processors, which used a non-sequential language called OCCAM to take advantage of this. However, because of this parallelism, source files were not stored sequentially. A 12 line C program could be "hello.c", but the equvalent wasn't "occam.c". Instead, it would be 12 files with names like ~24nkj24.jd8 and the like, which the Occam editor would link into the environment.

    One day, we went to run a video test, and discovered that there was less than 2kb free on the 20MB drive. So, one of my teammates cleaned up enough space on the disk to run the test. A day later, the manager of the other group became hysterical that our group had destroyed six months of their work.

    Fortunately, my teammate had done a backup of the machine before wiping it, however, the other team's project directory wasn't on the backups.

    Where my and other teams used DEVTEAMS, this group decided to go their own way. The decided to use their group member's initials for the directory name. So James, Uri, Norm, and Kwok put all of their transputer files in the subdirectory... C:\JUNK

    Yes, on a shared machine, they set up a directory called JUNK, and filled it with 300 binary files with names like $3j5a1.d7x, and were shocked when people looking to clean out dead files didn't realize that those were critical project files.

    Although they weren't so critical that their team ever bothered to back them up, of course.

  35. quartzz

    I swear computers now are where they _should_ have been 20 years ago.

    1. Roland6 Silver badge

      ? Given there isn't much difference between the mess created by W2K/Office2K and Win10/Office2016 when the plug is pulled (or Windows shutdown) with files opens, I'm not sure I agree with you.

      1. quartzz

        I mean more about compatibility, an email with an image sent will generally show up with the image visible, a hard drive from this system will generally work with that system (provided they are both sata and so on)

  36. Anonymous Coward
    Anonymous Coward

    You were told to clean up our systems, not delete 8,000 crucial files

    I clicked on this article thinking it was about another Windows update glitch.

  37. Griffo

    Lost all a CIO's emails during an Exchange Migration

    I once had to complete an AD and Exchange migration for a company. I don't recall the reason why exactly, but they needed to move to a new AD so a full migration was necessary.

    When I configured the new Exchange environment, I set up some basic policies - you know, like remove all mail from the Deleted Items folder after 30 days etc.

    A week or so after the migration was completed, I got an urgent "please explain" email from the CIO, he wanted to know why I was so incompetent that I had managed to lose all his emails.

    Naturally I went straight to the logs to see that yes all X number of items had copied across, so I went to question him as to what emails he was referring to. At which point I learnt that he, no shit, stored every single email he wanted to come back to in the 'Deleted Items' folder. On their previous server they had no policies so they stayed there until he deleted them a second time. I never could get my head around his logic that this was a good place to store them..

    Anyway, luckily I still had a PST of his old mailbox, so disaster was averted. But what a muppet.

  38. Luiz Abdala
    Windows

    Never delete. Auto-backup.

    My old man would simply buy a new hard drive, disconnect the old one, format and install the latest Windows in the new one, and reattach the old one as a slave. Everything is backed up... Using the hard drive itself as a backup is a somewhat genius move from him. No time wasted, the system has a bootable drive in itself...

    Except he did it 3 times in a row from a machine to the next, and never threw away the old drives.

    So his machine had at one point a 2GB drive, a 10GB, and a 40GB HDD. Each one with a flavor of Windows, all of them perfectly functional if they were formatted on the same motherboard. Aaaand we could only have 4 IDE drives in a single machine.... back then...

    So when I had a 80GB drive installed for him, he asked me to save all of his files. So I got all the older drives copied into the 40GB one, slaved that to the 80GB, and burned everything to CD-Rs and a SINGLE USB stick. Just in case.

    THEN I could format and donate the drives.

    But I literally made copies of everything I was deleting.

    Nobody deletes anymore.

    1. Prst. V.Jeltz Silver badge

      Re: Never delete. Auto-backup.

      many other posts have said same .

      Y'all have much backup media 90% filled with c:\winnnt , system32 , program files - and other shit you dont need.

  39. ICPurvis47
    Thumb Up

    Inherited HDD

    Many moons ago (pre 1990), when I was just getting to grips with DOS4 on a two-floppy (5¼") XT, a colleague of mine was entrusted with a similar machine with a 20MB hard drive expansion card in order to fulfil his position as Membership Secretary of his local Crown Green Bowling club. He complained that he could not update the membership database or add any new game fixtures, and would I have a look at it please. It appeared that the previous incumbent of that post had been paranoid about virus infections, and had added every antivirus suite he could lay his hands on, and everything was installed in the C:\ directory, which was at full capacity having 255 entries. The poor old HDD was struggling and had thrown up a few bad sectors as well. I persuaded the club to buy him a new (30MB) hard drive card, copied all of the important files across and organised them into a proper directory structure, then removed the old drive and returned the newly working computer to him, with the admonition NOT to add any more free-on-the-cover-of-a-magazine antivirus software, and to keep the one kosher AV I had reinstalled up to date. I then low level formatted the 20MB drive to remove the bad sectors, reformatted it and installed DOS4, and installed it drive card in my XT as payment for the service I had provided.

  40. J.G.Harston Silver badge

    Oh dear....

    .... that day when I discovered something had SET TMP=C:\DOS ....

  41. ricardian

    Here's something to bring back memories!

    https://youtu.be/nLy_jEbuY-U

    1. Tim99 Silver badge
      Windows

      Thanks, yes, I look like this >>======>

      Probably because I was one of the XT/AT installers, supporters, and system architecture & software specifiers at a very large public utility. A tip was to put a spare key in the IBM installer ring folder as nobody ever looked at them after the system was working - Many people "lost" the key/folder when they moved office, so we also kept a register of which key fitted each machine. Our standard install was a WordStar or DEC WPS/IBM Displaywrite word processor (Depending on whether the user's job was as an engineer/scientist or administrator/secretary - Later we standardized on WordPerfect); Lotus 1-2-3; and R:Base or dBase III if required. Generally a suitable pseudo-menu batch file was supplied by us and called from AUTOEXEC.BAT on start-up. Some people paid for the IBM DOS menuing program which was OK, or later we supplied Norton Commander or XTree. When Windows became common we noticed that people who had used NC seemed to prefer separate Windows Explorer windows, and XTree users tended to drag files into one window, so what they learnt first seemed to stick with them later...

  42. RegGuy1 Silver badge

    Problem with users -- or software?

    Hmm, what do all these horror stories suggest?

    Is it that users, who are often not technical and only use the mail system as part of their daily job, are stupid, and should know better?

    Or could you argue that software is not user friendly enough? If this is known behaviour that happens so often, is it not the responsibility of the software companies to provide software that makes it easy and intuitive for none technical people to use?

    We can all laugh at stupid users, but is the stupid software not to blame?

  43. Aussiekraut

    Did something similar

    About the same time as the OP, I had a little issue with an Outlook user.

    To keep disk space usage in check, I configured Outlook to automatically delete the "deleted items" folder on exit. One day I worked on my Boss' machine and did the same. The next day I get a frantic phone call with my boss telling me all his emails were gone. So off I went and had a look and they were all there. He then said no, not the ones in the rubbish bin. I told him I set Outlook to delete them on exit and he almost took my head off. Apparently, when he wanted to get an email out of his face but keep it for later, he just hit delete and used his rubbish bin as his mail archive! Well, this was all POPed mail and there was no backup of workstation disks (the boss also was a stingy bugger), so all his old mails were gone for good. He wasn't impressed and didn't like the question if he uses his rubbish bin under the desk as his filing cabinet and tell the cleaner off for emptying it either. But he took my advice to keep mails elsewhere, not in the rubbish to heart and luckily didn't sack me either.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like