Out of Office...
There are several cases where auto-answer is not a good idea, but things improved a lot once all mail servers started to use a database back-end to store only ONE copy of an email.
Friday is upon us once again, bringing with it all the promise of the weekend and a cheeky adult beverage or eight. Unless, of course, you're cursed to be On Call. Today's tale comes from an individual we're going to call "Al" and concerns that triumvirate of evil: a cluttered Windows desktop, Lotus Notes, and the Devil's …
At one company we used Novell Groupwise. This allowed you to delay sending emails until a particular time. Very handy for sending press releases when needed say after 12am. It was also great for slagging people off after you've left the company, making it look like you're working late etc. So anyway two staff members at this particular company were having a friendly bit of banter via email. Unbeknownst to each other they were both going on holiday at the same time. They had both had the clever idea of sending the other a mildly insulting message after they'd left for sunnier skies. There wasn't a send once set on the out of office of either of them. So every time an automatic response was sent it triggered another in the other mailbox
The mail server filled up fairly quicly as one of them had included a picture of a beach in their out of office. Alarms started going off on the Network manager's desktop as the server started to reach capacity. Then we started to lose the ability to send and receive emails before the mail accounts were identified and finally dealt with.
The culprits weren't named but it was interesting what he said in the autopsy of what had happened. In the five years since Groupwise was installed this was the first instance of that happening. When pressed further on that he said most people didn't know you could delay sending email. That lead a lot of staff to believe it was people on his team - I couldn't possibly comment.........although I know exactly who it was.
Except it's now back, done manually.
It never really helped for message content. It was always about attachments.
MS are now encouraging people to send attachments via OneDrive instead. So now we get single-instance storage, at the cost of forcing the sender and recipient to manually deal with it.
Such is progress.
MS are now encouraging people to send attachments via OneDrive
Fortunately, as we are NHS, One Drive is blocked! Every now and then a new (non-medical) manager takes offence at this and starts a process to get it unblocked but they move on so quickly that the idea is dropped.
Mail on the original ARPANET had a problem with two people responding to "out of office" messages with "out of office" messages (ccing all of the other recipients). Things filled up fast until protocols were developed not to respond to automatically generated messages.
A Windows 95 PC with 2GB of RAM? Sorry, that's incorrect.
When Windows 95 came along, a decent PC would have in the region of 16MB to 32MB of RAM, and it wasn't uncommon to find lower-end systems sporting 8MB. In fact if I recall, Win95 had problems booting if you had more than 480MB of RAM in your system.
We had bits, and we felt lucky.
Our father woke us up at one o'clock, two hours before we went to sleep, murdered us in cold blood, made us install Computer Associates middleware before eating a breakfast of cold gravel, then sent us t' mine via Lotus Notes. In the hot inky blackness of the netherworld we'd wind individual bytes of memory with razorwire, whilst the Foreman read aloud from a selection of early PHP code. Then at ten o'clock they'd kill us by flooding the mine with SPX packets as the salespeople played illegal copies of DOOM. Our corpses would float up, where they'd revive us and make us fix a field full of HP Laserjets with "PC Load Letter" errors. At the end of week, we'd be expected to pay a shilling for privilege.
And you try telling that to the kids of today. They won't believe you!
Lotus Notes gave me a career for 15 years, so I'm not usually one to bad mouth it.
That website was out of date even when it was published, as it mostly deals with Lotus Notes 4 - 5 shipped in 1999.
The biggest problem Notes had was just that IBM underinvested in it massively. The backend was superb, and didn't need much more investment. The client needed some improvement, but only got it in fits and starts. Classic IBM management failure, really.
Its day is past now. You don't have to use the same app for mail, calendar, to-do and so forth. Notes as a platform was ultimately killed by what's also slowly killing off Outlook - the web browser. Outlook took some Notes mail seats temporarily, that's all. ;-)
(And we still don't have any great mail clients. My feature list for a mythical "perfect mail client" has features from about five different email clients, and would be hard to build. Especially as I'd like it to be cross platform on desktop, web, Android and iOS. Ultimately, we muddle through with what we've got.)
Yup have to agree, even most didn't apply to 5 by the time they got a UI that was usable it was bloated and slow, the Notes webmail client though was brilliant.
As out of offices were sent via an agent in the nsf (Every 15 minutes if I remember correctly) the impact would have taken longer to be felt as long as the Domino Server transaction logs didn't run out of space.
Even the scheduled agent OOO went away - I think in Release 7. There had always been a trigger for agents to run "on new mail", but the concern was that an email to everyone would overwhelm the agent manager queuing system, and not all of them would get processed. IBM did a little work so that the Router could run simple agents itself, rather than sending a trigger to the agent manager process.
That meant that OOO became effectively instant without large load on the server.
And the Notes OOO had, since version 4 at least, kept a list of who it had already replied to. By default, it was set to only send one reply during the entire duration. That really helped keep email storms at bay. Of course, there's always someone who'll turn that off, but Notes made it easy enough to spot and react to because you had decent logging and a server console you could look at...
Anyway, enough rambling. I'm showing my age. ;-)
Yes although you had to be sure to apply the mail template on the nsf and enable the setting on the Domino config.
I seem to remember using Designer a few times to disable agents that for some reason hadn't disabled.
Also the console really was a nice feature, just annoying when someone installed Domino and decided to put a password on the server id file and create a new Notes Network.
One of the best features in Lotus Notes, which has STILL not been ported into Outlook by Microshaft, is the ability to detach attachments and save them to separate storage, thereby maintaining the audit trail but saving space on the Domino/Exchange server. We have enough trouble with full mailboxes where I work as it is, current circumstances have just exacerbated the problem!!
DAOS really is your friend there, only storing one (encrypted, unidentifiable from the OS) copy of identical attachments per server, so a, say 55GB mail file only takes about 9GB on disk, plus the space for attachments it shares with other mailboxes. Vastly superior to the old SCOS in use up to v7, which was a nightmare to maintain.
"One of the best features in Lotus Notes, which has STILL not been ported into Outlook by Microshaft, is the ability to detach attachments and save them to separate storage"
Eudora did that. Meant I could keep years of message boxes of a reasonable size for instant access. But now have to use Thunderbird which keeps messages + attachments in one large blob which is v inconvenient :( And searching and signature handling not as good as in Eudora.
Any suggestions for a better alternative to Thunderbird? I'm happy to pay for it.
A good weekend to all Commentards.
Actually if memory serves Exchange 2003, and 2007 to an extent, did do single instancing of messages / attachments.
They took it away in 2010 when storage became 'cheap' although I think even then 2010 did attachment compression.
Obviously the user would still be 'charged' for the logical space taken up by the storage limits but, especially with sales teams or accountants who insisted on e-mailing massive spreadsheets to each other, the size of the database could be way less than the sum of the individual mailboxes so to speak.
I use Aquamail.
In the many, many options there's the ability to force plain text format, to force replying at the bottom and to determine your quote prefix.
It may not be entirely what you want, but it's not bad. You can back up the settings to a cloud account so once you've got it configured, which is handy.
FairEmail is a superb open source email client for Android. It offers this feature (just tap the pen icon above the quoted email in your reply to edit it inline), and loads more beside.
I don't do email on a tablet (phone and pc only), but I imagine it will work fine there for you too.
Well worth a look.
K-9 hasn't been updated since 2018, so it is a security time bomb at this moment. I can't find Kaiten on the Play Store today, but that always used to be just a paid skin on top of K-9 anyway, so I cannot think it will be any better off security-wise.
The lack of updates for K-9 is the reason I switched away. FairEmail seems like the natural successor. And there is a set of features (eye-candy stuff, not substantial functions) you can pay for if you would like to support the developer.
Some may find its enlightened privacy-enhancing features too much, but you can turn those off if you wish.
That website was out of date even when it was published, as it mostly deals with Lotus Notes 4 - 5 shipped in 1999.
I started my full time professional career in 2004, we ditched Notes 4 in 2011. I think it was our homebrew HR system that was the final thing stopping us from ditching it.
One of my first coding jobs was rewriting the RTF output on one of our websites so that it matched the shit-brown theme that someone had decided our Notes install should be.
many companies still use Lotus Notes.
I was using it at IBM the other year, many councils still have it lurking somewhere.
The retailer i now work at still has lurking hidden in the background.
The best thing about lotus notes is that its built on a DB and effectively permits you to build apps on it, email, forms, processes, messaging etc all within notes. Its massive and bewildering but served/serves a function that nothing else seem to.
I'm quite glad i don't have to use or support it but appreciate what it did.
We used to have a Lotus Notes server at work. Nothing major, just a few users and a small server.
After a couple of years, I moved into the office of the Notes administrator (oracle dba primarily). I was thinking about how we were told Notes would be the next big thing and we would all be getting accounts a couple of years previously, and how nothing had happened. So, I asked him what happened to the Notes server. He replied “What to you think I’ve been resting my feet on under the desk?”. Sure enough, it was there, on it’s side. He’d quietly migrated the users back to IMAP (they were only using email anyway), turned it off and moved the server under his desk..
In my day me an' my brother 'ad to work together, one reading out 'ex digits while t'uther typed.
And we 'ad none yer fancy routable SPX. LAT was all we 'ad or we could carry the bits ourselves! As fer yer HP Laserjets... we 'ad Epson FX-80's and 'ad to feed the paper manually. All we 'ad t' eat were the holes that fell on t'floor!
I spoke to my nephew who was attempting to do home videos during the lockdown. The aspiring Spielberg had called for advice on something technical. The phone was struggling to edit the footage doing any special effects. Turns out his phone is a very old one his mother had upgraded from. It had just 1GB of RAM and not much onboard storage although he did have a micro SD card to expand on this. I explained patiently that the computers when I started work didn't have 1GB as total storage let alone just RAM.
He didn't believe me at first and after I insisted just said "You must be really old then" instantly killing off his next birthday present. He redeemed himself and his future present when I asked how old he thought I was. Twenty five, when I'm nearly twice that made me smile.
No. They still teach them those basics at school. Despite the insane requirement for every kid to be taught "coding".* Despite the calls, (frequently from within El reg commentard comments) to change school IT to something that wasn't just about how to use WORD/EXCEL/Access ( or equivalents) this is still something that they need to learn.
*Everyone, from CEO to caretaker (janitor) needs to be able to use the standard office programmes, find their way round a computer and use a keyboard, even in the 2020s. Almost no one needs to be able to code. Which is just the 21st C version of us all having to learn metal and woodwork when I was at school in the 70s. School should be about gaining a education and generalisable skills. Not about training to do the job that you are expected to enter because you are one of the working classes and didn't go to private school.
My first computer was better. It was a COSMIC Elf with 4K of RAM. I splurged when I got it so added the 4K expansion RAM board because 256 bytes just didn't seem like enough.
So I guess I lied, I had 4.256K of RAM.
But sadly I never had it running Basic as it only had a hex keypad for entry. But it did have a video output.
I can confirm such a beast existed, I was running it.
I'd found a "memory tree" card that allowed you to add many more sticks of RAM than the motherboard had sockets, supposedly so the hobbiest could use a bunch of 1, 2, & 4 Mb sticks to max out their system. I'd filled mine with the max capacity available at the time (employee discounts FTW!), and used a utility to turn most of it into a RAM disk. Windows had more memory than it knew what to do with, the RD allowed me to save downloads faster than the HDD could maintain write speeds, & as long as I remembered to save from RD>>HDD before shutting down, I could zip along faster than my piddly little 'puter had any right to be.
Even better was using the RD to load programs into in which to run them. I quickly found out which ones could be safely run that way, which ones ran too fast when purely in memory, & which refused to run at all. Another DOS based RD utility allowed me to create one from a pure DOS prompt, thus allowing me to run my DOS games the same way. It was heaven...
Right up until the damned thing caught fire & turned itself into slag.
*Weeps rainbow arcs of sparkly tears at the memories*
RAMdisks FTW all the way, now, I say. Got my Windows TEMP on one (among other things). But if anyone can tell me exactly why some things (notably AMD graphics drivers) will refuse to install with the TEMP on a RAMdisk – giving a useless error code, I might add – then I'd be fascinated to know. It's certainly not because it doesn't have enough room: I checked what it actually uses and it's way below the 8GB I have free.
If fails because they dump installer files into TEMP before rebooting into a pre-logged in state that hasn't loaded the rd. Just point the TEMP (and probably TMP) environment variables to a real drive or comment out to revert to defaults before installing, point back to ramdisk after and reboot again.
This may have been fixed, the last update just worked before I remembered to apply the fix. Or it might just be leftovers from the previous install and I got lucky.
When 95 launched the price of RAM increased due to demand.
Price rocket after Kobe earthquake.
Seem to remember paying around £100 for a single 8Mb SIMM.
So 2Gb would have been extortionate even if you could find a machine to take it
"Price rocket after Kobe earthquake"
The first generation of RAM-raiders made it much worse. The high cost led to the robberies and I made a good few bob replacing RAM for an office full of PCs that had lost their memory over the weekend. The extra demand drive up the price even further which, of course, made knicking more an even more lucrative attraction.
It was a bonus if the RAM-raiders were amateurs. New motherboards all round.
The Kobe earthquake was in 2011, so if that affected RAM prices in 1995 you may have accidentally found a timey-wimey wobbly thing.
In 1993 there was a fire at a chemical plant where they brewed the stuff for chip cases; I still have some 4MB RAM SIMMs where they had just stuck the chip itself onto the circuit board, sealing it with black goop (with the appropriate brand name 'Topless'), but that scarcity was just about over when W95 was launched.
I had a look at a few Thinkpad models from around 1998; newer laptops would have been installed with W98 standard, although businesses might have reinstalled W95 for support reasons. The largest machines from that era had a 400MHz P2 with 512MB or 544MB RAM; largest SO-DIMM they could take was 256MB.
Memory expansion cards were not uncommon for desktop systems, although with Pentium/PCI systems they tended to be dedicated for particular mobos, ISTR there were some generic ones for VLB, and of course ISA, but those would have maxed out at 16MB. With laptops you'd be running into space problems, although I can figure there having been docking stations with extra memory
 'Frank' would of course have told them he couldn't work with W95, possibly even installing his own pirated W98 copy.
I worked for a PC manufacturer in that era as tech support/repair shop , and having access to Stuff We Never Could Afford Ourselves we did load up a machine or two with ...shall we say... excessive specs..
So yes.. there's been at least one Win95 box with 2 GB because we did build one, to see if it would actually work, mostly.
From memory: Yes it worked, even though any performance improvement past 64MB was actually negligible, and it did have problems addressing the full memory space. It did work pretty well ( and lightning-fast for the time once you got stuff loaded) because we resorted to using the overkill past 128 MB as virtual hard disks. Which did work a treat.
We did get a serious bollocking over it, because welll... Using expensive server-grade hardware for Experiments tends to be Frowned Upon From Above... But hey... For Science!! ;-)
I had 64MB of RAM in my Windows 95 PC -- at the time that was considered a lot of memory.
I remember that I had to take half of it out whenever I wanted to play GTA, because DOS4GW didn't like having as much physical RAM as virtual RAM. When I got in touch with someone at Rational they were all "oh; we never considered that someone might actually have 64MB of memory".
@JimboSmith - my kids (teenagers) initially didn't believe that when I was their age, the internet basically didn't exist (at least for Joe Public). Neither did mobile phones.
And then I pointed out that today I have more storage space on my person (phone and thumb drives on my keyring) than the entire department had when I was doing my first degree, and that their phones (not exactly cutting edge models) have more processing power than the Space Shuttles did, and their calculator probably has more than put man on the moon.
To put it in context, my PhD thesis (written in LaTeX) was entirely stored (including diagrams) on two floppy disks (and backed up on several more plus the tiny HDD of the machine), and written on a 486 PC with less memory than my current watch.
"my PhD thesis (written in LaTeX) was entirely stored (including diagrams) on two floppy disks"
@Custard [OT]: as a matter of interest, have you ever tried recompiling your thesis on a recent LaTeX version? I sometimes wonder about the longevity of digital formats. I'm guessing markup based systems are probably the safest bet
Coat: mine hosted, in the distant past, a Selectric maths golfball and a set of Rotring pens...
@keithpeter - nope, I've got the paper bound book copy on the shelf behind me, and I'm sure if I dig around somewhere I might have an electronic copy somewhere. In the worst case, it's more or less a text file with some now non-processed mark-up.
But as it was 22 years ago (eeek!) I must say it's not something I've really ever gone back to...
I had never seen (other than in films/on TV) a computer before I got to 6th formin the late 70s. There was a rumour that the secondary school had one, somewhere, but I never found it. At 6th form they had not one, but two - Exidy Sorceror, one has 16k the other had 32k RAM, and storage was on cassette.
I bought one while I was at 6th form and was one of only a very small number of people to have a computer. Mine was an Ohio Superboard (1MHz - yes, that's not a typo) 6502 and 1k byte (no, that's not a typo either) of RAM (expandable to a whopping 8kbytes of RAM). And I'm sat here typing on a rather aged laptop, grumbling about being limited to only 8G of RAM (hardware limit) with more than 8G of swap space in use.
As an aside, it's interesting to note that years ago I was in business selling and supporting computers. My colleague had this story about how he went to demonstrate an accounting package - and after an hour someone asked why he kept pressing buttons on the keyboard. They genuinely though tthat you just showed (e.g.) an invoice to the computer and it read it. Now, getting on for half a century later, we are almost to the point where that can be done - AIUI some of the advanced features can now interpret a scan (or photo in the mobile enabled age) of an invoice - or at least make a half decent stab at it.
The downside to all this is that we are using applications and OSs that take up gigabytes of disk space, and most people have no idea how any of it actually works. Even with something like the Arduino, there's a huge layer of middlewhere to isolate the programmer from what's actually happening - a far cry from the days of hand assembling code to run on my computer with only 1k of RAM.
I suspect I don't have a computing device in the house that isn't orders of magnitude more capable than those that first put a man on the moon ! Actually, that's not quite true, the old Superboard is still in the attic.
my kids (teenagers) initially didn't believe that when I was their age, the internet basically didn't exist
Show them the movie Soylent Green. Made in 1973, it showed the far future world of ... 2022.
In the film's climax, the hero is being chased, and desperately has to get the information he learned out to the world. It's a major plot point that during the chase, he can't find a telephone booth to use :-)
"and their calculator probably has more than put man on the moon."
Unless it's a Texas Instruments graphing calculator. I once worked out that my calculator (TI-82, I think) had about half the RAM used in the lunar landings. Now 20 years later they're selling calculators with, uh, oh wait, about the same capability...
I bought a Gateway 2000 P90 with 16MB EDO RAM. The supplied memory checking tool started spitting out errors when I tested it after delivery. I contacted Gateway and they put another 16MB EDO in the post. I installed that and re-ran the tests, same result.
I ran the PC for a week or so, until the engineer turned up, with 32MB, what a luxury! It ran stably and flawlessly.
The engineer had a new motherboard and he swapped the old one out. Memory test threw up the same errors... It turns out that the P90 was the first Gateway PC to use EDO memory and the testing program didn't understand EDO, so reported every memory location as faulty! Ah well, back to 16MB it was.
Back when W95 came out I had a 386/40 with 16MB of memory (and four hard disks, the two largest of which were 160MB, reformatted to 320MB by using an ARRL controller). A friend had problems running W95 on a 4MB 386, and I suggested upgrading to 16MB. He then went shopping, and came back wit a 75MHz Pentium with, again, just 4MB RAM. On which W95 ran just as shitty. I had just installed OS/2, and in comparison it flew.
A bit, but it is still a fair amount. My first proper job was in 2004 at a local council, and the Windows 2000 PCs there had between 128MB and 512MB of RAM. To be fair, Win 2k ran like crap with 128MB and I remember upgrading many a system to 256MB after receiving another support ticket of "PC running very slowly".
>In 2002 I built a system--far from my first--with 2GB of ECC RAM. It also had dual
> Opteron-240 processors and 3 WD 36GB 10K RPM SATA-I Raptor HDDs.
> The memory alone cost $500.
That must have been mid 2003. Opteron 240 introduced April 2003 and available shortly after that. WD Raptors were available in 2003.
> The OS on it was SuSE 9.2 because that was the only Linux distro available that
> was a full 64-bit system.
That part does not make sense and urged me to reply... With 2 GB RAM you don't actually need a 64 Bit OS, it actually wastes more RAM. Was there a specific need for an 64 Bit OS other than the customer needed to compensate his short d*?
In 2006 it would have been different, AMD added virtualization to the CPU, which only 64 Bit OS can really use: https://en.wikipedia.org/wiki/List_of_AMD_Opteron_microprocessors#Opteron_1200-series_%22Santa_Ana%22_%2890_nm%29
Windows NT 4.0 can handle 2 GB of RAM. With latest service pack even 4 GB of RAM. During the last breaths of NT 4.0 a reinstall required a reduction to 2 GB of RAM, then Service Pack 6.0a, and then you could have the full 4 GB again. The number of user machines with 2 GB during the NT 4.0 time was considerable thin, but they existed!
Since Windows 95 and NT 4.0 look so alike and are just one year apart it could simply have been a blunder in story telling.
We had a developer who spent all of his time talking and doing not much work. Customer's loved him because he could talk the talk. The rest of use muttered in our beards and did the work he should have done. He was "poached" to work for one of our customers as the architect. It was a classic win-win situation.
A year later at a conference I was talking to someone from that customer and asking how the new architect was. The reply was "He was bloody useless, didnt know anything, and was moved to a sales position within 3 months".
Remembering that in some irredeemable cases, the right position is at worst not in your organisation, and at best with a competitor.
It reminds me of a comment I got when I was leading an install team once and was a man down. The leader of the larger team working at a nearby customer offered to lend me one of his engineers (who wasn't the most useful tool in the toolbox), and one of the other straight-talking Scottish engineers in his team came back with "don't do that, then he'll be two men down!"
Suffice it to say, we made do with who we had until my engineer returned.
I once worked on an outsourced helpdesk, one of the guys there was always asking for assistance with the simplest of support calls, or giving incorrect advice. I won't give his full name for fear of a lawsuit, but he gained the nickname behind his back of 'Useless Eustace'.
A few years later I was contracting at another outsourcer, I happened to look at the user profiles on the PC I was using and spotted a familiar name. I turned to one of my colleagues and said 'Oh, I see you've had XXXXX working here', his response was 'You mean Useless Eustace...?'
Had something similar, a User whose first name was not Wayne although that's what we called him when he wasn't around. He would take it on himself to move his PC half-way across the office (despite his boss telling him not to. Every. Bloody. Week.) then log a call complaining his mouse wasn't plugged in - of course not, he had deliberately left it on his proper desk. He would often move a networked printer (HP LaserJet 3) because he could not be bothered to get up and walk two desks there and back. He had lots of other tricks but if I give too many away, he might recognise himself. He might anyway.
His surname was Kerr...
I can't begin to describe the number of Windows issues I've had to spend far longer than is sane trying to resolve because the first few pages of search engine results are just Microsoft forums with the same useless advice from Microsoft shills, which inevitably starts off with "run sfc /scannow" and either ends with no resolution or with the OP having reinstalled Windows.
> We had a developer who spent all of his time talking and doing not much work. Customer's loved him because he could talk the talk
I once went for an job interview, but came second because the company felt that the "winner" had better technical skills.
C'est la vie. Until I got a phone call a few weeks later; turned out the guy had completely blagged the interview and had the technical skills of an untrained monkey, so they'd sacked him off and had to start the entire recruitment process again...
I got my first job that way as well, although thankfully they couldn't be arsed going through all the interviews again and simply offered me the job as I was apparently their "second best" candidate. Thought they were just being kind when they told me that originally, until they phoned me two weeks later to offer me the job.
Offspring was applying for paid undergrad internships as a year within her degree.
One large company wouldn't confirm she hadn't got the post. Even several weeks after.
Eventually they admitted that she was best candidate and the others had been told "no" weeks before- but they'd had to decide between her and giving the role to someone already in the company (i.e. not an undergrad intern). And it was "no". So she accepted the other one she'd already been offered.
Then company called and offered her the post she'd been turned down for anyway.
Too late. She'll spend her year with a large multinational tech company. Probably not as interesting a product line (she's not a techie) but much better working conditions and opportunities.
We had almost exactly the same scenario years ago of two bouncing out-of-office messages, neither with "send once" set. It took a while to manage to stop it happening as the server was so overloaded. I can't recall now whether that was with Notes or with the predecessor system which was a proprietary email box connecting to the world via an ISDN line, I suspect it was the latter.
That reminds me of an incident we had at a previous place, way back around the turn of the millennium, and when email was in it's infancy. Our first email server was a package who's name I can't recall running on a Mac IIcx that was a hand me down from the art&design dept. We were migrating to a more capable server, and to avoid the "go round and do everyone's settings at once" trick, did mailboxes one at a time - and as each user moved, we setup a forward from the old server to the new one.
It was working fine, then one day we "had a problem". The two servers had different mail size limits set. A largish mail came in, and was forwarded to the newer server, where it exceeded the mail size limit and got bounced - back to the old mail server. The old mail server then attempted to deliver to bounce message (which included ALL of the original large message) to the user at the new mail server. It was actually rather fun to watch - it went fairly slowly as both servers were not exactly the fastest computers around, and we were probably still on 10M ethernet back then.
Even more fun was when someone external sent a 10MByte email to a user at a remote site. Our internet was 64k ISDN, and it was another 64k link to the remote site. When the email "hadn't arrived", the external user resent it. Then they sent it again, and again, and ... Back then, Demon allowed you to query the mail queue - and there were several of these 10MB messages blocking the line and causing all the other email to be delayed. "Explanations were given" to the user regarding the practicalities of large emails, and why not to expect instant delivery, and why not to have them resent.
One of our consultants had been working at a major customer's site for several years, and had an e-mail account in the customer's system. Because he was rarely in the office, he set up a similar rule in Exchange to forward all of his internal corporate e-mails to his alternate e-mail address with the customer with a receive acknowledge enabled. Everything worked fine until the contract he was working on was completed (successfully I should add); the customer deleted his e-mail account at COB on the Friday.
Urgent call for me on Sunday afternoon from one of my company's directors - they need to get a significant proposal out to another customer PDQ, but the e-mail system had ground to a halt. A not-so-quick remote log on to the e-mail system later (for some reason, soon to be discovered, the corporate Internet connection was very slow as well) I had a look at the external e-mail server's logs (no, not exchange - the relay server was Postfix, something that I was really glad I had set up) and discovered that it had over 30 thousand e-mails queued for transmission - the two e-mail servers spent the entire weekend playing ping-pong with the messages until things had reach pretty catastrophic levels. More to the point the process was continuing even as I watched.
Emergency action was very quickly taken. The relay server was quickly shutdown (the metaphorical silence was deafening) and its outgoing queue was purged. Incoming messages for the guilty party where temporarily redirected to a bit-bucket (to allow the customer's e-mail server to calm down and purge its queue) and a message was sent to everyone in the company saying that any outgoing e-mails may have been lost and may therefore need to be resent. A more thorough report was sent to the director, who had a little "chat" with the guilty party Monday morning (while I was resetting things back to normal).
Why would you tell them something that isn't true? Chrome only keeps as many tabs actually running as your machine can handle - at a maximum, because ones that haven't been touched for a while will be suspended regardless.
If for some reason it offends you, show them how to use a tab-saver.
The question of interest is why users are so scared about closing down running programmes/web pages. It points to a fear of not being able to find stuff. Which itself implies poor training and arguably poor software that just isn't as clear and intuitive as we would like to think it is.
"Could Windows 95 even accept 2GB of Ram?"
Theoretically yes. There was some sort of limit at ~512MB with all Win9x systems that needed configuration trickery to overcome.
"We are talking 1994/1995 here, had someone given him a machine that was supposed to be the company server by accident?"
I'm pretty sure there were no x86 computers at the time that could sport 2GB.
Compaq Proliant 4500 (a high-end Pentium server at the time) could be fitted with 1GB of memory. You just needed an extra daughter card to house them all and the latest 64MB 72-pin SIMM modules. A single such module would cost (at the time) maybe $4000 - $5000, so were talking about a server that would easily cost way over $100K. (including the hard drives etc).
I built a Pentium 100 computer in early 1995 and the 32MB RAM was easily the costliest part when the norm was perhaps 8MB of memory.
256MB was starting to be the norm around 2002 or 2003 and 2GB was getting to be a standard baseline around/after Windows 7 was launched in 2009.
So yes, Al is talking complete bollocks here.
Though not exactly topic, but does involve email. I had been testing some new software and had a email sent every min at db resync. Well, traveled home relaxed a min, and read THIS Very Article, before checking email, yep as you can guess mine, my team, and my worse my boss who has been following this project just got dozens of happy sync reports, boxes filling fast.
Thanks REG for the spark of recall.
And now come the calls.....
I was once tasked with cutting costs in an office that had about a hundred users, which paid vendors for custom data feeds. Some feeds were cheap, some were expensive, and some were astronomically expensive.
The system had built up over several years, both the original users and the support staff had moved on over the years, and the only documentation that was guaranteed to be accurate were the current invoices. Looking back at the pricing logs over the years looked like "Hollywood accounting", as vendors gave special deals to compete with one other, the undocumented specifics of which were lost to time.
As explained, the job was supposed to be looking at the network logs, seeing what data feeds a user was using, asking him if he really needed them, then cancelling the unneeded data feed to save money.
In a lot of cases, a user would be accessing a high cost data feed but only using a portion of the data, and what they were using was available from another vendor on a different (and cheaper) feed, and we could switch them over.
The theory was that the users would tell us when they weren't using something, and we could cut it.
The theory was, of course, incorrect. Every user was adamant that they needed what they currently had. No changes would be tolerated. Everything was essential, including the user who was using the fourth most expensive feed despite covering everything but the feed's clock with other windows. Why did she need this particular feed? Because she "liked the way the clock looked".
I actually quoted that in my report.
Obviously, the users weren't going to be any help, so we resorted to a more effective method of analysis: brute force.
Now, our network analysis showed which feeds were really being used all the time, and we left those alone. It was the "blue moon" ones - things that were only accessed once a day, or weekly - that were the best candidates for culling. So, we simply unplugged them, and waited to hear screams.
If we unplugged something and a user screeched that his feed was done less than 30 seconds later, that one was obviously in use, and stayed in the "keep, analyse later" pile. If no one complained, we'd wait to see how long it was before there was a complaint, and after a while, downgrade the speed/usage rate (some of these feeds had metered options rather than just unlimited), or cancel it altogether.
We saved enough money to meet the goal, and life went on.
I left the company and went on to greener pastures after Christmas. One day, in mid-July, I got a phone call from my replacement. It turned out that user X had tried to access his "favourite screen", the one he used every damned day, and it turned out that the feed had been cancelled. By me. Five months earlier. And yet it had taken him half a year to notice that his favourite screen had been empty for several months...
Of course, it didn't work like that at all. Every
querying the users, finding out what they
All those fresh faced college students returning from a summer of fun in the sun followed directions which included auto forwarding mail to yourself. Enough students autoforwarded to themselves to crash the VAX VMS. They finally figured out not to include that particular instruction my Junior year.
Biting the hand that feeds IT © 1998–2020