Buffer overflow DoS attack.
Self initiated. Caused by using the system as it is advertised.
Sometimes you take the call. Sometimes you place it. Welcome to an On Call where our reader is on the other side of the telephone. Today's story comes from "Mark" (not his name) and concerns the time he was charged with developing a document management application for the university department where he worked. The plan was …
> 640KB is enough for anyone
Which is often attributed to Bill Gates, but he never said this.
The variant from 1980 that he said "nobody would ever need 640K of computer memory" is wrong too.
The source could be an IBM PC designer saying that, but even that is not really proven - even though it is more plausible.
It is, as it seems, simply a myth. Like Abraham Lincoln said: Don't believe everything your read on the internet.
He didn't implement it - why are you telling such nonsense? Why do you post such thing in a public forum? If you try to troll learn to be a better troll.
Above 640 KB, from 0xA0000 to 0xBFFFF was reserved for VGA, 0xC0000 to 0xDFFFF was Options ROMs and Video BIOS, 0xE0000 to 0xEFFFF was other BIOS reserved, and even 0xF0000 to 0xFFFFF was reserved in some boxes.
It is IBM PC HARDWARE DESIGN. If you had more than 640 KB the memory from 0xA0000 to 0xFFFFF the memory was "shifted up" by 384 KB. Still remember himem.sys or EMS?
Today the same is done for 64 Bit OS-es, "simply" remap / shift it. 32 Bit OS-es (those with actual reach) don't do it, since PCI-memory mapping is stealing between 512 MB up to 1.5 GB of RAM from the end of the lower 4 GB they cannot make use of the full 4 GB.
Ok, pedantic mode on…
“ even 0xF0000 to 0xFFFFF was reserved in some boxes.”
0xFFFF0 had to be reserved at the very least as the intel 8086 processor used it as the bootstrap address on power up - it generally contained little more than a jump to the real startup routines, but it had to be there. One of the defined methods for rebooting a PC was to JMP FFFF:0000 (segmented addressing = 0xFFFF0). See Intel_8086_Overview.pdf
“It is IBM PC HARDWARE DESIGN. If you had more than 640 KB the memory from 0xA0000 to 0xFFFFF the memory was "shifted up" by 384 KB. Still remember himem.sys or EMS?”
IBM maybe, but not the PC - the PC could only address 1MB. So it couldn’t really be “shifted up”. EMS involved “shifting sideways” where a segment of memory would be swapped in and out of the upper address space so you could access more than the 1MB, but not all at once.
* You import the data by going to this drop-down and selecting option X.
* There *is* no option X!
* You import the data by going to this drop-down and selecting option X.
* LISTEN TO ME, THERE IS NO OPTION X!
round and round in circles....
* To import the data you must have ABC access permissions set by your manager, and then you will be able to select option X.
WHY THE FUCK DOESN'T OPTION X APPEAR THEN TO TELL ME THAT'S WHERE IT IS AND TELL ME I HAVE TO ASK FOR EXTRA PERMISSIONS!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Some days I really really really want to got to wherever the programmers are with a long stick and repeatedly hit them very very hard.
Had that one too whilst trying to fix a discrepancy on a stock system. Talking to the support bloke who tells me to use the reconciliation function from one of the the drop down menus.
Me: Which one is it under?
I don’t remember, and I’m not at my desk so just try them all.
Me: Okay I’ve done that and it’s not there.
Did you go into the sub menus?
Me: Yes I tried everything I could find.
Okay I’ll call you when I get back to my desk.
Some minutes pass…..
I’m at my desk and it’s the Third one along third one down.
Me: Nope not there.
What version are you using and what privilege level are you?
Me: Twelve and I’m level two
Ah you need level Three……you should be Three if you’re support, is there someone who has that there?
At this point dear reader the store manager logs in and reconcile is there third menu along third option down.
Me: Why isn’t it visible at lower privilege levels even if disabled?
Well, people might be tempted to try and use it.
Me: So put a message box to come up saying you need to be Three or above
I’m in customer service not development, thank you!
And wait for the support calls complaining that lots of menu items have appeared/disappeared for no apparent reason.
When an ex colleague last spoke to me they’d just switched to Office 345 on the web. This was dressed up as a “making working anywhere easier” move, whereas it was actually just a cost saving exercise. Suddenly IT were deluged with irate staff wanting to know what had happened to features they used in Office. Custom dictionaries or rather the lack of, being one of the biggest bugbears.
When you say that… A few days ago my wife couldn’t pay in a cheque with her Barclays banking app. So I tried it on my phone, couldn’t do it either. Googled it, instructions say click here, click there, the select “pay in check”.
I call Barclays support. The very friendly and competent lady says “I’ll try this” and 10 seconds later “the option is gone on my phone as well”. I said thank you very much, I’ll try it again later, and made her promise to tell her developers not to _rrmove_ a button when they have problems but disable it.
Few hours later it worked again.
In the early days of SQL Server 6 (when it was much the same as Sybase 4.2) we sent in a question to MS that must have been better than we suspected as within two days we had a stinking god of programming sent over from Seattle to see what we were doing. I pity his neighbours on the aeroplane, his physical hygiene really was deficient.
I was surprised at the difficulty of the problem as we weren't trying to do anything complicated - we couldn't, it was our first time playing with relational databases and we were still bashing the rocks together.
After three days staring at the screen, grunting occasionally and honking out the office, there was a particularly loud grunt, a flurry of typing and he said, "Got it. I'm going back".
Three days later we had a patch and all was well with our database.
Did you know there's a maximum subfolder limit in a Microsoft Exchange mailbox?
We didn't, until we logged the PSS Request with Microsoft.
In their defence, neither them or us expected a user to attempt to file their email into 5,000 subfolders manually, and by the time we worked that out, the scope of the problem was well beyond a technical one...
I'm sorry but anyone who has managed Enterprise level mail systems will know that users start at the WTF level and then see how deep they can dig.
We had a team who insisted that they needed a new sub-folder for every day so they could easily find emails and another team who had to have the full company name of the sender as the sub-folder name, with sub-sub folders for the different offices of said company leading to paths such as Joe Soap Ltd/Joe Soap Ltd - Croydon Office/December 2000/Received. Doesn't take long to hit all manner of limits with those ideas
The only way to curtail this level of file management idiocy is to charge them by the subfolder , and length of folder name , plus a premium for spaces and other stupid chars .
Also annual "Stupidest filepath in the company" awards .
Instead we give them ever more sophisticated file systems which "enable" ever dumber behavior.
Let's remember here that the users are (hopefully) using the stored emails to conduct the business the company exists to conduct. They should be enabled to store and find the documents they need effectively.
In the past big filing systems would have all sorts of weird codlings so that the discussion about production problems of the gimball-bracket spacers for Acme-co's MarkII version 2 widget, 1983 model was DDDLK/X23X/9782/3 so that if you knew that you could go straight to it and if you didn't know that you were in trouble.
Shouldn't it be possible to provide users with something better? Perhaps knowing a folder name, even a hierarchical folder name isn't the answer. Maybe it is the answer. But shouldn't it be IT's job to do something radical such as find out what the requirements are, what the constraints are and devise something that works?
Ive seen folders on the server for wireless drivers, they were called
New Cisco drivers
Latest Cisco drivers
Then we had laptops with Intel WiFi hardware. IIRC, they were stored in the "latest Cisco drivers" folder...
I'm just going to store the stuff I need on my desktop pc and not worry about that
Our SAs who Must Not Be Questioned set up all the trad Unix versioning tools as "SA permission required", for reasons that passeth all understanding.
I asked my Boss - who was hot to trot on getting rid of blah.old, blah.<date> files littering our servers for a copy of the git software (I had no download or install privs). Refused.
I explained it was in use around the world *and* by all our vendors. Refused. He told me we had a "Microsoft product" that would do nicely.
"What's it called?" Dunno.
"Is there any training to be had for it?" No.
"Where's the documentation for it?" Dunno.
"Who's the admin?" Dunno.
So I went away and in a rage wrote a perl script that would make copies of any file into a .old version. You typed notgit myprog.sql and would get notgit.sql.old but - and here is the !clever bit - it would first look for all .old files and move them to .old versions.
So if this was your fourth "version" of myprog.sql there would be myprog.sql.old, myprog.sql.old.old and myprog.sql.old.old.old too.
I showed my boss when he came round to nag me. He was livid. I explained with my best "keen innocence" look that one could keep track of changes and fork from old versions by simply counting the "old"s and subtracting one, but he was unimpressed and shouted at me for not using the Microsoft product.
I did my "hurts borne manfully" face and explained that I would have done so but that I did not know what it was called, how to use it, where to find the documentation for it and could not contact the admin for guidance because I was unaware of his name, at which point my boss started to make wheezing noises and stomped off.
Later, we were in a meeting and he announced our dotnet devs had a new version of Toad that had a versioning engine in it. He said he did not know what the underlying technology it was. I told him I did. He asked me for the name.
"Let's remember here that the users are (hopefully) using the stored emails to conduct the business the company exists to conduct."
Therein lies a potentially big problem...e-mails should be regarded as transitory, not as formal records
Suppose the ever popular Alice and Bob work for separate companies, but they figure out a plan to make one or both of their businesses better.
Alice sends Bob the revised process description.
To cover their backs they both send a copy to their respective bosses,Agatha and Bill.
Agatha and Bill realise that this has commercial implications, so send copies to their respective commercial, legal, QA departments. Finance dept. may also get involved
There are now (at least) 10 - 12 copies of the document floating around.
In quite short order, Alice and Bob may make minor changes, the commercial and legal folks will also want to communicate with their counterparts and change things ...
If you believe that all 12 copies of the document will remain aligned, I have a bridge to sell you :-)
When recovering from a SharePoint/Exchange outage, the choice of 'point of truth' will be stressful for the (innocent and not involved) IT techie [who will pick the 'wrong' one from the point of view of many].
I know that using Git, SharePoint ... would ease the issue but, in my experience, getting management types to use them properly (or even at all) is an exercise in futility and frustration when simply attaching stuff to e-mails is easy to do.
If, by some miracle, things turn out well, Agatha and Bill celebrate and get their bonuses; if normality ensues, Alice and Bob suffer career limiting reviews for 'not following process'. Either way Microsoft gets its licence income.
"Therein lies a potentially big problem...e-mails should be regarded as transitory, not as formal records"
In the sort of situation you describe there are likely to be additional documents of some nature with the emails as covering notes, just as they would have been exchanged by letter post back in the day. But also in that situation Alice or Agatha and Bob or Bill, whoever are authorised to act for their companies, can agree on a final version by email.
I've written elsewhere in these comments that the significant unit of communication is the thread, not the individual messages. In the situation you describe the back and forth emails between managers will actually keep track of the changes. They might not be able to handle any more rigorous form of change control but email with its threading will be the one thing they can and will handle.
Twenty years ago we were doing business whereby additions to a contract and consequent additions to the agreed data dictionary, inter-developer discussions of additions to the XML schema and test data were all carried out by email. Only samples of the final product required anything more. It wasn't difficult and it didn't lead to any problems even though one of the other parties was one of the notorious Usual Suspects in the realms of outsourcers.
I have seen this happen in real life.
There was a point in time when a minor dispute arose. The 'working process' (as defined by Alice and Bob -- the people actually doing the job) document held by the "A" organisation differed in several small, but apparently significant, ways from the 'commercial' document which the "B" organisation was using. This caused some dispute in payments.
Tracking down the sources of changes, who suggested, who agreed, and who approved them... through individual e-mail messages was a nightmare [thankfully not for me, but it was so bad that even I had sympathy for the commercial/financial teams]. Even then, locating all of the various copies of the downloaded attachment documents was never truly completed (and as for those on backup tapes, archives...)
E-mail traffic for 'meta data' could just about be managed; for anything more serious than that, there should be references to one, agreed, change managed point of truth.
I'd like to say that painful lessons were learnt, but ... well "messengers were shot"
E-mail is good at generating [large amounts of] *data* ; less good at maintaining *information*
e-mails should be regarded as transitory, not as formal records
Indeed, this is a load of nonsense.
In the nineteenth century, business correspondence was manually copied into chapbooks. When flat filing was invented, it would be copied onto separate sheets, then filed. The practice continued under various technological advances such as vertical filing and carbon copies. (See for example Yates, Control Through Communication.)
The practice has persisted because business correspondence reifies critical institutional memory that is not otherwise captured through formal recording mechanisms.
I have email messages in my archives that are more than a quarter-century old. Occasionally I consult them. They remain relevant.
This attitude that no one needs to keep email – espoused even by some quite clever people – is naive and unsupportable.
> Even Exchange nowadays (well, for most of this century) is smart enough to only store a single copy of an email message addressed to many.
Nope, they removed that in Exchange 2010. The reasoning was: Customers do have more than one database anyway. And more than one exchange server, often one for each subsidary. Using Single Instance Storage within one database does not save that much space, way below the 10% margin for such a setup. So it was removed in favour of performance, at the cost of disk space.
Yep. The loss of Single Object Storage (or whatever they called it) was why we migrated from Groupwise directly into Live@EDU/Office365. I went to the "Intro to Exchange" course in like 2010 as we were thinking it was time to take GW out to pasture (*sniff*), so I figured it would be good to take a couple of classes on Exchange administration. Once I heard they'd removed that feature from 2010, I told the boss, and we decided to skip on-prem, and eventually went to O365 in 2014. The good news was I didn't have to sit through any more of those courses. The bad news was we had to migrate from Groupwise 8 to O365. That was a pain I hope to never, EVER experience again. 6 months of my life wasted while slowly, slowly, slooooooowly shuffling email messages out to O365. So slow, so fraught with problems and issues. And once we finally got all the kinks worked out and thought we were gonna fly, Microsoft's throttling kicked in. Never, ever, again will I go through that.
And honestly, getting rid of Single Object was a stupid idea for on-prem, especially for a small org with 1200 users who have thousands of emails from the past 15 years. Folks who NEVER deleted the weekly chirpy, 10-megabyte graphic-heavy newsletter from Marketing. So we would lose a few Gigs of storage on the server for EACH of those newsletters? Eh, no thanks, chaps. Luckily at the time MS was giving away all-you-can-eat O365 licensing for all EDU volume-licensed schools, so it didn't take too much debate to make that decision. Once the migration was done, it worked pretty well. Would be better (even today) if Microsoft wasn't constantly fucking with the back-end stuff, though.
They got rid of it because it was a problem! Single Object Storage (appropriate acronym SOS) was a major cause of Exchange database failures. Not to mention the overhead of keeping track of Attachment changes.
By the time they removed it storage was becoming cheap. It just wan't worth the headaches to keep it!
Funny, Groupwise always had that feature AFAIK, and in the 15 years I adminned our Groupwise system, we never had a database issue more serious than a few orphaned attachments that GWCheck fixed with no sweat.
I do wonder if one reason they removed it at that time was to push more people towards their upcoming O365. Storage may have been "cheap" for corporate, but for EDU, anything more expensive than "Free" wasn't cheap. So letting Microsoft worry about the storage space required from losing SOS was a big (possibly the biggest) reason we went to O365.
I can't say I'm sad that I'll never have "Exchange Admin" on my resume. Sounds like a total nightmare.
Just as well.
I remember an email going round the UK arm of a multinational I used to work in, circa 2003, stating that our Exchange servers were running out of disk space and we all needed to urgently archive or delete old messages.
We dutifully went through our inboxes deleting our unneeded messages.
Cue a snippy email a few days later saying that we were getting dangerously close to our limits so we urgently needed to delete unwanted mails.
We'd all deleted over half the emails in our inboxes, as we could all see from our quotas, but the management were looking at server reports and were convinced we'd done nothing.
Turns out that having a few thousand employees delete the staff newsletters attached as bloated Word documents to the regular emails from senior management and HR doesn't make the damnedest bit of difference on deduped storage, and all it achieved was wasting several man-years of labour.
...which would all be forgivable if it wasn't a multinational IT services company, who made a lot of money managing Exchange servers for clients.
We could have saved a shedload of cash if they'd just given the advice at the beginning that they gave us the day before email would otherwise have died:
Archive all your individual email over x months old. (Cue everyone panicking about failing to follow procedure on filtering project-related correspondence into project mailboxes.)
Every year, my company gives document training. The training is simple, all documentation, including email, can potentially be supoenaed for a court case. Therefore, it is not transitory, each email is a stored document. There are even document destruction rules we are supposed to follow.
I don't follow them though. I maintain as little paper as possible, and have most of my email set to autodelete after 90 days. If the company wants it retained they can do it. After 30+ years, evidently I've not been important enough to matter because nobody has ever complained. Nor can I be accused of hiding anything - my email rules were set up shortly after starting and I never change them beyond adding an occasional "delete all email from @domain" rule every time IT wants to do a phishng test. They only use a few domains so it didn't take long to stop their obvious tests.
And that's a whole different pisser there. Once I reported a suspicious email that turned out to be a real, legitimate, not spam email and they tried to make me attend a bunch of training on how to recognize spam.
Their emails and training clearly say "report anything even remotely suspicious." What they actually did was train me to stop reporting anything. You give disciplinary training to the guy that clicks the link that loads the virus, not the guy that reports something suspicious that turns out to be a false alarm. Better the occasional false alarm than a ransomware lock.
That presupposes a company standard way of doing things that meets the companies needs.
That sentence implies quite a lot of preparatory work. What stands out here is that all too often that's not been done so either there's nothing to teach or what's taught isn't guaranteed to meet the needs.
Ah, but you forget that as soon as the Clever Young Things are hired they fall under the powerful illusion that companies buy, maintain and replace computers for a living.
Ollie White had a radical suggestion in his course on Material Requirements Planning: Take your software staff onto the shop floor and show them what the company does and who does it. That way, that bill of materials breakout is no longer a report that can be delivered just so good, a bit late if necessary, but a vital cog in the wheel that grinds the grain to make the bread for the table of the programmer.
I did a contract a large concern that manufactured postage machines where they turned this on its head and had manufacturing bods as IT team managers. Unfortunately, they forgot to do the walk-through for the team members too, so there was still a basic disconnect with what the IT team hunchbacks thought they did for a living.
It is truly depressing the number of System Administrators in my current workplace who believe they are the top of the heap - and therefore must not be questioned when they make decisions - rather than plumbers needed to unblock the too-small pipes they installed for the real job we do.
Time for a snack and a cuppa.
With Novell's Windows 2000 against Netware servers it was possible to create paths that were too long for MS Office apps to read. What's more the practical limit wasn't completely consistent. As I recall the documented limit at the time was 216 characters, but various apps would actually read anything up to 230/240 - Excel and Word were different IIRC!
The users seemed to love moving deep directory trees, so it was a surprisingly common problem - sufficiently so that I had to put together a utility than ran a daily report and emailed the recorded file owner to warn them and ask them to do something about it.
Especially when Windows "helpfully" show only part of the path with elipses, the amount it shows varying by which application of file manager you are using to dig down with because the address bar isn't allowed to scale with the content in case it messes up the pretty layout of tool icons.
Depending on what's being organized, that makes sense. First, it puts the subfolders into a sorted order even if the labels are not in alphabetical order. Sometimes this is unnecessary. Sometimes it is desired. Second, it allows people to be sent to a place with a numbered key, not a full path. With text labels after the numbers, it retains the ability for them to find things if they don't have a numbered key. I don't do it, but I can see some reasons why others could want to and get benefits from doing so.
But I do find that Sharepoint quickly becomes impossible to use if you consider the full file path to be naming information, because Sharepoint doesn't (seem to) treat them as such in your search.
It also doesn't show the full path/breadcrumb trail (at least not how it's configured here) so all I get on my screen is the filename and the site name. I'm looking at 10 near-identical filenames on the same site just now.
Hate it hate it hate it.
Except, in Outlook, you do actually kind of have to do this. If you search for an email *it only tells you the folder it's in and not where that folder is*.
It's possible they've fixed this in later versions, or have it buried under some convoluted series of options, but this has come up several times where users have accidentally put something in the wrong folder and can't find it.
This drove me mad for years but there is a way around it, IF (big IF) you can find the email by a search. Once you find it, open it, click on the body, type CTRL+SHIFT+F to bring up advanced find and then you can search back through BROWSE on the top right to locate the folder
Still buggered if you can't find the email at all though!
And please don't shout at me if everyone knows this. I discovered it a year ago and I'm still chuffed, even when using it every day
What is it with Exchange search?
I asked a colleague for some information, which was supplied by return of email. I sent a "thankyou".
For the next year or so, whenever I had forgotten the detail in that information I would search for the email using a search term which was in the email's subject line but didn't crop up very often, so I didn't have thousands to check through to find the right one.
For most of that year, Outlook (desktop app) would find the email where I requested the information, and the email where I said "thankyou" but it wouldn't find the one in-between. Now this wasn't the end of the world as my thankyou had quoted the reply so I could scroll down and get the information so I just left it alone as "one of those things".
Then suddenly, towards the end of the period when I needed to refer to this information, doing exactly the same search *would* bring up the reply email! What happened there?
Depending on how it's set up, an which version, and what you are searching, and what connector you are using, Exchange and Outlook use indexed search. Using one of many versions of MS search indexing.
Personally, I hate indexed search. I've got it turned off everywhere. Yes, mail stores and disks are bigger now -- but CPUs and disks are faster now, so I still don't want indexed search deciding for me which bits will be indexed.
And of course each space in a name on SharePoint gets translated to %20, similarly any 'foreign' characters
I never did persuade TPTB to stick to underscores to prevent the 'out of cheese' path errors because of excessive folder nesting, or even just to drop a couple of levels to save space... you don't need a "Working File/" folder and that just wasted 15 characters!
maximum folder depth on Sharepoint
Well, apparently you are not supposed to use folders in Sharepoint - you "simply" should tag all bloody files. At least that is what I was told by some
millennialSharepoint Wiz Kid.
I can't -for reason of comment rules- write here how much I hate Sharepoint.
HIt the Exchange limits a couple of times in the support days. Usually results in major issues.
More surprising to me (at the time) was the surprisingly small file/folder limits in DFSR under WIndows 2008 (iirc 100k objects). This wasn't documented anywhere public but we found out after the clients file servers collapsed in a heap after they'd hit around 8x that limit. Cue about a month of frantic calls with MS around the world support before we got it working again - on WIndows 2012 R2 which had a higher limit.
DFS-R is only good for nearly static folders. If you have folders with lots of activity it will fall over. And if you have the Async-DFS-R, which probably most have, it gets even better when users start to work on the same file. Extra fun. have a database file there. But even with Sync-DFS-R you can have a lot of fun... Especially with database files like access.
we had a similair thing years ago but it was a limit in the namespace allowed for nested folder names in windows. Think it was around win98 era and I can't remember what the limit was, but we had a particulary ackward user who like to do things his way even if his way was bonkers to normal people! And he had very deep level nester folders. He also had a PC under his desk that had an exported copy of his groupwise mailbox, thats all, becasue he didn't want to use the new email system to access all hs old email that was sat in groupwise! So he had hte PC and accessed the old email via groupwise webaccess on the box. Same user also has about a million applications and tabs open at anyone time and never shuts his box down. So eventually things get a bit shite and we take great pleasure in getting him to reboot his box to fix the issue! I did try and get him to change things but just got a "well thats how I like to work" answer, whihc is fine but don;t then moan when things start to break becuase you're working in a bonkers fasion! LOL
"Did you know there's a maximum subfolder limit in a Microsoft Exchange mailbox?"
Once upon a time business communication was largely by bits of paper. Big companies received lots of bits of paper. They filed them in folders. They files the folders in filing cabinets.
If they needed more folders they bought more folders.
If they needed more filing cabinets they ... well, it depends: they might take the view that only really important stuff older than X years needed to be kept and the rest was dumped; they might take the view that it was important to keep old stuff but access time wasn't important so it could be bundled up and stored in some cheap, off-site location; they might just buy more filing cabinets. Whatever the choice extreme storage was a matter of company policy.
Now we have allegedly wonderful electronic systems replacing the paper. It takes less room. It should enable companies to manage mail more efficiently than it did in the days of paper and what happens? Storage is limited by vendor decisions and individual users are setting their own storage policies because companies (perhaps rightly, based on expectations of better faculties) don't.
I'm afraid MrsAC deals with this simply by having 80,000 emails in her in box, although to be fair, the search function in the latest Outlook is actually getting quite good!
I also discovered this week that my mother uses the Trash folder for emails that she might want to come back to later (shortly before I had discovered that the iOS default is to delete the trash after 7 days...)
How many times do we read this?
And why? Because email clients are badly designed. Their defaults for handling read mail are unhelpful to non-existent. To some extent it's because email clients originally started out as message handling systems. Usually they now manage to link messages into threads but all to often that seems to be an afterthought and not a core function (see 3 below).
How about something fairly simple:
1. The inbox is a list of unread messages. The moment a message is read it is removed from the inbox and cannot be replaced.
2. The Trash is a queue of messages or threads waiting to be deleted according to some set of rules - time or quantity related doesn't matter so much. Once the criteria are met it's gone for good. While it's in there any message or thread can be retrieved.
3. The unit of interest is not the message, it's the thread/discussion/conversation/call it what you will. Messages are their components. A singleton message - one with, as yet, no reply - is simply a component of a thread with one item; all threads start this way. Placing sent messages in a separate "Sent" folder where they won't get matched up with their replies or the messages to which they themselves are replies is just stupid.
4. By default all sent and all read messages initially get placed in a common folder. We'll call it something such as "Current". The contents of Current are presented as threads, not individual messages (see 3 above).
5. We don't want Current to grow indefinitely (the clue's in the name) so we'll have some sort of ageing rules to move threads that haven't been added to or read for some time into less current folders. The ultimate such folders will be archives, probably on a yearly basis.
6. In addition to such core management the user can devise any additional organisation they please, hierarchical, cross-referenced or whatever. For extra points such folders can be represented in the OS filing system. That means that if you cave a folder for Project Rhubarb you can drag the email client's folder of email threads relating to Project Rhubarb into it as a sub-folder. Or when you opt to create such a folder in the client you can elect to place it there.
7. Some of 6 could be automated by rules - all emails from example.com addresses can be set up to go into Example Inc's folder. All emails from the banks' email addresses go into the relevant bank's folder in the Finance folder and so on. There's a mass of possibilities for helping and guiding the user to good practices.
8. Nothing described above need represent the actual storage of the threads and the messages which are their components. That can be done by using a plain old database. The client's folders are just pointers into the data. The "threads" in the OS folders are just files that contain sufficient information to tell the client where to find them on a read-only basis.
9. If the users really want to shove a file back into the Inbox to deal with later and can't, then maybe a Pending tray could be provided.
A number of mail platforms do allow you to configure pretty much everything you are asking for.
At risk of a million downvotes (and because it is what I support) I am going to say that even I could write the code in a Notes mailfile to make it behave that way.
The problem is that in almost every deployment, a mailfile/database is implemented out of the box and left to the user to customise thus limiting the options.
"I'm afraid MrsAC deals with this simply by having 80,000 emails in her in box, although to be fair, the search function in the latest Outlook is actually getting quite good!"
It is better than it used to be, but why does it list the "Top 3 Results" then below that, list all the results, including the same Top 3 Results in the top 3 positions? And it never has my result in the Top 3 Results even though every search term I entered is definitely in the one I'm looking for? It seems the "Top 3 Results" are just the most recent emails which happen to match the criteria.
Lots of hidden limits in Exchange and Outlook.
The length of a BCC field and contact list. Both had (had?) a limit around 7,000 characters, after which they fail in non helpful way.
From a user perspective the problem is not the limit but that it's not related to how many addresses you have but the string length.
"...MS Onedrive. Save a file to my Onedrive folder. Go into Outlook, try and attach that file to a message - "Sorry, you don't have permission to do that"
WTF ?? I own that soddin' document! What extra permissions do I need? MS has no idea...."
I've found I get all sorts of weird errors if I try to attach a document from OneDrive that is currently open in e.g. Word.
Just as a check - make sure it isn't open anywhere before you try to attach it. It may or may not make a difference.
Ah, yes, shitty error messages. I once wrote a little program that ran on as an NT service and would select a random error number and display the message as a popup dialog at random intervals. There are a *lot* of weird Windows error messages.
There is nothing like a dialog popping up with the message: "The control file blocks have been destroyed.", along with an abort, retry, fail.
We would install it on unsuspecting co-workers that left their workstations unlocked. Make sure your sysadmin and security are in on the gag.
A year or two ago a customer decided to keep all the files for a project I was helping with on Onedrive ("It's in the Cloud, so it must be safe."). As we were getting close to the completion of the project I got an e-mail from them saying their files had disappeared - they couldn't see anything when they logged in to Onedrive and were more than a little unhappy about that.
Oddly enough, when I logged in to the same Onedrive I could see all their files - so I downloaded everything, and sent it to them with the suggestion that keeping local copies of their data might not be a bad idea, old-fashioned though that concept may be.
A good weekend to all Commentards. -->
What I like much with OneDrive is the list of forbidden characters that is different from the list of forbidden characters defined for the local file system.
So in some cases, when trying to open in the web interface a file created on a computer you get a message like "Sorry Dave, I can't do that" and you have to rename your file on the computer, then wait for the synchronisation, and then you can use it everywhere...
Bouncing around between different OS's has taught me to only use alphanumerics in file names, and maybe a - or _ (if I'm feeling fancy).
It's the same with passwords, eg, I don't have a £ in a password, in case I end up having to try and logon to a machine with a US keyboard layout. (and yes, I have had that problem)
I used to use £ in passwords for that same reason. Trouble is many USian severs won't let you.
I'm a reseller for a European AV supplier... they moved their hosting to a US company we'll call "WhalesFarce". Now the character is blocked from use.
Happens far too often for no logical reason on many servers. Even seen it blocked on some hardware like routers. Let me have my £££ back.
Raspbian - Debian derivative for the Rapberry Pis. The Pi, the product of a UK company, yes?
Debian, keyboard set to UK, no problem with £ at command line, never has been as long as I can remember.
Raspbian, GUI application such as KWrite, LO etc. no problem with £.
Raspbian command line, keyboard set to UK - doesn't like £ at the command line or programs such as vi run from the command line at all. To be fair, it's some time since i had occasion to set one up so it might have been fixed in later versions.
The trouble is, your user name will probably come out fine, (because both en-GB and en-US use qwerty). That means there's no way to notice that the computer is using a different map than the keyboard you're using, so you'll blithely type
@ when you thought you pressed
Almost as much fun as trying to set up a computer in French, when you only have a UK keyboard.
> When your keyboard selection is wrong, use the ALT number keypad to enter your special character. [ALT]156 is £ in high-bit ASCII. [ALT]35 is #, which is £ in 7-bit ASCII character sets.
Window Key + Space cycles between the installed keyboards on a system, even on the logon screen.
I used to work somewhere that did a lot of remote support around the world using various remote access technologies. If a special character was needed in a password we tended to use ! as we found that it tended to move around the least on the regional keyboards that we encountered
It's the same with passwords, eg, I don't have a £ in a password, in case I end up having to try and logon to a machine with a US keyboard layout. (and yes, I have had that problem)
That won't be enough for a Turkish keyboard.... because the
I key is the dotless-
I key is elsewhere.
(And this forum does not support
kbd; :-( )
You gotta love Turkish keyboards - I've seen that dotless I blow testing software right out of the water because the software assumed it was a lowercase L. Or, in one case, just simply couldn't deal with it and threw a 'test failed' every time it tried to do a text comparison.
I ordered a Gigabyte Aorus keyboard recently from a company in Italy because the one I wanted was not apparently available in the UK. I specified a UK keyboard layout and -tada!!- was sent a Turkish one. I then found that the company didn't have a returns policy. YFW??? As it happens, it was one of the very few times I'd ordered anything through Amazon and their customer service was very helpful. Within ten minutes they confirmed the lack of returns policy, issued me a full refund and gave instructions to keep the then-useless and very expensive keyboard. Three days and £30 later it's now sporting a new set of Corsair UK keycaps.
I've had that issue with sharepoint as well, it allowed me to upload the file, but then no-one could retrieve it, turns out it was a hyphen which wasn't allowed. The admin modified some config file which then allowed it... if it's a valid character which works when you modify the config, why was it disabled by default, and why did it let me upload the file without warning me that no-one would be able to download it?
Had a similar problem with a social services application I used to support. Database was Oracle, ASCII (I think) character set. Users would copy and paste from various other applications, Word, Excel, whatever. Application would happily swallow whatever was copy pasted in, and then throw an unhandled error when you tried to retrieve that particular form because it contained what ASCII considered as control characters. Took a while to work out what was going on but earned me a lot of brownie points when I twigged and could fix the offending form generally within 30 minutes of being aware of the problem. Honestly, it was frequently a 5 minute job but never let the end users know it's easy. Supplier took 24 hours to fix such issues, their long term fix was to suggest we upgraded, at considerable expense, to their latest and greatest rewrite of the application. We did consider that, but it went nowhere, partly because their 'expert' didn't understand that their application when running on a Solaris box, pointing at a database on another Solaris box, didn't need, and in fact probably couldn't, map to the C:\ drive.
If you're M$ you let it load to 99 percent in 3 seconds, holdnat 99 percent for 3 hours, then put a generic "++Out Of Cheese Error; Please Reboot Universe++ error. Guess it's easier than throwing up a ++Filename "..."; Invalid character in name, remove "-"++ error. I mean, the computer knows what isn't flying, so state it.
40 years ago... I remember working with someone who was written a program which worked, They added a comment, and it failed to compile with internal error.
Take out a different comment it worked. Cue lots of head scratching from the rest of the team.
The problem was that for a program up to a certain size, the compiler kept it in memory. Over a certain size, it had to spill and use an intermediate file.
There was a bug in the code which spilled to the file. Obvious with hind sight
I never understood why anybody bought sharepoint.
It's a bad clone of some free open source Wiki software, except it doesn't work properly, and you have to pay for it.
I just love the feature where you copy and paste a picture into a page and it displays lovely on the screen. Then when you save the page and reopen it, the picture isn't there anymore.
Because you can't actually copy and paste a picture, you have to click insert picture and then upload it. And even then, if you make changes to the picture and upload it again, it carries on using the original one unless you change the filename.
If you are pasting text from other sources, the formatting and line spacing and font sizes get all screwed up and you have to resort to manually editing the HTML to make it look right.
And unless you insist on some kind of hierarchical page structure, most of the pages that users create are never looked at because they are impossible to find.
"most of the pages that users create are never looked at because they are impossible to find"
That was my experience of it as well - but in our case the indexing would also stop for no apparent reason, so documents couldn't be found by searching either. The sysadmins eventually cobbled together some sort of script that would re-enable indexing at the end of every a day and get it to re-index *everything* to catch the stuff that had been missed, since there was no way of limiting it to stuff that had been missed while the indexing was stopped.
As you point out there are a lot of free, open source alternatives, so in the development team we refused to use Sharepoint for our internal docs, using our own wiki instead. Nowadays I'm stuck with Atlassian's eye wateringly expensive products, which at least seem to work well compared to Sharepoint.
I never understood why anybody bought sharepoint.
I think most smaller companies that use(d) Sharepoint just installed the free Windows SharePoint Services (later on renamed Sharepoint Foundation) that was a component within Windows Server or could be downloaded from Micros~1 website. Simple to initially setup - although the management and troubleshooting has never been! Free versions of Sharepoint are not available anymore.
The free Sharepoint was adequate for file sharing, creating lists and forms, and MS Office had built-in support for working with SP sites and files, but it was a lite version of what Micros~1 wanter to sell to the bigger organisations.
"It's a bad clone of some free open source Wiki software, except it doesn't work properly, and you have to pay for it."
It's the integration with Office and Windows that sure beats Wiki. I don't think SPS is so much used for internal websites but more for file sharing, questionnaire forms and such, it is quite customisable. The Sharepoint folder structures can be easily synced to your PC's.
I'm not advocating Sharepoint and I have a strong dislike to the HTML interface, but I can see why it is used in many companies. Purely for document management in all its glory I would choose something else, M-Files for example.
The worst part of SharePoint was trying code from outside into it's "API from hell". We simply needed to extract some text data and then send a boolen back in, I spent about two weeks on it and in the end we simply bought a shareware webscrapper util for $5, coded around that in about a day and it worked perfecty for about 3 years!
The amount of crap support libs you needed to load up and that CAML shite, just to get some text out!
My grip with SP, as a Joe user only (I'm not interested in coding for it) is it seems like every time you restrict permissions on one folder, it becomes invisible, and new users will simply never know it does at all exist !
When working with it as a repository of docs, it's intolerable !
Also, yes, I've never seen a search work ...
And unless you insist on some kind of hierarchical page structure, most of the pages that users create are never looked at because they are impossible to find.
On the contrary, I've had SP working well (for a given value of 'well') for years with all docs (hunners of them) in one location, categorised using metadata; means that a document can be found by type, product and so on, without having to guess which folder it is in (or worse, multiple versions in different folders). Use revision control and insist on check-in comments and you have a decent revision history.
Search is still *^%$ though.
"I never understood why anybody bought sharepoint."
At a guess, partly because it was the (sort of) follow-on to FrontPage. I vaguely remember at one point MS renamed the FrontPage Client to Sharepoint. Or maybe it was the server component. Eh, the memories are fuzzy, and I never cared for it myself, but it was something like that.
So, not that long ago, I found a bug in the office 2010 VBA editor. It wasn't quite as bad as a hard-cap on lines, but, in essence - too much code could crash it.
I was working with this horrifically complex macro - basically an entire application that happened to use Excel as a GUI. Periodically the users would ask for new functionality, and I'd be tasked with updating it (I assume in a past life I was a terrible sinner). VBA isn't hard to work with really, and this monster had a built-in error-stack, good naming conventions, lots of comments, pretty much the ideal for a gigantic excel macro, if such a thing has to exist! And it was oh so functionalised, no repeated code, lots of tiny sub-functions being called whenever that would have happened.
For some reason, it crashed a lot, but only when being tested with the VBA editor open.
If I closed the VBE window and just clicked GUI buttons, it worked fine! But I couldn't so much as run a test function with the VBE open, lest it hard-crash to desktop. After a few times of re-typing the same changes, the first thing I did was make sure it saved itself before doing any tests, in case I forgot, and then I went digging in the (online) manuals.
Turns out that because VBA has access to the entire Office object model, that includes _the VBA editor itself_.
So now the code checks to see if it's being run with the VBE open, and if so, it closes the VBE window to re-open either at the end of the process or if the error handler is invoked.
Amazing the lengths we can go to to avoid re-implementing decade-old excel macros in a better behaved language, eh? Still it pays my bills...
"VBA has access to the entire Office object model, that includes _the VBA editor itself_."
Oh yes. Leading to the ability to do things that are either funny, or sabotage, depending on whether it's for fun or for work. IIRC you can step back through the code you've already run, change it, and execute it again, for example.
It can even be sabotage in the furtherance of work. I have a co-worker whose job is essentially reverse-engineering certain workbooks constructed by our partners, which are usually locked with password protection. I had to assist recently with getting some unlocking macros working - one of which unprotected any protected VBA modules by (apparently) calling Windows libs to patch the running Excel application to return success on the unlocking check. Because of course the VBA code is run by the Excel application, which is allowed to modify itself...
Incidentally it turns out unlocking the VBA code is very handy for picking up hints about what the password might be for the worksheets. Because they're doing fancy stuff from VBA in these workbooks, they have to unprotect the worksheets and re-protect them in the code, so you can pick up the password by searching for calls to the relevant methods.
A friend of mine made a living for a few years as a consultant, going to companies and unlocking their excel worksheets for them. Aside from that particular trick, depending on the Office version you might be able to find/overwrite the password in the file with a hex editor, or up until Office 2010 you could just have a macro guess the password.
Their hashing was not great, so you'd get collisions all over the place, usually you'd find a 6-character string that would be accepted.
A lot of companies have critical stuff sat locked that was build by someone who's left the company. It works, but until they need to change it they just don't really realise how much trouble it is. and provided they actually own the file, it is legal to crack the password, and software can be found to do so.
Myself? I need a very good reason to put a password on stuff. I don't often deal with critical customer data, and the main use for protected worksheets is "stop me accidentally breaking something", so why set a password?
2001 is probably well into the era where this sort of thing was waning, as the migration to primarily internet-based support was already well underway by the time Y2K rolled around, but a decade earlier the behemoth that was Microsoft telephone support was truly a wonder to behold.
My own dealings with it were back in the summer of 1993, when my just-survived-freshman-year self parlayed my 1/4-of-a-bachelor's-degree "experience" into a summer job doing on-site development for a business graphics service in Manhattan. The shop operated under a business model where they would meet their clients wherever they were at, technologically, rather than imposing the file-type and -format requirements typical of more regimented, higher-volume services. In order to uphold their claim that clients could submit files in any format, from any application, and have them made into whatever combination of slides, printed documents, etc. they requested invariably meant that preflight involved a fairly robust arsenal of middleware translators, parsers, reformatters, etc. sitting in the path between the customer files and the slide/document printers, and that was where I came in.
Since the bulk of the development was being done in Visual C for their Windows production workflow, at some point during the course of the summer I ended up hitting some issue that, like Mark, left me dialing the support number listed in the software manual.
What a trip. Never before or since have I encountered a phone-support infrastructure so massive in both scope and complexity, not to mention so actively used that it easily justified the entire production. The call center maintained dozens of support queues for all of Bill's various products and services, so at any time there were easily hundreds of callers holding to reach a support engineer. And while we all waited? No canned, tedious hold-music loops for these hopeful petitioners, no sir! Instead, a live DJ "hosted" the on-hold experience, interspersing his musical selections with updates on wait times for the various support queues in exactly the manner of a drive-time radio DJ giving traffic reports.
Considering there was at least one occasion where I spent a solid 45 minutes on hold, the novelty was not unwelcome.
a live DJ "hosted" the on-hold experience
There's always something somewhere out to make on-hold worse. Usually it's recorded messages telling you that you can use the web-site whose failings drove you to try to phone in. But a
DJ person with verbal diarrhoea would take some beating.
Only thing better would be a cynical & sarcastic DJ...
"Hey folks, the queue times are continuing to climb. If you just joined our program, it really sucks to be you. I could tell you how much the company values our customers, but if that was honestly true we would have enough help on staff.
Remember, if you don't want to hold the line any longer it is easier for us to ignore you if you open a support case on-line."
Seriously, I would sign an NDA & hold harmless contract just to listen to a DJ poke fun at the company for having me on hold so long. For those of us in the trenches - myself, the DJ, and the overworked tech taking my call - we might as well get a good laugh out of the situation.
When I worked for a bank, I was always given software to pilot as I always seemed to find bugs in it. I wasn't looking for them, I just somehow found them but when I reported them, it would be along the lines of "I did A, B, C and then D and it didn't work. However, if I do A, B, D, it works." rather than "Uh... doesn't work. Bye."
Similar to me - I could break anything (and still can, though retired). If you wanted something (major)bug-free, let me loose. Don't let me anywhere near a time-critical release that has been tested to death - by others. I'll find the showstopper.
It's a b****r when I actually want something to work for me...
... last time when I dealt with MS libraries I noticed my code was doing nothing, so I went into the debugger, and there is was: "ExceptionNotImplemented" or somehting like that was triggered when calling documented methods ... had to go to the old COM libraries to get screenshots of Office documents
In Content Manager 3.0 (the ancestor of SharePoint, released after MS bough CM 2.0), MS had implemented 2 different syntaxes for searching documents, one basic and one advanced.
When testing the search feature, we noticed that there was no difference in the results when we tried to use advanced syntax (like searching for wordA near wordB at less than 10 words distance).
So we sent an inquiry to MS, and got the answer that the advanced syntax was not implemented, with a nice smiley.
The client we were working for was a government, that didn't like the smiley at all when we forwarded it the answer...
I was working on a budgeting system and my boss ran up a prototype in 1-2-3 for DOS to quickly show the management how it could look. But we would write it in C++...
Famous last words. Of course, the management saw the demo, said "great, make it so"... "In 1-2-3, because all our users have that."
The problem was, the tool would pull down data from an Oracle database, running on a VAX and the 1-2-3 proof of concept would pull in the various text files and work on them. Thankfully, 1-2-3 could use extended memory and all 1MB RAM on the laptops could be used, although it was dog slow, because 1-2-3.
We tried repeatedly to get the management to let us develop a real solution, but no, the p-o-c, or p-o-s as we called it, "worked" on a test dataset, so we had to get it working on the full dataset.
It started off fine, but it quickly grew and it ended up using about a couple of dozen different worksheets. Not something you really worry about these day, a few dozen worksheets are fine, when you have 16GB RAM. But back then, it was starting to bang on the 1MB limit.
The other problem is, it was written using 1-2-3 macros. That wasn't a real language, not even a macro language like Excel used, let alone VisualBasic for Applications, which was still a couple of years away. No, we are talking "keystroke macros", cells on a particular worksheet that would be executed and it emulated the keystrokes the user would make manually (E.g. "/fstest.wk3" would save the current file as "test.wk3". You could reference other cells, which would hold text or formulas to calculate the file name, for example, taking the name of the product and using that as the filename). You could also have dynamic ranges in the macro, by using formulas in the cells that are being executed - self modifying code! :-O
It all ran smoothly and then, suddenly, it didn't, it crashed in a big heap. So we single-stepped the macros, and it worked flawlessly. Run the macro, crash, go back and single step, worked, run crash, step works... Hmm, strange and, if you can't see where it is crashing, because it works when you debug it, you can't really move forward.
So we contacted Lotus. They were perplexed and asked for a copy of the code, which was duly sent, along with a set of test files.
The solution came back about a week later, "use C++, Lotus 1-2-3 was never made to do anything this complex!"
Needless to say, that didn't go down well with the management.
About 25 ish years ago my boss at the time did the same thing, but as the company only employed 20 people it was left for him to sort out.
In did work and made ordering faster but could be a problem when it went wrong.
It then got extended to do stock control as well - I can’t remember if there was a proper database involved.
In all the various stories about "manglement" there is a certain very common thread. The difference between managers who listen to frontline staff and make decisions based on information coming up, and those who make decisions based on wishes and demand frontline staff implement them.
There's a variety of manglement (usually most of them) who can't grasp the difference between a demo that only just works and a product that just works. Always build in a few crashes or at least gaps in the demo so you can say "That's because we can't implement it properly in $DemoVehicle. It'll be OK when we write it properly in $TargetVehicle".
If I'm not mistaken, some prototyping software creates GUIs that look hand-drawn -- a script font, lines and widgets that look as though they were drawn freehand, etc. The theory is that, even if the buttons are clickable and the fields fill-in-able, if it looks like a sketch, people won't mistake it for the real thing.
Or you could have used vp-planner. A Lotus clone with read write access to dbf files using lotus style macros.
Perfect for a small business multi-user forecasting application
Never keep data in spreadsheets! Use them for data entry and reporting only.
That was in the mid 80s before Lotus sued paperback software out of existence.
I was still using it till the mid 90s when TM1 appeared… but that’s another story
I was on site at a customer's once, and found a compiler bug. A reasonably simple test expression was failing, and reviewing the output assembler showed that the compiler was clearly generating incorrect code.
A call to the support hotline produced the response "the guy who supports that product is on holiday, could you call back in 2 weeks?". A somewhat lukewarm line, I think.
Refactoring the code worked around it, I don't think we ever got a fix.
I had a problem that wasted me a few days.
The AV or IT spyware, would delete my work when I rebooted the system.
The files could be worked on, copied to and from OneDrive fine, and then after a reboot was gone from both.
I never have found out what was wrong with the contents of those source code files, and IT just said "no it doesn't".
I have worked for some companies that specialised in Sharepoint tooling and I think I can honestly say that it got slightly less bad over time. I remember being on a call with some very senior Microsoft Sharepoint people where one of them said "the Sharepoint Platform is more like an operating system" and that felt extremely bad to me, like this mediocre file-sharing tool packed with barely-working features is like an operating system? Now I don't want to use any of your operating systems either.
Last time I worked with a Sharepoint-based platform, we found ourselves step-by-step moving the product off Sharepoint and every time we did, it got a little better. By the time it was a standalone product that could access Sharepoint for some things it was endlessly faster, tidier, and more reliable.
I too had to create a massive VBA macro to convert data in Excel format so it could be imported (also in Excel format) into another tool. The macro was working great until I added a couple lines of additional code and then suddenly it didn't. It turns out there is an overall character limit for a Module. I had to break up the code into functions so that they could be placed in different Modules just to get around the character limit.
A few years back, I used to work with diverse content management systems, however I never seen something like BorkingPoint.
We have a project for a "Scan-to-workflow" solution and it was decided to use SharePoint. The architecture is horrible, any custom code makes it a nightmare to upgrade and infamous Microsoft Workflow Foundation to under pin automatic processes is a drop in the ocean. The search engine was a CPU eating thing, horrible and unsecure.
At the end we ended up using a third party BPM tool for SharePoint based on J2EE just to make it at least functional.
Ahh! Another bonus feature, it you wanted to publish the SharePoint site to the internet, Microsoft would proscribe the now defunct Forefront ....another nightmare of tool.
I worked for a company that used a terrible issue-tracking system. I've probably blocked the memories of what made it terrible, but finally, it seemed likely to go away.
It fell over completely one day, and a wag noted that the ticket ID numbers had been approaching a bit under 33000. Yep, turns out they'd used a signed, 16-bit integer for the ticket ID. I thought for sure that we were finally rid of it, but sadly, they fixed that issue.
This post has been deleted by its author
I wrote a COBOL program on an IBM mainframe that needed to extend a record in a VSAM KSDS spanned dataset. That's a valid operation (just read the VSAM manual), but it didn't work. I couldn't figure out why not. The systems programmers in our shop couldn't figure it out. The local IBM support folks couldn't figure it out.
So it go sent to the VSAM development group in (IIRC) the Netherlands. About 6 months later, we got a reply. While what I was doing was valid in VSAM, the COBOL compiler didn't implement it. So they were going to change the COBOL manual to say You Can't Do That.
(IIRC, my solution was to read the record, delete it from the dataset, build a new record with the same key, and add that to the dataset.)
The only place I would use SharePoint is transparently behind the MS Teams application. I like Teams, the user of does and wiki should generally stick to only the main channel for a team though. It's been very useful.
I worked with the first few versions of SharePoint and hate it to this day.
I got "banned" from MS tech support once back in the mid-90s. I was working on a project in VB 3, and was calling an API function, as advised in one of the VB manuals I had. Problem was, the manual had the wrong value for one of the constants that the function needed. This was before the Internet got big, so if the manual was wrong, I was SOL. I spent most of a day trying to figure out what value it wanted, but no dice. Luckily, this was back when MS still offered free, live (and American, but I digress) tech support for VB. So I rang them up, sat on hold for the required amount of time, then got a tech on the line. After a few minutes of talking about the issue, was told "The manual is correct."
"But, eh, the code is still blowing up."
"No, the constant in the manual is correct."
"But it doesn't work."
"Sorry, sir, I don't have anything more for you"
At that point, younger me got a bit hot and let loose a couple of choice words with the strongest Southern accent I could muster.
"Goodbye, sir, and please do not call Microsoft Tech Support again."
It was 4 years before I had the nerve to call them back and use my real name. By then, they'd started charging for support.
And the value that I was looking for? One of the other guys in the company had a new copy of Visual C++ with the corresponding boat-anchor box full of manuals, After a bit of looking, I found the correct value in one of them, and it was not the same as what was in the VB manual. But it finally worked.
I was involved with some Sharepoint stuff very early in its evolution - about 2001 I think. And what not many people know is that Microsoft at that time had two completely different products, both badged as Sharepoint. From memory (it's over 20 years ago so don't @ me for not getting the details exactly right), one was built around a database, and the other was built around NTFS with extra metadata. They had completely different models and APIs, but they were both "Sharepoint".