I’m not old, you’re old!
What do Cristiano Ronaldo, Bruno Mars and Windows have in common? They're all 35 years old. It is three and a half decades since Microsoft Windows 1.0 was unleashed upon an unsuspecting world. Tottering atop MS-DOS, Windows 1.0 was released on 20 November 1985. A graphical multitasking shell, it would usher in an era of …
My impression was that Mars is generally considered a moderately-competent musician, but I think I've only ever heard one of his pieces, and that in passing. I'm reasonably confident I wouldn't be interested in hearing more of it.
Ronaldo apparent is some sort of sportsball participant.
I did use - and, briefly, write software for - Windows 1.0, though that project quickly moved to requiring Windows 2.0, with additional features if you were running Windows/286 or /386. Would I rather spend an hour using Windows1.0, listening to Bruno Mars music, or watching Christiano Renaldo run around and kick things? Good question. Can I watch one of these films instead?
Xerox refused to invest any serious resources in commercializing what PARC had invented. A $100K plus kinda workstation with no serious marketing effort behind it is not exactly a serious effort. So Jobs was invited to a demo in 1979 of what the guys at PARC had and offered them a chance to turn it into a mass market technology. People at PARC like Larry Tesler jumped at the opportunity and went to work at Apple to turn this into reality.
That's where the Lisa desktop and applications actually came from. Which begat the Mac the following year.
Whereas Windows 1 was a straight rip-off from Visi On. By VisiCorp. Win 1 was demo-ed in 1983 as ready to ship "real soon now" but only shipped anything stable in 1985. A year after the Mac made it completely obsolete. It took MS another 5 years to even start ripping of the Mac in a half assed way, Win 3.0 and over a decade before they got anything usable. Win95. The best description of Win 3 I ever saw was - MS read Inside Mac very carefully from cover to cover copying what the saw but completely and totally missed the point. Which is why Win16/Win32 was always complete incoherent. Both at the UI and API level. MS just never got it.
Xerox did sell quite a few desktop publishing systems. Some featured dual processors way before the rest of the crowd. I worked on implementing MP/M for their 820-II 16/8 desktop machine. The OS was extended to swap jobs between the 8 and 16 bit processors. A lot of PARC technology also ended up in mainframes. Yes, Xerox did manufacture the Sigma range from the late 60's onwards.
Windows ripped off MacOS.
The Mac team paid for or were freely given the handful of core concepts first seriously implemented by Xerox PARC.
The MacOS team then did maHOOsive work on translating them into something which actually fit how humans thought and saw and moved. Have a look just at the huge range of experimentation on window structures, on menu structures, on feedback mechanisms. Just the surviving pencil sketches I've seen are a tribute to the intelligence and human-centric focus of the MacOS team.
Next time you choose a menu option, pay attention to the feedback you're given. Be aware the MacOS team went through dozens of iterations' testing before settling on that one as the best.
Look up Fitt's Law.
Next time you choose a menu option, pay attention to the feedback you're given.
Ha ha ha - A millennial programmer was doing some UI work - some menu options would pop a dialog and some would act directly. They had omitted the ellipses in the former, and were blissfully unaware of their significance. I resorted to firing up Excel or Word and going through the menus to demonstrate. The penny took awhile to drop.
This was several months after them declining the offer to lend one of my books on UI design. Despite that, being unaware or curious about the significance of ellipses in everyday applications is what is so depressing, especially from an application programmer
It's all turning into hieroglyphics now. TeamViewer has a side bar which might as well be full of suns, gods, and cats for all they meaning it conveys. Now try telling someone who opens TeamViewer which one they need to click on to get to the screen which has their ID and password they can give out.
Seriously, what has happened to UI design? The rules were simple enough to understand and somehow we've got to this mess today. If the OSes themselves don't follow the rules third-party software won't either.
the first thing I do with a new gmail account, is turn the icons into text
the industry seems absolutely hellbent, on "removing details". it probably started off with Apple doing it right, then the industry copying it badly/un necessarily
one vaio notebook I tried once, the trackpad had the buttons integrated. but there were no indent lines to seperate/highlight the buttons, it was just a single surface, which had some clickable area at the lower left/right
no thanks. if something is a button, I want it to have some plastic molding/lines which show's it as such
I also *hate* the trend of putting buttons anywhere other than on the front panel (eg, underneath the border facing down, or on the rear of a flatscreen display)
Windows has always had a delay between moving the pointer out of the highlighted menu item and the sub menu closing, so if you're fast enough you'll make it to the sub menu even if you clip the corner. Newer UI elements like the breadcrumb drop down menus in Windows Explorer have no delay at all because morons.
It looks like macOSX has implemented a similar delay.
And they're both wrong.
Originally, macOS defined a V shaped region (rotated 90 degrees, obvs.) with its point located slightly on the other side of where the mouse cursor was when the sub menu was displayed. As long as the mouse pointer remained in that region (until reaching the sub menu) the sub menu would remain open.
That's better, but not quite perfect if you ask me. I think the actual orientation of the V region should be dependent on the position of the sub menu. Most sub menus open with their tops flush to the parent item they belong to and extend downwards. Therefore the V should (usually) be orientated to allow more downward movement than upward movement.
It's little things like that that show how shit macOSX is compared to earlier versions. I used to love using a Mac at work back in the mid 90s. These days I hate it. (XCode certainly doesn't help there though - it is a gargantuan sack of shit of monumental proportions.)
Having actually used a Xerox box for real work, I can say that the UI was vastly superior to the early MacOS and to windows until 3.1.
Not only that the word processor and spreadsheet were 10 years ahead of anything else available. Not that anyone used thr spreadsheet app as the tables in the word processor were just as capable.
But then again it took 4 cpus and lots of money to get there.
This post has been deleted by its author
Non-overlapping windows, most of which cannot be moved or sized. No substantial concept of a desktop — boots into a text prompt and runs one application at a time. Every application does more or less whatever it wants with the UI. No pull-down menus. No drag and drop at all — including for file manipulation. No concept of file associations, workflow is always application-first.
And, of course, it runs like a snail because the CPU is built of discrete components and it runs an interpreted language.
If you were to ask me to pick a computer for general office productivity purposes, between 1984 and 1990 it'd be any Mac, from 1990 to 1995 it'd be Windows or Mac, whatever, from 1995 to 2001 I'd rather use Windows, and after that it'd be back to whatever my employer was willing to provide.
I'd have taken the Amiga a thousand times over the alternatives - and indeed did. A2000, deluxe paint, the Works! platinum edition both brilliant programs. Even now. Compatible with most of major file formats of the day, I do believe the Works could read Lotus 123. Wordworth a few years later gave the miggy a very potent word processor option too, though did really need an accelerator and a bit of extra memory over the vanilla 68000.
In my opinion, the business world plopped for IBM and clones frankly, because they were not especially informed buyers. Unproductive, obtuse DOS-derived systems or a nice, fast and responsive GUI? NASA knew what a good thing the 2000 was; even the space shuttle telemetry labs used them right into the 2000's long after they had any right to still be in service.
For compatibility - and comedy value - try installing the Works today under UAE. Create a spreadsheet or word processor file and transfer it over to a modern PC. Modern Excel and Word read those outputs from the 80's perfectly.
By comparison, Windows 1 and 2 were nothing more than forgettable toys. 3.1 was a massive leap and, for me at least, the peak-windows experience. As a developer, it was of course, terrible to program for.
Ugh, I forgot about the Amiga; a disadvantage of allowing history to be rewritten by the victors — the main thoughts in my head were merely 'avoid Windows prior to 3.1', followed by 'avoid Mac OS 8 & 9, as even original-release Windows 95 has more chance of recovering from a crashed application'.
That said, I think the business world went for PCs just because of the wide array of vendors and the practicalities of procurement. Which is also one, but not the only, reason why you rarely see Mac-based workplaces today (outside of a couple of niches).
Former Xerox user as well, we had portrait black-on white screens, WYSIWYG document editing and a building wide network for sharing all this. Plus, there were multiple languages available, important for the government agency using the system. I was sad to see it go, but it was getting old and slow and then the internet came!
Now the Alto truly was revolutionary in the mid 1970's but a much more primitive machine. The only really commercial machine was the Star. I only ever saw Alto's at trade shows. The only time I got any time on a Xerox Star would have been in 1985. A co-workers roommate had one. I think the setup he had would have cost about $35K if bought at the time it was shipped a few years before. It was acquired as the result of a failed project. The usual story. We had a $100K plus Symbolics 3600 and a whole office of Lisas / Macs doing dev work. I also had an early alpha of Apples port of SmallTalk 80 running on one of the Macs at the time. So thats what I was comparing the Star against.
The Star was interesting to use from an historical perspective but that was about it. In many ways the Sun 3 and Apollo Domain workstations were much more interesting machines at the time. For about the same price.
The Lisa by 1983 did pretty much all the software stuff the Star did and in a much more integrated and coherent manner. The seamless networking integration on the Lisa Mac end had pretty much caught up by that stage. I did the AppleTalk stack integration in 1985 and for the dev machines it was just rolled into the office network. We had been doing network printing for years but had no need for laser printers (from HP) so just all daisy/dot/line printers. All networked. In fact the day I connected the dev Macs and Lisa to the office network in 1984 it was so normal by that stage it was no big deal.
The Alto was utterly revolutionary when it first shipped. The Star which was the first real commercial version was still a very big deal when it shipped in 1981. But two or three later it was no big deal.
Thats how fast the business evolved back then. In 1981 the Xero Star was the future for $35K/$40K Three years later I had most of that on my desk for under $4K. And I could even travel with my Mac as carry on luggage. Try doing that with a Star.
Saying all that nothing will match the sheer excitement of the first time I read about the Alto and what it could do. I think it was in Creative Computing in 1977. An even greater sense of amazement than seeing the first advert for the IMSAI 8080 in 1975. A universe away from the ASR 33's and PDP 8 I was using at the time.
You know you are getting old when you can remember a time when we had a very exiting future with infinite possibilities..
I use the latest MacOS and Windows 10 regularly (work machine / home machine). And the claimed superior usability of Mac is long behind it. Home / End keys do the same thing in every text based Windows application. In Mac applications - inconsistent. In a terminal Home and End don't take you to the start and end of the line you are typing (that would be too useful), but to the top and bottom of your terminal session. But in some applications they do take you to the start and end of the line. It's nice to guess I suppose.
Don't want to use the mouse? Press Alt on a Windows machine, and underlines show you what letter to press for a menu item. Is there an equivalent on the Mac? Maybe, but it isn't very discoverable and I haven't worked out how to do it yet.
Snap a window to the left/right on Windows - Windows key + left/right-arrow. Want to do do it on a Mac? Either install a 3rd party app like Shift it, or aim for the small green button at the top left of a window and go for it with your mouse.
Having problems with (for example) your built in camera? Windows - open the Device manager and check that all is working, update the drivers if necessary. MacOS, cross your fingers and go through the arcane process for an NVPRAM reset.
"And they both ripped off the Xerox Alto environment."
Back then the concept of "ripping off" didn't really exist in the computing world. As one example, almost all computers shipped with complete source code, not just to the OS, but also to the applications. Consider that nobody attending The Mother of All Demos had to sign an NDA.
It wasn't until after Bill Gates' "Open Letter to Hobbyists" that the big-wigs in the computer world started jealously guarding this kind of thing.
The Alto GUI was basically a total steaming pile of garbage. It had many, many clever ideas in it, but the first Apple and Windows GUIs were so superior it's not even funny.
People who think that Mac/Windows ripped off the GUI should definitely go and watch those videos (search CuriousMarc). I was personally surprised at how little it had, given the "Jobs copied it" stuff you see everywhere...
You do realise you're comparing a computer from 1973 to one from 1984, right?
Also, for much of the superiorness of the Mac and Windows 95 GUIs (MS got there 10 years after everyone else as usual), they somehow managed to both miss the document-centric nature of the Alto's GUI.
Does anybody talk about the latest CPU ripping of Shockley's transistor? Turings Machine? Von Neumans implementation?
There are plenty of nearly valid reasons to hate MS/Apple/Sun/Linux/... that aren't based on a lack of originality. Set Theory's specialization in Boolean Algebra was the first glorious standing on the shoulders of giants moment in computing. The trend has continued ever since; be glad, not spite filled.
Only if by 'ripped off' you mean: Apple paid Xerox in share options (shortly pre-IPO, so no shares yet), Microsoft licensed relevant additional details from Apple.
In both cases the relevant companies proactively paid for what they were reproducing. Only Apple tried to turn around and sue afterwards, claiming that Microsoft's licence was limited solely to Windows 1.0.
The court found in Microsoft's favour in the latter case at least partly because they found that many of the things Apple claimed to own had only been licensed from Xerox, not purchased.
So I don't think there's even a moral case for 'ripped off' here, no matter how stretched. All copied ideas were paid for.
There are those that maintain that the Mac was the result or internal power struggle inside Apple. Well, that and Jobs getting fired.
> The original Mac, only a single floppy for storage and not really enough memory was more a proof of concept.
The first 128K dev machines (dual floppy) were shipped early February 1984 the first 512K's and HD20's by September. Still usable as dev machines even when we got the first Mac II's at the end of '86. I was using dual monitor Mac Plus's in 1986 and a 4 monitor continuous desktop Mac II in 1987. With a zero config network to three other Macs. Some proof of concept.
> W95 looks to have owed rather a lot in functionality to MWM, HP VUE and their successor, CDE with a good deal of underpinnings from HP New Era.
Nope. Run Win 95 side by side with any of the above. Then run System 7. From 1991. That's how wrong you are. Going from Win3.1 to Win 95 was quite jarring. Going from System 7 was very straight forward. Apart from all the stuff Win95 was still missing. Win7 was the first release that pretty much got there. Which is why most serious dev work at the time was still done on Win NT 3.x with testing on Win95 if needed.
Hmmm. three down votes?..
Well part of my day job from 1987 onwards (till maybe mid 90's) involving trying out almost every windowing system known to mankind at the time (and there were a lot ) that ran on desktop PC's and their ilk to see if any were viable port candidates for our very successful software product lines. The Unix ones were very late to the game. Although some very creative ideas in various nooks and crannies. Once one found them.
I remember one very funny conversation with a VP in 1989 trying to explain to him why HP New Wave application software was going to have even lower sales than OS/2. Which were minuscule. Or why GEM and AmigaOS were such great successes in Europe but total flops in the US. The difference in disposable income. There was a lot more in the US so it was Macs and IBM PC's as home PC's.
I think the last such review I did was for Open Doc in 1995. Felt sorry for the guy Apple sent out to try to persuade the company why it was a viable port platform. Some really great ideas (still have the dev docs) but lacking any practical coherent end user delivery strategy and with MS holding to keys to all the most important business categories and them pushing OLE so hard Open Doc was never going to get any momentum. And so it proved.
NeXT had the slickest demos at trade shows at the time and it looked beautiful but boy did they get narky if you dared deviate from the demo script. After hearing the story of the trials of the Next dev machine with one of the companies they were desperate to get to port their application to NextStep I was no longer surprised why the Next stand dollies got very surly. The term Potemkin village comes to mind.
So I have personally seen and used a lot of windowing systems. Too damn many of them. Although the sweet spot is still MacOS System 7.6.1. That was the perfect balance between complexity and simplicity. I still miss it. Because it seems that's as good as it gets.
MS Office (1.0) (Win3.1 iirc)
That was when MacOS's advantage over MSWindows collapsed to near negligible for 99% of business purchasers.
It essentially imitated all the casually visible interface of MacOS for 99% of normal business applications. OK, so it was application(suite)-specific. Who cares if that's 100% of the applications you use every day? I can copy-paste words n pictures n graphs n do everything -- what's Mac got that this hasn't?
..."well you can't do that for/between any _other_ apps..." "Why would I need to do that, these are everything I ever need."
MacOS's marketshare was growing strongly despite the massive You can't get fired if you buy IBM syndrome. But from that point on, it turned over and collapsed.
I remember seeing MS Office on Windows for the first time Jan 1993. My blood went cold. "Mac's lost."
I was right.
MacOS's marketshare was growing strongly despite the massive You can't get fired if you buy IBM syndrome
MacOS was growing in several markets, but Windows was growing faster. The entire market was growing; home computers were becoming less of a curiosity and more acceptable. They still weren't necessities, by any means, but they were no longer oddities.
A PC in 1987 was still the price of a used car, and Macs were considerably more expensive. And that's what did MacOS in.
A person with a $3,000 PC might invest $99 to buy Windows to try it out, but he wasn't going to spend $5,000 to try out a Mac.
Even then, that $99 was a hurdle, and Microsoft recognized it. That's why they included a copy of Windows with pretty much everything they sold. You bought a Microsoft Mouse? Here's a runtime copy of WIndows with some graphics programs. You bought a Microsoft game? Here's the DOS and the Windows versions, together. Oh, you don't have Windows? That's okay, it comes with a Windows runtime.
Sure, those runtime Windows ran like a dog, and needed more memory than the user had, but he could afford to buy more memory a lot more than he could afford a Mac.
There was a push at the time to get MacOS on PCs, one that Jobs fought against. He railed against "crap PC hardware", and he was right. PC hardware quality was all over the map, as was pricing. Macs were stable, because they were single-source. But that came at a cost that most people weren't willing to pay.
And by the same token, if you spend the same amount of money a Mac cost and put it into a high end PC, then the gap between MacOS and Windows narrowed considerably.
Mac still won a lot of markets, notably education and graphics, but the expected educational followup never happened. There was a lot of talk that parents of kids with Macs at school would buy a Mac for home, and some did. But most walked into the computer section of the department store, saw the Mac and the PC next to each other, and got sticker shock and the cost of the Mac, and went home with a PC.
MacOS made some inroads with the "Hackintoshes", as they were known, but one of Jobs' first actions when returning to Apple was to kill them, so that was that.
You're jumbling a lot of things, markets, concepts, and timeframes here.
Also, look up "marketshare" -- I used the word precisely; you seem to have read it as market size. And note all your points are re the retail personal micro-segment -- essentially ignored by MS in favour of locking up the 800lb gorilla of the business segment.
In the timeframe I referenced:
Apple got a few big but lucky wins in education, but primarily held post-MSOffice just the DTP/graphics micro-segment and a diminishing portion of retail personal (max.maybe 1-2% of the market, Apple's share of even that diminishing as the 90% average people just bought what they were using at work: MSWin+MSOffice).
MacOS had a stranglehold on DTP/graphics (a then extremely high premium segment) entirely due to Postscript being only available on MacOS. Because Warnock loved MacOS, disliked MSWindows, and despised what Gates was doing to the industry.
Then Jobs eliminated that by announcing at a major joint presentation that he & MS had signed a deal to develop a replacement for Postscript (only MS fulfilled their end of the bargain, and we still use it today; you're probably looking at it right now). John Warnock was sitting right behind Jobs at the time...
He was white and shaking when it was his turn to get up and speak.
When he got back to the office, he was spitting feathers re Jobs's backstabbing and kicked off the port of PS to Windows immediately.
And that was the beginning of the end for MacOS's last safe market segment.
MacOS peaked but died with 8.6.
MacOSX makes me cry. Better with large font-sets but less functional than MacOS 4.
Avi Tevanian (Jobs II's head of devel for Mac) went on public record as stating his goal was to destroy Macintosh, and he succeeded.
Jobs was long gone from Apple the day of that TrueType press conference. I remember it well. Although Warnock was a really good guy Adobe was kind of difficult to deal with at the time on the subject of the encrypted Type I fonts etc. Most other companies where not willling to pay the very high licensing fees Adobe were demanding so we went elsewhere for outline fonts. So not much sympathy that day when Adobe got poleaxed. After that Adobe were much more reasonable to deal with.
Regarding market share numbers pre Win 95 they really did not change that much. Worldwide. MacOS was around 10% hardware revenue, 20% software revenue (at least in US), with Win 2x/3x just cannibalizing MS/DOS share. DOS boxes only sold about 1.1 applications per unit whereas MacOS was in the 3 to 4 range most years. Mac people bought more software for each machine. DOS boxes usually only used to run one application.
After Win95 the MacOS market share held in there for a few years but unit sales collapsed to 2% soon after the 1997 debacle. But MacOS software revenue share also collapsed and with it the very rich eco-system of small software companies. All gone by the early 2000's. Despite all the Mac laptops one might see actual MacOS market share in the last decade has never reached the level it had in the early / mid 1990's and the software revenue numbers are very anemic. And some markets are gone for good. Japan which was once 1/3 of revenue. DTP long gone. Education long gone. Video production almost gone. In all cases these markets were lost because of very deliberate and often malicious decisions by Jobs and the Nextie minions.
Not that it really matters that much as Apple have been a consumer electronics company for almost the last twenty years. Apple Computer, Inc is long dead. Very long dead.
Sorry, I wasn't clear, or rather I blurred Mac & Apple. My marketshare refs were strictly re Apple, or to be read in the sense of what success did MacOS make directly for its creators. Hence: # of boxes shifted, vs competitors' #s. I did not include the surrounding 3rd party software ecosystem nor was I considering $relativities.
My recollection of the marketshare graphs I was watching grimly at the time is of ~'90 peak over 10%, winding down hard after MSOffice release to about 4-5% by the time I saw it, to sub2%. IIRC 1.6 or 1.8%.
(By the bye, re 3rd Party Software, here's some tech insight re coding apps on early MacOS:
Quite like the custom Mac tool for receiving a binary over the serial port from the external compiler.
Very few people used the serial download approach in my experience . At least from Lisa Workshop. It was even flakier than the Lisa Workshop compilers and linkers. A lot easier to put the final executable on a floppy and stick that in the test Mac. Floppy net. The first Mac product I worked on in '84 by passed the Lisa completely and just built on a VAX and downloaded to a boot-strapper running on the Mac. There was a great two Mac debugger called MacDB that proved useful, some unique features, but in the end Moto's own Macsbug won that battle.
I know some products were developed in MDS in '84/'85 and there were a few C compilers kicking around but once the betas of MPW became stable then (Spring '86'?) it was immediately shift to MPW running on the Mac then throw out the Lisa's. Literally. I think MPW 0.8 was stable enough to build successfully. I know by the time the official release came out we had been using it for day to day development for quite a long time.
Then the following year Lightspeed C (Think C) became stable enough for heavy duty development and everyone moved to that pretty quickly. MPW only used for very specialized tasks. Saying that the MPW 1.0 C compiler produced the best optimized 68K I've ever seen. The 2.x/3.x compilers were just average. And MPW Shell is still the best shell I've ever used. With the MPW Command O tool still unique and unsurpassed in making CLI shells usable even after 35 years.
Then everyone moved to CodeWarrior in 1993 when Symantec destroyed Think C and then post 1997 Metroworks gave up on CodeWarrior Mac and that was it. The first version of XCode that you could actually build and compile successfully from the GUI Shell was 3.0 in 2007. XCode 1.x and 2.x were CLI only if you wanted to get work done. Up till then pretty much the only usable Carbon dev system was obsolete versions of the now abandoned CodeWarrior Mac. Which still worked great. As does the Win32 version of Codewarrior to this day. Still running a version from 1999 for looking at old project code. Last time I checked it still builds and debugs fine on x86 but the debugger does not work on x64. Still, pretty good for a 20 plus year old IDE.
I still miss the guys at Metrowerks. The absolute best dev tools company in my four decades of professional developer. By a large margin. Really great guys.
Why do all the most competent, rational, sane, and EXCELLENT comments on ElReg get either no votes or massively downvoted?
Mate, your post is somewhere between, and yet both of, brilliant and agonising.
Brilliant for the accuracy, historicity, and validity.
Agonising for the reminder of what we've lost and how far we've gone backwards.
(BeOS should have been the next MacOS, written clean as MacOS-copy for the newly economically-achievable mass-market via new hardware -- so insanely fast! -- 25yrs later and still, on top kit, current OSs can't match it. Plus the"relational OS. But Gassée trashed the apps market/developers. And refused to market. Because people should just reward the brilliance. (He was also responsible for trashing all the early Macs' expansion/integration capacities, and, worse, inculcating and profoundly establishing the Apple culture of "Fuck You" on the hardware side. (Eg, FireWire makes me cry. Eg, Allowing the addition of a SCSI port was his big massive major concession for too many years.))
MacOS remained, and still remains,far and away the most real-world productive of any OS platform to date. People used to casually hack their kernel, and not only not realise that that was what they were doing, but not even need to know what that even frikkin meant. Autocorrect? (At the KEYBOARD level regardless of specific custom code for particular app or desk accessory?) Cf. Thunder, the kernel-hack progenitor.
MacOSX peaked at Panther 10.3 --approx. 70% of MacOS-- and and has gone backwards from there for users. Sometimes startlingly backwards, sometimes just subtly but miserably backwards. But always backwards.
(I was for many years the top tech guy for the London Mac Users Group and the Oxford Mac Users Group (for a few years, the oxmacug home page had a photo with us hard core and Woz). But I bailed, broken finally, at IIRC 10.5, when Apple deliberately created an internal firewall to block NASs they disapproved of. Massive carnage for many many users. I managed to find a way around it, but it required people plugging in an approved NAS plus then carefully stepping through my list of commandline instructions. Well, I'm an oldskool Unix head but that's PRECISELY the sort of shit that Mac was intended to NOT be. I'd already rewritten TimeMachine to allow customisation for various MUGers (turns out I'd accidentally written it in '97 for a different purpose), sorted out & written all sorts of unnecessary shit to get people back to what they used to not just have but took for granted. "Enough. I can't take it any more. This is the antithesis of why I like Mac."
Metrowerks. CodeWarrior. Awesome awesome brilliant tools. ...People.
And what people did with it....
Metrowerks didn't give up on CodeWarrior: they got driven out of the market by Jobs. And Avi Tevanian. Deliberately: mens rea, "strategy".
No, I think it's genuinely the first. NextStep seemingly got there independently, but later.
Also props to RISC OS for being the first mainstream option to push vector fonts. Unless you started screwing up your system with proprietary Adobe nonsense, both Windows and the Mac were bitmap-only until the early '90s.
They're all flashy, massively self regarding media products that make loads of money, despite also being widely disliked.
It's 25 years this Autumn since I waved goodbye to the old monochrome dumb character terminal on my desk at work and excitedly put together and booted up a Windows 3 PC for the first time. On saving my first Word document, the machine crashed completely then spontaneously rebooted.
And it apparently took 10 years to reach this level of reliability.
...and burger master.
There was some insanely clever shit in the early versions of Windows.
Just research out how Microsoft's coders got a page-based memory manager reliably working in Intel's 'real mode' with next to no help from the processor; and then read how they uplifted this to protected mode without breaking anything. Google (or Bing, whatever) thunks and stack walking. Look at how small the Windows kernel was; read about how BitBlt built custom blitters on the stack and then executed them. Next, read about how all those mysterious VxDs were demand-loaded components for a brand new 32bit multitasking OS that wasn't Windows that we didn't even know we were running at the time, because it virtualised DOS and Windows so well.
Then wonder how a company with coders like this came up with IE, Teams and ShitePoint.
But hear hear. Gates was toxic for the industry but the actual coders did wonders in a crippled environment and with no real overarching architect or even design.
Check Raymond Chen's blog to read the tech.insights of a startlingly competent and rational coder, not just at the deep level like your points but also right up to consumer level. Just his explanations of how much ongoing and deep work they had/have to do to facilitate major apps is awesome.
"Then wonder how a company with coders like this came up with IE, Teams and ShitePoint."
"Enthused by then-CEO Steve Ballmer, who had originally dismissed the iPhone..."
Everybody leaves an heritage, although direct causality isn't always that clear to us mere mortals. We all see it when a start up with passionate inventors realise they have to work with M&S people (and to their standards) when they actually want to be "successful"...
Anybody knows what the old fumbling CancerMaster is doing nowadays?
Developers! Developers! Developers! *5-20 and insert faux bouncing and near caricature levels of streaming sweat.
I believe he retired with a few private islands up his sleeve. The benefits of crawling up the arse of a person born super privileged, yeah? (Gates inherited the modern equivalent of a quarter of a billion dollars, in case you've ever wondered about the balls of dropping out of Harvard with no job). Clark(?) and Ballmer knew a good thing when they saw it and stuck to it
Hell, Clark dragged Gates back out of Harvard after he panicked and ran back.
I spent 4yrs within 3 reports of the Board of a several trillion $ software company. Accidentally, amusingly -- jackrabbited from tech saviour to CEO for temp. political reasons, which stuck, not least for me turning around a multinational company (subsumed and destroyed by the takeover, by the new larger US company). With a fair whack of head-to-head interaction with competitors' and equal-sized or large clients.
I can assure you, at length, that they are FAR less intelligent than the catastrophic stupidity you have observed from the outside.
Then wonder how a company with coders like this came up with IE, Teams and ShitePoint.
I've wondered how a company that came up with the 3D Skeuomorphic look for Windows 3.x _AND_ OS/2 PM 1.2 (or was it 1.3?) could *SUDDENLY* *ABANDON* *IT* *ALL* and *GO* *BACK* to a 2D FLATTY FLATSO look that Windows 1.x and 2.x had... _ESPECIALLY_ since the reason Windows 3.x _WAS_ _SO_ _SUCCESSFUL_ is _EASILY_ explained by that very same 3D Skeuomorphic look that they *ABANDONED* in Windows 8 through 10 !!!
So yeah, same *kind* of frustration, more or less.
I still use mc (Midnight Commander, a GPLv3 clone of the Norton offering) occasionally on my *nix systems. It's a useful tool, and a lot more powerful that it looks on first glance. Recommended.
N.B. Be VERY careful if you choose to run it as root ... it will do exactly what you tell it to do. Don't say I didn't warn you.
In the late '90s I used to host install parties at Foothill JC in Los Altos (sometimes at De Anza or Cañada). The first time around (late '95 or early '96), I pointed out the existence of mc, (release 3.0 or 3.1). Several of the newbies latched onto it as being somewhat familiar. I guess I hadn't impressed on them the necessity of creating a user account, especially when they were just starting out with *nix ... Much hilarity ensued. I corrected my approach, and the issue almost went away (there is always one ...).
Oddly enough, the single biggest mistake was "tidying" the file system.
 BSD, Slackware and/or Minix, Coherent if you had your own set of disks.
Raise a glass with you. The last windows I used was 0.8, as a student picking up stray contracts in either 83 or 84. I still had the 8 Beta disks & hugeass folder until a friend with a fetish for useless technology adopted it a few years back. Weird ass piece of shit (the software, not my friend).
At the time I ran a pirated copy of QNX 1.0 on my XT. I have still never installed or owned a windows machine since then; though was parachuted onto them occasionally.
@ Blackjack: The good news is that the days of being stuck with Windows if you want to play games is over. Steam's SteamPlay feature automatically takes care of almost all Steam games and runs them on Linux as easily as on Windows with a single click of the install/play button. For titles other than those on Steam, try Lutris which can automatically take care of pretty much everything else you might want to play from install discs to emulators and other launchers such as EA Origin, Ubisoft's UPlay, Battle.net, etc. Individual compatibility can be checked on protondb,com. Gaming on Linux has never been so good.
"Whats the wafer thin mint that finally causes windows to explode?"
20H2 update in my case. Well, it nearly caused me to explode last weekend. I've got a crappy 2GB/32GB laptop with W10. "Update to 20H2?" OK. "I need more disk space, please insert a USB drive." OK. Hours later, install fails.
"Update to 20H2?" Retry. Hours later, install fails.
Dig about. Somewhere there is a log of updates. Mystery eight-digit error code. Look online.
"This error code appears if you try to update using a FAT32-formatted USB drive. Use an NTFS-formatted drive instead."
Seriously? No check when I insert the USB drive, to say: "This one isn't suitable, please use another or reformat this one"? Just, "USB drive found. Should I use this one."?
Oh, Windows installer technology is a horrible pile of rubbish, and its error handling is particularly bad. I don't think I've ever seen MSI actually translate an error code; it reports all of them as "unknown", even though they're almost always standard Windows errors, for which it could get an explanatory string from the standard Windows FormatMessage API. Just completely incompetent.
On the other hand, it's astoundingly slow.
Growing up, I was vaguely aware of this 'Windows 3.x' thing, yet we used MS DOS (5.x) at home, though I have vague memories of playing games on a C64 as well. I never knew there was this 'Windows 1.0' thing as well, even if it made sense in hindsight considering the 'Windows 3.x' thing. MS DOS was what I used the most, and was what ran on the IBM PS/2 386 SX system (with Model-M keyboard) that my father got me as my first PC when his work was selling off old systems in the 90s.
At the primary school everything was MSX machines, and the high school was using NT 3.51 or 4 (on ancient Pentium 1 machines). I have used Windows 95 (SR2) some, but I mostly got started with Windows 98 (SE, of course) on account of being apparently a whippersnapper when it comes to Windows.
Doesn't keep me from being completely grumpy about 'modern Windows' (i.e. anything past Windows 7) and pining for the wonderful days of running Windows 2000 on a Celeron 400 with 64 MB RAM with zero issues. An experience which made me decide against continuing my self-flagellation with SuSE 6.3 in '98 (First Year of the Linux Desktop, IIRC) when I figured Win98SE had pushed me too far. Those were the days.
Not sure if I'm pining more for MS DOS or Windows 2000, though. Maybe a bit of both :)
... was at some point borking a W2000 install for the n-th time and asking myself: why I can't trace all this and know exactly what is wrong? why "reinstall from scratch" is the only means of fixing things? A CD with RedHat was the starting point. Windows gave me a true perspective on the value of open source: the ability to peek at absolutely everything and being able to trace and change it. In addition to the other freedoms, of course, but that was the key point for me: understanding what went wrong and fixing it.
I now currently use professionally (SW development) both Windows and Linux and I wherever I have the option, I choose Linux.
I spent far too many years than I care to remember developing for Windows (Server side stuff). I called everyone in MS from Gates down all sorts of names.
Then in 2016, I threw my toys out of the pram and said 'No More'.
Windows has been expunged from my life. What a difference.
Now I develop on MacOS and Linux with Linux as the target OS (R-Pi systems)
Because MacOS is Unix under the hood (as near as makes no difference to me) it works very well.
My main Linux box has not been rebuilt since 2012. No need. That wasn't the case on Windows unless I ran the Server OS on my desktop.
Personnally my first forray in the Penguin Universe was at some point in 1991...
Where I downloaded on an uncountable number of floppies, the SLS from ftp.funet.fi.
the Kernel was in version 0.8 something or other... and despite all my coaxing ( through editions of the configurations files in vi ) I never managed to get X-Window working.
But it was fun and got me hooked enough to *nix likes that I spent years afterwards working with Solaris and HP-UX ( with a siding of Linux ).
Around 1994, I got X-windows working well enough on a dual-boot Win3.1/Linux system to use it as an X-term off of a Sun machine.
The CAD software I was using (Viewlogic) came in Windows and Unix flavors. The Windows version required QEMM and if you had too much on a page or too many pages open in the schematic, Windows would roll over and die.This did not happen, of course, on the Sun system. So, to me, the path to productivity was clear -- use the Sun system. Booting into Win3.1 was required to do "office work", Word, Excel, email, etc. Boot into Linux when doing schematics.
The boss didn't like it but he left me alone, once I showed him that there were two separate HDDs, one supplied by him, and one supplied by me. Mine had Linux, his was the one he had issued me with the computer and was unchanged. And my productivity went way up.
This was the point where I began to take Linux seriously. Same hardware, but much better performance. I still use Windows for work (need to edit and create Office documents and get my email on the corporate Outlook), bit it's Linux, WINE and an XP VM at home. The VM runs oddball negative scanner software and a legal copy of Lightroom, Linux does everything else.
The world has "upgraded" and Windows 10.0 runs on 4Gb of RAM and 32Gb of disk space, that's an "upgrade"?
Windows 10 requires 2GB of RAM.
Linux hardware requirements these days are not too dissimilar from those requirements; the days of using a 386sx with 4MB have passed a long time ago.
Linux hardware requirements these days are not too dissimilar from those requirements
NOT true... I run Linux on older hardware quite a bit, actually. I still have an old Laptop (from 2003-ish) that has Linux on it, an older debian release, that I used for consulting a year and a half ago when I needed a Linux box and they didn't have one available for me. All I needed was ssh and some file and network utilities to talk to a couple of RPi devices, but windows is _SO_ pathetic for that, so I used a 2003-ish laptop with only 512Mb of RAM and a relatively TINY hard drive as a PRODUCTIVITY booster.
A week later they handed me a year-old CPU box that I took home and put Linux on, overnight. No problems since.
Now I _WILL_ admit that *GLUTTONOUS* *PIG* applications like Chromium and Firefox [which manage memory about as well as an alcoholic manages booze] won't run very well on that old laptop, since they're so "modern" and all, but pretty much everything ELSE [including gimp] seem to work JUST FINE on a system with less than 1Gb of RAM, running X on a 1024x768 laptop screen, with a debian release that barely pre-dates its inclusion of systemd.
I bought it when it came out - along with GEM and something called DoubleDos. Windows 1.0 was number 3 in popularity for me - but even then, I could see the writing on the wall.
I also remember buying Oracle v1 for DOS - not a patch on Paradox (which came a little later).
WinMob 7 was supported by developers but MS kept moving the goalposts then brought out WinMob8 which was incompatible with 7 so everyone's work wa wasted. With MS track record of dropping support for its own products like a fresh turd no wonder developers walked away and put their effort into iOS and Droid.
Back OT I remember Win 3.11 being pretty functional and usable on today what looks like kit no more powerful than a calculator. It was pretty stable for me as well as long as you didn't ask too much of it.
yep, you read it...
The original Amstrads PCs ( PC1512, PC1640, PC6128, .. ) came with the GEM... and it wasn't on top of DOS.
Too bad it didn't survive Windows 1.0 and became a 68000 ( Atari/Amiga ) limited product.
And yes while I never used Windows 1.0 I did use Windows 2.0/286 ( because at that time there was a different windows for then the top of the line 80286s ).
My father bought one at work to scan parts of blueprints... and No it didn't have Photoshop, the Imaging software at that time was ImageIn.
Well he computerized his company around that time, and people in the office learned the funs of Multiplan dBase and more. ( those in the machining halls didn't care, they had already enough fun transforming bits of metals into tons of heavy machinery )
Was the GDI system. Set up your printer and it just works for everything.
I'm crusty and near-dead enough to remember the days before. I had to set up some weird non standard laser printer that only understood Epson LQ (yeah, dot matrix) and it's own protocol that nothing else seemed to speak.
So, WordPerfect 5.1, xywrite, FoxPro, and various other things. All set up one by one, all with their own quirks and oddities.
Then along came Windows 3.1 and the printer garbage only needed to be done the once and everything that printed within Windows then worked.
Windows hasn't been part of my life since 2001, when I switched to Linux and KDE/Gnome.
Windows was, and still is, an awful OS. It arguably held computing back by a decade because of its appalling deficiencies in networking and security.
Every time I see pagefile.sys, I cringe.
Azure doesn't work without Linux. Pretty much every serious advance in computing in the last 20 years has happened on Linux.
And now ESR speculates that windows will become just an emulation layer on top of Linux.
I won't miss windows.
Biting the hand that feeds IT © 1998–2020