
A company that loses the consumer and ends up being an expensive business only player will eventually fail.
The man in charge of Microsoft founder Paul Allen's investment vehicle has suggested Microsoft should be broken up. Allen left Microsoft in 1982 after an illness, but retained a huge parcel of shares and still owns a couple of billion worth of Microsoft stock. That stake is managed by Vulcan Capital, whose chief investment …
IBM demonstrated that business services and back end hardware, while killing off all consumer tech can result in a thriving business.
What I suspect is really the case is that businesses who do R&D thrive. All these people circling round MS, saying they should split up, will be the first to kill off R&D and that would be the death of MS.
@AC 00:05 What have the Romans ever given us?
While the Romans probably didn't invent the aquaduct, they certainly made huge advances in making its benefits easily available to lots of people.
Sort of how MS took a bunch of Unix technologies as made them available in a ready-to-roll form as Active Directory. Before that it was their acquisitioon of MS-DOS.
"Before that it was their acquisitioon of MS-DOS."
An "acquisitioon" is not R&D!
"Sort of how MS took a bunch of Unix technologies as made them available in a ready-to-roll form as Active Directory."
Actually Active Directory was based off of Novell Directory Services. Sure it is based off of x.500 which was releases several years prior. NDS had an issue upon release and Novell offered a workaround before an actual fix was released. Microsoft releases AD and surprise surprise, it had the same flaw and Microsoft offered the same workaround. Microsoft vowed an actual fix but never did and instead waited until a later release of Windows to actually fix it.
So, both of your examples actually just proved my prior statement that copying others is not R&D. The fact is, Microsoft has done very little innovation. Vista was suppose to have a lot of features found in other OS's available at the time. The schedule for Vista kept being pushed further back until Microsoft pulled the plug on a lot of those features. Vista also saw features available in XP removed only to be added back in with Windows 7. Vista had a 5 1/2 year development process but was an incomplete OS compared to XP and Windows 7.
Microsoft has a cash cow of products and they are just riding that wave. Many companies have been in that position in the past and they all came crashing down at some point.
Thereby being Linux just a copy of Unix you mean Linux is no R&D at all?
Sometimes reinventing the wheel is not the way to go, sometimes it is. Microsoft does R&D, copies and acquisitions as any other company. Where do you believe the multitouch technology which made Apple mobile device successful? From internal R&D? Nope, it was an acquisition (Fingerworks). What is Android built on? A copy of Java, and Apache Harmony. Lots of internal R&D here too.... Research: "free java library" -> "I'm feeling lucky".
MS May have purchased the QDOS poduct, but it was barely functional and certainly needed a very large amount of bug fixes and development before it was released. It was then IIRC totally re-written from the ground up for the next major release.
You seem to imply that because a product is based around something that it somehow needs no development or research? AD didn't just magic into existence and certainly uses technologies which aren't available to Novell or x.500 users.
MS do a large amount of R&D for the sake of doing R&D, you'll see articles fairly regularly on The Reg about things such as their no-glasses 3d screens, and various other flashy stuff. Try doing a few internet searches for "Microsoft Research" and see what you find, it's pretty interesting.
> MS May have purchased the QDOS poduct, but it was barely functional and certainly needed a very large amount of bug fixes and development before it was released.
When MS bought DOS from SCP it was _already_ being distributed as SCP-DOS (or 86-DOS) and was sold with SCP's Zebra 8086 S100 boxes.
Certainly it was modified to use MS's FAT file system rather than the CP/M cloned file system.
> It was then IIRC totally re-written from the ground up for the next major release.
It was rewritten by IBM when Gary Kildall demonstrated PC-DOS 1.0 showing a DRI copyright notice that was buried deeply in the code. Both MS and SCP were CP/M OEMs. SCP with their Zebra 8 bit systems and MS with the Z80 Softcard. It has been alleged that SCP decompiled the CP/M BDOS (there were commented decompilers available at that time - see Byte magazine ads) and put the code through Intel's 8-16 bit ASM converter (also available) to arrive at the initial QDOS.
After IBM rewrote PC-DOS as 1.1 it was passed back to MS to create MS-DOS 1.25.
It was rewritten again to make 2.x and to include hard disk support (which had been available for several years on CP/M).
R&D ?
My favourite example of Windows since 3.1, was stacking Glasses on bar for "layer" of OS, about win98se I had to change to paper cups, it was unstable ...
Windows was always a single user system, adapted/Updated/Modified to work with WWW, that why its drowning in viruses, and Java ...
Bill jumped ship, now rats fight for scraps, before it goes down ....
@JamesTQuirk - Thankyou for demonstrating that you know nothing about modern Windows. The NT/2k/XP series of OSes have absolutely nothing to do with the 3.1/95/98 series. The NT OS is multi user from the ground up.
Also, Bill hardly jumped ships, what with him being the Chairman of the board of Microsoft.
@ Anonymous Coward (Windows Certified Thingy ?), r u somebody who paid microsoft a wad of cash to be shown how to actually use & organise, their easy to use software ...?
Your so right, being using using Debian since about 94, I saw writing on wall, for me, Windows gets buried under a linux install, ALWAYS .. So yes "modern" Windows has no value for me, I met it's ancestors ....
Why bother commenting on comment threads about Windows when you by your own admission have no knowledge of the OS?
I've been using Linux since about 1995, I've also been using Windows since 3.1 and DOS from way before then. I know for certain that I wouldn't compare Linux from 1995 to Linux today, so why would you think that Windows is just the same?
How do you even know that Windows has no value to you, if you don't know enough about it to know that it's not a single user system?
Yeah Windows 8 will be a problem, soon as they stop bundling laptops with them, before I buy a new one, but I will find a copy in Junk pile soon ...
I have multible copies of XP, Win7 here, I seen them, fix them & I charge Windows types heaps for it, de-virus/trojan etc regularly, for others that use windows, how people can use that enviroment bewilders me, I know old Legacy, but name says it all software Business is to tight to replace, well sorry I prodded your fragile windows ego's but it sucks, it sucked 20, 10 5, years ago & yesterday as far as I am Concerned ...
I have knowledge of Window "nt" or greater, but it tells me to avoid it .....
It was a guess, I was too busy to write it down, and there was no artifical EGO , like twitter, facebook, things that teenagers use these days, and not sure, did you look at 68K dev, as I said I first installed on Amiga 2000, but if it was 19 years ago, who cares, windows is still a limp dick ...
> The NT OS is multi user from the ground up.
That is simply not true. Originally Cutler intended the new NT to be multiuser but Gates had this removed because he wanted to sell a machine to each user and not one copy of Windows for several users.
Later, Citrix (started by Ed Iacobucci of OS/2 fame), added actual multiuser facilities to NT 3.51 by licencing the NT source code. When NT 4 was released MS refused to release the source code to this until Citrix cross licenced their multiuser code and communications protocols back to MS (for free?) so that MS could build TSE.
While NT may have serial reuse with different logins, it is not (apart from Terminal Services) actual multi user (ie multi concurrent users).
Even with Citrix or TSE there are several nasty hacks needed because there is no real separation between the users on one machine.
Erm, lots of Enterprise level software runs on Linux (and commercial Unix OSes) within businesses.
Linux is extremely well established in the Enterprise, certainly in the financial sector.
Running large volume web based systems on Windows is pretty much non existent at the high end of this industry. You sound like you would be quite surprised how much of a consumer you actually are of Linux based systems when you use the web.
To be fair, all desktop PCs within businesses use Windows for everyday user applications, but I think you're ignoring the backend systems which power the real mission critical stuff a little too much.
MSFT has a very successful business suite called Office, that is used by BOTH businesses and consumers.
If MSFT spun off the "consumer" businesses, what happens to Office?
MSFT has the opportunity to LEVERAGE the success of Office to gain market share in mobile.
The POTENTIAL gains in mobile from leveraging the ownership of Office are immense.
It is short sighted of Vulcan to call for the split up of MSFT at this time, in my opinion.
Their is a powerful synergy between consumer and business, based upon the OFFICE suite.
In the future, almost everything will be accessed from the cloud.
But, who's cloud?
MSFT has the ability to become one of the top cloud destinations, which will become the key to future profitability.
The OFFICE suite, SKYPE, and gaming content will be key attractions for choosing MSFT's cloud over the competition.
Business people are also "consumers". The trend is NOT to carry 2 phones, 2 mobile computers
Most Business people want ONE mobile device from each category, that can handle ALL of their needs.
Bottom line, there are significant synergies gained by being both a business supplier, and a consumer supplier.
Just look at Blackberry...Their business offerings are suffering, because people do not want to carry around 2 devices...
If you are multi billionaire, or the manager of $15 billion, you can have an assistant who follows you around carrying all your various different devices... could it be that these super rich people do not understand how most people operate, in the real world?
Please put down the Kool-Aid. You sound almost as bad as the Fanbois.
Open Office/Office Libre killed the Office Synergy. Sure I use Office at work, never at home. I don't want to be stuck on the upgrade treadmill for an occasional use product. In fact, were it not for the Windows Monopoly there would be no Office Monopoly for MS today. WordPerfect was a far superior product, even when Office was the only working GUI for Windows 95A. Of course back in those days a full copy of WordPerfect set me back $495. Upgrades were more affordable of course. MS cut their money supply by selling competitive upgrade copies for $99. WordPerfect tried the same thing, but without the cash flow from a monopoly OS, it didn't help. Although I did know people who owned copies of both during that time because once you had one the other was only $100 more. One enterprising co-worker even managed to bag both of them for only $200.
We're moving toward platform agnosticism not platform dependencies. Best of breed not monopoly lock-in.
I'm not sure if they are better off as one company or better off split apart. Maybe they can get synergies. Maybe forcing everything into the same mold is what is holding them back. I'm sure as hell in the camp that thinks that's how they screwed up Windows 8. But I'm willing to concede they might be able to maintain a root code base that is similar for devices. Linux certainly does even when it gets specialized for an appliance as dedicated as a car gps.
IBM is cleaning up by focusing on business and of course computer research.
The consumer end is either really gadget or games. Throw in a little search and help and you are there. Samsung came from no where, just like Apple did, just like Sony did, just like the next fad manufacturer will. That's the problem with the consumer market, your time in the sun always comes to an end. So target IBM business and research, not so much the competitor to beat, the whole of that market is that but the competitor to align with. Security is also floating around but thanks to the NSA that is going to end up a one country at a time market, nobody will trust anyone going forward (the fiscal harm the NSA has done to US business is staggering, trillions of dollars will go begging).
So MSN, XBox(gaming division), Bing(back to MSN Search),whole bunch of cash for restructuring and to keep share value up, basically creating MSN consumer products.
Everything else stays with M$. Add in clearing out deadwood, for both divisions and M$ will be able to trim right down and clean up.
While MSN will charge right at Apple and Google, two targets with big revenue to sink their teeth into (Focus is all MSN needs).
I've always liked Bing, but until recently it did not have a good option for sorting search results by date, like Google. Now, at least Bing has introduced the ability to search for the past day, week and month, so I'm finding it more useful. They need to add the ability to sort search results by year, and by custom date, like Google.
Bing's image and video search features are already quite superior to Google from my perspective, and their maps are very good, although the maps don't yet offer auto-complete like Google's maps.
Mind you, it could mean they develop a search engine of their own instead of merely throwing up a script kiddie wrapper round a real one.
@Bob - We've been through this before countless times:
What happened was a google engineer, found a way to game the Bing toolbar (which watches your visited URLs and returns them to MS) so that when he visited a page which was present only on a malformed google search. The bing toolbar picked up the url from the browser and sent it back to MS, the systems at MS basically said "we've not seen this URL before, let's include it in bing."
You repeat this claim again and again, you are corrected again and again but still do it. This is an entirely dishonest way to behave.
http://searchengineland.com/bing-why-googles-wrong-in-its-accusations-63279
I do believe the microsoft shills round here are following a script. Why just last week when the painful reminder of Uncle Festers "Linux is a cancer" rant came up, they all piped up right on cue, with some pathetic argument about it all being in the past so it doesn't count - similar to this Bing copies Google one, in fact. Well the article on Googles official blog hasn't changed in the almost 3 years since it was written so I'd say it's as much a certainty as it is hilarious watching the apologists try to squirm out of it each time it's brought up.
Which it will be again, I'd warrant.
It's people like you that bring computers to their knees for no good reason and then insist more money must be spent.
Need to run a 30Kb DOS executable? No problem, that just means pissing away 256Mb memory and 10Gb disk space on an XP VM...
Oh, and that still doesn't work quite right because it's now on a separate VM rather than alongside everything else it needs...
Round here you'd be told where to go in no short order.
I can't speak for your desktop PC, but mine is Core i7 3770 water cooled, 32GB fast RAM, SSD boot drive and 5TB secondary storage, plus a decent GPU.
I build a PC for my business to last 5 years, 7 with luck because unlike most businesses I have dealt with, I know that 'best value' is not the same as 'least cost'. Whether you're a serious home user or you make your livelihood with your PC, it should be money well spent, not a question of 'how much'.
https://secure.flickr.com/photos/haizo_baum/sets/72157633423490011/
So, No, using the resources available is NOT pissing them away and I won't even feel such a flea-bite.
If you can't get all the resources you need into one XP VM, maybe you need someone to build you an appliance under VMware, that works fine for me too.
I've been in places where they relied on old programs because they were too tight to get a replacement written. I have warned managers of the need to migrate from DOS and 16-bit applications for safety and maintainability too - until I gave it away as a waste of time in 2004!
In my opinion, it is those managers who put their companies at such risk that should be fired, pronto.
I've been in places where they relied on old programs because they were too tight to get a replacement written. I have warned managers of the need to migrate from DOS and 16-bit applications for safety and maintainability too - until I gave it away as a waste of time in 2004!
In my opinion, it is those managers who put their companies at such risk that should be fired, pronto.
In that case it's simple: we'd never hire you.
As a slightly different but essentially identical example: I work at a university where we have a telescope mount controlled from an old XENIX app. We were able to upgrade that to Openserver 5 but even that is 15 years old now. Anything more recent via XENIX emulation simply doesn't work since it needs semi-direct hardware access via ioctls. Cost of an new and equivalent mount? £130,000. Are you offering to stump up?
That's still small fry: for example Royal Mail have hundreds of millions tied up in Integrated Mail Processors. There's a fancy front end on some of the newer machines but underneath it all they are still DOS based. Are they supposed to scrap that investment on your whim too?
Remember, you haven't qualified your statement in any way so you are either ignorant of reality or chequebook happy. There are plenty of apps out there tied to one specific platform or other for any number of reasons. Simply wishing them away doesn't work.
Well, I never understood why managers need the latest car/laptop/tablet/phone model, but don't care about upgrading software even when it's getting too old.
Your telescope mount software can't be upgraded? Fine - stay with old software, and maybe hardware, if the software can't be run on on newer one - or virtualize it (and you can still give VMs direct access to hardware if you need it). The latest Hubble upgrade required technicians to work on hardware and software from the '80s - and they did, didn't ask, "hey, we need <whatever> to support VMS and CP/M and 8" floppies!"
But you can't ask endless compatibility with old software from newer operating systems, there's a point when going forward means to cut backward compatibility.
If a company like Royal Mail is unable to plan investments to keep its software updated as technology changes, well, there's a lot to be worried about. Do they plan to deliver mail still by horse, steam trains and Morse code? Or did they plan - and invest - to deliver mail using more modern media?
As everything else, software is not something "buy once, use forever". Its obsolescence and upgrade cycle must be planned as everything else - machinery, cars, etc. If management fails it, well, it means there are the wrong people in the wrong place.
"don't care about upgrading software even when it's getting too old"
What do you mean by "too old"?
If you mean it is not doing the job that is required of it, then yes, something needs to be done. But if you have something special that does what is required efficiently and reliably, why change it?
We have an ancient DOS application and it runs for months on end without fault, only restarted when system updates need a reboot or configuration changes have to be made. To re-write that would be man-years of effort, followed by a year or more of bug-fixing, and all to get back to exactly where we currently are in terms of function.
To me that represents business resources that could be used more effectively elsewhere.
And having done so, by your book we would be then saying in 1-2 years time "oh you don't do it that way any more as the .net/HAL/silverlight/etc assumptions have changes so you need to"... and so go through a lot of said pain again, and again.
As the grammatically-challenged, but pertinent saying goes: If it ain't broke - don't fix it.
"If it ain't broke - don't fix it."
That's what lazy syadmins spending their work hours watching porn on the Internet, and managers more interested in changing their company cars with their budgets want you to believe. And that's what made the IT systema a paraphernalia of unsecure badly configured systems running old, unamanteinable software.
You have two choice: let the old software run on old systems, or you have to upgrade it because lack of support. Everything has an obsolescence cycle - why software should last forever? Are you working on a fifty year old desktop and chair, lightin it with an old incandescent lamp, and typing on a keyboard from the '80s, and using a green phosphor CRT monitor? Do you still use perfectly working 10Mb hubs? No, you have changed them even if perfectly working to get something more "modern". Face it - software needs to be upgraded too, and you have to plan and invest for that as well. A well written DOS application can become a Windows console applications with very few changes. If the code is no longer maintenable, or lost, well, the problem is not Windows not supporting DOS apps any longer....
It's funny, there are peope here who would kill for the latest mobe (while any old mobe can still place calls perfectly), but Windows has to support code thirty years old. BTW - how many 16 bit apps can Linux run? How many old PowerPC applications can the latest OSX run?
"typing on a keyboard from the '80s, and using a green phosphor CRT monitor?"
Actually I do still use a 10+ year old keyboard because it has a much better feel than your typical $5 ones that almost everyone now supplies. But I have changed my monitor to a better one. Not "modern", but better.
"A well written DOS application can become a Windows console applications with very few changes."
Excuse me while I stop laughing. Have you actually tried to port a complex bit of software that was written to use the old DOS graphics libraries and that assumed direct hardware I/O was permitted?
"BTW - how many 16 bit apps can Linux run?"
AFIK there never were any 16-bit (i.e. segment/offset memory model) programs for Linux as it started life in a flat 32-bit memory model. Same as Windows NT.
But today a 64-bit Linux can run a lot of 16-bit DOS programs using dosemu. While MS, with billions of dollars to spend, has decided that their equivalent ntvdm (which is how they supported DOS and Win3.1/95/98 stuff) will not be supported any more.
Ultimately with Linux I can do something about it if things break for me, but for Windows I am screwed if MS decided to change something, and after years of Windows usage that is something I now value a lot.
To reply to me AC self:
"assumed direct hardware I/O was permitted"
While ntvdm will not permit general I/O, it emulates serial & parallel port I/O behaviour reasonably well.
However with dosemu if needed (and desperate/bold/stupid, delete as applicable) you can give direct I/O rights to the DOS program for special hardware access.
It was Intel that removed Virtual 86 mode from x64 processor running in long mode. There was little MS could do to support DOS in Windows without it, but with emulation. But why spend time to support really outdated applications and cripple the system with folder redirection and lower security?
It was time to cut some ties with the past and look forward. Why no one complained when Apple removed PowerPC apps support from OSX?
And what graphical applications are you still running today but games??
"little MS could do to support DOS in Windows without it, but with emulation"
Well dosemu manages such emulation and they don't have MS' budget for programming staff.
"Why no one complained when Apple removed PowerPC apps support from OSX?"
I think you will find a lot of folk complained. However, not many Apple users have built business or science systems on their products, most have been in the creative/content-generation business who are happier with changing for fashion.
"And what graphical applications are you still running today but games?"
Anything that uses menus and/or a mouse. So basically that is pretty much everything interactive, other than file I/O or database-like programs (which are often easier to port, as long as they were written in C/FORTRAN/Pascal and not in something obscure or using x86 assembly in places).
And right there is the business need to re-write the code to comply with modern security standards. If you allow the 16-bit app direct access you've opened up a hole a mile wide that hackers will sail an oil tanker through.
No it's not pretty, but buck up and pay the money to secure the system. The university telescope you can probably make a case for a waiver on. It doesn't need to be networked. But it will need a decent security protocol as well given Stuxnet.
Want some qualifications, eh?
Firstly, assertions like "we'd never hire you" carry no weight at all when you're over 50 and apparently already past it.
Meanwhile, I make a living writing real-time embedded systems right down to the metal on ARM processors.
One particular device is a classic closed-loop motor control job to throw tennis balls accurately, finding and tracking a star should be child's play today if a processor from 15 years ago could do it.
http://www.tennismatic.com/main/page_products_ball_machines_t200_series.html
In the past I have done IBM 3270 screen-scraping for Australia's first Mobile Data courier system for Skypak, The original Freight Management System for Ansett Airfreight, and the Sortation Plant control program for TNT's Enfield Sortation Plant which handled 4 parcels per second at maximum load UNDER WINDOWS! as much as any Royal Mail plant. I have real-time down cold.
Saving flying Silent Trader aircraft flying overnight from Adelaide alone for Ansett saved millions.
Feel free to verify any of this with my old manager - http://www.linkedin.com/pub/john-szwec/7/493/94b
I run Cross Compiling, Simulation and PCB design mostly on my PC and for the chap who could have bought a PC each year, I say "9 women can't have a child in 1 month", you wouldn't have the same capability as this PC from year 1 and probably not by year 5 either.
This post has been deleted by its author
@ mad physicist Fiona
I remember helping somebody called Fiona, with "fig-forth" language for a telescope control system about 30 years ago ...
Anyway, maybe I dont understand but, VMing a old system is a possible answer and running there, I have Dos 3 to 7 as Vm's in Linux, or if things get really nasty, run a old PC @ controls with 8/16/32 bit OS + Software, interfaced to Main System via network, to drive the Mechanics, but maybe there are things I dont see ...
It's that I have helped a lot of "mad" pc people over the years, and a lot of it is interfacing one piece of gizmotron to another, Once I used the guts of old printer plotter, to repair controls on a Large Industrial Laser version (cutting dashboards panels for Landrovers from steel sheet) saved him big bucks & tight arse is probably still using it, should have given him a bigger bill, because I noticed/remembered control board looked the same, my point is, there is always a way to do things, so other things can move on .....
Using a VM is a good solution as long as you only need "standard" I/O such as serial ports (on normal Baud rates) or parallel.
If you have any special I/O cards, or rely on something odd in the settings of normal I/O (as I found trying a non-standard Baud rate that the UART chip should support) you are likely out of luck with a VM.
Using a VM is a good solution as long as you only need "standard" I/O such as serial ports (on normal Baud rates) or parallel.
Among other things...
People who think VMs are the answer to everything should read Bugnion et al, "Bringing Virtualization to the x86 Architecture with the Original VMware Workstation." (ACM TOCS 2012; requires access to ACM pubs, but if you're not an ACM member, what are you doing in a technical discussion about computing, eh? And TOCS is likely available at a research library near you.) VMs for the x86 architecture family have to have special-case handling of a number of tricky things real-mode code tries to do, and it's not feasible to handle all of them. That's why OS/2, for example, has never been supported by VMWare.
Ok, but I also suggested tying a PC (with rev OS + Hardware ) at the gizmotron that needs it, control PC over Network, So ok you have 8/16/32bit in use, just not in main Network, Secruity issues can be Fireballed away, usually ...
Any thoughts on that cause now I am interested I spend at of time recueing Old Data for people...
I am wondering cause of "Steam" OS etc, for Linux etc, now that allow a windows PC to be used, to run Windows DRM software over over linux, complete nonsense if u ask me, I had old Software/PC here, 8086 up, so running old stuff is a interest, I try to run the OS the SOFTWARE / Enviro needs. Getting old Apps/games working can be hard work, sometimes booting/running OS & Software in a small FAT16 Part on HD, solves a lot of issues ...
I get by with booting the windows7 install that is on HD, as a VM in Xubuntu, Just like HP that in the shop, except it's a vm, but I am in 64bit Xubuntu/Windows and running VT extensions, Works Fine, I can use steam, Play Halo etc
" but if you're not an ACM member, what are you doing in a technical discussion about computing, eh?"
This is year 43 for me, in PC's, I can handle it, but as always I am Learning, the day u stop is the day u die, so if that disturbs you "technical types", good ......
Unless your business is CAD or similar, I suggest you get real. For the money you wasted on your "business" PC, my business could have bought one new PC EVERY YEAR and still have money left over. Most companies would rather spend the money where it really matters - on their networks or the back end systems that actually do the work. How does having a powerful client PC help improve database performance on my 10 year old P4-Xeon based server?
As for rewriting old DOS software, how the hell is a business that uses an old application supposed to go about getting it rewritten if the third party developer doesn't want to know? Attempting to update it yourself would be illegal and possibly nearly impossible, would it not? What if you need a com or parallel port too? Good luck getting that working in a VM.
I can't speak for your desktop PC, but mine is Core i7 3770 water cooled, 32GB fast RAM, SSD boot drive and 5TB secondary storage, plus a decent GPU.
That's my point.
Try living in the real world for a moment. Consider any corp with 1000+ desktops. How many of those are liquid cooled i7s with SSDs and 32GB of RAM? The typical price for an new office PC base unit seems to be in the £300 range these days, and yes, those machines generally will be expected to last a 3 or 5 year replacement cycle. Look at what spec that budget gets you.
Sure, you can piss away money so you can in turn piss away resources for no good reason, but the real world isn't like that. Yes, best value is the driver here. Ask which is better value over a ten year cycle - three £300 machines or two £1500 machines?
"I run Windows 7 Pro' 64-bit. It has a 32-bit Windows XP VM to run all that stuff, no further development required"
And what happens to your XP VM come April 2014? Oh yes it goes out of support so no more bug-fixes for the same flawed code that later Windows share. How well protected is your 64-bit system from malware in the 32-bit VM?
Can we have a new 32-bit Windows that is supported?
One that supports 16-bit applications?
One that supports older 32-bit drivers for legacy hardware?
Oh, you just told us that is a waste of space...
Well, Microsoft will stop mainstream Windows 7 support on January 12, 2015. But the company will keep providing extended support until January 14, 2020.
http://www.pcworld.com/article/2010820/how-long-will-microsoft-support-windows-7.html
I run multiple firewalls from different vendors in accordance with best-practice recommendations.
I also have my previous PCs and test machines, currently: Windows Vista 32bit, XP 32bit and Windows 8.1 64bit as well as various Linux flavours, all still working fine.
Only bad workmen begrudge spending money on good tools that will last well and perform well.
If that legacy 16-bit software can't run on a standalone legacy box that doesn't need servicing OR
a VM that's firewalled and isolated as hell from the rest of the system it shouldn't be anywhere NEAR a modern OS.
Yes I'm and old DOS hand from about version 3.0. Yes in some ways it was a lot easier. It is also completely insecure.
And insisting that our 64-bit OS systems handle it on a native level without an emulator in between is insane. There's a good reason we no longer give any random application direct access to the hardware. In fact, it's plural not singular.
MS huge income from some business units allows them to keep some really bad stuff alive.
Take for example, MS's foray into phones and other Windows CE activities. This should have died a death in 2001. However the infinite MS cash mountain has allowed these products to be kept alive with vast marketing budgets. This has caused confusion in the marketplace.
@Charles, I disagree!
MSFT has added alternative mobile platform, which created more competition, which spurs innovation, which is good for the end customer.
MSFT has INVESTED many $ Billions into R & D, that have resulted in a very good mobile phone operating system today.
It does not matter if their initial products were deficient, and not competitive.
Society benefits from competition!
Society is benefiting from MSFT competing in the mobile arena.
No, I expect Bing, Xbox, and possibly even RT to zombie on. MS almost never lets an idea die. They put it on the shelf for a few years, pull it back off, blow the dust off, hand to a new team of developer to shine up a bit, and then release with a new jazzier name. In fact, I think the only two things they've really let die are Bob and Encarta.
“The search business and even Xbox, which has been a very successful product, are detracting from that."
Last I looked the xbox has made -3 billion dollars since they entered the gaming arena that's successful?
I guess the Surface is a booming success at that point too along with the zune...
If I was a MS investor I'd want the xbox, surface, and most things culled to get them back on track.
And I have considered Windows a consumer product of dubious quality that sneaked into business creating the first and largest catastrophe in the history of IT. And even if I had a chance of living my life again I cannot see how that would change facts. Bill Gates went for the consumer personal computer from the very beginning. For a while he played around with Xenix, a *nix system that would have saved us and business billions and billions, but he took the fast and dirty road. A personal success, no doubt, as he become the richest guy among those who cannot hide. A catastrophe, all the same, especially for business.
DOS and Windows came into business because the incumbents thought they had business sewn up, they were happy to charge £10k for a workstation, thousands for dumb terminals and their infrastructure. The file servers at the time were big-iron minicomputers you paid through the nose for. I hardly think that MS can be blamed for producing good-enough cost effective alternatives. No-one cared if a file server went down once a month, if they were paying such a tiny fraction of the costs of the Big Iron unix systems. Novel may have been more reliable as server, but they never had a full client, so yes, you could get more cost effective reliable server side, but still not a cost effective client, this signed Novels fate and that of Windows also.
And you paid yearly fees for software licenses... and had very little choice of software and development tools. It's funny how many people who started to use a real computer with DOS/Windows keep on complaining they were able just because DOS/Windows made business computers affordable and versatile, while the previous wave of home PCs were very closed systems with little or no interoperability even across succesive models, not upgradable and very little expandable.
Were DOS and Windows perfect? No, not at all. But they worked enough well and were cheap enough to allow many to get a decent computer. And without DOS/Windows, I guess there would have not been any Linux - because the x86 hardware most run it on would not have been so widely available.
> while the previous wave of home PCs were very closed systems with little or no interoperability even across succesive models, not upgradable and very little expandable.
You obviously weren't even born then. Many systems from the mid-70s used CP/M, on which MS-DOS was based. There were several CP/M clones such as CDOS, Turbo-DOS. Many machines were based on S100 bus (eg the Altair) and these had swappable and upgradable boards.
Even the Apple II from 1978 could take add-in boards such as the Z80 Softcard that ran CP/M. All CP/M, MP/M and clone machines were interoperable because they could all run the same software.
> And without DOS/Windows, I guess there would have not been any Linux - because the x86 hardware most run it on would not have been so widely available.
Linux, and Unix before it, can run on almost any CPU, it didn't need x86. MS-DOS was forever stuck on 8086/8088 CPUs (or 8086 mode on others) because it could only run in real mode. Granted Windows could switch to 80386 mode.
Linux was the first OS to run on AMDs x86-64 and Intel copy of that, well before Windows got around to it.
I suppose you will claim that it is Windows RT that make ARM CPUs widely available.
I was speaking about "home PCs", not the business model which were very expensive even then. Most CP/M machines, and Apples ones, were too expensive for home use.
"Many systems from the mid-70s used CP/M"
Yes, and that was the only common denominator. Everything else was often "proprietary", even if they were build around the same CPU. Sure, an "add on card" to run CP/M on an Apple... cool way, put a computer inside another to run another OS. "Interoperability" was at source code level, not binary level. While all DOS/Windows machines could run the same executables, and board and peripherals were (and are) compatible among different models and brands thanks to the use of common standards.
"Linux, and Unix before it, can run on almost any CPU"
Sure, but if the CPU and the hardware around costs 10k+, how many could afford it? If MS had not licensed DOS to clone manufacturers, there would have never been the low-cost PC clones, and the spreading of Linux thanks to them as well. Linux would be maybe just another OS used in universities only.
Without PCs, Intel would have not maybe become the powerhouse it is now.
"MS-DOS was forever stuck on 8086/8088 CPUs (or 8086 mode on others) because it could only run in real mode."
False. DOS extenders made DOS applications run in protected mode on 286 and 386, and access more then 640k. Maybe you weren't even born there, or maybe you were just playing with PCs and not coding.
"Windows RT that make ARM CPUs widely available."
No, but it wasn't nor iOS nor Android. Smartphones and other mobile devices used them - including Windows CE - well before both.
> Most CP/M machines, and Apples ones, were too expensive for home use.
Untrue. The Apple II was a home computer. CP/M machines were no more expensive, and usually cheaper, than MS-DOS ones when these came available.
> "Interoperability" was at source code level, not binary level.
Completely untrue. Any CP/M machine could run any CP/M binary executables. They were configurable for the screen controls, but then so were many MS-DOS programs*.
> While all DOS/Windows machines could run the same executables, and board and peripherals were (and are) compatible among different models and brands thanks to the use of common standards.
Complete nonsense. In the early days there were many computers running MS-DOS that were _not_ IBM-PCs. HP, Wang, DEC, Apricot, SCP all built machines that were _completely_ different and either used S100 or proprietry bus for their boards. Later, many moved to IBM compatibility with ISA bus, then there was the PS/2 with incompatible MicroChannel, EISA, PCI, PCI Express.
You won't push an ISA board into a PCI slot.
> If MS had not licensed DOS to clone manufacturers, there would have never been the low-cost PC clones
No, there would have _continued_ to be low cost CP/M, CP/M-86, and Concurrent-CP/M machines - around the same price as IBM-PC clones.
> Without PCs, Intel would have not maybe become the powerhouse it is now.
I don't know why you tie Intel to DOS, before there was DOS Intel had the CP/M and CP/M-86 market and several others. There may have also been Motorola, Zilog and others but they were all equally capable of making high-volume low-cost CPUs and systems.
> DOS extenders made DOS applications run in protected mode on 286 and 386, and access more then 640k.
No, you are completely wrong yet again. DOS extenders could switch the machine between real mode and extended mode on order to swap RAM pages between real memory and extended memory, but DOS applications, nor DOS itself*, _NEVER_ ran in anything but real mode and could never directly access anything above 1Mbyte address space. If they wanted to access data not currently in that address space they asked the EMS manager to fetch it to where they could access it.
Also the 640Kb limit was not a DOS limit but was a restriction on the IBM-PCs. Other machines running MS-DOS could access the full 1Mbyte of the 8086 model.
> Smartphones and other mobile devices used them - including Windows CE - well before both.
And home computers (Acorn Archimedes) and millions of other embedded devices well before those.
* For example Turbo-Pascal was available for CP/M, CP/M-86, MS-DOS and PC-DOS. The MS-DOS version could be configured for serial terminal screens (or ANSI if you had a display adaptor card and monitor).
* Actually there was an 80286 MS-DOS: version 4.0 and 4.1 (not to be confused with the much later 4.01) were based on 3.1 and 3.2 respectively. It was used by Wang, Siemanns and ICL (where I worked with it) and was known as 'European DOS'. It was intended to run protected mode programs and provided some multitasking with background tasks. It was dumped when OS/2 was started.
> Sure, but if the CPU and the hardware around costs 10k+, how many could afford it?
Why do you pick 10k+. Atari and Amigas, 68000 based, were perfectly capable of running Linux, and do so, and were priced much less than an IBM PC or clone of the time.
There is nothing about an IBM PC clone that suddenly makes it cheaper than other non-IBM PC machines of the day. It wasn't like the components were cheaper if you mentioned IBM or Bill Gates name.
Linux was the first OS to run on AMDs x86-64 and Intel copy of that, well before Windows got around to it.
Actually, no. NetBSD was first. It's true modular structure meant the device drivers simply worked, as opposed to the rewrites needed on Linux. All that was needed was memory management code for the new architecture. The NetBSD team got their x64 port out in five days. It took the Linux crowd six months.
> Actually, no. NetBSD was first.
https://en.wikipedia.org/wiki/X86-64#Linux
"""Linux was the first operating system kernel to run the x86-64 architecture in long mode, starting with the 2.4 version in 2001 (prior to the physical hardware's availability)"""
While NetBSD was working on x86-64 since 2001 it didn't run until after Linux.
"""Linux was the first operating system kernel to run the x86-64 architecture in long mode, starting with the 2.4 version in 2001 (prior to the physical hardware's availability)
While NetBSD was working on x86-64 since 2001 it didn't run until after Linux.
Well have that completely pointless bone if you want it. He's clearly not talking about the same thing. You are talking about getting isolated code fragments running on an EMULATOR versus device drivers and everything else that makes up a complete system on HARDWARE.
Two different metrics entirely, and all you are doing here is emphasizing how long the work actually took. Linux ISN'T always the first with everything, we already knew that. How long did you have to wait for real POSIX threads or NAT, again? Yes, I know you had cheap copies of each - clone() and IP masquerading - but the real deals?
The future is not guided and/or created by internetworking information and intelligence search engines, for such is the realm and raison d'être of virtual knowledge placement machines .... with the simple complexity of pleasure robot programming in the what you give is what you get gold standard class, that which excels and exceeds and leads all others to follow in full and reasonable expectation of insatiable satisfaction for constant progress.
Beware and be aware though, that all sub-prime product delivery systems sow the seeds and feeds of one's own destruction and catastrophic delivery systems collapse.
Oops, that Valiant Victor AIdVenTuring post should be started with ....
The future is not just justly guided and/or created by internetworking information and intelligence search engines, etc etc ....
Apologies for the earlier early duff intelligence/disinformation/misdirection.
If they were to split. It would be a good thing, hopefully not a vista/8 again. Though when you spend a lot of time with m$ stuff you realise the different product groups almost behave like little companies. The adk can't be bundled with sccm? Shared excel files don't work on a dfs share.
Still think Microsoft would be a better company if it was split into different companies. A consumer company and a business company.
Windows and Office would stay 'Business'. but would sell the underlying OS to the consumer division for them to build their consumer systems.
That way, they could stop ruining the enterprise products!
I am slightly confused by the assertion that Microsoft has failed to get into the consumer business.
I see lots of Xboxes around. I also find that most people I know have a PC or two running windows at home.
It's true that they've rather missed out on phones and tablets. But surely the failing then is to fail to hang on to the consumer business, rather than not to have got into it?